Remote sensing as a source of land cover information utilized in the universal soil loss equation
NASA Technical Reports Server (NTRS)
Morris-Jones, D. R.; Morgan, K. M.; Kiefer, R. W.; Scarpace, F. L.
1979-01-01
In this study, methods for gathering the land use/land cover information required by the USLE were investigated with medium altitude, multi-date color and color infrared 70-mm positive transparencies using human and computer-based interpretation techniques. Successful results, which compare favorably with traditional field study methods, were obtained within the test site watershed with airphoto data sources and human airphoto interpretation techniques. Computer-based interpretation techniques were not capable of identifying soil conservation practices but were successful to varying degrees in gathering other types of desired land use/land cover information.
Infusing Counseling Skills in Test Interpretation.
ERIC Educational Resources Information Center
Rawlins, Melanie E.; And Others
1991-01-01
Presents an instructional model based on Neurolinguistic Programming that links counseling student course work in measurement and test interpretation with counseling techniques and theory. A process incorporating Neurolinguistic Programming patterns is outlined for teaching graduate students the counseling skills helpful in test interpretation.…
Acting the Intangible: Hints of Politeness in Non-Verbal Form
ERIC Educational Resources Information Center
Jumanto, Jumanto; Rizal, Sarif Syamsu; Nugroho, Raden Arief
2017-01-01
This review paper has explored politeness in non-verbal form to come to hints for indicating the ideology. Politeness in non-verbal form is researched by reviewing verbal politeness theories through interpretive techniques, and then the data in form of interpreted hints based on the reviews are analyzed by employing a coding technique. The six…
Cardiac imaging: working towards fully-automated machine analysis & interpretation.
Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido
2017-03-01
Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.
Pen-based Interfaces for Engineering and Education
NASA Astrophysics Data System (ADS)
Stahovich, Thomas F.
Sketches are an important problem-solving tool in many fields. This is particularly true of engineering design, where sketches facilitate creativity by providing an efficient medium for expressing ideas. However, despite the importance of sketches in engineering practice, current engineering software still relies on traditional mouse and keyboard interfaces, with little or no capabilities to handle free-form sketch input. With recent advances in machine-interpretation techniques, it is now becoming possible to create practical interpretation-based interfaces for such software. In this chapter, we report on our efforts to create interpretation techniques to enable pen-based engineering applications. We describe work on two fundamental sketch understanding problems. The first is sketch parsing, the task of clustering pen strokes or geometric primitives into individual symbols. The second is symbol recognition, the task of classifying symbols once they have been located by a parser. We have used the techniques that we have developed to construct several pen-based engineering analysis tools. These are used here as examples to illustrate our methods. We have also begun to use our techniques to create pen-based tutoring systems that scaffold students in solving problems in the same way they would ordinarily solve them with paper and pencil. The chapter concludes with a brief discussion of these systems.
Modeling and managing risk early in software development
NASA Technical Reports Server (NTRS)
Briand, Lionel C.; Thomas, William M.; Hetmanski, Christopher J.
1993-01-01
In order to improve the quality of the software development process, we need to be able to build empirical multivariate models based on data collectable early in the software process. These models need to be both useful for prediction and easy to interpret, so that remedial actions may be taken in order to control and optimize the development process. We present an automated modeling technique which can be used as an alternative to regression techniques. We show how it can be used to facilitate the identification and aid the interpretation of the significant trends which characterize 'high risk' components in several Ada systems. Finally, we evaluate the effectiveness of our technique based on a comparison with logistic regression based models.
Cardiac imaging: working towards fully-automated machine analysis & interpretation
Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido
2017-01-01
Introduction Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation. PMID:28277804
Data selection techniques in the interpretation of MAGSAT data over Australia
NASA Technical Reports Server (NTRS)
Johnson, B. D.; Dampney, C. N. G.
1983-01-01
The MAGSAT data require critical selection in order to produce a self-consistent data set suitable for map construction and subsequent interpretation. Interactive data selection techniques are described which involve the use of a special-purpose profile-oriented data base and a colour graphics display. The careful application of these data selection techniques permits validation every data value and ensures that the best possible self-consistent data set is being used to construct the maps of the magnetic field measured at satellite altitudes over Australia.
Role Of Synchrotron Techniques In USEPA Regulatory And Remediation Decisions
Science provides the foundation for credible decision making. Science is observation followed by an interpretation and understanding of the result of the measurement. Observations may not be correct, complete, or fully descriptive of the phenomena. Interpretation based on avai...
NASA Astrophysics Data System (ADS)
Pal, S. K.; Majumdar, T. J.; Bhattacharya, Amit K.
Fusion of optical and synthetic aperture radar data has been attempted in the present study for mapping of various lithologic units over a part of the Singhbhum Shear Zone (SSZ) and its surroundings. ERS-2 SAR data over the study area has been enhanced using Fast Fourier Transformation (FFT) based filtering approach, and also using Frost filtering technique. Both the enhanced SAR imagery have been then separately fused with histogram equalized IRS-1C LISS III image using Principal Component Analysis (PCA) technique. Later, Feature-oriented Principal Components Selection (FPCS) technique has been applied to generate False Color Composite (FCC) images, from which corresponding geological maps have been prepared. Finally, GIS techniques have been successfully used for change detection analysis in the lithological interpretation between the published geological map and the fusion based geological maps. In general, there is good agreement between these maps over a large portion of the study area. Based on the change detection studies, few areas could be identified which need attention for further detailed ground-based geological studies.
NASA Astrophysics Data System (ADS)
Lewis, Donna L.; Phinn, Stuart
2011-01-01
Aerial photography interpretation is the most common mapping technique in the world. However, unlike an algorithm-based classification of satellite imagery, accuracy of aerial photography interpretation generated maps is rarely assessed. Vegetation communities covering an area of 530 km2 on Bullo River Station, Northern Territory, Australia, were mapped using an interpretation of 1:50,000 color aerial photography. Manual stereoscopic line-work was delineated at 1:10,000 and thematic maps generated at 1:25,000 and 1:100,000. Multivariate and intuitive analysis techniques were employed to identify 22 vegetation communities within the study area. The accuracy assessment was based on 50% of a field dataset collected over a 4 year period (2006 to 2009) and the remaining 50% of sites were used for map attribution. The overall accuracy and Kappa coefficient for both thematic maps was 66.67% and 0.63, respectively, calculated from standard error matrices. Our findings highlight the need for appropriate scales of mapping and accuracy assessment of aerial photography interpretation generated vegetation community maps.
NASA Astrophysics Data System (ADS)
Singh, Sarvesh Kumar; Kumar, Pramod; Rani, Raj; Turbelin, Grégory
2017-04-01
The study highlights a theoretical comparison and various interpretations of a recent inversion technique, called renormalization, developed for the reconstruction of unknown tracer emissions from their measured concentrations. The comparative interpretations are presented in relation to the other inversion techniques based on principle of regularization, Bayesian, minimum norm, maximum entropy on mean, and model resolution optimization. It is shown that the renormalization technique can be interpreted in a similar manner to other techniques, with a practical choice of a priori information and error statistics, while eliminating the need of additional constraints. The study shows that the proposed weight matrix and weighted Gram matrix offer a suitable deterministic choice to the background error and measurement covariance matrices, respectively, in the absence of statistical knowledge about background and measurement errors. The technique is advantageous since it (i) utilizes weights representing a priori information apparent to the monitoring network, (ii) avoids dependence on background source estimates, (iii) improves on alternative choices for the error statistics, (iv) overcomes the colocalization problem in a natural manner, and (v) provides an optimally resolved source reconstruction. A comparative illustration of source retrieval is made by using the real measurements from a continuous point release conducted in Fusion Field Trials, Dugway Proving Ground, Utah.
Characterizing multivariate decoding models based on correlated EEG spectral features
McFarland, Dennis J.
2013-01-01
Objective Multivariate decoding methods are popular techniques for analysis of neurophysiological data. The present study explored potential interpretative problems with these techniques when predictors are correlated. Methods Data from sensorimotor rhythm-based cursor control experiments was analyzed offline with linear univariate and multivariate models. Features were derived from autoregressive (AR) spectral analysis of varying model order which produced predictors that varied in their degree of correlation (i.e., multicollinearity). Results The use of multivariate regression models resulted in much better prediction of target position as compared to univariate regression models. However, with lower order AR features interpretation of the spectral patterns of the weights was difficult. This is likely to be due to the high degree of multicollinearity present with lower order AR features. Conclusions Care should be exercised when interpreting the pattern of weights of multivariate models with correlated predictors. Comparison with univariate statistics is advisable. Significance While multivariate decoding algorithms are very useful for prediction their utility for interpretation may be limited when predictors are correlated. PMID:23466267
Ball, Lyndsay B.; Kress, Wade H.; Steele, Gregory V.; Cannia, James C.; Andersen, Michael J.
2006-01-01
In the North Platte River Basin, a ground-water model is being developed to evaluate the effectiveness of using water leakage from selected irrigation canal systems to enhance ground-water recharge. The U.S. Geological Survey, in cooperation with the North Platte Natural Resources District, used land-based capacitively coupled and water-borne direct-current continuous resistivity profiling techniques to map the lithology of the upper 8 meters and to interpret the relative canal leakage potential of 110 kilometers of the Interstate and Tri-State Canals in western Nebraska and eastern Wyoming. Lithologic descriptions from 25 test holes were used to evaluate the effectiveness of both techniques for indicating relative grain size. An interpretive color scale was developed that symbolizes contrasting resistivity features indicative of different grain-size categories. The color scale was applied to the vertically averaged resistivity and used to classify areas of the canals as having either high, moderate, or low canal leakage potential. When results were compared with the lithologic descriptions, both land-based and water-borne continuous resistivity profiling techniques were determined to be effective at differentiating coarse-grained from fine-grained sediment. Both techniques were useful for producing independent, similar interpretations of canal leakage potential.
A 'digital' technique for manual extraction of data from aerial photography
NASA Technical Reports Server (NTRS)
Istvan, L. B.; Bondy, M. T.
1977-01-01
The interpretation procedure described uses a grid cell approach. In addition, a random point is located in each cell. The procedure required that the cell/point grid be established on a base map, and identical grids be made to precisely match the scale of the photographic frames. The grid is then positioned on the photography by visual alignment to obvious features. Several alignments on one frame are sometimes required to make a precise match of all points to be interpreted. This system inherently corrects for distortions in the photography. Interpretation is then done cell by cell. In order to meet the time constraints, first order interpretation should be maintained. The data is put onto coding forms, along with other appropriate data, if desired. This 'digital' manual interpretation technique has proven to be efficient, and time and cost effective, while meeting strict requirements for data format and accuracy.
Towards Symbolic Model Checking for Multi-Agent Systems via OBDDs
NASA Technical Reports Server (NTRS)
Raimondi, Franco; Lomunscio, Alessio
2004-01-01
We present an algorithm for model checking temporal-epistemic properties of multi-agent systems, expressed in the formalism of interpreted systems. We first introduce a technique for the translation of interpreted systems into boolean formulae, and then present a model-checking algorithm based on this translation. The algorithm is based on OBDD's, as they offer a compact and efficient representation for boolean formulae.
Multi-objective evolutionary algorithms for fuzzy classification in survival prediction.
Jiménez, Fernando; Sánchez, Gracia; Juárez, José M
2014-03-01
This paper presents a novel rule-based fuzzy classification methodology for survival/mortality prediction in severe burnt patients. Due to the ethical aspects involved in this medical scenario, physicians tend not to accept a computer-based evaluation unless they understand why and how such a recommendation is given. Therefore, any fuzzy classifier model must be both accurate and interpretable. The proposed methodology is a three-step process: (1) multi-objective constrained optimization of a patient's data set, using Pareto-based elitist multi-objective evolutionary algorithms to maximize accuracy and minimize the complexity (number of rules) of classifiers, subject to interpretability constraints; this step produces a set of alternative (Pareto) classifiers; (2) linguistic labeling, which assigns a linguistic label to each fuzzy set of the classifiers; this step is essential to the interpretability of the classifiers; (3) decision making, whereby a classifier is chosen, if it is satisfactory, according to the preferences of the decision maker. If no classifier is satisfactory for the decision maker, the process starts again in step (1) with a different input parameter set. The performance of three multi-objective evolutionary algorithms, niched pre-selection multi-objective algorithm, elitist Pareto-based multi-objective evolutionary algorithm for diversity reinforcement (ENORA) and the non-dominated sorting genetic algorithm (NSGA-II), was tested using a patient's data set from an intensive care burn unit and a standard machine learning data set from an standard machine learning repository. The results are compared using the hypervolume multi-objective metric. Besides, the results have been compared with other non-evolutionary techniques and validated with a multi-objective cross-validation technique. Our proposal improves the classification rate obtained by other non-evolutionary techniques (decision trees, artificial neural networks, Naive Bayes, and case-based reasoning) obtaining with ENORA a classification rate of 0.9298, specificity of 0.9385, and sensitivity of 0.9364, with 14.2 interpretable fuzzy rules on average. Our proposal improves the accuracy and interpretability of the classifiers, compared with other non-evolutionary techniques. We also conclude that ENORA outperforms niched pre-selection and NSGA-II algorithms. Moreover, given that our multi-objective evolutionary methodology is non-combinational based on real parameter optimization, the time cost is significantly reduced compared with other evolutionary approaches existing in literature based on combinational optimization. Copyright © 2014 Elsevier B.V. All rights reserved.
Shen, Shichen; Sheng, Quanhu; Shyr, Yu; Qu, Jun
2016-01-01
The recently-introduced Orbitrap Fusion mass spectrometry permits various types of MS2 acquisition methods. To date, these different MS2 strategies and the optimal data interpretation approach for each have not been adequately evaluated. This study comprehensively investigated the four MS2 strategies: HCD-OT (higher-energy-collisional-dissociation with Orbitrap detection), HCD-IT (HCD with ion trap, IT), CID-IT (collision-induced-dissociation with IT) and CID-OT on Orbitrap Fusion. To achieve extensive comparison and identify the optimal data interpretation method for each technique, several search engines (SEQUEST and Mascot) and post-processing methods (score-based, PeptideProphet, and Percolator) were assessed for all techniques for the analysis of a human cell proteome. It was found that divergent conclusions could be made from the same dataset when different data interpretation approaches were used and therefore requiring a relatively fair comparison among techniques. Percolator was chosen for comparison of techniques because it performs the best among all search engines and MS2 strategies. For the analysis of human cell proteome using individual MS2 strategies, the highest number of identifications was achieved by HCD-OT, followed by HCD-IT and CID-IT. Based on these results, we concluded that a relatively fair platform for data interpretation is necessary to avoid divergent conclusions from the same dataset, and HCD-OT and HCD-IT may be preferable for protein/peptide identification using Orbitrap Fusion. PMID:27472422
What defines an Expert? - Uncertainty in the interpretation of seismic data
NASA Astrophysics Data System (ADS)
Bond, C. E.
2008-12-01
Studies focusing on the elicitation of information from experts are concentrated primarily in economics and world markets, medical practice and expert witness testimonies. Expert elicitation theory has been applied in the natural sciences, most notably in the prediction of fluid flow in hydrological studies. In the geological sciences expert elicitation has been limited to theoretical analysis with studies focusing on the elicitation element, gaining expert opinion rather than necessarily understanding the basis behind the expert view. In these cases experts are defined in a traditional sense, based for example on: standing in the field, no. of years of experience, no. of peer reviewed publications, the experts position in a company hierarchy or academia. Here traditional indicators of expertise have been compared for significance on affective seismic interpretation. Polytomous regression analysis has been used to assess the relative significance of length and type of experience on the outcome of a seismic interpretation exercise. Following the initial analysis the techniques used by participants to interpret the seismic image were added as additional variables to the analysis. Specific technical skills and techniques were found to be more important for the affective geological interpretation of seismic data than the traditional indicators of expertise. The results of a seismic interpretation exercise, the techniques used to interpret the seismic and the participant's prior experience have been combined and analysed to answer the question - who is and what defines an expert?
Characterizing multivariate decoding models based on correlated EEG spectral features.
McFarland, Dennis J
2013-07-01
Multivariate decoding methods are popular techniques for analysis of neurophysiological data. The present study explored potential interpretative problems with these techniques when predictors are correlated. Data from sensorimotor rhythm-based cursor control experiments was analyzed offline with linear univariate and multivariate models. Features were derived from autoregressive (AR) spectral analysis of varying model order which produced predictors that varied in their degree of correlation (i.e., multicollinearity). The use of multivariate regression models resulted in much better prediction of target position as compared to univariate regression models. However, with lower order AR features interpretation of the spectral patterns of the weights was difficult. This is likely to be due to the high degree of multicollinearity present with lower order AR features. Care should be exercised when interpreting the pattern of weights of multivariate models with correlated predictors. Comparison with univariate statistics is advisable. While multivariate decoding algorithms are very useful for prediction their utility for interpretation may be limited when predictors are correlated. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
On the Power of Abstract Interpretation
NASA Technical Reports Server (NTRS)
Reddy, Uday S.; Kamin, Samuel N.
1991-01-01
Increasingly sophisticated applications of static analysis place increased burden on the reliability of the analysis techniques. Often, the failure of the analysis technique to detect some information my mean that the time or space complexity of the generated code would be altered. Thus, it is important to precisely characterize the power of static analysis techniques. We follow the approach of Selur et. al. who studied the power of strictness analysis techniques. Their result can be summarized by saying 'strictness analysis is perfect up to variations in constants.' In other words, strictness analysis is as good as it could be, short of actually distinguishing between concrete values. We use this approach to characterize a broad class of analysis techniques based on abstract interpretation including, but not limited to, strictness analysis. For the first-order case, we consider abstract interpretations where the abstract domain for data values is totally ordered. This condition is satisfied by Mycroft's strictness analysis that of Sekar et. al. and Wadler's analysis of list-strictness. For such abstract interpretations, we show that the analysis is complete in the sense that, short of actually distinguishing between concrete values with the same abstraction, it gives the best possible information. We further generalize these results to typed lambda calculus with pairs and higher-order functions. Note that products and function spaces over totally ordered domains are not totally ordered. In fact, the notion of completeness used in the first-order case fails if product domains or function spaces are added. We formulate a weaker notion of completeness based on observability of values. Two values (including pairs and functions) are considered indistinguishable if their observable components are indistinguishable. We show that abstract interpretation of typed lambda calculus programs is complete up to this notion of indistinguishability. We use denotationally-oriented arguments instead of the detailed operational arguments used by Selur et. al.. Hence, our proofs are much simpler. They should be useful for further future improvements.
Minervini, Andrea; Carini, Marco; Uzzo, Robert G; Campi, Riccardo; Smaldone, Marc C; Kutikov, Alexander
2014-11-01
A standardized reporting system of nephron-sparing surgery resection techniques is lacking. The surface-intermediate-base scoring system represents a formal reporting instrument to assist in interpretation of reported data and to facilitate comparisons in the urologic literature. Copyright © 2014 European Association of Urology. Published by Elsevier B.V. All rights reserved.
Quantitative non-destructive evaluation of composite materials based on ultrasonic wave propagation
NASA Technical Reports Server (NTRS)
Miller, J. G.
1986-01-01
The application and interpretation of specific ultrasonic nondestructive evaluation techniques are studied. The Kramers-Kronig or generalized dispersion relationships are applied to nondestructive techniques. Progress was made on an improved determination of material properties of composites inferred from elastic constant measurements.
Computer enhancement through interpretive techniques
NASA Technical Reports Server (NTRS)
Foster, G.; Spaanenburg, H. A. E.; Stumpf, W. E.
1972-01-01
The improvement in the usage of the digital computer through the use of the technique of interpretation rather than the compilation of higher ordered languages was investigated by studying the efficiency of coding and execution of programs written in FORTRAN, ALGOL, PL/I and COBOL. FORTRAN was selected as the high level language for examining programs which were compiled, and A Programming Language (APL) was chosen for the interpretive language. It is concluded that APL is competitive, not because it and the algorithms being executed are well written, but rather because the batch processing is less efficient than has been admitted. There is not a broad base of experience founded on trying different implementation strategies which have been targeted at open competition with traditional processing methods.
Simulation of target interpretation based on infrared image features and psychology principle
NASA Astrophysics Data System (ADS)
Lin, Wei; Chen, Yu-hua; Gao, Hong-sheng; Wang, Zhan-feng; Wang, Ji-jun; Su, Rong-hua; Huang, Yan-ping
2009-07-01
It's an important and complicated process in target interpretation that target features extraction and identification, which effect psychosensorial quantity of interpretation person to target infrared image directly, and decide target viability finally. Using statistical decision theory and psychology principle, designing four psychophysical experiment, the interpretation model of the infrared target is established. The model can get target detection probability by calculating four features similarity degree between target region and background region, which were plotted out on the infrared image. With the verification of a great deal target interpretation in practice, the model can simulate target interpretation and detection process effectively, get the result of target interpretation impersonality, which can provide technique support for target extraction, identification and decision-making.
Application of AIS Technology to Forest Mapping
NASA Technical Reports Server (NTRS)
Yool, S. R.; Star, J. L.
1985-01-01
Concerns about environmental effects of large scale deforestation have prompted efforts to map forests over large areas using various remote sensing data and image processing techniques. Basic research on the spectral characteristics of forest vegetation are required to form a basis for development of new techniques, and for image interpretation. Examination of LANDSAT data and image processing algorithms over a portion of boreal forest have demonstrated the complexity of relations between the various expressions of forest canopies, environmental variability, and the relative capacities of different image processing algorithms to achieve high classification accuracies under these conditions. Airborne Imaging Spectrometer (AIS) data may in part provide the means to interpret the responses of standard data and techniques to the vegetation based on its relatively high spectral resolution.
An orientation soil survey at the Pebble Cu-Au-Mo porphyry deposit, Alaska
Smith, Steven M.; Eppinger, Robert G.; Fey, David L.; Kelley, Karen D.; Giles, S.A.
2009-01-01
Soil samples were collected in 2007 and 2008 along three traverses across the giant Pebble Cu-Au-Mo porphyry deposit. Within each soil pit, four subsamples were collected following recommended protocols for each of ten commonly-used and proprietary leach/digestion techniques. The significance of geochemical patterns generated by these techniques was classified by visual inspection of plots showing individual element concentration by each analytical method along the 2007 traverse. A simple matrix by element versus method, populated with a value based on the significance classification, provides a method for ranking the utility of methods and elements at this deposit. The interpretation of a complex multi-element dataset derived from multiple analytical techniques is challenging. An example of vanadium results from a single leach technique is used to illustrate the several possible interpretations of the data.
Cerebral microbleeds: a guide to detection and interpretation.
Greenberg, Steven M; Vernooij, Meike W; Cordonnier, Charlotte; Viswanathan, Anand; Al-Shahi Salman, Rustam; Warach, Steven; Launer, Lenore J; Van Buchem, Mark A; Breteler, Monique Mb
2009-02-01
Cerebral microbleeds (CMBs) are increasingly recognised neuroimaging findings in individuals with cerebrovascular disease and dementia, and in normal ageing. There has been substantial progress in the understanding of CMBs in recent years, particularly in the development of newer MRI methods for the detection of CMBs and the application of these techniques to population-based samples of elderly people. In this Review, we focus on these recent developments and their effects on two main questions: how CMBs are detected, and how CMBs should be interpreted. The number of CMBs detected depends on MRI characteristics, such as pulse sequence, sequence parameters, spatial resolution, magnetic field strength, and image post-processing, emphasising the importance of taking into account MRI technique in the interpretation of study results. Recent investigations with sensitive MRI techniques have indicated a high prevalence of CMBs in community-dwelling elderly people. We propose a procedural guide for identification of CMBs and suggest possible future approaches for elucidating the role of these common lesions as markers for, and contributors to, small-vessel brain disease.
NASA Astrophysics Data System (ADS)
Bauer, K.; Muñoz, G.; Moeck, I.
2012-12-01
The combined interpretation of different models as derived from seismic tomography and magnetotelluric (MT) inversion represents a more efficient approach to determine the lithology of the subsurface compared with the separate treatment of each discipline. Such models can be developed independently or by application of joint inversion strategies. After the step of model generation using different geophysical methodologies, a joint interpretation work flow includes the following steps: (1) adjustment of a joint earth model based on the adapted, identical model geometry for the different methods, (2) classification of the model components (e.g. model blocks described by a set of geophysical parameters), and (3) re-mapping of the classified rock types to visualise their distribution within the earth model, and petrophysical characterization and interpretation. One possible approach for the classification of multi-parameter models is based on statistical pattern recognition, where different models are combined and translated into probability density functions. Classes of rock types are identified in these methods as isolated clusters with high probability density function values. Such techniques are well-established for the analysis of two-parameter models. Alternatively we apply self-organizing map (SOM) techniques, which have no limitations in the number of parameters to be analysed in the joint interpretation. Our SOM work flow includes (1) generation of a joint earth model described by so-called data vectors, (2) unsupervised learning or training, (3) analysis of the feature map by adopting image processing techniques, and (4) application of the knowledge to derive a lithological model which is based on the different geophysical parameters. We show the usage of the SOM work flow for a synthetic and a real data case study. Both tests rely on three geophysical properties: P velocity and vertical velocity gradient from seismic tomography, and electrical resistivity from MT inversion. The synthetic data are used as a benchmark test to demonstrate the performance of the SOM method. The real data were collected along a 40 km profile across parts of the NE German basin. The lithostratigraphic model from the joint SOM interpretation consists of eight litho-types and covers Cenozoic, Mesozoic and Paleozoic sediments down to 5 km depth. There is a remarkable agreement between the SOM based model and regional marker horizons interpolated from surrounding 2D industrial seismic data. The most interesting results include (1) distinct properties of the Jurassic (low P velocity gradients, low resistivities) interpreted as the signature of shaly clastics, and (2) a pattern within the Upper Permian Zechstein with decreased resistivities and increased P velocities within the salt depressions on the one hand, and increased resistivities and decreased P velocities in the salt pillows on the other hand. In our interpretation this pattern is related with flow of less dense salt matrix components into the pillows and remaining brittle evaporites within the depressions.
Determination of the Conservation Time of Periodicals for Optimal Shelf Maintenance of a Library.
ERIC Educational Resources Information Center
Miyamoto, Sadaaki; Nakayama, Kazuhiko
1981-01-01
Presents a method based on a constrained optimization technique that determines the time of removal of scientific periodicals from the shelf of a library. A geometrical interpretation of the theoretical result is given, and a numerical example illustrates how the technique is applicable to real bibliographic data. (FM)
An evaluation of consensus techniques for diagnostic interpretation
NASA Astrophysics Data System (ADS)
Sauter, Jake N.; LaBarre, Victoria M.; Furst, Jacob D.; Raicu, Daniela S.
2018-02-01
Learning diagnostic labels from image content has been the standard in computer-aided diagnosis. Most computer-aided diagnosis systems use low-level image features extracted directly from image content to train and test machine learning classifiers for diagnostic label prediction. When the ground truth for the diagnostic labels is not available, reference truth is generated from the experts diagnostic interpretations of the image/region of interest. More specifically, when the label is uncertain, e.g. when multiple experts label an image and their interpretations are different, techniques to handle the label variability are necessary. In this paper, we compare three consensus techniques that are typically used to encode the variability in the experts labeling of the medical data: mean, median and mode, and their effects on simple classifiers that can handle deterministic labels (decision trees) and probabilistic vectors of labels (belief decision trees). Given that the NIH/NCI Lung Image Database Consortium (LIDC) data provides interpretations for lung nodules by up to four radiologists, we leverage the LIDC data to evaluate and compare these consensus approaches when creating computer-aided diagnosis systems for lung nodules. First, low-level image features of nodules are extracted and paired with their radiologists semantic ratings (1= most likely benign, , 5 = most likely malignant); second, machine learning multi-class classifiers that handle deterministic labels (decision trees) and probabilistic vectors of labels (belief decision trees) are built to predict the lung nodules semantic ratings. We show that the mean-based consensus generates the most robust classi- fier overall when compared to the median- and mode-based consensus. Lastly, the results of this study show that, when building CAD systems with uncertain diagnostic interpretation, it is important to evaluate different strategies for encoding and predicting the diagnostic label.
NASA Technical Reports Server (NTRS)
Vangenderen, J. L. (Principal Investigator); Lock, B. F.
1976-01-01
The author has identified the following significant results. It was found that color composite transparencies and monocular magnification provided the best base for land use interpretation. New methods for determining optimum sample sizes and analyzing interpretation accuracy levels were developed. All stages of the methodology were assessed, in the operational sense, during the production of a 1:250,000 rural land use map of Murcia Province, Southeast Spain.
Shattuck-Hufnagel, S.; Choi, J. Y.; Moro-Velázquez, L.; Gómez-García, J. A.
2017-01-01
Although a large amount of acoustic indicators have already been proposed in the literature to evaluate the hypokinetic dysarthria of people with Parkinson’s Disease, the goal of this work is to identify and interpret new reliable and complementary articulatory biomarkers that could be applied to predict/evaluate Parkinson’s Disease from a diadochokinetic test, contributing to the possibility of a further multidimensional analysis of the speech of parkinsonian patients. The new biomarkers proposed are based on the kinetic behaviour of the envelope trace, which is directly linked with the articulatory dysfunctions introduced by the disease since the early stages. The interest of these new articulatory indicators stands on their easiness of identification and interpretation, and their potential to be translated into computer based automatic methods to screen the disease from the speech. Throughout this paper, the accuracy provided by these acoustic kinetic biomarkers is compared with the one obtained with a baseline system based on speaker identification techniques. Results show accuracies around 85% that are in line with those obtained with the complex state of the art speaker recognition techniques, but with an easier physical interpretation, which open the possibility to be transferred to a clinical setting. PMID:29240814
Effects of Interpretation as a Counseling Technique
ERIC Educational Resources Information Center
Helner, Philip A.; Jessell, John C.
1974-01-01
This research was an inquiry into the effects of interpretation in counseling. The feelings of subjects toward interpretation were compared with their feelings toward the techniques of reflection, advice giving, and probing. The implications of the use of interpretation in counseling are discussed. (Author)
Hepatocellular carcinoma: Advances in diagnostic imaging.
Sun, Haoran; Song, Tianqiang
2015-10-01
Thanks to the growing knowledge on biological behaviors of hepatocellular carcinomas (HCC), as well as continuous improvement in imaging techniques and experienced interpretation of imaging features of the nodules in cirrhotic liver, the detection and characterization of HCC has improved in the past decade. A number of practice guidelines for imaging diagnosis have been developed to reduce interpretation variability and standardize management of HCC, and they are constantly updated with advances in imaging techniques and evidence based data from clinical series. In this article, we strive to review the imaging techniques and the characteristic features of hepatocellular carcinoma associated with cirrhotic liver, with emphasis on the diagnostic value of advanced magnetic resonance imaging (MRI) techniques and utilization of hepatocyte-specific MRI contrast agents. We also briefly describe the concept of liver imaging reporting and data systems and discuss the consensus and controversy of major practice guidelines.
Dhurjad, Pooja Sukhdev; Marothu, Vamsi Krishna; Rathod, Rajeshwari
2017-08-01
Metabolite identification is a crucial part of the drug discovery process. LC-MS/MS-based metabolite identification has gained widespread use, but the data acquired by the LC-MS/MS instrument is complex, and thus the interpretation of data becomes troublesome. Fortunately, advancements in data mining techniques have simplified the process of data interpretation with improved mass accuracy and provide a potentially selective, sensitive, accurate and comprehensive way for metabolite identification. In this review, we have discussed the targeted (extracted ion chromatogram, mass defect filter, product ion filter, neutral loss filter and isotope pattern filter) and untargeted (control sample comparison, background subtraction and metabolomic approaches) post-acquisition data mining techniques, which facilitate the drug metabolite identification. We have also discussed the importance of integrated data mining strategy.
Artificial intelligence and signal processing for infrastructure assessment
NASA Astrophysics Data System (ADS)
Assaleh, Khaled; Shanableh, Tamer; Yehia, Sherif
2015-04-01
The Ground Penetrating Radar (GPR) is being recognized as an effective nondestructive evaluation technique to improve the inspection process. However, data interpretation and complexity of the results impose some limitations on the practicality of using this technique. This is mainly due to the need of a trained experienced person to interpret images obtained by the GPR system. In this paper, an algorithm to classify and assess the condition of infrastructures utilizing image processing and pattern recognition techniques is discussed. Features extracted form a dataset of images of defected and healthy slabs are used to train a computer vision based system while another dataset is used to evaluate the proposed algorithm. Initial results show that the proposed algorithm is able to detect the existence of defects with about 77% success rate.
The Mediated Museum: Computer-Based Technology and Museum Infrastructure.
ERIC Educational Resources Information Center
Sterman, Nanette T.; Allen, Brockenbrough S.
1991-01-01
Describes the use of computer-based tools and techniques in museums. The integration of realia with media-based advice and interpretation is described, electronic replicas of ancient Greek vases in the J. Paul Getty Museum are explained, examples of mediated exhibits are presented, and the use of hypermedia is discussed. (five references) (LRW)
Transonic flight flutter tests of a control surface utilizing an impedance response technique
NASA Technical Reports Server (NTRS)
Mirowitz, L. I.
1975-01-01
Transonic flight flutter tests of the XF3H-1 Demon Airplane were conducted utilizing a frequency response technique in which the oscillating rudder provides the means of system excitation. These tests were conducted as a result of a rudder flutter incident in the transonic speed range. The technique employed is presented including a brief theoretical development of basic concepts. Test data obtained during the flight are included and the method of interpretation of these data is indicated. This method is based on an impedance matching technique. It is shown that an artificial stabilizing device, such as a damper, may be incorporated in the system for test purposes without complicating the interpretation of the test results of the normal configuration. Data are presented which define the margin of stability introduced to the originally unstable rudder by design changes which involve higher control system stiffness and external damper. It is concluded that this technique of flight flutter testing is a feasible means of obtaining flutter stability information in flight.
Interaction techniques for radiology workstations: impact on users' productivity
NASA Astrophysics Data System (ADS)
Moise, Adrian; Atkins, M. Stella
2004-04-01
As radiologists progress from reading images presented on film to modern computer systems with images presented on high-resolution displays, many new problems arise. Although the digital medium has many advantages, the radiologist"s job becomes cluttered with many new tasks related to image manipulation. This paper presents our solution for supporting radiologists" interpretation of digital images by automating image presentation during sequential interpretation steps. Our method supports scenario based interpretation, which group data temporally, according to the mental paradigm of the physician. We extended current hanging protocols with support for "stages". A stage reflects the presentation of digital information required to complete a single step within a complex task. We demonstrated the benefits of staging in a user study with 20 lay subjects involved in a visual conjunctive search for targets, similar to a radiology task of identifying anatomical abnormalities. We designed a task and a set of stimuli which allowed us to simulate the interpretation workflow from a typical radiology scenario - reading a chest computed radiography exam when a prior study is also available. The simulation was possible by abstracting the radiologist"s task and the basic workstation navigation functionality. We introduced "Stages," an interaction technique attuned to the radiologist"s interpretation task. Compared to the traditional user interface, Stages generated a 14% reduction in the average interpretation.
Quantitative ultrasonic evaluation of concrete structures using one-sided access
NASA Astrophysics Data System (ADS)
Khazanovich, Lev; Hoegh, Kyle
2016-02-01
Nondestructive diagnostics of concrete structures is an important and challenging problem. A recent introduction of array ultrasonic dry point contact transducer systems offers opportunities for quantitative assessment of the subsurface condition of concrete structures, including detection of defects and inclusions. The methods described in this paper are developed for signal interpretation of shear wave impulse response time histories from multiple fixed distance transducer pairs in a self-contained ultrasonic linear array. This included generalizing Kirchoff migration-based synthetic aperture focusing technique (SAFT) reconstruction methods to handle the spatially diverse transducer pair locations, creating expanded virtual arrays with associated reconstruction methods, and creating automated reconstruction interpretation methods for reinforcement detection and stochastic flaw detection. Interpretation of the reconstruction techniques developed in this study were validated using the results of laboratory and field forensic studies. Applicability of the developed methods for solving practical engineering problems was demonstrated.
Cerebral Microbleeds: A Field Guide to their Detection and Interpretation
Greenberg, Steven M.; Vernooij, Meike W.; Cordonnier, Charlotte; Viswanathan, Anand; Salman, Rustam Al-Shahi; Warach, Steven; Launer, Lenore J.; Van Buchem, Mark A.; Breteler, Monique M.B.
2012-01-01
Summary Cerebral microbleeds (CMB) are increasingly recognized neuroimaging findings, occurring with cerebrovascular disease, dementia, and normal aging. Recent years have seen substantial progress, particularly in developing newer MRI methodologies for CMB detection and applying them to population-based elderly samples. This review focuses on these recent developments and their impact on two major questions: how CMB are detected, and how they should be interpreted. There is now ample evidence that prevalence and number of detected CMB varies with MRI characteristics such as pulse sequence, sequence parameters, spatial resolution, magnetic field strength, and post-processing, underlining the importance of MRI technique in interpreting studies. Recent investigations using sensitive techniques find the prevalence of CMB detected in community-dwelling elderly to be surprisingly high. We propose procedural guidelines for identifying CMB and suggest possible future approaches for elucidating the role of these common lesions as markers for, and potential contributors to, small vessel brain disease. PMID:19161908
Applications of Infrared and Raman Spectroscopies to Probiotic Investigation
Santos, Mauricio I.; Gerbino, Esteban; Tymczyszyn, Elizabeth; Gomez-Zavaglia, Andrea
2015-01-01
In this review, we overview the most important contributions of vibrational spectroscopy based techniques in the study of probiotics and lactic acid bacteria. First, we briefly introduce the fundamentals of these techniques, together with the main multivariate analytical tools used for spectral interpretation. Then, four main groups of applications are reported: (a) bacterial taxonomy (Subsection 4.1); (b) bacterial preservation (Subsection 4.2); (c) monitoring processes involving lactic acid bacteria and probiotics (Subsection 4.3); (d) imaging-based applications (Subsection 4.4). A final conclusion, underlying the potentialities of these techniques, is presented. PMID:28231205
Assessing the validity of discourse analysis: transdisciplinary convergence
NASA Astrophysics Data System (ADS)
Jaipal-Jamani, Kamini
2014-12-01
Research studies using discourse analysis approaches make claims about phenomena or issues based on interpretation of written or spoken text, which includes images and gestures. How are findings/interpretations from discourse analysis validated? This paper proposes transdisciplinary convergence as a way to validate discourse analysis approaches to research. The argument is made that discourse analysis explicitly grounded in semiotics, systemic functional linguistics, and critical theory, offers a credible research methodology. The underlying assumptions, constructs, and techniques of analysis of these three theoretical disciplines can be drawn on to show convergence of data at multiple levels, validating interpretations from text analysis.
Interpretable Decision Sets: A Joint Framework for Description and Prediction
Lakkaraju, Himabindu; Bach, Stephen H.; Jure, Leskovec
2016-01-01
One of the most important obstacles to deploying predictive models is the fact that humans do not understand and trust them. Knowing which variables are important in a model’s prediction and how they are combined can be very powerful in helping people understand and trust automatic decision making systems. Here we propose interpretable decision sets, a framework for building predictive models that are highly accurate, yet also highly interpretable. Decision sets are sets of independent if-then rules. Because each rule can be applied independently, decision sets are simple, concise, and easily interpretable. We formalize decision set learning through an objective function that simultaneously optimizes accuracy and interpretability of the rules. In particular, our approach learns short, accurate, and non-overlapping rules that cover the whole feature space and pay attention to small but important classes. Moreover, we prove that our objective is a non-monotone submodular function, which we efficiently optimize to find a near-optimal set of rules. Experiments show that interpretable decision sets are as accurate at classification as state-of-the-art machine learning techniques. They are also three times smaller on average than rule-based models learned by other methods. Finally, results of a user study show that people are able to answer multiple-choice questions about the decision boundaries of interpretable decision sets and write descriptions of classes based on them faster and more accurately than with other rule-based models that were designed for interpretability. Overall, our framework provides a new approach to interpretable machine learning that balances accuracy, interpretability, and computational efficiency. PMID:27853627
The dendritic spine story: an intriguing process of discovery.
DeFelipe, Javier
2015-01-01
Dendritic spines are key components of a variety of microcircuits and they represent the majority of postsynaptic targets of glutamatergic axon terminals in the brain. The present article will focus on the discovery of dendritic spines, which was possible thanks to the application of the Golgi technique to the study of the nervous system, and will also explore the early interpretation of these elements. This discovery represents an interesting chapter in the history of neuroscience as it shows us that progress in the study of the structure of the nervous system is based not only on the emergence of new techniques but also on our ability to exploit the methods already available and correctly interpret their microscopic images.
The development of selected data base applications for the crustal dynamics data information system
NASA Technical Reports Server (NTRS)
Noll, C. E.
1981-01-01
The development of a data base and its accompanying software for the data information system of crustal dynamics project is described. Background information concerning this project, and a definition of the techniques used in the implementation of an operational data base, are presented. Examples of key applications are included and interpreted.
What's so different about Lacan's approach to psychoanalysis?
Fink, Bruce
2011-12-01
Clinical work based on Lacanian principles is rarely compared in the psychoanalytic literature with that based on other principles. The author attempts to highlight a few important theoretical differences regarding language, desire, affect, and time between a Lacanian approach and certain others that lead to differences in focus and technique, related, for example, to interpretation, scansion, and countertransference. Lacanian techniques are illustrated with brief clinical vignettes. In the interest of confidentiality, identifying information and certain circumstances have been changed or omitted in the material presented.
Multispectral multisensor image fusion using wavelet transforms
Lemeshewsky, George P.
1999-01-01
Fusion techniques can be applied to multispectral and higher spatial resolution panchromatic images to create a composite image that is easier to interpret than the individual images. Wavelet transform-based multisensor, multiresolution fusion (a type of band sharpening) was applied to Landsat thematic mapper (TM) multispectral and coregistered higher resolution SPOT panchromatic images. The objective was to obtain increased spatial resolution, false color composite products to support the interpretation of land cover types wherein the spectral characteristics of the imagery are preserved to provide the spectral clues needed for interpretation. Since the fusion process should not introduce artifacts, a shift invariant implementation of the discrete wavelet transform (SIDWT) was used. These results were compared with those using the shift variant, discrete wavelet transform (DWT). Overall, the process includes a hue, saturation, and value color space transform to minimize color changes, and a reported point-wise maximum selection rule to combine transform coefficients. The performance of fusion based on the SIDWT and DWT was evaluated with a simulated TM 30-m spatial resolution test image and a higher resolution reference. Simulated imagery was made by blurring higher resolution color-infrared photography with the TM sensors' point spread function. The SIDWT based technique produced imagery with fewer artifacts and lower error between fused images and the full resolution reference. Image examples with TM and SPOT 10-m panchromatic illustrate the reduction in artifacts due to the SIDWT based fusion.
Application of multivariate statistical techniques in microbial ecology
Paliy, O.; Shankar, V.
2016-01-01
Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large scale ecological datasets. Especially noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions, and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amounts of data, powerful statistical techniques of multivariate analysis are well suited to analyze and interpret these datasets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular dataset. In this review we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive, and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and dataset structure. PMID:26786791
NASA Astrophysics Data System (ADS)
Hayat, T.; Ullah, Siraj; Khan, M. Ijaz; Alsaedi, A.; Zaigham Zia, Q. M.
2018-03-01
Here modeling and computations are presented to introduce the novel concept of Darcy-Forchheimer three-dimensional flow of water-based carbon nanotubes with nonlinear thermal radiation and heat generation/absorption. Bidirectional stretching surface induces the flow. Darcy's law is commonly replace by Forchheimer relation. Xue model is implemented for nonliquid transport mechanism. Nonlinear formulation based upon conservation laws of mass, momentum and energy is first modeled and then solved by optimal homotopy analysis technique. Optimal estimations of auxiliary variables are obtained. Importance of influential variables on the velocity and thermal fields is interpreted graphically. Moreover velocity and temperature gradients are discussed and analyzed. Physical interpretation of influential variables is examined.
Digital image classification approach for estimating forest clearing and regrowth rates and trends
NASA Technical Reports Server (NTRS)
Sader, Steven A.
1987-01-01
A technique is presented to monitor vegetation changes for a selected study area in Costa Rica. A normalized difference vegetation index was computed for three dates of Landsat satellite data and a modified parallelipiped classifier was employed to generate a multitemporal greenness image representing all three dates. A second-generation image was created by partitioning the intensity levels at each date into high, medium, and low and thereby reducing the number of classes to 21. A sampling technique was applied to describe forest and other land cover change occurring between time periods based on interpretation of aerial photography that closely matched the dates of satellite acquisition. Comparison of the Landsat-derived classes with the photo-interpreted sample areas can provide a basis for evaluating the satellite monitoring technique and the accuracy of estimating forest clearing and regrowth rates and trends.
Interpretable functional principal component analysis.
Lin, Zhenhua; Wang, Liangliang; Cao, Jiguo
2016-09-01
Functional principal component analysis (FPCA) is a popular approach to explore major sources of variation in a sample of random curves. These major sources of variation are represented by functional principal components (FPCs). The intervals where the values of FPCs are significant are interpreted as where sample curves have major variations. However, these intervals are often hard for naïve users to identify, because of the vague definition of "significant values". In this article, we develop a novel penalty-based method to derive FPCs that are only nonzero precisely in the intervals where the values of FPCs are significant, whence the derived FPCs possess better interpretability than the FPCs derived from existing methods. To compute the proposed FPCs, we devise an efficient algorithm based on projection deflation techniques. We show that the proposed interpretable FPCs are strongly consistent and asymptotically normal under mild conditions. Simulation studies confirm that with a competitive performance in explaining variations of sample curves, the proposed FPCs are more interpretable than the traditional counterparts. This advantage is demonstrated by analyzing two real datasets, namely, electroencephalography data and Canadian weather data. © 2015, The International Biometric Society.
Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R; Guerra-Hernandez, Erick I; Almanza-Ojeda, Dora L; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J; Ibarra-Manzano, Mario A
2016-03-05
Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states.
Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R.; Guerra-Hernandez, Erick I.; Almanza-Ojeda, Dora L.; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J.; Ibarra-Manzano, Mario A.
2016-01-01
Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states. PMID:26959029
The dendritic spine story: an intriguing process of discovery
DeFelipe, Javier
2015-01-01
Dendritic spines are key components of a variety of microcircuits and they represent the majority of postsynaptic targets of glutamatergic axon terminals in the brain. The present article will focus on the discovery of dendritic spines, which was possible thanks to the application of the Golgi technique to the study of the nervous system, and will also explore the early interpretation of these elements. This discovery represents an interesting chapter in the history of neuroscience as it shows us that progress in the study of the structure of the nervous system is based not only on the emergence of new techniques but also on our ability to exploit the methods already available and correctly interpret their microscopic images. PMID:25798090
Yielding physically-interpretable emulators - A Sparse PCA approach
NASA Astrophysics Data System (ADS)
Galelli, S.; Alsahaf, A.; Giuliani, M.; Castelletti, A.
2015-12-01
Projection-based techniques, such as Principal Orthogonal Decomposition (POD), are a common approach to surrogate high-fidelity process-based models by lower order dynamic emulators. With POD, the dimensionality reduction is achieved by using observations, or 'snapshots' - generated with the high-fidelity model -, to project the entire set of input and state variables of this model onto a smaller set of basis functions that account for most of the variability in the data. While reduction efficiency and variance control of POD techniques are usually very high, the resulting emulators are structurally highly complex and can hardly be given a physically meaningful interpretation as each basis is a projection of the entire set of inputs and states. In this work, we propose a novel approach based on Sparse Principal Component Analysis (SPCA) that combines the several assets of POD methods with the potential for ex-post interpretation of the emulator structure. SPCA reduces the number of non-zero coefficients in the basis functions by identifying a sparse matrix of coefficients. While the resulting set of basis functions may retain less variance of the snapshots, the presence of a few non-zero coefficients assists in the interpretation of the underlying physical processes. The SPCA approach is tested on the reduction of a 1D hydro-ecological model (DYRESM-CAEDYM) used to describe the main ecological and hydrodynamic processes in Tono Dam, Japan. An experimental comparison against a standard POD approach shows that SPCA achieves the same accuracy in emulating a given output variable - for the same level of dimensionality reduction - while yielding better insights of the main process dynamics.
Developments in signal processing and interpretation in laser tapping
NASA Astrophysics Data System (ADS)
Perton, M.; Neron, C.; Blouin, A.; Monchalin, J.-P.
2013-01-01
A novel technique, called laser-tapping, based on the thermoelastic excitation by laser like laser-ultrasonics has been previously introduced for inspecting honeycomb and foam core structures. If the top skin is delaminated or detached from the substrate, the detached layer is driven into vibration. The interpretation of the vibrations in terms of Lamb wave resonances is first discussed for a flat bottom hole configuration and then used to determine appropriate signal processing for samples such as honeycomb structures.
NASA Astrophysics Data System (ADS)
Schrott, Lothar; Sass, Oliver
2008-01-01
During the last decade, the use of geophysical techniques has become popular in many geomorphological studies. However, the correct handling of geophysical instruments and the subsequent processing of the data they yield are difficult tasks. Furthermore, the description and interpretation of geomorphological settings to which they apply can significantly influence the data gathering and subsequent modelling procedure ( e.g. achieving a maximum depth of 30 m requires a certain profile length and geophone spacing or a particular frequency of antenna). For more than three decades geophysical techniques have been successfully applied, for example, in permafrost studies. However, in many cases complex or more heterogeneous subsurface structures could not be adequately interpreted due to limited computer facilities and time consuming calculations. As a result of recent technical improvements, geophysical techniques have been applied to a wider spectrum of geomorphological and geological settings. This paper aims to present some examples of geomorphological studies that demonstrate the powerful integration of geophysical techniques and highlight some of the limitations of these techniques. A focus has been given to the three most frequently used techniques in geomorphology to date, namely ground-penetrating radar, seismic refraction and DC resistivity. Promising applications are reported for a broad range of landforms and environments, such as talus slopes, block fields, landslides, complex valley fill deposits, karst and loess covered landforms. A qualitative assessment highlights suitable landforms and environments. The techniques can help to answer yet unsolved questions in geomorphological research regarding for example sediment thickness and internal structures. However, based on case studies it can be shown that the use of a single geophysical technique or a single interpretation tool is not recommended for many geomorphological surface and subsurface conditions as this may lead to significant errors in interpretation. Because of changing physical properties of the subsurface material ( e.g. sediment, water content) in many cases only a combination of two or sometimes even three geophysical methods gives sufficient insight to avoid serious misinterpretation. A "good practice guide" has been framed that provides recommendations to enable the successful application of three important geophysical methods in geomorphology and to help users avoid making serious mistakes.
NASA Astrophysics Data System (ADS)
Lespinats, Sylvain; Pinker-Domenig, Katja; Wengert, Georg; Houben, Ivo; Lobbes, Marc; Stadlbauer, Andreas; Meyer-Bäse, Anke
2016-05-01
Glioma-derived cancer stem cells (GSCs) are tumor-initiating cells and may be refractory to radiation and chemotherapy and thus have important implications for tumor biology and therapeutics. The analysis and interpretation of large proteomic data sets requires the development of new data mining and visualization approaches. Traditional techniques are insufficient to interpret and visualize these resulting experimental data. The emphasis of this paper lies in the application of novel approaches for the visualization, clustering and projection representation to unveil hidden data structures relevant for the accurate interpretation of biological experiments. These qualitative and quantitative methods are applied to the proteomic analysis of data sets derived from the GSCs. The achieved clustering and visualization results provide a more detailed insight into the protein-level fold changes and putative upstream regulators for the GSCs. However the extracted molecular information is insufficient in classifying GSCs and paving the pathway to an improved therapeutics of the heterogeneous glioma.
NASA Technical Reports Server (NTRS)
Viezee, W.; Russell, P. B.; Hake, R. D., Jr.
1974-01-01
The matching method of lidar data analysis is explained, and the results from two flights studying the stratospheric aerosol using lidar techniques are summarized and interpreted. Support is lent to the matching method of lidar data analysis by the results, but it is not yet apparent that the analysis technique leads to acceptable results on all nights in all seasons.
Delfiner, Matthew S; Martinez, Luis R; Pavia, Charles S
2016-01-01
Laboratory diagnostic tests have an essential role in patient care, and the increasing number of medical and health professions schools focusing on teaching laboratory medicine to pre-clinical students reflects this importance. However, data validating the pedagogical methods that best influence students' comprehension and interpretation of diagnostic tests have not been well described. The Gram stain is a simple yet significant and frequently used diagnostic test in the clinical setting that helps classify bacteria into two major groups, Gram positive and negative, based on their cell wall structure. We used this technique to assess which educational strategies may improve students' learning and competency in medical diagnostic techniques. Hence, in this randomized controlled study, we compared the effectiveness of several educational strategies (e.g. workshop, discussion, or lecture) in first year medical students' competency in comprehension and interpretation of the Gram stain procedure. We demonstrated that a hands-on practical workshop significantly enhances students' competency in memorization and overall comprehension of the technique. Interestingly, most students irrespective of their cohort showed difficulty in answering Gram stain-related analytical questions, suggesting that more emphasis should be allocated by the instructors to clearly explain the interpretation of the diagnostic test results to students in medical and health professional schools. This proof of principle study highlights the need of practical experiences on laboratory medical techniques during pre-clinical training to facilitate future medical doctors' and healthcare professionals' basic understanding and competency in diagnostic testing for better patient care.
ERIC Educational Resources Information Center
Borsuk, Ellen R.; Watkins, Marley W.; Canivez, Gary L.
2006-01-01
Although often applied in practice, clinically based cognitive subtest profile analysis has failed to achieve empirical support. Nonlinear multivariate subtest profile analysis may have benefits over clinically based techniques, but the psychometric properties of these methods must be studied prior to their implementation and interpretation. The…
ERIC Educational Resources Information Center
Anae, Nicole
2014-01-01
This paper presents a theorised classroom-based account discussing the author's interdisciplinary approach to engaging first-year teacher-education students in self-critical inquiry using creative writing techniques as an entry point into Arts-based three-dimensional storytelling. Via an interpretation of Lacan's "speaking…
What if ? On alternative conceptual models and the problem of their implementation
NASA Astrophysics Data System (ADS)
Neuberg, Jurgen
2015-04-01
Seismic and other monitoring techniques rely on a set of conceptual models on the base of which data sets can be interpreted. In order to do this on an operational level in volcano observatories these models need to be tested and ready for an interpretation in a timely manner. Once established, scientists in charge advising stakeholders and decision makers often stick firmly to these models to avoid confusion by giving alternative versions of interpretations to non-experts. This talk gives an overview of widely accepted conceptual models to interpret seismic and deformation data, and highlights in a few case studies some of the arising problems. Aspects covered include knowledge transfer between research institutions and observatories, data sharing, the problem of up-taking advice, and some hidden problems which turn out to be much more critical in assessing volcanic hazard than the actual data interpretation.
The Q sort theory and technique.
Nyatanga, L
1989-10-01
This paper is based on the author's experience of using the Q sort technique with BA Social Sciences (BASS) students, and the community psychiatric nursing (CPN, ENB No 811 course). The paper focuses on two main issues: 1. The theoretical assumptions underpinning the Q Sort technique. Carl Rogers' self theory and some of the values of humanistic psychology are summarised. 2. The actual technique procedure and meaning of results are highlighted. As the Q Sort technique is potentially useful in a variety of sittings some of which are listed in this paper, the emphasis has deliberately been placed in understanding the theoretical underpinning and the operationalisation (sensitive interpretation) of the theory to practice.
Machine Translation in Post-Contemporary Era
ERIC Educational Resources Information Center
Lin, Grace Hui Chin
2010-01-01
This article focusing on translating techniques via personal computer or laptop reports updated artificial intelligence progresses before 2010. Based on interpretations and information for field of MT [Machine Translation] by Yorick Wilks' book, "Machine Translation, Its scope and limits," this paper displays understandable theoretical frameworks…
Educational principles and techniques for interpreters.
F. David Boulanger; John P. Smith
1973-01-01
Interpretation is in large part education, since it attempts to convey information, concepts, and principles while creating attitude changes and such emotional states as wonder, delight, and appreciation. Although interpreters might profit greatly by formal training in the principles and techniques of teaching, many have not had such training. Some means of making the...
Passive Super-Low Frequency electromagnetic prospecting technique
NASA Astrophysics Data System (ADS)
Wang, Nan; Zhao, Shanshan; Hui, Jian; Qin, Qiming
2017-03-01
The Super-Low Frequency (SLF) electromagnetic prospecting technique, adopted as a non-imaging remote sensing tool for depth sounding, is systematically proposed for subsurface geological survey. In this paper, we propose and theoretically illustrate natural source magnetic amplitudes as SLF responses for the first step. In order to directly calculate multi-dimensional theoretical SLF responses, modeling algorithms were developed and evaluated using the finite difference method. The theoretical results of three-dimensional (3-D) models show that the average normalized SLF magnetic amplitude responses were numerically stable and appropriate for practical interpretation. To explore the depth resolution, three-layer models were configured. The modeling results prove that the SLF technique is more sensitive to conductive objective layers than high resistive ones, with the SLF responses of conductive objective layers obviously showing uprising amplitudes in the low frequency range. Afterwards, we proposed an improved Frequency-Depth transformation based on Bostick inversion to realize the depth sounding by empirically adjusting two parameters. The SLF technique has already been successfully applied in geothermal exploration and coalbed methane (CBM) reservoir interpretation, which demonstrates that the proposed methodology is effective in revealing low resistive distributions. Furthermore, it siginificantly contributes to reservoir identification with electromagnetic radiation anomaly extraction. Meanwhile, the SLF interpretation results are in accordance with dynamic production status of CBM reservoirs, which means it could provide an economical, convenient and promising method for exploring and monitoring subsurface geo-objects.
Antibody detection tests improve the sensitivity of tuberculosis diagnosis in cattle.
Casal, C; Infantes, J A; Risalde, M A; Díez-Guerrier, A; Domínguez, M; Moreno, I; Romero, B; de Juan, L; Sáez, J L; Juste, R; Gortázar, C; Domínguez, L; Bezos, J
2017-06-01
We evaluated the sensitivity (Se) of the single cervical intradermal tuberculin (SIT) test, two interferon-gamma (IFN-γ) assays and three different antibody detection techniques for bovine tuberculosis (bTB) diagnosis in 131 mixed beef breed cattle. The results of the diagnostic techniques performed over the whole herd, and over the animals confirmed as infected based on the presence of lesions compatible with the disease and/or M. bovis isolation were compared to determine apparent prevalence (AP) and Se. The Se of the SIT test (severe interpretation) was 63.7% (95% CI, 54.54-72.00), while the Se of the IFN-γ assays ranged between 60.2% and 92%. The proportion of infected cattle detected by the different antibody detection techniques ranged from 65.5% to 87.6%. Three of the antibody detection techniques yielded a significant higher (p<0.05) Se than that achieved with the official diagnostic techniques. In addition, the interpretation in parallel of cellular and antibody detection techniques reached the highest Se: 98.2% (95% CI, 93.78-99.51) suggesting that the use of diagnostic techniques detecting both cellular and humoral responses could be considered as an alternative in the control of bTB outbreaks in high prevalence settings. Copyright © 2017 Elsevier Ltd. All rights reserved.
Reconstruction and 3D visualisation based on objective real 3D based documentation.
Bolliger, Michael J; Buck, Ursula; Thali, Michael J; Bolliger, Stephan A
2012-09-01
Reconstructions based directly upon forensic evidence alone are called primary information. Historically this consists of documentation of findings by verbal protocols, photographs and other visual means. Currently modern imaging techniques such as 3D surface scanning and radiological methods (computer tomography, magnetic resonance imaging) are also applied. Secondary interpretation is based on facts and the examiner's experience. Usually such reconstructive expertises are given in written form, and are often enhanced by sketches. However, narrative interpretations can, especially in complex courses of action, be difficult to present and can be misunderstood. In this report we demonstrate the use of graphic reconstruction of secondary interpretation with supporting pictorial evidence, applying digital visualisation (using 'Poser') or scientific animation (using '3D Studio Max', 'Maya') and present methods of clearly distinguishing between factual documentation and examiners' interpretation based on three cases. The first case involved a pedestrian who was initially struck by a car on a motorway and was then run over by a second car. The second case involved a suicidal gunshot to the head with a rifle, in which the trigger was pushed with a rod. The third case dealt with a collision between two motorcycles. Pictorial reconstruction of the secondary interpretation of these cases has several advantages. The images enable an immediate overview, give rise to enhanced clarity, and compel the examiner to look at all details if he or she is to create a complete image.
NASA Technical Reports Server (NTRS)
Sabol, Donald E., Jr.; Roberts, Dar A.; Adams, John B.; Smith, Milton O.
1993-01-01
An important application of remote sensing is to map and monitor changes over large areas of the land surface. This is particularly significant with the current interest in monitoring vegetation communities. Most of traditional methods for mapping different types of plant communities are based upon statistical classification techniques (i.e., parallel piped, nearest-neighbor, etc.) applied to uncalibrated multispectral data. Classes from these techniques are typically difficult to interpret (particularly to a field ecologist/botanist). Also, classes derived for one image can be very different from those derived from another image of the same area, making interpretation of observed temporal changes nearly impossible. More recently, neural networks have been applied to classification. Neural network classification, based upon spectral matching, is weak in dealing with spectral mixtures (a condition prevalent in images of natural surfaces). Another approach to mapping vegetation communities is based on spectral mixture analysis, which can provide a consistent framework for image interpretation. Roberts et al. (1990) mapped vegetation using the band residuals from a simple mixing model (the same spectral endmembers applied to all image pixels). Sabol et al. (1992b) and Roberts et al. (1992) used different methods to apply the most appropriate spectral endmembers to each image pixel, thereby allowing mapping of vegetation based upon the the different endmember spectra. In this paper, we describe a new approach to classification of vegetation communities based upon the spectra fractions derived from spectral mixture analysis. This approach was applied to three 1992 AVIRIS images of Jasper Ridge, California to observe seasonal changes in surface composition.
Integrated Artificial Intelligence Approaches for Disease Diagnostics.
Vashistha, Rajat; Chhabra, Deepak; Shukla, Pratyoosh
2018-06-01
Mechanocomputational techniques in conjunction with artificial intelligence (AI) are revolutionizing the interpretations of the crucial information from the medical data and converting it into optimized and organized information for diagnostics. It is possible due to valuable perfection in artificial intelligence, computer aided diagnostics, virtual assistant, robotic surgery, augmented reality and genome editing (based on AI) technologies. Such techniques are serving as the products for diagnosing emerging microbial or non microbial diseases. This article represents a combinatory approach of using such approaches and providing therapeutic solutions towards utilizing these techniques in disease diagnostics.
Thermodynamics and Mechanics of Membrane Curvature Generation and Sensing by Proteins and Lipids
Baumgart, Tobias; Capraro, Benjamin R.; Zhu, Chen; Das, Sovan L.
2014-01-01
Research investigating lipid membrane curvature generation and sensing is a rapidly developing frontier in membrane physical chemistry and biophysics. The fast recent progress is based on the discovery of a plethora of proteins involved in coupling membrane shape to cellular membrane function, the design of new quantitative experimental techniques to study aspects of membrane curvature, and the development of analytical theories and simulation techniques that allow a mechanistic interpretation of quantitative measurements. The present review first provides an overview of important classes of membrane proteins for which function is coupled to membrane curvature. We then survey several mechanisms that are assumed to underlie membrane curvature sensing and generation. Finally, we discuss relatively simple thermodynamic/mechanical models that allow quantitative interpretation of experimental observations. PMID:21219150
Fuzzy support vector machine: an efficient rule-based classification technique for microarrays.
Hajiloo, Mohsen; Rabiee, Hamid R; Anooshahpour, Mahdi
2013-01-01
The abundance of gene expression microarray data has led to the development of machine learning algorithms applicable for tackling disease diagnosis, disease prognosis, and treatment selection problems. However, these algorithms often produce classifiers with weaknesses in terms of accuracy, robustness, and interpretability. This paper introduces fuzzy support vector machine which is a learning algorithm based on combination of fuzzy classifiers and kernel machines for microarray classification. Experimental results on public leukemia, prostate, and colon cancer datasets show that fuzzy support vector machine applied in combination with filter or wrapper feature selection methods develops a robust model with higher accuracy than the conventional microarray classification models such as support vector machine, artificial neural network, decision trees, k nearest neighbors, and diagonal linear discriminant analysis. Furthermore, the interpretable rule-base inferred from fuzzy support vector machine helps extracting biological knowledge from microarray data. Fuzzy support vector machine as a new classification model with high generalization power, robustness, and good interpretability seems to be a promising tool for gene expression microarray classification.
NASA Astrophysics Data System (ADS)
Sridhar, M.; Markandeyulu, A.; Chaturvedi, A. K.
2017-01-01
Mapping of subtrappean sediments is a complex geological problem attempted by many interpreters applying different geophysical techniques. Variations in thickness and resistivity of traps and underlying sediments, respectively, results in considerable uncertainty in the interpretation of geophysical data. It is proposed that the transient electromagnetic technique is an effective geophysical tool for delineation of the sub-trappean sediments, due to marked resistivity contrast between the Deccan trap, and underlying sediments and/or basement. The northern margin of the Kaladgi basin is covered under trap. A heliborne time domain electromagnetic survey was conducted to demarcate the basin extent and map the sub-trappean sediments. Conductivity depth transformations were used to map the interface between conductive trap and resistive 'basement'. Two resistivity contrast boundaries are picked: the first corresponds to the bottom of the shallow conductive unit interpreted as the base of the Deccan Volcanics and the second - picked at the base of a deeper subsurface conductive zone - is interpreted as the weathered paleo-surface of the crystalline basement. This second boundary can only be seen in areas where the volcanics are thin or absent, suggesting that the volcanics are masking the EM signal preventing deeper penetration. An interesting feature, which shows prominently in the EM data but less clearly imaged in the magnetic data, is observed in the vicinity of Mudhol. The surface geology interpreted from satellite imagery show Deccan trap cover around Mudhol. Modelling of TDEM data suggest the presence of synclinal basin structure. The depth of penetration of the heliborne TDEM data is estimated to be approximately 350 m for the study area. This suggests that heliborne TDEM could penetrate significant thicknesses of conductive Deccan trap cover to delineate structure below in the Bagalkot Group.
Fjodorova, Natalja; Novič, Marjana
2012-01-01
The knowledge-based Toxtree expert system (SAR approach) was integrated with the statistically based counter propagation artificial neural network (CP ANN) model (QSAR approach) to contribute to a better mechanistic understanding of a carcinogenicity model for non-congeneric chemicals using Dragon descriptors and carcinogenic potency for rats as a response. The transparency of the CP ANN algorithm was demonstrated using intrinsic mapping technique specifically Kohonen maps. Chemical structures were represented by Dragon descriptors that express the structural and electronic features of molecules such as their shape and electronic surrounding related to reactivity of molecules. It was illustrated how the descriptors are correlated with particular structural alerts (SAs) for carcinogenicity with recognized mechanistic link to carcinogenic activity. Moreover, the Kohonen mapping technique enables one to examine the separation of carcinogens and non-carcinogens (for rats) within a family of chemicals with a particular SA for carcinogenicity. The mechanistic interpretation of models is important for the evaluation of safety of chemicals. PMID:24688639
Interpreting Medical Information Using Machine Learning and Individual Conditional Expectation.
Nohara, Yasunobu; Wakata, Yoshifumi; Nakashima, Naoki
2015-01-01
Recently, machine-learning techniques have spread many fields. However, machine-learning is still not popular in medical research field due to difficulty of interpreting. In this paper, we introduce a method of interpreting medical information using machine learning technique. The method gave new explanation of partial dependence plot and individual conditional expectation plot from medical research field.
Application of multivariate statistical techniques in microbial ecology.
Paliy, O; Shankar, V
2016-03-01
Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure. © 2016 John Wiley & Sons Ltd.
2010-01-01
Background Proton Magnetic Resonance (MR) Spectroscopy (MRS) is a widely available technique for those clinical centres equipped with MR scanners. Unlike the rest of MR-based techniques, MRS yields not images but spectra of metabolites in the tissues. In pathological situations, the MRS profile changes and this has been particularly described for brain tumours. However, radiologists are frequently not familiar to the interpretation of MRS data and for this reason, the usefulness of decision-support systems (DSS) in MRS data analysis has been explored. Results This work presents the INTERPRET DSS version 3.0, analysing the improvements made from its first release in 2002. Version 3.0 is aimed to be a program that 1st, can be easily used with any new case from any MR scanner manufacturer and 2nd, improves the initial analysis capabilities of the first version. The main improvements are an embedded database, user accounts, more diagnostic discrimination capabilities and the possibility to analyse data acquired under additional data acquisition conditions. Other improvements include a customisable graphical user interface (GUI). Most diagnostic problems included have been addressed through a pattern-recognition based approach, in which classifiers based on linear discriminant analysis (LDA) were trained and tested. Conclusions The INTERPRET DSS 3.0 allows radiologists, medical physicists, biochemists or, generally speaking, any person with a minimum knowledge of what an MR spectrum is, to enter their own SV raw data, acquired at 1.5 T, and to analyse them. The system is expected to help in the categorisation of MR Spectra from abnormal brain masses. PMID:21114820
ERIC Educational Resources Information Center
Cobb, Jeanne B.
2012-01-01
This study utilized a qualitative, interpretative, analytic technique based on image-based research. This descriptive study was designed to investigate children's perceptions of "good readers" as portrayed in their representational drawings. Children in grades kindergarten through 6, 156 total, in 14 schools in a small, rural school…
Husak, G.J.; Marshall, M. T.; Michaelsen, J.; Pedreros, Diego; Funk, Christopher C.; Galu, G.
2008-01-01
Reliable estimates of cropped area (CA) in developing countries with chronic food shortages are essential for emergency relief and the design of appropriate market-based food security programs. Satellite interpretation of CA is an effective alternative to extensive and costly field surveys, which fail to represent the spatial heterogeneity at the country-level. Bias-corrected, texture based classifications show little deviation from actual crop inventories, when estimates derived from aerial photographs or field measurements are used to remove systematic errors in medium resolution estimates. In this paper, we demonstrate a hybrid high-medium resolution technique for Central Ethiopia that combines spatially limited unbiased estimates from IKONOS images, with spatially extensive Landsat ETM+ interpretations, land-cover, and SRTM-based topography. Logistic regression is used to derive the probability of a location being crop. These individual points are then aggregated to produce regional estimates of CA. District-level analysis of Landsat based estimates showed CA totals which supported the estimates of the Bureau of Agriculture and Rural Development. Continued work will evaluate the technique in other parts of Africa, while segmentation algorithms will be evaluated, in order to automate classification of medium resolution imagery for routine CA estimation in the future.
NASA Astrophysics Data System (ADS)
Husak, G. J.; Marshall, M. T.; Michaelsen, J.; Pedreros, D.; Funk, C.; Galu, G.
2008-07-01
Reliable estimates of cropped area (CA) in developing countries with chronic food shortages are essential for emergency relief and the design of appropriate market-based food security programs. Satellite interpretation of CA is an effective alternative to extensive and costly field surveys, which fail to represent the spatial heterogeneity at the country-level. Bias-corrected, texture based classifications show little deviation from actual crop inventories, when estimates derived from aerial photographs or field measurements are used to remove systematic errors in medium resolution estimates. In this paper, we demonstrate a hybrid high-medium resolution technique for Central Ethiopia that combines spatially limited unbiased estimates from IKONOS images, with spatially extensive Landsat ETM+ interpretations, land-cover, and SRTM-based topography. Logistic regression is used to derive the probability of a location being crop. These individual points are then aggregated to produce regional estimates of CA. District-level analysis of Landsat based estimates showed CA totals which supported the estimates of the Bureau of Agriculture and Rural Development. Continued work will evaluate the technique in other parts of Africa, while segmentation algorithms will be evaluated, in order to automate classification of medium resolution imagery for routine CA estimation in the future.
Yang, Lixia; Mu, Yuming; Quaglia, Luiz Augusto; Tang, Qi; Guan, Lina; Wang, Chunmei; Shih, Ming Chi
2012-01-01
The study aim was to compare two different stress echocardiography interpretation techniques based on the correlation with thrombosis in myocardial infarction (TIMI ) flow grading from acute coronary syndrome (ACS) patients. Forty-one patients with suspected ACS were studied before diagnostic coronary angiography with myocardial contrast echocardiography (MCE) at rest and at stress. The correlation of visual interpretation of MCE and TIMI flow grade was significant. The quantitative analysis (myocardial perfusion parameters: A, β, and A × β) and TIMI flow grade were significant. MCE visual interpretation and TIMI flow grade had a high degree of agreement, on diagnosing myocardial perfusion abnormality. If one considers TIMI flow grade <3 as abnormal, MCE visual interpretation at rest had 73.1% accuracy with 58.2% sensitivity and 84.2% specificity and at stress had 80.4% accuracy with 76.6% sensitivity and 83.3% specificity. The MCE quantitative analysis has better accuracy with 100% of agreement with different level of TIMI flow grading. MCE quantitative analysis at stress has showed a direct correlation with TIMI flow grade, more significant than the visual interpretation technique. Further studies could measure the clinical relevance of this more objective approach to managing acute coronary syndrome patient before percutaneous coronary intervention (PCI). PMID:22778555
Raposo, Letícia M; Nobre, Flavio F
2017-08-30
Resistance to antiretrovirals (ARVs) is a major problem faced by HIV-infected individuals. Different rule-based algorithms were developed to infer HIV-1 susceptibility to antiretrovirals from genotypic data. However, there is discordance between them, resulting in difficulties for clinical decisions about which treatment to use. Here, we developed ensemble classifiers integrating three interpretation algorithms: Agence Nationale de Recherche sur le SIDA (ANRS), Rega, and the genotypic resistance interpretation system from Stanford HIV Drug Resistance Database (HIVdb). Three approaches were applied to develop a classifier with a single resistance profile: stacked generalization, a simple plurality vote scheme and the selection of the interpretation system with the best performance. The strategies were compared with the Friedman's test and the performance of the classifiers was evaluated using the F-measure, sensitivity and specificity values. We found that the three strategies had similar performances for the selected antiretrovirals. For some cases, the stacking technique with naïve Bayes as the learning algorithm showed a statistically superior F-measure. This study demonstrates that ensemble classifiers can be an alternative tool for clinical decision-making since they provide a single resistance profile from the most commonly used resistance interpretation systems.
Tutorial review of seismic surface waves' phenomenology
NASA Astrophysics Data System (ADS)
Levshin, A. L.; Barmin, M. P.; Ritzwoller, M. H.
2018-03-01
In recent years, surface wave seismology has become one of the leading directions in seismological investigations of the Earth's structure and seismic sources. Various applications cover a wide spectrum of goals, dealing with differences in sources of seismic excitation, penetration depths, frequency ranges, and interpretation techniques. Observed seismic data demonstrates the great variability of phenomenology which can produce difficulties in interpretation for beginners. This tutorial review is based on the many years' experience of authors in processing and interpretation of seismic surface wave observations and the lectures of one of the authors (ALL) at Workshops on Seismic Wave Excitation, Propagation and Interpretation held at the Abdus Salam International Center for Theoretical Physics (Trieste, Italy) in 1990-2012. We present some typical examples of wave patterns which could be encountered in different applications and which can serve as a guide to analysis of observed seismograms.
Liao, David; Tlsty, Thea D.
2014-01-01
The use of mathematical equations to analyse population dynamics measurements is being increasingly applied to elucidate complex dynamic processes in biological systems, including cancer. Purely ‘empirical’ equations may provide sufficient accuracy to support predictions and therapy design. Nevertheless, interpretation of fitting equations in terms of physical and biological propositions can provide additional insights that can be used both to refine models that prove inconsistent with data and to understand the scope of applicability of models that validate. The purpose of this tutorial is to assist readers in mathematically associating interpretations with equations and to provide guidance in choosing interpretations and experimental systems to investigate based on currently available biological knowledge, techniques in mathematical and computational analysis and methods for in vitro and in vivo experiments. PMID:25097752
NASA Astrophysics Data System (ADS)
Hramov, Alexander; Musatov, Vyacheslav Yu.; Runnova, Anastasija E.; Efremova, Tatiana Yu.; Koronovskii, Alexey A.; Pisarchik, Alexander N.
2018-04-01
In the paper we propose an approach based on artificial neural networks for recognition of different human brain states associated with distinct visual stimulus. Based on the developed numerical technique and the analysis of obtained experimental multichannel EEG data, we optimize the spatiotemporal representation of multichannel EEG to provide close to 97% accuracy in recognition of the EEG brain states during visual perception. Different interpretations of an ambiguous image produce different oscillatory patterns in the human EEG with similar features for every interpretation. Since these features are inherent to all subjects, a single artificial network can classify with high quality the associated brain states of other subjects.
Influence of dipolar interactions on the superparamagnetic relaxation time of γ-Fe2O3
NASA Astrophysics Data System (ADS)
Labzour, A.; Housni, A.; Limame, K.; Essahlaoui, A.; Sayouri, S.
2017-03-01
Influence of dipolar interactions on the Néel superparamagnetic relaxation time, τ , of an assembly of ultrafine ferromagnetic particles (γ-Fe2O3 ) with uniaxial anisotropy and of different sizes has been widely studied using Mössbauer technique. These studies, based on different analytical approaches, have shown that τ decreases with increasing interactions between particles. To interpret these results, we propose a model where interaction effects are considered as being due to a constant and external randomly oriented magnetic field B(Ψ, ϕ). The model is based on the resolution of the Fokker-Planck equation (FPE), generalizes previous calculations and gives satisfactory interpretation of the relaxation phenomenon in such systems.
Methodology of remote sensing data interpretation and geological applications. [Brazil
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Veneziani, P.; Dosanjos, C. E.
1982-01-01
Elements of photointerpretation discussed include the analysis of photographic texture and structure as well as film tonality. The method used is based on conventional techniques developed for interpreting aerial black and white photographs. By defining the properties which characterize the form and individuality of dual images, homologous zones can be identified. Guy's logic method (1966) was adapted and used on functions of resolution, scale, and spectral characteristics of remotely sensed products. Applications of LANDSAT imagery are discussed for regional geological mapping, mineral exploration, hydrogeology, and geotechnical engineering in Brazil.
Fault diagnosis model for power transformers based on information fusion
NASA Astrophysics Data System (ADS)
Dong, Ming; Yan, Zhang; Yang, Li; Judd, Martin D.
2005-07-01
Methods used to assess the insulation status of power transformers before they deteriorate to a critical state include dissolved gas analysis (DGA), partial discharge (PD) detection and transfer function techniques, etc. All of these approaches require experience in order to correctly interpret the observations. Artificial intelligence (AI) is increasingly used to improve interpretation of the individual datasets. However, a satisfactory diagnosis may not be obtained if only one technique is used. For example, the exact location of PD cannot be predicted if only DGA is performed. However, using diverse methods may result in different diagnosis solutions, a problem that is addressed in this paper through the introduction of a fuzzy information infusion model. An inference scheme is proposed that yields consistent conclusions and manages the inherent uncertainty in the various methods. With the aid of information fusion, a framework is established that allows different diagnostic tools to be combined in a systematic way. The application of information fusion technique for insulation diagnostics of transformers is proved promising by means of examples.
Remote Sensing Applications with High Reliability in Changjiang Water Resource Management
NASA Astrophysics Data System (ADS)
Ma, L.; Gao, S.; Yang, A.
2018-04-01
Remote sensing technology has been widely used in many fields. But most of the applications cannot get the information with high reliability and high accuracy in large scale, especially for the applications using automatic interpretation methods. We have designed an application-oriented technology system (PIR) composed of a series of accurate interpretation techniques,which can get over 85 % correctness in Water Resource Management from the view of photogrammetry and expert knowledge. The techniques compose of the spatial positioning techniques from the view of photogrammetry, the feature interpretation techniques from the view of expert knowledge, and the rationality analysis techniques from the view of data mining. Each interpreted polygon is accurate enough to be applied to the accuracy sensitive projects, such as the Three Gorge Project and the South - to - North Water Diversion Project. In this paper, we present several remote sensing applications with high reliability in Changjiang Water Resource Management,including water pollution investigation, illegal construction inspection, and water conservation monitoring, etc.
NASA Technical Reports Server (NTRS)
Eppler, Dean B.; Bleacher, Jacob F.; Evans, Cynthia A.; Feng, Wanda; Gruener, John; Hurwitz, Debra M.; Skinner, J. A., Jr.; Whitson, Peggy; Janoiko, Barbara
2013-01-01
Geologic maps integrate the distributions, contacts, and compositions of rock and sediment bodies as a means to interpret local to regional formative histories. Applying terrestrial mapping techniques to other planets is challenging because data is collected primarily by orbiting instruments, with infrequent, spatiallylimited in situ human and robotic exploration. Although geologic maps developed using remote data sets and limited "Apollo-style" field access likely contain inaccuracies, the magnitude, type, and occurrence of these are only marginally understood. This project evaluates the interpretative and cartographic accuracy of both field- and remote-based mapping approaches by comparing two 1:24,000 scale geologic maps of the San Francisco Volcanic Field (SFVF), north-central Arizona. The first map is based on traditional field mapping techniques, while the second is based on remote data sets, augmented with limited field observations collected during NASA Desert Research & Technology Studies (RATS) 2010 exercises. The RATS mission used Apollo-style methods not only for pre-mission traverse planning but also to conduct geologic sampling as part of science operation tests. Cross-comparison demonstrates that the Apollo-style map identifies many of the same rock units and determines a similar broad history as the field-based map. However, field mapping techniques allow markedly improved discrimination of map units, particularly unconsolidated surficial deposits, and recognize a more complex eruptive history than was possible using Apollo-style data. Further, the distribution of unconsolidated surface units was more obvious in the remote sensing data to the field team after conducting the fieldwork. The study raises questions about the most effective approach to balancing mission costs with the rate of knowledge capture, suggesting that there is an inflection point in the "knowledge capture curve" beyond which additional resource investment yields progressively smaller gains in geologic knowledge.
Skorupa, Agnieszka; Wicher, Magdalena; Banasik, Tomasz; Jamroz, Ewa; Paprocka, Justyna; Kiełtyka, Aleksandra; Sokół, Maria; Konopka, Marek
2014-05-08
The primary purpose of this work was to assess long-term in vitro reproducibility of metabolite levels measured using 1H MRS (proton magnetic resonance spectroscopy). The secondary purpose was to use the in vitro results for interpretation of 1H MRS in vivo spectra acquired from patients diagnosed with Canavan disease. 1H MRS measurements were performed in the period from April 2006 to September 2010. 118 short and 116 long echo spectra were acquired from a stable phantom during this period. Change-point analysis of the in vitro N-acetylaspartate levels was exploited in the computation of fT factor (ratio of the actual to the reference N-acetylaspartate level normalized by the reciprocity principle). This coefficient was utilized in the interpretation of in vivo spectra analyzed using absolute reference technique. The monitored time period was divided into six time intervals based on short echo in vitro data (seven time intervals based on long echo in vitro data) characterized by fT coefficient ranging from 0.97 to 1.09 (based on short echo data) and from 1.0 to 1.11 (based on long echo data). Application of this coefficient to interpretation of in vivo spectra confirmed increased N-acetylaspartate level in Canavan disease. Long-term monitoring of an MRS system reproducibility, allowing for absolute referencing of metabolite levels, facilitates interpretation of metabolic changes in white matter disorders.
Gibert, Karina; García-Rudolph, Alejandro; García-Molina, Alberto; Roig-Rovira, Teresa; Bernabeu, Montse; Tormos, José María
2008-01-01
Develop a classificatory tool to identify different populations of patients with Traumatic Brain Injury based on the characteristics of deficit and response to treatment. A KDD framework where first, descriptive statistics of every variable was done, data cleaning and selection of relevant variables. Then data was mined using a generalization of Clustering based on rules (CIBR), an hybrid AI and Statistics technique which combines inductive learning (AI) and clustering (Statistics). A prior Knowledge Base (KB) is considered to properly bias the clustering; semantic constraints implied by the KB hold in final clusters, guaranteeing interpretability of the resultis. A generalization (Exogenous Clustering based on rules, ECIBR) is presented, allowing to define the KB in terms of variables which will not be considered in the clustering process itself, to get more flexibility. Several tools as Class panel graph are introduced in the methodology to assist final interpretation. A set of 5 classes was recommended by the system and interpretation permitted profiles labeling. From the medical point of view, composition of classes is well corresponding with different patterns of increasing level of response to rehabilitation treatments. All the patients initially assessable conform a single group. Severe impaired patients are subdivided in four profiles which clearly distinct response patterns. Particularly interesting the partial response profile, where patients could not improve executive functions. Meaningful classes were obtained and, from a semantics point of view, the results were sensibly improved regarding classical clustering, according to our opinion that hybrid AI & Stats techniques are more powerful for KDD than pure ones.
Subsurface Mapping: A Question of Position and Interpretation
ERIC Educational Resources Information Center
Kellie, Andrew C.
2009-01-01
This paper discusses the character and challenges inherent in the graphical portrayal of features in subsurface mapping. Subsurface structures are, by their nature, hidden and must be mapped based on drilling and/or geophysical data. Efficient use of graphical techniques is central to effectively communicating the results of expensive exploration…
Using, Abusing, and Understanding Research: A Guide for Counselors.
ERIC Educational Resources Information Center
Campbell, Patricia B.
This brochure highlights the role of the school counselor in using educational research, and emphasizes the relationship between bias (racism and sexism) and past and present research. The tests counselors use, the ways test results are interpreted, and counseling techniques are all based on educational research. Counselors are cautioned to…
Distributed acoustic sensing technique and its field trial in SAGD well
NASA Astrophysics Data System (ADS)
Han, Li; He, Xiangge; Pan, Yong; Liu, Fei; Yi, Duo; Hu, Chengjun; Zhang, Min; Gu, Lijuan
2017-10-01
Steam assisted gravity drainage (SAGD) is a very promising way for the development of heavy oil, extra heavy oil and tight oil reservoirs. Proper monitoring of the SAGD operations is essential to avoid operational issues and improve efficiency. Among all the monitoring techniques, micro-seismic monitoring and related interpretation method can give useful information about the steam chamber development and has been extensively studied. Distributed acoustic sensor (DAS) based on Rayleigh backscattering is a newly developed technique that can measure acoustic signal at all points along the sensing fiber. In this paper, we demonstrate a DAS system based on dual-pulse heterodyne demodulation technique and did field trial in SAGD well located in Xinjiang Oilfield, China. The field trail results validated the performance of the DAS system and indicated its applicability in steam-chamber monitoring and hydraulic monitoring.
Karl, Jason W.; Gillan, Jeffrey K.; Barger, Nichole N.; Herrick, Jeffrey E.; Duniway, Michael C.
2014-01-01
The use of very high resolution (VHR; ground sampling distances < ∼5 cm) aerial imagery to estimate site vegetation cover and to detect changes from management has been well documented. However, as the purpose of monitoring is to document change over time, the ability to detect changes from imagery at the same or better level of accuracy and precision as those measured in situ must be assessed for image-based techniques to become reliable tools for ecosystem monitoring. Our objective with this study was to quantify the relationship between field-measured and image-interpreted changes in vegetation and ground cover measured one year apart in a Piñon and Juniper (P–J) woodland in southern Utah, USA. The study area was subject to a variety of fuel removal treatments between 2009 and 2010. We measured changes in plant community composition and ground cover along transects in a control area and three different treatments prior to and following P–J removal. We compared these measurements to vegetation composition and change based on photo-interpretation of ∼4 cm ground sampling distance imagery along similar transects. Estimates of cover were similar between field-based and image-interpreted methods in 2009 and 2010 for woody vegetation, no vegetation, herbaceous vegetation, and litter (including woody litter). Image-interpretation slightly overestimated cover for woody vegetation and no-vegetation classes (average difference between methods of 1.34% and 5.85%) and tended to underestimate cover for herbaceous vegetation and litter (average difference of −5.18% and 0.27%), but the differences were significant only for litter cover in 2009. Level of agreement between the field-measurements and image-interpretation was good for woody vegetation and no-vegetation classes (r between 0.47 and 0.89), but generally poorer for herbaceous vegetation and litter (r between 0.18 and 0.81) likely due to differences in image quality by year and the difficulty in discriminating fine vegetation and litter in imagery. Our results show that image interpretation to detect vegetation changes has utility for monitoring fuels reduction treatments in terms of woody vegetation and no-vegetation classes. The benefits of this technique are that it provides objective and repeatable measurements of site conditions that could be implemented relatively inexpensively and easily without the need for highly specialized software or technical expertise. Perhaps the biggest limitations of image interpretation to monitoring fuels treatments are challenges in estimating litter and herbaceous vegetation cover and the sensitivity of herbaceous cover estimates to image quality and shadowing.
NASA Technical Reports Server (NTRS)
1974-01-01
A comprehensive land use planning process model is being developed in Meade County, South Dakota, using remote sensing technology. The proper role of remote sensing in the land use planning process is being determined by interaction of remote sensing specialists with local land use planners. The data that were collected by remote sensing techniques are as follows: (1) level I land use data interpreted at a scale of 1:250,000 from false color enlargement prints of ERTS-1 color composite transparencies; (2) detailed land use data interpreted at a scale of 1:24,000 from enlargement color prints of high altitude RB-57 photography; and (3) general soils map interpreted at a scale of 1:250,000 from false color enlargement prints of ERTS-1 color composite transparencies. In addition to use of imagery as an interpretation aid, the utility of using photographs as base maps was demonstrated.
Evolutionary fuzzy modeling human diagnostic decisions.
Peña-Reyes, Carlos Andrés
2004-05-01
Fuzzy CoCo is a methodology, combining fuzzy logic and evolutionary computation, for constructing systems able to accurately predict the outcome of a human decision-making process, while providing an understandable explanation of the underlying reasoning. Fuzzy logic provides a formal framework for constructing systems exhibiting both good numeric performance (accuracy) and linguistic representation (interpretability). However, fuzzy modeling--meaning the construction of fuzzy systems--is an arduous task, demanding the identification of many parameters. To solve it, we use evolutionary computation techniques (specifically cooperative coevolution), which are widely used to search for adequate solutions in complex spaces. We have successfully applied the algorithm to model the decision processes involved in two breast cancer diagnostic problems, the WBCD problem and the Catalonia mammography interpretation problem, obtaining systems both of high performance and high interpretability. For the Catalonia problem, an evolved system was embedded within a Web-based tool-called COBRA-for aiding radiologists in mammography interpretation.
Grauch, V.J.S.; Phillips, Jeffrey D.; Koning, Daniel J.; Johnson, Peggy S.; Bankey, Viki
2009-01-01
The southern Espanola basin consists of a westward- and northward-thickening wedge of rift fill, composed primarily of Santa Fe Group sediments, that serves as an important aquifer for the city of Santa Fe and surrounding areas. Detailed aeromagnetic surveys were flown to better understand ground-water resources in this aquifer. This report presents a synthesis of these data with gravity data and other constraints. The interpretations were accomplished using qualitative interpretation, state-of-art data analysis techniques, and two- and three-dimensional modeling. The results depict the presence of and depth to many geologic features that have hydrogeologic significance, including shallow faults, different types of igneous units, and basement rocks. The results are presented as map interpretations, geophysical profile models, and a digital surface that represents the base and thickness of Santa Fe Group sediments, as well as vector files of some volcanic features and faults.
Learning accurate and interpretable models based on regularized random forests regression
2014-01-01
Background Many biology related research works combine data from multiple sources in an effort to understand the underlying problems. It is important to find and interpret the most important information from these sources. Thus it will be beneficial to have an effective algorithm that can simultaneously extract decision rules and select critical features for good interpretation while preserving the prediction performance. Methods In this study, we focus on regression problems for biological data where target outcomes are continuous. In general, models constructed from linear regression approaches are relatively easy to interpret. However, many practical biological applications are nonlinear in essence where we can hardly find a direct linear relationship between input and output. Nonlinear regression techniques can reveal nonlinear relationship of data, but are generally hard for human to interpret. We propose a rule based regression algorithm that uses 1-norm regularized random forests. The proposed approach simultaneously extracts a small number of rules from generated random forests and eliminates unimportant features. Results We tested the approach on some biological data sets. The proposed approach is able to construct a significantly smaller set of regression rules using a subset of attributes while achieving prediction performance comparable to that of random forests regression. Conclusion It demonstrates high potential in aiding prediction and interpretation of nonlinear relationships of the subject being studied. PMID:25350120
In with the new, out with the old? Auto-extraction for remote sensing archaeology
NASA Astrophysics Data System (ADS)
Cowley, David C.
2012-09-01
This paper explores aspects of the inter-relationships between traditional archaeological interpretation of remote sensed data (principally visual examination of aerial photographs/satellite) and those drawing on automated feature extraction and processing. Established approaches to archaeological interpretation of aerial photographs are heavily reliant on individual observation (eye/brain) in an experience and knowledge-based process. Increasingly, however, much more complex and extensive datasets are becoming available to archaeology and these require critical reflection on analytical and interpretative processes. Archaeological applications of Airborne Laser Scanning (ALS) are becoming increasingly routine, and as the spatial resolution of hyper-spectral data improves, its potentially massive implications for archaeological site detection may prove to be a sea-change. These complex datasets demand new approaches, as traditional methods based on direct observation by an archaeological interpreter will never do more than scratch the surface, and will fail to fully extend the boundaries of knowledge. Inevitably, changing analytical and interpretative processes can create tensions, especially, as has been the case in archaeology, when the innovations in data and analysis come from outside the discipline. These tensions often centre on the character of the information produced, and a lack of clarity on the place of archaeological interpretation in the workflow. This is especially true for ALS data and autoextraction techniques, and carries implications for all forms of remote sensed archaeological datasets, including hyperspectral data and aerial photographs.
NASA Astrophysics Data System (ADS)
Pandey, Rishi Kumar; Mishra, Hradyesh Kumar
2017-11-01
In this paper, the semi-analytic numerical technique for the solution of time-space fractional telegraph equation is applied. This numerical technique is based on coupling of the homotopy analysis method and sumudu transform. It shows the clear advantage with mess methods like finite difference method and also with polynomial methods similar to perturbation and Adomian decomposition methods. It is easily transform the complex fractional order derivatives in simple time domain and interpret the results in same meaning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loveday, D.L.; Craggs, C.
Box-Jenkins-based multivariate stochastic modeling is carried out using data recorded from a domestic heating system. The system comprises an air-source heat pump sited in the roof space of a house, solar assistance being provided by the conventional tile roof acting as a radiation absorber. Multivariate models are presented which illustrate the time-dependent relationships between three air temperatures - at external ambient, at entry to, and at exit from, the heat pump evaporator. Using a deterministic modeling approach, physical interpretations are placed on the results of the multivariate technique. It is concluded that the multivariate Box-Jenkins approach is a suitable techniquemore » for building thermal analysis. Application to multivariate Box-Jenkins approach is a suitable technique for building thermal analysis. Application to multivariate model-based control is discussed, with particular reference to building energy management systems. It is further concluded that stochastic modeling of data drawn from a short monitoring period offers a means of retrofitting an advanced model-based control system in existing buildings, which could be used to optimize energy savings. An approach to system simulation is suggested.« less
Tools to Support Interpreting Multiple Regression in the Face of Multicollinearity
Kraha, Amanda; Turner, Heather; Nimon, Kim; Zientek, Linda Reichwein; Henson, Robin K.
2012-01-01
While multicollinearity may increase the difficulty of interpreting multiple regression (MR) results, it should not cause undue problems for the knowledgeable researcher. In the current paper, we argue that rather than using one technique to investigate regression results, researchers should consider multiple indices to understand the contributions that predictors make not only to a regression model, but to each other as well. Some of the techniques to interpret MR effects include, but are not limited to, correlation coefficients, beta weights, structure coefficients, all possible subsets regression, commonality coefficients, dominance weights, and relative importance weights. This article will review a set of techniques to interpret MR effects, identify the elements of the data on which the methods focus, and identify statistical software to support such analyses. PMID:22457655
Tools to support interpreting multiple regression in the face of multicollinearity.
Kraha, Amanda; Turner, Heather; Nimon, Kim; Zientek, Linda Reichwein; Henson, Robin K
2012-01-01
While multicollinearity may increase the difficulty of interpreting multiple regression (MR) results, it should not cause undue problems for the knowledgeable researcher. In the current paper, we argue that rather than using one technique to investigate regression results, researchers should consider multiple indices to understand the contributions that predictors make not only to a regression model, but to each other as well. Some of the techniques to interpret MR effects include, but are not limited to, correlation coefficients, beta weights, structure coefficients, all possible subsets regression, commonality coefficients, dominance weights, and relative importance weights. This article will review a set of techniques to interpret MR effects, identify the elements of the data on which the methods focus, and identify statistical software to support such analyses.
NASA Technical Reports Server (NTRS)
Harwood, P. (Principal Investigator); Malin, P.; Finley, R.; Mcculloch, S.; Murphy, D.; Hupp, B.; Schell, J. A.
1977-01-01
The author has identified the following significant results. Four LANDSAT scenes were analyzed for the Harbor Island area test sites to produce land cover and land use maps using both image interpretation and computer-assisted techniques. When evaluated against aerial photography, the mean accuracy for three scenes was 84% for the image interpretation product and 62% for the computer-assisted classification maps. Analysis of the fourth scene was not completed using the image interpretation technique, because of poor quality, false color composite, but was available from the computer technique. Preliminary results indicate that these LANDSAT products can be applied to a variety of planning and management activities in the Texas coastal zone.
Natural Resource Information System, remote sensing studies
NASA Technical Reports Server (NTRS)
Leachtenauer, J.; Hirsch, R.; Williams, V.; Tucker, R.
1972-01-01
Potential applications of remote sensing data were reviewed, and available imagery was interpreted to provide input to a demonstration data base. A literature review was conducted to determine the types and qualities of imagery required to satisfy identified data needs. Ektachrome imagery available over the demonstration areas was reviewed to establish the feasibility of interpreting cultural features, range condition, and timber type. Using the same imagery, a land use map was prepared for the demonstration area. The feasibility of identifying commercial timber areas using a density slicing technique was tested on multispectral imagery available for a portion of the demonstration area.
Strategy for an Extensible Microcomputer-Based Mumps System for Private Practice
Walters, Richard F.; Johnson, Stephen L.
1979-01-01
A macro expander technique has been adopted to generate a machine independent single user version of ANSI Standard MUMPS running on an 8080 Microcomputer. This approach makes it possible to have the medically oriented MUMPS language available on inexpensive systems suitable for small group practice settings. Substitution of another macro expansion set allows the same interpreter to be implemented on another computer, thereby providing compatibility with comparable or larger scale systems. Furthermore, since the global file handler can be separated from the interpreter, this approach permits development of a distributed MUMPS system with no change in applications software.
Technique for ranking potential predictor layers for use in remote sensing analysis
Andrew Lister; Mike Hoppus; Rachel Riemann
2004-01-01
Spatial modeling using GIS-based predictor layers often requires that extraneous predictors be culled before conducting analysis. In some cases, using extraneous predictor layers might improve model accuracy but at the expense of increasing complexity and interpretability. In other cases, using extraneous layers can dilute the relationship between predictors and target...
Observations of the Geometry of Horizon-Based Optical Navigation
NASA Technical Reports Server (NTRS)
Christian, John; Robinson, Shane
2016-01-01
NASA's Orion Project has sparked a renewed interest in horizon-based optical navigation(OPNAV) techniques for spacecraft in the Earth-Moon system. Some approaches have begun to explore the geometry of horizon-based OPNAV and exploit the fact that it is a conic section problem. Therefore, the present paper focuses more deeply on understanding and leveraging the various geometric interpretations of horizon-based OPNAV. These results provide valuable insight into the fundamental workings of OPNAV solution methods, their convergence properties, and associated estimate covariance. Most importantly, the geometry and transformations uncovered in this paper lead to a simple and non-iterative solution to the generic horizon-based OPNAV problem. This represents a significant theoretical advancement over existing methods. Thus, we find that a clear understanding of geometric relationships is central to the prudent design, use, and operation of horizon-based OPNAV techniques.
Wicher, Magdalena; Banasik, Tomasz; Jamroz, Ewa; Paprocka, Justyna; Kiettyka, Aleksandra; Sokót, Maria; Konopka, Marek
2014-01-01
The primary purpose of this work was to assess long‐term in vitro reproducibility of metabolite levels measured using 1H MRS (proton magnetic resonance spectroscopy). The secondary purpose was to use the in vitro results for interpretation of ‘H MRS in vivo spectra acquired from patients diagnosed with Canavan disease. 1H MRS measurements were performed in the period from April 2006 to September 2010. 118 short and 116 long echo spectra were acquired from a stable phantom during this period. Change‐point analysis of the in vitro N‐acetylaspartate levels was exploited in the computation of fT factor (ratio of the actual to the reference N‐acetylaspartate level normalized by the reciprocity principle). This coefficient was utilized in the interpretation of in vivo spectra analyzed using absolute reference technique. The monitored time period was divided into six time intervals based on short echo in vitro data (seven time intervals based on long echo in vitro data) characterized by fT coefficient ranging from 0.97 to 1.09 (based on short echo data) and from 1.0 to 1.11 (based on long echo data). Application of this coefficient to interpretation of in vivo spectra confirmed increased N‐acetylaspartate level in Canavan disease. Long‐term monitoring of an MRS system reproducibility, allowing for absolute referencing of metabolite levels, facilitates interpretation of metabolic changes in white matter disorders. PACS numbers: 87.19.lf, 87.61.Tg, 87.64.K‐, 87.64.kj PMID:24892353
Applications of Advanced, Waveform Based AE Techniques for Testing Composite Materials
NASA Technical Reports Server (NTRS)
Prosser, William H.
1996-01-01
Advanced, waveform based acoustic emission (AE) techniques have been previously used to evaluate damage progression in laboratory tests of composite coupons. In these tests, broad band, high fidelity acoustic sensors were used to detect signals which were then digitized and stored for analysis. Analysis techniques were based on plate mode wave propagation characteristics. This approach, more recently referred to as Modal AE, provides an enhanced capability to discriminate and eliminate noise signals from those generated by damage mechanisms. This technique also allows much more precise source location than conventional, threshold crossing arrival time determination techniques. To apply Modal AE concepts to the interpretation of AE on larger composite structures, the effects of wave propagation over larger distances and through structural complexities must be well characterized and understood. In this research, measurements were made of the attenuation of the extensional and flexural plate mode components of broad band simulated AE signals in large composite panels. As these materials have applications in a cryogenic environment, the effects of cryogenic insulation on the attenuation of plate mode AE signals were also documented.
Application of remote sensing to estimating soil erosion potential
NASA Technical Reports Server (NTRS)
Morris-Jones, D. R.; Kiefer, R. W.
1980-01-01
A variety of remote sensing data sources and interpretation techniques has been tested in a 6136 hectare watershed with agricultural, forest and urban land cover to determine the relative utility of alternative aerial photographic data sources for gathering the desired land use/land cover data. The principal photographic data sources are high altitude 9 x 9 inch color infrared photos at 1:120,000 and 1:60,000 and multi-date medium altitude color and color infrared photos at 1:60,000. Principal data for estimating soil erosion potential include precipitation, soil, slope, crop, crop practice, and land use/land cover data derived from topographic maps, soil maps, and remote sensing. A computer-based geographic information system organized on a one-hectare grid cell basis is used to store and quantify the information collected using different data sources and interpretation techniques. Research results are compared with traditional Universal Soil Loss Equation field survey methods.
Probabilistic registration of an unbiased statistical shape model to ultrasound images of the spine
NASA Astrophysics Data System (ADS)
Rasoulian, Abtin; Rohling, Robert N.; Abolmaesumi, Purang
2012-02-01
The placement of an epidural needle is among the most difficult regional anesthetic techniques. Ultrasound has been proposed to improve success of placement. However, it has not become the standard-of-care because of limitations in the depictions and interpretation of the key anatomical features. We propose to augment the ultrasound images with a registered statistical shape model of the spine to aid interpretation. The model is created with a novel deformable group-wise registration method which utilizes a probabilistic approach to register groups of point sets. The method is compared to a volume-based model building technique and it demonstrates better generalization and compactness. We instantiate and register the shape model to a spine surface probability map extracted from the ultrasound images. Validation is performed on human subjects. The achieved registration accuracy (2-4 mm) is sufficient to guide the choice of puncture site and trajectory of an epidural needle.
NASA Astrophysics Data System (ADS)
Vrbancich, Julian
2011-09-01
Helicopter time-domain airborne electromagnetic (AEM) methodology is being investigated as a reconnaissance technique for bathymetric mapping in shallow coastal waters, especially in areas affected by water turbidity where light detection and ranging (LIDAR) and hyperspectral techniques may be limited. Previous studies in Port Lincoln, South Australia, used a floating AEM time-domain system to provide an upper limit to the expected bathymetric accuracy based on current technology for AEM systems. The survey lines traced by the towed floating system were also flown with an airborne system using the same transmitter and receiver electronic instrumentation, on two separate occasions. On the second occasion, significant improvements had been made to the instrumentation to reduce the system self-response at early times. A comparison of the interpreted water depths obtained from the airborne and floating systems is presented, showing the degradation in bathymetric accuracy obtained from the airborne data. An empirical data correction method based on modelled and observed EM responses over deep seawater (i.e. a quasi half-space response) at varying survey altitudes, combined with known seawater conductivity measured during the survey, can lead to significant improvements in interpreted water depths and serves as a useful method for checking system calibration. Another empirical data correction method based on observed and modelled EM responses in shallow water was shown to lead to similar improvements in interpreted water depths; however, this procedure is notably inferior to the quasi half-space response because more parameters need to be assumed in order to compute the modelled EM response. A comparison between the results of the two airborne surveys in Port Lincoln shows that uncorrected data obtained from the second airborne survey gives good agreement with known water depths without the need to apply any empirical corrections to the data. This result significantly decreases the data-processing time thereby enabling the AEM method to serve as a rapid reconnaissance technique for bathymetric mapping.
Using Analytical Techniques to Interpret Financial Statements.
ERIC Educational Resources Information Center
Walters, Donald L.
1986-01-01
Summarizes techniques for interpreting the balance sheet and the statement of revenues, expenditures, and changes-in-fund-balance sections of the comprehensive annual financial report required of all school districts. Uses three tables to show intricacies involved and focuses on analyzing favorable and unfavorable budget variances. (MLH)
Direct push driven in situ color logging tool (CLT): technique, analysis routines, and application
NASA Astrophysics Data System (ADS)
Werban, U.; Hausmann, J.; Dietrich, P.; Vienken, T.
2014-12-01
Direct push technologies have recently seen a broad development providing several tools for in situ parameterization of unconsolidated sediments. One of these techniques is the measurement of soil colors - a proxy information that reveals to soil/sediment properties. We introduce the direct push driven color logging tool (CLT) for real-time and depth-resolved investigation of soil colors within the visible spectrum. Until now, no routines exist on how to handle high-resolved (mm-scale) soil color data. To develop such a routine, we transform raw data (CIEXYZ) into soil color surrogates of selected color spaces (CIExyY, CIEL*a*b*, CIEL*c*h*, sRGB) and denoise small-scale natural variability by Haar and Daublet4 wavelet transformation, gathering interpretable color logs over depth. However, interpreting color log data as a single application remains challenging. Additional information, such as site-specific knowledge of the geological setting, is required to correlate soil color data to specific layers properties. Hence, we exemplary provide results from a joint interpretation of in situ-obtained soil color data and 'state-of-the-art' direct push based profiling tool data and discuss the benefit of additional data. The developed routine is capable of transferring the provided information obtained as colorimetric data into interpretable color surrogates. Soil color data proved to correlate with small-scale lithological/chemical changes (e.g., grain size, oxidative and reductive conditions), especially when combined with additional direct push vertical high resolution data (e.g., cone penetration testing and soil sampling). Thus, the technique allows enhanced profiling by means of providing another reproducible high-resolution parameter for analysis subsurface conditions. This opens potential new areas of application and new outputs for such data in site investigation. It is our intention to improve color measurements by means method of application and data interpretation, useful to characterize vadose layer/soil/sediment characteristics.
Use of global ionospheric maps for HF Doppler measurements interpretation
NASA Astrophysics Data System (ADS)
Petrova, I. R.; Bochkarev, V. V.; Latypov, R. R.
2018-04-01
The HF Doppler technique, a method of measurement of Doppler frequency shift of ionospheric signal, is one of the well-known and widely used techniques of ionosphere research. It allows investigation of various disturbances in the ionosphere. There are different sources of disturbances in the ionosphere such as geomagnetic storms, solar flashes, meteorological effects and atmospheric waves. The HF Doppler technique allows us to find out the influence of earthquakes, explosions and other processes on the ionosphere, which occurs near the Earth. HF Doppler technique has high sensitivity to small frequency variations and high time resolution but interpretation of results is difficult. In this paper, we attempt to use GPS data for Doppler measurements interpretation. Modeling of Doppler frequency shift variations with use of TEC allows separation of ionosphere disturbances of medium scale.
Applied photo interpretation for airbrush cartography
NASA Technical Reports Server (NTRS)
Inge, J. L.; Bridges, P. M.
1976-01-01
New techniques of cartographic portrayal have been developed for the compilation of maps of lunar and planetary surfaces. Conventional photo interpretation methods utilizing size, shape, shadow, tone, pattern, and texture are applied to computer processed satellite television images. The variety of the image data allows the illustrator to interpret image details by inter-comparison and intra-comparison of photographs. Comparative judgements are affected by illumination, resolution, variations in surface coloration, and transmission or processing artifacts. The validity of the interpretation process is tested by making a representational drawing by an airbrush portrayal technique. Production controls insure the consistency of a map series. Photo interpretive cartographic portrayal skills are used to prepare two kinds of map series and are adaptable to map products of different kinds and purposes.
Scalable and Accurate SMT-based Model Checking of Data Flow Systems
2013-10-30
guided by the semantics of the description language . In this project we developed instead a complementary and novel approach based on a somewhat brute...believe that our approach could help considerably in expanding the reach of abstract interpretation techniques to a variety of tar- get languages , as...project. We worked on developing a framework for compositional verification that capitalizes on the fact that data-flow languages , such as Lustre, have
Biggs, Jason D.; Voll, Judith A.; Mukamel, Shaul
2012-01-01
Two types of diagrammatic approaches for the design and simulation of nonlinear optical experiments (closed-time path loops based on the wave function and double-sided Feynman diagrams for the density matrix) are presented and compared. We give guidelines for the assignment of relevant pathways and provide rules for the interpretation of existing nonlinear experiments in carotenoids. PMID:22753822
Analysis and Interpretation of Findings Using Multiple Regression Techniques
ERIC Educational Resources Information Center
Hoyt, William T.; Leierer, Stephen; Millington, Michael J.
2006-01-01
Multiple regression and correlation (MRC) methods form a flexible family of statistical techniques that can address a wide variety of different types of research questions of interest to rehabilitation professionals. In this article, we review basic concepts and terms, with an emphasis on interpretation of findings relevant to research questions…
DOT National Transportation Integrated Search
2012-06-01
The objective of this study was to develop an approach for incorporating techniques to interpret and evaluate deflection : data for network-level pavement management system (PMS) applications. The first part of this research focused on : identifying ...
Non-local means denoising of dynamic PET images.
Dutta, Joyita; Leahy, Richard M; Li, Quanzheng
2013-01-01
Dynamic positron emission tomography (PET), which reveals information about both the spatial distribution and temporal kinetics of a radiotracer, enables quantitative interpretation of PET data. Model-based interpretation of dynamic PET images by means of parametric fitting, however, is often a challenging task due to high levels of noise, thus necessitating a denoising step. The objective of this paper is to develop and characterize a denoising framework for dynamic PET based on non-local means (NLM). NLM denoising computes weighted averages of voxel intensities assigning larger weights to voxels that are similar to a given voxel in terms of their local neighborhoods or patches. We introduce three key modifications to tailor the original NLM framework to dynamic PET. Firstly, we derive similarities from less noisy later time points in a typical PET acquisition to denoise the entire time series. Secondly, we use spatiotemporal patches for robust similarity computation. Finally, we use a spatially varying smoothing parameter based on a local variance approximation over each spatiotemporal patch. To assess the performance of our denoising technique, we performed a realistic simulation on a dynamic digital phantom based on the Digimouse atlas. For experimental validation, we denoised [Formula: see text] PET images from a mouse study and a hepatocellular carcinoma patient study. We compared the performance of NLM denoising with four other denoising approaches - Gaussian filtering, PCA, HYPR, and conventional NLM based on spatial patches. The simulation study revealed significant improvement in bias-variance performance achieved using our NLM technique relative to all the other methods. The experimental data analysis revealed that our technique leads to clear improvement in contrast-to-noise ratio in Patlak parametric images generated from denoised preclinical and clinical dynamic images, indicating its ability to preserve image contrast and high intensity details while lowering the background noise variance.
An integrated use of topography with RSI in gully mapping, Shandong Peninsula, China.
He, Fuhong; Wang, Tao; Gu, Lijuan; Li, Tao; Jiang, Weiguo; Shao, Hongbo
2014-01-01
Taking the Quickbird optical satellite imagery of the small watershed of Beiyanzigou valley of Qixia city, Shandong province, as the study data, we proposed a new method by using a fused image of topography with remote sensing imagery (RSI) to achieve a high precision interpretation of gully edge lines. The technique first transformed remote sensing imagery into HSV color space from RGB color space. Then the slope threshold values of gully edge line and gully thalweg were gained through field survey and the slope data were segmented using thresholding, respectively. Based on the fused image in combination with gully thalweg thresholding vectors, the gully thalweg thresholding vectors were amended. Lastly, the gully edge line might be interpreted based on the amended gully thalweg vectors, fused image, gully edge line thresholding vectors, and slope data. A testing region was selected in the study area to assess the accuracy. Then accuracy assessment of the gully information interpreted by both interpreting remote sensing imagery only and the fused image was performed using the deviation, kappa coefficient, and overall accuracy of error matrix. Compared with interpreting remote sensing imagery only, the overall accuracy and kappa coefficient are increased by 24.080% and 264.364%, respectively. The average deviations of gully head and gully edge line are reduced by 60.448% and 67.406%, respectively. The test results show the thematic and the positional accuracy of gully interpreted by new method are significantly higher. Finally, the error sources for interpretation accuracy by the two methods were analyzed.
An Integrated Use of Topography with RSI in Gully Mapping, Shandong Peninsula, China
He, Fuhong; Wang, Tao; Gu, Lijuan; Li, Tao; Jiang, Weiguo; Shao, Hongbo
2014-01-01
Taking the Quickbird optical satellite imagery of the small watershed of Beiyanzigou valley of Qixia city, Shandong province, as the study data, we proposed a new method by using a fused image of topography with remote sensing imagery (RSI) to achieve a high precision interpretation of gully edge lines. The technique first transformed remote sensing imagery into HSV color space from RGB color space. Then the slope threshold values of gully edge line and gully thalweg were gained through field survey and the slope data were segmented using thresholding, respectively. Based on the fused image in combination with gully thalweg thresholding vectors, the gully thalweg thresholding vectors were amended. Lastly, the gully edge line might be interpreted based on the amended gully thalweg vectors, fused image, gully edge line thresholding vectors, and slope data. A testing region was selected in the study area to assess the accuracy. Then accuracy assessment of the gully information interpreted by both interpreting remote sensing imagery only and the fused image was performed using the deviation, kappa coefficient, and overall accuracy of error matrix. Compared with interpreting remote sensing imagery only, the overall accuracy and kappa coefficient are increased by 24.080% and 264.364%, respectively. The average deviations of gully head and gully edge line are reduced by 60.448% and 67.406%, respectively. The test results show the thematic and the positional accuracy of gully interpreted by new method are significantly higher. Finally, the error sources for interpretation accuracy by the two methods were analyzed. PMID:25302333
NASA Astrophysics Data System (ADS)
Sailhac, P.; Marquis, G.; Darnet, M.; Szalai, S.
2003-04-01
Surface self potential measurements (SP) are useful to characterize underground fluid flow or chemical reactions (as redox) and can be used in addition to NMR and electrical prospecting in hydrological investigations. Assuming that the SP anomalies have an electrokinetic origin, the source of SP data is the divergence of underground fluid flow; one important problem with surface SP data is then its interpretation in terms of fluid flow geometry. Some integral transform techniques have been shown to be powerful for SP interpretation (e.g. Fournier 1989, Patella, 1997; Sailhac &Marquis 2001). All these techniques are based upon Green’{ }s functions to characterize underground water flow, but they assume a constant electrical conductivity in the subsurface. This unrealistic approximation results in the appearance of non-electrokinetic sources at strong lateral electrical conductivity contrasts. We present here new Green’{ }s functions suitable for media of heterogeneous electrical conductivity. This new approach allows the joint interpretation of electrical resistivity tomography and SP measurements to detect electrokinetic sources caused by fluid flow. Tests on synthetic examples show that it gives more realistic results that when a constant electrical conductivity is assumed.
Inquiry-based experiments for large-scale introduction to PCR and restriction enzyme digests.
Johanson, Kelly E; Watt, Terry J
2015-01-01
Polymerase chain reaction and restriction endonuclease digest are important techniques that should be included in all Biochemistry and Molecular Biology laboratory curriculums. These techniques are frequently taught at an advanced level, requiring many hours of student and faculty time. Here we present two inquiry-based experiments that are designed for introductory laboratory courses and combine both techniques. In both approaches, students must determine the identity of an unknown DNA sequence, either a gene sequence or a primer sequence, based on a combination of PCR product size and restriction digest pattern. The experimental design is flexible, and can be adapted based on available instructor preparation time and resources, and both approaches can accommodate large numbers of students. We implemented these experiments in our courses with a combined total of 584 students and have an 85% success rate. Overall, students demonstrated an increase in their understanding of the experimental topics, ability to interpret the resulting data, and proficiency in general laboratory skills. © 2015 The International Union of Biochemistry and Molecular Biology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trexler, D.T.; Flynn, T.; Koenig, B.A.
1982-01-01
Geological, geophysical and geochemical surveys were used in conjunction with temperature gradient hole drilling to assess the geothermal resources in Pumpernickel Valley and Carlin, Nevada. This program is based on a statewide assessment of geothermal resources that was completed in 1979. The exploration techniques are based on previous federally-funded assessment programs that were completed in six other areas in Nevada and include: literature search and compilation of existing data, geologic reconnaissance, chemical sampling of thermal and non-thermal fluids, interpretation of satellite imagery, interpretation of low-sun angle aerial photographs, two-meter depth temperature probe survey, gravity survey, seismic survey, soil-mercury survey, andmore » temperature gradient drilling.« less
Toward sensor-based context aware systems.
Sakurai, Yoshitaka; Takada, Kouhei; Anisetti, Marco; Bellandi, Valerio; Ceravolo, Paolo; Damiani, Ernesto; Tsuruta, Setsuo
2012-01-01
This paper proposes a methodology for sensor data interpretation that can combine sensor outputs with contexts represented as sets of annotated business rules. Sensor readings are interpreted to generate events labeled with the appropriate type and level of uncertainty. Then, the appropriate context is selected. Reconciliation of different uncertainty types is achieved by a simple technique that moves uncertainty from events to business rules by generating combs of standard Boolean predicates. Finally, context rules are evaluated together with the events to take a decision. The feasibility of our idea is demonstrated via a case study where a context-reasoning engine has been connected to simulated heartbeat sensors using prerecorded experimental data. We use sensor outputs to identify the proper context of operation of a system and trigger decision-making based on context information.
A simulation-based evaluation of methods for inferring linear barriers to gene flow
Christopher Blair; Dana E. Weigel; Matthew Balazik; Annika T. H. Keeley; Faith M. Walker; Erin Landguth; Sam Cushman; Melanie Murphy; Lisette Waits; Niko Balkenhol
2012-01-01
Different analytical techniques used on the same data set may lead to different conclusions about the existence and strength of genetic structure. Therefore, reliable interpretation of the results from different methods depends on the efficacy and reliability of different statistical methods. In this paper, we evaluated the performance of multiple analytical methods to...
Using Evidence-Based Decision Trees Instead of Formulas to Identify At-Risk Readers. REL 2014-036
ERIC Educational Resources Information Center
Koon, Sharon; Petscher, Yaacov; Foorman, Barbara R.
2014-01-01
This study examines whether the classification and regression tree (CART) model improves the early identification of students at risk for reading comprehension difficulties compared with the more difficult to interpret logistic regression model. CART is a type of predictive modeling that relies on nonparametric techniques. It presents results in…
ERIC Educational Resources Information Center
Whitworth, David E.
2016-01-01
Laboratory-based practical classes are a common feature of life science teaching, during which students learn how to perform experiments and generate/interpret data. Practical classes are typically instructional, concentrating on providing topic- and technique-specific skills, however to produce research-capable graduates it is also important to…
ERIC Educational Resources Information Center
Whittaker, Tiffany A.; Khojasteh, Jam
2017-01-01
Latent growth modeling (LGM) is a popular and flexible technique that may be used when data are collected across several different measurement occasions. Modeling the appropriate growth trajectory has important implications with respect to the accurate interpretation of parameter estimates of interest in a latent growth model that may impact…
ERTS evaluation for land use inventory
NASA Technical Reports Server (NTRS)
Hardy, E. E. (Principal Investigator)
1973-01-01
The author has identified the following significant results. The feasibility of accomplishing a general inventory of any given region based on spectral categories from satellite data has been demonstrated in a pilot study for an area of 6300 square kilometers in central New York State. This was accomplished by developing special processing techniques to improve and balance contrast and density for each spectral band of an image scene to compare with a standard range of density and contrast found to be acceptable for interpretation of the scene. Diazo film transparencies were made from enlarged black and white transparencies of each spectral band. Color composites were constructed from these diazo films in combinations of hue and spectral bands to enhance different spectral features in the scene. Interpretation and data takeoff was accomplished manually by translating interpreted areas onto an overlay to construct a spectral map. The minimum area interpreted was 25 hectares. The minimum area geographically referenced was one square kilometer. The interpretation and referencing of data from ERTS-1 was found to be about 88% accurate for eight primary spectral categories.
A fuzzy hill-climbing algorithm for the development of a compact associative classifier
NASA Astrophysics Data System (ADS)
Mitra, Soumyaroop; Lam, Sarah S.
2012-02-01
Classification, a data mining technique, has widespread applications including medical diagnosis, targeted marketing, and others. Knowledge discovery from databases in the form of association rules is one of the important data mining tasks. An integrated approach, classification based on association rules, has drawn the attention of the data mining community over the last decade. While attention has been mainly focused on increasing classifier accuracies, not much efforts have been devoted towards building interpretable and less complex models. This paper discusses the development of a compact associative classification model using a hill-climbing approach and fuzzy sets. The proposed methodology builds the rule-base by selecting rules which contribute towards increasing training accuracy, thus balancing classification accuracy with the number of classification association rules. The results indicated that the proposed associative classification model can achieve competitive accuracies on benchmark datasets with continuous attributes and lend better interpretability, when compared with other rule-based systems.
Ugurbil, Kamil
2011-01-01
Magnetic resonance spectroscopy-based magnetization transfer techniques (MT) are commonly used to assess the rate of oxidative (i.e., mitochondrial) ATP synthesis in intact tissues. Physiologically appropriate interpretation of MT rate data depends on accurate appraisal of the biochemical events that contribute to a specific MT rate measurement. The relative contributions of the specific enzymatic reactions that can contribute to a MT Pi→ATP rate measurement are tissue dependent; nonrecognition of this fact can bias the interpretation of MT Pi→ATP rate data. The complexities of MT-based measurements of mitochondrial ATP synthesis rates made in striated muscle and other tissues are reviewed, following which, the adverse impacts of erroneous Pi→ATP rate data analyses on the physiological inferences presented in selected published studies of cardiac and skeletal muscle are considered. PMID:21368294
Opto-electronic characterization of third-generation solar cells.
Neukom, Martin; Züfle, Simon; Jenatsch, Sandra; Ruhstaller, Beat
2018-01-01
We present an overview of opto-electronic characterization techniques for solar cells including light-induced charge extraction by linearly increasing voltage, impedance spectroscopy, transient photovoltage, charge extraction and more. Guidelines for the interpretation of experimental results are derived based on charge drift-diffusion simulations of solar cells with common performance limitations. It is investigated how nonidealities like charge injection barriers, traps and low mobilities among others manifest themselves in each of the studied cell characterization techniques. Moreover, comprehensive parameter extraction for an organic bulk-heterojunction solar cell comprising PCDTBT:PC 70 BM is demonstrated. The simulations reproduce measured results of 9 different experimental techniques. Parameter correlation is minimized due to the combination of various techniques. Thereby a route to comprehensive and accurate parameter extraction is identified.
Real-time monitoring of CO2 storage sites: Application to Illinois Basin-Decatur Project
Picard, G.; Berard, T.; Chabora, E.; Marsteller, S.; Greenberg, S.; Finley, R.J.; Rinck, U.; Greenaway, R.; Champagnon, C.; Davard, J.
2011-01-01
Optimization of carbon dioxide (CO2) storage operations for efficiency and safety requires use of monitoring techniques and implementation of control protocols. The monitoring techniques consist of permanent sensors and tools deployed for measurement campaigns. Large amounts of data are thus generated. These data must be managed and integrated for interpretation at different time scales. A fast interpretation loop involves combining continuous measurements from permanent sensors as they are collected to enable a rapid response to detected events; a slower loop requires combining large datasets gathered over longer operational periods from all techniques. The purpose of this paper is twofold. First, it presents an analysis of the monitoring objectives to be performed in the slow and fast interpretation loops. Second, it describes the implementation of the fast interpretation loop with a real-time monitoring system at the Illinois Basin-Decatur Project (IBDP) in Illinois, USA. ?? 2011 Published by Elsevier Ltd.
The use of interpractive graphic displays for interpretation of surface design parameters
NASA Technical Reports Server (NTRS)
Talcott, N. A., Jr.
1981-01-01
An interactive computer graphics technique known as the Graphic Display Data method has been developed to provide a convenient means for rapidly interpreting large amounts of surface design data. The display technique should prove valuable in such disciplines as aerodynamic analysis, structural analysis, and experimental data analysis. To demonstrate the system's features, an example is presented of the Graphic Data Display method used as an interpretive tool for radiation equilibrium temperature distributions over the surface of an aerodynamic vehicle. Color graphic displays were also examined as a logical extension of the technique to improve its clarity and to allow the presentation of greater detail in a single display.
Supporting flight data analysis for Space Shuttle Orbiter Experiments at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Green, M. J.; Budnick, M. P.; Yang, L.; Chiasson, M. P.
1983-01-01
The Space Shuttle Orbiter Experiments program in responsible for collecting flight data to extend the research and technology base for future aerospace vehicle design. The Infrared Imagery of Shuttle (IRIS), Catalytic Surface Effects, and Tile Gap Heating experiments sponsored by Ames Research Center are part of this program. The paper describes the software required to process the flight data which support these experiments. In addition, data analysis techniques, developed in support of the IRIS experiment, are discussed. Using the flight data base, the techniques have provided information useful in analyzing and correcting problems with the experiment, and in interpreting the IRIS image obtained during the entry of the third Shuttle mission.
Supporting flight data analysis for Space Shuttle Orbiter experiments at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Green, M. J.; Budnick, M. P.; Yang, L.; Chiasson, M. P.
1983-01-01
The space shuttle orbiter experiments program is responsible for collecting flight data to extend the research and technology base for future aerospace vehicle design. The infrared imagery of shuttle (IRIS), catalytic surface effects, and tile gap heating experiments sponsored by Ames Research Center are part of this program. The software required to process the flight data which support these experiments is described. In addition, data analysis techniques, developed in support of the IRIS experiment, are discussed. Using the flight data base, the techniques provide information useful in analyzing and correcting problems with the experiment, and in interpreting the IRIS image obtained during the entry of the third shuttle mission.
Exploring Techniques for Vision Based Human Activity Recognition: Methods, Systems, and Evaluation
Xu, Xin; Tang, Jinshan; Zhang, Xiaolong; Liu, Xiaoming; Zhang, Hong; Qiu, Yimin
2013-01-01
With the wide applications of vision based intelligent systems, image and video analysis technologies have attracted the attention of researchers in the computer vision field. In image and video analysis, human activity recognition is an important research direction. By interpreting and understanding human activities, we can recognize and predict the occurrence of crimes and help the police or other agencies react immediately. In the past, a large number of papers have been published on human activity recognition in video and image sequences. In this paper, we provide a comprehensive survey of the recent development of the techniques, including methods, systems, and quantitative evaluation of the performance of human activity recognition. PMID:23353144
Informatics and computational strategies for the study of lipids.
Yetukuri, Laxman; Ekroos, Kim; Vidal-Puig, Antonio; Oresic, Matej
2008-02-01
Recent advances in mass spectrometry (MS)-based techniques for lipidomic analysis have empowered us with the tools that afford studies of lipidomes at the systems level. However, these techniques pose a number of challenges for lipidomic raw data processing, lipid informatics, and the interpretation of lipidomic data in the context of lipid function and structure. Integration of lipidomic data with other systemic levels, such as genomic or proteomic, in the context of molecular pathways and biophysical processes provides a basis for the understanding of lipid function at the systems level. The present report, based on the limited literature, is an update on a young but rapidly emerging field of lipid informatics and related pathway reconstruction strategies.
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Deassuncao, G. V.; Moreira, M. A.; Novaes, R. A.
1984-01-01
The development of a methodology for annual estimates of irrigated rice crop in the State of Rio Grande do Sul, Brazil, using remote sensing techniques is proposed. The project involves interpretation, digital analysis, and sampling techniques of LANDSAT imagery. Results are discussed from a preliminary phase for identifying and evaluating irrigated rice crop areas in four counties of the State, for the crop year 1982/1983. This first phase involved just visual interpretation techniques of MSS/LANDSAT images.
Automated lithology prediction from PGNAA and other geophysical logs.
Borsaru, M; Zhou, B; Aizawa, T; Karashima, H; Hashimoto, T
2006-02-01
Different methods of lithology predictions from geophysical data have been developed in the last 15 years. The geophysical logs used for predicting lithology are the conventional logs: sonic, neutron-neutron, gamma (total natural-gamma) and density (backscattered gamma-gamma). The prompt gamma neutron activation analysis (PGNAA) is another established geophysical logging technique for in situ element analysis of rocks in boreholes. The work described in this paper was carried out to investigate the application of PGNAA to the lithology interpretation. The data interpretation was conducted using the automatic interpretation program LogTrans based on statistical analysis. Limited test suggests that PGNAA logging data can be used to predict the lithology. A success rate of 73% for lithology prediction was achieved from PGNAA logging data only. It can also be used in conjunction with the conventional geophysical logs to enhance the lithology prediction.
Image processing via level set curvature flow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malladi, R.; Sethian, J.A.
We present a controlled image smoothing and enhancement method based on a curvature flow interpretation of the geometric heat equation. Compared to existing techniques, the model has several distinct advantages. (i) It contains just one enhancement parameter. (ii) The scheme naturally inherits a stopping criterion from the image; continued application of the scheme produces no further change. (iii) The method is one of the fastest possible schemes based on a curvature-controlled approach. 15 ref., 6 figs.
The Optimum Text in Simultaneous Interpreting: A Cognitive Approach to Interpreters' Training.
ERIC Educational Resources Information Center
Alexieva, Bistra
A discussion of text translatability in simultaneous interpreting (SI) looks at semantic redundancy, the repetition of semantic components essential to creating an utterance, and offers some classroom techniques for teaching interpreting skills. It is proposed that the translatability of a text in SI should be studied in terms of the experiential…
ERIC Educational Resources Information Center
Drallny, Ines
1987-01-01
Describes the purpose and appropriate methodology for various levels of interpreter training, for both consecutive and simultaneous interpretation. The importance of relating the intent of the text to the explicit language forms through which that intent is realized is discussed, and appropriate criteria for evaluation of student interpreters are…
Methods of collecting and interpreting ground-water data
Bentall, Ray
1963-01-01
Because ground water is hidden from view, ancient man could only theorize as to its sources of replenishment and its behavior. His theories held sway until the latter part of the 17th century, which marked the first experimental work to determine the source and movement of ground water. Thus founded, the science of ground-water hydrology grew slowly and not until the 19th century is there substantial evidence of conclusions having been based on observational data. The 20th century has witnessed tremendous advances in the science in the methods of field investigation and interpretation of collected data, in the methods of determining the hydrologic characteristics of water-bearing material, and in the methods of inventorying ground-water supplies. Now, as is true of many other disciplines, the science of ground-water hydrology is characterized by frequent advancement of new ideas and techniques, refinement of old techniques, and an increasing wealth of data awaiting interpretation.So that its widely scattered staff of professional hydrologists could keep abreast of new ideas and advances in the techniques of groundwater investigation, it has been the practice in the U.S. Geological Survey to distribute such information for immediate internal use. As the methods become better established and developed, they are described in formal publications. Six papers pertaining to widely different phases of ground-water investigation comprise this particular contribution. For the sake of clarity and conformity, the original papers have been revised and edited by the compiler.
Patterns of Communication through Interpreters: A Detailed Sociolinguistic Analysis
Aranguri, Cesar; Davidson, Brad; Ramirez, Robert
2006-01-01
BACKGROUND Numerous articles have detailed how the presence of an interpreter leads to less satisfactory communication with physicians; few have studied how actual communication takes place through an interpreter in a clinical setting. OBJECTIVE Record and analyze physician-interpreter-patient interactions. DESIGN Primary care physicians with high-volume Hispanic practices were recruited for a communication study. Dyslipidemic Hispanic patients, either monolingual Spanish or bilingual Spanish-English, were recruited on the day of a normally scheduled appointment and, once consented, recorded without a researcher present in the room. Separate postvisit interviews were conducted with the patient and the physician. All interactions were fully transcribed and analyzed. PARTICIPANTS Sixteen patients were recorded interacting with 9 physicians. Thirteen patients used an interpreter with 8 physicians, and 3 patients spoke Spanish with the 1 bilingual physician. APPROACH Transcript analysis based on sociolinguistic and discourse analytic techniques, including but not limited to time speaking, analysis of questions asked and answered, and the loss of semantic information. RESULTS Speech was significantly reduced and revised by the interpreter, resulting in an alteration of linguistic features such as content, meaning, reinforcement/validation, repetition, and affect. In addition, visits that included an interpreter had virtually no rapport-building “small talk,” which typically enables the physician to gain comprehensive patient history, learn clinically relevant information, and increase emotional engagement in treatment. CONCLUSIONS The presence of an interpreter increases the difficulty of achieving good physician-patient communication. Physicians and interpreters should be trained in the process of communication and interpretation, to minimize conversational loss and maximize the information and relational exchange with interpreted patients. PMID:16808747
Kianmehr, Keivan; Alhajj, Reda
2008-09-01
In this study, we aim at building a classification framework, namely the CARSVM model, which integrates association rule mining and support vector machine (SVM). The goal is to benefit from advantages of both, the discriminative knowledge represented by class association rules and the classification power of the SVM algorithm, to construct an efficient and accurate classifier model that improves the interpretability problem of SVM as a traditional machine learning technique and overcomes the efficiency issues of associative classification algorithms. In our proposed framework: instead of using the original training set, a set of rule-based feature vectors, which are generated based on the discriminative ability of class association rules over the training samples, are presented to the learning component of the SVM algorithm. We show that rule-based feature vectors present a high-qualified source of discrimination knowledge that can impact substantially the prediction power of SVM and associative classification techniques. They provide users with more conveniences in terms of understandability and interpretability as well. We have used four datasets from UCI ML repository to evaluate the performance of the developed system in comparison with five well-known existing classification methods. Because of the importance and popularity of gene expression analysis as real world application of the classification model, we present an extension of CARSVM combined with feature selection to be applied to gene expression data. Then, we describe how this combination will provide biologists with an efficient and understandable classifier model. The reported test results and their biological interpretation demonstrate the applicability, efficiency and effectiveness of the proposed model. From the results, it can be concluded that a considerable increase in classification accuracy can be obtained when the rule-based feature vectors are integrated in the learning process of the SVM algorithm. In the context of applicability, according to the results obtained from gene expression analysis, we can conclude that the CARSVM system can be utilized in a variety of real world applications with some adjustments.
ERIC Educational Resources Information Center
Gliddon, C. M.; Rosengren, R. J.
2012-01-01
This article describes a 13-week laboratory course called Human Toxicology taught at the University of Otago, New Zealand. This course used a guided inquiry based laboratory coupled with formative assessment and collaborative learning to develop in undergraduate students the skills of problem solving/critical thinking, data interpretation and…
Techniques in teaching statistics : linking research production and research use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martinez-Moyano, I .; Smith, A.; Univ. of Massachusetts at Boston)
In the spirit of closing the 'research-practice gap,' the authors extend evidence-based principles to statistics instruction in social science graduate education. The authors employ a Delphi method to survey experienced statistics instructors to identify teaching techniques to overcome the challenges inherent in teaching statistics to students enrolled in practitioner-oriented master's degree programs. Among the teaching techniques identi?ed as essential are using real-life examples, requiring data collection exercises, and emphasizing interpretation rather than results. Building on existing research, preliminary interviews, and the ?ndings from the study, the authors develop a model describing antecedents to the strength of the link between researchmore » and practice.« less
Analysis of pulse thermography using similarities between wave and diffusion propagation
NASA Astrophysics Data System (ADS)
Gershenson, M.
2017-05-01
Pulse thermography or thermal wave imaging are commonly used as nondestructive evaluation (NDE) method. While the technical aspect has evolve with time, theoretical interpretation is lagging. Interpretation is still using curved fitting on a log log scale. A new approach based directly on the governing differential equation is introduced. By using relationships between wave propagation and the diffusive propagation of thermal excitation, it is shown that one can transform from solutions in one type of propagation to the other. The method is based on the similarities between the Laplace transforms of the diffusion equation and the wave equation. For diffusive propagation we have the Laplace variable s to the first power, while for the wave propagation similar equations occur with s2. For discrete time the transformation between the domains is performed by multiplying the temperature data vector by a matrix. The transform is local. The performance of the techniques is tested on synthetic data. The application of common back projection techniques used in the processing of wave data is also demonstrated. The combined use of the transform and back projection makes it possible to improve both depth and lateral resolution of transient thermography.
Geophysical monitoring technology for CO2 sequestration
NASA Astrophysics Data System (ADS)
Ma, Jin-Feng; Li, Lin; Wang, Hao-Fan; Tan, Ming-You; Cui, Shi-Ling; Zhang, Yun-Yin; Qu, Zhi-Peng; Jia, Ling-Yun; Zhang, Shu-Hai
2016-06-01
Geophysical techniques play key roles in the measuring, monitoring, and verifying the safety of CO2 sequestration and in identifying the efficiency of CO2-enhanced oil recovery. Although geophysical monitoring techniques for CO2 sequestration have grown out of conventional oil and gas geophysical exploration techniques, it takes a long time to conduct geophysical monitoring, and there are many barriers and challenges. In this paper, with the initial objective of performing CO2 sequestration, we studied the geophysical tasks associated with evaluating geological storage sites and monitoring CO2 sequestration. Based on our review of the scope of geophysical monitoring techniques and our experience in domestic and international carbon capture and sequestration projects, we analyzed the inherent difficulties and our experiences in geophysical monitoring techniques, especially, with respect to 4D seismic acquisition, processing, and interpretation.
Microbial Growth and Metabolism in Soil - Refining the Interpretation of Carbon Use Efficiency
NASA Astrophysics Data System (ADS)
Geyer, K.; Frey, S. D.
2016-12-01
Carbon use efficiency (CUE) describes a critical step in the terrestrial carbon cycle where microorganisms partition organic carbon (C) between stabilized organic forms and CO2. Application of this concept, however, begins with accurate measurements of CUE. Both traditional and developing approaches still depend on numerous assumptions that render them difficult to interpret and potentially incompatible with one another. Here we explore the soil processes inherent to traditional (e.g., substrate-based, biomass-based) and emerging (e.g., growth rate-based, calorimetry) CUE techniques in order to better understand the information they provide. Soil from the Harvard Forest Long Term Ecological Research (LTER) site in Massachusetts, USA, was amended with both 13C-glucose and 18O-water and monitored over 72 h for changes in dissolved organic carbon (DOC), respiration (R), microbial biomass (MB), DNA synthesis, and heat flux (Q). Four different CUE estimates were calculated: 1) (ΔDOC - R)/ΔDOC (substrate-based), 2) Δ13C-MB/(Δ13C-MB + R) (biomass-based), 3) Δ18O-DNA/(Δ18O-DNA + R) (growth rate-based), 4) Q/R (energy-based). Our results indicate that microbial growth (estimated by both 13C and 18O techniques) was delayed for 40 h after amendment even though DOC had declined to pre-amendment levels within 48 h. Respiration and heat flux also peaked after 40 h. Although these soils have a relatively high organic C content (5% C), respired CO2 was greater than 88% glucose-derived throughout the experiment. All estimates of microbial growth (Spearman's ρ >0.83, p<0.01) and efficiency (Spearman's ρ >0.65, p<0.05) were positively correlated, but strong differences in the magnitude of CUE suggest incomplete C accounting. This work increases the transparency of CUE techniques for researchers looking to choose the most appropriate measure for their scale of inquiry or to use CUE estimates in modeling applications.
NASA Technical Reports Server (NTRS)
Harwood, P. (Principal Investigator); Finley, R.; Mcculloch, S.; Malin, P. A.; Schell, J. A.
1977-01-01
The author has identified the following significant results. Image interpretation and computer-assisted techniques were developed to analyze LANDSAT scenes in support of resource inventory and monitoring requirements for the Texas coastal region. Land cover and land use maps, at a scale of 1:125,000 for the image interpretation product and 1:24,000 for the computer-assisted product, were generated covering four Texas coastal test sites. Classification schemes which parallel national systems were developed for each procedure, including 23 classes for image interpretation technique and 13 classes for the computer-assisted technique. Results indicate that LANDSAT-derived land cover and land use maps can be successfully applied to a variety of planning and management activities on the Texas coast. Computer-derived land/water maps can be used with tide gage data to assess shoreline boundaries for management purposes.
A manual for inexpensive methods of analyzing and utilizing remote sensor data
NASA Technical Reports Server (NTRS)
Elifrits, C. D.; Barr, D. J.
1978-01-01
Instructions are provided for inexpensive methods of using remote sensor data to assist in the completion of the need to observe the earth's surface. When possible, relative costs were included. Equipment need for analysis of remote sensor data is described, and methods of use of these equipment items are included, as well as advantages and disadvantages of the use of individual items. Interpretation and analysis of stereo photos and the interpretation of typical patterns such as tone and texture, landcover, drainage, and erosional form are described. Similar treatment is given to monoscopic image interpretation, including LANDSAT MSS data. Enhancement techniques are detailed with respect to their application and simple techniques of creating an enhanced data item. Techniques described include additive and subtractive (Diazo processes) color techniques and enlargement of photos or images. Applications of these processes, including mappings of land resources, engineering soils, geology, water resources, environmental conditions, and crops and/or vegetation, are outlined.
The Elicitation Interview Technique: Capturing People's Experiences of Data Representations.
Hogan, Trevor; Hinrichs, Uta; Hornecker, Eva
2016-12-01
Information visualization has become a popular tool to facilitate sense-making, discovery and communication in a large range of professional and casual contexts. However, evaluating visualizations is still a challenge. In particular, we lack techniques to help understand how visualizations are experienced by people. In this paper we discuss the potential of the Elicitation Interview technique to be applied in the context of visualization. The Elicitation Interview is a method for gathering detailed and precise accounts of human experience. We argue that it can be applied to help understand how people experience and interpret visualizations as part of exploration and data analysis processes. We describe the key characteristics of this interview technique and present a study we conducted to exemplify how it can be applied to evaluate data representations. Our study illustrates the types of insights this technique can bring to the fore, for example, evidence for deep interpretation of visual representations and the formation of interpretations and stories beyond the represented data. We discuss general visualization evaluation scenarios where the Elicitation Interview technique may be beneficial and specify what needs to be considered when applying this technique in a visualization context specifically.
Husak, Gregory J.; Michaelsen, Joel; Kyriakidis, P.; Verdin, James P.; Funk, Chris; Galu, Gideon
2011-01-01
Probabilistic forecasts are produced from a variety of outlets to help predict rainfall, and other meteorological events, for periods of 1 month or more. Such forecasts are expressed as probabilities of a rainfall event, e.g. being in the upper, middle, or lower third of the relevant distribution of rainfall in the region. The impact of these forecasts on the expectation for the event is not always clear or easily conveyed. This article proposes a technique based on Monte Carlo simulation for adjusting existing climatologic statistical parameters to match forecast information, resulting in new parameters defining the probability of events for the forecast interval. The resulting parameters are shown to approximate the forecasts with reasonable accuracy. To show the value of the technique as an application for seasonal rainfall, it is used with consensus forecast developed for the Greater Horn of Africa for the 2009 March-April-May season. An alternative, analytical approach is also proposed, and discussed in comparison to the first simulation-based technique.
ERIC Educational Resources Information Center
Thomas, Noel, Ed.; Towell, Richard, Ed.
Papers presented at a conference on the use of simultaneous, consecutive, and other forms of interpreting as features of foreign language teaching and learning in British higher education include the following: "Liaison Interpreting as a Communicative Language-Learning Exercise" (H. A. Keith); "Interpreting and Communicating:…
Tipu, Hamid Nawaz; Bashir, Muhammad Mukarram; Noman, Muhammad
2016-10-01
Serology and DNA techniques are employed for Human Leukocyte Antigen (HLA) typing in different transplant centers. Results may not always correlate well and may need retyping with different technique. All the patients (with aplastic anemia, thalassemia, and immunodeficiency) and their donors, requiring HLA typing for bone marrow transplant were enrolled in the study. Serological HLA typing was done by complement-dependent lymphocytotoxicity while DNA-based typing was done with sequence specific primers (SSP). Serology identified 167 HLA A and 165 HLA B antigens while SSP in same samples identified 181 HLA A and 184 HLA B alleles. A11 and B51 were the commonest antigens/alleles by both methods. There were a total of 21 misreads and 32 dropouts on serology, for both HLA A and B loci with HLA A32, B52 and B61 being the most ambiguous antigens. Inherent limitations of serological techniques warrant careful interpretation or use of DNA-based methods for resolution of ambiguous typing.
Opto-electronic characterization of third-generation solar cells
Jenatsch, Sandra
2018-01-01
Abstract We present an overview of opto-electronic characterization techniques for solar cells including light-induced charge extraction by linearly increasing voltage, impedance spectroscopy, transient photovoltage, charge extraction and more. Guidelines for the interpretation of experimental results are derived based on charge drift-diffusion simulations of solar cells with common performance limitations. It is investigated how nonidealities like charge injection barriers, traps and low mobilities among others manifest themselves in each of the studied cell characterization techniques. Moreover, comprehensive parameter extraction for an organic bulk-heterojunction solar cell comprising PCDTBT:PC70BM is demonstrated. The simulations reproduce measured results of 9 different experimental techniques. Parameter correlation is minimized due to the combination of various techniques. Thereby a route to comprehensive and accurate parameter extraction is identified. PMID:29707069
Bowler, J O; Hoppitt, L; Illingworth, J; Dalgleish, T; Ononaiye, M; Perez-Olivas, G; Mackintosh, B
2017-03-01
It is well established that attention bias and interpretation bias each have a key role in the development and continuation of anxiety. How the biases may interact with one another in anxiety is, however, poorly understood. Using cognitive bias modification techniques, the present study examined whether training a more positive interpretation bias or attention bias resulted in transfer of effects to the untrained cognitive domain. Differences in anxiety reactivity to a real-world stressor were also assessed. Ninety-seven first year undergraduates who had self-reported anxiety were allocated to one of four groups: attention bias training (n = 24), interpretation bias training (n = 26), control task training (n = 25) and no training (n = 22). Training was computer-based and comprised eight sessions over four weeks. Baseline and follow-up measures of attention and interpretation bias, anxiety and depression were taken. A significant reduction in threat-related attention bias and an increase in positive interpretation bias occurred in the attention bias training group. The interpretation bias training group did not exhibit a significant change in attention bias, only interpretation bias. The effect of attention bias training on interpretation bias was significant as compared with the two control groups. There were no effects on self-report measures. The extent to which interpretive training can modify attentional processing remains unclear. Findings support the idea that attentional training might have broad cognitive consequences, impacting downstream on interpretive bias. However, they do not fully support a common mechanism hypothesis, as interpretive training did not impact on attentional bias. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Imagery Interpretation is a timed-tested technique for extracting landscape-level information from aerial photographs and other types of remotely sensed data. The U.S. Environmental Protection Agency's Environmental Photographic Interpretation Center (EPIC) has a 25+ year history...
What Do the Numbers Say? Clarifying Our Interpretation of Carbon Use Efficiency Data from Soil.
NASA Astrophysics Data System (ADS)
Geyer, K.; Dijkstra, P.; Sinsabaugh, R. L.; Frey, S. D.
2017-12-01
Carbon use efficiency (CUE) is the proportion of carbon resources that a microorganism commits towards cellular growth and thus affects the dynamics of soil organic matter pools. While numerous approaches exist for estimating CUE, no attempts have been made to simultaneously compare methods and reconcile their inherent biases. Such work is necessary to partition the observed variation in CUE estimates (commonly between 0.3 - 0.7) as biological or technical in origin. Here we review our results from experimental work aimed at comparing both traditional and emerging CUE techniques. Soil from the Harvard Forest Long Term Ecological Research site in Massachusetts, USA, was amended with 13C-glucose and 18O-water in laboratory mesocosms and monitored for changing rates of soil dissolved organic carbon (DOC) uptake, respiration (R), microbial biomass (MBC) production, DNA synthesis, and heat (Q) flux over 72 hrs. Three CUE estimates were generated from this data: 1) Δ13MBC/(Δ13MBC+ 13R), 2) Δ18DNA/(Δ18DNA + R), 3) Q/R. CUE was also measured via two indirect techniques: metabolic flux modeling and stoichiometric modeling. Results indicate that the 18O technique is able to discern gross growth of soil microbes while the 13C technique indicates net growth. As a result, 18O-based CUE remains unchanged ( 0.45) for the incubation duration at low amendment rates (0.0 and 0.05 mg glucose-C/g) while 13C-based CUE declines with time (0.75 to 0.5). The 13C technique likely overestimates CUE because the numerator (Δ13MBC) integrates any label (whether or not destined for growth) residing within the cell. High amendment rates (2.0 mg glucose-C/g) cause dramatic declines in 18O- and 13C-based CUE for the first 24 hr of incubation, but neither modeling approach was able to detect these dynamics. In summary, our results suggest that 13C-based estimates of CUE are best interpreted as net changes in the residence of labeled substrate within MBC pools while 18O-based estimates directly capture gross growth dynamics. Metabolic flux modeling and stoichiometric modeling appear to be suited for conditions of steady-state MBC only, where they approximate the CUE magnitude of the 13C- and 18O-based approaches, respectively.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sternberg, B.K.; Thomas, S.J.
1992-12-01
The overall objective of the project was to apply a new high-resolution imaging system to water resource investigations. This imaging system measures the ellipticity of received magnetic-field components. The source of the magnetic field is a long-line transmitter emitting frequencies from 30 Hz to 30 kHz. A new high-accuracy calibration method was used to enhance the resolution of the measurements. The specific objectives included: (1) refine the system hardware and software based on these investigations, (2) learn the limitations of this technology in practical water resource investigations, and (3) improve interpretation techniques to extract the highest possible resolution. Successful fieldmore » surveys were run at: (1) San Xavier Mine, Arizona - flow of injected fluid was monitored with the system. (2) Avra Valley, Arizona - subsurface stratigraphy was imaged. A survey at a third site was less successful; interpreted resistivity section does not agree with nearby well logs. Surveys are continuing at this site.« less
EANM-EORTC general recommendations for sentinel node diagnostics in melanoma.
Chakera, Annette H; Hesse, Birger; Burak, Zeynep; Ballinger, James R; Britten, Allan; Caracò, Corrado; Cochran, Alistair J; Cook, Martin G; Drzewiecki, Krzysztof T; Essner, Richard; Even-Sapir, Einat; Eggermont, Alexander M M; Stopar, Tanja Gmeiner; Ingvar, Christian; Mihm, Martin C; McCarthy, Stanley W; Mozzillo, Nicola; Nieweg, Omgo E; Scolyer, Richard A; Starz, Hans; Thompson, John F; Trifirò, Giuseppe; Viale, Giuseppe; Vidal-Sicart, Sergi; Uren, Roger; Waddington, Wendy; Chiti, Arturo; Spatz, Alain; Testori, Alessandro
2009-10-01
The accurate diagnosis of a sentinel node in melanoma includes a sequence of procedures from different medical specialities (nuclear medicine, surgery, oncology, and pathology). The items covered are presented in 11 sections and a reference list: (1) definition of a sentinel node, (2) clinical indications, (3) radiopharmaceuticals and activity injected, (4) dosimetry, (5) injection technique, (6) image acquisition and interpretation, (7) report and display, (8) use of dye, (9) gamma probe detection, (10) surgical techniques in sentinel node biopsy, and (11) pathological evaluation of melanoma-draining sentinel lymph nodes. If specific recommendations given cannot be based on evidence from original, scientific studies, referral is given to "general consensus" and similar expressions. The recommendations are designed to assist in the practice of referral to, performance, interpretation and reporting of all steps of the sentinel node procedure in the hope of setting state-of-the-art standards for good-quality evaluation of possible spread to the lymphatic system in intermediate-to-high risk melanoma without clinical signs of dissemination.
Neural networks: Alternatives to conventional techniques for automatic docking
NASA Technical Reports Server (NTRS)
Vinz, Bradley L.
1994-01-01
Automatic docking of orbiting spacecraft is a crucial operation involving the identification of vehicle orientation as well as complex approach dynamics. The chaser spacecraft must be able to recognize the target spacecraft within a scene and achieve accurate closing maneuvers. In a video-based system, a target scene must be captured and transformed into a pattern of pixels. Successful recognition lies in the interpretation of this pattern. Due to their powerful pattern recognition capabilities, artificial neural networks offer a potential role in interpretation and automatic docking processes. Neural networks can reduce the computational time required by existing image processing and control software. In addition, neural networks are capable of recognizing and adapting to changes in their dynamic environment, enabling enhanced performance, redundancy, and fault tolerance. Most neural networks are robust to failure, capable of continued operation with a slight degradation in performance after minor failures. This paper discusses the particular automatic docking tasks neural networks can perform as viable alternatives to conventional techniques.
NASA Astrophysics Data System (ADS)
Ivonin, D. V.; Skrunes, S.; Brekke, C.; Ivanov, A. Yu.
2016-03-01
A simple automatic multipolarization technique for discrimination of main types of thin oil films (of thickness less than the radio wave skin depth) from natural ones is proposed. It is based on a new multipolarization parameter related to the ratio between the damping in the slick of specially normalized resonant and nonresonant signals calculated using the normalized radar cross-section model proposed by Kudryavtsev et al. (2003a). The technique is tested on RADARSAT-2 copolarization (VV/HH) synthetic aperture radar images of slicks of a priori known provenance (mineral oils, e.g., emulsion and crude oil, and plant oil served to model a natural slick) released during annual oil-on-water exercises in the North Sea in 2011 and 2012. It has been shown that the suggested multipolarization parameter gives new capabilities in interpreting slicks visible on synthetic aperture radar images while allowing discrimination between mineral oil and plant oil slicks.
Konevskikh, Tatiana; Ponossov, Arkadi; Blümel, Reinhold; Lukacs, Rozalia; Kohler, Achim
2015-06-21
The appearance of fringes in the infrared spectroscopy of thin films seriously hinders the interpretation of chemical bands because fringes change the relative peak heights of chemical spectral bands. Thus, for the correct interpretation of chemical absorption bands, physical properties need to be separated from chemical characteristics. In the paper at hand we revisit the theory of the scattering of infrared radiation at thin absorbing films. Although, in general, scattering and absorption are connected by a complex refractive index, we show that for the scattering of infrared radiation at thin biological films, fringes and chemical absorbance can in good approximation be treated as additive. We further introduce a model-based pre-processing technique for separating fringes from chemical absorbance by extended multiplicative signal correction (EMSC). The technique is validated by simulated and experimental FTIR spectra. It is further shown that EMSC, as opposed to other suggested filtering methods for the removal of fringes, does not remove information related to chemical absorption.
NASA Technical Reports Server (NTRS)
Camci, C.; Kim, K.; Hippensteele, S. A.
1992-01-01
A new image processing based color capturing technique for the quantitative interpretation of liquid crystal images used in convective heat transfer studies is presented. This method is highly applicable to the surfaces exposed to convective heating in gas turbine engines. It is shown that, in the single-crystal mode, many of the colors appearing on the heat transfer surface correlate strongly with the local temperature. A very accurate quantitative approach using an experimentally determined linear hue vs temperature relation is found to be possible. The new hue-capturing process is discussed in terms of the strength of the light source illuminating the heat transfer surface, the effect of the orientation of the illuminating source with respect to the surface, crystal layer uniformity, and the repeatability of the process. The present method is more advantageous than the multiple filter method because of its ability to generate many isotherms simultaneously from a single-crystal image at a high resolution in a very time-efficient manner.
Huang, Rao; Lo, Li-Ta; Wen, Yuhua; Voter, Arthur F; Perez, Danny
2017-10-21
Modern molecular-dynamics-based techniques are extremely powerful to investigate the dynamical evolution of materials. With the increase in sophistication of the simulation techniques and the ubiquity of massively parallel computing platforms, atomistic simulations now generate very large amounts of data, which have to be carefully analyzed in order to reveal key features of the underlying trajectories, including the nature and characteristics of the relevant reaction pathways. We show that clustering algorithms, such as the Perron Cluster Cluster Analysis, can provide reduced representations that greatly facilitate the interpretation of complex trajectories. To illustrate this point, clustering tools are used to identify the key kinetic steps in complex accelerated molecular dynamics trajectories exhibiting shape fluctuations in Pt nanoclusters. This analysis provides an easily interpretable coarse representation of the reaction pathways in terms of a handful of clusters, in contrast to the raw trajectory that contains thousands of unique states and tens of thousands of transitions.
NASA Astrophysics Data System (ADS)
Huang, Rao; Lo, Li-Ta; Wen, Yuhua; Voter, Arthur F.; Perez, Danny
2017-10-01
Modern molecular-dynamics-based techniques are extremely powerful to investigate the dynamical evolution of materials. With the increase in sophistication of the simulation techniques and the ubiquity of massively parallel computing platforms, atomistic simulations now generate very large amounts of data, which have to be carefully analyzed in order to reveal key features of the underlying trajectories, including the nature and characteristics of the relevant reaction pathways. We show that clustering algorithms, such as the Perron Cluster Cluster Analysis, can provide reduced representations that greatly facilitate the interpretation of complex trajectories. To illustrate this point, clustering tools are used to identify the key kinetic steps in complex accelerated molecular dynamics trajectories exhibiting shape fluctuations in Pt nanoclusters. This analysis provides an easily interpretable coarse representation of the reaction pathways in terms of a handful of clusters, in contrast to the raw trajectory that contains thousands of unique states and tens of thousands of transitions.
Ultrasonic nondestructive evaluation, microstructure, and mechanical property interrelations
NASA Technical Reports Server (NTRS)
Vary, A.
1984-01-01
Ultrasonic techniques for mechanical property characterizations are reviewed and conceptual models are advanced for explaining and interpreting the empirically based results. At present, the technology is generally empirically based and is emerging from the research laboratory. Advancement of the technology will require establishment of theoretical foundations for the experimentally observed interrelations among ultrasonic measurements, mechanical properties, and microstructure. Conceptual models are applied to ultrasonic assessment of fracture toughness to illustrate an approach for predicting correlations found among ultrasonic measurements, microstructure, and mechanical properties.
CFD in the 1980's from one point of view
NASA Technical Reports Server (NTRS)
Lomax, Harvard
1991-01-01
The present interpretive treatment of the development history of CFD in the 1980s gives attention to advancements in such algorithmic techniques as flux Jacobian-based upwind differencing, total variation-diminishing and essentially nonoscillatory schemes, multigrid methods, unstructured grids, and nonrectangular structured grids. At the same time, computational turbulence research gave attention to turbulence modeling on the bases of increasingly powerful supercomputers and meticulously constructed databases. The major future developments in CFD will encompass such capabilities as structured and unstructured three-dimensional grids.
NASA Technical Reports Server (NTRS)
Schrumpf, B. J. (Principal Investigator); Johnson, J. R.; Mouat, D. A.; Pyott, W. T.
1974-01-01
The author has identified the following significant results. A vegetation classification, with 31 types and compatible with remote sensing applications, was developed for the test site. Terrain features can be used to discriminate vegetation types. Elevation and macrorelief interpretations were successful on ERTS photos, although for macrorelief, high sun angle stereoscopic interpretations were better than low sun angle monoscopic interpretations. Using spectral reflectivity, several vegetation types were characterized in terms of patterns of signature change. ERTS MSS digital data were used to discriminate vegetation classes at the association level and at the alliance level when image contrasts were high or low, respectively. An imagery comparison technique was developed to test image complexity and image groupability. In two stage sampling of vegetation types, ERTS plus high altitude photos were highly satisfactory for estimating kind and extent of types present, and for providing a mapping base.
Support vector machine for automatic pain recognition
NASA Astrophysics Data System (ADS)
Monwar, Md Maruf; Rezaei, Siamak
2009-02-01
Facial expressions are a key index of emotion and the interpretation of such expressions of emotion is critical to everyday social functioning. In this paper, we present an efficient video analysis technique for recognition of a specific expression, pain, from human faces. We employ an automatic face detector which detects face from the stored video frame using skin color modeling technique. For pain recognition, location and shape features of the detected faces are computed. These features are then used as inputs to a support vector machine (SVM) for classification. We compare the results with neural network based and eigenimage based automatic pain recognition systems. The experiment results indicate that using support vector machine as classifier can certainly improve the performance of automatic pain recognition system.
Verifying Multi-Agent Systems via Unbounded Model Checking
NASA Technical Reports Server (NTRS)
Kacprzak, M.; Lomuscio, A.; Lasica, T.; Penczek, W.; Szreter, M.
2004-01-01
We present an approach to the problem of verification of epistemic properties in multi-agent systems by means of symbolic model checking. In particular, it is shown how to extend the technique of unbounded model checking from a purely temporal setting to a temporal-epistemic one. In order to achieve this, we base our discussion on interpreted systems semantics, a popular semantics used in multi-agent systems literature. We give details of the technique and show how it can be applied to the well known train, gate and controller problem. Keywords: model checking, unbounded model checking, multi-agent systems
Research in interactive scene analysis
NASA Technical Reports Server (NTRS)
Tenenbaum, J. M.; Barrow, H. G.; Weyl, S. A.
1976-01-01
Cooperative (man-machine) scene analysis techniques were developed whereby humans can provide a computer with guidance when completely automated processing is infeasible. An interactive approach promises significant near-term payoffs in analyzing various types of high volume satellite imagery, as well as vehicle-based imagery used in robot planetary exploration. This report summarizes the work accomplished over the duration of the project and describes in detail three major accomplishments: (1) the interactive design of texture classifiers; (2) a new approach for integrating the segmentation and interpretation phases of scene analysis; and (3) the application of interactive scene analysis techniques to cartography.
NASA Technical Reports Server (NTRS)
Brooner, W. G.; Nichols, D. A.
1972-01-01
Development of a scheme for utilizing remote sensing technology in an operational program for regional land use planning and land resource management program applications. The scheme utilizes remote sensing imagery as one of several potential inputs to derive desired and necessary data, and considers several alternative approaches to the expansion and/or reduction and analysis of data, using automated data handling techniques. Within this scheme is a five-stage program development which includes: (1) preliminary coordination, (2) interpretation and encoding, (3) creation of data base files, (4) data analysis and generation of desired products, and (5) applications.
Trautwein, C.M.; Rowan, L.C.
1987-01-01
Linear structural features and hydrothermally altered rocks that were interpreted from Landsat data have been used by the U.S. Geological Survey (USGS) in regional mineral resource appraisals for more than a decade. In the past, linear features and alterations have been incorporated into models for assessing mineral resources potential by manually overlaying these and other data sets. Recently, USGS research into computer-based geographic information systems (GIS) for mineral resources assessment programs has produced several new techniques for data analysis, quantification, and integration to meet assessment objectives.
Improved interpretation of satellite altimeter data using genetic algorithms
NASA Technical Reports Server (NTRS)
Messa, Kenneth; Lybanon, Matthew
1992-01-01
Genetic algorithms (GA) are optimization techniques that are based on the mechanics of evolution and natural selection. They take advantage of the power of cumulative selection, in which successive incremental improvements in a solution structure become the basis for continued development. A GA is an iterative procedure that maintains a 'population' of 'organisms' (candidate solutions). Through successive 'generations' (iterations) the population as a whole improves in simulation of Darwin's 'survival of the fittest'. GA's have been shown to be successful where noise significantly reduces the ability of other search techniques to work effectively. Satellite altimetry provides useful information about oceanographic phenomena. It provides rapid global coverage of the oceans and is not as severely hampered by cloud cover as infrared imagery. Despite these and other benefits, several factors lead to significant difficulty in interpretation. The GA approach to the improved interpretation of satellite data involves the representation of the ocean surface model as a string of parameters or coefficients from the model. The GA searches in parallel, a population of such representations (organisms) to obtain the individual that is best suited to 'survive', that is, the fittest as measured with respect to some 'fitness' function. The fittest organism is the one that best represents the ocean surface model with respect to the altimeter data.
42 CFR 493.911 - Bacteriology.
Code of Federal Regulations, 2012 CFR
2012-10-01
... laboratories: (1) Those that interpret Gram stains or perform primary inoculation, or both; and refer cultures...; (2) Those that use direct antigen techniques to detect an organism and may also interpret Gram stains... interpreting Gram stains, performing primary inoculations, and using direct antigen tests, also isolate and...
42 CFR 493.911 - Bacteriology.
Code of Federal Regulations, 2014 CFR
2014-10-01
... laboratories: (1) Those that interpret Gram stains or perform primary inoculation, or both; and refer cultures...; (2) Those that use direct antigen techniques to detect an organism and may also interpret Gram stains... interpreting Gram stains, performing primary inoculations, and using direct antigen tests, also isolate and...
42 CFR 493.911 - Bacteriology.
Code of Federal Regulations, 2013 CFR
2013-10-01
... laboratories: (1) Those that interpret Gram stains or perform primary inoculation, or both; and refer cultures...; (2) Those that use direct antigen techniques to detect an organism and may also interpret Gram stains... interpreting Gram stains, performing primary inoculations, and using direct antigen tests, also isolate and...
NASA Astrophysics Data System (ADS)
Zvezhinskiy, D. S.; Butterling, M.; Wagner, A.; Krause-Rehberg, R.; Stepanov, S. V.
2013-06-01
Recent development of the Gamma-induced Positron Spectroscopy (GiPS) setup significantly extends applicability of the Age-Momentum Correlation technique (AMOC) for studies of the bulk samples. It also provides many advantages comparing with conventional positron annihilation experiments in liquids, such as extremely low annihilation fraction in vessel walls, absence of a positron source and positron annihilations in it. We have developed a new approach for processing and interpretation of the AMOC-GiPS data based on the diffusion recombination model of the intratrack radiolytic processes. This approach is verified in case of liquid water, which is considered as a reference medium in the positron and positronium chemistry.
Clinical decision making using teleradiology in urology.
Lee, B R; Allaf, M; Moore, R; Bohlman, M; Wang, G M; Bishoff, J T; Jackman, S V; Cadeddu, J A; Jarrett, T W; Khazan, R; Kavoussi, L R
1999-01-01
Using a personal computer-based teleradiology system, we compared accuracy, confidence, and diagnostic ability in the interpretation of digitized radiographs to determine if teleradiology-imported studies convey sufficient information to make relevant clinical decisions involving urology. Variables of diagnostic accuracy, confidence, image quality, interpretation, and the impact of clinical decisions made after viewing digitized radiographs were compared with those of original radiographs. We evaluated 956 radiographs that included 94 IV pyelograms, four voiding cystourethrograms, and two nephrostograms. The radiographs were digitized and transferred over an Ethernet network to a remote personal computer-based viewing station. The digitized images were viewed by urologists and graded according to confidence in making a diagnosis, image quality, diagnostic difficulty, clinical management based on the image itself, and brief patient history. The hard-copy radiographs were then interpreted immediately afterward, and diagnostic decisions were reassessed. All analog radiographs were reviewed by an attending radiologist. Ninety-seven percent of the decisions made from the digitized radiographs did not change after reviewing conventional radiographs of the same case. When comparing the variables of clinical confidence, quality of the film on the teleradiology system versus analog films, and diagnostic difficulty, we found no statistical difference (p > .05) between the two techniques. Overall accuracy in interpreting the digitized images on the teleradiology system was 88% by urologists compared with that of the attending radiologist's interpretation of the analog radiographs. However, urologists detected findings on five (5%) analog radiographs that had been previously unreported by the radiologist. Viewing radiographs transmitted to a personal computer-based viewing station is an appropriate means of reviewing films with sufficient quality on which to base clinical decisions. Our focus was whether decisions made after viewing the transmitted radiographs would change after viewing the hard-copy images of the same case. In 97% of the cases, the decision did not change. In those cases in which management was altered, recommendation of further imaging studies was the most common factor.
Beyond Academia - Interrogating Research Impact in the Research Excellence Framework.
Terama, Emma; Smallman, Melanie; Lock, Simon J; Johnson, Charlotte; Austwick, Martin Zaltz
2016-01-01
Big changes to the way in which research funding is allocated to UK universities were brought about in the Research Excellence Framework (REF), overseen by the Higher Education Funding Council, England. Replacing the earlier Research Assessment Exercise, the purpose of the REF was to assess the quality and reach of research in UK universities-and allocate funding accordingly. For the first time, this included an assessment of research 'impact', accounting for 20% of the funding allocation. In this article we use a text mining technique to investigate the interpretations of impact put forward via impact case studies in the REF process. We find that institutions have developed a diverse interpretation of impact, ranging from commercial applications to public and cultural engagement activities. These interpretations of impact vary from discipline to discipline and between institutions, with more broad-based institutions depicting a greater variety of impacts. Comparing the interpretations with the score given by REF, we found no evidence of one particular interpretation being more highly rewarded than another. Importantly, we also found a positive correlation between impact score and [overall research] quality score, suggesting that impact is not being achieved at the expense of research excellence.
Beyond Academia – Interrogating Research Impact in the Research Excellence Framework
Smallman, Melanie; Lock, Simon J.; Johnson, Charlotte; Austwick, Martin Zaltz
2016-01-01
Big changes to the way in which research funding is allocated to UK universities were brought about in the Research Excellence Framework (REF), overseen by the Higher Education Funding Council, England. Replacing the earlier Research Assessment Exercise, the purpose of the REF was to assess the quality and reach of research in UK universities–and allocate funding accordingly. For the first time, this included an assessment of research ‘impact’, accounting for 20% of the funding allocation. In this article we use a text mining technique to investigate the interpretations of impact put forward via impact case studies in the REF process. We find that institutions have developed a diverse interpretation of impact, ranging from commercial applications to public and cultural engagement activities. These interpretations of impact vary from discipline to discipline and between institutions, with more broad-based institutions depicting a greater variety of impacts. Comparing the interpretations with the score given by REF, we found no evidence of one particular interpretation being more highly rewarded than another. Importantly, we also found a positive correlation between impact score and [overall research] quality score, suggesting that impact is not being achieved at the expense of research excellence. PMID:27997599
NASA Astrophysics Data System (ADS)
Hayashi, Tatsuro; Zhou, Xiangrong; Chen, Huayue; Hara, Takeshi; Miyamoto, Kei; Kobayashi, Tatsunori; Yokoyama, Ryujiro; Kanematsu, Masayuki; Hoshi, Hiroaki; Fujita, Hiroshi
2010-03-01
X-ray CT images have been widely used in clinical routine in recent years. CT images scanned by a modern CT scanner can show the details of various organs and tissues. This means various organs and tissues can be simultaneously interpreted on CT images. However, CT image interpretation requires a lot of time and energy. Therefore, support for interpreting CT images based on image-processing techniques is expected. The interpretation of the spinal curvature is important for clinicians because spinal curvature is associated with various spinal disorders. We propose a quantification scheme of the spinal curvature based on the center line of spinal canal on CT images. The proposed scheme consists of four steps: (1) Automated extraction of the skeletal region based on CT number thresholding. (2) Automated extraction of the center line of spinal canal. (3) Generation of the median plane image of spine, which is reformatted based on the spinal canal. (4) Quantification of the spinal curvature. The proposed scheme was applied to 10 cases, and compared with the Cobb angle that is commonly used by clinicians. We found that a high-correlation (for the 95% confidence interval, lumbar lordosis: 0.81-0.99) between values obtained by the proposed (vector) method and Cobb angle. Also, the proposed method can provide the reproducible result (inter- and intra-observer variability: within 2°). These experimental results suggested a possibility that the proposed method was efficient for quantifying the spinal curvature on CT images.
Hybrid modeling method for a DEP based particle manipulation.
Miled, Mohamed Amine; Gagne, Antoine; Sawan, Mohamad
2013-01-30
In this paper, a new modeling approach for Dielectrophoresis (DEP) based particle manipulation is presented. The proposed method fulfills missing links in finite element modeling between the multiphysic simulation and the biological behavior. This technique is amongst the first steps to develop a more complex platform covering several types of manipulations such as magnetophoresis and optics. The modeling approach is based on a hybrid interface using both ANSYS and MATLAB to link the propagation of the electrical field in the micro-channel to the particle motion. ANSYS is used to simulate the electrical propagation while MATLAB interprets the results to calculate cell displacement and send the new information to ANSYS for another turn. The beta version of the proposed technique takes into account particle shape, weight and its electrical properties. First obtained results are coherent with experimental results.
Hybrid Modeling Method for a DEP Based Particle Manipulation
Miled, Mohamed Amine; Gagne, Antoine; Sawan, Mohamad
2013-01-01
In this paper, a new modeling approach for Dielectrophoresis (DEP) based particle manipulation is presented. The proposed method fulfills missing links in finite element modeling between the multiphysic simulation and the biological behavior. This technique is amongst the first steps to develop a more complex platform covering several types of manipulations such as magnetophoresis and optics. The modeling approach is based on a hybrid interface using both ANSYS and MATLAB to link the propagation of the electrical field in the micro-channel to the particle motion. ANSYS is used to simulate the electrical propagation while MATLAB interprets the results to calculate cell displacement and send the new information to ANSYS for another turn. The beta version of the proposed technique takes into account particle shape, weight and its electrical properties. First obtained results are coherent with experimental results. PMID:23364197
System equivalent model mixing
NASA Astrophysics Data System (ADS)
Klaassen, Steven W. B.; van der Seijs, Maarten V.; de Klerk, Dennis
2018-05-01
This paper introduces SEMM: a method based on Frequency Based Substructuring (FBS) techniques that enables the construction of hybrid dynamic models. With System Equivalent Model Mixing (SEMM) frequency based models, either of numerical or experimental nature, can be mixed to form a hybrid model. This model follows the dynamic behaviour of a predefined weighted master model. A large variety of applications can be thought of, such as the DoF-space expansion of relatively small experimental models using numerical models, or the blending of different models in the frequency spectrum. SEMM is outlined, both mathematically and conceptually, based on a notation commonly used in FBS. A critical physical interpretation of the theory is provided next, along with a comparison to similar techniques; namely DoF expansion techniques. SEMM's concept is further illustrated by means of a numerical example. It will become apparent that the basic method of SEMM has some shortcomings which warrant a few extensions to the method. One of the main applications is tested in a practical case, performed on a validated benchmark structure; it will emphasize the practicality of the method.
Wimmer, Helge; Gundacker, Nina C; Griss, Johannes; Haudek, Verena J; Stättner, Stefan; Mohr, Thomas; Zwickl, Hannes; Paulitschke, Verena; Baron, David M; Trittner, Wolfgang; Kubicek, Markus; Bayer, Editha; Slany, Astrid; Gerner, Christopher
2009-06-01
Interpretation of proteome data with a focus on biomarker discovery largely relies on comparative proteome analyses. Here, we introduce a database-assisted interpretation strategy based on proteome profiles of primary cells. Both 2-D-PAGE and shotgun proteomics are applied. We obtain high data concordance with these two different techniques. When applying mass analysis of tryptic spot digests from 2-D gels of cytoplasmic fractions, we typically identify several hundred proteins. Using the same protein fractions, we usually identify more than thousand proteins by shotgun proteomics. The data consistency obtained when comparing these independent data sets exceeds 99% of the proteins identified in the 2-D gels. Many characteristic differences in protein expression of different cells can thus be independently confirmed. Our self-designed SQL database (CPL/MUW - database of the Clinical Proteomics Laboratories at the Medical University of Vienna accessible via www.meduniwien.ac.at/proteomics/database) facilitates (i) quality management of protein identification data, which are based on MS, (ii) the detection of cell type-specific proteins and (iii) of molecular signatures of specific functional cell states. Here, we demonstrate, how the interpretation of proteome profiles obtained from human liver tissue and hepatocellular carcinoma tissue is assisted by the Clinical Proteomics Laboratories at the Medical University of Vienna-database. Therefore, we suggest that the use of reference experiments supported by a tailored database may substantially facilitate data interpretation of proteome profiling experiments.
Fuzzy logic and image processing techniques for the interpretation of seismic data
NASA Astrophysics Data System (ADS)
Orozco-del-Castillo, M. G.; Ortiz-Alemán, C.; Urrutia-Fucugauchi, J.; Rodríguez-Castellanos, A.
2011-06-01
Since interpretation of seismic data is usually a tedious and repetitive task, the ability to do so automatically or semi-automatically has become an important objective of recent research. We believe that the vagueness and uncertainty in the interpretation process makes fuzzy logic an appropriate tool to deal with seismic data. In this work we developed a semi-automated fuzzy inference system to detect the internal architecture of a mass transport complex (MTC) in seismic images. We propose that the observed characteristics of a MTC can be expressed as fuzzy if-then rules consisting of linguistic values associated with fuzzy membership functions. The constructions of the fuzzy inference system and various image processing techniques are presented. We conclude that this is a well-suited problem for fuzzy logic since the application of the proposed methodology yields a semi-automatically interpreted MTC which closely resembles the MTC from expert manual interpretation.
Interface Problems: Structural Constraints on Interpretation?
ERIC Educational Resources Information Center
Frazier, Lyn; Clifton, Charles; Rayner, Keith; Deevy, Patricia; Koh, Sungryong; Bader, Markus
2005-01-01
Five experiments investigated the interpretation of quantified noun phrases in relation to discourse structure. They demonstrated, using questionnaire and on-line reading techniques, that readers in English prefer to give a quantified noun phrase in (VP-external) subject position a presuppositional interpretation, in which the noun phrase limits…
Decision trees in epidemiological research.
Venkatasubramaniam, Ashwini; Wolfson, Julian; Mitchell, Nathan; Barnes, Timothy; JaKa, Meghan; French, Simone
2017-01-01
In many studies, it is of interest to identify population subgroups that are relatively homogeneous with respect to an outcome. The nature of these subgroups can provide insight into effect mechanisms and suggest targets for tailored interventions. However, identifying relevant subgroups can be challenging with standard statistical methods. We review the literature on decision trees, a family of techniques for partitioning the population, on the basis of covariates, into distinct subgroups who share similar values of an outcome variable. We compare two decision tree methods, the popular Classification and Regression tree (CART) technique and the newer Conditional Inference tree (CTree) technique, assessing their performance in a simulation study and using data from the Box Lunch Study, a randomized controlled trial of a portion size intervention. Both CART and CTree identify homogeneous population subgroups and offer improved prediction accuracy relative to regression-based approaches when subgroups are truly present in the data. An important distinction between CART and CTree is that the latter uses a formal statistical hypothesis testing framework in building decision trees, which simplifies the process of identifying and interpreting the final tree model. We also introduce a novel way to visualize the subgroups defined by decision trees. Our novel graphical visualization provides a more scientifically meaningful characterization of the subgroups identified by decision trees. Decision trees are a useful tool for identifying homogeneous subgroups defined by combinations of individual characteristics. While all decision tree techniques generate subgroups, we advocate the use of the newer CTree technique due to its simplicity and ease of interpretation.
NASA Astrophysics Data System (ADS)
Sergio, de los Santos-Villalobos; Claudio, Bravo-Linares; dos Anjos Roberto, Meigikos; Renan, Cardoso; Max, Gibbs; Andrew, Swales; Lionel, Mabit; Gerd, Dercon
Soil erosion is one of the biggest challenges for food production around the world. Many techniques have been used to evaluate and mitigate soil degradation. Nowadays isotopic techniques are becoming a powerful tool to assess soil apportionment. One of the innovative techniques used is the Compound Specific Stable Isotopes (CSSI) analysis, which has been used to track down sediments and specify their sources by the isotopic signature of δ13 C in specific fatty acids. The application of this technique on soil apportionment has been recently developed, however there is a lack of user-friendly Software for data processing and interpretation. The aim of this article is to introduce a new open source tool for working with data sets generated by the use of the CSSI technique to assess soil apportionment, called the CSSIARv1.00 Software
Emotion-prints: interaction-driven emotion visualization on multi-touch interfaces
NASA Astrophysics Data System (ADS)
Cernea, Daniel; Weber, Christopher; Ebert, Achim; Kerren, Andreas
2015-01-01
Emotions are one of the unique aspects of human nature, and sadly at the same time one of the elements that our technological world is failing to capture and consider due to their subtlety and inherent complexity. But with the current dawn of new technologies that enable the interpretation of emotional states based on techniques involving facial expressions, speech and intonation, electrodermal response (EDS) and brain-computer interfaces (BCIs), we are finally able to access real-time user emotions in various system interfaces. In this paper we introduce emotion-prints, an approach for visualizing user emotional valence and arousal in the context of multi-touch systems. Our goal is to offer a standardized technique for representing user affective states in the moment when and at the location where the interaction occurs in order to increase affective self-awareness, support awareness in collaborative and competitive scenarios, and offer a framework for aiding the evaluation of touch applications through emotion visualization. We show that emotion-prints are not only independent of the shape of the graphical objects on the touch display, but also that they can be applied regardless of the acquisition technique used for detecting and interpreting user emotions. Moreover, our representation can encode any affective information that can be decomposed or reduced to Russell's two-dimensional space of valence and arousal. Our approach is enforced by a BCI-based user study and a follow-up discussion of advantages and limitations.
Mortimer, Duncan; Segal, Leonie
2008-01-01
Algorithms for converting descriptive measures of health status into quality-adjusted life year (QALY)--weights are now widely available, and their application in economic evaluation is increasingly commonplace. The objective of this study is to describe and compare existing conversion algorithms and to highlight issues bearing on the derivation and interpretation of the QALY-weights so obtained. Systematic review of algorithms for converting descriptive measures of health status into QALY-weights. The review identified a substantial body of literature comprising 46 derivation studies and 16 studies that provided evidence or commentary on the validity of conversion algorithms. Conversion algorithms were derived using 1 of 4 techniques: 1) transfer to utility regression, 2) response mapping, 3) effect size translation, and 4) "revaluing" outcome measures using preference-based scaling techniques. Although these techniques differ in their methodological/theoretical tradition, data requirements, and ease of derivation and application, the available evidence suggests that the sensitivity and validity of derived QALY-weights may be more dependent on the coverage and sensitivity of measures and the disease area/patient group under evaluation than on the technique used in derivation. Despite the recent proliferation of conversion algorithms, a number of questions bearing on the derivation and interpretation of derived QALY-weights remain unresolved. These unresolved issues suggest directions for future research in this area. In the meantime, analysts seeking guidance in selecting derived QALY-weights should consider the validity and feasibility of each conversion algorithm in the disease area and patient group under evaluation rather than restricting their choice to weights from a particular derivation technique.
Single-particle imaging for biosensor applications
NASA Astrophysics Data System (ADS)
Yorulmaz, Mustafa; Isil, Cagatay; Seymour, Elif; Yurdakul, Celalettin; Solmaz, Berkan; Koc, Aykut; Ünlü, M. Selim
2017-10-01
Current state-of-the-art technology for in-vitro diagnostics employ laboratory tests such as ELISA that consists of a multi-step test procedure and give results in analog format. Results of these tests are interpreted by the color change in a set of diluted samples in a multi-well plate. However, detection of the minute changes in the color poses challenges and can lead to false interpretations. Instead, a technique that allows individual counting of specific binding events would be useful to overcome such challenges. Digital imaging has been applied recently for diagnostics applications. SPR is one of the techniques allowing quantitative measurements. However, the limit of detection in this technique is on the order of nM. The current required detection limit, which is already achieved with the analog techniques, is around pM. Optical techniques that are simple to implement and can offer better sensitivities have great potential to be used in medical diagnostics. Interference Microscopy is one of the tools that have been investigated over years in optics field. More of the studies have been performed in confocal geometry and each individual nanoparticle was observed separately. Here, we achieve wide-field imaging of individual nanoparticles in a large field-of-view ( 166 μm × 250 μm) on a micro-array based sensor chip in fraction of a second. We tested the sensitivity of our technique on dielectric nanoparticles because they exhibit optical properties similar to viruses and cells. We can detect non-resonant dielectric polystyrene nanoparticles of 100 nm. Moreover, we perform post-processing applications to further enhance visibility.
A bird's eye view: the cognitive strategies of experts interpreting seismic profiles
NASA Astrophysics Data System (ADS)
Bond, C. E.; Butler, R.
2012-12-01
Geoscience is perhaps unique in its reliance on incomplete datasets and building knowledge from their interpretation. This interpretation basis for the science is fundamental at all levels; from creation of a geological map to interpretation of remotely sensed data. To teach and understand better the uncertainties in dealing with incomplete data we need to understand the strategies individual practitioners deploy that make them effective interpreters. The nature of interpretation is such that the interpreter needs to use their cognitive ability in the analysis of the data to propose a sensible solution in their final output that is both consistent not only with the original data but also with other knowledge and understanding. In a series of experiments Bond et al. (2007, 2008, 2011, 2012) investigated the strategies and pitfalls of expert and non-expert interpretation of seismic images. These studies focused on large numbers of participants to provide a statistically sound basis for analysis of the results. The outcome of these experiments showed that techniques and strategies are more important than expert knowledge per se in developing successful interpretations. Experts are successful because of their application of these techniques. In a new set of experiments we have focused on a small number of experts to determine how they use their cognitive and reasoning skills, in the interpretation of 2D seismic profiles. Live video and practitioner commentary were used to track the evolving interpretation and to gain insight on their decision processes. The outputs of the study allow us to create an educational resource of expert interpretation through online video footage and commentary with associated further interpretation and analysis of the techniques and strategies employed. This resource will be of use to undergraduate, post-graduate, industry and academic professionals seeking to improve their seismic interpretation skills, develop reasoning strategies for dealing with incomplete datasets, and for assessing the uncertainty in these interpretations. Bond, C.E. et al. (2012). 'What makes an expert effective at interpreting seismic images?' Geology, 40, 75-78. Bond, C. E. et al. (2011). 'When there isn't a right answer: interpretation and reasoning, key skills for 21st century geoscience'. International Journal of Science Education, 33, 629-652. Bond, C. E. et al. (2008). 'Structural models: Optimizing risk analysis by understanding conceptual uncertainty'. First Break, 26, 65-71. Bond, C. E. et al., (2007). 'What do you think this is?: "Conceptual uncertainty" In geoscience interpretation'. GSA Today, 17, 4-10.
Peleato, Nicolas M; Legge, Raymond L; Andrews, Robert C
2018-06-01
The use of fluorescence data coupled with neural networks for improved predictability of drinking water disinfection by-products (DBPs) was investigated. Novel application of autoencoders to process high-dimensional fluorescence data was related to common dimensionality reduction techniques of parallel factors analysis (PARAFAC) and principal component analysis (PCA). The proposed method was assessed based on component interpretability as well as for prediction of organic matter reactivity to formation of DBPs. Optimal prediction accuracies on a validation dataset were observed with an autoencoder-neural network approach or by utilizing the full spectrum without pre-processing. Latent representation by an autoencoder appeared to mitigate overfitting when compared to other methods. Although DBP prediction error was minimized by other pre-processing techniques, PARAFAC yielded interpretable components which resemble fluorescence expected from individual organic fluorophores. Through analysis of the network weights, fluorescence regions associated with DBP formation can be identified, representing a potential method to distinguish reactivity between fluorophore groupings. However, distinct results due to the applied dimensionality reduction approaches were observed, dictating a need for considering the role of data pre-processing in the interpretability of the results. In comparison to common organic measures currently used for DBP formation prediction, fluorescence was shown to improve prediction accuracies, with improvements to DBP prediction best realized when appropriate pre-processing and regression techniques were applied. The results of this study show promise for the potential application of neural networks to best utilize fluorescence EEM data for prediction of organic matter reactivity. Copyright © 2018 Elsevier Ltd. All rights reserved.
Modeling of tool path for the CNC sheet cutting machines
NASA Astrophysics Data System (ADS)
Petunin, Aleksandr A.
2015-11-01
In the paper the problem of tool path optimization for CNC (Computer Numerical Control) cutting machines is considered. The classification of the cutting techniques is offered. We also propose a new classification of toll path problems. The tasks of cost minimization and time minimization for standard cutting technique (Continuous Cutting Problem, CCP) and for one of non-standard cutting techniques (Segment Continuous Cutting Problem, SCCP) are formalized. We show that the optimization tasks can be interpreted as discrete optimization problem (generalized travel salesman problem with additional constraints, GTSP). Formalization of some constraints for these tasks is described. For the solution GTSP we offer to use mathematical model of Prof. Chentsov based on concept of a megalopolis and dynamic programming.
Interpretive Responses in Reading History and Biology: An Exploratory Study
ERIC Educational Resources Information Center
Fareed, Ahmed A.
1971-01-01
Explores the interpretive processes of 12 sixth-grade pupils, using the recorded interview technique. Concludes that readers use the processes of reproduction, inquiry, emotional reaction, rational judgment, appreciation, association, and illumination, and that the nature of the reading material influences the types of interpretive responses. (VJ)
Production and Comprehension of Pantomimes Used to Depict Objects
van Nispen, Karin; van de Sandt-Koenderman, W. Mieke. E.; Krahmer, Emiel
2017-01-01
Pantomime, gesture in absence of speech, has no conventional meaning. Nevertheless, individuals seem to be able to produce pantomimes and derive meaning from pantomimes. A number of studies has addressed the use of co-speech gesture, but little is known on pantomime. Therefore, the question of how people construct and understand pantomimes arises in gesture research. To determine how people use pantomimes, we asked participants to depict a set of objects using pantomimes only. We annotated what representation techniques people produced. Furthermore, using judgment tasks, we assessed the pantomimes' comprehensibility. Analyses showed that similar techniques were used to depict objects across individuals. Objects with a default depiction method were better comprehended than objects for which there was no such default. More specifically, tools and objects depicted using a handling technique were better understood. The open-answer experiment showed low interpretation accuracy. Conversely, the forced-choice experiment showed ceiling effects. These results suggest that across individuals, similar strategies are deployed to produce pantomime, with the handling technique as the apparent preference. This might indicate that the production of pantomimes is based on mental representations which are intrinsically similar. Furthermore, pantomime conveys semantically rich, but ambiguous, information, and its interpretation is much dependent on context. This pantomime database is available online: https://dataverse.nl/dataset.xhtml?persistentId=hdl:10411/QZHO6M. This can be used as a baseline with which we can compare clinical groups. PMID:28744232
Clinical veterinary proteomics: Techniques and approaches to decipher the animal plasma proteome.
Ghodasara, P; Sadowski, P; Satake, N; Kopp, S; Mills, P C
2017-12-01
Over the last two decades, technological advancements in the field of proteomics have advanced our understanding of the complex biological systems of living organisms. Techniques based on mass spectrometry (MS) have emerged as powerful tools to contextualise existing genomic information and to create quantitative protein profiles from plasma, tissues or cell lines of various species. Proteomic approaches have been used increasingly in veterinary science to investigate biological processes responsible for growth, reproduction and pathological events. However, the adoption of proteomic approaches by veterinary investigators lags behind that of researchers in the human medical field. Furthermore, in contrast to human proteomics studies, interpretation of veterinary proteomic data is difficult due to the limited protein databases available for many animal species. This review article examines the current use of advanced proteomics techniques for evaluation of animal health and welfare and covers the current status of clinical veterinary proteomics research, including successful protein identification and data interpretation studies. It includes a description of an emerging tool, sequential window acquisition of all theoretical fragment ion mass spectra (SWATH-MS), available on selected mass spectrometry instruments. This newly developed data acquisition technique combines advantages of discovery and targeted proteomics approaches, and thus has the potential to advance the veterinary proteomics field by enhancing identification and reproducibility of proteomics data. Copyright © 2017 Elsevier Ltd. All rights reserved.
Sonification of optical coherence tomography data and images
Ahmad, Adeel; Adie, Steven G.; Wang, Morgan; Boppart, Stephen A.
2010-01-01
Sonification is the process of representing data as non-speech audio signals. In this manuscript, we describe the auditory presentation of OCT data and images. OCT acquisition rates frequently exceed our ability to visually analyze image-based data, and multi-sensory input may therefore facilitate rapid interpretation. This conversion will be especially valuable in time-sensitive surgical or diagnostic procedures. In these scenarios, auditory feedback can complement visual data without requiring the surgeon to constantly monitor the screen, or provide additional feedback in non-imaging procedures such as guided needle biopsies which use only axial-scan data. In this paper we present techniques to translate OCT data and images into sound based on the spatial and spatial frequency properties of the OCT data. Results obtained from parameter-mapped sonification of human adipose and tumor tissues are presented, indicating that audio feedback of OCT data may be useful for the interpretation of OCT images. PMID:20588846
Tian, Xin; Xin, Mingyuan; Luo, Jian; Liu, Mingyao; Jiang, Zhenran
2017-02-01
The selection of relevant genes for breast cancer metastasis is critical for the treatment and prognosis of cancer patients. Although much effort has been devoted to the gene selection procedures by use of different statistical analysis methods or computational techniques, the interpretation of the variables in the resulting survival models has been limited so far. This article proposes a new Random Forest (RF)-based algorithm to identify important variables highly related with breast cancer metastasis, which is based on the important scores of two variable selection algorithms, including the mean decrease Gini (MDG) criteria of Random Forest and the GeneRank algorithm with protein-protein interaction (PPI) information. The new gene selection algorithm can be called PPIRF. The improved prediction accuracy fully illustrated the reliability and high interpretability of gene list selected by the PPIRF approach.
Statistical normalization techniques for magnetic resonance imaging.
Shinohara, Russell T; Sweeney, Elizabeth M; Goldsmith, Jeff; Shiee, Navid; Mateen, Farrah J; Calabresi, Peter A; Jarso, Samson; Pham, Dzung L; Reich, Daniel S; Crainiceanu, Ciprian M
2014-01-01
While computed tomography and other imaging techniques are measured in absolute units with physical meaning, magnetic resonance images are expressed in arbitrary units that are difficult to interpret and differ between study visits and subjects. Much work in the image processing literature on intensity normalization has focused on histogram matching and other histogram mapping techniques, with little emphasis on normalizing images to have biologically interpretable units. Furthermore, there are no formalized principles or goals for the crucial comparability of image intensities within and across subjects. To address this, we propose a set of criteria necessary for the normalization of images. We further propose simple and robust biologically motivated normalization techniques for multisequence brain imaging that have the same interpretation across acquisitions and satisfy the proposed criteria. We compare the performance of different normalization methods in thousands of images of patients with Alzheimer's disease, hundreds of patients with multiple sclerosis, and hundreds of healthy subjects obtained in several different studies at dozens of imaging centers.
A web-based instruction module for interpretation of craniofacial cone beam CT anatomy.
Hassan, B A; Jacobs, R; Scarfe, W C; Al-Rawi, W T
2007-09-01
To develop a web-based module for learner instruction in the interpretation and recognition of osseous anatomy on craniofacial cone-beam CT (CBCT) images. Volumetric datasets from three CBCT systems were acquired (i-CAT, NewTom 3G and AccuiTomo FPD) for various subjects using equipment-specific scanning protocols. The datasets were processed using multiple software to provide two-dimensional (2D) multiplanar reformatted (MPR) images (e.g. sagittal, coronal and axial) and three-dimensional (3D) visual representations (e.g. maximum intensity projection, minimum intensity projection, ray sum, surface and volume rendering). Distinct didactic modules which illustrate the principles of CBCT systems, guided navigation of the volumetric dataset, and anatomic correlation of 3D models and 2D MPR graphics were developed using a hybrid combination of web authoring and image analysis techniques. Interactive web multimedia instruction was facilitated by the use of dynamic highlighting and labelling, and rendered video illustrations, supplemented with didactic textual material. HTML coding and Java scripting were heavily implemented for the blending of the educational modules. An interactive, multimedia educational tool for visualizing the morphology and interrelationships of osseous craniofacial anatomy, as depicted on CBCT MPR and 3D images, was designed and implemented. The present design of a web-based instruction module may assist radiologists and clinicians in learning how to recognize and interpret the craniofacial anatomy of CBCT based images more efficiently.
Major Fault Patterns in Zanjan State of Iran Based of GECO Global Geoid Model
NASA Astrophysics Data System (ADS)
Beheshty, Sayyed Amir Hossein; Abrari Vajari, Mohammad; Raoufikelachayeh, SeyedehSusan
2016-04-01
A new Earth Gravitational Model (GECO) to degree 2190 has been developed incorporates EGM2008 and the latest GOCE based satellite solutions. Satellite gradiometry data are more sensitive information of the long- and medium- wavelengths of the gravity field than the conventional satellite tracking data. Hence, by utilizing this new technique, more accurate, reliable and higher degrees/orders of the spherical harmonic expansion of the gravity field can be achieved. Gravity gradients can also be useful in geophysical interpretation and prospecting. We have presented the concept of gravity gradients with some simple interpretations. A MATLAB based computer programs were developed and utilized for determining the gravity and gradient components of the gravity field using the GGMs, followed by a case study in Zanjan State of Iran. Our numerical studies show strong (more than 72%) correlations between gravity anomalies and the diagonal elements of the gradient tensor. Also, strong correlations were revealed between the components of the deflection of vertical and the off-diagonal elements as well as between the horizontal gradient and magnitude of the deflection of vertical. We clearly distinguished two big faults in North and South of Zanjan city based on the current information. Also, several minor faults were detected in the study area. Therefore, the same geophysical interpretation can be stated for gravity gradient components too. Our mathematical derivations support some of these correlations.
Sparse network-based models for patient classification using fMRI
Rosa, Maria J.; Portugal, Liana; Hahn, Tim; Fallgatter, Andreas J.; Garrido, Marta I.; Shawe-Taylor, John; Mourao-Miranda, Janaina
2015-01-01
Pattern recognition applied to whole-brain neuroimaging data, such as functional Magnetic Resonance Imaging (fMRI), has proved successful at discriminating psychiatric patients from healthy participants. However, predictive patterns obtained from whole-brain voxel-based features are difficult to interpret in terms of the underlying neurobiology. Many psychiatric disorders, such as depression and schizophrenia, are thought to be brain connectivity disorders. Therefore, pattern recognition based on network models might provide deeper insights and potentially more powerful predictions than whole-brain voxel-based approaches. Here, we build a novel sparse network-based discriminative modeling framework, based on Gaussian graphical models and L1-norm regularized linear Support Vector Machines (SVM). In addition, the proposed framework is optimized in terms of both predictive power and reproducibility/stability of the patterns. Our approach aims to provide better pattern interpretation than voxel-based whole-brain approaches by yielding stable brain connectivity patterns that underlie discriminative changes in brain function between the groups. We illustrate our technique by classifying patients with major depressive disorder (MDD) and healthy participants, in two (event- and block-related) fMRI datasets acquired while participants performed a gender discrimination and emotional task, respectively, during the visualization of emotional valent faces. PMID:25463459
Forest and range mapping in the Houston area with ERTS-1
NASA Technical Reports Server (NTRS)
Heath, G. R.; Parker, H. D.
1973-01-01
ERTS-1 data acquired over the Houston area has been analyzed for applications to forest and range mapping. In the field of forestry the Sam Houston National Forest (Texas) was chosen as a test site, (Scene ID 1037-16244). Conventional imagery interpretation as well as computer processing methods were used to make classification maps of timber species, condition and land-use. The results were compared with timber stand maps which were obtained from aircraft imagery and checked in the field. The preliminary investigations show that conventional interpretation techniques indicated an accuracy in classification of 63 percent. The computer-aided interpretations made by a clustering technique gave 70 percent accuracy. Computer-aided and conventional multispectral analysis techniques were applied to range vegetation type mapping in the gulf coast marsh. Two species of salt marsh grasses were mapped.
The Recent Developments in Sample Preparation for Mass Spectrometry-Based Metabolomics.
Gong, Zhi-Gang; Hu, Jing; Wu, Xi; Xu, Yong-Jiang
2017-07-04
Metabolomics is a critical member in systems biology. Although great progress has been achieved in metabolomics, there are still some problems in sample preparation, data processing and data interpretation. In this review, we intend to explore the roles, challenges and trends in sample preparation for mass spectrometry- (MS-) based metabolomics. The newly emerged sample preparation methods were also critically examined, including laser microdissection, in vivo sampling, dried blood spot, microwave, ultrasound and enzyme-assisted extraction, as well as microextraction techniques. Finally, we provide some conclusions and perspectives for sample preparation in MS-based metabolomics.
Bouktif, Salah; Hanna, Eileen Marie; Zaki, Nazar; Abu Khousa, Eman
2014-01-01
Prediction and classification techniques have been well studied by machine learning researchers and developed for several real-word problems. However, the level of acceptance and success of prediction models are still below expectation due to some difficulties such as the low performance of prediction models when they are applied in different environments. Such a problem has been addressed by many researchers, mainly from the machine learning community. A second problem, principally raised by model users in different communities, such as managers, economists, engineers, biologists, and medical practitioners, etc., is the prediction models' interpretability. The latter is the ability of a model to explain its predictions and exhibit the causality relationships between the inputs and the outputs. In the case of classification, a successful way to alleviate the low performance is to use ensemble classiers. It is an intuitive strategy to activate collaboration between different classifiers towards a better performance than individual classier. Unfortunately, ensemble classifiers method do not take into account the interpretability of the final classification outcome. It even worsens the original interpretability of the individual classifiers. In this paper we propose a novel implementation of classifiers combination approach that does not only promote the overall performance but also preserves the interpretability of the resulting model. We propose a solution based on Ant Colony Optimization and tailored for the case of Bayesian classifiers. We validate our proposed solution with case studies from medical domain namely, heart disease and Cardiotography-based predictions, problems where interpretability is critical to make appropriate clinical decisions. The datasets, Prediction Models and software tool together with supplementary materials are available at http://faculty.uaeu.ac.ae/salahb/ACO4BC.htm.
Yatsushiro, Satoshi; Hirayama, Akihiro; Matsumae, Mitsunori; Kajiwara, Nao; Abdullah, Afnizanfaizal; Kuroda, Kagayaki
2014-01-01
Correlation time mapping based on magnetic resonance (MR) velocimetry has been applied to pulsatile cerebrospinal fluid (CSF) motion to visualize the pressure transmission between CSF at different locations and/or between CSF and arterial blood flow. Healthy volunteer experiments demonstrated that the technique exhibited transmitting pulsatile CSF motion from CSF space in the vicinity of blood vessels with short delay and relatively high correlation coefficients. Patient and healthy volunteer experiments indicated that the properties of CSF motion were different from the healthy volunteers. Resultant images in healthy volunteers implied that there were slight individual difference in the CSF driving source locations. Clinical interpretation for these preliminary results is required to apply the present technique for classifying status of hydrocephalus.
NASA Technical Reports Server (NTRS)
Hall, M. J.
1981-01-01
An inventory technique based upon using remote sensing technology, interpreting both high altitude aerial photography and LANDSAT multispectral scanner imagery, is discussed. It is noted that once the final land use inventory maps of irrigated agricultural lands are available and approximately scaled they may be overlaid directly onto either multispectral scanner or return beam vidicon prints, thereby providing an inexpensive updating procedure.
Signal Processing and Interpretation Using Multilevel Signal Abstractions.
1986-06-01
mappings expressed in the Fourier domain. Pre- viously proposed causal analysis techniques for diagnosis are based on the analysis of intermediate data ...can be processed either as individual one-dimensional waveforms or as multichannel data 26 I P- - . . . ." " ." h9. for source detection and direction...microphone data . The signal processing for both spectral analysis of microphone signals and direc- * tion determination of acoustic sources involves
Qamar, Anthony I.; Malone, Stephen; Moran, Seth C.; Steele, William P.; Thelen, Weston A.; Sherrod, David R.; Scott, William E.; Stauffer, Peter H.
2008-01-01
The rapid onset of energetic seismicity on September 23, 2004, at Mount St. Helens caused seismologists at the Pacific Northwest Seismic Network and the Cascades Volcano Observatory to quickly improve and develop techniques that summarized and displayed seismic parameters for use by scientists and the general public. Such techniques included webicorders (Web-based helicorder-like displays), graphs showing RSAM (real-time seismic amplitude measurements), RMS (root-mean-square) plots, spectrograms, location maps, automated seismic-event detectors, focal mechanism solutions, automated approximations of earthquake magnitudes, RSAM-based alarms, and time-depth plots for seismic events. Many of these visual-information products were made available publicly as Web pages generated and updated routinely. The graphs and maps included short written text that explained the concepts behind them, which increased their value to the nonseismologic community that was tracking the eruption. Laypeople could read online summaries of the scientific interpretations and, if they chose, review some of the basic data, thereby providing a better understanding of the data used by scientists to make interpretations about ongoing eruptive activity, as well as a better understanding of how scientists worked to monitor the volcano.
NASA Astrophysics Data System (ADS)
Kolb, Thomas; Fuchs, Markus; Zöller, Ludwig
2015-04-01
River terraces are widespread geomorphic features of Quaternary landscapes. Besides tectonics, their formation is predominantly controlled by climatic conditions. Changes in either conditions cause changes in fluvial discharge and sediment load. Therefore, fluvial terraces are widely used as important non-continuous sedimentary archives for paleotectonic and paleoenvironmental reconstruction. The informative value of fluvial archives and their significance for paleoenvironmental research, however, strongly depend on a precise dating of the terrace formation. Over the last decades, various luminescence dating techniques have successfully been applied on fluvial deposits and were able to provide reliable age information. In contrast to radiocarbon dating, modern luminescence dating techniques provide an extended dating range, which enables the determination of age information for fluvial and other terrestrial archives far beyond the last glacial-interglacial cycle. Due to the general abundance of quartz and feldspar minerals, there is almost no limitation of dateable material, so that luminescence dating methods can be applied on a wide variety of deposits. When using luminescence dating techniques, however, some methodological difficulties have to be considered. Due to the mechanism of fluvial transport, this is especially true for fluvial sediments, for which two major problems have been identified to be the main reasons of incorrect age estimations: (1) incomplete resetting of the luminescence signal during transport and (2) dosimetric inaccuracies as a result of the heterogeneity of terrace gravels. Thus, luminescence dating techniques are still far from being standard methods for dating fluvial archives and the calculated sedimentation ages always demand a careful interpretation. This contribution reveals some of the difficulties that may occur when luminescence dating techniques are applied on river terraces and illustrates several strategies used for overcoming these problems and for determining correct sedimentation ages. The presented results are based on a case study, located in the headwaters of the River Main, the longest right bank tributary of the Rhine drainage system. Here, within an oversized dry valley in Northern Bavaria (Germany), five Pleistocene terraces are distinguished. The terraces are interpreted as the result of a complex landscape evolution, which is characterized by multiple river deflections. The need for a careful interpretation of luminescence results is illustrated by some optically stimulated luminescence (OSL) ages calculated for the youngest of these five Pleistocene terraces. These results show different sedimentation ages of samples originating from the same morphological unit. Thus, these ages may be interpreted as evidence for a diachronic character of river incision and, hence, point to the complexity of fluvial systems' response to climatically and/or tectonically forced changes in local and regional paleoenvironmental conditions.
NASA Astrophysics Data System (ADS)
Grawe, M.; Makela, J. J.
2016-12-01
Airglow imaging of the 630.0-nm redline emission has emerged as a useful tool for studying the properties of tsunami-ionospheric coupling in recent years, offering spatially continuous coverage of the sky with a single instrument. Past studies have shown that airglow signatures induced by tsunamis are inherently anisotropic due to the observation geometry and effects from the geomagnetic field. Here, we present details behind the techniques used to determine the parameters of the signature (orientation, wavelength, etc) with potential extensions to real or quasi-real time and a tool for interpreting the location and strength of the signatures in the field of view. We demonstrate application of the techniques to ground-based optical measurements of several tsunami-induced signatures taking place over the past five years from an imaging system in Hawaii. Additionally, these methods are extended for use on space-based observation platforms, offering advantages over ground-based installations.
Interpreting Tools by Imagining Their Uses
ERIC Educational Resources Information Center
Kwan, Alistair
2017-01-01
By prompting imagined or actual bodily experience, we can guide interpretation of tools to emphasize the action that those tools perform. The technique requires little more than an extension from looking at an object, to imagining how the body engages with it, perhaps even trying out those specialist postures, to nourish an interpretation centered…
ERIC Educational Resources Information Center
Morgan, Mark
1996-01-01
Describes a field experiment that was designed to test the effects of three different interpretive programs on students' attitudes toward live, nonpoisonous snakes. One of the treatments measured the effectiveness of using "hands-on" interpretive techniques. Direct contact with snakes improved students' attitudes but only slightly. Females'…
A direct-measurement technique for estimating discharge-chamber lifetime. [for ion thrusters
NASA Technical Reports Server (NTRS)
Beattie, J. R.; Garvin, H. L.
1982-01-01
The use of short-term measurement techniques for predicting the wearout of ion thrusters resulting from sputter-erosion damage is investigated. The laminar-thin-film technique is found to provide high precision erosion-rate data, although the erosion rates are generally substantially higher than those found during long-term erosion tests, so that the results must be interpreted in a relative sense. A technique for obtaining absolute measurements is developed using a masked-substrate arrangement. This new technique provides a means for estimating the lifetimes of critical discharge-chamber components based on direct measurements of sputter-erosion depths obtained during short-duration (approximately 1 hr) tests. Results obtained using the direct-measurement technique are shown to agree with sputter-erosion depths calculated for the plasma conditions of the test. The direct-measurement approach is found to be applicable to both mercury and argon discharge-plasma environments and will be useful for estimating the lifetimes of inert gas and extended performance mercury ion thrusters currently under development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Norgaard, J.V.; Olsen, D.; Springer, N.
1995-12-31
A new technique for obtaining water-oil capillary pressure curves, based on NMR imaging of the saturation distribution in flooded cores is presented. In this technique, a steady state fluid saturation profile is developed by flooding the core at a constant flow rate. At the steady state situation where the saturation distribution no longer changes, the local pressure difference between the wetting and non-wetting phases represents the capillary pressure. The saturation profile is measured using an NMR technique and for a drainage case, the pressure in the non-wetting phase is calculated numerically. The paper presents the NMR technique and the proceduremore » for calculating the pressure distribution in the sample. Inhomogeneous samples produce irregular saturation profiles, which may be interpreted in terms of variation in permeability, porosity, and capillary pressure. Capillary pressure curves for North Sea chalk obtained by the new technique show good agreement with capillary pressure curves obtained by traditional techniques.« less
A novel murmur-based heart sound feature extraction technique using envelope-morphological analysis
NASA Astrophysics Data System (ADS)
Yao, Hao-Dong; Ma, Jia-Li; Fu, Bin-Bin; Wang, Hai-Yang; Dong, Ming-Chui
2015-07-01
Auscultation of heart sound (HS) signals serves as an important primary approach to diagnose cardiovascular diseases (CVDs) for centuries. Confronting the intrinsic drawbacks of traditional HS auscultation, computer-aided automatic HS auscultation based on feature extraction technique has witnessed explosive development. Yet, most existing HS feature extraction methods adopt acoustic or time-frequency features which exhibit poor relationship with diagnostic information, thus restricting the performance of further interpretation and analysis. Tackling such a bottleneck problem, this paper innovatively proposes a novel murmur-based HS feature extraction method since murmurs contain massive pathological information and are regarded as the first indications of pathological occurrences of heart valves. Adapting discrete wavelet transform (DWT) and Shannon envelope, the envelope-morphological characteristics of murmurs are obtained and three features are extracted accordingly. Validated by discriminating normal HS and 5 various abnormal HS signals with extracted features, the proposed method provides an attractive candidate in automatic HS auscultation.
Nonideal ultrathin mantle cloak for electrically large conducting cylinders.
Liu, Shuo; Zhang, Hao Chi; Xu, He-Xiu; Cui, Tie Jun
2014-09-01
Based on the concept of the scattering cancellation technique, we propose a nonideal ultrathin mantle cloak that can efficiently suppress the total scattering cross sections of an electrically large conducting cylinder (over one free-space wavelength). The cloaking mechanism is investigated in depth based on the Mie scattering theory and is simultaneously interpreted from the perspective of far-field bistatic scattering and near-field distributions. We remark that, unlike the perfect transformation-optics-based cloak, this nonideal cloaking technique is mainly designed to minimize simultaneously several scattering multipoles of a relatively large geometry around considerably broad bandwidth. Numerical simulations and experimental results show that the antiscattering ability of the metasurface gives rise to excellent total scattering reduction of the electrically large cylinder and remarkable electric-field restoration around the cloak. The outstanding cloaking performance together with the good features of and ultralow profile, flexibility, and easy fabrication predict promising applications in the microwave frequencies.
Monitoring real-time navigation processes using the automated reasoning tool (ART)
NASA Technical Reports Server (NTRS)
Maletz, M. C.; Culbert, C. J.
1985-01-01
An expert system is described for monitoring and controlling navigation processes in real-time. The ART-based system features data-driven computation, accommodation of synchronous and asynchronous data, temporal modeling for individual time intervals and chains of time intervals, and hypothetical reasoning capabilities that consider alternative interpretations of the state of navigation processes. The concept is illustrated in terms of the NAVEX system for monitoring and controlling the high speed ground navigation console for Mission Control at Johnson Space Center. The reasoning processes are outlined, including techniques used to consider alternative data interpretations. Installation of the system has permitted using a single operator, instead of three, to monitor the ascent and entry phases of a Shuttle mission.
Multi-quasiparticle excitations in 145Tb
NASA Astrophysics Data System (ADS)
Zheng, Y.; Zhou, X. H.; Zhang, Y. H.; Hayakawa, T.; Oshima, M.; Toh, Y.; Shizuma, T.; Katakura, J.; Hatsukawa, Y.; Matsuda, M.; Kusakari, H.; Sugawara, M.; Furuno, K.; Komatsubara, T.
2004-04-01
High-spin states in 145Tb have been investigated by means of in-beam ggr-ray spectroscopy techniques with the 118Sn(32S, 1p4n) reaction. Excitation functions, X-ggr-t and ggr-ggr-t coincidences and ggr-ray anisotropies were measured. A level scheme of 145Tb was established up to Exap 7 MeV. The level structure shows characteristics of a spherical nucleus. Based on the systematics of level structure in the odd-A N = 80 isotones, the level structure below 2 MeV excitation is interpreted by coupling an h11/2 proton to the excitations in the even-even 144Gd core. Above 2 MeV excitation, most of the yrast levels are interpreted with multi-quasiparticle shell-model configurations.
Geology of the Sklodowska Region, Lunar Farside. M.S. Thesis Final Report
NASA Technical Reports Server (NTRS)
Kauffman, J. D.
1974-01-01
Investigation of an area on the lunar farside has resulted in a geologic map, development of a regional stratigraphic sequence, and interpretation of surface materials. Apollo 15 metric photographs were used in conjunction with photogrammetric techniques to produce a base map to which geologic units were later added. Geologic units were first delineated on the metric photographs and then transferred to the base map. Materials were defined and described from selected Lunar Orbiter and Apollo 15 metric, panoramic, and Hasselblad photographs on the basis of distinctive morphologic characteristics.
Decision Tree Approach for Soil Liquefaction Assessment
Gandomi, Amir H.; Fridline, Mark M.; Roke, David A.
2013-01-01
In the current study, the performances of some decision tree (DT) techniques are evaluated for postearthquake soil liquefaction assessment. A database containing 620 records of seismic parameters and soil properties is used in this study. Three decision tree techniques are used here in two different ways, considering statistical and engineering points of view, to develop decision rules. The DT results are compared to the logistic regression (LR) model. The results of this study indicate that the DTs not only successfully predict liquefaction but they can also outperform the LR model. The best DT models are interpreted and evaluated based on an engineering point of view. PMID:24489498
Decision tree approach for soil liquefaction assessment.
Gandomi, Amir H; Fridline, Mark M; Roke, David A
2013-01-01
In the current study, the performances of some decision tree (DT) techniques are evaluated for postearthquake soil liquefaction assessment. A database containing 620 records of seismic parameters and soil properties is used in this study. Three decision tree techniques are used here in two different ways, considering statistical and engineering points of view, to develop decision rules. The DT results are compared to the logistic regression (LR) model. The results of this study indicate that the DTs not only successfully predict liquefaction but they can also outperform the LR model. The best DT models are interpreted and evaluated based on an engineering point of view.
Ghosal, Sayan; Gannepalli, Anil; Salapaka, Murti
2017-08-11
In this article, we explore methods that enable estimation of material properties with the dynamic mode atomic force microscopy suitable for soft matter investigation. The article presents the viewpoint of casting the system, comprising of a flexure probe interacting with the sample, as an equivalent cantilever system and compares a steady-state analysis based method with a recursive estimation technique for determining the parameters of the equivalent cantilever system in real time. The steady-state analysis of the equivalent cantilever model, which has been implicitly assumed in studies on material property determination, is validated analytically and experimentally. We show that the steady-state based technique yields results that quantitatively agree with the recursive method in the domain of its validity. The steady-state technique is considerably simpler to implement, however, slower compared to the recursive technique. The parameters of the equivalent system are utilized to interpret storage and dissipative properties of the sample. Finally, the article identifies key pitfalls that need to be avoided toward the quantitative estimation of material properties.
Functional programming interpreter. M. S. thesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robison, A.D.
1987-03-01
Functional Programming (FP) sup BAC87 is an alternative to conventional imperative programming languages. This thesis describes an FP interpreter implementation. Superficially, FP appears to be a simple, but very inefficient language. Its simplicity, however, allows it to be interpreted quickly. Much of the inefficiency can be removed by simple interpreter techniques. This thesis describes the Illinois Functional Programming (IFP) interpreter, an interactive functional programming implementation which runs under both MS-DOS and UNIX. The IFP interpreter allows functions to be created, executed, and debugged in an environment very similar to UNIX. IFP's speed is competitive with other interpreted languages such asmore » BASIC.« less
Rohling, Martin L; Williamson, David J; Miller, L Stephen; Adams, Russell L
2003-11-01
The aim of this project was to validate an alternative global measure of neurocognitive impairment (Rohling Interpretive Method, or RIM) that could be generated from data gathered from a flexible battery approach. A critical step in this process is to establish the utility of the technique against current standards in the field. In this paper, we compared results from the Rohling Interpretive Method to those obtained from the General Neuropsychological Deficit Scale (GNDS; Reitan & Wolfson, 1988) and the Halstead-Russell Average Impairment Rating (AIR; Russell, Neuringer & Goldstein, 1970) on a large previously published sample of patients assessed with the Halstead-Reitan Battery (HRB). Findings support the use of the Rohling Interpretive Method in producing summary statistics similar in diagnostic sensitivity and specificity to the traditional HRB indices.
Interpreting Popov criteria in Lure´ systems with complex scaling stability analysis
NASA Astrophysics Data System (ADS)
Zhou, J.
2018-06-01
The paper presents a novel frequency-domain interpretation of Popov criteria for absolute stability in Lure´ systems by means of what we call complex scaling stability analysis. The complex scaling technique is developed for exponential/asymptotic stability in LTI feedback systems, which dispenses open-loop poles distribution, contour/locus orientation and prior frequency sweeping. Exploiting the technique for alternatively revealing positive realness of transfer functions, re-interpreting Popov criteria is explicated. More specifically, the suggested frequency-domain stability conditions are conformable both in scalar and multivariable cases, and can be implemented either graphically with locus plotting or numerically without; in particular, the latter is suitable as a design tool with auxiliary parameter freedom. The interpretation also reveals further frequency-domain facts about Lure´ systems. Numerical examples are included to illustrate the main results.
VISUALIZIATION OF CELLULAR PHOSPHOINOSITIDE POOLS WITH GFP-FUSED PROTEIN-DOMAINS
Balla, Tamas; Várnai, Péter
2011-01-01
This unit describes the method of following phosphoinositide dynamics in live cells. Inositol phospholipids have emerged as universal signaling molecules present in virtually every membrane of eukaryotic cells. Phosphoinositides are present only in tiny amounts compared to structural lipids but are metabolically very active as they are produced and degraded by the numerous inositide kinase and phosphatase enzymes. Phosphoinositides control the membrane-recruitment and activity of many protein signaling-complexes in specific membrane compartments and have been implicated in the regulation of a variety of signaling and trafficking pathways. It has been a challenge to develop methods that allow detection of phosphoinositides at the single cell level. The only available technique in live cell application is based on the use of the same protein domains selected by evolution to recognize cellular phosphoinositides. Some of these isolated protein modules when fused to fluorescent proteins can follow dynamic changes in phosphoinositides. While this technique can provide information on phosphoinositide dynamics in live cells with subcellular resolution and rapidly gained popularity, it also has several limitations that must be taken into account when interpreting the data. Here, we summarize the design and practical use of these constructs and also review important considerations for the interpretation of the data obtained by this technique. PMID:19283730
NASA Astrophysics Data System (ADS)
Sabol, Bruce M.
2005-09-01
There has been a longstanding need for an objective and cost-effective technique to detect, characterize, and quantify submersed aquatic vegetation at spatial scales between direct physical sampling and remote aerial-based imaging. Acoustic-based approaches for doing so are reviewed and an explicit approach, using a narrow, single-beam echosounder, is described in detail. This heuristic algorithm is based on the spatial distribution of a thresholded signal generated from a high-frequency, narrow-beam echosounder operated in a vertical orientation from a survey boat. The physical basis, rationale, and implementation of this algorithm are described, and data documenting performance are presented. Using this technique, it is possible to generate orders of magnitude more data than would be available using previous techniques with a comparable level of effort. Thus, new analysis and interpretation approaches are called for which can make full use of these data. Several analyses' examples are shown for environmental effects application studies. Current operational window and performance limitations are identified and thoughts on potential processing approaches to improve performance are discussed.
Interpretation reduces ecological impacts of visitors to world heritage site.
Littlefair, Carolyn; Buckley, Ralf
2008-07-01
Minimal-impact interpretation is widely used to reduce the ecological impacts of visitors to protected areas. We tested whether verbal appeals and/or role-model demonstrations of minimal-impact behavior by a trained guide reduced noise, litter, and trampling impacts on hiking trails in a subtropical rainforest. Interpretation did reduce impacts significantly. Different interpretive techniques were more effective for different impacts. The experimental groups were mature, well-educated professionals; interpretation may differ in effectiveness for different visitors. Interpretation by skilled guides can indeed reduce visitor impacts in protected areas, especially if role modeling is combined with verbal appeals.
NASA Technical Reports Server (NTRS)
Cardamone, P.; Lechi, G. M.; Cavallin, A.; Marino, C. M.; Zanferrari, A.
1977-01-01
The results obtained in the study of linears derived from the analysis of LANDSAT 2 images recorded over Friuli during 1975 are described. Particular attention is devoted to the comparison of several passes in different bands, scales and photographic supports. Moreover reference is made to aerial photographic interpretation in selected sites and to the information obtained by laser techniques.
Applied photo interpretation for airbrush cartography
NASA Technical Reports Server (NTRS)
Inge, J. L.; Bridges, P. M.
1976-01-01
Lunar and planetary exploration has required the development of new techniques of cartographic portrayal. Conventional photo-interpretive methods employing size, shape, shadow, tone, pattern, and texture are applied to computer-processed satellite television images. Comparative judgements are affected by illumination, resolution, variations in surface coloration, and transmission or processing artifacts. The portrayal of tonal densities in a relief illustration is performed using a unique airbrush technique derived from hill-shading of contour maps. The control of tone and line quality is essential because the mid-gray to dark tone densities must be finalized prior to the addition of highlights to the drawing. This is done with an electric eraser until the drawing is completed. The drawing density is controlled with a reflectance-reading densitometer to meet certain density guidelines. The versatility of planetary photo-interpretive methods for airbrushed map portrayals is demonstrated by the application of these techniques to the synthesis of nonrelief data.
Application of an artificial neural network to pump card diagnosis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ashenayi, K.; Lea, J.F.; Kemp, F.
1994-12-01
Beam pumping is the most frequently used artificial-lift technique for oil production. Downhole pump cards are used to evaluate performance of the pumping unit. Pump cards can be generated from surface dynamometer cards using a 1D wave equation with viscous damping, as suggested by Gibbs and Neely. Pump cards contain significant information describing the behavior of the pump. However, interpretation of these cards is tedious and time-consuming; hence, an automated system capable of interpreting these cards could speed interpretation and warn of pump failures. This work presents the results of a DOS-based computer program capable of correctly classifying pump cards.more » The program uses a hybrid artificial neural network (ANN) to identify significant features of the pump card. The hybrid ANN uses classical and sinusoidal perceptrons. The network is trained using an error-back-propagation technique. The program correctly identified pump problems for more than 180 different training and test pump cards. The ANN takes a total of 80 data points as input. Sixty data points are collected from the pump card perimeter, and the remaining 20 data points represent the slope at selected points on the pump card perimeter. Pump problem conditions are grouped into 11 distinct classes. The network is capable of identifying one or more of these problem conditions for each pump card. Eight examples are presented and discussed.« less
Image analysis by integration of disparate information
NASA Technical Reports Server (NTRS)
Lemoigne, Jacqueline
1993-01-01
Image analysis often starts with some preliminary segmentation which provides a representation of the scene needed for further interpretation. Segmentation can be performed in several ways, which are categorized as pixel based, edge-based, and region-based. Each of these approaches are affected differently by various factors, and the final result may be improved by integrating several or all of these methods, thus taking advantage of their complementary nature. In this paper, we propose an approach that integrates pixel-based and edge-based results by utilizing an iterative relaxation technique. This approach has been implemented on a massively parallel computer and tested on some remotely sensed imagery from the Landsat-Thematic Mapper (TM) sensor.
Comments on "Failures in detecting volcanic ash from a satellite-based technique"
Prata, F.; Bluth, G.; Rose, B.; Schneider, D.; Tupper, A.
2001-01-01
The recent paper by Simpson et al. [Remote Sens. Environ. 72 (2000) 191.] on failures to detect volcanic ash using the 'reverse' absorption technique provides a timely reminder of the danger that volcanic ash presents to aviation and the urgent need for some form of effective remote detection. The paper unfortunately suffers from a fundamental flaw in its methodology and numerous errors of fact and interpretation. For the moment, the 'reverse' absorption technique provides the best means for discriminating volcanic ash clouds from meteorological clouds. The purpose of our comment is not to defend any particular algorithm; rather, we point out some problems with Simpson et al.'s analysis and re-state the conditions under which the 'reverse' absorption algorithm is likely to succeed. ?? 2001 Elsevier Science Inc. All rights reserved.
The Design of a Quantitative Western Blot Experiment
Taylor, Sean C.; Posch, Anton
2014-01-01
Western blotting is a technique that has been in practice for more than three decades that began as a means of detecting a protein target in a complex sample. Although there have been significant advances in both the imaging and reagent technologies to improve sensitivity, dynamic range of detection, and the applicability of multiplexed target detection, the basic technique has remained essentially unchanged. In the past, western blotting was used simply to detect a specific target protein in a complex mixture, but now journal editors and reviewers are requesting the quantitative interpretation of western blot data in terms of fold changes in protein expression between samples. The calculations are based on the differential densitometry of the associated chemiluminescent and/or fluorescent signals from the blots and this now requires a fundamental shift in the experimental methodology, acquisition, and interpretation of the data. We have recently published an updated approach to produce quantitative densitometric data from western blots (Taylor et al., 2013) and here we summarize the complete western blot workflow with a focus on sample preparation and data analysis for quantitative western blotting. PMID:24738055
Automated lidar-derived canopy height estimates for the Upper Mississippi River System
Hlavacek, Enrika
2015-01-01
Land cover/land use (LCU) classifications serve as important decision support products for researchers and land managers. The LCU classifications produced by the U.S. Geological Survey’s Upper Midwest Environmental Sciences Center (UMESC) include canopy height estimates that are assigned through manual aerial photography interpretation techniques. In an effort to improve upon these techniques, this project investigated the use of high-density lidar data for the Upper Mississippi River System to determine canopy height. An ArcGIS tool was developed to automatically derive height modifier information based on the extent of land cover features for forest classes. The measurement of canopy height included a calculation of the average height from lidar point cloud data as well as the inclusion of a local maximum filter to identify individual tree canopies. Results were compared to original manually interpreted height modifiers and to field survey data from U.S. Forest Service Forest Inventory and Analysis plots. This project demonstrated the effectiveness of utilizing lidar data to more efficiently assign height modifier attributes to LCU classifications produced by the UMESC.
NASA Technical Reports Server (NTRS)
Smith, Nathanial T.; Durston, Donald A.; Heineck, James T.
2017-01-01
In support of NASA's Commercial Supersonics Technology (CST) project, a test was conducted in the 9-by-7 ft. supersonic section of the NASA Ames Unitary Plan Wind Tunnel (UPWT). The tests were designed to study the interaction of shocks with a supersonic jet characteristic of those that may occur on a commercial supersonic aircraft. Multiple shock generating geometries were tested to examine the interaction dynamics as they pertain to sonic boom mitigation. An integral part of the analyses of these interactions are the interpretation of the data generated from the retroreflective Background Oriented Schlieren (RBOS) imaging technique employed for this test. The regularization- based optical flow methodology used to generate these data is described. Sample results are compared to those using normalized cross-correlation. The reduced noise, additional feature detail, and fewer false artifacts provided by the optical flow technique produced clearer time-averaged images, allowing for better interpretation of the underlying flow phenomena. These images, coupled with pressure signatures in the near field, are used to provide an overview of the detailed interaction flowfields.
Bonner, W.J.; English, T.C.; Haas, R.H.; Feagan, T.R.; McKinley, R.A.
1987-01-01
The Bureau of Indian Affairs (BIA) is responsible for the natural resource management of approximately 52 million acres of Trust lands in the contiguous United States. The lands are distributed in a "patchwork" fashion throughout the country. Management responsibilities on these areas include: minerals, range, timber, fish and wildlife, agricultural, cultural, and archaeological resources. In an age of decreasing natural resources and increasing natural resource values, effective multiple resource management is critical. BIA has adopted a "systems approach" to natural resource management which utilizes Geographic Information System (GIS) technology. The GIS encompasses a continuum of spatial and relational data elements, and included functional capabilities such as: data collection, data entry, data base development, data analysis, data base management, display, and report generalization. In support of database development activities, BIA and BLM/TGS conducted a cooperative effort to investigate the potential of 1:100,000 scale Thematic Mapper (TM) False Color Composites (FCCs) for providing vegetation information suitable for input to the GIS and to later be incorporated as a generalized Bureau wide land cover map. Land cover information is critical as the majority of reservations currently have no land cover information in either map or digital form. This poster outlines an approach which includes the manual interpretation of land cover using TM FCCs, the digitizing of interpreted polygons, and the editing of digital data, used upon ground truthing exercises. An efficient and cost-effective methodology for generating large area land cover information is illustrated for the Mineral Strip area on the San Carlos Indian Reservation in Arizona. Techniques which capitalize on the knowledge of the local natural resources professionals, while minimizing machine processing requirements, are suggested.
A quantitative investigation of the fracture pump-in/flowback test
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plahn, S.V.; Nolte, K.G.; Thompson, L.G.
1997-02-01
Fracture-closure pressure is an important parameter for fracture treatment design and evaluation. The pump-in/flowback (PIFB) test is frequently used to estimate its magnitude. The test is attractive because bottomhole pressures (BHP`s) during flowback develop a distinct and repeatable signature. This is in contrast to the pump-in/shut-in test, where strong indications of fracture closure are rarely seen. Various techniques are used to extract closure pressure from the flowback-pressure response. Unfortunately, these techniques give different estimates for closure pressure, and their theoretical bases are not well established. The authors present results that place the PIFB test on a firmer foundation. A numericalmore » model is used to simulate the PIFB test and glean physical mechanisms contributing to the response. On the basis of their simulation results, they propose interpretation techniques that give better estimates of closure pressure than existing techniques.« less
Wavelet Analyses of Oil Prices, USD Variations and Impact on Logistics
NASA Astrophysics Data System (ADS)
Melek, M.; Tokgozlu, A.; Aslan, Z.
2009-07-01
This paper is related with temporal variations of historical oil prices and Dollar and Euro in Turkey. Daily data based on OECD and Central Bank of Turkey records beginning from 1946 has been considered. 1D-continuous wavelets and wavelet packets analysis techniques have been applied on data. Wavelet techniques help to detect abrupt changing's, increasing and decreasing trends of data. Estimation of variables has been presented by using linear regression estimation techniques. The results of this study have been compared with the small and large scale effects. Transportation costs of track show a similar variation with fuel prices. The second part of the paper is related with estimation of imports, exports, costs, total number of vehicles and annual variations by considering temporal variation of oil prices and Dollar currency in Turkey. Wavelet techniques offer a user friendly methodology to interpret some local effects on increasing trend of imports and exports data.
NASA Astrophysics Data System (ADS)
Jonker, C. M.; Snoep, J. L.; Treur, J.; Westerhoff, H. V.; Wijngaards, W. C. A.
Within the areas of Computational Organisation Theory and Artificial Intelligence, techniques have been developed to simulate and analyse dynamics within organisations in society. Usually these modelling techniques are applied to factories and to the internal organisation of their process flows, thus obtaining models of complex organisations at various levels of aggregation. The dynamics in living cells are often interpreted in terms of well-organised processes, a bacterium being considered a (micro)factory. This suggests that organisation modelling techniques may also benefit their analysis. Using the example of Escherichia coli it is shown how indeed agent-based organisational modelling techniques can be used to simulate and analyse E.coli's intracellular dynamics. Exploiting the abstraction levels entailed by this perspective, a concise model is obtained that is readily simulated and analysed at the various levels of aggregation, yet shows the cell's essential dynamic patterns.
Imaging nanoscale lattice variations by machine learning of x-ray diffraction microscopy data
Laanait, Nouamane; Zhang, Zhan; Schlepütz, Christian M.
2016-08-09
In this paper, we present a novel methodology based on machine learning to extract lattice variations in crystalline materials, at the nanoscale, from an x-ray Bragg diffraction-based imaging technique. By employing a full-field microscopy setup, we capture real space images of materials, with imaging contrast determined solely by the x-ray diffracted signal. The data sets that emanate from this imaging technique are a hybrid of real space information (image spatial support) and reciprocal lattice space information (image contrast), and are intrinsically multidimensional (5D). By a judicious application of established unsupervised machine learning techniques and multivariate analysis to this multidimensional datamore » cube, we show how to extract features that can be ascribed physical interpretations in terms of common structural distortions, such as lattice tilts and dislocation arrays. Finally, we demonstrate this 'big data' approach to x-ray diffraction microscopy by identifying structural defects present in an epitaxial ferroelectric thin-film of lead zirconate titanate.« less
Imaging nanoscale lattice variations by machine learning of x-ray diffraction microscopy data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laanait, Nouamane; Zhang, Zhan; Schlepütz, Christian M.
In this paper, we present a novel methodology based on machine learning to extract lattice variations in crystalline materials, at the nanoscale, from an x-ray Bragg diffraction-based imaging technique. By employing a full-field microscopy setup, we capture real space images of materials, with imaging contrast determined solely by the x-ray diffracted signal. The data sets that emanate from this imaging technique are a hybrid of real space information (image spatial support) and reciprocal lattice space information (image contrast), and are intrinsically multidimensional (5D). By a judicious application of established unsupervised machine learning techniques and multivariate analysis to this multidimensional datamore » cube, we show how to extract features that can be ascribed physical interpretations in terms of common structural distortions, such as lattice tilts and dislocation arrays. Finally, we demonstrate this 'big data' approach to x-ray diffraction microscopy by identifying structural defects present in an epitaxial ferroelectric thin-film of lead zirconate titanate.« less
Color imaging of Mars by the High Resolution Imaging Science Experiment (HiRISE)
Delamere, W.A.; Tornabene, L.L.; McEwen, A.S.; Becker, K.; Bergstrom, J.W.; Bridges, N.T.; Eliason, E.M.; Gallagher, D.; Herkenhoff, K. E.; Keszthelyi, L.; Mattson, S.; McArthur, G.K.; Mellon, M.T.; Milazzo, M.; Russell, P.S.; Thomas, N.
2010-01-01
HiRISE has been producing a large number of scientifically useful color products of Mars and other planetary objects. The three broad spectral bands, coupled with the highly sensitive 14 bit detectors and time delay integration, enable detection of subtle color differences. The very high spatial resolution of HiRISE can augment the mineralogic interpretations based on multispectral (THEMIS) and hyperspectral datasets (TES, OMEGA and CRISM) and thereby enable detailed geologic and stratigraphic interpretations at meter scales. In addition to providing some examples of color images and their interpretation, we describe the processing techniques used to produce them and note some of the minor artifacts in the output. We also provide an example of how HiRISE color products can be effectively used to expand mineral and lithologic mapping provided by CRISM data products that are backed by other spectral datasets. The utility of high quality color data for understanding geologic processes on Mars has been one of the major successes of HiRISE. ?? 2009 Elsevier Inc.
Asfahani, J; Ahmad, Z; Ghani, B Abdul
2018-07-01
An approach based on self organizing map (SOM) artificial neural networks is proposed herewith oriented towards interpreting nuclear and electrical well logging data. The well logging measurements of Kodana well in Southern Syria have been interpreted by applying the proposed approach. Lithological cross-section model of the basaltic environment has been derived and four different kinds of basalt have been consequently distinguished. The four basalts are hard massive basalt, hard basalt, pyroclastic basalt and the alteration basalt products- clay. The results obtained by SOM artificial neural networks are in a good agreement with the previous published results obtained by other different techniques. The SOM approach is practiced successfully in the case study of the Kodana well logging data, and can be therefore recommended as a suitable and effective approach for handling huge well logging data with higher number of variables required for lithological discrimination purposes. Copyright © 2018 Elsevier Ltd. All rights reserved.
Interpretation of Blood Microbiology Results - Function of the Clinical Microbiologist.
Kristóf, Katalin; Pongrácz, Júlia
2016-04-01
The proper use and interpretation of blood microbiology results may be one of the most challenging and one of the most important functions of clinical microbiology laboratories. Effective implementation of this function requires careful consideration of specimen collection and processing, pathogen detection techniques, and prompt and precise reporting of identification and susceptibility results. The responsibility of the treating physician is proper formulation of the analytical request and to provide the laboratory with complete and precise patient information, which are inevitable prerequisites of a proper testing and interpretation. The clinical microbiologist can offer advice concerning the differential diagnosis, sampling techniques and detection methods to facilitate diagnosis. Rapid detection methods are essential, since the sooner a pathogen is detected, the better chance the patient has of getting cured. Besides the gold-standard blood culture technique, microbiologic methods that decrease the time in obtaining a relevant result are more and more utilized today. In the case of certain pathogens, the pathogen can be identified directly from the blood culture bottle after propagation with serological or automated/semi-automated systems or molecular methods or with MALDI-TOF MS (matrix-assisted laser desorption-ionization time of flight mass spectrometry). Molecular biology methods are also suitable for the rapid detection and identification of pathogens from aseptically collected blood samples. Another important duty of the microbiology laboratory is to notify the treating physician immediately about all relevant information if a positive sample is detected. The clinical microbiologist may provide important guidance regarding the clinical significance of blood isolates, since one-third to one-half of blood culture isolates are contaminants or isolates of unknown clinical significance. To fully exploit the benefits of blood culture and other (non- culture based) diagnoses, the microbiologist and the clinician should interact directly.
NASA Astrophysics Data System (ADS)
Schneiderwind, S.; Mason, J.; Wiatr, T.; Papanikolaou, I.; Reicherter, K.
2015-09-01
Two normal faults on the Island of Crete and mainland Greece were studied to create and test an innovative workflow to make palaeoseismic trench logging more objective, and visualise the sedimentary architecture within the trench wall in 3-D. This is achieved by combining classical palaeoseismic trenching techniques with multispectral approaches. A conventional trench log was firstly compared to results of iso cluster analysis of a true colour photomosaic representing the spectrum of visible light. Passive data collection disadvantages (e.g. illumination) were addressed by complementing the dataset with active near-infrared backscatter signal image from t-LiDAR measurements. The multispectral analysis shows that distinct layers can be identified and it compares well with the conventional trench log. According to this, a distinction of adjacent stratigraphic units was enabled by their particular multispectral composition signature. Based on the trench log, a 3-D-interpretation of GPR data collected on the vertical trench wall was then possible. This is highly beneficial for measuring representative layer thicknesses, displacements and geometries at depth within the trench wall. Thus, misinterpretation due to cutting effects is minimised. Sedimentary feature geometries related to earthquake magnitude can be used to improve the accuracy of seismic hazard assessments. Therefore, this manuscript combines multiparametric approaches and shows: (i) how a 3-D visualisation of palaeoseismic trench stratigraphy and logging can be accomplished by combining t-LiDAR and GRP techniques, and (ii) how a multispectral digital analysis can offer additional advantages and a higher objectivity in the interpretation of palaeoseismic and stratigraphic information. The multispectral datasets are stored allowing unbiased input for future (re-)investigations.
Visualization of hyperspectral imagery
NASA Astrophysics Data System (ADS)
Hogervorst, Maarten A.; Bijl, Piet; Toet, Alexander
2007-04-01
We developed four new techniques to visualize hyper spectral image data for man-in-the-loop target detection. The methods respectively: (1) display the subsequent bands as a movie ("movie"), (2) map the data onto three channels and display these as a colour image ("colour"), (3) display the correlation between the pixel signatures and a known target signature ("match") and (4) display the output of a standard anomaly detector ("anomaly"). The movie technique requires no assumptions about the target signature and involves no information loss. The colour technique produces a single image that can be displayed in real-time. A disadvantage of this technique is loss of information. A display of the match between a target signature and pixels and can be interpreted easily and fast, but this technique relies on precise knowledge of the target signature. The anomaly detector signifies pixels with signatures that deviate from the (local) background. We performed a target detection experiment with human observers to determine their relative performance with the four techniques,. The results show that the "match" presentation yields the best performance, followed by "movie" and "anomaly", while performance with the "colour" presentation was the poorest. Each scheme has its advantages and disadvantages and is more or less suited for real-time and post-hoc processing. The rationale is that the final interpretation is best done by a human observer. In contrast to automatic target recognition systems, the interpretation of hyper spectral imagery by the human visual system is robust to noise and image transformations and requires a minimal number of assumptions (about signature of target and background, target shape etc.) When more knowledge about target and background is available this may be used to help the observer interpreting the data (aided target detection).
Satellite monitoring of vegetation and geology in semi-arid environments. [Tanzania
NASA Technical Reports Server (NTRS)
Kihlblom, U.; Johansson, D. (Principal Investigator)
1980-01-01
The possibility of mapping various characteristics of the natural environment of Tanzania by various LANDSAT techniques was assessed. Interpretation and mapping were carried out using black and white as well as color infrared images on the scale of 1:250,000. The advantages of several computer techniques were also assessed, including contrast-stretched rationing, differential edge enhancement; supervised classification; multitemporal classification; and change detection. Results Show the most useful image for interpretation comes from band 5, with additional information being obtained from either band 6 or band 7. The advantages of using color infrared images for interpreting vegetation and geology are so great that black and white should be used only to supplement the colored images.
Research in space physics at the University of Iowa
NASA Technical Reports Server (NTRS)
Vanallen, J. A.
1976-01-01
Energetic particles in outer space and their relationship to electric, magnetic, and electromagnetic fields associated with the earth, sun, moon, and planets, and the interplanetary medium are investigated. Special attention was given to observations of earth and moon satellites and interplanetary spacecraft; phenomenological analysis and interpretation were emphasized. Data also cover ground based on radio astronomical and optical techniques and theoretical problems in plasma physics as revelant to solar planetary and interplanetary phenomena.
2016-08-01
and interpret generated MS data. She also got familiarized with synthesis of HPMA polymer and conjugation of targeted peptide to the polymer . During...techniques (Ciera), polymer synthesis and nanomedicine development (Starr and Andrea), the effect of drug treatment on prostate cancer cells (My’Chelle...career in the field of prostate cancer. W81XWH-15-1-0202 15. SUBJECT TERMS Prostate cancer, co- polymer , anti-androgen, peptide-based targeting
Classification of air quality using fuzzy synthetic multiplication.
Abdullah, Lazim; Khalid, Noor Dalina
2012-11-01
Proper identification of environment's air quality based on limited observations is an essential task to meet the goals of environmental management. Various classification methods have been used to estimate the change of air quality status and health. However, discrepancies frequently arise from the lack of clear distinction between each air quality, the uncertainty in the quality criteria employed and the vagueness or fuzziness embedded in the decision-making output values. Owing to inherent imprecision, difficulties always exist in some conventional methodologies when describing integrated air quality conditions with respect to various pollutants. Therefore, this paper presents two fuzzy multiplication synthetic techniques to establish classification of air quality. The fuzzy multiplication technique empowers the max-min operations in "or" and "and" in executing the fuzzy arithmetic operations. Based on a set of air pollutants data carbon monoxide, sulfur dioxide, nitrogen dioxide, ozone, and particulate matter (PM(10)) collected from a network of 51 stations in Klang Valley, East Malaysia, Sabah, and Sarawak were utilized in this evaluation. The two fuzzy multiplication techniques consistently classified Malaysia's air quality as "good." The findings indicated that the techniques may have successfully harmonized inherent discrepancies and interpret complex conditions. It was demonstrated that fuzzy synthetic multiplication techniques are quite appropriate techniques for air quality management.
Cox, Martine Elizabeth; Small, Hannah Julie; Boyes, Allison W; O'Brien, Lorna; Rose, Shiho Karina; Baker, Amanda L; Henskens, Frans A; Kirkwood, Hannah Naomi; Roach, Della M
2017-01-01
Background Web-based typed exchanges are increasingly used by professionals to provide emotional support to patients. Although some empirical evidence exists to suggest that various strategies may be used to convey emotion during Web-based text communication, there has been no critical review of these data in patients with chronic conditions. Objectives The objective of this review was to identify the techniques used to convey emotion in written or typed Web-based communication and assess the empirical evidence regarding impact on communication and psychological outcomes. Methods An electronic search of databases, including MEDLINE, CINAHL, PsycINFO, EMBASE, and the Cochrane Library was conducted to identify literature published from 1990 to 2016. Searches were also conducted using Google Scholar, manual searching of reference lists of identified papers and manual searching of tables of contents for selected relevant journals. Data extraction and coding were completed by 2 reviewers (10.00% [573/5731] of screened papers, at abstract/title screening stage; 10.0% of screened [69/694] papers, at full-text screening stage). Publications were assessed against the eligibility criteria and excluded if they were duplicates, were not published in English, were published before 1990, referenced animal or nonhuman subjects, did not describe original research, were not journal papers, or did not empirically test the effect of one or more nonverbal communication techniques (for eg, smileys, emoticons, emotional bracketing, voice accentuation, trailers [ellipsis], and pseudowords) as part of Web-based or typed communication on communication-related variables, including message interpretation, social presence, the nature of the interaction (eg, therapeutic alliance), patient perceptions of the interaction (eg, participant satisfaction), or psychological outcomes, including depression, anxiety, and distress. Results A total of 6902 unique publications were identified. Of these, six publications met the eligibility criteria and were included in a narrative synthesis. All six studies addressed the effect of smileys or emoticons on participant responses, message interpretation, or social presence of the writer. None of these studies specifically targeted chronic conditions. It was found that emoticons were more effective in influencing the emotional impact of a message than no cue and that smileys and emoticons were able to convey a limited amount of emotion. No studies addressed other techniques for conveying emotion in written communication. No studies addressed the effects of any techniques on the nature of the interaction (eg, therapeutic alliance), patient perceptions of the interaction (eg, participant satisfaction), or psychological outcomes (depression, anxiety, or distress). Conclusions There is a need for greater empirical attention to the effects of the various proposed techniques for conveying emotion in Web-based typed communications to inform health service providers regarding best-practice communication skills in this setting. PMID:29066426
NASA Astrophysics Data System (ADS)
Bauer, Klaus; Pussak, Marcin; Stiller, Manfred; Bujakowski, Wieslaw
2014-05-01
Self-organizing maps (SOM) are neural network techniques which can be used for the joint interpretation of multi-disciplinary data sets. In this investigation we apply SOM within a geothermal exploration project using 3D seismic reflection data. The study area is located in the central part of the Polish basin. Several sedimentary target horizons were identified at this location based on fluid flow rate measurements in the geothermal research well Kompina-2. The general objective is a seismic facies analysis and characterization of the major geothermal target reservoir. A 3D seismic reflection experiment with a sparse acquisition geometry was carried out around well Kompina-2. Conventional signal processing (amplitude corrections, filtering, spectral whitening, deconvolution, static corrections, muting) was followed by normal-moveout (NMO) stacking, and, alternatively, by common-reflection-surface (CRS) stacking. Different signal attributes were then derived from the stacked images including root-mean-square (RMS) amplitude, instantaneous frequency and coherency. Furthermore, spectral decomposition attributes were calculated based on the continuous wavelet transform. The resulting attribute maps along major target horizons appear noisy after the NMO stack and clearly structured after the CRS stack. Consequently, the following SOM-based multi-parameter signal attribute analysis was applied only to the CRS images. We applied our SOM work flow, which includes data preparation, unsupervised learning, segmentation of the trained SOM using image processing techniques, and final application of the learned knowledge. For the Lower Jurassic target horizon Ja1 we derived four different clusters with distinct seismic attribute signatures. As the most striking feature, a corridor parallel to a fault system was identified, which is characterized by decreased RMS amplitudes and low frequencies. In our interpretation we assume that this combination of signal properties can be explained by increased fracture porosity and enhanced fluid saturation within this part of the Lower Jurassic sandstone horizon. Hence, we suggest that a future drilling should be carried out within this compartment of the reservoir.
NASA Astrophysics Data System (ADS)
Weidlich, O.; Bernecker, M.
2004-04-01
Measurements of laminations from marine and limnic sediments are commonly a time-consuming procedure. However, the resulting quantitative proxies are of importance for the interpretation of both, climate changes and paleo-seismic activities. Digital image analysis accelerates the generation and interpretation of large data sets from laminated sediments based on contrasting grey values of dark and light laminae. Statistical transformation and correlation of the grey value signals reflect high frequency cycles due to changing mean laminae thicknesses, and thus provide data monitoring climate change. Perturbations (e.g., slumping structures, seismites, and tsunamites) of the commonly continuous laminae record seismic activities and obtain proxies for paleo-earthquake frequency. Using outcrop data from (i) the Pleistocene Lisan Formation of Jordan (Dead Sea Basin) and (ii) the Carboniferous-Permian Copacabana Formation of Bolivia (Lake Titicaca), we present a two-step approach to gain high-resolution time series based on field data for both purposes from unconsolidated and lithified outcrops. Step 1 concerns the construction of a continuous digital phototransect and step 2 covers the creation of a grey density curve based on digital photos along a line transect using image analysis. The applied automated image analysis technique provides a continuous digital record of the studied sections and, therefore, serves as useful tool for the evaluation of further proxy data. Analysing the obtained grey signal of the light and dark laminae of varves using phototransects, we discuss the potential and limitations of the proposed technique.
MR signal intensity: staying on the bright side in MR image interpretation
Bloem, Johan L; Reijnierse, Monique; Huizinga, Tom W J
2018-01-01
In 2003, the Nobel Prize for Medicine was awarded for contribution to the invention of MRI, reflecting the incredible value of MRI for medicine. Since 2003, enormous technical advancements have been made in acquiring MR images. However, MRI has a complicated, accident-prone dark side; images are not calibrated and respective images are dependent on all kinds of subjective choices in the settings of the machine, acquisition technique parameters, reconstruction techniques, data transmission, filtering and postprocessing techniques. The bright side is that understanding MR techniques increases opportunities to unravel characteristics of tissue. In this viewpoint, we summarise the different subjective choices that can be made to generate MR images and stress the importance of communication between radiologists and rheumatologists to correctly interpret images.
Mapping land cover from satellite images: A basic, low cost approach
NASA Technical Reports Server (NTRS)
Elifrits, C. D.; Barney, T. W.; Barr, D. J.; Johannsen, C. J.
1978-01-01
Simple, inexpensive methodologies developed for mapping general land cover and land use categories from LANDSAT images are reported. One methodology, a stepwise, interpretive, direct tracing technique was developed through working with university students from different disciplines with no previous experience in satellite image interpretation. The technique results in maps that are very accurate in relation to actual land cover and relative to the small investment in skill, time, and money needed to produce the products.
Logan, Heather; Wolfaardt, Johan; Boulanger, Pierre; Hodgetts, Bill; Seikaly, Hadi
2013-06-19
It is important to understand the perceived value of surgical design and simulation (SDS) amongst surgeons, as this will influence its implementation in clinical settings. The purpose of the present study was to examine the application of the convergent interview technique in the field of surgical design and simulation and evaluate whether the technique would uncover new perceptions of virtual surgical planning (VSP) and medical models not discovered by other qualitative case-based techniques. Five surgeons were asked to participate in the study. Each participant was interviewed following the convergent interview technique. After each interview, the interviewer interpreted the information by seeking agreements and disagreements among the interviewees in order to understand the key concepts in the field of SDS. Fifteen important issues were extracted from the convergent interviews. In general, the convergent interview was an effective technique in collecting information about the perception of clinicians. The study identified three areas where the technique could be improved upon for future studies in the SDS field.
2013-01-01
Background It is important to understand the perceived value of surgical design and simulation (SDS) amongst surgeons, as this will influence its implementation in clinical settings. The purpose of the present study was to examine the application of the convergent interview technique in the field of surgical design and simulation and evaluate whether the technique would uncover new perceptions of virtual surgical planning (VSP) and medical models not discovered by other qualitative case-based techniques. Methods Five surgeons were asked to participate in the study. Each participant was interviewed following the convergent interview technique. After each interview, the interviewer interpreted the information by seeking agreements and disagreements among the interviewees in order to understand the key concepts in the field of SDS. Results Fifteen important issues were extracted from the convergent interviews. Conclusion In general, the convergent interview was an effective technique in collecting information about the perception of clinicians. The study identified three areas where the technique could be improved upon for future studies in the SDS field. PMID:23782771
Three-dimensional waveform sensitivity kernels
NASA Astrophysics Data System (ADS)
Marquering, Henk; Nolet, Guust; Dahlen, F. A.
1998-03-01
The sensitivity of intermediate-period (~10-100s) seismic waveforms to the lateral heterogeneity of the Earth is computed using an efficient technique based upon surface-wave mode coupling. This formulation yields a general, fully fledged 3-D relationship between data and model without imposing smoothness constraints on the lateral heterogeneity. The calculations are based upon the Born approximation, which yields a linear relation between data and model. The linear relation ensures fast forward calculations and makes the formulation suitable for inversion schemes; however, higher-order effects such as wave-front healing are neglected. By including up to 20 surface-wave modes, we obtain Fréchet, or sensitivity, kernels for waveforms in the time frame that starts at the S arrival and which includes direct and surface-reflected body waves. These 3-D sensitivity kernels provide new insights into seismic-wave propagation, and suggest that there may be stringent limitations on the validity of ray-theoretical interpretations. Even recently developed 2-D formulations, which ignore structure out of the source-receiver plane, differ substantially from our 3-D treatment. We infer that smoothness constraints on heterogeneity, required to justify the use of ray techniques, are unlikely to hold in realistic earth models. This puts the use of ray-theoretical techniques into question for the interpretation of intermediate-period seismic data. The computed 3-D sensitivity kernels display a number of phenomena that are counter-intuitive from a ray-geometrical point of view: (1) body waves exhibit significant sensitivity to structure up to 500km away from the source-receiver minor arc; (2) significant near-surface sensitivity above the two turning points of the SS wave is observed; (3) the later part of the SS wave packet is most sensitive to structure away from the source-receiver path; (4) the sensitivity of the higher-frequency part of the fundamental surface-wave mode is wider than for its faster, lower-frequency part; (5) delayed body waves may considerably influence fundamental Rayleigh and Love waveforms. The strong sensitivity of waveforms to crustal structure due to fundamental-mode-to-body-wave scattering precludes the use of phase-velocity filters to model body-wave arrivals. Results from the 3-D formulation suggest that the use of 2-D and 1-D techniques for the interpretation of intermediate-period waveforms should seriously be reconsidered.
Cui, De-Mi; Yan, Weizhong; Wang, Xiao-Quan; Lu, Lie-Min
2017-10-25
Low strain pile integrity testing (LSPIT), due to its simplicity and low cost, is one of the most popular NDE methods used in pile foundation construction. While performing LSPIT in the field is generally quite simple and quick, determining the integrity of the test piles by analyzing and interpreting the test signals (reflectograms) is still a manual process performed by experienced experts only. For foundation construction sites where the number of piles to be tested is large, it may take days before the expert can complete interpreting all of the piles and delivering the integrity assessment report. Techniques that can automate test signal interpretation, thus shortening the LSPIT's turnaround time, are of great business value and are in great need. Motivated by this need, in this paper, we develop a computer-aided reflectogram interpretation (CARI) methodology that can interpret a large number of LSPIT signals quickly and consistently. The methodology, built on advanced signal processing and machine learning technologies, can be used to assist the experts in performing both qualitative and quantitative interpretation of LSPIT signals. Specifically, the methodology can ease experts' interpretation burden by screening all test piles quickly and identifying a small number of suspected piles for experts to perform manual, in-depth interpretation. We demonstrate the methodology's effectiveness using the LSPIT signals collected from a number of real-world pile construction sites. The proposed methodology can potentially enhance LSPIT and make it even more efficient and effective in quality control of deep foundation construction.
NASA Technical Reports Server (NTRS)
Nichols, J. D.; Gialdini, M.; Jaakkola, S.
1974-01-01
A quasi-operational study demonstrating that a timber inventory based on manual and automated analysis of ERTS-1, supporting aircraft data and ground data was made using multistage sampling techniques. The inventory proved to be a timely, cost effective alternative to conventional timber inventory techniques. The timber volume on the Quincy Ranger District of the Plumas National Forest was estimated to be 2.44 billion board feet with a sampling error of 8.2 percent. Costs per acre for the inventory procedure at 1.1 cent/acre compared favorably with the costs of a conventional inventory at 25 cents/acre. A point-by-point comparison of CALSCAN-classified ERTS data with human-interpreted low altitude photo plots indicated no significant differences in the overall classification accuracies.
Nandigam, Ravi K; Kim, Sangtae; Singh, Juswinder; Chuaqui, Claudio
2009-05-01
The desire to exploit structural information to aid structure based design and virtual screening led to the development of the interaction fingerprint for analyzing, mining, and filtering the binding patterns underlying the complex 3D data. In this paper we introduce a new approach, weighted SIFt (or w-SIFt), extending the concept of SIFt to capture the relative importance of different binding interactions. The methodology presented here for determining the weights in w-SIFt involves utilizing a dimensionality reduction technique for eliminating linear redundancies in the data followed by a stochastic optimization. We find that the relative weights of the fingerprint bits provide insight into what interactions are critical in determining inhibitor potency. Moreover, the weighted interaction fingerprint can serve as an interpretable position dependent scoring function for ligand protein interactions.
Litho-structural analysis of eastern part of Ilesha schist belt, Southwestern Nigeria
NASA Astrophysics Data System (ADS)
Fagbohun, Babatunde Joseph; Adeoti, Blessing; Aladejana, Olabanji Odunayo
2017-09-01
The Ilesha schist belt is an excellent example of high strain shear belt within basement complex of southwestern Nigeria which is part of the larger West African Shield. The Ilesha schist belt is characterised by metasediment-metavolcanic, migmatite-gneiss and older granite rocks and the occurrence of a Shear zone which has been traced to and correlated with the central Hoggar Neoproterozoic shear zone as part of the Trans-Saharan Belt. Although the area is interesting in terms of geologic-tectonic setting, however, detailed geological assessment and structural interpretation of features in this area is lacking due accessibility problem. For these reasons we applied principal component analysis (PCA) and band ratio (BR) techniques on Landsat 8 OLI data for lithological discrimination while for structural interpretation, filtering techniques of edge enhancement and edge detection was applied on digital elevation model (DEM) acquired by shuttle radar topographic mission (SRTM) sensor. The PCA outperform BR for discrimination between quartzite and granite which are the most exposed rock units in the area. For structural interpretation, DEM was used to generate shaded relief model and edge maps which enable detailed structural interpretation. Geologic fieldwork was further conducted to validate structures and units identified from image processing. Based image interpretation, three deformation events were identified. The first event (D1) which is majorly a ductile deformation produced foliations and folds whose axial planes trend in NNE-SSW. The second event (D2) resulted in reactivation and rotation of the D1 structures particularly the folds in the NE-SW. The third event (D3) produced a transgressive deformation starting with the ductile deformation resulting in the development of sigmoidal structures oriented in NE-SW to E-W direction and the brittle deformation occurring at later stages producing fractures oriented in the E-W to NE-SW directions. These results have important implications in terms of regional tectonics and geological mapping as well as in land-use planning and other areas such as hydrogeology or geotechnics.
C code generation from Petri-net-based logic controller specification
NASA Astrophysics Data System (ADS)
Grobelny, Michał; Grobelna, Iwona; Karatkevich, Andrei
2017-08-01
The article focuses on programming of logic controllers. It is important that a programming code of a logic controller is executed flawlessly according to the primary specification. In the presented approach we generate C code for an AVR microcontroller from a rule-based logical model of a control process derived from a control interpreted Petri net. The same logical model is also used for formal verification of the specification by means of the model checking technique. The proposed rule-based logical model and formal rules of transformation ensure that the obtained implementation is consistent with the already verified specification. The approach is validated by practical experiments.
Application of borehole geophysics to water-resources investigations
Keys, W.S.; MacCary, L.M.
1971-01-01
This manual is intended to be a guide for hydrologists using borehole geophysics in ground-water studies. The emphasis is on the application and interpretation of geophysical well logs, and not on the operation of a logger. It describes in detail those logging techniques that have been utilized within the Water Resources Division of the U.S. Geological Survey, and those used in petroleum investigations that have potential application to hydrologic problems. Most of the logs described can be made by commercial logging service companies, and many can be made with small water-well loggers. The general principles of each technique and the rules of log interpretation are the same, regardless of differences in instrumentation. Geophysical well logs can be interpreted to determine the lithology, geometry, resistivity, formation factor, bulk density, porosity, permeability, moisture content, and specific yield of water-bearing rocks, and to define the source, movement, and chemical and physical characteristics of ground water. Numerous examples of logs are used to illustrate applications and interpretation in various ground-water environments. The interrelations between various types of logs are emphasized, and the following aspects are described for each of the important logging techniques: Principles and applications, instrumentation, calibration and standardization, radius of investigation, and extraneous effects.
Strategies for Dealing with Missing Accelerometer Data.
Stephens, Samantha; Beyene, Joseph; Tremblay, Mark S; Faulkner, Guy; Pullnayegum, Eleanor; Feldman, Brian M
2018-05-01
Missing data is a universal research problem that can affect studies examining the relationship between physical activity measured with accelerometers and health outcomes. Statistical techniques are available to deal with missing data; however, available techniques have not been synthesized. A scoping review was conducted to summarize the advantages and disadvantages of identified methods of dealing with missing data from accelerometers. Missing data poses a threat to the validity and interpretation of trials using physical activity data from accelerometry. Imputation using multiple imputation techniques is recommended to deal with missing data and improve the validity and interpretation of studies using accelerometry. Copyright © 2018 Elsevier Inc. All rights reserved.
Visualization of volumetric seismic data
NASA Astrophysics Data System (ADS)
Spickermann, Dela; Böttinger, Michael; Ashfaq Ahmed, Khawar; Gajewski, Dirk
2015-04-01
Mostly driven by demands of high quality subsurface imaging, highly specialized tools and methods have been developed to support the processing, visualization and interpretation of seismic data. 3D seismic data acquisition and 4D time-lapse seismic monitoring are well-established techniques in academia and industry, producing large amounts of data to be processed, visualized and interpreted. In this context, interactive 3D visualization methods proved to be valuable for the analysis of 3D seismic data cubes - especially for sedimentary environments with continuous horizons. In crystalline and hard rock environments, where hydraulic stimulation techniques may be applied to produce geothermal energy, interpretation of the seismic data is a more challenging problem. Instead of continuous reflection horizons, the imaging targets are often steep dipping faults, causing a lot of diffractions. Without further preprocessing these geological structures are often hidden behind the noise in the data. In this PICO presentation we will present a workflow consisting of data processing steps, which enhance the signal-to-noise ratio, followed by a visualization step based on the use the commercially available general purpose 3D visualization system Avizo. Specifically, we have used Avizo Earth, an extension to Avizo, which supports the import of seismic data in SEG-Y format and offers easy access to state-of-the-art 3D visualization methods at interactive frame rates, even for large seismic data cubes. In seismic interpretation using visualization, interactivity is a key requirement for understanding complex 3D structures. In order to enable an easy communication of the insights gained during the interactive visualization process, animations of the visualized data were created which support the spatial understanding of the data.
de la Vega de León, Antonio; Bajorath, Jürgen
2016-09-01
The concept of chemical space is of fundamental relevance for medicinal chemistry and chemical informatics. Multidimensional chemical space representations are coordinate-based. Chemical space networks (CSNs) have been introduced as a coordinate-free representation. A computational approach is presented for the transformation of multidimensional chemical space into CSNs. The design of transformation CSNs (TRANS-CSNs) is based upon a similarity function that directly reflects distance relationships in original multidimensional space. TRANS-CSNs provide an immediate visualization of coordinate-based chemical space and do not require the use of dimensionality reduction techniques. At low network density, TRANS-CSNs are readily interpretable and make it possible to evaluate structure-activity relationship information originating from multidimensional chemical space.
Quantitative Image Analysis Techniques with High-Speed Schlieren Photography
NASA Technical Reports Server (NTRS)
Pollard, Victoria J.; Herron, Andrew J.
2017-01-01
Optical flow visualization techniques such as schlieren and shadowgraph photography are essential to understanding fluid flow when interpreting acquired wind tunnel test data. Output of the standard implementations of these visualization techniques in test facilities are often limited only to qualitative interpretation of the resulting images. Although various quantitative optical techniques have been developed, these techniques often require special equipment or are focused on obtaining very precise and accurate data about the visualized flow. These systems are not practical in small, production wind tunnel test facilities. However, high-speed photography capability has become a common upgrade to many test facilities in order to better capture images of unsteady flow phenomena such as oscillating shocks and flow separation. This paper describes novel techniques utilized by the authors to analyze captured high-speed schlieren and shadowgraph imagery from wind tunnel testing for quantification of observed unsteady flow frequency content. Such techniques have applications in parametric geometry studies and in small facilities where more specialized equipment may not be available.
NASA Technical Reports Server (NTRS)
Schmer, F. A. (Principal Investigator); Isakson, R. E.; Eidenshink, J. C.
1977-01-01
The author has identified the following significant results. Successful operational applications of LANDSAT data were found for level 1 land use mapping, drainage network delineation, and aspen mapping. Visual LANDSAT interpretation using 1:125,000 color composite imagery was the least expensive method of obtaining timely level 1 land use data. With an average agricultural/rangeland interpretation accuracy in excess of 80%, such a data source was considered the most cost effective of those sources available to state agencies. Costs do not compare favorably with those incurred using the present method of extracting land use data from historical tabular summaries. The cost increase in advancing from the present procedure to a satellite-based data source was justified in terms of expanded data content.
NASA Technical Reports Server (NTRS)
Borchardt, G. C.
1994-01-01
The Simple Tool for Automated Reasoning program (STAR) is an interactive, interpreted programming language for the development and operation of artificial intelligence (AI) application systems. STAR provides an environment for integrating traditional AI symbolic processing with functions and data structures defined in compiled languages such as C, FORTRAN and PASCAL. This type of integration occurs in a number of AI applications including interpretation of numerical sensor data, construction of intelligent user interfaces to existing compiled software packages, and coupling AI techniques with numerical simulation techniques and control systems software. The STAR language was created as part of an AI project for the evaluation of imaging spectrometer data at NASA's Jet Propulsion Laboratory. Programming in STAR is similar to other symbolic processing languages such as LISP and CLIP. STAR includes seven primitive data types and associated operations for the manipulation of these structures. A semantic network is used to organize data in STAR, with capabilities for inheritance of values and generation of side effects. The AI knowledge base of STAR can be a simple repository of records or it can be a highly interdependent association of implicit and explicit components. The symbolic processing environment of STAR may be extended by linking the interpreter with functions defined in conventional compiled languages. These external routines interact with STAR through function calls in either direction, and through the exchange of references to data structures. The hybrid knowledge base may thus be accessed and processed in general by either side of the application. STAR is initially used to link externally compiled routines and data structures. It is then invoked to interpret the STAR rules and symbolic structures. In a typical interactive session, the user enters an expression to be evaluated, STAR parses the input, evaluates the expression, performs any file input/output required, and displays the results. The STAR interpreter is written in the C language for interactive execution. It has been implemented on a VAX 11/780 computer operating under VMS, and the UNIX version has been implemented on a Sun Microsystems 2/170 workstation. STAR has a memory requirement of approximately 200K of 8 bit bytes, excluding externally compiled functions and application-dependent symbolic definitions. This program was developed in 1985.
NASA Technical Reports Server (NTRS)
Borchardt, G. C.
1994-01-01
The Simple Tool for Automated Reasoning program (STAR) is an interactive, interpreted programming language for the development and operation of artificial intelligence (AI) application systems. STAR provides an environment for integrating traditional AI symbolic processing with functions and data structures defined in compiled languages such as C, FORTRAN and PASCAL. This type of integration occurs in a number of AI applications including interpretation of numerical sensor data, construction of intelligent user interfaces to existing compiled software packages, and coupling AI techniques with numerical simulation techniques and control systems software. The STAR language was created as part of an AI project for the evaluation of imaging spectrometer data at NASA's Jet Propulsion Laboratory. Programming in STAR is similar to other symbolic processing languages such as LISP and CLIP. STAR includes seven primitive data types and associated operations for the manipulation of these structures. A semantic network is used to organize data in STAR, with capabilities for inheritance of values and generation of side effects. The AI knowledge base of STAR can be a simple repository of records or it can be a highly interdependent association of implicit and explicit components. The symbolic processing environment of STAR may be extended by linking the interpreter with functions defined in conventional compiled languages. These external routines interact with STAR through function calls in either direction, and through the exchange of references to data structures. The hybrid knowledge base may thus be accessed and processed in general by either side of the application. STAR is initially used to link externally compiled routines and data structures. It is then invoked to interpret the STAR rules and symbolic structures. In a typical interactive session, the user enters an expression to be evaluated, STAR parses the input, evaluates the expression, performs any file input/output required, and displays the results. The STAR interpreter is written in the C language for interactive execution. It has been implemented on a VAX 11/780 computer operating under VMS, and the UNIX version has been implemented on a Sun Microsystems 2/170 workstation. STAR has a memory requirement of approximately 200K of 8 bit bytes, excluding externally compiled functions and application-dependent symbolic definitions. This program was developed in 1985.
Interpretative Communities in Conflict: A Master Syllabus for Political Communication.
ERIC Educational Resources Information Center
Smith, Craig Allen
1992-01-01
Advocates the interpretive communities approach to teaching political communication. Discusses philosophical issues in the teaching of political communication courses, and pedagogical techniques (including concepts versus cases, clustering examples, C-SPAN video examples, and simulations and games). (SR)
Miller, Brian W.; Van der Meeren, Anne; Tazrart, Anissa; Angulo, Jaime F.; Griffiths, Nina M.
2017-01-01
This work presents a comparison of three autoradiography techniques for imaging biological samples contaminated with actinides: emulsion-based, plastic-based autoradiography and a quantitative digital technique, the iQID camera, based on the numerical analysis of light from a scintillator screen. In radiation toxicology it has been important to develop means of imaging actinide distribution in tissues as these radionuclides may be heterogeneously distributed within and between tissues after internal contamination. Actinide distribution determines which cells are exposed to alpha radiation and is thus potentially critical for assessing absorbed dose. The comparison was carried out by generating autoradiographs of the same biological samples contaminated with actinides with the three autoradiography techniques. These samples were cell preparations or tissue sections collected from animals contaminated with different physico-chemical forms of actinides. The autoradiograph characteristics and the performances of the techniques were evaluated and discussed mainly in terms of acquisition process, activity distribution patterns, spatial resolution and feasibility of activity quantification. The obtained autoradiographs presented similar actinide distribution at low magnification. Out of the three techniques, emulsion autoradiography is the only one to provide a highly-resolved image of the actinide distribution inherently superimposed on the biological sample. Emulsion autoradiography is hence best interpreted at higher magnifications. However, this technique is destructive for the biological sample. Both emulsion- and plastic-based autoradiography record alpha tracks and thus enabled the differentiation between ionized forms of actinides and oxide particles. This feature can help in the evaluation of decorporation therapy efficacy. The most recent technique, the iQID camera, presents several additional features: real-time imaging, separate imaging of alpha particles and gamma rays, and alpha activity quantification. The comparison of these three autoradiography techniques showed that they are complementary and the choice of the technique depends on the purpose of the imaging experiment. PMID:29023595
Techniques for interpretation of geoid anomalies
NASA Technical Reports Server (NTRS)
Chapman, M. E.
1979-01-01
For purposes of geological interpretation, techniques are developed to compute directly the geoid anomaly over models of density within the earth. Ideal bodies such as line segments, vertical sheets, and rectangles are first used to calculate the geoid anomaly. Realistic bodies are modeled with formulas for two-dimensional polygons and three-dimensional polyhedra. By using Fourier transform methods the two-dimensional geoid is seen to be a filtered version of the gravity field, in which the long-wavelength components are magnified and the short-wavelength components diminished.
Lewiss, Resa E; Chan, Wilma; Sheng, Alexander Y; Soto, Jorge; Castro, Alexandra; Meltzer, Andrew C; Cherney, Alan; Kumaravel, Manickam; Cody, Dianna; Chen, Esther H
2015-12-01
The appropriate selection and accurate interpretation of diagnostic imaging is a crucial skill for emergency practitioners. To date, the majority of the published literature and research on competency assessment comes from the subspecialty of point-of-care ultrasound. A group of radiologists, physicists, and emergency physicians convened at the 2015 Academic Emergency Medicine consensus conference to discuss and prioritize a research agenda related to education, assessment, and competency in ordering and interpreting diagnostic imaging. A set of questions for the continued development of an educational curriculum on diagnostic imaging for trainees and competency assessment using specific assessment methods based on current best practices was delineated. The research priorities were developed through an iterative consensus-driven process using a modified nominal group technique that culminated in an in-person breakout session. The four recommendations are: 1) develop a diagnostic imaging curriculum for emergency medicine (EM) residency training; 2) develop, study, and validate tools to assess competency in diagnostic imaging interpretation; 3) evaluate the role of simulation in education, assessment, and competency measures for diagnostic imaging; 4) study is needed regarding the American College of Radiology Appropriateness Criteria, an evidence-based peer-reviewed resource in determining the use of diagnostic imaging, to maximize its value in EM. In this article, the authors review the supporting reliability and validity evidence and make specific recommendations for future research on the education, competency, and assessment of learning diagnostic imaging. © 2015 by the Society for Academic Emergency Medicine.
A guide to understanding meta-analysis.
Israel, Heidi; Richter, Randy R
2011-07-01
With the focus on evidence-based practice in healthcare, a well-conducted systematic review that includes a meta-analysis where indicated represents a high level of evidence for treatment effectiveness. The purpose of this commentary is to assist clinicians in understanding meta-analysis as a statistical tool using both published articles and explanations of components of the technique. We describe what meta-analysis is, what heterogeneity is, and how it affects meta-analysis, effect size, the modeling techniques of meta-analysis, and strengths and weaknesses of meta-analysis. Common components like forest plot interpretation, software that may be used, special cases for meta-analysis, such as subgroup analysis, individual patient data, and meta-regression, and a discussion of criticisms, are included.
Protein free energy landscapes from long equilibrium simulations
NASA Astrophysics Data System (ADS)
Piana-Agostinetti, Stefano
Many computational techniques based on molecular dynamics (MD) simulation can be used to generate data to aid in the construction of protein free energy landscapes with atomistic detail. Unbiased, long, equilibrium MD simulations--although computationally very expensive--are particularly appealing, as they can provide direct kinetic and thermodynamic information on the transitions between the states that populate a protein free energy surface. It can be challenging to know how to analyze and interpret even results generated by this direct technique, however. I will discuss approaches we have employed, using equilibrium MD simulation data, to obtain descriptions of the free energy landscapes of proteins ranging in size from tens to thousands of amino acids.
Insight, working through, and practice: the role of procedural knowledge.
Rosenblatt, Allan
2004-01-01
A conception of insight is proposed, based on a systems and information-processing framework and using current neuroscience concepts, as an integration of information that results in a new symbolization of experience with a significant change in self-image and a transformation of non-declarative procedural knowledge into declarative knowledge. Since procedural memory and knowledge, seen to include emotional and relationship issues, is slow to change, durable emotional and behavioral change often requires repeated practice, a need not explicitly addressed in standard psychoanalytic technique. Working through is thus seen as also encompassing nondynamic factors. The application of these ideas to therapeutic technique suggests possible therapeutic interventions beyond interpretation. An illustrative clinical vignette is presented.
Solidification kinetics of a Cu-Zr alloy: ground-based and microgravity experiments
NASA Astrophysics Data System (ADS)
Galenko, P. K.; Hanke, R.; Paul, P.; Koch, S.; Rettenmayr, M.; Gegner, J.; Herlach, D. M.; Dreier, W.; Kharanzhevski, E. V.
2017-04-01
Experimental and theoretical results obtained in the MULTIPHAS-project (ESA-European Space Agency and DLR-German Aerospace Center) are critically discussed regarding solidification kinetics of congruently melting and glass forming Cu50Zr50 alloy samples. The samples are investigated during solidification using a containerless technique in the Electromagnetic Levitation Facility [1]. Applying elaborated methodologies for ground-based and microgravity experimental investigations [2], the kinetics of primary dendritic solidification is quantitatively evaluated. Electromagnetic Levitator in microgravity (parabolic flights and on board of the International Space Station) and Electrostatic Levitator on Ground are employed. The solidification kinetics is determined using a high-speed camera and applying two evaluation methods: “Frame by Frame” (FFM) and “First Frame - Last Frame” (FLM). In the theoretical interpretation of the solidification experiments, special attention is given to the behavior of the cluster structure in Cu50Zr50 samples with the increase of undercooling. Experimental results on solidification kinetics are interpreted using a theoretical model of diffusion controlled dendrite growth.
Age diagnosis based on incremental lines in dental cementum: a critical reflection.
Grosskopf, Birgit; McGlynn, George
2011-01-01
Age estimation based on the counting of incremental lines in dental cementum is a method frequently used for the estimation of the age at death for humans in bioarchaeology, and increasingly, forensic anthropology. Assessment of applicability, precision, and method reproducibility continue to be the focus of research in this area, and are occasionally accompanied by significant controversy. Differences in methodological techniques for data collection (e.g. number of sections, factor of magnification for counting or interpreting "outliers") are presented. Potential influences on method reliability are discussed, especially for their applicability in forensic contexts.
NASA Technical Reports Server (NTRS)
Edwards, H. D.
1976-01-01
Data collected by the Georgia Tech Radio Meteor Wind Facility during the fall and winter of 1975 are analyzed indicating a relationship between lower thermospheric circulation at mid latitudes and polar stratospheric dynamics. Techniques of measurement of mixing processes in the upper atmosphere and the interpretation of those measurements are described along with a diffusion simulation program based on the Global Reference Atmosphere program.
Efficient morse decompositions of vector fields.
Chen, Guoning; Mischaikow, Konstantin; Laramee, Robert S; Zhang, Eugene
2008-01-01
Existing topology-based vector field analysis techniques rely on the ability to extract the individual trajectories such as fixed points, periodic orbits, and separatrices that are sensitive to noise and errors introduced by simulation and interpolation. This can make such vector field analysis unsuitable for rigorous interpretations. We advocate the use of Morse decompositions, which are robust with respect to perturbations, to encode the topological structures of a vector field in the form of a directed graph, called a Morse connection graph (MCG). While an MCG exists for every vector field, it need not be unique. Previous techniques for computing MCG's, while fast, are overly conservative and usually results in MCG's that are too coarse to be useful for the applications. To address this issue, we present a new technique for performing Morse decomposition based on the concept of tau-maps, which typically provides finer MCG's than existing techniques. Furthermore, the choice of tau provides a natural tradeoff between the fineness of the MCG's and the computational costs. We provide efficient implementations of Morse decomposition based on tau-maps, which include the use of forward and backward mapping techniques and an adaptive approach in constructing better approximations of the images of the triangles in the meshes used for simulation.. Furthermore, we propose the use of spatial tau-maps in addition to the original temporal tau-maps. These techniques provide additional trade-offs between the quality of the MCGs and the speed of computation. We demonstrate the utility of our technique with various examples in the plane and on surfaces including engine simulation data sets.
A comparison of two- and three-dimensional stochastic models of regional solute movement
Shapiro, A.M.; Cvetkovic, V.D.
1990-01-01
Recent models of solute movement in porous media that are based on a stochastic description of the porous medium properties have been dedicated primarily to a three-dimensional interpretation of solute movement. In many practical problems, however, it is more convenient and consistent with measuring techniques to consider flow and solute transport as an areal, two-dimensional phenomenon. The physics of solute movement, however, is dependent on the three-dimensional heterogeneity in the formation. A comparison of two- and three-dimensional stochastic interpretations of solute movement in a porous medium having a statistically isotropic hydraulic conductivity field is investigated. To provide an equitable comparison between the two- and three-dimensional analyses, the stochastic properties of the transmissivity are defined in terms of the stochastic properties of the hydraulic conductivity. The variance of the transmissivity is shown to be significantly reduced in comparison to that of the hydraulic conductivity, and the transmissivity is spatially correlated over larger distances. These factors influence the two-dimensional interpretations of solute movement by underestimating the longitudinal and transverse growth of the solute plume in comparison to its description as a three-dimensional phenomenon. Although this analysis is based on small perturbation approximations and the special case of a statistically isotropic hydraulic conductivity field, it casts doubt on the use of a stochastic interpretation of the transmissivity in describing regional scale movement. However, by assuming the transmissivity to be the vertical integration of the hydraulic conductivity field at a given position, the stochastic properties of the hydraulic conductivity can be estimated from the stochastic properties of the transmissivity and applied to obtain a more accurate interpretation of solute movement. ?? 1990 Kluwer Academic Publishers.
Phase contrast imaging of buccal mucosa tissues-Feasibility study
NASA Astrophysics Data System (ADS)
Fatima, A.; Tripathi, S.; Shripathi, T.; Kulkarni, V. K.; Banda, N. R.; Agrawal, A. K.; Sarkar, P. S.; Kashyap, Y.; Sinha, A.
2015-06-01
Phase Contrast Imaging (PCI) technique has been used to interpret physical parameters obtained from the image taken on the normal buccal mucosa tissue extracted from cheek of a patient. The advantages of this method over the conventional imaging techniques are discussed. PCI technique uses the X-ray phase shift at the edges differentiated by very minute density differences and the edge enhanced high contrast images reveal details of soft tissues. The contrast in the images produced is related to changes in the X-ray refractive index of the tissues resulting in higher clarity compared with conventional absorption based X-ray imaging. The results show that this type of imaging has better ability to visualize microstructures of biological soft tissues with good contrast, which can lead to the diagnosis of lesions at an early stage of the diseases.
Black Interpretation, Black American Literature, and Grey Audiences.
ERIC Educational Resources Information Center
Washington, Earl M.
1981-01-01
Defines and illustrates language techniques used by Black authors writing to and for Blacks in the 1960s and 1970s. Suggests how language and theme barriers of such literature might be overcome in a contemporary integrated oral interpretation classroom. (PD)
Refining the 'cucumber' technique for laryngeal biopsy.
Robertson, S; Cooper, L; McPhaden, A; MacKenzie, K
2011-06-01
To refine the case selection process for the 'cucumber' mounting system for laryngeal biopsies. We conducted a retrospective audit of cucumber technique specimens taken between January 2002 and December 2008. We analysed the clinical indications for biopsy and the pathological diagnosis, for each specimen, in order to inform our case selection process. The cucumber technique was used for 125 laryngeal specimens. 60 specimens were taken for diagnostic sampling, 46 were taken during endoscopic laser resection, and 19 for overtly benign pathology. The cucumber technique was most useful for the interpretation of margins in endoscopic laser resection specimens. The cucumber technique is most useful for endoscopic resection cases in which tumour, dysplasia or suspicious lesions have been excised. Detailed information on resection margins is invaluable during multidisciplinary team discussions on patient management. Detailed photography of mounted specimens enables both laryngologist and pathologist to orientate and interpret specimens accurately.
Applications of Geodesy to Geodynamics, an International Symposium
NASA Technical Reports Server (NTRS)
Mueller, I. I. (Editor)
1978-01-01
Geodetic techniques in detecting and monitoring geodynamic phenomena are reviewed. Specific areas covered include: rotation of the earth and polar motion; tectonic plate movements and crustal deformations (space techniques); horizontal crustal movements (terrestrial techniques); vertical crustal movements (terrestrial techniques); gravity field, geoid, and ocean surface by space techniques; surface gravity and new techniques for the geophysical interpretation of gravity and geoid undulation; and earth tides and geodesy.
Presentation and Impact of Experimental Techniques in Chemistry
ERIC Educational Resources Information Center
Sojka, Zbigniew; Che, Michel
2008-01-01
Laboratory and practical courses, where students become familiar with experimental techniques and learn to interpret data and relate them to appropriate theory, play a vital role in chemical education. In the large panoply of currently available techniques, it is difficult to find a rational and easy way to classify the techniques in relation to…
From guideline modeling to guideline execution: defining guideline-based decision-support services.
Tu, S. W.; Musen, M. A.
2000-01-01
We describe our task-based approach to defining the guideline-based decision-support services that the EON system provides. We categorize uses of guidelines in patient-specific decision support into a set of generic tasks--making of decisions, specification of work to be performed, interpretation of data, setting of goals, and issuance of alert and reminders--that can be solved using various techniques. Our model includes constructs required for representing the knowledge used by these techniques. These constructs form a toolkit from which developers can select modeling solutions for guideline task. Based on the tasks and the guideline model, we define a guideline-execution architecture and a model of interactions between a decision-support server and clients that invoke services provided by the server. These services use generic interfaces derived from guideline tasks and their associated modeling constructs. We describe two implementations of these decision-support services and discuss how this work can be generalized. We argue that a well-defined specification of guideline-based decision-support services will facilitate sharing of tools that implement computable clinical guidelines. PMID:11080007
Interpreting BOLD: towards a dialogue between cognitive and cellular neuroscience.
Hall, Catherine N; Howarth, Clare; Kurth-Nelson, Zebulun; Mishra, Anusha
2016-10-05
Cognitive neuroscience depends on the use of blood oxygenation level-dependent (BOLD) functional magnetic resonance imaging (fMRI) to probe brain function. Although commonly used as a surrogate measure of neuronal activity, BOLD signals actually reflect changes in brain blood oxygenation. Understanding the mechanisms linking neuronal activity to vascular perfusion is, therefore, critical in interpreting BOLD. Advances in cellular neuroscience demonstrating differences in this neurovascular relationship in different brain regions, conditions or pathologies are often not accounted for when interpreting BOLD. Meanwhile, within cognitive neuroscience, the increasing use of high magnetic field strengths and the development of model-based tasks and analyses have broadened the capability of BOLD signals to inform us about the underlying neuronal activity, but these methods are less well understood by cellular neuroscientists. In 2016, a Royal Society Theo Murphy Meeting brought scientists from the two communities together to discuss these issues. Here, we consolidate the main conclusions arising from that meeting. We discuss areas of consensus about what BOLD fMRI can tell us about underlying neuronal activity, and how advanced modelling techniques have improved our ability to use and interpret BOLD. We also highlight areas of controversy in understanding BOLD and suggest research directions required to resolve these issues.This article is part of the themed issue 'Interpreting BOLD: a dialogue between cognitive and cellular neuroscience'. © 2016 The Author(s).
Domínguez Hernández, Karem R.; Aguilar Lasserre, Alberto A.; Posada Gómez, Rubén; Palet Guzmán, José A.; González Sánchez, Blanca E.
2013-01-01
Cervical cancer is the second largest cause of death among women worldwide. Nowadays, this disease is preventable and curable at low cost and low risk when an accurate diagnosis is done in due time, since it is the neoplasm with the highest prevention potential. This work describes the development of an expert system able to provide a diagnosis to cervical neoplasia (CN) precursor injuries through the integration of fuzzy logics and image interpretation techniques. The key contribution of this research focuses on atypical cases, specifically on atypical glandular cells (AGC). The expert system consists of 3 phases: (1) risk diagnosis which consists of the interpretation of a patient's clinical background and the risks for contracting CN according to specialists; (2) cytology images detection which consists of image interpretation (IM) and the Bethesda system for cytology interpretation, and (3) determination of cancer precursor injuries which consists of in retrieving the information from the prior phases and integrating the expert system by means of a fuzzy logics (FL) model. During the validation stage of the system, 21 already diagnosed cases were tested with a positive correlation in which 100% effectiveness was obtained. The main contribution of this work relies on the reduction of false positives and false negatives by providing a more accurate diagnosis for CN. PMID:23690881
Cui, De-Mi; Wang, Xiao-Quan; Lu, Lie-Min
2017-01-01
Low strain pile integrity testing (LSPIT), due to its simplicity and low cost, is one of the most popular NDE methods used in pile foundation construction. While performing LSPIT in the field is generally quite simple and quick, determining the integrity of the test piles by analyzing and interpreting the test signals (reflectograms) is still a manual process performed by experienced experts only. For foundation construction sites where the number of piles to be tested is large, it may take days before the expert can complete interpreting all of the piles and delivering the integrity assessment report. Techniques that can automate test signal interpretation, thus shortening the LSPIT’s turnaround time, are of great business value and are in great need. Motivated by this need, in this paper, we develop a computer-aided reflectogram interpretation (CARI) methodology that can interpret a large number of LSPIT signals quickly and consistently. The methodology, built on advanced signal processing and machine learning technologies, can be used to assist the experts in performing both qualitative and quantitative interpretation of LSPIT signals. Specifically, the methodology can ease experts’ interpretation burden by screening all test piles quickly and identifying a small number of suspected piles for experts to perform manual, in-depth interpretation. We demonstrate the methodology’s effectiveness using the LSPIT signals collected from a number of real-world pile construction sites. The proposed methodology can potentially enhance LSPIT and make it even more efficient and effective in quality control of deep foundation construction. PMID:29068431
The Importance of Quality in Ventilation-Perfusion Imaging.
Mann, April; DiDea, Mario; Fournier, France; Tempesta, Daniel; Williams, Jessica; LaFrance, Norman
2018-06-01
As the health care environment continues to change and morph into a system focusing on increased quality and evidence-based outcomes, nuclear medicine technologists must be reminded that they play a critical role in achieving high-quality, interpretable images used to drive patient care, treatment, and best possible outcomes. A survey performed by the Quality Committee of the Society of Nuclear Medicine and Molecular Imaging Technologist Section demonstrated that a clear knowledge gap exists among technologists regarding their understanding of quality, how it is measured, and how it should be achieved by all practicing technologists regardless of role and education level. Understanding of these areas within health care, in conjunction with the growing emphasis on evidence-based outcomes, quality measures, and patient satisfaction, will ultimately elevate the role of nuclear medicine technologists today and into the future. The nuclear medicine role now requires technologists to demonstrate patient assessment skills, practice safety procedures with regard to staff and patients, provide patient education and instruction, and provide physicians with information to assist with the interpretation and outcome of the study. In addition, the technologist must be able to evaluate images by performing technical analysis, knowing the demonstrated anatomy and pathophysiology, and assessing overall quality. Technologists must also be able to triage and understand the disease processes being evaluated and how nuclear medicine diagnostic studies may drive care and treatment. Therefore, it is imperative that nuclear medicine technologists understand their role in the achievement of a high-quality, interpretable study by applying quality principles and understanding and using imaging techniques beyond just basic protocols for every type of disease or system being imaged. This article focuses on quality considerations related to ventilation-perfusion imaging. It provides insight on appropriate imaging techniques and protocols, true imaging variants and tracer distributions versus artifacts that may result in a lower-quality or misinterpreted study, and the use of SPECT and SPECT/CT as an alternative providing a high-quality, interpretable study with better diagnostic accuracy and fewer nondiagnostic procedures than historical planar imaging. © 2018 by the Society of Nuclear Medicine and Molecular Imaging.
Identification of P/S-wave successions for application in microseismicity
NASA Astrophysics Data System (ADS)
Deflandre, J.-P.; Dubesset, M.
1992-09-01
Interpretation of P/S-wave successions is used in induced or passive microseismicity. It makes the location of microseismic events possible when the triangulation technique cannot be used. To improve the reliability of the method, we propose a technique that identifies the P/S-wave successions among recorded wave successions. A polarization software is used to verify the orthogonality between the P and S polarization axes. The polarization parameters are computed all along the 3-component acoustic signal. Then the algorithm detects time windows within which the signal polarization axis is perpendicular to the polarization axis of the wave in the reference time window (representative of the P wave). The technique is demonstrated for a synthetic event, and three application cases are presented. The first one corresponds to a calibration shot within which the arrivals of perpendicularly polarized waves are correctly detected in spite of their moderate amplitude. The second example presents a microseismic event recorded during gas withdrawal from an underground gas storage reservoir. The last example is chosen as a counter-example, concerning a microseismic event recorded during a hydraulic fracturing job. The detection algorithm reveals that, in this case, the wave succession does not correspond to a P/S one. This implies that such an event must not be located by the method based on the interpretation of a P/S-wave succession as no such a succession is confirmed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kristensen, L.; Dons, T.; Schioler, P.
1995-11-01
Correlation of wireline log data from the North Sea chalk reservoirs is frequently hampered by rather subtle log patterns in the chalk section due to the apparent monotonous nature of the chalk sediments, which may lead to ambiguous correlations. This study deals with a correlation technique based on an integration of biostratigraphic data, seismic interpretation, and wireline log correlation; this technique aims at producing a consistent reservoir subdivision that honors both the well data and the seismic data. This multidisciplinary approach has been used to subdivide and correlate the Maastrichtian chalk in the Dan field. The biostratigraphic subdivision is basedmore » on a new detailed dinoflagellate study of core samples from eight wells. Integrating the biostratigraphic results with three-dimensional seismic data allows recognition of four stratigraphic units within the Maastrichtian, bounded by assumed chronostratigraphic horizons. This subdivision is further refined by adding a seismic horizon and four horizons from wireline log correlations, establishing a total of nine reservoir units. The approximate chronostratigraphic nature of these units provides an improved interpretation of the depositional and structural patterns in this area. The three upper reservoir units pinch out and disappear in a northeasterly direction across the field. We interpret this stratal pattern as reflecting a relative sea level fall or regional basinal subsidence during the latest Maastrichtian, possibly combined with local synsedimentary uplift due to salt tectonics. Isochore maps indicate that the underlying six non-wedging units are unaffected by salt tectonics.« less
Codetype-based interpretation of the MMPI-2 in an outpatient psychotherapy sample.
Koffmann, Andrew
2015-01-01
In an evaluation of the codetype-based interpretation of the MMPI-2, 48 doctoral student psychotherapists rated their clients' (N = 120) standardized interpretations as more accurate when based on the profile's codetype, in comparison with ratings for interpretations based on alternate codetypes. Effect sizes ranged from nonsignificant to large, depending on the degree of proximity between the profile's codetype and the alternate codetype. There was weak evidence to suggest that well-defined profiles yielded more accurate interpretations than undefined profiles. It appears that codetype-based interpretation of the MMPI-2 is generally valid, but there might be little difference in the accuracy of interpretations based on nearby codetypes.
Interpreting Low Spatial Resolution Thermal Data from Active Volcanoes on Io and the Earth
NASA Technical Reports Server (NTRS)
Keszthelyi, L.; Harris, A. J. L.; Flynn, L.; Davies, A. G.; McEwen, A.
2001-01-01
The style of volcanism was successfully determined at a number of active volcanoes on Io and the Earth using the same techniques to interpret thermal remote sensing data. Additional information is contained in the original extended abstract.
Neurolinguistic Programming: Add It To Your Tool Chest of Interpretive Techniques.
ERIC Educational Resources Information Center
Parratt, Smitty
1997-01-01
Highlights the importance of using verbal and nonverbal neurolinguistic programming to maximize the potential of interactions between interpreters and the general public and to improve long-term interactions. Discusses the power of mirroring and representational systems. Contains 29 references. (JRH)
What We Do and Do Not Know about Teaching Medical Image Interpretation.
Kok, Ellen M; van Geel, Koos; van Merriënboer, Jeroen J G; Robben, Simon G F
2017-01-01
Educators in medical image interpretation have difficulty finding scientific evidence as to how they should design their instruction. We review and comment on 81 papers that investigated instructional design in medical image interpretation. We distinguish between studies that evaluated complete offline courses and curricula, studies that evaluated e-learning modules, and studies that evaluated specific educational interventions. Twenty-three percent of all studies evaluated the implementation of complete courses or curricula, and 44% of the studies evaluated the implementation of e-learning modules. We argue that these studies have encouraging results but provide little information for educators: too many differences exist between conditions to unambiguously attribute the learning effects to specific instructional techniques. Moreover, concepts are not uniformly defined and methodological weaknesses further limit the usefulness of evidence provided by these studies. Thirty-two percent of the studies evaluated a specific interventional technique. We discuss three theoretical frameworks that informed these studies: diagnostic reasoning, cognitive schemas and study strategies. Research on diagnostic reasoning suggests teaching students to start with non-analytic reasoning and subsequently applying analytic reasoning, but little is known on how to train non-analytic reasoning. Research on cognitive schemas investigated activities that help the development of appropriate cognitive schemas. Finally, research on study strategies supports the effectiveness of practice testing, but more study strategies could be applicable to learning medical image interpretation. Our commentary highlights the value of evaluating specific instructional techniques, but further evidence is required to optimally inform educators in medical image interpretation.
NASA Astrophysics Data System (ADS)
Balletti, C.; Guerra, F.; Scocca, V.; Gottardi, C.
2015-02-01
Highly accurate documentation and 3D reconstructions are fundamental for analyses and further interpretations in archaeology. In the last years the integrated digital survey (ground-based survey methods and UAV photogrammetry) has confirmed its main role in the documentation and comprehension of excavation contexts, thanks to instrumental and methodological development concerning the on site data acquisition. The specific aim of the project, reported in this paper and realized by the Laboratory of Photogrammetry of the IUAV University of Venice, is to check different acquisition systems and their effectiveness test, considering each methodology individually or integrated. This research focuses on the awareness that the integration of different survey's methodologies can as a matter of fact increase the representative efficacy of the final representations; these are based on a wider and verified set of georeferenced metric data. Particularly the methods' integration allows reducing or neutralizing issues related to composite and complex objects' survey, since the most appropriate tools and techniques can be chosen considering the characteristics of each part of an archaeological site (i.e. urban structures, architectural monuments, small findings). This paper describes the experience in several sites of the municipality of Sepino (Molise, Italy), where the 3d digital acquisition of cities and structure of monuments, sometimes hard to reach, was realized using active and passive techniques (rage-based and image based methods). This acquisition was planned in order to obtain not only the basic support for interpretation analysis, but also to achieve models of the actual state of conservation of the site on which some reconstructive hypotheses can be based on. Laser scanning data were merged with Structure from Motion techniques' clouds into the same reference system, given by a topographical and GPS survey. These 3d models are not only the final results of the metric survey, but also the starting point for the whole reconstruction of the city and its urban context, from the research point of view. This reconstruction process will concern even some areas that have not yet been excavated, where the application of procedural modelling can offer an important support to the reconstructive hypothesis.
Biochemical Imaging of Gliomas Using MR Spectroscopic Imaging for Radiotherapy Treatment Planning
NASA Astrophysics Data System (ADS)
Heikal, Amr Ahmed
This thesis discusses the main obstacles facing wide clinical implementation of magnetic resonance spectroscopic imaging (MRSI) as a tumor delineation tool for radiotherapy treatment planning, particularly for gliomas. These main obstacles are identified as 1. observer bias and poor interpretational reproducibility of the results of MRSI scans, and 2. the long scan times required to conduct MRSI scans. An examination of an existing user-independent MRSI tumor delineation technique known as the choline-to-NAA index (CNI) is conducted to assess its utility in providing a tool for reproducible interpretation of MRSI results. While working with spatial resolutions typically twice those on which the CNI model was originally designed, a region of statistical uncertainty was discovered between the tumor and normal tissue populations and as such a modification to the CNI model was introduced to clearly identify that region. To address the issue of long scan times, a series of studies were conducted to adapt a scan acceleration technique, compressed sensing (CS), to work with MRSI and to quantify the effects of such a novel technique on the modulation transfer function (MTF), an important quantitative imaging metric. The studies included the development of the first phantom based method of measuring the MTF for MRSI data, a study of the correlation between the k-space sampling patterns used for compressed sensing and the resulting MTFs, and the introduction of a technique circumventing some of side-effects of compressed sensing by exploiting the conjugate symmetry property of k-space. The work in this thesis provides two essential steps towards wide clinical implementation of MRSI-based tumor delineation. The proposed modifications to the CNI method coupled with the application of CS to MRSI address the two main obstacles outlined. However, there continues to be room for improvement and questions that need to be answered by future research.
Liu, Yu; Zhou, Haibo; Hu, Ziwei; Yu, Guangxia; Yang, Danting; Zhao, Jinshun
2017-08-15
Rapid, accurate detection of pathogen bacteria is a highly topical research area for the sake of food safety and public health. Surface-enhanced Raman scattering (SERS) is being considered as a powerful and attractive technique for pathogen bacteria detection, due to its sensitivity, high speed, comparatively low cost, multiplexing ability and portability. This contribution aims to give a comprehensive overview of SERS as a technique for rapid detection of pathogen bacteria based on label and label-free strategies. A brief tutorial on SERS is given first of all. Then we summarize the recent trends and developments of label and label-free based SERS applied to detection of pathogen bacteria, including the relatively complete interpretation of SERS spectra. In addition, multifunctional SERS platforms for pathogen bacteria in matrix are discussed as well. Furthermore, an outlook of the work done and a perspective on the future directions of SERS as a reliable tool for real-time pathogen bacteria detection are given. Copyright © 2017 Elsevier B.V. All rights reserved.
Development of fuzzy air quality index using soft computing approach.
Mandal, T; Gorai, A K; Pathak, G
2012-10-01
Proper assessment of air quality status in an atmosphere based on limited observations is an essential task for meeting the goals of environmental management. A number of classification methods are available for estimating the changing status of air quality. However, a discrepancy frequently arises from the quality criteria of air employed and vagueness or fuzziness embedded in the decision making output values. Owing to inherent imprecision, difficulties always exist in some conventional methodologies like air quality index when describing integrated air quality conditions with respect to various pollutants parameters and time of exposure. In recent years, the fuzzy logic-based methods have demonstrated to be appropriated to address uncertainty and subjectivity in environmental issues. In the present study, a methodology based on fuzzy inference systems (FIS) to assess air quality is proposed. This paper presents a comparative study to assess status of air quality using fuzzy logic technique and that of conventional technique. The findings clearly indicate that the FIS may successfully harmonize inherent discrepancies and interpret complex conditions.
Measurements of strain at plate boundaries using space based geodetic techniques
NASA Technical Reports Server (NTRS)
Robaudo, Stefano; Harrison, Christopher G. A.
1993-01-01
We have used the space based geodetic techniques of Satellite Laser Ranging (SLR) and VLBI to study strain along subduction and transform plate boundaries and have interpreted the results using a simple elastic dislocation model. Six stations located behind island arcs were analyzed as representative of subduction zones while 13 sites located on either side of the San Andreas fault were used for the transcurrent zones. The length deformation scale was then calculated for both tectonic margins by fitting the relative strain to an exponentially decreasing function of distance from the plate boundary. Results show that space-based data for the transcurrent boundary along the San Andreas fault help to define better the deformation length scale in the area while fitting nicely the elastic half-space earth model. For subduction type bonndaries the analysis indicates that there is no single scale length which uniquely describes the deformation. This is mainly due to the difference in subduction characteristics for the different areas.
Reduced modeling of signal transduction – a modular approach
Koschorreck, Markus; Conzelmann, Holger; Ebert, Sybille; Ederer, Michael; Gilles, Ernst Dieter
2007-01-01
Background Combinatorial complexity is a challenging problem in detailed and mechanistic mathematical modeling of signal transduction. This subject has been discussed intensively and a lot of progress has been made within the last few years. A software tool (BioNetGen) was developed which allows an automatic rule-based set-up of mechanistic model equations. In many cases these models can be reduced by an exact domain-oriented lumping technique. However, the resulting models can still consist of a very large number of differential equations. Results We introduce a new reduction technique, which allows building modularized and highly reduced models. Compared to existing approaches further reduction of signal transduction networks is possible. The method also provides a new modularization criterion, which allows to dissect the model into smaller modules that are called layers and can be modeled independently. Hallmarks of the approach are conservation relations within each layer and connection of layers by signal flows instead of mass flows. The reduced model can be formulated directly without previous generation of detailed model equations. It can be understood and interpreted intuitively, as model variables are macroscopic quantities that are converted by rates following simple kinetics. The proposed technique is applicable without using complex mathematical tools and even without detailed knowledge of the mathematical background. However, we provide a detailed mathematical analysis to show performance and limitations of the method. For physiologically relevant parameter domains the transient as well as the stationary errors caused by the reduction are negligible. Conclusion The new layer based reduced modeling method allows building modularized and strongly reduced models of signal transduction networks. Reduced model equations can be directly formulated and are intuitively interpretable. Additionally, the method provides very good approximations especially for macroscopic variables. It can be combined with existing reduction methods without any difficulties. PMID:17854494
NASA Astrophysics Data System (ADS)
Vasantrao, Baride Mukund; Bhaskarrao, Patil Jitendra; Mukund, Baride Aarti; Baburao, Golekar Rushikesh; Narayan, Patil Sanjaykumar
2017-12-01
The area chosen for the present study is Dhule district, which belongs to the drought prone area of Maharashtra State, India. Dhule district suffers from water problem, and therefore, there is no extra water available to supply for the agricultural and industrial growth. To understand the lithological characters in terms of its hydro-geological conditions, it is necessary to understand the geology of the area. It is now established fact that the geophysical method gives a better information of subsurface geology. Geophysical electrical surveys with four electrodes configuration, i.e., Wenner and Schlumberger method, were carried out at the same selected sites to observe the similarity and compared both the applications in terms of its use and handling in the field. A total 54 VES soundings were carried out spread over the Dhule district and representing different lithological units. The VES curves are drawn using inverse slope method for Wenner configuration, IPI2 win Software, and curve matching techniques were used for Schlumberger configuration. Regionwise lithologs are prepared based on the obtained resistivity and thickness for Wenner method. Regionwise curves were prepared based on resistivity layers for Schlumberger method. Comparing the two methods, it is observed that Wenner and Schlumberger methods have merits or demerits. Considering merits and demerits from the field point of view, it is suggested that Wenner inverse slope method is more handy for calculation and interpretation, but requires lateral length which is a constrain. Similarly, Schlumberger method is easy in application but unwieldy for their interpretation. The work amply proves the applicability of geophysical techniques in the water resource evaluation procedure. This technique is found to be suitable for the areas with similar geological setup elsewhere.
Secure steganographic communication algorithm based on self-organizing patterns.
Saunoriene, Loreta; Ragulskis, Minvydas
2011-11-01
A secure steganographic communication algorithm based on patterns evolving in a Beddington-de Angelis-type predator-prey model with self- and cross-diffusion is proposed in this paper. Small perturbations of initial states of the system around the state of equilibrium result in the evolution of self-organizing patterns. Small differences between initial perturbations result in slight differences also in the evolving patterns. It is shown that the generation of interpretable target patterns cannot be considered as a secure mean of communication because contours of the secret image can be retrieved from the cover image using statistical techniques if only it represents small perturbations of the initial states of the system. An alternative approach when the cover image represents the self-organizing pattern that has evolved from initial states perturbed using the dot-skeleton representation of the secret image can be considered as a safe visual communication technique protecting both the secret image and communicating parties.
A quantitative investigation of the fracture pump-in/flowback test
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plahn, S.V.; Nolte, K.G.; Miska, S.
1995-12-31
Fracture closure pressure is an important parameter for fracture treatment design and evaluation. The pump-in/flowback (PIFB) test is frequently used to estimate its magnitude. The test is attractive because bottomhole pressures during flowback develop a distinct and repeatable signature. This is in contrast to the pump-in/shut-in test where strong indications of fracture closure are rarely seen. Various techniques exist for extracting closure pressure from the flowback pressure response. Unfortunately, these procedures give different estimates for closure pressure and their theoretical bases are not well established. We present results that place the PIFB test on a more solid foundation. A numericalmore » model is used to simulate the PIFB test and glean physical mechanisms contributing to the response. Based on our simulation results, we propose an interpretation procedure which gives better estimates for closure pressure than existing techniques.« less
On the convergence of nanotechnology and Big Data analysis for computer-aided diagnosis.
Rodrigues, Jose F; Paulovich, Fernando V; de Oliveira, Maria Cf; de Oliveira, Osvaldo N
2016-04-01
An overview is provided of the challenges involved in building computer-aided diagnosis systems capable of precise medical diagnostics based on integration and interpretation of data from different sources and formats. The availability of massive amounts of data and computational methods associated with the Big Data paradigm has brought hope that such systems may soon be available in routine clinical practices, which is not the case today. We focus on visual and machine learning analysis of medical data acquired with varied nanotech-based techniques and on methods for Big Data infrastructure. Because diagnosis is essentially a classification task, we address the machine learning techniques with supervised and unsupervised classification, making a critical assessment of the progress already made in the medical field and the prospects for the near future. We also advocate that successful computer-aided diagnosis requires a merge of methods and concepts from nanotechnology and Big Data analysis.
Secure Recognition of Voice-Less Commands Using Videos
NASA Astrophysics Data System (ADS)
Yau, Wai Chee; Kumar, Dinesh Kant; Weghorn, Hans
Interest in voice recognition technologies for internet applications is growing due to the flexibility of speech-based communication. The major drawback with the use of sound for internet access with computers is that the commands will be audible to other people in the vicinity. This paper examines a secure and voice-less method for recognition of speech-based commands using video without evaluating sound signals. The proposed approach represents mouth movements in the video data using 2D spatio-temporal templates (STT). Zernike moments (ZM) are computed from STT and fed into support vector machines (SVM) to be classified into one of the utterances. The experimental results demonstrate that the proposed technique produces a high accuracy of 98% in a phoneme classification task. The proposed technique is demonstrated to be invariant to global variations of illumination level. Such a system is useful for securely interpreting user commands for internet applications on mobile devices.
Photocurrent measurements of pentacene-based devices
NASA Astrophysics Data System (ADS)
Masurkar, Amrita; Kymissis, Ioannis
2015-09-01
Photocurrent spectroscopy (PCS) and photocurrent microscopy (PCM) are powerful tools that can probe the underlying mechanisms of charge generation and transport in organic semiconductor devices. There has been significant progress in the use of these techniques, which has yielded a number of insights into the underlying materials and operation of the devices. Despite the potential for PCS and PCM to become standard tools, however, a consensus has not been reached on (1) its uses and (2) the underlying mechanisms which produce the photoresponse. This is particularly true for measurements of pentacene devices, as the energy dynamics of pentacene are complex. Accordingly, here we report the current body of PCS and PCM of pentacene devices, offer interpretations of the data, and discuss which questions remain unanswered. We have divided the reviewed work into four categories based on the goals of the study and the technique used: photocurrent spectroscopy, scanning photocurrent microscopy, mobility, and trap density-of-states.
Quantitative Seismic Interpretation: Applying Rock Physics Tools to Reduce Interpretation Risk
NASA Astrophysics Data System (ADS)
Sondergeld, Carl H.
This book is divided into seven chapters that cover rock physics, statistical rock physics, seismic inversion techniques, case studies, and work flows. On balance, the emphasis is on rock physics. Included are 56 color figures that greatly help in the interpretation of more complicated plots and displays.The domain of rock physics falls between petrophysics and seismics. It is the basis for interpreting seismic observations and therefore is pivotal to the understanding of this book. The first two chapters are dedicated to this topic (109 pages).
Introducing version 5 of Interpreting Indicators of Rangeland Health
USDA-ARS?s Scientific Manuscript database
Interpreting Indicators of Rangeland Health was initiated in 1994 as a qualitative, rapid assessment technique to evaluate rangeland health. Seventeen field indicators are used to rate three attributes of rangeland health: 1) soil/site stability, 2) hydrologic function, and 3) biotic integrity. The ...
Tokens: Facts and Interpretation.
ERIC Educational Resources Information Center
Schmandt-Besserat, Denise
1986-01-01
Summarizes some of the major pieces of evidence concerning the archeological clay tokens, specifically the technique for their manufacture, their geographic distribution, chronology, and the context in which they are found. Discusses the interpretation of tokens as the first example of visible language, particularly as an antecedent of Sumerian…
Remote sensing of wildland resources: A state-of-the-art review
Robert C. Aldrich
1979-01-01
A review, with literature citations, of current remote sensing technology, applications, and costs for wildland resource management, including collection, interpretation, and processing of data gathered through photographic and nonphotographic techniques for classification and mapping, interpretive information for specific applications, measurement of resource...
Film-Making and the Curriculum.
ERIC Educational Resources Information Center
Schwartz, Elizabeth
A guide to filmmaking techniques and the use of class-made films in the curriculum covers techniques of both animated and live-action films. The purposes of single concept, documentary, interpretive, and time-lapse films are discussed briefly. Production techniques covered include organization of personnel, scripting, filming, directing, editing,…
Fenaille, François; Visani, Piero; Fumeaux, René; Milo, Christian; Guy, Philippe A
2003-04-23
Two headspace techniques based on mass spectrometry detection (MS), electronic nose, and solid phase microextraction coupled to gas chromatography-mass spectrometry (SPME-GC/MS) were evaluated for their ability to differentiate various infant formula powders based on changes of their volatiles upon storage. The electronic nose gave unresolved MS fingerprints of the samples gas phases that were further submitted to principal component analysis (PCA). Such direct MS recording combined to multivariate treatment enabled a rapid differentiation of the infant formulas over a 4 week storage test. Although MS-based electronic nose advantages are its easy-to-use aspect and its meaningful data interpretation obtained with a high throughput (100 samples per 24 h), its greatest disadvantage is that the present compounds could not be identified and quantified. For these reasons, a SPME-GC/MS measurement was also investigated. This technique allowed the identification of saturated aldehydes as the main volatiles present in the headspace of infant milk powders. An isotope dilution assay was further developed to quantitate hexanal as a potential indicator of infant milk powder oxidation. Thus, hexanal content was found to vary from roughly 500 and 3500 microg/kg for relatively non-oxidized and oxidized infant formulas, respectively.
Integrating multi-omic features exploiting Chromosome Conformation Capture data.
Merelli, Ivan; Tordini, Fabio; Drocco, Maurizio; Aldinucci, Marco; Liò, Pietro; Milanesi, Luciano
2015-01-01
The representation, integration, and interpretation of omic data is a complex task, in particular considering the huge amount of information that is daily produced in molecular biology laboratories all around the world. The reason is that sequencing data regarding expression profiles, methylation patterns, and chromatin domains is difficult to harmonize in a systems biology view, since genome browsers only allow coordinate-based representations, discarding functional clusters created by the spatial conformation of the DNA in the nucleus. In this context, recent progresses in high throughput molecular biology techniques and bioinformatics have provided insights into chromatin interactions on a larger scale and offer a formidable support for the interpretation of multi-omic data. In particular, a novel sequencing technique called Chromosome Conformation Capture allows the analysis of the chromosome organization in the cell's natural state. While performed genome wide, this technique is usually called Hi-C. Inspired by service applications such as Google Maps, we developed NuChart, an R package that integrates Hi-C data to describe the chromosomal neighborhood starting from the information about gene positions, with the possibility of mapping on the achieved graphs genomic features such as methylation patterns and histone modifications, along with expression profiles. In this paper we show the importance of the NuChart application for the integration of multi-omic data in a systems biology fashion, with particular interest in cytogenetic applications of these techniques. Moreover, we demonstrate how the integration of multi-omic data can provide useful information in understanding why genes are in certain specific positions inside the nucleus and how epigenetic patterns correlate with their expression.
Sowa, Mandy; Hiemann, Rico; Schierack, Peter; Reinhold, Dirk; Conrad, Karsten; Roggenbuck, Dirk
2017-08-01
Occurrence of autoantibodies (autoAbs) is a hallmark of autoimmune diseases, and the analysis thereof is an essential part in the diagnosis of organ-specific autoimmune and systemic autoimmune rheumatic diseases (SARD), especially connective tissue diseases (CTDs). Due to the appearance of autoAb profiles in SARD patients and the complexity of the corresponding serological diagnosis, different diagnostic strategies have been suggested for appropriate autoAb testing. Thus, evolving assay techniques and the continuous discovery of novel autoantigens have greatly influenced the development of these strategies. Antinuclear antibody (ANA) analysis by indirect immunofluorescence (IIF) on tissue and later cellular substrates was one of the first tests introduced into clinical routine and is still an indispensable tool for CTD serology. Thus, screening for ANA by IIF is recommended to be followed by confirmatory testing of positive findings employing different assay techniques. Given the continuous growth in the demand for autoAb testing, IIF has been challenged as the standard method for ANA and other autoAb analyses due to lacking automation, standardization, modern data management, and human bias in IIF pattern interpretation. To address these limitations of autoAb testing, the CytoBead® technique has been introduced recently which enables automated interpretation of cell-based IIF and quantitative autoAb multiplexing by addressable microbead immunoassays in one reaction environment. Thus, autoAb screening and confirmatory testing can be combined for the first time. The present review discusses the history of autoAb assay techniques in this context and gives an overview and outlook of the recent progress in emerging technologies.
A technique for the reduction of banding in Landsat Thematic Mapper Images
Helder, Dennis L.; Quirk, Bruce K.; Hood, Joy J.
1992-01-01
The radiometric difference between forward and reverse scans in Landsat thematic mapper (TM) images, referred to as "banding," can create problems when enhancing the image for interpretation or when performing quantitative studies. Recent research has led to the development of a method that reduces the banding in Landsat TM data sets. It involves passing a one-dimensional spatial kernel over the data set. This kernel is developed from the statistics of the banding pattern and is based on the Wiener filter. It has been implemented on both a DOS-based microcomputer and several UNIX-based computer systems. The algorithm has successfully reduced the banding in several test data sets.
Helical Axis Data Visualization and Analysis of the Knee Joint Articulation.
Millán Vaquero, Ricardo Manuel; Vais, Alexander; Dean Lynch, Sean; Rzepecki, Jan; Friese, Karl-Ingo; Hurschler, Christof; Wolter, Franz-Erich
2016-09-01
We present processing methods and visualization techniques for accurately characterizing and interpreting kinematical data of flexion-extension motion of the knee joint based on helical axes. We make use of the Lie group of rigid body motions and particularly its Lie algebra for a natural representation of motion sequences. This allows to analyze and compute the finite helical axis (FHA) and instantaneous helical axis (IHA) in a unified way without redundant degrees of freedom or singularities. A polynomial fitting based on Legendre polynomials within the Lie algebra is applied to provide a smooth description of a given discrete knee motion sequence which is essential for obtaining stable instantaneous helical axes for further analysis. Moreover, this allows for an efficient overall similarity comparison across several motion sequences in order to differentiate among several cases. Our approach combines a specifically designed patient-specific three-dimensional visualization basing on the processed helical axes information and incorporating computed tomography (CT) scans for an intuitive interpretation of the axes and their geometrical relation with respect to the knee joint anatomy. In addition, in the context of the study of diseases affecting the musculoskeletal articulation, we propose to integrate the above tools into a multiscale framework for exploring related data sets distributed across multiple spatial scales. We demonstrate the utility of our methods, exemplarily processing a collection of motion sequences acquired from experimental data involving several surgery techniques. Our approach enables an accurate analysis, visualization and comparison of knee joint articulation, contributing to the evaluation and diagnosis in medical applications.
Commentary: "re-programming or selecting adult stem cells?".
Trosko, James E
2008-01-01
The recent observations that embryonic stemness-associated genes could assist in the "de-differentiation" of adult skin fibroblast cells to "embryonic-like stem cells", using the "somatic cell nuclear transfer" techniques, have been interpreted as indicating a "re-programming" of genes. These reports have demonstrated a "proof of principle" approach to by-pass many, but not all, of the ethical, scientific and medical limitations of the "therapeutic cloning" of embryonic stem cells from embryos. However, while the interpretation that real "re-programming" of all those somatic fibroblastic differentiation genes might be correct, there does exists an alternative hypothesis of these exciting results. Based on the fact that multipotent adult stem cells exist in most, if not all, adult organs, the possibility exists that all these recent "re-programming" results, using the somatic nuclear transfer techniques, actually were the results of transferred rare nuclear material from the adult stem cells residing in the skin of the mouse, monkey and human samples. An examination of the rationale for this challenging hypothesis has been drawn from the hypothesis of the "stem cell theory of cancer", as well as from the field of human adult stem cells research.
Measuring energy expenditure in clinical populations: rewards and challenges
Psota, T; Chen, KY
2013-01-01
The measurement of energy expenditure (EE) is recommended as an important component of comprehensive clinical nutrition assessments in patients with altered metabolic states, who failed to respond to nutrition support and with critical illness that require individualized nutrition support. There is evidence that EE is variable in patients with metabolic diseases, such as chronic renal disease, cirrhosis, HIV, cancer cachexia, cystic fibrosis and patients under intensive care. By using appropriate techniques and interpretations of basal or resting EE, clinicians can facilitate the adequate nutrition support with minimum negative impacts from under- or overfeeding in these patients. This review is based on our current understanding of the different components of EE and the techniques to measure them, and to re-examine advances and challenges to determine energy needs in clinical populations with more focuses on the obese, pediatric and elderly patients. In addition, technological advances have expanded the choices of market-available equipments for assessing EE, which also bring specific challenges and rewards in selecting the right equipment with specific performance criteria. Lastly, analytical considerations of interpreting the results of EE in the context of changing body composition are presented and discussed. PMID:23443826
Water quality analysis of the Rapur area, Andhra Pradesh, South India using multivariate techniques
NASA Astrophysics Data System (ADS)
Nagaraju, A.; Sreedhar, Y.; Thejaswi, A.; Sayadi, Mohammad Hossein
2017-10-01
The groundwater samples from Rapur area were collected from different sites to evaluate the major ion chemistry. The large number of data can lead to difficulties in the integration, interpretation, and representation of the results. Two multivariate statistical methods, hierarchical cluster analysis (HCA) and factor analysis (FA), were applied to evaluate their usefulness to classify and identify geochemical processes controlling groundwater geochemistry. Four statistically significant clusters were obtained from 30 sampling stations. This has resulted two important clusters viz., cluster 1 (pH, Si, CO3, Mg, SO4, Ca, K, HCO3, alkalinity, Na, Na + K, Cl, and hardness) and cluster 2 (EC and TDS) which are released to the study area from different sources. The application of different multivariate statistical techniques, such as principal component analysis (PCA), assists in the interpretation of complex data matrices for a better understanding of water quality of a study area. From PCA, it is clear that the first factor (factor 1), accounted for 36.2% of the total variance, was high positive loading in EC, Mg, Cl, TDS, and hardness. Based on the PCA scores, four significant cluster groups of sampling locations were detected on the basis of similarity of their water quality.
NASA Technical Reports Server (NTRS)
Goodrich, Charles C.
1993-01-01
The goal of this project is to investigate the use of visualization software based on the visual programming and data-flow paradigms to meet the needs of the SPOF and through it the International Solar Terrestrial Physics (ISTP) science community. Specific needs we address include science planning, data interpretation, comparisons of data with simulation and model results, and data acquisition. Our accomplishments during the twelve month grant period are discussed below.
Subsurface damage distribution in the lapping process.
Wang, Zhuo; Wu, Yulie; Dai, Yifan; Li, Shengyi
2008-04-01
To systematically investigate the influence of lapping parameters on subsurface damage (SSD) depth and characterize the damage feature comprehensively, maximum depth and distribution of SSD generated in the optical lapping process were measured with the magnetorheological finishing wedge technique. Then, an interaction of adjacent indentations was applied to interpret the generation of maximum depth of SSD. Eventually, the lapping procedure based on the influence of lapping parameters on the material removal rate and SSD depth was proposed to improve the lapping efficiency.
Selective field evaporation in field-ion microscopy for ordered alloys
NASA Astrophysics Data System (ADS)
Ge, Xi-jin; Chen, Nan-xian; Zhang, Wen-qing; Zhu, Feng-wu
1999-04-01
Semiempirical pair potentials, obtained by applying the Chen-inversion technique to a cohesion equation of Rose et al. [Phys. Rev. B 29, 2963 (1984)], are employed to assess the bonding energies of surface atoms of intermetallic compounds. This provides a new calculational model of selective field evaporation in field-ion microscopy (FIM). Based on this model, a successful interpretation of FIM image contrasts for Fe3Al, PtCo, Pt3Co, Ni4Mo, Ni3Al, and Ni3Fe is given.
Data Mining for Financial Applications
NASA Astrophysics Data System (ADS)
Kovalerchuk, Boris; Vityaev, Evgenii
This chapter describes Data Mining in finance by discussing financial tasks, specifics of methodologies and techniques in this Data Mining area. It includes time dependence, data selection, forecast horizon, measures of success, quality of patterns, hypothesis evaluation, problem ID, method profile, attribute-based and relational methodologies. The second part of the chapter discusses Data Mining models and practice in finance. It covers use of neural networks in portfolio management, design of interpretable trading rules and discovering money laundering schemes using decision rules and relational Data Mining methodology.
Distributed intelligent data analysis in diabetic patient management.
Bellazzi, R.; Larizza, C.; Riva, A.; Mira, A.; Fiocchi, S.; Stefanelli, M.
1996-01-01
This paper outlines the methodologies that can be used to perform an intelligent analysis of diabetic patients' data, realized in a distributed management context. We present a decision-support system architecture based on two modules, a Patient Unit and a Medical Unit, connected by telecommunication services. We stress the necessity to resort to temporal abstraction techniques, combined with time series analysis, in order to provide useful advice to patients; finally, we outline how data analysis and interpretation can be cooperatively performed by the two modules. PMID:8947655
Liver Masses: What Physicians Need to Know About Ordering and Interpreting Liver Imaging.
Sheybani, Arman; Gaba, Ron C; Lokken, R Peter; Berggruen, Senta M; Mar, Winnie A
2017-10-18
This paper reviews diagnostic imaging techniques used to characterize liver masses and the imaging characteristics of the most common liver masses. The role of recently adopted ultrasound and magnetic resonance imaging contrast agents will be emphasized. Contrast-enhanced ultrasound is an inexpensive exam which can confirm benignity of certain liver masses without ionizing radiation. Magnetic resonance imaging using hepatocyte-specific gadolinium-based contrast agents can help confirm or narrow the differential diagnosis of liver masses.
Utility of correlation techniques in gravity and magnetic interpretation
NASA Technical Reports Server (NTRS)
Chandler, V. W.; Koski, J. S.; Braile, L. W.; Hinze, W. J.
1977-01-01
Two methods of quantitative combined analysis, internal correspondence and clustering, are presented. Model studies are used to illustrate implementation and interpretation procedures of these methods, particularly internal correspondence. Analysis of the results of applying these methods to data from the midcontinent and a transcontinental profile show they can be useful in identifying crustal provinces, providing information on horizontal and vertical variations of physical properties over province size zones, validating long wave-length anomalies, and isolating geomagnetic field removal problems. Thus, these techniques are useful in considering regional data acquired by satellites.
NASA Technical Reports Server (NTRS)
Driscoll, R. S.; Francis, R. E.
1970-01-01
A description of space and supporting aircraft photography for the interpretation and analyses of non-forest (shrubby and herbaceous) native vegetation is presented. The research includes the development of a multiple sampling technique to assign quantitative area values of specific plant community types included within an assigned space photograph map unit. Also, investigations of aerial film type, scale, and season of photography for identification and quantity measures of shrubby and herbaceous vegetation were conducted. Some work was done to develop automated interpretation techniques with film image density measurement devices.
Semantic wireless localization of WiFi terminals in smart buildings
NASA Astrophysics Data System (ADS)
Ahmadi, H.; Polo, A.; Moriyama, T.; Salucci, M.; Viani, F.
2016-06-01
The wireless localization of mobile terminals in indoor scenarios by means of a semantic interpretation of the environment is addressed in this work. A training-less approach based on the real-time calibration of a simple path loss model is proposed which combines (i) the received signal strength information measured by the wireless terminal and (ii) the topological features of the localization domain. A customized evolutionary optimization technique has been designed to estimate the optimal target position that fits the complex wireless indoor propagation and the semantic target-environment relation, as well. The proposed approach is experimentally validated in a real building area where the available WiFi network is opportunistically exploited for data collection. The presented results point out a reduction of the localization error obtained with the introduction of a very simple semantic interpretation of the considered scenario.
Abraham, Jared D.; Bedrosian, Paul A.; Asch, Theodore H.; Ball, Lyndsay B.; Cannia, James C.; Phillips, Jeffery D.; Lackey, Susan
2012-01-01
Surface audio-magnetotelluric and time-domain electromagnetic methods achieved sufficient depth of penetration and indicated that the paleochannel was much more complex than the original geological model. Simulated and observed gravity anomalies indicate that imaging sand and gravel aquifers near Oakland, Nebraska, would be difficult due to the complex basement density contrasts. Interpretation of the magnetic data indicates no magnetic sources from geologic units above the bedrock surface. Based upon the analysis and interpretation of the four methods evaluated, we suggest a large-scale survey using a high-powered time-domain airborne system. This is the most efficient and cost-effective path forward for the Eastern Nebraska Water Assessment group to map paleochannels that lie beneath thick clay-rich glacial tills.
Bello, Ajediran I; Ofori, Eric K; Alabi, Oluwasegun J; Adjei, David N
2014-03-29
Objective physical assessment of patients with lumbar spondylosis involves plain film radiographs (PFR) viewing and interpretation by the radiologists. Physiotherapists also routinely assess PFR within the scope of their practice. However, studies appraising the level of agreement of physiotherapists' PFR interpretation with radiologists are not common in Ghana. Forty-one (41) physiotherapists took part in the cross-sectional survey. An assessment guide was developed from findings of the interpretation of three PFR of patients with lumbar spondylosis by a radiologist. The three PFR were selected from a pool of different radiographs based on clarity, common visible pathological features, coverage body segments and short post production period. Physiotherapists were required to view the same PFR after which they were assessed with the assessment guide according to the number of features identified correctly or incorrectly. The score range on the assessment form was 0-24, interpreted as follow: 0-8 points (low), 9-16 points (moderate) and 17-24 points (high) levels of agreement. Data were analyzed using one sample t-test and fisher's exact test at α = 0.05. The mean score of interpretation for the physiotherapists was 12.7 ± 2.6 points compared to the radiologist's interpretation of 24 points (assessment guide). The physiotherapists' levels were found to be significantly associated with their academic qualification (p = 0.006) and sex (p = 0.001). However, their levels of agreement were not significantly associated with their age group (p = 0.098), work settings (p = 0.171), experience (p = 0.666), preferred PFR view (p = 0.088) and continuing education (p = 0.069). The physiotherapists' skills fall short of expectation for interpreting PFR of patients with lumbar spondylosis. The levels of agreement with radiologist's interpretation have no link with year of clinial practice, age, work settings and continuing education. Thus, routine PFR viewing techniques should be made a priority in physiotherapists' continuing professional education.
NASA Astrophysics Data System (ADS)
Cheyney, S.; Fishwick, S.; Hill, I. A.; Linford, N. T.
2015-08-01
Despite the development of advanced processing and interpretation tools for magnetic data sets in the fields of mineral and hydrocarbon industries, these methods have not achieved similar levels of adoption for archaeological or very near surface surveys. Using a synthetic data set we demonstrate that certain methodologies and assumptions used to successfully invert more regional-scale data can lead to large discrepancies between the true and recovered depths when applied to archaeological-type anomalies. We propose variations to the current approach, analysing the choice of the depth-weighting function, mesh design and parameter constraints, to develop an appropriate technique for the 3-D inversion of archaeological-scale data sets. The results show a successful recovery of a synthetic scenario, as well as a case study of a Romano-Celtic temple in the UK. For the case study, the final susceptibility model is compared with two coincident ground penetrating radar surveys, showing a high correlation with the comparative depth slices. The new approach takes interpretation of archaeological data sets beyond a simple 2-D visual interpretation based on pattern recognition.
NASA Astrophysics Data System (ADS)
Salvini, F.; Arragoni, S.; Cianfarra, P.; Maggi, M.
2017-10-01
The comment by Berra et al. (2017) on the evidence of Alpine tectonics in Eastern Sardinia proposed by Arragoni et al. (2016) is based on the sedimentological interpretations of few local outcrops in a marginal portion of the study area. The Cenozoic Alpine fold-and-thrust setting, which characterizes this region, presents flat-over-flat shear planes acting along originally stratigraphic contacts, where stratigraphic continuity is obviously maintained. The ramp sectors present steeply dipping bedding attitudes, and there is no need to invoke and to force prograding clinoforms with unrealistic angles to justify them. The balanced geological cross section proposed by Arragoni et al. (2016) is fully supported by robust newly collected structural data and is compatible with the overall tectonic setting, while the interpretation proposed by Berra et al. (2017) lacks a detailed structural investigation. We believe that the partial application of the techniques available to modern geology may lead to incorrect interpretations, thus representing an obstacle for the progress of knowledge in the Earth sciences.
Hramov, Alexander E.; Maksimenko, Vladimir A.; Pchelintseva, Svetlana V.; Runnova, Anastasiya E.; Grubov, Vadim V.; Musatov, Vyacheslav Yu.; Zhuravlev, Maksim O.; Koronovskii, Alexey A.; Pisarchik, Alexander N.
2017-01-01
In order to classify different human brain states related to visual perception of ambiguous images, we use an artificial neural network (ANN) to analyze multichannel EEG. The classifier built on the basis of a multilayer perceptron achieves up to 95% accuracy in classifying EEG patterns corresponding to two different interpretations of the Necker cube. The important feature of our classifier is that trained on one subject it can be used for the classification of EEG traces of other subjects. This result suggests the existence of common features in the EEG structure associated with distinct interpretations of bistable objects. We firmly believe that the significance of our results is not limited to visual perception of the Necker cube images; the proposed experimental approach and developed computational technique based on ANN can also be applied to study and classify different brain states using neurophysiological data recordings. This may give new directions for future research in the field of cognitive and pathological brain activity, and for the development of brain-computer interfaces. PMID:29255403
Lintott, Paul R; Davison, Sophie; van Breda, John; Kubasiewicz, Laura; Dowse, David; Daisley, Jonathan; Haddy, Emily; Mathews, Fiona
2018-01-01
Acoustic surveys of bats are one of the techniques most commonly used by ecological practitioners. The results are used in Ecological Impact Assessments to assess the likely impacts of future developments on species that are widely protected in law, and to monitor developments' postconstruction. However, there is no standardized methodology for analyzing or interpreting these data, which can make the assessment of the ecological value of a site very subjective. Comparisons of sites and projects are therefore difficult for ecologists and decision-makers, for example, when trying to identify the best location for a new road based on relative bat activity levels along alternative routes. Here, we present a new web-based, data-driven tool, Ecobat, which addresses the need for a more robust way of interpreting ecological data. Ecobat offers users an easy, standardized, and objective method for analyzing bat activity data. It allows ecological practitioners to compare bat activity data at regional and national scales and to generate a numerical indicator of the relative importance of a night's worth of bat activity. The tool is free and open-source; because the underlying algorithms are already developed, it could easily be expanded to new geographical regions and species. Data donation is required to ensure the robustness of the analyses; we use a positive feedback mechanism to encourage ecological practitioners to share data by providing in return high quality, contextualized data analysis, and graphical visualizations for direct use in ecological reports.
NGS-based likelihood ratio for identifying contributors in two- and three-person DNA mixtures.
Chan Mun Wei, Joshua; Zhao, Zicheng; Li, Shuai Cheng; Ng, Yen Kaow
2018-06-01
DNA fingerprinting, also known as DNA profiling, serves as a standard procedure in forensics to identify a person by the short tandem repeat (STR) loci in their DNA. By comparing the STR loci between DNA samples, practitioners can calculate a probability of match to identity the contributors of a DNA mixture. Most existing methods are based on 13 core STR loci which were identified by the Federal Bureau of Investigation (FBI). Analyses based on these loci of DNA mixture for forensic purposes are highly variable in procedures, and suffer from subjectivity as well as bias in complex mixture interpretation. With the emergence of next-generation sequencing (NGS) technologies, the sequencing of billions of DNA molecules can be parallelized, thus greatly increasing throughput and reducing the associated costs. This allows the creation of new techniques that incorporate more loci to enable complex mixture interpretation. In this paper, we propose a computation for likelihood ratio that uses NGS (next generation sequencing) data for DNA testing on mixed samples. We have applied the method to 4480 simulated DNA mixtures, which consist of various mixture proportions of 8 unrelated whole-genome sequencing data. The results confirm the feasibility of utilizing NGS data in DNA mixture interpretations. We observed an average likelihood ratio as high as 285,978 for two-person mixtures. Using our method, all 224 identity tests for two-person mixtures and three-person mixtures were correctly identified. Copyright © 2018 Elsevier Ltd. All rights reserved.
Paillet, Frederick L.; Crowder, R.E.
1996-01-01
Quantitative analysis of geophysical logs in ground-water studies often involves at least as broad a range of applications and variation in lithology as is typically encountered in petroleum exploration, making such logs difficult to calibrate and complicating inversion problem formulation. At the same time, data inversion and analysis depend on inversion model formulation and refinement, so that log interpretation cannot be deferred to a geophysical log specialist unless active involvement with interpretation can be maintained by such an expert over the lifetime of the project. We propose a generalized log-interpretation procedure designed to guide hydrogeologists in the interpretation of geophysical logs, and in the integration of log data into ground-water models that may be systematically refined and improved in an iterative way. The procedure is designed to maximize the effective use of three primary contributions from geophysical logs: (1) The continuous depth scale of the measurements along the well bore; (2) The in situ measurement of lithologic properties and the correlation with hydraulic properties of the formations over a finite sample volume; and (3) Multiple independent measurements that can potentially be inverted for multiple physical or hydraulic properties of interest. The approach is formulated in the context of geophysical inversion theory, and is designed to be interfaced with surface geophysical soundings and conventional hydraulic testing. The step-by-step procedures given in our generalized interpretation and inversion technique are based on both qualitative analysis designed to assist formulation of the interpretation model, and quantitative analysis used to assign numerical values to model parameters. The approach bases a decision as to whether quantitative inversion is statistically warranted by formulating an over-determined inversion. If no such inversion is consistent with the inversion model, quantitative inversion is judged not possible with the given data set. Additional statistical criteria such as the statistical significance of regressions are used to guide the subsequent calibration of geophysical data in terms of hydraulic variables in those situations where quantitative data inversion is considered appropriate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gibson, Adam; Piquette, Kathryn E.; Bergmann, Uwe
Ancient Egyptian mummies were often covered with an outer casing, panels and masks made from cartonnage: a lightweight material made from linen, plaster, and recycled papyrus held together with adhesive. Egyptologists, papyrologists, and historians aim to recover and read extant text on the papyrus contained within cartonnage layers, but some methods, such as dissolving mummy casings, are destructive. The use of an advanced range of different imaging modalities was investigated to test the feasibility of non-destructive approaches applied to multi-layered papyrus found in ancient Egyptian mummy cartonnage. Eight different techniques were compared by imaging four synthetic phantoms designed to providemore » robust, well-understood, yet relevant sample standards using modern papyrus and replica inks. The techniques include optical (multispectral imaging with reflection and transillumination, and optical coherence tomography), X-ray (X-ray fluorescence imaging, X-ray fluorescence spectroscopy, X-ray micro computed tomography and phase contrast X-ray) and terahertz-based approaches. Optical imaging techniques were able to detect inks on all four phantoms, but were unable to significantly penetrate papyrus. X-ray-based techniques were sensitive to iron-based inks with excellent penetration but were not able to detect carbon-based inks. However, using terahertz imaging, it was possible to detect carbon-based inks with good penetration but with less sensitivity to iron-based inks. The phantoms allowed reliable and repeatable tests to be made at multiple sites on three continents. Finally, the tests demonstrated that each imaging modality needs to be optimised for this particular application: it is, in general, not sufficient to repurpose an existing device without modification. Furthermore, it is likely that no single imaging technique will to be able to robustly detect and enable the reading of text within ancient Egyptian mummy cartonnage. However, by carefully selecting, optimising and combining techniques, text contained within these fragile and rare artefacts may eventually be open to non-destructive imaging, identification, and interpretation.« less
Gibson, Adam; Piquette, Kathryn E.; Bergmann, Uwe; ...
2018-02-26
Ancient Egyptian mummies were often covered with an outer casing, panels and masks made from cartonnage: a lightweight material made from linen, plaster, and recycled papyrus held together with adhesive. Egyptologists, papyrologists, and historians aim to recover and read extant text on the papyrus contained within cartonnage layers, but some methods, such as dissolving mummy casings, are destructive. The use of an advanced range of different imaging modalities was investigated to test the feasibility of non-destructive approaches applied to multi-layered papyrus found in ancient Egyptian mummy cartonnage. Eight different techniques were compared by imaging four synthetic phantoms designed to providemore » robust, well-understood, yet relevant sample standards using modern papyrus and replica inks. The techniques include optical (multispectral imaging with reflection and transillumination, and optical coherence tomography), X-ray (X-ray fluorescence imaging, X-ray fluorescence spectroscopy, X-ray micro computed tomography and phase contrast X-ray) and terahertz-based approaches. Optical imaging techniques were able to detect inks on all four phantoms, but were unable to significantly penetrate papyrus. X-ray-based techniques were sensitive to iron-based inks with excellent penetration but were not able to detect carbon-based inks. However, using terahertz imaging, it was possible to detect carbon-based inks with good penetration but with less sensitivity to iron-based inks. The phantoms allowed reliable and repeatable tests to be made at multiple sites on three continents. Finally, the tests demonstrated that each imaging modality needs to be optimised for this particular application: it is, in general, not sufficient to repurpose an existing device without modification. Furthermore, it is likely that no single imaging technique will to be able to robustly detect and enable the reading of text within ancient Egyptian mummy cartonnage. However, by carefully selecting, optimising and combining techniques, text contained within these fragile and rare artefacts may eventually be open to non-destructive imaging, identification, and interpretation.« less
NASA Technical Reports Server (NTRS)
Jameson, Arthur R.
1997-01-01
The effort involved three elements all related to the measurement of rain and clouds using microwaves: (1) Examine recently proposed techniques for measuring rainfall rate and rain water content using data from ground-based radars and the TRMM microwave link in order to develop improved ground validation and radar calibration techniques; (2) Develop dual-polarization, multiple frequency radar techniques for estimating rain water content and cloud water content to interpret the vertical profiles of radar reflectivity factors (Z) measured by the TRMM Precipitation Radar; and (3) Investigate theoretically and experimentally the potential biases in TRMM Z measurements due to spatial inhomogeneities in precipitation. The research succeeded in addressing all of these topics, resulting in several referred publications. addition, the research indicated that the effects of non-Rayleigh statistics resulting from the nature of the precipitation inhomogeneities will probably not result in serious errors for the TRMM radar Measurements, but the TRMM radiometers may be subject to significant bias due to the inhomogeneities.
NASA Technical Reports Server (NTRS)
Schenker, Paul S. (Editor)
1991-01-01
The volume on data fusion from multiple sources discusses fusing multiple views, temporal analysis and 3D motion interpretation, sensor fusion and eye-to-hand coordination, and integration in human shape perception. Attention is given to surface reconstruction, statistical methods in sensor fusion, fusing sensor data with environmental knowledge, computational models for sensor fusion, and evaluation and selection of sensor fusion techniques. Topics addressed include the structure of a scene from two and three projections, optical flow techniques for moving target detection, tactical sensor-based exploration in a robotic environment, and the fusion of human and machine skills for remote robotic operations. Also discussed are K-nearest-neighbor concepts for sensor fusion, surface reconstruction with discontinuities, a sensor-knowledge-command fusion paradigm for man-machine systems, coordinating sensing and local navigation, and terrain map matching using multisensing techniques for applications to autonomous vehicle navigation.
A Face Attention Technique for a Robot Able to Interpret Facial Expressions
NASA Astrophysics Data System (ADS)
Simplício, Carlos; Prado, José; Dias, Jorge
Automatic facial expressions recognition using vision is an important subject towards human-robot interaction. Here is proposed a human face focus of attention technique and a facial expressions classifier (a Dynamic Bayesian Network) to incorporate in an autonomous mobile agent whose hardware is composed by a robotic platform and a robotic head. The focus of attention technique is based on the symmetry presented by human faces. By using the output of this module the autonomous agent keeps always targeting the human face frontally. In order to accomplish this, the robot platform performs an arc centered at the human; thus the robotic head, when necessary, moves synchronized. In the proposed probabilistic classifier the information is propagated, from the previous instant, in a lower level of the network, to the current instant. Moreover, to recognize facial expressions are used not only positive evidences but also negative.
Tomographic reconstruction of tokamak plasma light emission using wavelet-vaguelette decomposition
NASA Astrophysics Data System (ADS)
Schneider, Kai; Nguyen van Yen, Romain; Fedorczak, Nicolas; Brochard, Frederic; Bonhomme, Gerard; Farge, Marie; Monier-Garbet, Pascale
2012-10-01
Images acquired by cameras installed in tokamaks are difficult to interpret because the three-dimensional structure of the plasma is flattened in a non-trivial way. Nevertheless, taking advantage of the slow variation of the fluctuations along magnetic field lines, the optical transformation may be approximated by a generalized Abel transform, for which we proposed in Nguyen van yen et al., Nucl. Fus., 52 (2012) 013005, an inversion technique based on the wavelet-vaguelette decomposition. After validation of the new method using an academic test case and numerical data obtained with the Tokam 2D code, we present an application to an experimental movie obtained in the tokamak Tore Supra. A comparison with a classical regularization technique for ill-posed inverse problems, the singular value decomposition, allows us to assess the efficiency. The superiority of the wavelet-vaguelette technique is reflected in preserving local features, such as blobs and fronts, in the denoised emissivity map.
NASA Astrophysics Data System (ADS)
Nguyen van yen, R.; Fedorczak, N.; Brochard, F.; Bonhomme, G.; Schneider, K.; Farge, M.; Monier-Garbet, P.
2012-01-01
Images acquired by cameras installed in tokamaks are difficult to interpret because the three-dimensional structure of the plasma is flattened in a non-trivial way. Nevertheless, taking advantage of the slow variation of the fluctuations along magnetic field lines, the optical transformation may be approximated by a generalized Abel transform, for which we propose an inversion technique based on the wavelet-vaguelette decomposition. After validation of the new method using an academic test case and numerical data obtained with the Tokam 2D code, we present an application to an experimental movie obtained in the tokamak Tore Supra. A comparison with a classical regularization technique for ill-posed inverse problems, the singular value decomposition, allows us to assess the efficiency. The superiority of the wavelet-vaguelette technique is reflected in preserving local features, such as blobs and fronts, in the denoised emissivity map.
Alkarkhi, Abbas F M; Ramli, Saifullah Bin; Easa, Azhar Mat
2009-01-01
Major (sodium, potassium, calcium, magnesium) and minor elements (iron, copper, zinc, manganese) and one heavy metal (lead) of Cavendish banana flour and Dream banana flour were determined, and data were analyzed using multivariate statistical techniques of factor analysis and discriminant analysis. Factor analysis yielded four factors explaining more than 81% of the total variance: the first factor explained 28.73%, comprising magnesium, sodium, and iron; the second factor explained 21.47%, comprising only manganese and copper; the third factor explained 15.66%, comprising zinc and lead; while the fourth factor explained 15.50%, comprising potassium. Discriminant analysis showed that magnesium and sodium exhibited a strong contribution in discriminating the two types of banana flour, affording 100% correct assignation. This study presents the usefulness of multivariate statistical techniques for analysis and interpretation of complex mineral content data from banana flour of different varieties.
Combined magnetic and gravity analysis
NASA Technical Reports Server (NTRS)
Hinze, W. J.; Braile, L. W.; Chandler, V. W.; Mazella, F. E.
1975-01-01
Efforts are made to identify methods of decreasing magnetic interpretation ambiguity by combined gravity and magnetic analysis, to evaluate these techniques in a preliminary manner, to consider the geologic and geophysical implications of correlation, and to recommend a course of action to evaluate methods of correlating gravity and magnetic anomalies. The major thrust of the study was a search and review of the literature. The literature of geophysics, geology, geography, and statistics was searched for articles dealing with spatial correlation of independent variables. An annotated bibliography referencing the Germane articles and books is presented. The methods of combined gravity and magnetic analysis techniques are identified and reviewed. A more comprehensive evaluation of two types of techniques is presented. Internal correspondence of anomaly amplitudes is examined and a combined analysis is done utilizing Poisson's theorem. The geologic and geophysical implications of gravity and magnetic correlation based on both theoretical and empirical relationships are discussed.
NASA Technical Reports Server (NTRS)
Jameson, Arthur R.
1997-01-01
The effort involved three elements all related to the measurement of rain and clouds using microwaves: (1) Examine recently proposed techniques for measuring rainfall rate and rain water content using data from ground-based radars and the TRMM microwave link in order to develop improved ground validation and radar calibration techniques; (2) Develop dual-polarization, multiple frequency radar techniques for estimating rain water content and cloud water content to interpret the vertical profiles of radar reflectivity factors (Z) measured by the TRMM Precipitation Radar; and (3) Investigate theoretically and experimentally the potential biases in TRMM Z measurements due to spatial inhomogeneities in precipitation. The research succeeded in addressing all of these topics, resulting in several refereed publications. In addition, the research indicated that the effects of non-Rayleigh statistics resulting from the nature of the precipitation inhomogeneities will probably not result in serious errors for the TRMM radar measurements, but the TRMM radiometers may be subject to significant bias due to the inhomogeneities.
Nonlinear Acoustic and Ultrasonic NDT of Aeronautical Components
NASA Astrophysics Data System (ADS)
Van Den Abeele, Koen; Katkowski, Tomasz; Mattei, Christophe
2006-05-01
In response to the demand for innovative microdamage inspection systems, with high sensitivity and undoubted accuracy, we are currently investigating the use and robustness of several acoustic and ultrasonic NDT techniques based on Nonlinear Elastic Wave Spectroscopy (NEWS) for the characterization of microdamage in aeronautical components. In this report, we illustrate the results of an amplitude dependent analysis of the resonance behaviour, both in time (signal reverberation) and in frequency (sweep) domain. The technique is applied to intact and damaged samples of Carbon Fiber Reinforced Plastics (CFRP) composites after thermal loading or mechanical fatigue. The method shows a considerable gain in sensitivity and an incontestable interpretation of the results for nonlinear signatures in comparison with the linear characteristics. For highly fatigued samples, slow dynamical effects are observed.
A Quantitative Technique for Beginning Microscopists.
ERIC Educational Resources Information Center
Sundberg, Marshall D.
1984-01-01
Stereology is the study of three-dimensional objects through the interpretation of two-dimensional images. Stereological techniques used in introductory botany to quantitatively examine changes in leaf anatomy in response to different environments are discussed. (JN)
Integration of geological remote-sensing techniques in subsurface analysis
Taranik, James V.; Trautwein, Charles M.
1976-01-01
Geological remote sensing is defined as the study of the Earth utilizing electromagnetic radiation which is either reflected or emitted from its surface in wavelengths ranging from 0.3 micrometre to 3 metres. The natural surface of the Earth is composed of a diversified combination of surface cover types, and geologists must understand the characteristics of surface cover types to successfully evaluate remotely-sensed data. In some areas landscape surface cover changes throughout the year, and analysis of imagery acquired at different times of year can yield additional geological information. Integration of different scales of analysis allows landscape features to be effectively interpreted. Interpretation of the static elements displayed on imagery is referred to as an image interpretation. Image interpretation is dependent upon: (1) the geologist's understanding of the fundamental aspects of image formation, and (2.) his ability to detect, delineate, and classify image radiometric data; recognize radiometric patterns; and identify landscape surface characteristics as expressed on imagery. A geologic interpretation integrates surface characteristics of the landscape with subsurface geologic relationships. Development of a geologic interpretation from imagery is dependent upon: (1) the geologist's ability to interpret geomorphic processes from their static surface expression as landscape characteristics on imagery, (2) his ability to conceptualize the dynamic processes responsible for the evolution 6f interpreted geologic relationships (his ability to develop geologic models). The integration of geologic remote-sensing techniques in subsurface analysis is illustrated by development of an exploration model for ground water in the Tucson area of Arizona, and by the development of an exploration model for mineralization in southwest Idaho.
Dobos, Gustav; Overhamm, Tatiana; Büssing, Arndt; Ostermann, Thomas; Langhorst, Jost; Kümmel, Sherko; Paul, Anna; Cramer, Holger
2015-10-01
The aim of this study was to investigate the effects of a mindfulness-based day care clinic group program for cancer survivors on health-related quality of life and mental health; and to investigate which psychological variables are associated with changes in health variables. One hundred seventeen cancer survivors (91.0 % female; mean age 53.9 ± 10.7 years; 65.0 % breast cancer; mean time since diagnosis 27.2 ± 46.5 months) participated in an 11-week mindfulness-based day care clinic group program, 6 h per week. The intervention incorporated mindfulness-based meditation, yoga, cognitive-behavioral techniques, and lifestyle modification. Outcome measures including health-related quality of life (EORTC QLQ-C30), depression and anxiety (HADS); and psychological variables including life satisfaction (BMLSS), mindfulness (FMI), adaptive coping styles (AKU), spiritual/religious attitudes in dealing with illness (SpREUK), and interpretation of illness (IIQ) were assessed before, after, and 3 months after the intervention. Using mixed linear models, significant improvements in global health status, physical functioning, role functioning, emotional functioning, cognitive functioning, and social functioning were found. Cancer-related symptoms, including fatigue, pain, insomnia, constipation, anxiety, and depression, also improved significantly. Mindfulness, life satisfaction, health satisfaction, all coping styles, all spiritual/religious attitudes, and interpretation of illness as something of value increased; interpretation of illness as punishment decreased significantly (all p < 0.05). Improved outcomes were associated with increases in psychological variables, mainly life satisfaction, health satisfaction, and trust in medical help (R (2) = 7.3-43.6 %). Supportive mindfulness-based interventions can be considered as an effective means to improve cancer survivors' physical and mental health. Functional improvements are associated with improved satisfaction and coping styles.
Mapping ecological states in a complex environment
NASA Astrophysics Data System (ADS)
Steele, C. M.; Bestelmeyer, B.; Burkett, L. M.; Ayers, E.; Romig, K.; Slaughter, A.
2013-12-01
The vegetation of northern Chihuahuan Desert rangelands is sparse, heterogeneous and for most of the year, consists of a large proportion of non-photosynthetic material. The soils in this area are spectrally bright and variable in their reflectance properties. Both factors provide challenges to the application of remote sensing for estimating canopy variables (e.g., leaf area index, biomass, percentage canopy cover, primary production). Additionally, with reference to current paradigms of rangeland health assessment, remotely-sensed estimates of canopy variables have limited practical use to the rangeland manager if they are not placed in the context of ecological site and ecological state. To address these challenges, we created a multifactor classification system based on the USDA-NRCS ecological site schema and associated state-and-transition models to map ecological states on desert rangelands in southern New Mexico. Applying this system using per-pixel image processing techniques and multispectral, remotely sensed imagery raised other challenges. Per-pixel image classification relies upon the spectral information in each pixel alone, there is no reference to the spatial context of the pixel and its relationship with its neighbors. Ecological state classes may have direct relevance to managers but the non-unique spectral properties of different ecological state classes in our study area means that per-pixel classification of multispectral data performs poorly in discriminating between different ecological states. We found that image interpreters who are familiar with the landscape and its associated ecological site descriptions perform better than per-pixel classification techniques in assigning ecological states. However, two important issues affect manual classification methods: subjectivity of interpretation and reproducibility of results. An alternative to per-pixel classification and manual interpretation is object-based image analysis. Object-based image analysis provides a platform for classification that more closely resembles human recognition of objects within a remotely sensed image. The analysis presented here compares multiple thematic maps created for test locations on the USDA-ARS Jornada Experimental Range ranch. Three study sites in different pastures, each 300 ha in size, were selected for comparison on the basis of their ecological site type (';Clayey', ';Sandy' and a combination of both) and the degree of complexity of vegetation cover. Thematic maps were produced for each study site using (i) manual interpretation of digital aerial photography (by five independent interpreters); (ii) object-oriented, decision-tree classification of fine and moderate spatial resolution imagery (Quickbird; Landsat Thematic Mapper) and (iii) ground survey. To identify areas of uncertainty, we compared agreement in location, areal extent and class assignation between 5 independently produced, manually-digitized ecological state maps and with the map created from ground survey. Location, areal extent and class assignation of the map produced by object-oriented classification was also assessed with reference to the ground survey map.
NASA Technical Reports Server (NTRS)
Erb, R. B.
1974-01-01
The Coastal Analysis Team of the Johnson Space Center conducted a 1-year investigation of ERTS-1 MSS data to determine its usefulness in coastal zone management. Galveston Bay, Texas, was the study area for evaluating both conventional image interpretation and computer-aided techniques. There was limited success in detecting, identifying and measuring areal extent of water bodies, turbidity zones, phytoplankton blooms, salt marshes, grasslands, swamps, and low wetlands using image interpretation techniques. Computer-aided techniques were generally successful in identifying these features. Aerial measurement of salt marshes accuracies ranged from 89 to 99 percent. Overall classification accuracy of all study sites was 89 percent for Level 1 and 75 percent for Level 2.
Manin, Alex N; Voronin, Alexander P; Drozd, Ksenia V; Manin, Nikolay G; Bauer-Brandl, Annette; Perlovich, German L
2014-12-18
The main problem occurring at the early stages of cocrystal search is the choice of an effective screening technique. Among the most popular techniques of obtaining cocrystals are crystallization from solution, crystallization from melt and solvent-drop grinding. This paper represents a comparative analysis of the following screening techniques: DSC cocrystal screening method, thermal microscopy and saturation temperature method. The efficiency of different techniques of cocrystal screening was checked in 18 systems. Benzamide and benzoic acid derivatives were chosen as model systems due to their ability to form acid-amide supramolecular heterosynthon. The screening has confirmed the formation of 6 new cocrystals. The screening by the saturation temperature method has the highest screen-out rate but the smallest range of application. DSC screening has a satisfactory accuracy and allows screening over a short time. Thermal microscopy is most efficient as an additional technique used to interpret ambiguous DSC screening results. The study also included an analysis of the influence of solvent type and component solubility on cocrystal formation. Copyright © 2014 Elsevier B.V. All rights reserved.
Spectroscopic and Statistical Techniques for Information Recovery in Metabonomics and Metabolomics
NASA Astrophysics Data System (ADS)
Lindon, John C.; Nicholson, Jeremy K.
2008-07-01
Methods for generating and interpreting metabolic profiles based on nuclear magnetic resonance (NMR) spectroscopy, mass spectrometry (MS), and chemometric analysis methods are summarized and the relative strengths and weaknesses of NMR and chromatography-coupled MS approaches are discussed. Given that all data sets measured to date only probe subsets of complex metabolic profiles, we describe recent developments for enhanced information recovery from the resulting complex data sets, including integration of NMR- and MS-based metabonomic results and combination of metabonomic data with data from proteomics, transcriptomics, and genomics. We summarize the breadth of applications, highlight some current activities, discuss the issues relating to metabonomics, and identify future trends.
Spectroscopic and statistical techniques for information recovery in metabonomics and metabolomics.
Lindon, John C; Nicholson, Jeremy K
2008-01-01
Methods for generating and interpreting metabolic profiles based on nuclear magnetic resonance (NMR) spectroscopy, mass spectrometry (MS), and chemometric analysis methods are summarized and the relative strengths and weaknesses of NMR and chromatography-coupled MS approaches are discussed. Given that all data sets measured to date only probe subsets of complex metabolic profiles, we describe recent developments for enhanced information recovery from the resulting complex data sets, including integration of NMR- and MS-based metabonomic results and combination of metabonomic data with data from proteomics, transcriptomics, and genomics. We summarize the breadth of applications, highlight some current activities, discuss the issues relating to metabonomics, and identify future trends.
NASA Astrophysics Data System (ADS)
Hammud, Hassan H.; Ghannoum, Amer; Masoud, Mamdouh S.
2006-02-01
Sixteen Schiff bases obtained from the condensation of benzaldehyde or salicylaldehyde with various amines (aniline, 4-carboxyaniline, phenylhydrazine, 2,4-dinitrophenylhydrazine, ethylenediamine, hydrazine, o-phenylenediamine and 2,6-pyridinediamine) are studied with UV-vis spectroscopy to observe the effect of solvents, substituents and other structural factors on the spectra. The bands involving different electronic transitions are interpreted. Computerized analysis and multiple regression techniques were applied to calculate the regression and correlation coefficients based on the equation that relates peak position λmax to the solvent parameters that depend on the H-bonding ability, refractive index and dielectric constant of solvents.
ERIC Educational Resources Information Center
Bowers, Amy M.
2010-01-01
This study examined the results of a uniquely constructed literacy assessment technique, combining Dehn's (2006) interpretation of psychological processing assessment and the Reading Rockets (Greater Washington Educational Telecommunications Association, Inc., 2005) interpretation of academic achievement. The study employed a quantitative…
Executive-Attentional Uncertainty Responses by Rhesus Macaques ("Macaca mulatta")
ERIC Educational Resources Information Center
Smith, J. David; Coutinho, Mariana V. C.; Church, Barbara A.; Beran, Michael J.
2013-01-01
The uncertainty response has been influential in studies of human perception, and it is crucial in the growing research literature that explores animal metacognition. However, the uncertainty response's interpretation is still sharply debated. The authors sought to clarify this interpretation using the dissociative technique of cognitive loads…
Applications of three-dimensional modeling in electromagnetic exploration
NASA Astrophysics Data System (ADS)
Pellerin, Louise Donna
Numerical modeling is used in geophysical exploration to understand physical mechanisms of a geophysical method, compare different exploration techniques, and interpret field data. Exploring the physics of a geophysical response enhances the geophysicist's insight, resulting in better survey design and interpretation. Comparing exploration methods numerically can eliminate the use of a technique that cannot resolve the exploration target. Interpreting field data to determine the structure of the earth is the ultimate goal of the exploration geophysicist. Applications of three-dimensional (3-D) electromagnetic (EM) modeling in mining, geothermal and environmental exploration demonstrate the importance of numerical modeling as a geophysical tool. Detection of a confined, conductive target with a vertical electric source (VES) can be an effective technique if properly used. The vertical magnetic field response is due solely to multi-dimensional structures, and current channeling is the dominant mechanism. A VES is deployed in a bore hole, hence the orientation of the hole is critical to the response. A deviation of more than a degree from the vertical can result in a host response that overwhelms the target response. Only the in-phase response at low frequencies can be corrected to a purely vertical response. The geothermal system studied consists of a near-surface clay cap and a deep reservoir. The magnetotelluric (MT), controlled-source audio magnetotelluric (CSAMT), long-offset time-domain electromagnetic (LOTEM) and central-loop transient electromagnetic (TEM) methods are appraised for their ability to detect the reservoir and delineate the cap. The reservoir anomaly is supported by boundary charges and therefore is detectable only with deep sounding electric field measurement MT and LOTEM. The cap is easily delineated with all techniques. For interpretation I developed an approximate 3-D inversion that refines a 1-D interpretation by removing lateral distortions. An iterative inverse procedure invokes EM reciprocity while operating on a localized portion of the survey area thereby greatly reducing the computational requirements. The scheme is illustrated with three synthetic data sets representative of problems in environmental geophysics.
Application of multivariable statistical techniques in plant-wide WWTP control strategies analysis.
Flores, X; Comas, J; Roda, I R; Jiménez, L; Gernaey, K V
2007-01-01
The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation of the complex multicriteria data sets and allows an improved use of information for effective evaluation of control strategies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerr, J.; Jones, G.L.
1996-01-01
Recent advances in hardware and software have given the interpreter and engineer new ways to view 3D seismic data and well bore information. Recent papers have also highlighted the use of various statistics and seismic attributes. By combining new 3D rendering technologies with recent trends in seismic analysis, the interpreter can improve the structural and stratigraphic resolution of hydrocarbon reservoirs. This paper gives several examples using 3D visualization to better define both the structural and stratigraphic aspects of several different structural types from around the world. Statistics, 3D visualization techniques and rapid animation are used to show complex faulting andmore » detailed channel systems. These systems would be difficult to map using either 2D or 3D data with conventional interpretation techniques.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerr, J.; Jones, G.L.
1996-12-31
Recent advances in hardware and software have given the interpreter and engineer new ways to view 3D seismic data and well bore information. Recent papers have also highlighted the use of various statistics and seismic attributes. By combining new 3D rendering technologies with recent trends in seismic analysis, the interpreter can improve the structural and stratigraphic resolution of hydrocarbon reservoirs. This paper gives several examples using 3D visualization to better define both the structural and stratigraphic aspects of several different structural types from around the world. Statistics, 3D visualization techniques and rapid animation are used to show complex faulting andmore » detailed channel systems. These systems would be difficult to map using either 2D or 3D data with conventional interpretation techniques.« less
Accelerometer Data Analysis and Presentation Techniques
NASA Technical Reports Server (NTRS)
Rogers, Melissa J. B.; Hrovat, Kenneth; McPherson, Kevin; Moskowitz, Milton E.; Reckart, Timothy
1997-01-01
The NASA Lewis Research Center's Principal Investigator Microgravity Services project analyzes Orbital Acceleration Research Experiment and Space Acceleration Measurement System data for principal investigators of microgravity experiments. Principal investigators need a thorough understanding of data analysis techniques so that they can request appropriate analyses to best interpret accelerometer data. Accelerometer data sampling and filtering is introduced along with the related topics of resolution and aliasing. Specific information about the Orbital Acceleration Research Experiment and Space Acceleration Measurement System data sampling and filtering is given. Time domain data analysis techniques are discussed and example environment interpretations are made using plots of acceleration versus time, interval average acceleration versus time, interval root-mean-square acceleration versus time, trimmean acceleration versus time, quasi-steady three dimensional histograms, and prediction of quasi-steady levels at different locations. An introduction to Fourier transform theory and windowing is provided along with specific analysis techniques and data interpretations. The frequency domain analyses discussed are power spectral density versus frequency, cumulative root-mean-square acceleration versus frequency, root-mean-square acceleration versus frequency, one-third octave band root-mean-square acceleration versus frequency, and power spectral density versus frequency versus time (spectrogram). Instructions for accessing NASA Lewis Research Center accelerometer data and related information using the internet are provided.
Optimal control of a harmonic oscillator: Economic interpretations
NASA Astrophysics Data System (ADS)
Janová, Jitka; Hampel, David
2013-10-01
Optimal control is a popular technique for modelling and solving the dynamic decision problems in economics. A standard interpretation of the criteria function and Lagrange multipliers in the profit maximization problem is well known. On a particular example, we aim to a deeper understanding of the possible economic interpretations of further mathematical and solution features of the optimal control problem: we focus on the solution of the optimal control problem for harmonic oscillator serving as a model for Phillips business cycle. We discuss the economic interpretations of arising mathematical objects with respect to well known reasoning for these in other problems.
Bond, R R; Kligfield, P D; Zhu, T; Finlay, D D; Drew, B; Guldenring, D; Breen, C; Clifford, G D; Wagner, G S
2015-01-01
The 12-lead electrocardiogram (ECG) is a complex set of cardiac signals that require a high degree of skill and clinical knowledge to interpret. Therefore, it is imperative to record and understand how expert readers interpret the 12-lead ECG. This short paper showcases how eye tracking technology and audio data can be fused together and visualised to gain insight into the interpretation techniques employed by an eminent ECG champion, namely Dr Rory Childers. Copyright © 2015 Elsevier Inc. All rights reserved.
Ambros, P F; Ambros, I M; Brodeur, G M; Haber, M; Khan, J; Nakagawara, A; Schleiermacher, G; Speleman, F; Spitz, R; London, W B; Cohn, S L; Pearson, A D J; Maris, J M
2009-01-01
Neuroblastoma serves as a paradigm for utilising tumour genomic data for determining patient prognosis and treatment allocation. However, before the establishment of the International Neuroblastoma Risk Group (INRG) Task Force in 2004, international consensus on markers, methodology, and data interpretation did not exist, compromising the reliability of decisive genetic markers and inhibiting translational research efforts. The objectives of the INRG Biology Committee were to identify highly prognostic genetic aberrations to be included in the new INRG risk classification schema and to develop precise definitions, decisive biomarkers, and technique standardisation. The review of the INRG database (n=8800 patients) by the INRG Task Force finally enabled the identification of the most significant neuroblastoma biomarkers. In addition, the Biology Committee compared the standard operating procedures of different cooperative groups to arrive at international consensus for methodology, nomenclature, and future directions. Consensus was reached to include MYCN status, 11q23 allelic status, and ploidy in the INRG classification system on the basis of an evidence-based review of the INRG database. Standardised operating procedures for analysing these genetic factors were adopted, and criteria for proper nomenclature were developed. Neuroblastoma treatment planning is highly dependant on tumour cell genomic features, and it is likely that a comprehensive panel of DNA-based biomarkers will be used in future risk assignment algorithms applying genome-wide techniques. Consensus on methodology and interpretation is essential for uniform INRG classification and will greatly facilitate international and cooperative clinical and translational research studies. PMID:19401703
DFP: a Bioconductor package for fuzzy profile identification and gene reduction of microarray data
Glez-Peña, Daniel; Álvarez, Rodrigo; Díaz, Fernando; Fdez-Riverola, Florentino
2009-01-01
Background Expression profiling assays done by using DNA microarray technology generate enormous data sets that are not amenable to simple analysis. The greatest challenge in maximizing the use of this huge amount of data is to develop algorithms to interpret and interconnect results from different genes under different conditions. In this context, fuzzy logic can provide a systematic and unbiased way to both (i) find biologically significant insights relating to meaningful genes, thereby removing the need for expert knowledge in preliminary steps of microarray data analyses and (ii) reduce the cost and complexity of later applied machine learning techniques being able to achieve interpretable models. Results DFP is a new Bioconductor R package that implements a method for discretizing and selecting differentially expressed genes based on the application of fuzzy logic. DFP takes advantage of fuzzy membership functions to assign linguistic labels to gene expression levels. The technique builds a reduced set of relevant genes (FP, Fuzzy Pattern) able to summarize and represent each underlying class (pathology). A last step constructs a biased set of genes (DFP, Discriminant Fuzzy Pattern) by intersecting existing fuzzy patterns in order to detect discriminative elements. In addition, the software provides new functions and visualisation tools that summarize achieved results and aid in the interpretation of differentially expressed genes from multiple microarray experiments. Conclusion DFP integrates with other packages of the Bioconductor project, uses common data structures and is accompanied by ample documentation. It has the advantage that its parameters are highly configurable, facilitating the discovery of biologically relevant connections between sets of genes belonging to different pathologies. This information makes it possible to automatically filter irrelevant genes thereby reducing the large volume of data supplied by microarray experiments. Based on these contributions GENECBR, a successful tool for cancer diagnosis using microarray datasets, has recently been released. PMID:19178723
DFP: a Bioconductor package for fuzzy profile identification and gene reduction of microarray data.
Glez-Peña, Daniel; Alvarez, Rodrigo; Díaz, Fernando; Fdez-Riverola, Florentino
2009-01-29
Expression profiling assays done by using DNA microarray technology generate enormous data sets that are not amenable to simple analysis. The greatest challenge in maximizing the use of this huge amount of data is to develop algorithms to interpret and interconnect results from different genes under different conditions. In this context, fuzzy logic can provide a systematic and unbiased way to both (i) find biologically significant insights relating to meaningful genes, thereby removing the need for expert knowledge in preliminary steps of microarray data analyses and (ii) reduce the cost and complexity of later applied machine learning techniques being able to achieve interpretable models. DFP is a new Bioconductor R package that implements a method for discretizing and selecting differentially expressed genes based on the application of fuzzy logic. DFP takes advantage of fuzzy membership functions to assign linguistic labels to gene expression levels. The technique builds a reduced set of relevant genes (FP, Fuzzy Pattern) able to summarize and represent each underlying class (pathology). A last step constructs a biased set of genes (DFP, Discriminant Fuzzy Pattern) by intersecting existing fuzzy patterns in order to detect discriminative elements. In addition, the software provides new functions and visualisation tools that summarize achieved results and aid in the interpretation of differentially expressed genes from multiple microarray experiments. DFP integrates with other packages of the Bioconductor project, uses common data structures and is accompanied by ample documentation. It has the advantage that its parameters are highly configurable, facilitating the discovery of biologically relevant connections between sets of genes belonging to different pathologies. This information makes it possible to automatically filter irrelevant genes thereby reducing the large volume of data supplied by microarray experiments. Based on these contributions GENECBR, a successful tool for cancer diagnosis using microarray datasets, has recently been released.
Semantic Space as a Metapopulation System: Modelling the Wikipedia Information Flow Network
NASA Astrophysics Data System (ADS)
Masucci, A. Paolo; Kalampokis, Alkiviadis; Eguíluz, Víctor M.; Hernández-García, Emilio
The meaning of a word can be defined as an indefinite set of interpretants, which are other words that circumscribe the semantic content of the word they represent (Derrida 1982). In the same way each interpretant has a set of interpretants representing it and so on. Hence the indefinite chain of meaning assumes a rhizomatic shape that can be represented and analysed via the modern techniques of network theory (Dorogovtsev and Mendes 2013).
NASA Astrophysics Data System (ADS)
Walker, J. I.; Blodgett, D. L.; Suftin, I.; Kunicki, T.
2013-12-01
High-resolution data for use in environmental modeling is increasingly becoming available at broad spatial and temporal scales. Downscaled climate projections, remotely sensed landscape parameters, and land-use/land-cover projections are examples of datasets that may exceed an individual investigation's data management and analysis capacity. To allow projects on limited budgets to work with many of these data sets, the burden of working with them must be reduced. The approach being pursued at the U.S. Geological Survey Center for Integrated Data Analytics uses standard self-describing web services that allow machine to machine data access and manipulation. These techniques have been implemented and deployed in production level server-based Web Processing Services that can be accessed from a web application or scripted workflow. Data publication techniques that allow machine-interpretation of large collections of data have also been implemented for numerous datasets at U.S. Geological Survey data centers as well as partner agencies and academic institutions. Discovery of data services is accomplished using a method in which a machine-generated metadata record holds content--derived from the data's source web service--that is intended for human interpretation as well as machine interpretation. A distributed search application has been developed that demonstrates the utility of a decentralized search of data-owner metadata catalogs from multiple agencies. The integrated but decentralized system of metadata, data, and server-based processing capabilities will be presented. The design, utility, and value of these solutions will be illustrated with applied science examples and success stories. Datasets such as the EPA's Integrated Climate and Land Use Scenarios, USGS/NASA MODIS derived land cover attributes, and downscaled climate projections from several sources are examples of data this system includes. These and other datasets, have been published as standard, self-describing, web services that provide the ability to inspect and subset the data. This presentation will demonstrate this file-to-web service concept and how it can be used from script-based workflows or web applications.
NASA Astrophysics Data System (ADS)
Ibraheem, Ismael M.; Elawadi, Eslam A.; El-Qady, Gad M.
2018-03-01
The Wadi El Natrun area in Egypt is located west of the Nile Delta on both sides of the Cairo-Alexandria desert road, between 30°00‧ and 30°40‧N latitude, and 29°40‧ and 30°40‧E longitude. The name refers to the NW-SE trending depression located in the area and containing lakes that produce natron salt. In spite of the area is promising for oil and gas exploration as well as agricultural projects, Geophysical studies carried out in the area is limited to the regional seismic surveys accomplished by oil companies. This study presents the interpretation of the airborne magnetic data to map the structure architecture and depth to the basement of the study area. This interpretation was facilitated by applying different data enhancement and processing techniques. These techniques included filters (regional-residual separation), derivatives and depth estimation using spectral analysis and Euler deconvolution. The results were refined using 2-D forward modeling along three profiles. Based on the depth estimation techniques, the estimated depth to the basement surface, ranges from 2.25 km to 5.43 km while results of the two-dimensional forward modeling show that the depth of the basement surface ranges from 2.2 km to 4.8 km. The dominant tectonic trends in the study area at deep levels are NW (Suez Trend), NNW, NE, and ENE (Syrian Arc System trend). The older ENE trend, which dominates the northwestern desert is overprinted in the study area by relatively recent NW and NE trends, whereas the tectonic trends at shallow levels are NW, ENE, NNE (Aqaba Trend), and NE. The predominant structure trend for both deep and shallow structures is the NW trend. The results of this study can be used to better understand deep-seated basement structures and to support decisions with regard to the development of agriculture, industrial areas, as well as oil and gas exploration in northern Egypt.
Lu, Tong; Tai, Chiew-Lan; Yang, Huafei; Cai, Shijie
2009-08-01
We present a novel knowledge-based system to automatically convert real-life engineering drawings to content-oriented high-level descriptions. The proposed method essentially turns the complex interpretation process into two parts: knowledge representation and knowledge-based interpretation. We propose a new hierarchical descriptor-based knowledge representation method to organize the various types of engineering objects and their complex high-level relations. The descriptors are defined using an Extended Backus Naur Form (EBNF), facilitating modification and maintenance. When interpreting a set of related engineering drawings, the knowledge-based interpretation system first constructs an EBNF-tree from the knowledge representation file, then searches for potential engineering objects guided by a depth-first order of the nodes in the EBNF-tree. Experimental results and comparisons with other interpretation systems demonstrate that our knowledge-based system is accurate and robust for high-level interpretation of complex real-life engineering projects.
Probabilistic Magnetotelluric Inversion with Adaptive Regularisation Using the No-U-Turns Sampler
NASA Astrophysics Data System (ADS)
Conway, Dennis; Simpson, Janelle; Didana, Yohannes; Rugari, Joseph; Heinson, Graham
2018-04-01
We present the first inversion of magnetotelluric (MT) data using a Hamiltonian Monte Carlo algorithm. The inversion of MT data is an underdetermined problem which leads to an ensemble of feasible models for a given dataset. A standard approach in MT inversion is to perform a deterministic search for the single solution which is maximally smooth for a given data-fit threshold. An alternative approach is to use Markov Chain Monte Carlo (MCMC) methods, which have been used in MT inversion to explore the entire solution space and produce a suite of likely models. This approach has the advantage of assigning confidence to resistivity models, leading to better geological interpretations. Recent advances in MCMC techniques include the No-U-Turns Sampler (NUTS), an efficient and rapidly converging method which is based on Hamiltonian Monte Carlo. We have implemented a 1D MT inversion which uses the NUTS algorithm. Our model includes a fixed number of layers of variable thickness and resistivity, as well as probabilistic smoothing constraints which allow sharp and smooth transitions. We present the results of a synthetic study and show the accuracy of the technique, as well as the fast convergence, independence of starting models, and sampling efficiency. Finally, we test our technique on MT data collected from a site in Boulia, Queensland, Australia to show its utility in geological interpretation and ability to provide probabilistic estimates of features such as depth to basement.
Measurements of Solar Vector Magnetic Fields
NASA Technical Reports Server (NTRS)
Hagyard, M. J. (Editor)
1985-01-01
Various aspects of the measurement of solar magnetic fields are presented. The four major subdivisions of the study are: (1) theoretical understanding of solar vector magnetic fields; (3) techniques for interpretation of observational data; and (4) techniques for data display.
A technique for extracting Radiolaria from radiolarian cherts.
NASA Technical Reports Server (NTRS)
Pessagno, E. A., Jr.; Newport, R. L.
1972-01-01
Differential solution of Mesozoic radiolarian cherts with hydrofluoric acid has yielded well-preserved, matrix-free Radiolaria. This technique allows the full utilization of Radiolaria in interpreting the stratigraphy of ophiolite sequences and of other successions where cherts are prevalent.
NASA Astrophysics Data System (ADS)
Haryono, E.; Widartono, B. S.; Lukito, H.; Kusumayuda, S. B.
2016-11-01
This paper aims at exploring interpretability of the panchromatic band of Landsat Enhanced Thematic Mapper Plus (ETM+) compared to the panchromatic aerial photograph in lineament and fracture trace extraction. Special interest is addressed to karst terrain where lineaments and fracture traces are expressed by aligned valleys and closed depressions. The study area is an single aerial photographic coverage in the Gunungsewu Karst, Java-Indonesia which is characterized by a lineament-controlled cone karst and labyrinth-cone karst. The result shows that the recording time of the Landsat ETM+ with respect to the shadow resulting from the sun illumination angle is the key factor in the performance of lineament and fracture traces extraction. Directional filtering and slicing techniques significantly enhance the lineament interpretability of the panchromatic band of Landsat ETM+. The two methods result in more lineaments and fracture traces which T-test affirm in 0.001 and 0.004 significant levels. Length based lineament analysis attains a better result compared to frequency based analysis.
A data mining based approach to predict spatiotemporal changes in satellite images
NASA Astrophysics Data System (ADS)
Boulila, W.; Farah, I. R.; Ettabaa, K. Saheb; Solaiman, B.; Ghézala, H. Ben
2011-06-01
The interpretation of remotely sensed images in a spatiotemporal context is becoming a valuable research topic. However, the constant growth of data volume in remote sensing imaging makes reaching conclusions based on collected data a challenging task. Recently, data mining appears to be a promising research field leading to several interesting discoveries in various areas such as marketing, surveillance, fraud detection and scientific discovery. By integrating data mining and image interpretation techniques, accurate and relevant information (i.e. functional relation between observed parcels and a set of informational contents) can be automatically elicited. This study presents a new approach to predict spatiotemporal changes in satellite image databases. The proposed method exploits fuzzy sets and data mining concepts to build predictions and decisions for several remote sensing fields. It takes into account imperfections related to the spatiotemporal mining process in order to provide more accurate and reliable information about land cover changes in satellite images. The proposed approach is validated using SPOT images representing the Saint-Denis region, capital of Reunion Island. Results show good performances of the proposed framework in predicting change for the urban zone.
NASA Astrophysics Data System (ADS)
Munch, Federico; Grayver, Alexander; Khan, Amir; Kuvshinov, Alexey
2017-04-01
As most of Earth's interior remains geochemically unsampled, geophysical techniques based on seismology, geodesy, gravimetry, and electromagnetic studies play prominent roles because of their ability to sense structure at depth. Although seismic tomography maps show a variety of structures, separating thermal and compositional contributions from seismic velocities alone still remains a challenging task. Alternatively, as electrical conductivity is sensitive to temperature, chemical composition, oxygen fugacity, water content, and the presence of melt, it can serve for determining chemistry, mineralogy, and physical structure of the deep mantle. In this work we estimate and invert local C-responses (period range 3-100 days) for a number of worldwide geomagnetic observatories to map lateral variations of electrical conductivity in Earth's mantle (400-1600 km depth). The obtained conductivity profiles are interpreted in terms of basalt fraction in a basalt-harzburgite mixture, temperature structure, and water content variations. Interpretation is based on a self-consistent thermodynamic calculation of mineral phase equilibria, electrical conductivity databases, and probabilistic inverse methods.
Park, Yu Rang; Chung, Tae Su; Lee, Young Joo; Song, Yeong Wook; Lee, Eun Young; Sohn, Yeo Won; Song, Sukgil; Park, Woong Yang
2012-01-01
Infection by microorganisms may cause fatally erroneous interpretations in the biologic researches based on cell culture. The contamination by microorganism in the cell culture is quite frequent (5% to 35%). However, current approaches to identify the presence of contamination have many limitations such as high cost of time and labor, and difficulty in interpreting the result. In this paper, we propose a model to predict cell infection, using a microarray technique which gives an overview of the whole genome profile. By analysis of 62 microarray expression profiles under various experimental conditions altering cell type, source of infection and collection time, we discovered 5 marker genes, NM_005298, NM_016408, NM_014588, S76389, and NM_001853. In addition, we discovered two of these genes, S76389, and NM_001853, are involved in a Mycolplasma-specific infection process. We also suggest models to predict the source of infection, cell type or time after infection. We implemented a web based prediction tool in microarray data, named Prediction of Microbial Infection (http://www.snubi.org/software/PMI). PMID:23091307
An Online Synchronous Test for Professional Interpreters
ERIC Educational Resources Information Center
Chen, Nian-Shing; Ko, Leong
2010-01-01
This article is based on an experiment designed to conduct an interpreting test for multiple candidates online, using web-based synchronous cyber classrooms. The test model was based on the accreditation test for Professional Interpreters produced by the National Accreditation Authority of Translators and Interpreters (NAATI) in Australia.…
Structural damage detection using deep learning of ultrasonic guided waves
NASA Astrophysics Data System (ADS)
Melville, Joseph; Alguri, K. Supreet; Deemer, Chris; Harley, Joel B.
2018-04-01
Structural health monitoring using ultrasonic guided waves relies on accurate interpretation of guided wave propagation to distinguish damage state indicators. However, traditional physics based models do not provide an accurate representation, and classic data driven techniques, such as a support vector machine, are too simplistic to capture the complex nature of ultrasonic guide waves. To address this challenge, this paper uses a deep learning interpretation of ultrasonic guided waves to achieve fast, accurate, and automated structural damaged detection. To achieve this, full wavefield scans of thin metal plates are used, half from the undamaged state and half from the damaged state. This data is used to train our deep network to predict the damage state of a plate with 99.98% accuracy given signals from just 10 spatial locations on the plate, as compared to that of a support vector machine (SVM), which achieved a 62% accuracy.
In Situ Observations of Electric-Field Induced Nanoparticle Aggregation
NASA Astrophysics Data System (ADS)
Woehl, T. J.; Browning, N. D.; Ristenpart, W. D.
2010-11-01
Nanoparticles have been widely observed to aggregate laterally on electrodes in response to applied electric fields. The mechanism driving this behavior, however, is unclear. Several groups have interpreted the aggregation in terms of electrohydrodynamic or electroosmotic fluid motion, but little corroborating evidence has been presented. Notably, work to date has relied on post situ observations using electron microscopy. Here we present a fluorescence microscopy technique to track the dynamics of nanoparticle aggregation in situ. Fluorescent 20-nm polystyrene nanoparticles are observed to form optically visible aggregates in response to an applied AC field. Although single particle resolution is lost, the existence of aggregates on the electrode surface is marked by growing clusters of increasingly bright intensity. We present a systematic investigation of the effects of applied potential and frequency on the aggregation rate, and we interpret the behavior in terms of a mechanism based on electrically induced convective flow.
NASA Technical Reports Server (NTRS)
Palosz, W.
2003-01-01
Residual gases present in closed ampoules may affect different crystal growth processes. Their presence may affect techniques requiring low pressures and affect the crystal quality in different ways. For that reason a good understanding and control of formation of residual gases may be important for an optimum design and meaningful interpretation of crystal growth experiments. Our extensive experimental and theoretical study includes degassing of silica glass and generation of gases from various source materials. Different materials processing conditions, like outgassing under vacuum, annealing in hydrogen, resublimation, different material preparation procedures, multiple annealings, different processing times, and others were applied and their effect on the amount and composition of gas were analyzed. The experimental results were interpreted based on theoretical calculations on diffusion in silica glass and source materials and thermochemistry of the system. Procedures for a reduction of the amount of gas are also discussed.
Incorrect interpretation of carbon mass balance biases global vegetation fire emission estimates.
Surawski, N C; Sullivan, A L; Roxburgh, S H; Meyer, C P Mick; Polglase, P J
2016-05-05
Vegetation fires are a complex phenomenon in the Earth system with many global impacts, including influences on global climate. Estimating carbon emissions from vegetation fires relies on a carbon mass balance technique that has evolved with two different interpretations. Databases of global vegetation fire emissions use an approach based on 'consumed biomass', which is an approximation to the biogeochemically correct 'burnt carbon' approach. Here we show that applying the 'consumed biomass' approach to global emissions from vegetation fires leads to annual overestimates of carbon emitted to the atmosphere by 4.0% or 100 Tg compared with the 'burnt carbon' approach. The required correction is significant and represents ∼9% of the net global forest carbon sink estimated annually. Vegetation fire emission studies should use the 'burnt carbon' approach to quantify and understand the role of this burnt carbon, which is not emitted to the atmosphere, as a sink enriched in carbon.
U.S. Geological Survey water resources activities in Florida, 1985-86
Glenn, M. E.
1986-01-01
This report contains summary statements of water resources activities in Florida conducted by the Water Resources Division of the U.S. Geological Survey in cooperation with Federal, State , and local agencies during 1985-86. These activities are part of the Federal program of appraising the Nation 's water resources. Water resources appraisals in Florida are highly diversified, ranging from hydrologic records networks to interpretive appraisals of water resources and applied research to develop investigative techniques. Thus, water resource investigations range from basic descriptive water-availability studies for areas of low-intensity water development and management to sophisticated cause and effect studies in areas of high-intensity water development and management. The interpretive reports and records that are products of the investigations are a principal hydrologic foundation upon which the plans for development, management, and protection of Florida 's water resources may be based. (Lantz-PTT)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, Ashley; Hunt, Kristopher; Bernstein, Hans C.
Interest in microbial communities for bioprocessing has surged in recent years based on the potential to optimize multiple tasks simultaneously and to enhance process productivity and stability. The presence and magnitude of these desirable system properties often result from interactions between functionally distinct community members. The importance of interactions, while appreciated by some disciplines for decades, has gained interest recently due to the development of ‘omics techniques, polymicrobial culturing approaches, and computational methods which has made the systems-level analysis of interacting components more tractable. This review defines and categorizes natural and engineered system components, interactions, and emergent properties, as wellmore » as presents three ecological theories relevant to microbial communities. Case studies are interpreted to illustrate components, interactions, emergent properties and agreement with theoretical concepts. A general foundation is laid to facilitate interpretation of current systems and to aid in future design of microbial systems for the next generation of bioprocesses.« less
Diffusion processes in tumors: A nuclear medicine approach
NASA Astrophysics Data System (ADS)
Amaya, Helman
2016-07-01
The number of counts used in nuclear medicine imaging techniques, only provides physical information about the desintegration of the nucleus present in the the radiotracer molecules that were uptaken in a particular anatomical region, but that information is not a real metabolic information. For this reason a mathematical method was used to find a correlation between number of counts and 18F-FDG mass concentration. This correlation allows a better interpretation of the results obtained in the study of diffusive processes in an agar phantom, and based on it, an image from the PETCETIX DICOM sample image set from OsiriX-viewer software was processed. PET-CT gradient magnitude and Laplacian images could show direct information on diffusive processes for radiopharmaceuticals that enter into the cells by simple diffusion. In the case of the radiopharmaceutical 18F-FDG is necessary to include pharmacokinetic models, to make a correct interpretation of the gradient magnitude and Laplacian of counts images.
NASA Technical Reports Server (NTRS)
Westmoreland, Sally; Stow, Douglas A.
1992-01-01
A framework is proposed for analyzing ancillary data and developing procedures for incorporating ancillary data to aid interactive identification of land-use categories in land-use updates. The procedures were developed for use within an integrated image processsing/geographic information systems (GIS) that permits simultaneous display of digital image data with the vector land-use data to be updated. With such systems and procedures, automated techniques are integrated with visual-based manual interpretation to exploit the capabilities of both. The procedural framework developed was applied as part of a case study to update a portion of the land-use layer in a regional scale GIS. About 75 percent of the area in the study site that experienced a change in land use was correctly labeled into 19 categories using the combination of automated and visual interpretation procedures developed in the study.
From Phonomecanocardiography to Phonocardiography computer aided
NASA Astrophysics Data System (ADS)
Granados, J.; Tavera, F.; López, G.; Velázquez, J. M.; Hernández, R. T.; López, G. A.
2017-01-01
Due to lack of training doctors to identify many of the disorders in the heart by conventional listening, it is necessary to add an objective and methodological analysis to support this technique. In order to obtain information of the performance of the heart to be able to diagnose heart disease through a simple, cost-effective procedure by means of a data acquisition system, we have obtained Phonocardiograms (PCG), which are images of the sounds emitted by the heart. A program of acoustic, visual and artificial vision recognition was elaborated to interpret them. Based on the results of previous research of cardiologists a code of interpretation of PCG and associated diseases was elaborated. Also a site, within the university campus, of experimental sampling of cardiac data was created. Phonocardiography computer-aided is a viable and low cost procedure which provides additional medical information to make a diagnosis of complex heart diseases. We show some previous results.
Utility of correlation techniques in gravity and magnetic interpretation
NASA Technical Reports Server (NTRS)
Chandler, V. W.; Koski, J. S.; Braice, L. W.; Hinze, W. J.
1977-01-01
Internal correspondence uses Poisson's Theorem in a moving-window linear regression analysis between the anomalous first vertical derivative of gravity and total magnetic field reduced to the pole. The regression parameters provide critical information on source characteristics. The correlation coefficient indicates the strength of the relation between magnetics and gravity. Slope value gives delta j/delta sigma estimates of the anomalous source. The intercept furnishes information on anomaly interference. Cluster analysis consists of the classification of subsets of data into groups of similarity based on correlation of selected characteristics of the anomalies. Model studies are used to illustrate implementation and interpretation procedures of these methods, particularly internal correspondence. Analysis of the results of applying these methods to data from the midcontinent and a transcontinental profile shows they can be useful in identifying crustal provinces, providing information on horizontal and vertical variations of physical properties over province size zones, validating long wavelength anomalies, and isolating geomagnetic field removal problems.
NASA Astrophysics Data System (ADS)
Maurya, S. P.; Singh, K. H.; Singh, N. P.
2018-05-01
In present study, three recently developed geostatistical methods, single attribute analysis, multi-attribute analysis and probabilistic neural network algorithm have been used to predict porosity in inter well region for Blackfoot field, Alberta, Canada, an offshore oil field. These techniques make use of seismic attributes, generated by model based inversion and colored inversion techniques. The principle objective of the study is to find the suitable combination of seismic inversion and geostatistical techniques to predict porosity and identification of prospective zones in 3D seismic volume. The porosity estimated from these geostatistical approaches is corroborated with the well log porosity. The results suggest that all the three implemented geostatistical methods are efficient and reliable to predict the porosity but the multi-attribute and probabilistic neural network analysis provide more accurate and high resolution porosity sections. A low impedance (6000-8000 m/s g/cc) and high porosity (> 15%) zone is interpreted from inverted impedance and porosity sections respectively between 1060 and 1075 ms time interval and is characterized as reservoir. The qualitative and quantitative results demonstrate that of all the employed geostatistical methods, the probabilistic neural network along with model based inversion is the most efficient method for predicting porosity in inter well region.
D Tracking Based Augmented Reality for Cultural Heritage Data Management
NASA Astrophysics Data System (ADS)
Battini, C.; Landi, G.
2015-02-01
The development of contactless documentation techniques is allowing researchers to collect high volumes of three-dimensional data in a short time but with high levels of accuracy. The digitalisation of cultural heritage opens up the possibility of using image processing and analysis, and computer graphics techniques, to preserve this heritage for future generations; augmenting it with additional information or with new possibilities for its enjoyment and use. The collection of precise datasets about cultural heritage status is crucial for its interpretation, its conservation and during the restoration processes. The application of digital-imaging solutions for various feature extraction, image data-analysis techniques, and three-dimensional reconstruction of ancient artworks, allows the creation of multidimensional models that can incorporate information coming from heterogeneous data sets, research results and historical sources. Real objects can be scanned and reconstructed virtually, with high levels of data accuracy and resolution. Real-time visualisation software and hardware is rapidly evolving and complex three-dimensional models can be interactively visualised and explored on applications developed for mobile devices. This paper will show how a 3D reconstruction of an object, with multiple layers of information, can be stored and visualised through a mobile application that will allow interaction with a physical object for its study and analysis, using 3D Tracking based Augmented Reality techniques.
Evaluation of Skylab (EREP) data for forest and rangeland surveys
Robert C. Aldrich
1976-01-01
Data products from the Skylab Earth Resources Experiment Package were examined monocularly or stereoscopically using a variety of magnifying interpretation devices. Land use, forest types, physiographic sites, and plant communities, as well as forest stress, were interpreted and mapped at sites in Georgia, South Dakota, and Colorado. Microdensitometric techniques and...
Using Real World Experience to Teach Science and Environmental Writing.
ERIC Educational Resources Information Center
Friedman, Sharon M.
The use of interpretive reporting techniques and programs offering real world training to writers may provide solutions to the problems encountered in writing about science for the mass media. Both science and environmental writers have suggested that the problems they face would be decreased by the use of more interpretive and investigative…
Locus of Control and Student Perceptions of Three Counseling Techniques
ERIC Educational Resources Information Center
Dougherty, A. Michael; And Others
1978-01-01
Use of advice-giving, Adlerian interpretation, and analytically-derived interpretation with regard to whether feelings of approach, attack, or withdrawal were elicited was investigated by having subjects respond to eight videotaped role-played counseling segments. Subjects were 242 fourth-graders and 191 tenth-graders, grouped by locus of control.…
Landsat and SPOT data for oil exploration in North-Western China
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nishidai, Takashi
1996-07-01
Satellite remote sensing technology has been employed by Japex to provide information related to oil exploration programs for many years. Since the beginning of the 1980`s, regional geological interpretation through to advanced studies using satellite imagery with high spectral and spatial resolutions (such as Landsat TM and SPOT HRV), have been carried out, for both exploration programs and for scientific research. Advanced techniques (including analysis of airborne hyper-multispectral imaging sensor data) as well as conventional photogeological techniques were used throughout these programs. The first program using remote sensing technology in China focused on the Tarim Basin, Xinjiang Uygur Autonomous Region,more » and was carried out using Landsat MSS data. Landsat MSS imagery allows us to gain useful preliminary geological information about an area of interest, prior to field studies. About 90 Landsat scenes cover the entire Xinjiang Uygru Autonomous Region, this allowed us to give comprehensive overviews of 3 hydrocarbon-bearing basins (Tarim, Junggar, and Turpan-Hami) in NW China. The overviews were based on the interpretations and assessments of the satellite imagery and on a synthesis of the most up-to-date accessible geological and geophysical data as well as some field works. Pairs of stereoscopic SPOT HRV images were used to generate digital elevation data with a 40 in grid cover for part of the Tarim Basin. Topographic contour maps, created from this digital elevation data, at scales of 1:250,000 and 1:100,000 with contour intervals of 100 m and 50 m, allowed us to make precise geological interpretation, and to carry out swift and efficient geological field work. Satellite imagery was also utilized to make medium scale to large scale image maps, not only to interpret geological features but also to support field workers and seismic survey field operations.« less
NASA Astrophysics Data System (ADS)
Lwin, A.; Khaing, M. M.
2012-07-01
The Yangon river, also known as the Rangoon river, is about 40 km long (25miles), and flows from southern Myanmar as an outlet of the Irrawaddy (Ayeyarwady) river into the Ayeyarwady delta. The Yangon river drains the Pegu Mountains; both the Yangon and the Pathein rivers enter the Ayeyarwady at the delta. Fluvial geomorphology is based primarily on rivers of manageable dimensions. The emphasis is on geomorphology, sedimentology of Yangon river and techniques for their identification and management. Present techniques such as remote sensing have made it easier to investigate and interpret in details analysis of river geomorphology. In this paper, attempt has been made the complicated issues of geomorphology, sedimentation patterns and management of river system and evolution studied. The analysis was carried out for the impact of land use/ land cover (LULC) changes on stream flow patterns. The hydrologic response to intense, flood producing rainfall events bears the signatures of the geomorphic structure of the channel network and of the characteristic slope lengths defining the drainage density of the basin. The interpretation of the hydrologic response as the travel time distribution of a water particle randomly injected in a distributed manner across the landscape inspired many geomorphic insights. In 2008, Cyclone Nargis was seriously damaged to mangrove area and its biodiversity system in and around of Yangon river terraces. A combination of digital image processing techniques was employed for enhancement and classification process. It is observed from the study that middle infra red band (0.77mm - 0.86mm) is highly suitable for mapping mangroves. Two major classes of mangroves, dense and open mangroves were delineated from the digital data.
2012-01-01
Background Establishing the distribution of materials in paintings and that of their degradation products by imaging techniques is fundamental to understand the painting technique and can improve our knowledge on the conservation status of the painting. The combined use of chromatographic-mass spectrometric techniques, such as GC/MS or Py/GC/MS, and the chemical mapping of functional groups by imaging SR FTIR in transmission mode on thin sections and SR XRD line scans will be presented as a suitable approach to have a detailed characterisation of the materials in a paint sample, assuring their localisation in the sample build-up. This analytical approach has been used to study samples from Catalan paintings by Josep Maria Sert y Badía (20th century), a muralist achieving international recognition whose canvases adorned international buildings. Results The pigments used by the painter as well as the organic materials used as binders and varnishes could be identified by means of conventional techniques. The distribution of these materials by means of Synchrotron Radiation based techniques allowed to establish the mixtures used by the painter depending on the purpose. Conclusions Results show the suitability of the combined use of SR μFTIR and SR μXRD mapping and conventional techniques to unequivocally identify all the materials present in the sample and their localization in the sample build-up. This kind of approach becomes indispensable to solve the challenge of micro heterogeneous samples. The complementary interpretation of the data obtained with all the different techniques allowed the characterization of both organic and inorganic materials in the samples layer by layer as well as to establish the painting techniques used by Sert in the works-of-art under study. PMID:22616949
Kroll, Lars Eric; Schumann, Maria; Müters, Stephan; Lampert, Thomas
2017-12-01
Nationwide health surveys can be used to estimate regional differences in health. Using traditional estimation techniques, the spatial depth for these estimates is limited due to the constrained sample size. So far - without special refreshment samples - results have only been available for larger populated federal states of Germany. An alternative is regression-based small-area estimation techniques. These models can generate smaller-scale data, but are also subject to greater statistical uncertainties because of the model assumptions. In the present article, exemplary regionalized results based on the studies "Gesundheit in Deutschland aktuell" (GEDA studies) 2009, 2010 and 2012, are compared to the self-rated health status of the respondents. The aim of the article is to analyze the range of regional estimates in order to assess the usefulness of the techniques for health reporting more adequately. The results show that the estimated prevalence is relatively stable when using different samples. Important determinants of the variation of the estimates are the achieved sample size on the district level and the type of the district (cities vs. rural regions). Overall, the present study shows that small-area modeling of prevalence is associated with additional uncertainties compared to conventional estimates, which should be taken into account when interpreting the corresponding findings.
Interpretation of ERTS-MSS images of a Savanna area in eastern Colombia
NASA Technical Reports Server (NTRS)
Elberson, G. W. W.
1973-01-01
The application of ERTS-1 imagery for extrapolating existing soil maps into unmapped areas of the Llanos Orientales of Colombia, South America is discussed. Interpretations of ERTS-1 data were made according to conventional photointerpretation techniques. Most units delineated in the existing reconnaissance soil map at a scale of 1:250,000 could be recognized and delineated in the ERTS image. The methods of interpretation are described and the results obtained for specific areas are analyzed.
Ghasemi Damavandi, Hamidreza; Sen Gupta, Ananya; Nelson, Robert K; Reddy, Christopher M
2016-01-01
Comprehensive two-dimensional gas chromatography [Formula: see text] provides high-resolution separations across hundreds of compounds in a complex mixture, thus unlocking unprecedented information for intricate quantitative interpretation. We exploit this compound diversity across the [Formula: see text] topography to provide quantitative compound-cognizant interpretation beyond target compound analysis with petroleum forensics as a practical application. We focus on the [Formula: see text] topography of biomarker hydrocarbons, hopanes and steranes, as they are generally recalcitrant to weathering. We introduce peak topography maps (PTM) and topography partitioning techniques that consider a notably broader and more diverse range of target and non-target biomarker compounds compared to traditional approaches that consider approximately 20 biomarker ratios. Specifically, we consider a range of 33-154 target and non-target biomarkers with highest-to-lowest peak ratio within an injection ranging from 4.86 to 19.6 (precise numbers depend on biomarker diversity of individual injections). We also provide a robust quantitative measure for directly determining "match" between samples, without necessitating training data sets. We validate our methods across 34 [Formula: see text] injections from a diverse portfolio of petroleum sources, and provide quantitative comparison of performance against established statistical methods such as principal components analysis (PCA). Our data set includes a wide range of samples collected following the 2010 Deepwater Horizon disaster that released approximately 160 million gallons of crude oil from the Macondo well (MW). Samples that were clearly collected following this disaster exhibit statistically significant match [Formula: see text] using PTM-based interpretation against other closely related sources. PTM-based interpretation also provides higher differentiation between closely correlated but distinct sources than obtained using PCA-based statistical comparisons. In addition to results based on this experimental field data, we also provide extentive perturbation analysis of the PTM method over numerical simulations that introduce random variability of peak locations over the [Formula: see text] biomarker ROI image of the MW pre-spill sample (sample [Formula: see text] in Additional file 4: Table S1). We compare the robustness of the cross-PTM score against peak location variability in both dimensions and compare the results against PCA analysis over the same set of simulated images. Detailed description of the simulation experiment and discussion of results are provided in Additional file 1: Section S8. We provide a peak-cognizant informational framework for quantitative interpretation of [Formula: see text] topography. Proposed topographic analysis enables [Formula: see text] forensic interpretation across target petroleum biomarkers, while including the nuances of lesser-known non-target biomarkers clustered around the target peaks. This allows potential discovery of hitherto unknown connections between target and non-target biomarkers.
Visualization rhetoric: framing effects in narrative visualization.
Hullman, Jessica; Diakopoulos, Nicholas
2011-12-01
Narrative visualizations combine conventions of communicative and exploratory information visualization to convey an intended story. We demonstrate visualization rhetoric as an analytical framework for understanding how design techniques that prioritize particular interpretations in visualizations that "tell a story" can significantly affect end-user interpretation. We draw a parallel between narrative visualization interpretation and evidence from framing studies in political messaging, decision-making, and literary studies. Devices for understanding the rhetorical nature of narrative information visualizations are presented, informed by the rigorous application of concepts from critical theory, semiotics, journalism, and political theory. We draw attention to how design tactics represent additions or omissions of information at various levels-the data, visual representation, textual annotations, and interactivity-and how visualizations denote and connote phenomena with reference to unstated viewing conventions and codes. Classes of rhetorical techniques identified via a systematic analysis of recent narrative visualizations are presented, and characterized according to their rhetorical contribution to the visualization. We describe how designers and researchers can benefit from the potentially positive aspects of visualization rhetoric in designing engaging, layered narrative visualizations and how our framework can shed light on how a visualization design prioritizes specific interpretations. We identify areas where future inquiry into visualization rhetoric can improve understanding of visualization interpretation. © 2011 IEEE
NASA Astrophysics Data System (ADS)
Olurin, Oluwaseun T.; Ganiyu, Saheed A.; Hammed, Olaide S.; Aluko, Taiwo J.
2016-10-01
This study presents the results of spectral analysis of magnetic data over Abeokuta area, Southwestern Nigeria, using fast Fourier transform (FFT) in Microsoft Excel. The study deals with the quantitative interpretation of airborne magnetic data (Sheet No. 260), which was conducted by the Nigerian Geological Survey Agency in 2009. In order to minimise aliasing error, the aeromagnetic data was gridded at spacing of 1 km. Spectral analysis technique was used to estimate the magnetic basement depth distributed at two levels. The result of the interpretation shows that the magnetic sources are mainly distributed at two levels. The shallow sources (minimum depth) range in depth from 0.103 to 0.278 km below ground level and are inferred to be due to intrusions within the region. The deeper sources (maximum depth) range in depth from 2.739 to 3.325 km below ground and are attributed to the underlying basement.
NASA Technical Reports Server (NTRS)
Sung, Q. C.; Miller, L. D.
1977-01-01
Three methods were tested for collection of the training sets needed to establish the spectral signatures of the land uses/land covers sought due to the difficulties of retrospective collection of representative ground control data. Computer preprocessing techniques applied to the digital images to improve the final classification results were geometric corrections, spectral band or image ratioing and statistical cleaning of the representative training sets. A minimal level of statistical verification was made based upon the comparisons between the airphoto estimates and the classification results. The verifications provided a further support to the selection of MSS band 5 and 7. It also indicated that the maximum likelihood ratioing technique can achieve more agreeable classification results with the airphoto estimates than the stepwise discriminant analysis.
Blood oxygenation level-dependent MRI for assessment of renal oxygenation
Neugarten, Joel; Golestaneh, Ladan
2014-01-01
Blood oxygen level-dependent magnetic resonance imaging (BOLD MRI) has recently emerged as an important noninvasive technique to assess intrarenal oxygenation under physiologic and pathophysiologic conditions. Although this tool represents a major addition to our armamentarium of methodologies to investigate the role of hypoxia in the pathogenesis of acute kidney injury and progressive chronic kidney disease, numerous technical limitations confound interpretation of data derived from this approach. BOLD MRI has been utilized to assess intrarenal oxygenation in numerous experimental models of kidney disease and in human subjects with diabetic and nondiabetic chronic kidney disease, acute kidney injury, renal allograft rejection, contrast-associated nephropathy, and obstructive uropathy. However, confidence in conclusions based on data derived from BOLD MRI measurements will require continuing advances and technical refinements in the use of this technique. PMID:25473304
Infrared thermography for wood density estimation
NASA Astrophysics Data System (ADS)
López, Gamaliel; Basterra, Luis-Alfonso; Acuña, Luis
2018-03-01
Infrared thermography (IRT) is becoming a commonly used technique to non-destructively inspect and evaluate wood structures. Based on the radiation emitted by all objects, this technique enables the remote visualization of the surface temperature without making contact using a thermographic device. The process of transforming radiant energy into temperature depends on many parameters, and interpreting the results is usually complicated. However, some works have analyzed the operation of IRT and expanded its applications, as found in the latest literature. This work analyzes the effect of density on the thermodynamic behavior of timber to be determined by IRT. The cooling of various wood samples has been registered, and a statistical procedure that enables one to quantitatively estimate the density of timber has been designed. This procedure represents a new method to physically characterize this material.
Extraction of Greenhouse Areas with Image Processing Methods in Karabuk Province
NASA Astrophysics Data System (ADS)
Yildirima, M. Z.; Ozcan, C.
2017-11-01
Greenhouses provide the environmental conditions to be controlled and regulated as desired while allowing agricultural products to be produced without being affected by external environmental conditions. High quality and a wide variety of agricultural products can be produced throughout the year. In addition, mapping and detection of these areas has great importance in terms of factors such as yield analysis, natural resource management and environmental impact. Various remote sensing techniques are currently available for extraction of greenhouse areas. These techniques are based on the automatic detection and interpretation of objects on remotely sensed images. In this study, greenhouse areas were determined from optical images obtained from Landsat. The study was carried out in the greenhouse areas in Karabuk province. The obtained results are presented with figures and tables.
Research in space physics at the University of Iowa
NASA Technical Reports Server (NTRS)
Vanallen, J. A.
1979-01-01
Current investigations relating to energetic particles and the electric, magnetic, and electromagnetic fields associated with the earth, the sun, the moon, the planets, comets, and the interplanetary medium are reported. Primary emphasis is on observational work using a wide diversity of intruments on satellites of the earth and the moon and on planetary and interplanetary spacecraft, and on phenomenological analysis and interpretation. Secondary emphasis is given to closely related observational work by ground based radio-astronomical and optical techniques, and to theoretical problems in plasma physics as relevant to solar, planetary, and interplanetary phenomena.
The role of networks and artificial intelligence in nanotechnology design and analysis.
Hudson, D L; Cohen, M E
2004-05-01
Techniques with their origins in artificial intelligence have had a great impact on many areas of biomedicine. Expert-based systems have been used to develop computer-assisted decision aids. Neural networks have been used extensively in disease classification and more recently in many bioinformatics applications including genomics and drug design. Network theory in general has proved useful in modeling all aspects of biomedicine from healthcare organizational structure to biochemical pathways. These methods show promise in applications involving nanotechnology both in the design phase and in interpretation of system functioning.
IPL Processing of the Viking Orbiter Images of Mars
NASA Technical Reports Server (NTRS)
Ruiz, R. M.; Elliott, D. A.; Yagi, G. M.; Pomphrey, R. B.; Power, M. A.; Farrell, W., Jr.; Lorre, J. J.; Benton, W. D.; Dewar, R. E.; Cullen, L. E.
1977-01-01
The Viking orbiter cameras returned over 9000 images of Mars during the 6-month nominal mission. Digital image processing was required to produce products suitable for quantitative and qualitative scientific interpretation. Processing included the production of surface elevation data using computer stereophotogrammetric techniques, crater classification based on geomorphological characteristics, and the generation of color products using multiple black-and-white images recorded through spectral filters. The Image Processing Laboratory of the Jet Propulsion Laboratory was responsible for the design, development, and application of the software required to produce these 'second-order' products.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alekseev, I. S.; Ivanov, I. E.; Strelkov, P. S., E-mail: strelkov@fpl.gpi.ru
A method based on the detection of emission of a dielectric screen with metal microinclusions in open air is applied to visualize the transverse structure of a high-power microwave beam. In contrast to other visualization techniques, the results obtained in this work provide qualitative information not only on the electric field strength, but also on the structure of electric field lines in the microwave beam cross section. The interpretation of the results obtained with this method is confirmed by numerical simulations of the structure of electric field lines in the microwave beam cross section by means of the CARAT code.
NASA Astrophysics Data System (ADS)
Jain, A.
2017-08-01
Computer based method can help in discovery of leads and can potentially eliminate chemical synthesis and screening of many irrelevant compounds, and in this way, it save time as well as cost. Molecular modeling systems are powerful tools for building, visualizing, analyzing and storing models of complex molecular structure that can help to interpretate structure activity relationship. The use of various techniques of molecular mechanics and dynamics and software in Computer aided drug design along with statistics analysis is powerful tool for the medicinal chemistry to synthesis therapeutic and effective drugs with minimum side effect.
Interpretation of geographic patterns in simulated orbital television imagery of earth resources
NASA Technical Reports Server (NTRS)
Latham, J. P.; Cross, C. I.; Kuyper, W. H.; Witmer, R. E.
1972-01-01
In order to better determine the effects of the television imagery characteristics upon the interpretation of geographic patterns obtainable from orbital television sensors, and in order to better evaluate the influences of alternative sensor system parameters such as changes in orbital altitudes or scan line rates, a team of three professional interpreters independently mapped thematically the selected geographic phenomena that they could detect in orbital television imagery produced on a fourteen inch monitor and recorded photographically for analysis. Three thematic maps were compiled by each interpreter. The maps were: (1) transportation patterns; (2) other land use; and (3) physical regions. The results from the three interpreters are compared, agreements noted, and differences analyzed for cause such as disagreement on identification of phenomenon, visual acuity, differences in interpretation techniques, and differing professional backgrounds.
Structural interpretation of seismic data and inherent uncertainties
NASA Astrophysics Data System (ADS)
Bond, Clare
2013-04-01
Geoscience is perhaps unique in its reliance on incomplete datasets and building knowledge from their interpretation. This interpretation basis for the science is fundamental at all levels; from creation of a geological map to interpretation of remotely sensed data. To teach and understand better the uncertainties in dealing with incomplete data we need to understand the strategies individual practitioners deploy that make them effective interpreters. The nature of interpretation is such that the interpreter needs to use their cognitive ability in the analysis of the data to propose a sensible solution in their final output that is both consistent not only with the original data but also with other knowledge and understanding. In a series of experiments Bond et al. (2007, 2008, 2011, 2012) investigated the strategies and pitfalls of expert and non-expert interpretation of seismic images. These studies focused on large numbers of participants to provide a statistically sound basis for analysis of the results. The outcome of these experiments showed that a wide variety of conceptual models were applied to single seismic datasets. Highlighting not only spatial variations in fault placements, but whether interpreters thought they existed at all, or had the same sense of movement. Further, statistical analysis suggests that the strategies an interpreter employs are more important than expert knowledge per se in developing successful interpretations. Experts are successful because of their application of these techniques. In a new set of experiments a small number of experts are focused on to determine how they use their cognitive and reasoning skills, in the interpretation of 2D seismic profiles. Live video and practitioner commentary were used to track the evolving interpretation and to gain insight on their decision processes. The outputs of the study allow us to create an educational resource of expert interpretation through online video footage and commentary with associated further interpretation and analysis of the techniques and strategies employed. This resource will be of use to undergraduate, post-graduate, industry and academic professionals seeking to improve their seismic interpretation skills, develop reasoning strategies for dealing with incomplete datasets, and for assessing the uncertainty in these interpretations. Bond, C.E. et al. (2012). 'What makes an expert effective at interpreting seismic images?' Geology, 40, 75-78. Bond, C. E. et al. (2011). 'When there isn't a right answer: interpretation and reasoning, key skills for 21st century geoscience'. International Journal of Science Education, 33, 629-652. Bond, C. E. et al. (2008). 'Structural models: Optimizing risk analysis by understanding conceptual uncertainty'. First Break, 26, 65-71. Bond, C. E. et al., (2007). 'What do you think this is?: "Conceptual uncertainty" In geoscience interpretation'. GSA Today, 17, 4-10.
Efficient Analysis of Mass Spectrometry Data Using the Isotope Wavelet
NASA Astrophysics Data System (ADS)
Hussong, Rene; Tholey, Andreas; Hildebrandt, Andreas
2007-09-01
Mass spectrometry (MS) has become today's de-facto standard for high-throughput analysis in proteomics research. Its applications range from toxicity analysis to MS-based diagnostics. Often, the time spent on the MS experiment itself is significantly less than the time necessary to interpret the measured signals, since the amount of data can easily exceed several gigabytes. In addition, automated analysis is hampered by baseline artifacts, chemical as well as electrical noise, and an irregular spacing of data points. Thus, filtering techniques originating from signal and image analysis are commonly employed to address these problems. Unfortunately, smoothing, base-line reduction, and in particular a resampling of data points can affect important characteristics of the experimental signal. To overcome these problems, we propose a new family of wavelet functions based on the isotope wavelet, which is hand-tailored for the analysis of mass spectrometry data. The resulting technique is theoretically well-founded and compares very well with standard peak picking tools, since it is highly robust against noise spoiling the data, but at the same time sufficiently sensitive to detect even low-abundant peptides.
NASA Astrophysics Data System (ADS)
Kozikowski, Raymond T.; Smith, Sarah E.; Lee, Jennifer A.; Castleman, William L.; Sorg, Brian S.; Hahn, David W.
2012-06-01
Fluorescence spectroscopy has been widely investigated as a technique for identifying pathological tissue; however, unrelated subject-to-subject variations in spectra complicate data analysis and interpretation. We describe and evaluate a new biosensing technique, differential laser-induced perturbation spectroscopy (DLIPS), based on deep ultraviolet (UV) photochemical perturbation in combination with difference spectroscopy. This technique combines sequential fluorescence probing (pre- and post-perturbation) with sub-ablative UV perturbation and difference spectroscopy to provide a new spectral dimension, facilitating two improvements over fluorescence spectroscopy. First, the differential technique eliminates significant variations in absolute fluorescence response within subject populations. Second, UV perturbations alter the extracellular matrix (ECM), directly coupling the DLIPS response to the biological structure. Improved biosensing with DLIPS is demonstrated in vivo in a murine model of chemically induced skin lesion development. Component loading analysis of the data indicates that the DLIPS technique couples to structural proteins in the ECM. Analysis of variance shows that DLIPS has a significant response to emerging pathology as opposed to other population differences. An optimal likelihood ratio classifier for the DLIPS dataset shows that this technique holds promise for improved diagnosis of epithelial pathology. Results further indicate that DLIPS may improve diagnosis of tissue by augmenting fluorescence spectra (i.e. orthogonal sensing).
Edge compression techniques for visualization of dense directed graphs.
Dwyer, Tim; Henry Riche, Nathalie; Marriott, Kim; Mears, Christopher
2013-12-01
We explore the effectiveness of visualizing dense directed graphs by replacing individual edges with edges connected to 'modules'-or groups of nodes-such that the new edges imply aggregate connectivity. We only consider techniques that offer a lossless compression: that is, where the entire graph can still be read from the compressed version. The techniques considered are: a simple grouping of nodes with identical neighbor sets; Modular Decomposition which permits internal structure in modules and allows them to be nested; and Power Graph Analysis which further allows edges to cross module boundaries. These techniques all have the same goal--to compress the set of edges that need to be rendered to fully convey connectivity--but each successive relaxation of the module definition permits fewer edges to be drawn in the rendered graph. Each successive technique also, we hypothesize, requires a higher degree of mental effort to interpret. We test this hypothetical trade-off with two studies involving human participants. For Power Graph Analysis we propose a novel optimal technique based on constraint programming. This enables us to explore the parameter space for the technique more precisely than could be achieved with a heuristic. Although applicable to many domains, we are motivated by--and discuss in particular--the application to software dependency analysis.
Kairov, Ulykbek; Cantini, Laura; Greco, Alessandro; Molkenov, Askhat; Czerwinska, Urszula; Barillot, Emmanuel; Zinovyev, Andrei
2017-09-11
Independent Component Analysis (ICA) is a method that models gene expression data as an action of a set of statistically independent hidden factors. The output of ICA depends on a fundamental parameter: the number of components (factors) to compute. The optimal choice of this parameter, related to determining the effective data dimension, remains an open question in the application of blind source separation techniques to transcriptomic data. Here we address the question of optimizing the number of statistically independent components in the analysis of transcriptomic data for reproducibility of the components in multiple runs of ICA (within the same or within varying effective dimensions) and in multiple independent datasets. To this end, we introduce ranking of independent components based on their stability in multiple ICA computation runs and define a distinguished number of components (Most Stable Transcriptome Dimension, MSTD) corresponding to the point of the qualitative change of the stability profile. Based on a large body of data, we demonstrate that a sufficient number of dimensions is required for biological interpretability of the ICA decomposition and that the most stable components with ranks below MSTD have more chances to be reproduced in independent studies compared to the less stable ones. At the same time, we show that a transcriptomics dataset can be reduced to a relatively high number of dimensions without losing the interpretability of ICA, even though higher dimensions give rise to components driven by small gene sets. We suggest a protocol of ICA application to transcriptomics data with a possibility of prioritizing components with respect to their reproducibility that strengthens the biological interpretation. Computing too few components (much less than MSTD) is not optimal for interpretability of the results. The components ranked within MSTD range have more chances to be reproduced in independent studies.
Effects of salt-related mode conversions on subsalt prospecting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ogilvie, J.S.; Purnell, G.W.
1996-03-01
Mode conversion of waves during seismic reflection surveys has generally been considered a small phenomenon that could be neglected in data processing and interpretation. However, in subsalt prospecting, the contrast in material properties at the salt/sediment interface is often great enough that significant P-to-S and/or S-to-P conversion occurs. The resulting converted waves can be both a help and a hindrance for subsalt prospecting. A case history from the Mississippi Canyon area of the Gulf of Mexico demonstrates strong converted-wave reflections from the base-of-salt that complicate the evaluation of a subsalt prospect using 3-D seismic data. Before and after stack, themore » converted-wave reflections are evident in 2-D and 3-D surveys across the prospect. Ray-tracing synthetic common midpoint (CMP) gathers provides some useful insights about the occurrence of these waves, but elastic-wave-equation modeling is even more useful. While the latter is more time-consuming, even in 2-D, it also provides a more realistic simulated seismic survey across the prospect, which helps to reveal how some converted waves survive the processes of CMP stack and migration, and thereby present possible pitfalls to an unwary interpreter. The insights gained from the synthetic-data suggest some simple techniques that can assist an interpreter in the 3-D interpretation of subsalt events.« less
[Considerations when using creatinine as a measure of kidney function].
Drion, I Iefke; Fokkert, M J Marion; Bilo, H J G Henk
2013-01-01
Reported serum creatinine concentrations can sometimes vary considerably, even when the renal function does less so or even not. This variation is partly due to true changes in actual serum concentration, and partly due to interferences in the measurement technique, thus not reflecting a true change in concentration. Increased or decreased endogenous creatinine production, ingested creatinine sources through meat eating or certain creatine formulations, and interference by either browning of chromogenic substances in Jaffe measurement techniques or promotors and inhibitors of enzymatic reaction methods do play a role. Reliable serum creatinine measurements are needed for renal function estimating equations. In screening circumstances and daily practice, chronic kidney disease staging is based on these estimated glomerular filtration rate values. Given the possible influences on reported serum creatinine concentrations, it is important for health care workers to remain critical when interpreting outcomes of renal function estimating equations and to not see every reported result based on an equation as a true reflection of renal function.
Spacecraft momentum management procedures. [large space telescope
NASA Technical Reports Server (NTRS)
Chen, L. C.; Davenport, P. B.; Sturch, C. R.
1980-01-01
Techniques appropriate for implementation onboard the space telescope and other spacecraft to manage the accumulation of momentum in reaction wheel control systems using magnetic torquing coils are described. Generalized analytical equations are derived for momentum control laws that command the magnetic torquers. These control laws naturally fall into two main categories according to the methods used for updating the magnetic dipole command: closed loop, in which the update is based on current measurements to achieve a desired torque instantaneously, and open-loop, in which the update is based on predicted information to achieve a desired momentum at the end of a period of time. Physical interpretations of control laws in general and of the Space Telescope cross product and minimum energy control laws in particular are presented, and their merits and drawbacks are discussed. A technique for retaining the advantages of both the open-loop and the closed-loop control laws is introduced. Simulation results are presented to compare the performance of these control laws in the Space Telescope environment.
Progressive Learning of Topic Modeling Parameters: A Visual Analytics Framework.
El-Assady, Mennatallah; Sevastjanova, Rita; Sperrle, Fabian; Keim, Daniel; Collins, Christopher
2018-01-01
Topic modeling algorithms are widely used to analyze the thematic composition of text corpora but remain difficult to interpret and adjust. Addressing these limitations, we present a modular visual analytics framework, tackling the understandability and adaptability of topic models through a user-driven reinforcement learning process which does not require a deep understanding of the underlying topic modeling algorithms. Given a document corpus, our approach initializes two algorithm configurations based on a parameter space analysis that enhances document separability. We abstract the model complexity in an interactive visual workspace for exploring the automatic matching results of two models, investigating topic summaries, analyzing parameter distributions, and reviewing documents. The main contribution of our work is an iterative decision-making technique in which users provide a document-based relevance feedback that allows the framework to converge to a user-endorsed topic distribution. We also report feedback from a two-stage study which shows that our technique results in topic model quality improvements on two independent measures.
Robot path planning using expert systems and machine vision
NASA Astrophysics Data System (ADS)
Malone, Denis E.; Friedrich, Werner E.
1992-02-01
This paper describes a system developed for the robotic processing of naturally variable products. In order to plan the robot motion path it was necessary to use a sensor system, in this case a machine vision system, to observe the variations occurring in workpieces and interpret this with a knowledge based expert system. The knowledge base was acquired by carrying out an in-depth study of the product using examination procedures not available in the robotic workplace and relates the nature of the required path to the information obtainable from the machine vision system. The practical application of this system to the processing of fish fillets is described and used to illustrate the techniques.
Application of Semantic Tagging to Generate Superimposed Information on a Digital Encyclopedia
NASA Astrophysics Data System (ADS)
Garrido, Piedad; Tramullas, Jesus; Martinez, Francisco J.
We can find in the literature several works regarding the automatic or semi-automatic processing of textual documents with historic information using free software technologies. However, more research work is needed to integrate the analysis of the context and provide coverage to the peculiarities of the Spanish language from a semantic point of view. This research work proposes a novel knowledge-based strategy based on combining subject-centric computing, a topic-oriented approach, and superimposed information. It subsequent combination with artificial intelligence techniques led to an automatic analysis after implementing a made-to-measure interpreted algorithm which, in turn, produced a good number of associations and events with 90% reliability.
ERIC Educational Resources Information Center
Abate, Marie A.
The education of students in the techniques of critical appraisal of drug studies has been identified as a deficiency in many health sciences curricula. Errors in research design and inconsistencies in the reporting of study results persist in professional pharmacy and medical journals. Thus, thorough and accurate review and interpretation of…
ERIC Educational Resources Information Center
Liew, Chong-Wah; Treagust, David F.
This study involves action research to explore the effectiveness of the Predict-Observe-Explain (POE) technique in diagnosing students' understanding of science and identifying their levels of achievement. A multidimensional interpretive framework is used to interpret students' understanding of science. The research methodology incorporated…
ERIC Educational Resources Information Center
Georgakopoulos, Alexia
2009-01-01
This study challenges narrow definitions of teacher effectiveness and uses a systems approach to investigate teacher effectiveness as a multi-dimensional, holistic phenomenon. The methods of Nominal Group Technique and Interpretive Structural Modeling were used to assist U.S. and Japanese students separately construct influence structures during…
TOF-SIMS imaging technique with information entropy
NASA Astrophysics Data System (ADS)
Aoyagi, Satoka; Kawashima, Y.; Kudo, Masahiro
2005-05-01
Time-of-flight secondary ion mass spectrometry (TOF-SIMS) is capable of chemical imaging of proteins on insulated samples in principal. However, selection of specific peaks related to a particular protein, which are necessary for chemical imaging, out of numerous candidates had been difficult without an appropriate spectrum analysis technique. Therefore multivariate analysis techniques, such as principal component analysis (PCA), and analysis with mutual information defined by information theory, have been applied to interpret SIMS spectra of protein samples. In this study mutual information was applied to select specific peaks related to proteins in order to obtain chemical images. Proteins on insulated materials were measured with TOF-SIMS and then SIMS spectra were analyzed by means of the analysis method based on the comparison using mutual information. Chemical mapping of each protein was obtained using specific peaks related to each protein selected based on values of mutual information. The results of TOF-SIMS images of proteins on the materials provide some useful information on properties of protein adsorption, optimality of immobilization processes and reaction between proteins. Thus chemical images of proteins by TOF-SIMS contribute to understand interactions between material surfaces and proteins and to develop sophisticated biomaterials.
The MIND PALACE: A Multi-Spectral Imaging and Spectroscopy Database for Planetary Science
NASA Astrophysics Data System (ADS)
Eshelman, E.; Doloboff, I.; Hara, E. K.; Uckert, K.; Sapers, H. M.; Abbey, W.; Beegle, L. W.; Bhartia, R.
2017-12-01
The Multi-Instrument Database (MIND) is the web-based home to a well-characterized set of analytical data collected by a suite of deep-UV fluorescence/Raman instruments built at the Jet Propulsion Laboratory (JPL). Samples derive from a growing body of planetary surface analogs, mineral and microbial standards, meteorites, spacecraft materials, and other astrobiologically relevant materials. In addition to deep-UV spectroscopy, datasets stored in MIND are obtained from a variety of analytical techniques obtained over multiple spatial and spectral scales including electron microscopy, optical microscopy, infrared spectroscopy, X-ray fluorescence, and direct fluorescence imaging. Multivariate statistical analysis techniques, primarily Principal Component Analysis (PCA), are used to guide interpretation of these large multi-analytical spectral datasets. Spatial co-referencing of integrated spectral/visual maps is performed using QGIS (geographic information system software). Georeferencing techniques transform individual instrument data maps into a layered co-registered data cube for analysis across spectral and spatial scales. The body of data in MIND is intended to serve as a permanent, reliable, and expanding database of deep-UV spectroscopy datasets generated by this unique suite of JPL-based instruments on samples of broad planetary science interest.
Improving the geological interpretation of magnetic and gravity satellite anomalies
NASA Technical Reports Server (NTRS)
Hinze, William J.; Braile, Lawrence W.; Vonfrese, Ralph R. B.
1987-01-01
Quantitative analysis of the geologic component of observed satellite magnetic and gravity fields requires accurate isolation of the geologic component of the observations, theoretically sound and viable inversion techniques, and integration of collateral, constraining geologic and geophysical data. A number of significant contributions were made which make quantitative analysis more accurate. These include procedures for: screening and processing orbital data for lithospheric signals based on signal repeatability and wavelength analysis; producing accurate gridded anomaly values at constant elevations from the orbital data by three-dimensional least squares collocation; increasing the stability of equivalent point source inversion and criteria for the selection of the optimum damping parameter; enhancing inversion techniques through an iterative procedure based on the superposition theorem of potential fields; and modeling efficiently regional-scale lithospheric sources of satellite magnetic anomalies. In addition, these techniques were utilized to investigate regional anomaly sources of North and South America and India and to provide constraints to continental reconstruction. Since the inception of this research study, eleven papers were presented with associated published abstracts, three theses were completed, four papers were published or accepted for publication, and an additional manuscript was submitted for publication.
Lemeshewsky, G.P.; Rahman, Z.-U.; Schowengerdt, R.A.; Reichenbach, S.E.
2002-01-01
Enhanced false color images from mid-IR, near-IR (NIR), and visible bands of the Landsat thematic mapper (TM) are commonly used for visually interpreting land cover type. Described here is a technique for sharpening or fusion of NIR with higher resolution panchromatic (Pan) that uses a shift-invariant implementation of the discrete wavelet transform (SIDWT) and a reported pixel-based selection rule to combine coefficients. There can be contrast reversals (e.g., at soil-vegetation boundaries between NIR and visible band images) and consequently degraded sharpening and edge artifacts. To improve performance for these conditions, I used a local area-based correlation technique originally reported for comparing image-pyramid-derived edges for the adaptive processing of wavelet-derived edge data. Also, using the redundant data of the SIDWT improves edge data generation. There is additional improvement because sharpened subband imagery is used with the edge-correlation process. A reported technique for sharpening three-band spectral imagery used forward and inverse intensity, hue, and saturation transforms and wavelet-based sharpening of intensity. This technique had limitations with opposite contrast data, and in this study sharpening was applied to single-band multispectral-Pan image pairs. Sharpening used simulated 30-m NIR imagery produced by degrading the spatial resolution of a higher resolution reference. Performance, evaluated by comparison between sharpened and reference image, was improved when sharpened subband data were used with the edge correlation.
NASA Astrophysics Data System (ADS)
Olayanju, G. M.; Mogaji, K. A.; Lim, H. S.; Ojo, T. S.
2017-06-01
The determination of parameters comprising exact depth to bedrock and its lithological type, lateral changes in lithology, and detection of fractures, cracks, or faults are essential to designing formidable foundations and assessing the integrity of civil engineering structures. In this study, soil and site characterization in a typical hard rock geologic terrain in southwestern Nigeria were carried out employing integrated geophysical and geotechnical techniques to address tragedies in civil engineering infrastructural development. The deployed geophysical measurements involved running both very low frequency electromagnetic (VLF-EM) and electrical resistivity methods (dipole-dipole imaging and vertical electrical sounding (VES) techniques) along the established traverses, while the latter technique entailed conducting geological laboratory sieve analysis and Atterberg limit-index tests upon the collected soil samples in the area. The results of the geophysical measurement, based on the interpreted VLF-EM and dipole-dipole data, revealed conductive zones and linear features interpreted as fractures/faults which endanger the foundations of public infrastructures. The delineation of four distinct geoelectric layers in the area—comprised of topsoil, lateritic/clayey substratum, weathered layer, and bedrock—were based on the VES results. Strong evidence, including high degree of decomposition and fracturing of underlying bedrock revealed by the VES results, confirmed the VLF-EM and dipole-dipole results. Furthermore, values in the range of 74.2%-77.8%, 55%-62.5%, 23.4%-24.5%, 7.7%-8.2%, 19.5%-22.4%, and 31.65%-38.25% were obtained for these geotechnical parameters viz soil percentage passing 0.075 mm sieve size, liquid limit, plasticity index, linear shrinkage, natural moisture content, and plastic limit, respectively, resulting from the geotechnical analysis of the soil samples. The comparatively analyzed geophysical and geotechnical results revealed a high weathering of charnockitic rocks resulting in plastic clay material mapped with a mean resistivity value of 73 Ohm-m, in conformity with the obtained geotechnical parameters, which failed to agree with the standard specification of subsoil foundation materials and which, in turn, can impact negatively on the foundational integrity of infrastructures. Based on these results, the area subsoils’ competence for foundation has been rated poor to low. This study has more widely demonstrated the effective application of integrative geophysical and geotechnical methods in the assessment of subsoil competence.
NASA Astrophysics Data System (ADS)
Molina-Viedma, A. J.; Felipe-Sesé, L.; López-Alba, E.; Díaz, F.
2018-03-01
High speed video cameras provide valuable information in dynamic events. Mechanical characterisation has been improved by the interpretation of the behaviour in slow-motion visualisations. In modal analysis, videos contribute to the evaluation of mode shapes but, generally, the motion is too subtle to be interpreted. In latest years, image treatment algorithms have been developed to generate a magnified version of the motion that could be interpreted by naked eye. Nevertheless, optical techniques such as Digital Image Correlation (DIC) are able to provide quantitative information of the motion with higher sensitivity than naked eye. For vibration analysis, mode shapes characterisation is one of the most interesting DIC performances. Full-field measurements provide higher spatial density than classical instrumentations or Scanning Laser Doppler Vibrometry. However, the accurateness of DIC is reduced at high frequencies as a consequence of the low displacements and hence it is habitually employed in low frequency spectra. In the current work, the combination of DIC and motion magnification is explored in order to provide numerical information in magnified videos and perform DIC mode shapes characterisation at unprecedented high frequencies through increasing the amplitude of displacements.
de Mio, Giuliano; Giacheti, Heraldo L
2007-03-01
Correlations between mapping units of costal sedimentary basin and interpretation of piezocone test results are presented and discussed based on examples from Caravelas strandplain, (State of Bahia), Paranaguá (State of Paraná) and Guarujá bays (State of São Paulo), Brazil. Recognizing that the sedimentary environment was mainly controlled by sea level fluctuations led to the interpretation of transgressive and regressive sedimentary sequences, which is in a good agreement with the sea level fluctuation curves currently accepted for these regions. The interpretation of piezocone test results shows that the sedimentary sequences of Caravelas and Guarujá sites are similar and they have a good correlation to the sea level fluctuation curve accepted for Salvador region, State of Bahia. On the other hand, the piezocone test results from Paranaguá site indicate a different sedimentary sequence from the previous ones, relating to the sea level fluctuation curve accepted for Paranaguá region. The results show the high applicability of piezocone testing for stratigraphical logging and suggest that it is possible to integrate it with other current techniques used for paleo-environmental studies in Brazil, in accordance with recent approaches used in international research on the subject.
Assesment on the performance of electrode arrays using image processing technique
NASA Astrophysics Data System (ADS)
Usman, N.; Khiruddin, A.; Nawawi, Mohd
2017-08-01
Interpreting inverted resistivity section is time consuming, tedious and requires other sources of information to be relevant geologically. Image processing technique was used in order to perform post inversion processing which make geophysical data interpretation easier. The inverted data sets were imported into the PCI Geomatica 9.0.1 for further processing. The data sets were clipped and merged together in order to match the coordinates of the three layers and permit pixel to pixel analysis. Dipole-dipole array is more sensitive to resistivity variation with depth in comparison with Werner-Schlumberger and pole-dipole. Image processing serves as good post-inversion tool in geophysical data processing.
A generalization of the theory of fringe patterns containing displacement information
NASA Astrophysics Data System (ADS)
Sciammarella, C. A.; Bhat, G.
The theory that provides the interpretation of interferometric fringes as frequency modulated signals, is used to show that the electrooptical system used to analyze fringe patterns can be considered as a simultaneous Fourier spectrum analyzer. This interpretation generalizes the quasi-heterodyning techniques. It is pointed out that the same equations that yield the discrete Fourier transform as summations, yield correct values for a reduced number of steps. Examples of application of the proposed technique to electronic holography are given. It is found that for a uniform field the standard deviation of the individual readings is 1/20 of the fringe spacing.
NASA Technical Reports Server (NTRS)
Casas, J. C.; Koziana, J. V.; Saylor, M. S.; Kindle, E. C.
1982-01-01
Problems associated with the development of the measurement of air pollution from satellites (MAPS) experiment program are addressed. The primary thrust of this research was the utilization of the MAPS experiment data in three application areas: low altitude aircraft flights (one to six km); mid altitude aircraft flights (eight to 12 km); and orbiting space platforms. Extensive research work in four major areas of data management was the framework for implementation of the MAPS experiment technique. These areas are: (1) data acquisition; (2) data processing, analysis and interpretation algorithms; (3) data display techniques; and (4) information production.
Dadhich, Hrishikesh; Toi, Pampa Ch; Siddaraju, Neelaiah; Sevvanthi, Kalidas
2016-11-01
Clinically, detection of malignant cells in serous body fluids is critical, as their presence implies the upstaging of the disease. Cytology of body cavity fluids serves as an important tool when other diagnostic tests cannot be performed. In most laboratories, currently, the effusion fluid samples are analysed chiefly by the conventional cytopreparatory (CCP) technique. Although, there are several studies comparing the liquid-based cytology (LBC), with CCP technique in the field of cervicovaginal cytology; the literature on such comparison with respect to serous body fluid examination is sparse. One hundred samples of serous body fluids were processed by both CCP and LBC techniques. Slides prepared by these techniques were studied using six parameters. A comparative analysis of the advantages and disadvantages of the techniques in detection of malignant cells was carried out with appropriate statistical tests. The samples comprised 52 pleural, 44 peritoneal and four pericardial fluids. No statistically significant difference was noted with respect to cellularity (P values = 0.22), cell distribution (P values = 0.39) and diagnosis of malignancy (P values = 0.20). As for the remaining parameters, LBC provided statistically significant clearer smear background (P values < 0.0001) and shorter screening time (P values < 0.0001), while CPP technique provided a significantly better staining quality (P values 0.01) and sharper cytomorphologic features (P values 0.05). Although, a reduced screening time and clearer smear background are the two major advantages of LBC; the CCP technique provides the better staining quality with sharper cytomorphologic features which is more critical from the cytologic interpretation point of view. Diagn. Cytopathol. 2016;44:874-879. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Application of seismic-refraction techniques to hydrologic studies
Haeni, F.P.
1988-01-01
During the past 30 years, seismic-refraction methods have been used extensively in petroleum, mineral, and engineering investigations and to some extent for hydrologic applications. Recent advances in equipment, sound sources, and computer interpretation techniques make seismic refraction a highly effective and economical means of obtaining subsurface data in hydrologic studies. Aquifers that can be defined by one or more high-seismic-velocity surface, such as (1) alluvial or glacial deposits in consolidated rock valleys, (2) limestone or sandstone underlain by metamorphic or igneous rock, or (3) saturated unconsolidated deposits overlain by unsaturated unconsolidated deposits, are ideally suited for seismic-refraction methods. These methods allow economical collection of subsurface data, provide the basis for more efficient collection of data by test drilling or aquifer tests, and result in improved hydrologic studies. This manual briefly reviews the basics of seismic-refraction theory and principles. It emphasizes the use of these techniques in hydrologic investigations and describes the planning, equipment, field procedures, and interpretation techniques needed for this type of study. Further-more, examples of the use of seismic-refraction techniques in a wide variety of hydrologic studies are presented.
Advanced Neuroimaging in Traumatic Brain Injury
Edlow, Brian L.; Wu, Ona
2013-01-01
Advances in structural and functional neuroimaging have occurred at a rapid pace over the past two decades. Novel techniques for measuring cerebral blood flow, metabolism, white matter connectivity, and neural network activation have great potential to improve the accuracy of diagnosis and prognosis for patients with traumatic brain injury (TBI), while also providing biomarkers to guide the development of new therapies. Several of these advanced imaging modalities are currently being implemented into clinical practice, whereas others require further development and validation. Ultimately, for advanced neuroimaging techniques to reach their full potential and improve clinical care for the many civilians and military personnel affected by TBI, it is critical for clinicians to understand the applications and methodological limitations of each technique. In this review, we examine recent advances in structural and functional neuroimaging and the potential applications of these techniques to the clinical care of patients with TBI. We also discuss pitfalls and confounders that should be considered when interpreting data from each technique. Finally, given the vast amounts of advanced imaging data that will soon be available to clinicians, we discuss strategies for optimizing data integration, visualization and interpretation. PMID:23361483
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fernandes, Justin L.; Rappaport, Carey M.; Sheen, David M.
2011-05-01
The cylindrical millimeter-wave imaging technique, developed at Pacific Northwest National Laboratory (PNNL) and commercialized by L-3 Communications/Safeview in the ProVision system, is currently being deployed in airports and other high security locations to meet person-borne weapon and explosive detection requirements. While this system is efficient and effective in its current form, there are a number of areas in which the detection performance may be improved through using different reconstruction algorithms and sensing configurations. PNNL and Northeastern University have teamed together to investigate higher-order imaging artifacts produced by the current cylindrical millimeter-wave imaging technique using full-wave forward modeling and laboratory experimentation.more » Based on imaging results and scattered field visualizations using the full-wave forward model, a new imaging system is proposed. The new system combines a multistatic sensor configuration with the generalized synthetic aperture focusing technique (GSAFT). Initial results show an improved ability to image in areas of the body where target shading, specular and higher-order reflections cause images produced by the monostatic system difficult to interpret.« less
Advantage of spatial map ion imaging in the study of large molecule photodissociation
NASA Astrophysics Data System (ADS)
Lee, Chin; Lin, Yen-Cheng; Lee, Shih-Huang; Lee, Yin-Yu; Tseng, Chien-Ming; Lee, Yuan-Tseh; Ni, Chi-Kung
2017-07-01
The original ion imaging technique has low velocity resolution, and currently, photodissociation is mostly investigated using velocity map ion imaging. However, separating signals from the background (resulting from undissociated excited parent molecules) is difficult when velocity map ion imaging is used for the photodissociation of large molecules (number of atoms ≥ 10). In this study, we used the photodissociation of phenol at the S1 band origin as an example to demonstrate how our multimass ion imaging technique, based on modified spatial map ion imaging, can overcome this difficulty. The photofragment translational energy distribution obtained when multimass ion imaging was used differed considerably from that obtained when velocity map ion imaging and Rydberg atom tagging were used. We used conventional translational spectroscopy as a second method to further confirm the experimental results, and we conclude that data should be interpreted carefully when velocity map ion imaging or Rydberg atom tagging is used in the photodissociation of large molecules. Finally, we propose a modified velocity map ion imaging technique without the disadvantages of the current velocity map ion imaging technique.
NASA Astrophysics Data System (ADS)
Su, Zhongqing; Ye, Lin
2004-08-01
The practical utilization of elastic waves, e.g. Rayleigh-Lamb waves, in high-performance structural health monitoring techniques is somewhat impeded due to the complicated wave dispersion phenomena, the existence of multiple wave modes, the high susceptibility to diverse interferences, the bulky sampled data and the difficulty in signal interpretation. An intelligent signal processing and pattern recognition (ISPPR) approach using the wavelet transform and artificial neural network algorithms was developed; this was actualized in a signal processing package (SPP). The ISPPR technique comprehensively functions as signal filtration, data compression, characteristic extraction, information mapping and pattern recognition, capable of extracting essential yet concise features from acquired raw wave signals and further assisting in structural health evaluation. For validation, the SPP was applied to the prediction of crack growth in an alloy structural beam and construction of a damage parameter database for defect identification in CF/EP composite structures. It was clearly apparent that the elastic wave propagation-based damage assessment could be dramatically streamlined by introduction of the ISPPR technique.
2014-01-01
Background Objective physical assessment of patients with lumbar spondylosis involves plain film radiographs (PFR) viewing and interpretation by the radiologists. Physiotherapists also routinely assess PFR within the scope of their practice. However, studies appraising the level of agreement of physiotherapists’ PFR interpretation with radiologists are not common in Ghana. Method Forty-one (41) physiotherapists took part in the cross-sectional survey. An assessment guide was developed from findings of the interpretation of three PFR of patients with lumbar spondylosis by a radiologist. The three PFR were selected from a pool of different radiographs based on clarity, common visible pathological features, coverage body segments and short post production period. Physiotherapists were required to view the same PFR after which they were assessed with the assessment guide according to the number of features identified correctly or incorrectly. The score range on the assessment form was 0–24, interpreted as follow: 0–8 points (low), 9–16 points (moderate) and 17–24 points (high) levels of agreement. Data were analyzed using one sample t-test and fisher’s exact test at α = 0.05. Results The mean score of interpretation for the physiotherapists was 12.7 ± 2.6 points compared to the radiologist’s interpretation of 24 points (assessment guide). The physiotherapists’ levels were found to be significantly associated with their academic qualification (p = 0.006) and sex (p = 0.001). However, their levels of agreement were not significantly associated with their age group (p = 0.098), work settings (p = 0.171), experience (p = 0.666), preferred PFR view (p = 0.088) and continuing education (p = 0.069). Conclusions The physiotherapists’ skills fall short of expectation for interpreting PFR of patients with lumbar spondylosis. The levels of agreement with radiologist’s interpretation have no link with year of clinial practice, age, work settings and continuing education. Thus, routine PFR viewing techniques should be made a priority in physiotherapists’ continuing professional education. PMID:24678695
Code of Federal Regulations, 2010 CFR
2010-07-01
..., other techniques, such as the use of statistical models or the use of historical data could be..., mathematical techniques should be applied to account for the trends to ensure that the expected annual values... emission patterns, either the most recent representative year(s) could be used or statistical techniques or...
Immunohistochemistry as an Important Tool in Biomarkers Detection and Clinical Practice
de Matos, Leandro Luongo; Trufelli, Damila Cristina; de Matos, Maria Graciela Luongo; da Silva Pinhal, Maria Aparecida
2010-01-01
The immunohistochemistry technique is used in the search for cell or tissue antigens that range from amino acids and proteins to infectious agents and specific cellular populations. The technique comprises two phases: (1) slides preparation and stages involved for the reaction; (2) interpretation and quantification of the obtained expression. Immunohistochemistry is an important tool for scientific research and also a complementary technique for the elucidation of differential diagnoses which are not determinable by conventional analysis with hematoxylin and eosin. In the last couple of decades there has been an exponential increase in publications on immunohistochemistry and immunocytochemistry techniques. This review covers the immunohistochemistry technique; its history, applications, importance, limitations, difficulties, problems and some aspects related to results interpretation and quantification. Future developments on the immunohistochemistry technique and its expression quantification should not be disseminated in two languages—that of the pathologist and another of clinician or surgeon. The scientific, diagnostic and prognostic applications of this methodology must be explored in a bid to benefit of patient. In order to achieve this goal a collaboration and pooling of knowledge from both of these valuable medical areas is vital PMID:20212918
NASA Astrophysics Data System (ADS)
Wilgocka, Aleksandra; RÄ czkowski, Włodzimierz; Kostyrko, Mikołaj; Ruciński, Dominik
2016-08-01
Years of experience in air-photo interpretations provide us to conclusion that we know what we are looking at, we know why we can see cropmarks, we even can estimate, when are the best opportunities to observe them. But even today cropmarks may be a subject of misinterpretation or wishful thinking. The same problems appear when working with aerial photographs, satellite imageries, ALS, geophysics, etc. In the paper we present several case studies based on data acquired for and within ArchEO - archaeological applications of Earth Observation techniques project to discuss complexity and consequences of archaeological interpretations. While testing usefulness of satellite imagery in Poland on various types of sites, cropmarks were the most frequent indicators of past landscapes as well as archaeological and natural features. Hence, new archaeological sites have been discovered mainly thanks to cropmarks. This situation has given us an opportunity to test not only satellite imageries as a source of data but also confront them with results of other non-invasive methods of data acquisition. When working with variety of data we have met several issues which raised problems of interpretation. Consequently, questions related to the cognitive value of remote sensing data appear and should be discussed. What do the data represent? To what extent the imageries, cropmarks or other visualizations represent the past? How should we deal with ambiguity of data? What can we learn from pitfalls in the interpretation of cropmarks, soilmarks etc. to share more Sherlock's methodology rather than run around Don Quixote's delusions?
Tasker, Gary D.; Granato, Gregory E.
2000-01-01
Decision makers need viable methods for the interpretation of local, regional, and national-highway runoff and urban-stormwater data including flows, concentrations and loads of chemical constituents and sediment, potential effects on receiving waters, and the potential effectiveness of various best management practices (BMPs). Valid (useful for intended purposes), current, and technically defensible stormwater-runoff models are needed to interpret data collected in field studies, to support existing highway and urban-runoffplanning processes, to meet National Pollutant Discharge Elimination System (NPDES) requirements, and to provide methods for computation of Total Maximum Daily Loads (TMDLs) systematically and economically. Historically, conceptual, simulation, empirical, and statistical models of varying levels of detail, complexity, and uncertainty have been used to meet various data-quality objectives in the decision-making processes necessary for the planning, design, construction, and maintenance of highways and for other land-use applications. Water-quality simulation models attempt a detailed representation of the physical processes and mechanisms at a given site. Empirical and statistical regional water-quality assessment models provide a more general picture of water quality or changes in water quality over a region. All these modeling techniques share one common aspect-their predictive ability is poor without suitable site-specific data for calibration. To properly apply the correct model, one must understand the classification of variables, the unique characteristics of water-resources data, and the concept of population structure and analysis. Classifying variables being used to analyze data may determine which statistical methods are appropriate for data analysis. An understanding of the characteristics of water-resources data is necessary to evaluate the applicability of different statistical methods, to interpret the results of these techniques, and to use tools and techniques that account for the unique nature of water-resources data sets. Populations of data on stormwater-runoff quantity and quality are often best modeled as logarithmic transformations. Therefore, these factors need to be considered to form valid, current, and technically defensible stormwater-runoff models. Regression analysis is an accepted method for interpretation of water-resources data and for prediction of current or future conditions at sites that fit the input data model. Regression analysis is designed to provide an estimate of the average response of a system as it relates to variation in one or more known variables. To produce valid models, however, regression analysis should include visual analysis of scatterplots, an examination of the regression equation, evaluation of the method design assumptions, and regression diagnostics. A number of statistical techniques are described in the text and in the appendixes to provide information necessary to interpret data by use of appropriate methods. Uncertainty is an important part of any decisionmaking process. In order to deal with uncertainty problems, the analyst needs to know the severity of the statistical uncertainty of the methods used to predict water quality. Statistical models need to be based on information that is meaningful, representative, complete, precise, accurate, and comparable to be deemed valid, up to date, and technically supportable. To assess uncertainty in the analytical tools, the modeling methods, and the underlying data set, all of these components need be documented and communicated in an accessible format within project publications.
Ozseven, Ayşe Gül; Sesli Çetin, Emel; Ozseven, Levent
2012-07-01
In recent years, owing to the presence of multi-drug resistant nosocomial bacteria, combination therapies are more frequently applied. Thus there is more need to investigate the in vitro activity of drug combinations against multi-drug resistant bacteria. Checkerboard synergy testing is among the most widely used standard technique to determine the activity of antibiotic combinations. It is based on microdilution susceptibility testing of antibiotic combinations. Although this test has a standardised procedure, there are many different methods for interpreting the results. In many previous studies carried out with multi-drug resistant bacteria, different rates of synergy have been reported with various antibiotic combinations using checkerboard technique. These differences might be attributed to the different features of the strains. However, different synergy rates detected by checkerboard method have also been reported in other studies using the same drug combinations and same types of bacteria. It was thought that these differences in synergy rates might be due to the different methods of interpretation of synergy test results. In recent years, multi-drug resistant Acinetobacter baumannii has been the most commonly encountered nosocomial pathogen especially in intensive-care units. For this reason, multidrug resistant A.baumannii has been the subject of a considerable amount of research about antimicrobial combinations. In the present study, the in vitro activities of frequently preferred combinations in A.baumannii infections like imipenem plus ampicillin/sulbactam, and meropenem plus ampicillin/sulbactam were tested by checkerboard synergy method against 34 multi-drug resistant A.baumannii isolates. Minimum inhibitory concentration (MIC) values for imipenem, meropenem and ampicillin/sulbactam were determined by the broth microdilution method. Subsequently the activity of two different combinations were tested in the dilution range of 4 x MIC and 0.03 x MIC in 96-well checkerboard plates. The results were obtained separately using the four different interpretation methods frequently preferred by researchers. Thus, it was aimed to detect to what extent the rates of synergistic, indifferent and antagonistic interactions were affected by different interpretation methods. The differences between the interpretation methods were tested by chi-square analysis for each combination used. Statistically significant differences were detected between the four different interpretation methods for the determination of synergistic and indifferent interactions (p< 0.0001). Highest rates of synergy were observed with both combinations by the method that used the lowest fractional inhibitory concentration index of all the non-turbid wells along the turbidity/non-turbidity interface. There was no statistically significant difference between the four methods for the detection of antagonism (p> 0.05). In conclusion although there is a standard procedure for checkerboard synergy testing it fails to exhibit standard results owing to different methods of interpretation of the results. Thus, there is a need to standardise the interpretation method for checkerboard synergy testing. To determine the most appropriate method of interpretation further studies investigating the clinical benefits of synergic combinations and additionally comparing the consistency of the results obtained from the other standard combination tests like time-kill studies, are required.
Fent, Graham; Gosai, Jivendra; Purva, Makani
2016-01-01
Accurate interpretation of the electrocardiogram (ECG) remains an essential skill for medical students and junior doctors. While many techniques for teaching ECG interpretation are described, no single method has been shown to be superior. This randomized control trial is the first to investigate whether teaching ECG interpretation using a computer simulator program or traditional teaching leads to improved scores in a test of ECG interpretation among medical students and postgraduate doctors immediately after and 3months following teaching. Participants' opinions of the program were assessed using a questionnaire. There were no differences in ECG interpretation test scores immediately after or 3months after teaching in the lecture or simulator groups. At present therefore, there is insufficient evidence to suggest that ECG simulator programs are superior to traditional teaching. Copyright © 2016 Elsevier Inc. All rights reserved.
Keefe, John R; Solomonov, Nili; Derubeis, Robert J; Phillips, Alexander C; Busch, Fredric N; Barber, Jacques P; Chambless, Dianne L; Milrod, Barbara L
2018-04-18
This study examines whether, in panic-focused psychodynamic psychotherapy (PFPP), interpretations of conflicts that underlie anxiety (panic-focused or PF-interpretations) are specifically associated with subsequent panic disorder (PD) symptom improvement, over and above the provision of non-symptom-focused interpretations. Technique use in Sessions 2 and 10 of a 24-session PFPP protocol was assessed for the 65 patients with complete outcome data randomized to PFPP in a two-site trial of psychotherapies for PD. Sessions were rated in 15-min segments for therapists' use of PF-interpretations, non-PF-interpretations, and PF-clarifications. Robust regressions were conducted to examine the relationship between these interventions and symptom change subsequent to the sampled session. Interpersonal problems were examined as a moderator of the relationship of PF-interpretations to symptom change. At Session 10, but not at Session 2, patients who received a higher degree of PF-interpretations experienced greater subsequent improvement in panic symptoms. Non-PF-interpretations were not predictive. Patients with more interpersonal distress benefitted particularly from the use of PF-interpretations at Session 10. By the middle phase of PFPP, panic-focused interpretations may drive subsequent improvements in panic symptoms, especially among patients with higher interpersonal distress. Interpretations of conflict absent a panic focus may not be especially helpful.
Hyphenated analytical techniques for materials characterisation
NASA Astrophysics Data System (ADS)
Armstrong, Gordon; Kailas, Lekshmi
2017-09-01
This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the practical issues that arise in combining different techniques. We will consider how the complementary and varied information obtained by combining these techniques may be interpreted together to better understand the sample in greater detail than that was possible before, and also how combining different techniques can simplify sample preparation and ensure reliable comparisons are made between multiple analyses on the same samples—a topic of particular importance as nanoscale technologies become more prevalent in applied and industrial research and development (R&D). The review will conclude with a brief outline of the emerging state of the art in the research laboratory, and a suggested approach to using hyphenated techniques, whether in the teaching, quality control or R&D laboratory.
Interpretation and classification of microvolt T wave alternans tests
NASA Technical Reports Server (NTRS)
Bloomfield, Daniel M.; Hohnloser, Stefan H.; Cohen, Richard J.
2002-01-01
Measurement of microvolt-level T wave alternans (TWA) during routine exercise stress testing now is possible as a result of sophisticated noise reduction techniques and analytic methods that have become commercially available. Even though this technology is new, the available data suggest that microvolt TWA is a potent predictor of arrhythmia risk in diverse disease states. As this technology becomes more widely available, physicians will be called upon to interpret microvolt TWA tracings. This review seeks to establish uniform standards for the clinical interpretation of microvolt TWA tracings.
A self-trained classification technique for producing 30 m percent-water maps from Landsat data
Rover, Jennifer R.; Wylie, Bruce K.; Ji, Lei
2010-01-01
Small bodies of water can be mapped with moderate-resolution satellite data using methods where water is mapped as subpixel fractions using field measurements or high-resolution images as training datasets. A new method, developed from a regression-tree technique, uses a 30 m Landsat image for training the regression tree that, in turn, is applied to the same image to map subpixel water. The self-trained method was evaluated by comparing the percent-water map with three other maps generated from established percent-water mapping methods: (1) a regression-tree model trained with a 5 m SPOT 5 image, (2) a regression-tree model based on endmembers and (3) a linear unmixing classification technique. The results suggest that subpixel water fractions can be accurately estimated when high-resolution satellite data or intensively interpreted training datasets are not available, which increases our ability to map small water bodies or small changes in lake size at a regional scale.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacques, David A.; Streamer, Margaret; Rowland, Susan L.
2009-09-02
The crystal structure of the DNA-damage checkpoint inhibitor of sporulation, Sda, from Bacillus subtilis, has been solved by the MAD technique using selenomethionine-substituted protein. The structure closely resembles that previously solved by NMR, as well as the structure of a homologue from Geobacillus stearothermophilus solved in complex with the histidine kinase KinB. The structure contains three molecules in the asymmetric unit. The unusual trimeric arrangement, which lacks simple internal symmetry, appears to be preserved in solution based on an essentially ideal fit to previously acquired scattering data for Sda in solution. This interpretation contradicts previous findings that Sda was monomericmore » or dimeric in solution. This study demonstrates the difficulties that can be associated with the characterization of small proteins and the value of combining multiple biophysical techniques. It also emphasizes the importance of understanding the physical principles behind these techniques and therefore their limitations.« less
3D Modeling of Ultrasonic Wave Interaction with Disbonds and Weak Bonds
NASA Technical Reports Server (NTRS)
Leckey, C.; Hinders, M.
2011-01-01
Ultrasonic techniques, such as the use of guided waves, can be ideal for finding damage in the plate and pipe-like structures used in aerospace applications. However, the interaction of waves with real flaw types and geometries can lead to experimental signals that are difficult to interpret. 3-dimensional (3D) elastic wave simulations can be a powerful tool in understanding the complicated wave scattering involved in flaw detection and for optimizing experimental techniques. We have developed and implemented parallel 3D elastodynamic finite integration technique (3D EFIT) code to investigate Lamb wave scattering from realistic flaws. This paper discusses simulation results for an aluminum-aluminum diffusion disbond and an aluminum-epoxy disbond and compares results from the disbond case to the common artificial flaw type of a flat-bottom hole. The paper also discusses the potential for extending the 3D EFIT equations to incorporate physics-based weak bond models for simulating wave scattering from weak adhesive bonds.
NASA Technical Reports Server (NTRS)
Blackwell, R. J.
1982-01-01
Remote sensing data analysis of water quality monitoring is evaluated. Data anaysis and image processing techniques are applied to LANDSAT remote sensing data to produce an effective operational tool for lake water quality surveying and monitoring. Digital image processing and analysis techniques were designed, developed, tested, and applied to LANDSAT multispectral scanner (MSS) data and conventional surface acquired data. Utilization of these techniques facilitates the surveying and monitoring of large numbers of lakes in an operational manner. Supervised multispectral classification, when used in conjunction with surface acquired water quality indicators, is used to characterize water body trophic status. Unsupervised multispectral classification, when interpreted by lake scientists familiar with a specific water body, yields classifications of equal validity with supervised methods and in a more cost effective manner. Image data base technology is used to great advantage in characterizing other contributing effects to water quality. These effects include drainage basin configuration, terrain slope, soil, precipitation and land cover characteristics.
Computational Biology Methods for Characterization of Pluripotent Cells.
Araúzo-Bravo, Marcos J
2016-01-01
Pluripotent cells are a powerful tool for regenerative medicine and drug discovery. Several techniques have been developed to induce pluripotency, or to extract pluripotent cells from different tissues and biological fluids. However, the characterization of pluripotency requires tedious, expensive, time-consuming, and not always reliable wet-lab experiments; thus, an easy, standard quality-control protocol of pluripotency assessment remains to be established. Here to help comes the use of high-throughput techniques, and in particular, the employment of gene expression microarrays, which has become a complementary technique for cellular characterization. Research has shown that the transcriptomics comparison with an Embryonic Stem Cell (ESC) of reference is a good approach to assess the pluripotency. Under the premise that the best protocol is a computer software source code, here I propose and explain line by line a software protocol coded in R-Bioconductor for pluripotency assessment based on the comparison of transcriptomics data of pluripotent cells with an ESC of reference. I provide advice for experimental design, warning about possible pitfalls, and guides for results interpretation.
NASA Astrophysics Data System (ADS)
Louie, J. N.; Basler-Reeder, K.; Kent, G. M.; Pullammanappallil, S. K.
2015-12-01
Simultaneous joint seismic-gravity optimization improves P-wave velocity models in areas with sharp lateral velocity contrasts. Optimization is achieved using simulated annealing, a metaheuristic global optimization algorithm that does not require an accurate initial model. Balancing the seismic-gravity objective function is accomplished by a novel approach based on analysis of Pareto charts. Gravity modeling uses a newly developed convolution algorithm, while seismic modeling utilizes the highly efficient Vidale eikonal equation traveltime generation technique. Synthetic tests show that joint optimization improves velocity model accuracy and provides velocity control below the deepest headwave raypath. Detailed first arrival picking followed by trial velocity modeling remediates inconsistent data. We use a set of highly refined first arrival picks to compare results of a convergent joint seismic-gravity optimization to the Plotrefa™ and SeisOpt® Pro™ velocity modeling packages. Plotrefa™ uses a nonlinear least squares approach that is initial model dependent and produces shallow velocity artifacts. SeisOpt® Pro™ utilizes the simulated annealing algorithm and is limited to depths above the deepest raypath. Joint optimization increases the depth of constrained velocities, improving reflector coherency at depth. Kirchoff prestack depth migrations reveal that joint optimization ameliorates shallow velocity artifacts caused by limitations in refraction ray coverage. Seismic and gravity data from the San Emidio Geothermal field of the northwest Basin and Range province demonstrate that joint optimization changes interpretation outcomes. The prior shallow-valley interpretation gives way to a deep valley model, while shallow antiformal reflectors that could have been interpreted as antiformal folds are flattened. Furthermore, joint optimization provides a clearer image of the rangefront fault. This technique can readily be applied to existing datasets and could replace the existing strategy of forward modeling to match gravity data.
Self organising hypothesis networks: a new approach for representing and structuring SAR knowledge
2014-01-01
Background Combining different sources of knowledge to build improved structure activity relationship models is not easy owing to the variety of knowledge formats and the absence of a common framework to interoperate between learning techniques. Most of the current approaches address this problem by using consensus models that operate at the prediction level. We explore the possibility to directly combine these sources at the knowledge level, with the aim to harvest potentially increased synergy at an earlier stage. Our goal is to design a general methodology to facilitate knowledge discovery and produce accurate and interpretable models. Results To combine models at the knowledge level, we propose to decouple the learning phase from the knowledge application phase using a pivot representation (lingua franca) based on the concept of hypothesis. A hypothesis is a simple and interpretable knowledge unit. Regardless of its origin, knowledge is broken down into a collection of hypotheses. These hypotheses are subsequently organised into hierarchical network. This unification permits to combine different sources of knowledge into a common formalised framework. The approach allows us to create a synergistic system between different forms of knowledge and new algorithms can be applied to leverage this unified model. This first article focuses on the general principle of the Self Organising Hypothesis Network (SOHN) approach in the context of binary classification problems along with an illustrative application to the prediction of mutagenicity. Conclusion It is possible to represent knowledge in the unified form of a hypothesis network allowing interpretable predictions with performances comparable to mainstream machine learning techniques. This new approach offers the potential to combine knowledge from different sources into a common framework in which high level reasoning and meta-learning can be applied; these latter perspectives will be explored in future work. PMID:24959206
Report on Concepts & Approaches for SSBD for eCHEM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murphy, Chantell Lynne-Marie
The verification of special nuclear material (SNM) in spent fuel pyroprocessing is an important safeguards challenge. The detection of spontaneous fission (SF) neutrons from curium is an accepted, non-destructive technique that has been applied to verify special nuclear material (SNM) content in used fuel and other materials in the fuel cycle. The nuclear material accounting (NMA) technique at the Korea Atomic Energy Research Institute’s Reference Engineering-scale Pyroprocessing Facility (REPF) is based on the Cm balance technique. Several publications have demonstrated the safeguards benefit from using process monitoring (PM) on nuclear facilities as a complementary measure to NMA. More recently, thismore » concept was expanded and preliminarily demonstrated for pyroprocessing. The concept of Signature Based Safeguards (SBS) is part of this expansion, and is built around the interpretation of input from various sensors in a declared facility coupled with complementary NMA methods to increase confidence and lower standard error inventory differences (SEID). The SBS methodology was conceptually developed and relies on near real time analysis of process monitoring data to detect material diversion complemented by robust containment and surveillance (C/S) measures. This work demonstrates one example of how the SBS framework can be used in the electrorefiner. In this SBS application, a combination of cyclic voltammetry (CV) and neutron counting is applied to track and monitor Pu mass balance. The main purpose of this experiment is to determine if meaningful information can be gained from CV measurements with regard to the Mg/Gd ratio. This data will be coupled with ICP-MS to verify Gd concentrations and analyzed for statistical significance. It is expected the CV data will register a significant change under the off-normal operating conditions. Knowing how to identify and interpret those changes may help inform how to target more traditional neutron counting methods, which could support a more efficient safeguards system. The experimental results will be compared with theoretical calculations and the ERAD simulations.« less
NASA Astrophysics Data System (ADS)
Oh, J.; Min, D.; Kim, W.; Huh, C.; Kang, S.
2012-12-01
Recently, the CCS (Carbon Capture and Storage) is one of the promising methods to reduce the CO2 emission. To evaluate the success of the CCS project, various geophysical monitoring techniques have been applied. Among them, the time-lapse seismic monitoring is one of the effective methods to investigate the migration of CO2 plume. To monitor the injected CO2 plume accurately, it is needed to interpret seismic monitoring data using not only the imaging technique but also the full waveform inversion, because subsurface material properties can be estimated through the inversion. However, previous works for interpreting seismic monitoring data are mainly based on the imaging technique. In this study, we perform the frequency-domain full waveform inversion for synthetic data obtained by the acoustic-elastic coupled modeling for the geological model made after Ulleung Basin, which is one of the CO2 storage prospects in Korea. We suppose the injection layer is located in fault-related anticlines in the Dolgorae Deformed Belt and, for more realistic situation, we contaminate the synthetic monitoring data with random noise and outliers. We perform the time-lapse full waveform inversion in two scenarios. One scenario is that the injected CO2 plume migrates within the injection layer and is stably captured. The other scenario is that the injected CO2 plume leaks through the weak part of the cap rock. Using the inverted P- and S-wave velocities and Poisson's ratio, we were able to detect the migration of the injected CO2 plume. Acknowledgment This work was financially supported by the Brain Korea 21 project of Energy Systems Engineering, the "Development of Technology for CO2 Marine Geological Storage" program funded by the Ministry of Land, Transport and Maritime Affairs (MLTM) of Korea and the Korea CCS R&D Center (KCRC) grant funded by the Korea government (Ministry of Education, Science and Technology) (No. 2012-0008926).
Adaptive inferential sensors based on evolving fuzzy models.
Angelov, Plamen; Kordon, Arthur
2010-04-01
A new technique to the design and use of inferential sensors in the process industry is proposed in this paper, which is based on the recently introduced concept of evolving fuzzy models (EFMs). They address the challenge that the modern process industry faces today, namely, to develop such adaptive and self-calibrating online inferential sensors that reduce the maintenance costs while keeping the high precision and interpretability/transparency. The proposed new methodology makes possible inferential sensors to recalibrate automatically, which reduces significantly the life-cycle efforts for their maintenance. This is achieved by the adaptive and flexible open-structure EFM used. The novelty of this paper lies in the following: (1) the overall concept of inferential sensors with evolving and self-developing structure from the data streams; (2) the new methodology for online automatic selection of input variables that are most relevant for the prediction; (3) the technique to detect automatically a shift in the data pattern using the age of the clusters (and fuzzy rules); (4) the online standardization technique used by the learning procedure of the evolving model; and (5) the application of this innovative approach to several real-life industrial processes from the chemical industry (evolving inferential sensors, namely, eSensors, were used for predicting the chemical properties of different products in The Dow Chemical Company, Freeport, TX). It should be noted, however, that the methodology and conclusions of this paper are valid for the broader area of chemical and process industries in general. The results demonstrate that well-interpretable and with-simple-structure inferential sensors can automatically be designed from the data stream in real time, which predict various process variables of interest. The proposed approach can be used as a basis for the development of a new generation of adaptive and evolving inferential sensors that can address the challenges of the modern advanced process industry.
Empirical Approach to Understanding the Fatigue Behavior of Metals Made Using Additive Manufacturing
NASA Astrophysics Data System (ADS)
Witkin, David B.; Albright, Thomas V.; Patel, Dhruv N.
2016-08-01
High-cycle fatigue measurements were performed on alloys prepared by powder-bed fusion additive manufacturing techniques. Selective laser melted (SLM) nickel-based superalloy 625 and electron beam melted (EBM) Ti-6Al-4V specimens were prepared as round fatigue specimens and tested with as-built surfaces at stress ratios of -1, 0.1 and 0.5. Data collected at R = -1 were used to construct Goodman diagrams that correspond closely to measured experimental data collected at R > 0. A second way to interpret the HCF data is based on the influence of surface roughness on fatigue, and approximate the surface feature size as a notch. On this basis, the data were interpreted using the fatigue notch factor k f and average stress models relating k f and stress concentration factor K t. The depth and root radius of surface features associated with fatigue crack initiation were used to estimate a K t of 2.8 for SLM 625. For Ti-6Al-4V, a direct estimate of K t from HCF data was not possible, but approximate values of k f based on HCF data and K t from crack initiation site geometry are found to explain other published EBM Ti-6Al-4V.
NASA Astrophysics Data System (ADS)
Asiedu, Mercy Nyamewaa; Simhal, Anish; Lam, Christopher T.; Mueller, Jenna; Chaudhary, Usamah; Schmitt, John W.; Sapiro, Guillermo; Ramanujam, Nimmi
2018-02-01
The world health organization recommends visual inspection with acetic acid (VIA) and/or Lugol's Iodine (VILI) for cervical cancer screening in low-resource settings. Human interpretation of diagnostic indicators for visual inspection is qualitative, subjective, and has high inter-observer discordance, which could lead both to adverse outcomes for the patient and unnecessary follow-ups. In this work, we a simple method for automatic feature extraction and classification for Lugol's Iodine cervigrams acquired with a low-cost, miniature, digital colposcope. Algorithms to preprocess expert physician-labelled cervigrams and to extract simple but powerful color-based features are introduced. The features are used to train a support vector machine model to classify cervigrams based on expert physician labels. The selected framework achieved a sensitivity, specificity, and accuracy of 89.2%, 66.7% and 80.6% with majority diagnosis of the expert physicians in discriminating cervical intraepithelial neoplasia (CIN +) relative to normal tissues. The proposed classifier also achieved an area under the curve of 84 when trained with majority diagnosis of the expert physicians. The results suggest that utilizing simple color-based features may enable unbiased automation of VILI cervigrams, opening the door to a full system of low-cost data acquisition complemented with automatic interpretation.
ERIC Educational Resources Information Center
Holman, Garvin L.
This report documents the training effectiveness of a map interpretation and terrain analysis course (MITAC) developed to enhance the ability of helicopter pilots to navigate accurately during low altitude terrain following flight. A study comparing student aviators taught by the MITAC technique with a control group of students taught by…
Bladed disc crack diagnostics using blade passage signals
NASA Astrophysics Data System (ADS)
Hanachi, Houman; Liu, Jie; Banerjee, Avisekh; Koul, Ashok; Liang, Ming; Alavi, Elham
2012-12-01
One of the major potential faults in a turbo fan engine is the crack initiation and propagation in bladed discs under cyclic loads that could result in the breakdown of the engines if not detected at an early stage. Reliable fault detection techniques are therefore in demand to reduce maintenance cost and prevent catastrophic failures. Although a number of approaches have been reported in the literature, it remains very challenging to develop a reliable technique to accurately estimate the health condition of a rotating bladed disc. Correspondingly, this paper presents a novel technique for bladed disc crack detection through two sequential signal processing stages: (1) signal preprocessing that aims to eliminate the noises in the blade passage signals; (2) signal postprocessing that intends to identify the crack location. In the first stage, physics-based modeling and interpretation are established to help characterize the noises. The crack initiation can be determined based on the calculated health monitoring index derived from the sinusoidal effects. In the second stage, the crack is located through advanced detrended fluctuation analysis of the preprocessed data. The proposed technique is validated using a set of spin rig test data (i.e. tip clearance and time of arrival) that was acquired during a test conducted on a bladed military engine fan disc. The test results have demonstrated that the developed technique is an effective approach for identifying and locating the incipient crack that occurs at the root of a bladed disc.
Cluster-based analysis improves predictive validity of spike-triggered receptive field estimates
Malone, Brian J.
2017-01-01
Spectrotemporal receptive field (STRF) characterization is a central goal of auditory physiology. STRFs are often approximated by the spike-triggered average (STA), which reflects the average stimulus preceding a spike. In many cases, the raw STA is subjected to a threshold defined by gain values expected by chance. However, such correction methods have not been universally adopted, and the consequences of specific gain-thresholding approaches have not been investigated systematically. Here, we evaluate two classes of statistical correction techniques, using the resulting STRF estimates to predict responses to a novel validation stimulus. The first, more traditional technique eliminated STRF pixels (time-frequency bins) with gain values expected by chance. This correction method yielded significant increases in prediction accuracy, including when the threshold setting was optimized for each unit. The second technique was a two-step thresholding procedure wherein clusters of contiguous pixels surviving an initial gain threshold were then subjected to a cluster mass threshold based on summed pixel values. This approach significantly improved upon even the best gain-thresholding techniques. Additional analyses suggested that allowing threshold settings to vary independently for excitatory and inhibitory subfields of the STRF resulted in only marginal additional gains, at best. In summary, augmenting reverse correlation techniques with principled statistical correction choices increased prediction accuracy by over 80% for multi-unit STRFs and by over 40% for single-unit STRFs, furthering the interpretational relevance of the recovered spectrotemporal filters for auditory systems analysis. PMID:28877194
Protein Modelling: What Happened to the “Protein Structure Gap”?
Schwede, Torsten
2013-01-01
Computational modeling and prediction of three-dimensional macromolecular structures and complexes from their sequence has been a long standing vision in structural biology as it holds the promise to bypass part of the laborious process of experimental structure solution. Over the last two decades, a paradigm shift has occurred: starting from a situation where the “structure knowledge gap” between the huge number of protein sequences and small number of known structures has hampered the widespread use of structure-based approaches in life science research, today some form of structural information – either experimental or computational – is available for the majority of amino acids encoded by common model organism genomes. Template based homology modeling techniques have matured to a point where they are now routinely used to complement experimental techniques. With the scientific focus of interest moving towards larger macromolecular complexes and dynamic networks of interactions, the integration of computational modeling methods with low-resolution experimental techniques allows studying large and complex molecular machines. Computational modeling and prediction techniques are still facing a number of challenges which hamper the more widespread use by the non-expert scientist. For example, it is often difficult to convey the underlying assumptions of a computational technique, as well as the expected accuracy and structural variability of a specific model. However, these aspects are crucial to understand the limitations of a model, and to decide which interpretations and conclusions can be supported. PMID:24010712
How adolescent girls interpret weight-loss advertising.
Hobbs, Renee; Broder, Sharon; Pope, Holly; Rowe, Jonelle
2006-10-01
While they demonstrate some ability to critically analyze the more obvious forms of deceptive weight-loss advertising, many girls do not recognize how advertising evokes emotional responses or how visual and narrative techniques are used to increase identification in weight-loss advertising. This study examined how girls aged 9-17 years interpreted magazine advertising, television (TV) advertising and infomercials for weight-loss products in order to determine whether deceptive advertising techniques were recognized and to assess pre-existing media-literacy skills. A total of 42 participants were interviewed in seven geographic regions of the United States. In groups of three, participants were shown seven print and TV advertisements (ads) for weight-loss products and asked to share their interpretations of each ad. Common factors in girls' interpretation of weight-loss advertising included responding to texts emotionally by identifying with characters; comparing and contrasting persuasive messages with real-life experiences with family members; using prior knowledge about nutrition management and recognizing obvious deceptive claims like 'rapid' or 'permanent' weight loss. Girls were less able to demonstrate skills including recognizing persuasive construction strategies including message purpose, target audience and subtext and awareness of economic factors including financial motives, credibility enhancement and branding.
NASA Technical Reports Server (NTRS)
Bleacher, J. E.; Eppler, D. B.; Skinner, J. A.; Evans, C. A.; Feng, W.; Gruener, J. E.; Hurwitz, D. M.; Whitson, P.; Janoiko, B.
2014-01-01
Terrestrial geologic mapping techniques are regularly used for "photogeologic" mapping of other planets, but these approaches are complicated by the diverse type, areal coverage, and spatial resolution of available data sets. When available, spatially-limited in-situ human and/or robotic surface observations can sometimes introduce a level of detail that is difficult to integrate with regional or global interpretations. To assess best practices for utilizing observations acquired from orbit and on the surface, our team conducted a comparative study of geologic mapping and interpretation techniques. We compared maps generated for the same area in the San Francisco Volcanic Field (SFVF) in northern Arizona using 1) data collected for reconnaissance before and during the 2010 Desert Research And Technology Studies campaign, and 2) during a traditional, terrestrial field geology study. The operations, related results, and direct mapping comparisons are discussed in companion LPSC abstracts. Here we present new geologic interpretations for a volcanic cone and related lava flows as derived from all approaches involved in this study. Mapping results indicate a need for caution when interpreting past eruption conditions on other planetary surfaces from orbital data alone.
NASA Technical Reports Server (NTRS)
Bleacher, J. E.; Eppler, D. B.; Skinner, J. A.; Evans, C. A.; Feng, W.; Gruener, J. E.; Hurwitz, D. M.; Whitson, P.; Janoiko, B.
2014-01-01
Terrestrial geologic mapping techniques are regularly used for "photogeologic" mapping of other planets, but these approaches are complicated by the diverse type, areal coverage, and spatial resolution of available data sets. When available, spatially-limited in-situ human and/or robotic surface observations can sometimes introduce a level of detail that is difficult to integrate with regional or global interpretations. To assess best practices for utilizing observations acquired from orbit and on the surface, our team conducted a comparative study of geologic mapping and interpretation techniques. We compared maps generated for the same area in the San Francisco Volcanic Field (SFVF) in northern Arizona using 1) data collected for reconnaissance before and during the 2010 Desert Research And Technology Studies campaign, and 2) during a traditional, terrestrial field geology study. The operations, related results, and direct mapping comparisons are discussed in companion LPSC abstracts [1-3]. Here we present new geologic interpretations for a volcanic cone and related lava flows as derived from all approaches involved in this study. Mapping results indicate a need for caution when interpreting past eruption conditions on other planetary surfaces from orbital data alone.
Kusonmano, Kanthida; Vongsangnak, Wanwipa; Chumnanpuen, Pramote
2016-01-01
Metabolome profiling of biological systems has the powerful ability to provide the biological understanding of their metabolic functional states responding to the environmental factors or other perturbations. Tons of accumulative metabolomics data have thus been established since pre-metabolomics era. This is directly influenced by the high-throughput analytical techniques, especially mass spectrometry (MS)- and nuclear magnetic resonance (NMR)-based techniques. Continuously, the significant numbers of informatics techniques for data processing, statistical analysis, and data mining have been developed. The following tools and databases are advanced for the metabolomics society which provide the useful metabolomics information, e.g., the chemical structures, mass spectrum patterns for peak identification, metabolite profiles, biological functions, dynamic metabolite changes, and biochemical transformations of thousands of small molecules. In this chapter, we aim to introduce overall metabolomics studies from pre- to post-metabolomics era and their impact on society. Directing on post-metabolomics era, we provide a conceptual framework of informatics techniques for metabolomics and show useful examples of techniques, tools, and databases for metabolomics data analysis starting from preprocessing toward functional interpretation. Throughout the framework of informatics techniques for metabolomics provided, it can be further used as a scaffold for translational biomedical research which can thus lead to reveal new metabolite biomarkers, potential metabolic targets, or key metabolic pathways for future disease therapy.
Exploring Rating Quality in Rater-Mediated Assessments Using Mokken Scale Analysis
Wind, Stefanie A.; Engelhard, George
2015-01-01
Mokken scale analysis is a probabilistic nonparametric approach that offers statistical and graphical tools for evaluating the quality of social science measurement without placing potentially inappropriate restrictions on the structure of a data set. In particular, Mokken scaling provides a useful method for evaluating important measurement properties, such as invariance, in contexts where response processes are not well understood. Because rater-mediated assessments involve complex interactions among many variables, including assessment contexts, student artifacts, rubrics, individual rater characteristics, and others, rater-assigned scores are suitable candidates for Mokken scale analysis. The purposes of this study are to describe a suite of indices that can be used to explore the psychometric quality of data from rater-mediated assessments and to illustrate the substantive interpretation of Mokken-based statistics and displays in this context. Techniques that are commonly used in polytomous applications of Mokken scaling are adapted for use with rater-mediated assessments, with a focus on the substantive interpretation related to individual raters. Overall, the findings suggest that indices of rater monotonicity, rater scalability, and invariant rater ordering based on Mokken scaling provide diagnostic information at the level of individual raters related to the requirements for invariant measurement. These Mokken-based indices serve as an additional suite of diagnostic tools for exploring the quality of data from rater-mediated assessments that can supplement rating quality indices based on parametric models. PMID:29795883
McQueen, Peter; Gates, Lucy; Marshall, Michelle; Doherty, Michael; Arden, Nigel; Bowen, Catherine
2017-01-01
The prevalence of foot osteoarthritis (OA) is much less understood than hip, knee and hand OA. The foot is anatomically complex and different researchers have investigated different joints with lack of methodological standardisation across studies. The La Trobe Foot Atlas (LFA) is the first to address these issues in providing quantitative assessment of radiographic foot OA, but has not been tested externally. The aim of this study was to evaluate three different interpretive approaches to using the LFA for grading OA when scoring is difficult due to indistinct views of interosseous space and joint contour. Foot radiographs of all remaining participants ( n = 218) assessed in the Chingford Women Study 23 year visit (mean (SD) for age: 75.5 years (5.1)) were scored using the LFA defined protocol (Technique 1). Two revised scoring strategies were applied to the radiographs in addition to the standard LFA analyses. Technique 2 categorised joints that were difficult to grade as 'missing'. Technique 3 included joints that were difficult to grade as an over estimated score. Radiographic OA prevalence was defined for the foot both collectively and separately for individual joints. When radiographs were scored using the LFA (Technique 1), radiographic foot OA was present in 89.9%. For Technique 2 the presence of radiographic foot OA was 83.5% and for Technique 3 it was 97.2%. At the individual joint level, using Technique 1, the presence of radiographic foot OA was higher with a wider range (18.3-74.3%) than Technique 2 (17.9-46.3%) and lower with a wider range (18.3-74.3%) than Technique 3 (39.9-79.4%). The three different ways of interpreting the LFA scoring system when grading of individual joints is technically difficult and result in very different estimates of foot OA prevalence at both the individual joint and global foot level. Agreement on the best strategy is required to improve comparability between studies.
Martin, Antony; Yong, Alan K.; Salomone, Larry A.
2014-01-01
Active-source Love waves, recorded by the multi-channel analysis of surface wave (MASLW) technique, were recently analyzed in two site characterization projects. Between 2010 and 2012, the 2009 American Recovery and Reinvestment Act (ARRA) funded GEOVision to conduct geophysical investigations at 191 seismographic stations in California and the Central Eastern U.S. (CEUS). The original project plan was to utilize active and passive Rayleigh wave-based techniques to obtain shear-wave velocity (VS) profiles to a minimum depth of 30 m and the time-averaged VS of the upper 30 meters (VS30). Early in this investigation it became clear that Rayleigh wave techniques, such as multi-channel analysis of surface waves (MASRW), were not suited for characterizing all sites. Shear-wave seismic refraction and MASLW techniques were therefore applied. In 2012, the Electric Power Research Institute funded characterization of 33 CEUS station sites. Based on experience from the ARRA investigation, both MASRW and MASLW data were acquired by GEOVision at 24 CEUS sites. At shallow rock sites, sites with steep velocity gradients, and, sites with a thin, low velocity, surficial soil layer overlying stiffer sediments, Love wave techniques generally were found to be easier to interpret, i.e., Love wave data typically yielded unambiguous fundamental mode dispersion curves and thus, reduce uncertainty in the resultant VS model. These types of velocity structure often excite dominant higher modes in Rayleigh wave data, but not in the Love wave data. It is possible to model Rayleigh wave data using multi- or effective-mode techniques; however, extraction of Rayleigh wave dispersion data was found to be difficult in many cases. These results imply that field procedures should include careful scrutiny of Rayleigh wave-based dispersion data in order to also collect Love wave data when warranted.
Usage and applications of Semantic Web techniques and technologies to support chemistry research
2014-01-01
Background The drug discovery process is now highly dependent on the management, curation and integration of large amounts of potentially useful data. Semantics are necessary in order to interpret the information and derive knowledge. Advances in recent years have mitigated concerns that the lack of robust, usable tools has inhibited the adoption of methodologies based on semantics. Results This paper presents three examples of how Semantic Web techniques and technologies can be used in order to support chemistry research: a controlled vocabulary for quantities, units and symbols in physical chemistry; a controlled vocabulary for the classification and labelling of chemical substances and mixtures; and, a database of chemical identifiers. This paper also presents a Web-based service that uses the datasets in order to assist with the completion of risk assessment forms, along with a discussion of the legal implications and value-proposition for the use of such a service. Conclusions We have introduced the Semantic Web concepts, technologies, and methodologies that can be used to support chemistry research, and have demonstrated the application of those techniques in three areas very relevant to modern chemistry research, generating three new datasets that we offer as exemplars of an extensible portfolio of advanced data integration facilities. We have thereby established the importance of Semantic Web techniques and technologies for meeting Wild’s fourth “grand challenge”. PMID:24855494