Analysis of objects in binary images. M.S. Thesis - Old Dominion Univ.
NASA Technical Reports Server (NTRS)
Leonard, Desiree M.
1991-01-01
Digital image processing techniques are typically used to produce improved digital images through the application of successive enhancement techniques to a given image or to generate quantitative data about the objects within that image. In support of and to assist researchers in a wide range of disciplines, e.g., interferometry, heavy rain effects on aerodynamics, and structure recognition research, it is often desirable to count objects in an image and compute their geometric properties. Therefore, an image analysis application package, focusing on a subset of image analysis techniques used for object recognition in binary images, was developed. This report describes the techniques and algorithms utilized in three main phases of the application and are categorized as: image segmentation, object recognition, and quantitative analysis. Appendices provide supplemental formulas for the algorithms employed as well as examples and results from the various image segmentation techniques and the object recognition algorithm implemented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdel-Kareem, O.; Ghoneim, M.; Harith, M. A.
2011-09-22
Analysis of metal objects is a necessary step for establishing an appropriate conservation treatment of an object or to follow up the application's result of the suggested treatments. The main considerations on selecting a method that can be used in investigation and analysis of metal objects are based on the diagnostic power, representative sampling, reproducibility, destructive nature/invasiveness of analysis and accessibility to the appropriate instrument. This study aims at evaluating the usefulness of the use of Laser Induced Breakdown Spectroscopy (LIBS) Technique for analysis of historical metal objects. In this study various historical metal objects collected from different museums andmore » excavations in Egypt were investigated using (LIBS) technique. For evaluating usefulness of the suggested analytical protocol of this technique, the same investigated metal objects were investigated by other methods such as Scanning Electron Microscope with energy-dispersive x-ray analyzer (SEM-EDX) and X-ray Diffraction (XRD). This study confirms that Laser Induced Breakdown Spectroscopy (LIBS) Technique is considered very useful technique that can be used safely for investigating historical metal objects. LIBS analysis can quickly provide information on the qualitative and semi-quantitative elemental content of different metal objects and their characterization and classification. It is practically non-destructive technique with the critical advantage of being applicable in situ, thereby avoiding sampling and sample preparations. It is can be dependable, satisfactory and effective method for low cost study of archaeological and historical metals. But we have to take into consideration that the corrosion of metal leads to material alteration and possible loss of certain metals in the form of soluble salts. Certain corrosion products are known to leach out of the object and therefore, their low content does not necessarily reflect the composition of the metal at the time of the object manufacture. Another point should be taken into consideration that the heterogeneity of a metal alloy object that often result from poor mixing of the different metal alloy composition.There is a necessity to carry out further research to investigate and determine the most appropriate and effective approaches and methods for conservation of these metal objects.« less
NASA Astrophysics Data System (ADS)
Dostal, P.; Krasula, L.; Klima, M.
2012-06-01
Various image processing techniques in multimedia technology are optimized using visual attention feature of the human visual system. Spatial non-uniformity causes that different locations in an image are of different importance in terms of perception of the image. In other words, the perceived image quality depends mainly on the quality of important locations known as regions of interest. The performance of such techniques is measured by subjective evaluation or objective image quality criteria. Many state-of-the-art objective metrics are based on HVS properties; SSIM, MS-SSIM based on image structural information, VIF based on the information that human brain can ideally gain from the reference image or FSIM utilizing the low-level features to assign the different importance to each location in the image. But still none of these objective metrics utilize the analysis of regions of interest. We solve the question if these objective metrics can be used for effective evaluation of images reconstructed by processing techniques based on ROI analysis utilizing high-level features. In this paper authors show that the state-of-the-art objective metrics do not correlate well with subjective evaluation while the demosaicing based on ROI analysis is used for reconstruction. The ROI were computed from "ground truth" visual attention data. The algorithm combining two known demosaicing techniques on the basis of ROI location is proposed to reconstruct the ROI in fine quality while the rest of image is reconstructed with low quality. The color image reconstructed by this ROI approach was compared with selected demosaicing techniques by objective criteria and subjective testing. The qualitative comparison of the objective and subjective results indicates that the state-of-the-art objective metrics are still not suitable for evaluation image processing techniques based on ROI analysis and new criteria is demanded.
From fields to objects: A review of geographic boundary analysis
NASA Astrophysics Data System (ADS)
Jacquez, G. M.; Maruca, S.; Fortin, M.-J.
Geographic boundary analysis is a relatively new approach unfamiliar to many spatial analysts. It is best viewed as a technique for defining objects - geographic boundaries - on spatial fields, and for evaluating the statistical significance of characteristics of those boundary objects. This is accomplished using null spatial models representative of the spatial processes expected in the absence of boundary-generating phenomena. Close ties to the object-field dialectic eminently suit boundary analysis to GIS data. The majority of existing spatial methods are field-based in that they describe, estimate, or predict how attributes (variables defining the field) vary through geographic space. Such methods are appropriate for field representations but not object representations. As the object-field paradigm gains currency in geographic information science, appropriate techniques for the statistical analysis of objects are required. The methods reviewed in this paper are a promising foundation. Geographic boundary analysis is clearly a valuable addition to the spatial statistical toolbox. This paper presents the philosophy of, and motivations for geographic boundary analysis. It defines commonly used statistics for quantifying boundaries and their characteristics, as well as simulation procedures for evaluating their significance. We review applications of these techniques, with the objective of making this promising approach accessible to the GIS-spatial analysis community. We also describe the implementation of these methods within geographic boundary analysis software: GEM.
Fusing modeling techniques to support domain analysis for reuse opportunities identification
NASA Technical Reports Server (NTRS)
Hall, Susan Main; Mcguire, Eileen
1993-01-01
Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.
Watkinson, D; Rimmer, M; Kasztovszky, Z; Kis, Z; Maróti, B; Szentmiklósi, L
2014-01-01
Chloride (Cl) ions diffuse into iron objects during burial and drive corrosion after excavation. Located under corrosion layers, Cl is inaccessible to many analytical techniques. Neutron analysis offers non-destructive avenues for determining Cl content and distribution in objects. A pilot study used prompt gamma activation analysis (PGAA) and prompt gamma activation imaging (PGAI) to analyse the bulk concentration and longitudinal distribution of Cl in archaeological iron objects. This correlated with the object corrosion rate measured by oxygen consumption, and compared well with Cl measurement using a specific ion meter. High-Cl areas were linked with visible damage to the corrosion layers and attack of the iron core. Neutron techniques have significant advantages in the analysis of archaeological metals, including penetration depth and low detection limits. PMID:26028670
The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jason L. Wright; Milos Manic
2010-05-01
This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.
Modeling 3-D objects with planar surfaces for prediction of electromagnetic scattering
NASA Technical Reports Server (NTRS)
Koch, M. B.; Beck, F. B.; Cockrell, C. R.
1992-01-01
Electromagnetic scattering analysis of objects at resonance is difficult because low frequency techniques are slow and computer intensive, and high frequency techniques may not be reliable. A new technique for predicting the electromagnetic backscatter from electrically conducting objects at resonance is studied. This technique is based on modeling three dimensional objects as a combination of flat plates where some of the plates are blocking the scattering from others. A cube is analyzed as a simple example. The preliminary results compare well with the Geometrical Theory of Diffraction and with measured data.
Using object-oriented analysis techniques to support system testing
NASA Astrophysics Data System (ADS)
Zucconi, Lin
1990-03-01
Testing of real-time control systems can be greatly facilitated by use of object-oriented and structured analysis modeling techniques. This report describes a project where behavior, process and information models built for a real-time control system were used to augment and aid traditional system testing. The modeling techniques used were an adaptation of the Ward/Mellor method for real-time systems analysis and design (Ward85) for object-oriented development. The models were used to simulate system behavior by means of hand execution of the behavior or state model and the associated process (data and control flow) and information (data) models. The information model, which uses an extended entity-relationship modeling technique, is used to identify application domain objects and their attributes (instance variables). The behavioral model uses state-transition diagrams to describe the state-dependent behavior of the object. The process model uses a transformation schema to describe the operations performed on or by the object. Together, these models provide a means of analyzing and specifying a system in terms of the static and dynamic properties of the objects which it manipulates. The various models were used to simultaneously capture knowledge about both the objects in the application domain and the system implementation. Models were constructed, verified against the software as-built and validated through informal reviews with the developer. These models were then hand-executed.
Man-machine analysis of translation and work tasks of Skylab films
NASA Technical Reports Server (NTRS)
Hosler, W. W.; Boelter, J. G.; Morrow, J. R., Jr.; Jackson, J. T.
1979-01-01
An objective approach to determine the concurrent validity of computer-graphic models is real time film analysis. This technique was illustrated through the procedures and results obtained in an evaluation of translation of Skylab mission astronauts. The quantitative analysis was facilitated by the use of an electronic film analyzer, minicomputer, and specifically supportive software. The uses of this technique for human factors research are: (1) validation of theoretical operator models; (2) biokinetic analysis; (3) objective data evaluation; (4) dynamic anthropometry; (5) empirical time-line analysis; and (6) consideration of human variability. Computer assisted techniques for interface design and evaluation have the potential for improving the capability for human factors engineering.
Analysis of micro computed tomography images; a look inside historic enamelled metal objects
NASA Astrophysics Data System (ADS)
van der Linden, Veerle; van de Casteele, Elke; Thomas, Mienke Simon; de Vos, Annemie; Janssen, Elsje; Janssens, Koen
2010-02-01
In this study the usefulness of micro-Computed Tomography (µ-CT) for the in-depth analysis of enamelled metal objects was tested. Usually investigations of enamelled metal artefacts are restricted to non-destructive surface analysis or analysis of cross sections after destructive sampling. Radiography, a commonly used technique in the field of cultural heritage studies, is limited to providing two-dimensional information about a three-dimensional object (Lang and Middleton, Radiography of Cultural Material, pp. 60-61, Elsevier-Butterworth-Heinemann, Amsterdam-Stoneham-London, 2005). Obtaining virtual slices and information about the internal structure of these objects was made possible by CT analysis. With this technique the underlying metal work was studied without removing the decorative enamel layer. Moreover visible defects such as cracks were measured in both width and depth and as of yet invisible defects and weaker areas are visualised. All these features are of great interest to restorers and conservators as they allow a view inside these objects without so much as touching them.
NASA Technical Reports Server (NTRS)
Keuper, H. R.; Peplies, R. W.; Gillooly, R. P.
1977-01-01
The use of machine scanning and/or computer-based techniques to provide greater objectivity in the photomorphic approach was investigated. Photomorphic analysis and its application in regional planning are discussed. Topics included: delineation of photomorphic regions; inadequacies of existing classification systems; tonal and textural characteristics and signature analysis techniques; pattern recognition and Fourier transform analysis; and optical experiments. A bibliography is included.
Object-Based Image Analysis Beyond Remote Sensing - the Human Perspective
NASA Astrophysics Data System (ADS)
Blaschke, T.; Lang, S.; Tiede, D.; Papadakis, M.; Györi, A.
2016-06-01
We introduce a prototypical methodological framework for a place-based GIS-RS system for the spatial delineation of place while incorporating spatial analysis and mapping techniques using methods from different fields such as environmental psychology, geography, and computer science. The methodological lynchpin for this to happen - when aiming to delineate place in terms of objects - is object-based image analysis (OBIA).
1984-02-01
prediction Extratropical cyclones Objective analysis Bogus techniques 20. ABSTRACT (Continue on reverse aide If necooearn mid Identify by block number) Jh A...quasi-objective statistical method for deriving 300 mb geopotential heights and 1000/300 mb thicknesses in the vicinity of extratropical cyclones 0I...with the aid of satellite imagery is presented. The technique utilizes satellite observed extratropical spiral cloud pattern parameters in conjunction
Using machine learning techniques to automate sky survey catalog generation
NASA Technical Reports Server (NTRS)
Fayyad, Usama M.; Roden, J. C.; Doyle, R. J.; Weir, Nicholas; Djorgovski, S. G.
1993-01-01
We describe the application of machine classification techniques to the development of an automated tool for the reduction of a large scientific data set. The 2nd Palomar Observatory Sky Survey provides comprehensive photographic coverage of the northern celestial hemisphere. The photographic plates are being digitized into images containing on the order of 10(exp 7) galaxies and 10(exp 8) stars. Since the size of this data set precludes manual analysis and classification of objects, our approach is to develop a software system which integrates independently developed techniques for image processing and data classification. Image processing routines are applied to identify and measure features of sky objects. Selected features are used to determine the classification of each object. GID3* and O-BTree, two inductive learning techniques, are used to automatically learn classification decision trees from examples. We describe the techniques used, the details of our specific application, and the initial encouraging results which indicate that our approach is well-suited to the problem. The benefits of the approach are increased data reduction throughput, consistency of classification, and the automated derivation of classification rules that will form an objective, examinable basis for classifying sky objects. Furthermore, astronomers will be freed from the tedium of an intensely visual task to pursue more challenging analysis and interpretation problems given automatically cataloged data.
Computer-Assisted Digital Image Analysis of Plus Disease in Retinopathy of Prematurity.
Kemp, Pavlina S; VanderVeen, Deborah K
2016-01-01
The objective of this study is to review the current state and role of computer-assisted analysis in diagnosis of plus disease in retinopathy of prematurity. Diagnosis and documentation of retinopathy of prematurity are increasingly being supplemented by digital imaging. The incorporation of computer-aided techniques has the potential to add valuable information and standardization regarding the presence of plus disease, an important criterion in deciding the necessity of treatment of vision-threatening retinopathy of prematurity. A review of literature found that several techniques have been published examining the process and role of computer aided analysis of plus disease in retinopathy of prematurity. These techniques use semiautomated image analysis techniques to evaluate retinal vascular dilation and tortuosity, using calculated parameters to evaluate presence or absence of plus disease. These values are then compared with expert consensus. The study concludes that computer-aided image analysis has the potential to use quantitative and objective criteria to act as a supplemental tool in evaluating for plus disease in the setting of retinopathy of prematurity.
Determination of shell content by activation analysis : final report.
DOT National Transportation Integrated Search
1978-08-01
The objective of this study is to determine if neutron activation analysis technique, developed under Research Project 70-1ST, can be used to determine the shell content of a sand-shell mixture. : In order to accomplish this objective, samples of san...
Optimization Techniques for Analysis of Biological and Social Networks
2012-03-28
analyzing a new metaheuristic technique, variable objective search. 3. Experimentation and application: Implement the proposed algorithms , test and fine...alternative mathematical programming formulations, their theoretical analysis, the development of exact algorithms , and heuristics. Originally, clusters...systematic fashion under a unifying theoretical and algorithmic framework. Optimization, Complex Networks, Social Network Analysis, Computational
Enhanced Multiobjective Optimization Technique for Comprehensive Aerospace Design. Part A
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Rajadas, John N.
1997-01-01
A multidisciplinary design optimization procedure which couples formal multiobjectives based techniques and complex analysis procedures (such as computational fluid dynamics (CFD) codes) developed. The procedure has been demonstrated on a specific high speed flow application involving aerodynamics and acoustics (sonic boom minimization). In order to account for multiple design objectives arising from complex performance requirements, multiobjective formulation techniques are used to formulate the optimization problem. Techniques to enhance the existing Kreisselmeier-Steinhauser (K-S) function multiobjective formulation approach have been developed. The K-S function procedure used in the proposed work transforms a constrained multiple objective functions problem into an unconstrained problem which then is solved using the Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm. Weight factors are introduced during the transformation process to each objective function. This enhanced procedure will provide the designer the capability to emphasize specific design objectives during the optimization process. The demonstration of the procedure utilizes a computational Fluid dynamics (CFD) code which solves the three-dimensional parabolized Navier-Stokes (PNS) equations for the flow field along with an appropriate sonic boom evaluation procedure thus introducing both aerodynamic performance as well as sonic boom as the design objectives to be optimized simultaneously. Sensitivity analysis is performed using a discrete differentiation approach. An approximation technique has been used within the optimizer to improve the overall computational efficiency of the procedure in order to make it suitable for design applications in an industrial setting.
NASA Technical Reports Server (NTRS)
Oommen, Thomas; Rebbapragada, Umaa; Cerminaro, Daniel
2012-01-01
In this study, we perform a case study on imagery from the Haiti earthquake that evaluates a novel object-based approach for characterizing earthquake induced surface effects of liquefaction against a traditional pixel based change technique. Our technique, which combines object-oriented change detection with discriminant/categorical functions, shows the power of distinguishing earthquake-induced surface effects from changes in buildings using the object properties concavity, convexity, orthogonality and rectangularity. Our results suggest that object-based analysis holds promise in automatically extracting earthquake-induced damages from high-resolution aerial/satellite imagery.
Using object-oriented analysis to design a multi-mission ground data system
NASA Technical Reports Server (NTRS)
Shames, Peter
1995-01-01
This paper describes an analytical approach and descriptive methodology that is adapted from Object-Oriented Analysis (OOA) techniques. The technique is described and then used to communicate key issues of system logical architecture. The essence of the approach is to limit the analysis to only service objects, with the idea of providing a direct mapping from the design to a client-server implementation. Key perspectives on the system, such as user interaction, data flow and management, service interfaces, hardware configuration, and system and data integrity are covered. A significant advantage of this service-oriented approach is that it permits mapping all of these different perspectives on the system onto a single common substrate. This services substrate is readily represented diagramatically, thus making details of the overall design much more accessible.
Objective analysis of tidal fields in the Atlantic and Indian Oceans
NASA Technical Reports Server (NTRS)
Sanchez, B. V.; Rao, D. B.; Steenrod, S. D.
1986-01-01
An objective analysis technique has been developed to extrapolate tidal amplitudes and phases over entire ocean basins using existing gauge data and the altimetric measurements which are now beginning to be provided by satellite oceanography. The technique was previously tested in the Lake Superior basin. The method has now been developed and applied in the Atlantic-Indian ocean basins using a 6 deg x 6 deg grid to test its essential features. The functions used in the interpolation are the eigenfunctions of the velocity potential (Proudman functions) which are computed numerically from a knowledge of the basin's bottom topography, the horizontal plan form and the necessary boundary conditions. These functions are characteristic of the particular basin. The gravitational normal modes of the basin are computed as part of the investigation, they are used to obtain the theoretical forced solutions for the tidal constituents, the latter provide the simulated data for the testing of the method and serve as a guide in choosing the most energetic modes for the objective analysis. The results of the objective analysis of the M2 and K1 tidal constituents indicate the possibility of recovering the tidal signal with a degree of accuracy well within the error bounds of present day satellite techniques.
A Cost-Effectiveness/Benefit Analysis Model for Postsecondary Vocational Programs. Technical Report.
ERIC Educational Resources Information Center
Kim, Jin Eun
A cost-effectiveness/benefit analysis is defined as a technique for measuring the outputs of existing and new programs in relation to their specified program objectives, against the costs of those programs. In terms of its specific use, the technique is conceptualized as a systems analysis method, an evaluation method, and a planning tool for…
NASA Technical Reports Server (NTRS)
1974-01-01
The use of optical data processing (ODP) techniques for motion analysis in two-dimensional imagery was studied. The basic feasibility of this approach was demonstrated, but inconsistent performance of the photoplastic used for recording spatial filters prevented totally automatic operation. Promising solutions to the problems encountered are discussed, and it is concluded that ODP techniques could be quite useful for motion analysis.
Artificial Intelligence Techniques: Applications for Courseware Development.
ERIC Educational Resources Information Center
Dear, Brian L.
1986-01-01
Introduces some general concepts and techniques of artificial intelligence (natural language interfaces, expert systems, knowledge bases and knowledge representation, heuristics, user-interface metaphors, and object-based environments) and investigates ways these techniques might be applied to analysis, design, development, implementation, and…
Radiograph and passive data analysis using mixed variable optimization
Temple, Brian A.; Armstrong, Jerawan C.; Buescher, Kevin L.; Favorite, Jeffrey A.
2015-06-02
Disclosed herein are representative embodiments of methods, apparatus, and systems for performing radiography analysis. For example, certain embodiments perform radiographic analysis using mixed variable computation techniques. One exemplary system comprises a radiation source, a two-dimensional detector for detecting radiation transmitted through a object between the radiation source and detector, and a computer. In this embodiment, the computer is configured to input the radiographic image data from the two-dimensional detector and to determine one or more materials that form the object by using an iterative analysis technique that selects the one or more materials from hierarchically arranged solution spaces of discrete material possibilities and selects the layer interfaces from the optimization of the continuous interface data.
Closed-loop, pilot/vehicle analysis of the approach and landing task
NASA Technical Reports Server (NTRS)
Anderson, M. R.; Schmidt, D. K.
1986-01-01
In the case of approach and landing, it is universally accepted that the pilot uses more than one vehicle response, or output, to close his control loops. Therefore, to model this task, a multi-loop analysis technique is required. The analysis problem has been in obtaining reasonable analytic estimates of the describing functions representing the pilot's loop compensation. Once these pilot describing functions are obtained, appropriate performance and workload metrics must then be developed for the landing task. The optimal control approach provides a powerful technique for obtaining the necessary describing functions, once the appropriate task objective is defined in terms of a quadratic objective function. An approach is presented through the use of a simple, reasonable objective function and model-based metrics to evaluate loop performance and pilot workload. The results of an analysis of the LAHOS (Landing and Approach of Higher Order Systems) study performed by R.E. Smith is also presented.
NASA Technical Reports Server (NTRS)
Liu, H. K.
1978-01-01
A phase modulated triple exposure technique was incorporated into a holographic nondestructive test (HNDT) system. The technique was able to achieve a goal of simultaneously identifying the zero-order fringe and determining the direction of motion (or displacement). Basically, the technique involves the addition of one more exposure, during the loading of the tested object, to the conventional double-exposure hologram. A phase shifter is added to either the object beam or the reference beam during the second and third exposure. Theoretical analysis with the assistance of computer simulation illustrated the feasibility of implementing the phase modulation and triple-exposure in the HNDT systems. Main advantages of the technique are the enhancement of accuracy in data interpretation and a better determination of the nature of the flaws in the tested object.
From Phenomena to Objects: Segmentation of Fuzzy Objects and its Application to Oceanic Eddies
NASA Astrophysics Data System (ADS)
Wu, Qingling
A challenging image analysis problem that has received limited attention to date is the isolation of fuzzy objects---i.e. those with inherently indeterminate boundaries---from continuous field data. This dissertation seeks to bridge the gap between, on the one hand, the recognized need for Object-Based Image Analysis of fuzzy remotely sensed features, and on the other, the optimization of existing image segmentation techniques for the extraction of more discretely bounded features. Using mesoscale oceanic eddies as a case study of a fuzzy object class evident in Sea Surface Height Anomaly (SSHA) imagery, the dissertation demonstrates firstly, that the widely used region-growing and watershed segmentation techniques can be optimized and made comparable in the absence of ground truth data using the principle of parsimony. However, they both have significant shortcomings, with the region growing procedure creating contour polygons that do not follow the shape of eddies while the watershed technique frequently subdivides eddies or groups together separate eddy objects. Secondly, it was determined that these problems can be remedied by using a novel Non-Euclidian Voronoi (NEV) tessellation technique. NEV is effective in isolating the extrema associated with eddies in SSHA data while using a non-Euclidian cost-distance based procedure (based on cumulative gradients in ocean height) to define the boundaries between fuzzy objects. Using this procedure as the first stage in isolating candidate eddy objects, a novel "region-shrinking" multicriteria eddy identification algorithm was developed that includes consideration of shape and vorticity. Eddies identified by this region-shrinking technique compare favorably with those identified by existing techniques, while simplifying and improving existing automated eddy detection algorithms. However, it also tends to find a larger number of eddies as a result of its ability to separate what other techniques identify as connected eddies. The research presented here is of significance not only to eddy research in oceanography, but also to other areas of Earth System Science for which the automated detection of features lacking rigid boundary definitions is of importance.
Applying machine learning classification techniques to automate sky object cataloguing
NASA Astrophysics Data System (ADS)
Fayyad, Usama M.; Doyle, Richard J.; Weir, W. Nick; Djorgovski, Stanislav
1993-08-01
We describe the application of an Artificial Intelligence machine learning techniques to the development of an automated tool for the reduction of a large scientific data set. The 2nd Mt. Palomar Northern Sky Survey is nearly completed. This survey provides comprehensive coverage of the northern celestial hemisphere in the form of photographic plates. The plates are being transformed into digitized images whose quality will probably not be surpassed in the next ten to twenty years. The images are expected to contain on the order of 107 galaxies and 108 stars. Astronomers wish to determine which of these sky objects belong to various classes of galaxies and stars. Unfortunately, the size of this data set precludes analysis in an exclusively manual fashion. Our approach is to develop a software system which integrates the functions of independently developed techniques for image processing and data classification. Digitized sky images are passed through image processing routines to identify sky objects and to extract a set of features for each object. These routines are used to help select a useful set of attributes for classifying sky objects. Then GID3 (Generalized ID3) and O-B Tree, two inductive learning techniques, learns classification decision trees from examples. These classifiers will then be applied to new data. These developmnent process is highly interactive, with astronomer input playing a vital role. Astronomers refine the feature set used to construct sky object descriptions, and evaluate the performance of the automated classification technique on new data. This paper gives an overview of the machine learning techniques with an emphasis on their general applicability, describes the details of our specific application, and reports the initial encouraging results. The results indicate that our machine learning approach is well-suited to the problem. The primary benefit of the approach is increased data reduction throughput. Another benefit is consistency of classification. The classification rules which are the product of the inductive learning techniques will form an objective, examinable basis for classifying sky objects. A final, not to be underestimated benefit is that astronomers will be freed from the tedium of an intensely visual task to pursue more challenging analysis and interpretation problems based on automatically catalogued data.
ERIC Educational Resources Information Center
Mosier, Nancy R.
Financial analysis techniques are tools that help managers make sound financial decisions that contribute to general corporate objectives. A literature review reveals that the most commonly used financial analysis techniques are payback time, average rate of return, present value or present worth, and internal rate of return. Despite the success…
Study of photon correlation techniques for processing of laser velocimeter signals
NASA Technical Reports Server (NTRS)
Mayo, W. T., Jr.
1977-01-01
The objective was to provide the theory and a system design for a new type of photon counting processor for low level dual scatter laser velocimeter (LV) signals which would be capable of both the first order measurements of mean flow and turbulence intensity and also the second order time statistics: cross correlation auto correlation, and related spectra. A general Poisson process model for low level LV signals and noise which is valid from the photon-resolved regime all the way to the limiting case of nonstationary Gaussian noise was used. Computer simulation algorithms and higher order statistical moment analysis of Poisson processes were derived and applied to the analysis of photon correlation techniques. A system design using a unique dual correlate and subtract frequency discriminator technique is postulated and analyzed. Expectation analysis indicates that the objective measurements are feasible.
NASA Astrophysics Data System (ADS)
Lachinova, Svetlana L.; Vorontsov, Mikhail A.; Filimonov, Grigory A.; LeMaster, Daniel A.; Trippel, Matthew E.
2017-07-01
Computational efficiency and accuracy of wave-optics-based Monte-Carlo and brightness function numerical simulation techniques for incoherent imaging of extended objects through atmospheric turbulence are evaluated. Simulation results are compared with theoretical estimates based on known analytical solutions for the modulation transfer function of an imaging system and the long-exposure image of a Gaussian-shaped incoherent light source. It is shown that the accuracy of both techniques is comparable over the wide range of path lengths and atmospheric turbulence conditions, whereas the brightness function technique is advantageous in terms of the computational speed.
NASA Technical Reports Server (NTRS)
Glick, B. J.
1985-01-01
Techniques for classifying objects into groups or clases go under many different names including, most commonly, cluster analysis. Mathematically, the general problem is to find a best mapping of objects into an index set consisting of class identifiers. When an a priori grouping of objects exists, the process of deriving the classification rules from samples of classified objects is known as discrimination. When such rules are applied to objects of unknown class, the process is denoted classification. The specific problem addressed involves the group classification of a set of objects that are each associated with a series of measurements (ratio, interval, ordinal, or nominal levels of measurement). Each measurement produces one variable in a multidimensional variable space. Cluster analysis techniques are reviewed and methods for incuding geographic location, distance measures, and spatial pattern (distribution) as parameters in clustering are examined. For the case of patterning, measures of spatial autocorrelation are discussed in terms of the kind of data (nominal, ordinal, or interval scaled) to which they may be applied.
Fringe pattern demodulation with a two-frame digital phase-locked loop algorithm.
Gdeisat, Munther A; Burton, David R; Lalor, Michael J
2002-09-10
A novel technique called a two-frame digital phase-locked loop for fringe pattern demodulation is presented. In this scheme, two fringe patterns with different spatial carrier frequencies are grabbed for an object. A digital phase-locked loop algorithm tracks and demodulates the phase difference between both fringe patterns by employing the wrapped phase components of one of the fringe patterns as a reference to demodulate the second fringe pattern. The desired phase information can be extracted from the demodulated phase difference. We tested the algorithm experimentally using real fringe patterns. The technique is shown to be suitable for noncontact measurement of objects with rapid surface variations, and it outperforms the Fourier fringe analysis technique in this aspect. Phase maps produced withthis algorithm are noisy in comparison with phase maps generated with the Fourier fringe analysis technique.
A Marketing Case History Profile
ERIC Educational Resources Information Center
Weirick, Margaret C.
1978-01-01
A current marketing plan from Temple University illustrates many marketing techniques, including those dealing with enrollment objectives, market objectives, demographic characteristics of Temple students, market share analysis, and the marketing plan. Specific guidelines are provided. (LBH)
A comparison of autonomous techniques for multispectral image analysis and classification
NASA Astrophysics Data System (ADS)
Valdiviezo-N., Juan C.; Urcid, Gonzalo; Toxqui-Quitl, Carina; Padilla-Vivanco, Alfonso
2012-10-01
Multispectral imaging has given place to important applications related to classification and identification of objects from a scene. Because of multispectral instruments can be used to estimate the reflectance of materials in the scene, these techniques constitute fundamental tools for materials analysis and quality control. During the last years, a variety of algorithms has been developed to work with multispectral data, whose main purpose has been to perform the correct classification of the objects in the scene. The present study introduces a brief review of some classical as well as a novel technique that have been used for such purposes. The use of principal component analysis and K-means clustering techniques as important classification algorithms is here discussed. Moreover, a recent method based on the min-W and max-M lattice auto-associative memories, that was proposed for endmember determination in hyperspectral imagery, is introduced as a classification method. Besides a discussion of their mathematical foundation, we emphasize their main characteristics and the results achieved for two exemplar images conformed by objects similar in appearance, but spectrally different. The classification results state that the first components computed from principal component analysis can be used to highlight areas with different spectral characteristics. In addition, the use of lattice auto-associative memories provides good results for materials classification even in the cases where some spectral similarities appears in their spectral responses.
NASA Technical Reports Server (NTRS)
Djorgovski, George
1993-01-01
The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multiparameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resource.
NASA Technical Reports Server (NTRS)
Djorgovski, Stanislav
1992-01-01
The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multi parameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resources.
Surface Location In Scene Content Analysis
NASA Astrophysics Data System (ADS)
Hall, E. L.; Tio, J. B. K.; McPherson, C. A.; Hwang, J. J.
1981-12-01
The purpose of this paper is to describe techniques and algorithms for the location in three dimensions of planar and curved object surfaces using a computer vision approach. Stereo imaging techniques are demonstrated for planar object surface location using automatic segmentation, vertex location and relational table matching. For curved surfaces, the locations of corresponding 'points is very difficult. However, an example using a grid projection technique for the location of the surface of a curved cup is presented to illustrate a solution. This method consists of first obtaining the perspective transformation matrix from the images, then using these matrices to compute the three dimensional point locations of the grid points on the surface. These techniques may be used in object location for such applications as missile guidance, robotics, and medical diagnosis and treatment.
Non-destructive investigation of a time capsule using neutron radiography and X-ray fluorescence
NASA Astrophysics Data System (ADS)
MacDonald, B. L.; Vanderstelt, J.; O'Meara, J.; McNeill, F. E.
2016-01-01
Non-destructive analytical techniques are becoming increasingly important for the study of objects of cultural heritage interest. This study applied two techniques: X-ray fluorescence and neutron radiography, for the investigation of a capped, tubular metal object recovered from an urban construction site in Gore Park, Hamilton, Canada. The site is an urban park containing a World War I commemorative monument that underwent renovation and relocation. Historical documentation suggested that the object buried underneath the monument was a time capsule containing a paper document listing the names of 1800 Canadians who died during WWI. The purpose of this study was to assess the condition of the object, and to verify if it was what the historical records purported. XRF analysis was used to characterize the elemental composition of the metal artifact, while neutron radiography revealed that its contents were congruent with historical records and remained intact after being interred for 91 years. Results of this study demonstrate the value of non-destructive techniques for the analysis and preservation of cultural heritage.
An objective isobaric/isentropic technique for upper air analysis
NASA Technical Reports Server (NTRS)
Mancuso, R. L.; Endlich, R. M.; Ehernberger, L. J.
1981-01-01
An objective meteorological analysis technique is presented whereby both horizontal and vertical upper air analyses are performed. The process used to interpolate grid-point values from the upper-air station data is the same as for grid points on both an isobaric surface and a vertical cross-sectional plane. The nearby data surrounding each grid point are used in the interpolation by means of an anisotropic weighting scheme, which is described. The interpolation for a grid-point potential temperature is performed isobarically; whereas wind, mixing-ratio, and pressure height values are interpolated from data that lie on the isentropic surface that passes through the grid point. Two versions (A and B) of the technique are evaluated by qualitatively comparing computer analyses with subjective handdrawn analyses. The objective products of version A generally have fair correspondence with the subjective analyses and with the station data, and depicted the structure of the upper fronts, tropopauses, and jet streams fairly well. The version B objective products correspond more closely to the subjective analyses, and show the same strong gradients across the upper front with only minor smoothing.
D Tracking Based Augmented Reality for Cultural Heritage Data Management
NASA Astrophysics Data System (ADS)
Battini, C.; Landi, G.
2015-02-01
The development of contactless documentation techniques is allowing researchers to collect high volumes of three-dimensional data in a short time but with high levels of accuracy. The digitalisation of cultural heritage opens up the possibility of using image processing and analysis, and computer graphics techniques, to preserve this heritage for future generations; augmenting it with additional information or with new possibilities for its enjoyment and use. The collection of precise datasets about cultural heritage status is crucial for its interpretation, its conservation and during the restoration processes. The application of digital-imaging solutions for various feature extraction, image data-analysis techniques, and three-dimensional reconstruction of ancient artworks, allows the creation of multidimensional models that can incorporate information coming from heterogeneous data sets, research results and historical sources. Real objects can be scanned and reconstructed virtually, with high levels of data accuracy and resolution. Real-time visualisation software and hardware is rapidly evolving and complex three-dimensional models can be interactively visualised and explored on applications developed for mobile devices. This paper will show how a 3D reconstruction of an object, with multiple layers of information, can be stored and visualised through a mobile application that will allow interaction with a physical object for its study and analysis, using 3D Tracking based Augmented Reality techniques.
Nonlinear dynamic phase contrast microscopy for microfluidic and microbiological applications
NASA Astrophysics Data System (ADS)
Denz, C.; Holtmann, F.; Woerdemann, M.; Oevermann, M.
2008-08-01
In live sciences, the observation and analysis of moving living cells, molecular motors or motion of micro- and nano-objects is a current field of research. At the same time, microfluidic innovations are needed for biological and medical applications on a micro- and nano-scale. Conventional microscopy techniques are reaching considerable limits with respect to these issues. A promising approach for this challenge is nonlinear dynamic phase contrast microscopy. It is an alternative full field approach that allows to detect motion as well as phase changes of living unstained micro-objects in real-time, thereby being marker free, without contact and non destructive, i.e. fully biocompatible. The generality of this system allows it to be combined with several other microscope techniques such as conventional bright field or fluorescence microscopy. In this article we will present the dynamic phase contrast technique and its applications in analysis of micro organismic dynamics, micro flow velocimetry and micro-mixing analysis.
Child versus adult psychoanalysis: two processes or one?
Sugarman, Alan
2009-12-01
Child analysis continues to be seen as a different technique from adult analysis because children are still involved in a developmental process and because the primary objects continue to play active roles in their lives. This paper argues that this is a false dichotomy. An extended vignette of the analysis of a latency-aged girl is used to demonstrate that the psychoanalytic process that develops in child analysis is structurally the same as that in adult analysis. Both revolve around the analysis of resistance and transference and use both to promote knowledge of the patient's mind at work. And both techniques formulate interventions based on the analyst's appraisal of the patient's mental organization. It is hoped that stressing the essential commonality of both techniques will promote the development of an overarching theory of psychoanalytic technique.
Earth rotation, station coordinates and orbit determination from satellite laser ranging
NASA Astrophysics Data System (ADS)
Murata, Masaaki
The Project MERIT, a special program of international colaboration to Monitor Earth Rotation and Intercompare the Techniques of observation and analysis, has come to an end with great success. Its major objective was to evaluate the ultimate potential of space techniques such as VLBI and satellite laser ranging, in contrast with the other conventional techniques, in the determination of rotational dynamics of the earth. The National Aerospace Laboratory (NAL) has officially participated in the project as an associate analysis center for satellite laser technique for the period of the MERIT Main Campaign (September 1983-October 1984). In this paper, the NAL analysis center results are presented.
An Object Oriented Analysis Method for Ada and Embedded Systems
1989-12-01
expansion of the paradligm from the coding anld desiningactivities into the earlier activity of reurmnsalyi.Ts hpl, begins by discussing the application of...response time: 0.1 seconds.I Step le: Identify Known Restrictions on the Software.I " The cruise control system object code must fit within 16K of mem- orv...application of object-oriented techniques to the coding and desigll phases of the life cycle, as well as various approaches to requirements analysis. 3
NASA Technical Reports Server (NTRS)
Smith, D. R.
1982-01-01
The Purdue Regional Objective Analysis of the Mesoscale (PROAM) is a Barness-type scheme for the analysis of surface meteorological data. Modifications are introduced to the original version in order to increase its flexibility and to permit greater ease of usage. The code was rewritten for an interactive computer environment. Furthermore, a multiple iteration technique suggested by Barnes was implemented for greater accuracy. PROAM was subjected to a series of experiments in order to evaluate its performance under a variety of analysis conditions. The tests include use of a known analytic temperature distribution in order to quantify error bounds for the scheme. Similar experiments were conducted using actual atmospheric data. Results indicate that the multiple iteration technique increases the accuracy of the analysis. Furthermore, the tests verify appropriate values for the analysis parameters in resolving meso-beta scale phenomena.
ERIC Educational Resources Information Center
Vives, Robert
1983-01-01
Based on a literature review and analysis of teaching methods and objectives, it is proposed that the emphasis on communicative competence ascendant in French foreign language instruction is closely related to, and borrows from, expressive techniques taught in French native language instruction in the 1960s. (MSE)
Developing Formal Object-oriented Requirements Specifications: A Model, Tool and Technique.
ERIC Educational Resources Information Center
Jackson, Robert B.; And Others
1995-01-01
Presents a formal object-oriented specification model (OSS) for computer software system development that is supported by a tool that automatically generates a prototype from an object-oriented analysis model (OSA) instance, lets the user examine the prototype, and permits the user to refine the OSA model instance to generate a requirements…
Modular techniques for dynamic fault-tree analysis
NASA Technical Reports Server (NTRS)
Patterson-Hine, F. A.; Dugan, Joanne B.
1992-01-01
It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.
Scalable Conjunction Processing using Spatiotemporally Indexed Ephemeris Data
NASA Astrophysics Data System (ADS)
Budianto-Ho, I.; Johnson, S.; Sivilli, R.; Alberty, C.; Scarberry, R.
2014-09-01
The collision warnings produced by the Joint Space Operations Center (JSpOC) are of critical importance in protecting U.S. and allied spacecraft against destructive collisions and protecting the lives of astronauts during space flight. As the Space Surveillance Network (SSN) improves its sensor capabilities for tracking small and dim space objects, the number of tracked objects increases from thousands to hundreds of thousands of objects, while the number of potential conjunctions increases with the square of the number of tracked objects. Classical filtering techniques such as apogee and perigee filters have proven insufficient. Novel and orders of magnitude faster conjunction analysis algorithms are required to find conjunctions in a timely manner. Stellar Science has developed innovative filtering techniques for satellite conjunction processing using spatiotemporally indexed ephemeris data that efficiently and accurately reduces the number of objects requiring high-fidelity and computationally-intensive conjunction analysis. Two such algorithms, one based on the k-d Tree pioneered in robotics applications and the other based on Spatial Hash Tables used in computer gaming and animation, use, at worst, an initial O(N log N) preprocessing pass (where N is the number of tracked objects) to build large O(N) spatial data structures that substantially reduce the required number of O(N^2) computations, substituting linear memory usage for quadratic processing time. The filters have been implemented as Open Services Gateway initiative (OSGi) plug-ins for the Continuous Anomalous Orbital Situation Discriminator (CAOS-D) conjunction analysis architecture. We have demonstrated the effectiveness, efficiency, and scalability of the techniques using a catalog of 100,000 objects, an analysis window of one day, on a 64-core computer with 1TB shared memory. Each algorithm can process the full catalog in 6 minutes or less, almost a twenty-fold performance improvement over the baseline implementation running on the same machine. We will present an overview of the algorithms and results that demonstrate the scalability of our concepts.
How to Determine the Centre of Mass of Bodies from Image Modelling
ERIC Educational Resources Information Center
Dias, Marco Adriano; Carvalho, Paulo Simeão; Rodrigues, Marcelo
2016-01-01
Image modelling is a recent technique in physics education that includes digital tools for image treatment and analysis, such as digital stroboscopic photography (DSP) and video analysis software. It is commonly used to analyse the motion of objects. In this work we show how to determine the position of the centre of mass (CM) of objects with…
NASA Technical Reports Server (NTRS)
Doyle, James D.; Warner, Thomas T.
1987-01-01
Various combinations of VAS (Visible and Infrared Spin Scan Radiometer Atmospheric Sounder) data, conventional rawinsonde data, and gridded data from the National Weather Service's (NWS) global analysis, were used in successive-correction and variational objective-analysis procedures. Analyses are produced for 0000 GMT 7 March 1982, when the VAS sounding distribution was not greatly limited by the existence of cloud cover. The successive-correction (SC) procedure was used with VAS data alone, rawinsonde data alone, and both VAS and rawinsonde data. Variational techniques were applied in three ways. Each of these techniques was discussed.
Allen, Edwin B; Walls, Richard T; Reilly, Frank D
2008-02-01
This study investigated the effects of interactive instructional techniques in a web-based peripheral nervous system (PNS) component of a first year medical school human anatomy course. Existing data from 9 years of instruction involving 856 students were used to determine (1) the effect of web-based interactive instructional techniques on written exam item performance and (2) differences between student opinions of the benefit level of five different types of interactive learning objects used. The interactive learning objects included Patient Case studies, review Games, Simulated Interactive Patients (SIP), Flashcards, and unit Quizzes. Exam item analysis scores were found to be significantly higher (p < 0.05) for students receiving the instructional treatment incorporating the web-based interactive learning objects than for students not receiving this treatment. Questionnaires using a five-point Likert scale were analysed to determine student opinion ratings of the interactive learning objects. Students reported favorably on the benefit level of all learning objects. Students rated the benefit level of the Simulated Interactive Patients (SIP) highest, and this rating was significantly higher (p < 0.05) than all other learning objects. This study suggests that web-based interactive instructional techniques improve student exam performance. Students indicated a strong acceptance of Simulated Interactive Patient learning objects.
Application of multivariable statistical techniques in plant-wide WWTP control strategies analysis.
Flores, X; Comas, J; Roda, I R; Jiménez, L; Gernaey, K V
2007-01-01
The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation of the complex multicriteria data sets and allows an improved use of information for effective evaluation of control strategies.
Recognition Of Complex Three Dimensional Objects Using Three Dimensional Moment Invariants
NASA Astrophysics Data System (ADS)
Sadjadi, Firooz A.
1985-01-01
A technique for the recognition of complex three dimensional objects is presented. The complex 3-D objects are represented in terms of their 3-D moment invariants, algebraic expressions that remain invariant independent of the 3-D objects' orientations and locations in the field of view. The technique of 3-D moment invariants has been used successfully for simple 3-D object recognition in the past. In this work we have extended this method for the representation of more complex objects. Two complex objects are represented digitally; their 3-D moment invariants have been calculated, and then the invariancy of these 3-D invariant moment expressions is verified by changing the orientation and the location of the objects in the field of view. The results of this study have significant impact on 3-D robotic vision, 3-D target recognition, scene analysis and artificial intelligence.
Progress in multidisciplinary design optimization at NASA Langley
NASA Technical Reports Server (NTRS)
Padula, Sharon L.
1993-01-01
Multidisciplinary Design Optimization refers to some combination of disciplinary analyses, sensitivity analysis, and optimization techniques used to design complex engineering systems. The ultimate objective of this research at NASA Langley Research Center is to help the US industry reduce the costs associated with development, manufacturing, and maintenance of aerospace vehicles while improving system performance. This report reviews progress towards this objective and highlights topics for future research. Aerospace design problems selected from the author's research illustrate strengths and weaknesses in existing multidisciplinary optimization techniques. The techniques discussed include multiobjective optimization, global sensitivity equations and sequential linear programming.
NASA Astrophysics Data System (ADS)
Wang, P.; Becker, A. A.; Jones, I. A.; Glover, A. T.; Benford, S. D.; Vloeberghs, M.
2009-08-01
A virtual-reality real-time simulation of surgical operations that incorporates the inclusion of a hard tumour is presented. The software is based on Boundary Element (BE) technique. A review of the BE formulation for real-time analysis of two-domain deformable objects, using the pre-solution technique, is presented. The two-domain BE software is incorporated into a surgical simulation system called VIRS to simulate the initiation of a cut on the surface of the soft tissue and extending the cut deeper until the tumour is reached.
[An object-oriented intelligent engineering design approach for lake pollution control].
Zou, Rui; Zhou, Jing; Liu, Yong; Zhu, Xiang; Zhao, Lei; Yang, Ping-Jian; Guo, Huai-Cheng
2013-03-01
Regarding the shortage and deficiency of traditional lake pollution control engineering techniques, a new lake pollution control engineering approach was proposed in this study, based on object-oriented intelligent design (OOID) from the perspective of intelligence. It can provide a new methodology and framework for effectively controlling lake pollution and improving water quality. The differences between the traditional engineering techniques and the OOID approach were compared. The key points for OOID were described as object perspective, cause and effect foundation, set points into surface, and temporal and spatial optimization. The blue algae control in lake was taken as an example in this study. The effect of algae control and water quality improvement were analyzed in details from the perspective of object-oriented intelligent design based on two engineering techniques (vertical hydrodynamic mixer and pumping algaecide recharge). The modeling results showed that the traditional engineering design paradigm cannot provide scientific and effective guidance for engineering design and decision-making regarding lake pollution. Intelligent design approach is based on the object perspective and quantitative causal analysis in this case. This approach identified that the efficiency of mixers was much higher than pumps in achieving the goal of low to moderate water quality improvement. However, when the objective of water quality exceeded a certain value (such as the control objective of peak Chla concentration exceeded 100 microg x L(-1) in this experimental water), the mixer cannot achieve this goal. The pump technique can achieve the goal but with higher cost. The efficiency of combining the two techniques was higher than using one of the two techniques alone. Moreover, the quantitative scale control of the two engineering techniques has a significant impact on the actual project benefits and costs.
Signal-to-noise ratio analysis and evaluation of the Hadamard imaging technique
NASA Technical Reports Server (NTRS)
Jobson, D. J.; Katzberg, S. J.; Spiers, R. B., Jr.
1977-01-01
The signal-to-noise ratio performance of the Hadamard imaging technique is analyzed and an experimental evaluation of a laboratory Hadamard imager is presented. A comparison between the performances of Hadamard and conventional imaging techniques shows that the Hadamard technique is superior only when the imaging objective lens is required to have an effective F (focus) number of about 2 or slower.
Myocardial blood flow: Roentgen videodensitometry techniques
NASA Technical Reports Server (NTRS)
Smith, H. C.; Robb, R. A.; Wood, E. H.
1975-01-01
The current status of roentgen videodensitometric techniques that provide an objective assessment of blood flow at selected sites within the coronary circulation were described. Roentgen videodensitometry employs conventional radiopaque indicators, radiological equipment and coronary angiographic techniques. Roentgen videodensitometry techniques developed in the laboratory during the past nine years, and for the past three years were applied to analysis of angiograms in the clinical cardiac catheterization laboratory.
NASA Astrophysics Data System (ADS)
Bychkov, P. S.; Chentsov, A. V.; Kozintsev, V. M.; Popov, A. L.
2018-04-01
A calculation-experimental technique is developed for identification of the shrinkage stresses generated in objects after their additive manufacturing by layer-by-layer photopolymerization. The technique is based on the analysis of shrinkage deformations at bending occurring in a series of samples in the form of plates-stripes with identical sizes, but with different time of polymerization which is predetermined during their production on the 3D printer.
D'Elia, Caio Oliveira; Bitar, Alexandre Carneiro; Castropil, Wagner; Garofo, Antônio Guilherme Padovani; Cantuária, Anita Lopes; Orselli, Maria Isabel Veras; Luques, Isabela Ugo; Duarte, Marcos
2015-01-01
Objective: The objective of this study was to describe the methodology of knee rotation analysis using biomechanics laboratory instruments and to present the preliminary results from a comparative study on patients who underwent anterior cruciate ligament (ACL) reconstruction using the double-bundle technique. Methods: The protocol currently used in our laboratory was described. Three-dimensional kinematic analysis was performed and knee rotation amplitude was measured on eight normal patients (control group) and 12 patients who were operated using the double-bundle technique, by means of three tasks in the biomechanics laboratory. Results: No significant differences between operated and non-operated sides were shown in relation to the mean amplitudes of gait, gait with change in direction or gait with change in direction when going down stairs (p > 0.13). Conclusion: The preliminary results did not show any difference in the double-bundle ACL reconstruction technique in relation to the contralateral side and the control group. PMID:27027003
McCormick, Frank; Gupta, Anil; Bruce, Ben; Harris, Josh; Abrams, Geoff; Wilson, Hillary; Hussey, Kristen; Cole, Brian J.
2014-01-01
Purpose: The purpose of this study was to measure and compare the subjective, objective, and radiographic healing outcomes of single-row (SR), double-row (DR), and transosseous equivalent (TOE) suture techniques for arthroscopic rotator cuff repair. Materials and Methods: A retrospective comparative analysis of arthroscopic rotator cuff repairs by one surgeon from 2004 to 2010 at minimum 2-year followup was performed. Cohorts were matched for age, sex, and tear size. Subjective outcome variables included ASES, Constant, SST, UCLA, and SF-12 scores. Objective outcome variables included strength, active range of motion (ROM). Radiographic healing was assessed by magnetic resonance imaging (MRI). Statistical analysis was performed using analysis of variance (ANOVA), Mann — Whitney and Kruskal — Wallis tests with significance, and the Fisher exact probability test <0.05. Results: Sixty-three patients completed the study requirements (20 SR, 21 DR, 22 TOE). There was a clinically and statistically significant improvement in outcomes with all repair techniques (ASES mean improvement P = <0.0001). The mean final ASES scores were: SR 83; (SD 21.4); DR 87 (SD 18.2); TOE 87 (SD 13.2); (P = 0.73). There was a statistically significant improvement in strength for each repair technique (P < 0.001). There was no significant difference between techniques across all secondary outcome assessments: ASES improvement, Constant, SST, UCLA, SF-12, ROM, Strength, and MRI re-tear rates. There was a decrease in re-tear rates from single row (22%) to double-row (18%) to transosseous equivalent (11%); however, this difference was not statistically significant (P = 0.6). Conclusions: Compared to preoperatively, arthroscopic rotator cuff repair, using SR, DR, or TOE techniques, yielded a clinically and statistically significant improvement in subjective and objective outcomes at a minimum 2-year follow-up. Level of Evidence: Therapeutic level 3. PMID:24926159
NASA Astrophysics Data System (ADS)
Leydsman-McGinty, E. I.; Ramsey, R. D.; McGinty, C.
2013-12-01
The Remote Sensing/GIS Laboratory at Utah State University, in cooperation with the United States Environmental Protection Agency, is quantifying impervious surfaces for three watershed sub-basins in Utah. The primary objective of developing watershed-scale quantifications of impervious surfaces is to provide an indicator of potential impacts to wetlands that occur within the Wasatch Front and along the Great Salt Lake. A geospatial layer of impervious surfaces can assist state agencies involved with Utah's Wetlands Program Plan (WPP) in understanding the impacts of impervious surfaces on wetlands, as well as support them in carrying out goals and actions identified in the WPP. The three watershed sub-basins, Lower Bear-Malad, Lower Weber, and Jordan, span the highly urbanized Wasatch Front and are consistent with focal areas in need of wetland monitoring and assessment as identified in Utah's WPP. Geospatial layers of impervious surface currently exist in the form of national and regional land cover datasets; however, these datasets are too coarse to be utilized in fine-scale analyses. In addition, the pixel-based image processing techniques used to develop these coarse datasets have proven insufficient in smaller scale or detailed studies, particularly when applied to high-resolution satellite imagery or aerial photography. Therefore, object-based image analysis techniques are being implemented to develop the geospatial layer of impervious surfaces. Object-based image analysis techniques employ a combination of both geospatial and image processing methods to extract meaningful information from high-resolution imagery. Spectral, spatial, textural, and contextual information is used to group pixels into image objects and then subsequently used to develop rule sets for image classification. eCognition, an object-based image analysis software program, is being utilized in conjunction with one-meter resolution National Agriculture Imagery Program (NAIP) aerial photography from 2011.
NASA Technical Reports Server (NTRS)
Doyle, James D.; Warner, Thomas T.
1988-01-01
Various combinations of VAS (Visible and Infrared Spin Scan Radiometer Atmospheric Sounder) data, conventional rawinsonde data, and gridded data from the National Weather Service's (NWS) global analysis, were used in successive-correction and variational objective-analysis procedures. Analyses are produced for 0000 GMT 7 March 1982, when the VAS sounding distribution was not greatly limited by the existence of cloud cover. The successive-correction (SC) Procedure was used with VAS data alone, rawinsonde data alone, and both VAS and rawinsonde data. Variational techniques were applied in three ways. Each of these techniques was discussed.
Moire technique utilization for detection and measurement of scoliosis
NASA Astrophysics Data System (ADS)
Zawieska, Dorota; Podlasiak, Piotr
1993-02-01
Moire projection method enables non-contact measurement of the shape or deformation of different surfaces and constructions by fringe pattern analysis. The fringe map acquisition of the whole surface of the object under test is one of the main advantages compared with 'point by point' methods. The computer analyzes the shape of the whole surface and next user can selected different points or cross section of the object map. In this paper a few typical examples of an application of the moire technique in solving different medical problems will be presented. We will also present to you the equipment the moire pattern analysis is done in real time using the phase stepping method with CCD camera.
Analysis of the DORIS, GNSS, SLR, VLBI and gravimetric time series at the GGOS core sites
NASA Astrophysics Data System (ADS)
Moreaux, G.; Lemoine, F. G.; Luceri, V.; Pavlis, E. C.; MacMillan, D. S.; Bonvalot, S.; Saunier, J.
2017-12-01
Since June 2016 and the installation of a new DORIS station in Wettzell (Germany), four geodetic sites (Badary, Greenbelt, Wettzell and Yarragadee) are equipped with the four space geodetic techniques (DORIS, GNSS, SLR and VLBI). In line with the GGOS (Global Geodetic Observing System) objective of achieving a terrestrial reference frame at the millimetric level of accuracy, the combination centers of the four space techniques initiated a joint study to assess the level of agreement among these space geodetic techniques. In addition to the four sites, we will consider all the GGOS core sites including the seven sites with at least two space geodetic techniques in addition to DORIS. Starting from the coordinate time series, we will estimate and compare the mean positions and velocities of the co-located instruments. The temporal evolution of the coordinate differences will also be evaluated with respect to the local tie vectors and discrepancies will be investigated. Then, the analysis of the signal content of the time series will be carried out. Amplitudes and phases of the common signals among the techniques, and eventually from gravity data, will be compared. The first objective of this talk is to describe our joint study: the sites, the data, and the objectives. The second purpose is to present the first results obtained from the GGAO (Goddard Geophysical and Astronomic Observatory) site of Greenbelt.
Different techniques of multispectral data analysis for vegetation fraction retrieval
NASA Astrophysics Data System (ADS)
Kancheva, Rumiana; Georgiev, Georgi
2012-07-01
Vegetation monitoring is one of the most important applications of remote sensing technologies. In respect to farmlands, the assessment of crop condition constitutes the basis of growth, development, and yield processes monitoring. Plant condition is defined by a set of biometric variables, such as density, height, biomass amount, leaf area index, and etc. The canopy cover fraction is closely related to these variables, and is state-indicative of the growth process. At the same time it is a defining factor of the soil-vegetation system spectral signatures. That is why spectral mixtures decomposition is a primary objective in remotely sensed data processing and interpretation, specifically in agricultural applications. The actual usefulness of the applied methods depends on their prediction reliability. The goal of this paper is to present and compare different techniques for quantitative endmember extraction from soil-crop patterns reflectance. These techniques include: linear spectral unmixing, two-dimensional spectra analysis, spectral ratio analysis (vegetation indices), spectral derivative analysis (red edge position), colorimetric analysis (tristimulus values sum, chromaticity coordinates and dominant wavelength). The objective is to reveal their potential, accuracy and robustness for plant fraction estimation from multispectral data. Regression relationships have been established between crop canopy cover and various spectral estimators.
RAT SPERM MOTILITY ANALYSIS: METHODOLOGICAL CONSIDERATIONS
The objective of these studies was to optimize conditions for computer assisted sperm analysis (CASA) of rat epididymal spermatozoa. ethodological issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample ...
Rat sperm motility analysis: methodologic considerations
The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...
MPATHav: A software prototype for multiobjective routing in transportation risk assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ganter, J.H.; Smith, J.D.
Most routing problems depend on several important variables: transport distance, population exposure, accident rate, mandated roads (e.g., HM-164 regulations), and proximity to emergency response resources are typical. These variables may need to be minimized or maximized, and often are weighted. `Objectives` to be satisfied by the analysis are thus created. The resulting problems can be approached by combining spatial analysis techniques from geographic information systems (GIS) with multiobjective analysis techniques from the field of operations research (OR); we call this hybrid multiobjective spatial analysis` (MOSA). MOSA can be used to discover, display, and compare a range of solutions that satisfymore » a set of objectives to varying degrees. For instance, a suite of solutions may include: one solution that provides short transport distances, but at a cost of high exposure; another solution that provides low exposure, but long distances; and a range of solutions between these two extremes.« less
Mathematical analysis techniques for modeling the space network activities
NASA Technical Reports Server (NTRS)
Foster, Lisa M.
1992-01-01
The objective of the present work was to explore and identify mathematical analysis techniques, and in particular, the use of linear programming. This topic was then applied to the Tracking and Data Relay Satellite System (TDRSS) in order to understand the space network better. Finally, a small scale version of the system was modeled, variables were identified, data was gathered, and comparisons were made between actual and theoretical data.
On-Error Training (Book Excerpt).
ERIC Educational Resources Information Center
Fukuda, Ryuji
1985-01-01
This excerpt from "Managerial Engineering: Techniques for Improving Quality and Productivity in the Workplace" describes the development, objectives, and use of On-Error Training (OET), a method which trains workers to learn from their errors. Also described is New Joharry's Window, a performance-error data analysis technique used in…
Critical evaluation of sample pretreatment techniques.
Hyötyläinen, Tuulia
2009-06-01
Sample preparation before chromatographic separation is the most time-consuming and error-prone part of the analytical procedure. Therefore, selecting and optimizing an appropriate sample preparation scheme is a key factor in the final success of the analysis, and the judicious choice of an appropriate procedure greatly influences the reliability and accuracy of a given analysis. The main objective of this review is to critically evaluate the applicability, disadvantages, and advantages of various sample preparation techniques. Particular emphasis is placed on extraction techniques suitable for both liquid and solid samples.
NASA Astrophysics Data System (ADS)
Zhang, Caiyun; Smith, Molly; Lv, Jie; Fang, Chaoyang
2017-05-01
Mapping plant communities and documenting their changes is critical to the on-going Florida Everglades restoration project. In this study, a framework was designed to map dominant vegetation communities and inventory their changes in the Florida Everglades Water Conservation Area 2A (WCA-2A) using time series Landsat images spanning 1996-2016. The object-based change analysis technique was combined in the framework. A hybrid pixel/object-based change detection approach was developed to effectively collect training samples for historical images with sparse reference data. An object-based quantification approach was also developed to assess the expansion/reduction of a specific class such as cattail (an invasive species in the Everglades) from the object-based classifications of two dates of imagery. The study confirmed the results in the literature that cattail was largely expanded during 1996-2007. It also revealed that cattail expansion was constrained after 2007. Application of time series Landsat data is valuable to document vegetation changes for the WCA-2A impoundment. The digital techniques developed will benefit global wetland mapping and change analysis in general, and the Florida Everglades WCA-2A in particular.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdel-Kareem, O.; Khedr, A.; Abdelhamid, M.
Analysis of the composition of an object is a necessary step in the documentation of the properties of this object for estimating its condition. Also this is an important task for establishing an appropriate conservation treatment of an object or to follow up the result of the application of the suggested treatments. There has been an important evolution in the methods used for analysis of metal threads since the second half of the twentieth century. Today, the main considerations of selecting a method are based on the diagnostic power, representative sampling, reproducibility, destructive nature/invasiveness of analysis and accessibility to themore » appropriate instrument. This study aims at evaluating the usefulness of the use of Laser Induced Breakdown Spectroscopy (LIBS) Technique for analysis of historical metal threads. In this study various historical metal threads collected from different museums were investigated using (LIBS) technique. For evaluating usefulness of the suggested analytical protocol of this technique, the same investigated metal thread samples were investigated with Scanning Electron Microscope (SEM) with energy-dispersive x-ray analyzer (EDX) which is reported in conservation field as the best method, to determine the chemical composition, and corrosion of investigated metal threads. The results show that all investigated metal threads in the present study are too dirty, strongly damaged and corroded with different types of corrosion products. Laser Induced Breakdown Spectroscopy (LIBS) Technique is considered very useful technique that can be used safely for investigating historical metal threads. It is, in fact, very useful tool as a noninvasive method for analysis of historical metal threads. The first few laser shots are very useful for the investigation of the corrosion and dirt layer, while the following shots are very useful and effective for investigating the coating layer. Higher number of laser shots are very useful for the main composition of the metal thread. There is a necessity to carry out further research to investigate and determine the most appropriate and effective approaches and methods for conservation of these metal threads.« less
Crash Simulation and Animation: 'A New Approach for Traffic Safety Analysis'
DOT National Transportation Integrated Search
2001-02-01
This researchs objective is to present a methodology to supplement the conventional traffic safety analysis techniques. This methodology aims at using computer simulation to animate and visualize crash occurrence at high-risk locations. This methodol...
A review of intelligent systems for heart sound signal analysis.
Nabih-Ali, Mohammed; El-Dahshan, El-Sayed A; Yahia, Ashraf S
2017-10-01
Intelligent computer-aided diagnosis (CAD) systems can enhance the diagnostic capabilities of physicians and reduce the time required for accurate diagnosis. CAD systems could provide physicians with a suggestion about the diagnostic of heart diseases. The objective of this paper is to review the recent published preprocessing, feature extraction and classification techniques and their state of the art of phonocardiogram (PCG) signal analysis. Published literature reviewed in this paper shows the potential of machine learning techniques as a design tool in PCG CAD systems and reveals that the CAD systems for PCG signal analysis are still an open problem. Related studies are compared to their datasets, feature extraction techniques and the classifiers they used. Current achievements and limitations in developing CAD systems for PCG signal analysis using machine learning techniques are presented and discussed. In the light of this review, a number of future research directions for PCG signal analysis are provided.
Covariate selection with iterative principal component analysis for predicting physical
USDA-ARS?s Scientific Manuscript database
Local and regional soil data can be improved by coupling new digital soil mapping techniques with high resolution remote sensing products to quantify both spatial and absolute variation of soil properties. The objective of this research was to advance data-driven digital soil mapping techniques for ...
DOT National Transportation Integrated Search
2012-06-01
The objective of this study was to develop an approach for incorporating techniques to interpret and evaluate deflection : data for network-level pavement management system (PMS) applications. The first part of this research focused on : identifying ...
USDA-ARS?s Scientific Manuscript database
Ultra high resolution digital aerial photography has great potential to complement or replace ground measurements of vegetation cover for rangeland monitoring and assessment. We investigated object-based image analysis (OBIA) techniques for classifying vegetation in southwestern U.S. arid rangelands...
Objective analysis of pseudostress over the Indian Ocean using a direct-minimization approach
NASA Technical Reports Server (NTRS)
Legler, David M.; Navon, I. M.; O'Brien, James J.
1989-01-01
A technique not previously used in objective analysis of meteorological data is used here to produce monthly average surface pseudostress data over the Indian Ocean. An initial guess field is derived and a cost functional is constructed with five terms: approximation to initial guess, approximation to climatology, a smoothness parameter, and two kinematic terms. The functional is minimized using a conjugate-gradient technique, and the weight for the climatology term controls the overall balance of influence between the climatology and the initial guess. Results from various weight combinations are presented for January and July 1984. Quantitative and qualitative comparisons to the subject analysis are made to find which weight combination provides the best results. The weight on the approximation to climatology is found to balance the influence of the original field and climatology.
NASA Astrophysics Data System (ADS)
Bártová, H.; Trojek, T.; Čechák, T.; Šefců, R.; Chlumská, Š.
2017-10-01
The presence of heavy chemical elements in old pigments is possible to identify in historical paintings using X-ray fluorescence analysis (XRF). This is a non-destructive analytical method frequently used in examination of objects that require in situ analysis, where it is necessary to avoid damaging the object by taking samples. Different modalities are available, such as microanalysis, scanning selected areas, or depth profiling techniques. Surface scanning is particularly profitable since 2D element distribution maps are much more understandable than the results of individual analyses. Information on the layered structure of the painting can be also obtained by handheld portable systems. Results presented in our paper combine 2D element distribution maps obtained by scanning analysis, and depth profiling using conventional XRF. The latter is very suitable for objects of art, as it can be evaluated from data measured with portable XRF device. Depth profiling by conventional XRF is based on the differences in X-ray absorption in paint layers. The XRF technique was applied for analysis of panel paintings of the Master of the St George Altarpiece who was active in Prague in the 1470s and 1480s. The results were evaluated by taking micro-samples and performing a material analysis.
2012-05-18
by the AWAC. It is a surface- penetrating device that measures continuous changes in the water elevations over time at much higher sampling rates of...background subtraction, a technique based on detecting change from a background scene. Their study highlights the difficulty in object detection and tracking...movements (Zhang et al. 2009) Alternatively, another common object detection method , known as Optical Flow Analysis , may be utilized for vessel
Review and classification of variability analysis techniques with clinical applications.
Bravi, Andrea; Longtin, André; Seely, Andrew J E
2011-10-10
Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.
Review and classification of variability analysis techniques with clinical applications
2011-01-01
Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357
A Survey Version of Full-Profile Conjoint Analysis.
ERIC Educational Resources Information Center
Chrzan, Keith
Two studies were conducted to test the viability of a survey version of full-profile conjoint analysis. Conjoint analysis describes a variety of analytic techniques for measuring subjects'"utilities," or preferences for the individual attributes or levels of attributes that constitute objects under study. The first study compared the…
NASA Technical Reports Server (NTRS)
Smith, D. R.; Leslie, F. W.
1984-01-01
The Purdue Regional Objective Analysis of the Mesoscale (PROAM) is a successive correction type scheme for the analysis of surface meteorological data. The scheme is subjected to a series of experiments to evaluate its performance under a variety of analysis conditions. The tests include use of a known analytic temperature distribution to quantify error bounds for the scheme. Similar experiments were conducted using actual atmospheric data. Results indicate that the multiple pass technique increases the accuracy of the analysis. Furthermore, the tests suggest appropriate values for the analysis parameters in resolving disturbances for the data set used in this investigation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, L.; Lanza, R.C.
1999-12-01
The authors have developed a near field coded aperture imaging system for use with fast neutron techniques as a tool for the detection of contraband and hidden explosives through nuclear elemental analysis. The technique relies on the prompt gamma rays produced by fast neutron interactions with the object being examined. The position of the nuclear elements is determined by the location of the gamma emitters. For existing fast neutron techniques, in Pulsed Fast Neutron Analysis (PFNA), neutrons are used with very low efficiency; in Fast Neutron Analysis (FNS), the sensitivity for detection of the signature gamma rays is very low.more » For the Coded Aperture Fast Neutron Analysis (CAFNA{reg{underscore}sign}) the authors have developed, the efficiency for both using the probing fast neutrons and detecting the prompt gamma rays is high. For a probed volume of n{sup 3} volume elements (voxels) in a cube of n resolution elements on a side, they can compare the sensitivity with other neutron probing techniques. As compared to PFNA, the improvement for neutron utilization is n{sup 2}, where the total number of voxels in the object being examined is n{sup 3}. Compared to FNA, the improvement for gamma-ray imaging is proportional to the total open area of the coded aperture plane; a typical value is n{sup 2}/2, where n{sup 2} is the number of total detector resolution elements or the number of pixels in an object layer. It should be noted that the actual signal to noise ratio of a system depends also on the nature and distribution of background events and this comparison may reduce somewhat the effective sensitivity of CAFNA. They have performed analysis, Monte Carlo simulations, and preliminary experiments using low and high energy gamma-ray sources. The results show that a high sensitivity 3-D contraband imaging and detection system can be realized by using CAFNA.« less
A technology roadmap of smart biosensors from conventional glucose monitoring systems.
Shende, Pravin; Sahu, Pratiksha; Gaud, Ram
2017-06-01
The objective of this review article is to focus on technology roadmap of smart biosensors from a conventional glucose monitoring system. The estimation of glucose with commercially available devices involves analysis of blood samples that are obtained by pricking finger or extracting blood from the forearm. Since pain and discomfort are associated with invasive methods, the non-invasive measurement techniques have been investigated. The non-invasive methods show advantages like non-exposure to sharp objects such as needles and syringes, due to which there is an increase in testing frequency, improved control of glucose concentration and absence of pain and biohazard materials. This review study is aimed to describe recent invasive techniques and major noninvasive techniques, viz. biosensors, optical techniques and sensor-embedded contact lenses for glucose estimation.
Imaging of conductivity distributions using audio-frequency electromagnetic data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Ki Ha; Morrison, H.F.
1990-10-01
The objective of this study has been to develop mathematical methods for mapping conductivity distributions between boreholes using low frequency electromagnetic (em) data. In relation to this objective this paper presents two recent developments in high-resolution crosshole em imaging techniques. These are (1) audio-frequency diffusion tomography, and (2) a transform method in which low frequency data is first transformed into a wave-like field. The idea in the second approach is that we can then treat the transformed field using conventional techniques designed for wave field analysis.
Shape-based human detection for threat assessment
NASA Astrophysics Data System (ADS)
Lee, Dah-Jye; Zhan, Pengcheng; Thomas, Aaron; Schoenberger, Robert B.
2004-07-01
Detection of intrusions for early threat assessment requires the capability of distinguishing whether the intrusion is a human, an animal, or other objects. Most low-cost security systems use simple electronic motion detection sensors to monitor motion or the location of objects within the perimeter. Although cost effective, these systems suffer from high rates of false alarm, especially when monitoring open environments. Any moving objects including animals can falsely trigger the security system. Other security systems that utilize video equipment require human interpretation of the scene in order to make real-time threat assessment. Shape-based human detection technique has been developed for accurate early threat assessments for open and remote environment. Potential threats are isolated from the static background scene using differential motion analysis and contours of the intruding objects are extracted for shape analysis. Contour points are simplified by removing redundant points connecting short and straight line segments and preserving only those with shape significance. Contours are represented in tangent space for comparison with shapes stored in database. Power cepstrum technique has been developed to search for the best matched contour in database and to distinguish a human from other objects from different viewing angles and distances.
Sensitivity of Forecast Skill to Different Objective Analysis Schemes
NASA Technical Reports Server (NTRS)
Baker, W. E.
1979-01-01
Numerical weather forecasts are characterized by rapidly declining skill in the first 48 to 72 h. Recent estimates of the sources of forecast error indicate that the inaccurate specification of the initial conditions contributes substantially to this error. The sensitivity of the forecast skill to the initial conditions is examined by comparing a set of real-data experiments whose initial data were obtained with two different analysis schemes. Results are presented to emphasize the importance of the objective analysis techniques used in the assimilation of observational data.
Basic research for the geodynamics program
NASA Technical Reports Server (NTRS)
1984-01-01
Some objectives of this geodynamic program are: (1) optimal utilization of laser and VLBI observations as reference frames for geodynamics, (2) utilization of range difference observations in geodynamics, and (3) estimation techniques in crustal deformation analysis. The determination of Earth rotation parameters from different space geodetic systems is studied. Also reported on is the utilization of simultaneous laser range differences for the determination of baseline variation. An algorithm for the analysis of regional or local crustal deformation measurements is proposed along with other techniques and testing procedures. Some results of the reference from comparisons in terms of the pole coordinates from different techniques are presented.
USDA-ARS?s Scientific Manuscript database
Objective: To prepare a new fluorescent tracer against common mycotoxins such as fumonisin B1 in order to replace 6-(4,6-Dichlorotriazinyl) aminofluorescein (6-DTAF), an expensive marker, and to develop a technique for quick detection of fumonisin B1 based on the principle of fluorescence polarizati...
Multiprocessor smalltalk: Implementation, performance, and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pallas, J.I.
1990-01-01
Multiprocessor Smalltalk demonstrates the value of object-oriented programming on a multiprocessor. Its implementation and analysis shed light on three areas: concurrent programming in an object oriented language without special extensions, implementation techniques for adapting to multiprocessors, and performance factors in the resulting system. Adding parallelism to Smalltalk code is easy, because programs already use control abstractions like iterators. Smalltalk's basic control and concurrency primitives (lambda expressions, processes and semaphores) can be used to build parallel control abstractions, including parallel iterators, parallel objects, atomic objects, and futures. Language extensions for concurrency are not required. This implementation demonstrates that it is possiblemore » to build an efficient parallel object-oriented programming system and illustrates techniques for doing so. Three modification tools-serialization, replication, and reorganization-adapted the Berkeley Smalltalk interpreter to the Firefly multiprocessor. Multiprocessor Smalltalk's performance shows that the combination of multiprocessing and object-oriented programming can be effective: speedups (relative to the original serial version) exceed 2.0 for five processors on all the benchmarks; the median efficiency is 48%. Analysis shows both where performance is lost and how to improve and generalize the experimental results. Changes in the interpreter to support concurrency add at most 12% overhead; better access to per-process variables could eliminate much of that. Changes in the user code to express concurrency add as much as 70% overhead; this overhead could be reduced to 54% if blocks (lambda expressions) were reentrant. Performance is also lost when the program cannot keep all five processors busy.« less
The generation and use of numerical shape models for irregular Solar System objects
NASA Technical Reports Server (NTRS)
Simonelli, Damon P.; Thomas, Peter C.; Carcich, Brian T.; Veverka, Joseph
1993-01-01
We describe a procedure that allows the efficient generation of numerical shape models for irregular Solar System objects, where a numerical model is simply a table of evenly spaced body-centered latitudes and longitudes and their associated radii. This modeling technique uses a combination of data from limbs, terminators, and control points, and produces shape models that have some important advantages over analytical shape models. Accurate numerical shape models make it feasible to study irregular objects with a wide range of standard scientific analysis techniques. These applications include the determination of moments of inertia and surface gravity, the mapping of surface locations and structural orientations, photometric measurement and analysis, the reprojection and mosaicking of digital images, and the generation of albedo maps. The capabilities of our modeling procedure are illustrated through the development of an accurate numerical shape model for Phobos and the production of a global, high-resolution, high-pass-filtered digital image mosaic of this Martian moon. Other irregular objects that have been modeled, or are being modeled, include the asteroid Gaspra and the satellites Deimos, Amalthea, Epimetheus, Janus, Hyperion, and Proteus.
NASA Technical Reports Server (NTRS)
Crawford, Daniel J.; Burdette, Daniel W.; Capron, William R.
1993-01-01
The methodology and techniques used to collect and analyze look-point position data from a real-time ATC display-format comparison experiment are documented. That study compared the delivery precision and controller workload of three final approach spacing aid display formats. Using an oculometer, controller lookpoint position data were collected, associated with gaze objects (e.g., moving aircraft) on the ATC display, and analyzed to determine eye-scan behavior. The equipment involved and algorithms for saving, synchronizing with the ATC simulation output, and filtering the data are described. Target (gaze object) and cross-check scanning identification algorithms are also presented. Data tables are provided of total dwell times, average dwell times, and cross-check scans. Flow charts, block diagrams, file record descriptors, and source code are included. The techniques and data presented are intended to benefit researchers in other studies that incorporate non-stationary gaze objects and oculometer equipment.
Single-organelle tracking by two-photon conversion
NASA Astrophysics Data System (ADS)
Watanabe, Wataru; Shimada, Tomoko; Matsunaga, Sachihiro; Kurihara, Daisuke; Fukui, Kiichi; Shin-Ichi Arimura, Shin-Ichi; Tsutsumi, Nobuhiro; Isobe, Keisuke; Itoh, Kazuyoshi
2007-03-01
Spatial and temporal information about intracellular objects and their dynamics within a living cell are essential for dynamic analysis of such objects in cell biology. A specific intracellular object can be discriminated by photoactivatable fluorescent proteins that exhibit pronounced light-induced spectral changes. Here, we report on selective labeling and tracking of a single organelle by using two-photon conversion of a photoconvertible fluorescent protein with near-infrared femtosecond laser pulses. We performed selective labeling of a single mitochondrion in a living tobacco BY-2 cell using two-photon photoconversion of Kaede. Using this technique, we demonstrated that, in plants, the directed movement of individual mitochondria along the cytoskeletons was mediated by actin filaments, whereas microtubules were not required for the movement of mitochondria. This single-organelle labeling technique enabled us to track the dynamics of a single organelle, revealing the mechanisms involved in organelle dynamics. The technique has potential application in direct tracking of selective cellular and intracellular structures.
Facilitating LOS Debriefings: A Training Manual
NASA Technical Reports Server (NTRS)
McDonnell, Lori K.; Jobe, Kimberly K.; Dismukes, R. Key
1997-01-01
This manual is a practical guide to help airline instructors effectively facilitate debriefings of Line Oriented Simulations (LOS). It is based on a recently completed study of Line Oriented Flight Training (LOFT) debriefings at several U.S. airlines. This manual presents specific facilitation tools instructors can use to achieve debriefing objectives. The approach of the manual is to be flexible so it can be tailored to the individual needs of each airline. Part One clarifies the purpose and objectives of facilitation in the LOS setting. Part Two provides recommendations for clarifying roles and expectations and presents a model for organizing discussion. Part Tree suggests techniques for eliciting active crew participation and in-depth analysis and evaluation. Finally, in Part Four, these techniques are organized according to the facilitation model. Examples of how to effectively use the techniques are provided throughout, including strategies to try when the debriefing objectives are not being fully achieved.
NASA Technical Reports Server (NTRS)
McDowell, Mark
2004-01-01
An integrated algorithm for decomposing overlapping particle images (multi-particle objects) along with determining each object s constituent particle centroid(s) has been developed using image analysis techniques. The centroid finding algorithm uses a modified eight-direction search method for finding the perimeter of any enclosed object. The centroid is calculated using the intensity-weighted center of mass of the object. The overlap decomposition algorithm further analyzes the object data and breaks it down into its constituent particle centroid(s). This is accomplished with an artificial neural network, feature based technique and provides an efficient way of decomposing overlapping particles. Combining the centroid finding and overlap decomposition routines into a single algorithm allows us to accurately predict the error associated with finding the centroid(s) of particles in our experiments. This algorithm has been tested using real, simulated, and synthetic data and the results are presented and discussed.
S-192 analysis: Conventional and special data processing techniques. [Michigan
NASA Technical Reports Server (NTRS)
Nalepka, R. F. (Principal Investigator); Morganstern, J.; Cicone, R.; Sarno, J.; Lambeck, P.; Malila, W.
1975-01-01
The author has identified the following significant results. Multispectral scanner data gathered over test sites in southeast Michigan were analyzed. This analysis showed the data to be somewhat deficient especially in terms of the limited signal range in most SDOs and also in regard to SDO-SDO misregistration. Further analysis showed that the scan line straightening algorithm increased the misregistration of the data. Data were processed using the conic format. The effects of such misregistration on classification accuracy was analyzed via simulation and found to be significant. Results of employing conventional as well as special, unresolved object, processing techniques were disappointing due, at least in part, to the limited signal range and noise content of the data. Application of a second class of special processing techniques, signature extension techniques, yielded better results. Two of the more basic signature extension techniques seemed to be useful in spite of the difficulties.
Automatic archaeological feature extraction from satellite VHR images
NASA Astrophysics Data System (ADS)
Jahjah, Munzer; Ulivieri, Carlo
2010-05-01
Archaeological applications need a methodological approach on a variable scale able to satisfy the intra-site (excavation) and the inter-site (survey, environmental research). The increased availability of high resolution and micro-scale data has substantially favoured archaeological applications and the consequent use of GIS platforms for reconstruction of archaeological landscapes based on remotely sensed data. Feature extraction of multispectral remotely sensing image is an important task before any further processing. High resolution remote sensing data, especially panchromatic, is an important input for the analysis of various types of image characteristics; it plays an important role in the visual systems for recognition and interpretation of given data. The methods proposed rely on an object-oriented approach based on a theory for the analysis of spatial structures called mathematical morphology. The term "morphology" stems from the fact that it aims at analysing object shapes and forms. It is mathematical in the sense that the analysis is based on the set theory, integral geometry, and lattice algebra. Mathematical morphology has proven to be a powerful image analysis technique; two-dimensional grey tone images are seen as three-dimensional sets by associating each image pixel with an elevation proportional to its intensity level. An object of known shape and size, called the structuring element, is then used to investigate the morphology of the input set. This is achieved by positioning the origin of the structuring element to every possible position of the space and testing, for each position, whether the structuring element either is included or has a nonempty intersection with the studied set. The shape and size of the structuring element must be selected according to the morphology of the searched image structures. Other two feature extraction techniques were used, eCognition and ENVI module SW, in order to compare the results. These techniques were applied to different archaeological sites in Turkmenistan (Nisa) and in Iraq (Babylon); a further change detection analysis was applied to the Babylon site using two HR images as a pre-post second gulf war. We had different results or outputs, taking into consideration the fact that the operative scale of sensed data determines the final result of the elaboration and the output of the information quality, because each of them was sensitive to specific shapes in each input image, we had mapped linear and nonlinear objects, updating archaeological cartography, automatic change detection analysis for the Babylon site. The discussion of these techniques has the objective to provide the archaeological team with new instruments for the orientation and the planning of a remote sensing application.
Three Object-Oriented enhancement for EPICS
NASA Astrophysics Data System (ADS)
Osberg, E. A.; Dohan, D. A.; Richter, R.; Biggs, R.; Chillara, K.; Wade, D.; Bossom, J.
1994-12-01
In line with our group's intention of producing software using, where possible, Object-Oriented methodologies and techniques in the development of RF control systems, we have undertaken three projects to enhance the EPICS software environment. Two of the projects involve interfaces to EPICs Channel Access from Object-Oriented languages. The third is an enhancement to the EPICS State Notation Language to better support the Shlaer-Mellor Object-Oriented Analysis and Design Methodology. This paper discusses the motivation, approaches, results and future directions of these three projects.
How Factor Analysis Can Be Used in Classification.
ERIC Educational Resources Information Center
Harman, Harry H.
This is a methodological study that suggests a taxometric technique for objective classification of yeasts. It makes use of the minres method of factor analysis and groups strains of yeast according to their factor profiles. The similarities are judged in the higher-dimensional space determined by the factor analysis, but otherwise rely on the…
Performance Analysis of Garbage Collection and Dynamic Reordering in a Lisp System. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Llames, Rene Lim
1991-01-01
Generation based garbage collection and dynamic reordering of objects are two techniques for improving the efficiency of memory management in Lisp and similar dynamic language systems. An analysis of the effect of generation configuration is presented, focusing on the effect of a number of generations and generation capabilities. Analytic timing and survival models are used to represent garbage collection runtime and to derive structural results on its behavior. The survival model provides bounds on the age of objects surviving a garbage collection at a particular level. Empirical results show that execution time is most sensitive to the capacity of the youngest generation. A technique called scanning for transport statistics, for evaluating the effectiveness of reordering independent of main memory size, is presented.
Shape and Color Features for Object Recognition Search
NASA Technical Reports Server (NTRS)
Duong, Tuan A.; Duong, Vu A.; Stubberud, Allen R.
2012-01-01
A bio-inspired shape feature of an object of interest emulates the integration of the saccadic eye movement and horizontal layer in vertebrate retina for object recognition search where a single object can be used one at a time. The optimal computational model for shape-extraction-based principal component analysis (PCA) was also developed to reduce processing time and enable the real-time adaptive system capability. A color feature of the object is employed as color segmentation to empower the shape feature recognition to solve the object recognition in the heterogeneous environment where a single technique - shape or color - may expose its difficulties. To enable the effective system, an adaptive architecture and autonomous mechanism were developed to recognize and adapt the shape and color feature of the moving object. The bio-inspired object recognition based on bio-inspired shape and color can be effective to recognize a person of interest in the heterogeneous environment where the single technique exposed its difficulties to perform effective recognition. Moreover, this work also demonstrates the mechanism and architecture of the autonomous adaptive system to enable the realistic system for the practical use in the future.
SSBRP User Operations Facility (UOF) Overview and Development Strategy
NASA Technical Reports Server (NTRS)
Picinich, Lou; Stone, Thom; Sun, Charles; Windrem, May; Givens, John J. (Technical Monitor)
1995-01-01
This paper will present the Space Station Biological Research Project (SSBRP) User Operations Facility (UOF) architecture and development strategy. A major element of the UOF at NASA Ames Research Center, the Communication and Data System (CDS) will be the primary focus of the discussions. CDS operational, telescience, security, and development objectives will be discussed along with CDS implementation strategy. The implementation strategy discussions will include: Object Oriented Analysis & Design, System & Software Prototyping, and Technology Utilization. A CDS design overview that includes: CDS Context Diagram, CDS Architecture, Object Models, Use Cases, and User Interfaces will also be presented. CDS development brings together "cutting edge" technologies and techniques such as: object oriented development, network security, multimedia networking, web-based data distribution, JAVA, and graphical user interfaces. Use of these "cutting edge" technologies and techniques translates directly to lower development and operations costs.
Lightweight Active Object Retrieval with Weak Classifiers.
Czúni, László; Rashad, Metwally
2018-03-07
In the last few years, there has been a steadily growing interest in autonomous vehicles and robotic systems. While many of these agents are expected to have limited resources, these systems should be able to dynamically interact with other objects in their environment. We present an approach where lightweight sensory and processing techniques, requiring very limited memory and processing power, can be successfully applied to the task of object retrieval using sensors of different modalities. We use the Hough framework to fuse optical and orientation information of the different views of the objects. In the presented spatio-temporal perception technique, we apply active vision, where, based on the analysis of initial measurements, the direction of the next view is determined to increase the hit-rate of retrieval. The performance of the proposed methods is shown on three datasets loaded with heavy noise.
Lightweight Active Object Retrieval with Weak Classifiers
2018-01-01
In the last few years, there has been a steadily growing interest in autonomous vehicles and robotic systems. While many of these agents are expected to have limited resources, these systems should be able to dynamically interact with other objects in their environment. We present an approach where lightweight sensory and processing techniques, requiring very limited memory and processing power, can be successfully applied to the task of object retrieval using sensors of different modalities. We use the Hough framework to fuse optical and orientation information of the different views of the objects. In the presented spatio-temporal perception technique, we apply active vision, where, based on the analysis of initial measurements, the direction of the next view is determined to increase the hit-rate of retrieval. The performance of the proposed methods is shown on three datasets loaded with heavy noise. PMID:29518902
Frame sequences analysis technique of linear objects movement
NASA Astrophysics Data System (ADS)
Oshchepkova, V. Y.; Berg, I. A.; Shchepkin, D. V.; Kopylova, G. V.
2017-12-01
Obtaining data by noninvasive methods are often needed in many fields of science and engineering. This is achieved through video recording in various frame rate and light spectra. In doing so quantitative analysis of movement of the objects being studied becomes an important component of the research. This work discusses analysis of motion of linear objects on the two-dimensional plane. The complexity of this problem increases when the frame contains numerous objects whose images may overlap. This study uses a sequence containing 30 frames at the resolution of 62 × 62 pixels and frame rate of 2 Hz. It was required to determine the average velocity of objects motion. This velocity was found as an average velocity for 8-12 objects with the error of 15%. After processing dependencies of the average velocity vs. control parameters were found. The processing was performed in the software environment GMimPro with the subsequent approximation of the data obtained using the Hill equation.
Silicon ribbon technology assessment 1978-1986 - A computer-assisted analysis using PECAN
NASA Technical Reports Server (NTRS)
Kran, A.
1978-01-01
The paper presents a 1978-1986 economic outlook for silicon ribbon technology based on the capillary action shaping technique. The outlook is presented within the framework of two sets of scenarios, which develop strategy for approaching the 1986 national energy capacity cost objective of $0.50/WE peak. The PECAN (Photovoltaic Energy Conversion Analysis) simulation technique is used to develop a 1986 sheet material price ($50/sq m) which apparently can be attained without further scientific breakthrough.
2013-09-01
Result Analysis In this phase, users and analysts check all the results per objective- question. Then, they consolidate all these results to form...the CRUD technique. By using both the CRUD and the user goal techniques, we identified all the use cases the iFRE system must perform. Table 3...corresponding Focus Area or Critical Operation Issue to simplify the user tasks, and exempts the user from remembering the identifying codes/numbers of
The Universe at Ultraviolet Wavelengths: The first two years of International Ultraviolet Explorer
NASA Technical Reports Server (NTRS)
Chapman, R. D. (Editor)
1981-01-01
Highlights of the results obtained from the IUE satellite are addressed. specific topics discussed include the solar system, O-A stars, F-M stars, binary stars and highly evolved objects, nebulae and interstellar medium, and extragalactic objects. Data reduction techniques employed in the analysis of the varied data are also discussed.
DOT National Transportation Integrated Search
2016-03-02
The principal objective of this project, Connected Vehicle Impacts on Transportation Planning, is to comprehensively assess how connected vehicles should be considered across the range of transportation planning processes and products developed...
DOT National Transportation Integrated Search
2015-06-01
The principal objective of this project, Connected Vehicle Impacts on Transportation Planning, is to comprehensively assess how connected vehicles should be considered across the range of transportation planning processes and products developed...
Discriminant forest classification method and system
Chen, Barry Y.; Hanley, William G.; Lemmond, Tracy D.; Hiller, Lawrence J.; Knapp, David A.; Mugge, Marshall J.
2012-11-06
A hybrid machine learning methodology and system for classification that combines classical random forest (RF) methodology with discriminant analysis (DA) techniques to provide enhanced classification capability. A DA technique which uses feature measurements of an object to predict its class membership, such as linear discriminant analysis (LDA) or Andersen-Bahadur linear discriminant technique (AB), is used to split the data at each node in each of its classification trees to train and grow the trees and the forest. When training is finished, a set of n DA-based decision trees of a discriminant forest is produced for use in predicting the classification of new samples of unknown class.
THE EDUCATIONAL INSTITUTION AS A SYSTEM--A PROPOSED GENERALIZED PROCEDURE FOR ANALYSIS.
ERIC Educational Resources Information Center
REISMAN, ARNOLD; TAFT, MARTIN I.
A UNIFIED APPROACH TO THE ANALYSIS AND SYNTHESIS OF THE FUNCTIONS AND OPERATIONS IN EDUCATIONAL INSTITUTIONS IS PRESENTED. SYSTEMS ANALYSIS TECHNIQUES USED IN OTHER AREAS SUCH AS CRAFT, PERT, CERBS, AND OPERATIONS RESEARCH ARE SUGGESTED AS POTENTIALLY ADAPTABLE FOR USE IN HIGHER EDUCATION. THE MAJOR OBJECTIVE OF A SCHOOL IS TO ALLOCATE AVAILABLE…
ERIC Educational Resources Information Center
Morris, Phillip; Thrall, Grant
2010-01-01
Geographic analysis has been adopted by businesses, especially the retail sector, since the early 1990s (Thrall, 2002). Institutional research can receive the same benefits businesses have by adopting geographic analysis and technology. The commonalities between businesses and higher education institutions include the existence of trade areas, the…
Automatic Topography Using High Precision Digital Moire Methods
NASA Astrophysics Data System (ADS)
Yatagai, T.; Idesawa, M.; Saito, S.
1983-07-01
Three types of moire topographic methods using digital techniques are proposed. Deformed gratings obtained by projecting a reference grating onto an object under test are subjected to digital analysis. The electronic analysis procedures of deformed gratings described here enable us to distinguish between depression and elevation of the object, so that automatic measurement of 3-D shapes and automatic moire fringe interpolation are performed. Based on the digital moire methods, we have developed a practical measurement system, with a linear photodiode array on a micro-stage as a scanning image sensor. Examples of fringe analysis in medical applications are presented.
Digital imaging and image analysis applied to numerical applications in forensic hair examination.
Brooks, Elizabeth; Comber, Bruce; McNaught, Ian; Robertson, James
2011-03-01
A method that provides objective data to complement the hair analysts' microscopic observations, which is non-destructive, would be of obvious benefit in the forensic examination of hairs. This paper reports on the use of objective colour measurement and image analysis techniques of auto-montaged images. Brown Caucasian telogen scalp hairs were chosen as a stern test of the utility of these approaches. The results show the value of using auto-montaged images and the potential for the use of objective numerical measures of colour and pigmentation to complement microscopic observations. 2010. Published by Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Sudra, Gunther; Speidel, Stefanie; Fritz, Dominik; Müller-Stich, Beat Peter; Gutt, Carsten; Dillmann, Rüdiger
2007-03-01
Minimally invasive surgery is a highly complex medical discipline with various risks for surgeon and patient, but has also numerous advantages on patient-side. The surgeon has to adapt special operation-techniques and deal with difficulties like the complex hand-eye coordination, limited field of view and restricted mobility. To alleviate with these new problems, we propose to support the surgeon's spatial cognition by using augmented reality (AR) techniques to directly visualize virtual objects in the surgical site. In order to generate an intelligent support, it is necessary to have an intraoperative assistance system that recognizes the surgical skills during the intervention and provides context-aware assistance surgeon using AR techniques. With MEDIASSIST we bundle our research activities in the field of intraoperative intelligent support and visualization. Our experimental setup consists of a stereo endoscope, an optical tracking system and a head-mounted-display for 3D visualization. The framework will be used as platform for the development and evaluation of our research in the field of skill recognition and context-aware assistance generation. This includes methods for surgical skill analysis, skill classification, context interpretation as well as assistive visualization and interaction techniques. In this paper we present the objectives of MEDIASSIST and first results in the fields of skill analysis, visualization and multi-modal interaction. In detail we present a markerless instrument tracking for surgical skill analysis as well as visualization techniques and recognition of interaction gestures in an AR environment.
Analyzing and designing object-oriented missile simulations with concurrency
NASA Astrophysics Data System (ADS)
Randorf, Jeffrey Allen
2000-11-01
A software object model for the six degree-of-freedom missile modeling domain is presented. As a precursor, a domain analysis of the missile modeling domain was started, based on the Feature-Oriented Domain Analysis (FODA) technique described by the Software Engineering Institute (SEI). It was subsequently determined the FODA methodology is functionally equivalent to the Object Modeling Technique. The analysis used legacy software documentation and code from the ENDOSIM, KDEC, and TFrames 6-DOF modeling tools, including other technical literature. The SEI Object Connection Architecture (OCA) was the template for designing the object model. Three variants of the OCA were considered---a reference structure, a recursive structure, and a reference structure with augmentation for flight vehicle modeling. The reference OCA design option was chosen for maintaining simplicity while not compromising the expressive power of the OMT model. The missile architecture was then analyzed for potential areas of concurrent computing. It was shown how protected objects could be used for data passing between OCA object managers, allowing concurrent access without changing the OCA reference design intent or structure. The implementation language was the 1995 release of Ada. OCA software components were shown how to be expressed as Ada child packages. While acceleration of several low level and other high operations level are possible on proper hardware, there was a 33% degradation of 4th order Runge-Kutta integrator performance of two simultaneous ordinary differential equations using Ada tasking on a single processor machine. The Defense Department's High Level Architecture was introduced and explained in context with the OCA. It was shown the HLA and OCA were not mutually exclusive architectures, but complimentary. HLA was shown as an interoperability solution, with the OCA as an architectural vehicle for software reuse. Further directions for implementing a 6-DOF missile modeling environment are discussed.
DOT National Transportation Integrated Search
1983-12-01
This research was performed to complete and advance the status of recently developed : procedures for analysis and design of weaving sections (known as the Leisch method and-initially published in the 1979 issue of ITE Journal). The objective was to ...
DOT National Transportation Integrated Search
2016-03-11
The principal objective of this project, Connected Vehicle Impacts on Transportation Planning, is to comprehensively assess how connected vehicles should be considered across the range of transportation planning processes and products developed...
DOT National Transportation Integrated Search
2016-03-11
The principal objective of this project, Connected Vehicle Impacts on Transportation Planning, is to comprehensively assess how connected vehicles should be considered across the range of transportation planning processes and products developed...
Job Analysis, Job Descriptions, and Performance Appraisal Systems.
ERIC Educational Resources Information Center
Sims, Johnnie M.; Foxley, Cecelia H.
1980-01-01
Job analysis, job descriptions, and performance appraisal can benefit student services administration in many ways. Involving staff members in the development and implementation of these techniques can increase commitment to and understanding of the overall objectives of the office, as well as communication and cooperation among colleagues.…
Sensitivity analysis for large-scale problems
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.; Whitworth, Sandra L.
1987-01-01
The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.
Correcting the lobule in otoplasty using the fillet technique.
Sadick, Haneen; Artinger, Verena M; Haubner, Frank; Gassner, Holger G
2014-01-01
Correction of the protruded lobule in otoplasty continues to represent an important challenge. The lack of skeletal elements within the lobule makes a controlled lobule repositioning less predictable. OBJECTIVE To present a new surgical technique for lobule correction in otoplasty. Human cadaver studies were performed for detailed anatomical analysis of lobule deformities. In addition, we evaluated a novel algorithmic approach to correction of the lobule in 12 consecutive patients. INTERVENTIONS/EXPOSURES: Otoplasty with surgical correction of lobule using the fillet technique. The surgical outcome in the 12 most recent consecutive patients with at least 3 months of follow-up was assessed retrospectively. The postsurgical results were independently reviewed by a panel of noninvolved experts. The 3 major anatomic components of lobular deformities are the axial angular protrusion, the coronal angular protrusion, and the inherent shape. The fillet technique described in the present report addressed all 3 aspects in an effective way. Clinical data analysis revealed no immediate or long-term complications associated with this new surgical method. The patients' subjective rating and the panel's objective rating revealed "good" to "very good" postoperative results. This newly described fillet technique represents a safe and efficient method to correct protruded ear lobules in otoplasty. It allows precise and predictable positioning of the lobule with an excellent safety profile. 4.
Systematic procedure for designing processes with multiple environmental objectives.
Kim, Ki-Joo; Smith, Raymond L
2005-04-01
Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems due to the complex nature of the problems, the need for complex assessments, and the complicated analysis of multidimensional results. In this paper, a novel systematic procedure is presented for designing processes with multiple environmental objectives. This procedure has four steps: initialization, screening, evaluation, and visualization. The first two steps are used for systematic problem formulation based on mass and energy estimation and order of magnitude analysis. In the third step, an efficient parallel multiobjective steady-state genetic algorithm is applied to design environmentally benign and economically viable processes and to provide more accurate and uniform Pareto optimal solutions. In the last step a new visualization technique for illustrating multiple objectives and their design parameters on the same diagram is developed. Through these integrated steps the decision-maker can easily determine design alternatives with respect to his or her preferences. Most importantly, this technique is independent of the number of objectives and design parameters. As a case study, acetic acid recovery from aqueous waste mixtures is investigated by minimizing eight potential environmental impacts and maximizing total profit. After applying the systematic procedure, the most preferred design alternatives and their design parameters are easily identified.
A further component analysis for illicit drugs mixtures with THz-TDS
NASA Astrophysics Data System (ADS)
Xiong, Wei; Shen, Jingling; He, Ting; Pan, Rui
2009-07-01
A new method for quantitative analysis of mixtures of illicit drugs with THz time domain spectroscopy was proposed and verified experimentally. In traditional method we need fingerprints of all the pure chemical components. In practical as only the objective components in a mixture and their absorption features are known, it is necessary and important to present a more practical technique for the detection and identification. Our new method of quantitatively inspect of the mixtures of illicit drugs is developed by using derivative spectrum. In this method, the ratio of objective components in a mixture can be obtained on the assumption that all objective components in the mixture and their absorption features are known but the unknown components are not needed. Then methamphetamine and flour, a illicit drug and a common adulterant, were selected for our experiment. The experimental result verified the effectiveness of the method, which suggested that it could be an effective method for quantitative identification of illicit drugs. This THz spectroscopy technique is great significant in the real-world applications of illicit drugs quantitative analysis. It could be an effective method in the field of security and pharmaceuticals inspection.
NASA Astrophysics Data System (ADS)
Madrigal, Carlos A.; Restrepo, Alejandro; Branch, John W.
2016-09-01
3D reconstruction of small objects is used in applications of surface analysis, forensic analysis and tissue reconstruction in medicine. In this paper, we propose a strategy for the 3D reconstruction of small objects and the identification of some superficial defects. We applied a technique of projection of structured light patterns, specifically sinusoidal fringes and an algorithm of phase unwrapping. A CMOS camera was used to capture images and a DLP digital light projector for synchronous projection of the sinusoidal pattern onto the objects. We implemented a technique based on a 2D flat pattern as calibration process, so the intrinsic and extrinsic parameters of the camera and the DLP were defined. Experimental tests were performed in samples of artificial teeth, coal particles, welding defects and surfaces tested with Vickers indentation. Areas less than 5cm were studied. The objects were reconstructed in 3D with densities of about one million points per sample. In addition, the steps of 3D description, identification of primitive, training and classification were implemented to recognize defects, such as: holes, cracks, roughness textures and bumps. We found that pattern recognition strategies are useful, when quality supervision of surfaces has enough quantities of points to evaluate the defective region, because the identification of defects in small objects is a demanding activity of the visual inspection.
Van Lierde, Kristiane M; De Bodt, Marc; Dhaeseleer, Evelien; Wuyts, Floris; Claeys, Sofie
2010-05-01
The purpose of the present study is to measure the effectiveness of two treatment techniques--vocalization with abdominal breath support and manual circumlaryngeal therapy (MCT)--in patients with muscle tension dysphonia (MTD). The vocal quality before and after the two treatment techniques was measured by means of the dysphonia severity index (DSI), which is designed to establish an objective and quantitative correlate of the perceived vocal quality. The DSI is based on the weighted combination of the following set of voice measurements: maximum phonation time (MPT), highest frequency, lowest intensity, and jitter. The repeated-measures analysis of variance (ANOVA) revealed a significant difference between the objective overall vocal quality before and after MCT. No significant differences were measured between the objective overall vocal quality before and after vocalization with abdominal breath support. This study showed evidence that MCT is an effective treatment technique for patients with elevated laryngeal position, increased laryngeal muscle tension, and MTD. The precise way in which MCT has an effect on vocal quality has not been addressed in this experiment, but merits study. Further research into this topic could focus on electromyography (EMG) recordings in relation to vocal improvements with larger sample of subjects. (c) 2010 The Voice Foundation. Published by Mosby, Inc. All rights reserved.
MeV ion-beam analysis of optical data storage films
NASA Technical Reports Server (NTRS)
Leavitt, J. A.; Mcintyre, L. C., Jr.; Lin, Z.
1993-01-01
Our objectives are threefold: (1) to accurately characterize optical data storage films by MeV ion-beam analysis (IBA) for ODSC collaborators; (2) to develop new and/or improved analysis techniques; and (3) to expand the capabilities of the IBA facility itself. Using H-1(+), He-4(+), and N-15(++) ion beams in the 1.5 MeV to 10 MeV energy range from a 5.5 MV Van de Graaff accelerator, film thickness (in atoms/sq cm), stoichiometry, impurity concentration profiles, and crystalline structure were determined by Rutherford backscattering (RBS), high-energy backscattering, channeling, nuclear reaction analysis (NRA) and proton induced X-ray emission (PIXE). Most of these techniques are discussed in detail in the ODSC Annual Report (February 17, 1987), p. 74. The PIXE technique is briefly discussed in the ODSC Annual Report (March 15, 1991), p. 23.
Land Cover Analysis by Using Pixel-Based and Object-Based Image Classification Method in Bogor
NASA Astrophysics Data System (ADS)
Amalisana, Birohmatin; Rokhmatullah; Hernina, Revi
2017-12-01
The advantage of image classification is to provide earth’s surface information like landcover and time-series changes. Nowadays, pixel-based image classification technique is commonly performed with variety of algorithm such as minimum distance, parallelepiped, maximum likelihood, mahalanobis distance. On the other hand, landcover classification can also be acquired by using object-based image classification technique. In addition, object-based classification uses image segmentation from parameter such as scale, form, colour, smoothness and compactness. This research is aimed to compare the result of landcover classification and its change detection between parallelepiped pixel-based and object-based classification method. Location of this research is Bogor with 20 years range of observation from 1996 until 2016. This region is famous as urban areas which continuously change due to its rapid development, so that time-series landcover information of this region will be interesting.
Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research.
Ercius, Peter; Alaidi, Osama; Rames, Matthew J; Ren, Gang
2015-10-14
Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. This review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research
Alaidi, Osama; Rames, Matthew J.
2016-01-01
Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. This review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. PMID:26087941
Roy Mann
1979-01-01
Drilling rigs, confined dredged material disposal sites power and sewage treatment facilities, and other built objects on or near shorelines have often created appreciable impacts on the aesthetic perceptions of residents and recreational users. Techniques for assessing such impacts that are reviewed in this paper include viewscape analysis for large-scale shore...
Three-dimensional scanner based on fringe projection
NASA Astrophysics Data System (ADS)
Nouri, Taoufik
1995-07-01
This article presents a way of scanning 3D objects using noninvasive and contact loss techniques. The principle is to project parallel fringes on an object and then to record the object at two viewing angles. With an appropriate treatment one can reconstruct the 3D object even when it has no symmetry planes. The 3D surface data are available immediately in digital form for computer visualization and for analysis software tools. The optical setup for recording the object, the data extraction and treatment, and the reconstruction of the object are reported and commented on. Application is proposed for reconstructive/cosmetic surgery, CAD, animation, and research.
Quantitative Hyperspectral Reflectance Imaging
Klein, Marvin E.; Aalderink, Bernard J.; Padoan, Roberto; de Bruin, Gerrit; Steemers, Ted A.G.
2008-01-01
Hyperspectral imaging is a non-destructive optical analysis technique that can for instance be used to obtain information from cultural heritage objects unavailable with conventional colour or multi-spectral photography. This technique can be used to distinguish and recognize materials, to enhance the visibility of faint or obscured features, to detect signs of degradation and study the effect of environmental conditions on the object. We describe the basic concept, working principles, construction and performance of a laboratory instrument specifically developed for the analysis of historical documents. The instrument measures calibrated spectral reflectance images at 70 wavelengths ranging from 365 to 1100 nm (near-ultraviolet, visible and near-infrared). By using a wavelength tunable narrow-bandwidth light-source, the light energy used to illuminate the measured object is minimal, so that any light-induced degradation can be excluded. Basic analysis of the hyperspectral data includes a qualitative comparison of the spectral images and the extraction of quantitative data such as mean spectral reflectance curves and statistical information from user-defined regions-of-interest. More sophisticated mathematical feature extraction and classification techniques can be used to map areas on the document, where different types of ink had been applied or where one ink shows various degrees of degradation. The developed quantitative hyperspectral imager is currently in use by the Nationaal Archief (National Archives of The Netherlands) to study degradation effects of artificial samples and original documents, exposed in their permanent exhibition area or stored in their deposit rooms. PMID:27873831
Foster, J D; Miskovic, D; Allison, A S; Conti, J A; Ockrim, J; Cooper, E J; Hanna, G B; Francis, N K
2016-06-01
Laparoscopic rectal resection is technically challenging, with outcomes dependent upon technical performance. No robust objective assessment tool exists for laparoscopic rectal resection surgery. This study aimed to investigate the application of the objective clinical human reliability analysis (OCHRA) technique for assessing technical performance of laparoscopic rectal surgery and explore the validity and reliability of this technique. Laparoscopic rectal cancer resection operations were described in the format of a hierarchical task analysis. Potential technical errors were defined. The OCHRA technique was used to identify technical errors enacted in videos of twenty consecutive laparoscopic rectal cancer resection operations from a single site. The procedural task, spatial location, and circumstances of all identified errors were logged. Clinical validity was assessed through correlation with clinical outcomes; reliability was assessed by test-retest. A total of 335 execution errors identified, with a median 15 per operation. More errors were observed during pelvic tasks compared with abdominal tasks (p < 0.001). Within the pelvis, more errors were observed during dissection on the right side than the left (p = 0.03). Test-retest confirmed reliability (r = 0.97, p < 0.001). A significant correlation was observed between error frequency and mesorectal specimen quality (r s = 0.52, p = 0.02) and with blood loss (r s = 0.609, p = 0.004). OCHRA offers a valid and reliable method for evaluating technical performance of laparoscopic rectal surgery.
Method for high resolution magnetic resonance analysis using magic angle technique
Wind, Robert A.; Hu, Jian Zhi
2003-12-30
A method of performing a magnetic resonance analysis of a biological object that includes placing the object in a main magnetic field (that has a static field direction) and in a radio frequency field; rotating the object at a frequency of less than about 100 Hz around an axis positioned at an angle of about 54.degree.44' relative to the main magnetic static field direction; pulsing the radio frequency to provide a sequence that includes a phase-corrected magic angle turning pulse segment; and collecting data generated by the pulsed radio frequency. The object may be reoriented about the magic angle axis between three predetermined positions that are related to each other by 120.degree.. The main magnetic field may be rotated mechanically or electronically. Methods for magnetic resonance imaging of the object are also described.
Method for high resolution magnetic resonance analysis using magic angle technique
Wind, Robert A.; Hu, Jian Zhi
2004-12-28
A method of performing a magnetic resonance analysis of a biological object that includes placing the object in a main magnetic field (that has a static field direction) and in a radio frequency field; rotating the object at a frequency of less than about 100 Hz around an axis positioned at an angle of about 54.degree.44' relative to the main magnetic static field direction; pulsing the radio frequency to provide a sequence that includes a phase-corrected magic angle turning pulse segment; and collecting data generated by the pulsed radio frequency. The object may be reoriented about the magic angle axis between three predetermined positions that are related to each other by 120.degree.. The main magnetic field may be rotated mechanically or electronically. Methods for magnetic resonance imaging of the object are also described.
C++, objected-oriented programming, and astronomical data models
NASA Technical Reports Server (NTRS)
Farris, A.
1992-01-01
Contemporary astronomy is characterized by increasingly complex instruments and observational techniques, higher data collection rates, and large data archives, placing severe stress on software analysis systems. The object-oriented paradigm represents a significant new approach to software design and implementation that holds great promise for dealing with this increased complexity. The basic concepts of this approach will be characterized in contrast to more traditional procedure-oriented approaches. The fundamental features of objected-oriented programming will be discussed from a C++ programming language perspective, using examples familiar to astronomers. This discussion will focus on objects, classes and their relevance to the data type system; the principle of information hiding; and the use of inheritance to implement generalization/specialization relationships. Drawing on the object-oriented approach, features of a new database model to support astronomical data analysis will be presented.
NASA Astrophysics Data System (ADS)
Daly, Aoife; Streeton, Noëlle L. W.
2017-06-01
A technique for non-invasive dendrochronological analysis of oak was developed for archaeological material, using an industrial CT scanner. Since 2013, this experience has been extended within the scope of the research project `After the Black Death: Painting and Polychrome Sculpture in Norway'. The source material for the project is a collection of late-medieval winged altarpieces, shrines, polychrome sculpture, and fragments from Norwegian churches, which are owned by the Museum of Cultural History, University of Oslo. The majority cannot be sampled, and many are too large to fit into the CT scanner. For these reasons, a combined approach was adopted, utilizing CT scanning where possible, but preceded by an `exposed-wood' imaging technique. Both non-invasive techniques have yielded reliable results, and CT scanning has confirmed the reliability of the imaging technique alone. This paper presents the analytical methods, along with results from two of the 13 objects under investigation. Results for reliable dates and provenances provide new foundations for historical interpretations.
POPA: A Personality and Object Profiling Assistant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dreicer, J.S.
POPA: A Personality and Object Profiling Assistant system utilizes an extension and variation of a process developed for decision analysis as a tool to quantify intuitive feelings and subjective judgments. The technique is based on a manipulation of the Analytical Hierarchy Process. The POPA system models an individual in terms of his character type, life orientation, and incentive (motivational) factors. Then an object (i.e., individual, project, situation, or policy) is modeled with respect to its three most important factors. The individual and object models are combined to indicate the influence each of the three object factors have on the individual.more » We have investigated this problem: 1) to develop a technique that models personality types in a quantitative and organized manner, 2) to develop a tool capable of evaluating the probable success of obtaining funding for proposed programs at Los Alamos National Laboratory, 3) to determine the feasibility of quantifying feelings and intuition, and 4) to better understand subjective knowledge acquisition (especially intuition). 49 refs., 10 figs., 5 tabs.« less
Objective fitting of hemoglobin dynamics in traumatic bruises based on temperature depth profiling
NASA Astrophysics Data System (ADS)
Vidovič, Luka; Milanič, Matija; Majaron, Boris
2014-02-01
Pulsed photothermal radiometry (PPTR) allows noninvasive measurement of laser-induced temperature depth profiles. The obtained profiles provide information on depth distribution of absorbing chromophores, such as melanin and hemoglobin. We apply this technique to objectively characterize mass diffusion and decomposition rate of extravasated hemoglobin during the bruise healing process. In present study, we introduce objective fitting of PPTR data obtained over the course of the bruise healing process. By applying Monte Carlo simulation of laser energy deposition and simulation of the corresponding PPTR signal, quantitative analysis of underlying bruise healing processes is possible. Introduction of objective fitting enables an objective comparison between the simulated and experimental PPTR signals. In this manner, we avoid reconstruction of laser-induced depth profiles and thus inherent loss of information in the process. This approach enables us to determine the value of hemoglobin mass diffusivity, which is controversial in existing literature. Such information will be a valuable addition to existing bruise age determination techniques.
Exploring Characterizations of Learning Object Repositories Using Data Mining Techniques
NASA Astrophysics Data System (ADS)
Segura, Alejandra; Vidal, Christian; Menendez, Victor; Zapata, Alfredo; Prieto, Manuel
Learning object repositories provide a platform for the sharing of Web-based educational resources. As these repositories evolve independently, it is difficult for users to have a clear picture of the kind of contents they give access to. Metadata can be used to automatically extract a characterization of these resources by using machine learning techniques. This paper presents an exploratory study carried out in the contents of four public repositories that uses clustering and association rule mining algorithms to extract characterizations of repository contents. The results of the analysis include potential relationships between different attributes of learning objects that may be useful to gain an understanding of the kind of resources available and eventually develop search mechanisms that consider repository descriptions as a criteria in federated search.
ERIC Educational Resources Information Center
Henry, Anna E.; Story, Mary
2009-01-01
Objective: To identify food and beverage brand Web sites featuring designated children's areas, assess marketing techniques present on those industry Web sites, and determine nutritional quality of branded food items marketed to children. Design: Systematic content analysis of food and beverage brand Web sites and nutrient analysis of food and…
Extending unbiased stereology of brain ultrastructure to three-dimensional volumes
NASA Technical Reports Server (NTRS)
Fiala, J. C.; Harris, K. M.; Koslow, S. H. (Principal Investigator)
2001-01-01
OBJECTIVE: Analysis of brain ultrastructure is needed to reveal how neurons communicate with one another via synapses and how disease processes alter this communication. In the past, such analyses have usually been based on single or paired sections obtained by electron microscopy. Reconstruction from multiple serial sections provides a much needed, richer representation of the three-dimensional organization of the brain. This paper introduces a new reconstruction system and new methods for analyzing in three dimensions the location and ultrastructure of neuronal components, such as synapses, which are distributed non-randomly throughout the brain. DESIGN AND MEASUREMENTS: Volumes are reconstructed by defining transformations that align the entire area of adjacent sections. Whole-field alignment requires rotation, translation, skew, scaling, and second-order nonlinear deformations. Such transformations are implemented by a linear combination of bivariate polynomials. Computer software for generating transformations based on user input is described. Stereological techniques for assessing structural distributions in reconstructed volumes are the unbiased bricking, disector, unbiased ratio, and per-length counting techniques. A new general method, the fractional counter, is also described. This unbiased technique relies on the counting of fractions of objects contained in a test volume. A volume of brain tissue from stratum radiatum of hippocampal area CA1 is reconstructed and analyzed for synaptic density to demonstrate and compare the techniques. RESULTS AND CONCLUSIONS: Reconstruction makes practicable volume-oriented analysis of ultrastructure using such techniques as the unbiased bricking and fractional counter methods. These analysis methods are less sensitive to the section-to-section variations in counts and section thickness, factors that contribute to the inaccuracy of other stereological methods. In addition, volume reconstruction facilitates visualization and modeling of structures and analysis of three-dimensional relationships such as synaptic connectivity.
NASA Technical Reports Server (NTRS)
Hailperin, Max
1993-01-01
This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that our techniques allow more accurate estimation of the global system load ing, resulting in fewer object migration than local methods. Our method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive methods.
Real-time color image processing for forensic fiber investigations
NASA Astrophysics Data System (ADS)
Paulsson, Nils
1995-09-01
This paper describes a system for automatic fiber debris detection based on color identification. The properties of the system are fast analysis and high selectivity, a necessity when analyzing forensic fiber samples. An ordinary investigation separates the material into well above 100,000 video images to analyze. The system is based on standard techniques such as CCD-camera, motorized sample table, and IBM-compatible PC/AT with add-on-boards for video frame digitalization and stepping motor control as the main parts. It is possible to operate the instrument at full video rate (25 image/s) with aid of the HSI-color system (hue- saturation-intensity) and software optimization. High selectivity is achieved by separating the analysis into several steps. The first step is fast direct color identification of objects in the analyzed video images and the second step analyzes detected objects with a more complex and time consuming stage of the investigation to identify single fiber fragments for subsequent analysis with more selective techniques.
NASA Astrophysics Data System (ADS)
Abualrob, Marwan M. A.; Gnanamalar Sarojini Daniel, Esther
2013-10-01
This article outlines how learning objectives based upon science, technology and society (STS) elements for Palestinian ninth grade science textbooks were identified, which was part of a bigger study to establish an STS foundation in the ninth grade science curriculum in Palestine. First, an initial list of STS elements was determined. Second, using this list, ninth grade science textbooks and curriculum document contents were analyzed. Third, based on this content analysis, a possible list of 71 learning objectives for the integration of STS elements was prepared. This list of learning objectives was refined by using a two-round Delphi technique. The Delphi study was used to rate and to determine the consensus regarding which items (i.e. learning objectives for STS in the ninth grade science textbooks in Palestine) are to be accepted for inclusion. The results revealed that of the initial 71 objectives in round one, 59 objectives within round two had a mean score of 5.683 or higher, which indicated that the learning objectives could be included in the development of STS modules for ninth grade science in Palestine.
NASA Technical Reports Server (NTRS)
Alston, D. W.
1981-01-01
The considered research had the objective to design a statistical model that could perform an error analysis of curve fits of wind tunnel test data using analysis of variance and regression analysis techniques. Four related subproblems were defined, and by solving each of these a solution to the general research problem was obtained. The capabilities of the evolved true statistical model are considered. The least squares fit is used to determine the nature of the force, moment, and pressure data. The order of the curve fit is increased in order to delete the quadratic effect in the residuals. The analysis of variance is used to determine the magnitude and effect of the error factor associated with the experimental data.
Segmentation of Unstructured Datasets
NASA Technical Reports Server (NTRS)
Bhat, Smitha
1996-01-01
Datasets generated by computer simulations and experiments in Computational Fluid Dynamics tend to be extremely large and complex. It is difficult to visualize these datasets using standard techniques like Volume Rendering and Ray Casting. Object Segmentation provides a technique to extract and quantify regions of interest within these massive datasets. This thesis explores basic algorithms to extract coherent amorphous regions from two-dimensional and three-dimensional scalar unstructured grids. The techniques are applied to datasets from Computational Fluid Dynamics and from Finite Element Analysis.
Analysis of body form using biostereometrics
NASA Technical Reports Server (NTRS)
1979-01-01
The general objective of the research was to provide the space and life sciences directorate with an improved biostereometric measurement capability. This objective was determined from the usefulness of stereophotogrametric techniques developed during the Apollo and Skylab Missions to measure body conformation, surface area, volume and relative density of astronauts. These noninvasive anthropometric measurements provided invaluable data concerning the physiological, biochemical and nutritional effects of the space environment upon the human body. The indirect nature of the technique has many advantages over other methods, and has a potential for many other applications. The stereophotographs contain an enormous amount of data which can be later reexamined should the need arise.
Scrutinizing UML Activity Diagrams
NASA Astrophysics Data System (ADS)
Al-Fedaghi, Sabah
Building an information system involves two processes: conceptual modeling of the “real world domain” and designing the software system. Object-oriented methods and languages (e.g., UML) are typically used for describing the software system. For the system analysis process that produces the conceptual description, object-oriented techniques or semantics extensions are utilized. Specifically, UML activity diagrams are the “flow charts” of object-oriented conceptualization tools. This chapter proposes an alternative to UML activity diagrams through the development of a conceptual modeling methodology based on the notion of flow.
Thunderstorm monitoring and lightning warning, operational applications of the Safir system
NASA Technical Reports Server (NTRS)
Richard, Philippe
1991-01-01
During the past years a new range of studies have been opened by the application of electromagnetic localization techniques to the field of thunderstorm remote sensing. VHF localization techniques were used in particular for the analysis of lightning discharges and gave access to time resolved 3-D images of lightning discharges within thunderclouds. Detection and localization techniques developed have been applied to the design of the SAFIR system. This development's main objective was the design of an operational system capable of assessing and warning in real time for lightning hazards and potential thunderstorm hazards. The SAFIR system main detection technique is the long range interferometric localization of thunderstorm electromagnetic activity; the system performs the localization of intracloud and cloud to ground lightning discharges and the analysis of the characteristics of the activity.
ERIC Educational Resources Information Center
Parks, R. B.
This volume is an explanatory volume referencing the Job Analysis Tecnniques for Restructuring Health Manpower Education and Training in the Navy Medical Department project accomplishments and precedes the 16 separately bound attachments. Contained in this publication are: The Introduction; Philosophies and Objectives; First Year Activities; The…
The International Workshop on Genetic Toxicology (IWGT) meets every four years with an objective to reach consensus recommendations on difficult or conflicting approaches to genotoxicity testing based upon practical experience and newly available data and data analysis techniques...
USDA-ARS?s Scientific Manuscript database
The objective of this study was to fill in additional knowledge gaps with respect to the extraction, storage, and analysis of airborne endotoxin, with a specific focus on samples from a dairy production facility. We utilized polycarbonate filters to collect total airborne endotoxins, sonication as ...
Tuan Pham; Julia Jones; Ronald Metoyer; Frederick Colwell
2014-01-01
The study of the diversity of multivariate objects shares common characteristics and goals across disciplines, including ecology and organizational management. Nevertheless, subject-matter experts have adopted somewhat separate diversity concepts and analysis techniques, limiting the potential for sharing and comparing across disciplines. Moreover, while large and...
ERIC Educational Resources Information Center
Blackman, Tim
2008-01-01
Objective: To investigate how smoking cessation services could be more effectively targeted to tackle socioeconomic inequalities in health. Design: Secondary analysis of data from a household interview survey undertaken for Middlesbrough Council in north east England using the technique of Qualitative Comparative Analysis. Setting: Home-based…
A Meta-Analysis of Previous Research on the Treatment of Hyperactivity. Final Report.
ERIC Educational Resources Information Center
White, Karl R.; And Others
Using meta-analysis techniques, the study sought to identify, integrate, and synthesize the literature from 61 articles which review the efficacy of various treatments for hyperactive children. The major objectives were to determine if drugs can be used effectively with hyperactive children, what child and intervention characteristics covary with…
NASA Technical Reports Server (NTRS)
Hailperin, M.
1993-01-01
This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that the authors' techniques allow more accurate estimation of the global system loading, resulting in fewer object migrations than local methods. The authors' method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive load-balancing methods. Results from a preliminary analysis of another system and from simulation with a synthetic load provide some evidence of more general applicability.
Multi-object segmentation using coupled nonparametric shape and relative pose priors
NASA Astrophysics Data System (ADS)
Uzunbas, Mustafa Gökhan; Soldea, Octavian; Çetin, Müjdat; Ünal, Gözde; Erçil, Aytül; Unay, Devrim; Ekin, Ahmet; Firat, Zeynep
2009-02-01
We present a new method for multi-object segmentation in a maximum a posteriori estimation framework. Our method is motivated by the observation that neighboring or coupling objects in images generate configurations and co-dependencies which could potentially aid in segmentation if properly exploited. Our approach employs coupled shape and inter-shape pose priors that are computed using training images in a nonparametric multi-variate kernel density estimation framework. The coupled shape prior is obtained by estimating the joint shape distribution of multiple objects and the inter-shape pose priors are modeled via standard moments. Based on such statistical models, we formulate an optimization problem for segmentation, which we solve by an algorithm based on active contours. Our technique provides significant improvements in the segmentation of weakly contrasted objects in a number of applications. In particular for medical image analysis, we use our method to extract brain Basal Ganglia structures, which are members of a complex multi-object system posing a challenging segmentation problem. We also apply our technique to the problem of handwritten character segmentation. Finally, we use our method to segment cars in urban scenes.
A methodology for commonality analysis, with applications to selected space station systems
NASA Technical Reports Server (NTRS)
Thomas, Lawrence Dale
1989-01-01
The application of commonality in a system represents an attempt to reduce costs by reducing the number of unique components. A formal method for conducting commonality analysis has not been established. In this dissertation, commonality analysis is characterized as a partitioning problem. The cost impacts of commonality are quantified in an objective function, and the solution is that partition which minimizes this objective function. Clustering techniques are used to approximate a solution, and sufficient conditions are developed which can be used to verify the optimality of the solution. This method for commonality analysis is general in scope. It may be applied to the various types of commonality analysis required in the conceptual, preliminary, and detail design phases of the system development cycle.
SISSY: An example of a multi-threaded, networked, object-oriented databased application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scipioni, B.; Liu, D.; Song, T.
1993-05-01
The Systems Integration Support SYstem (SISSY) is presented and its capabilities and techniques are discussed. It is fully automated data collection and analysis system supporting the SSCL`s systems analysis activities as they relate to the Physics Detector and Simulation Facility (PDSF). SISSY itself is a paradigm of effective computing on the PDSF. It uses home-grown code (C++), network programming (RPC, SNMP), relational (SYBASE) and object-oriented (ObjectStore) DBMSs, UNIX operating system services (IRIX threads, cron, system utilities, shells scripts, etc.), and third party software applications (NetCentral Station, Wingz, DataLink) all of which act together as a single application to monitor andmore » analyze the PDSF.« less
3D visualization techniques for the STEREO-mission
NASA Astrophysics Data System (ADS)
Wiegelmann, T.; Podlipnik, B.; Inhester, B.; Feng, L.; Ruan, P.
The forthcoming STEREO-mission will observe the Sun from two different viewpoints We expect about 2GB data per day which ask for suitable data presentation techniques A key feature of STEREO is that it will provide for the first time a 3D-view of the Sun and the solar corona In our normal environment we see objects three dimensional because the light from real 3D objects needs different travel times to our left and right eye As a consequence we see slightly different images with our eyes which gives us information about the depth of objects and a corresponding 3D impression Techniques for the 3D-visualization of scientific and other data on paper TV computer screen cinema etc are well known e g two colour anaglyph technique shutter glasses polarization filters and head-mounted displays We discuss advantages and disadvantages of these techniques and how they can be applied to STEREO-data The 3D-visualization techniques are not limited to visual images but can be also used to show the reconstructed coronal magnetic field and energy and helicity distribution In the advent of STEREO we test the method with data from SOHO which provides us different viewpoints by the solar rotation This restricts the analysis to structures which remain stationary for several days Real STEREO-data will not be affected by these limitations however
Portable XRF and principal component analysis for bill characterization in forensic science.
Appoloni, C R; Melquiades, F L
2014-02-01
Several modern techniques have been applied to prevent counterfeiting of money bills. The objective of this study was to demonstrate the potential of Portable X-ray Fluorescence (PXRF) technique and the multivariate analysis method of Principal Component Analysis (PCA) for classification of bills in order to use it in forensic science. Bills of Dollar, Euro and Real (Brazilian currency) were measured directly at different colored regions, without any previous preparation. Spectra interpretation allowed the identification of Ca, Ti, Fe, Cu, Sr, Y, Zr and Pb. PCA analysis separated the bills in three groups and subgroups among Brazilian currency. In conclusion, the samples were classified according to its origin identifying the elements responsible for differentiation and basic pigment composition. PXRF allied to multivariate discriminate methods is a promising technique for rapid and no destructive identification of false bills in forensic science. Copyright © 2013 Elsevier Ltd. All rights reserved.
Using decision analysis to choose phosphorus targets for Lake Erie.
Anderson, R M; Hobbs, B F; Koonce, J F; Locci, A B
2001-02-01
Lake Erie water quality has improved dramatically since the degraded conditions of the 1960s. Additional gains could be made, but at the expense of further investment and reductions in fishery productivity. In facing such cross-jurisdictional issues, natural resource managers in Canada and the United States must grapple with conflicting objectives and important uncertainties, while considering the priorities of the public that live in the basin. The techniques and tools of decision analysis have been used successfully to deal with such decision problems in a range of environmental settings, but infrequently in the Great Lakes. The objective of this paper is to illustrate how such techniques might be brought to bear on an important, real decision currently facing Lake Erie resource managers and stakeholders: the choice of new phosphorus loading targets for the lake. The heart of our approach is a systematic elicitation of stakeholder preferences and an investigation of the degree to which different phosphorus-loading policies might satisfy ecosystem objectives. Results show that there are potential benefits to changing the historical policy of reducing phosphorus loads in Lake Erie. Copyright 2001 Springer-Verlag
A New Computational Technique for the Generation of Optimised Aircraft Trajectories
NASA Astrophysics Data System (ADS)
Chircop, Kenneth; Gardi, Alessandro; Zammit-Mangion, David; Sabatini, Roberto
2017-12-01
A new computational technique based on Pseudospectral Discretisation (PSD) and adaptive bisection ɛ-constraint methods is proposed to solve multi-objective aircraft trajectory optimisation problems formulated as nonlinear optimal control problems. This technique is applicable to a variety of next-generation avionics and Air Traffic Management (ATM) Decision Support Systems (DSS) for strategic and tactical replanning operations. These include the future Flight Management Systems (FMS) and the 4-Dimensional Trajectory (4DT) planning and intent negotiation/validation tools envisaged by SESAR and NextGen for a global implementation. In particular, after describing the PSD method, the adaptive bisection ɛ-constraint method is presented to allow an efficient solution of problems in which two or multiple performance indices are to be minimized simultaneously. Initial simulation case studies were performed adopting suitable aircraft dynamics models and addressing a classical vertical trajectory optimisation problem with two objectives simultaneously. Subsequently, a more advanced 4DT simulation case study is presented with a focus on representative ATM optimisation objectives in the Terminal Manoeuvring Area (TMA). The simulation results are analysed in-depth and corroborated by flight performance analysis, supporting the validity of the proposed computational techniques.
Optimization Techniques for Clustering,Connectivity, and Flow Problems in Complex Networks
2012-10-01
discrete optimization and for analysis of performance of algorithm portfolios; introducing a metaheuristic framework of variable objective search that...The results of empirical evaluation of the proposed algorithm are also included. 1.3 Theoretical analysis of heuristics and designing new metaheuristic ...analysis of heuristics for inapproximable problems and designing new metaheuristic approaches for the problems of interest; (IV) Developing new models
Researching Mental Health Disorders in the Era of Social Media: Systematic Review
Vadillo, Miguel A; Curcin, Vasa
2017-01-01
Background Mental illness is quickly becoming one of the most prevalent public health problems worldwide. Social network platforms, where users can express their emotions, feelings, and thoughts, are a valuable source of data for researching mental health, and techniques based on machine learning are increasingly used for this purpose. Objective The objective of this review was to explore the scope and limits of cutting-edge techniques that researchers are using for predictive analytics in mental health and to review associated issues, such as ethical concerns, in this area of research. Methods We performed a systematic literature review in March 2017, using keywords to search articles on data mining of social network data in the context of common mental health disorders, published between 2010 and March 8, 2017 in medical and computer science journals. Results The initial search returned a total of 5386 articles. Following a careful analysis of the titles, abstracts, and main texts, we selected 48 articles for review. We coded the articles according to key characteristics, techniques used for data collection, data preprocessing, feature extraction, feature selection, model construction, and model verification. The most common analytical method was text analysis, with several studies using different flavors of image analysis and social interaction graph analysis. Conclusions Despite an increasing number of studies investigating mental health issues using social network data, some common problems persist. Assembling large, high-quality datasets of social media users with mental disorder is problematic, not only due to biases associated with the collection methods, but also with regard to managing consent and selecting appropriate analytics techniques. PMID:28663166
Smart Sensor-Based Motion Detection System for Hand Movement Training in Open Surgery.
Sun, Xinyao; Byrns, Simon; Cheng, Irene; Zheng, Bin; Basu, Anup
2017-02-01
We introduce a smart sensor-based motion detection technique for objective measurement and assessment of surgical dexterity among users at different experience levels. The goal is to allow trainees to evaluate their performance based on a reference model shared through communication technology, e.g., the Internet, without the physical presence of an evaluating surgeon. While in the current implementation we used a Leap Motion Controller to obtain motion data for analysis, our technique can be applied to motion data captured by other smart sensors, e.g., OptiTrack. To differentiate motions captured from different participants, measurement and assessment in our approach are achieved using two strategies: (1) low level descriptive statistical analysis, and (2) Hidden Markov Model (HMM) classification. Based on our surgical knot tying task experiment, we can conclude that finger motions generated from users with different surgical dexterity, e.g., expert and novice performers, display differences in path length, number of movements and task completion time. In order to validate the discriminatory ability of HMM for classifying different movement patterns, a non-surgical task was included in our analysis. Experimental results demonstrate that our approach had 100 % accuracy in discriminating between expert and novice performances. Our proposed motion analysis technique applied to open surgical procedures is a promising step towards the development of objective computer-assisted assessment and training systems.
Real time 3D scanner: investigations and results
NASA Astrophysics Data System (ADS)
Nouri, Taoufik; Pflug, Leopold
1993-12-01
This article presents a concept of reconstruction of 3-D objects using non-invasive and touch loss techniques. The principle of this method is to display parallel interference optical fringes on an object and then to record the object under two angles of view. According to an appropriated treatment one reconstructs the 3-D object even when the object has no symmetrical plan. The 3-D surface data is available immediately in digital form for computer- visualization and for analysis software tools. The optical set-up for recording the 3-D object, the 3-D data extraction and treatment, as well as the reconstruction of the 3-D object are reported and commented on. This application is dedicated for reconstructive/cosmetic surgery, CAD, animation and research purposes.
Analysis of Some Potential Manpower Policies for the All-Volunteer Navy. Final Report.
ERIC Educational Resources Information Center
Battelle, R. Bard; And Others
This report describes an analysis of Navy personnel as a subsystem of the Navy, functioning with the overall objective of maintaining Fleet readiness within the constraints of budget and manpower supply limitations. Manpower utilization and management techniques and options were examined and evaluated for their usefulness to an all volunteer Navy…
ERIC Educational Resources Information Center
Muller, Veronica; Brooks, Jessica; Tu, Wei-Mo; Moser, Erin; Lo, Chu-Ling; Chan, Fong
2015-01-01
Purpose: The main objective of this study was to determine the extent to which physical and cognitive-affective factors are associated with fibromyalgia (FM) fatigue. Method: A quantitative descriptive design using correlation techniques and multiple regression analysis. The participants consisted of 302 members of the National Fibromyalgia &…
USDA-ARS?s Scientific Manuscript database
The objective was to characterize texture properties of raw and cooked broiler fillets (Pectoralis major) with the wooden breast condition (WBC) using the instrumental texture techniques of Meullenet-Owens Razor Shear (MORS) and Texture Profile Analysis (TPA). Deboned (3 h post-mortem) broiler fille...
Interactive tele-radiological segmentation systems for treatment and diagnosis.
Zimeras, S; Gortzis, L G
2012-01-01
Telehealth is the exchange of health information and the provision of health care services through electronic information and communications technology, where participants are separated by geographic, time, social and cultural barriers. The shift of telemedicine from desktop platforms to wireless and mobile technologies is likely to have a significant impact on healthcare in the future. It is therefore crucial to develop a general information exchange e-medical system to enables its users to perform online and offline medical consultations through diagnosis. During the medical diagnosis, image analysis techniques combined with doctor's opinions could be useful for final medical decisions. Quantitative analysis of digital images requires detection and segmentation of the borders of the object of interest. In medical images, segmentation has traditionally been done by human experts. Even with the aid of image processing software (computer-assisted segmentation tools), manual segmentation of 2D and 3D CT images is tedious, time-consuming, and thus impractical, especially in cases where a large number of objects must be specified. Substantial computational and storage requirements become especially acute when object orientation and scale have to be considered. Therefore automated or semi-automated segmentation techniques are essential if these software applications are ever to gain widespread clinical use. The main purpose of this work is to analyze segmentation techniques for the definition of anatomical structures under telemedical systems.
Meleo, Deborah; Baggi, Luigi; Di Girolamo, Michele; Di Carlo, Fabio; Pecci, Raffaella; Bedini, Rossella
2012-01-01
X-ray micro-tomography (micro-CT) is a miniaturized form of conventional computed axial tomography (CAT) able to investigate small radio-opaque objects at a-few-microns high resolution, in a non-destructive, non-invasive, and tri-dimensional way. Compared to traditional optical and electron microscopy techniques, which provide two-dimensional images, this innovative investigation technology enables a sample tri-dimensional analysis without cutting, coating or exposing the object to any particular chemical treatment. X-ray micro-tomography matches ideal 3D microscopy features: the possibility of investigating an object in natural conditions and without any preparation or alteration; non-invasive, non-destructive, and sufficiently magnified 3D reconstruction; reliable measurement of numeric data of the internal structure (morphology, structure and ultra-structure). Hence, this technique has multi-fold applications in a wide range of fields, not only in medical and odontostomatologic areas, but also in biomedical engineering, materials science, biology, electronics, geology, archaeology, oil industry, and semi-conductors industry. This study shows possible applications of micro-CT in dental implantology to analyze 3D micro-features of dental implant to abutment interface. Indeed, implant-abutment misfit is known to increase mechanical stress on connection structures and surrounding bone tissue. This condition may cause not only screw preload loss or screw fracture, but also biological issues in peri-implant tissues.
Tamura, Naomi; Terashita, Takayoshi; Ogasawara, Katsuhiko
2014-03-01
In general, it is difficult to objectively evaluate the results of an educational program. The semantic differential (SeD) technique, a methodology used to measure the connotative meaning of objects, words, and concepts, can, however, be applied to the evaluation of students' attitudes. In this study, we aimed to achieve an objective evaluation of the effects of radiological technology education. We therefore investigated the attitude of radiological students using the SeD technique. We focused on X-ray examinations in the field of radiological technology science. Bipolar adjective scales were used for the SeD questionnaire. To create the questionnaire, appropriate adjectives were selected from past reports of X-ray examination practice. The participants were 32 senior students at Hokkaido University at the Division of Radiological Technology at the School of Medicine's Department of Health Sciences. All the participants completed the questionnaire. The study was conducted in early June 2012. Attitudes toward X-ray examination were identified using a factor analysis of 11 adjectives. The factor analysis revealed the following three attitudes: feelings of expectation, responsibility, and resistance. Knowledge regarding the attitudes that students have toward X-ray examination will prove useful for evaluating the effects of educational intervention. In this study, a sampling bias may have occurred due to the small sample size; however, no other biases were observed.
Application and Validation of Workload Assessment Techniques
1993-03-01
tech ical report documents the process and outcome of meeting this objective. Procedure: A series of eight separate studies was conducted using three...development process . The task analysis and simulation technique was shown to have the capability to track empirical workload ratings. More research is...operator workload during the systems acquisi- tion process , and (b) a pamphlet for the managers of Army systems that describes the need and some procedures
Determining the Optimal Number of Clusters with the Clustergram
NASA Technical Reports Server (NTRS)
Fluegemann, Joseph K.; Davies, Misty D.; Aguirre, Nathan D.
2011-01-01
Cluster analysis aids research in many different fields, from business to biology to aerospace. It consists of using statistical techniques to group objects in large sets of data into meaningful classes. However, this process of ordering data points presents much uncertainty because it involves several steps, many of which are subject to researcher judgment as well as inconsistencies depending on the specific data type and research goals. These steps include the method used to cluster the data, the variables on which the cluster analysis will be operating, the number of resulting clusters, and parts of the interpretation process. In most cases, the number of clusters must be guessed or estimated before employing the clustering method. Many remedies have been proposed, but none is unassailable and certainly not for all data types. Thus, the aim of current research for better techniques of determining the number of clusters is generally confined to demonstrating that the new technique excels other methods in performance for several disparate data types. Our research makes use of a new cluster-number-determination technique based on the clustergram: a graph that shows how the number of objects in the cluster and the cluster mean (the ordinate) change with the number of clusters (the abscissa). We use the features of the clustergram to make the best determination of the cluster-number.
Joint Center for Operational Analysis Journal. Volume 12, Issue 3, Winter 2010-2011
2011-01-01
building . One technique that worked well involved demonstra- tion projects such as green houses, center-pivot and drip irrigation, and grain silos...or other professional objectives such that failure is not really verifiable. A common example from foreign assistance project objectives is... professionals . Indeed, there is a long litany of far more egregious and damaging projects wholly planned and executed by development professionals
Methodologies for launcher-payload coupled dynamic analysis
NASA Astrophysics Data System (ADS)
Fransen, S. H. J. A.
2012-06-01
An important step in the design and verification process of spacecraft structures is the coupled dynamic analysis with the launch vehicle in the low-frequency domain, also referred to as coupled loads analysis (CLA). The objective of such analyses is the computation of the dynamic environment of the spacecraft (payload) in terms of interface accelerations, interface forces, center of gravity (CoG) accelerations as well as the internal state of stress. In order to perform an efficient, fast and accurate launcher-payload coupled dynamic analysis, various methodologies have been applied and developed. The methods are related to substructuring techniques, data recovery techniques, the effects of prestress and fluids and time integration problems. The aim of this paper was to give an overview of these methodologies and to show why, how and where these techniques can be used in the process of launcher-payload coupled dynamic analysis. In addition, it will be shown how these methodologies fit together in a library of procedures which can be used with the MSC.Nastran™ solution sequences.
Correlating Subjective and Objective Sleepiness: Revisiting the Association Using Survival Analysis
Aurora, R. Nisha; Caffo, Brian; Crainiceanu, Ciprian; Punjabi, Naresh M.
2011-01-01
Study Objectives: The Epworth Sleepiness Scale (ESS) and multiple sleep latency test (MSLT) are the most commonly used measures of subjective and objective sleepiness, respectively. The strength of the association between these measures as well as the optimal ESS threshold that indicates objective sleepiness remains a topic of significant interest in the clinical and research arenas. The current investigation sought to: (a) examine the association between the ESS and the average sleep latency from the MSLT using the techniques of survival analysis; (b) determine whether specific patient factors influence the association; (c) examine the utility of each ESS question; and (d) identify the optimal ESS threshold that indicates objective sleepiness. Design: Cross-sectional study. Patients and Settings: Patients (N = 675) referred for polysomnography and MSLT. Measurements and Results: Using techniques of survival analysis, a significant association was noted between the ESS score and the average sleep latency. The adjusted hazard ratios for sleep onset during the MSLT for the ESS quartiles were 1.00 (ESS < 9), 1.32 (ESS: 10–13), 1.85 (ESS: 14-17), and 2.53 (ESS ≥ 18), respectively. The association was independent of several patient factors and was distinct for the 4 naps. Furthermore, most of the ESS questions were individually predictive of the average sleep latency except the tendency to doze off when lying down to rest in the afternoon, which was only predictive in patients with less than a college education. Finally, an ESS score ≥ 13 optimally predicted an average sleep latency < 8 minutes. Conclusions: In contrast to previous reports, the association between the ESS and the average sleep latency is clearly apparent when the data are analyzed by survival analysis, and most of the ESS questions are predictive of objective sleepiness. An ESS score ≥ 13 most effectively predicts objective sleepiness, which is higher than what has typically been used in clinical practice. Given the ease of administering the ESS, it represents a relatively simple and cost-effective method for identifying individuals at risk for daytime sleepiness. Citation: Aurora RN; Caffo B; Crainiceanu C; Punjabi NM. Correlating subjective and objective sleepiness: revisiting the association using survival analysis. SLEEP 2011;34(12):1707-1714. PMID:22131609
Variation objective analyses for cyclone studies
NASA Technical Reports Server (NTRS)
Achtemeier, G. L.; Kidder, S. Q.; Ochs, H. T.
1985-01-01
The objectives were to: (1) develop an objective analysis technique that will maximize the information content of data available from diverse sources, with particular emphasis on the incorporation of observations from satellites with those from more traditional immersion techniques; and (2) to develop a diagnosis of the state of the synoptic scale atmosphere on a much finer scale over a much broader region than is presently possible to permit studies of the interactions and energy transfers between global, synoptic and regional scale atmospheric processes. The variational objective analysis model consists of the two horizontal momentum equations, the hydrostatic equation, and the integrated continuity equation for a dry hydrostatic atmosphere. Preliminary tests of the model with the SESMAE I data set are underway for 12 GMT 10 April 1979. At this stage of purpose of the analysis is not the diagnosis of atmospheric structures but rather the validation of the model. Model runs for rawinsonde data and with the precision modulus weights set to force most of the adjustment of the wind field to the mass field have produced 90 to 95 percent reductions in the imbalance of the initial data after only 4-cycles through the Euler-Lagrange equations. Sensitivity tests for linear stability of the 11 Euler-Lagrange equations that make up the VASP Model 1 indicate that there will be a lower limit to the scales of motion that can be resolved by this method. Linear stability criteria are violated where there is large horizontal wind shear near the upper tropospheric jet.
Sohrabpour, Abbas; Ye, Shuai; Worrell, Gregory A.; Zhang, Wenbo
2016-01-01
Objective Combined source imaging techniques and directional connectivity analysis can provide useful information about the underlying brain networks in a non-invasive fashion. Source imaging techniques have been used successfully to either determine the source of activity or to extract source time-courses for Granger causality analysis, previously. In this work, we utilize source imaging algorithms to both find the network nodes (regions of interest) and then extract the activation time series for further Granger causality analysis. The aim of this work is to find network nodes objectively from noninvasive electromagnetic signals, extract activation time-courses and apply Granger analysis on the extracted series to study brain networks under realistic conditions. Methods Source imaging methods are used to identify network nodes and extract time-courses and then Granger causality analysis is applied to delineate the directional functional connectivity of underlying brain networks. Computer simulations studies where the underlying network (nodes and connectivity pattern) is known were performed; additionally, this approach has been evaluated in partial epilepsy patients to study epilepsy networks from inter-ictal and ictal signals recorded by EEG and/or MEG. Results Localization errors of network nodes are less than 5 mm and normalized connectivity errors of ~20% in estimating underlying brain networks in simulation studies. Additionally, two focal epilepsy patients were studied and the identified nodes driving the epileptic network were concordant with clinical findings from intracranial recordings or surgical resection. Conclusion Our study indicates that combined source imaging algorithms with Granger causality analysis can identify underlying networks precisely (both in terms of network nodes location and internodal connectivity). Significance The combined source imaging and Granger analysis technique is an effective tool for studying normal or pathological brain conditions. PMID:27740473
The Evolution of 3D Microimaging Techniques in Geosciences
NASA Astrophysics Data System (ADS)
Sahagian, D.; Proussevitch, A.
2009-05-01
In the analysis of geomaterials, it is essential to be able to analyze internal structures on a quantitative basis. Techniques have evolved from rough qualitative methods to highly accurate quantitative methods coupled with 3-D numerical analysis. The earliest primitive method for "seeing'" what was inside a rock was multiple sectioning to produce a series of image slices. This technique typically completely destroyed the sample being analyzed. Another destructive method was developed to give more detailed quantitative information by forming plastic casts of internal voids in sedimentary and volcanic rocks. For this, void were filled with plastic and the rock dissolved away with HF to reveal plastic casts of internal vesicles. Later, new approaches to stereology were developed to extract 3D information from 2D cross-sectional images. This has long been possible for spheres because the probability distribution for cutting a sphere along any small circle is known analytically (greatest probability is near the equator). However, large numbers of objects are required for statistical validity, and geomaterials are seldom spherical, so crystals, vesicles, and other inclusions would need a more sophisticated approach. Consequently, probability distributions were developed using numerical techniques for rectangular solids and various ellipsoids so that stereological techniques could be applied to these. The "holy grail" has always been to obtain 3D quantitative images non-destructively. A key method is Computed X-ray Tomography (CXT), in which attenuation of X-rays is recorded as a function of angular position in a cylindrical sample, providing a 2D "slice" of the interior. When a series of these "slices" is stacked (in increments equivalent with the resolution of the X-ray to make cubic voxels), a 3D image results with quantitative information regarding internal structure, particle/void volumes, nearest neighbors, coordination numbers, preferred orientations, etc. CXT can be done at three basic levels of resolution, with "normal" x-rays providing tens of microns resolution, synchrotron sources providing single to few microns, and emerging XuM techniques providing a practical 300 nm and theoretical 60 nm. The main challenges in CXT imaging have been in segmentation, which delineates material boundaries, and object recognition (registration), in which the individual objects within a material are identified. The former is critical in quantifying object volume, while the latter is essential for preventing the false appearance of individual objects as a continuous structure. Additional, new techniques are now being developed to enhance resolution and provide more detailed analysis without the complex infrastructure needed for CXT. One such method is Laser Scanning Confocal Microscopy, in which a laser is reflected from individual interior surfaces of a fluorescing material, providing a series of sharp images of internal slices with quantitative information available, just as in x-ray tomography, after "z-stacking" of planes of pixels. Another novel approach is the use of Stereo Scanning Electron Microscopy to create digital elevation models of 3D surficial features such as partial bubble margins on the surfaces of fine volcanic ash particles. As other novel techniques emerge, new opportunities will be presented to the geological research community to obtain ever more detailed and accurate information regarding the interior structure of geomaterials.
Object oriented studies into artificial space debris
NASA Technical Reports Server (NTRS)
Adamson, J. M.; Marshall, G.
1988-01-01
A prototype simulation is being developed under contract to the Royal Aerospace Establishment (RAE), Farnborough, England, to assist in the discrimination of artificial space objects/debris. The methodology undertaken has been to link Object Oriented programming, intelligent knowledge based system (IKBS) techniques and advanced computer technology with numeric analysis to provide a graphical, symbolic simulation. The objective is to provide an additional layer of understanding on top of conventional classification methods. Use is being made of object and rule based knowledge representation, multiple reasoning, truth maintenance and uncertainty. Software tools being used include Knowledge Engineering Environment (KEE) and SymTactics for knowledge representation. Hooks are being developed within the SymTactics framework to incorporate mathematical models describing orbital motion and fragmentation. Penetration and structural analysis can also be incorporated. SymTactics is an Object Oriented discrete event simulation tool built as a domain specific extension to the KEE environment. The tool provides facilities for building, debugging and monitoring dynamic (military) simulations.
The physical and empirical basis for a specific clear-air turbulence risk index
NASA Technical Reports Server (NTRS)
Keller, J. L.
1985-01-01
An improved operational CAT detection and forecasting technique is developed and detailed. This technique is the specific clear air turbulence risk (SCATR) index. This index shows some promising results. The improvements seen using hand analyzed data, as a result of the more realistic representation of the vertical shear of the horizontal wind, are also realized in the data analysis used in the PROFS/CWP application. The SCATR index should improve as database enhancements such as profiler and VAS satellite data, which increase the resolution in space and time, are brought into even more sophisticated objective analysis schemes.
Combined elemental and microstructural analysis of genuine and fake copper-alloy coins
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartoli, L; Agresti, J; Mascalchi, M
2011-07-31
Innovative noninvasive material analysis techniques are applied to determine archaeometallurgical characteristics of copper-alloy coins from Florence's National Museum of Archaeology. Three supposedly authentic Roman coins and three hypothetically fraudolent imitations are thoroughly investigated using laser-induced plasma spectroscopy and time of flight neutron diffraction along with 3D videomicroscopy and electron microscopy. Material analyses are aimed at collecting data allowing for objective discrimination between genuine Roman productions and late fakes. The results show the mentioned techniques provide quantitative compositional and textural data, which are strictly related to the manufacturing processes and aging of copper alloys. (laser applications)
Temporal Methods to Detect Content-Based Anomalies in Social Media
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skryzalin, Jacek; Field, Jr., Richard; Fisher, Andrew N.
Here, we develop a method for time-dependent topic tracking and meme trending in social media. Our objective is to identify time periods whose content differs signifcantly from normal, and we utilize two techniques to do so. The first is an information-theoretic analysis of the distributions of terms emitted during different periods of time. In the second, we cluster documents from each time period and analyze the tightness of each clustering. We also discuss a method of combining the scores created by each technique, and we provide ample empirical analysis of our methodology on various Twitter datasets.
Method for high resolution magnetic resonance analysis using magic angle technique
Wind, Robert A.; Hu, Jian Zhi
2003-11-25
A method of performing a magnetic resonance analysis of a biological object that includes placing the biological object in a main magnetic field and in a radio frequency field, the main magnetic field having a static field direction; rotating the biological object at a rotational frequency of less than about 100 Hz around an axis positioned at an angle of about 54.degree.44' relative to the main magnetic static field direction; pulsing the radio frequency to provide a sequence that includes a magic angle turning pulse segment; and collecting data generated by the pulsed radio frequency. According to another embodiment, the radio frequency is pulsed to provide a sequence capable of producing a spectrum that is substantially free of spinning sideband peaks.
Research Reports: 1988 NASA/ASEE Summer Faculty Fellowship Program
NASA Technical Reports Server (NTRS)
Freeman, L. Michael (Editor); Chappell, Charles R. (Editor); Cothran, Ernestine K. (Editor); Karr, Gerald R. (Editor)
1988-01-01
The basic objectives are to further the professional knowledge of qualified engineering and science faculty members; to stimulate an exchange of ideas between participants and NASA: to enrich and refresh the research and teaching activities of the participants' institutions; and to contribute to the research objectives of the NASA centers. Topics addressed include: cryogenics; thunderstorm simulation; computer techniques; computer assisted instruction; system analysis weather forecasting; rocket engine design; crystal growth; control systems design; turbine pumps for the Space Shuttle Main engine; electron mobility; heat transfer predictions; rotor dynamics; mathematical models; computational fluid dynamics; and structural analysis.
Rare event computation in deterministic chaotic systems using genealogical particle analysis
NASA Astrophysics Data System (ADS)
Wouters, J.; Bouchet, F.
2016-09-01
In this paper we address the use of rare event computation techniques to estimate small over-threshold probabilities of observables in deterministic dynamical systems. We demonstrate that genealogical particle analysis algorithms can be successfully applied to a toy model of atmospheric dynamics, the Lorenz ’96 model. We furthermore use the Ornstein-Uhlenbeck system to illustrate a number of implementation issues. We also show how a time-dependent objective function based on the fluctuation path to a high threshold can greatly improve the performance of the estimator compared to a fixed-in-time objective function.
3D Feature Extraction for Unstructured Grids
NASA Technical Reports Server (NTRS)
Silver, Deborah
1996-01-01
Visualization techniques provide tools that help scientists identify observed phenomena in scientific simulation. To be useful, these tools must allow the user to extract regions, classify and visualize them, abstract them for simplified representations, and track their evolution. Object Segmentation provides a technique to extract and quantify regions of interest within these massive datasets. This article explores basic algorithms to extract coherent amorphous regions from two-dimensional and three-dimensional scalar unstructured grids. The techniques are applied to datasets from Computational Fluid Dynamics and those from Finite Element Analysis.
MSIX - A general and user-friendly platform for RAM analysis
NASA Astrophysics Data System (ADS)
Pan, Z. J.; Blemel, Peter
The authors present a CAD (computer-aided design) platform supporting RAM (reliability, availability, and maintainability) analysis with efficient system description and alternative evaluation. The design concepts, implementation techniques, and application results are described. This platform is user-friendly because of its graphic environment, drawing facilities, object orientation, self-tutoring, and access to the operating system. The programs' independency and portability make them generally applicable to various analysis tasks.
NASA Technical Reports Server (NTRS)
Tholen, David J.; Barucci, M. Antonietta
1989-01-01
The spectral reflectivity of asteroid surfaces over the wavelength range of 0.3 to 1.1 micron can be used to classify these objects into several broad groups with similar spectral characteristics. The three most recently developed taxonomies group the asteroids into 9, 11, or 14 different clases, depending on the technique used to perform the analysis. The distribution of the taxonomic classes shows that darker and redder objects become more dominant at larger heliocentric distances, while the rare asteroid types are found more frequently among the small objects of the planet-crossing population.
Model authoring system for fail safe analysis
NASA Technical Reports Server (NTRS)
Sikora, Scott E.
1990-01-01
The Model Authoring System is a prototype software application for generating fault tree analyses and failure mode and effects analyses for circuit designs. Utilizing established artificial intelligence and expert system techniques, the circuits are modeled as a frame-based knowledge base in an expert system shell, which allows the use of object oriented programming and an inference engine. The behavior of the circuit is then captured through IF-THEN rules, which then are searched to generate either a graphical fault tree analysis or failure modes and effects analysis. Sophisticated authoring techniques allow the circuit to be easily modeled, permit its behavior to be quickly defined, and provide abstraction features to deal with complexity.
Exploring novel objective functions for simulating muscle coactivation in the neck.
Mortensen, J; Trkov, M; Merryweather, A
2018-04-11
Musculoskeletal modeling allows for analysis of individual muscles in various situations. However, current techniques to realistically simulate muscle response when significant amounts of intentional coactivation is required are inadequate. This would include stiffening the neck or spine through muscle coactivation in preparation for perturbations or impacts. Muscle coactivation has been modeled previously in the neck and spine using optimization techniques that seek to maximize the joint stiffness by maximizing total muscle activation or muscle force. These approaches have not sought to replicate human response, but rather to explore the possible effects of active muscle. Coactivation remains a challenging feature to include in musculoskeletal models, and may be improved by extracting optimization objective functions from experimental data. However, the components of such an objective function must be known before fitting to experimental data. This study explores the effect of components in several objective functions, in order to recommend components to be used for fitting to experimental data. Four novel approaches to modeling coactivation through optimization techniques are presented, two of which produce greater levels of stiffness than previous techniques. Simulations were performed using OpenSim and MATLAB cooperatively. Results show that maximizing the moment generated by a particular muscle appears analogous to maximizing joint stiffness. The approach of optimizing for maximum moment generated by individual muscles may be a good candidate for developing objective functions that accurately simulate muscle coactivation in complex joints. This new approach will be the focus of future studies with human subjects. Copyright © 2018 Elsevier Ltd. All rights reserved.
Surrogate-based Analysis and Optimization
NASA Technical Reports Server (NTRS)
Queipo, Nestor V.; Haftka, Raphael T.; Shyy, Wei; Goel, Tushar; Vaidyanathan, Raj; Tucker, P. Kevin
2005-01-01
A major challenge to the successful full-scale development of modem aerospace systems is to address competing objectives such as improved performance, reduced costs, and enhanced safety. Accurate, high-fidelity models are typically time consuming and computationally expensive. Furthermore, informed decisions should be made with an understanding of the impact (global sensitivity) of the design variables on the different objectives. In this context, the so-called surrogate-based approach for analysis and optimization can play a very valuable role. The surrogates are constructed using data drawn from high-fidelity models, and provide fast approximations of the objectives and constraints at new design points, thereby making sensitivity and optimization studies feasible. This paper provides a comprehensive discussion of the fundamental issues that arise in surrogate-based analysis and optimization (SBAO), highlighting concepts, methods, techniques, as well as practical implications. The issues addressed include the selection of the loss function and regularization criteria for constructing the surrogates, design of experiments, surrogate selection and construction, sensitivity analysis, convergence, and optimization. The multi-objective optimal design of a liquid rocket injector is presented to highlight the state of the art and to help guide future efforts.
Arthropod Surveillance Programs: Basic Components, Strategies, and Analysis.
Cohnstaedt, Lee W; Rochon, Kateryn; Duehl, Adrian J; Anderson, John F; Barrera, Roberto; Su, Nan-Yao; Gerry, Alec C; Obenauer, Peter J; Campbell, James F; Lysyk, Tim J; Allan, Sandra A
2012-03-01
Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium "Advancements in arthropod monitoring technology, techniques, and analysis" presented at the 58th annual meeting of the Entomological Society of America in San Diego, CA. Interdisciplinary examples of arthropod monitoring for urban, medical, and veterinary applications are reviewed. Arthropod surveillance consists of the three components: 1) sampling method, 2) trap technology, and 3) analysis technique. A sampling method consists of selecting the best device or collection technique for a specific location and sampling at the proper spatial distribution, optimal duration, and frequency to achieve the surveillance objective. Optimized sampling methods are discussed for several mosquito species (Diptera: Culicidae) and ticks (Acari: Ixodidae). The advantages and limitations of novel terrestrial and aerial insect traps, artificial pheromones and kairomones are presented for the capture of red flour beetle (Coleoptera: Tenebrionidae), small hive beetle (Coleoptera: Nitidulidae), bed bugs (Hemiptera: Cimicidae), and Culicoides (Diptera: Ceratopogonidae) respectively. After sampling, extrapolating real world population numbers from trap capture data are possible with the appropriate analysis techniques. Examples of this extrapolation and action thresholds are given for termites (Isoptera: Rhinotermitidae) and red flour beetles.
Multi-objects recognition for distributed intelligent sensor networks
NASA Astrophysics Data System (ADS)
He, Haibo; Chen, Sheng; Cao, Yuan; Desai, Sachi; Hohil, Myron E.
2008-04-01
This paper proposes an innovative approach for multi-objects recognition for homeland security and defense based intelligent sensor networks. Unlike the conventional way of information analysis, data mining in such networks is typically characterized with high information ambiguity/uncertainty, data redundancy, high dimensionality and real-time constrains. Furthermore, since a typical military based network normally includes multiple mobile sensor platforms, ground forces, fortified tanks, combat flights, and other resources, it is critical to develop intelligent data mining approaches to fuse different information resources to understand dynamic environments, to support decision making processes, and finally to achieve the goals. This paper aims to address these issues with a focus on multi-objects recognition. Instead of classifying a single object as in the traditional image classification problems, the proposed method can automatically learn multiple objectives simultaneously. Image segmentation techniques are used to identify the interesting regions in the field, which correspond to multiple objects such as soldiers or tanks. Since different objects will come with different feature sizes, we propose a feature scaling method to represent each object in the same number of dimensions. This is achieved by linear/nonlinear scaling and sampling techniques. Finally, support vector machine (SVM) based learning algorithms are developed to learn and build the associations for different objects, and such knowledge will be adaptively accumulated for objects recognition in the testing stage. We test the effectiveness of proposed method in different simulated military environments.
Basic research for the geodynamics program
NASA Technical Reports Server (NTRS)
Mueller, I. I.
1985-01-01
The current technical objectives for the geodynamics program consist of (1) optimal utilization of laser and Very Long Baseline Interferometry (VLBI) observations for reference frames for geodynamics; (2) utilization of range difference observations in geodynamics; and (3) estimation techniques in crustal deformation analysis.
NASA Astrophysics Data System (ADS)
Campbell, Norah; Deane, Cormac; Murphy, Padraig
2017-07-01
Public perceptions of nanotechnology are shaped by sound in surprising ways. Our analysis of the audiovisual techniques employed by nanotechnology stakeholders shows that well-chosen sounds can help to win public trust, create value and convey the weird reality of objects on the nanoscale.
Digital image analysis techniques for fiber and soil mixtures.
DOT National Transportation Integrated Search
1999-05-01
The objective of image processing is to visually enhance, quantify, and/or statistically evaluate some aspect of an image not readily apparent in its original form. Processed digital image data can be analyzed in numerous ways. In order to summarize ...
DOT National Transportation Integrated Search
2014-04-01
The first objective of this study was to develop procedures for determining bracing forces during bridge construction. : Numerical finite element models and analysis techniques were developed for evaluating brace forces induced by construction loads ...
Transgender Phonosurgery: A Systematic Review and Meta-analysis.
Song, Tara Elena; Jiang, Nancy
2017-05-01
Objectives Different surgical techniques have been described in the literature to increase vocal pitch. The purpose of this study is to systematically review these surgeries and perform a meta-analysis to determine which technique increases pitch the most. Data Sources CINAHL, Cochrane, Embase, Medline, PubMed, and Science Direct. Review Methods A systematic review and meta-analysis of the literature was performed using the CINAHL, Cochrane, Embase, Medline, PubMed, and Science Direct databases. Studies were eligible for inclusion if they evaluated pitch-elevating phonosurgical techniques in live humans and performed pre- and postoperative acoustic analysis. Data were gathered regarding surgical technique, pre- and postoperative fundamental frequencies, perioperative care measures, and complications. Results Twenty-nine studies were identified. After applying inclusion and exclusion criteria, a total of 13 studies were included in the meta-analysis. Mechanisms of pitch elevation included increasing vocal cord tension (cricothyroid approximation), shortening the vocal cord length (cold knife glottoplasty, laser-shortening glottoplasty), and decreasing mass (laser reduction glottoplasty). The most common interventions were shortening techniques and cricothyroid approximation (6 studies each). The largest increase in fundamental frequency was seen with techniques that shortened the vocal cords. Preoperative speech therapy, postoperative voice rest, and reporting of patient satisfaction were inconsistent. Many of the studies were limited by low power and short length of follow-up. Conclusions Multiple techniques for elevation of vocal pitch exist, but vocal cord shortening procedures appear to result in the largest increase in fundamental frequency.
On the Optimization of Aerospace Plane Ascent Trajectory
NASA Astrophysics Data System (ADS)
Al-Garni, Ahmed; Kassem, Ayman Hamdy
A hybrid heuristic optimization technique based on genetic algorithms and particle swarm optimization has been developed and tested for trajectory optimization problems with multi-constraints and a multi-objective cost function. The technique is used to calculate control settings for two types for ascending trajectories (constant dynamic pressure and minimum-fuel-minimum-heat) for a two-dimensional model of an aerospace plane. A thorough statistical analysis is done on the hybrid technique to make comparisons with both basic genetic algorithms and particle swarm optimization techniques with respect to convergence and execution time. Genetic algorithm optimization showed better execution time performance while particle swarm optimization showed better convergence performance. The hybrid optimization technique, benefiting from both techniques, showed superior robust performance compromising convergence trends and execution time.
Directly manipulated free-form deformation image registration.
Tustison, Nicholas J; Avants, Brian B; Gee, James C
2009-03-01
Previous contributions to both the research and open source software communities detailed a generalization of a fast scalar field fitting technique for cubic B-splines based on the work originally proposed by Lee . One advantage of our proposed generalized B-spline fitting approach is its immediate application to a class of nonrigid registration techniques frequently employed in medical image analysis. Specifically, these registration techniques fall under the rubric of free-form deformation (FFD) approaches in which the object to be registered is embedded within a B-spline object. The deformation of the B-spline object describes the transformation of the image registration solution. Representative of this class of techniques, and often cited within the relevant community, is the formulation of Rueckert who employed cubic splines with normalized mutual information to study breast deformation. Similar techniques from various groups provided incremental novelty in the form of disparate explicit regularization terms, as well as the employment of various image metrics and tailored optimization methods. For several algorithms, the underlying gradient-based optimization retained the essential characteristics of Rueckert's original contribution. The contribution which we provide in this paper is two-fold: 1) the observation that the generic FFD framework is intrinsically susceptible to problematic energy topographies and 2) that the standard gradient used in FFD image registration can be modified to a well-understood preconditioned form which substantially improves performance. This is demonstrated with theoretical discussion and comparative evaluation experimentation.
A switched systems approach to image-based estimation
NASA Astrophysics Data System (ADS)
Parikh, Anup
With the advent of technological improvements in imaging systems and computational resources, as well as the development of image-based reconstruction techniques, it is necessary to understand algorithm performance when subject to real world conditions. Specifically, this dissertation focuses on the stability and performance of a class of image-based observers in the presence of intermittent measurements, caused by e.g., occlusions, limited FOV, feature tracking losses, communication losses, or finite frame rates. Observers or filters that are exponentially stable under persistent observability may have unbounded error growth during intermittent sensing, even while providing seemingly accurate state estimates. In Chapter 3, dwell time conditions are developed to guarantee state estimation error convergence to an ultimate bound for a class of observers while undergoing measurement loss. Bounds are developed on the unstable growth of the estimation errors during the periods when the object being tracked is not visible. A Lyapunov-based analysis for the switched system is performed to develop an inequality in terms of the duration of time the observer can view the moving object and the duration of time the object is out of the field of view. In Chapter 4, a motion model is used to predict the evolution of the states of the system while the object is not visible. This reduces the growth rate of the bounding function to an exponential and enables the use of traditional switched systems Lyapunov analysis techniques. The stability analysis results in an average dwell time condition to guarantee state error convergence with a known decay rate. In comparison with the results in Chapter 3, the estimation errors converge to zero rather than a ball, with relaxed switching conditions, at the cost of requiring additional information about the motion of the feature. In some applications, a motion model of the object may not be available. Numerous adaptive techniques have been developed to compensate for unknown parameters or functions in system dynamics; however, persistent excitation (PE) conditions are typically required to ensure parameter convergence, i.e., learning. Since the motion model is needed in the predictor, model learning is desired; however, PE is difficult to insure a priori and infeasible to check online for nonlinear systems. Concurrent learning (CL) techniques have been developed to use recorded data and a relaxed excitation condition to ensure convergence. In CL, excitation is only required for a finite period of time, and the recorded data can be checked to determine if it is sufficiently rich. However, traditional CL requires knowledge of state derivatives, which are typically not measured and require extensive filter design and tuning to develop satisfactory estimates. In Chapter 5 of this dissertation, a novel formulation of CL is developed in terms of an integral (ICL), removing the need to estimate state derivatives while preserving parameter convergence properties. Using ICL, an estimator is developed in Chapter 6 for simultaneously estimating the pose of an object as well as learning a model of its motion for use in a predictor when the object is not visible. A switched systems analysis is provided to demonstrate the stability of the estimation and prediction with learning scheme. Dwell time conditions as well as excitation conditions are developed to ensure estimation errors converge to an arbitrarily small bound. Experimental results are provided to illustrate the performance of each of the developed estimation schemes. The dissertation concludes with a discussion of the contributions and limitations of the developed techniques, as well as avenues for future extensions.
Applying data fusion techniques for benthic habitat mapping and monitoring in a coral reef ecosystem
NASA Astrophysics Data System (ADS)
Zhang, Caiyun
2015-06-01
Accurate mapping and effective monitoring of benthic habitat in the Florida Keys are critical in developing management strategies for this valuable coral reef ecosystem. For this study, a framework was designed for automated benthic habitat mapping by combining multiple data sources (hyperspectral, aerial photography, and bathymetry data) and four contemporary imagery processing techniques (data fusion, Object-based Image Analysis (OBIA), machine learning, and ensemble analysis). In the framework, 1-m digital aerial photograph was first merged with 17-m hyperspectral imagery and 10-m bathymetry data using a pixel/feature-level fusion strategy. The fused dataset was then preclassified by three machine learning algorithms (Random Forest, Support Vector Machines, and k-Nearest Neighbor). Final object-based habitat maps were produced through ensemble analysis of outcomes from three classifiers. The framework was tested for classifying a group-level (3-class) and code-level (9-class) habitats in a portion of the Florida Keys. Informative and accurate habitat maps were achieved with an overall accuracy of 88.5% and 83.5% for the group-level and code-level classifications, respectively.
Study of pipe thickness loss using a neutron radiography method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohamed, Abdul Aziz; Wahab, Aliff Amiru Bin; Yazid, Hafizal B.
2014-02-12
The purpose of this preliminary work is to study for thickness changes in objects using neutron radiography. In doing the project, the technique for the radiography was studied. The experiment was done at NUR-2 facility at TRIGA research reactor in Malaysian Nuclear Agency, Malaysia. Test samples of varying materials were used in this project. The samples were radiographed using direct technique. Radiographic images were recorded using Nitrocellulose film. The films obtained were digitized to processed and analyzed. Digital processing is done on the images using software Isee!. The images were processed to produce better image for analysis. The thickness changesmore » in the image were measured to be compared with real thickness of the objects. From the data collected, percentages difference between measured and real thickness are below than 2%. This is considerably very low variation from original values. Therefore, verifying the neutron radiography technique used in this project.« less
A new blood vessel extraction technique using edge enhancement and object classification.
Badsha, Shahriar; Reza, Ahmed Wasif; Tan, Kim Geok; Dimyati, Kaharudin
2013-12-01
Diabetic retinopathy (DR) is increasing progressively pushing the demand of automatic extraction and classification of severity of diseases. Blood vessel extraction from the fundus image is a vital and challenging task. Therefore, this paper presents a new, computationally simple, and automatic method to extract the retinal blood vessel. The proposed method comprises several basic image processing techniques, namely edge enhancement by standard template, noise removal, thresholding, morphological operation, and object classification. The proposed method has been tested on a set of retinal images. The retinal images were collected from the DRIVE database and we have employed robust performance analysis to evaluate the accuracy. The results obtained from this study reveal that the proposed method offers an average accuracy of about 97 %, sensitivity of 99 %, specificity of 86 %, and predictive value of 98 %, which is superior to various well-known techniques.
Steam generator tubing NDE performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henry, G.; Welty, C.S. Jr.
1997-02-01
Steam generator (SG) non-destructive examination (NDE) is a fundamental element in the broader SG in-service inspection (ISI) process, a cornerstone in the management of PWR steam generators. Based on objective performance measures (tube leak forced outages and SG-related capacity factor loss), ISI performance has shown a continually improving trend over the years. Performance of the NDE element is a function of the fundamental capability of the technique, and the ability of the analysis portion of the process in field implementation of the technique. The technology continues to improve in several areas, e.g. system sensitivity, data collection rates, probe/coil design, andmore » data analysis software. With these improvements comes the attendant requirement for qualification of the technique on the damage form(s) to which it will be applied, and for training and qualification of the data analysis element of the ISI process on the field implementation of the technique. The introduction of data transfer via fiber optic line allows for remote data acquisition and analysis, thus improving the efficiency of analysis for a limited pool of data analysts. This paper provides an overview of the current status of SG NDE, and identifies several important issues to be addressed.« less
Medium-range, objective predictions of thunderstorm location and severity for aviation
NASA Technical Reports Server (NTRS)
Wilson, G. S.; Turner, R. E.
1981-01-01
This paper presents a computerized technique for medium-range (12-48h) prediction of both the location and severity of thunderstorms utilizing atmospheric predictions from the National Meteorological Center's limited-area fine-mesh model (LFM). A regional-scale analysis scheme is first used to examine the spatial and temporal distributions of forecasted variables associated with the structure and dynamics of mesoscale systems over an area of approximately 10 to the 6th sq km. The final prediction of thunderstorm location and severity is based upon an objective combination of these regionally analyzed variables. Medium-range thunderstorm predictions are presented for the late afternoon period of April 10, 1979, the day of the Wichita Falls, Texas tornado. Conventional medium-range thunderstorm forecasts, made from observed data, are presented with the case study to demonstrate the possible application of this objective technique in improving 12-48 h thunderstorm forecasts for aviation.
Digital Mapping Techniques '09-Workshop Proceedings, Morgantown, West Virginia, May 10-13, 2009
Soller, David R.
2011-01-01
As in the previous years' meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.
ERIC Educational Resources Information Center
Plonsky, Luke; Gonulal, Talip
2015-01-01
Research synthesis and meta-analysis provide a pathway to bring together findings in a given domain with greater systematicity, objectivity, and transparency than traditional reviews. The same techniques and corresponding benefits can be and have been applied to examine methodological practices in second language (L2) research (e.g., Plonsky,…
A uniform technique for flood frequency analysis.
Thomas, W.O.
1985-01-01
This uniform technique consisted of fitting the logarithms of annual peak discharges to a Pearson Type III distribution using the method of moments. The objective was to adopt a consistent approach for the estimation of floodflow frequencies that could be used in computing average annual flood losses for project evaluation. In addition, a consistent approach was needed for defining equitable flood-hazard zones as part of the National Flood Insurance Program. -from ASCE Publications Information
Information transfer satellite concept study. Volume 2: Technical
NASA Technical Reports Server (NTRS)
Bergin, P.; Kincade, C.; Kurpiewski, D.; Leinhaupel, F.; Millican, F.; Onstad, R.
1971-01-01
The ITS concept study is preceded by two requirements studies whose primary objectives are to identify viable demands and to develop the functional requirements associated with these demands. In addition to continuing this basic activity the ITS concept study objectives are to: (1) develop tools and techniques for planning advanced information transfer satellite communications systems, and to (2) select viable systems for further analysis both in their near-term and in the far-term aspects.
Multiscale Analysis of Solar Image Data
NASA Astrophysics Data System (ADS)
Young, C. A.; Myers, D. C.
2001-12-01
It is often said that the blessing and curse of solar physics is that there is too much data. Solar missions such as Yohkoh, SOHO and TRACE have shown us the Sun with amazing clarity but have also cursed us with an increased amount of higher complexity data than previous missions. We have improved our view of the Sun yet we have not improved our analysis techniques. The standard techniques used for analysis of solar images generally consist of observing the evolution of features in a sequence of byte scaled images or a sequence of byte scaled difference images. The determination of features and structures in the images are done qualitatively by the observer. There is little quantitative and objective analysis done with these images. Many advances in image processing techniques have occured in the past decade. Many of these methods are possibly suited for solar image analysis. Multiscale/Multiresolution methods are perhaps the most promising. These methods have been used to formulate the human ability to view and comprehend phenomena on different scales. So these techniques could be used to quantitify the imaging processing done by the observers eyes and brains. In this work we present a preliminary analysis of multiscale techniques applied to solar image data. Specifically, we explore the use of the 2-d wavelet transform and related transforms with EIT, LASCO and TRACE images. This work was supported by NASA contract NAS5-00220.
The application of LIBS for the analysis of archaeological ceramic and metal artifacts
NASA Astrophysics Data System (ADS)
Melessanaki, Kristalia; Mateo, Maripaz; Ferrence, Susan C.; Betancourt, Philip P.; Anglos, Demetrios
2002-09-01
A bench-top laser-induced breakdown spectroscopy (LIBS) system has been used in the examination of pottery, jewelry and metal artifacts found in archaeological excavations in central and eastern Crete, Greece. The objects date from the Middle and Late Minoan periods (ca. 20th-13th century B. C.) through Byzantine and Venetian to Ottoman times (ca. 5th-19th century A.D.). The spectral data indicates the qualitative and often the semi-quantitative elemental composition of the examined materials. In the case of colored glazed ceramics, the identity of pigments was established while in the case of metal and jewelry analysis, the type of metal or metal alloy used was determined. The analyses demonstrate the potential of the LIBS technique for performing routine, rapid, on-site analysis of archaeological objects, which leads to the quick characterization or screening of different types of objects.
Microstructural characterization of multiphase chocolate using X-ray microtomography.
Frisullo, Pierangelo; Licciardello, Fabio; Muratore, Giuseppe; Del Nobile, Matteo Alessandro
2010-09-01
In this study, X-ray microtomography (μCT) was used for the image analysis of the microstructure of 12 types of Italian aerated chocolate chosen to exhibit variability in terms of cocoa mass content. Appropriate quantitative 3-dimensional parameters describing the microstructure were calculated, for example, the structure thickness (ST), object structure volume ratio (OSVR), and the percentage object volume (POV). Chemical analysis was also performed to correlate the microstructural data to the chemical composition of the samples. Correlation between the μCT parameters acquired for the pore microstructure evaluation and the chemical analysis revealed that the sugar crystals content does not influence the pore structure and content. On the other hand, it revealed that there is a strong correlation between the POV and the sugar content obtained by chemical analysis. The results from this study show that μCT is a suitable technique for the microstructural analysis of confectionary products such as chocolates and not only does it provide an accurate analysis of the pores and microstructure but the data obtained could also be used to aid in the assessment of its composition and consistency with label specifications. X-ray microtomography (μCT) is a noninvasive and nondestructive 3-D imaging technique that has several advantages over other methods, including the ability to image low-moisture materials. Given the enormous success of μCT in medical applications, material science, chemical engineering, geology, and biology, it is not surprising that in recent years much attention has been focused on extending this imaging technique to food science as a useful technique to aid in the study of food microstructure. X-ray microtomography provides in-depth information on the microstructure of the food product being tested; therefore, a better understanding of the physical structure of the product and from an engineering perspective, knowledge about the microstructure of foods can be used to identify the important processing parameters that affect the quality of a product.
3D shape representation with spatial probabilistic distribution of intrinsic shape keypoints
NASA Astrophysics Data System (ADS)
Ghorpade, Vijaya K.; Checchin, Paul; Malaterre, Laurent; Trassoudaine, Laurent
2017-12-01
The accelerated advancement in modeling, digitizing, and visualizing techniques for 3D shapes has led to an increasing amount of 3D models creation and usage, thanks to the 3D sensors which are readily available and easy to utilize. As a result, determining the similarity between 3D shapes has become consequential and is a fundamental task in shape-based recognition, retrieval, clustering, and classification. Several decades of research in Content-Based Information Retrieval (CBIR) has resulted in diverse techniques for 2D and 3D shape or object classification/retrieval and many benchmark data sets. In this article, a novel technique for 3D shape representation and object classification has been proposed based on analyses of spatial, geometric distributions of 3D keypoints. These distributions capture the intrinsic geometric structure of 3D objects. The result of the approach is a probability distribution function (PDF) produced from spatial disposition of 3D keypoints, keypoints which are stable on object surface and invariant to pose changes. Each class/instance of an object can be uniquely represented by a PDF. This shape representation is robust yet with a simple idea, easy to implement but fast enough to compute. Both Euclidean and topological space on object's surface are considered to build the PDFs. Topology-based geodesic distances between keypoints exploit the non-planar surface properties of the object. The performance of the novel shape signature is tested with object classification accuracy. The classification efficacy of the new shape analysis method is evaluated on a new dataset acquired with a Time-of-Flight camera, and also, a comparative evaluation on a standard benchmark dataset with state-of-the-art methods is performed. Experimental results demonstrate superior classification performance of the new approach on RGB-D dataset and depth data.
Zhang, Yang; Xu, Caiqi; Dong, Shiqui; Shen, Peng; Su, Wei; Zhao, Jinzhong
2016-09-01
To provide an up-to-date assessment of the difference between anatomic double-bundle anterior cruciate ligament (ACL) reconstruction (DB-ACLR) and anatomic single-bundle ACL reconstruction (SB-ACLR). We hypothesized that anatomic SB-ACLR using independent femoral drilling technique would be able to achieve kinematic stability as with anatomic DB-ACLR. A comprehensive Internet search was performed to identify all therapeutic trials of anatomic DB-ACLR versus anatomic SB-ACLR. Only clinical studies of Level I and II evidence were included. The comparative outcomes were instrument-measured anterior laxity, Lachman test, pivot shift, clinical outcomes including objective/subjective International Knee Documentation Committee (IKDC) score, Lysholm score, Tegner activity scale and complication rates of extension/flexion deficits, graft failure, and early osteoarthritis. Subgroup analyses were performed for femoral tunnel drilling techniques including independent drilling and transtibial (TT) drilling. Twenty-two clinical trials of 2,261 anatomically ACL-reconstructed patients were included in the meta-analysis. Via TT drilling technique, anatomic DB-ACLR led to improved instrument-measured anterior laxity with a standard mean difference (SMD) of -0.42 (95% confidence interval [CI] = -0.81 to -0.02), less rotational instability measured by pivot shift (SMD = 2.76, 95% CI = 1.24 to 6.16), and higher objective IKDC score with odds ratio (OR) of 2.28 (95% CI = 1.19 to 4.36). Via independent drilling technique, anatomic DB-ACLR yielded better pivot shift (SMD = 2.04, 95% CI = 1.36 to 3.05). Anatomic DB-ACLR also revealed statistical significance in subjective IKDC score compared with anatomic SB-ACLR (SMD = 0.27, 95% CI = 0.05 to 0.49). Anatomic DB-ACLR showed better anterior and rotational stability and higher objective IKDC score than anatomic SB-ACLR via TT drilling technique. Via independent drilling technique, however, anatomic DB-ACLR only showed superiority of rotational stability. All clinical function outcomes except subjective IKDC score were not significantly different between anatomic DB-ACLR and SB-ACLR. Level II, meta-analysis of Level I and II studies. Copyright © 2016 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.
Seventh symposium on systems analysis in forest resources; 1997 May 28-31; Traverse City, MI.
J. Michael Vasievich; Jeremy S. Fried; Larry A. Leefers
2000-01-01
This international symposium included presentations by representatives from government, academic, and private institutions. Topics covered management objectives; information systems: modeling, optimization, simulation and decision support techniques; spatial methods; timber supply; and economic and operational analyses.
Geospatial methods and data analysis for assessing distribution of grazing livestock
USDA-ARS?s Scientific Manuscript database
Free-ranging livestock research must begin with a well conceived problem statement and employ appropriate data acquisition tools and analytical techniques to accomplish the research objective. These requirements are especially critical in addressing animal distribution. Tools and statistics used t...
Structuring an Internal Evaluation Process.
ERIC Educational Resources Information Center
Gordon, Sheila C.; Heinemann, Harry N.
1980-01-01
The design of an internal program evaluation system requires (1) formulation of program, operational, and institutional objectives; (2) establishment of evaluation criteria; (3) choice of data collection and evaluation techniques; (4) analysis of results; and (5) integration of the system into the mainstream of operations. (SK)
Electromagnetic Induction E-Sensor for Underwater UXO Detection
2011-12-01
EMF Electromotive force FET Field Effect Transitor Hz Hertz ms millisecond nV nanoVolt QFS QUASAR Federal...processing. Statistical discrimination techniques based on model analysis, such as the Time-Domain Three Dipole (TD3D) model, can separate UXO-like objects
D'Elia, Caio Oliveira; Bitar, Alexandre Carneiro; Castropil, Wagner; Garofo, Antônio Guilherme Padovani; Cantuária, Anita Lopes; Orselli, Maria Isabel Veras; Luques, Isabela Ugo; Duarte, Marcos
2011-01-01
The objective of this study was to describe the methodology of knee rotation analysis using biomechanics laboratory instruments and to present the preliminary results from a comparative study on patients who underwent anterior cruciate ligament (ACL) reconstruction using the double-bundle technique. The protocol currently used in our laboratory was described. Three-dimensional kinematic analysis was performed and knee rotation amplitude was measured on eight normal patients (control group) and 12 patients who were operated using the double-bundle technique, by means of three tasks in the biomechanics laboratory. No significant differences between operated and non-operated sides were shown in relation to the mean amplitudes of gait, gait with change in direction or gait with change in direction when going down stairs (p > 0.13). The preliminary results did not show any difference in the double-bundle ACL reconstruction technique in relation to the contralateral side and the control group.
Software Process Assessment (SPA)
NASA Technical Reports Server (NTRS)
Rosenberg, Linda H.; Sheppard, Sylvia B.; Butler, Scott A.
1994-01-01
NASA's environment mirrors the changes taking place in the nation at large, i.e. workers are being asked to do more work with fewer resources. For software developers at NASA's Goddard Space Flight Center (GSFC), the effects of this change are that we must continue to produce quality code that is maintainable and reusable, but we must learn to produce it more efficiently and less expensively. To accomplish this goal, the Data Systems Technology Division (DSTD) at GSFC is trying a variety of both proven and state-of-the-art techniques for software development (e.g., object-oriented design, prototyping, designing for reuse, etc.). In order to evaluate the effectiveness of these techniques, the Software Process Assessment (SPA) program was initiated. SPA was begun under the assumption that the effects of different software development processes, techniques, and tools, on the resulting product must be evaluated in an objective manner in order to assess any benefits that may have accrued. SPA involves the collection and analysis of software product and process data. These data include metrics such as effort, code changes, size, complexity, and code readability. This paper describes the SPA data collection and analysis methodology and presents examples of benefits realized thus far by DSTD's software developers and managers.
View subspaces for indexing and retrieval of 3D models
NASA Astrophysics Data System (ADS)
Dutagaci, Helin; Godil, Afzal; Sankur, Bülent; Yemez, Yücel
2010-02-01
View-based indexing schemes for 3D object retrieval are gaining popularity since they provide good retrieval results. These schemes are coherent with the theory that humans recognize objects based on their 2D appearances. The viewbased techniques also allow users to search with various queries such as binary images, range images and even 2D sketches. The previous view-based techniques use classical 2D shape descriptors such as Fourier invariants, Zernike moments, Scale Invariant Feature Transform-based local features and 2D Digital Fourier Transform coefficients. These methods describe each object independent of others. In this work, we explore data driven subspace models, such as Principal Component Analysis, Independent Component Analysis and Nonnegative Matrix Factorization to describe the shape information of the views. We treat the depth images obtained from various points of the view sphere as 2D intensity images and train a subspace to extract the inherent structure of the views within a database. We also show the benefit of categorizing shapes according to their eigenvalue spread. Both the shape categorization and data-driven feature set conjectures are tested on the PSB database and compared with the competitor view-based 3D shape retrieval algorithms.
A practical approach to object based requirements analysis
NASA Technical Reports Server (NTRS)
Drew, Daniel W.; Bishop, Michael
1988-01-01
Presented here is an approach developed at the Unisys Houston Operation Division, which supports the early identification of objects. This domain oriented analysis and development concept is based on entity relationship modeling and object data flow diagrams. These modeling techniques, based on the GOOD methodology developed at the Goddard Space Flight Center, support the translation of requirements into objects which represent the real-world problem domain. The goal is to establish a solid foundation of understanding before design begins, thereby giving greater assurance that the system will do what is desired by the customer. The transition from requirements to object oriented design is also promoted by having requirements described in terms of objects. Presented is a five step process by which objects are identified from the requirements to create a problem definition model. This process involves establishing a base line requirements list from which an object data flow diagram can be created. Entity-relationship modeling is used to facilitate the identification of objects from the requirements. An example is given of how semantic modeling may be used to improve the entity-relationship model and a brief discussion on how this approach might be used in a large scale development effort.
The principles of the Brazilian Unified Health System, studied based on similitude analysis
de Pontes, Ana Paula Munhen; de Oliveira, Denize Cristina; Gomes, Antonio Marcos Tosoli
2014-01-01
Objectives to analyze and compare the incorporation of the ethical-doctrinal and organizational principles into the social representations of the Unified Health System (SUS) among health professionals. Method a study grounded in Social Representations Theory, undertaken with 125 subjects, in eight health institutions in Rio de Janeiro. The free word association technique was applied to the induction term "SUS", the words evoked being analyzed using the techniques of the Vergès matrix and similitude analysis. Results it was identified that the professionals' social representations vary depending on their level of education, and that those with higher education represent a subgroup responsible for the process of representational change identified. This result was confirmed through similitude analysis. Conclusion a process of representational change is ongoing, in which it was ascertained that the professionals incorporated the principles of the SUS into their symbolic constructions. The similitude analysis was shown to be a fruitful technique for research in nursing. PMID:24553704
NASA Astrophysics Data System (ADS)
Schneiderwind, S.; Mason, J.; Wiatr, T.; Papanikolaou, I.; Reicherter, K.
2015-09-01
Two normal faults on the Island of Crete and mainland Greece were studied to create and test an innovative workflow to make palaeoseismic trench logging more objective, and visualise the sedimentary architecture within the trench wall in 3-D. This is achieved by combining classical palaeoseismic trenching techniques with multispectral approaches. A conventional trench log was firstly compared to results of iso cluster analysis of a true colour photomosaic representing the spectrum of visible light. Passive data collection disadvantages (e.g. illumination) were addressed by complementing the dataset with active near-infrared backscatter signal image from t-LiDAR measurements. The multispectral analysis shows that distinct layers can be identified and it compares well with the conventional trench log. According to this, a distinction of adjacent stratigraphic units was enabled by their particular multispectral composition signature. Based on the trench log, a 3-D-interpretation of GPR data collected on the vertical trench wall was then possible. This is highly beneficial for measuring representative layer thicknesses, displacements and geometries at depth within the trench wall. Thus, misinterpretation due to cutting effects is minimised. Sedimentary feature geometries related to earthquake magnitude can be used to improve the accuracy of seismic hazard assessments. Therefore, this manuscript combines multiparametric approaches and shows: (i) how a 3-D visualisation of palaeoseismic trench stratigraphy and logging can be accomplished by combining t-LiDAR and GRP techniques, and (ii) how a multispectral digital analysis can offer additional advantages and a higher objectivity in the interpretation of palaeoseismic and stratigraphic information. The multispectral datasets are stored allowing unbiased input for future (re-)investigations.
Estimation of surface curvature from full-field shape data using principal component analysis
NASA Astrophysics Data System (ADS)
Sharma, Sameer; Vinuchakravarthy, S.; Subramanian, S. J.
2017-01-01
Three-dimensional digital image correlation (3D-DIC) is a popular image-based experimental technique for estimating surface shape, displacements and strains of deforming objects. In this technique, a calibrated stereo rig is used to obtain and stereo-match pairs of images of the object of interest from which the shapes of the imaged surface are then computed using the calibration parameters of the rig. Displacements are obtained by performing an additional temporal correlation of the shapes obtained at various stages of deformation and strains by smoothing and numerically differentiating the displacement data. Since strains are of primary importance in solid mechanics, significant efforts have been put into computation of strains from the measured displacement fields; however, much less attention has been paid to date to computation of curvature from the measured 3D surfaces. In this work, we address this gap by proposing a new method of computing curvature from full-field shape measurements using principal component analysis (PCA) along the lines of a similar work recently proposed to measure strains (Grama and Subramanian 2014 Exp. Mech. 54 913-33). PCA is a multivariate analysis tool that is widely used to reveal relationships between a large number of variables, reduce dimensionality and achieve significant denoising. This technique is applied here to identify dominant principal components in the shape fields measured by 3D-DIC and these principal components are then differentiated systematically to obtain the first and second fundamental forms used in the curvature calculation. The proposed method is first verified using synthetically generated noisy surfaces and then validated experimentally on some real world objects with known ground-truth curvatures.
Multidisciplinary System Reliability Analysis
NASA Technical Reports Server (NTRS)
Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)
2001-01-01
The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.
Multi-Disciplinary System Reliability Analysis
NASA Technical Reports Server (NTRS)
Mahadevan, Sankaran; Han, Song
1997-01-01
The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code developed under the leadership of NASA Lewis Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multi-disciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.
A Comparative Study between Universal Eclectic Septoplasty Technique and Cottle
Amaral Neto, Odim Ferreira do; Mizoguchi, Flavio Massao; Freitas, Renato da Silva; Maniglia, João Jairney; Maniglia, Fábio Fabrício; Maniglia, Ricardo Fabrício
2017-01-01
Introduction Since the last century surgical correction of nasal septum deviation has been improved. The Universal Eclectic Technique was recently reported and there are still few studies dedicated to address this surgical approach. Objective The objective of this study is to compare the results of septal deviation correction achieved using the Universal Eclectic Technique (UET) with those obtained through Cottle's Technique. Methods This is a prospective study with two consecutive case series totaling 90 patients (40 women and 50 men), aged between 18 and 55 years. We divided patients into two groups according to the surgical approach. Fifty-three patients underwent septoplasty through Universal Eclectic Technique (UET) and thirty-seven patients were submitted to classical Cottle's septoplasty technique. All patients have answered the Nasal Obstruction Symptom Evaluation Scale (NOSE) questionnaire to assess pre and postoperative nasal obstruction. Results Statistical analysis showed a significantly shorter operating time for the UET group. Nasal edema assessment performed seven days after the surgery showed a prevalence of mild edema in UET group and moderate edema in Cottle's technique group. In regard to complication rates, UET presented a single case of septal hematoma while in Cottle's technique group we observed: 02 cases of severe edemas, 01 case of incapacitating headache, and 01 complaint of nasal pain. Conclusion The Universal Eclectic Technique (UET) has proven to be a safe and effective surgical technique with faster symptomatic improvement, low complication rates, and reduced surgical time when compared with classical Cottle's technique. PMID:28680499
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klishin, G.S.; Seleznev, V.E.; Aleoshin, V.V.
1997-12-31
Gas industry enterprises such as main pipelines, compressor gas transfer stations, gas extracting complexes belong to the energy intensive industry. Accidents there can result into the catastrophes and great social, environmental and economic losses. Annually, according to the official data several dozens of large accidents take place at the pipes in the USA and Russia. That is why prevention of the accidents, analysis of the mechanisms of their development and prediction of their possible consequences are acute and important tasks nowadays. The accidents reasons are usually of a complicated character and can be presented as a complex combination of natural,more » technical and human factors. Mathematical and computer simulations are safe, rather effective and comparatively inexpensive methods of the accident analysis. It makes it possible to analyze different mechanisms of a failure occurrence and development, to assess its consequences and give recommendations to prevent it. Besides investigation of the failure cases, numerical simulation techniques play an important role in the treatment of the diagnostics results of the objects and in further construction of mathematical prognostic simulations of the object behavior in the period of time between two inspections. While solving diagnostics tasks and in the analysis of the failure cases, the techniques of theoretical mechanics, of qualitative theory of different equations, of mechanics of a continuous medium, of chemical macro-kinetics and optimizing techniques are implemented in the Conversion Design Bureau {number_sign}5 (DB{number_sign}5). Both universal and special numerical techniques and software (SW) are being developed in DB{number_sign}5 for solution of such tasks. Almost all of them are calibrated on the calculations of the simulated and full-scale experiments performed at the VNIIEF and MINATOM testing sites. It is worth noting that in the long years of work there has been established a fruitful and effective collaboration of theoreticians, mathematicians and experimentalists of the institute to solve such tasks.« less
Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research
Ercius, Peter; Alaidi, Osama; Rames, Matthew J.; ...
2015-06-18
Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is amore » technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. Here, this review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. Electron tomography produces quantitative 3D reconstructions for biological and physical sciences from sets of 2D projections acquired at different tilting angles in a transmission electron microscope. Finally, state-of-the-art techniques capable of producing 3D representations such as Pt-Pd core-shell nanoparticles and IgG1 antibody molecules are reviewed.« less
Analysis of Variance in Statistical Image Processing
NASA Astrophysics Data System (ADS)
Kurz, Ludwik; Hafed Benteftifa, M.
1997-04-01
A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.
Non-destructive analysis of museum objects by fibre-optic Raman spectroscopy
Tate, Jim; Moens, Luc
2006-01-01
Raman spectroscopy is a versatile technique that has frequently been applied for the investigation of art objects. By using mobile Raman instrumentation it is possible to investigate the artworks without the need for sampling. This work evaluates the use of a dedicated mobile spectrometer for the investigation of a range of museum objects in museums in Scotland, including antique Egyptian sarcophagi, a panel painting, painted surfaces on paper and textile, and the painted lid and soundboard of an early keyboard instrument. The investigations of these artefacts illustrate some analytical challenges that arise when analysing museum objects, including fluorescing varnish layers, ambient sunlight, large dimensions of artefacts and the need to handle fragile objects with care. Analysis of the musical instrument (the Mar virginals) was undertaken in the exhibition gallery, while on display, which meant that interaction with the public and health and safety issues had to be taken into account. Experimental set-up for the non-destructive Raman spectroscopic investigation of a textile banner in the National Museums of Scotland PMID:16953310
Non-destructive analysis of museum objects by fibre-optic Raman spectroscopy.
Vandenabeele, Peter; Tate, Jim; Moens, Luc
2007-02-01
Raman spectroscopy is a versatile technique that has frequently been applied for the investigation of art objects. By using mobile Raman instrumentation it is possible to investigate the artworks without the need for sampling. This work evaluates the use of a dedicated mobile spectrometer for the investigation of a range of museum objects in museums in Scotland, including antique Egyptian sarcophagi, a panel painting, painted surfaces on paper and textile, and the painted lid and soundboard of an early keyboard instrument. The investigations of these artefacts illustrate some analytical challenges that arise when analysing museum objects, including fluorescing varnish layers, ambient sunlight, large dimensions of artefacts and the need to handle fragile objects with care. Analysis of the musical instrument (the Mar virginals) was undertaken in the exhibition gallery, while on display, which meant that interaction with the public and health and safety issues had to be taken into account. Experimental set-up for the non-destructive Raman spectroscopic investigation of a textile banner in the National Museums of Scotland.
Islam, Naz Niamul; Hannan, M A; Shareef, Hussain; Mohamed, Azah; Salam, M A
2014-01-01
Power oscillation damping controller is designed in linearized model with heuristic optimization techniques. Selection of the objective function is very crucial for damping controller design by optimization algorithms. In this research, comparative analysis has been carried out to evaluate the effectiveness of popular objective functions used in power system oscillation damping. Two-stage lead-lag damping controller by means of power system stabilizers is optimized using differential search algorithm for different objective functions. Linearized model simulations are performed to compare the dominant mode's performance and then the nonlinear model is continued to evaluate the damping performance over power system oscillations. All the simulations are conducted in two-area four-machine power system to bring a detailed analysis. Investigated results proved that multiobjective D-shaped function is an effective objective function in terms of moving unstable and lightly damped electromechanical modes into stable region. Thus, D-shape function ultimately improves overall system damping and concurrently enhances power system reliability.
Information management system study results. Volume 1: IMS study results
NASA Technical Reports Server (NTRS)
1971-01-01
The information management system (IMS) special emphasis task was performed as an adjunct to the modular space station study, with the objective of providing extended depth of analysis and design in selected key areas of the information management system. Specific objectives included: (1) in-depth studies of IMS requirements and design approaches; (2) design and fabricate breadboard hardware for demonstration and verification of design concepts; (3) provide a technological base to identify potential design problems and influence long range planning (4) develop hardware and techniques to permit long duration, low cost, manned space operations; (5) support SR&T areas where techniques or equipment are considered inadequate; and (6) permit an overall understanding of the IMS as an integrated component of the space station.
System analysis in rotorcraft design: The past decade
NASA Technical Reports Server (NTRS)
Galloway, Thomas L.
1988-01-01
Rapid advances in the technology of electronic digital computers and the need for an integrated synthesis approach in developing future rotorcraft programs has led to increased emphasis on system analysis techniques in rotorcraft design. The task in systems analysis is to deal with complex, interdependent, and conflicting requirements in a structured manner so rational and objective decisions can be made. Whether the results are wisdom or rubbish depends upon the validity and sometimes more importantly, the consistency of the inputs, the correctness of the analysis, and a sensible choice of measures of effectiveness to draw conclusions. In rotorcraft design this means combining design requirements, technology assessment, sensitivity analysis and reviews techniques currently in use by NASA and Army organizations in developing research programs and vehicle specifications for rotorcraft. These procedures span simple graphical approaches to comprehensive analysis on large mainframe computers. Examples of recent applications to military and civil missions are highlighted.
Systemic Analysis Approaches for Air Transportation
NASA Technical Reports Server (NTRS)
Conway, Sheila
2005-01-01
Air transportation system designers have had only limited success using traditional operations research and parametric modeling approaches in their analyses of innovations. They need a systemic methodology for modeling of safety-critical infrastructure that is comprehensive, objective, and sufficiently concrete, yet simple enough to be used with reasonable investment. The methodology must also be amenable to quantitative analysis so issues of system safety and stability can be rigorously addressed. However, air transportation has proven itself an extensive, complex system whose behavior is difficult to describe, no less predict. There is a wide range of system analysis techniques available, but some are more appropriate for certain applications than others. Specifically in the area of complex system analysis, the literature suggests that both agent-based models and network analysis techniques may be useful. This paper discusses the theoretical basis for each approach in these applications, and explores their historic and potential further use for air transportation analysis.
Kalal, M; Nugent, K A; Luther-Davies, B
1987-05-01
An interferometric technique which enables simultaneous phase and amplitude imaging of optically transparent objects is discussed with respect to its application for the measurement of spontaneous toroidal magnetic fields generated in laser-produced plasmas. It is shown that this technique can replace the normal independent pair of optical systems (interferometry and shadowgraphy) by one system and use computer image processing to recover both the plasma density and magnetic field information with high accuracy. A fully automatic algorithm for the numerical analysis of the data has been developed and its performance demonstrated for the case of simulated as well as experimental data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kalal, M.; Nugent, K.A.; Luther-Davies, B.
1987-05-01
An interferometric technique which enables simultaneous phase and amplitude imaging of optically transparent objects is discussed with respect to its application for the measurement of spontaneous toroidal magnetic fields generated in laser-produced plasmas. It is shown that this technique can replace the normal independent pair of optical systems (interferometry and shadowgraphy) by one system and use computer image processing to recover both the plasma density and magnetic field information with high accuracy. A fully automatic algorithm for the numerical analysis of the data has been developed and its performance demonstrated for the case of simulated as well as experimental data.
15 CFR 292.3 - Technical tools, techniques, practices, and analyses projects.
Code of Federal Regulations, 2011 CFR
2011-01-01
... understanding of existing organizations and resources relevant to the proposed project; adequate linkages and... the governing or managing organization to conduct the proposed activities; qualifications of the... objective. The purpose of these projects is to support the initial development, implementation, and analysis...
Applying the ICF framework to study changes in quality-of-life for youth with chronic conditions
McDougall, Janette; Wright, Virginia; Schmidt, Jonathan; Miller, Linda; Lowry, Karen
2011-01-01
Objective The objective of this paper is to describe how the ICF framework was applied as the foundation for a longitudinal study of changes in quality-of-life (QoL) for youth with chronic conditions. Method This article will describe the study’s aims, methods, measures and data analysis techniques. It will point out how the ICF framework was used—and expanded upon—to provide a model for studying the impact of factors on changes in QoL for youth with chronic conditions. Further, it will describe the instruments that were chosen to measure the components of the ICF framework and the data analysis techniques that will be used to examine the impact of factors on changes in youths’ QoL. Conclusions Qualitative and longitudinal designs for studying QoL based on the ICF framework can be useful for unraveling the complex ongoing inter-relationships among functioning, contextual factors and individuals’ perceptions of their QoL. PMID:21034288
Near-infrared imaging spectroscopy for counterfeit drug detection
NASA Astrophysics Data System (ADS)
Arnold, Thomas; De Biasio, Martin; Leitner, Raimund
2011-06-01
Pharmaceutical counterfeiting is a significant issue in the healthcare community as well as for the pharmaceutical industry worldwide. The use of counterfeit medicines can result in treatment failure or even death. A rapid screening technique such as near infrared (NIR) spectroscopy could aid in the search for and identification of counterfeit drugs. This work presents a comparison of two laboratory NIR imaging systems and the chemometric analysis of the acquired spectroscopic image data. The first imaging system utilizes a NIR liquid crystal tuneable filter and is designed for the investigation of stationary objects. The second imaging system utilizes a NIR imaging spectrograph and is designed for the fast analysis of moving objects on a conveyor belt. Several drugs in form of tablets and capsules were analyzed. Spectral unmixing techniques were applied to the mixed reflectance spectra to identify constituent parts of the investigated drugs. The results show that NIR spectroscopic imaging can be used for contact-less detection and identification of a variety of counterfeit drugs.
Applying object-oriented software engineering at the BaBar collaboration
NASA Astrophysics Data System (ADS)
Jacobsen, Bob; BaBar Collaboration Reconstruction Software Group
1997-02-01
The BaBar experiment at SLAC will start taking data in 1999. We are attempting to build its reconstruction software using good software engineering practices, including the use of object-oriented technology. We summarize our experience to date with analysis and design activities, training, CASE and documentation tools, C++ programming practice and similar topics. The emphasis is on the practical issues of simultaneously introducing new techniques to a large collaboration while under a deadline for system delivery.
NASA Astrophysics Data System (ADS)
Wang, Min; Cui, Qi; Sun, Yujie; Wang, Qiao
2018-07-01
In object-based image analysis (OBIA), object classification performance is jointly determined by image segmentation, sample or rule setting, and classifiers. Typically, as a crucial step to obtain object primitives, image segmentation quality significantly influences subsequent feature extraction and analyses. By contrast, template matching extracts specific objects from images and prevents shape defects caused by image segmentation. However, creating or editing templates is tedious and sometimes results in incomplete or inaccurate templates. In this study, we combine OBIA and template matching techniques to address these problems and aim for accurate photovoltaic panel (PVP) extraction from very high-resolution (VHR) aerial imagery. The proposed method is based on the previously proposed region-line primitive association framework, in which complementary information between region (segment) and line (straight line) primitives is utilized to achieve a more powerful performance than routine OBIA. Several novel concepts, including the mutual fitting ratio and best-fitting template based on region-line primitive association analyses, are proposed. Automatic template generation and matching method for PVP extraction from VHR imagery are designed for concept and model validation. Results show that the proposed method can successfully extract PVPs without any user-specified matching template or training sample. High user independency and accuracy are the main characteristics of the proposed method in comparison with routine OBIA and template matching techniques.
Multiscale Image Processing of Solar Image Data
NASA Astrophysics Data System (ADS)
Young, C.; Myers, D. C.
2001-12-01
It is often said that the blessing and curse of solar physics is too much data. Solar missions such as Yohkoh, SOHO and TRACE have shown us the Sun with amazing clarity but have also increased the amount of highly complex data. We have improved our view of the Sun yet we have not improved our analysis techniques. The standard techniques used for analysis of solar images generally consist of observing the evolution of features in a sequence of byte scaled images or a sequence of byte scaled difference images. The determination of features and structures in the images are done qualitatively by the observer. There is little quantitative and objective analysis done with these images. Many advances in image processing techniques have occured in the past decade. Many of these methods are possibly suited for solar image analysis. Multiscale/Multiresolution methods are perhaps the most promising. These methods have been used to formulate the human ability to view and comprehend phenomena on different scales. So these techniques could be used to quantitify the imaging processing done by the observers eyes and brains. In this work we present several applications of multiscale techniques applied to solar image data. Specifically, we discuss uses of the wavelet, curvelet, and related transforms to define a multiresolution support for EIT, LASCO and TRACE images.
Arthropod Surveillance Programs: Basic Components, Strategies, and Analysis
Rochon, Kateryn; Duehl, Adrian J.; Anderson, John F.; Barrera, Roberto; Su, Nan-Yao; Gerry, Alec C.; Obenauer, Peter J.; Campbell, James F.; Lysyk, Tim J.; Allan, Sandra A.
2015-01-01
Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium “Advancements in arthropod monitoring technology, techniques, and analysis” presented at the 58th annual meeting of the Entomological Society of America in San Diego, CA. Interdisciplinary examples of arthropod monitoring for urban, medical, and veterinary applications are reviewed. Arthropod surveillance consists of the three components: 1) sampling method, 2) trap technology, and 3) analysis technique. A sampling method consists of selecting the best device or collection technique for a specific location and sampling at the proper spatial distribution, optimal duration, and frequency to achieve the surveillance objective. Optimized sampling methods are discussed for several mosquito species (Diptera: Culicidae) and ticks (Acari: Ixodidae). The advantages and limitations of novel terrestrial and aerial insect traps, artificial pheromones and kairomones are presented for the capture of red flour beetle (Coleoptera: Tenebrionidae), small hive beetle (Coleoptera: Nitidulidae), bed bugs (Hemiptera: Cimicidae), and Culicoides (Diptera: Ceratopogonidae) respectively. After sampling, extrapolating real world population numbers from trap capture data are possible with the appropriate analysis techniques. Examples of this extrapolation and action thresholds are given for termites (Isoptera: Rhinotermitidae) and red flour beetles. PMID:26543242
NASA Technical Reports Server (NTRS)
Hsieh, Shang-Hsien
1993-01-01
The principal objective of this research is to develop, test, and implement coarse-grained, parallel-processing strategies for nonlinear dynamic simulations of practical structural problems. There are contributions to four main areas: finite element modeling and analysis of rotational dynamics, numerical algorithms for parallel nonlinear solutions, automatic partitioning techniques to effect load-balancing among processors, and an integrated parallel analysis system.
ERIC Educational Resources Information Center
Bagaiolo, Leila F.; Mari, Jair de J.; Bordini, Daniela; Ribeiro, Tatiane C.; Martone, Maria Carolina C.; Caetano, Sheila C.; Brunoni, Decio; Brentani, Helena; Paula, Cristiane S.
2017-01-01
Video modeling using applied behavior analysis techniques is one of the most promising and cost-effective ways to improve social skills for parents with autism spectrum disorder children. The main objectives were: (1) To elaborate/describe videos to improve eye contact and joint attention, and to decrease disruptive behaviors of autism spectrum…
Multivariate Quantitative Chemical Analysis
NASA Technical Reports Server (NTRS)
Kinchen, David G.; Capezza, Mary
1995-01-01
Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.
Parallel object-oriented, denoising system using wavelet multiresolution analysis
Kamath, Chandrika; Baldwin, Chuck H.; Fodor, Imola K.; Tang, Nu A.
2005-04-12
The present invention provides a data de-noising system utilizing processors and wavelet denoising techniques. Data is read and displayed in different formats. The data is partitioned into regions and the regions are distributed onto the processors. Communication requirements are determined among the processors according to the wavelet denoising technique and the partitioning of the data. The data is transforming onto different multiresolution levels with the wavelet transform according to the wavelet denoising technique, the communication requirements, and the transformed data containing wavelet coefficients. The denoised data is then transformed into its original reading and displaying data format.
Use of nuclear techniques to determine the fill of found unexploded ordnance.
Steward, Scott; Forsht, Denice
2005-01-01
The PELAN is a man-portable device that uses pulsed neutrons to interrogate objects in order to determine their filler. The neutrons initiate several types of nuclear reactions within the object under scrutiny, which result in the formation of gamma rays. The energy of the resulting gamma rays provides information about the elements (carbon, hydrogen, oxygen, nitrogen) contained within the object; in addition the number of gamma rays detected provides information about how much of each element is present. An analysis of the elements present and their ratios to one another allows for identification of the filler material.
Ur Rehman, Yasar Abbas; Tariq, Muhammad; Khan, Omar Usman
2015-01-01
Object localization plays a key role in many popular applications of Wireless Multimedia Sensor Networks (WMSN) and as a result, it has acquired a significant status for the research community. A significant body of research performs this task without considering node orientation, object geometry and environmental variations. As a result, the localized object does not reflect the real world scenarios. In this paper, a novel object localization scheme for WMSN has been proposed that utilizes range free localization, computer vision, and principle component analysis based algorithms. The proposed approach provides the best possible approximation of distance between a wmsn sink and an object, and the orientation of the object using image based information. Simulation results report 99% efficiency and an error ratio of 0.01 (around 1 ft) when compared to other popular techniques. PMID:26528919
Cameli, Matteo; Ciccone, Marco M; Maiello, Maria; Modesti, Pietro A; Muiesan, Maria L; Scicchitano, Pietro; Novo, Salvatore; Palmiero, Pasquale; Saba, Pier S; Pedrinelli, Roberto
2016-05-01
Speckle tracking echocardiography (STE) is an imaging technique applied to the analysis of left atrial function. STE provides a non-Doppler, angle-independent and objective quantification of left atrial myocardial deformation. Data regarding feasibility, accuracy and clinical applications of left atrial strain are rapidly gathering. This review describes the fundamental concepts of left atrial STE, illustrates its pathophysiological background and discusses its emerging role in systemic arterial hypertension.
Detection of Glaucoma Using Image Processing Techniques: A Critique.
Kumar, B Naveen; Chauhan, R P; Dahiya, Nidhi
2018-01-01
The primary objective of this article is to present a summary of different types of image processing methods employed for the detection of glaucoma, a serious eye disease. Glaucoma affects the optic nerve in which retinal ganglion cells become dead, and this leads to loss of vision. The principal cause is the increase in intraocular pressure, which occurs in open-angle and angle-closure glaucoma, the two major types affecting the optic nerve. In the early stages of glaucoma, no perceptible symptoms appear. As the disease progresses, vision starts to become hazy, leading to blindness. Therefore, early detection of glaucoma is needed for prevention. Manual analysis of ophthalmic images is fairly time-consuming and accuracy depends on the expertise of the professionals. Automatic analysis of retinal images is an important tool. Automation aids in the detection, diagnosis, and prevention of risks associated with the disease. Fundus images obtained from a fundus camera have been used for the analysis. Requisite pre-processing techniques have been applied to the image and, depending upon the technique, various classifiers have been used to detect glaucoma. The techniques mentioned in the present review have certain advantages and disadvantages. Based on this study, one can determine which technique provides an optimum result.
Microextraction techniques combined with capillary electrophoresis in bioanalysis.
Kohler, Isabelle; Schappler, Julie; Rudaz, Serge
2013-01-01
Over the past two decades, many environmentally sustainable sample-preparation techniques have been proposed, with the objective of reducing the use of toxic organic solvents or substituting these with environmentally friendly alternatives. Microextraction techniques (MEs), in which only a small amount of organic solvent is used, have several advantages, including reduced sample volume, analysis time, and operating costs. Thus, MEs are well adapted in bioanalysis, in which sample preparation is mandatory because of the complexity of a sample that is available in small quantities (mL or even μL only). Capillary electrophoresis (CE) is a powerful and efficient separation technique in which no organic solvents are required for analysis. Combination of CE with MEs is regarded as a very attractive environmentally sustainable analytical tool, and numerous applications have been reported over the last few decades for bioanalysis of low-molecular-weight compounds or for peptide analysis. In this paper we review the use of MEs combined with CE in bioanalysis. The review is divided into two sections: liquid and solid-based MEs. A brief practical and theoretical description of each ME is given, and the techniques are illustrated by relevant applications.
TH-EF-BRC-03: Fault Tree Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomadsen, B.
2016-06-15
This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less
Objective Assessment of Patient Inhaler User Technique Using an Audio-Based Classification Approach.
Taylor, Terence E; Zigel, Yaniv; Egan, Clarice; Hughes, Fintan; Costello, Richard W; Reilly, Richard B
2018-02-01
Many patients make critical user technique errors when using pressurised metered dose inhalers (pMDIs) which reduce the clinical efficacy of respiratory medication. Such critical errors include poor actuation coordination (poor timing of medication release during inhalation) and inhaling too fast (peak inspiratory flow rate over 90 L/min). Here, we present a novel audio-based method that objectively assesses patient pMDI user technique. The Inhaler Compliance Assessment device was employed to record inhaler audio signals from 62 respiratory patients as they used a pMDI with an In-Check Flo-Tone device attached to the inhaler mouthpiece. Using a quadratic discriminant analysis approach, the audio-based method generated a total frame-by-frame accuracy of 88.2% in classifying sound events (actuation, inhalation and exhalation). The audio-based method estimated the peak inspiratory flow rate and volume of inhalations with an accuracy of 88.2% and 83.94% respectively. It was detected that 89% of patients made at least one critical user technique error even after tuition from an expert clinical reviewer. This method provides a more clinically accurate assessment of patient inhaler user technique than standard checklist methods.
NASA Technical Reports Server (NTRS)
Dickson, B.; Cronkhite, J.; Bielefeld, S.; Killian, L.; Hayden, R.
1996-01-01
The objective of this study was to evaluate two techniques, Flight Condition Recognition (FCR) and Flight Load Synthesis (FIS), for usage monitoring and assess the potential benefits of extending the retirement intervals of life-limited components, thus reducing the operator's maintenance and replacement costs. Both techniques involve indirect determination of loads using measured flight parameters and subsequent fatigue analysis to calculate the life expended on the life-limited components. To assess the potential benefit of usage monitoring, the two usage techniques were compared to current methods of component retirement. In addition, comparisons were made with direct load measurements to assess the accuracy of the two techniques.
A Study on Re-entry Predictions of Uncontrolled Space Objects for Space Situational Awareness
NASA Astrophysics Data System (ADS)
Choi, Eun-Jung; Cho, Sungki; Lee, Deok-Jin; Kim, Siwoo; Jo, Jung Hyun
2017-12-01
The key risk analysis technologies for the re-entry of space objects into Earth’s atmosphere are divided into four categories: cataloguing and databases of the re-entry of space objects, lifetime and re-entry trajectory predictions, break-up models after re-entry and multiple debris distribution predictions, and ground impact probability models. In this study, we focused on re- entry prediction, including orbital lifetime assessments, for space situational awareness systems. Re-entry predictions are very difficult and are affected by various sources of uncertainty. In particular, during uncontrolled re-entry, large spacecraft may break into several pieces of debris, and the surviving fragments can be a significant hazard for persons and properties on the ground. In recent years, specific methods and procedures have been developed to provide clear information for predicting and analyzing the re-entry of space objects and for ground-risk assessments. Representative tools include object reentry survival analysis tool (ORSAT) and debris assessment software (DAS) developed by National Aeronautics and Space Administration (NASA), spacecraft atmospheric re-entry and aerothermal break-up (SCARAB) and debris risk assessment and mitigation analysis (DRAMA) developed by European Space Agency (ESA), and semi-analytic tool for end of life analysis (STELA) developed by Centre National d’Etudes Spatiales (CNES). In this study, various surveys of existing re-entry space objects are reviewed, and an efficient re-entry prediction technique is suggested based on STELA, the life-cycle analysis tool for satellites, and DRAMA, a re-entry analysis tool. To verify the proposed method, the re-entry of the Tiangong-1 Space Lab, which is expected to re-enter Earth’s atmosphere shortly, was simulated. Eventually, these results will provide a basis for space situational awareness risk analyses of the re-entry of space objects.
Three-dimensional reconstruction with x-ray shape-from-silhouette
NASA Astrophysics Data System (ADS)
Simioni, E.; Ratti, F.; Calliari, I.; Poletto, L.
2010-09-01
In the field of restoration of ancient handworks, X-ray tomography is a powerful method to reconstruct the internal structure of the object in non-invasive way. In some cases, such as small objects fully realized with hard metals and completely hidden by clay or products of oxidation, the tomography, although necessary to obtain the 3D appearance of the object, does not give any additional information on its internal monolithic structure. We present here the application of the shape-from-silhouette technique on X-ray images to reconstruct the 3D profile of handworks. The acquisition technique is similar to tomography, since several X-ray images are taken while the object is rotated. Some reference points are placed on a structure co-rotating with the object and are acquired on the images for calibration and registration. The shape-from-silhouette algorithm gives finally the 3D appearance of the handwork. We present the analysis of a tin pendant of VI-VIII century b.C. (Venetian area) completely hidden by solid ground. The 3D reconstruction shows surprisingly that the pendant is a very elaborated piece, with two embraced figures that were completely invisible before restoration.
Covariance Based Pre-Filters and Screening Criteria for Conjunction Analysis
NASA Astrophysics Data System (ADS)
George, E., Chan, K.
2012-09-01
Several relationships are developed relating object size, initial covariance and range at closest approach to probability of collision. These relationships address the following questions: - Given the objects' initial covariance and combined hard body size, what is the maximum possible value of the probability of collision (Pc)? - Given the objects' initial covariance, what is the maximum combined hard body radius for which the probability of collision does not exceed the tolerance limit? - Given the objects' initial covariance and the combined hard body radius, what is the minimum miss distance for which the probability of collision does not exceed the tolerance limit? - Given the objects' initial covariance and the miss distance, what is the maximum combined hard body radius for which the probability of collision does not exceed the tolerance limit? The first relationship above allows the elimination of object pairs from conjunction analysis (CA) on the basis of the initial covariance and hard-body sizes of the objects. The application of this pre-filter to present day catalogs with estimated covariance results in the elimination of approximately 35% of object pairs as unable to ever conjunct with a probability of collision exceeding 1x10-6. Because Pc is directly proportional to object size and inversely proportional to covariance size, this pre-filter will have a significantly larger impact on future catalogs, which are expected to contain a much larger fraction of small debris tracked only by a limited subset of available sensors. This relationship also provides a mathematically rigorous basis for eliminating objects from analysis entirely based on element set age or quality - a practice commonly done by rough rules of thumb today. Further, these relations can be used to determine the required geometric screening radius for all objects. This analysis reveals the screening volumes for small objects are much larger than needed, while the screening volumes for pairs of large objects may be inadequate. These relationships may also form the basis of an important metric for catalog maintenance by defining the maximum allowable covariance size for effective conjunction analysis. The application of these techniques promises to greatly improve the efficiency and completeness of conjunction analysis.
NASA Astrophysics Data System (ADS)
Levan, P.
2010-09-01
Geosynchronous objects appear as unresolved blurs even when observed with the largest ground-based telescopes. Due to the lack of any spatial detail, two or more objects appearing at similar brightness levels within the spectral bandpass they are observed are difficult to distinguish. Observing a changing pattern of such objects from one time epoch to another showcases the deficiencies in associating individual objects before and after the configuration change. This paper explores solutions to this deficiency in the form of spectral (under small business innovative research) and phase curve analyses. The extension of the technique to phase curves proves to be a powerful new capability.
NASA Astrophysics Data System (ADS)
Addink, Elisabeth A.; Van Coillie, Frieke M. B.; De Jong, Steven M.
2012-04-01
Traditional image analysis methods are mostly pixel-based and use the spectral differences of landscape elements at the Earth surface to classify these elements or to extract element properties from the Earth Observation image. Geographic object-based image analysis (GEOBIA) has received considerable attention over the past 15 years for analyzing and interpreting remote sensing imagery. In contrast to traditional image analysis, GEOBIA works more like the human eye-brain combination does. The latter uses the object's color (spectral information), size, texture, shape and occurrence to other image objects to interpret and analyze what we see. GEOBIA starts by segmenting the image grouping together pixels into objects and next uses a wide range of object properties to classify the objects or to extract object's properties from the image. Significant advances and improvements in image analysis and interpretation are made thanks to GEOBIA. In June 2010 the third conference on GEOBIA took place at the Ghent University after successful previous meetings in Calgary (2008) and Salzburg (2006). This special issue presents a selection of the 2010 conference papers that are worked out as full research papers for JAG. The papers cover GEOBIA applications as well as innovative methods and techniques. The topics range from vegetation mapping, forest parameter estimation, tree crown identification, urban mapping, land cover change, feature selection methods and the effects of image compression on segmentation. From the original 94 conference papers, 26 full research manuscripts were submitted; nine papers were selected and are presented in this special issue. Selection was done on the basis of quality and topic of the studies. The next GEOBIA conference will take place in Rio de Janeiro from 7 to 9 May 2012 where we hope to welcome even more scientists working in the field of GEOBIA.
Instrumental texture characteristics of broiler pectoralis major with the woody breast condition
USDA-ARS?s Scientific Manuscript database
The objective was to characterize texture properties of raw and cooked broiler fillets (pectoralis major) with the woody breast condition (WBC) using instrumental texture techniques Meullenet-Owens Razor Shear (MORS) and texture profile analysis (TPA). Deboned (3 h postmortem) broiler fillets were c...
Latin-square three-dimensional gage master
Jones, L.
1981-05-12
A gage master for coordinate measuring machines has an nxn array of objects distributed in the Z coordinate utilizing the concept of a Latin square experimental design. Using analysis of variance techniques, the invention may be used to identify sources of error in machine geometry and quantify machine accuracy.
Latin square three dimensional gage master
Jones, Lynn L.
1982-01-01
A gage master for coordinate measuring machines has an nxn array of objects distributed in the Z coordinate utilizing the concept of a Latin square experimental design. Using analysis of variance techniques, the invention may be used to identify sources of error in machine geometry and quantify machine accuracy.
A Primer on Simulation and Gaming.
ERIC Educational Resources Information Center
Barton, Richard F.
In a primer intended for the administrative professions, for the behavioral sciences, and for education, simulation and its various aspects are defined, illustrated, and explained. Man-model simulation, man-computer simulation, all-computer simulation, and analysis are discussed as techniques for studying object systems (parts of the "real…
Rismanchian, Farhood; Lee, Young Hoon
2017-07-01
This article proposes an approach to help designers analyze complex care processes and identify the optimal layout of an emergency department (ED) considering several objectives simultaneously. These objectives include minimizing the distances traveled by patients, maximizing design preferences, and minimizing the relocation costs. Rising demand for healthcare services leads to increasing demand for new hospital buildings as well as renovating existing ones. Operations management techniques have been successfully applied in both manufacturing and service industries to design more efficient layouts. However, high complexity of healthcare processes makes it challenging to apply these techniques in healthcare environments. Process mining techniques were applied to address the problem of complexity and to enhance healthcare process analysis. Process-related information, such as information about the clinical pathways, was extracted from the information system of an ED. A goal programming approach was then employed to find a single layout that would simultaneously satisfy several objectives. The layout identified using the proposed method improved the distances traveled by noncritical and critical patients by 42.2% and 47.6%, respectively, and minimized the relocation costs. This study has shown that an efficient placement of the clinical units yields remarkable improvements in the distances traveled by patients.
Adaptive Morphological Feature-Based Object Classifier for a Color Imaging System
NASA Technical Reports Server (NTRS)
McDowell, Mark; Gray, Elizabeth
2009-01-01
Utilizing a Compact Color Microscope Imaging System (CCMIS), a unique algorithm has been developed that combines human intelligence along with machine vision techniques to produce an autonomous microscope tool for biomedical, industrial, and space applications. This technique is based on an adaptive, morphological, feature-based mapping function comprising 24 mutually inclusive feature metrics that are used to determine the metrics for complex cell/objects derived from color image analysis. Some of the features include: Area (total numbers of non-background pixels inside and including the perimeter), Bounding Box (smallest rectangle that bounds and object), centerX (x-coordinate of intensity-weighted, center-of-mass of an entire object or multi-object blob), centerY (y-coordinate of intensity-weighted, center-of-mass, of an entire object or multi-object blob), Circumference (a measure of circumference that takes into account whether neighboring pixels are diagonal, which is a longer distance than horizontally or vertically joined pixels), . Elongation (measure of particle elongation given as a number between 0 and 1. If equal to 1, the particle bounding box is square. As the elongation decreases from 1, the particle becomes more elongated), . Ext_vector (extremal vector), . Major Axis (the length of a major axis of a smallest ellipse encompassing an object), . Minor Axis (the length of a minor axis of a smallest ellipse encompassing an object), . Partial (indicates if the particle extends beyond the field of view), . Perimeter Points (points that make up a particle perimeter), . Roundness [(4(pi) x area)/perimeter(squared)) the result is a measure of object roundness, or compactness, given as a value between 0 and 1. The greater the ratio, the rounder the object.], . Thin in center (determines if an object becomes thin in the center, (figure-eight-shaped), . Theta (orientation of the major axis), . Smoothness and color metrics for each component (red, green, blue) the minimum, maximum, average, and standard deviation within the particle are tracked. These metrics can be used for autonomous analysis of color images from a microscope, video camera, or digital, still image. It can also automatically identify tumor morphology of stained images and has been used to detect stained cell phenomena (see figure).
An image analysis system for near-infrared (NIR) fluorescence lymph imaging
NASA Astrophysics Data System (ADS)
Zhang, Jingdan; Zhou, Shaohua Kevin; Xiang, Xiaoyan; Rasmussen, John C.; Sevick-Muraca, Eva M.
2011-03-01
Quantitative analysis of lymphatic function is crucial for understanding the lymphatic system and diagnosing the associated diseases. Recently, a near-infrared (NIR) fluorescence imaging system is developed for real-time imaging lymphatic propulsion by intradermal injection of microdose of a NIR fluorophore distal to the lymphatics of interest. However, the previous analysis software3, 4 is underdeveloped, requiring extensive time and effort to analyze a NIR image sequence. In this paper, we develop a number of image processing techniques to automate the data analysis workflow, including an object tracking algorithm to stabilize the subject and remove the motion artifacts, an image representation named flow map to characterize lymphatic flow more reliably, and an automatic algorithm to compute lymph velocity and frequency of propulsion. By integrating all these techniques to a system, the analysis workflow significantly reduces the amount of required user interaction and improves the reliability of the measurement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miron, M.S.; Christopher, C.; Hirshfield, S.
1978-05-01
Psycholinguistics provides crisis managers in nuclear threat incidents with a quantitative methodology which can aid in the determination of threat credibility, authorship identification and perpetrator apprehension. The objective of this contract is to improve and enhance present psycholinguistic software systems by means of newly-developed, computer-automated techniques which significantly extend the technology of automated content and stylistic analysis of nuclear threat. In accordance with this overall objective, the first two contract Tasks have been completed and are reported on in this document. The first Task specifies the development of software support for the purpose of syntax regularization of vocabulary to rootmore » form. The second calls for the exploration and development of alternative approaches to correlative analysis of vocabulary usage.« less
Na, X D; Zang, S Y; Wu, C S; Li, W L
2015-11-01
Knowledge of the spatial extent of forested wetlands is essential to many studies including wetland functioning assessment, greenhouse gas flux estimation, and wildlife suitable habitat identification. For discriminating forested wetlands from their adjacent land cover types, researchers have resorted to image analysis techniques applied to numerous remotely sensed data. While with some success, there is still no consensus on the optimal approaches for mapping forested wetlands. To address this problem, we examined two machine learning approaches, random forest (RF) and K-nearest neighbor (KNN) algorithms, and applied these two approaches to the framework of pixel-based and object-based classifications. The RF and KNN algorithms were constructed using predictors derived from Landsat 8 imagery, Radarsat-2 advanced synthetic aperture radar (SAR), and topographical indices. The results show that the objected-based classifications performed better than per-pixel classifications using the same algorithm (RF) in terms of overall accuracy and the difference of their kappa coefficients are statistically significant (p<0.01). There were noticeably omissions for forested and herbaceous wetlands based on the per-pixel classifications using the RF algorithm. As for the object-based image analysis, there were also statistically significant differences (p<0.01) of Kappa coefficient between results performed based on RF and KNN algorithms. The object-based classification using RF provided a more visually adequate distribution of interested land cover types, while the object classifications based on the KNN algorithm showed noticeably commissions for forested wetlands and omissions for agriculture land. This research proves that the object-based classification with RF using optical, radar, and topographical data improved the mapping accuracy of land covers and provided a feasible approach to discriminate the forested wetlands from the other land cover types in forestry area.
2012-01-01
1200 Session 3 – C2 Framework, OR Methods MOOs, MOEs, MOPs Development Case Study – 1300-1630 Session 4 – Findings...Objective 1: Understand the impact of the application of traditional operational research techniques to networked C2 systems. • Objective 2: Develop ...for the network. 3. Cost measures including cost and time to implement the solution (for example, a basic rule-of-thumb I use for development
Solving a product safety problem using a recycled high density polyethylene container
NASA Technical Reports Server (NTRS)
Liu, Ping; Waskom, T. L.
1993-01-01
The objectives are to introduce basic problem-solving techniques for product safety including problem identification, definition, solution criteria, test process and design, and data analysis. The students are given a recycled milk jug made of high density polyethylene (HDPE) by blow molding. The objectives are to design and perform proper material test(s) so they can evaluate the product safety if the milk jug is used in a certain way which is specified in the description of the procedure for this investigation.
Setup calibration and optimization for comparative digital holography
NASA Astrophysics Data System (ADS)
Baumbach, Torsten; Osten, Wolfgang; Kebbel, Volker; von Kopylow, Christoph; Jueptner, Werner
2004-08-01
With increasing globalization many enterprises decide to produce the components of their products at different locations all over the world. Consequently, new technologies and strategies for quality control are required. In this context the remote comparison of objects with regard to their shape or response on certain loads is getting more and more important for a variety of applications. For such a task the novel method of comparative digital holography is a suitable tool with interferometric sensitivity. With this technique the comparison in shape or deformation of two objects does not require the presence of both objects at the same place. In contrast to the well known incoherent techniques based on inverse fringe projection this new approach uses a coherent mask for the illumination of the sample object. The coherent mask is created by digital holography to enable the instant access to the complete optical information of the master object at any wanted place. The reconstruction of the mask is done by a spatial light modulator (SLM). The transmission of the digital master hologram to the place of comparison can be done via digital telecommunication networks. Contrary to other interferometric techniques this method enables the comparison of objects with different microstructure. In continuation of earlier reports our investigations are focused here on the analysis of the constraints of the setup with respect to the quality of the hologram reconstruction with a spatial light modulator. For successful measurements the selection of the appropriate reconstruction method and the adequate optical set-up is mandatory. In addition, the use of a SLM for the reconstruction requires the knowledge of its properties for the accomplishment of this method. The investigation results for the display properties such as display curvature, phase shift and the consequences for the technique will be presented. The optimization and the calibration of the set-up and its components lead to improved results in comparative digital holography with respect to the resolution. Examples of measurements before and after the optimization and calibration will be presented.
Bridging the gap between high and low acceleration for planetary escape
NASA Astrophysics Data System (ADS)
Indrikis, Janis; Preble, Jeffrey C.
With the exception of the often time consuming analysis by numerical optimization, no single orbit transfer analysis technique exists that can be applied over a wide range of accelerations. Using the simple planetary escape (parabolic trajectory) mission some of the more common techniques are considered as the limiting bastions at the high and the extremely low acceleration regimes. The brachistochrone, the minimum time of flight path, is proposed as the technique to bridge the gap between the high and low acceleration regions, providing a smooth bridge over the entire acceleration spectrum. A smooth and continuous velocity requirement is established for the planetary escape mission. By using these results, it becomes possible to determine the effect of finite accelerations on mission performance and target propulsion and power system designs which are consistent with a desired mission objective.
Statistical Analysis of Fort Hood Quality-of-Life Questionnaire.
1978-10-01
The objective of this work was to provide supplementary data analyses of data abstracted from the Quality - of - Life questionnaire developed earlier at...the Fort Hood Field Unit at the request of Headquarters, TRADOC Combined Arms Test Activity (TCATA). The Quality - of - Life questionnaire data were...to the Quality - of - Life questionnaire. These data were then intensively analyzed using analysis of variance and correlational techniques. The results
Technical Training on High-Order Spectral Analysis and Thermal Anemometry Applications
NASA Technical Reports Server (NTRS)
Maslov, A. A.; Shiplyuk, A. N.; Sidirenko, A. A.; Bountin, D. A.
2003-01-01
The topics of thermal anemometry and high-order spectral analyses were the subject of the technical training. Specifically, the objective of the technical training was to study: (i) the recently introduced constant voltage anemometer (CVA) for high-speed boundary layer; and (ii) newly developed high-order spectral analysis techniques (HOSA). Both CVA and HOSA are relevant tools for studies of boundary layer transition and stability.
Instantiating the art of war for effects-based operations
NASA Astrophysics Data System (ADS)
Burns, Carla L.
2002-07-01
Effects-Based Operations (EBO) is a mindset, a philosophy and an approach for planning, executing and assessing military operations for the effects they produce rather than the targets or even objectives they deal with. An EBO approach strives to provide economy of force, dynamic tasking, and reduced collateral damage. The notion of EBO is not new. Military Commanders certainly have desired effects in mind when conducting military operations. However, to date EBO has been an art of war that lacks automated techniques and tools that enable effects-based analysis and assessment. Modeling and simulation is at the heart of this challenge. The Air Force Research Laboratory (AFRL) EBO Program is developing modeling techniques and corresponding tool capabilities that can be brought to bear against the challenges presented by effects-based analysis and assessment. Effects-based course-of-action development, center of gravity/target system analysis, and wargaming capabilities are being developed and integrated to help give Commanders the information decision support required to achieve desired national security objectives. This paper presents an introduction to effects-based operations, discusses the benefits of an EBO approach, and focuses on modeling and analysis for effects-based strategy development. An overview of modeling and simulation challenges for EBO is presented, setting the stage for the detailed technical papers in the subject session.
Analysis and modelling of septic shock microarray data using Singular Value Decomposition.
Allanki, Srinivas; Dixit, Madhulika; Thangaraj, Paul; Sinha, Nandan Kumar
2017-06-01
Being a high throughput technique, enormous amounts of microarray data has been generated and there arises a need for more efficient techniques of analysis, in terms of speed and accuracy. Finding the differentially expressed genes based on just fold change and p-value might not extract all the vital biological signals that occur at a lower gene expression level. Besides this, numerous mathematical models have been generated to predict the clinical outcome from microarray data, while very few, if not none, aim at predicting the vital genes that are important in a disease progression. Such models help a basic researcher narrow down and concentrate on a promising set of genes which leads to the discovery of gene-based therapies. In this article, as a first objective, we have used the lesser known and used Singular Value Decomposition (SVD) technique to build a microarray data analysis tool that works with gene expression patterns and intrinsic structure of the data in an unsupervised manner. We have re-analysed a microarray data over the clinical course of Septic shock from Cazalis et al. (2014) and have shown that our proposed analysis provides additional information compared to the conventional method. As a second objective, we developed a novel mathematical model that predicts a set of vital genes in the disease progression that works by generating samples in the continuum between health and disease, using a simple normal-distribution-based random number generator. We also verify that most of the predicted genes are indeed related to septic shock. Copyright © 2017 Elsevier Inc. All rights reserved.
Spectral features based tea garden extraction from digital orthophoto maps
NASA Astrophysics Data System (ADS)
Jamil, Akhtar; Bayram, Bulent; Kucuk, Turgay; Zafer Seker, Dursun
2018-05-01
The advancements in the photogrammetry and remote sensing technologies has made it possible to extract useful tangible information from data which plays a pivotal role in various application such as management and monitoring of forests and agricultural lands etc. This study aimed to evaluate the effectiveness of spectral signatures for extraction of tea gardens from 1 : 5000 scaled digital orthophoto maps obtained from Rize city in Turkey. First, the normalized difference vegetation index (NDVI) was derived from the input images to suppress the non-vegetation areas. NDVI values less than zero were discarded and the output images was normalized in the range 0-255. Individual pixels were then mapped into meaningful objects using global region growing technique. The resulting image was filtered and smoothed to reduce the impact of noise. Furthermore, geometrical constraints were applied to remove small objects (less than 500 pixels) followed by morphological opening operator to enhance the results. These objects served as building blocks for further image analysis. Finally, for the classification stage, a range of spectral values were empirically calculated for each band and applied on candidate objects to extract tea gardens. For accuracy assessment, we employed an area based similarity metric by overlapping obtained tea garden boundaries with the manually digitized tea garden boundaries created by experts of photogrammetry. The overall accuracy of the proposed method scored 89 % for tea gardens from 10 sample orthophoto maps. We concluded that exploiting the spectral signatures using object based analysis is an effective technique for extraction of dominant tree species from digital orthophoto maps.
Mladinich, C.
2010-01-01
Human disturbance is a leading ecosystem stressor. Human-induced modifications include transportation networks, areal disturbances due to resource extraction, and recreation activities. High-resolution imagery and object-oriented classification rather than pixel-based techniques have successfully identified roads, buildings, and other anthropogenic features. Three commercial, automated feature-extraction software packages (Visual Learning Systems' Feature Analyst, ENVI Feature Extraction, and Definiens Developer) were evaluated by comparing their ability to effectively detect the disturbed surface patterns from motorized vehicle traffic. Each package achieved overall accuracies in the 70% range, demonstrating the potential to map the surface patterns. The Definiens classification was more consistent and statistically valid. Copyright ?? 2010 by Bellwether Publishing, Ltd. All rights reserved.
Glaser, E M; Van der Loos, H
1981-08-01
Exceptionally clear Golgi-Nissl sections of 300 micron thickness have been morphometrically studied by light microscopy using oil immersion objectives. The clarity results from a new variation of a staining procedure that combines Golgi and Nissl images in one section. A viewing technique has been developed that permits a histologic preparation to be examined from its obverse (or normally viewed) side and its reverse (or under) side. The technique was designed for use with a computer microscope but can be employed with any light microscope whose stage position can be measured within 100 micron. Sections thicker than 300 micron can be studied dependent on the working distance of the objective lens, provided that the clarity of the material permits it.
Does Parent-Child Interaction Therapy Reduce Future Physical Abuse? A Meta-Analysis
ERIC Educational Resources Information Center
Kennedy, Stephanie C.; Kim, Johnny S.; Tripodi, Stephen J.; Brown, Samantha M.; Gowdy, Grace
2016-01-01
Objective: To use meta-analytic techniques to evaluating the effectiveness of parent-child interaction therapy (PCIT) at reducing future physical abuse among physically abusive families. Methods: A systematic search identified six eligible studies. Outcomes of interest were physical abuse recurrence, child abuse potential, and parenting stress.…
Stigmatizing Attributions and Vocational Rehabilitation Outcomes of People with Disabilities
ERIC Educational Resources Information Center
Chan, Jacob Yui-Chung; Keegan, John P.; Ditchman, Nicole; Gonzalez, Rene; Zheng, Lisa Xi; Chan, Fong
2011-01-01
Objective: To determine whether employment outcomes of people with disabilities can be predicted by the social-cognitive/attribution theory of stigmatization. Design: Ex post facto design using data mining technique and logistic regression analysis. Participants: Data from 40,585 vocational rehabilitation (VR) consumers were extracted from the…
Gamma Ray Astrophysics: New insight into the universe
NASA Technical Reports Server (NTRS)
Fichtel, C. E.; Trombka, J. I.
1981-01-01
Gamma ray observations of the solar system, the galaxy and extragalactic radiation are reported. Topics include: planets, comets, and asteroids; solar observations; interstellar medium and galactic structure; compact objects; cosmology; and diffuse radiation. The instrumentation used in gamma ray astronomy in covered along with techniques for the analysis of observational spectra.
Leisman, Gerald; Ashkenazi, Maureen
1979-01-01
Objective psychophysical techniques for investigating visual fields are described. The paper concerns methods for the collection and analysis of evoked potentials using a small laboratory computer and provides efficient methods for obtaining information about the conduction pathways of the visual system.
PERIPHYTON AND SEDIMENT BIOASSESSMENT AS INDICATORS OF THE EFFECT OF A COASTAL PULP MILL WASTEWATER
A two year study was conducted near Port St. Joe, Florida, in a coastal transportation canal and bay receiving combined municipal and pulp mill wastewater. The objective of the study was to determine the effectiveness of periphyton analysis techniques and sediment toxicity as ind...
Visual reconciliation of alternative similarity spaces in climate modeling
J Poco; A Dasgupta; Y Wei; William Hargrove; C.R. Schwalm; D.N. Huntzinger; R Cook; E Bertini; C.T. Silva
2015-01-01
Visual data analysis often requires grouping of data objects based on their similarity. In many application domains researchers use algorithms and techniques like clustering and multidimensional scaling to extract groupings from data. While extracting these groups using a single similarity criteria is relatively straightforward, comparing alternative criteria poses...
Cognitive Mapping Tobacco Control Advice for Dentistry: A Dental PBRN Study
ERIC Educational Resources Information Center
Qu, Haiyan; Houston, Thomas K.; Williams, Jessica H.; Gilbert, Gregg H.; Shewchuk, Richard M.
2011-01-01
Objective: To identify facilitative strategies that could be used in developing a tobacco cessation program for community dental practices. Methods: Nominal group technique (NGT) meetings and a card-sort task were used to obtain formative data. A cognitive mapping approach involving multidimensional scaling and hierarchical cluster analysis was…
Although amplification of source-specific molecular markers from Bacteroidales fecal bacteria can identify several different kinds of fecal contamination in water, it remains unclear how this technique relates to fecal indicator measurements in natural waters. The objectives of t...
Child-Parent Interventions for Childhood Anxiety Disorders: A Systematic Review and Meta-Analysis
ERIC Educational Resources Information Center
Brendel, Kristen Esposito; Maynard, Brandy R.
2014-01-01
Objective: This study compared the effects of direct child-parent interventions to the effects of child-focused interventions on anxiety outcomes for children with anxiety disorders. Method: Systematic review methods and meta-analytic techniques were employed. Eight randomized controlled trials examining effects of family cognitive behavior…
Secondary Analysis of National Longitudinal Transition Study 2 Data
ERIC Educational Resources Information Center
Hicks, Tyler A.; Knollman, Greg A.
2015-01-01
This review examines published secondary analyses of National Longitudinal Transition Study 2 (NLTS2) data, with a primary focus upon statistical objectives, paradigms, inferences, and methods. Its primary purpose was to determine which statistical techniques have been common in secondary analyses of NLTS2 data. The review begins with an…
Evaluating the Skill of Students to Determine Soil Morphology Characteristics
ERIC Educational Resources Information Center
Post, Donald F.; Parikh, Sanjai J.; Papp, Rae Ann; Ferriera, Laerta
2006-01-01
Precise and accurate pedon descriptions prepared by field scientists using standard techniques with defined terminology and methodology are essential in describing soil pedons. The accuracy of field measurements generally are defined in terms of how well they agree with objective criteria (e.g., laboratory analysis), such as mechanical analysis…
From Periodic Properties to a Periodic Table Arrangement
ERIC Educational Resources Information Center
Besalú, Emili
2013-01-01
A periodic table is constructed from the consideration of periodic properties and the application of the principal components analysis technique. This procedure is useful for objects classification and data reduction and has been used in the field of chemistry for many applications, such as lanthanides, molecules, or conformers classification.…
Le Climat d'Apprentissage; Analyse Conceptuelle=Learning Climate: A Conceptual Analysis.
ERIC Educational Resources Information Center
Michaud, Pierre; And Others
1989-01-01
Analyzes and defines the concept of "learning climate." Discusses the conceptual models of Biddle and Brookover. Considers the use of observation techniques and surveys to measure learning climate. Reviews research on the relationship between learning climate and the attainment of course and institutional objectives. (DMM)
Rotation covariant image processing for biomedical applications.
Skibbe, Henrik; Reisert, Marco
2013-01-01
With the advent of novel biomedical 3D image acquisition techniques, the efficient and reliable analysis of volumetric images has become more and more important. The amount of data is enormous and demands an automated processing. The applications are manifold, ranging from image enhancement, image reconstruction, and image description to object/feature detection and high-level contextual feature extraction. In most scenarios, it is expected that geometric transformations alter the output in a mathematically well-defined manner. In this paper we emphasis on 3D translations and rotations. Many algorithms rely on intensity or low-order tensorial-like descriptions to fulfill this demand. This paper proposes a general mathematical framework based on mathematical concepts and theories transferred from mathematical physics and harmonic analysis into the domain of image analysis and pattern recognition. Based on two basic operations, spherical tensor differentiation and spherical tensor multiplication, we show how to design a variety of 3D image processing methods in an efficient way. The framework has already been applied to several biomedical applications ranging from feature and object detection tasks to image enhancement and image restoration techniques. In this paper, the proposed methods are applied on a variety of different 3D data modalities stemming from medical and biological sciences.
TH-EF-BRC-04: Quality Management Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yorke, E.
2016-06-15
This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2016-06-15
This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huq, M.
2016-06-15
This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less
Using EIGER for Antenna Design and Analysis
NASA Technical Reports Server (NTRS)
Champagne, Nathan J.; Khayat, Michael; Kennedy, Timothy F.; Fink, Patrick W.
2007-01-01
EIGER (Electromagnetic Interactions GenERalized) is a frequency-domain electromagnetics software package that is built upon a flexible framework, designed using object-oriented techniques. The analysis methods used include moment method solutions of integral equations, finite element solutions of partial differential equations, and combinations thereof. The framework design permits new analysis techniques (boundary conditions, Green#s functions, etc.) to be added to the software suite with a sensible effort. The code has been designed to execute (in serial or parallel) on a wide variety of platforms from Intel-based PCs and Unix-based workstations. Recently, new potential integration scheme s that avoid singularity extraction techniques have been added for integral equation analysis. These new integration schemes are required for facilitating the use of higher-order elements and basis functions. Higher-order elements are better able to model geometrical curvature using fewer elements than when using linear elements. Higher-order basis functions are beneficial for simulating structures with rapidly varying fields or currents. Results presented here will demonstrate curren t and future capabilities of EIGER with respect to analysis of installed antenna system performance in support of NASA#s mission of exploration. Examples include antenna coupling within an enclosed environment and antenna analysis on electrically large manned space vehicles.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunscombe, P.
This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less
Strain-energy release rate analysis of a laminate with a postbuckled delamination
NASA Technical Reports Server (NTRS)
Whitcomb, John D.; Shivakumar, K. N.
1987-01-01
The objectives are to present the derivation of the new virtual crack closure technique, evaluate the accuracy of the technique, and finally to present the results of a limited parametric study of laminates with a postbuckled delamination. Although the new virtual crack closure technique is general, only homogeneous, isotropic laminates were analyzed. This was to eliminate the variation of flexural stiffness with orientation, which occurs even for quasi-isotropic laminates. This made it easier to identify the effect of geometrical parameters on G. The new virtual crack closure technique is derived. Then the specimen configurations are described. Next, the stress analyses is discussed. Finally, the virtual crack closure technique is evaluated and then used to calculate the distribution of G along the delamination front of several laminates with a postbuckled delamination.
Task Oriented Evaluation of Module Extraction Techniques
NASA Astrophysics Data System (ADS)
Palmisano, Ignazio; Tamma, Valentina; Payne, Terry; Doran, Paul
Ontology Modularization techniques identify coherent and often reusable regions within an ontology. The ability to identify such modules, thus potentially reducing the size or complexity of an ontology for a given task or set of concepts is increasingly important in the Semantic Web as domain ontologies increase in terms of size, complexity and expressivity. To date, many techniques have been developed, but evaluation of the results of these techniques is sketchy and somewhat ad hoc. Theoretical properties of modularization algorithms have only been studied in a small number of cases. This paper presents an empirical analysis of a number of modularization techniques, and the modules they identify over a number of diverse ontologies, by utilizing objective, task-oriented measures to evaluate the fitness of the modules for a number of statistical classification problems.
RLV Turbine Performance Optimization
NASA Technical Reports Server (NTRS)
Griffin, Lisa W.; Dorney, Daniel J.
2001-01-01
A task was developed at NASA/Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. Final results of the preliminary design and the results of the two-dimensional (2D) detailed design of the first-stage vane of a supersonic turbine suitable for a reusable launch vehicle (R-LV) are presented. Analytical techniques for obtaining the results are also discussed.
Engineering tradeoff problems viewed as multiple objective optimizations and the VODCA methodology
NASA Astrophysics Data System (ADS)
Morgan, T. W.; Thurgood, R. L.
1984-05-01
This paper summarizes a rational model for making engineering tradeoff decisions. The model is a hybrid from the fields of social welfare economics, communications, and operations research. A solution methodology (Vector Optimization Decision Convergence Algorithm - VODCA) firmly grounded in the economic model is developed both conceptually and mathematically. The primary objective for developing the VODCA methodology was to improve the process for extracting relative value information about the objectives from the appropriate decision makers. This objective was accomplished by employing data filtering techniques to increase the consistency of the relative value information and decrease the amount of information required. VODCA is applied to a simplified hypothetical tradeoff decision problem. Possible use of multiple objective analysis concepts and the VODCA methodology in product-line development and market research are discussed.
Novel optimization technique of isolated microgrid with hydrogen energy storage.
Beshr, Eman Hassan; Abdelghany, Hazem; Eteiba, Mahmoud
2018-01-01
This paper presents a novel optimization technique for energy management studies of an isolated microgrid. The system is supplied by various Distributed Energy Resources (DERs), Diesel Generator (DG), a Wind Turbine Generator (WTG), Photovoltaic (PV) arrays and supported by fuel cell/electrolyzer Hydrogen storage system for short term storage. Multi-objective optimization is used through non-dominated sorting genetic algorithm to suit the load requirements under the given constraints. A novel multi-objective flower pollination algorithm is utilized to check the results. The Pros and cons of the two optimization techniques are compared and evaluated. An isolated microgrid is modelled using MATLAB software package, dispatch of active/reactive power, optimal load flow analysis with slack bus selection are carried out to be able to minimize fuel cost and line losses under realistic constraints. The performance of the system is studied and analyzed during both summer and winter conditions and three case studies are presented for each condition. The modified IEEE 15 bus system is used to validate the proposed algorithm.
Ioannou, Ioanna; Kazmierczak, Edmund; Stern, Linda
2015-01-01
The use of virtual reality (VR) simulation for surgical training has gathered much interest in recent years. Despite increasing popularity and usage, limited work has been carried out in the use of automated objective measures to quantify the extent to which performance in a simulator resembles performance in the operating theatre, and the effects of simulator training on real world performance. To this end, we present a study exploring the effects of VR training on the performance of dentistry students learning a novel oral surgery task. We compare the performance of trainees in a VR simulator and in a physical setting involving ovine jaws, using a range of automated metrics derived by motion analysis. Our results suggest that simulator training improved the motion economy of trainees without adverse effects on task outcome. Comparison of surgical technique on the simulator with the ovine setting indicates that simulator technique is similar, but not identical to real world technique.
Novel optimization technique of isolated microgrid with hydrogen energy storage
Abdelghany, Hazem; Eteiba, Mahmoud
2018-01-01
This paper presents a novel optimization technique for energy management studies of an isolated microgrid. The system is supplied by various Distributed Energy Resources (DERs), Diesel Generator (DG), a Wind Turbine Generator (WTG), Photovoltaic (PV) arrays and supported by fuel cell/electrolyzer Hydrogen storage system for short term storage. Multi-objective optimization is used through non-dominated sorting genetic algorithm to suit the load requirements under the given constraints. A novel multi-objective flower pollination algorithm is utilized to check the results. The Pros and cons of the two optimization techniques are compared and evaluated. An isolated microgrid is modelled using MATLAB software package, dispatch of active/reactive power, optimal load flow analysis with slack bus selection are carried out to be able to minimize fuel cost and line losses under realistic constraints. The performance of the system is studied and analyzed during both summer and winter conditions and three case studies are presented for each condition. The modified IEEE 15 bus system is used to validate the proposed algorithm. PMID:29466433
NASA Technical Reports Server (NTRS)
Schutz, Bob E.; Baker, Gregory A.
1997-01-01
The recovery of a high resolution geopotential from satellite gradiometer observations motivates the examination of high performance computational techniques. The primary subject matter addresses specifically the use of satellite gradiometer and GPS observations to form and invert the normal matrix associated with a large degree and order geopotential solution. Memory resident and out-of-core parallel linear algebra techniques along with data parallel batch algorithms form the foundation of the least squares application structure. A secondary topic includes the adoption of object oriented programming techniques to enhance modularity and reusability of code. Applications implementing the parallel and object oriented methods successfully calculate the degree variance for a degree and order 110 geopotential solution on 32 processors of the Cray T3E. The memory resident gradiometer application exhibits an overall application performance of 5.4 Gflops, and the out-of-core linear solver exhibits an overall performance of 2.4 Gflops. The combination solution derived from a sun synchronous gradiometer orbit produce average geoid height variances of 17 millimeters.
NASA Astrophysics Data System (ADS)
Baker, Gregory Allen
The recovery of a high resolution geopotential from satellite gradiometer observations motivates the examination of high performance computational techniques. The primary subject matter addresses specifically the use of satellite gradiometer and GPS observations to form and invert the normal matrix associated with a large degree and order geopotential solution. Memory resident and out-of-core parallel linear algebra techniques along with data parallel batch algorithms form the foundation of the least squares application structure. A secondary topic includes the adoption of object oriented programming techniques to enhance modularity and reusability of code. Applications implementing the parallel and object oriented methods successfully calculate the degree variance for a degree and order 110 geopotential solution on 32 processors of the Cray T3E. The memory resident gradiometer application exhibits an overall application performance of 5.4 Gflops, and the out-of-core linear solver exhibits an overall performance of 2.4 Gflops. The combination solution derived from a sun synchronous gradiometer orbit produce average geoid height variances of 17 millimeters.
An object-oriented approach to data display and storage: 3 years experience, 25,000 cases.
Sainsbury, D A
1993-11-01
Object-oriented programming techniques were used to develop computer based data display and storage systems. These have been operating in the 8 anaesthetising areas of the Adelaide Children's Hospital for 3 years. The analogue and serial outputs from an array of patient monitors are connected to IBM compatible PC-XT computers. The information is displayed on a colour screen as wave-form and trend graphs and digital format in 'real time'. The trend data is printed simultaneously on a dot matrix printer. This data is also stored for 24 hours on 'hard' disk. The major benefit has been the provision of a single visual focus for all monitored variables. The automatic logging of data has been invaluable in the analysis of critical incidents. The systems were made possible by recent, rapid improvements in computer hardware and software. This paper traces the development of the program and demonstrates the advantages of object-oriented programming techniques.
Jennane, Rachid; Aufort, Gabriel; Benhamou, Claude Laurent; Ceylan, Murat; Ozbay, Yüksel; Ucan, Osman Nuri
2012-04-01
Curve and surface thinning are widely-used skeletonization techniques for modeling objects in three dimensions. In the case of disordered porous media analysis, however, neither is really efficient since the internal geometry of the object is usually composed of both rod and plate shapes. This paper presents an alternative to compute a hybrid shape-dependent skeleton and its application to porous media. The resulting skeleton combines 2D surfaces and 1D curves to represent respectively the plate-shaped and rod-shaped parts of the object. For this purpose, a new technique based on neural networks is proposed: cascade combinations of complex wavelet transform (CWT) and complex-valued artificial neural network (CVANN). The ability of the skeleton to characterize hybrid shaped porous media is demonstrated on a trabecular bone sample. Results show that the proposed method achieves high accuracy rates about 99.78%-99.97%. Especially, CWT (2nd level)-CVANN structure converges to optimum results as high accuracy rate-minimum time consumption.
Automatic classification of minimally invasive instruments based on endoscopic image sequences
NASA Astrophysics Data System (ADS)
Speidel, Stefanie; Benzko, Julia; Krappe, Sebastian; Sudra, Gunther; Azad, Pedram; Müller-Stich, Beat Peter; Gutt, Carsten; Dillmann, Rüdiger
2009-02-01
Minimally invasive surgery is nowadays a frequently applied technique and can be regarded as a major breakthrough in surgery. The surgeon has to adopt special operation-techniques and deal with difficulties like the complex hand-eye coordination and restricted mobility. To alleviate these constraints we propose to enhance the surgeon's capabilities by providing a context-aware assistance using augmented reality techniques. To analyze the current situation for context-aware assistance, we need intraoperatively gained sensor data and a model of the intervention. A situation consists of information about the performed activity, the used instruments, the surgical objects, the anatomical structures and defines the state of an intervention for a given moment in time. The endoscopic images provide a rich source of information which can be used for an image-based analysis. Different visual cues are observed in order to perform an image-based analysis with the objective to gain as much information as possible about the current situation. An important visual cue is the automatic recognition of the instruments which appear in the scene. In this paper we present the classification of minimally invasive instruments using the endoscopic images. The instruments are not modified by markers. The system segments the instruments in the current image and recognizes the instrument type based on three-dimensional instrument models.
Hybrid, experimental and computational, investigation of mechanical components
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
1996-07-01
Computational and experimental methodologies have unique features for the analysis and solution of a wide variety of engineering problems. Computations provide results that depend on selection of input parameters such as geometry, material constants, and boundary conditions which, for correct modeling purposes, have to be appropriately chosen. In addition, it is relatively easy to modify the input parameters in order to computationally investigate different conditions. Experiments provide solutions which characterize the actual behavior of the object of interest subjected to specific operating conditions. However, it is impractical to experimentally perform parametric investigations. This paper discusses the use of a hybrid, computational and experimental, approach for study and optimization of mechanical components. Computational techniques are used for modeling the behavior of the object of interest while it is experimentally tested using noninvasive optical techniques. Comparisons are performed through a fringe predictor program used to facilitate the correlation between both techniques. In addition, experimentally obtained quantitative information, such as displacements and shape, can be applied in the computational model in order to improve this correlation. The result is a validated computational model that can be used for performing quantitative analyses and structural optimization. Practical application of the hybrid approach is illustrated with a representative example which demonstrates the viability of the approach as an engineering tool for structural analysis and optimization.
NASA Astrophysics Data System (ADS)
Rodriguez-Hervas, Berta; Maile, Michael; Flores, Benjamin C.
2014-05-01
In recent years, the automotive industry has experienced an evolution toward more powerful driver assistance systems that provide enhanced vehicle safety. These systems typically operate in the optical and microwave regions of the electromagnetic spectrum and have demonstrated high efficiency in collision and risk avoidance. Microwave radar systems are particularly relevant due to their operational robustness under adverse weather or illumination conditions. Our objective is to study different signal processing techniques suitable for extraction of accurate micro-Doppler signatures of slow moving objects in dense urban environments. Selection of the appropriate signal processing technique is crucial for the extraction of accurate micro-Doppler signatures that will lead to better results in a radar classifier system. For this purpose, we perform simulations of typical radar detection responses in common driving situations and conduct the analysis with several signal processing algorithms, including short time Fourier Transform, continuous wavelet or Kernel based analysis methods. We take into account factors such as the relative movement between the host vehicle and the target, and the non-stationary nature of the target's movement. A comparison of results reveals that short time Fourier Transform would be the best approach for detection and tracking purposes, while the continuous wavelet would be the best suited for classification purposes.
Top-level modeling of an als system utilizing object-oriented techniques
NASA Astrophysics Data System (ADS)
Rodriguez, L. F.; Kang, S.; Ting, K. C.
The possible configuration of an Advanced Life Support (ALS) System capable of supporting human life for long-term space missions continues to evolve as researchers investigate potential technologies and configurations. To facilitate the decision process the development of acceptable, flexible, and dynamic mathematical computer modeling tools capable of system level analysis is desirable. Object-oriented techniques have been adopted to develop a dynamic top-level model of an ALS system.This approach has several advantages; among these, object-oriented abstractions of systems are inherently modular in architecture. Thus, models can initially be somewhat simplistic, while allowing for adjustments and improvements. In addition, by coding the model in Java, the model can be implemented via the World Wide Web, greatly encouraging the utilization of the model. Systems analysis is further enabled with the utilization of a readily available backend database containing information supporting the model. The subsystem models of the ALS system model include Crew, Biomass Production, Waste Processing and Resource Recovery, Food Processing and Nutrition, and the Interconnecting Space. Each subsystem model and an overall model have been developed. Presented here is the procedure utilized to develop the modeling tool, the vision of the modeling tool, and the current focus for each of the subsystem models.
NASA Technical Reports Server (NTRS)
Turso, James; Lawrence, Charles; Litt, Jonathan
2004-01-01
The development of a wavelet-based feature extraction technique specifically targeting FOD-event induced vibration signal changes in gas turbine engines is described. The technique performs wavelet analysis of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/health parameter estimation for FOD-event detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-event test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite element model modal analysis to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-event detection. In the presence of significant noise, precise location of the FOD event in time was obtained using the developed wavelet-based feature.
NASA Technical Reports Server (NTRS)
Turso, James A.; Lawrence, Charles; Litt, Jonathan S.
2007-01-01
The development of a wavelet-based feature extraction technique specifically targeting FOD-event induced vibration signal changes in gas turbine engines is described. The technique performs wavelet analysis of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/ health parameter estimation for FOD-event detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-event test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite-element model modal analysis to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-event detection. In the presence of significant noise, precise location of the FOD event in time was obtained using the developed wavelet-based feature.
NASA Technical Reports Server (NTRS)
Wiegman, E. J.; Evans, W. E.; Hadfield, R.
1975-01-01
Measurements are examined of snow coverage during the snow-melt season in 1973 and 1974 from LANDSAT imagery for the three Columbia River Subbasins. Satellite derived snow cover inventories for the three test basins were obtained as an alternative to inventories performed with the current operational practice of using small aircraft flights over selected snow fields. The accuracy and precision versus cost for several different interactive image analysis procedures was investigated using a display device, the Electronic Satellite Image Analysis Console. Single-band radiance thresholding was the principal technique employed in the snow detection, although this technique was supplemented by an editing procedure involving reference to hand-generated elevation contours. For each data and view measured, a binary thematic map or "mask" depicting the snow cover was generated by a combination of objective and subjective procedures. Photographs of data analysis equipment (displays) are shown.
Frequency response of electrochemical cells
NASA Technical Reports Server (NTRS)
Thomas, Daniel L.
1990-01-01
The main objective was to examine the feasibility of using frequency response techniques (1) as a tool in destructive physical analysis of batteries, particularly for estimating electrode structural parameters such as specific area, porosity, and tortuosity and (2) as a non-destructive testing technique for obtaining information such as state of charge and acceptability for space flight. The phenomena that contribute to the frequency response of an electrode include: (1) double layer capacitance; (2) Faradaic reaction resistance; (3) mass transfer of Warburg impedance; and (4) ohmic solution resistance. Nickel cadmium cells were investigated in solutions of KOH. A significant amount of data was acquired. Quantitative data analysis, using the developed software, is planned for the future.
On surface analysis and archaeometallurgy
NASA Astrophysics Data System (ADS)
Giumlia-Mair, Alessandra
2005-09-01
The tasks and problems which the study of ancient artefacts involves are manifold and almost as numerous as the different classes of materials and objects studied by modern specialists. This happens especially because the conservation of artefacts depends not only on their material and manufacturing techniques, but also very much on the environment and the type of soil in which they were deposited, sometimes for millennia. A number of archaeological materials and objects, dated to different periods, from the earliest use of metals to historical times, and also coming from very different geographical contexts in the whole Ancient World are discussed in this paper. The most common surface analysis methods will be evaluated and discussed in the framework of different applications.
Computer simulation of schlieren images of rotationally symmetric plasma systems: a simple method.
Noll, R; Haas, C R; Weikl, B; Herziger, G
1986-03-01
Schlieren techniques are commonly used methods for quantitative analysis of cylindrical or spherical index of refraction profiles. Many schlieren objects, however, are characterized by more complex geometries, so we have investigated the more general case of noncylindrical, rotationally symmetric distributions of index of refraction n(r,z). Assuming straight ray paths in the schlieren object we have calculated 2-D beam deviation profiles. It is shown that experimental schlieren images of the noncylindrical plasma generated by a plasma focus device can be simulated with these deviation profiles. The computer simulation allows a quantitative analysis of these schlieren images, which yields, for example, the plasma parameters, electron density, and electron density gradients.
Chao, T.T.; Sanzolone, R.F.
1992-01-01
Sample decomposition is a fundamental and integral step in the procedure of geochemical analysis. It is often the limiting factor to sample throughput, especially with the recent application of the fast and modern multi-element measurement instrumentation. The complexity of geological materials makes it necessary to choose the sample decomposition technique that is compatible with the specific objective of the analysis. When selecting a decomposition technique, consideration should be given to the chemical and mineralogical characteristics of the sample, elements to be determined, precision and accuracy requirements, sample throughput, technical capability of personnel, and time constraints. This paper addresses these concerns and discusses the attributes and limitations of many techniques of sample decomposition along with examples of their application to geochemical analysis. The chemical properties of reagents as to their function as decomposition agents are also reviewed. The section on acid dissolution techniques addresses the various inorganic acids that are used individually or in combination in both open and closed systems. Fluxes used in sample fusion are discussed. The promising microwave-oven technology and the emerging field of automation are also examined. A section on applications highlights the use of decomposition techniques for the determination of Au, platinum group elements (PGEs), Hg, U, hydride-forming elements, rare earth elements (REEs), and multi-elements in geological materials. Partial dissolution techniques used for geochemical exploration which have been treated in detail elsewhere are not discussed here; nor are fire-assaying for noble metals and decomposition techniques for X-ray fluorescence or nuclear methods be discussed. ?? 1992.
Characterization of an Indian sword: classic and noninvasive methods of investigation in comparison
NASA Astrophysics Data System (ADS)
Barzagli, E.; Grazzi, F.; Williams, A.; Edge, D.; Scherillo, A.; Kelleher, J.; Zoppi, M.
2015-04-01
The evolution of metallurgy in history is one of the most interesting topics in Archaeometry. The production of steel and its forging methods to make tools and weapons are topics of great interest in the field of the history of metallurgy. In the production of weapons, we find almost always the highest level of technology. These were generally produced by skilled craftsmen who used the best quality materials available. Indian swords are an outstanding example in this field and one of the most interesting classes of objects for the study of the evolution of metallurgy. This work presents the study of a Shamsheer (a sword with a curved blade with single edge) made available by the Wallace Collection in London. The purpose of this study was to determine the composition, the microstructure, the level and the direction of residual strain and their distribution in the blade. We have used two different approaches: the classical one (metallography) and a nondestructive technique (neutron diffraction): In this way, we can test differences and complementarities of these two techniques. To obtain a good characterization of artifacts studied by traditional analytical methods, an invasive approach is required. However, the most ancient objects are scarce in number, and the most interesting ones are usually in an excellent state of conservation, so it is unthinkable to apply techniques with a destructive approach. The analysis of blades that has been performed by metallographic microscopy has demonstrated the specificity of the production of this type of steel. However, metallographic analysis can give only limited information about the structural characteristics of these artifacts of high quality, and it is limited to the sampled areas. The best approach for nondestructive analysis is therefore to use neutron techniques.
Assessment methods for the evaluation of vitiligo.
Alghamdi, K M; Kumar, A; Taïeb, A; Ezzedine, K
2012-12-01
There is no standardized method for assessing vitiligo. In this article, we review the literature from 1981 to 2011 on different vitiligo assessment methods. We aim to classify the techniques available for vitiligo assessment as subjective, semi-objective or objective; microscopic or macroscopic; and as based on morphometry or colorimetry. Macroscopic morphological measurements include visual assessment, photography in natural or ultraviolet light, photography with computerized image analysis and tristimulus colorimetry or spectrophotometry. Non-invasive micromorphological methods include confocal laser microscopy (CLM). Subjective methods include clinical evaluation by a dermatologist and a vitiligo disease activity score. Semi-objective methods include the Vitiligo Area Scoring Index (VASI) and point-counting methods. Objective methods include software-based image analysis, tristimulus colorimetry, spectrophotometry and CLM. Morphometry is the measurement of the vitiliginous surface area, whereas colorimetry quantitatively analyses skin colour changes caused by erythema or pigment. Most methods involve morphometry, except for the chromameter method, which assesses colorimetry. Some image analysis software programs can assess both morphometry and colorimetry. The details of these programs (Corel Draw, Image Pro Plus, AutoCad and Photoshop) are discussed in the review. Reflectance confocal microscopy provides real-time images and has great potential for the non-invasive assessment of pigmentary lesions. In conclusion, there is no single best method for assessing vitiligo. This review revealed that VASI, the rule of nine and Wood's lamp are likely to be the best techniques available for assessing the degree of pigmentary lesions and measuring the extent and progression of vitiligo in the clinic and in clinical trials. © 2012 The Authors. Journal of the European Academy of Dermatology and Venereology © 2012 European Academy of Dermatology and Venereology.
Histology image analysis for carcinoma detection and grading
He, Lei; Long, L. Rodney; Antani, Sameer; Thoma, George R.
2012-01-01
This paper presents an overview of the image analysis techniques in the domain of histopathology, specifically, for the objective of automated carcinoma detection and classification. As in other biomedical imaging areas such as radiology, many computer assisted diagnosis (CAD) systems have been implemented to aid histopathologists and clinicians in cancer diagnosis and research, which have been attempted to significantly reduce the labor and subjectivity of traditional manual intervention with histology images. The task of automated histology image analysis is usually not simple due to the unique characteristics of histology imaging, including the variability in image preparation techniques, clinical interpretation protocols, and the complex structures and very large size of the images themselves. In this paper we discuss those characteristics, provide relevant background information about slide preparation and interpretation, and review the application of digital image processing techniques to the field of histology image analysis. In particular, emphasis is given to state-of-the-art image segmentation methods for feature extraction and disease classification. Four major carcinomas of cervix, prostate, breast, and lung are selected to illustrate the functions and capabilities of existing CAD systems. PMID:22436890
The robot's eyes - Stereo vision system for automated scene analysis
NASA Technical Reports Server (NTRS)
Williams, D. S.
1977-01-01
Attention is given to the robot stereo vision system which maintains the image produced by solid-state detector television cameras in a dynamic random access memory called RAPID. The imaging hardware consists of sensors (two solid-state image arrays using a charge injection technique), a video-rate analog-to-digital converter, the RAPID memory, and various types of computer-controlled displays, and preprocessing equipment (for reflexive actions, processing aids, and object detection). The software is aimed at locating objects and transversibility. An object-tracking algorithm is discussed and it is noted that tracking speed is in the 50-75 pixels/s range.
Analysis of localized fringes in the holographic optical Schlieren system
NASA Technical Reports Server (NTRS)
Kurtz, R. L.
1980-01-01
The relation between localization of interference fringes in classical and holographic interferometry is reviewed and an application of holographic interferometry is considered for which the object is a transparent medium with nonhomogeneous refractive index. The technique is based on the analysis of the optical path length change of the object wave as it propagates through a transparent medium. Phase shifts due to variations of the speed of light within the medium give rise to an interference pattern. The resulting interferogram can be used to determine the physical properties of the medium or transparent object. Such properties include the mass density of fluids, electron densities of plasmas, the temperature of fluids, the chemical species concentration of fluids, and the state of stress in solids. The optical wave used can be either a simple plane or spherical wave, or it may be a complicated spatial wave scattered by a diffusing screen. The mathematical theory on the formation and analysis of localized fringes, the general theoretical concepts used, and a computer code for analysis are included along with the inversion of fringe order data.
NASA Technical Reports Server (NTRS)
Hablani, H. B.
1985-01-01
Real disturbances and real sensors have finite bandwidths. The first objective of this paper is to incorporate this finiteness in the 'open-loop modal cost analysis' as applied to a flexible spacecraft. Analysis based on residue calculus shows that among other factors, significance of a mode depends on the power spectral density of disturbances and the response spectral density of sensors at the modal frequency. The second objective of this article is to compare performances of an optimal and a suboptimal output feedback controller, the latter based on 'minimum error excitation' of Kosut. Both the performances are found to be nearly the same, leading us to favor the latter technique because it entails only linear computations. Our final objective is to detect an instability due to truncated modes by representing them as a multiplicative and an additive perturbation in a nominal transfer function. In an example problem it is found that this procedure leads to a narrow range of permissible controller gains, and that it labels a wrong mode as a cause of instability. A free beam is used to illustrate the analysis in this work.
Schwantes, Jon M.; Marsden, Oliva; Pellegrini, Kristi L.
2016-09-16
The Nuclear Forensics International Technical Working Group (ITWG) recently completed its fourth Collaborative Materials Exercise (CMX-4) in the 21 year history of the Group. This was also the largest materials exercise to date, with participating laboratories from 16 countries or international organizations. Moreover, exercise samples (including three separate samples of low enriched uranium oxide) were shipped as part of an illicit trafficking scenario, for which each laboratory was asked to conduct nuclear forensic analyses in support of a fictitious criminal investigation. In all, over 30 analytical techniques were applied to characterize exercise materials, for which ten of those techniques weremore » applied to ITWG exercises for the first time. We performed an objective review of the state of practice and emerging application of analytical techniques of nuclear forensic analysis based upon the outcome of this most recent exercise is provided.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwantes, Jon M.; Marsden, Oliva; Pellegrini, Kristi L.
The Nuclear Forensics International Technical Working Group (ITWG) recently completed its fourth Collaborative Materials Exercise (CMX-4) in the 21 year history of the Group. This was also the largest materials exercise to date, with participating laboratories from 16 countries or international organizations. Moreover, exercise samples (including three separate samples of low enriched uranium oxide) were shipped as part of an illicit trafficking scenario, for which each laboratory was asked to conduct nuclear forensic analyses in support of a fictitious criminal investigation. In all, over 30 analytical techniques were applied to characterize exercise materials, for which ten of those techniques weremore » applied to ITWG exercises for the first time. We performed an objective review of the state of practice and emerging application of analytical techniques of nuclear forensic analysis based upon the outcome of this most recent exercise is provided.« less
Pattern recognition of satellite cloud imagery for improved weather prediction
NASA Technical Reports Server (NTRS)
Gautier, Catherine; Somerville, Richard C. J.; Volfson, Leonid B.
1986-01-01
The major accomplishment was the successful development of a method for extracting time derivative information from geostationary meteorological satellite imagery. This research is a proof-of-concept study which demonstrates the feasibility of using pattern recognition techniques and a statistical cloud classification method to estimate time rate of change of large-scale meteorological fields from remote sensing data. The cloud classification methodology is based on typical shape function analysis of parameter sets characterizing the cloud fields. The three specific technical objectives, all of which were successfully achieved, are as follows: develop and test a cloud classification technique based on pattern recognition methods, suitable for the analysis of visible and infrared geostationary satellite VISSR imagery; develop and test a methodology for intercomparing successive images using the cloud classification technique, so as to obtain estimates of the time rate of change of meteorological fields; and implement this technique in a testbed system incorporating an interactive graphics terminal to determine the feasibility of extracting time derivative information suitable for comparison with numerical weather prediction products.
Solid Phase Microextraction and Related Techniques for Drugs in Biological Samples
Moein, Mohammad Mahdi; Said, Rana; Bassyouni, Fatma
2014-01-01
In drug discovery and development, the quantification of drugs in biological samples is an important task for the determination of the physiological performance of the investigated drugs. After sampling, the next step in the analytical process is sample preparation. Because of the low concentration levels of drug in plasma and the variety of the metabolites, the selected extraction technique should be virtually exhaustive. Recent developments of sample handling techniques are directed, from one side, toward automatization and online coupling of sample preparation units. The primary objective of this review is to present the recent developments in microextraction sample preparation methods for analysis of drugs in biological fluids. Microextraction techniques allow for less consumption of solvent, reagents, and packing materials, and small sample volumes can be used. In this review the use of solid phase microextraction (SPME), microextraction in packed sorbent (MEPS), and stir-bar sorbtive extraction (SBSE) in drug analysis will be discussed. In addition, the use of new sorbents such as monoliths and molecularly imprinted polymers will be presented. PMID:24688797
A novel approach to segmentation and measurement of medical image using level set methods.
Chen, Yao-Tien
2017-06-01
The study proposes a novel approach for segmentation and visualization plus value-added surface area and volume measurements for brain medical image analysis. The proposed method contains edge detection and Bayesian based level set segmentation, surface and volume rendering, and surface area and volume measurements for 3D objects of interest (i.e., brain tumor, brain tissue, or whole brain). Two extensions based on edge detection and Bayesian level set are first used to segment 3D objects. Ray casting and a modified marching cubes algorithm are then adopted to facilitate volume and surface visualization of medical-image dataset. To provide physicians with more useful information for diagnosis, the surface area and volume of an examined 3D object are calculated by the techniques of linear algebra and surface integration. Experiment results are finally reported in terms of 3D object extraction, surface and volume rendering, and surface area and volume measurements for medical image analysis. Copyright © 2017 Elsevier Inc. All rights reserved.
Baldasso, Rosane Pérez; Tinoco, Rachel Lima Ribeiro; Vieira, Cristina Saft Matos; Fernandes, Mário Marques; Oliveira, Rogério Nogueira
2016-10-01
The process of forensic facial analysis may be founded on several scientific techniques and imaging modalities, such as digital signal processing, photogrammetry and craniofacial anthropometry. However, one of the main limitations in this analysis is the comparison of images acquired with different angles of incidence. The present study aimed to explore a potential approach for the correction of the planar perspective projection (PPP) in geometric structures traced from the human face. A technique for the correction of the PPP was calibrated within photographs of two geometric structures obtained with angles of incidence distorted in 80°, 60° and 45°. The technique was performed using ImageJ ® 1.46r (National Institutes of Health, Bethesda, Maryland). The corrected images were compared with photographs of the same object obtained in 90° (reference). In a second step, the technique was validated in a digital human face created using MakeHuman ® 1.0.2 (Free Software Foundation, Massachusetts, EUA) and Blender ® 2.75 (Blender ® Foundation, Amsterdam, Nederland) software packages. The images registered with angular distortion presented a gradual decrease in height when compared to the reference. The digital technique for the correction of the PPP is a valuable tool for forensic applications using photographic imaging modalities, such as forensic facial analysis. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Qualitative computer aided evaluation of dental impressions in vivo.
Luthardt, Ralph G; Koch, Rainer; Rudolph, Heike; Walter, Michael H
2006-01-01
Clinical investigations dealing with the precision of different impression techniques are rare. Objective of the present study was to develop and evaluate a procedure for the qualitative analysis of the three-dimensional impression precision based on an established in-vitro procedure. The zero hypothesis to be tested was that the precision of impressions does not differ depending on the impression technique used (single-step, monophase and two-step-techniques) and on clinical variables. Digital surface data of patient's teeth prepared for crowns were gathered from standardized manufactured master casts after impressions with three different techniques were taken in a randomized order. Data-sets were analyzed for each patient in comparison with the one-step impression chosen as the reference. The qualitative analysis was limited to data-points within the 99.5%-range. Based on the color-coded representation areas with maximum deviations were determined (preparation margin and the mantle and occlusal surface). To qualitatively analyze the precision of the impression techniques, the hypothesis was tested in linear models for repeated measures factors (p < 0.05). For the positive 99.5% deviations no variables with significant influence were determined in the statistical analysis. In contrast, the impression technique and the position of the preparation margin significantly influenced the negative 99.5% deviations. The influence of clinical parameter on the deviations between impression techniques can be determined reliably using the 99.5 percentile of the deviations. An analysis regarding the areas with maximum deviations showed high clinical relevance. The preparation margin was pointed out as the weak spot of impression taking.
Application of Bayesian Approach in Cancer Clinical Trial
Bhattacharjee, Atanu
2014-01-01
The application of Bayesian approach in clinical trials becomes more useful over classical method. It is beneficial from design to analysis phase. The straight forward statement is possible to obtain through Bayesian about the drug treatment effect. Complex computational problems are simple to handle with Bayesian techniques. The technique is only feasible to performing presence of prior information of the data. The inference is possible to establish through posterior estimates. However, some limitations are present in this method. The objective of this work was to explore the several merits and demerits of Bayesian approach in cancer research. The review of the technique will be helpful for the clinical researcher involved in the oncology to explore the limitation and power of Bayesian techniques. PMID:29147387
Image sequence analysis workstation for multipoint motion analysis
NASA Astrophysics Data System (ADS)
Mostafavi, Hassan
1990-08-01
This paper describes an application-specific engineering workstation designed and developed to analyze motion of objects from video sequences. The system combines the software and hardware environment of a modem graphic-oriented workstation with the digital image acquisition, processing and display techniques. In addition to automation and Increase In throughput of data reduction tasks, the objective of the system Is to provide less invasive methods of measurement by offering the ability to track objects that are more complex than reflective markers. Grey level Image processing and spatial/temporal adaptation of the processing parameters is used for location and tracking of more complex features of objects under uncontrolled lighting and background conditions. The applications of such an automated and noninvasive measurement tool include analysis of the trajectory and attitude of rigid bodies such as human limbs, robots, aircraft in flight, etc. The system's key features are: 1) Acquisition and storage of Image sequences by digitizing and storing real-time video; 2) computer-controlled movie loop playback, freeze frame display, and digital Image enhancement; 3) multiple leading edge tracking in addition to object centroids at up to 60 fields per second from both live input video or a stored Image sequence; 4) model-based estimation and tracking of the six degrees of freedom of a rigid body: 5) field-of-view and spatial calibration: 6) Image sequence and measurement data base management; and 7) offline analysis software for trajectory plotting and statistical analysis.
Higher Level Visual Cortex Represents Retinotopic, Not Spatiotopic, Object Location
Kanwisher, Nancy
2012-01-01
The crux of vision is to identify objects and determine their locations in the environment. Although initial visual representations are necessarily retinotopic (eye centered), interaction with the real world requires spatiotopic (absolute) location information. We asked whether higher level human visual cortex—important for stable object recognition and action—contains information about retinotopic and/or spatiotopic object position. Using functional magnetic resonance imaging multivariate pattern analysis techniques, we found information about both object category and object location in each of the ventral, dorsal, and early visual regions tested, replicating previous reports. By manipulating fixation position and stimulus position, we then tested whether these location representations were retinotopic or spatiotopic. Crucially, all location information was purely retinotopic. This pattern persisted when location information was irrelevant to the task, and even when spatiotopic (not retinotopic) stimulus position was explicitly emphasized. We also conducted a “searchlight” analysis across our entire scanned volume to explore additional cortex but again found predominantly retinotopic representations. The lack of explicit spatiotopic representations suggests that spatiotopic object position may instead be computed indirectly and continually reconstructed with each eye movement. Thus, despite our subjective impression that visual information is spatiotopic, even in higher level visual cortex, object location continues to be represented in retinotopic coordinates. PMID:22190434
A Taxonomy of 3D Occluded Objects Recognition Techniques
NASA Astrophysics Data System (ADS)
Soleimanizadeh, Shiva; Mohamad, Dzulkifli; Saba, Tanzila; Al-ghamdi, Jarallah Saleh
2016-03-01
The overall performances of object recognition techniques under different condition (e.g., occlusion, viewpoint, and illumination) have been improved significantly in recent years. New applications and hardware are shifted towards digital photography, and digital media. This faces an increase in Internet usage requiring object recognition for certain applications; particularly occulded objects. However occlusion is still an issue unhandled, interlacing the relations between extracted feature points through image, research is going on to develop efficient techniques and easy to use algorithms that would help users to source images; this need to overcome problems and issues regarding occlusion. The aim of this research is to review recognition occluded objects algorithms and figure out their pros and cons to solve the occlusion problem features, which are extracted from occluded object to distinguish objects from other co-existing objects by determining the new techniques, which could differentiate the occluded fragment and sections inside an image.
Computed Tomography Inspection and Analysis for Additive Manufacturing Components
NASA Technical Reports Server (NTRS)
Beshears, Ronald D.
2016-01-01
Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws were inspected using a 2MeV linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed using standard image analysis techniques to determine the impact of additive manufacturing on inspectability of objects with complex geometries.
Magnetic Braking: A Video Analysis
NASA Astrophysics Data System (ADS)
Molina-Bolívar, J. A.; Abella-Palacios, A. J.
2012-10-01
This paper presents a laboratory exercise that introduces students to the use of video analysis software and the Lenz's law demonstration. Digital techniques have proved to be very useful for the understanding of physical concepts. In particular, the availability of affordable digital video offers students the opportunity to actively engage in kinematics in introductory-level physics.1,2 By using digital videos frame advance features and "marking" the position of a moving object in each frame, students are able to more precisely determine the position of an object at much smaller time increments than would be possible with common time devices. Once the student collects data consisting of positions and times, these values may be manipulated to determine velocity and acceleration. There are a variety of commercial and free applications that can be used for video analysis. Because the relevant technology has become inexpensive, video analysis has become a prevalent tool in introductory physics courses.
Sensor planning for moving targets
NASA Astrophysics Data System (ADS)
Musman, Scott A.; Lehner, Paul; Elsaesser, Chris
1994-10-01
Planning a search for moving ground targets is difficult for humans and computationally intractable. This paper describes a technique to solve such problems. The main idea is to combine probability of detection assessments with computational search heuristics to generate sensor plans which approximately maximize either the probability of detection or a user- specified knowledge function (e.g., determining the target's probable destination; locating the enemy tanks). In contrast to super computer-based moving target search planning, our technique has been implemented using workstation technology. The data structures generated by sensor planning can be used to evaluate sensor reports during plan execution. Our system revises its objective function with each sensor report, allowing the user to assess both the current situation as well as the expected value of future information. This capability is particularly useful in situations involving a high rate of sensor reporting, helping the user focus his attention on sensors reports most pertinent to current needs. Our planning approach is implemented in a three layer architecture. The layers are: mobility analysis, followed by sensor coverage analysis, and concluding with sensor plan analysis. It is possible using these layers to describe the physical, spatial, and temporal characteristics of a scenario in the first two layers, and customize the final analysis to specific intelligence objectives. The architecture also allows a user to customize operational parameters in each of the three major components of the system. As examples of these performance options, we briefly describe the mobility analysis and discuss issues affecting sensor plan analysis.
Donato, Gianluca; Bartlett, Marian Stewart; Hager, Joseph C.; Ekman, Paul; Sejnowski, Terrence J.
2010-01-01
The Facial Action Coding System (FACS) [23] is an objective method for quantifying facial movement in terms of component actions. This system is widely used in behavioral investigations of emotion, cognitive processes, and social interaction. The coding is presently performed by highly trained human experts. This paper explores and compares techniques for automatically recognizing facial actions in sequences of images. These techniques include analysis of facial motion through estimation of optical flow; holistic spatial analysis, such as principal component analysis, independent component analysis, local feature analysis, and linear discriminant analysis; and methods based on the outputs of local filters, such as Gabor wavelet representations and local principal components. Performance of these systems is compared to naive and expert human subjects. Best performances were obtained using the Gabor wavelet representation and the independent component representation, both of which achieved 96 percent accuracy for classifying 12 facial actions of the upper and lower face. The results provide converging evidence for the importance of using local filters, high spatial frequencies, and statistical independence for classifying facial actions. PMID:21188284
Technical Note: The Initial Stages of Statistical Data Analysis
Tandy, Richard D.
1998-01-01
Objective: To provide an overview of several important data-related considerations in the design stage of a research project and to review the levels of measurement and their relationship to the statistical technique chosen for the data analysis. Background: When planning a study, the researcher must clearly define the research problem and narrow it down to specific, testable questions. The next steps are to identify the variables in the study, decide how to group and treat subjects, and determine how to measure, and the underlying level of measurement of, the dependent variables. Then the appropriate statistical technique can be selected for data analysis. Description: The four levels of measurement in increasing complexity are nominal, ordinal, interval, and ratio. Nominal data are categorical or “count” data, and the numbers are treated as labels. Ordinal data can be ranked in a meaningful order by magnitude. Interval data possess the characteristics of ordinal data and also have equal distances between levels. Ratio data have a natural zero point. Nominal and ordinal data are analyzed with nonparametric statistical techniques and interval and ratio data with parametric statistical techniques. Advantages: Understanding the four levels of measurement and when it is appropriate to use each is important in determining which statistical technique to use when analyzing data. PMID:16558489
Potvin, Christopher M; Zhou, Hongde
2011-11-01
The objective of this study was to demonstrate the effects of complex matrix effects caused by chemical materials on the analysis of key soluble microbial products (SMP) including proteins, humics, carbohydrates, and polysaccharides in activated sludge samples. Emphasis was placed on comparison of the commonly used standard curve technique with standard addition (SA), a technique that differs in that the analytical responses are measured for sample solutions spiked with known quantities of analytes. The results showed that using SA provided a great improvement in compensating for SMP recovery and thus improving measurement accuracy by correcting for matrix effects. Analyte recovery was found to be highly dependent on sample dilution, and changed due to extraction techniques, storage conditions and sample composition. Storage of sample extracts by freezing changed SMP concentrations dramatically, as did storage at 4°C for as little as 1d. Copyright © 2011 Elsevier Ltd. All rights reserved.
[Introduction to Exploratory Factor Analysis (EFA)].
Martínez, Carolina Méndez; Sepúlveda, Martín Alonso Rondón
2012-03-01
Exploratory Factor Analysis (EFA) has become one of the most frequently used statistical techniques, especially in the medical and social sciences. Given its popularity, it is essential to understand the basic concepts necessary for its proper application and to take into consideration the main strengths and weaknesses of this technique. To present in a clear and concise manner the main applications of this technique, to determine the basic requirements for its use providing a description step by step of its methodology, and to establish the elements that must be taken into account during its preparation in order to not incur in erroneous results and interpretations. Narrative review. This review identifies the basic concepts and briefly describes the objectives, design, assumptions, and methodology to achieve factor derivation, global adjustment evaluation, and adequate interpretation of results. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
Fast and objective detection and analysis of structures in downhole images
NASA Astrophysics Data System (ADS)
Wedge, Daniel; Holden, Eun-Jung; Dentith, Mike; Spadaccini, Nick
2017-09-01
Downhole acoustic and optical televiewer images, and formation microimager (FMI) logs are important datasets for structural and geotechnical analyses for the mineral and petroleum industries. Within these data, dipping planar structures appear as sinusoids, often in incomplete form and in abundance. Their detection is a labour intensive and hence expensive task and as such is a significant bottleneck in data processing as companies may have hundreds of kilometres of logs to process each year. We present an image analysis system that harnesses the power of automated image analysis and provides an interactive user interface to support the analysis of televiewer images by users with different objectives. Our algorithm rapidly produces repeatable, objective results. We have embedded it in an interactive workflow to complement geologists' intuition and experience in interpreting data to improve efficiency and assist, rather than replace the geologist. The main contributions include a new image quality assessment technique for highlighting image areas most suited to automated structure detection and for detecting boundaries of geological zones, and a novel sinusoid detection algorithm for detecting and selecting sinusoids with given confidence levels. Further tools are provided to perform rapid analysis of and further detection of structures e.g. as limited to specific orientations.
The Systems Analysis Approach to Satellite Education in Brazil.
ERIC Educational Resources Information Center
Cusack, Mary Ann
The SACI project in Brazil has as a main target the country's primary teachers. The SACI Project objectives are: (1) to test the efficiency of an educational program using audiovisual media (particularly television, radio, and slow scan) at the primary level; (2) to develop television production techniques; (3) to train teachers in the utilization…
Improved analyses using function datasets and statistical modeling
John S. Hogland; Nathaniel M. Anderson
2014-01-01
Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space and have limited statistical functionality and machine learning algorithms. To address this issue, we developed a new modeling framework using C# and ArcObjects and integrated that framework...
ERIC Educational Resources Information Center
Hoz, Ron; Bowman, Dan; Chacham, Tova
1997-01-01
Students (N=14) in a geomorphology course took an objective geomorphology test, the tree construction task, and the Standardized Concept Structuring Analysis Technique (SConSAT) version of concept mapping. Results suggest that the SConSAT knowledge structure dimensions have moderate to good construct validity. Contains 82 references. (DDR)
Using the Technique of Journal Writing to Learn Emergency Psychiatry
ERIC Educational Resources Information Center
Bhuvaneswar, Chaya; Stern, Theodore; Beresin, Eugene
2009-01-01
Objective: The authors discuss journal writing in learning emergency psychiatry. Methods: The journal of a psychiatry intern rotating through an emergency department is used as sample material for analysis that could take place in supervision or a resident support group. A range of articles are reviewed that illuminate the relevance of journal…
Wavelet-based hierarchical surface approximation from height fields
Sang-Mook Lee; A. Lynn Abbott; Daniel L. Schmoldt
2004-01-01
This paper presents a novel hierarchical approach to triangular mesh generation from height fields. A wavelet-based multiresolution analysis technique is used to estimate local shape information at different levels of resolution. Using predefined templates at the coarsest level, the method constructs an initial triangulation in which underlying object shapes are well...
The radiographic investigation of two Egyptian mummies.
Fodor, J; Malott, J C; King, A Y
1983-01-01
Radiography is a well-recognized method of nondestructive analysis of art objects and ancient relics. The methods and techniques used in the examination of two ancient Egyptian mummies are presented here. Additionally, the use of radiographic findings to help substantiate alleged historical information and to establish sex, age, and pathology of each specimen is discussed.
Studying Urban History through Oral History and Q Methodology: A Comparative Analysis.
ERIC Educational Resources Information Center
Jimenez, Rebecca S.
Oral history and Q methodology (a social science technique designed to document objectively and numerically the reactions of individuals to selected issues) were used to investigate urban renewal in Waco, Texas. Nineteen persons directly involved in the city's relocation and rehabilitation projects granted interviews. From these oral histories, 70…
Methods for magnetic resonance analysis using magic angle technique
Hu, Jian Zhi [Richland, WA; Wind, Robert A [Kennewick, WA; Minard, Kevin R [Kennewick, WA; Majors, Paul D [Kennewick, WA
2011-11-22
Methods of performing a magnetic resonance analysis of a biological object are disclosed that include placing the object in a main magnetic field (that has a static field direction) and in a radio frequency field; rotating the object at a frequency of less than about 100 Hz around an axis positioned at an angle of about 54.degree.44' relative to the main magnetic static field direction; pulsing the radio frequency to provide a sequence that includes a phase-corrected magic angle turning pulse segment; and collecting data generated by the pulsed radio frequency. In particular embodiments the method includes pulsing the radio frequency to provide at least two of a spatially selective read pulse, a spatially selective phase pulse, and a spatially selective storage pulse. Further disclosed methods provide pulse sequences that provide extended imaging capabilities, such as chemical shift imaging or multiple-voxel data acquisition.
Quantification of oxidation on the surface of a polymer through photography
NASA Astrophysics Data System (ADS)
Yáñez M., J.; Estrada M., A.
2009-09-01
Oxidation in polymeric materials and special polyurethane is manifested by a yellow color, highly visible in white soles for footwear, besides presenting changes in its properties. Its importance varies according to the application of the material for which it was created. The most common way to detect this process is through a visual color change on the surface. In the present proposal we present a technique using digital photography for quantifying the color change in the polymer. The analysis of the photography is realized by means of projective geometry, since, relates the plane of the object and the one of the image of the object. This allows determining the area of the studied object, and by means of a histogram, which is determined each time for to record the progress of oxidation on the surface of the material. We present results of visual analysis and its behavior through a mathematical model.
Near-Earth Object Astrometric Interferometry
NASA Technical Reports Server (NTRS)
Werner, Martin R.
2005-01-01
Using astrometric interferometry on near-Earth objects (NEOs) poses many interesting and difficult challenges. Poor reflectance properties and potentially no significant active emissions lead to NEOs having intrinsically low visual magnitudes. Using worst case estimates for signal reflection properties leads to NEOs having visual magnitudes of 27 and higher. Today the most sensitive interferometers in operation have limiting magnitudes of 20 or less. The main reason for this limit is due to the atmosphere, where turbulence affects the light coming from the target, limiting the sensitivity of the interferometer. In this analysis, the interferometer designs assume no atmosphere, meaning they would be placed at a location somewhere in space. Interferometer configurations and operational uncertainties are looked at in order to parameterize the requirements necessary to achieve measurements of low visual magnitude NEOs. This analysis provides a preliminary estimate of what will be required in order to take high resolution measurements of these objects using interferometry techniques.
Remote Safety Monitoring for Elderly Persons Based on Omni-Vision Analysis
Xiang, Yun; Tang, Yi-ping; Ma, Bao-qing; Yan, Hang-chen; Jiang, Jun; Tian, Xu-yuan
2015-01-01
Remote monitoring service for elderly persons is important as the aged populations in most developed countries continue growing. To monitor the safety and health of the elderly population, we propose a novel omni-directional vision sensor based system, which can detect and track object motion, recognize human posture, and analyze human behavior automatically. In this work, we have made the following contributions: (1) we develop a remote safety monitoring system which can provide real-time and automatic health care for the elderly persons and (2) we design a novel motion history or energy images based algorithm for motion object tracking. Our system can accurately and efficiently collect, analyze, and transfer elderly activity information and provide health care in real-time. Experimental results show that our technique can improve the data analysis efficiency by 58.5% for object tracking. Moreover, for the human posture recognition application, the success rate can reach 98.6% on average. PMID:25978761
Post-processing of auditory steady-state responses to correct spectral leakage.
Felix, Leonardo Bonato; de Sá, Antonio Mauricio Ferreira Leite Miranda; Mendes, Eduardo Mazoni Andrade Marçal; Moraes, Márcio Flávio Dutra
2009-06-30
Auditory steady-state responses (ASSRs) are electrical manifestations of brain due to high rate sound stimulation. These evoked responses can be used to assess the hearing capabilities of a subject in an objective, automatic fashion. Usually, the detection protocol is accomplished by frequency-domain techniques, such as magnitude-squared coherence, whose estimation is based on the fast Fourier transform (FFT) of several data segments. In practice, the FFT-based spectrum may spread out the energy of a given frequency to its side bins and this escape of energy in the spectrum is called spectral leakage. The distortion of the spectrum due to leakage may severely compromise statistical significance of objective detection. This work presents an offline, a posteriori method for spectral leakage minimization in the frequency-domain analysis of ASSRs using coherent sampling criterion and interpolation in time. The technique was applied to the local field potentials of 10 Wistar rats and the results, together with those from simulated data, indicate that a leakage-free analysis of ASSRs is possible for any dataset if the methods showed in this paper were followed.
Introducing Risk Management Techniques Within Project Based Software Engineering Courses
NASA Astrophysics Data System (ADS)
Port, Daniel; Boehm, Barry
2002-03-01
In 1996, USC switched its core two-semester software engineering course from a hypothetical-project, homework-and-exam course based on the Bloom taxonomy of educational objectives (knowledge, comprehension, application, analysis, synthesis, and evaluation). The revised course is a real-client team-project course based on the CRESST model of learning objectives (content understanding, problem solving, collaboration, communication, and self-regulation). We used the CRESST cognitive demands analysis to determine the necessary student skills required for software risk management and the other major project activities, and have been refining the approach over the last 5 years of experience, including revised versions for one-semester undergraduate and graduate project course at Columbia. This paper summarizes our experiences in evolving the risk management aspects of the project course. These have helped us mature more general techniques such as risk-driven specifications, domain-specific simplifier and complicator lists, and the schedule as an independent variable (SAIV) process model. The largely positive results in terms of review of pass / fail rates, client evaluations, product adoption rates, and hiring manager feedback are summarized as well.
Simulation/Emulation Techniques: Compressing Schedules With Parallel (HW/SW) Development
NASA Technical Reports Server (NTRS)
Mangieri, Mark L.; Hoang, June
2014-01-01
NASA has always been in the business of balancing new technologies and techniques to achieve human space travel objectives. NASA's Kedalion engineering analysis lab has been validating and using many contemporary avionics HW/SW development and integration techniques, which represent new paradigms to NASA's heritage culture. Kedalion has validated many of the Orion HW/SW engineering techniques borrowed from the adjacent commercial aircraft avionics solution space, inserting new techniques and skills into the Multi - Purpose Crew Vehicle (MPCV) Orion program. Using contemporary agile techniques, Commercial-off-the-shelf (COTS) products, early rapid prototyping, in-house expertise and tools, and extensive use of simulators and emulators, NASA has achieved cost effective paradigms that are currently serving the Orion program effectively. Elements of long lead custom hardware on the Orion program have necessitated early use of simulators and emulators in advance of deliverable hardware to achieve parallel design and development on a compressed schedule.
Hergovich, Andreas; Gröbl, Kristian; Carbon, Claus-Christian
2011-01-01
Following Gustav Kuhn's inspiring technique of using magicians' acts as a source of insight into cognitive sciences, we used the 'paddle move' for testing the psychophysics of combined movement trajectories. The paddle move is a standard technique in magic consisting of a combined rotating and tilting movement. Careful control of the mutual speed parameters of the two movements makes it possible to inhibit the perception of the rotation, letting the 'magic' effect emerge--a sudden change of the tilted object. By using 3-D animated computer graphics we analysed the interaction of different angular speeds and the object shape/size parameters in evoking this motion disappearance effect. An angular speed of 540 degrees s(-1) (1.5 rev. s(-1)) sufficed to inhibit the perception of the rotary movement with the smallest object showing the strongest effect. 90.7% of the 172 participants were not able to perceive the rotary movement at an angular speed of 1125 degrees s(-1) (3.125 rev. s(-1)). Further analysis by multiple linear regression revealed major influences on the effectiveness of the magic trick of object height and object area, demonstrating the applicability of analysing key factors of magic tricks to reveal limits of the perceptual system.
The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bihn T. Pham; Jeffrey J. Einerson
2010-06-01
This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automatedmore » processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.« less
A comparison of algorithms for inference and learning in probabilistic graphical models.
Frey, Brendan J; Jojic, Nebojsa
2005-09-01
Research into methods for reasoning under uncertainty is currently one of the most exciting areas of artificial intelligence, largely because it has recently become possible to record, store, and process large amounts of data. While impressive achievements have been made in pattern classification problems such as handwritten character recognition, face detection, speaker identification, and prediction of gene function, it is even more exciting that researchers are on the verge of introducing systems that can perform large-scale combinatorial analyses of data, decomposing the data into interacting components. For example, computational methods for automatic scene analysis are now emerging in the computer vision community. These methods decompose an input image into its constituent objects, lighting conditions, motion patterns, etc. Two of the main challenges are finding effective representations and models in specific applications and finding efficient algorithms for inference and learning in these models. In this paper, we advocate the use of graph-based probability models and their associated inference and learning algorithms. We review exact techniques and various approximate, computationally efficient techniques, including iterated conditional modes, the expectation maximization (EM) algorithm, Gibbs sampling, the mean field method, variational techniques, structured variational techniques and the sum-product algorithm ("loopy" belief propagation). We describe how each technique can be applied in a vision model of multiple, occluding objects and contrast the behaviors and performances of the techniques using a unifying cost function, free energy.
NASA Astrophysics Data System (ADS)
Setiyoko, A.; Dharma, I. G. W. S.; Haryanto, T.
2017-01-01
Multispectral data and hyperspectral data acquired from satellite sensor have the ability in detecting various objects on the earth ranging from low scale to high scale modeling. These data are increasingly being used to produce geospatial information for rapid analysis by running feature extraction or classification process. Applying the most suited model for this data mining is still challenging because there are issues regarding accuracy and computational cost. This research aim is to develop a better understanding regarding object feature extraction and classification applied for satellite image by systematically reviewing related recent research projects. A method used in this research is based on PRISMA statement. After deriving important points from trusted sources, pixel based and texture-based feature extraction techniques are promising technique to be analyzed more in recent development of feature extraction and classification.
NASA Astrophysics Data System (ADS)
Vasant, P.; Ganesan, T.; Elamvazuthi, I.
2012-11-01
A fairly reasonable result was obtained for non-linear engineering problems using the optimization techniques such as neural network, genetic algorithms, and fuzzy logic independently in the past. Increasingly, hybrid techniques are being used to solve the non-linear problems to obtain better output. This paper discusses the use of neuro-genetic hybrid technique to optimize the geological structure mapping which is known as seismic survey. It involves the minimization of objective function subject to the requirement of geophysical and operational constraints. In this work, the optimization was initially performed using genetic programming, and followed by hybrid neuro-genetic programming approaches. Comparative studies and analysis were then carried out on the optimized results. The results indicate that the hybrid neuro-genetic hybrid technique produced better results compared to the stand-alone genetic programming method.
Pandey, Khushaboo; Dubey, Rama Shankar; Prasad, Bhim Bali
2016-03-01
The most important objectives that are frequently found in bio-analytical chemistry involve applying tools to relevant medical/biological problems and refining these applications. Developing a reliable sample preparation step, for the medical and biological fields is another primary objective in analytical chemistry, in order to extract and isolate the analytes of interest from complex biological matrices. Since, main inborn errors of metabolism (IEM) diagnosable through uracil analysis and the therapeutic monitoring of toxic 5-fluoruracil (an important anti-cancerous drug) in dihydropyrimidine dehydrogenase deficient patients, require an ultra-sensitive, reproducible, selective, and accurate analytical techniques for their measurements. Therefore, keeping in view, the diagnostic value of uracil and 5-fluoruracil measurements, this article refines several analytical techniques involved in selective recognition and quantification of uracil and 5-fluoruracil from biological and pharmaceutical samples. The prospective study revealed that implementation of molecularly imprinted polymer as a solid-phase material for sample preparation and preconcentration of uracil and 5-fluoruracil had proven to be effective as it could obviates problems related to tedious separation techniques, owing to protein binding and drastic interferences, from the complex matrices in real samples such as blood plasma, serum samples.
The monocular visual imaging technology model applied in the airport surface surveillance
NASA Astrophysics Data System (ADS)
Qin, Zhe; Wang, Jian; Huang, Chao
2013-08-01
At present, the civil aviation airports use the surface surveillance radar monitoring and positioning systems to monitor the aircrafts, vehicles and the other moving objects. Surface surveillance radars can cover most of the airport scenes, but because of the terminals, covered bridges and other buildings geometry, surface surveillance radar systems inevitably have some small segment blind spots. This paper presents a monocular vision imaging technology model for airport surface surveillance, achieving the perception of scenes of moving objects such as aircrafts, vehicles and personnel location. This new model provides an important complement for airport surface surveillance, which is different from the traditional surface surveillance radar techniques. Such technique not only provides clear objects activities screen for the ATC, but also provides image recognition and positioning of moving targets in this area. Thereby it can improve the work efficiency of the airport operations and avoid the conflict between the aircrafts and vehicles. This paper first introduces the monocular visual imaging technology model applied in the airport surface surveillance and then the monocular vision measurement accuracy analysis of the model. The monocular visual imaging technology model is simple, low cost, and highly efficient. It is an advanced monitoring technique which can make up blind spot area of the surface surveillance radar monitoring and positioning systems.
Cognitive task analysis of network analysts and managers for network situational awareness
NASA Astrophysics Data System (ADS)
Erbacher, Robert F.; Frincke, Deborah A.; Wong, Pak Chung; Moody, Sarah; Fink, Glenn
2010-01-01
The goal of our project is to create a set of next-generation cyber situational-awareness capabilities with applications to other domains in the long term. The situational-awareness capabilities being developed focus on novel visualization techniques as well as data analysis techniques designed to improve the comprehensibility of the visualizations. The objective is to improve the decision-making process to enable decision makers to choose better actions. To this end, we put extensive effort into ensuring we had feedback from network analysts and managers and understanding what their needs truly are. This paper discusses the cognitive task analysis methodology we followed to acquire feedback from the analysts. This paper also provides the details we acquired from the analysts on their processes, goals, concerns, etc. A final result we describe is the generation of a task-flow diagram.
General Aviation Interior Noise. Part 3; Noise Control Measure Evaluation
NASA Technical Reports Server (NTRS)
Unruh, James F.; Till, Paul D.; Palumbo, Daniel L. (Technical Monitor)
2002-01-01
The work reported herein is an extension to the work accomplished under NASA Grant NAG1-2091 on the development of noise/source/path identification techniques for single engine propeller driven General Aviation aircraft. The previous work developed a Conditioned Response Analysis (CRA) technique to identify potential noise sources that contributed to the dominating tonal responses within the aircraft cabin. The objective of the present effort was to improve and verify the findings of the CRA and develop and demonstrate noise control measures for single engine propeller driven General Aviation aircraft.
Advanced computational tools for 3-D seismic analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barhen, J.; Glover, C.W.; Protopopescu, V.A.
1996-06-01
The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advancemore » in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.« less
Dos Santos, Wellington P; de Assis, Francisco M; de Souza, Ricardo E; Dos Santos Filho, Plinio B
2008-01-01
Alzheimer's disease is the most common cause of dementia, yet hard to diagnose precisely without invasive techniques, particularly at the onset of the disease. This work approaches image analysis and classification of synthetic multispectral images composed by diffusion-weighted (DW) magnetic resonance (MR) cerebral images for the evaluation of cerebrospinal fluid area and measuring the advance of Alzheimer's disease. A clinical 1.5 T MR imaging system was used to acquire all images presented. The classification methods are based on Objective Dialectical Classifiers, a new method based on Dialectics as defined in the Philosophy of Praxis. A 2-degree polynomial network with supervised training is used to generate the ground truth image. The classification results are used to improve the usual analysis of the apparent diffusion coefficient map.
Single Cell and Population Level Analysis of HCA Data.
Novo, David; Ghosh, Kaya; Burke, Sean
2018-01-01
High Content Analysis instrumentation has undergone tremendous hardware advances in recent years. It is now possible to obtain images of hundreds of thousands to millions of individual objects, across multiple wells, channels, and plates, in a reasonable amount of time. In addition, it is possible to extract dozens, or hundreds, of features per object using commonly available software tools. Analyzing this data provides new challenges to the scientists. The magnitude of these numbers is reminiscent of flow cytometer, where practitioners have long been taking what effectively amounted to very low resolution, multi-parametric measurements from individual cells for many decades. Flow cytometrists have developed a wide range of tools to effectively analyze and interpret these types of data. This chapter will review the techniques used in flow cytometry and show how they can easily and effectively be applied to High Content Analysis.
NASA Astrophysics Data System (ADS)
Marston, B. K.; Bishop, M. P.; Shroder, J. F.
2009-12-01
Digital terrain analysis of mountain topography is widely utilized for mapping landforms, assessing the role of surface processes in landscape evolution, and estimating the spatial variation of erosion. Numerous geomorphometry techniques exist to characterize terrain surface parameters, although their utility to characterize the spatial hierarchical structure of the topography and permit an assessment of the erosion/tectonic impact on the landscape is very limited due to scale and data integration issues. To address this problem, we apply scale-dependent geomorphometric and object-oriented analyses to characterize the hierarchical spatial structure of mountain topography. Specifically, we utilized a high resolution digital elevation model to characterize complex topography in the Shimshal Valley in the Western Himalaya of Pakistan. To accomplish this, we generate terrain objects (geomorphological features and landform) including valley floors and walls, drainage basins, drainage network, ridge network, slope facets, and elemental forms based upon curvature. Object-oriented analysis was used to characterize object properties accounting for object size, shape, and morphometry. The spatial overlay and integration of terrain objects at various scales defines the nature of the hierarchical organization. Our results indicate that variations in the spatial complexity of the terrain hierarchical organization is related to the spatio-temporal influence of surface processes and landscape evolution dynamics. Terrain segmentation and the integration of multi-scale terrain information permits further assessment of process domains and erosion, tectonic impact potential, and natural hazard potential. We demonstrate this with landform mapping and geomorphological assessment examples.
Laboratory techniques and rhythmometry
NASA Technical Reports Server (NTRS)
Halberg, F.
1973-01-01
Some of the procedures used for the analysis of rhythms are illustrated, notably as these apply to current medical and biological practice. For a quantitative approach to medical and broader socio-ecologic goals, the chronobiologist gathers numerical objective reference standards for rhythmic biophysical, biochemical, and behavioral variables. These biological reference standards can be derived by specialized computer analyses of largely self-measured (until eventually automatically recorded) time series (autorhythmometry). Objective numerical values for individual and population parameters of reproductive cycles can be obtained concomitantly with characteristics of about-yearly (circannual), about-daily (circadian) and other rhythms.
Experimental research of digital holographic microscopic measuring
NASA Astrophysics Data System (ADS)
Zhu, Xueliang; Chen, Feifei; Li, Jicheng
2013-06-01
Digital holography is a new imaging technique, which is developed on the base of optical holography, Digital processing, and Computer techniques. It is using CCD instead of the conventional silver to record hologram, and then reproducing the 3D contour of the object by the way of computer simulation. Compared with the traditional optical holographic, the whole process is of simple measuring, lower production cost, faster the imaging speed, and with the advantages of non-contact real-time measurement. At present, it can be used in the fields of the morphology detection of tiny objects, micro deformation analysis, and biological cells shape measurement. It is one of the research hot spot at home and abroad. This paper introduced the basic principles and relevant theories about the optical holography and Digital holography, and researched the basic questions which influence the reproduce images in the process of recording and reconstructing of the digital holographic microcopy. In order to get a clear digital hologram, by analyzing the optical system structure, we discussed the recording distance and of the hologram. On the base of the theoretical studies, we established a measurement and analyzed the experimental conditions, then adjusted them to the system. To achieve a precise measurement of tiny object in three-dimension, we measured MEMS micro device for example, and obtained the reproduction three-dimensional contour, realized the three dimensional profile measurement of tiny object. According to the experiment results consider: analysis the reference factors between the zero-order term and a pair of twin-images by the choice of the object light and the reference light and the distance of the recording and reconstructing and the characteristics of reconstruction light on the measurement, the measurement errors were analyzed. The research result shows that the device owns certain reliability.
Application of GEM-based detectors in full-field XRF imaging
NASA Astrophysics Data System (ADS)
Dąbrowski, W.; Fiutowski, T.; Frączek, P.; Koperny, S.; Lankosz, M.; Mendys, A.; Mindur, B.; Świentek, K.; Wiącek, P.; Wróbel, P. M.
2016-12-01
X-ray fluorescence spectroscopy (XRF) is a commonly used technique for non-destructive elemental analysis of cultural heritage objects. It can be applied to investigations of provenance of historical objects as well as to studies of art techniques. While the XRF analysis can be easily performed locally using standard available equipment there is a growing interest in imaging of spatial distribution of specific elements. Spatial imaging of elemental distrbutions is usually realised by scanning an object with a narrow focused X-ray excitation beam and measuring characteristic fluorescence radiation using a high energy resolution detector, usually a silicon drift detector. Such a technique, called macro-XRF imaging, is suitable for investigation of flat surfaces but it is time consuming because the spatial resolution is basically determined by the spot size of the beam. Another approach is the full-field XRF, which is based on simultaneous irradiation and imaging of large area of an object. The image of the investigated area is projected by a pinhole camera on a position-sensitive and energy dispersive detector. The infinite depth of field of the pinhole camera allows one, in principle, investigation of non-flat surfaces. One of possible detectors to be employed in full-field XRF imaging is a GEM based detector with 2-dimensional readout. In the paper we report on development of an imaging system equipped with a standard 3-stage GEM detector of 10 × 10 cm2 equipped with readout electronics based on dedicated full-custom ASICs and DAQ system. With a demonstrator system we have obtained 2-D spatial resolution of the order of 100 μm and energy resolution at a level of 20% FWHM for 5.9 keV . Limitations of such a detector due to copper fluorescence radiation excited in the copper-clad drift electrode and GEM foils is discussed and performance of the detector using chromium-clad electrodes is reported.
NASA Astrophysics Data System (ADS)
Luo, Qiankun; Wu, Jianfeng; Yang, Yun; Qian, Jiazhong; Wu, Jichun
2014-11-01
This study develops a new probabilistic multi-objective fast harmony search algorithm (PMOFHS) for optimal design of groundwater remediation systems under uncertainty associated with the hydraulic conductivity (K) of aquifers. The PMOFHS integrates the previously developed deterministic multi-objective optimization method, namely multi-objective fast harmony search algorithm (MOFHS) with a probabilistic sorting technique to search for Pareto-optimal solutions to multi-objective optimization problems in a noisy hydrogeological environment arising from insufficient K data. The PMOFHS is then coupled with the commonly used flow and transport codes, MODFLOW and MT3DMS, to identify the optimal design of groundwater remediation systems for a two-dimensional hypothetical test problem and a three-dimensional Indiana field application involving two objectives: (i) minimization of the total remediation cost through the engineering planning horizon, and (ii) minimization of the mass remaining in the aquifer at the end of the operational period, whereby the pump-and-treat (PAT) technology is used to clean up contaminated groundwater. Also, Monte Carlo (MC) analysis is employed to evaluate the effectiveness of the proposed methodology. Comprehensive analysis indicates that the proposed PMOFHS can find Pareto-optimal solutions with low variability and high reliability and is a potentially effective tool for optimizing multi-objective groundwater remediation problems under uncertainty.
Integrating end-to-end threads of control into object-oriented analysis and design
NASA Technical Reports Server (NTRS)
Mccandlish, Janet E.; Macdonald, James R.; Graves, Sara J.
1993-01-01
Current object-oriented analysis and design methodologies fall short in their use of mechanisms for identifying threads of control for the system being developed. The scenarios which typically describe a system are more global than looking at the individual objects and representing their behavior. Unlike conventional methodologies that use data flow and process-dependency diagrams, object-oriented methodologies do not provide a model for representing these global threads end-to-end. Tracing through threads of control is key to ensuring that a system is complete and timing constraints are addressed. The existence of multiple threads of control in a system necessitates a partitioning of the system into processes. This paper describes the application and representation of end-to-end threads of control to the object-oriented analysis and design process using object-oriented constructs. The issue of representation is viewed as a grouping problem, that is, how to group classes/objects at a higher level of abstraction so that the system may be viewed as a whole with both classes/objects and their associated dynamic behavior. Existing object-oriented development methodology techniques are extended by adding design-level constructs termed logical composite classes and process composite classes. Logical composite classes are design-level classes which group classes/objects both logically and by thread of control information. Process composite classes further refine the logical composite class groupings by using process partitioning criteria to produce optimum concurrent execution results. The goal of these design-level constructs is to ultimately provide the basis for a mechanism that can support the creation of process composite classes in an automated way. Using an automated mechanism makes it easier to partition a system into concurrently executing elements that can be run in parallel on multiple processors.
Computational Control of Flexible Aerospace Systems
NASA Technical Reports Server (NTRS)
Sharpe, Lonnie, Jr.; Shen, Ji Yao
1994-01-01
The main objective of this project is to establish a distributed parameter modeling technique for structural analysis, parameter estimation, vibration suppression and control synthesis of large flexible aerospace structures. This report concentrates on the research outputs produced in the last two years of the project. The main accomplishments can be summarized as follows. A new version of the PDEMOD Code had been completed. A theoretical investigation of the NASA MSFC two-dimensional ground-based manipulator facility by using distributed parameter modelling technique has been conducted. A new mathematical treatment for dynamic analysis and control of large flexible manipulator systems has been conceived, which may provide a embryonic form of a more sophisticated mathematical model for future modified versions of the PDEMOD Codes.
Basic research planning in mathematical pattern recognition and image analysis
NASA Technical Reports Server (NTRS)
Bryant, J.; Guseman, L. F., Jr.
1981-01-01
Fundamental problems encountered while attempting to develop automated techniques for applications of remote sensing are discussed under the following categories: (1) geometric and radiometric preprocessing; (2) spatial, spectral, temporal, syntactic, and ancillary digital image representation; (3) image partitioning, proportion estimation, and error models in object scene interference; (4) parallel processing and image data structures; and (5) continuing studies in polarization; computer architectures and parallel processing; and the applicability of "expert systems" to interactive analysis.
2004-06-01
obtained. Further refinements of the technique based on recent research in experimental psychology are also considered. INTRODUCTION The key...an established line of research in psychology in which objective and subjective metrics are combined to analyse the degree of ‘calibration’ in... Creelman , 1991). A notable exception is the study by Kunimoto et al. (2001) in which confidence ratings were subjected to SDT analysis to evaluate the
NASA Technical Reports Server (NTRS)
Trenchard, M. H. (Principal Investigator)
1980-01-01
Procedures and techniques for providing analyses of meteorological conditions at segments during the growing season were developed for the U.S./Canada Wheat and Barley Exploratory Experiment. The main product and analysis tool is the segment-level climagraph which depicts temporally meteorological variables for the current year compared with climatological normals. The variable values for the segment are estimates derived through objective analysis of values obtained at first-order station in the region. The procedures and products documented represent a baseline for future Foreign Commodity Production Forecasting experiments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1977-06-01
The mixed-strategy analysis was a tradeoff analysis between energy-conservation methods and an alternative energy source (solar) considering technical and economic benefits. The objective of the analysis was to develop guidelines for: reducing energy requirements; reducing conventional fuel use; and identifying economic alternatives for building owners. The analysis was done with a solar system in place. This makes the study unique in that it is determining the interaction of energy conservation with a solar system. The study, therefore, established guidelines as to how to minimize capital investment while reducing the conventional fuel consumption through either a larger solar system or anmore » energy-conserving technique. To focus the scope of energy-conservation techniques and alternative energy sources considered, five building types (house, apartment buildings, commercial buildings, schools, and office buildings) were selected. Finally, the lists of energy-conservation techniques and alternative energy sources were reduced to lists of manageable size by using technical attributes to select the best candidates for further study. The resultant energy-conservation techniques were described in detail and installed costs determined. The alternative energy source reduced to solar. Building construction characteristics were defined for each building for each of four geographic regions of the country. A mixed strategy consisting of an energy-conservation technique and solar heating/hot water/cooling system was analyzed, using computer simulation to determine the interaction between energy conservation and the solar system. Finally, using FEA fuel-price scenarios and installed costs for the solar system and energy conservation techniques, an economic analysis was performed to determine the cost effectiveness of the combination. (MCW)« less
NASA Astrophysics Data System (ADS)
Lee, Yu-Cheng; Yen, Tieh-Min; Tsai, Chih-Hung
This study provides an integrated model of Supplier Quality Performance Assesment (SQPA) activity for the semiconductor industry through introducing the ISO 9001 management framework, Importance-Performance Analysis (IPA) Supplier Quality Performance Assesment and Taguchi`s Signal-to-Noise Ratio (S/N) techniques. This integrated model provides a SQPA methodology to create value for all members under mutual cooperation and trust in the supply chain. This method helps organizations build a complete SQPA framework, linking organizational objectives and SQPA activities to optimize rating techniques to promote supplier quality improvement. The techniques used in SQPA activities are easily understood. A case involving a design house is illustrated to show our model.
Towards photometry pipeline of the Indonesian space surveillance system
NASA Astrophysics Data System (ADS)
Priyatikanto, Rhorom; Religia, Bahar; Rachman, Abdul; Dani, Tiar
2015-09-01
Optical observation through sub-meter telescope equipped with CCD camera becomes alternative method for increasing orbital debris detection and surveillance. This observational mode is expected to eye medium-sized objects in higher orbits (e.g. MEO, GTO, GSO & GEO), beyond the reach of usual radar system. However, such observation of fast moving objects demands special treatment and analysis technique. In this study, we performed photometric analysis of the satellite track images photographed using rehabilitated Schmidt Bima Sakti telescope in Bosscha Observatory. The Hough transformation was implemented to automatically detect linear streak from the images. From this analysis and comparison to USSPACECOM catalog, two satellites were identified and associated with inactive Thuraya-3 satellite and Satcom-3 debris which are located at geostationary orbit. Further aperture photometry analysis revealed the periodicity of tumbling Satcom-3 debris. In the near future, it is not impossible to apply similar scheme to establish an analysis pipeline for optical space surveillance system hosted in Indonesia.
Clarifying Objectives and Results of Equivalent System Mass Analyses for Advanced Life Support
NASA Technical Reports Server (NTRS)
Levri, Julie A.; Drysdale, Alan E.
2003-01-01
This paper discusses some of the analytical decisions that an investigator must make during the course of a life support system trade study. Equivalent System Mass (ESM) is often applied to evaluate trade study options in the Advanced Life Support (ALS) Program. ESM can be used to identify which of several options that meet all requirements are most likely to have lowest cost. It can also be used to identify which of the many interacting parts of a life support system have the greatest impact and sensitivity to assumptions. This paper summarizes recommendations made in the newly developed ALS ESM Guidelines Document and expands on some of the issues relating to trade studies that involve ESM. In particular, the following three points are expounded: 1) The importance of objectives: Analysis objectives drive the approach to any trade study, including identification of assumptions, selection of characteristics to compare in the analysis, and the most appropriate techniques for reflecting those characteristics. 2) The importance of results inferprefafion: The accuracy desired in the results depends upon the analysis objectives, whereas the realized accuracy is determined by the data quality and degree of detail in analysis methods. 3) The importance of analysis documentation: Documentation of assumptions and data modifications is critical for effective peer evaluation of any trade study. ESM results are analysis-specific and should always be reported in context, rather than as solitary values. For this reason, results reporting should be done with adequate rigor to allow for verification by other researchers.
Aslam, Tariq Mehmood; Shakir, Savana; Wong, James; Au, Leon; Ashworth, Jane
2012-12-01
Mucopolysaccharidoses (MPS) can cause corneal opacification that is currently difficult to objectively quantify. With newer treatments for MPS comes an increased need for a more objective, valid and reliable index of disease severity for clinical and research use. Clinical evaluation by slit lamp is very subjective and techniques based on colour photography are difficult to standardise. In this article the authors present evidence for the utility of dedicated image analysis algorithms applied to images obtained by a highly sophisticated iris recognition camera that is small, manoeuvrable and adapted to achieve rapid, reliable and standardised objective imaging in a wide variety of patients while minimising artefactual interference in image quality.
NASA Astrophysics Data System (ADS)
Cagigal, Manuel P.; Valle, Pedro J.; Colodro-Conde, Carlos; Villó-Pérez, Isidro; Pérez-Garrido, Antonio
2016-01-01
Images of stars adopt shapes far from the ideal Airy pattern due to atmospheric density fluctuations. Hence, diffraction-limited images can only be achieved by telescopes without atmospheric influence, e.g. spatial telescopes, or by using techniques like adaptive optics or lucky imaging. In this paper, we propose a new computational technique based on the evaluation of the COvariancE of Lucky Images (COELI). This technique allows us to discover companions to main stars by taking advantage of the atmospheric fluctuations. We describe the algorithm and we carry out a theoretical analysis of the improvement in contrast. We have used images taken with 2.2-m Calar Alto telescope as a test bed for the technique resulting that, under certain conditions, telescope diffraction limit is clearly reached.
Hu, Fei; Cheng, Yayun; Gui, Liangqi; Wu, Liang; Zhang, Xinyi; Peng, Xiaohui; Su, Jinlong
2016-11-01
The polarization properties of thermal millimeter-wave emission capture inherent information of objects, e.g., material composition, shape, and surface features. In this paper, a polarization-based material-classification technique using passive millimeter-wave polarimetric imagery is presented. Linear polarization ratio (LPR) is created to be a new feature discriminator that is sensitive to material type and to remove the reflected ambient radiation effect. The LPR characteristics of several common natural and artificial materials are investigated by theoretical and experimental analysis. Based on a priori information about LPR characteristics, the optimal range of incident angle and the classification criterion are discussed. Simulation and measurement results indicate that the presented classification technique is effective for distinguishing between metals and dielectrics. This technique suggests possible applications for outdoor metal target detection in open scenes.
Matrix evaluation of science objectives
NASA Technical Reports Server (NTRS)
Wessen, Randii R.
1994-01-01
The most fundamental objective of all robotic planetary spacecraft is to return science data. To accomplish this, a spacecraft is fabricated and built, software is planned and coded, and a ground system is designed and implemented. However, the quantitative analysis required to determine how the collection of science data drives ground system capabilities has received very little attention. This paper defines a process by which science objectives can be quantitatively evaluated. By applying it to the Cassini Mission to Saturn, this paper further illustrates the power of this technique. The results show which science objectives drive specific ground system capabilities. In addition, this process can assist system engineers and scientists in the selection of the science payload during pre-project mission planning; ground system designers during ground system development and implementation; and operations personnel during mission operations.
The analysis of crystallographic symmetry types in finite groups
NASA Astrophysics Data System (ADS)
Sani, Atikah Mohd; Sarmin, Nor Haniza; Adam, Nooraishikin; Zamri, Siti Norziahidayu Amzee
2014-06-01
Undeniably, it is human nature to prefer objects which are considered beautiful. Most consider beautiful as perfection, hence they try to create objects which are perfectly balance in shape and patterns. This creates a whole different kind of art, the kind that requires an object to be symmetrical. This leads to the study of symmetrical objects and pattern. Even mathematicians and ethnomathematicians are very interested with the essence of symmetry. One of these studies were conducted on the Malay traditional triaxial weaving culture. The patterns derived from this technique are symmetrical and this allows for further research. In this paper, the 17 symmetry types in a plane, known as the wallpaper groups, are studied and discussed. The wallpaper groups will then be applied to the triaxial patterns of food cover in Malaysia.
Evaluating Remapped Physical Reach for Hand Interactions with Passive Haptics in Virtual Reality.
Han, Dustin T; Suhail, Mohamed; Ragan, Eric D
2018-04-01
Virtual reality often uses motion tracking to incorporate physical hand movements into interaction techniques for selection and manipulation of virtual objects. To increase realism and allow direct hand interaction, real-world physical objects can be aligned with virtual objects to provide tactile feedback and physical grasping. However, unless a physical space is custom configured to match a specific virtual reality experience, the ability to perfectly match the physical and virtual objects is limited. Our research addresses this challenge by studying methods that allow one physical object to be mapped to multiple virtual objects that can exist at different virtual locations in an egocentric reference frame. We study two such techniques: one that introduces a static translational offset between the virtual and physical hand before a reaching action, and one that dynamically interpolates the position of the virtual hand during a reaching motion. We conducted two experiments to assess how the two methods affect reaching effectiveness, comfort, and ability to adapt to the remapping techniques when reaching for objects with different types of mismatches between physical and virtual locations. We also present a case study to demonstrate how the hand remapping techniques could be used in an immersive game application to support realistic hand interaction while optimizing usability. Overall, the translational technique performed better than the interpolated reach technique and was more robust for situations with larger mismatches between virtual and physical objects.
Rotation Covariant Image Processing for Biomedical Applications
Reisert, Marco
2013-01-01
With the advent of novel biomedical 3D image acquisition techniques, the efficient and reliable analysis of volumetric images has become more and more important. The amount of data is enormous and demands an automated processing. The applications are manifold, ranging from image enhancement, image reconstruction, and image description to object/feature detection and high-level contextual feature extraction. In most scenarios, it is expected that geometric transformations alter the output in a mathematically well-defined manner. In this paper we emphasis on 3D translations and rotations. Many algorithms rely on intensity or low-order tensorial-like descriptions to fulfill this demand. This paper proposes a general mathematical framework based on mathematical concepts and theories transferred from mathematical physics and harmonic analysis into the domain of image analysis and pattern recognition. Based on two basic operations, spherical tensor differentiation and spherical tensor multiplication, we show how to design a variety of 3D image processing methods in an efficient way. The framework has already been applied to several biomedical applications ranging from feature and object detection tasks to image enhancement and image restoration techniques. In this paper, the proposed methods are applied on a variety of different 3D data modalities stemming from medical and biological sciences. PMID:23710255
Transit Spectroscopy: new data analysis techniques and interpretation
NASA Astrophysics Data System (ADS)
Tinetti, Giovanna; Waldmann, Ingo P.; Morello, Giuseppe; Tessenyi, Marcell; Varley, Ryan; Barton, Emma; Yurchenko, Sergey; Tennyson, Jonathan; Hollis, Morgan
2014-11-01
Planetary science beyond the boundaries of our Solar System is today in its infancy. Until a couple of decades ago, the detailed investigation of the planetary properties was restricted to objects orbiting inside the Kuiper Belt. Today, we cannot ignore that the number of known planets has increased by two orders of magnitude nor that these planets resemble anything but the objects present in our own Solar System. A key observable for planets is the chemical composition and state of their atmosphere. To date, two methods can be used to sound exoplanetary atmospheres: transit and eclipse spectroscopy, and direct imaging spectroscopy. Although the field of exoplanet spectroscopy has been very successful in past years, there are a few serious hurdles that need to be overcome to progress in this area: in particular instrument systematics are often difficult to disentangle from the signal, data are sparse and often not recorded simultaneously causing degeneracy of interpretation. We will present here new data analysis techniques and interpretation developed by the “ExoLights” team at UCL to address the above-mentioned issues. Said techniques include statistical tools, non-parametric, machine-learning algorithms, optimized radiative transfer models and spectroscopic line-lists. These new tools have been successfully applied to existing data recorded with space and ground instruments, shedding new light on our knowledge and understanding of these alien worlds.
NASA Astrophysics Data System (ADS)
MacDonald, Brandi Lee
2017-01-01
When considered as an object, a painting consists of multiple components that, when analyzed together, have a unique story to tell about the artist, their practice, and the history of the work of art. Techniques traditionally applied in physics, including neutron-, x-radiographic and near-infrared imaging, and surface elemental analysis via x-ray fluorescence, are useful for generating significant insight into works of art. By examining the supporting material, grounds, pigments, and varnishes that a painter chose to utilize, we generate new knowledge regarding the composition, context, and decision-making involved in the creation of a work. The project `The Unvarnished Truth: exploring the material history of paintings' is an interdisciplinary initiative that incorporated the expertise of forensic art historians, conservation scientists, physicists, and biomedical engineers. Through the technical analysis of nine paintings from the McMaster Museum of Art permanent collection, we explored research questions related to painting technique, attribution, authenticity, connoisseurship, and object condition and stability. The paintings span over 500 years of European art history, and include works from Vincent Van Gogh, Alexander Rodchenko, and A. van der Neer. This project highlights the multitude of ways in which micro- and non-destructive methods can be used to answer art historical questions. This project is supported by grants from Canadian Heritage and Ontario Arts Council.
From scenarios to domain models: processes and representations
NASA Astrophysics Data System (ADS)
Haddock, Gail; Harbison, Karan
1994-03-01
The domain specific software architectures (DSSA) community has defined a philosophy for the development of complex systems. This philosophy improves productivity and efficiency by increasing the user's role in the definition of requirements, increasing the systems engineer's role in the reuse of components, and decreasing the software engineer's role to the development of new components and component modifications only. The scenario-based engineering process (SEP), the first instantiation of the DSSA philosophy, has been adopted by the next generation controller project. It is also the chosen methodology of the trauma care information management system project, and the surrogate semi-autonomous vehicle project. SEP uses scenarios from the user to create domain models and define the system's requirements. Domain knowledge is obtained from a variety of sources including experts, documents, and videos. This knowledge is analyzed using three techniques: scenario analysis, task analysis, and object-oriented analysis. Scenario analysis results in formal representations of selected scenarios. Task analysis of the scenario representations results in descriptions of tasks necessary for object-oriented analysis and also subtasks necessary for functional system analysis. Object-oriented analysis of task descriptions produces domain models and system requirements. This paper examines the representations that support the DSSA philosophy, including reference requirements, reference architectures, and domain models. The processes used to create and use the representations are explained through use of the scenario-based engineering process. Selected examples are taken from the next generation controller project.
Whalley, H C; Kestelman, J N; Rimmington, J E; Kelso, A; Abukmeil, S S; Best, J J; Johnstone, E C; Lawrie, S M
1999-07-30
The Edinburgh High Risk Project is a longitudinal study of brain structure (and function) in subjects at high risk of developing schizophrenia in the next 5-10 years for genetic reasons. In this article we describe the methods of volumetric analysis of structural magnetic resonance images used in the study. We also consider potential sources of error in these methods: the validity of our image analysis techniques; inter- and intra-rater reliability; possible positional variation; and thresholding criteria used in separating brain from cerebro-spinal fluid (CSF). Investigation with a phantom test object (of similar imaging characteristics to the brain) provided evidence for the validity of our image acquisition and analysis techniques. Both inter- and intra-rater reliability were found to be good in whole brain measures but less so for smaller regions. There were no statistically significant differences in positioning across the three study groups (patients with schizophrenia, high risk subjects and normal volunteers). A new technique for thresholding MRI scans longitudinally is described (the 'rescale' method) and compared with our established method (thresholding by eye). Few differences between the two techniques were seen at 3- and 6-month follow-up. These findings demonstrate the validity and reliability of the structural MRI analysis techniques used in the Edinburgh High Risk Project, and highlight methodological issues of general concern in cross-sectional and longitudinal studies of brain structure in healthy control subjects and neuropsychiatric populations.
NASA Astrophysics Data System (ADS)
Stosnach, Hagen
2010-09-01
Selenium is essential for many aspects of human health and, thus, the object of intensive medical research. This demands the use of analytical techniques capable of analysing selenium at low concentrations with high accuracy in widespread matrices and sometimes smallest sample amounts. In connection with the increasing importance of selenium, there is a need for rapid and simple on-site (or near-to-site) selenium analysis in food basics like wheat at processing and production sites, as well as for the analysis of this element in dietary supplements. Common analytical techniques like electrothermal atomic absorption spectroscopy (ETAAS) and inductively-coupled plasma mass spectrometry (ICP-MS) are capable of analysing selenium in medical samples with detection limits in the range from 0.02 to 0.7 μg/l. Since in many cases less complicated and expensive analytical techniques are required, TXRF has been tested regarding its suitability for selenium analysis in different medical, food basics and dietary supplement samples applying most simple sample preparation techniques. The reported results indicate that the accurate analysis of selenium in all sample types is possible. The detection limits of TXRF are in the range from 7 to 12 μg/l for medical samples and 0.1 to 0.2 mg/kg for food basics and dietary supplements. Although this sensitivity is low compared to established techniques, it is sufficient for the physiological concentrations of selenium in the investigated samples.
nSTAT: Open-Source Neural Spike Train Analysis Toolbox for Matlab
Cajigas, I.; Malik, W.Q.; Brown, E.N.
2012-01-01
Over the last decade there has been a tremendous advance in the analytical tools available to neuroscientists to understand and model neural function. In particular, the point process - Generalized Linear Model (PPGLM) framework has been applied successfully to problems ranging from neuro-endocrine physiology to neural decoding. However, the lack of freely distributed software implementations of published PP-GLM algorithms together with problem-specific modifications required for their use, limit wide application of these techniques. In an effort to make existing PP-GLM methods more accessible to the neuroscience community, we have developed nSTAT – an open source neural spike train analysis toolbox for Matlab®. By adopting an Object-Oriented Programming (OOP) approach, nSTAT allows users to easily manipulate data by performing operations on objects that have an intuitive connection to the experiment (spike trains, covariates, etc.), rather than by dealing with data in vector/matrix form. The algorithms implemented within nSTAT address a number of common problems including computation of peri-stimulus time histograms, quantification of the temporal response properties of neurons, and characterization of neural plasticity within and across trials. nSTAT provides a starting point for exploratory data analysis, allows for simple and systematic building and testing of point process models, and for decoding of stimulus variables based on point process models of neural function. By providing an open-source toolbox, we hope to establish a platform that can be easily used, modified, and extended by the scientific community to address limitations of current techniques and to extend available techniques to more complex problems. PMID:22981419
Murray C. Richardson; Carl P. J. Mitchell; Brian A. Branfireun; Randall K. Kolka
2010-01-01
A new technique for quantifying the geomorphic form of northern forested wetlands from airborne LiDAR surveys is introduced, demonstrating the unprecedented ability to characterize the geomorphic form of northern forested wetlands using high-resolution digital topography. Two quantitative indices are presented, including the lagg width index (LWI) which objectively...
Analysis of Docudrama Techniques and Negotiating One's Identity in David Edgar's "Pentecost"
ERIC Educational Resources Information Center
Al Sharadgeh, Samer Ziyad
2018-01-01
Edgar manages to invert the subordinate function of generally accepted objective indicators of membership of a particular national group--language, religion, common history, and territory--into the essential mode of imperative distinction shaping the unique national identity. In other words, it is the fresco and the value assigned to it that…
Methods for Identifying Object Class, Type, and Orientation in the Presence of Uncertainty
1990-08-01
on Range Finding Techniques for Computer Vision," IEEE Trans. on Pattern Analysis and Machine Intellegence PAMI-5 (2), pp 129-139 March 1983. 15. Yang... Artificial Intelligence Applications, pp 199-205, December 1984. 16. Flynn, P.J. and Jain, A.K.," On Reliable Curvature Estimation, " Proceedings of the
A two-phase method for timber supply analysis
Stephen Smith
1978-01-01
There is an increasing need to clarify the long-term wood supply implications of current harvesting rates. To assess the wood supply and to set timber production objectives, different linear programming techniques are applied to the short and long term. The transportation method is applied to the short term and the B. C. Forest Service computer-assisted resource...
USDA-ARS?s Scientific Manuscript database
The objective of the current study was to evaluate transcript activity of motile-rich sperm collected from June (spring) or August (summer), stored as cooled-extended (ExT) or cryopreserved (FrZ), and selected for least or most sperm head shape change, using Fourier harmonic analysis techniques, bet...
Time Study of Harvesting Equipment Using GPS-Derived Positional Data
Tim McDonald
1999-01-01
The objectives in this study were to develop and test a data analysis system for calculating machine productivity from GPS-derived positional information alone. A technique was used where positions were `filtered' initially to locate specific events that were independent of what actually traveled the path, then these events were combined using user-specified rules...
ERIC Educational Resources Information Center
Berniklau, Vladimir V.
Focusing on management development of scientists and engineers within the Federal government, this study was done to form a framework of factors (mainly attitudes, motives or needs, and leadership styles) to be evaluated before choosing suitable techniques and alternatives. Such variables as differing program objectives, characteristics of…
INVESTIGATION OF THE USE OF STATISTICS IN COUNSELING STUDENTS.
ERIC Educational Resources Information Center
HEWES, ROBERT F.
THE OBJECTIVE WAS TO EMPLOY TECHNIQUES OF PROFILE ANALYSIS TO DEVELOP THE JOINT PROBABILITY OF SELECTING A SUITABLE SUBJECT MAJOR AND OF ASSURING TO A HIGH DEGREE GRADUATION FROM COLLEGE WITH THAT MAJOR. THE SAMPLE INCLUDED 1,197 MIT FRESHMEN STUDENTS IN 1952-53, AND THE VALIDATION GROUP INCLUDED 699 ENTRANTS IN 1954. DATA INCLUDED SECONDARY…
Classification and spatial analysis of eastern hemlock health using remote sensing and GIS
Laurent R. Bonneau; Kathleen S. Shields; Daniel L. Civco; David R. Mikus
2000-01-01
Over the past decade hemlock stands in southern Connecticut have undergone significant decline coincident with the arrival in 1985 of an exotic insect pest, the hemlock woolly adelgid (Adelges tsugae Annand). The objective of this study was to evaluate image enhancement techniques for rating the health of hemlocks at the landscape level using...
DOT National Transportation Integrated Search
1985-03-01
A report is offered on a study of the information activities within the Right-of-Way section of ADOT. The objectives of the study were to adapt and apply techniques to measure user-perceived needs, satisfaction and utility of services provided Right-...
DOT National Transportation Integrated Search
1985-03-01
A report is offered on a study of the information activities within the Right-of-Way section of ADOT. The objectives of the study were to adapt and apply techniques to measure user-perceived needs, satisfaction and utility of services provided Right-...
The spatial and temporal variability of ambient air concentrations of SO2, SO42-, NO3
Preliminary Analysis of Photoreading
NASA Technical Reports Server (NTRS)
McNamara, Danielle S.
2000-01-01
The purpose of this project was to provide a preliminary analysis of a reading strategy called PhotoReading. PhotoReading is a technique developed by Paul Scheele that claims to increase reading rate to 25,000 words per minute (Scheele, 1993). PhotoReading itself involves entering a "relaxed state" and looking at, but not reading, each page of a text for a brief moment (about I to 2 seconds). While this technique has received attention in the popular press, there had been no objective examinations of the technique's validity. To examine the effectiveness of PhotoReading, the principal investigator (i.e., trainee) participated in a PhotoReading workshop to learn the technique. Parallel versions of two standardized and three experimenter-created reading comprehension tests were administered to the trainee and an expert user of the PhotoReading technique to compare the use of normal reading strategies and the PhotoReading technique by both readers. The results for all measures yielded no benefits of using the PhotoReading technique. The extremely rapid reading rates claimed by PhotoReaders were not observed; indeed, the reading rates were generally comparable to those for normal reading. Moreover, the PhotoReading expert generally showed an increase in reading time when using the PhotoReading technique in comparison to when using normal reading strategies to process text. This increase in reading time when PhotoReading was accompanied by a decrease in text comprehension.
Properties of resonant trans-Neptunian objects based on Herschel Space Observatory data
NASA Astrophysics Data System (ADS)
Farkas Anikó, Takácsné; Kiss, Csaba; Mueller, Thomas G.; Mommert, Michael; Vilenius, Esa
2016-10-01
The goal of our work is to characterise the physical characteristics of resonant, detached and scattered disk objects in the trans-Neptunian region, observed in the framework of the "TNOs are Cool!" Herschel Open Time Key Program. Based on thermal emission measurements with the Herschel/PACS and Spitzer/MIPS instruments we were able to determine size, albedo, and surface thermal properties for 23 objects using radiometric modelling techniques. This is the first analysis in which the physical properties of objects in the outer resonances are determined for a larger sample. In addition to the results for individual objects, we have compared these characteristic with the bulk properties of other populations of the trans-Neptunian region. The newly analysed objects show e.g. a large variety of beaming factors, indicating diverse surfaces, and in general they follow the albedo-colour clustering identified earlier for Kuiper belt objects and Centaurs, further strengthening the evidence for a compositional discontinuity in the young solar system.
A CAD/CAE analysis of photographic and engineering data
NASA Technical Reports Server (NTRS)
Goza, S. Michael; Peterson, Wayne L.
1987-01-01
In the investigation of the STS 51L accident, NASA engineers were given the task of visual analysis of photographic data extracted from the tracking cameras located at the launch pad. An analysis of the rotations associated with the right Solid Rocket Booster (SRB) was also performed. The visual analysis involved pinpointing coordinates of specific areas on the photographs. The objective of the analysis on the right SRB was to duplicate the rotations provided by the SRB rate gyros and to determine the effects of the rotations on the launch configuration. To accomplish the objectives, computer aided design and engineering was employed. The solid modeler, GEOMOD, inside the Structural Dynamics Research Corp. I-DEAS package, proved invaluable. The problem areas that were encountered and the corresponding solutions that were obtained are discussed. A brief description detailing the construction of the computer generated solid model of the STS launch configuration is given. A discussion of the coordinate systems used in the analysis is provided for the purpose of positioning the model in coordinate space. The techniques and theory used in the model analysis are described.
Automated quantification of renal fibrosis with Sirius Red and polarization contrast microscopy
Street, Jonathan M.; Souza, Ana Carolina P.; Alvarez‐Prats, Alejandro; Horino, Taro; Hu, Xuzhen; Yuen, Peter S. T.; Star, Robert A.
2014-01-01
Abstract Interstitial fibrosis is commonly measured by histology. The Masson trichrome stain is widely used, with semiquantitative scores subjectively assigned by trained operators. We have developed an objective technique combining Sirius Red staining, polarization contrast microscopy, and automated analysis. Repeated analysis of the same sections by the same operator (r = 0.99) or by different operators (r = 0.98) was highly consistent for Sirius Red, while Masson trichrome performed less consistently (r = 0.61 and 0.72, respectively). These techniques performed equally well when comparing sections from the left and right kidneys of mice. Poor correlation between Sirius Red and Masson trichrome may reflect different specificities, as enhanced birefringence with Sirius Red staining is specific for collagen type I and III fibrils. Combining whole‐section imaging and automated image analysis with Sirius Red/polarization contrast is a rapid, reproducible, and precise technique that is complementary to Masson trichrome. It also prevents biased selection of fields as fibrosis is measured on the entire kidney section. This new tool shall enhance our search for novel therapeutics and noninvasive biomarkers for fibrosis. To listen to podcast click here PMID:25052492
NASA Astrophysics Data System (ADS)
Varun, Sajja; Reddy, Kalakada Bhargav Bal; Vardhan Reddy, R. R. Vishnu
2016-09-01
In this research work, development of a multi response optimization technique has been undertaken, using traditional desirability analysis and non-traditional particle swarm optimization techniques (for different customer's priorities) in wire electrical discharge machining (WEDM). Monel 400 has been selected as work material for experimentation. The effect of key process parameters such as pulse on time (TON), pulse off time (TOFF), peak current (IP), wire feed (WF) were on material removal rate (MRR) and surface roughness(SR) in WEDM operation were investigated. Further, the responses such as MRR and SR were modelled empirically through regression analysis. The developed models can be used by the machinists to predict the MRR and SR over a wide range of input parameters. The optimization of multiple responses has been done for satisfying the priorities of multiple users by using Taguchi-desirability function method and particle swarm optimization technique. The analysis of variance (ANOVA) is also applied to investigate the effect of influential parameters. Finally, the confirmation experiments were conducted for the optimal set of machining parameters, and the betterment has been proved.
NASA Astrophysics Data System (ADS)
Ruiz-Cárcel, C.; Jaramillo, V. H.; Mba, D.; Ottewill, J. R.; Cao, Y.
2016-01-01
The detection and diagnosis of faults in industrial processes is a very active field of research due to the reduction in maintenance costs achieved by the implementation of process monitoring algorithms such as Principal Component Analysis, Partial Least Squares or more recently Canonical Variate Analysis (CVA). Typically the condition of rotating machinery is monitored separately using vibration analysis or other specific techniques. Conventional vibration-based condition monitoring techniques are based on the tracking of key features observed in the measured signal. Typically steady-state loading conditions are required to ensure consistency between measurements. In this paper, a technique based on merging process and vibration data is proposed with the objective of improving the detection of mechanical faults in industrial systems working under variable operating conditions. The capabilities of CVA for detection and diagnosis of faults were tested using experimental data acquired from a compressor test rig where different process faults were introduced. Results suggest that the combination of process and vibration data can effectively improve the detectability of mechanical faults in systems working under variable operating conditions.
The examination, analysis and conservation of a bronze Egyptian Horus statuette
NASA Astrophysics Data System (ADS)
Smith, A.; Botha, H.; de Beer, F. C.; Ferg, E.
2011-09-01
The production techniques, corrosive deterioration, conservation and questions regarding authenticity of a small Egyptian bronze statuette of the Child Horus (in the collection of the Ditsong: National Museum of Cultural History in Pretoria) was scientifically examined and analysed. The statuette dates to Egypt's 12th Dynasty. When the statuette was damaged, it was considered the appropriate time to obtain valuable information about its history and background through scientific research. Neutron tomography (NT), a relatively new non-destructive technique (NDT) to the South African R&D community to study museum objects, was applied to perform this research. The results from NT were supported by additional tests done through XRF and XRD analyses of samples taken from the damaged statuette. Results revealed that the lost-wax method was used in the manufacturing process. The extent of the restoration and materials used can be verified and as a result the deterioration of the object can now be monitored. This paper describes in detail the analytical techniques used in the study and how it contributed to the conservation of the statuette and its authenticity.
Study the effect of elevated dies temperature on aluminium and steel round deep drawing
NASA Astrophysics Data System (ADS)
Lean, Yeong Wei; Azuddin, M.
2016-02-01
Round deep drawing operation can only be realized by expensive multi-step production processes. To reduce the cost of processes while expecting an acceptable result, round deep drawing can be done at elevated temperature. There are 3 common problems which are fracture, wrinkling and earing of deep drawing a round cup. The main objective is to investigate the effect of dies temperature on aluminium and steel round deep drawing; with a sub-objective of eliminate fracture and reducing wrinkling effect. Experimental method is conducted with 3 different techniques on heating the die. The techniques are heating both upper and lower dies, heating only the upper dies, and heating only the lower dies. 4 different temperatures has been chosen throughout the experiment. The experimental result then will be compared with finite element analysis software. There is a positive result from steel material on heating both upper and lower dies, where the simulation result shows comparable as experimental result. Heating both upper and lower dies will be the best among 3 types of heating techniques.
Comparison of segmentation algorithms for fluorescence microscopy images of cells.
Dima, Alden A; Elliott, John T; Filliben, James J; Halter, Michael; Peskin, Adele; Bernal, Javier; Kociolek, Marcin; Brady, Mary C; Tang, Hai C; Plant, Anne L
2011-07-01
The analysis of fluorescence microscopy of cells often requires the determination of cell edges. This is typically done using segmentation techniques that separate the cell objects in an image from the surrounding background. This study compares segmentation results from nine different segmentation techniques applied to two different cell lines and five different sets of imaging conditions. Significant variability in the results of segmentation was observed that was due solely to differences in imaging conditions or applications of different algorithms. We quantified and compared the results with a novel bivariate similarity index metric that evaluates the degree of underestimating or overestimating a cell object. The results show that commonly used threshold-based segmentation techniques are less accurate than k-means clustering with multiple clusters. Segmentation accuracy varies with imaging conditions that determine the sharpness of cell edges and with geometric features of a cell. Based on this observation, we propose a method that quantifies cell edge character to provide an estimate of how accurately an algorithm will perform. The results of this study will assist the development of criteria for evaluating interlaboratory comparability. Published 2011 Wiley-Liss, Inc.
Liang, Zhiting; Guan, Yong; Liu, Gang; Chen, Xiangyu; Li, Fahu; Guo, Pengfei; Tian, Yangchao
2016-03-01
The `missing wedge', which is due to a restricted rotation range, is a major challenge for quantitative analysis of an object using tomography. With prior knowledge of the grey levels, the discrete algebraic reconstruction technique (DART) is able to reconstruct objects accurately with projections in a limited angle range. However, the quality of the reconstructions declines as the number of grey levels increases. In this paper, a modified DART (MDART) was proposed, in which each independent region of homogeneous material was chosen as a research object, instead of the grey values. The grey values of each discrete region were estimated according to the solution of the linear projection equations. The iterative process of boundary pixels updating and correcting the grey values of each region was executed alternately. Simulation experiments of binary phantoms as well as multiple grey phantoms show that MDART is capable of achieving high-quality reconstructions with projections in a limited angle range. The interesting advancement of MDART is that neither prior knowledge of the grey values nor the number of grey levels is necessary.
Object tracking via background subtraction for monitoring illegal activity in crossroad
NASA Astrophysics Data System (ADS)
Ghimire, Deepak; Jeong, Sunghwan; Park, Sang Hyun; Lee, Joonwhoan
2016-07-01
In the field of intelligent transportation system a great number of vision-based techniques have been proposed to prevent pedestrians from being hit by vehicles. This paper presents a system that can perform pedestrian and vehicle detection and monitoring of illegal activity in zebra crossings. In zebra crossing, according to the traffic light status, to fully avoid a collision, a driver or pedestrian should be warned earlier if they possess any illegal moves. In this research, at first, we detect the traffic light status of pedestrian and monitor the crossroad for vehicle pedestrian moves. The background subtraction based object detection and tracking is performed to detect pedestrian and vehicles in crossroads. Shadow removal, blob segmentation, trajectory analysis etc. are used to improve the object detection and classification performance. We demonstrate the experiment in several video sequences which are recorded in different time and environment such as day time and night time, sunny and raining environment. Our experimental results show that such simple and efficient technique can be used successfully as a traffic surveillance system to prevent accidents in zebra crossings.
Demidenko, Natalia V; Penin, Aleksey A
2012-01-01
qRT-PCR is a generally acknowledged method for gene expression analysis due to its precision and reproducibility. However, it is well known that the accuracy of qRT-PCR data varies greatly depending on the experimental design and data analysis. Recently, a set of guidelines has been proposed that aims to improve the reliability of qRT-PCR. However, there are additional factors that have not been taken into consideration in these guidelines that can seriously affect the data obtained using this method. In this study, we report the influence that object morphology can have on qRT-PCR data. We have used a number of Arabidopsis thaliana mutants with altered floral morphology as models for this study. These mutants have been well characterised (including in terms of gene expression levels and patterns) by other techniques. This allows us to compare the results from the qRT-PCR with the results inferred from other methods. We demonstrate that the comparison of gene expression levels in objects that differ greatly in their morphology can lead to erroneous results.
The MGDO software library for data analysis in Ge neutrinoless double-beta decay experiments
NASA Astrophysics Data System (ADS)
Agostini, M.; Detwiler, J. A.; Finnerty, P.; Kröninger, K.; Lenz, D.; Liu, J.; Marino, M. G.; Martin, R.; Nguyen, K. D.; Pandola, L.; Schubert, A. G.; Volynets, O.; Zavarise, P.
2012-07-01
The Gerda and Majorana experiments will search for neutrinoless double-beta decay of 76Ge using isotopically enriched high-purity germanium detectors. Although the experiments differ in conceptual design, they share many aspects in common, and in particular will employ similar data analysis techniques. The collaborations are jointly developing a C++ software library, MGDO, which contains a set of data objects and interfaces to encapsulate, store and manage physical quantities of interest, such as waveforms and high-purity germanium detector geometries. These data objects define a common format for persistent data, whether it is generated by Monte Carlo simulations or an experimental apparatus, to reduce code duplication and to ease the exchange of information between detector systems. MGDO also includes general-purpose analysis tools that can be used for the processing of measured or simulated digital signals. The MGDO design is based on the Object-Oriented programming paradigm and is very flexible, allowing for easy extension and customization of the components. The tools provided by the MGDO libraries are used by both Gerda and Majorana.
Schlieren technique in soap film flows
NASA Astrophysics Data System (ADS)
Auliel, M. I.; Hebrero, F. Castro; Sosa, R.; Artana, G.
2017-05-01
We propose the use of the Schlieren technique as a tool to analyse the flows in soap film tunnels. The technique enables to visualize perturbations of the film produced by the interposition of an object in the flow. The variations of intensity of the image are produced as a consequence of the deviations of the light beam traversing the deformed surfaces of the film. The quality of the Schlieren image is compared to images produced by the conventional interferometric technique. The analysis of Schlieren images of a cylinder wake flow indicates that this technique enables an easy visualization of vortex centers. Post-processing of series of two successive images of a grid turbulent flow with a dense motion estimator is used to derive the velocity fields. The results obtained with this self-seeded flow show good agreement with the statistical properties of the 2D turbulent flows reported on the literature.
Bourget, Philippe; Amin, Alexandre; Vidal, Fabrice; Merlette, Christophe; Troude, Pénélope; Baillet-Guffroy, Arlette
2014-08-15
The purpose of the study was to perform a comparative analysis of the technical performance, respective costs and environmental effect of two invasive analytical methods (HPLC and UV/visible-FTIR) as compared to a new non-invasive analytical technique (Raman spectroscopy). Three pharmacotherapeutic models were used to compare the analytical performances of the three analytical techniques. Statistical inter-method correlation analysis was performed using non-parametric correlation rank tests. The study's economic component combined calculations relative to the depreciation of the equipment and the estimated cost of an AQC unit of work. In any case, analytical validation parameters of the three techniques were satisfactory, and strong correlations between the two spectroscopic techniques vs. HPLC were found. In addition, Raman spectroscopy was found to be superior as compared to the other techniques for numerous key criteria including a complete safety for operators and their occupational environment, a non-invasive procedure, no need for consumables, and a low operating cost. Finally, Raman spectroscopy appears superior for technical, economic and environmental objectives, as compared with the other invasive analytical methods. Copyright © 2014 Elsevier B.V. All rights reserved.
Muller, E; Gargani, D; Banuls, A L; Tibayrenc, M; Dollet, M
1997-10-01
The genetic polymorphism of 30 isolates of plant trypanosomatids colloquially referred to as plant trypanosomes was assayed by means of RAPD. The principle objectives of this study were to assess the discriminative power of RAPD analysis for studying plant trypanosomes and to determine whether the results obtained were comparable with those from a previous isoenzyme (MLEE) study. The principle groups of plant trypanosomes identified previously by isoenzyme analysis--intraphloemic trypanosomes, intralaticiferous trypanosomes and trypanosomes isolated from fruits--were also clearly separated by the RAPD technique. Moreover, the results showed a fair parity between MLEE and RAPD data (coefficient of correlation = 0.84) and the two techniques have comparable discriminative ability. Most of the separation revealed by the two techniques between the clusters was associated with major biological properties. However, the RAPD technique gave a more coherent separation than MLEE because the intraphloemic isolates, which were biologically similar in terms of their specific localization in the sieve tubes of the plant, were found to be in closer groups by the RAPD. For both techniques, the existence of the main clusters was correlated with the existence of synapomorphic characters, which could be used as powerful tools in taxonomy and epidemiology.
Assessing clutter reduction in parallel coordinates using image processing techniques
NASA Astrophysics Data System (ADS)
Alhamaydh, Heba; Alzoubi, Hussein; Almasaeid, Hisham
2018-01-01
Information visualization has appeared as an important research field for multidimensional data and correlation analysis in recent years. Parallel coordinates (PCs) are one of the popular techniques to visual high-dimensional data. A problem with the PCs technique is that it suffers from crowding, a clutter which hides important data and obfuscates the information. Earlier research has been conducted to reduce clutter without loss in data content. We introduce the use of image processing techniques as an approach for assessing the performance of clutter reduction techniques in PC. We use histogram analysis as our first measure, where the mean feature of the color histograms of the possible alternative orderings of coordinates for the PC images is calculated and compared. The second measure is the extracted contrast feature from the texture of PC images based on gray-level co-occurrence matrices. The results show that the best PC image is the one that has the minimal mean value of the color histogram feature and the maximal contrast value of the texture feature. In addition to its simplicity, the proposed assessment method has the advantage of objectively assessing alternative ordering of PC visualization.
Advances in Modal Analysis Using a Robust and Multiscale Method
NASA Astrophysics Data System (ADS)
Picard, Cécile; Frisson, Christian; Faure, François; Drettakis, George; Kry, Paul G.
2010-12-01
This paper presents a new approach to modal synthesis for rendering sounds of virtual objects. We propose a generic method that preserves sound variety across the surface of an object at different scales of resolution and for a variety of complex geometries. The technique performs automatic voxelization of a surface model and automatic tuning of the parameters of hexahedral finite elements, based on the distribution of material in each cell. The voxelization is performed using a sparse regular grid embedding of the object, which permits the construction of plausible lower resolution approximations of the modal model. We can compute the audible impulse response of a variety of objects. Our solution is robust and can handle nonmanifold geometries that include both volumetric and surface parts. We present a system which allows us to manipulate and tune sounding objects in an appropriate way for games, training simulations, and other interactive virtual environments.
A simple Lagrangian forecast system with aviation forecast potential
NASA Technical Reports Server (NTRS)
Petersen, R. A.; Homan, J. H.
1983-01-01
A trajectory forecast procedure is developed which uses geopotential tendency fields obtained from a simple, multiple layer, potential vorticity conservative isentropic model. This model can objectively account for short-term advective changes in the mass field when combined with fine-scale initial analyses. This procedure for producing short-term, upper-tropospheric trajectory forecasts employs a combination of a detailed objective analysis technique, an efficient mass advection model, and a diagnostically proven trajectory algorithm, none of which require extensive computer resources. Results of initial tests are presented, which indicate an exceptionally good agreement for trajectory paths entering the jet stream and passing through an intensifying trough. It is concluded that this technique not only has potential for aiding in route determination, fuel use estimation, and clear air turbulence detection, but also provides an example of the types of short range forecasting procedures which can be applied at local forecast centers using simple algorithms and a minimum of computer resources.
Non-destructive testing of ceramic materials using mid-infrared ultrashort-pulse laser
NASA Astrophysics Data System (ADS)
Sun, S. C.; Qi, Hong; An, X. Y.; Ren, Y. T.; Qiao, Y. B.; Ruan, Liming M.
2018-04-01
The non-destructive testing (NDT) of ceramic materials using mid-infrared ultrashort-pulse laser is investigated in this study. The discrete ordinate method is applied to solve the transient radiative transfer equation in 2D semitransparent medium and the emerging radiative intensity on boundary serves as input for the inverse analysis. The sequential quadratic programming algorithm is employed as the inverse technique to optimize objective function, in which the gradient of objective function with respect to reconstruction parameters is calculated using the adjoint model. Two reticulated porous ceramics including partially stabilized zirconia and oxide-bonded silicon carbide are tested. The retrieval results show that the main characteristics of defects such as optical properties, geometric shapes and positions can be accurately reconstructed by the present model. The proposed technique is effective and robust in NDT of ceramics even with measurement errors.
Grape colour phenotyping: development of a method based on the reflectance spectrum.
Rustioni, Laura; Basilico, Roberto; Fiori, Simone; Leoni, Alessandra; Maghradze, David; Failla, Osvaldo
2013-01-01
The colour of fruit is an important quality factor for cultivar classification and phenotyping techniques. Besides the subjective visual evaluation, new instruments and techniques can be used. This work aims at developping an objective, fast, easy and non-destructive method as a useful support for evaluating grapes' colour under different cultural and environmental conditions, as well as for breeding process and germplasm evaluation, supporting the plant characterization and the biodiversity preservation. Colours of 120 grape varieties were studied using reflectance spectra. The classification was realized using cluster and discriminant analysis. Reflectance of the whole berries surface was also compared with absorption properties of single skin extracts. A phenotyping method based on the reflectance spectra was developed, producing reliable colour classifications. A cultivar-independent index for pigment content evaluation has also been obtained. This work allowed the classification of the berry colour using an objective method. Copyright © 2013 John Wiley & Sons, Ltd.
Industrial metrology as applied to large physics experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Veal, D.
1993-05-01
A physics experiment is a large complex 3-D object (typ. 1200 m{sup 3}, 35000 tonnes), with sub-millimetric alignment requirements. Two generic survey alignment tasks can be identified; first, an iterative positioning of the apparatus subsystems in space and, second, a quantification of as-built parameters. The most convenient measurement technique is industrial triangulation but the complexity of the measured object and measurement environment constraints frequently requires a more sophisticated approach. To enlarge the ``survey alignment toolbox`` measurement techniques commonly associated with other disciplines such as geodesy, applied geodesy for accelerator alignment, and mechanical engineering are also used. Disparate observables require amore » heavy reliance on least squares programs for campaign pre-analysis and calculation. This paper will offer an introduction to the alignment of physics experiments and will identify trends for the next generation of SSC experiments.« less
Collaborative real-time motion video analysis by human observer and image exploitation algorithms
NASA Astrophysics Data System (ADS)
Hild, Jutta; Krüger, Wolfgang; Brüstle, Stefan; Trantelle, Patrick; Unmüßig, Gabriel; Heinze, Norbert; Peinsipp-Byma, Elisabeth; Beyerer, Jürgen
2015-05-01
Motion video analysis is a challenging task, especially in real-time applications. In most safety and security critical applications, a human observer is an obligatory part of the overall analysis system. Over the last years, substantial progress has been made in the development of automated image exploitation algorithms. Hence, we investigate how the benefits of automated video analysis can be integrated suitably into the current video exploitation systems. In this paper, a system design is introduced which strives to combine both the qualities of the human observer's perception and the automated algorithms, thus aiming to improve the overall performance of a real-time video analysis system. The system design builds on prior work where we showed the benefits for the human observer by means of a user interface which utilizes the human visual focus of attention revealed by the eye gaze direction for interaction with the image exploitation system; eye tracker-based interaction allows much faster, more convenient, and equally precise moving target acquisition in video images than traditional computer mouse selection. The system design also builds on prior work we did on automated target detection, segmentation, and tracking algorithms. Beside the system design, a first pilot study is presented, where we investigated how the participants (all non-experts in video analysis) performed in initializing an object tracking subsystem by selecting a target for tracking. Preliminary results show that the gaze + key press technique is an effective, efficient, and easy to use interaction technique when performing selection operations on moving targets in videos in order to initialize an object tracking function.
Lee, E J; Lee, S K; Agid, R; Howard, P; Bae, J M; terBrugge, K
2009-10-01
The combined automatic tube current modulation (ATCM) technique adapts and modulates the x-ray tube current in the x-y-z axis according to the patient's individual anatomy. We compared image quality and radiation dose of the combined ATCM technique with those of a fixed tube current (FTC) technique in craniocervical CT angiography performed with a 64-section multidetector row CT (MDCT) system. A retrospective review of craniocervical CT angiograms (CTAs) by using combined ATCM (n = 25) and FTC techniques (n = 25) was performed. Other CTA parameters, such as kilovolt (peak), matrix size, FOV, section thickness, pitch, contrast agent, and contrast injection techniques, were held constant. We recorded objective image noise in the muscles at 2 anatomic levels: radiation exposure doses (CT dose index volume and dose-length product); and subjective image quality parameters, such as vascular delineation of various arterial vessels, visibility of small arterial detail, image artifacts, and certainty of diagnosis. The Mann-Whitney U test was used for statistical analysis. No significant difference was detected in subjective image quality parameters between the FTC and combined ATCM techniques. Most subjects in both study groups (49/50, 98%) had acceptable subjective artifacts. The objective image noise values at shoulder level did not show a significant difference, but the noise value at the upper neck was higher with the combined ATCM (P < .05) technique. Significant reduction in radiation dose (18% reduction) was noted with the combined ATCM technique (P < .05). The combined ATCM technique for craniocervical CTA performed at 64-section MDCT substantially reduced radiation exposure dose but maintained diagnostic image quality.
Gelfusa, M; Gaudio, P; Malizia, A; Murari, A; Vega, J; Richetta, M; Gonzalez, S
2014-06-01
Recently, surveying large areas in an automatic way, for early detection of both harmful chemical agents and forest fires, has become a strategic objective of defence and public health organisations. The Lidar and Dial techniques are widely recognized as a cost-effective alternative to monitor large portions of the atmosphere. To maximize the effectiveness of the measurements and to guarantee reliable monitoring of large areas, new data analysis techniques are required. In this paper, an original tool, the Universal Multi Event Locator, is applied to the problem of automatically identifying the time location of peaks in Lidar and Dial measurements for environmental physics applications. This analysis technique improves various aspects of the measurements, ranging from the resilience to drift in the laser sources to the increase of the system sensitivity. The method is also fully general, purely software, and can therefore be applied to a large variety of problems without any additional cost. The potential of the proposed technique is exemplified with the help of data of various instruments acquired during several experimental campaigns in the field.
Non-destructive evaluation of laboratory scale hydraulic fracturing using acoustic emission
NASA Astrophysics Data System (ADS)
Hampton, Jesse Clay
The primary objective of this research is to develop techniques to characterize hydraulic fractures and fracturing processes using acoustic emission monitoring based on laboratory scale hydraulic fracturing experiments. Individual microcrack AE source characterization is performed to understand the failure mechanisms associated with small failures along pre-existing discontinuities and grain boundaries. Individual microcrack analysis methods include moment tensor inversion techniques to elucidate the mode of failure, crack slip and crack normal direction vectors, and relative volumetric deformation of an individual microcrack. Differentiation between individual microcrack analysis and AE cloud based techniques is studied in efforts to refine discrete fracture network (DFN) creation and regional damage quantification of densely fractured media. Regional damage estimations from combinations of individual microcrack analyses and AE cloud density plotting are used to investigate the usefulness of weighting cloud based AE analysis techniques with microcrack source data. Two granite types were used in several sample configurations including multi-block systems. Laboratory hydraulic fracturing was performed with sample sizes ranging from 15 x 15 x 25 cm3 to 30 x 30 x 25 cm 3 in both unconfined and true-triaxially confined stress states using different types of materials. Hydraulic fracture testing in rock block systems containing a large natural fracture was investigated in terms of AE response throughout fracture interactions. Investigations of differing scale analyses showed the usefulness of individual microcrack characterization as well as DFN and cloud based techniques. Individual microcrack characterization weighting cloud based techniques correlated well with post-test damage evaluations.
Analysis of High Temporal and Spatial Observations of Hurricane Joaquin During TCI-15
NASA Technical Reports Server (NTRS)
Creasey, Robert; Elsberry, Russell L.; Velden, Chris; Cecil, Daniel J.; Bell, Michael; Hendricks, Eric A.
2016-01-01
Objectives: Provide an example of why analysis of high density soundings across Hurricane Joaquin also require highly accurate center positions; Describe technique for calculating 3-D zero-wind center positions from the highly accurate GPS positions of sequences of High-Density Sounding System (HDSS) soundings as they fall from 10 km to the ocean surface; Illustrate the vertical tilt of the vortex above 4-5 km during two center passes through Hurricane Joaquin on 4 October 2015.
Practical quality control tools for curves and surfaces
NASA Technical Reports Server (NTRS)
Small, Scott G.
1992-01-01
Curves (geometry) and surfaces created by Computer Aided Geometric Design systems in the engineering environment must satisfy two basic quality criteria: the geometric shape must have the desired engineering properties; and the objects must be parameterized in a way which does not cause computational difficulty for geometric processing and engineering analysis. Interactive techniques are described which are in use at Boeing to evaluate the quality of aircraft geometry prior to Computational Fluid Dynamic analysis, including newly developed methods for examining surface parameterization and its effects.
3D shape measurement of moving object with FFT-based spatial matching
NASA Astrophysics Data System (ADS)
Guo, Qinghua; Ruan, Yuxi; Xi, Jiangtao; Song, Limei; Zhu, Xinjun; Yu, Yanguang; Tong, Jun
2018-03-01
This work presents a new technique for 3D shape measurement of moving object in translational motion, which finds applications in online inspection, quality control, etc. A low-complexity 1D fast Fourier transform (FFT)-based spatial matching approach is devised to obtain accurate object displacement estimates, and it is combined with single shot fringe pattern prolometry (FPP) techniques to achieve high measurement performance with multiple captured images through coherent combining. The proposed technique overcomes some limitations of existing ones. Specifically, the placement of marks on object surface and synchronization between projector and camera are not needed, the velocity of the moving object is not required to be constant, and there is no restriction on the movement trajectory. Both simulation and experimental results demonstrate the effectiveness of the proposed technique.
Meta-analysis of the clinical application on gasless laparoscopic cholecystectomy in China
Liu, Qian; Zhang, Guangyong; Zhong, Yong; Duan, Chongyang; Hu, Sanyuan
2015-01-01
Objective: We aim to perform systematic reviews of the clinical effects of the abdominal wall suspension technique in laparoscopic cholecystectomy in China. Methods: We retrieved databases of literature on randomized controlled trials involving abdominal wall suspension laparoscopic cholecystectomy. Then, we conducted screenings, extracted data, and performed quality assessment and meta-analysis. Results: We analyzed 611 patients. Our analysis showed that the abdominal wall suspension group compared to the traditional group had reduced length of hospital stay (SMD = -0.91, 95% CI = -1.76~-0.06, P = 0.04), had shortened postoperative first exhaust time (SMD = -0.65, 95% CI = -1.11~-0.20, P = 0.005), and had diminished incidence of postoperative complications (P < 0.001), which decreased the cost of hospitalization. Conclusions: Application of abdominal wall suspension endoscopic technique can significantly speed up the rehabilitation of laparoscopic cholecystectomy patients; therefore, it is worthy of further research and clinical application. PMID:25932097
Spectrum analysis on quality requirements consideration in software design documents.
Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji
2013-12-01
Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.
Analysis of the particle stability in a new designed ultrasonic levitation device.
Baer, Sebastian; Andrade, Marco A B; Esen, Cemal; Adamowski, Julio Cezar; Schweiger, Gustav; Ostendorf, Andreas
2011-10-01
The use of acoustic levitation in the fields of analytical chemistry and in the containerless processing of materials requires a good stability of the levitated particle. However, spontaneous oscillations and rotation of the levitated particle have been reported in literature, which can reduce the applicability of the acoustic levitation technique. Aiming to reduce the particle oscillations, this paper presents the analysis of the particle stability in a new acoustic levitator device. The new acoustic levitator consists of a piezoelectric transducer with a concave radiating surface and a concave reflector. The analysis is conducted by determining numerically the axial and lateral forces that act on the levitated object and by measuring the oscillations of a sphere particle by a laser Doppler vibrometer. It is shown that the new levitator design allows to increase the lateral forces and reduce significantly the lateral oscillations of the levitated object.
Experimental evaluation of a system for human life detection under debris
NASA Astrophysics Data System (ADS)
Joju, Reshma; Konica, Pimplapure Ramya T.; Alex, Zachariah C.
2017-11-01
It is difficult to for the human beings to be found under debris or behind the walls in case of military applications. Due to which several rescue techniques such as robotic systems, optical devices, and acoustic devices were used. But if victim was unconscious then these rescue system failed. We conducted an experimental analysis on whether the microwaves could detect heart beat and breathing signals of human beings trapped under collapsed debris. For our analysis we used RADAR based on by Doppler shift effect. We calculated the minimum speed that the RADAR could detect. We checked the frequency variation by placing the RADAR at a fixed position and placing the object in motion at different distances. We checked the frequency variation by using objects of different materials as debris behind which the motion was made. The graphs of different analysis were plotted.
Texture analysis of Napoleonic War Era copper bolts
NASA Astrophysics Data System (ADS)
Malamud, Florencia; Northover, Shirley; James, Jon; Northover, Peter; Kelleher, Joe
2016-04-01
Neutron diffraction techniques are suitable for volume texture analyses due to high penetration of thermal neutrons in most materials. We have implemented a new data analysis methodology that employed the spatial resolution achievable by a time-of-flight neutron strain scanner to non-destructively determine the crystallographic texture at selected locations within a macroscopic sample. The method is based on defining the orientation distribution function of the crystallites from several incomplete pole figures, and it has been implemented on ENGIN-X, a neutron strain scanner at the Isis Facility in the UK. Here, we demonstrate the application of this new texture analysis methodology in determining the crystallographic texture at selected locations within museum quality archaeological objects up to 1 m in length. The results were verified using samples of similar, but less valuable, objects by comparing the results of applying this method with those obtained using both electron backscatter diffraction and X-ray diffraction on their cross sections.
3-D surface profilometry based on modulation measurement by applying wavelet transform method
NASA Astrophysics Data System (ADS)
Zhong, Min; Chen, Feng; Xiao, Chao; Wei, Yongchao
2017-01-01
A new analysis of 3-D surface profilometry based on modulation measurement technique by the application of Wavelet Transform method is proposed. As a tool excelling for its multi-resolution and localization in the time and frequency domains, Wavelet Transform method with good localized time-frequency analysis ability and effective de-noizing capacity can extract the modulation distribution more accurately than Fourier Transform method. Especially for the analysis of complex object, more details of the measured object can be well remained. In this paper, the theoretical derivation of Wavelet Transform method that obtains the modulation values from a captured fringe pattern is given. Both computer simulation and elementary experiment are used to show the validity of the proposed method by making a comparison with the results of Fourier Transform method. The results show that the Wavelet Transform method has a better performance than the Fourier Transform method in modulation values retrieval.
NASA Astrophysics Data System (ADS)
Valach, J.; Cacciotti, R.; Kuneš, P.; ČerÅanský, M.; Bláha, J.
2012-04-01
The paper presents a project aiming to develop a knowledge-based system for documentation and analysis of defects of cultural heritage objects and monuments. The MONDIS information system concentrates knowledge on damage of immovable structures due to various causes, and preventive/remedial actions performed to protect/repair them, where possible. The currently built system is to provide for understanding of causal relationships between a defect, materials, external load, and environment of built object. Foundation for the knowledge-based system will be the systemized and formalized knowledge on defects and their mitigation acquired in the process of analysis of a representative set of cases documented in the past. On the basis of design comparability, used technologies, materials and the nature of the external forces and surroundings, the developed software system has the capacity to indicate the most likely risks of new defect occurrence or the extension of the existing ones. The system will also allow for a comparison of the actual failure with similar cases documented and will propose a suitable technical intervention plan. The system will provide conservationists, administrators and owners of historical objects with a toolkit for defect documentation for their objects. Also, advanced artificial intelligence methods will offer accumulated knowledge to users and will also enable them to get oriented in relevant techniques of preventive interventions and reconstructions based on similarity with their case.
Full-color high-definition CGH reconstructing hybrid scenes of physical and virtual objects
NASA Astrophysics Data System (ADS)
Tsuchiyama, Yasuhiro; Matsushima, Kyoji; Nakahara, Sumio; Yamaguchi, Masahiro; Sakamoto, Yuji
2017-03-01
High-definition CGHs can reconstruct high-quality 3D images that are comparable to that in conventional optical holography. However, it was difficult to exhibit full-color images reconstructed by these high-definition CGHs, because three CGHs for RGB colors and a bulky image combiner were needed to produce full-color images. Recently, we reported a novel technique for full-color reconstruction using RGB color filters, which are similar to that used for liquid-crystal panels. This technique allows us to produce full-color high-definition CGHs composed of a single plate and place them on exhibition. By using the technique, we demonstrate full-color CGHs that reconstruct hybrid scenes comprised of real-existing physical objects and CG-modeled virtual objects in this paper. Here, the wave field of the physical object are obtained from dense multi-viewpoint images by employing the ray-sampling (RS) plane technique. In addition to the technique for full-color capturing and reconstruction of real object fields, the principle and simulation technique for full- color CGHs using RGB color filters are presented.
NASA Astrophysics Data System (ADS)
Sukawattanavijit, Chanika; Srestasathiern, Panu
2017-10-01
Land Use and Land Cover (LULC) information are significant to observe and evaluate environmental change. LULC classification applying remotely sensed data is a technique popularly employed on a global and local dimension particularly, in urban areas which have diverse land cover types. These are essential components of the urban terrain and ecosystem. In the present, object-based image analysis (OBIA) is becoming widely popular for land cover classification using the high-resolution image. COSMO-SkyMed SAR data was fused with THAICHOTE (namely, THEOS: Thailand Earth Observation Satellite) optical data for land cover classification using object-based. This paper indicates a comparison between object-based and pixel-based approaches in image fusion. The per-pixel method, support vector machines (SVM) was implemented to the fused image based on Principal Component Analysis (PCA). For the objectbased classification was applied to the fused images to separate land cover classes by using nearest neighbor (NN) classifier. Finally, the accuracy assessment was employed by comparing with the classification of land cover mapping generated from fused image dataset and THAICHOTE image. The object-based data fused COSMO-SkyMed with THAICHOTE images demonstrated the best classification accuracies, well over 85%. As the results, an object-based data fusion provides higher land cover classification accuracy than per-pixel data fusion.
Investigation of advanced phase-shifting projected fringe profilometry techniques
NASA Astrophysics Data System (ADS)
Liu, Hongyu
1999-11-01
The phase-shifting projected fringe profilometry (PSPFP) technique is a powerful tool in the profile measurements of rough engineering surfaces. Compared with other competing techniques, this technique is notable for its full-field measurement capacity, system simplicity, high measurement speed, and low environmental vulnerability. The main purpose of this dissertation is to tackle three important problems, which severely limit the capability and the accuracy of the PSPFP technique, with some new approaches. Chapter 1 provides some background information of the PSPFP technique including the measurement principles, basic features, and related techniques is briefly introduced. The objectives and organization of the thesis are also outlined. Chapter 2 gives a theoretical treatment to the absolute PSPFP measurement. The mathematical formulations and basic requirements of the absolute PSPFP measurement and its supporting techniques are discussed in detail. Chapter 3 introduces the experimental verification of the proposed absolute PSPFP technique. Some design details of a prototype system are discussed as supplements to the previous theoretical analysis. Various fundamental experiments performed for concept verification and accuracy evaluation are introduced together with some brief comments. Chapter 4 presents the theoretical study of speckle- induced phase measurement errors. In this analysis, the expression for speckle-induced phase errors is first derived based on the multiplicative noise model of image- plane speckles. The statistics and the system dependence of speckle-induced phase errors are then thoroughly studied through numerical simulations and analytical derivations. Based on the analysis, some suggestions on the system design are given to improve measurement accuracy. Chapter 5 discusses a new technique combating surface reflectivity variations. The formula used for error compensation is first derived based on a simplified model of the detection process. The techniques coping with two major effects of surface reflectivity variations are then introduced. Some fundamental problems in the proposed technique are studied through simulations. Chapter 6 briefly summarizes the major contributions of the current work and provides some suggestions for the future research.
A Software Designed For STP Data Plot and Analysis Based on Object-oriented Methodology
NASA Astrophysics Data System (ADS)
Lina, L.; Murata, K.
2006-12-01
In the present study, we design a system that is named "STARS (Solar-Terrestrial data Analysis and Reference System)". The STARS provides a research environment that researchers can refer to and analyse a variety of data with single software. This software design is based on the OMT (Object Modeling Technique). The OMT is one of the object-oriented techniques, which has an advantage in maintenance improvement, reuse and long time development of a system. At the Center for Information Technology, Ehime University, after our designing of the STARS, we have already started implementing the STARS. The latest version of the STARS, the STARS5, was released in 2006. Any user can download the system from our WWW site (http:// www.infonet.cite.ehime-u.ac.jp/STARS). The present paper is mainly devoted to the design of a data analysis software system. Through our designing, we paid attention so that the design is flexible and applicable when other developers design software for the similar purpose. If our model is so particular only for our own purpose, it would be useless for other developers. Through our design of the domain object model, we carefully removed the parts, which depend on the system resources, e.g. hardware and software. We put the dependent parts into the application object model. In the present design, therefore, the domain object model and the utility object model are independent of computer resource. This helps anther developer to construct his/her own system based the present design. They simply modify their own application object models according to their system resource. This division of the design between dependent and independent part into three object models is one of the advantages in the OMT. If the design of software is completely done along with the OMT, implementation is rather simple and automatic: developers simply map their designs on our programs. If one creates "ganother STARS" with other programming language such as Java, the programmer simply follows the present system as long as the language is object-oriented language. Researchers would want to add their data into the STARS. In this case, they simply add their own data class in the domain object model. It is because any satellite data has properties such as time or date, which are inherited from the upper class. In this way, their effort is less than in other old methodologies. In the OMT, description format of the system is rather strictly standardized. When new developers take part in STARS project, they have only to understand each model to obtain the overview of the STARS. Then they follow this designs and documents to implement the system. The OMT makes a new comer easy to join into the project already running.
Techniques for assessing relative values for multiple objective management on private forests
Donald F. Dennis; Thomas H. Stevens; David B. Kittredge; Mark G. Rickenbach
2003-01-01
Decision models for assessing multiple objective management of private lands will require estimates of the relative values of various nonmarket outputs or objectives that have become increasingly important. In this study, conjoint techniques are used to assess the relative values and acceptable trade-offs (marginal rates of substitution) among various objectives...
Research into a distributed fault diagnosis system and its application
NASA Astrophysics Data System (ADS)
Qian, Suxiang; Jiao, Weidong; Lou, Yongjian; Shen, Xiaomei
2005-12-01
CORBA (Common Object Request Broker Architecture) is a solution to distributed computing methods over heterogeneity systems, which establishes a communication protocol between distributed objects. It takes great emphasis on realizing the interoperation between distributed objects. However, only after developing some application approaches and some practical technology in monitoring and diagnosis, can the customers share the monitoring and diagnosis information, so that the purpose of realizing remote multi-expert cooperation diagnosis online can be achieved. This paper aims at building an open fault monitoring and diagnosis platform combining CORBA, Web and agent. Heterogeneity diagnosis object interoperate in independent thread through the CORBA (soft-bus), realizing sharing resource and multi-expert cooperation diagnosis online, solving the disadvantage such as lack of diagnosis knowledge, oneness of diagnosis technique and imperfectness of analysis function, so that more complicated and further diagnosis can be carried on. Take high-speed centrifugal air compressor set for example, we demonstrate a distributed diagnosis based on CORBA. It proves that we can find out more efficient approaches to settle the problems such as real-time monitoring and diagnosis on the net and the break-up of complicated tasks, inosculating CORBA, Web technique and agent frame model to carry on complemental research. In this system, Multi-diagnosis Intelligent Agent helps improve diagnosis efficiency. Besides, this system offers an open circumstances, which is easy for the diagnosis objects to upgrade and for new diagnosis server objects to join in.
NASA Astrophysics Data System (ADS)
Salvemini, Filomena; Grazzi, Francesco; Kardjilov, Nikolay; Wieder, Frank; Manke, Ingo; Edge, David; Williams, Alan; Zoppi, Marco
2017-05-01
Non-invasive experimental methods play an important role in the field of cultural heritage. Benefiting from the technical progress in recent years, neutron imaging has been demonstrated to complement effectively studies based on surface analysis, allowing for a non-invasive characterization of the whole three-dimensional volume. This study focuses on a kris and a kanjar, two weapons from ancient Asia, to show the potential of the combined use of X-ray and neutron imaging techniques for the characterisation of the manufacturing methods and the authentication of objects of cultural and historical interest.
NASA Technical Reports Server (NTRS)
Hammond, Ernest C., Jr.
1989-01-01
Since the United States of America is moving into an age of reusable space vehicles, both electronic and photographic materials will continue to be an integral part of the recording techniques available. Film as a scientifically viable recording technique in astronomy is well documented. There is a real need to expose various types of films to the Shuttle environment. Thus, the main objective was to look at the subtle densitometric changes of canisters of IIaO film that was placed aboard the Space Shuttle 3 (STS-3).
Vision sensing techniques in aeronautics and astronautics
NASA Technical Reports Server (NTRS)
Hall, E. L.
1988-01-01
The close relationship between sensing and other tasks in orbital space, and the integral role of vision sensing in practical aerospace applications, are illustrated. Typical space mission-vision tasks encompass the docking of space vehicles, the detection of unexpected objects, the diagnosis of spacecraft damage, and the inspection of critical spacecraft components. Attention is presently given to image functions, the 'windowing' of a view, the number of cameras required for inspection tasks, the choice of incoherent or coherent (laser) illumination, three-dimensional-to-two-dimensional model-matching, edge- and region-segmentation techniques, and motion analysis for tracking.
Trautwein, C.M.; Rowan, L.C.
1987-01-01
Linear structural features and hydrothermally altered rocks that were interpreted from Landsat data have been used by the U.S. Geological Survey (USGS) in regional mineral resource appraisals for more than a decade. In the past, linear features and alterations have been incorporated into models for assessing mineral resources potential by manually overlaying these and other data sets. Recently, USGS research into computer-based geographic information systems (GIS) for mineral resources assessment programs has produced several new techniques for data analysis, quantification, and integration to meet assessment objectives.
NASA Technical Reports Server (NTRS)
Goldstein, J. I.; Williams, D. B.
1992-01-01
This paper reviews and discusses future directions in analytical electron microscopy for microchemical analysis using X-ray and Electron Energy Loss Spectroscopy (EELS). The technique of X-ray microanalysis, using the ratio method and k(sub AB) factors, is outlined. The X-ray absorption correction is the major barrier to the objective of obtaining I% accuracy and precision in analysis. Spatial resolution and Minimum Detectability Limits (MDL) are considered with present limitations of spatial resolution in the 2 to 3 microns range and of MDL in the 0.1 to 0.2 wt. % range when a Field Emission Gun (FEG) system is used. Future directions of X-ray analysis include improvement in X-ray spatial resolution to the I to 2 microns range and MDL as low as 0.01 wt. %. With these improvements the detection of single atoms in the analysis volume will be possible. Other future improvements include the use of clean room techniques for thin specimen preparation, quantification available at the I% accuracy and precision level with light element analysis quantification available at better than the 10% accuracy and precision level, the incorporation of a compact wavelength dispersive spectrometer to improve X-ray spectral resolution, light element analysis and MDL, and instrument improvements including source stability, on-line probe current measurements, stage stability, and computerized stage control. The paper reviews the EELS technique, recognizing that it has been slow to develop and still remains firmly in research laboratories rather than in applications laboratories. Consideration of microanalysis with core-loss edges is given along with a discussion of the limitations such as specimen thickness. Spatial resolution and MDL are considered, recognizing that single atom detection is already possible. Plasmon loss analysis is discussed as well as fine structure analysis. New techniques for energy-loss imaging are also summarized. Future directions in the EELS technique will be the development of new spectrometers and improvements in thin specimen preparation. The microanalysis technique needs to be simplified and software developed so that the EELS technique approaches the relative simplicity of the X-ray technique. Finally, one can expect major improvements in EELS imaging as data storage and processing improvements occur.
3D Texture Features Mining for MRI Brain Tumor Identification
NASA Astrophysics Data System (ADS)
Rahim, Mohd Shafry Mohd; Saba, Tanzila; Nayer, Fatima; Syed, Afraz Zahra
2014-03-01
Medical image segmentation is a process to extract region of interest and to divide an image into its individual meaningful, homogeneous components. Actually, these components will have a strong relationship with the objects of interest in an image. For computer-aided diagnosis and therapy process, medical image segmentation is an initial mandatory step. Medical image segmentation is a sophisticated and challenging task because of the sophisticated nature of the medical images. Indeed, successful medical image analysis heavily dependent on the segmentation accuracy. Texture is one of the major features to identify region of interests in an image or to classify an object. 2D textures features yields poor classification results. Hence, this paper represents 3D features extraction using texture analysis and SVM as segmentation technique in the testing methodologies.
A review of supervised object-based land-cover image classification
NASA Astrophysics Data System (ADS)
Ma, Lei; Li, Manchun; Ma, Xiaoxue; Cheng, Liang; Du, Peijun; Liu, Yongxue
2017-08-01
Object-based image classification for land-cover mapping purposes using remote-sensing imagery has attracted significant attention in recent years. Numerous studies conducted over the past decade have investigated a broad array of sensors, feature selection, classifiers, and other factors of interest. However, these research results have not yet been synthesized to provide coherent guidance on the effect of different supervised object-based land-cover classification processes. In this study, we first construct a database with 28 fields using qualitative and quantitative information extracted from 254 experimental cases described in 173 scientific papers. Second, the results of the meta-analysis are reported, including general characteristics of the studies (e.g., the geographic range of relevant institutes, preferred journals) and the relationships between factors of interest (e.g., spatial resolution and study area or optimal segmentation scale, accuracy and number of targeted classes), especially with respect to the classification accuracy of different sensors, segmentation scale, training set size, supervised classifiers, and land-cover types. Third, useful data on supervised object-based image classification are determined from the meta-analysis. For example, we find that supervised object-based classification is currently experiencing rapid advances, while development of the fuzzy technique is limited in the object-based framework. Furthermore, spatial resolution correlates with the optimal segmentation scale and study area, and Random Forest (RF) shows the best performance in object-based classification. The area-based accuracy assessment method can obtain stable classification performance, and indicates a strong correlation between accuracy and training set size, while the accuracy of the point-based method is likely to be unstable due to mixed objects. In addition, the overall accuracy benefits from higher spatial resolution images (e.g., unmanned aerial vehicle) or agricultural sites where it also correlates with the number of targeted classes. More than 95.6% of studies involve an area less than 300 ha, and the spatial resolution of images is predominantly between 0 and 2 m. Furthermore, we identify some methods that may advance supervised object-based image classification. For example, deep learning and type-2 fuzzy techniques may further improve classification accuracy. Lastly, scientists are strongly encouraged to report results of uncertainty studies to further explore the effects of varied factors on supervised object-based image classification.
Diaby, Vakaramoko; Sanogo, Vassiki; Moussa, Kouame Richard
2015-01-01
Objective In this paper, the readers are introduced to ELICIT, an imprecise weight elicitation technique for multicriteria decision analysis for healthcare. Methods The application of ELICIT consists of two steps: the rank ordering of evaluation criteria based on decision-makers’ (DMs) preferences using the principal component analysis; and the estimation of criteria weights and their descriptive statistics using the variable interdependent analysis and the Monte Carlo method. The application of ELICIT is illustrated with a hypothetical case study involving the elicitation of weights for five criteria used to select the best device for eye surgery. Results The criteria were ranked from 1–5, based on a strict preference relationship established by the DMs. For each criterion, the deterministic weight was estimated as well as the standard deviation and 95% credibility interval. Conclusions ELICIT is appropriate in situations where only ordinal DMs’ preferences are available to elicit decision criteria weights. PMID:26361235
Smoothed Particle Inference Analysis of SNR RCW 103
NASA Astrophysics Data System (ADS)
Frank, Kari A.; Burrows, David N.; Dwarkadas, Vikram
2016-04-01
We present preliminary results of applying a novel analysis method, Smoothed Particle Inference (SPI), to an XMM-Newton observation of SNR RCW 103. SPI is a Bayesian modeling process that fits a population of gas blobs ("smoothed particles") such that their superposed emission reproduces the observed spatial and spectral distribution of photons. Emission-weighted distributions of plasma properties, such as abundances and temperatures, are then extracted from the properties of the individual blobs. This technique has important advantages over analysis techniques which implicitly assume that remnants are two-dimensional objects in which each line of sight encompasses a single plasma. By contrast, SPI allows superposition of as many blobs of plasma as are needed to match the spectrum observed in each direction, without the need to bin the data spatially. This RCW 103 analysis is part of a pilot study for the larger SPIES (Smoothed Particle Inference Exploration of SNRs) project, in which SPI will be applied to a sample of 12 bright SNRs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vasdekis, Andreas E.; Stephanopoulos, Gregory
The sampling and manipulation of cells down to the individual has been of substantial interest since the very beginning of Life Sciences. Herein, our objective is to highlight the most recent developments in single cell manipulation, as well as pioneering ones. First, flow-through methods will be discussed, namely methods in which the single cells flow continuously in an ordered manner during their analysis. This section will be followed by confinement techniques that enable cell isolation and confinement in one, two- or three-dimensions. Flow cytometry and droplet microfluidics are the two most common methods of flow-through analysis. While both are high-throughputmore » techniques, their difference lays in the fact that the droplet encapsulated cells experience a restricted and personal microenvironment, while in flow cytometry cells experience similar nutrient and stimuli initial concentrations. These methods are rather well established; however, they recently enabled immense strides in single cell phenotypic analysis, namely the identification and analysis of metabolically distinct individuals from an isogenic population using both droplet microfluidics and flow cytometry.« less
BLIND EXTRACTION OF AN EXOPLANETARY SPECTRUM THROUGH INDEPENDENT COMPONENT ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waldmann, I. P.; Tinetti, G.; Hollis, M. D. J.
2013-03-20
Blind-source separation techniques are used to extract the transmission spectrum of the hot-Jupiter HD189733b recorded by the Hubble/NICMOS instrument. Such a 'blind' analysis of the data is based on the concept of independent component analysis. The detrending of Hubble/NICMOS data using the sole assumption that nongaussian systematic noise is statistically independent from the desired light-curve signals is presented. By not assuming any prior or auxiliary information but the data themselves, it is shown that spectroscopic errors only about 10%-30% larger than parametric methods can be obtained for 11 spectral bins with bin sizes of {approx}0.09 {mu}m. This represents a reasonablemore » trade-off between a higher degree of objectivity for the non-parametric methods and smaller standard errors for the parametric de-trending. Results are discussed in light of previous analyses published in the literature. The fact that three very different analysis techniques yield comparable spectra is a strong indication of the stability of these results.« less
The Analysis of Image Segmentation Hierarchies with a Graph-based Knowledge Discovery System
NASA Technical Reports Server (NTRS)
Tilton, James C.; Cooke, diane J.; Ketkar, Nikhil; Aksoy, Selim
2008-01-01
Currently available pixel-based analysis techniques do not effectively extract the information content from the increasingly available high spatial resolution remotely sensed imagery data. A general consensus is that object-based image analysis (OBIA) is required to effectively analyze this type of data. OBIA is usually a two-stage process; image segmentation followed by an analysis of the segmented objects. We are exploring an approach to OBIA in which hierarchical image segmentations provided by the Recursive Hierarchical Segmentation (RHSEG) software developed at NASA GSFC are analyzed by the Subdue graph-based knowledge discovery system developed by a team at Washington State University. In this paper we discuss out initial approach to representing the RHSEG-produced hierarchical image segmentations in a graphical form understandable by Subdue, and provide results on real and simulated data. We also discuss planned improvements designed to more effectively and completely convey the hierarchical segmentation information to Subdue and to improve processing efficiency.
Task 7: Endwall treatment inlet flow distortion analysis
NASA Technical Reports Server (NTRS)
Hall, E. J.; Topp, D. A.; Heidegger, N. J.; McNulty, G. S.; Weber, K. F.; Delaney, R. A.
1996-01-01
The overall objective of this study was to develop a 3-D numerical analysis for compressor casing treatment flowfields, and to perform a series of detailed numerical predictions to assess the effectiveness of various endwall treatments for enhancing the efficiency and stall margin of modern high speed fan rotors. Particular attention was given to examining the effectiveness of endwall treatments to counter the undesirable effects of inflow distortion. Calculations were performed using three different gridding techniques based on the type of casing treatment being tested and the level of complexity desired in the analysis. In each case, the casing treatment itself is modeled as a discrete object in the overall analysis, and the flow through the casing treatment is determined as part of the solution. A series of calculations were performed for both treated and untreated modern fan rotors both with and without inflow distortion. The effectiveness of the various treatments were quantified, and several physical mechanisms by which the effectiveness of endwall treatments is achieved are discussed.
Advantages and limitations of potential methods for the analysis of bacteria in milk: a review.
Tabit, Frederick Tawi
2016-01-01
Contamination concerns in the dairy industry are motivated by outbreaks of disease in humans and the inability of thermal processes to eliminate bacteria completely in processed products. HACCP principles are an important tool used in the food industry to identify and control potential food safety hazards in order to meet customer demands and regulatory requirements. Milk testing is of importance to the milk industry regarding quality assurance and monitoring of processed products by researchers, manufacturers and regulatory agencies. Due to the availability of numerous methods used for analysing the microbial quality of milk in literature and differences in priorities of stakeholders, it is sometimes confusing to choose an appropriate method for a particular analysis. The objective of this paper is to review the advantages and disadvantages of selected techniques that can be used in the analysis of bacteria in milk. SSC, HRMA, REP, and RAPD are the top four techniques which are quick and cost-effective and possess adequate discriminatory power for the detection and profiling of bacteria. The following conclusions were arrived at during this review: HRMA, REP and RFLP are the techniques with the most reproducible results, and the techniques with the most discriminatory power are AFLP, PFGE and Raman Spectroscopy.
NASA Astrophysics Data System (ADS)
Maurya, S. P.; Singh, K. H.; Singh, N. P.
2018-05-01
In present study, three recently developed geostatistical methods, single attribute analysis, multi-attribute analysis and probabilistic neural network algorithm have been used to predict porosity in inter well region for Blackfoot field, Alberta, Canada, an offshore oil field. These techniques make use of seismic attributes, generated by model based inversion and colored inversion techniques. The principle objective of the study is to find the suitable combination of seismic inversion and geostatistical techniques to predict porosity and identification of prospective zones in 3D seismic volume. The porosity estimated from these geostatistical approaches is corroborated with the well log porosity. The results suggest that all the three implemented geostatistical methods are efficient and reliable to predict the porosity but the multi-attribute and probabilistic neural network analysis provide more accurate and high resolution porosity sections. A low impedance (6000-8000 m/s g/cc) and high porosity (> 15%) zone is interpreted from inverted impedance and porosity sections respectively between 1060 and 1075 ms time interval and is characterized as reservoir. The qualitative and quantitative results demonstrate that of all the employed geostatistical methods, the probabilistic neural network along with model based inversion is the most efficient method for predicting porosity in inter well region.
High Temperature Transparent Furnace Development
NASA Technical Reports Server (NTRS)
Bates, Stephen C.
1997-01-01
This report describes the use of novel techniques for heat containment that could be used to build a high temperature transparent furnace. The primary objective of the work was to experimentally demonstrate transparent furnace operation at 1200 C. Secondary objectives were to understand furnace operation and furnace component specification to enable the design and construction of a low power prototype furnace for delivery to NASA in a follow-up project. The basic approach of the research was to couple high temperature component design with simple concept demonstration experiments that modify a commercially available transparent furnace rated at lower temperature. A detailed energy balance of the operating transparent furnace was performed, calculating heat losses through the furnace components as a result of conduction, radiation, and convection. The transparent furnace shells and furnace components were redesigned to permit furnace operation at at least 1200 C. Techniques were developed that are expected to lead to significantly improved heat containment compared with current transparent furnaces. The design of a thermal profile in a multizone high temperature transparent furnace design was also addressed. Experiments were performed to verify the energy balance analysis, to demonstrate some of the major furnace improvement techniques developed, and to demonstrate the overall feasibility of a high temperature transparent furnace. The important objective of the research was achieved: to demonstrate the feasibility of operating a transparent furnace at 1200 C.
Optical micro-profilometry for archaeology
NASA Astrophysics Data System (ADS)
Carcagni, Pierluigi; Daffara, Claudia; Fontana, Raffaella; Gambino, Maria Chiara; Mastroianni, Maria; Mazzotta, Cinazia; Pampaloni, Enrico; Pezzati, Luca
2005-06-01
A quantitative morphological analysis of archaeological objects represents an important element for historical evaluations, artistic studies and conservation projects. At present, a variety of contact instruments for high-resolution surface survey is available on the market, but because of their invasivity they are not well received in the field of artwork conservation. On the contrary, optical testing techniques have seen a successful growth in last few years due to their effectiveness and safety. In this work we present a few examples of application of high-resolution 3D techniques for the survey of archaeological objects. Measurements were carried out by means of an optical micro-profilometer composed of a commercial conoprobe mounted on a scanning device that allows a maximum sampled area of 280×280 mm2. Measurements as well as roughness calculations were carried out on selected areas, representative of the differently degraded surface, of an ellenestic bronze statue to document the surface corrosion before restoration intervention started. Two highly-corroded ancient coins and a limestone column were surveyed to enhance the relief of inscriptions and drawings for dating purposes. High-resolution 3D survey, beyond the faithful representation of objects, makes it possible to display the surface in an image format that can be processed by means of image processing software. The application of digital filters as well as rendering techniques easies the readability of the smallest details.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neuhauser, K.; Gates, C.
This research project evaluates post-retrofit performance measurements, energy use data and construction costs for 13 projects that participated in the National Grid Deep Energy Retrofit Pilot program. The projects implemented a package of measures defined by performance targets for building enclosure components and building enclosure air tightness. Nearly all of the homes reached a post-retrofit air tightness result of 1.5 ACH 50. Homes that used the chainsaw retrofit technique along with roof insulation, and wall insulation applied to the exterior had the best air tightness results and the lowest heating and cooling source energy use. Analysis of measure costs andmore » project objectives yielded a categorization of costs relative to energy performance objectives. On average about ½ of the energy-related measure costs correspond primarily to energy-related objectives, and 20% of energy-related measure costs relate primarily to non-energy objectives.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gates, C.; Neuhauser, K.
This research project evaluates post-retrofit performance measurements, energy use data and construction costs for 13 projects that participated in the National Grid Deep Energy Retrofit Pilot program. The projects implemented a package of measures defined by performance targets for building enclosure components and building enclosure air tightness. Nearly all of the homes reached a post-retrofit air tightness result of 1.5 ACH 50. Homes that used the chainsaw retrofit technique along with roof insulation, and wall insulation applied to the exterior had the best air tightness results and the lowest heating and cooling source energy use. Analysis of measure costs andmore » project objectives yielded a categorization of costs relative to energy performance objectives. On average about 1/2 of the energy-related measure costs correspond primarily to energy-related objectives, and 20% of energy-related measure costs relate primarily to non-energy objectives.« less
NASA Astrophysics Data System (ADS)
Prod'homme, Thibaut; Verhoeve, P.; Kohley, R.; Short, A.; Boudin, N.
2014-07-01
The science objectives of space missions using CCDs to carry out accurate astronomical measurements are put at risk by the radiation-induced increase in charge transfer inefficiency (CTI) that results from trapping sites in the CCD silicon lattice. A variety of techniques are used to obtain CTI values and derive trap parameters, however they often differ in results. To identify and understand these differences, we take advantage of an on-going comprehensive characterisation of an irradiated Euclid prototype CCD including the following techniques: X-ray, trap pumping, flat field extended pixel edge response and first pixel response. We proceed to a comparative analysis of the obtained results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moody, J.S.; Brown, L.R.; Thieling, S.C.
1995-12-31
The Mississippi Office of Geology has conducted field trials using the surface exploration techniques of geomicrobial, radiometrics, and free soil gas. The objective of these trials is to determine if Mississippi oil and gas fields have surface hydrocarbon expression resulting from vertical microseepage migration. Six fields have been surveyed ranging in depth from 3,330 ft to 18,500 ft. The fields differ in trapping styles and hydrocarbon type. The results so far indicate that these fields do have a surface expression and that geomicrobial analysis as well as radiometrics and free soil gas can detect hydrocarbon microseepage from pressurized reservoirs. Allmore » three exploration techniques located the reservoirs independent of depth, hydrocarbon type, or trapping style.« less
Empirical performance of interpolation techniques in risk-neutral density (RND) estimation
NASA Astrophysics Data System (ADS)
Bahaludin, H.; Abdullah, M. H.
2017-03-01
The objective of this study is to evaluate the empirical performance of interpolation techniques in risk-neutral density (RND) estimation. Firstly, the empirical performance is evaluated by using statistical analysis based on the implied mean and the implied variance of RND. Secondly, the interpolation performance is measured based on pricing error. We propose using the leave-one-out cross-validation (LOOCV) pricing error for interpolation selection purposes. The statistical analyses indicate that there are statistical differences between the interpolation techniques:second-order polynomial, fourth-order polynomial and smoothing spline. The results of LOOCV pricing error shows that interpolation by using fourth-order polynomial provides the best fitting to option prices in which it has the lowest value error.
Colin D. MacLean
1980-01-01
Identification of opportunities for silvicultural treatment from inventory data is an important objective of Renewable Resources Evaluation in the Pacific Northwest. This paper describes the field plot design and data analysis procedure used by what used to be known as Forest Survey to determine the treatment opportunity associated with each inventory plot in western...
Assessing Resource Value and Relationships Between Objectives in Effects-Based Operations
2006-03-01
terms of a set of desired end states for the campaign’s system of systems . Value theory was used to identify the resource’s value in terms of the direct...2-1 2.3. System of Systems Analysis (SoSA) Definitions and Notation .................. 2-6 2.4. Mathematical Notation to...Describe an Enemy System ............................... 2-7 2.5. Weighting Techniques
Acoustic Seaglider: PhilSea10 Data Analysis
2016-06-13
and (simple) Kalman filtering techniques will be explored to utilize the unique time-space sound speed sampling of the Seagliders to generate snapshots... temperature and salinity were deployed (Figure 1). General objectives of the experiment are to understand the acoustic propagation in the...an acoustic recording system (ARS) to record the moored source transmissions, as well as temperature , salinity and pressure sensors (from which
Training Effectiveness and Cost Iterative Technique (TECIT). Volume 2. Cost Effectiveness Analysis
1988-07-01
Moving Tank in a Field Exercise A The task cluster identified as tank commander’s station/tank gunnery and the sub-task of firing an M250 grenade launcher...Firing Procedures, Task Number 171-126-1028. I OBJECTIVE: Given an Ml tank with crew, loaded M250 I grenade launcher, the commander’s station powered up
Wildlife habitats of the north coast of California: new techniques for extensive forest inventory.
Janet L. Ohmann
1992-01-01
A study was undertaken to develop methods for extensive inventory and analysis of wildlife habitats. The objective was to provide information about amounts and conditions of wildlife habitats from extensive, sample based inventories so that wildlife can be better considered in forest planning and policy decisions at the regional scale. The new analytical approach...
Design of an Orbital Inspection Satellite
1986-12-01
ADDRESS (City, State, and ZIP Code ) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK WORK UNITELEMENT NO. NO. NO. CCESSION NO. 11. TITLE (include...Captain, USAF Dh t ibutioni Availabiity Codes Avail adlor Dist [Special December 1986 Approved for public release; distribution...lends itself to the technique of multi -objective analysis. The final step is planning for action. This communicates the entire systems engineering
Gender Positioning in Education: A Critical Image Analysis of ESL Texts.
ERIC Educational Resources Information Center
Giaschi, Peter
2000-01-01
This article is adapted from a project report prepared for the author's Master's degree in Education. The objective is to report the use of an adapted analytical technique for examining the images contained in contemporary English-as-a-Second-Language (ESL) textbooks. The point of departure for the study was the identification of the trend in mass…