Sample records for digital processing method

  1. 3D measurement by digital photogrammetry

    NASA Astrophysics Data System (ADS)

    Schneider, Carl T.

    1993-12-01

    Photogrammetry is well known in geodetic surveys as aerial photogrammetry or close range applications as architectural photogrammetry. The photogrammetric methods and algorithms combined with digital cameras and digital image processing methods are now introduced for industrial applications as automation and quality control. The presented paper will describe the photogrammetric and digital image processing algorithms and the calibration methods. These algorithms and methods were demonstrated with application examples. These applications are a digital photogrammetric workstation as a mobil multi purpose 3D measuring tool and a tube measuring system as an example for a single purpose tool.

  2. Methods, media and systems for managing a distributed application running in a plurality of digital processing devices

    DOEpatents

    Laadan, Oren; Nieh, Jason; Phung, Dan

    2012-10-02

    Methods, media and systems for managing a distributed application running in a plurality of digital processing devices are provided. In some embodiments, a method includes running one or more processes associated with the distributed application in virtualized operating system environments on a plurality of digital processing devices, suspending the one or more processes, and saving network state information relating to network connections among the one or more processes. The method further include storing process information relating to the one or more processes, recreating the network connections using the saved network state information, and restarting the one or more processes using the stored process information.

  3. Digital mammography, cancer screening: Factors important for image compression

    NASA Technical Reports Server (NTRS)

    Clarke, Laurence P.; Blaine, G. James; Doi, Kunio; Yaffe, Martin J.; Shtern, Faina; Brown, G. Stephen; Winfield, Daniel L.; Kallergi, Maria

    1993-01-01

    The use of digital mammography for breast cancer screening poses several novel problems such as development of digital sensors, computer assisted diagnosis (CAD) methods for image noise suppression, enhancement, and pattern recognition, compression algorithms for image storage, transmission, and remote diagnosis. X-ray digital mammography using novel direct digital detection schemes or film digitizers results in large data sets and, therefore, image compression methods will play a significant role in the image processing and analysis by CAD techniques. In view of the extensive compression required, the relative merit of 'virtually lossless' versus lossy methods should be determined. A brief overview is presented here of the developments of digital sensors, CAD, and compression methods currently proposed and tested for mammography. The objective of the NCI/NASA Working Group on Digital Mammography is to stimulate the interest of the image processing and compression scientific community for this medical application and identify possible dual use technologies within the NASA centers.

  4. Digital storytelling as a method in health research: a systematic review protocol.

    PubMed

    Rieger, Kendra L; West, Christina H; Kenny, Amanda; Chooniedass, Rishma; Demczuk, Lisa; Mitchell, Kim M; Chateau, Joanne; Scott, Shannon D

    2018-03-05

    Digital storytelling is an arts-based research method with potential to elucidate complex narratives in a compelling manner, increase participant engagement, and enhance the meaning of research findings. This method involves the creation of a 3- to 5-min video that integrates multimedia materials including photos, participant voices, drawings, and music. Given the significant potential of digital storytelling to meaningfully capture and share participants' lived experiences, a systematic review of its use in healthcare research is crucial to develop an in-depth understanding of how researchers have used this method, with an aim to refine and further inform future iterations of its use. We aim to identify and synthesize evidence on the use, impact, and ethical considerations of using digital storytelling in health research. The review questions are as follows: (1) What is known about the purpose, definition, use (processes), and contexts of digital storytelling as part of the research process in health research? (2) What impact does digital storytelling have upon the research process, knowledge development, and healthcare practice? (3) What are the key ethical considerations when using digital storytelling within qualitative, quantitative, and mixed method research studies? Key databases and the grey literature will be searched from 1990 to the present for qualitative, quantitative, and mixed methods studies that utilized digital storytelling as part of the research process. Two independent reviewers will screen and critically appraise relevant articles with established quality appraisal tools. We will extract narrative data from all studies with a standardized data extraction form and conduct a thematic analysis of the data. To facilitate innovative dissemination through social media, we will develop a visual infographic and three digital stories to illustrate the review findings, as well as methodological and ethical implications. In collaboration with national and international experts in digital storytelling, we will synthesize key evidence about digital storytelling that is critical to the development of methodological and ethical expertise about arts-based research methods. We will also develop recommendations for incorporating digital storytelling in a meaningful and ethical manner into the research process. PROSPERO registry number CRD42017068002 .

  5. Implementation and extension of the impulse transfer function method for future application to the space shuttle project. Volume 2: Program description and user's guide

    NASA Technical Reports Server (NTRS)

    Patterson, G.

    1973-01-01

    The data processing procedures and the computer programs were developed to predict structural responses using the Impulse Transfer Function (ITF) method. There are three major steps in the process: (1) analog-to-digital (A-D) conversion of the test data to produce Phase I digital tapes (2) processing of the Phase I digital tapes to extract ITF's and storing them in a permanent data bank, and (3) predicting structural responses to a set of applied loads. The analog to digital conversion is performed by a standard package which will be described later in terms of the contents of the resulting Phase I digital tape. Two separate computer programs have been developed to perform the digital processing.

  6. Development of Coriolis mass flowmeter with digital drive and signal processing technology.

    PubMed

    Hou, Qi-Li; Xu, Ke-Jun; Fang, Min; Liu, Cui; Xiong, Wen-Jun

    2013-09-01

    Coriolis mass flowmeter (CMF) often suffers from two-phase flowrate which may cause flowtube stalling. To solve this problem, a digital drive method and a digital signal processing method of CMF is studied and implemented in this paper. A positive-negative step signal is used to initiate the flowtube oscillation without knowing the natural frequency of the flowtube. A digital zero-crossing detection method based on Lagrange interpolation is adopted to calculate the frequency and phase difference of the sensor output signals in order to synthesize the digital drive signal. The digital drive approach is implemented by a multiplying digital to analog converter (MDAC) and a direct digital synthesizer (DDS). A digital Coriolis mass flow transmitter is developed with a digital signal processor (DSP) to control the digital drive, and realize the signal processing. Water flow calibrations and gas-liquid two-phase flowrate experiments are conducted to examine the performance of the transmitter. The experimental results show that the transmitter shortens the start-up time and can maintain the oscillation of flowtube in two-phase flowrate condition. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Gamma ray spectroscopy employing divalent europium-doped alkaline earth halides and digital readout for accurate histogramming

    DOEpatents

    Cherepy, Nerine Jane; Payne, Stephen Anthony; Drury, Owen B; Sturm, Benjamin W

    2014-11-11

    A scintillator radiation detector system according to one embodiment includes a scintillator; and a processing device for processing pulse traces corresponding to light pulses from the scintillator, wherein pulse digitization is used to improve energy resolution of the system. A scintillator radiation detector system according to another embodiment includes a processing device for fitting digitized scintillation waveforms to an algorithm based on identifying rise and decay times and performing a direct integration of fit parameters. A method according to yet another embodiment includes processing pulse traces corresponding to light pulses from a scintillator, wherein pulse digitization is used to improve energy resolution of the system. A method in a further embodiment includes fitting digitized scintillation waveforms to an algorithm based on identifying rise and decay times; and performing a direct integration of fit parameters. Additional systems and methods are also presented.

  8. Modular error embedding

    DOEpatents

    Sandford, II, Maxwell T.; Handel, Theodore G.; Ettinger, J. Mark

    1999-01-01

    A method of embedding auxiliary information into the digital representation of host data containing noise in the low-order bits. The method applies to digital data representing analog signals, for example digital images. The method reduces the error introduced by other methods that replace the low-order bits with auxiliary information. By a substantially reverse process, the embedded auxiliary data can be retrieved easily by an authorized user through use of a digital key. The modular error embedding method includes a process to permute the order in which the host data values are processed. The method doubles the amount of auxiliary information that can be added to host data values, in comparison with bit-replacement methods for high bit-rate coding. The invention preserves human perception of the meaning and content of the host data, permitting the addition of auxiliary data in the amount of 50% or greater of the original host data.

  9. Digital processing of radiographic images for print publication.

    PubMed

    Cockerill, James W

    2002-01-01

    Digital imaging of X-rays yields high quality, evenly exposed negatives and prints. This article outlines the method used, materials and methods of this technique and discusses the advantages of digital radiographic images.

  10. Optimization of digital image processing to determine quantum dots' height and density from atomic force microscopy.

    PubMed

    Ruiz, J E; Paciornik, S; Pinto, L D; Ptak, F; Pires, M P; Souza, P L

    2018-01-01

    An optimized method of digital image processing to interpret quantum dots' height measurements obtained by atomic force microscopy is presented. The method was developed by combining well-known digital image processing techniques and particle recognition algorithms. The properties of quantum dot structures strongly depend on dots' height, among other features. Determination of their height is sensitive to small variations in their digital image processing parameters, which can generate misleading results. Comparing the results obtained with two image processing techniques - a conventional method and the new method proposed herein - with the data obtained by determining the height of quantum dots one by one within a fixed area, showed that the optimized method leads to more accurate results. Moreover, the log-normal distribution, which is often used to represent natural processes, shows a better fit to the quantum dots' height histogram obtained with the proposed method. Finally, the quantum dots' height obtained were used to calculate the predicted photoluminescence peak energies which were compared with the experimental data. Again, a better match was observed when using the proposed method to evaluate the quantum dots' height. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Spectroscopic analysis and control

    DOEpatents

    Tate; , James D.; Reed, Christopher J.; Domke, Christopher H.; Le, Linh; Seasholtz, Mary Beth; Weber, Andy; Lipp, Charles

    2017-04-18

    Apparatus for spectroscopic analysis which includes a tunable diode laser spectrometer having a digital output signal and a digital computer for receiving the digital output signal from the spectrometer, the digital computer programmed to process the digital output signal using a multivariate regression algorithm. In addition, a spectroscopic method of analysis using such apparatus. Finally, a method for controlling an ethylene cracker hydrogenator.

  12. Methods in Astronomical Image Processing

    NASA Astrophysics Data System (ADS)

    Jörsäter, S.

    A Brief Introductory Note History of Astronomical Imaging Astronomical Image Data Images in Various Formats Digitized Image Data Digital Image Data Philosophy of Astronomical Image Processing Properties of Digital Astronomical Images Human Image Processing Astronomical vs. Computer Science Image Processing Basic Tools of Astronomical Image Processing Display Applications Calibration of Intensity Scales Calibration of Length Scales Image Re-shaping Feature Enhancement Noise Suppression Noise and Error Analysis Image Processing Packages: Design of AIPS and MIDAS AIPS MIDAS Reduction of CCD Data Bias Subtraction Clipping Preflash Subtraction Dark Subtraction Flat Fielding Sky Subtraction Extinction Correction Deconvolution Methods Rebinning/Combining Summary and Prospects for the Future

  13. Review of digital holography reconstruction methods

    NASA Astrophysics Data System (ADS)

    Dovhaliuk, Rostyslav Yu.

    2018-01-01

    Development of digital holography opened new ways of both transparent and opaque objects non-destructive study. In this paper, a digital hologram reconstruction process is investigated. The advantages and limitations of common wave propagation methods are discussed. The details of a software implementation of a digital hologram reconstruction methods are presented. Finally, the performance of each wave propagation method is evaluated, and recommendations about possible use cases for each of them are given.

  14. Three-dimensional image signals: processing methods

    NASA Astrophysics Data System (ADS)

    Schiopu, Paul; Manea, Adrian; Craciun, Anca-Ileana; Craciun, Alexandru

    2010-11-01

    Over the years extensive studies have been carried out to apply coherent optics methods in real-time processing, communications and transmission image. This is especially true when a large amount of information needs to be processed, e.g., in high-resolution imaging. The recent progress in data-processing networks and communication systems has considerably increased the capacity of information exchange. We describe the results of literature investigation research of processing methods for the signals of the three-dimensional images. All commercially available 3D technologies today are based on stereoscopic viewing. 3D technology was once the exclusive domain of skilled computer-graphics developers with high-end machines and software. The images capture from the advanced 3D digital camera can be displayed onto screen of the 3D digital viewer with/ without special glasses. For this is needed considerable processing power and memory to create and render the complex mix of colors, textures, and virtual lighting and perspective necessary to make figures appear three-dimensional. Also, using a standard digital camera and a technique called phase-shift interferometry we can capture "digital holograms." These are holograms that can be stored on computer and transmitted over conventional networks. We present some research methods to process "digital holograms" for the Internet transmission and results.

  15. 78 FR 33098 - Prospective Grant of Co-Exclusive Licenses: Multi-Focal Structured Illumination Microscopy...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-03

    ...-Exclusive Licenses: Multi-Focal Structured Illumination Microscopy Systems and Methods AGENCY: National... pertains to a system and method for digital confocal microscopy that rapidly processes enhanced images. In particular, the invention is a method for digital confocal microscopy that includes a digital mirror device...

  16. Digital signal processor and processing method for GPS receivers

    NASA Technical Reports Server (NTRS)

    Thomas, Jr., Jess B. (Inventor)

    1989-01-01

    A digital signal processor and processing method therefor for use in receivers of the NAVSTAR/GLOBAL POSITIONING SYSTEM (GPS) employs a digital carrier down-converter, digital code correlator and digital tracking processor. The digital carrier down-converter and code correlator consists of an all-digital, minimum bit implementation that utilizes digital chip and phase advancers, providing exceptional control and accuracy in feedback phase and in feedback delay. Roundoff and commensurability errors can be reduced to extremely small values (e.g., less than 100 nanochips and 100 nanocycles roundoff errors and 0.1 millichip and 1 millicycle commensurability errors). The digital tracking processor bases the fast feedback for phase and for group delay in the C/A, P.sub.1, and P.sub.2 channels on the L.sub.1 C/A carrier phase thereby maintaining lock at lower signal-to-noise ratios, reducing errors in feedback delays, reducing the frequency of cycle slips and in some cases obviating the need for quadrature processing in the P channels. Simple and reliable methods are employed for data bit synchronization, data bit removal and cycle counting. Improved precision in averaged output delay values is provided by carrier-aided data-compression techniques. The signal processor employs purely digital operations in the sense that exactly the same carrier phase and group delay measurements are obtained, to the last decimal place, every time the same sampled data (i.e., exactly the same bits) are processed.

  17. Digital Signal Processing and Control for the Study of Gene Networks

    NASA Astrophysics Data System (ADS)

    Shin, Yong-Jun

    2016-04-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.

  18. Digital Signal Processing and Control for the Study of Gene Networks.

    PubMed

    Shin, Yong-Jun

    2016-04-22

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.

  19. Digital Signal Processing and Control for the Study of Gene Networks

    PubMed Central

    Shin, Yong-Jun

    2016-01-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks. PMID:27102828

  20. Apparatus and method for managing digital resources by passing digital resource tokens between queues

    DOEpatents

    Crawford, H.J.; Lindenstruth, V.

    1999-06-29

    A method of managing digital resources of a digital system includes the step of reserving token values for certain digital resources in the digital system. A selected token value in a free-buffer-queue is then matched to an incoming digital resource request. The selected token value is then moved to a valid-request-queue. The selected token is subsequently removed from the valid-request-queue to allow a digital agent in the digital system to process the incoming digital resource request associated with the selected token. Thereafter, the selected token is returned to the free-buffer-queue. 6 figs.

  1. Apparatus and method for managing digital resources by passing digital resource tokens between queues

    DOEpatents

    Crawford, Henry J.; Lindenstruth, Volker

    1999-01-01

    A method of managing digital resources of a digital system includes the step of reserving token values for certain digital resources in the digital system. A selected token value in a free-buffer-queue is then matched to an incoming digital resource request. The selected token value is then moved to a valid-request-queue. The selected token is subsequently removed from the valid-request-queue to allow a digital agent in the digital system to process the incoming digital resource request associated with the selected token. Thereafter, the selected token is returned to the free-buffer-queue.

  2. Digital Video Cameras for Brainstorming and Outlining: The Process and Potential

    ERIC Educational Resources Information Center

    Unger, John A.; Scullion, Vicki A.

    2013-01-01

    This "Voices from the Field" paper presents methods and participant-exemplar data for integrating digital video cameras into the writing process across postsecondary literacy contexts. The methods and participant data are part of an ongoing action-based research project systematically designed to bring research and theory into practice…

  3. Comparison of traditional nondestructive analysis of RERTR fuel plates with digital radiographic techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidsmeier, T.; Koehl, R.; Lanham, R.

    2008-07-15

    The current design and fabrication process for RERTR fuel plates utilizes film radiography during the nondestructive testing and characterization. Digital radiographic methods offer a potential increases in efficiency and accuracy. The traditional and digital radiographic methods are described and demonstrated on a fuel plate constructed with and average of 51% by volume fuel using the dispersion method. Fuel loading data from each method is analyzed and compared to a third baseline method to assess accuracy. The new digital method is shown to be more accurate, save hours of work, and provide additional information not easily available in the traditional method.more » Additional possible improvements suggested by the new digital method are also raised. (author)« less

  4. A method for the processing and analysis of digital terrain elevation data. [Shiprock and Gallup Quadrangles, Arizona and New Mexico

    NASA Technical Reports Server (NTRS)

    Junkin, B. G. (Principal Investigator)

    1979-01-01

    A method is presented for the processing and analysis of digital topography data that can subsequently be entered in an interactive data base in the form of slope, slope length, elevation, and aspect angle. A discussion of the data source and specific descriptions of the data processing software programs are included. In addition, the mathematical considerations involved in the registration of raw digitized coordinate points to the UTM coordinate system are presented. Scale factor considerations are also included. Results of the processing and analysis are illustrated using the Shiprock and Gallup Quadrangle test data.

  5. Physical Principles of the Method for Determination of Geometrical Characteristics and Particle Recognition in Digital Holography

    NASA Astrophysics Data System (ADS)

    Dyomin, V. V.; Polovtsev, I. G.; Davydova, A. Yu.

    2018-03-01

    The physical principles of a method for determination of geometrical characteristics of particles and particle recognition based on the concepts of digital holography, followed by processing of the particle images reconstructed from the digital hologram, using the morphological parameter are reported. An example of application of this method for fast plankton particle recognition is given.

  6. Digital signal processing methods for biosequence comparison.

    PubMed Central

    Benson, D C

    1990-01-01

    A method is discussed for DNA or protein sequence comparison using a finite field fast Fourier transform, a digital signal processing technique; and statistical methods are discussed for analyzing the output of this algorithm. This method compares two sequences of length N in computing time proportional to N log N compared to N2 for methods currently used. This method makes it feasible to compare very long sequences. An example is given to show that the method correctly identifies sites of known homology. PMID:2349096

  7. A Method for Identifying Contours in Processing Digital Images from Computer Tomograph

    NASA Astrophysics Data System (ADS)

    Roşu, Şerban; Pater, Flavius; Costea, Dan; Munteanu, Mihnea; Roşu, Doina; Fratila, Mihaela

    2011-09-01

    The first step in digital processing of two-dimensional computed tomography images is to identify the contour of component elements. This paper deals with the collective work of specialists in medicine and applied mathematics in computer science on elaborating new algorithms and methods in medical 2D and 3D imagery.

  8. Digital Microdroplet Ejection Technology-Based Heterogeneous Objects Prototyping

    PubMed Central

    Yang, Jiquan; Feng, Chunmei; Yang, Jianfei; Zhu, Liya; Guo, Aiqing

    2016-01-01

    An integrate fabrication framework is presented to build heterogeneous objects (HEO) using digital microdroplets injecting technology and rapid prototyping. The heterogeneous materials part design and manufacturing method in structure and material was used to change the traditional process. The net node method was used for digital modeling that can configure multimaterials in time. The relationship of material, color, and jetting nozzle was built. The main important contributions are to combine the structure, material, and visualization in one process and give the digital model for manufacture. From the given model, it is concluded that the method is effective for HEO. Using microdroplet rapid prototyping and the model given in the paper HEO could be gotten basically. The model could be used in 3D biomanufacturing. PMID:26981110

  9. Digital Microdroplet Ejection Technology-Based Heterogeneous Objects Prototyping.

    PubMed

    Li, Na; Yang, Jiquan; Feng, Chunmei; Yang, Jianfei; Zhu, Liya; Guo, Aiqing

    2016-01-01

    An integrate fabrication framework is presented to build heterogeneous objects (HEO) using digital microdroplets injecting technology and rapid prototyping. The heterogeneous materials part design and manufacturing method in structure and material was used to change the traditional process. The net node method was used for digital modeling that can configure multimaterials in time. The relationship of material, color, and jetting nozzle was built. The main important contributions are to combine the structure, material, and visualization in one process and give the digital model for manufacture. From the given model, it is concluded that the method is effective for HEO. Using microdroplet rapid prototyping and the model given in the paper HEO could be gotten basically. The model could be used in 3D biomanufacturing.

  10. Synthesis method from low-coherence digital holograms for improvement of image quality in holographic display.

    PubMed

    Mori, Yutaka; Nomura, Takanori

    2013-06-01

    In holographic displays, it is undesirable to observe the speckle noises with the reconstructed images. A method for improvement of reconstructed image quality by synthesizing low-coherence digital holograms is proposed. It is possible to obtain speckleless reconstruction of holograms due to low-coherence digital holography. An image sensor records low-coherence digital holograms, and the holograms are synthesized by computational calculation. Two approaches, the threshold-processing and the picking-a-peak methods, are proposed in order to reduce random noise of low-coherence digital holograms. The reconstructed image quality by the proposed methods is compared with the case of high-coherence digital holography. Quantitative evaluation is given to confirm the proposed methods. In addition, the visual evaluation by 15 people is also shown.

  11. Development and testing of controller performance evaluation methodology for multi-input/multi-output digital control systems

    NASA Technical Reports Server (NTRS)

    Pototzky, Anthony; Wieseman, Carol; Hoadley, Sherwood Tiffany; Mukhopadhyay, Vivek

    1991-01-01

    Described here is the development and implementation of on-line, near real time controller performance evaluation (CPE) methods capability. Briefly discussed are the structure of data flow, the signal processing methods used to process the data, and the software developed to generate the transfer functions. This methodology is generic in nature and can be used in any type of multi-input/multi-output (MIMO) digital controller application, including digital flight control systems, digitally controlled spacecraft structures, and actively controlled wind tunnel models. Results of applying the CPE methodology to evaluate (in near real time) MIMO digital flutter suppression systems being tested on the Rockwell Active Flexible Wing (AFW) wind tunnel model are presented to demonstrate the CPE capability.

  12. Fourier analysis and signal processing by use of the Moebius inversion formula

    NASA Technical Reports Server (NTRS)

    Reed, Irving S.; Yu, Xiaoli; Shih, Ming-Tang; Tufts, Donald W.; Truong, T. K.

    1990-01-01

    A novel Fourier technique for digital signal processing is developed. This approach to Fourier analysis is based on the number-theoretic method of the Moebius inversion of series. The Fourier transform method developed is shown also to yield the convolution of two signals. A computer simulation shows that this method for finding Fourier coefficients is quite suitable for digital signal processing. It competes with the classical FFT (fast Fourier transform) approach in terms of accuracy, complexity, and speed.

  13. Deformation analysis of MEMS structures by modified digital moiré methods

    NASA Astrophysics Data System (ADS)

    Liu, Zhanwei; Lou, Xinhao; Gao, Jianxin

    2010-11-01

    Quantitative deformation analysis of micro-fabricated electromechanical systems is of importance for the design and functional control of microsystems. In this paper, two modified digital moiré processing methods, Gaussian blurring algorithm combined with digital phase shifting and geometrical phase analysis (GPA) technique based on digital moiré method, are developed to quantitatively analyse the deformation behaviour of micro-electro-mechanical system (MEMS) structures. Measuring principles and experimental procedures of the two methods are described in detail. A digital moiré fringe pattern is generated by superimposing a specimen grating etched directly on a microstructure surface with a digital reference grating (DRG). Most of the grating noise is removed from the digital moiré fringes, which enables the phase distribution of the moiré fringes to be obtained directly. Strain measurement result of a MEMS structure demonstrates the feasibility of the two methods.

  14. Some Aspects in Photogrammetry Education at the Department of Geodesy and Cadastre of the VGTU

    NASA Astrophysics Data System (ADS)

    Ruzgienė, Birutė

    2008-03-01

    The education in photogrammetry is very important when applying photogrammetric methods for the terrain mapping purposes, for spatial data modelling, solving engineering tasks, measuring of architectural monuments etc. During the time the traditional photogrammetric technologies have been changing to modern fully digital photogrammetric workflow. The number of potential users of the photogrammetric methods tends to increase, because of high-degree automation in photographs (images) processing. The main subjects in Photogrammetry (particularly in Digital Photogrammetry) educational process are discussed. Different methods and digital systems are demonstrated with the examples of aerial photogrammetry products. The main objective is to search the possibilities for training in the photogrammetric measurements. Special attention is paid to the stereo plotting from aerial photography applying modified for teaching analytical technology. The integration of functionality of Digital Photogrammetric Systems and Digital Image Processing is analysed as well with an intention of extending the application areas and possibilities for usage of modern technologies in urban mapping and land cadastre. The practical presentation of photos geometry restitution is implemented as significant part of the studies. The interactive teaching for main photogrammetric procedures and controlling systems are highly desirable that without any doubt improve the quality of educational process.

  15. Methods of Adapting Digital Content for the Learning Process via Mobile Devices

    ERIC Educational Resources Information Center

    Lopez, J. L. Gimenez; Royo, T. Magal; Laborda, Jesus Garcia; Calvo, F. Garde

    2009-01-01

    This article analyses different methods of adapting digital content for its delivery via mobile devices taking into account two aspects which are a fundamental part of the learning process; on the one hand, functionality of the contents, and on the other, the actual controlled navigation requirements that the learner needs in order to acquire high…

  16. Spatial recurrence analysis: A sensitive and fast detection tool in digital mammography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prado, T. L.; Galuzio, P. P.; Lopes, S. R.

    Efficient diagnostics of breast cancer requires fast digital mammographic image processing. Many breast lesions, both benign and malignant, are barely visible to the untrained eye and requires accurate and reliable methods of image processing. We propose a new method of digital mammographic image analysis that meets both needs. It uses the concept of spatial recurrence as the basis of a spatial recurrence quantification analysis, which is the spatial extension of the well-known time recurrence analysis. The recurrence-based quantifiers are able to evidence breast lesions in a way as good as the best standard image processing methods available, but with amore » better control over the spurious fragments in the image.« less

  17. Digital Documentation: Using Computers to Create Multimedia Reports.

    ERIC Educational Resources Information Center

    Speitel, Tom; And Others

    1996-01-01

    Describes methods for creating integrated multimedia documents using recent advances in print, audio, and video digitization that bring added usefulness to computers as data acquisition, processing, and presentation tools. Discusses advantages of digital documentation. (JRH)

  18. Digital intermediate frequency QAM modulator using parallel processing

    DOEpatents

    Pao, Hsueh-Yuan [Livermore, CA; Tran, Binh-Nien [San Ramon, CA

    2008-05-27

    The digital Intermediate Frequency (IF) modulator applies to various modulation types and offers a simple and low cost method to implement a high-speed digital IF modulator using field programmable gate arrays (FPGAs). The architecture eliminates multipliers and sequential processing by storing the pre-computed modulated cosine and sine carriers in ROM look-up-tables (LUTs). The high-speed input data stream is parallel processed using the corresponding LUTs, which reduces the main processing speed, allowing the use of low cost FPGAs.

  19. Online Farsi digit recognition using their upper half structure

    NASA Astrophysics Data System (ADS)

    Ghods, Vahid; Sohrabi, Mohammad Karim

    2015-03-01

    In this paper, we investigated the efficiency of upper half Farsi numerical digit structure. In other words, half of data (upper half of the digit shapes) was exploited for the recognition of Farsi numerical digits. This method can be used for both offline and online recognition. Half of data is more effective in speed process, data transfer and in this application accuracy. Hidden Markov model (HMM) was used to classify online Farsi digits. Evaluation was performed by TMU dataset. This dataset contains more than 1200 samples of online handwritten Farsi digits. The proposed method yielded more accuracy in recognition rate.

  20. Apparatus and method for loading and unloading multiple digital tape cassettes utilizing a removable magazine

    DOEpatents

    Lindenmeyer, C.W.

    1993-01-26

    An apparatus and method to automate the handling of multiple digital tape cassettes for processing by commercially available cassette tape readers and recorders. A removable magazine rack stores a plurality of tape cassettes, and cooperates with a shuttle device that automatically inserts and removes cassettes from the magazine to the reader and vice-versa. Photocells are used to identify and index to the desired tape cassette. The apparatus allows digital information stored on multiple cassettes to be processed without significant operator intervention.

  1. Apparatus and method for loading and unloading multiple digital tape cassettes utilizing a removable magazine

    DOEpatents

    Lindenmeyer, Carl W.

    1993-01-01

    An apparatus and method to automate the handling of multiple digital tape cassettes for processing by commercially available cassette tape readers and recorders. A removable magazine rack stores a plurality of tape cassettes, and cooperates with a shuttle device that automatically inserts and removes cassettes from the magazine to the reader and vice-versa. Photocells are used to identify and index to the desired tape cassette. The apparatus allows digital information stored on multiple cassettes to be processed without significant operator intervention.

  2. The digital storytelling process: A comparative analysis from various experts

    NASA Astrophysics Data System (ADS)

    Hussain, Hashiroh; Shiratuddin, Norshuhada

    2016-08-01

    Digital Storytelling (DST) is a method of delivering information to the audience. It combines narrative and digital media content infused with the multimedia elements. In order for the educators (i.e the designers) to create a compelling digital story, there are sets of processes introduced by experts. Nevertheless, the experts suggest varieties of processes to guide them; of which some are redundant. The main aim of this study is to propose a single guide process for the creation of DST. A comparative analysis is employed where ten DST models from various experts are analysed. The process can also be implemented in other multimedia materials that used the concept of DST.

  3. Advanced Digital Forensic and Steganalysis Methods

    DTIC Science & Technology

    2009-02-01

    investigation is simultaneously cropped, scaled, and processed, extending the technology when the digital image is printed, developing technology capable ...or other common processing operations). TECNOLOGY APPLICATIONS 1. Determining the origin of digital images 2. Matching an image to a camera...Technology Transfer and Innovation Partnerships Division of Research P.O. Box 6000 State University of New York Binghamton, NY 13902-6000 Phone: 607-777

  4. A defocus-information-free autostereoscopic three-dimensional (3D) digital reconstruction method using direct extraction of disparity information (DEDI)

    NASA Astrophysics Data System (ADS)

    Li, Da; Cheung, Chifai; Zhao, Xing; Ren, Mingjun; Zhang, Juan; Zhou, Liqiu

    2016-10-01

    Autostereoscopy based three-dimensional (3D) digital reconstruction has been widely applied in the field of medical science, entertainment, design, industrial manufacture, precision measurement and many other areas. The 3D digital model of the target can be reconstructed based on the series of two-dimensional (2D) information acquired by the autostereoscopic system, which consists multiple lens and can provide information of the target from multiple angles. This paper presents a generalized and precise autostereoscopic three-dimensional (3D) digital reconstruction method based on Direct Extraction of Disparity Information (DEDI) which can be used to any transform autostereoscopic systems and provides accurate 3D reconstruction results through error elimination process based on statistical analysis. The feasibility of DEDI method has been successfully verified through a series of optical 3D digital reconstruction experiments on different autostereoscopic systems which is highly efficient to perform the direct full 3D digital model construction based on tomography-like operation upon every depth plane with the exclusion of the defocused information. With the absolute focused information processed by DEDI method, the 3D digital model of the target can be directly and precisely formed along the axial direction with the depth information.

  5. Securing Digital Audio using Complex Quadratic Map

    NASA Astrophysics Data System (ADS)

    Suryadi, MT; Satria Gunawan, Tjandra; Satria, Yudi

    2018-03-01

    In This digital era, exchanging data are common and easy to do, therefore it is vulnerable to be attacked and manipulated from unauthorized parties. One data type that is vulnerable to attack is digital audio. So, we need data securing method that is not vulnerable and fast. One of the methods that match all of those criteria is securing the data using chaos function. Chaos function that is used in this research is complex quadratic map (CQM). There are some parameter value that causing the key stream that is generated by CQM function to pass all 15 NIST test, this means that the key stream that is generated using this CQM is proven to be random. In addition, samples of encrypted digital sound when tested using goodness of fit test are proven to be uniform, so securing digital audio using this method is not vulnerable to frequency analysis attack. The key space is very huge about 8.1×l031 possible keys and the key sensitivity is very small about 10-10, therefore this method is also not vulnerable against brute-force attack. And finally, the processing speed for both encryption and decryption process on average about 450 times faster that its digital audio duration.

  6. Digital signal processing in the radio science stability analyzer

    NASA Technical Reports Server (NTRS)

    Greenhall, C. A.

    1995-01-01

    The Telecommunications Division has built a stability analyzer for testing Deep Space Network installations during flight radio science experiments. The low-frequency part of the analyzer operates by digitizing wave signals with bandwidths between 80 Hz and 45 kHz. Processed outputs include spectra of signal, phase, amplitude, and differential phase; time series of the same quantities; and Allan deviation of phase and differential phase. This article documents the digital signal-processing methods programmed into the analyzer.

  7. Rethinking the architectural design concept in the digital culture (in architecture's practice perspective)

    NASA Astrophysics Data System (ADS)

    Prawata, Albertus Galih

    2017-11-01

    The architectural design stages in architectural practices or in architectural design studio consist of many aspects. One of them is during the early phases of the design process, where the architects or designers try to interpret the project brief into the design concept. This paper is a report of the procedure of digital tools in the early design process in an architectural practice in Jakarta. It targets principally the use of BIM and digital modeling to generate information and transform them into conceptual forms, which is not very common in Indonesian architectural practices. Traditionally, the project brief is transformed into conceptual forms by using sketches, drawings, and physical model. The new method using digital tools shows that it is possible to do the same thing during the initial stage of the design process to create early architectural design forms. Architect's traditional tools and methods begin to be replaced effectively by digital tools, which would drive bigger opportunities for innovation.

  8. Evaluation Digital Elevation Model Generated by Synthetic Aperture Radar Data

    NASA Astrophysics Data System (ADS)

    Makineci, H. B.; Karabörk, H.

    2016-06-01

    Digital elevation model, showing the physical and topographical situation of the earth, is defined a tree-dimensional digital model obtained from the elevation of the surface by using of selected an appropriate interpolation method. DEMs are used in many areas such as management of natural resources, engineering and infrastructure projects, disaster and risk analysis, archaeology, security, aviation, forestry, energy, topographic mapping, landslide and flood analysis, Geographic Information Systems (GIS). Digital elevation models, which are the fundamental components of cartography, is calculated by many methods. Digital elevation models can be obtained terrestrial methods or data obtained by digitization of maps by processing the digital platform in general. Today, Digital elevation model data is generated by the processing of stereo optical satellite images, radar images (radargrammetry, interferometry) and lidar data using remote sensing and photogrammetric techniques with the help of improving technology. One of the fundamental components of remote sensing radar technology is very advanced nowadays. In response to this progress it began to be used more frequently in various fields. Determining the shape of topography and creating digital elevation model comes the beginning topics of these areas. It is aimed in this work , the differences of evaluation of quality between Sentinel-1A SAR image ,which is sent by European Space Agency ESA and Interferometry Wide Swath imaging mode and C band type , and DTED-2 (Digital Terrain Elevation Data) and application between them. The application includes RMS static method for detecting precision of data. Results show us to variance of points make a high decrease from mountain area to plane area.

  9. The AAPM/RSNA physics tutorial for residents: digital fluoroscopy.

    PubMed

    Pooley, R A; McKinney, J M; Miller, D A

    2001-01-01

    A digital fluoroscopy system is most commonly configured as a conventional fluoroscopy system (tube, table, image intensifier, video system) in which the analog video signal is converted to and stored as digital data. Other methods of acquiring the digital data (eg, digital or charge-coupled device video and flat-panel detectors) will become more prevalent in the future. Fundamental concepts related to digital imaging in general include binary numbers, pixels, and gray levels. Digital image data allow the convenient use of several image processing techniques including last image hold, gray-scale processing, temporal frame averaging, and edge enhancement. Real-time subtraction of digital fluoroscopic images after injection of contrast material has led to widespread use of digital subtraction angiography (DSA). Additional image processing techniques used with DSA include road mapping, image fade, mask pixel shift, frame summation, and vessel size measurement. Peripheral angiography performed with an automatic moving table allows imaging of the peripheral vasculature with a single contrast material injection.

  10. Demonstration of three gorges archaeological relics based on 3D-visualization technology

    NASA Astrophysics Data System (ADS)

    Xu, Wenli

    2015-12-01

    This paper mainly focuses on the digital demonstration of three gorges archeological relics to exhibit the achievements of the protective measures. A novel and effective method based on 3D-visualization technology, which includes large-scaled landscape reconstruction, virtual studio, and virtual panoramic roaming, etc, is proposed to create a digitized interactive demonstration system. The method contains three stages: pre-processing, 3D modeling and integration. Firstly, abundant archaeological information is classified according to its history and geographical information. Secondly, build up a 3D-model library with the technology of digital images processing and 3D modeling. Thirdly, use virtual reality technology to display the archaeological scenes and cultural relics vividly and realistically. The present work promotes the application of virtual reality to digital projects and enriches the content of digital archaeology.

  11. Automatic registration of fused lidar/digital imagery (texel images) for three-dimensional image creation

    NASA Astrophysics Data System (ADS)

    Budge, Scott E.; Badamikar, Neeraj S.; Xie, Xuan

    2015-03-01

    Several photogrammetry-based methods have been proposed that the derive three-dimensional (3-D) information from digital images from different perspectives, and lidar-based methods have been proposed that merge lidar point clouds and texture the merged point clouds with digital imagery. Image registration alone has difficulty with smooth regions with low contrast, whereas point cloud merging alone has difficulty with outliers and a lack of proper convergence in the merging process. This paper presents a method to create 3-D images that uses the unique properties of texel images (pixel-fused lidar and digital imagery) to improve the quality and robustness of fused 3-D images. The proposed method uses both image processing and point-cloud merging to combine texel images in an iterative technique. Since the digital image pixels and the lidar 3-D points are fused at the sensor level, more accurate 3-D images are generated because registration of image data automatically improves the merging of the point clouds, and vice versa. Examples illustrate the value of this method over other methods. The proposed method also includes modifications for the situation where an estimate of position and attitude of the sensor is known, when obtained from low-cost global positioning systems and inertial measurement units sensors.

  12. Unified Digital Image Display And Processing System

    NASA Astrophysics Data System (ADS)

    Horii, Steven C.; Maguire, Gerald Q.; Noz, Marilyn E.; Schimpf, James H.

    1981-11-01

    Our institution like many others, is faced with a proliferation of medical imaging techniques. Many of these methods give rise to digital images (e.g. digital radiography, computerized tomography (CT) , nuclear medicine and ultrasound). We feel that a unified, digital system approach to image management (storage, transmission and retrieval), image processing and image display will help in integrating these new modalities into the present diagnostic radiology operations. Future techniques are likely to employ digital images, so such a system could readily be expanded to include other image sources. We presently have the core of such a system. We can both view and process digital nuclear medicine (conventional gamma camera) images, positron emission tomography (PET) and CT images on a single system. Images from our recently installed digital radiographic unit can be added. Our paper describes our present system, explains the rationale for its configuration, and describes the directions in which it will expand.

  13. Optimization of digitization procedures in cultural heritage preservation

    NASA Astrophysics Data System (ADS)

    Martínez, Bea; Mitjà, Carles; Escofet, Jaume

    2013-11-01

    The digitization of both volumetric and flat objects is the nowadays-preferred method in order to preserve cultural heritage items. High quality digital files obtained from photographic plates, films and prints, paintings, drawings, gravures, fabrics and sculptures, allows not only for a wider diffusion and on line transmission, but also for the preservation of the original items from future handling. Early digitization procedures used scanners for flat opaque or translucent objects and camera only for volumetric or flat highly texturized materials. The technical obsolescence of the high-end scanners and the improvement achieved by professional cameras has result in a wide use of cameras with digital back to digitize any kind of cultural heritage item. Since the lens, the digital back, the software controlling the camera and the digital image processing provide a wide range of possibilities, there is necessary to standardize the methods used in the reproduction work leading to preserve as high as possible the original item properties. This work presents an overview about methods used for camera system characterization, as well as the best procedures in order to identify and counteract the effect of the lens residual aberrations, sensor aliasing, image illumination, color management and image optimization by means of parametric image processing. As a corollary, the work shows some examples of reproduction workflow applied to the digitization of valuable art pieces and glass plate photographic black and white negatives.

  14. Digital signal processing at Bell Labs-Foundations for speech and acoustics research

    NASA Astrophysics Data System (ADS)

    Rabiner, Lawrence R.

    2004-05-01

    Digital signal processing (DSP) is a fundamental tool for much of the research that has been carried out of Bell Labs in the areas of speech and acoustics research. The fundamental bases for DSP include the sampling theorem of Nyquist, the method for digitization of analog signals by Shannon et al., methods of spectral analysis by Tukey, the cepstrum by Bogert et al., and the FFT by Tukey (and Cooley of IBM). Essentially all of these early foundations of DSP came out of the Bell Labs Research Lab in the 1930s, 1940s, 1950s, and 1960s. This fundamental research was motivated by fundamental applications (mainly in the areas of speech, sonar, and acoustics) that led to novel design methods for digital filters (Kaiser, Golden, Rabiner, Schafer), spectrum analysis methods (Rabiner, Schafer, Allen, Crochiere), fast convolution methods based on the FFT (Helms, Bergland), and advanced digital systems used to implement telephony channel banks (Jackson, McDonald, Freeny, Tewksbury). This talk summarizes the key contributions to DSP made at Bell Labs, and illustrates how DSP was utilized in the areas of speech and acoustics research. It also shows the vast, worldwide impact of this DSP research on modern consumer electronics.

  15. The influence of digital filter type, amplitude normalisation method, and co-contraction algorithm on clinically relevant surface electromyography data during clinical movement assessments.

    PubMed

    Devaprakash, Daniel; Weir, Gillian J; Dunne, James J; Alderson, Jacqueline A; Donnelly, Cyril J

    2016-12-01

    There is a large and growing body of surface electromyography (sEMG) research using laboratory-specific signal processing procedures (i.e., digital filter type and amplitude normalisation protocols) and data analyses methods (i.e., co-contraction algorithms) to acquire practically meaningful information from these data. As a result, the ability to compare sEMG results between studies is, and continues to be challenging. The aim of this study was to determine if digital filter type, amplitude normalisation method, and co-contraction algorithm could influence the practical or clinical interpretation of processed sEMG data. Sixteen elite female athletes were recruited. During data collection, sEMG data was recorded from nine lower limb muscles while completing a series of calibration and clinical movement assessment trials (running and sidestepping). Three analyses were conducted: (1) signal processing with two different digital filter types (Butterworth or critically damped), (2) three amplitude normalisation methods, and (3) three co-contraction ratio algorithms. Results showed the choice of digital filter did not influence the clinical interpretation of sEMG; however, choice of amplitude normalisation method and co-contraction algorithm did influence the clinical interpretation of the running and sidestepping task. Care is recommended when choosing amplitude normalisation method and co-contraction algorithms if researchers/clinicians are interested in comparing sEMG data between studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Digital processing of radiographic images from PACS to publishing.

    PubMed

    Christian, M E; Davidson, H C; Wiggins, R H; Berges, G; Cannon, G; Jackson, G; Chapman, B; Harnsberger, H R

    2001-03-01

    Several studies have addressed the implications of filmless radiologic imaging on telemedicine, diagnostic ability, and electronic teaching files. However, many publishers still require authors to submit hard-copy images for publication of articles and textbooks. This study compares the quality digital images directly exported from picture archive and communications systems (PACS) to images digitized from radiographic film. The authors evaluated the quality of publication-grade glossy photographs produced from digital radiographic images using 3 different methods: (1) film images digitized using a desktop scanner and then printed, (2) digital images obtained directly from PACS then printed, and (3) digital images obtained from PACS and processed to improve sharpness prior to printing. Twenty images were printed using each of the 3 different methods and rated for quality by 7 radiologists. The results were analyzed for statistically significant differences among the image sets. Subjective evaluations of the filmless images found them to be of equal or better quality than the digitized images. Direct electronic transfer of PACS images reduces the number of steps involved in creating publication-quality images as well as providing the means to produce high-quality radiographic images in a digital environment.

  17. Geology

    NASA Technical Reports Server (NTRS)

    Stewart, R. K.; Sabins, F. F., Jr.; Rowan, L. C.; Short, N. M.

    1975-01-01

    Papers from private industry reporting applications of remote sensing to oil and gas exploration were presented. Digitally processed LANDSAT images were successfully employed in several geologic interpretations. A growing interest in digital image processing among the geologic user community was shown. The papers covered a wide geographic range and a wide technical and application range. Topics included: (1) oil and gas exploration, by use of radar and multisensor studies as well as by use of LANDSAT imagery or LANDSAT digital data, (2) mineral exploration, by mapping from LANDSAT and Skylab imagery and by LANDSAT digital processing, (3) geothermal energy studies with Skylab imagery, (4) environmental and engineering geology, by use of radar or LANDSAT and Skylab imagery, (5) regional mapping and interpretation, and digital and spectral methods.

  18. Experiences with digital processing of images at INPE

    NASA Technical Reports Server (NTRS)

    Mascarenhas, N. D. A. (Principal Investigator)

    1984-01-01

    Four different research experiments with digital image processing at INPE will be described: (1) edge detection by hypothesis testing; (2) image interpolation by finite impulse response filters; (3) spatial feature extraction methods in multispectral classification; and (4) translational image registration by sequential tests of hypotheses.

  19. Measurement methods to build up the digital optical twin

    NASA Astrophysics Data System (ADS)

    Prochnau, Marcel; Holzbrink, Michael; Wang, Wenxin; Holters, Martin; Stollenwerk, Jochen; Loosen, Peter

    2018-02-01

    The realization of the Digital Optical Twin (DOT), which is in short the digital representation of the physical state of an optical system, is particularly useful in the context of an automated assembly process of optical systems. During the assembly process, the physical system status of the optical system is continuously measured and compared with the digital model. In case of deviations between physical state and the digital model, the latter one is adapted to match the physical state. To reach the goal described above, in a first step measurement/characterization technologies concerning their suitability to generate a precise digital twin of an existing optical system have to be identified and evaluated. This paper gives an overview of possible characterization methods and, finally, shows first results of evaluated, compared methods (e.g. spot-radius, MTF, Zernike-polynomials), to create a DOT. The focus initially lies on the unequivocalness of the optimization results as well as on the computational time required for the optimization to reach the characterized system state. Possible sources of error are the measurement accuracy (to characterize the system) , execution time of the measurement, time needed to map the digital to the physical world (optimization step) as well as interface possibilities to integrate the measurement tool into an assembly cell. Moreover, it is to be discussed whether the used measurement methods are suitable for a `seamless' integration into an assembly cell.

  20. Disposable world-to-chip interface for digital microfluidics

    DOEpatents

    Van Dam, R. Michael; Shah, Gaurav; Keng, Pei-Yuin

    2017-05-16

    The present disclosure sets forth incorporating microfluidic chips interfaces for use with digital microfluidic processes. Methods and devices according to the present disclosure utilize compact, integrated platforms that interface with a chip upstream and downstream of the reaction, as well as between intermediate reaction steps if needed. In some embodiments these interfaces are automated, including automation of a multiple reagent process. Various reagent delivery systems and methods are also disclosed.

  1. On detection of median filtering in digital images

    NASA Astrophysics Data System (ADS)

    Kirchner, Matthias; Fridrich, Jessica

    2010-01-01

    In digital image forensics, it is generally accepted that intentional manipulations of the image content are most critical and hence numerous forensic methods focus on the detection of such 'malicious' post-processing. However, it is also beneficial to know as much as possible about the general processing history of an image, including content-preserving operations, since they can affect the reliability of forensic methods in various ways. In this paper, we present a simple yet effective technique to detect median filtering in digital images-a widely used denoising and smoothing operator. As a great variety of forensic methods relies on some kind of a linearity assumption, a detection of non-linear median filtering is of particular interest. The effectiveness of our method is backed with experimental evidence on a large image database.

  2. Digital redesign of anti-wind-up controller for cascaded analog system.

    PubMed

    Chen, Y S; Tsai, J S H; Shieh, L S; Moussighi, M M

    2003-01-01

    The cascaded conventional anti-wind-up (CAW) design method for integral controller is discussed. Then, the prediction-based digital redesign methodology is utilized to find the new pulse amplitude modulated (PAM) digital controller for effective digital control of the analog plant with input saturation constraint. The desired digital controller is determined from existing or pre-designed CAW analog controller. The proposed method provides a novel methodology for indirect digital design of a continuous-time unity output-feedback system with a cascaded analog controller as in the case of PID controllers for industrial control processes with the presence of actuator saturations. It enables us to implement an existing or pre-designed cascaded CAW analog controller via a digital controller effectively.

  3. Art for the Ages.

    ERIC Educational Resources Information Center

    Casazza, Ornella; Franchi, Paolo

    1985-01-01

    Description of encoding of art works and digitization of paintings to preserve and restore them reviews experiments which used chromatic selection and abstraction as a painting restoration method. This method utilizes the numeric processing resulting from digitization to restore a painting and computer simulation to shorten the restoration…

  4. Method and apparatus for combinatorial logic signal processor in a digitally based high speed x-ray spectrometer

    DOEpatents

    Warburton, William K.; Zhou, Zhiquing

    1999-01-01

    A high speed, digitally based, signal processing system which accepts a digitized input signal and detects the presence of step-like pulses in the this data stream, extracts filtered estimates of their amplitudes, inspects for pulse pileup, and records input pulse rates and system livetime. The system has two parallel processing channels: a slow channel, which filters the data stream with a long time constant trapezoidal filter for good energy resolution; and a fast channel which filters the data stream with a short time constant trapezoidal filter, detects pulses, inspects for pileups, and captures peak values from the slow channel for good events. The presence of a simple digital interface allows the system to be easily integrated with a digital processor to produce accurate spectra at high count rates and allow all spectrometer functions to be fully automated. Because the method is digitally based, it allows pulses to be binned based on time related values, as well as on their amplitudes, if desired.

  5. Digital image processing: a primer for JVIR authors and readers: Part 3: Digital image editing.

    PubMed

    LaBerge, Jeanne M; Andriole, Katherine P

    2003-12-01

    This is the final installment of a three-part series on digital image processing intended to prepare authors for online submission of manuscripts. In the first two articles of the series, the fundamentals of digital image architecture were reviewed and methods of importing images to the computer desktop were described. In this article, techniques are presented for editing images in preparation for online submission. A step-by-step guide to basic editing with use of Adobe Photoshop is provided and the ethical implications of this activity are explored.

  6. Digital Correlation In Laser-Speckle Velocimetry

    NASA Technical Reports Server (NTRS)

    Gilbert, John A.; Mathys, Donald R.

    1992-01-01

    Periodic recording helps to eliminate spurious results. Improved digital-correlation process extracts velocity field of two-dimensional flow from laser-speckle images of seed particles distributed sparsely in flow. Method which involves digital correlation of images recorded at unequal intervals, completely automated and has potential to be fastest yet.

  7. Digital data storage systems, computers, and data verification methods

    DOEpatents

    Groeneveld, Bennett J.; Austad, Wayne E.; Walsh, Stuart C.; Herring, Catherine A.

    2005-12-27

    Digital data storage systems, computers, and data verification methods are provided. According to a first aspect of the invention, a computer includes an interface adapted to couple with a dynamic database; and processing circuitry configured to provide a first hash from digital data stored within a portion of the dynamic database at an initial moment in time, to provide a second hash from digital data stored within the portion of the dynamic database at a subsequent moment in time, and to compare the first hash and the second hash.

  8. Fingerprint pattern restoration by digital image processing techniques.

    PubMed

    Wen, Che-Yen; Yu, Chiu-Chung

    2003-09-01

    Fingerprint evidence plays an important role in solving criminal problems. However, defective (lacking information needed for completeness) or contaminated (undesirable information included) fingerprint patterns make identifying and recognizing processes difficult. Unfortunately. this is the usual case. In the recognizing process (enhancement of patterns, or elimination of "false alarms" so that a fingerprint pattern can be searched in the Automated Fingerprint Identification System (AFIS)), chemical and physical techniques have been proposed to improve pattern legibility. In the identifying process, a fingerprint examiner can enhance contaminated (but not defective) fingerprint patterns under guidelines provided by the Scientific Working Group on Friction Ridge Analysis, Study and Technology (SWGFAST), the Scientific Working Group on Imaging Technology (SWGIT), and an AFIS working group within the National Institute of Justice. Recently, the image processing techniques have been successfully applied in forensic science. For example, we have applied image enhancement methods to improve the legibility of digital images such as fingerprints and vehicle plate numbers. In this paper, we propose a novel digital image restoration technique based on the AM (amplitude modulation)-FM (frequency modulation) reaction-diffusion method to restore defective or contaminated fingerprint patterns. This method shows its potential application to fingerprint pattern enhancement in the recognizing process (but not for the identifying process). Synthetic and real images are used to show the capability of the proposed method. The results of enhancing fingerprint patterns by the manual process and our method are evaluated and compared.

  9. Method and Apparatus for Processing UDP Data Packets

    NASA Technical Reports Server (NTRS)

    Murphy, Brandon M. (Inventor)

    2017-01-01

    A method and apparatus for processing a plurality of data packets. A data packet is received. A determination is made as to whether a portion of the data packet follows a selected digital recorder standard protocol based on a header of the data packet. Raw data in the data packet is converted into human-readable information in response to a determination that the portion of the data packet follows the selected digital recorder standard protocol.

  10. An interactive method for digitizing zone maps

    NASA Technical Reports Server (NTRS)

    Giddings, L. E.; Thompson, E. J.

    1975-01-01

    A method is presented for digitizing maps that consist of zones, such as contour or climatic zone maps. A color-coded map is prepared by any convenient process. The map is then read into memory of an Image 100 computer by means of its table scanner, using colored filters. Zones are separated and stored in themes, using standard classification procedures. Thematic data are written on magnetic tape and these data, appropriately coded, are combined to make a digitized image on tape. Step-by-step procedures are given for digitization of crop moisture index maps with this procedure. In addition, a complete example of the digitization of a climatic zone map is given.

  11. Method and apparatus for digitally based high speed x-ray spectrometer

    DOEpatents

    Warburton, W.K.; Hubbard, B.

    1997-11-04

    A high speed, digitally based, signal processing system which accepts input data from a detector-preamplifier and produces a spectral analysis of the x-rays illuminating the detector. The system achieves high throughputs at low cost by dividing the required digital processing steps between a ``hardwired`` processor implemented in combinatorial digital logic, which detects the presence of the x-ray signals in the digitized data stream and extracts filtered estimates of their amplitudes, and a programmable digital signal processing computer, which refines the filtered amplitude estimates and bins them to produce the desired spectral analysis. One set of algorithms allow this hybrid system to match the resolution of analog systems while operating at much higher data rates. A second set of algorithms implemented in the processor allow the system to be self calibrating as well. The same processor also handles the interface to an external control computer. 19 figs.

  12. Method and apparatus for digitally based high speed x-ray spectrometer

    DOEpatents

    Warburton, William K.; Hubbard, Bradley

    1997-01-01

    A high speed, digitally based, signal processing system which accepts input data from a detector-preamplifier and produces a spectral analysis of the x-rays illuminating the detector. The system achieves high throughputs at low cost by dividing the required digital processing steps between a "hardwired" processor implemented in combinatorial digital logic, which detects the presence of the x-ray signals in the digitized data stream and extracts filtered estimates of their amplitudes, and a programmable digital signal processing computer, which refines the filtered amplitude estimates and bins them to produce the desired spectral analysis. One set of algorithms allow this hybrid system to match the resolution of analog systems while operating at much higher data rates. A second set of algorithms implemented in the processor allow the system to be self calibrating as well. The same processor also handles the interface to an external control computer.

  13. Digital photography provides a fast, reliable, and noninvasive method to estimate anthocyanin pigment concentration in reproductive and vegetative plant tissues.

    PubMed

    Del Valle, José C; Gallardo-López, Antonio; Buide, Mª Luisa; Whittall, Justen B; Narbona, Eduardo

    2018-03-01

    Anthocyanin pigments have become a model trait for evolutionary ecology as they often provide adaptive benefits for plants. Anthocyanins have been traditionally quantified biochemically or more recently using spectral reflectance. However, both methods require destructive sampling and can be labor intensive and challenging with small samples. Recent advances in digital photography and image processing make it the method of choice for measuring color in the wild. Here, we use digital images as a quick, noninvasive method to estimate relative anthocyanin concentrations in species exhibiting color variation. Using a consumer-level digital camera and a free image processing toolbox, we extracted RGB values from digital images to generate color indices. We tested petals, stems, pedicels, and calyces of six species, which contain different types of anthocyanin pigments and exhibit different pigmentation patterns. Color indices were assessed by their correlation to biochemically determined anthocyanin concentrations. For comparison, we also calculated color indices from spectral reflectance and tested the correlation with anthocyanin concentration. Indices perform differently depending on the nature of the color variation. For both digital images and spectral reflectance, the most accurate estimates of anthocyanin concentration emerge from anthocyanin content-chroma ratio, anthocyanin content-chroma basic, and strength of green indices. Color indices derived from both digital images and spectral reflectance strongly correlate with biochemically determined anthocyanin concentration; however, the estimates from digital images performed better than spectral reflectance in terms of r 2 and normalized root-mean-square error. This was particularly noticeable in a species with striped petals, but in the case of striped calyces, both methods showed a comparable relationship with anthocyanin concentration. Using digital images brings new opportunities to accurately quantify the anthocyanin concentrations in both floral and vegetative tissues. This method is efficient, completely noninvasive, applicable to both uniform and patterned color, and works with samples of any size.

  14. Interactive Digital Textbooks and Engagement: A Learning Strategies Framework

    ERIC Educational Resources Information Center

    Bikowski, Dawn; Casal, J. Elliott

    2018-01-01

    This mixed-methods study explored non-native English speaking students' learning processes and engagement as they used a customized interactive digital textbook housed on a mobile device. Think aloud protocols, surveys of anticipated and actual engagement with the digital textbook, reflective journals, and member checking constituted data…

  15. Handwritten digits recognition based on immune network

    NASA Astrophysics Data System (ADS)

    Li, Yangyang; Wu, Yunhui; Jiao, Lc; Wu, Jianshe

    2011-11-01

    With the development of society, handwritten digits recognition technique has been widely applied to production and daily life. It is a very difficult task to solve these problems in the field of pattern recognition. In this paper, a new method is presented for handwritten digit recognition. The digit samples firstly are processed and features extraction. Based on these features, a novel immune network classification algorithm is designed and implemented to the handwritten digits recognition. The proposed algorithm is developed by Jerne's immune network model for feature selection and KNN method for classification. Its characteristic is the novel network with parallel commutating and learning. The performance of the proposed method is experimented to the handwritten number datasets MNIST and compared with some other recognition algorithms-KNN, ANN and SVM algorithm. The result shows that the novel classification algorithm based on immune network gives promising performance and stable behavior for handwritten digits recognition.

  16. Using digital photo technology to improve visualization of gastric lumen CT images

    NASA Astrophysics Data System (ADS)

    Pyrgioti, M.; Kyriakidis, A.; Chrysostomou, S.; Panaritis, V.

    2006-12-01

    In order to evaluate the gastric lumen CT images better, a new method is being applied to images using an Image Processing software. During a 12-month period, 69 patients with various gastric symptoms and 20 normal (as far as it concerns the upper gastrointestinal system) volunteers underwent computed tomography of the upper gastrointestinal system. Just before the examination the patients and the normal volunteers underwent preparation with 40 ml soda water and 10 ml gastrografin. All the CT images were digitized with an Olympus 3.2 Mpixel digital camera and further processed with an Image Processing software. The administration per os of gastrografin and soda water resulted in the distension of the stomach and consequently better visualization of all the anatomic parts. By using an Image Processing software in a PC, all the pathological and normal images of the stomach were better diagnostically estimated. We believe that the photo digital technology improves the diagnostic capacity not only of the CT image but also in MRI and probably many other imaging methods.

  17. Discrete-time modelling of musical instruments

    NASA Astrophysics Data System (ADS)

    Välimäki, Vesa; Pakarinen, Jyri; Erkut, Cumhur; Karjalainen, Matti

    2006-01-01

    This article describes physical modelling techniques that can be used for simulating musical instruments. The methods are closely related to digital signal processing. They discretize the system with respect to time, because the aim is to run the simulation using a computer. The physics-based modelling methods can be classified as mass-spring, modal, wave digital, finite difference, digital waveguide and source-filter models. We present the basic theory and a discussion on possible extensions for each modelling technique. For some methods, a simple model example is chosen from the existing literature demonstrating a typical use of the method. For instance, in the case of the digital waveguide modelling technique a vibrating string model is discussed, and in the case of the wave digital filter technique we present a classical piano hammer model. We tackle some nonlinear and time-varying models and include new results on the digital waveguide modelling of a nonlinear string. Current trends and future directions in physical modelling of musical instruments are discussed.

  18. Some thoughts on cartographic and geographic information systems for the 1980's

    USGS Publications Warehouse

    Starr, L.E.; Anderson, Kirk E.

    1981-01-01

    The U.S. Geological Survey is adopting computer techniques to meet the expanding need for cartographic base category data. Digital methods are becoming increasingly important in the mapmaking process, and the demand is growing for physical, social, and economic data. Recognizing these emerging needs, the National Mapping Division began, several years ago, an active program to develop advanced digital methods to support cartographic and geographic data processing. An integrated digital cartographic database would meet the anticipated needs. Such a database would contain data from various sources, and could provide a variety of standard and customized map and digital data file products. This cartographic database soon will be technologically feasible. The present trends in the economics of cartographic and geographic data handling and the growing needs for integrated physical, social, and economic data make such a database virtually mandatory.

  19. Interference elimination in digital controllers of automation systems of oil and gas complex

    NASA Astrophysics Data System (ADS)

    Solomentsev, K. Yu; Fugarov, D. D.; Purchina, O. A.; Poluyan, A. Y.; Nesterchuk, V. V.; Petrenkova, S. B.

    2018-05-01

    The given article considers the problems arising in the process of digital governors development for the systems of automatic control. In the case of interference, and also in case of high frequency of digitization, digital differentiation gives a big error. The problem is that the derivative is calculated as the difference of two close variables. The method of differentiation is offered to reduce this error, when there is a case of averaging the difference quotient of the series of meanings. The structure chart for the implementation of this differentiation method is offered in the case of governors construction.

  20. Method and apparatus for combinatorial logic signal processor in a digitally based high speed x-ray spectrometer

    DOEpatents

    Warburton, W.K.

    1999-02-16

    A high speed, digitally based, signal processing system is disclosed which accepts a digitized input signal and detects the presence of step-like pulses in the this data stream, extracts filtered estimates of their amplitudes, inspects for pulse pileup, and records input pulse rates and system lifetime. The system has two parallel processing channels: a slow channel, which filters the data stream with a long time constant trapezoidal filter for good energy resolution; and a fast channel which filters the data stream with a short time constant trapezoidal filter, detects pulses, inspects for pileups, and captures peak values from the slow channel for good events. The presence of a simple digital interface allows the system to be easily integrated with a digital processor to produce accurate spectra at high count rates and allow all spectrometer functions to be fully automated. Because the method is digitally based, it allows pulses to be binned based on time related values, as well as on their amplitudes, if desired. 31 figs.

  1. Realization of guitar audio effects using methods of digital signal processing

    NASA Astrophysics Data System (ADS)

    Buś, Szymon; Jedrzejewski, Konrad

    2015-09-01

    The paper is devoted to studies on possibilities of realization of guitar audio effects by means of methods of digital signal processing. As a result of research, some selected audio effects corresponding to the specifics of guitar sound were realized as the real-time system called Digital Guitar Multi-effect. Before implementation in the system, the selected effects were investigated using the dedicated application with a graphical user interface created in Matlab environment. In the second stage, the real-time system based on a microcontroller and an audio codec was designed and realized. The system is designed to perform audio effects on the output signal of an electric guitar.

  2. [Digital x-ray image processing as an aid in forensic medicine].

    PubMed

    Buitrago-Tellez, C; Wenz, W; Friedrich, G

    1992-02-01

    Radiology plays an important role in the identification of unknown corpses. Positive radiographic identification by comparison with antemortem films is an established technique in this setting. Technical defects together with non-well-preserved films make it sometimes difficult or even impossible to establish a confident comparison. Digital image processing after secondary digitalization of ante- and postmortem films represents an important development and aid in forensic medicine. The application of this method is demonstrated on a single case.

  3. The research of laser marking control technology

    NASA Astrophysics Data System (ADS)

    Zhang, Qiue; Zhang, Rong

    2009-08-01

    In the area of Laser marking, the general control method is insert control card to computer's mother board, it can not support hot swap, it is difficult to assemble or it. Moreover, the one marking system must to equip one computer. In the system marking, the computer can not to do the other things except to transmit marking digital information. Otherwise it can affect marking precision. Based on traditional control methods existed some problems, introduced marking graphic editing and digital processing by the computer finish, high-speed digital signal processor (DSP) control marking the whole process. The laser marking controller is mainly contain DSP2812, digital memorizer, DAC (digital analog converting) transform unit circuit, USB interface control circuit, man-machine interface circuit, and other logic control circuit. Download the marking information which is processed by computer to U disk, DSP read the information by USB interface on time, then processing it, adopt the DSP inter timer control the marking time sequence, output the scanner control signal by D/A parts. Apply the technology can realize marking offline, thereby reduce the product cost, increase the product efficiency. The system have good effect in actual unit markings, the marking speed is more quickly than PCI control card to 20 percent. It has application value in practicality.

  4. A New Statistics-Based Online Baseline Restorer for a High Count-Rate Fully Digital System.

    PubMed

    Li, Hongdi; Wang, Chao; Baghaei, Hossain; Zhang, Yuxuan; Ramirez, Rocio; Liu, Shitao; An, Shaohui; Wong, Wai-Hoi

    2010-04-01

    The goal of this work is to develop a novel, accurate, real-time digital baseline restorer using online statistical processing for a high count-rate digital system such as positron emission tomography (PET). In high count-rate nuclear instrumentation applications, analog signals are DC-coupled for better performance. However, the detectors, pre-amplifiers and other front-end electronics would cause a signal baseline drift in a DC-coupling system, which will degrade the performance of energy resolution and positioning accuracy. Event pileups normally exist in a high-count rate system and the baseline drift will create errors in the event pileup-correction. Hence, a baseline restorer (BLR) is required in a high count-rate system to remove the DC drift ahead of the pileup correction. Many methods have been reported for BLR from classic analog methods to digital filter solutions. However a single channel BLR with analog method can only work under 500 kcps count-rate, and normally an analog front-end application-specific integrated circuits (ASIC) is required for the application involved hundreds BLR such as a PET camera. We have developed a simple statistics-based online baseline restorer (SOBLR) for a high count-rate fully digital system. In this method, we acquire additional samples, excluding the real gamma pulses, from the existing free-running ADC in the digital system, and perform online statistical processing to generate a baseline value. This baseline value will be subtracted from the digitized waveform to retrieve its original pulse with zero-baseline drift. This method can self-track the baseline without a micro-controller involved. The circuit consists of two digital counter/timers, one comparator, one register and one subtraction unit. Simulation shows a single channel works at 30 Mcps count-rate with pileup condition. 336 baseline restorer circuits have been implemented into 12 field-programmable-gate-arrays (FPGA) for our new fully digital PET system.

  5. REVIEW ARTICLE: Spectrophotometric applications of digital signal processing

    NASA Astrophysics Data System (ADS)

    Morawski, Roman Z.

    2006-09-01

    Spectrophotometry is more and more often the method of choice not only in analysis of (bio)chemical substances, but also in the identification of physical properties of various objects and their classification. The applications of spectrophotometry include such diversified tasks as monitoring of optical telecommunications links, assessment of eating quality of food, forensic classification of papers, biometric identification of individuals, detection of insect infestation of seeds and classification of textiles. In all those applications, large numbers of data, generated by spectrophotometers, are processed by various digital means in order to extract measurement information. The main objective of this paper is to review the state-of-the-art methodology for digital signal processing (DSP) when applied to data provided by spectrophotometric transducers and spectrophotometers. First, a general methodology of DSP applications in spectrophotometry, based on DSP-oriented models of spectrophotometric data, is outlined. Then, the most important classes of DSP methods for processing spectrophotometric data—the methods for DSP-aided calibration of spectrophotometric instrumentation, the methods for the estimation of spectra on the basis of spectrophotometric data, the methods for the estimation of spectrum-related measurands on the basis of spectrophotometric data—are presented. Finally, the methods for preprocessing and postprocessing of spectrophotometric data are overviewed. Throughout the review, the applications of DSP are illustrated with numerous examples related to broadly understood spectrophotometry.

  6. Digital methods of recording color television images on film tape

    NASA Astrophysics Data System (ADS)

    Krivitskaya, R. Y.; Semenov, V. M.

    1985-04-01

    Three methods are now available for recording color television images on film tape, directly or after appropriate finish of signal processing. Conventional recording of images from the screens of three kinescopes with synthetic crystal face plates is still most effective for high fidelity. This method was improved by digital preprocessing of brightness color-difference signal. Frame-by-frame storage of these signals in the memory in digital form is followed by gamma and aperture correction and electronic correction of crossover distortions in the color layers of the film with fixing in accordance with specific emulsion procedures. The newer method of recording color television images with line arrays of light-emitting diodes involves dichromic superposing mirrors and a movable scanning mirror. This method allows the use of standard movie cameras, simplifies interlacing-to-linewise conversion and the mechanical equipment, and lengthens exposure time while it shortens recording time. The latest image transform method requires an audio-video recorder, a memory disk, a digital computer, and a decoder. The 9-step procedure includes preprocessing the total color television signal with reduction of noise level and time errors, followed by frame frequency conversion and setting the number of lines. The total signal is then resolved into its brightness and color-difference components and phase errors and image blurring are also reduced. After extraction of R,G,B signals and colorimetric matching of TV camera and film tape, the simultaneous R,B, B signals are converted from interlacing to sequential triades of color-quotient frames with linewise scanning at triple frequency. Color-quotient signals are recorded with an electron beam on a smoothly moving black-and-white film tape under vacuum. While digital techniques improve the signal quality and simplify the control of processes, not requiring stabilization of circuits, image processing is still analog.

  7. Digital enhancement of X-rays for NDT

    NASA Technical Reports Server (NTRS)

    Butterfield, R. L.

    1980-01-01

    Report is "cookbook" for digital processing of industrial X-rays. Computer techniques, previously used primarily in laboratory and developmental research, have been outlined and codified into step by step procedures for enhancing X-ray images. Those involved in nondestructive testing should find report valuable asset, particularly is visual inspection is method currently used to process X-ray images.

  8. PREFACE: I International Scientific School Methods of Digital Image Processing in Optics and Photonics

    NASA Astrophysics Data System (ADS)

    Gurov, I. P.; Kozlov, S. A.

    2014-09-01

    The first international scientific school "Methods of Digital Image Processing in Optics and Photonics" was held with a view to develop cooperation between world-class experts, young scientists, students and post-graduate students, and to exchange information on the current status and directions of research in the field of digital image processing in optics and photonics. The International Scientific School was managed by: Saint Petersburg National Research University of Information Technologies, Mechanics and Optics (ITMO University) - Saint Petersburg (Russia) Chernyshevsky Saratov State University - Saratov (Russia) National research nuclear University "MEPHI" (NRNU MEPhI) - Moscow (Russia) The school was held with the participation of the local chapters of Optical Society of America (OSA), the Society of Photo-Optical Instrumentation Engineers (SPIE) and IEEE Photonics Society. Further details, including topics, committees and conference photos are available in the PDF

  9. On-line surveillance of a dynamic process by a moving system based on pulsed digital holographic interferometry.

    PubMed

    Pedrini, Giancarlo; Alexeenko, Igor; Osten, Wolfgang; Schnars, Ulf

    2006-02-10

    A method based on pulsed digital holographic interferometry for the measurement of dynamic deformations of a surface by using a moving system is presented. The measuring system may move with a speed of several meters per minute and can measure deformation of the surface with an accuracy of better than 50 nm. The deformation is obtained by comparison of the wavefronts recorded at different times with different laser pulses produced by a Nd:YAG laser. The effect due to the movement of the measuring system is compensated for by digital processing of the different holograms. The system is well suited for on-line surveillance of a dynamic process such as laser welding and friction stir welding. Experimental results are presented, and the advantages of the method are discussed.

  10. Dynamic deformation image de-blurring and image processing for digital imaging correlation measurement

    NASA Astrophysics Data System (ADS)

    Guo, X.; Li, Y.; Suo, T.; Liu, H.; Zhang, C.

    2017-11-01

    This paper proposes a method for de-blurring of images captured in the dynamic deformation of materials. De-blurring is achieved based on the dynamic-based approach, which is used to estimate the Point Spread Function (PSF) during the camera exposure window. The deconvolution process involving iterative matrix calculations of pixels, is then performed on the GPU to decrease the time cost. Compared to the Gauss method and the Lucy-Richardson method, it has the best result of the image restoration. The proposed method has been evaluated by using the Hopkinson bar loading system. In comparison to the blurry image, the proposed method has successfully restored the image. It is also demonstrated from image processing applications that the de-blurring method can improve the accuracy and the stability of the digital imaging correlation measurement.

  11. Image Segmentation Using Minimum Spanning Tree

    NASA Astrophysics Data System (ADS)

    Dewi, M. P.; Armiati, A.; Alvini, S.

    2018-04-01

    This research aim to segmented the digital image. The process of segmentation is to separate the object from the background. So the main object can be processed for the other purposes. Along with the development of technology in digital image processing application, the segmentation process becomes increasingly necessary. The segmented image which is the result of the segmentation process should accurate due to the next process need the interpretation of the information on the image. This article discussed the application of minimum spanning tree on graph in segmentation process of digital image. This method is able to separate an object from the background and the image will change to be the binary images. In this case, the object that being the focus is set in white, while the background is black or otherwise.

  12. Digital pulse shape discrimination.

    PubMed

    Miller, L F; Preston, J; Pozzi, S; Flaska, M; Neal, J

    2007-01-01

    Pulse-shape discrimination (PSD) has been utilised for about 40 years as a method to obtain estimates for dose in mixed neutron and photon fields. Digitizers that operate close to GHz are currently available at a reasonable cost, and they can be used to directly sample signals from photomultiplier tubes. This permits one to perform digital PSD rather than the traditional, and well-established, analogoue techniques. One issue that complicates PSD for neutrons in mixed fields is that the light output characteristics of typical scintillators available for PSD, such as BC501A, vary as a function of energy deposited in the detector. This behaviour is more easily accommodated with digital processing of signals than with analogoue signal processing. Results illustrate the effectiveness of digital PSD.

  13. The Commercial Challenges Of Pacs

    NASA Astrophysics Data System (ADS)

    Vanden Brink, John A.

    1984-08-01

    The increasing use of digital imaging techniques create a need for improved methods of digital processing, communication and archiving. However, the commercial opportunity is dependent on the resolution of a number of issues. These issues include proof that digital processes are more cost effective than present techniques, implementation of information system support in the imaging activity, implementation of industry standards, conversion of analog images to digital formats, definition of clinical needs, the implications of the purchase decision and technology requirements. In spite of these obstacles, a market is emerging, served by new and existing companies, that may become a $500 million market (U.S.) by 1990 for equipment and supplies.

  14. Image processing for a tactile/vision substitution system using digital CNN.

    PubMed

    Lin, Chien-Nan; Yu, Sung-Nien; Hu, Jin-Cheng

    2006-01-01

    In view of the parallel processing and easy implementation properties of CNN, we propose to use digital CNN as the image processor of a tactile/vision substitution system (TVSS). The digital CNN processor is used to execute the wavelet down-sampling filtering and the half-toning operations, aiming to extract important features from the images. A template combination method is used to embed the two image processing functions into a single CNN processor. The digital CNN processor is implemented on an intellectual property (IP) and is implemented on a XILINX VIRTEX II 2000 FPGA board. Experiments are designated to test the capability of the CNN processor in the recognition of characters and human subjects in different environments. The experiments demonstrates impressive results, which proves the proposed digital CNN processor a powerful component in the design of efficient tactile/vision substitution systems for the visually impaired people.

  15. The Instructional Instrument SL-EDGE Student Library-Educational DiGital Environment.

    ERIC Educational Resources Information Center

    Kyriakopoulou, Antonia; Kalamboukis, Theodore

    An educational digital environment that will provide appropriate methods and techniques for the support and enhancement of the educational and learning process is a valuable tool for both educators and learners. In the context of such a mission, the educational tool SL-EDGE (Student Library-Educational DiGital Environment) has been developed. The…

  16. A study for watermark methods appropriate to medical images.

    PubMed

    Cho, Y; Ahn, B; Kim, J S; Kim, I Y; Kim, S I

    2001-06-01

    The network system, including the picture archiving and communication system (PACS), is essential in hospital and medical imaging fields these days. Many medical images are accessed and processed on the web, as well as in PACS. Therefore, any possible accidents caused by the illegal modification of medical images must be prevented. Digital image watermark techniques have been proposed as a method to protect against illegal copying or modification of copyrighted material. Invisible signatures made by a digital image watermarking technique can be a solution to these problems. However, medical images have some different characteristics from normal digital images in that one must not corrupt the information contained in the original medical images. In this study, we suggest modified watermark methods appropriate for medical image processing and communication system that prevent clinically important data contained in original images from being corrupted.

  17. Signal digitizing system and method based on amplitude-to-time optical mapping

    DOEpatents

    Chou, Jason; Bennett, Corey V; Hernandez, Vince

    2015-01-13

    A signal digitizing system and method based on analog-to-time optical mapping, optically maps amplitude information of an analog signal of interest first into wavelength information using an amplitude tunable filter (ATF) to impress spectral changes induced by the amplitude of the analog signal onto a carrier signal, i.e. a train of optical pulses, and next from wavelength information to temporal information using a dispersive element so that temporal information representing the amplitude information is encoded in the time domain in the carrier signal. Optical-to-electrical conversion of the optical pulses into voltage waveforms and subsequently digitizing the voltage waveforms into a digital image enables the temporal information to be resolved and quantized in the time domain. The digital image may them be digital signal processed to digitally reconstruct the analog signal based on the temporal information with high fidelity.

  18. A frequency standard via spectrum analysis and direct digital synthesis

    NASA Astrophysics Data System (ADS)

    Li, Dawei; Shi, Daiting; Hu, Ermeng; Wang, Yigen; Tian, Lu; Zhao, Jianye; Wang, Zhong

    2014-11-01

    We demonstrated a frequency standard based on a detuned coherent population beating phenomenon. In this phenomenon, the beat frequency of the radio frequency for laser modulation and the hyperfine splitting can be obtained by digital signal processing technology. After analyzing the spectrum of the beat frequency, the fluctuation information is obtained and applied to compensate for the frequency shift to generate the standard frequency by the digital synthesis method. Frequency instability of 2.6 × 1012 at 1000 s is observed in our preliminary experiment. By eliminating the phase-locking loop, the method will enable us to achieve a full-digital frequency standard with remarkable stability.

  19. The digital language of amino acids.

    PubMed

    Kurić, L

    2007-11-01

    The subject of this paper is a digital approach to the investigation of the biochemical basis of genetic processes. The digital mechanism of nucleic acid and protein bio-syntheses, the evolution of biomacromolecules and, especially, the biochemical evolution of genetic language have been analyzed by the application of cybernetic methods, information theory and system theory, respectively. This paper reports the discovery of new methods for developing the new technologies in genetics. It is about the most advanced digital technology which is based on program, cybernetics and informational systems and laws. The results in the practical application of the new technology could be useful in bioinformatics, genetics, biochemistry, medicine and other natural sciences.

  20. A Cyber Security Self-Assessment Method for Nuclear Power Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glantz, Clifford S.; Coles, Garill A.; Bass, Robert B.

    2004-11-01

    A cyber security self-assessment method (the Method) has been developed by Pacific Northwest National Laboratory. The development of the Method was sponsored and directed by the U.S. Nuclear Regulatory Commission. Members of the Nuclear Energy Institute Cyber Security Task Force also played a substantial role in developing the Method. The Method's structured approach guides nuclear power plants in scrutinizing their digital systems, assessing the potential consequences to the plant of a cyber exploitation, identifying vulnerabilities, estimating cyber security risks, and adopting cost-effective protective measures. The focus of the Method is on critical digital assets. A critical digital asset is amore » digital device or system that plays a role in the operation, maintenance, or proper functioning of a critical system (i.e., a plant system that can impact safety, security, or emergency preparedness). A critical digital asset may have a direct or indirect connection to a critical system. Direct connections include both wired and wireless communication pathways. Indirect connections include sneaker-net pathways by which software or data are manually transferred from one digital device to another. An indirect connection also may involve the use of instructions or data stored on a critical digital asset to make adjustments to a critical system. The cyber security self-assessment begins with the formation of an assessment team, and is followed by a six-stage process.« less

  1. Method and system for conserving power in a telecommunications network during emergency situations

    DOEpatents

    Conrad, Stephen H [Algodones, NM; O'Reilly, Gerard P [Manalapan, NJ

    2011-10-11

    Disclosed is a method and apparatus for conserving power in a telecommunications network during emergency situations. A permissible number list of emergency and/or priority numbers is stored in the telecommunications network. In the event of an emergency or power failure, input digits of a call to the telecommunications network are compared to the permissible number list. The call is processed in the telecommunications network and routed to its destination if the input digits match an entry in the permissible number list. The call is dropped without any further processing if the input digits do not match an entry in the permissible number list. Thus, power can be conserved in emergency situations by only allowing emergency and/or priority calls.

  2. Ultrasonic imaging system for in-process fabric defect detection

    DOEpatents

    Sheen, Shuh-Haw; Chien, Hual-Te; Lawrence, William P.; Raptis, Apostolos C.

    1997-01-01

    An ultrasonic method and system are provided for monitoring a fabric to identify a defect. A plurality of ultrasonic transmitters generate ultrasonic waves relative to the fabric. An ultrasonic receiver means responsive to the generated ultrasonic waves from the transmitters receives ultrasonic waves coupled through the fabric and generates a signal. An integrated peak value of the generated signal is applied to a digital signal processor and is digitized. The digitized signal is processed to identify a defect in the fabric. The digitized signal processing includes a median value filtering step to filter out high frequency noise. Then a mean value and standard deviation of the median value filtered signal is calculated. The calculated mean value and standard deviation are compared with predetermined threshold values to identify a defect in the fabric.

  3. Choosing a DIVA: a comparison of emerging digital imagery vegetation analysis techniques

    USGS Publications Warehouse

    Jorgensen, Christopher F.; Stutzman, Ryan J.; Anderson, Lars C.; Decker, Suzanne E.; Powell, Larkin A.; Schacht, Walter H.; Fontaine, Joseph J.

    2013-01-01

    Question: What is the precision of five methods of measuring vegetation structure using ground-based digital imagery and processing techniques? Location: Lincoln, Nebraska, USA Methods: Vertical herbaceous cover was recorded using digital imagery techniques at two distinct locations in a mixed-grass prairie. The precision of five ground-based digital imagery vegetation analysis (DIVA) methods for measuring vegetation structure was tested using a split-split plot analysis of covariance. Variability within each DIVA technique was estimated using coefficient of variation of mean percentage cover. Results: Vertical herbaceous cover estimates differed among DIVA techniques. Additionally, environmental conditions affected the vertical vegetation obstruction estimates for certain digital imagery methods, while other techniques were more adept at handling various conditions. Overall, percentage vegetation cover values differed among techniques, but the precision of four of the five techniques was consistently high. Conclusions: DIVA procedures are sufficient for measuring various heights and densities of standing herbaceous cover. Moreover, digital imagery techniques can reduce measurement error associated with multiple observers' standing herbaceous cover estimates, allowing greater opportunity to detect patterns associated with vegetation structure.

  4. Using high-resolution digital aerial imagery to map land cover

    USGS Publications Warehouse

    Dieck, J.J.; Robinson, Larry

    2014-01-01

    The Upper Midwest Environmental Sciences Center (UMESC) has used aerial photography to map land cover/land use on federally owned and managed lands for over 20 years. Until recently, that process used 23- by 23-centimeter (9- by 9-inch) analog aerial photos to classify vegetation along the Upper Mississippi River System, on National Wildlife Refuges, and in National Parks. With digital aerial cameras becoming more common and offering distinct advantages over analog film, UMESC transitioned to an entirely digital mapping process in 2009. Though not without challenges, this method has proven to be much more accurate and efficient when compared to the analog process.

  5. Radar data processing and analysis

    NASA Technical Reports Server (NTRS)

    Ausherman, D.; Larson, R.; Liskow, C.

    1976-01-01

    Digitized four-channel radar images corresponding to particular areas from the Phoenix and Huntington test sites were generated in conjunction with prior experiments performed to collect X- and L-band synthetic aperture radar imagery of these two areas. The methods for generating this imagery are documented. A secondary objective was the investigation of digital processing techniques for extraction of information from the multiband radar image data. Following the digitization, the remaining resources permitted a preliminary machine analysis to be performed on portions of the radar image data. The results, although necessarily limited, are reported.

  6. Separation of overlapping dental arch objects using digital records of illuminated plaster casts.

    PubMed

    Yadollahi, Mohammadreza; Procházka, Aleš; Kašparová, Magdaléna; Vyšata, Oldřich; Mařík, Vladimír

    2015-07-11

    Plaster casts of individual patients are important for orthodontic specialists during the treatment process and their analysis is still a standard diagnostical tool. But the growing capabilities of information technology enable their replacement by digital models obtained by complex scanning systems. This paper presents the possibility of using a digital camera as a simple instrument to obtain the set of digital images for analysis and evaluation of the treatment using appropriate mathematical tools of image processing. The methods studied in this paper include the segmentation of overlapping dental bodies and the use of different illumination sources to increase the reliability of the separation process. The circular Hough transform, region growing with multiple seed points, and the convex hull detection method are applied to the segmentation of orthodontic plaster cast images to identify dental arch objects and their sizes. The proposed algorithm presents the methodology of improving the accuracy of segmentation of dental arch components using combined illumination sources. Dental arch parameters and distances between the canines and premolars for different segmentation methods were used as a measure to compare the results obtained. A new method of segmentation of overlapping dental arch components using digital records of illuminated plaster casts provides information with the precision required for orthodontic treatment. The distance between corresponding teeth was evaluated with a mean error of 1.38% and the Dice similarity coefficient of the evaluated dental bodies boundaries reached 0.9436 with a false positive rate [Formula: see text] and false negative rate [Formula: see text].

  7. [Diagnossis and treatment of complicated anterior teeth esthetic defects by combination of whole-process digital esthetic rehabilitation with periodontic surgery].

    PubMed

    Li, Z; Liu, Y S; Ye, H Q; Liu, Y S; Hu, W J; Zhou, Y S

    2017-02-18

    To explore a new method of whole-process digital esthetic prosthodontic rehabilitation combined with periodontic surgery for complicated anterior teeth esthetic defects accompanied by soft tissue morphology, to provide an alternative choice for solving this problem under the guidance of three-dimensional (3D) printing digital dental model and surgical guide, thus completing periodontic surgery and digital esthetic rehabilitation of anterior teeth. In this study, 12 patients with complicated esthetic problems accompanied by soft tissue morphology in their anterior teeth were included. The dentition and facial images were obtained by intra-oral scanning and three-dimensional (3D) facial scanning and then calibrated. Two esthetic designs and prosthodontic outcome predictions were created by computer aided design /computer aided manufacturing (CAD/CAM) software combined with digital photography, including consideration of white esthetics and comprehensive consideration of pink-white esthetics. The predictive design of prostheses and the facial appearances of the two designs were evaluated by the patients. If the patients chose the design of comprehensive consideration of pink-white esthetics, they would choose whether they would receive periodontic surgery before esthetic rehabilitation. The dentition design cast of those who chose periodontic surgery would be 3D printed for the guide of periodontic surgery accordingly. In light of the two digital designs based on intra-oral scanning, facing scanning and digital photography, the satisfaction rate of the patients was significantly higher for the comprehensive consideration of pink-white esthetic design (P<0.05) and more patients tended to choose priodontic surgery before esthetic rehabilitation. The 3D printed digital dental model and surgical guide provided significant instructions for periodontic surgery, and achieved success transfer from digital design to clinical application. The prostheses were fabricated by CAD/CAM, thus realizing the whole-process digital esthetic rehabilitation. The new method for esthetic rehabilitation of complicated anterior teeth esthetic defects accompanied by soft tissue morphology, including patient-involved digital esthetic analysis, design, esthetic outcome prediction, 3D printing surgical guide for periodontic surgery and digital fabrication is a practical technology. This method is useful for improvement of clinical communication efficiency between doctor-patient, doctor-technician and doctors from different departments, and is conducive to multidisciplinary treatment of this complicated anterior teeth esthetic problem.

  8. Method and Apparatus for Evaluating the Visual Quality of Processed Digital Video Sequences

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B. (Inventor)

    2002-01-01

    A Digital Video Quality (DVQ) apparatus and method that incorporate a model of human visual sensitivity to predict the visibility of artifacts. The DVQ method and apparatus are used for the evaluation of the visual quality of processed digital video sequences and for adaptively controlling the bit rate of the processed digital video sequences without compromising the visual quality. The DVQ apparatus minimizes the required amount of memory and computation. The input to the DVQ apparatus is a pair of color image sequences: an original (R) non-compressed sequence, and a processed (T) sequence. Both sequences (R) and (T) are sampled, cropped, and subjected to color transformations. The sequences are then subjected to blocking and discrete cosine transformation, and the results are transformed to local contrast. The next step is a time filtering operation which implements the human sensitivity to different time frequencies. The results are converted to threshold units by dividing each discrete cosine transform coefficient by its respective visual threshold. At the next stage the two sequences are subtracted to produce an error sequence. The error sequence is subjected to a contrast masking operation, which also depends upon the reference sequence (R). The masked errors can be pooled in various ways to illustrate the perceptual error over various dimensions, and the pooled error can be converted to a visual quality measure.

  9. Development of Gravity Acceleration Measurement Using Simple Harmonic Motion Pendulum Method Based on Digital Technology and Photogate Sensor

    NASA Astrophysics Data System (ADS)

    Yulkifli; Afandi, Zurian; Yohandri

    2018-04-01

    Development of gravitation acceleration measurement using simple harmonic motion pendulum method, digital technology and photogate sensor has been done. Digital technology is more practical and optimizes the time of experimentation. The pendulum method is a method of calculating the acceleration of gravity using a solid ball that connected to a rope attached to a stative pole. The pendulum is swung at a small angle resulted a simple harmonic motion. The measurement system consists of a power supply, Photogate sensors, Arduino pro mini and seven segments. The Arduino pro mini receives digital data from the photogate sensor and processes the digital data into the timing data of the pendulum oscillation. The calculation result of the pendulum oscillation time is displayed on seven segments. Based on measured data, the accuracy and precision of the experiment system are 98.76% and 99.81%, respectively. Based on experiment data, the system can be operated in physics experiment especially in determination of the gravity acceleration.

  10. Watermarking and copyright labeling of printed images

    NASA Astrophysics Data System (ADS)

    Hel-Or, Hagit Z.

    2001-07-01

    Digital watermarking is a labeling technique for digital images which embeds a code into the digital data so the data are marked. Watermarking techniques previously developed deal with on-line digital data. These techniques have been developed to withstand digital attacks such as image processing, image compression and geometric transformations. However, one must also consider the readily available attack of printing and scanning. The available watermarking techniques are not reliable under printing and scanning. In fact, one must consider the availability of watermarks for printed images as well as for digital images. An important issue is to intercept and prevent forgery in printed material such as currency notes, back checks, etc. and to track and validate sensitive and secrete printed material. Watermarking in such printed material can be used not only for verification of ownership but as an indicator of date and type of transaction or date and source of the printed data. In this work we propose a method of embedding watermarks in printed images by inherently taking advantage of the printing process. The method is visually unobtrusive to the printed image, the watermark is easily extracted and is robust under reconstruction errors. The decoding algorithm is automatic given the watermarked image.

  11. Linear programming phase unwrapping for dual-wavelength digital holography.

    PubMed

    Wang, Zhaomin; Jiao, Jiannan; Qu, Weijuan; Yang, Fang; Li, Hongru; Tian, Ailing; Asundi, Anand

    2017-01-20

    A linear programming phase unwrapping method in dual-wavelength digital holography is proposed and verified experimentally. The proposed method uses the square of height difference as a convergence standard and theoretically gives the boundary condition in a searching process. A simulation was performed by unwrapping step structures at different levels of Gaussian noise. As a result, our method is capable of recovering the discontinuities accurately. It is robust and straightforward. In the experiment, a microelectromechanical systems sample and a cylindrical lens were measured separately. The testing results were in good agreement with true values. Moreover, the proposed method is applicable not only in digital holography but also in other dual-wavelength interferometric techniques.

  12. Can digital stories go where palliative care research has never gone before? A descriptive qualitative study exploring the application of an emerging public health research method in an indigenous palliative care context.

    PubMed

    Williams, Lisa; Gott, Merryn; Moeke-Maxwell, Tess; Black, Stella; Kothari, Shuchi; Pearson, Sarina; Morgan, Tessa; Wharemate, Matua Rawiri; Hansen, Whaea Whio

    2017-09-04

    The World Health Organization (WHO) has called for global approaches to palliative care development. Yet it is questionable whether one-size-fits-all solutions can accommodate international disparities in palliative care need. More flexible research methods are called for in order to understand diverse priorities at local levels. This is especially imperative for Indigenous populations and other groups underrepresented in the palliative care evidence-base. Digital storytelling (DST) offers the potential to be one such method. Digital stories are short first-person videos that tell a story of great significance to the creator. The method has already found a place within public health research and has been described as a useful, emergent method for community-based participatory research. The aim of this study was to explore Māori participants' views on DST's usefulness, from an Indigenous perspective, as a research method within the discipline of palliative care. The digital storytelling method was adapted to include Māori cultural protocols. Data capturing participant experience of the study were collected using participant observation and anonymous questionnaires. Eight participants, seven women and one man, took part. Field notes and questionnaire data were analysed using critical thematic analysis. Two main themes were identified during analyses: 1) issues that facilitated digital storytelling's usefulness as a research method for Māori reporting on end of life caregiving; and 2) issues that hindered this process. All subthemes identified: recruitment, the pōwhiri process, (Māori formal welcome of visitors) and technology, related to both main themes and are presented in this way. Digital storytelling is an emerging method useful for exploring Indigenous palliative care issues. In line with a Health Promoting Palliative Care approach that centres research in communities, it helps meet the need for diverse approaches to involve underrepresented groups.

  13. Desolvation Induced Origami of Photocurable Polymers by Digit Light Processing.

    PubMed

    Zhao, Zeang; Wu, Jiangtao; Mu, Xiaoming; Chen, Haosen; Qi, H Jerry; Fang, Daining

    2017-07-01

    Self-folding origami is of great interest in current research on functional materials and structures, but there is still a challenge to develop a simple method to create freestanding, reversible, and complex origami structures. This communication provides a feasible solution to this challenge by developing a method based on the digit light processing technique and desolvation-induced self-folding. In this new method, flat polymer sheets can be cured by a light field from a commercial projector with varying intensity, and the self-folding process is triggered by desolvation in water. Folded origami structures can be recovered once immersed in the swelling medium. The self-folding process is investigated both experimentally and theoretically. Diverse 3D origami shapes are demonstrated. This method can be used for responsive actuators and the fabrication of 3D electronic devices. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Fundamentals handbook of electrical and computer engineering. Volume 1 Circuits fields and electronics

    NASA Astrophysics Data System (ADS)

    Chang, S. S. L.

    State of the art technology in circuits, fields, and electronics is discussed. The principles and applications of these technologies to industry, digital processing, microwave semiconductors, and computer-aided design are explained. Important concepts and methodologies in mathematics and physics are reviewed, and basic engineering sciences and associated design methods are dealt with, including: circuit theory and the design of magnetic circuits and active filter synthesis; digital signal processing, including FIR and IIR digital filter design; transmission lines, electromagnetic wave propagation and surface acoustic wave devices. Also considered are: electronics technologies, including power electronics, microwave semiconductors, GaAs devices, and magnetic bubble memories; digital circuits and logic design.

  15. Faster processing of multiple spatially-heterodyned direct to digital holograms

    DOEpatents

    Hanson, Gregory R.; Bingham, Philip R.

    2006-10-03

    Systems and methods are described for faster processing of multiple spatially-heterodyned direct to digital holograms. A method includes of obtaining multiple spatially-heterodyned holograms, includes: digitally recording a first spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; digitally recording a second spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded first spatially-heterodyned hologram by shifting a first original origin of the recorded first spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a first angle between a first reference beam and a first, object beam; applying a first digital filter to cut off signals around the first original origin and performing an inverse Fourier transform on the result; Fourier analyzing the recorded second spatially-heterodyned hologram by shifting a second original origin of the recorded second spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a second angle between a second reference beam and a second object beam; and applying a second digital filter to cut off signals around the second original origin and performing an inverse Fourier transform on the result, wherein digitally recording the first spatially-heterodyned hologram is completed before digitally recording the second spatially-heterodyned hologram and a single digital image includes both the first spatially-heterodyned hologram and the second spatially-heterodyned hologram.

  16. Faster processing of multiple spatially-heterodyned direct to digital holograms

    DOEpatents

    Hanson, Gregory R [Clinton, TN; Bingham, Philip R [Knoxville, TN

    2008-09-09

    Systems and methods are described for faster processing of multiple spatially-heterodyned direct to digital holograms. A method includes of obtaining multiple spatially-heterodyned holograms, includes: digitally recording a first spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; digitally recording a second spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded first spatially-heterodyned hologram by shifting a first original origin of the recorded first spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a first angle between a first reference beam and a first object beam; applying a first digital filter to cut off signals around the first original origin and performing an inverse Fourier transform on the result; Fourier analyzing the recorded second spatially-heterodyned hologram by shifting a second original origin of the recorded second spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a second angle between a second reference beam and a second object beam; and applying a second digital filter to cut off signals around the second original origin and performing an inverse Fourier transform on the result, wherein digitally recording the first spatially-heterodyned hologram is completed before digitally recording the second spatially-heterodyned hologram and a single digital image includes both the first spatially-heterodyned hologram and the second spatially-heterodyned hologram.

  17. Integrating digital topology in image-processing libraries.

    PubMed

    Lamy, Julien

    2007-01-01

    This paper describes a method to integrate digital topology informations in image-processing libraries. This additional information allows a library user to write algorithms respecting topological constraints, for example, a seed fill or a skeletonization algorithm. As digital topology is absent from most image-processing libraries, such constraints cannot be fulfilled. We describe and give code samples for all the structures necessary for this integration, and show a use case in the form of a homotopic thinning filter inside ITK. The obtained filter can be up to a hundred times as fast as ITK's thinning filter and works for any image dimension. This paper mainly deals of integration within ITK, but can be adapted with only minor modifications to other image-processing libraries.

  18. Use of contextual inquiry to understand anatomic pathology workflow: Implications for digital pathology adoption

    PubMed Central

    Ho, Jonhan; Aridor, Orly; Parwani, Anil V.

    2012-01-01

    Background: For decades anatomic pathology (AP) workflow have been a highly manual process based on the use of an optical microscope and glass slides. Recent innovations in scanning and digitizing of entire glass slides are accelerating a move toward widespread adoption and implementation of a workflow based on digital slides and their supporting information management software. To support the design of digital pathology systems and ensure their adoption into pathology practice, the needs of the main users within the AP workflow, the pathologists, should be identified. Contextual inquiry is a qualitative, user-centered, social method designed to identify and understand users’ needs and is utilized for collecting, interpreting, and aggregating in-detail aspects of work. Objective: Contextual inquiry was utilized to document current AP workflow, identify processes that may benefit from the introduction of digital pathology systems, and establish design requirements for digital pathology systems that will meet pathologists’ needs. Materials and Methods: Pathologists were observed and interviewed at a large academic medical center according to contextual inquiry guidelines established by Holtzblatt et al. 1998. Notes representing user-provided data were documented during observation sessions. An affinity diagram, a hierarchal organization of the notes based on common themes in the data, was created. Five graphical models were developed to help visualize the data including sequence, flow, artifact, physical, and cultural models. Results: A total of six pathologists were observed by a team of two researchers. A total of 254 affinity notes were documented and organized using a system based on topical hierarchy, including 75 third-level, 24 second-level, and five main-level categories, including technology, communication, synthesis/preparation, organization, and workflow. Current AP workflow was labor intensive and lacked scalability. A large number of processes that may possibly improve following the introduction of digital pathology systems were identified. These work processes included case management, case examination and review, and final case reporting. Furthermore, a digital slide system should integrate with the anatomic pathologic laboratory information system. Conclusions: To our knowledge, this is the first study that utilized the contextual inquiry method to document AP workflow. Findings were used to establish key requirements for the design of digital pathology systems. PMID:23243553

  19. A Fast Multiple Sampling Method for Low-Noise CMOS Image Sensors With Column-Parallel 12-bit SAR ADCs.

    PubMed

    Kim, Min-Kyu; Hong, Seong-Kwan; Kwon, Oh-Kyong

    2015-12-26

    This paper presents a fast multiple sampling method for low-noise CMOS image sensor (CIS) applications with column-parallel successive approximation register analog-to-digital converters (SAR ADCs). The 12-bit SAR ADC using the proposed multiple sampling method decreases the A/D conversion time by repeatedly converting a pixel output to 4-bit after the first 12-bit A/D conversion, reducing noise of the CIS by one over the square root of the number of samplings. The area of the 12-bit SAR ADC is reduced by using a 10-bit capacitor digital-to-analog converter (DAC) with four scaled reference voltages. In addition, a simple up/down counter-based digital processing logic is proposed to perform complex calculations for multiple sampling and digital correlated double sampling. To verify the proposed multiple sampling method, a 256 × 128 pixel array CIS with 12-bit SAR ADCs was fabricated using 0.18 μm CMOS process. The measurement results shows that the proposed multiple sampling method reduces each A/D conversion time from 1.2 μs to 0.45 μs and random noise from 848.3 μV to 270.4 μV, achieving a dynamic range of 68.1 dB and an SNR of 39.2 dB.

  20. Digital selective growth of a ZnO nanowire array by large scale laser decomposition of zinc acetate.

    PubMed

    Hong, Sukjoon; Yeo, Junyeob; Manorotkul, Wanit; Kang, Hyun Wook; Lee, Jinhwan; Han, Seungyong; Rho, Yoonsoo; Suh, Young Duk; Sung, Hyung Jin; Ko, Seung Hwan

    2013-05-07

    We develop a digital direct writing method for ZnO NW micro-patterned growth on a large scale by selective laser decomposition of zinc acetate. For ZnO NW growth, by replacing the bulk heating with the scanning focused laser as a fully digital local heat source, zinc acetate crystallites can be selectively activated as a ZnO seed pattern to grow ZnO nanowires locally on a larger area. Together with the selective laser sintering process of metal nanoparticles, more than 10,000 UV sensors have been demonstrated on a 4 cm × 4 cm glass substrate to develop all-solution processible, all-laser mask-less digital fabrication of electronic devices including active layer and metal electrodes without any conventional vacuum deposition, photolithographic process, premade mask, high temperature and vacuum environment.

  1. A simplified close range photogrammetry method for soil erosion assessment

    USDA-ARS?s Scientific Manuscript database

    With the increased affordability of consumer grade cameras and the development of powerful image processing software, digital photogrammetry offers a competitive advantage as a tool for soil erosion estimation compared to other technologies. One bottleneck of digital photogrammetry is its dependency...

  2. A novel method for the photographic recovery of fingermark impressions from ammunition cases using digital imaging.

    PubMed

    Porter, Glenn; Ebeyan, Robert; Crumlish, Charles; Renshaw, Adrian

    2015-03-01

    The photographic preservation of fingermark impression evidence found on ammunition cases remains problematic due to the cylindrical shape of the deposition substrate preventing complete capture of the impression in a single image. A novel method was developed for the photographic recovery of fingermarks from curved surfaces using digital imaging. The process involves the digital construction of a complete impression image made from several different images captured from multiple camera perspectives. Fingermark impressions deposited onto 9-mm and 0.22-caliber brass cartridge cases and a plastic 12-gauge shotgun shell were tested using various image parameters, including digital stitching method, number of images per 360° rotation of shell, image cropping, and overlap. The results suggest that this method may be successfully used to recover fingermark impression evidence from the surfaces of ammunition cases or other similar cylindrical surfaces. © 2014 American Academy of Forensic Sciences.

  3. Digital photography for the light microscope: results with a gated, video-rate CCD camera and NIH-image software.

    PubMed

    Shaw, S L; Salmon, E D; Quatrano, R S

    1995-12-01

    In this report, we describe a relatively inexpensive method for acquiring, storing and processing light microscope images that combines the advantages of video technology with the powerful medium now termed digital photography. Digital photography refers to the recording of images as digital files that are stored, manipulated and displayed using a computer. This report details the use of a gated video-rate charge-coupled device (CCD) camera and a frame grabber board for capturing 256 gray-level digital images from the light microscope. This camera gives high-resolution bright-field, phase contrast and differential interference contrast (DIC) images but, also, with gated on-chip integration, has the capability to record low-light level fluorescent images. The basic components of the digital photography system are described, and examples are presented of fluorescence and bright-field micrographs. Digital processing of images to remove noise, to enhance contrast and to prepare figures for printing is discussed.

  4. The optimal digital filters of sine and cosine transforms for geophysical transient electromagnetic method

    NASA Astrophysics Data System (ADS)

    Zhao, Yun-wei; Zhu, Zi-qiang; Lu, Guang-yin; Han, Bo

    2018-03-01

    The sine and cosine transforms implemented with digital filters have been used in the Transient electromagnetic methods for a few decades. Kong (2007) proposed a method of obtaining filter coefficients, which are computed in the sample domain by Hankel transform pair. However, the curve shape of Hankel transform pair changes with a parameter, which usually is set to be 1 or 3 in the process of obtaining the digital filter coefficients of sine and cosine transforms. First, this study investigates the influence of the parameter on the digital filter algorithm of sine and cosine transforms based on the digital filter algorithm of Hankel transform and the relationship between the sine, cosine function and the ±1/2 order Bessel function of the first kind. The results show that the selection of the parameter highly influences the precision of digital filter algorithm. Second, upon the optimal selection of the parameter, it is found that an optimal sampling interval s also exists to achieve the best precision of digital filter algorithm. Finally, this study proposes four groups of sine and cosine transform digital filter coefficients with different length, which may help to develop the digital filter algorithm of sine and cosine transforms, and promote its application.

  5. Processing Electromyographic Signals to Recognize Words

    NASA Technical Reports Server (NTRS)

    Jorgensen, C. C.; Lee, D. D.

    2009-01-01

    A recently invented speech-recognition method applies to words that are articulated by means of the tongue and throat muscles but are otherwise not voiced or, at most, are spoken sotto voce. This method could satisfy a need for speech recognition under circumstances in which normal audible speech is difficult, poses a hazard, is disturbing to listeners, or compromises privacy. The method could also be used to augment traditional speech recognition by providing an additional source of information about articulator activity. The method can be characterized as intermediate between (1) conventional speech recognition through processing of voice sounds and (2) a method, not yet developed, of processing electroencephalographic signals to extract unspoken words directly from thoughts. This method involves computational processing of digitized electromyographic (EMG) signals from muscle innervation acquired by surface electrodes under a subject's chin near the tongue and on the side of the subject s throat near the larynx. After preprocessing, digitization, and feature extraction, EMG signals are processed by a neural-network pattern classifier, implemented in software, that performs the bulk of the recognition task as described.

  6. Unfolding and unfoldability of digital pulses in the z-domain

    NASA Astrophysics Data System (ADS)

    Regadío, Alberto; Sánchez-Prieto, Sebastián

    2018-04-01

    The unfolding (or deconvolution) technique is used in the development of digital pulse processing systems applied to particle detection. This technique is applied to digital signals obtained by digitization of analog signals that represent the combined response of the particle detectors and the associated signal conditioning electronics. This work describes a technique to determine if the signal is unfoldable. For unfoldable signals the characteristics of the unfolding system (unfolder) are presented. Finally, examples of the method applied to real experimental setup are discussed.

  7. Digital computer processing of peach orchard multispectral aerial photography

    NASA Technical Reports Server (NTRS)

    Atkinson, R. J.

    1976-01-01

    Several methods of analysis using digital computers applicable to digitized multispectral aerial photography, are described, with particular application to peach orchard test sites. This effort was stimulated by the recent premature death of peach trees in the Southeastern United States. The techniques discussed are: (1) correction of intensity variations by digital filtering, (2) automatic detection and enumeration of trees in five size categories, (3) determination of unhealthy foliage by infrared reflectances, and (4) four band multispectral classification into healthy and declining categories.

  8. Digital Signal Processing Based on a Clustering Algorithm for Ir/Au TES Microcalorimeter

    NASA Astrophysics Data System (ADS)

    Zen, N.; Kunieda, Y.; Takahashi, H.; Hiramoto, K.; Nakazawa, M.; Fukuda, D.; Ukibe, M.; Ohkubo, M.

    2006-02-01

    In recent years, cryogenic microcalorimeters using their superconducting transition edge have been under development for possible application to the research for astronomical X-ray observations. To improve the energy resolution of superconducting transition edge sensors (TES), several correction methods have been developed. Among them, a clustering method based on digital signal processing has recently been proposed. In this paper, we applied the clustering method to Ir/Au bilayer TES. This method resulted in almost a 10% improvement in the energy resolution. Conversely, from the point of view of imaging X-ray spectroscopy, we applied the clustering method to pixellated Ir/Au-TES devices. We will thus show how a clustering method which sorts signals by their shapes is also useful for position identification

  9. Two schemes for rapid generation of digital video holograms using PC cluster

    NASA Astrophysics Data System (ADS)

    Park, Hanhoon; Song, Joongseok; Kim, Changseob; Park, Jong-Il

    2017-12-01

    Computer-generated holography (CGH), which is a process of generating digital holograms, is computationally expensive. Recently, several methods/systems of parallelizing the process using graphic processing units (GPUs) have been proposed. Indeed, use of multiple GPUs or a personal computer (PC) cluster (each PC with GPUs) enabled great improvements in the process speed. However, extant literature has less often explored systems involving rapid generation of multiple digital holograms and specialized systems for rapid generation of a digital video hologram. This study proposes a system that uses a PC cluster and is able to more efficiently generate a video hologram. The proposed system is designed to simultaneously generate multiple frames and accelerate the generation by parallelizing the CGH computations across a number of frames, as opposed to separately generating each individual frame while parallelizing the CGH computations within each frame. The proposed system also enables the subprocesses for generating each frame to execute in parallel through multithreading. With these two schemes, the proposed system significantly reduced the data communication time for generating a digital hologram when compared with that of the state-of-the-art system.

  10. Automated grain extraction and classification by combining improved region growing segmentation and shape descriptors in electromagnetic mill classification system

    NASA Astrophysics Data System (ADS)

    Budzan, Sebastian

    2018-04-01

    In this paper, the automatic method of grain detection and classification has been presented. As input, it uses a single digital image obtained from milling process of the copper ore with an high-quality digital camera. The grinding process is an extremely energy and cost consuming process, thus granularity evaluation process should be performed with high efficiency and time consumption. The method proposed in this paper is based on the three-stage image processing. First, using Seeded Region Growing (SRG) segmentation with proposed adaptive thresholding based on the calculation of Relative Standard Deviation (RSD) all grains are detected. In the next step results of the detection are improved using information about the shape of the detected grains using distance map. Finally, each grain in the sample is classified into one of the predefined granularity class. The quality of the proposed method has been obtained by using nominal granularity samples, also with a comparison to the other methods.

  11. Digital image envelope: method and evaluation

    NASA Astrophysics Data System (ADS)

    Huang, H. K.; Cao, Fei; Zhou, Michael Z.; Mogel, Greg T.; Liu, Brent J.; Zhou, Xiaoqiang

    2003-05-01

    Health data security, characterized in terms of data privacy, authenticity, and integrity, is a vital issue when digital images and other patient information are transmitted through public networks in telehealth applications such as teleradiology. Mandates for ensuring health data security have been extensively discussed (for example The Health Insurance Portability and Accountability Act, HIPAA) and health informatics guidelines (such as the DICOM standard) are beginning to focus on issues of data continue to be published by organizing bodies in healthcare; however, there has not been a systematic method developed to ensure data security in medical imaging Because data privacy and authenticity are often managed primarily with firewall and password protection, we have focused our research and development on data integrity. We have developed a systematic method of ensuring medical image data integrity across public networks using the concept of the digital envelope. When a medical image is generated regardless of the modality, three processes are performed: the image signature is obtained, the DICOM image header is encrypted, and a digital envelope is formed by combining the signature and the encrypted header. The envelope is encrypted and embedded in the original image. This assures the security of both the image and the patient ID. The embedded image is encrypted again and transmitted across the network. The reverse process is performed at the receiving site. The result is two digital signatures, one from the original image before transmission, and second from the image after transmission. If the signatures are identical, there has been no alteration of the image. This paper concentrates in the method and evaluation of the digital image envelope.

  12. Advanced Recording and Preprocessing of Physiological Signals. [data processing equipment for flow measurement of blood flow by ultrasonics

    NASA Technical Reports Server (NTRS)

    Bentley, P. B.

    1975-01-01

    The measurement of the volume flow-rate of blood in an artery or vein requires both an estimate of the flow velocity and its spatial distribution and the corresponding cross-sectional area. Transcutaneous measurements of these parameters can be performed using ultrasonic techniques that are analogous to the measurement of moving objects by use of a radar. Modern digital data recording and preprocessing methods were applied to the measurement of blood-flow velocity by means of the CW Doppler ultrasonic technique. Only the average flow velocity was measured and no distribution or size information was obtained. Evaluations of current flowmeter design and performance, ultrasonic transducer fabrication methods, and other related items are given. The main thrust was the development of effective data-handling and processing methods by application of modern digital techniques. The evaluation resulted in useful improvements in both the flowmeter instrumentation and the ultrasonic transducers. Effective digital processing algorithms that provided enhanced blood-flow measurement accuracy and sensitivity were developed. Block diagrams illustrative of the equipment setup are included.

  13. Digital Holography, a metrological tool for quantitative analysis: Trends and future applications

    NASA Astrophysics Data System (ADS)

    Paturzo, Melania; Pagliarulo, Vito; Bianco, Vittorio; Memmolo, Pasquale; Miccio, Lisa; Merola, Francesco; Ferraro, Pietro

    2018-05-01

    A review on the last achievements of Digital Holography is reported in this paper, showing that this powerful method can be a key metrological tool for the quantitative analysis and non-invasive inspection of a variety of materials, devices and processes. Nowadays, its range of applications has been greatly extended, including the study of live biological matter and biomedical applications. This paper overviews the main progresses and future perspectives of digital holography, showing new optical configurations and investigating the numerical issues to be tackled for the processing and display of quantitative data.

  14. Digital pathology imaging as a novel platform for standardization and globalization of quantitative nephropathology

    PubMed Central

    Gimpel, Charlotte; Kain, Renate; Laurinavicius, Arvydas; Bueno, Gloria; Zeng, Caihong; Liu, Zhihong; Schaefer, Franz; Kretzler, Matthias; Holzman, Lawrence B.; Hewitt, Stephen M.

    2017-01-01

    Abstract The introduction of digital pathology to nephrology provides a platform for the development of new methodologies and protocols for visual, morphometric and computer-aided assessment of renal biopsies. Application of digital imaging to pathology made substantial progress over the past decade; it is now in use for education, clinical trials and translational research. Digital pathology evolved as a valuable tool to generate comprehensive structural information in digital form, a key prerequisite for achieving precision pathology for computational biology. The application of this new technology on an international scale is driving novel methods for collaborations, providing unique opportunities but also challenges. Standardization of methods needs to be rigorously evaluated and applied at each step, from specimen processing to scanning, uploading into digital repositories, morphologic, morphometric and computer-aided assessment, data collection and analysis. In this review, we discuss the status and opportunities created by the application of digital imaging to precision nephropathology, and present a vision for the near future. PMID:28584625

  15. Digital pathology imaging as a novel platform for standardization and globalization of quantitative nephropathology.

    PubMed

    Barisoni, Laura; Gimpel, Charlotte; Kain, Renate; Laurinavicius, Arvydas; Bueno, Gloria; Zeng, Caihong; Liu, Zhihong; Schaefer, Franz; Kretzler, Matthias; Holzman, Lawrence B; Hewitt, Stephen M

    2017-04-01

    The introduction of digital pathology to nephrology provides a platform for the development of new methodologies and protocols for visual, morphometric and computer-aided assessment of renal biopsies. Application of digital imaging to pathology made substantial progress over the past decade; it is now in use for education, clinical trials and translational research. Digital pathology evolved as a valuable tool to generate comprehensive structural information in digital form, a key prerequisite for achieving precision pathology for computational biology. The application of this new technology on an international scale is driving novel methods for collaborations, providing unique opportunities but also challenges. Standardization of methods needs to be rigorously evaluated and applied at each step, from specimen processing to scanning, uploading into digital repositories, morphologic, morphometric and computer-aided assessment, data collection and analysis. In this review, we discuss the status and opportunities created by the application of digital imaging to precision nephropathology, and present a vision for the near future.

  16. Visual-area coding technique (VACT): optical parallel implementation of fuzzy logic and its visualization with the digital-halftoning process

    NASA Astrophysics Data System (ADS)

    Konishi, Tsuyoshi; Tanida, Jun; Ichioka, Yoshiki

    1995-06-01

    A novel technique, the visual-area coding technique (VACT), for the optical implementation of fuzzy logic with the capability of visualization of the results is presented. This technique is based on the microfont method and is considered to be an instance of digitized analog optical computing. Huge amounts of data can be processed in fuzzy logic with the VACT. In addition, real-time visualization of the processed result can be accomplished.

  17. Increasing signal-to-noise ratio of reconstructed digital holograms by using light spatial noise portrait of camera's photosensor

    NASA Astrophysics Data System (ADS)

    Cheremkhin, Pavel A.; Evtikhiev, Nikolay N.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Starikov, Sergey N.

    2015-01-01

    Digital holography is technique which includes recording of interference pattern with digital photosensor, processing of obtained holographic data and reconstruction of object wavefront. Increase of signal-to-noise ratio (SNR) of reconstructed digital holograms is especially important in such fields as image encryption, pattern recognition, static and dynamic display of 3D scenes, and etc. In this paper compensation of photosensor light spatial noise portrait (LSNP) for increase of SNR of reconstructed digital holograms is proposed. To verify the proposed method, numerical experiments with computer generated Fresnel holograms with resolution equal to 512×512 elements were performed. Simulation of shots registration with digital camera Canon EOS 400D was performed. It is shown that solo use of the averaging over frames method allows to increase SNR only up to 4 times, and further increase of SNR is limited by spatial noise. Application of the LSNP compensation method in conjunction with the averaging over frames method allows for 10 times SNR increase. This value was obtained for LSNP measured with 20 % error. In case of using more accurate LSNP, SNR can be increased up to 20 times.

  18. New Trends of Emerging Technologies in Digital Pathology.

    PubMed

    Bueno, Gloria; Fernández-Carrobles, M Milagro; Deniz, Oscar; García-Rojo, Marcial

    2016-01-01

    The future paradigm of pathology will be digital. Instead of conventional microscopy, a pathologist will perform a diagnosis through interacting with images on computer screens and performing quantitative analysis. The fourth generation of virtual slide telepathology systems, so-called virtual microscopy and whole-slide imaging (WSI), has allowed for the storage and fast dissemination of image data in pathology and other biomedical areas. These novel digital imaging modalities encompass high-resolution scanning of tissue slides and derived technologies, including automatic digitization and computational processing of whole microscopic slides. Moreover, automated image analysis with WSI can extract specific diagnostic features of diseases and quantify individual components of these features to support diagnoses and provide informative clinical measures of disease. Therefore, the challenge is to apply information technology and image analysis methods to exploit the new and emerging digital pathology technologies effectively in order to process and model all the data and information contained in WSI. The final objective is to support the complex workflow from specimen receipt to anatomic pathology report transmission, that is, to improve diagnosis both in terms of pathologists' efficiency and with new information. This article reviews the main concerns about and novel methods of digital pathology discussed at the latest workshop in the field carried out within the European project AIDPATH (Academia and Industry Collaboration for Digital Pathology). © 2016 S. Karger AG, Basel.

  19. Full field study of strain distribution near the crack tip in the fracture of solid propellants via large strain digital image correlation and optical microscopy

    NASA Astrophysics Data System (ADS)

    Gonzalez, Javier

    A full field method for visualizing deformation around the crack tip in a fracture process with large strains is developed. A digital image correlation program (DIC) is used to incrementally compute strains and displacements between two consecutive images of a deformation process. Values of strain and displacements for consecutive deformations are added, this way solving convergence problems in the DIC algorithm when large deformations are investigated. The method developed is used to investigate the strain distribution within 1 mm of the crack tip in a particulate composite solid (propellant) using microscopic visualization of the deformation process.

  20. A method for normalizing pathology images to improve feature extraction for quantitative pathology.

    PubMed

    Tam, Allison; Barker, Jocelyn; Rubin, Daniel

    2016-01-01

    With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology images by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. ICHE may be a useful preprocessing step a digital pathology image processing pipeline.

  1. A Situated Practice of Ethics for Participatory Visual and Digital Methods in Public Health Research and Practice: A Focus on Digital Storytelling

    PubMed Central

    Hill, Amy L.; Flicker, Sarah

    2014-01-01

    This article explores ethical considerations related to participatory visual and digital methods for public health research and practice, through the lens of an approach known as “digital storytelling.” We begin by briefly describing the digital storytelling process and its applications to public health research and practice. Next, we explore 6 common challenges: fuzzy boundaries, recruitment and consent to participate, power of shaping, representation and harm, confidentiality, and release of materials. We discuss their complexities and offer some considerations for ethical practice. We hope this article serves as a catalyst for expanded dialogue about the need for high standards of integrity and a situated practice of ethics wherein researchers and practitioners reflexively consider ethical decision-making as part of the ongoing work of public health. PMID:23948015

  2. A situated practice of ethics for participatory visual and digital methods in public health research and practice: a focus on digital storytelling.

    PubMed

    Gubrium, Aline C; Hill, Amy L; Flicker, Sarah

    2014-09-01

    This article explores ethical considerations related to participatory visual and digital methods for public health research and practice, through the lens of an approach known as "digital storytelling." We begin by briefly describing the digital storytelling process and its applications to public health research and practice. Next, we explore 6 common challenges: fuzzy boundaries, recruitment and consent to participate, power of shaping, representation and harm, confidentiality, and release of materials. We discuss their complexities and offer some considerations for ethical practice. We hope this article serves as a catalyst for expanded dialogue about the need for high standards of integrity and a situated practice of ethics wherein researchers and practitioners reflexively consider ethical decision-making as part of the ongoing work of public health.

  3. Regeneration and repair of human digits and limbs: fact and fiction

    PubMed Central

    Cheng, Tsun‐Chih

    2015-01-01

    Abstract A variety of digit and limb repair and reconstruction methods have been used in different clinical settings, but regeneration remains an item on every plastic surgeon's “wish list.” Although surgical salvage techniques are continually being improved, unreplantable digits and limbs are still abundant. We comprehensively review the structural and functional salvage methods in clinical practice, from the peeling injuries of small distal fingertips to multisegmented amputated limbs, and the developmental and tissue engineering approaches for regenerating human digits and limbs in the laboratory. Although surgical techniques have forged ahead, there are still situations in which digits and limbs are unreplantable. Advances in the field are delineated, and the regeneration processes of salamander limbs, lizard tails, and mouse digits and each component of tissue engineering approaches for digit‐ and limb‐building are discussed. Although the current technology is promising, there are many challenges in human digit and limb regeneration. We hope this review inspires research on the critical gap between clinical and basic science, and leads to more sophisticated digit and limb loss rescue and regeneration innovations. PMID:27499873

  4. Estimation of breast percent density in raw and processed full field digital mammography images via adaptive fuzzy c-means clustering and support vector machine segmentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keller, Brad M.; Nathan, Diane L.; Wang Yan

    Purpose: The amount of fibroglandular tissue content in the breast as estimated mammographically, commonly referred to as breast percent density (PD%), is one of the most significant risk factors for developing breast cancer. Approaches to quantify breast density commonly focus on either semiautomated methods or visual assessment, both of which are highly subjective. Furthermore, most studies published to date investigating computer-aided assessment of breast PD% have been performed using digitized screen-film mammograms, while digital mammography is increasingly replacing screen-film mammography in breast cancer screening protocols. Digital mammography imaging generates two types of images for analysis, raw (i.e., 'FOR PROCESSING') andmore » vendor postprocessed (i.e., 'FOR PRESENTATION'), of which postprocessed images are commonly used in clinical practice. Development of an algorithm which effectively estimates breast PD% in both raw and postprocessed digital mammography images would be beneficial in terms of direct clinical application and retrospective analysis. Methods: This work proposes a new algorithm for fully automated quantification of breast PD% based on adaptive multiclass fuzzy c-means (FCM) clustering and support vector machine (SVM) classification, optimized for the imaging characteristics of both raw and processed digital mammography images as well as for individual patient and image characteristics. Our algorithm first delineates the breast region within the mammogram via an automated thresholding scheme to identify background air followed by a straight line Hough transform to extract the pectoral muscle region. The algorithm then applies adaptive FCM clustering based on an optimal number of clusters derived from image properties of the specific mammogram to subdivide the breast into regions of similar gray-level intensity. Finally, a SVM classifier is trained to identify which clusters within the breast tissue are likely fibroglandular, which are then aggregated into a final dense tissue segmentation that is used to compute breast PD%. Our method is validated on a group of 81 women for whom bilateral, mediolateral oblique, raw and processed screening digital mammograms were available, and agreement is assessed with both continuous and categorical density estimates made by a trained breast-imaging radiologist. Results: Strong association between algorithm-estimated and radiologist-provided breast PD% was detected for both raw (r= 0.82, p < 0.001) and processed (r= 0.85, p < 0.001) digital mammograms on a per-breast basis. Stronger agreement was found when overall breast density was assessed on a per-woman basis for both raw (r= 0.85, p < 0.001) and processed (0.89, p < 0.001) mammograms. Strong agreement between categorical density estimates was also seen (weighted Cohen's {kappa}{>=} 0.79). Repeated measures analysis of variance demonstrated no statistically significant differences between the PD% estimates (p > 0.1) due to either presentation of the image (raw vs processed) or method of PD% assessment (radiologist vs algorithm). Conclusions: The proposed fully automated algorithm was successful in estimating breast percent density from both raw and processed digital mammographic images. Accurate assessment of a woman's breast density is critical in order for the estimate to be incorporated into risk assessment models. These results show promise for the clinical application of the algorithm in quantifying breast density in a repeatable manner, both at time of imaging as well as in retrospective studies.« less

  5. Edge enhancement of color images using a digital micromirror device.

    PubMed

    Di Martino, J Matías; Flores, Jorge L; Ayubi, Gastón A; Alonso, Julia R; Fernández, Ariel; Ferrari, José A

    2012-06-01

    A method for orientation-selective enhancement of edges in color images is proposed. The method utilizes the capacity of digital micromirror devices to generate a positive and a negative color replica of the image used as input. When both images are slightly displaced and imagined together, one obtains an image with enhanced edges. The proposed technique does not require a coherent light source or precise alignment. The proposed method could be potentially useful for processing large image sequences in real time. Validation experiments are presented.

  6. Metrological digital audio reconstruction

    DOEpatents

    Fadeyev,; Vitaliy, Haber [Berkeley, CA; Carl, [Berkeley, CA

    2004-02-19

    Audio information stored in the undulations of grooves in a medium such as a phonograph record may be reconstructed, with little or no contact, by measuring the groove shape using precision metrology methods coupled with digital image processing and numerical analysis. The effects of damage, wear, and contamination may be compensated, in many cases, through image processing and analysis methods. The speed and data handling capacity of available computing hardware make this approach practical. Two examples used a general purpose optical metrology system to study a 50 year old 78 r.p.m. phonograph record and a commercial confocal scanning probe to study a 1920's celluloid Edison cylinder. Comparisons are presented with stylus playback of the samples and with a digitally re-mastered version of an original magnetic recording. There is also a more extensive implementation of this approach, with dedicated hardware and software.

  7. Developing tools for digital radar image data evaluation

    NASA Technical Reports Server (NTRS)

    Domik, G.; Leberl, F.; Raggam, J.

    1986-01-01

    The refinement of radar image analysis methods has led to a need for a systems approach to radar image processing software. Developments stimulated through satellite radar are combined with standard image processing techniques to create a user environment to manipulate and analyze airborne and satellite radar images. One aim is to create radar products for the user from the original data to enhance the ease of understanding the contents. The results are called secondary image products and derive from the original digital images. Another aim is to support interactive SAR image analysis. Software methods permit use of a digital height model to create ortho images, synthetic images, stereo-ortho images, radar maps or color combinations of different component products. Efforts are ongoing to integrate individual tools into a combined hardware/software environment for interactive radar image analysis.

  8. Higuchi’s Method applied to detection of changes in timbre of digital sound synthesis of string instruments with the functional transformation method

    NASA Astrophysics Data System (ADS)

    Kanjanapen, Manorth; Kunsombat, Cherdsak; Chiangga, Surasak

    2017-09-01

    The functional transformation method (FTM) is a powerful tool for detailed investigation of digital sound synthesis by the physical modeling method, the resulting sound or measured vibrational characteristics at discretized points on real instruments directly solves the underlying physical effect of partial differential equation (PDE). In this paper, we present the Higuchi’s method to examine the difference between the timbre of tone and estimate fractal dimension of musical signals which contains information about their geometrical structure that synthesizes by FTM. With the Higuchi’s method we obtain the whole process is not complicated, fast processing, with the ease of analysis without expertise in the physics or virtuoso musicians and the easiest way for the common people can judge that sounds similarly presented.

  9. The elimination of zero-order diffraction of 10.6 μm infrared digital holography

    NASA Astrophysics Data System (ADS)

    Liu, Ning; Yang, Chao

    2017-05-01

    A new method of eliminating the zero-order diffraction in infrared digital holography has been raised in this paper. Usually in the reconstruction of digital holography, the spatial frequency of the infrared thermal imager, such as microbolometer, cannot be compared to the common visible CCD or CMOS devices. The infrared imager suffers the problems of large pixel size and low spatial resolution, which cause the zero-order diffraction a severe influence of the reconstruction process of digital holograms. The zero-order diffraction has very large energy and occupies the central region in the spectrum domain. In this paper, we design a new filtering strategy to overcome this problem. This filtering strategy contains two kinds of filtering process which are the Gaussian low-frequency filter and the high-pass phase averaging filter. With the correct set of the calculating parameters, these filtering strategies can work effectively on the holograms and fully eliminate the zero-order diffraction, as well as the two crossover bars shown in the spectrum domain. Detailed explanation and discussion about the new method have been proposed in this paper, and the experiment results are also demonstrated to prove the performance of this method.

  10. A method for normalizing pathology images to improve feature extraction for quantitative pathology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tam, Allison; Barker, Jocelyn; Rubin, Daniel

    Purpose: With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. Methods: To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology imagesmore » by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. Results: The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. Conclusions: ICHE may be a useful preprocessing step a digital pathology image processing pipeline.« less

  11. Integrated use of spatial and semantic relationships for extracting road networks from floating car data

    NASA Astrophysics Data System (ADS)

    Li, Jun; Qin, Qiming; Xie, Chao; Zhao, Yue

    2012-10-01

    The update frequency of digital road maps influences the quality of road-dependent services. However, digital road maps surveyed by probe vehicles or extracted from remotely sensed images still have a long updating circle and their cost remain high. With GPS technology and wireless communication technology maturing and their cost decreasing, floating car technology has been used in traffic monitoring and management, and the dynamic positioning data from floating cars become a new data source for updating road maps. In this paper, we aim to update digital road maps using the floating car data from China's National Commercial Vehicle Monitoring Platform, and present an incremental road network extraction method suitable for the platform's GPS data whose sampling frequency is low and which cover a large area. Based on both spatial and semantic relationships between a trajectory point and its associated road segment, the method classifies each trajectory point, and then merges every trajectory point into the candidate road network through the adding or modifying process according to its type. The road network is gradually updated until all trajectories have been processed. Finally, this method is applied in the updating process of major roads in North China and the experimental results reveal that it can accurately derive geometric information of roads under various scenes. This paper provides a highly-efficient, low-cost approach to update digital road maps.

  12. Spatially and temporally resolved diagnostics of dense sprays using gated, femtosecond, digital holography

    NASA Astrophysics Data System (ADS)

    Trolinger, James D.; Dioumaev, Andrei K.; Ziaee, Ali; Minniti, Marco; Dunn-Rankin, Derek

    2017-08-01

    This paper describes research that demonstrated gated, femtosecond, digital holography, enabling 3D microscopic viewing inside dense, almost opaque sprays, and providing a new and powerful diagnostics capability for viewing fuel atomization processes never seen before. The method works by exploiting the extremely short coherence and pulse length (approximately 30 micrometers in this implementation) provided by a femtosecond laser combined with digital holography to eliminate multiple and wide angle scattered light from particles surrounding the injection region, which normally obscures the image of interest. Photons that follow a path that differs in length by more than 30 micrometers from a straight path through the field to the sensor do not contribute to the holographic recording of photons that travel in a near straight path (ballistic and "snake" photons). To further enhance the method, off-axis digital holography was incorporated to enhance signal to noise ratio and image processing capability in reconstructed images by separating the conjugate images, which overlap and interfere in conventional in-line holography. This also enables digital holographic interferometry. Fundamental relationships and limitations were also examined. The project is a continuing collaboration between MetroLaser and the University of California, Irvine.

  13. A Fast Multiple Sampling Method for Low-Noise CMOS Image Sensors With Column-Parallel 12-bit SAR ADCs

    PubMed Central

    Kim, Min-Kyu; Hong, Seong-Kwan; Kwon, Oh-Kyong

    2015-01-01

    This paper presents a fast multiple sampling method for low-noise CMOS image sensor (CIS) applications with column-parallel successive approximation register analog-to-digital converters (SAR ADCs). The 12-bit SAR ADC using the proposed multiple sampling method decreases the A/D conversion time by repeatedly converting a pixel output to 4-bit after the first 12-bit A/D conversion, reducing noise of the CIS by one over the square root of the number of samplings. The area of the 12-bit SAR ADC is reduced by using a 10-bit capacitor digital-to-analog converter (DAC) with four scaled reference voltages. In addition, a simple up/down counter-based digital processing logic is proposed to perform complex calculations for multiple sampling and digital correlated double sampling. To verify the proposed multiple sampling method, a 256 × 128 pixel array CIS with 12-bit SAR ADCs was fabricated using 0.18 μm CMOS process. The measurement results shows that the proposed multiple sampling method reduces each A/D conversion time from 1.2 μs to 0.45 μs and random noise from 848.3 μV to 270.4 μV, achieving a dynamic range of 68.1 dB and an SNR of 39.2 dB. PMID:26712765

  14. Processing of Digital Plates1.2m of Baldone Observatory Schmidt Telescope

    NASA Astrophysics Data System (ADS)

    Eglitis, Ilgmars; Andruk, Vitaly

    2017-04-01

    The aim of this research is to evaluate accuracy of Plate Processing Method and to perform a detailed study of the Epson Expression 10000XL scanner, which was used to digitize plates from the database collection of the 1.2 m Schmidt Telescope installed in the Baldone Observatory. Special software developed in LINUX/MIDAS/ROMAFOT environment was used for processing the scans. Results of the digitized files with grey gradations of 8- and 16-bits were compared; an estimation of the accuracy of the developed method for rectangular coordinates determination and photometry was made. Errors in the instrumental system are ±0.026 pixels and ±0.024m for coordinates and stellar magnitudes respectively. To evaluate the repeatability of the scanner's astrometric and photometric errors, six consecutive scans of one plate were processed with a spatial separation of 1200 dpi. The following error estimations are obtained for stars brighter than U< 13.5m: σxy = ±0.021 to 0.027 pixels and σm = ±0.014m to 0.016m for rectangular coordinates and instrumental stellar magnitudes respectively.

  15. Processing Digital Imagery to Enhance Perceptions of Realism

    NASA Technical Reports Server (NTRS)

    Woodell, Glenn A.; Jobson, Daniel J.; Rahman, Zia-ur

    2003-01-01

    Multi-scale retinex with color restoration (MSRCR) is a method of processing digital image data based on Edwin Land s retinex (retina + cortex) theory of human color vision. An outgrowth of basic scientific research and its application to NASA s remote-sensing mission, MSRCR is embodied in a general-purpose algorithm that greatly improves the perception of visual realism and the quantity and quality of perceived information in a digitized image. In addition, the MSRCR algorithm includes provisions for automatic corrections to accelerate and facilitate what could otherwise be a tedious image-editing process. The MSRCR algorithm has been, and is expected to continue to be, the basis for development of commercial image-enhancement software designed to extend and refine its capabilities for diverse applications.

  16. Identification and Quantification Soil Redoximorphic Features by Digital Image Processing

    USDA-ARS?s Scientific Manuscript database

    Soil redoximorphic features (SRFs) have provided scientists and land managers with insight into relative soil moisture for approximately 60 years. The overall objective of this study was to develop a new method of SRF identification and quantification from soil cores using a digital camera and imag...

  17. Digital-Gaming Trajectories and Second Language Development

    ERIC Educational Resources Information Center

    Scholz, Kyle W.; Schulze, Mathias

    2017-01-01

    Recent research in digital game-based language learning has been encouraging, yet it would benefit from research methods that focus on the gaming processes and second-language development (Larsen-Freeman, 2015) rather than learner/player reflection or individuals' beliefs about the validity of gameplay. This has proven challenging as research…

  18. Three-dimensional displays for natural hazards analysis, using classified Landsat Thematic Mapper digital data and large-scale digital elevation models

    NASA Technical Reports Server (NTRS)

    Butler, David R.; Walsh, Stephen J.; Brown, Daniel G.

    1991-01-01

    Methods are described for using Landsat Thematic Mapper digital data and digital elevation models for the display of natural hazard sites in a mountainous region of northwestern Montana, USA. Hazard zones can be easily identified on the three-dimensional images. Proximity of facilities such as highways and building locations to hazard sites can also be easily displayed. A temporal sequence of Landsat TM (or similar) satellite data sets could also be used to display landscape changes associated with dynamic natural hazard processes.

  19. Development of Generation System of Simplified Digital Maps

    NASA Astrophysics Data System (ADS)

    Uchimura, Keiichi; Kawano, Masato; Tokitsu, Hiroki; Hu, Zhencheng

    In recent years, digital maps have been used in a variety of scenarios, including car navigation systems and map information services over the Internet. These digital maps are formed by multiple layers of maps of different scales; the map data most suitable for the specific situation are used. Currently, the production of map data of different scales is done by hand due to constraints related to processing time and accuracy. We conducted research concerning technologies for automatic generation of simplified map data from detailed map data. In the present paper, the authors propose the following: (1) a method to transform data related to streets, rivers, etc. containing widths into line data, (2) a method to eliminate the component points of the data, and (3) a method to eliminate data that lie below a certain threshold. In addition, in order to evaluate the proposed method, a user survey was conducted; in this survey we compared maps generated using the proposed method with the commercially available maps. From the viewpoint of the amount of data reduction and processing time, and on the basis of the results of the survey, we confirmed the effectiveness of the automatic generation of simplified maps using the proposed methods.

  20. Image correlation method for DNA sequence alignment.

    PubMed

    Curilem Saldías, Millaray; Villarroel Sassarini, Felipe; Muñoz Poblete, Carlos; Vargas Vásquez, Asticio; Maureira Butler, Iván

    2012-01-01

    The complexity of searches and the volume of genomic data make sequence alignment one of bioinformatics most active research areas. New alignment approaches have incorporated digital signal processing techniques. Among these, correlation methods are highly sensitive. This paper proposes a novel sequence alignment method based on 2-dimensional images, where each nucleic acid base is represented as a fixed gray intensity pixel. Query and known database sequences are coded to their pixel representation and sequence alignment is handled as object recognition in a scene problem. Query and database become object and scene, respectively. An image correlation process is carried out in order to search for the best match between them. Given that this procedure can be implemented in an optical correlator, the correlation could eventually be accomplished at light speed. This paper shows an initial research stage where results were "digitally" obtained by simulating an optical correlation of DNA sequences represented as images. A total of 303 queries (variable lengths from 50 to 4500 base pairs) and 100 scenes represented by 100 x 100 images each (in total, one million base pair database) were considered for the image correlation analysis. The results showed that correlations reached very high sensitivity (99.01%), specificity (98.99%) and outperformed BLAST when mutation numbers increased. However, digital correlation processes were hundred times slower than BLAST. We are currently starting an initiative to evaluate the correlation speed process of a real experimental optical correlator. By doing this, we expect to fully exploit optical correlation light properties. As the optical correlator works jointly with the computer, digital algorithms should also be optimized. The results presented in this paper are encouraging and support the study of image correlation methods on sequence alignment.

  1. Reconstruction of a digital core containing clay minerals based on a clustering algorithm.

    PubMed

    He, Yanlong; Pu, Chunsheng; Jing, Cheng; Gu, Xiaoyu; Chen, Qingdong; Liu, Hongzhi; Khan, Nasir; Dong, Qiaoling

    2017-10-01

    It is difficult to obtain a core sample and information for digital core reconstruction of mature sandstone reservoirs around the world, especially for an unconsolidated sandstone reservoir. Meanwhile, reconstruction and division of clay minerals play a vital role in the reconstruction of the digital cores, although the two-dimensional data-based reconstruction methods are specifically applicable as the microstructure reservoir simulation methods for the sandstone reservoir. However, reconstruction of clay minerals is still challenging from a research viewpoint for the better reconstruction of various clay minerals in the digital cores. In the present work, the content of clay minerals was considered on the basis of two-dimensional information about the reservoir. After application of the hybrid method, and compared with the model reconstructed by the process-based method, the digital core containing clay clusters without the labels of the clusters' number, size, and texture were the output. The statistics and geometry of the reconstruction model were similar to the reference model. In addition, the Hoshen-Kopelman algorithm was used to label various connected unclassified clay clusters in the initial model and then the number and size of clay clusters were recorded. At the same time, the K-means clustering algorithm was applied to divide the labeled, large connecting clusters into smaller clusters on the basis of difference in the clusters' characteristics. According to the clay minerals' characteristics, such as types, textures, and distributions, the digital core containing clay minerals was reconstructed by means of the clustering algorithm and the clay clusters' structure judgment. The distributions and textures of the clay minerals of the digital core were reasonable. The clustering algorithm improved the digital core reconstruction and provided an alternative method for the simulation of different clay minerals in the digital cores.

  2. Comparison of Two Simplification Methods for Shoreline Extraction from Digital Orthophoto Images

    NASA Astrophysics Data System (ADS)

    Bayram, B.; Sen, A.; Selbesoglu, M. O.; Vārna, I.; Petersons, P.; Aykut, N. O.; Seker, D. Z.

    2017-11-01

    The coastal ecosystems are very sensitive to external influences. Coastal resources such as sand dunes, coral reefs and mangroves has vital importance to prevent coastal erosion. Human based effects also threats the coastal areas. Therefore, the change of coastal areas should be monitored. Up-to-date, accurate shoreline information is indispensable for coastal managers and decision makers. Remote sensing and image processing techniques give a big opportunity to obtain reliable shoreline information. In the presented study, NIR bands of seven 1:5000 scaled digital orthophoto images of Riga Bay-Latvia have been used. The Object-oriented Simple Linear Clustering method has been utilized to extract shoreline of Riga Bay. Bend and Douglas-Peucker methods have been used to simplify the extracted shoreline to test the effect of both methods. Photogrammetrically digitized shoreline has been taken as reference data to compare obtained results. The accuracy assessment has been realised by Digital Shoreline Analysis tool. As a result, the achieved shoreline by the Bend method has been found closer to the extracted shoreline with Simple Linear Clustering method.

  3. The Process of Digitizing of Old Globe

    NASA Astrophysics Data System (ADS)

    Ambrožová, K.; Havrlanta, J.; Talich, M.; Böhm, O.

    2016-06-01

    This paper describes the process of digitalization of old globes that brings with it the possibility to use globes in their digital form. Created digital models are available to the general public through modern technology in the Internet network. This gives an opportunity to study old globes located in various historical collections, and prevent damage of the originals. Another benefit of digitization is also a possibility of comparing different models both among themselves and with current map data by increasing the transparency of individual layers. Digitization is carried out using special device that allows digitizing globes with a diameter ranging from 5 cm to 120 cm. This device can be easily disassembled, and it is fully mobile therefore the globes can be digitized in the place of its storage. Image data of globe surface are acquired by digital camera firmly fastened to the device. Acquired image data are then georeferenced by using a method of complex adjustment. The last step of digitization is publication of the final models that is realized by two ways. The first option is in the form of 3D model through JavaScript library Cesium or Google Earth plug-in in the Web browser. The second option is as a georeferenced map using Tile Map Service.

  4. Evaluation of user input methods for manipulating a tablet personal computer in sterile techniques.

    PubMed

    Yamada, Akira; Komatsu, Daisuke; Suzuki, Takeshi; Kurozumi, Masahiro; Fujinaga, Yasunari; Ueda, Kazuhiko; Kadoya, Masumi

    2017-02-01

    To determine a quick and accurate user input method for manipulating tablet personal computers (PCs) in sterile techniques. We evaluated three different manipulation methods, (1) Computer mouse and sterile system drape, (2) Fingers and sterile system drape, and (3) Digitizer stylus and sterile ultrasound probe cover with a pinhole, in terms of the central processing unit (CPU) performance, manipulation performance, and contactlessness. A significant decrease in CPU score ([Formula: see text]) and an increase in CPU temperature ([Formula: see text]) were observed when a system drape was used. The respective mean times taken to select a target image from an image series (ST) and the mean times for measuring points on an image (MT) were [Formula: see text] and [Formula: see text] s for the computer mouse method, [Formula: see text] and [Formula: see text] s for the finger method, and [Formula: see text] and [Formula: see text] s for the digitizer stylus method, respectively. The ST for the finger method was significantly longer than for the digitizer stylus method ([Formula: see text]). The MT for the computer mouse method was significantly longer than for the digitizer stylus method ([Formula: see text]). The mean success rate for measuring points on an image was significantly lower for the finger method when the diameter of the target was equal to or smaller than 8 mm than for the other methods. No significant difference in the adenosine triphosphate amount at the surface of the tablet PC was observed before, during, or after manipulation via the digitizer stylus method while wearing starch-powdered sterile gloves ([Formula: see text]). Quick and accurate manipulation of tablet PCs in sterile techniques without CPU load is feasible using a digitizer stylus and sterile ultrasound probe cover with a pinhole.

  5. Direct digital conversion detector technology

    NASA Astrophysics Data System (ADS)

    Mandl, William J.; Fedors, Richard

    1995-06-01

    Future imaging sensors for the aerospace and commercial video markets will depend on low cost, high speed analog-to-digital (A/D) conversion to efficiently process optical detector signals. Current A/D methods place a heavy burden on system resources, increase noise, and limit the throughput. This paper describes a unique method for incorporating A/D conversion right on the focal plane array. This concept is based on Sigma-Delta sampling, and makes optimum use of the active detector real estate. Combined with modern digital signal processors, such devices will significantly increase data rates off the focal plane. Early conversion to digital format will also decrease the signal susceptibility to noise, lowering the communications bit error rate. Computer modeling of this concept is described, along with results from several simulation runs. A potential application for direct digital conversion is also reviewed. Future uses for this technology could range from scientific instruments to remote sensors, telecommunications gear, medical diagnostic tools, and consumer products.

  6. Computer aided design of digital controller for radial active magnetic bearings

    NASA Technical Reports Server (NTRS)

    Cai, Zhong; Shen, Zupei; Zhang, Zuming; Zhao, Hongbin

    1992-01-01

    A five degree of freedom Active Magnetic Bearing (AMB) system is developed which is controlled by digital controllers. The model of the radial AMB system is linearized and the state equation is derived. Based on the state variables feedback theory, digital controllers are designed. The performance of the controllers are evaluated according to experimental results. The Computer Aided Design (CAD) method is used to design controllers for magnetic bearings. The controllers are implemented with a digital signal processing (DSP) system. The control algorithms are realized with real-time programs. It is very easy to change the controller by changing or modifying the programs. In order to identify the dynamic parameters of the controlled magnetic system, a special experiment was carried out. Also, the online Recursive Least Squares (RLS) parameter identification method is studied. It can be realized with the digital controllers. Online parameter identification is essential for the realization of an adaptive controller.

  7. Digital stethoscope system: the feasibility of cardiac auscultation

    NASA Astrophysics Data System (ADS)

    Pariaszewska, Katarzyna; Młyńczak, Marcel; Niewiadomski, Wiktor; Cybulski, Gerard

    2013-10-01

    The application of the digital stethoscope system is a new tendency in methods of cardiac auscultation. Heart sounds, generated by the fluctuations of blood velocity and vibrations of muscle structure, are an important signal in the primary diagnosis of heart diseases. Since the XIXs century for physical examination an analog stethoscope was used, but the development of microelectronics enable the construction of digital stethoscopes which started modern phonocardiography. The typical hardware of the system could be divided into analog and digital parts, respectively. The first one consists of microphone and pre-amplifier. The second one contains a microcontroller with peripherals for data saving and transmission. Usually the specialized software is applied for the signal acquisition and digital signal processing (filtering, spectral analysis and others). This paper presents an overview of methods used in cardiac auscultation and expected developing path in the future. It also contains the description of our digital stethoscope system, which is planned to be used in poliphysiographical studies.

  8. On Digital Simulation of Multicorrelated Random Processes and Its Applications. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Sinha, A. K.

    1973-01-01

    Two methods are described to simulate, on a digital computer, a set of correlated, stationary, and Gaussian time series with zero mean from the given matrix of power spectral densities and cross spectral densities. The first method is based upon trigonometric series with random amplitudes and deterministic phase angles. The random amplitudes are generated by using a standard random number generator subroutine. An example is given which corresponds to three components of wind velocities at two different spatial locations for a total of six correlated time series. In the second method, the whole process is carried out using the Fast Fourier Transform approach. This method gives more accurate results and works about twenty times faster for a set of six correlated time series.

  9. Data processing method for a weak, moving telemetry signal

    NASA Technical Reports Server (NTRS)

    Kendall, W. B.; Levy, G. S.; Nixon, D. L.; Panson, P. L.

    1969-01-01

    Method of processing data from a spacecraft, where the carrier has a low signal-to-noise ratio and wide unpredictable frequency shifts, consists of analogue recording of the noisy signal along with a high-frequency tone that is used as a clock to trigger a digitizer.

  10. The Effect of Pinyin Input Experience on the Link Between Semantic and Phonology of Chinese Character in Digital Writing.

    PubMed

    Chen, Jingjun; Luo, Rong; Liu, Huashan

    2017-08-01

    With the development of ICT, digital writing is becoming much more common in people's life. Differently from keyboarding alphabets directly to input English words, keyboarding Chinese character is always through typing phonetic alphabets and then identify the glyph provided by Pinyin input-method software while in this process which do not need users to produce orthography spelling, thus it is different from traditional written language production model based on handwriting process. Much of the research in this domain has found that using Pinyin input method is beneficial to Chinese characters recognition, but only a small part explored the effects of individual's Pinyin input experience on the Chinese characters production process. We ask whether using Pinyin input-method will strengthen the semantic-phonology linkage or semantic-orthography linkage in Chinese character mental lexicon. Through recording the RT and accuracy of participants completing semantic-syllable and semantic-glyph consistency judgments, the results found the accuracy of semantic-syllable consistency judgments in high Pinyin input experienced group was higher than that in low-experienced group, and RT was reversed. There were no significant differences on semantic-glyph consistency judgments between the two groups. We conclude that using Pinyin input method in Chinese digital writing can strengthen the semantic-phonology linkage while do not weakening the semantic-orthography linkage in mental lexicon at the same time, which means that Pinyin input method is beneficial to lexical processing involving Chinese cognition.

  11. Examining Factors of Engagement With Digital Interventions for Weight Management: Rapid Review

    PubMed Central

    2017-01-01

    Background Digital interventions for weight management provide a unique opportunity to target daily lifestyle choices and eating behaviors over a sustained period of time. However, recent evidence has demonstrated a lack of user engagement with digital health interventions, impacting on the levels of intervention effectiveness. Thus, it is critical to identify the factors that may facilitate user engagement with digital health interventions to encourage behavior change and weight management. Objective The aim of this study was to identify and synthesize the available evidence to gain insights about users’ perspectives on factors that affect engagement with digital interventions for weight management. Methods A rapid review methodology was adopted. The search strategy was executed in the following databases: Web of Science, PsycINFO, and PubMed. Studies were eligible for inclusion if they investigated users’ engagement with a digital weight management intervention and were published from 2000 onwards. A narrative synthesis of data was performed on all included studies. Results A total of 11 studies were included in the review. The studies were qualitative, mixed-methods, or randomized controlled trials. Some of the studies explored features influencing engagement when using a Web-based digital intervention, others specifically explored engagement when accessing a mobile phone app, and some looked at engagement after text message (short message service, SMS) reminders. Factors influencing engagement with digital weight management interventions were found to be both user-related (eg, perceived health benefits) and digital intervention–related (eg, ease of use and the provision of personalized information). Conclusions The findings highlight the importance of incorporating user perspectives during the digital intervention development process to encourage engagement. The review contributes to our understanding of what facilitates user engagement and points toward a coproduction approach for developing digital interventions for weight management. Particularly, it highlights the importance of thinking about user-related and digital tool–related factors from the very early stages of the intervention development process. PMID:29061557

  12. Estimation of breast percent density in raw and processed full field digital mammography images via adaptive fuzzy c-means clustering and support vector machine segmentation

    PubMed Central

    Keller, Brad M.; Nathan, Diane L.; Wang, Yan; Zheng, Yuanjie; Gee, James C.; Conant, Emily F.; Kontos, Despina

    2012-01-01

    Purpose: The amount of fibroglandular tissue content in the breast as estimated mammographically, commonly referred to as breast percent density (PD%), is one of the most significant risk factors for developing breast cancer. Approaches to quantify breast density commonly focus on either semiautomated methods or visual assessment, both of which are highly subjective. Furthermore, most studies published to date investigating computer-aided assessment of breast PD% have been performed using digitized screen-film mammograms, while digital mammography is increasingly replacing screen-film mammography in breast cancer screening protocols. Digital mammography imaging generates two types of images for analysis, raw (i.e., “FOR PROCESSING”) and vendor postprocessed (i.e., “FOR PRESENTATION”), of which postprocessed images are commonly used in clinical practice. Development of an algorithm which effectively estimates breast PD% in both raw and postprocessed digital mammography images would be beneficial in terms of direct clinical application and retrospective analysis. Methods: This work proposes a new algorithm for fully automated quantification of breast PD% based on adaptive multiclass fuzzy c-means (FCM) clustering and support vector machine (SVM) classification, optimized for the imaging characteristics of both raw and processed digital mammography images as well as for individual patient and image characteristics. Our algorithm first delineates the breast region within the mammogram via an automated thresholding scheme to identify background air followed by a straight line Hough transform to extract the pectoral muscle region. The algorithm then applies adaptive FCM clustering based on an optimal number of clusters derived from image properties of the specific mammogram to subdivide the breast into regions of similar gray-level intensity. Finally, a SVM classifier is trained to identify which clusters within the breast tissue are likely fibroglandular, which are then aggregated into a final dense tissue segmentation that is used to compute breast PD%. Our method is validated on a group of 81 women for whom bilateral, mediolateral oblique, raw and processed screening digital mammograms were available, and agreement is assessed with both continuous and categorical density estimates made by a trained breast-imaging radiologist. Results: Strong association between algorithm-estimated and radiologist-provided breast PD% was detected for both raw (r = 0.82, p < 0.001) and processed (r = 0.85, p < 0.001) digital mammograms on a per-breast basis. Stronger agreement was found when overall breast density was assessed on a per-woman basis for both raw (r = 0.85, p < 0.001) and processed (0.89, p < 0.001) mammograms. Strong agreement between categorical density estimates was also seen (weighted Cohen's κ ≥ 0.79). Repeated measures analysis of variance demonstrated no statistically significant differences between the PD% estimates (p > 0.1) due to either presentation of the image (raw vs processed) or method of PD% assessment (radiologist vs algorithm). Conclusions: The proposed fully automated algorithm was successful in estimating breast percent density from both raw and processed digital mammographic images. Accurate assessment of a woman's breast density is critical in order for the estimate to be incorporated into risk assessment models. These results show promise for the clinical application of the algorithm in quantifying breast density in a repeatable manner, both at time of imaging as well as in retrospective studies. PMID:22894417

  13. Colour in digital pathology: a review.

    PubMed

    Clarke, Emily L; Treanor, Darren

    2017-01-01

    Colour is central to the practice of pathology because of the use of coloured histochemical and immunohistochemical stains to visualize tissue features. Our reliance upon histochemical stains and light microscopy has evolved alongside a wide variation in slide colour, with little investigation into the implications of colour variation. However, the introduction of the digital microscope and whole-slide imaging has highlighted the need for further understanding and control of colour. This is because the digitization process itself introduces further colour variation which may affect diagnosis, and image analysis algorithms often use colour or intensity measures to detect or measure tissue features. The US Food and Drug Administration have released recent guidance stating the need to develop a method of controlling colour reproduction throughout the digitization process in whole-slide imaging for primary diagnostic use. This comprehensive review introduces applied basic colour physics and colour interpretation by the human visual system, before discussing the importance of colour in pathology. The process of colour calibration and its application to pathology are also included, as well as a summary of the current guidelines and recommendations regarding colour in digital pathology. © 2016 John Wiley & Sons Ltd.

  14. Digital holographic image fusion for a larger size object using compressive sensing

    NASA Astrophysics Data System (ADS)

    Tian, Qiuhong; Yan, Liping; Chen, Benyong; Yao, Jiabao; Zhang, Shihua

    2017-05-01

    Digital holographic imaging fusion for a larger size object using compressive sensing is proposed. In this method, the high frequency component of the digital hologram under discrete wavelet transform is represented sparsely by using compressive sensing so that the data redundancy of digital holographic recording can be resolved validly, the low frequency component is retained totally to ensure the image quality, and multiple reconstructed images with different clear parts corresponding to a laser spot size are fused to realize the high quality reconstructed image of a larger size object. In addition, a filter combing high-pass and low-pass filters is designed to remove the zero-order term from a digital hologram effectively. The digital holographic experimental setup based on off-axis Fresnel digital holography was constructed. The feasible and comparative experiments were carried out. The fused image was evaluated by using the Tamura texture features. The experimental results demonstrated that the proposed method can improve the processing efficiency and visual characteristics of the fused image and enlarge the size of the measured object effectively.

  15. Interference mitigation for simultaneous transmit and receive applications on digital phased array systems

    NASA Astrophysics Data System (ADS)

    Snow, Trevor M.

    As analog-to-digital (ADC) and digital-to-analog conversion (DAC) technologies become cheaper and digital processing capabilities improve, phased array systems with digital transceivers at every element will become more commonplace. These architectures offer greater capability over traditional analog systems and enable advanced applications such as multiple-input, multiple-output (MIMO) communications, adaptive beamforming, space-time adaptive processing (STAP), and MIMO for radar. Capabilities for such systems are still limited by the need for isolating self-interference from transmitters at co-located receivers. The typical approach of time-sharing the antenna aperture between transmitters and receivers works but leaves the receivers blind for a period of time. For full-duplex operation, some systems use separate frequency bands for transmission and reception, but these require fixed filtering which reduces the system's ability to adapt to its environment and is also an inefficient use of spectral resources. To that end, tunable, high quality-factor filters are used for sub-band isolation and protect receivers while allowing open reception at other frequencies. For more flexibility, another emergent area of related research has focused on co-located spatial isolation using multiple antennas and direct injection of interference cancellation signals into receivers, which enables same-frequency full-duplex operation. With all these methods, self-interference must be reduced by an amount that prevents saturation of the ADC. Intermodulation products generated in the receiver in this process can potentially be problematic, as certain intermodulation products may appear to come from a particular angle and cohere in the beamformer. This work explores various digital phased array architectures and the how the flexibility afforded by an all-digital beamforming architecture, layered with other methods of isolation, can be used to reduce self-interference within the system. Specifically, digital control of coupled energy into receiving elements for planar and cylindrical array symmetries can be significantly reduced using near-field nulling, optimization of transmission frequencies for particular steering angles, and optimization of phase weights over restricted sets, without major impacts to the far-field performance of the system. Finally, a method for reducing in-band intermodulation that would ordinarily cohere in a system's receive beamformer is demonstrated using parallel cross-linearization of adjacent digital receivers in a phased array.

  16. Image based automatic water meter reader

    NASA Astrophysics Data System (ADS)

    Jawas, N.; Indrianto

    2018-01-01

    Water meter is used as a tool to calculate water consumption. This tool works by utilizing water flow and shows the calculation result with mechanical digit counter. Practically, in everyday use, an operator will manually check the digit counter periodically. The Operator makes logs of the number shows by water meter to know the water consumption. This manual operation is time consuming and prone to human error. Therefore, in this paper we propose an automatic water meter digit reader from digital image. The digits sequence is detected by utilizing contour information of the water meter front panel.. Then an OCR method is used to get the each digit character. The digit sequence detection is an important part of overall process. It determines the success of overall system. The result shows promising results especially in sequence detection.

  17. Digital Video: Watch Me Do What I Say!

    ERIC Educational Resources Information Center

    Capraro, Robert M.; Capraro, Mary Margaret; Lamb, Charles E.

    This paper establishes a use for digital video in developing preservice teacher metacognition about the teaching process using a lesson plan-rating sheet as a guide. A lesson plan was developed to meet the specific needs of the methods instructors in a professional development program at a large public institution. The categories listed on the…

  18. Digital Immigrants in Distance Education

    ERIC Educational Resources Information Center

    Salazar-Márquez, Roberto

    2017-01-01

    The constant growth of methods of education that incorporate the Internet into teaching-learning processes has opened up a wide range of opportunities for students across the world to gain entry to undergraduate or graduate degree programs. However, if the enrolling student is a digital immigrant, the chances of success may be limited by the…

  19. Digital holographic 3D imaging spectrometry (a review)

    NASA Astrophysics Data System (ADS)

    Yoshimori, Kyu

    2017-09-01

    This paper reviews recent progress in the digital holographic 3D imaging spectrometry. The principle of this method is a marriage of incoherent holography and Fourier transform spectroscopy. Review includes principle, procedure of signal processing and experimental results to obtain a multispectral set of 3D images for spatially incoherent, polychromatic objects.

  20. Global rotational motion and displacement estimation of digital image stabilization based on the oblique vectors matching algorithm

    NASA Astrophysics Data System (ADS)

    Yu, Fei; Hui, Mei; Zhao, Yue-jin

    2009-08-01

    The image block matching algorithm based on motion vectors of correlative pixels in oblique direction is presented for digital image stabilization. The digital image stabilization is a new generation of image stabilization technique which can obtains the information of relative motion among frames of dynamic image sequences by the method of digital image processing. In this method the matching parameters are calculated from the vectors projected in the oblique direction. The matching parameters based on the vectors contain the information of vectors in transverse and vertical direction in the image blocks at the same time. So the better matching information can be obtained after making correlative operation in the oblique direction. And an iterative weighted least square method is used to eliminate the error of block matching. The weights are related with the pixels' rotational angle. The center of rotation and the global emotion estimation of the shaking image can be obtained by the weighted least square from the estimation of each block chosen evenly from the image. Then, the shaking image can be stabilized with the center of rotation and the global emotion estimation. Also, the algorithm can run at real time by the method of simulated annealing in searching method of block matching. An image processing system based on DSP was used to exam this algorithm. The core processor in the DSP system is TMS320C6416 of TI, and the CCD camera with definition of 720×576 pixels was chosen as the input video signal. Experimental results show that the algorithm can be performed at the real time processing system and have an accurate matching precision.

  1. Preferred color correction for digital LCD TVs

    NASA Astrophysics Data System (ADS)

    Kim, Kyoung Tae; Kim, Choon-Woo; Ahn, Ji-Young; Kang, Dong-Woo; Shin, Hyun-Ho

    2009-01-01

    Instead of colorimetirc color reproduction, preferred color correction is applied for digital TVs to improve subjective image quality. First step of the preferred color correction is to survey the preferred color coordinates of memory colors. This can be achieved by the off-line human visual tests. Next step is to extract pixels of memory colors representing skin, grass and sky. For the detected pixels, colors are shifted towards the desired coordinates identified in advance. This correction process may result in undesirable contours on the boundaries between the corrected and un-corrected areas. For digital TV applications, the process of extraction and correction should be applied in every frame of the moving images. This paper presents a preferred color correction method in LCH color space. Values of chroma and hue are corrected independently. Undesirable contours on the boundaries of correction are minimized. The proposed method change the coordinates of memory color pixels towards the target color coordinates. Amount of correction is determined based on the averaged coordinate of the extracted pixels. The proposed method maintains the relative color difference within memory color areas. Performance of the proposed method is evaluated using the paired comparison. Results of experiments indicate that the proposed method can reproduce perceptually pleasing images to viewers.

  2. A digital ISO expansion technique for digital cameras

    NASA Astrophysics Data System (ADS)

    Yoo, Youngjin; Lee, Kangeui; Choe, Wonhee; Park, SungChan; Lee, Seong-Deok; Kim, Chang-Yong

    2010-01-01

    Market's demands of digital cameras for higher sensitivity capability under low-light conditions are remarkably increasing nowadays. The digital camera market is now a tough race for providing higher ISO capability. In this paper, we explore an approach for increasing maximum ISO capability of digital cameras without changing any structure of an image sensor or CFA. Our method is directly applied to the raw Bayer pattern CFA image to avoid non-linearity characteristics and noise amplification which are usually deteriorated after ISP (Image Signal Processor) of digital cameras. The proposed method fuses multiple short exposed images which are noisy, but less blurred. Our approach is designed to avoid the ghost artifact caused by hand-shaking and object motion. In order to achieve a desired ISO image quality, both low frequency chromatic noise and fine-grain noise that usually appear in high ISO images are removed and then we modify the different layers which are created by a two-scale non-linear decomposition of an image. Once our approach is performed on an input Bayer pattern CFA image, the resultant Bayer image is further processed by ISP to obtain a fully processed RGB image. The performance of our proposed approach is evaluated by comparing SNR (Signal to Noise Ratio), MTF50 (Modulation Transfer Function), color error ~E*ab and visual quality with reference images whose exposure times are properly extended into a variety of target sensitivity.

  3. Digital Methodology to implement the ECOUTER engagement process.

    PubMed

    Wilson, Rebecca C; Butters, Oliver W; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J

    2016-01-01

    ECOUTER ( E mploying CO ncept u al schema for policy and T ranslation E  in R esearch - French for 'to listen' - is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes.

  4. Retrieval of phase-derivative discontinuities in digital speckle pattern interferometry fringes using the Wigner-Ville distribution

    NASA Astrophysics Data System (ADS)

    Federico, Alejandro; Kaufmann, Guillermo H.

    2004-08-01

    We evaluate the application of the Wigner-Ville distribution (WVD) to measure phase gradient maps in digital speckle pattern interferometry (DSPI), when the generated correlation fringes present phase discontinuities. The performance of the WVD method is evaluated using computer-simulated fringes. The influence of the filtering process to smooth DSPI fringes and additional drawbacks that emerge when this method is applied are discussed. A comparison with the conventional method based on the continuous wavelet transform in the stationary phase approximation is also presented.

  5. Measuring Down: Evaluating Digital Storytelling as a Process for Narrative Health Promotion.

    PubMed

    Gubrium, Aline C; Fiddian-Green, Alice; Lowe, Sarah; DiFulvio, Gloria; Del Toro-Mejías, Lizbeth

    2016-05-15

    Digital storytelling (DST) engages participants in a group-based process to create and share narrative accounts of life events. We present key evaluation findings of a 2-year, mixed-methods study that focused on effects of participating in the DST process on young Puerto Rican Latina's self-esteem, social support, empowerment, and sexual attitudes and behaviors. Quantitative results did not show significant changes in the expected outcomes. However, in our qualitative findings we identified several ways in which the DST made positive, health-bearing effects. We argue for the importance of "measuring down" to reflect the locally grounded, felt experiences of participants who engage in the process, as current quantitative scales do not "measure up" to accurately capture these effects. We end by suggesting the need to develop mixed-methods, culturally relevant, and sensitive evaluation tools that prioritize process effects as they inform intervention and health promotion. © The Author(s) 2016.

  6. [Computer-aided method and rapid prototyping for the personalized fabrication of a silicone bandage digital prosthesis].

    PubMed

    Ventura Ferreira, Nuno; Leal, Nuno; Correia Sá, Inês; Reis, Ana; Marques, Marisa

    2014-01-01

    The fabrication of digital prostheses has acquired growing importance not only for the possibility for the patient to overcome psychosocial trauma but also to promote grip functionality. An application method of three dimensional-computer-aided design technologies for the production of passive prostheses is presented by means of a fifth finger amputee clinical case following bilateral hand replantation.Three-dimensional-computerized tomography was used for the collection of anthropometric images of the hands. Computer-aided design techniques were used to develop the digital file-based prosthesis from the reconstruction images by inversion and superimposing the contra-lateral finger images. The rapid prototyping manufacturing method was used for the production of a silicone bandage prosthesis prototype. This approach replaces the traditional manual method by a virtual method that is basis for the optimization of a high speed, accurate and innovative process.

  7. Automatic rice crop height measurement using a field server and digital image processing.

    PubMed

    Sritarapipat, Tanakorn; Rakwatin, Preesan; Kasetkasem, Teerasit

    2014-01-07

    Rice crop height is an important agronomic trait linked to plant type and yield potential. This research developed an automatic image processing technique to detect rice crop height based on images taken by a digital camera attached to a field server. The camera acquires rice paddy images daily at a consistent time of day. The images include the rice plants and a marker bar used to provide a height reference. The rice crop height can be indirectly measured from the images by measuring the height of the marker bar compared to the height of the initial marker bar. Four digital image processing steps are employed to automatically measure the rice crop height: band selection, filtering, thresholding, and height measurement. Band selection is used to remove redundant features. Filtering extracts significant features of the marker bar. The thresholding method is applied to separate objects and boundaries of the marker bar versus other areas. The marker bar is detected and compared with the initial marker bar to measure the rice crop height. Our experiment used a field server with a digital camera to continuously monitor a rice field located in Suphanburi Province, Thailand. The experimental results show that the proposed method measures rice crop height effectively, with no human intervention required.

  8. Beyond data collection in digital mapping: interpretation, sketching and thought process elements in geological map making

    NASA Astrophysics Data System (ADS)

    Watkins, Hannah; Bond, Clare; Butler, Rob

    2016-04-01

    Geological mapping techniques have advanced significantly in recent years from paper fieldslips to Toughbook, smartphone and tablet mapping; but how do the methods used to create a geological map affect the thought processes that result in the final map interpretation? Geological maps have many key roles in the field of geosciences including understanding geological processes and geometries in 3D, interpreting geological histories and understanding stratigraphic relationships in 2D and 3D. Here we consider the impact of the methods used to create a map on the thought processes that result in the final geological map interpretation. As mapping technology has advanced in recent years, the way in which we produce geological maps has also changed. Traditional geological mapping is undertaken using paper fieldslips, pencils and compass clinometers. The map interpretation evolves through time as data is collected. This interpretive process that results in the final geological map is often supported by recording in a field notebook, observations, ideas and alternative geological models explored with the use of sketches and evolutionary diagrams. In combination the field map and notebook can be used to challenge the map interpretation and consider its uncertainties. These uncertainties and the balance of data to interpretation are often lost in the creation of published 'fair' copy geological maps. The advent of Toughbooks, smartphones and tablets in the production of geological maps has changed the process of map creation. Digital data collection, particularly through the use of inbuilt gyrometers in phones and tablets, has changed smartphones into geological mapping tools that can be used to collect lots of geological data quickly. With GPS functionality this data is also geospatially located, assuming good GPS connectivity, and can be linked to georeferenced infield photography. In contrast line drawing, for example for lithological boundary interpretation and sketching, is yet to find the digital flow that is achieved with pencil on notebook page or map. Free-form integrated sketching and notebook functionality in geological mapping software packages is in its nascence. Hence, the result is a tendency for digital geological mapping to focus on the ease of data collection rather than on the thoughts and careful observations that come from notebook sketching and interpreting boundaries on a map in the field. The final digital geological map can be assessed for when and where data was recorded, but the thought processes of the mapper are less easily assessed, and the use of observations and sketching to generate ideas and interpretations maybe inhibited by reliance on digital mapping methods. All mapping methods used have their own distinct advantages and disadvantages and with more recent technologies both hardware and software issues have arisen. We present field examples of using conventional fieldslip mapping, and compare these with more advanced technologies to highlight some of the main advantages and disadvantages of each method and discuss where geological mapping may be going in the future.

  9. Compressed digital holography: from micro towards macro

    NASA Astrophysics Data System (ADS)

    Schretter, Colas; Bettens, Stijn; Blinder, David; Pesquet-Popescu, Béatrice; Cagnazzo, Marco; Dufaux, Frédéric; Schelkens, Peter

    2016-09-01

    signal processing methods from software-driven computer engineering and applied mathematics. The compressed sensing theory in particular established a practical framework for reconstructing the scene content using few linear combinations of complex measurements and a sparse prior for regularizing the solution. Compressed sensing found direct applications in digital holography for microscopy. Indeed, the wave propagation phenomenon in free space mixes in a natural way the spatial distribution of point sources from the 3-dimensional scene. As the 3-dimensional scene is mapped to a 2-dimensional hologram, the hologram samples form a compressed representation of the scene as well. This overview paper discusses contributions in the field of compressed digital holography at the micro scale. Then, an outreach on future extensions towards the real-size macro scale is discussed. Thanks to advances in sensor technologies, increasing computing power and the recent improvements in sparse digital signal processing, holographic modalities are on the verge of practical high-quality visualization at a macroscopic scale where much higher resolution holograms must be acquired and processed on the computer.

  10. Light-reflection random-target method for measurement of the modulation transfer function of a digital video-camera

    NASA Astrophysics Data System (ADS)

    Pospisil, J.; Jakubik, P.; Machala, L.

    2005-11-01

    This article reports the suggestion, realization and verification of the newly developed measuring means of the noiseless and locally shift-invariant modulation transfer function (MTF) of a digital video camera in a usual incoherent visible region of optical intensity, especially of its combined imaging, detection, sampling and digitizing steps which are influenced by the additive and spatially discrete photodetector, aliasing and quantization noises. Such means relates to the still camera automatic working regime and static two-dimensional spatially continuous light-reflection random target of white-noise property. The introduced theoretical reason for such a random-target method is also performed under exploitation of the proposed simulation model of the linear optical intensity response and possibility to express the resultant MTF by a normalized and smoothed rate of the ascertainable output and input power spectral densities. The random-target and resultant image-data were obtained and processed by means of a processing and evaluational PC with computation programs developed on the basis of MATLAB 6.5E The present examples of results and other obtained results of the performed measurements demonstrate the sufficient repeatability and acceptability of the described method for comparative evaluations of the performance of digital video cameras under various conditions.

  11. Method and apparatus for analog signal conditioner for high speed, digital x-ray spectrometer

    DOEpatents

    Warburton, William K.; Hubbard, Bradley

    1999-01-01

    A signal processing system which accepts input from an x-ray detector-preamplifier and produces a signal of reduced dynamic range for subsequent analog-to-digital conversion. The system conditions the input signal to reduce the number of bits required in the analog-to-digital converter by removing that part of the input signal which varies only slowly in time and retaining the amplitude of the pulses which carry information about the x-rays absorbed by the detector. The parameters controlling the signal conditioner's operation can be readily supplied in digital form, allowing it to be integrated into a feedback loop as part of a larger digital x-ray spectroscopy system.

  12. Processing of 3-Dimensional Flash Lidar Terrain Images Generated From an Airborne Platform

    NASA Technical Reports Server (NTRS)

    Bulyshev, Alexander; Pierrottet, Diego; Amzajerdian, Farzin; Busch, George; Vanek, Michael; Reisse, Robert

    2009-01-01

    Data from the first Flight Test of the NASA Langley Flash Lidar system have been processed. Results of the analyses are presented and discussed. A digital elevation map of the test site is derived from the data, and is compared with the actual topography. The set of algorithms employed, starting from the initial data sorting, and continuing through to the final digital map classification is described. The accuracy, precision, and the spatial and angular resolution of the method are discussed.

  13. Automated feature detection and identification in digital point-ordered signals

    DOEpatents

    Oppenlander, Jane E.; Loomis, Kent C.; Brudnoy, David M.; Levy, Arthur J.

    1998-01-01

    A computer-based automated method to detect and identify features in digital point-ordered signals. The method is used for processing of non-destructive test signals, such as eddy current signals obtained from calibration standards. The signals are first automatically processed to remove noise and to determine a baseline. Next, features are detected in the signals using mathematical morphology filters. Finally, verification of the features is made using an expert system of pattern recognition methods and geometric criteria. The method has the advantage that standard features can be, located without prior knowledge of the number or sequence of the features. Further advantages are that standard features can be differentiated from irrelevant signal features such as noise, and detected features are automatically verified by parameters extracted from the signals. The method proceeds fully automatically without initial operator set-up and without subjective operator feature judgement.

  14. Evaluation of clinical image processing algorithms used in digital mammography.

    PubMed

    Zanca, Federica; Jacobs, Jurgen; Van Ongeval, Chantal; Claus, Filip; Celis, Valerie; Geniets, Catherine; Provost, Veerle; Pauwels, Herman; Marchal, Guy; Bosmans, Hilde

    2009-03-01

    Screening is the only proven approach to reduce the mortality of breast cancer, but significant numbers of breast cancers remain undetected even when all quality assurance guidelines are implemented. With the increasing adoption of digital mammography systems, image processing may be a key factor in the imaging chain. Although to our knowledge statistically significant effects of manufacturer-recommended image processings have not been previously demonstrated, the subjective experience of our radiologists, that the apparent image quality can vary considerably between different algorithms, motivated this study. This article addresses the impact of five such algorithms on the detection of clusters of microcalcifications. A database of unprocessed (raw) images of 200 normal digital mammograms, acquired with the Siemens Novation DR, was collected retrospectively. Realistic simulated microcalcification clusters were inserted in half of the unprocessed images. All unprocessed images were subsequently processed with five manufacturer-recommended image processing algorithms (Agfa Musica 1, IMS Raffaello Mammo 1.2, Sectra Mamea AB Sigmoid, Siemens OPVIEW v2, and Siemens OPVIEW v1). Four breast imaging radiologists were asked to locate and score the clusters in each image on a five point rating scale. The free-response data were analyzed by the jackknife free-response receiver operating characteristic (JAFROC) method and, for comparison, also with the receiver operating characteristic (ROC) method. JAFROC analysis revealed highly significant differences between the image processings (F = 8.51, p < 0.0001), suggesting that image processing strongly impacts the detectability of clusters. Siemens OPVIEW2 and Siemens OPVIEW1 yielded the highest and lowest performances, respectively. ROC analysis of the data also revealed significant differences between the processing but at lower significance (F = 3.47, p = 0.0305) than JAFROC. Both statistical analysis methods revealed that the same six pairs of modalities were significantly different, but the JAFROC confidence intervals were about 32% smaller than ROC confidence intervals. This study shows that image processing has a significant impact on the detection of microcalcifications in digital mammograms. Objective measurements, such as described here, should be used by the manufacturers to select the optimal image processing algorithm.

  15. TU-FG-209-11: Validation of a Channelized Hotelling Observer to Optimize Chest Radiography Image Processing for Nodule Detection: A Human Observer Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanchez, A; Little, K; Chung, J

    Purpose: To validate the use of a Channelized Hotelling Observer (CHO) model for guiding image processing parameter selection and enable improved nodule detection in digital chest radiography. Methods: In a previous study, an anthropomorphic chest phantom was imaged with and without PMMA simulated nodules using a GE Discovery XR656 digital radiography system. The impact of image processing parameters was then explored using a CHO with 10 Laguerre-Gauss channels. In this work, we validate the CHO’s trend in nodule detectability as a function of two processing parameters by conducting a signal-known-exactly, multi-reader-multi-case (MRMC) ROC observer study. Five naive readers scored confidencemore » of nodule visualization in 384 images with 50% nodule prevalence. The image backgrounds were regions-of-interest extracted from 6 normal patient scans, and the digitally inserted simulated nodules were obtained from phantom data in previous work. Each patient image was processed with both a near-optimal and a worst-case parameter combination, as determined by the CHO for nodule detection. The same 192 ROIs were used for each image processing method, with 32 randomly selected lung ROIs per patient image. Finally, the MRMC data was analyzed using the freely available iMRMC software of Gallas et al. Results: The image processing parameters which were optimized for the CHO led to a statistically significant improvement (p=0.049) in human observer AUC from 0.78 to 0.86, relative to the image processing implementation which produced the lowest CHO performance. Conclusion: Differences in user-selectable image processing methods on a commercially available digital radiography system were shown to have a marked impact on performance of human observers in the task of lung nodule detection. Further, the effect of processing on humans was similar to the effect on CHO performance. Future work will expand this study to include a wider range of detection/classification tasks and more observers, including experienced chest radiologists.« less

  16. "Glitch Logic" and Applications to Computing and Information Security

    NASA Technical Reports Server (NTRS)

    Stoica, Adrian; Katkoori, Srinivas

    2009-01-01

    This paper introduces a new method of information processing in digital systems, and discusses its potential benefits to computing and information security. The new method exploits glitches caused by delays in logic circuits for carrying and processing information. Glitch processing is hidden to conventional logic analyses and undetectable by traditional reverse engineering techniques. It enables the creation of new logic design methods that allow for an additional controllable "glitch logic" processing layer embedded into a conventional synchronous digital circuits as a hidden/covert information flow channel. The combination of synchronous logic with specific glitch logic design acting as an additional computing channel reduces the number of equivalent logic designs resulting from synthesis, thus implicitly reducing the possibility of modification and/or tampering with the design. The hidden information channel produced by the glitch logic can be used: 1) for covert computing/communication, 2) to prevent reverse engineering, tampering, and alteration of design, and 3) to act as a channel for information infiltration/exfiltration and propagation of viruses/spyware/Trojan horses.

  17. Concepts and algorithms in digital photogrammetry

    NASA Technical Reports Server (NTRS)

    Schenk, T.

    1994-01-01

    Despite much progress in digital photogrammetry, there is still a considerable lack of understanding of theories and methods which would allow a substantial increase in the automation of photogrammetric processes. The purpose of this paper is to raise awareness that the automation problem is one that cannot be solved in a bottom-up fashion by a trial-and-error approach. We present a short overview of concepts and algorithms used in digital photogrammetry. This is followed by a more detailed presentation of perceptual organization, a typical middle-level task.

  18. ROLES OF REMOTE SENSING AND CARTOGRAPHY IN THE USGS NATIONAL MAPPING DIVISION.

    USGS Publications Warehouse

    Southard, Rupert B.; Salisbury, John W.

    1983-01-01

    The inseparable roles of remote sensing and photogrammetry have been recognized to be consistent with the aims and interests of the American Society of Photogrammetry. In particular, spatial data storage, data merging and manipulation methods and other techniques originally developed for remote sensing applications also have applications for digital cartography. Also, with the introduction of much improved digital processing techniques, even relatively low resolution (80 m) traditional Landsat images can now be digitally mosaicked into excellent quality 1:250,000-scale image maps.

  19. Comparison of Image Generation And Processing Techniques For 3D Reconstruction of The Human Skull

    DTIC Science & Technology

    2001-10-25

    inexpensive Microscribe (3D digitizer) with a standard widely used and expensive CT-Scan and/or MRI for 3D reconstruction of a human skull, which will be... Microscribe 3D digitizing unit and another one using the CT-Scans (2D cross-sections) obtained from a GE scanner. Both models were then subjected to stress...these methods are still elaborate, expensive and not readily accessible. Using the hand-held digitizer, the Microscribe , X, Y and Z coordinates

  20. Advanced optical manufacturing digital integrated system

    NASA Astrophysics Data System (ADS)

    Tao, Yizheng; Li, Xinglan; Li, Wei; Tang, Dingyong

    2012-10-01

    It is necessarily to adapt development of advanced optical manufacturing technology with modern science technology development. To solved these problems which low of ration, ratio of finished product, repetition, consistent in big size and high precision in advanced optical component manufacturing. Applied business driven and method of Rational Unified Process, this paper has researched advanced optical manufacturing process flow, requirement of Advanced Optical Manufacturing integrated System, and put forward architecture and key technology of it. Designed Optical component core and Manufacturing process driven of Advanced Optical Manufacturing Digital Integrated System. the result displayed effective well, realized dynamic planning Manufacturing process, information integration improved ratio of production manufactory.

  1. Development of a digital method for neutron/gamma-ray discrimination based on matched filtering

    NASA Astrophysics Data System (ADS)

    Korolczuk, S.; Linczuk, M.; Romaniuk, R.; Zychor, I.

    2016-09-01

    Neutron/gamma-ray discrimination is crucial for measurements with detectors sensitive to both neutron and gamma-ray radiation. Different techniques to discriminate between neutrons and gamma-rays based on pulse shape analysis are widely used in many applications, e.g., homeland security, radiation dosimetry, environmental monitoring, fusion experiments, nuclear spectroscopy. A common requirement is to improve a radiation detection level with a high detection reliability. Modern electronic components, such as high speed analog to digital converters and powerful programmable digital circuits for signal processing, allow us to develop a fully digital measurement system. With this solution it is possible to optimize digital signal processing algorithms without changing any electronic components in an acquisition signal path. We report on results obtained with a digital acquisition system DNG@NCBJ designed at the National Centre for Nuclear Research. A 2'' × 2'' EJ309 liquid scintillator was used to register mixed neutron and gamma-ray radiation from PuBe sources. A dedicated algorithm for pulse shape discrimination, based on real-time filtering, was developed and implemented in hardware.

  2. Educating Information Systems Students on Business Process Management (BPM) through Digital Gaming Metaphors of Virtual Reality

    ERIC Educational Resources Information Center

    Lawler, James P.; Joseph, Anthony

    2010-01-01

    Digital gaming continues to be an approach for enhancing methods of pedagogy. The study evaluates the effectiveness of a gaming product of a leading technology firm in engaging graduate students in an information systems course at a major northeast institution. Findings from a detailed perception survey of the students indicate favorable…

  3. Screen Capture Technology: A Digital Window into Students' Writing Processes

    ERIC Educational Resources Information Center

    Seror, Jeremie

    2013-01-01

    Technological innovations and the prevalence of the computer as a means of producing and engaging with texts have dramatically transformed how literacy is defined and developed in modern society. This rise in digital writing practices has led to a growing number of tools and methods that can be used to explore second language (L2) writing…

  4. The Music of the Spheres

    ERIC Educational Resources Information Center

    Lewicki, Martin; Hughes, Stephen

    2012-01-01

    This article describes a method for making a spectroscope from scrap materials, i.e. a fragment of compact disc, a cardboard box, a tube and a digital camera to record the spectrum. An image processing program such as ImageJ can be used to calculate the wavelength of emission and absorption lines from the digital photograph. Multiple images of a…

  5. Method for enhanced control of welding processes

    DOEpatents

    Sheaffer, Donald A.; Renzi, Ronald F.; Tung, David M.; Schroder, Kevin

    2000-01-01

    Method and system for producing high quality welds in welding processes, in general, and gas tungsten arc (GTA) welding, in particular by controlling weld penetration. Light emitted from a weld pool is collected from the backside of a workpiece by optical means during welding and transmitted to a digital video camera for further processing, after the emitted light is first passed through a short wavelength pass filter to remove infrared radiation. By filtering out the infrared component of the light emitted from the backside weld pool image, the present invention provides for the accurate determination of the weld pool boundary. Data from the digital camera is fed to an imaging board which focuses on a 100.times.100 pixel portion of the image. The board performs a thresholding operation and provides this information to a digital signal processor to compute the backside weld pool dimensions and area. This information is used by a control system, in a dynamic feedback mode, to automatically adjust appropriate parameters of a welding system, such as the welding current, to control weld penetration and thus, create a uniform weld bead and high quality weld.

  6. Dual-channel in-line digital holographic double random phase encryption

    PubMed Central

    Das, Bhargab; Yelleswarapu, Chandra S; Rao, D V G L N

    2012-01-01

    We present a robust encryption method for the encoding of 2D/3D objects using digital holography and virtual optics. Using our recently developed dual-plane in-line digital holography technique, two in-line digital holograms are recorded at two different planes and are encrypted using two different double random phase encryption configurations, independently. The process of using two mutually exclusive encryption channels makes the system more robust against attacks since both the channels should be decrypted accurately in order to get a recognizable reconstruction. Results show that the reconstructed object is unrecognizable even when the portion of the correct phase keys used during decryption is close to 75%. The system is verified against blind decryptions by evaluating the SNR and MSE. Validation of the proposed method and sensitivities of the associated parameters are quantitatively analyzed and illustrated. PMID:23471012

  7. Preservation of Earth Science Data History with Digital Content Repository Technology

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Pan, J.; Shrestha, B.; Cook, R. B.

    2011-12-01

    An increasing need for derived and on-demand data product in Earth Science research makes the digital content more difficult for providers to manage and preserve and for users to locate, understand, and consume. Specifically, this increasing need presents additional challenges in managing data processing history information and delivering such information to end users. For example, the North American Carbon Program (NACP) Multi-scale Synthesis and Terrestrial Model Intercomparison Project (MsTMIP) chose a modified SYNMAP land cover data as one of the input driver data for participating terrestrial biospheric models. The global 1km resolution SYNMAP data was created by harmonizing 3 remote sensing-based land cover products: GLCC, GLC2000, and the MODIS land cover product. The original SYNMAP land cover data was aggregated into half and quarter degree resolution. It was then enhanced with more detailed grassland and cropland types. Currently, there lacks an effective mechanism to convey this data processing information to different modeling teams for them to determine if a data product meets their needs. It still highly relies on offline human interaction. The NASA-sponsored ORNL DAAC has leveraged the contemporary digital object repository technology to promote the representation, management, and delivery of data processing history and provenance information. Within digital object repository, different data products are managed as objects, with metadata as attributes and content delivery and management services as dissemination methods. Derivation relationships among data products can be semantically referenced between digital objects. Within the repository, data users can easily track a derived data product back to its origin, explorer metadata and documents about each intermediate data product, and discover processing details involved in each derivation step. Coupled with Drupal Web Content Management System, the digital repository interface was enhanced to provide intuitive graphic representation of the data processing history. Each data product is also associated with a formal metadata record in FGDC standards, and the main fields of the FGDC record are indexed for search, and are displayed as attributes of the data product. These features enable data users to better understand and consume a data product. The representation of data processing history in digital repository can further promote long-term data preservation. Lineage information is a major aspect to make digital data understandable and usable long time into the future. Derivation references can be setup between digital objects not only within a single digital repository, but also across multiple distributed digital repositories. Along with emerging identification mechanisms, such as Digital Object Identifier (DOI), a flexible distributed digital repository network can be setup to better preserve digital content. In this presentation, we describe how digital content repository technology can be used to manage, preserve, and deliver digital data processing history information in Earth Science research domain, with selected data archived in ORNL DAAC and Model and Synthesis Thematic Data Center (MAST-DC) as testing targets.

  8. Estimation of breast percent density in raw and processed full field digital mammography images via adaptive fuzzy c-means clustering and support vector machine segmentation.

    PubMed

    Keller, Brad M; Nathan, Diane L; Wang, Yan; Zheng, Yuanjie; Gee, James C; Conant, Emily F; Kontos, Despina

    2012-08-01

    The amount of fibroglandular tissue content in the breast as estimated mammographically, commonly referred to as breast percent density (PD%), is one of the most significant risk factors for developing breast cancer. Approaches to quantify breast density commonly focus on either semiautomated methods or visual assessment, both of which are highly subjective. Furthermore, most studies published to date investigating computer-aided assessment of breast PD% have been performed using digitized screen-film mammograms, while digital mammography is increasingly replacing screen-film mammography in breast cancer screening protocols. Digital mammography imaging generates two types of images for analysis, raw (i.e., "FOR PROCESSING") and vendor postprocessed (i.e., "FOR PRESENTATION"), of which postprocessed images are commonly used in clinical practice. Development of an algorithm which effectively estimates breast PD% in both raw and postprocessed digital mammography images would be beneficial in terms of direct clinical application and retrospective analysis. This work proposes a new algorithm for fully automated quantification of breast PD% based on adaptive multiclass fuzzy c-means (FCM) clustering and support vector machine (SVM) classification, optimized for the imaging characteristics of both raw and processed digital mammography images as well as for individual patient and image characteristics. Our algorithm first delineates the breast region within the mammogram via an automated thresholding scheme to identify background air followed by a straight line Hough transform to extract the pectoral muscle region. The algorithm then applies adaptive FCM clustering based on an optimal number of clusters derived from image properties of the specific mammogram to subdivide the breast into regions of similar gray-level intensity. Finally, a SVM classifier is trained to identify which clusters within the breast tissue are likely fibroglandular, which are then aggregated into a final dense tissue segmentation that is used to compute breast PD%. Our method is validated on a group of 81 women for whom bilateral, mediolateral oblique, raw and processed screening digital mammograms were available, and agreement is assessed with both continuous and categorical density estimates made by a trained breast-imaging radiologist. Strong association between algorithm-estimated and radiologist-provided breast PD% was detected for both raw (r = 0.82, p < 0.001) and processed (r = 0.85, p < 0.001) digital mammograms on a per-breast basis. Stronger agreement was found when overall breast density was assessed on a per-woman basis for both raw (r = 0.85, p < 0.001) and processed (0.89, p < 0.001) mammograms. Strong agreement between categorical density estimates was also seen (weighted Cohen's κ ≥ 0.79). Repeated measures analysis of variance demonstrated no statistically significant differences between the PD% estimates (p > 0.1) due to either presentation of the image (raw vs processed) or method of PD% assessment (radiologist vs algorithm). The proposed fully automated algorithm was successful in estimating breast percent density from both raw and processed digital mammographic images. Accurate assessment of a woman's breast density is critical in order for the estimate to be incorporated into risk assessment models. These results show promise for the clinical application of the algorithm in quantifying breast density in a repeatable manner, both at time of imaging as well as in retrospective studies.

  9. Dynamic neural network-based methods for compensation of nonlinear effects in multimode communication lines

    NASA Astrophysics Data System (ADS)

    Sidelnikov, O. S.; Redyuk, A. A.; Sygletos, S.

    2017-12-01

    We consider neural network-based schemes of digital signal processing. It is shown that the use of a dynamic neural network-based scheme of signal processing ensures an increase in the optical signal transmission quality in comparison with that provided by other methods for nonlinear distortion compensation.

  10. Demosaiced pixel super-resolution in digital holography for multiplexed computational color imaging on-a-chip (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Wu, Yichen; Zhang, Yibo; Luo, Wei; Ozcan, Aydogan

    2017-03-01

    Digital holographic on-chip microscopy achieves large space-bandwidth-products (e.g., >1 billion) by making use of pixel super-resolution techniques. To synthesize a digital holographic color image, one can take three sets of holograms representing the red (R), green (G) and blue (B) parts of the spectrum and digitally combine them to synthesize a color image. The data acquisition efficiency of this sequential illumination process can be improved by 3-fold using wavelength-multiplexed R, G and B illumination that simultaneously illuminates the sample, and using a Bayer color image sensor with known or calibrated transmission spectra to digitally demultiplex these three wavelength channels. This demultiplexing step is conventionally used with interpolation-based Bayer demosaicing methods. However, because the pixels of different color channels on a Bayer image sensor chip are not at the same physical location, conventional interpolation-based demosaicing process generates strong color artifacts, especially at rapidly oscillating hologram fringes, which become even more pronounced through digital wave propagation and phase retrieval processes. Here, we demonstrate that by merging the pixel super-resolution framework into the demultiplexing process, such color artifacts can be greatly suppressed. This novel technique, termed demosaiced pixel super-resolution (D-PSR) for digital holographic imaging, achieves very similar color imaging performance compared to conventional sequential R,G,B illumination, with 3-fold improvement in image acquisition time and data-efficiency. We successfully demonstrated the color imaging performance of this approach by imaging stained Pap smears. The D-PSR technique is broadly applicable to high-throughput, high-resolution digital holographic color microscopy techniques that can be used in resource-limited-settings and point-of-care offices.

  11. Computer analysis and mapping of gypsy moth levels in Pennsylvania using LANDSAT-1 digital data

    NASA Technical Reports Server (NTRS)

    Williams, D. L.

    1975-01-01

    The effectiveness of using LANDSAT-1 multispectral digital data and imagery, supplemented by ground truth and aerial photography, was investigated as a new method of surveying gypsy moth (Porthetria dispar (L.)) (Lepidoptera; Lymantriidae) defoliation, which has greatly increased in Pennsylvania in recent years. Since the acreage and severity of gypsy moth defoliation reaches a peak from mid-June through the first few days of July, the July 8, 1973, LANDSAT-1 scene was chosen for analysis. Results indicate that LANDSAT-1 data can be used to discriminate between defoliated and healthy vegetation in Pennsylvania and that digital processing methods can be used to map the extent and degree of defoliation.

  12. Time-resolved gamma spectroscopy of single events

    NASA Astrophysics Data System (ADS)

    Wolszczak, W.; Dorenbos, P.

    2018-04-01

    In this article we present a method of characterizing scintillating materials by digitization of each individual scintillation pulse followed by digital signal processing. With this technique it is possible to measure the pulse shape and the energy of an absorbed gamma photon on an event-by-event basis. In contrast to time-correlated single photon counting technique, the digital approach provides a faster measurement, an active noise suppression, and enables characterization of scintillation pulses simultaneously in two domains: time and energy. We applied this method to study the pulse shape change of a CsI(Tl) scintillator with energy of gamma excitation. We confirmed previously published results and revealed new details of the phenomenon.

  13. Kyiv UkrVO glass archives: new life

    NASA Astrophysics Data System (ADS)

    Pakuliak, L.; Golovnya, V.; Andruk, V.; Shatokhina, S.; Yizhakevych, O.; Kazantseva, L.; Lukianchuk, V.

    In the framework of UkrVO national project the new methods of plate digital image processing are developed. The photographic material of the UkrVO Joint Digital Archive (JDA) is used for the solution of classic astrometric problem - positional and photometric determinations of objects registered on the plates. The results of tested methods show that the positional rms errors are better than ±150 mas for both coordinates and photometric ones are better than ±0.20m with the Tycho-2 catalogue as reference.

  14. Life cycle cost evaluation of the digital opacity compliance system.

    PubMed

    McFarland, Michael J; Palmer, Glenn R; Olivas, Arthur C

    2010-01-01

    The US Environmental Protection Agency (EPA) has established EPA Reference Method 9 (Method 9) as the preferred enforcement approach for verifying compliance with federal visible opacity standards. While Method 9 has an extensive history of successful employment, reliance on human observers to quantify visible emissions is inherently subjective, a characteristic that exposes Method 9 results to claims of inaccuracy, bias and, in some cases, outright fraud. The Digital Opacity Compliance System (DOCS), which employs commercial-off-the-shelf digital photography coupled with simple computer processing, is a new approach for quantifying visible opacity. The DOCS technology has been previously demonstrated to meet and, in many cases, surpass the Method 9 accuracy and reliability standards (McFarland et al., 2006). Beyond its performance relative to Method 9, DOCS provides a permanent visual record of opacity, a vital feature in legal compliance challenges. In recent DOCS field testing, the opacity analysis of two hundred and forty one (241) regulated air emissions from the following industrial processes: 1) industrial scrubbers, 2) emergency generators, 3) asphalt paving, 4) steel production and 5) incineration indicated that Method 9 and DOCS were statistically equivalent at the 99% confidence level. However, a life cycle cost analysis demonstrated that implementation of DOCS could potentially save a facility $15,732 per trained opacity observer compared to utilization of Method 9. Copyright 2009 Elsevier Ltd. All rights reserved.

  15. Incorporating clinical metadata with digital image features for automated identification of cutaneous melanoma.

    PubMed

    Liu, Z; Sun, J; Smith, M; Smith, L; Warr, R

    2013-11-01

    Computer-assisted diagnosis (CAD) of malignant melanoma (MM) has been advocated to help clinicians to achieve a more objective and reliable assessment. However, conventional CAD systems examine only the features extracted from digital photographs of lesions. Failure to incorporate patients' personal information constrains the applicability in clinical settings. To develop a new CAD system to improve the performance of automatic diagnosis of melanoma, which, for the first time, incorporates digital features of lesions with important patient metadata into a learning process. Thirty-two features were extracted from digital photographs to characterize skin lesions. Patients' personal information, such as age, gender and, lesion site, and their combinations, was quantified as metadata. The integration of digital features and metadata was realized through an extended Laplacian eigenmap, a dimensionality-reduction method grouping lesions with similar digital features and metadata into the same classes. The diagnosis reached 82.1% sensitivity and 86.1% specificity when only multidimensional digital features were used, but improved to 95.2% sensitivity and 91.0% specificity after metadata were incorporated appropriately. The proposed system achieves a level of sensitivity comparable with experienced dermatologists aided by conventional dermoscopes. This demonstrates the potential of our method for assisting clinicians in diagnosing melanoma, and the benefit it could provide to patients and hospitals by greatly reducing unnecessary excisions of benign naevi. This paper proposes an enhanced CAD system incorporating clinical metadata into the learning process for automatic classification of melanoma. Results demonstrate that the additional metadata and the mechanism to incorporate them are useful for improving CAD of melanoma. © 2013 British Association of Dermatologists.

  16. Method and apparatus for data decoding and processing

    DOEpatents

    Hunter, Timothy M.; Levy, Arthur J.

    1992-01-01

    A system and technique is disclosed for automatically controlling the decoding and digitizaiton of an analog tape. The system includes the use of a tape data format which includes a plurality of digital codes recorded on the analog tape in a predetermined proximity to a period of recorded analog data. The codes associated with each period of analog data include digital identification codes prior to the analog data, a start of data code coincident with the analog data recording, and an end of data code subsequent to the associated period of recorded analog data. The formatted tape is decoded in a processing and digitization system which includes an analog tape player coupled to a digitizer to transmit analog information from the recorded tape over at least one channel to the digitizer. At the same time, the tape player is coupled to a decoder and interface system which detects and decodes the digital codes on the tape corresponding to each period of recorded analog data and controls tape movement and digitizer initiation in response to preprogramed modes. A host computer is also coupled to the decoder and interface system and the digitizer and programmed to initiate specific modes of data decoding through the decoder and interface system including the automatic compilation and storage of digital identification information and digitized data for the period of recorded analog data corresponding to the digital identification data, compilation and storage of selected digitized data representing periods of recorded analog data, and compilation of digital identification information related to each of the periods of recorded analog data.

  17. Digital Signal Processing Methods for Ultrasonic Echoes.

    PubMed

    Sinding, Kyle; Drapaca, Corina; Tittmann, Bernhard

    2016-04-28

    Digital signal processing has become an important component of data analysis needed in industrial applications. In particular, for ultrasonic thickness measurements the signal to noise ratio plays a major role in the accurate calculation of the arrival time. For this application a band pass filter is not sufficient since the noise level cannot be significantly decreased such that a reliable thickness measurement can be performed. This paper demonstrates the abilities of two regularization methods - total variation and Tikhonov - to filter acoustic and ultrasonic signals. Both of these methods are compared to a frequency based filtering for digitally produced signals as well as signals produced by ultrasonic transducers. This paper demonstrates the ability of the total variation and Tikhonov filters to accurately recover signals from noisy acoustic signals faster than a band pass filter. Furthermore, the total variation filter has been shown to reduce the noise of a signal significantly for signals with clear ultrasonic echoes. Signal to noise ratios have been increased over 400% by using a simple parameter optimization. While frequency based filtering is efficient for specific applications, this paper shows that the reduction of noise in ultrasonic systems can be much more efficient with regularization methods.

  18. Digital Twin concept for smart injection molding

    NASA Astrophysics Data System (ADS)

    Liau, Y.; Lee, H.; Ryu, K.

    2018-03-01

    Injection molding industry has evolved over decades and became the most common method to manufacture plastic parts. Monitoring and improvement in the injection molding industry are usually performed separately in each stage, i.e. mold design, mold making and injection molding process. However, in order to make a breakthrough and survive in the industrial revolution, all the stages in injection molding need to be linked and communicated with each other. Any changes in one stage will cause a certain effect in other stage because there is a correlation between each other. Hence, the simulation should not only based on the input of historical data, but it also needs to include the current condition of equipment and prediction of future events in other stages to make the responsive decision. This can be achieved by implementing the concept of Digital Twin that models the entire process as a virtual model and enables bidirectional control with the physical process. This paper presented types of data and technology required to build the Digital Twin for the injection molding industry. The concept includes Digital Twin of each stage and integration of these Digital Twin model as a thoroughgoing model of the injection molding industry.

  19. Color quality improvement of reconstructed images in color digital holography using speckle method and spectral estimation

    NASA Astrophysics Data System (ADS)

    Funamizu, Hideki; Onodera, Yusei; Aizu, Yoshihisa

    2018-05-01

    In this study, we report color quality improvement of reconstructed images in color digital holography using the speckle method and the spectral estimation. In this technique, an object is illuminated by a speckle field and then an object wave is produced, while a plane wave is used as a reference wave. For three wavelengths, the interference patterns of two coherent waves are recorded as digital holograms on an image sensor. Speckle fields are changed by moving a ground glass plate in an in-plane direction, and a number of holograms are acquired to average the reconstructed images. After the averaging process of images reconstructed from multiple holograms, we use the Wiener estimation method for obtaining spectral transmittance curves in reconstructed images. The color reproducibility in this method is demonstrated and evaluated using a Macbeth color chart film and staining cells of onion.

  20. Counting pollen grains using readily available, free image processing and analysis software.

    PubMed

    Costa, Clayton M; Yang, Suann

    2009-10-01

    Although many methods exist for quantifying the number of pollen grains in a sample, there are few standard methods that are user-friendly, inexpensive and reliable. The present contribution describes a new method of counting pollen using readily available, free image processing and analysis software. Pollen was collected from anthers of two species, Carduus acanthoides and C. nutans (Asteraceae), then illuminated on slides and digitally photographed through a stereomicroscope. Using ImageJ (NIH), these digital images were processed to remove noise and sharpen individual pollen grains, then analysed to obtain a reliable total count of the number of grains present in the image. A macro was developed to analyse multiple images together. To assess the accuracy and consistency of pollen counting by ImageJ analysis, counts were compared with those made by the human eye. Image analysis produced pollen counts in 60 s or less per image, considerably faster than counting with the human eye (5-68 min). In addition, counts produced with the ImageJ procedure were similar to those obtained by eye. Because count parameters are adjustable, this image analysis protocol may be used for many other plant species. Thus, the method provides a quick, inexpensive and reliable solution to counting pollen from digital images, not only reducing the chance of error but also substantially lowering labour requirements.

  1. Real-time monitoring of the solution concentration variation during the crystallization process of protein-lysozyme by using digital holographic interferometry.

    PubMed

    Zhang, Yanyan; Zhao, Jianlin; Di, Jianglei; Jiang, Hongzhen; Wang, Qian; Wang, Jun; Guo, Yunzhu; Yin, Dachuan

    2012-07-30

    We report a real-time measurement method of the solution concentration variation during the growth of protein-lysozyme crystals based on digital holographic interferometry. A series of holograms containing the information of the solution concentration variation in the whole crystallization process is recorded by CCD. Based on the principle of double-exposure holographic interferometry and the relationship between the phase difference of the reconstructed object wave and the solution concentration, the solution concentration variation with time for arbitrary point in the solution can be obtained, and then the two-dimensional concentration distribution of the solution during crystallization process can also be figured out under the precondition which the refractive index is constant through the light propagation direction. The experimental results turns out that it is feasible to in situ, full-field and real-time monitor the crystal growth process by using this method.

  2. Liquid crystal thermography and true-colour digital image processing

    NASA Astrophysics Data System (ADS)

    Stasiek, J.; Stasiek, A.; Jewartowski, M.; Collins, M. W.

    2006-06-01

    In the last decade thermochromic liquid crystals (TLC) and true-colour digital image processing have been successfully used in non-intrusive technical, industrial and biomedical studies and applications. Thin coatings of TLCs at surfaces are utilized to obtain detailed temperature distributions and heat transfer rates for steady or transient processes. Liquid crystals also can be used to make visible the temperature and velocity fields in liquids by the simple expedient of directly mixing the liquid crystal material into the liquid (water, glycerol, glycol, and silicone oils) in very small quantities to use as thermal and hydrodynamic tracers. In biomedical situations e.g., skin diseases, breast cancer, blood circulation and other medical application, TLC and image processing are successfully used as an additional non-invasive diagnostic method especially useful for screening large groups of potential patients. The history of this technique is reviewed, principal methods and tools are described and some examples are also presented.

  3. Accurate reconstruction in digital holographic microscopy using Fresnel dual-tree complex wavelet transform

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaolei; Zhang, Xiangchao; Yuan, He; Zhang, Hao; Xu, Min

    2018-02-01

    Digital holography is a promising measurement method in the fields of bio-medicine and micro-electronics. But the captured images of digital holography are severely polluted by the speckle noise because of optical scattering and diffraction. Via analyzing the properties of Fresnel diffraction and the topographies of micro-structures, a novel reconstruction method based on the dual-tree complex wavelet transform (DT-CWT) is proposed. This algorithm is shiftinvariant and capable of obtaining sparse representations for the diffracted signals of salient features, thus it is well suited for multiresolution processing of the interferometric holograms of directional morphologies. An explicit representation of orthogonal Fresnel DT-CWT bases and a specific filtering method are developed. This method can effectively remove the speckle noise without destroying the salient features. Finally, the proposed reconstruction method is compared with the conventional Fresnel diffraction integration and Fresnel wavelet transform with compressive sensing methods to validate its remarkable superiority on the aspects of topography reconstruction and speckle removal.

  4. Improved Discrete Approximation of Laplacian of Gaussian

    NASA Technical Reports Server (NTRS)

    Shuler, Robert L., Jr.

    2004-01-01

    An improved method of computing a discrete approximation of the Laplacian of a Gaussian convolution of an image has been devised. The primary advantage of the method is that without substantially degrading the accuracy of the end result, it reduces the amount of information that must be processed and thus reduces the amount of circuitry needed to perform the Laplacian-of- Gaussian (LOG) operation. Some background information is necessary to place the method in context. The method is intended for application to the LOG part of a process of real-time digital filtering of digitized video data that represent brightnesses in pixels in a square array. The particular filtering process of interest is one that converts pixel brightnesses to binary form, thereby reducing the amount of information that must be performed in subsequent correlation processing (e.g., correlations between images in a stereoscopic pair for determining distances or correlations between successive frames of the same image for detecting motions). The Laplacian is often included in the filtering process because it emphasizes edges and textures, while the Gaussian is often included because it smooths out noise that might not be consistent between left and right images or between successive frames of the same image.

  5. Digit-color synaesthesia only enhances memory for colors in a specific context: A new method of duration thresholds to measure serial recall.

    PubMed

    Teichmann, A Lina; Nieuwenstein, Mark R; Rich, Anina N

    2017-08-01

    For digit-color synaesthetes, digits elicit vivid experiences of color that are highly consistent for each individual. The conscious experience of synaesthesia is typically unidirectional: Digits evoke colors but not vice versa. There is an ongoing debate about whether synaesthetes have a memory advantage over non-synaesthetes. One key question in this debate is whether synaesthetes have a general superiority or whether any benefit is specific to a certain type of material. Here, we focus on immediate serial recall and ask digit-color synaesthetes and controls to memorize digit and color sequences. We developed a sensitive staircase method manipulating presentation duration to measure participants' serial recall of both overlearned and novel sequences. Our results show that synaesthetes can activate digit information to enhance serial memory for color sequences. When color sequences corresponded to ascending or descending digit sequences, synaesthetes encoded these sequences at a faster rate than their non-synaesthetes counterparts and faster than non-structured color sequences. However, encoding color sequences is approximately 200 ms slower than encoding digit sequences directly, independent of group and condition, which shows that the translation process is time consuming. These results suggest memory advantages in synaesthesia require a modified dual-coding account, in which secondary (synaesthetically linked) information is useful only if it is more memorable than the primary information to be recalled. Our study further shows that duration thresholds are a sensitive method to measure subtle differences in serial recall performance. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Method for the visualization of landform by mapping using low altitude UAV application

    NASA Astrophysics Data System (ADS)

    Sharan Kumar, N.; Ashraf Mohamad Ismail, Mohd; Sukor, Nur Sabahiah Abdul; Cheang, William

    2018-05-01

    Unmanned Aerial Vehicle (UAV) and Digital Photogrammetry are evolving drastically in mapping technology. The significance and necessity for digital landform mapping are developing with years. In this study, a mapping workflow is applied to obtain two different input data sets which are the orthophoto and DSM. A fine flying technology is used to capture Low Altitude Aerial Photography (LAAP). Low altitude UAV (Drone) with the fixed advanced camera was utilized for imagery while computerized photogrammetry handling using Photo Scan was applied for cartographic information accumulation. The data processing through photogrammetry and orthomosaic processes is the main applications. High imagery quality is essential for the effectiveness and nature of normal mapping output such as 3D model, Digital Elevation Model (DEM), Digital Surface Model (DSM) and Ortho Images. The exactitude of Ground Control Points (GCP), flight altitude and the resolution of the camera are essential for good quality DEM and Orthophoto.

  7. Research on control law accelerator of digital signal process chip TMS320F28035 for real-time data acquisition and processing

    NASA Astrophysics Data System (ADS)

    Zhao, Shuangle; Zhang, Xueyi; Sun, Shengli; Wang, Xudong

    2017-08-01

    TI C2000 series digital signal process (DSP) chip has been widely used in electrical engineering, measurement and control, communications and other professional fields, DSP TMS320F28035 is one of the most representative of a kind. When using the DSP program, need data acquisition and data processing, and if the use of common mode C or assembly language programming, the program sequence, analogue-to-digital (AD) converter cannot be real-time acquisition, often missing a lot of data. The control low accelerator (CLA) processor can run in parallel with the main central processing unit (CPU), and the frequency is consistent with the main CPU, and has the function of floating point operations. Therefore, the CLA coprocessor is used in the program, and the CLA kernel is responsible for data processing. The main CPU is responsible for the AD conversion. The advantage of this method is to reduce the time of data processing and realize the real-time performance of data acquisition.

  8. Method and apparatus for analog signal conditioner for high speed, digital x-ray spectrometer

    DOEpatents

    Warburton, W.K.; Hubbard, B.

    1999-02-09

    A signal processing system which accepts input from an x-ray detector-preamplifier and produces a signal of reduced dynamic range for subsequent analog-to-digital conversion is disclosed. The system conditions the input signal to reduce the number of bits required in the analog-to-digital converter by removing that part of the input signal which varies only slowly in time and retaining the amplitude of the pulses which carry information about the x-rays absorbed by the detector. The parameters controlling the signal conditioner`s operation can be readily supplied in digital form, allowing it to be integrated into a feedback loop as part of a larger digital x-ray spectroscopy system. 13 figs.

  9. The Effect of Pinyin Input Experience on the Link between Semantic and Phonology of Chinese Character in Digital Writing

    ERIC Educational Resources Information Center

    Chen, Jingjun; Luo, Rong; Liu, Huashan

    2017-01-01

    With the development of ICT, digital writing is becoming much more common in people's life. Differently from keyboarding alphabets directly to input English words, keyboarding Chinese character is always through typing phonetic alphabets and then identify the glyph provided by Pinyin input-method software while in this process which do not need…

  10. Modeling the Influences of Upper-Elementary School Students' Digital Reading Literacy, Socioeconomic Factors, and Self-Regulated Learning Strategies

    ERIC Educational Resources Information Center

    Chen, Shin-Feng

    2017-01-01

    Background: Reading is an interactive and constructive process of making meaning by engaging a variety of materials and sources and by participating in reading communities at school or in daily life. Aim: The purpose of this study was to explore the factors affecting digital reading literacy among upper-elementary school students. Method: A…

  11. Hadamard multimode optical imaging transceiver

    DOEpatents

    Cooke, Bradly J; Guenther, David C; Tiee, Joe J; Kellum, Mervyn J; Olivas, Nicholas L; Weisse-Bernstein, Nina R; Judd, Stephen L; Braun, Thomas R

    2012-10-30

    Disclosed is a method and system for simultaneously acquiring and producing results for multiple image modes using a common sensor without optical filtering, scanning, or other moving parts. The system and method utilize the Walsh-Hadamard correlation detection process (e.g., functions/matrix) to provide an all-binary structure that permits seamless bridging between analog and digital domains. An embodiment may capture an incoming optical signal at an optical aperture, convert the optical signal to an electrical signal, pass the electrical signal through a Low-Noise Amplifier (LNA) to create an LNA signal, pass the LNA signal through one or more correlators where each correlator has a corresponding Walsh-Hadamard (WH) binary basis function, calculate a correlation output coefficient for each correlator as a function of the corresponding WH binary basis function in accordance with Walsh-Hadamard mathematical principles, digitize each of the correlation output coefficient by passing each correlation output coefficient through an Analog-to-Digital Converter (ADC), and performing image mode processing on the digitized correlation output coefficients as desired to produce one or more image modes. Some, but not all, potential image modes include: multi-channel access, temporal, range, three-dimensional, and synthetic aperture.

  12. Digital pulse shape discrimination methods for n-γ separation in an EJ-301 liquid scintillation detector

    NASA Astrophysics Data System (ADS)

    Wan, Bo; Zhang, Xue-Ying; Chen, Liang; Ge, Hong-Lin; Ma, Fei; Zhang, Hong-Bin; Ju, Yong-Qin; Zhang, Yan-Bin; Li, Yan-Yan; Xu, Xiao-Wei

    2015-11-01

    A digital pulse shape discrimination system based on a programmable module NI-5772 has been established and tested with an EJ-301 liquid scintillation detector. The module was operated by running programs developed in LabVIEW, with a sampling frequency up to 1.6 GS/s. Standard gamma sources 22Na, 137Cs and 60Co were used to calibrate the EJ-301 liquid scintillation detector, and the gamma response function was obtained. Digital algorithms for the charge comparison method and zero-crossing method have been developed. The experimental results show that both digital signal processing (DSP) algorithms can discriminate neutrons from γ-rays. Moreover, the zero-crossing method shows better n-γ discrimination at 80 keVee and lower, whereas the charge comparison method gives better results at higher thresholds. In addition, the figure-of-merit (FOM) for detectors of two different dimensions were extracted at 9 energy thresholds, and it was found that the smaller detector presented better n-γ separation for fission neutrons. Supported by National Natural Science Foundation of China (91226107, 11305229) and the Strategic Priority Research Program of the Chinese Academy of Sciences (XDA03030300)

  13. Overview of a FPGA-based nuclear instrumentation dedicated to primary activity measurements.

    PubMed

    Bobin, C; Bouchard, J; Pierre, S; Thiam, C

    2012-09-01

    In National Metrology Institutes like LNE-LNHB, renewal and improvement of the instrumentation is an important task. Nowadays, the current trend is to adopt digital boards, which present numerous advantages over the standard electronics. The feasibility of an on-line fulfillment of nuclear-instrumentation functionalities using a commercial FPGA-based (Field-Programmable Gate Array) board has been validated in the case of TDCR primary measurements (Triple to Double Coincidence Ratio method based on liquid scintillation). The new applications presented in this paper have been included to allow either an on-line processing of the information or a raw-data acquisition for an off-line treatment. Developed as a complementary tool for TDCR counting, a time-to-digital converter specifically designed for this technique has been added. In addition, the description is given of a spectrometry channel based on the connection between conventional shaping amplifiers and the analog-to-digital converter (ADC) input available on the same digital board. First results are presented in the case of α- and γ-counting related to, respectively, the defined solid angle and well-type NaI(Tl) primary activity techniques. The combination of two different channels (liquid scintillation and γ-spectrometry) implementing the live-time anticoincidence processing is also described for the application of the 4πβ-γ coincidence method. The need for an optimized coupling between the analog chain and the ADC stage is emphasized. The straight processing of the signals delivered by the preamplifier connected to a HPGe detector is also presented along with the first development of digital filtering. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Brain Decoding-Classification of Hand Written Digits from fMRI Data Employing Bayesian Networks

    PubMed Central

    Yargholi, Elahe'; Hossein-Zadeh, Gholam-Ali

    2016-01-01

    We are frequently exposed to hand written digits 0–9 in today's modern life. Success in decoding-classification of hand written digits helps us understand the corresponding brain mechanisms and processes and assists seriously in designing more efficient brain–computer interfaces. However, all digits belong to the same semantic category and similarity in appearance of hand written digits makes this decoding-classification a challenging problem. In present study, for the first time, augmented naïve Bayes classifier is used for classification of functional Magnetic Resonance Imaging (fMRI) measurements to decode the hand written digits which took advantage of brain connectivity information in decoding-classification. fMRI was recorded from three healthy participants, with an age range of 25–30. Results in different brain lobes (frontal, occipital, parietal, and temporal) show that utilizing connectivity information significantly improves decoding-classification and capability of different brain lobes in decoding-classification of hand written digits were compared to each other. In addition, in each lobe the most contributing areas and brain connectivities were determined and connectivities with short distances between their endpoints were recognized to be more efficient. Moreover, data driven method was applied to investigate the similarity of brain areas in responding to stimuli and this revealed both similarly active areas and active mechanisms during this experiment. Interesting finding was that during the experiment of watching hand written digits, there were some active networks (visual, working memory, motor, and language processing), but the most relevant one to the task was language processing network according to the voxel selection. PMID:27468261

  15. Information theoretic analysis of edge detection in visual communication

    NASA Astrophysics Data System (ADS)

    Jiang, Bo; Rahman, Zia-ur

    2010-08-01

    Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the artifacts introduced into the process by the image gathering process. However, experiments show that the image gathering process profoundly impacts the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. In this paper, we perform an end-to-end information theory based system analysis to assess edge detection methods. We evaluate the performance of the different algorithms as a function of the characteristics of the scene, and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge detection algorithm is regarded to have high performance only if the information rate from the scene to the edge approaches the maximum possible. This goal can be achieved only by jointly optimizing all processes. People generally use subjective judgment to compare different edge detection methods. There is not a common tool that can be used to evaluate the performance of the different algorithms, and to give people a guide for selecting the best algorithm for a given system or scene. Our information-theoretic assessment becomes this new tool to which allows us to compare the different edge detection operators in a common environment.

  16. Quantitative Infrared Spectroscopy in Challenging Environments: Applications to Passive Remote Sensing and Process Monitoring

    DTIC Science & Technology

    2012-12-01

    IR remote sensing o ers a measurement method to detect gaseous species in the outdoor environment. Two major obstacles limit the application of this... method in quantitative analysis : (1) the e ect of both temperature and concentration on the measured spectral intensities and (2) the di culty and...crucial. In this research, particle swarm optimization, a population- based optimization method was applied. Digital ltering and wavelet processing methods

  17. Dual-wavelength common-path digital holographic microscopy for quantitative phase imaging of biological cells

    NASA Astrophysics Data System (ADS)

    Di, Jianglei; Song, Yu; Xi, Teli; Zhang, Jiwei; Li, Ying; Ma, Chaojie; Wang, Kaiqiang; Zhao, Jianlin

    2017-11-01

    Biological cells are usually transparent with a small refractive index gradient. Digital holographic interferometry can be used in the measurement of biological cells. We propose a dual-wavelength common-path digital holographic microscopy for the quantitative phase imaging of biological cells. In the proposed configuration, a parallel glass plate is inserted in the light path to create the lateral shearing, and two lasers with different wavelengths are used as the light source to form the dual-wavelength composite digital hologram. The information of biological cells for different wavelengths is separated and extracted in the Fourier domain of the hologram, and then combined to a shorter wavelength in the measurement process. This method could improve the system's temporal stability and reduce speckle noises simultaneously. Mouse osteoblastic cells and peony pollens are measured to show the feasibility of this method.

  18. Digital fabrication of textiles: an analysis of electrical networks in 3D knitted functional fabrics

    NASA Astrophysics Data System (ADS)

    Vallett, Richard; Knittel, Chelsea; Christe, Daniel; Castaneda, Nestor; Kara, Christina D.; Mazur, Krzysztof; Liu, Dani; Kontsos, Antonios; Kim, Youngmoo; Dion, Genevieve

    2017-05-01

    Digital fabrication methods are reshaping design and manufacturing processes through the adoption of pre-production visualization and analysis tools, which help minimize waste of materials and time. Despite the increasingly widespread use of digital fabrication techniques, comparatively few of these advances have benefited the design and fabrication of textiles. The development of functional fabrics such as knitted touch sensors, antennas, capacitors, and other electronic textiles could benefit from the same advances in electrical network modeling that revolutionized the design of integrated circuits. In this paper, the efficacy of using current state-of-the-art digital fabrication tools over the more common trialand- error methods currently used in textile design is demonstrated. Gaps are then identified in the current state-of-the-art tools that must be resolved to further develop and streamline the rapidly growing field of smart textiles and devices, bringing textile production into the realm of 21st century manufacturing.

  19. Full-field wrist pulse signal acquisition and analysis by 3D Digital Image Correlation

    NASA Astrophysics Data System (ADS)

    Xue, Yuan; Su, Yong; Zhang, Chi; Xu, Xiaohai; Gao, Zeren; Wu, Shangquan; Zhang, Qingchuan; Wu, Xiaoping

    2017-11-01

    Pulse diagnosis is an essential part in four basic diagnostic methods (inspection, listening, inquiring and palpation) in traditional Chinese medicine, which depends on longtime training and rich experience, so computerized pulse acquisition has been proposed and studied to ensure the objectivity. To imitate the process that doctors using three fingertips with different pressures to feel fluctuations in certain areas containing three acupoints, we established a five dimensional pulse signal acquisition system adopting a non-contacting optical metrology method, 3D digital image correlation, to record the full-field displacements of skin fluctuations under different pressures. The system realizes real-time full-field vibration mode observation with 10 FPS. The maximum sample frequency is 472 Hz for detailed post-processing. After acquisition, the signals are analyzed according to the amplitude, pressure, and pulse wave velocity. The proposed system provides a novel optical approach for digitalizing pulse diagnosis and massive pulse signal data acquisition for various types of patients.

  20. Digital impression and jaw relation record for the fabrication of CAD/CAM custom tray.

    PubMed

    Kanazawa, Manabu; Iwaki, Maiko; Arakida, Toshio; Minakuchi, Shunsuke

    2018-03-16

    This article describes the protocol of a digital impression technique to make an impression and recording of the jaw relationship of edentulous patients for the fabrication of CAD/CAM custom tray using computer-aided design and manufacturing (CAD/CAM) technology. Scan the maxillary and mandibular edentulous jaws using an intraoral scanner. Scan the silicone jig with the maxillary and mandibular jaws while keeping the jig between the jaws. Import the standard tessellation language data of the maxillary and mandibular jaws and jig to make a jaw relation record and fabricate custom trays (CAD/CAM trays) using a rapid prototyping system. Make a definitive impression of the maxillary and mandibular jaws using the CAD/CAM trays. Digitalization of the complete denture fabrication process can simplify the complicated treatment and laboratory process of conventional methods In addition, the proposed method enables quality control regardless of the operator's experience and technique. Copyright © 2018. Published by Elsevier Ltd.

  1. A new digitized reverse correction method for hypoid gears based on a one-dimensional probe

    NASA Astrophysics Data System (ADS)

    Li, Tianxing; Li, Jubo; Deng, Xiaozhong; Yang, Jianjun; Li, Genggeng; Ma, Wensuo

    2017-12-01

    In order to improve the tooth surface geometric accuracy and transmission quality of hypoid gears, a new digitized reverse correction method is proposed based on the measurement data from a one-dimensional probe. The minimization of tooth surface geometrical deviations is realized from the perspective of mathematical analysis and reverse engineering. Combining the analysis of complex tooth surface generation principles and the measurement mechanism of one-dimensional probes, the mathematical relationship between the theoretical designed tooth surface, the actual machined tooth surface and the deviation tooth surface is established, the mapping relation between machine-tool settings and tooth surface deviations is derived, and the essential connection between the accurate calculation of tooth surface deviations and the reverse correction method of machine-tool settings is revealed. Furthermore, a reverse correction model of machine-tool settings is built, a reverse correction strategy is planned, and the minimization of tooth surface deviations is achieved by means of the method of numerical iterative reverse solution. On this basis, a digitized reverse correction system for hypoid gears is developed by the organic combination of numerical control generation, accurate measurement, computer numerical processing, and digitized correction. Finally, the correctness and practicability of the digitized reverse correction method are proved through a reverse correction experiment. The experimental results show that the tooth surface geometric deviations meet the engineering requirements after two trial cuts and one correction.

  2. Evaluation of security algorithms used for security processing on DICOM images

    NASA Astrophysics Data System (ADS)

    Chen, Xiaomeng; Shuai, Jie; Zhang, Jianguo; Huang, H. K.

    2005-04-01

    In this paper, we developed security approach to provide security measures and features in PACS image acquisition and Tele-radiology image transmission. The security processing on medical images was based on public key infrastructure (PKI) and including digital signature and data encryption to achieve the security features of confidentiality, privacy, authenticity, integrity, and non-repudiation. There are many algorithms which can be used in PKI for data encryption and digital signature. In this research, we select several algorithms to perform security processing on different DICOM images in PACS environment, evaluate the security processing performance of these algorithms, and find the relationship between performance with image types, sizes and the implementation methods.

  3. Basic forest cover mapping using digitized remote sensor data and automated data processing techniques

    NASA Technical Reports Server (NTRS)

    Coggeshall, M. E.; Hoffer, R. M.

    1973-01-01

    Remote sensing equipment and automatic data processing techniques were employed as aids in the institution of improved forest resource management methods. On the basis of automatically calculated statistics derived from manually selected training samples, the feature selection processor of LARSYS selected, upon consideration of various groups of the four available spectral regions, a series of channel combinations whose automatic classification performances (for six cover types, including both deciduous and coniferous forest) were tested, analyzed, and further compared with automatic classification results obtained from digitized color infrared photography.

  4. A novel method for pair-matching using three-dimensional digital models of bone: mesh-to-mesh value comparison.

    PubMed

    Karell, Mara A; Langstaff, Helen K; Halazonetis, Demetrios J; Minghetti, Caterina; Frelat, Mélanie; Kranioti, Elena F

    2016-09-01

    The commingling of human remains often hinders forensic/physical anthropologists during the identification process, as there are limited methods to accurately sort these remains. This study investigates a new method for pair-matching, a common individualization technique, which uses digital three-dimensional models of bone: mesh-to-mesh value comparison (MVC). The MVC method digitally compares the entire three-dimensional geometry of two bones at once to produce a single value to indicate their similarity. Two different versions of this method, one manual and the other automated, were created and then tested for how well they accurately pair-matched humeri. Each version was assessed using sensitivity and specificity. The manual mesh-to-mesh value comparison method was 100 % sensitive and 100 % specific. The automated mesh-to-mesh value comparison method was 95 % sensitive and 60 % specific. Our results indicate that the mesh-to-mesh value comparison method overall is a powerful new tool for accurately pair-matching commingled skeletal elements, although the automated version still needs improvement.

  5. A Subsystem Test Bed for Chinese Spectral Radioheliograph

    NASA Astrophysics Data System (ADS)

    Zhao, An; Yan, Yihua; Wang, Wei

    2014-11-01

    The Chinese Spectral Radioheliograph is a solar dedicated radio interferometric array that will produce high spatial resolution, high temporal resolution, and high spectral resolution images of the Sun simultaneously in decimetre and centimetre wave range. Digital processing of intermediate frequency signal is an important part in a radio telescope. This paper describes a flexible and high-speed digital down conversion system for the CSRH by applying complex mixing, parallel filtering, and extracting algorithms to process IF signal at the time of being designed and incorporates canonic-signed digit coding and bit-plane method to improve program efficiency. The DDC system is intended to be a subsystem test bed for simulation and testing for CSRH. Software algorithms for simulation and hardware language algorithms based on FPGA are written which use less hardware resources and at the same time achieve high performances such as processing high-speed data flow (1 GHz) with 10 MHz spectral resolution. An experiment with the test bed is illustrated by using geostationary satellite data observed on March 20, 2014. Due to the easy alterability of the algorithms on FPGA, the data can be recomputed with different digital signal processing algorithms for selecting optimum algorithm.

  6. Accuracy of DSM based on digital aerial image matching. (Polish Title: Dokładność NMPT tworzonego metodą automatycznego dopasowania cyfrowych zdjęć lotniczych)

    NASA Astrophysics Data System (ADS)

    Kubalska, J. L.; Preuss, R.

    2013-12-01

    Digital Surface Models (DSM) are used in GIS data bases as single product more often. They are also necessary to create other products such as3D city models, true-ortho and object-oriented classification. This article presents results of DSM generation for classification of vegetation in urban areas. Source data allowed producing DSM with using of image matching method and ALS data. The creation of DSM from digital images, obtained by Ultra Cam-D digital Vexcel camera, was carried out in Match-T by INPHO. This program optimizes the configuration of images matching process, which ensures high accuracy and minimize gap areas. The analysis of the accuracy of this process was made by comparison of DSM generated in Match-T with DSM generated from ALS data. Because of further purpose of generated DSM it was decided to create model in GRID structure with cell size of 1 m. With this parameter differential model from both DSMs was also built that allowed determining the relative accuracy of the compared models. The analysis indicates that the generation of DSM with multi-image matching method is competitive for the same surface model creation from ALS data. Thus, when digital images with high overlap are available, the additional registration of ALS data seems to be unnecessary.

  7. Remote Sensing Image Quality Assessment Experiment with Post-Processing

    NASA Astrophysics Data System (ADS)

    Jiang, W.; Chen, S.; Wang, X.; Huang, Q.; Shi, H.; Man, Y.

    2018-04-01

    This paper briefly describes the post-processing influence assessment experiment, the experiment includes three steps: the physical simulation, image processing, and image quality assessment. The physical simulation models sampled imaging system in laboratory, the imaging system parameters are tested, the digital image serving as image processing input are produced by this imaging system with the same imaging system parameters. The gathered optical sampled images with the tested imaging parameters are processed by 3 digital image processes, including calibration pre-processing, lossy compression with different compression ratio and image post-processing with different core. Image quality assessment method used is just noticeable difference (JND) subject assessment based on ISO20462, through subject assessment of the gathered and processing images, the influence of different imaging parameters and post-processing to image quality can be found. The six JND subject assessment experimental data can be validated each other. Main conclusions include: image post-processing can improve image quality; image post-processing can improve image quality even with lossy compression, image quality with higher compression ratio improves less than lower ratio; with our image post-processing method, image quality is better, when camera MTF being within a small range.

  8. Targeted and untargeted-metabolite profiling to track the compositional integrity of ginger during processing using digitally-enhanced HPTLC pattern recognition analysis.

    PubMed

    Ibrahim, Reham S; Fathy, Hoda

    2018-03-30

    Tracking the impact of commonly applied post-harvesting and industrial processing practices on the compositional integrity of ginger rhizome was implemented in this work. Untargeted metabolite profiling was performed using digitally-enhanced HPTLC method where the chromatographic fingerprints were extracted using ImageJ software then analysed with multivariate Principal Component Analysis (PCA) for pattern recognition. A targeted approach was applied using a new, validated, simple and fast HPTLC image analysis method for simultaneous quantification of the officially recognized markers 6-, 8-, 10-gingerol and 6-shogaol in conjunction with chemometric Hierarchical Clustering Analysis (HCA). The results of both targeted and untargeted metabolite profiling revealed that peeling, drying in addition to storage employed during processing have a great influence on ginger chemo-profile, the different forms of processed ginger shouldn't be used interchangeably. Moreover, it deemed necessary to consider the holistic metabolic profile for comprehensive evaluation of ginger during processing. Copyright © 2018. Published by Elsevier B.V.

  9. Three-dimensional digital projection in neurosurgical education: technical note.

    PubMed

    Martins, Carolina; Ribas, Eduardo Carvalhal; Rhoton, Albert L; Ribas, Guilherme Carvalhal

    2015-10-01

    Three-dimensional images have become an important tool in teaching surgical anatomy, and its didactic power is enhanced when combined with 3D surgical images and videos. This paper describes the method used by the last author (G.C.R.) since 2002 to project 3D anatomical and surgical images using a computer source. Projecting 3D images requires the superposition of 2 similar but slightly different images of the same object. The set of images, one mimicking the view of the left eye and the other mimicking the view of the right eye, constitute the stereoscopic pair and can be processed using anaglyphic or horizontal-vertical polarization of light for individual use or presentation to larger audiences. Classically, 3D projection could be obtained by using a double set of slides, projected through 2 slide projectors, each of them equipped with complementary filters, shooting over a medium that keeps light polarized (a silver screen) and having the audience wear appropriate glasses. More recently, a digital method of 3D projection has been perfected. In this method, a personal computer is used as the source of the images, which are arranged in a Microsoft PowerPoint presentation. A beam splitter device is used to connect the computer source to 2 digital, portable projectors. Filters, a silver screen, and glasses are used, similar to the classic method. Among other advantages, this method brings flexibility to 3D presentations by allowing the combination of 3D anatomical and surgical still images and videos. It eliminates the need for using film and film developing, lowering the costs of the process. In using small, powerful digital projectors, this method substitutes for the previous technology, without incurring a loss of quality, and enhances portability.

  10. Estimating costs and performance of systems for machine processing of remotely sensed data

    NASA Technical Reports Server (NTRS)

    Ballard, R. J.; Eastwood, L. F., Jr.

    1977-01-01

    This paper outlines a method for estimating computer processing times and costs incurred in producing information products from digital remotely sensed data. The method accounts for both computation and overhead, and may be applied to any serial computer. The method is applied to estimate the cost and computer time involved in producing Level II Land Use and Vegetative Cover Maps for a five-state midwestern region. The results show that the amount of data to be processed overloads some example computer systems, but that the processing is feasible on others.

  11. Integrated Data Capturing Requirements for 3d Semantic Modelling of Cultural Heritage: the Inception Protocol

    NASA Astrophysics Data System (ADS)

    Di Giulio, R.; Maietti, F.; Piaia, E.; Medici, M.; Ferrari, F.; Turillazzi, B.

    2017-02-01

    The generation of high quality 3D models can be still very time-consuming and expensive, and the outcome of digital reconstructions is frequently provided in formats that are not interoperable, and therefore cannot be easily accessed. This challenge is even more crucial for complex architectures and large heritage sites, which involve a large amount of data to be acquired, managed and enriched by metadata. In this framework, the ongoing EU funded project INCEPTION - Inclusive Cultural Heritage in Europe through 3D semantic modelling proposes a workflow aimed at the achievements of efficient 3D digitization methods, post-processing tools for an enriched semantic modelling, web-based solutions and applications to ensure a wide access to experts and non-experts. In order to face these challenges and to start solving the issue of the large amount of captured data and time-consuming processes in the production of 3D digital models, an Optimized Data Acquisition Protocol (DAP) has been set up. The purpose is to guide the processes of digitization of cultural heritage, respecting needs, requirements and specificities of cultural assets.

  12. A Study on Improving Information Processing Abilities Based on PBL

    ERIC Educational Resources Information Center

    Kim, Du Gyu; Lee, JaeMu

    2014-01-01

    This study examined an instruction method for the improvement of information processing abilities in elementary school students. Current elementary students are required to develop information processing abilities to create new knowledge for this digital age. There is, however, a shortage of instruction strategies for these information processing…

  13. Survey of Munitions Response Technologies

    DTIC Science & Technology

    2006-06-01

    3-34 3.3.4 Digital Data Processing .......................................................................... 3-36 4.0 SOURCE DATA AND METHODS...6-4 6.1.6 DGM versus Mag and Flag Processes ..................................................... 6-5 6.1.7 Translation to...signatures, surface clutter, variances in operator technique, target selection, and data processing all degrade from and affect optimum performance

  14. Flexible and unique representations of two-digit decimals.

    PubMed

    Zhang, Li; Chen, Min; Lin, Chongde; Szűcs, Denes

    2014-09-01

    We examined the representation of two-digit decimals through studying distance and compatibility effects in magnitude comparison tasks in four experiments. Using number pairs with different leftmost digits, we found both the second digit distance effect and compatibility effect with two-digit integers but only the second digit distance effect with two-digit pure decimals. This suggests that both integers and pure decimals are processed in a compositional manner. In contrast, neither the second digit distance effect nor the compatibility effect was observed in two-digit mixed decimals, thereby showing no evidence for compositional processing of two-digit mixed decimals. However, when the relevance of the rightmost digit processing was increased by adding some decimals pairs with the same leftmost digits, both pure and mixed decimals produced the compatibility effect. Overall, results suggest that the processing of decimals is flexible and depends on the relevance of unique digit positions. This processing mode is different from integer analysis in that two-digit mixed decimals demonstrate parallel compositional processing only when the rightmost digit is relevant. Findings suggest that people probably do not represent decimals by simply ignoring the decimal point and converting them to natural numbers. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Digital floodplain mapping and an analysis of errors involved

    USGS Publications Warehouse

    Hamblen, C.S.; Soong, D.T.; Cai, X.

    2007-01-01

    Mapping floodplain boundaries using geographical information system (GIS) and digital elevation models (DEMs) was completed in a recent study. However convenient this method may appear at first, the resulting maps potentially can have unaccounted errors. Mapping the floodplain using GIS is faster than mapping manually, and digital mapping is expected to be more common in the future. When mapping is done manually, the experience and judgment of the engineer or geographer completing the mapping and the contour resolution of the surface topography are critical in determining the flood-plain and floodway boundaries between cross sections. When mapping is done digitally, discrepancies can result from the use of the computing algorithm and digital topographic datasets. Understanding the possible sources of error and how the error accumulates through these processes is necessary for the validation of automated digital mapping. This study will evaluate the procedure of floodplain mapping using GIS and a 3 m by 3 m resolution DEM with a focus on the accumulated errors involved in the process. Within the GIS environment of this mapping method, the procedural steps of most interest, initially, include: (1) the accurate spatial representation of the stream centerline and cross sections, (2) properly using a triangulated irregular network (TIN) model for the flood elevations of the studied cross sections, the interpolated elevations between them and the extrapolated flood elevations beyond the cross sections, and (3) the comparison of the flood elevation TIN with the ground elevation DEM, from which the appropriate inundation boundaries are delineated. The study area involved is of relatively low topographic relief; thereby, making it representative of common suburban development and a prime setting for the need of accurately mapped floodplains. This paper emphasizes the impacts of integrating supplemental digital terrain data between cross sections on floodplain delineation. ?? 2007 ASCE.

  16. Adaptive thresholding and dynamic windowing method for automatic centroid detection of digital Shack-Hartmann wavefront sensor.

    PubMed

    Yin, Xiaoming; Li, Xiang; Zhao, Liping; Fang, Zhongping

    2009-11-10

    A Shack-Hartmann wavefront sensor (SWHS) splits the incident wavefront into many subsections and transfers the distorted wavefront detection into the centroid measurement. The accuracy of the centroid measurement determines the accuracy of the SWHS. Many methods have been presented to improve the accuracy of the wavefront centroid measurement. However, most of these methods are discussed from the point of view of optics, based on the assumption that the spot intensity of the SHWS has a Gaussian distribution, which is not applicable to the digital SHWS. In this paper, we present a centroid measurement algorithm based on the adaptive thresholding and dynamic windowing method by utilizing image processing techniques for practical application of the digital SHWS in surface profile measurement. The method can detect the centroid of each focal spot precisely and robustly by eliminating the influence of various noises, such as diffraction of the digital SHWS, unevenness and instability of the light source, as well as deviation between the centroid of the focal spot and the center of the detection area. The experimental results demonstrate that the algorithm has better precision, repeatability, and stability compared with other commonly used centroid methods, such as the statistical averaging, thresholding, and windowing algorithms.

  17. Digital Radiography Qualification of Tube Welding

    NASA Technical Reports Server (NTRS)

    Carl, Chad

    2012-01-01

    The Orion Project will be directing Lockheed Martin to perform orbital arc welding on commodities metallic tubing as part of the Multi Purpose Crew Vehicle assembly and integration process in the Operations and Checkout High bay at Kennedy Space Center. The current method of nondestructive evaluation is utilizing traditional film based x-rays. Due to the high number of welds that are necessary to join the commodities tubing (approx 470), a more efficient and expeditious method of nondestructive evaluation is desired. Digital radiography will be qualified as part of a broader NNWG project scope.

  18. Direct-to-digital holography reduction of reference hologram noise and fourier space smearing

    DOEpatents

    Voelkl, Edgar

    2006-06-27

    Systems and methods are described for reduction of reference hologram noise and reduction of Fourier space smearing, especially in the context of direct-to-digital holography (off-axis interferometry). A method of reducing reference hologram noise includes: recording a plurality of reference holograms; processing the plurality of reference holograms into a corresponding plurality of reference image waves; and transforming the corresponding plurality of reference image waves into a reduced noise reference image wave. A method of reducing smearing in Fourier space includes: recording a plurality of reference holograms; processing the plurality of reference holograms into a corresponding plurality of reference complex image waves; transforming the corresponding plurality of reference image waves into a reduced noise reference complex image wave; recording a hologram of an object; processing the hologram of the object into an object complex image wave; and dividing the complex image wave of the object by the reduced noise reference complex image wave to obtain a reduced smearing object complex image wave.

  19. Digital Enhancement Of Pneumothoraces

    NASA Astrophysics Data System (ADS)

    Cocklin, M.; Kaye, G.; Kerr, I.; Lams, P.

    1982-11-01

    If a patient presents with symptoms indicative of a pneumothorax it is improbable that it would not be detected in a chest radiograph. However, detection on the radiograph can be difficult and a small pneumothorax may be missed when there is no clinical suspicion of its presence. This report presents some methods by which the characteristic pneumothorax edge may be enhanced by digital image processing. Various examples are given.

  20. Overview of machine vision methods in x-ray imaging and microtomography

    NASA Astrophysics Data System (ADS)

    Buzmakov, Alexey; Zolotov, Denis; Chukalina, Marina; Nikolaev, Dmitry; Gladkov, Andrey; Ingacheva, Anastasia; Yakimchuk, Ivan; Asadchikov, Victor

    2018-04-01

    Digital X-ray imaging became widely used in science, medicine, non-destructive testing. This allows using modern digital images analysis for automatic information extraction and interpretation. We give short review of scientific applications of machine vision in scientific X-ray imaging and microtomography, including image processing, feature detection and extraction, images compression to increase camera throughput, microtomography reconstruction, visualization and setup adjustment.

  1. PA12 Is digital storytelling ka pai for new zealand māori? using digital storytelling as a method to explore whānau end of life caregiving experiences: a pilot study.

    PubMed

    Williams, Lisa; Moeke-Maxwell, Tess; Kothari, Shuchi; Pearson, Sarina; Gott, Merryn; Black, Stella; Frey, Rosemary; Wharemate, Rawiri; Hansen, Whio

    2015-04-01

    Māori regard stories as a preferred method for imparting knowledge through waiata (song), moteatea (poetry), kauwhau (moralistic tale), pakiwaitara (story) and purakau (myths). Storytelling is also an expression of tinorangatiratanga (self-determination); Māori have the right to manage their knowledge, which includes embodiment in forms transcending typical western formulations. Digital storytelling is a process by which 'ordinary people' create short autobiographical videos. It has found application in numerous disciplines including public health and has been used to articulatethe experiences of those often excluded from knowledge production. To explore the use of digital storytelling as a research method for learning about whānau (family) experiences providing end of life care for kaumātua (older people). Eight Māori and their nominated co-creators attended a three-day digital story telling workshop led by co-researchers Shuchi Kothari and Sarina Pearson. They were guided in the creation of first-person digital stories about caring for kaumātua. The videos were shared at a group screening, and participants completed questionnaires about the workshop and their videos. A Kaupapa Māori narrative analysis was applied to their stories to gain new perspectives on Māori end of life caregiving practices. (Kaupapa Maori research privileges Maori worldviews and indigenous knowledge systems.) Digital storytelling is an appropriate method as Māori is an oral/aural society. It allows Māori to share their stories with others, thus promoting community support at the end of life, befitting a public health approach. Digital storytelling can be a useful method for Māori to express their experiences providing end of life caregiving. © 2015, Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  2. Method for indexing and retrieving manufacturing-specific digital imagery based on image content

    DOEpatents

    Ferrell, Regina K.; Karnowski, Thomas P.; Tobin, Jr., Kenneth W.

    2004-06-15

    A method for indexing and retrieving manufacturing-specific digital images based on image content comprises three steps. First, at least one feature vector can be extracted from a manufacturing-specific digital image stored in an image database. In particular, each extracted feature vector corresponds to a particular characteristic of the manufacturing-specific digital image, for instance, a digital image modality and overall characteristic, a substrate/background characteristic, and an anomaly/defect characteristic. Notably, the extracting step includes generating a defect mask using a detection process. Second, using an unsupervised clustering method, each extracted feature vector can be indexed in a hierarchical search tree. Third, a manufacturing-specific digital image associated with a feature vector stored in the hierarchicial search tree can be retrieved, wherein the manufacturing-specific digital image has image content comparably related to the image content of the query image. More particularly, can include two data reductions, the first performed based upon a query vector extracted from a query image. Subsequently, a user can select relevant images resulting from the first data reduction. From the selection, a prototype vector can be calculated, from which a second-level data reduction can be performed. The second-level data reduction can result in a subset of feature vectors comparable to the prototype vector, and further comparable to the query vector. An additional fourth step can include managing the hierarchical search tree by substituting a vector average for several redundant feature vectors encapsulated by nodes in the hierarchical search tree.

  3. A Q-Ising model application for linear-time image segmentation

    NASA Astrophysics Data System (ADS)

    Bentrem, Frank W.

    2010-10-01

    A computational method is presented which efficiently segments digital grayscale images by directly applying the Q-state Ising (or Potts) model. Since the Potts model was first proposed in 1952, physicists have studied lattice models to gain deep insights into magnetism and other disordered systems. For some time, researchers have realized that digital images may be modeled in much the same way as these physical systems ( i.e., as a square lattice of numerical values). A major drawback in using Potts model methods for image segmentation is that, with conventional methods, it processes in exponential time. Advances have been made via certain approximations to reduce the segmentation process to power-law time. However, in many applications (such as for sonar imagery), real-time processing requires much greater efficiency. This article contains a description of an energy minimization technique that applies four Potts (Q-Ising) models directly to the image and processes in linear time. The result is analogous to partitioning the system into regions of four classes of magnetism. This direct Potts segmentation technique is demonstrated on photographic, medical, and acoustic images.

  4. FERMI: a digital Front End and Readout MIcrosystem for high resolution calorimetry

    NASA Astrophysics Data System (ADS)

    Alexanian, H.; Appelquist, G.; Bailly, P.; Benetta, R.; Berglund, S.; Bezamat, J.; Blouzon, F.; Bohm, C.; Breveglieri, L.; Brigati, S.; Cattaneo, P. W.; Dadda, L.; David, J.; Engström, M.; Genat, J. F.; Givoletti, M.; Goggi, V. G.; Gong, S.; Grieco, G. M.; Hansen, M.; Hentzell, H.; Holmberg, T.; Höglund, I.; Inkinen, S. J.; Kerek, A.; Landi, C.; Ledortz, O.; Lippi, M.; Lofstedt, B.; Lund-Jensen, B.; Maloberti, F.; Mutz, S.; Nayman, P.; Piuri, V.; Polesello, G.; Sami, M.; Savoy-Navarro, A.; Schwemling, P.; Stefanelli, R.; Sundblad, R.; Svensson, C.; Torelli, G.; Vanuxem, J. P.; Yamdagni, N.; Yuan, J.; Ödmark, A.; Fermi Collaboration

    1995-02-01

    We present a digital solution for the front-end electronics of high resolution calorimeters at future colliders. It is based on analogue signal compression, high speed {A}/{D} converters, a fully programmable pipeline and a digital signal processing (DSP) chain with local intelligence and system supervision. This digital solution is aimed at providing maximal front-end processing power by performing waveform analysis using DSP methods. For the system integration of the multichannel device a multi-chip, silicon-on-silicon multi-chip module (MCM) has been adopted. This solution allows a high level of integration of complex analogue and digital functions, with excellent flexibility in mixing technologies for the different functional blocks. This type of multichip integration provides a high degree of reliability and programmability at both the function and the system level, with the additional possibility of customising the microsystem to detector-specific requirements. For enhanced reliability in high radiation environments, fault tolerance strategies, i.e. redundancy, reconfigurability, majority voting and coding for error detection and correction, are integrated into the design.

  5. Active flutter suppression using optical output feedback digital controllers

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A method for synthesizing digital active flutter suppression controllers using the concept of optimal output feedback is presented. A convergent algorithm is employed to determine constrained control law parameters that minimize an infinite time discrete quadratic performance index. Low order compensator dynamics are included in the control law and the compensator parameters are computed along with the output feedback gain as part of the optimization process. An input noise adjustment procedure is used to improve the stability margins of the digital active flutter controller. Sample rate variation, prefilter pole variation, control structure variation and gain scheduling are discussed. A digital control law which accommodates computation delay can stabilize the wing with reasonable rms performance and adequate stability margins.

  6. Tailpulse signal generator

    DOEpatents

    Baker, John [Walnut Creek, CA; Archer, Daniel E [Knoxville, TN; Luke, Stanley John [Pleasanton, CA; Decman, Daniel J [Livermore, CA; White, Gregory K [Livermore, CA

    2009-06-23

    A tailpulse signal generating/simulating apparatus, system, and method designed to produce electronic pulses which simulate tailpulses produced by a gamma radiation detector, including the pileup effect caused by the characteristic exponential decay of the detector pulses, and the random Poisson distribution pulse timing for radioactive materials. A digital signal process (DSP) is programmed and configured to produce digital values corresponding to pseudo-randomly selected pulse amplitudes and pseudo-randomly selected Poisson timing intervals of the tailpulses. Pulse amplitude values are exponentially decayed while outputting the digital value to a digital to analog converter (DAC). And pulse amplitudes of new pulses are added to decaying pulses to simulate the pileup effect for enhanced realism in the simulation.

  7. DIGITAL CARTOGRAPHY OF THE PLANETS: NEW METHODS, ITS STATUS, AND ITS FUTURE.

    USGS Publications Warehouse

    Batson, R.M.

    1987-01-01

    A system has been developed that establishes a standardized cartographic database for each of the 19 planets and major satellites that have been explored to date. Compilation of the databases involves both traditional and newly developed digital image processing and mosaicking techniques, including radiometric and geometric corrections of the images. Each database, or digital image model (DIM), is a digital mosaic of spacecraft images that have been radiometrically and geometrically corrected and photometrically modeled. During compilation, ancillary data files such as radiometric calibrations and refined photometric values for all camera lens and filter combinations and refined camera-orientation matrices for all images used in the mapping are produced.

  8. Analysis of accuracy of digital elevation models created from captured data by digital photogrammetry method

    NASA Astrophysics Data System (ADS)

    Hudec, P.

    2011-12-01

    A digital elevation model (DEM) is an important part of many geoinformatic applications. For the creation of DEM, spatial data collected by geodetic measurements in the field, photogrammetric processing of aerial survey photographs, laser scanning and secondary sources (analogue maps) are used. It is very important from a user's point of view to know the vertical accuracy of a DEM. The article describes the verification of the vertical accuracy of a DEM for the region of Medzibodrožie, which was created using digital photogrammetry for the purposes of water resources management and modeling and resolving flood cases based on geodetic measurements in the field.

  9. Color engineering in the age of digital convergence

    NASA Astrophysics Data System (ADS)

    MacDonald, Lindsay W.

    1998-09-01

    Digital color imaging has developed over the past twenty years from specialized scientific applications into the mainstream of computing. In addition to the phenomenal growth of computer processing power and storage capacity, great advances have been made in the capabilities and cost-effectiveness of color imaging peripherals. The majority of imaging applications, including the graphic arts, video and film have made the transition from analogue to digital production methods. Digital convergence of computing, communications and television now heralds new possibilities for multimedia publishing and mobile lifestyles. Color engineering, the application of color science to the design of imaging products, is an emerging discipline that poses exciting challenges to the international color imaging community for training, research and standards.

  10. Readiness for Delivering Digital Health at Scale: Lessons From a Longitudinal Qualitative Evaluation of a National Digital Health Innovation Program in the United Kingdom

    PubMed Central

    Lennon, Marilyn R; Bouamrane, Matt-Mouley; Devlin, Alison M; O'Connor, Siobhan; O'Donnell, Catherine; Chetty, Ula; Agbakoba, Ruth; Bikker, Annemieke; Grieve, Eleanor; Finch, Tracy; Watson, Nicholas; Wyke, Sally

    2017-01-01

    Background Digital health has the potential to support care delivery for chronic illness. Despite positive evidence from localized implementations, new technologies have proven slow to become accepted, integrated, and routinized at scale. Objective The aim of our study was to examine barriers and facilitators to implementation of digital health at scale through the evaluation of a £37m national digital health program: ‟Delivering Assisted Living Lifestyles at Scale” (dallas) from 2012-2015. Methods The study was a longitudinal qualitative, multi-stakeholder, implementation study. The methods included interviews (n=125) with key implementers, focus groups with consumers and patients (n=7), project meetings (n=12), field work or observation in the communities (n=16), health professional survey responses (n=48), and cross program documentary evidence on implementation (n=215). We used a sociological theory called normalization process theory (NPT) and a longitudinal (3 years) qualitative framework analysis approach. This work did not study a single intervention or population. Instead, we evaluated the processes (of designing and delivering digital health), and our outcomes were the identified barriers and facilitators to delivering and mainstreaming services and products within the mixed sector digital health ecosystem. Results We identified three main levels of issues influencing readiness for digital health: macro (market, infrastructure, policy), meso (organizational), and micro (professional or public). Factors hindering implementation included: lack of information technology (IT) infrastructure, uncertainty around information governance, lack of incentives to prioritize interoperability, lack of precedence on accountability within the commercial sector, and a market perceived as difficult to navigate. Factors enabling implementation were: clinical endorsement, champions who promoted digital health, and public and professional willingness. Conclusions Although there is receptiveness to digital health, barriers to mainstreaming remain. Our findings suggest greater investment in national and local infrastructure, implementation of guidelines for the safe and transparent use and assessment of digital health, incentivization of interoperability, and investment in upskilling of professionals and the public would help support the normalization of digital health. These findings will enable researchers, health care practitioners, and policy makers to understand the current landscape and the actions required in order to prepare the market and accelerate uptake, and use of digital health and wellness services in context and at scale. PMID:28209558

  11. Evaluating a voice recognition system: finding the right product for your department.

    PubMed

    Freeh, M; Dewey, M; Brigham, L

    2001-06-01

    The Department of Radiology at the University of Utah Health Sciences Center has been in the process of transitioning from the traditional film-based department to a digital imaging department for the past 2 years. The department is now transitioning from the traditional method of dictating reports (dictation by radiologist to transcription to review and signing by radiologist) to a voice recognition system. The transition to digital operations will not be complete until we have the ability to directly interface the dictation process with the image review process. Voice recognition technology has advanced to the level where it can and should be an integral part of the new way of working in radiology and is an integral part of an efficient digital imaging department. The transition to voice recognition requires the task of identifying the product and the company that will best meet a department's needs. This report introduces the methods we used to evaluate the vendors and the products available as we made our purchasing decision. We discuss our evaluation method and provide a checklist that can be used by other departments to assist with their evaluation process. The criteria used in the evaluation process fall into the following major categories: user operations, technical infrastructure, medical dictionary, system interfaces, service support, cost, and company strength. Conclusions drawn from our evaluation process will be detailed, with the intention being to shorten the process for others as they embark on a similar venture. As more and more organizations investigate the many products and services that are now being offered to enhance the operations of a radiology department, it becomes increasingly important that solid methods are used to most effectively evaluate the new products. This report should help others complete the task of evaluating a voice recognition system and may be adaptable to other products as well.

  12. Computation of dynamic seismic responses to viscous fluid of digitized three-dimensional Berea sandstones with a coupled finite-difference method.

    PubMed

    Zhang, Yang; Toksöz, M Nafi

    2012-08-01

    The seismic response of saturated porous rocks is studied numerically using microtomographic images of three-dimensional digitized Berea sandstones. A stress-strain calculation is employed to compute the velocities and attenuations of rock samples whose sizes are much smaller than the seismic wavelength of interest. To compensate for the contributions of small cracks lost in the imaging process to the total velocity and attenuation, a hybrid method is developed to recover the crack distribution, in which the differential effective medium theory, the Kuster-Toksöz model, and a modified squirt-flow model are utilized in a two-step Monte Carlo inversion. In the inversion, the velocities of P- and S-waves measured for the dry and water-saturated cases, and the measured attenuation of P-waves for different fluids are used. By using such a hybrid method, both the velocities of saturated porous rocks and the attenuations are predicted accurately when compared to laboratory data. The hybrid method is a practical way to model numerically the seismic properties of saturated porous rocks until very high resolution digital data are available. Cracks lost in the imaging process are critical for accurately predicting velocities and attenuations of saturated porous rocks.

  13. A wavelet domain adaptive image watermarking method based on chaotic encryption

    NASA Astrophysics Data System (ADS)

    Wei, Fang; Liu, Jian; Cao, Hanqiang; Yang, Jun

    2009-10-01

    A digital watermarking technique is a specific branch of steganography, which can be used in various applications, provides a novel way to solve security problems for multimedia information. In this paper, we proposed a kind of wavelet domain adaptive image digital watermarking method using chaotic stream encrypt and human eye visual property. The secret information that can be seen as a watermarking is hidden into a host image, which can be publicly accessed, so the transportation of the secret information will not attract the attention of illegal receiver. The experimental results show that the method is invisible and robust against some image processing.

  14. Computer-assisted techniques to evaluate fringe patterns

    NASA Astrophysics Data System (ADS)

    Sciammarella, Cesar A.; Bhat, Gopalakrishna K.

    1992-01-01

    Strain measurement using interferometry requires an efficient way to extract the desired information from interferometric fringes. Availability of digital image processing systems makes it possible to use digital techniques for the analysis of fringes. In the past, there have been several developments in the area of one dimensional and two dimensional fringe analysis techniques, including the carrier fringe method (spatial heterodyning) and the phase stepping (quasi-heterodyning) technique. This paper presents some new developments in the area of two dimensional fringe analysis, including a phase stepping technique supplemented by the carrier fringe method and a two dimensional Fourier transform method to obtain the strain directly from the discontinuous phase contour map.

  15. Developing criteria to establish Trusted Digital Repositories

    USGS Publications Warehouse

    Faundeen, John L.

    2017-01-01

    This paper details the drivers, methods, and outcomes of the U.S. Geological Survey’s quest to establish criteria by which to judge its own digital preservation resources as Trusted Digital Repositories. Drivers included recent U.S. legislation focused on data and asset management conducted by federal agencies spending $100M USD or more annually on research activities. The methods entailed seeking existing evaluation criteria from national and international organizations such as International Standards Organization (ISO), U.S. Library of Congress, and Data Seal of Approval upon which to model USGS repository evaluations. Certification, complexity, cost, and usability of existing evaluation models were key considerations. The selected evaluation method was derived to allow the repository evaluation process to be transparent, understandable, and defensible; factors that are critical for judging competing, internal units. Implementing the chosen evaluation criteria involved establishing a cross-agency, multi-disciplinary team that interfaced across the organization. 

  16. Simulation of digital mammography images

    NASA Astrophysics Data System (ADS)

    Workman, Adam

    2005-04-01

    A number of different technologies are available for digital mammography. However, it is not clear how differences in the physical performance aspects of the different imaging technologies affect clinical performance. Randomised controlled trials provide a means of gaining information on clinical performance however do not provide direct comparison of the different digital imaging technologies. This work describes a method of simulating the performance of different digital mammography systems. The method involves modifying the imaging performance parameters of images from a small field of view (SFDM), high resolution digital imaging system used for spot imaging. Under normal operating conditions this system produces images with higher signal-to-noise ratio (SNR) over a wide spatial frequency range than current full field digital mammography (FFDM) systems. The SFDM images can be 'degraded" by computer processing to simulate the characteristics of a FFDM system. Initial work characterised the physical performance (MTF, NPS) of the SFDM detector and developed a model and method for simulating signal transfer and noise properties of a FFDM system. It was found that the SNR properties of the simulated FFDM images were very similar to those measured from an actual FFDM system verifying the methodology used. The application of this technique to clinical images from the small field system will allow the clinical performance of different FFDM systems to be simulated and directly compared using the same clinical image datasets.

  17. Focus measure method based on the modulus of the gradient of the color planes for digital microscopy

    NASA Astrophysics Data System (ADS)

    Hurtado-Pérez, Román; Toxqui-Quitl, Carina; Padilla-Vivanco, Alfonso; Aguilar-Valdez, J. Félix; Ortega-Mendoza, Gabriel

    2018-02-01

    The modulus of the gradient of the color planes (MGC) is implemented to transform multichannel information to a grayscale image. This digital technique is used in two applications: (a) focus measurements during autofocusing (AF) process and (b) extending the depth of field (EDoF) by means of multifocus image fusion. In the first case, the MGC procedure is based on an edge detection technique and is implemented in over 15 focus metrics that are typically handled in digital microscopy. The MGC approach is tested on color images of histological sections for the selection of in-focus images. An appealing attribute of all the AF metrics working in the MGC space is their monotonic behavior even up to a magnification of 100×. An advantage of the MGC method is its computational simplicity and inherent parallelism. In the second application, a multifocus image fusion algorithm based on the MGC approach has been implemented on graphics processing units (GPUs). The resulting fused images are evaluated using a nonreference image quality metric. The proposed fusion method reveals a high-quality image independently of faulty illumination during the image acquisition. Finally, the three-dimensional visualization of the in-focus image is shown.

  18. A smartphone photogrammetry method for digitizing prosthetic socket interiors.

    PubMed

    Hernandez, Amaia; Lemaire, Edward

    2017-04-01

    Prosthetic CAD/CAM systems require accurate 3D limb models; however, difficulties arise when working from the person's socket since current 3D scanners have difficulties scanning socket interiors. While dedicated scanners exist, they are expensive and the cost may be prohibitive for a limited number of scans per year. A low-cost and accessible photogrammetry method for socket interior digitization is proposed, using a smartphone camera and cloud-based photogrammetry services. 15 two-dimensional images of the socket's interior are captured using a smartphone camera. A 3D model is generated using cloud-based software. Linear measurements were comparing between sockets and the related 3D models. 3D reconstruction accuracy averaged 2.6 ± 2.0 mm and 0.086 ± 0.078 L, which was less accurate than models obtained by high quality 3D scanners. However, this method would provide a viable 3D digital socket reproduction that is accessible and low-cost, after processing in prosthetic CAD software. Clinical relevance The described method provides a low-cost and accessible means to digitize a socket interior for use in prosthetic CAD/CAM systems, employing a smartphone camera and cloud-based photogrammetry software.

  19. An Innovative Method for Obtaining Consistent Images and Quantification of Histochemically Stained Specimens

    PubMed Central

    Sedgewick, Gerald J.; Ericson, Marna

    2015-01-01

    Obtaining digital images of color brightfield microscopy is an important aspect of biomedical research and the clinical practice of diagnostic pathology. Although the field of digital pathology has had tremendous advances in whole-slide imaging systems, little effort has been directed toward standardizing color brightfield digital imaging to maintain image-to-image consistency and tonal linearity. Using a single camera and microscope to obtain digital images of three stains, we show that microscope and camera systems inherently produce image-to-image variation. Moreover, we demonstrate that post-processing with a widely used raster graphics editor software program does not completely correct for session-to-session inconsistency. We introduce a reliable method for creating consistent images with a hardware/software solution (ChromaCal™; Datacolor Inc., NJ) along with its features for creating color standardization, preserving linear tonal levels, providing automated white balancing and setting automated brightness to consistent levels. The resulting image consistency using this method will also streamline mean density and morphometry measurements, as images are easily segmented and single thresholds can be used. We suggest that this is a superior method for color brightfield imaging, which can be used for quantification and can be readily incorporated into workflows. PMID:25575568

  20. The Goddard Profiling Algorithm (GPROF): Description and Current Applications

    NASA Technical Reports Server (NTRS)

    Olson, William S.; Yang, Song; Stout, John E.; Grecu, Mircea

    2004-01-01

    Atmospheric scientists use different methods for interpreting satellite data. In the early days of satellite meteorology, the analysis of cloud pictures from satellites was primarily subjective. As computer technology improved, satellite pictures could be processed digitally, and mathematical algorithms were developed and applied to the digital images in different wavelength bands to extract information about the atmosphere in an objective way. The kind of mathematical algorithm one applies to satellite data may depend on the complexity of the physical processes that lead to the observed image, and how much information is contained in the satellite images both spatially and at different wavelengths. Imagery from satellite-borne passive microwave radiometers has limited horizontal resolution, and the observed microwave radiances are the result of complex physical processes that are not easily modeled. For this reason, a type of algorithm called a Bayesian estimation method is utilized to interpret passive microwave imagery in an objective, yet computationally efficient manner.

  1. Digitized forensics: retaining a link between physical and digital crime scene traces using QR-codes

    NASA Astrophysics Data System (ADS)

    Hildebrandt, Mario; Kiltz, Stefan; Dittmann, Jana

    2013-03-01

    The digitization of physical traces from crime scenes in forensic investigations in effect creates a digital chain-of-custody and entrains the challenge of creating a link between the two or more representations of the same trace. In order to be forensically sound, especially the two security aspects of integrity and authenticity need to be maintained at all times. Especially the adherence to the authenticity using technical means proves to be a challenge at the boundary between the physical object and its digital representations. In this article we propose a new method of linking physical objects with its digital counterparts using two-dimensional bar codes and additional meta-data accompanying the acquired data for integration in the conventional documentation of collection of items of evidence (bagging and tagging process). Using the exemplary chosen QR-code as particular implementation of a bar code and a model of the forensic process, we also supply a means to integrate our suggested approach into forensically sound proceedings as described by Holder et al.1 We use the example of the digital dactyloscopy as a forensic discipline, where currently progress is being made by digitizing some of the processing steps. We show an exemplary demonstrator of the suggested approach using a smartphone as a mobile device for the verification of the physical trace to extend the chain-of-custody from the physical to the digital domain. Our evaluation of the demonstrator is performed towards the readability and the verification of its contents. We can read the bar code despite its limited size of 42 x 42 mm and rather large amount of embedded data using various devices. Furthermore, the QR-code's error correction features help to recover contents of damaged codes. Subsequently, our appended digital signature allows for detecting malicious manipulations of the embedded data.

  2. Barriers and facilitators to patient and public engagement and recruitment to digital health interventions: protocol of a systematic review of qualitative studies

    PubMed Central

    Hanlon, Peter; O'Donnell, Catherine A; Garcia, Sonia; Glanville, Julie; Mair, Frances S

    2016-01-01

    Introduction Patients and the public are beginning to use digital health tools to assist in managing chronic illness, support independent living and self-care, and remain connected to health and care providers. However, engaging with and enrolling in digital health interventions, such as telehealth systems, mobile health applications, patient portals and personal health records, in order to use them varies considerably. Many factors affect people's ability to engage with and sign up to digital health platforms. Objectives The primary aim is to identify the barriers and facilitators patients and the public experience to engagement and recruitment to digital health interventions. The secondary aim is to identify engagement and enrolment strategies, leading if possible to a taxonomy of such approaches, and a conceptual framework of digital health engagement and recruitment processes. Methods A systematic review of qualitative studies will be conducted by searching six databases: MEDLINE, CINAHL, PubMed, EMBASE, Scopus and the ACM Digital Library for papers published between 2000 and 2015. Titles and abstracts along with full-text papers will be screened by two independent reviewers against predetermined inclusion and exclusion criteria. A data extraction form will be used to provide details of the included studies. Quality assessment will be conducted using the Consolidated Criteria for Reporting Qualitative Research checklist. Any disagreements will be resolved through discussion with an independent third reviewer. Analysis will be guided by framework synthesis and informed by normalization process theory and burden of treatment theory, to aid conceptualisation of digital health engagement and recruitment processes. Discussion This systematic review of qualitative studies will explore factors affecting engagement and enrolment in digital health interventions. It will advance our understanding of readiness for digital health by examining the complex factors that affect patients’ and the public's ability to take part. Trial registration number CRD42015029846. PMID:27591017

  3. Modes of information delivery in radiologic anatomy education: Impact on student performance.

    PubMed

    Ketelsen, Dominik; Schrödl, Falk; Knickenberg, Inés; Heckemann, Rolf A; Hothorn, Torsten; Neuhuber, Winfried; Bautz, Werner A L; Grunewald, Markus

    2007-01-01

    This study provides a systematic assessment of different methods of delivering radiologic teaching content (lecture, printed text, and digital content delivery) under standard conditions, enabling comparison of the effectiveness of these methods. A printed atlas of sectional anatomy was used as a standard. Digital content was developed on the basis of the printed atlas. Lecturers used both the printed and the digital content to prepare lectures. Standardized teaching material thus created was presented to second-term undergraduate students who had attended the school's anatomy course, but had not received any radiology teaching. Multiple choice examinations were used to assess the students' ability to recognize anatomical structures in known as well as unknown images. In a survey, the students' subjective experience of the learning process was assessed. No difference was seen between the groups regarding examination results. Students preferred a combination of digital media and lectures by enthusiastic teachers. The shortage of teachers requires a compromise concerning the delivery of radiologic anatomy content in a medical school setting. Based on our results, we recommend a combined approach of lecture and digital content delivery.

  4. Will the digital computer transform classical mathematics?

    PubMed

    Rotman, Brian

    2003-08-15

    Mathematics and machines have influenced each other for millennia. The advent of the digital computer introduced a powerfully new element that promises to transform the relation between them. This paper outlines the thesis that the effect of the digital computer on mathematics, already widespread, is likely to be radical and far-reaching. To articulate this claim, an abstract model of doing mathematics is introduced based on a triad of actors of which one, the 'agent', corresponds to the function performed by the computer. The model is used to frame two sorts of transformation. The first is pragmatic and involves the alterations and progressive colonization of the content and methods of enquiry of various mathematical fields brought about by digital methods. The second is conceptual and concerns a fundamental antagonism between the infinity enshrined in classical mathematics and physics (continuity, real numbers, asymptotic definitions) and the inherently real and material limit of processes associated with digital computation. An example which lies in the intersection of classical mathematics and computer science, the P=NP problem, is analysed in the light of this latter issue.

  5. The Relationship between Digit Span and Cognitive Processing Across Ability Groups.

    ERIC Educational Resources Information Center

    Schofield, Neville J.; Ashman, Adrian F.

    1986-01-01

    The relationship between forward and backward digit span and basic cognitive processes was examined. Subjects were administered measures of sequential processing, simultaneous processing, and planning. Correlational analyses indicated the serial processing character of forward digit span, and the relationship between backward digit span and…

  6. Computational approach to integrate 3D X-ray microtomography and NMR data

    NASA Astrophysics Data System (ADS)

    Lucas-Oliveira, Everton; Araujo-Ferreira, Arthur G.; Trevizan, Willian A.; Fortulan, Carlos A.; Bonagamba, Tito J.

    2018-07-01

    Nowadays, most of the efforts in NMR applied to porous media are dedicated to studying the molecular fluid dynamics within and among the pores. These analyses have a higher complexity due to morphology and chemical composition of rocks, besides dynamic effects as restricted diffusion, diffusional coupling, and exchange processes. Since the translational nuclear spin diffusion in a confined geometry (e.g. pores and fractures) requires specific boundary conditions, the theoretical solutions are restricted to some special problems and, in many cases, computational methods are required. The Random Walk Method is a classic way to simulate self-diffusion along a Digital Porous Medium. Bergman model considers the magnetic relaxation process of the fluid molecules by including a probability rate of magnetization survival under surface interactions. Here we propose a statistical approach to correlate surface magnetic relaxivity with the computational method applied to the NMR relaxation in order to elucidate the relationship between simulated relaxation time and pore size of the Digital Porous Medium. The proposed computational method simulates one- and two-dimensional NMR techniques reproducing, for example, longitudinal and transverse relaxation times (T1 and T2, respectively), diffusion coefficients (D), as well as their correlations. For a good approximation between the numerical and experimental results, it is necessary to preserve the complexity of translational diffusion through the microstructures in the digital rocks. Therefore, we use Digital Porous Media obtained by 3D X-ray microtomography. To validate the method, relaxation times of ideal spherical pores were obtained and compared with the previous determinations by the Brownstein-Tarr model, as well as the computational approach proposed by Bergman. Furthermore, simulated and experimental results of synthetic porous media are compared. These results make evident the potential of computational physics in the analysis of the NMR data for complex porous materials.

  7. 3D food printing: a new dimension in food production processes

    USDA-ARS?s Scientific Manuscript database

    3D food printing, also known as food layered manufacture (FLM), is an exciting new method of digital food production that applies the process of additive manufacturing to food fabrication. In the 3D food printing process, a food product is first scanned or designed with computer-aided design softwa...

  8. Development of programmable artificial neural networks

    NASA Technical Reports Server (NTRS)

    Meade, Andrew J.

    1993-01-01

    Conventionally programmed digital computers can process numbers with great speed and precision, but do not easily recognize patterns or imprecise or contradictory data. Instead of being programmed in the conventional sense, artificial neural networks are capable of self-learning through exposure to repeated examples. However, the training of an ANN can be a time consuming and unpredictable process. A general method is being developed to mate the adaptability of the ANN with the speed and precision of the digital computer. This method was successful in building feedforward networks that can approximate functions and their partial derivatives from examples in a single iteration. The general method also allows the formation of feedforward networks that can approximate the solution to nonlinear ordinary and partial differential equations to desired accuracy without the need of examples. It is believed that continued research will produce artificial neural networks that can be used with confidence in practical scientific computing and engineering applications.

  9. Digital staining for histopathology multispectral images by the combined application of spectral enhancement and spectral transformation.

    PubMed

    Bautista, Pinky A; Yagi, Yukako

    2011-01-01

    In this paper we introduced a digital staining method for histopathology images captured with an n-band multispectral camera. The method consisted of two major processes: enhancement of the original spectral transmittance and the transformation of the enhanced transmittance to its target spectral configuration. Enhancement is accomplished by shifting the original transmittance with the scaled difference between the original transmittance and the transmittance estimated with m dominant principal component (PC) vectors;the m-PC vectors were determined from the transmittance samples of the background image. Transformation of the enhanced transmittance to the target spectral configuration was done using an nxn transformation matrix, which was derived by applying a least square method to the enhanced and target spectral training data samples of the different tissue components. Experimental results on the digital conversion of a hematoxylin and eosin (H&E) stained multispectral image to its Masson's trichrome stained (MT) equivalent shows the viability of the method.

  10. Application of ultrasonic signature analysis for fatigue detection in complex structures

    NASA Technical Reports Server (NTRS)

    Zuckerwar, A. J.

    1974-01-01

    Ultrasonic signature analysis shows promise of being a singularly well-suited method for detecting fatigue in structures as complex as aircraft. The method employs instrumentation centered about a Fourier analyzer system, which features analog-to-digital conversion, digital data processing, and digital display of cross-correlation functions and cross-spectra. These features are essential to the analysis of ultrasonic signatures according to the procedure described here. In order to establish the feasibility of the method, the initial experiments were confined to simple plates with simulated and fatigue-induced defects respectively. In the first test the signature proved sensitive to the size of a small hole drilled into the plate. In the second test, performed on a series of fatigue-loaded plates, the signature proved capable of indicating both the initial appearance and subsequent growth of a fatigue crack. In view of these encouraging results it is concluded that the method has reached a sufficiently advanced stage of development to warrant application to small-scale structures or even actual aircraft.

  11. Data mining for average images in a digital hand atlas

    NASA Astrophysics Data System (ADS)

    Zhang, Aifeng; Cao, Fei; Pietka, Ewa; Liu, Brent J.; Huang, H. K.

    2004-04-01

    Bone age assessment is a procedure performed in pediatric patients to quickly evaluate parameters of maturation and growth from a left hand and wrist radiograph. Pietka and Cao have developed a Computer-aided diagnosis (CAD) method of bone age assessment based on a digital hand atlas. The aim of this paper is to extend their work by automatically select the best representative image from a group of normal children based on specific bony features that reflect skeletal maturity. The group can be of any ethnic origin and gender from one year to 18 year old in the digital atlas. This best representative image is defined as the "average" image of the group that can be augmented to Piekta and Cao's method to facilitate in the bone age assessment process.

  12. High-speed high-accuracy three-dimensional shape measurement using digital binary defocusing method versus sinusoidal method

    NASA Astrophysics Data System (ADS)

    Hyun, Jae-Sang; Li, Beiwen; Zhang, Song

    2017-07-01

    This paper presents our research findings on high-speed high-accuracy three-dimensional shape measurement using digital light processing (DLP) technologies. In particular, we compare two different sinusoidal fringe generation techniques using the DLP projection devices: direct projection of computer-generated 8-bit sinusoidal patterns (a.k.a., the sinusoidal method), and the creation of sinusoidal patterns by defocusing binary patterns (a.k.a., the binary defocusing method). This paper mainly examines their performance on high-accuracy measurement applications under precisely controlled settings. Two different projection systems were tested in this study: a commercially available inexpensive projector and the DLP development kit. Experimental results demonstrated that the binary defocusing method always outperforms the sinusoidal method if a sufficient number of phase-shifted fringe patterns can be used.

  13. High-speed 3D imaging using digital binary defocusing method vs sinusoidal method

    NASA Astrophysics Data System (ADS)

    Zhang, Song; Hyun, Jae-Sang; Li, Beiwen

    2017-02-01

    This paper presents our research findings on high-speed 3D imaging using digital light processing (DLP) technologies. In particular, we compare two different sinusoidal fringe generation techniques using the DLP projection devices: direct projection of 8-bit computer generated sinusoidal patterns (a.k.a, the sinusoidal method), and the creation of sinusoidal patterns by defocusing binary patterns (a.k.a., the binary defocusing method). This paper mainly examines their performance on high-accuracy measurement applications under precisely controlled settings. Two different projection systems were tested in this study: the commercially available inexpensive projector, and the DLP development kit. Experimental results demonstrated that the binary defocusing method always outperforms the sinusoidal method if a sufficient number of phase-shifted fringe patterns can be used.

  14. Valuing national effects of digital health investments: an applied method.

    PubMed

    Hagens, Simon; Zelmer, Jennifer; Frazer, Cassandra; Gheorghiu, Bobby; Leaver, Chad

    2015-01-01

    This paper describes an approach which has been applied to value national outcomes of investments by federal, provincial and territorial governments, clinicians and healthcare organizations in digital health. Hypotheses are used to develop a model, which is revised and populated based upon the available evidence. Quantitative national estimates and qualitative findings are produced and validated through structured peer review processes. This methodology has applied in four studies since 2008.

  15. Digital imaging information technology for biospeckle activity assessment relative to bacteria and parasites.

    PubMed

    Ramírez-Miquet, Evelio E; Cabrera, Humberto; Grassi, Hilda C; de J Andrades, Efrén; Otero, Isabel; Rodríguez, Dania; Darias, Juan G

    2017-08-01

    This paper reports on the biospeckle processing of biological activity using a visualization scheme based upon the digital imaging information technology. Activity relative to bacterial growth in agar plates and to parasites affected by a drug is monitored via the speckle patterns generated by a coherent source incident on the microorganisms. We present experimental results to demonstrate the potential application of this methodology for following the activity in time. The digital imaging information technology is an alternative visualization enabling the study of speckle dynamics, which is correlated to the activity of bacteria and parasites. In this method, the changes in Red-Green-Blue (RGB) color component density are considered as markers of the growth of bacteria and parasites motility in presence of a drug. The RGB data was used to generate a two-dimensional surface plot allowing an analysis of color distribution on the speckle images. The proposed visualization is compared to the outcomes of the generalized differences and the temporal difference. A quantification of the activity is performed using a parameterization of the temporal difference method. The adopted digital image processing technique has been found suitable to monitor motility and morphological changes in the bacterial population over time and to detect and distinguish a short term drug action on parasites.

  16. Development of an imaging method for quantifying a large digital PCR droplet

    NASA Astrophysics Data System (ADS)

    Huang, Jen-Yu; Lee, Shu-Sheng; Hsu, Yu-Hsiang

    2017-02-01

    Portable devices have been recognized as the future linkage between end-users and lab-on-a-chip devices. It has a user friendly interface and provides apps to interface headphones, cameras, and communication duct, etc. In particular, the digital resolution of cameras installed in smartphones or pads already has a high imaging resolution with a high number of pixels. This unique feature has triggered researches to integrate optical fixtures with smartphone to provide microscopic imaging capabilities. In this paper, we report our study on developing a portable diagnostic tool based on the imaging system of a smartphone and a digital PCR biochip. A computational algorithm is developed to processing optical images taken from a digital PCR biochip with a smartphone in a black box. Each reaction droplet is recorded in pixels and is analyzed in a sRGB (red, green, and blue) color space. Multistep filtering algorithm and auto-threshold algorithm are adopted to minimize background noise contributed from ccd cameras and rule out false positive droplets, respectively. Finally, a size-filtering method is applied to identify the number of positive droplets to quantify target's concentration. Statistical analysis is then performed for diagnostic purpose. This process can be integrated in an app and can provide a user friendly interface without professional training.

  17. Digital signal processing for velocity measurements in dynamical material's behaviour studies.

    PubMed

    Devlaminck, Julien; Luc, Jérôme; Chanal, Pierre-Yves

    2014-03-01

    In this work, we describe different configurations of optical fiber interferometers (types Michelson and Mach-Zehnder) used to measure velocities during dynamical material's behaviour studies. We detail the algorithms of processing developed and optimized to improve the performance of these interferometers especially in terms of time and frequency resolutions. Three methods of analysis of interferometric signals were studied. For Michelson interferometers, the time-frequency analysis of signals by Short-Time Fourier Transform (STFT) is compared to a time-frequency analysis by Continuous Wavelet Transform (CWT). The results have shown that the CWT was more suitable than the STFT for signals with low signal-to-noise, and low velocity and high acceleration areas. For Mach-Zehnder interferometers, the measurement is carried out by analyzing the phase shift between three interferometric signals (Triature processing). These three methods of digital signal processing were evaluated, their measurement uncertainties estimated, and their restrictions or operational limitations specified from experimental results performed on a pulsed power machine.

  18. Human Factors Engineering Aspects of Modifications in Control Room Modernization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hugo, Jacques; Clefton, Gordon; Joe, Jeffrey

    This report describes the basic aspects of control room modernization projects in the U.S. nuclear industry and the need for supplementary guidance on the integration of human factors considerations into the licensing and regulatory aspects of digital upgrades. The report pays specific attention to the integration of principles described in NUREG-0711 (Human Factors Engineering Program Review Model) and how supplementary guidance can help to raise general awareness in the industry regarding the complexities of control room modernization projects created by many interdependent regulations, standards and guidelines. The report also describes how human factors engineering principles and methods provided by variousmore » resources and international standards can help in navigating through the process of licensing digital upgrades. In particular, the integration of human factors engineering guidance and requirements into the process of licensing digital upgrades can help reduce uncertainty related to development of technical bases for digital upgrades that will avoid the introduction of new failure modes.« less

  19. All-digital signal-processing open-loop fiber-optic gyroscope with enlarged dynamic range.

    PubMed

    Wang, Qin; Yang, Chuanchuan; Wang, Xinyue; Wang, Ziyu

    2013-12-15

    We propose and realize a new open-loop fiber-optic gyroscope (FOG) with an all-digital signal-processing (DSP) system where an all-digital phase-locked loop is employed for digital demodulation to eliminate the variation of the source intensity and suppress the bias drift. A Sagnac phase-shift tracking method is proposed to enlarge the dynamic range, and, with its aid, a new open-loop FOG, which can achieve a large dynamic range and high sensitivity at the same time, is realized. The experimental results show that compared with the conventional open-loop FOG with the same fiber coil and optical devices, the proposed FOG reduces the bias instability from 0.259 to 0.018 deg/h, and the angle random walk from 0.031 to 0.006 deg/h(1/2), moreover, enlarges the dynamic range to ±360 deg/s, exceeding the maximum dynamic range ±63 deg/s of the conventional open-loop FOG.

  20. Testing an Alternative Method for Estimating the Length of Fungal Hyphae Using Photomicrography and Image Processing.

    PubMed

    Shen, Qinhua; Kirschbaum, Miko U F; Hedley, Mike J; Camps Arbestain, Marta

    2016-01-01

    This study aimed to develop and test an unbiased and rapid methodology to estimate the length of external arbuscular mycorrhizal fungal (AMF) hyphae in soil. The traditional visual gridline intersection (VGI) method, which consists in a direct visual examination of the intersections of hyphae with gridlines on a microscope eyepiece after aqueous extraction, membrane-filtration, and staining (e.g., with trypan blue), was refined. For this, (i) images of the stained hyphae were taken by using a digital photomicrography technique to avoid the use of the microscope and the method was referred to as "digital gridline intersection" (DGI) method; and (ii), the images taken in (i) were processed and the hyphal length was measured by using ImageJ software, referred to as the "photomicrography-ImageJ processing" (PIP) method. The DGI and PIP methods were tested using known grade lengths of possum fur. Then they were applied to measure the hyphal lengths in soils with contrasting phosphorus (P) fertility status. Linear regressions were obtained between the known lengths (Lknown) of possum fur and the values determined by using either the DGI (LDGI) (LDGI = 0.37 + 0.97 × Lknown, r2 = 0.86) or PIP (LPIP) methods (LPIP = 0.33 + 1.01 × Lknown, r2 = 0.98). There were no significant (P > 0.05) differences between the LDGI and LPIP values. While both methods provided accurate estimation (slope of regression being 1.0), the PIP method was more precise, as reflected by a higher value of r2 and lower coefficients of variation. The average hyphal lengths (6.5-19.4 m g-1) obtained by the use of these methods were in the range of those typically reported in the literature (3-30 m g-1). Roots growing in P-deficient soil developed 2.5 times as many hyphae as roots growing in P-rich soil (17.4 vs 7.2 m g-1). These tests confirmed that the use of digital photomicrography in conjunction with either the grid-line intersection principle or image processing is a suitable method for the measurement of AMF hyphal lengths in soils for comparative investigations.

  1. Autofluorescence endoscopy with "real-time" digital image processing in differential diagnostics of selected benign and malignant lesions in the oesophagus.

    PubMed

    Sieroń-Stołtny, Karolina; Kwiatek, Sebastian; Latos, Wojciech; Kawczyk-Krupka, Aleksandra; Cieślar, Grzegorz; Stanek, Agata; Ziaja, Damian; Bugaj, Andrzej M; Sieroń, Aleksander

    2012-03-01

    Oesophageal papilloma and Barrett's oesophagus are benign lesions known as risk factors of carcinoma in the oesophagus. Therefore, it is important to diagnose these early changes before neoplastic transformation. Autofluorescence endoscopy is a fast and non-invasive method of imaging of tissues based on the natural fluorescence of endogenous fluorophores. The aim of this study was to prove the diagnostic utility of autofluorescence endoscopy with digital image processing in histological diagnosis of endoscopic findings in the upper digestive tract, primarily in the imaging of oesophageal papilloma. During the retrospective analysis of about 200 endoscopic procedures in the upper digestive tract, 67 cases of benign, precancerous or cancerous changes were found. White light endoscopy (WLE) image, single-channel (red or green) autofluorescence images, as well as green and red fluorescence intensities in two modal fluorescence image and red-to-green (R/G) ratio (Numerical Colour Value, NCV) were correlated with histopathologic results. The NCV analysis in autofluorescence imaging (AFI) showed increased R/G ratio in cancerous changes in 96% vs. 85% in WLE. Simultaneous analysis with digital image processing allowed us to diagnose suspicious tissue as cancerous in all of cases. Barrett's metaplasia was confirmed in 90% vs. 79% (AFI vs. WLE), and 98% in imaging with digital image processing. In benign lesions, WLE allowed us to exclude tissue as malignant in 85%. Using autofluorescence endoscopy R/G ratio was increased in only 10% of benign changes causing the picture to be interpreted as suspicious, but when both methods were used together, 97.5% were cases excluded as malignancies. Mean R/G ratios were estimated to be 2.5 in cancers, 1.25 in Barrett's metaplasia and 0.75 in benign changes and were statistically significant (p=0.04). Autofluorescence imaging is a sensitive method to diagnose precancerous and cancerous early stages of the diseases located in oesophagus. Especially in two-modal imaging including white light endoscopy, autofluorescence imaging with digital image processing seems to be a useful modality of early diagnostics. Also in observation of papilloma changes, it facilitates differentiation between neoplastic and benign lesions and more accurate estimation of the risk of potential malignancy. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Generation of topographic terrain models utilizing synthetic aperture radar and surface level data

    NASA Technical Reports Server (NTRS)

    Imhoff, Marc L. (Inventor)

    1991-01-01

    Topographical terrain models are generated by digitally delineating the boundary of the region under investigation from the data obtained from an airborne synthetic aperture radar image and surface elevation data concurrently acquired either from an airborne instrument or at ground level. A set of coregistered boundary maps thus generated are then digitally combined in three dimensional space with the acquired surface elevation data by means of image processing software stored in a digital computer. The method is particularly applicable for generating terrain models of flooded regions covered entirely or in part by foliage.

  3. Experiences on developing digital down conversion algorithms using Xilinx system generator

    NASA Astrophysics Data System (ADS)

    Xu, Chengfa; Yuan, Yuan; Zhao, Lizhi

    2013-07-01

    The Digital Down Conversion (DDC) algorithm is a classical signal processing method which is widely used in radar and communication systems. In this paper, the DDC function is implemented by Xilinx System Generator tool on FPGA. System Generator is an FPGA design tool provided by Xilinx Inc and MathWorks Inc. It is very convenient for programmers to manipulate the design and debug the function, especially for the complex algorithm. Through the developing process of DDC function based on System Generator, the results show that System Generator is a very fast and efficient tool for FPGA design.

  4. The effects of gray scale image processing on digital mammography interpretation performance.

    PubMed

    Cole, Elodia B; Pisano, Etta D; Zeng, Donglin; Muller, Keith; Aylward, Stephen R; Park, Sungwook; Kuzmiak, Cherie; Koomen, Marcia; Pavic, Dag; Walsh, Ruth; Baker, Jay; Gimenez, Edgardo I; Freimanis, Rita

    2005-05-01

    To determine the effects of three image-processing algorithms on diagnostic accuracy of digital mammography in comparison with conventional screen-film mammography. A total of 201 cases consisting of nonprocessed soft copy versions of the digital mammograms acquired from GE, Fischer, and Trex digital mammography systems (1997-1999) and conventional screen-film mammograms of the same patients were interpreted by nine radiologists. The raw digital data were processed with each of three different image-processing algorithms creating three presentations-manufacturer's default (applied and laser printed to film by each of the manufacturers), MUSICA, and PLAHE-were presented in soft copy display. There were three radiologists per presentation. Area under the receiver operating characteristic curve for GE digital mass cases was worse than screen-film for all digital presentations. The area under the receiver operating characteristic for Trex digital mass cases was better, but only with images processed with the manufacturer's default algorithm. Sensitivity for GE digital mass cases was worse than screen film for all digital presentations. Specificity for Fischer digital calcifications cases was worse than screen film for images processed in default and PLAHE algorithms. Specificity for Trex digital calcifications cases was worse than screen film for images processed with MUSICA. Specific image-processing algorithms may be necessary for optimal presentation for interpretation based on machine and lesion type.

  5. Point Cloud Generation from Aerial Image Data Acquired by a Quadrocopter Type Micro Unmanned Aerial Vehicle and a Digital Still Camera

    PubMed Central

    Rosnell, Tomi; Honkavaara, Eija

    2012-01-01

    The objective of this investigation was to develop and investigate methods for point cloud generation by image matching using aerial image data collected by quadrocopter type micro unmanned aerial vehicle (UAV) imaging systems. Automatic generation of high-quality, dense point clouds from digital images by image matching is a recent, cutting-edge step forward in digital photogrammetric technology. The major components of the system for point cloud generation are a UAV imaging system, an image data collection process using high image overlaps, and post-processing with image orientation and point cloud generation. Two post-processing approaches were developed: one of the methods is based on Bae Systems’ SOCET SET classical commercial photogrammetric software and another is built using Microsoft®’s Photosynth™ service available in the Internet. Empirical testing was carried out in two test areas. Photosynth processing showed that it is possible to orient the images and generate point clouds fully automatically without any a priori orientation information or interactive work. The photogrammetric processing line provided dense and accurate point clouds that followed the theoretical principles of photogrammetry, but also some artifacts were detected. The point clouds from the Photosynth processing were sparser and noisier, which is to a large extent due to the fact that the method is not optimized for dense point cloud generation. Careful photogrammetric processing with self-calibration is required to achieve the highest accuracy. Our results demonstrate the high performance potential of the approach and that with rigorous processing it is possible to reach results that are consistent with theory. We also point out several further research topics. Based on theoretical and empirical results, we give recommendations for properties of imaging sensor, data collection and processing of UAV image data to ensure accurate point cloud generation. PMID:22368479

  6. Point cloud generation from aerial image data acquired by a quadrocopter type micro unmanned aerial vehicle and a digital still camera.

    PubMed

    Rosnell, Tomi; Honkavaara, Eija

    2012-01-01

    The objective of this investigation was to develop and investigate methods for point cloud generation by image matching using aerial image data collected by quadrocopter type micro unmanned aerial vehicle (UAV) imaging systems. Automatic generation of high-quality, dense point clouds from digital images by image matching is a recent, cutting-edge step forward in digital photogrammetric technology. The major components of the system for point cloud generation are a UAV imaging system, an image data collection process using high image overlaps, and post-processing with image orientation and point cloud generation. Two post-processing approaches were developed: one of the methods is based on Bae Systems' SOCET SET classical commercial photogrammetric software and another is built using Microsoft(®)'s Photosynth™ service available in the Internet. Empirical testing was carried out in two test areas. Photosynth processing showed that it is possible to orient the images and generate point clouds fully automatically without any a priori orientation information or interactive work. The photogrammetric processing line provided dense and accurate point clouds that followed the theoretical principles of photogrammetry, but also some artifacts were detected. The point clouds from the Photosynth processing were sparser and noisier, which is to a large extent due to the fact that the method is not optimized for dense point cloud generation. Careful photogrammetric processing with self-calibration is required to achieve the highest accuracy. Our results demonstrate the high performance potential of the approach and that with rigorous processing it is possible to reach results that are consistent with theory. We also point out several further research topics. Based on theoretical and empirical results, we give recommendations for properties of imaging sensor, data collection and processing of UAV image data to ensure accurate point cloud generation.

  7. Digital coincidence counting

    NASA Astrophysics Data System (ADS)

    Buckman, S. M.; Ius, D.

    1996-02-01

    This paper reports on the development of a digital coincidence-counting system which comprises a custom-built data acquisition card and associated PC software. The system has been designed to digitise the pulse-trains from two radiation detectors at a rate of 20 MSamples/s with 12-bit resolution. Through hardware compression of the data, the system can continuously record both individual pulse-shapes and the time intervals between pulses. Software-based circuits are used to process the stored pulse trains. These circuits are constructed simply by linking together icons representing various components such as coincidence mixers, time delays, single-channel analysers, deadtimes and scalers. This system enables a pair of pulse trains to be processed repeatedly using any number of different methods. Some preliminary results are presented in order to demonstrate the versatility and efficiency of this new method.

  8. Visualization of permanent marks in progressive addition lenses by digital in-line holography

    NASA Astrophysics Data System (ADS)

    Perucho, Beatriz; Micó, Vicente

    2013-04-01

    A critical issue in the production of ophthalmic lenses is to guarantee the correct centering and alignment throughout the manufacturing and mounting processes. Aimed to that, progressive addition lenses (PALs) incorporate permanent marks at standardized locations at the lens. Those marks are engraved upon the surface and provide the model identification and addition power of the PAL, as well as to serve as locator marks to re-ink the removable marks again if necessary. Although the permanent marks should be visible by simple visual inspection, those marks are often faint and weak on new lenses providing low contrast, obscured by scratches on older lenses, and partially occluded and difficult to recognize on tinted or anti-reflection coated lenses. In this contribution, we present an extremely simple visualization system for permanent marks in PALs based on digital in-line holography. Light emitted by a superluminescent diode (SLD) is used to illuminate the PAL which is placed just before a digital (CCD) sensor. Thus, the CCD records an in-line hologram incoming from the diffracted wavefront provided by the PAL. As a result, it is possible to recover an in-focus image of the PAL inspected region by means of classical holographic tools applied in the digital domain. This numerical process involves digital recording of the in-line hologram, numerical back propagation to the PAL plane, and some digital processing to reduce noise and present a high quality final image. Preliminary experimental results are provided showing the applicability of the proposed method.

  9. A Preliminary Comparison of Three Dimensional Particle Tracking and Sizing using Plenoptic Imaging and Digital In-line Holography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guildenbecher, Daniel Robert; Munz, Elise Dahnke; Farias, Paul Abraham

    2015-12-01

    Digital in-line holography and plenoptic photography are two techniques for single-shot, volumetric measurement of 3D particle fields. Here we present a preliminary comparison of the two methods by applying plenoptic imaging to experimental configurations that have been previously investigated with digital in-line holography. These experiments include the tracking of secondary droplets from the impact of a water drop on a thin film of water and tracking of pellets from a shotgun. Both plenoptic imaging and digital in-line holography successfully quantify the 3D nature of these particle fields. This includes measurement of the 3D particle position, individual particle sizes, and three-componentmore » velocity vectors. For the initial processing methods presented here, both techniques give out-of-plane positional accuracy of approximately 1-2 particle diameters. For a fixed image sensor, digital holography achieves higher effective in-plane spatial resolutions. However, collimated and coherent illumination makes holography susceptible to image distortion through index of refraction gradients, as demonstrated in the shotgun experiments. On the other hand, plenotpic imaging allows for a simpler experimental configuration. Furthermore, due to the use of diffuse, white-light illumination, plenoptic imaging is less susceptible to image distortion in the shotgun experiments. Additional work is needed to better quantify sources of uncertainty, particularly in the plenoptic experiments, as well as develop data processing methodologies optimized for the plenoptic measurement.« less

  10. A Preliminary Comparison of Three Dimensional Particle Tracking and Sizing using Plenoptic Imaging and Digital In-line Holography [PowerPoint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guildenbecher, Daniel Robert; Munz, Elise Dahnke; Farias, Paul Abraham

    2015-12-01

    Digital in-line holography and plenoptic photography are two techniques for single-shot, volumetric measurement of 3D particle fields. Here we present a preliminary comparison of the two methods by applying plenoptic imaging to experimental configurations that have been previously investigated with digital in-line holography. These experiments include the tracking of secondary droplets from the impact of a water drop on a thin film of water and tracking of pellets from a shotgun. Both plenoptic imaging and digital in-line holography successfully quantify the 3D nature of these particle fields. This includes measurement of the 3D particle position, individual particle sizes, and three-componentmore » velocity vectors. For the initial processing methods presented here, both techniques give out-of-plane positional accuracy of approximately 1-2 particle diameters. For a fixed image sensor, digital holography achieves higher effective in-plane spatial resolutions. However, collimated and coherent illumination makes holography susceptible to image distortion through index of refraction gradients, as demonstrated in the shotgun experiments. On the other hand, plenotpic imaging allows for a simpler experimental configuration. Furthermore, due to the use of diffuse, white-light illumination, plenoptic imaging is less susceptible to image distortion in the shotgun experiments. Additional work is needed to better quantify sources of uncertainty, particularly in the plenoptic experiments, as well as develop data processing methodologies optimized for the plenoptic measurement.« less

  11. A multiparametric automatic method to monitor long-term reproducibility in digital mammography: results from a regional screening programme.

    PubMed

    Gennaro, G; Ballaminut, A; Contento, G

    2017-09-01

    This study aims to illustrate a multiparametric automatic method for monitoring long-term reproducibility of digital mammography systems, and its application on a large scale. Twenty-five digital mammography systems employed within a regional screening programme were controlled weekly using the same type of phantom, whose images were analysed by an automatic software tool. To assess system reproducibility levels, 15 image quality indices (IQIs) were extracted and compared with the corresponding indices previously determined by a baseline procedure. The coefficients of variation (COVs) of the IQIs were used to assess the overall variability. A total of 2553 phantom images were collected from the 25 digital mammography systems from March 2013 to December 2014. Most of the systems showed excellent image quality reproducibility over the surveillance interval, with mean variability below 5%. Variability of each IQI was 5%, with the exception of one index associated with the smallest phantom objects (0.25 mm), which was below 10%. The method applied for reproducibility tests-multi-detail phantoms, cloud automatic software tool to measure multiple image quality indices and statistical process control-was proven to be effective and applicable on a large scale and to any type of digital mammography system. • Reproducibility of mammography image quality should be monitored by appropriate quality controls. • Use of automatic software tools allows image quality evaluation by multiple indices. • System reproducibility can be assessed comparing current index value with baseline data. • Overall system reproducibility of modern digital mammography systems is excellent. • The method proposed and applied is cost-effective and easily scalable.

  12. Cost-estimating for commercial digital printing

    NASA Astrophysics Data System (ADS)

    Keif, Malcolm G.

    2007-01-01

    The purpose of this study is to document current cost-estimating practices used in commercial digital printing. A research study was conducted to determine the use of cost-estimating in commercial digital printing companies. This study answers the questions: 1) What methods are currently being used to estimate digital printing? 2) What is the relationship between estimating and pricing digital printing? 3) To what extent, if at all, do digital printers use full-absorption, all-inclusive hourly rates for estimating? Three different digital printing models were identified: 1) Traditional print providers, who supplement their offset presswork with digital printing for short-run color and versioned commercial print; 2) "Low-touch" print providers, who leverage the power of the Internet to streamline business transactions with digital storefronts; 3) Marketing solutions providers, who see printing less as a discrete manufacturing process and more as a component of a complete marketing campaign. Each model approaches estimating differently. Understanding and predicting costs can be extremely beneficial. Establishing a reliable system to estimate those costs can be somewhat challenging though. Unquestionably, cost-estimating digital printing will increase in relevance in the years ahead, as margins tighten and cost knowledge becomes increasingly more critical.

  13. Complete denture tooth arrangement technology driven by a reconfigurable rule.

    PubMed

    Dai, Ning; Yu, Xiaoling; Fan, Qilei; Yuan, Fulai; Liu, Lele; Sun, Yuchun

    2018-01-01

    The conventional technique for the fabrication of complete dentures is complex, with a long fabrication process and difficult-to-control restoration quality. In recent years, digital complete denture design has become a research focus. Digital complete denture tooth arrangement is a challenging issue that is difficult to efficiently implement under the constraints of complex tooth arrangement rules and the patient's individualized functional aesthetics. The present study proposes a complete denture automatic tooth arrangement method driven by a reconfigurable rule; it uses four typical operators, including a position operator, a scaling operator, a posture operator, and a contact operator, to establish the constraint mapping association between the teeth and the constraint set of the individual patient. By using the process reorganization of different constraint operators, this method can flexibly implement different clinical tooth arrangement rules. When combined with a virtual occlusion algorithm based on progressive iterative Laplacian deformation, the proposed method can achieve automatic and individual tooth arrangement. Finally, the experimental results verify that the proposed method is flexible and efficient.

  14. About problematic peculiarities of Fault Tolerance digital regulation organization

    NASA Astrophysics Data System (ADS)

    Rakov, V. I.; Zakharova, O. V.

    2018-05-01

    The solution of problems concerning estimation of working capacity of regulation chains and possibilities of preventing situations of its violation in three directions are offered. The first direction is working out (creating) the methods of representing the regulation loop (circuit) by means of uniting (combining) diffuse components and forming algorithmic tooling for building predicates of serviceability assessment separately for the components and the for regulation loops (circuits, contours) in general. The second direction is creating methods of Fault Tolerance redundancy in the process of complex assessment of current values of control actions, closure errors and their regulated parameters. The third direction is creating methods of comparing the processes of alteration (change) of control actions, errors of closure and regulating parameters with their standard models or their surroundings. This direction allows one to develop methods and algorithmic tool means, aimed at preventing loss of serviceability and effectiveness of not only a separate digital regulator, but also the whole complex of Fault Tolerance regulation.

  15. Digital stories as a tool for health promotion and youth engagement.

    PubMed

    Fletcher, Sarah; Mullett, Jennifer

    2016-08-15

    To provide opportunities for intergenerational knowledge sharing for healthy lifestyles; to facilitate youth and Elder mentorship; and to increase the self-esteem of youth by celebrating identity, cultural practices and community connection through the creation and sharing of digital stories. A youth research team (8 youth) aged 13-25, youth participants (60 core participants and 170 workshop participants) and Elders (14) from First Nations communities. The project was conducted with participants from several communities on Vancouver Island through on-site workshops and presentations. Youth and Elders were invited to a 3-day digital story workshop consisting of knowledge-sharing sessions by Elders and digital story training by the youth research team. Workshop attendees returned to their communities to develop stories. The group re-convened at the university to create digital stories focused on community connections, family histories and healthy lifestyles. During the following year the research team delivered instructional sessions in communities on the digital story process. The youth involved reported increased pride in community as well as new or enhanced relationships with Elders. The digital stories method facilitated intergenerational interactions and engaged community members in creating a digital representation of healthy lifestyles. The process itself is an intervention, as it affords critical reflection on historical, cultural and spiritual ideas of health and what it means to be healthy in an Aboriginal community. It is a particularly relevant health promotion tool in First Nations communities with strong oral history traditions.

  16. 78 FR 14233 - Electronic Retirement Processing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-05

    .... Public/private key (asymmetric) cryptography is a method of creating a unique mark, known as a digital... delivering a broad array of administrative services to multiple agencies. Shared symmetric key cryptography...

  17. Digital curation and online resources: digital scanning of surgical tools at the royal college of physicians and surgeons of Glasgow for an open university learning resource.

    PubMed

    Earley, Kirsty; Livingstone, Daniel; Rea, Paul M

    2017-01-01

    Collection preservation is essential for the cultural status of any city. However, presenting a collection publicly risks damage. Recently this drawback has been overcome by digital curation. Described here is a method of digitisation using photogrammetry and virtual reality software. Items were selected from the Royal College of Physicians and Surgeons of Glasgow archives, and implemented into an online learning module for the Open University. Images were processed via Agisoft Photoscan, Autodesk Memento, and Garden Gnome Object 2VR. Although problems arose due to specularity, 2VR digital models were developed for online viewing. Future research must minimise the difficulty of digitising specular objects.

  18. The FBI compression standard for digitized fingerprint images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brislawn, C.M.; Bradley, J.N.; Onyshczak, R.J.

    1996-10-01

    The FBI has formulated national standards for digitization and compression of gray-scale fingerprint images. The compression algorithm for the digitized images is based on adaptive uniform scalar quantization of a discrete wavelet transform subband decomposition, a technique referred to as the wavelet/scalar quantization method. The algorithm produces archival-quality images at compression ratios of around 15 to 1 and will allow the current database of paper fingerprint cards to be replaced by digital imagery. A compliance testing program is also being implemented to ensure high standards of image quality and interchangeability of data between different implementations. We will review the currentmore » status of the FBI standard, including the compliance testing process and the details of the first-generation encoder.« less

  19. FBI compression standard for digitized fingerprint images

    NASA Astrophysics Data System (ADS)

    Brislawn, Christopher M.; Bradley, Jonathan N.; Onyshczak, Remigius J.; Hopper, Thomas

    1996-11-01

    The FBI has formulated national standards for digitization and compression of gray-scale fingerprint images. The compression algorithm for the digitized images is based on adaptive uniform scalar quantization of a discrete wavelet transform subband decomposition, a technique referred to as the wavelet/scalar quantization method. The algorithm produces archival-quality images at compression ratios of around 15 to 1 and will allow the current database of paper fingerprint cards to be replaced by digital imagery. A compliance testing program is also being implemented to ensure high standards of image quality and interchangeability of data between different implementations. We will review the current status of the FBI standard, including the compliance testing process and the details of the first-generation encoder.

  20. A digital version of the 1970 U.S. Geological Survey topographic map of the San Francisco Bay region, three sheets, 1:125,000

    USGS Publications Warehouse

    Aitken, Douglas S.

    1997-01-01

    This Open-File report is a digital topographic map database. It contains a digital version of the 1970 U.S. Geological Survey topographic map of the San Francisco Bay Region (3 sheets), at a scale of 1:125,000. These ARC/INFO coverages are in vector format. The vectorization process has distorted characters representing letters and numbers, as well as some road and other symbols, making them difficult to read in some instances. This pamphlet serves to introduce and describe the digital data. There is no paper map included in the Open-File report. The content and character of the database and methods of obtaining it are described herein.

  1. A hybrid voice/data modulation for the VHF aeronautical channels

    NASA Technical Reports Server (NTRS)

    Akos, Dennis M.

    1993-01-01

    A method of improving the spectral efficiency of the existing Very High Frequency (VHF) Amplitude Modulation (AM) voice communication channels is proposed. The technique is to phase modulate the existing voice amplitude modulated carrier with digital data. This allows the transmission of digital information over an existing AM voice channel with no change to the existing AM signal format. There is no modification to the existing AM receiver to demodulate the voice signal and an additional receiver module can be added for processing of the digital data. The existing VHF AM transmitter requires only a slight modification for the addition of the digital data signal. The past work in the area is summarized and presented together with an improved system design and the proposed implementation.

  2. Methods for performing fast discrete curvelet transforms of data

    DOEpatents

    Candes, Emmanuel; Donoho, David; Demanet, Laurent

    2010-11-23

    Fast digital implementations of the second generation curvelet transform for use in data processing are disclosed. One such digital transformation is based on unequally-spaced fast Fourier transforms (USFFT) while another is based on the wrapping of specially selected Fourier samples. Both digital transformations return a table of digital curvelet coefficients indexed by a scale parameter, an orientation parameter, and a spatial location parameter. Both implementations are fast in the sense that they run in about O(n.sup.2 log n) flops for n by n Cartesian arrays or about O(N log N) flops for Cartesian arrays of size N=n.sup.3; in addition, they are also invertible, with rapid inversion algorithms of about the same complexity.

  3. Linear Calibration of Radiographic Mineral Density Using Video-Digitizing Methods

    NASA Technical Reports Server (NTRS)

    Martin, R. Bruce; Papamichos, Thomas; Dannucci, Greg A.

    1990-01-01

    Radiographic images can provide quantitative as well as qualitative information if they are subjected to densitometric analysis. Using modem video-digitizing techniques, such densitometry can be readily accomplished using relatively inexpensive computer systems. However, such analyses are made more difficult by the fact that the density values read from the radiograph have a complex, nonlinear relationship to bone mineral content. This article derives the relationship between these variables from the nature of the intermediate physical processes, and presents a simple mathematical method for obtaining a linear calibration function using a step wedge or other standard.

  4. Linear Calibration of Radiographic Mineral Density Using Video-Digitizing Methods

    NASA Technical Reports Server (NTRS)

    Martin, R. Bruce; Papamichos, Thomas; Dannucci, Greg A.

    1990-01-01

    Radiographic images can provide quantitative as well as qualitative information if they are subjected to densitometric analysis. Using modern video-digitizing techniques, such densitometry can be readily accomplished using relatively inexpensive computer systems. However, such analyses are made more difficult by the fact that the density values read from the radiograph have a complex, nonlinear relationship to bone mineral content. This article derives the relationship between these variables from the nature of the intermediate physical processes, and presents a simple mathematical method for obtaining a linear calibration function using a step wedge or other standard.

  5. Comparison of 5 types of interocclusal recording materials on the accuracy of articulation of digital models.

    PubMed

    Sweeney, Sunya; Smith, Derek K; Messersmith, Marion

    2015-08-01

    One method of articulating digital models is to use a digitized interocclusal record. However, the accuracy of different interocclusal record materials to articulate digital models has yet to be evaluated. A plastic typodont was modified with reference points for interarch measurements and articulated in maximum intercuspal position on a semiadjustable hinge articulator. Twenty-five interocclusal records of each of the 5 experimental materials (Regisil Rigid, Dentsply, York, Pa; Futar Scan, Kettenbach, Huntington Beach, Calif; Byte Right, Motion View Software, Chattanooga, Tenn; Aluwax, Aluwax Dental Products, Allendale, Mich; and Beauty Pink wax, Miltex, York, Pa) were made on the mounted typodont and digitized using an Ortho Insight 3D laser surface scanner (Motion View Software). Motion View Software was used to articulate the digital models by matching points from the models to the digitized interocclusal records. The distances between corresponding interarch markers were measured and compared with the measurements taken on the physical typodont (gold standard). Polyvinyl siloxane materials were significantly more likely to lead to successful articulation than were the other interocclusal record materials. Statistical analysis showed a significant effect of the bite registration material on the probability of success of the articulation (P <0.005). Polyvinyl siloxane is a more accurate interocclusal recording material when articulating digital models according to the process described in this study. Using a bite registration to articulate digital models should be considered the first step in the articulation process, with a likely residual need to manipulate the models manually. Copyright © 2015 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  6. Multiscale image processing and antiscatter grids in digital radiography.

    PubMed

    Lo, Winnie Y; Hornof, William J; Zwingenberger, Allison L; Robertson, Ian D

    2009-01-01

    Scatter radiation is a source of noise and results in decreased signal-to-noise ratio and thus decreased image quality in digital radiography. We determined subjectively whether a digitally processed image made without a grid would be of similar quality to an image made with a grid but without image processing. Additionally the effects of exposure dose and of a using a grid with digital radiography on overall image quality were studied. Thoracic and abdominal radiographs of five dogs of various sizes were made. Four acquisition techniques were included (1) with a grid, standard exposure dose, digital image processing; (2) without a grid, standard exposure dose, digital image processing; (3) without a grid, half the exposure dose, digital image processing; and (4) with a grid, standard exposure dose, no digital image processing (to mimic a film-screen radiograph). Full-size radiographs as well as magnified images of specific anatomic regions were generated. Nine reviewers rated the overall image quality subjectively using a five-point scale. All digitally processed radiographs had higher overall scores than nondigitally processed radiographs regardless of patient size, exposure dose, or use of a grid. The images made at half the exposure dose had a slightly lower quality than those made at full dose, but this was only statistically significant in magnified images. Using a grid with digital image processing led to a slight but statistically significant increase in overall quality when compared with digitally processed images made without a grid but whether this increase in quality is clinically significant is unknown.

  7. Robust adaptive multichannel SAR processing based on covariance matrix reconstruction

    NASA Astrophysics Data System (ADS)

    Tan, Zhen-ya; He, Feng

    2018-04-01

    With the combination of digital beamforming (DBF) processing, multichannel synthetic aperture radar(SAR) systems in azimuth promise well in high-resolution and wide-swath imaging, whereas conventional processing methods don't take the nonuniformity of scattering coefficient into consideration. This paper brings up a robust adaptive Multichannel SAR processing method which utilizes the Capon spatial spectrum estimator to obtain the spatial spectrum distribution over all ambiguous directions first, and then the interference-plus-noise covariance Matrix is reconstructed based on definition to acquire the Multichannel SAR processing filter. The performance of processing under nonuniform scattering coefficient is promoted by this novel method and it is robust again array errors. The experiments with real measured data demonstrate the effectiveness and robustness of the proposed method.

  8. An automated and universal method for measuring mean grain size from a digital image of sediment

    USGS Publications Warehouse

    Buscombe, Daniel D.; Rubin, David M.; Warrick, Jonathan A.

    2010-01-01

    Existing methods for estimating mean grain size of sediment in an image require either complicated sequences of image processing (filtering, edge detection, segmentation, etc.) or statistical procedures involving calibration. We present a new approach which uses Fourier methods to calculate grain size directly from the image without requiring calibration. Based on analysis of over 450 images, we found the accuracy to be within approximately 16% across the full range from silt to pebbles. Accuracy is comparable to, or better than, existing digital methods. The new method, in conjunction with recent advances in technology for taking appropriate images of sediment in a range of natural environments, promises to revolutionize the logistics and speed at which grain-size data may be obtained from the field.

  9. 77 FR 50724 - Developing Software Life Cycle Processes for Digital Computer Software Used in Safety Systems of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Developing Software Life Cycle Processes for Digital... Software Life Cycle Processes for Digital Computer Software used in Safety Systems of Nuclear Power Plants... clarifications, the enhanced consensus practices for developing software life-cycle processes for digital...

  10. 3D Analysis of Human Embryos and Fetuses Using Digitized Datasets From the Kyoto Collection.

    PubMed

    Takakuwa, Tetsuya

    2018-06-01

    Three-dimensional (3D) analysis of the human embryonic and early-fetal period has been performed using digitized datasets obtained from the Kyoto Collection, in which the digital datasets play a primary role in research. Datasets include magnetic resonance imaging (MRI) acquired with 1.5 T, 2.35 T, and 7 T magnet systems, phase-contrast X-ray computed tomography (CT), and digitized histological serial sections. Large, high-resolution datasets covering a broad range of developmental periods obtained with various methods of acquisition are key elements for the studies. The digital data have gross merits that enabled us to develop various analysis. Digital data analysis accelerated the speed of morphological observations using precise and improved methods by providing a suitable plane for a morphometric analysis from staged human embryos. Morphometric data are useful for quantitatively evaluating and demonstrating the features of development and for screening abnormal samples, which may be suggestive in the pathogenesis of congenital malformations. Morphometric data are also valuable for comparing sonographic data in a process known as "sonoembryology." The 3D coordinates of anatomical landmarks may be useful tools for analyzing the positional change of interesting landmarks and their relationships during development. Several dynamic events could be explained by differential growth using 3D coordinates. Moreover, 3D coordinates can be utilized in mathematical analysis as well as statistical analysis. The 3D analysis in our study may serve to provide accurate morphologic data, including the dynamics of embryonic structures related to developmental stages, which is required for insights into the dynamic and complex processes occurring during organogenesis. Anat Rec, 301:960-969, 2018. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  11. Verbal Working Memory in Older Adults: The Roles of Phonological Capacities and Processing Speed

    ERIC Educational Resources Information Center

    Nittrouer, Susan; Lowenstein, Joanna H.; Wucinich, Taylor; Moberly, Aaron C.

    2016-01-01

    Purpose: This study examined the potential roles of phonological sensitivity and processing speed in age-related declines of verbal working memory. Method: Twenty younger and 25 older adults with age-normal hearing participated. Two measures of verbal working memory were collected: digit span and serial recall of words. Processing speed was…

  12. Symbol processing in the left angular gyrus: evidence from passive perception of digits.

    PubMed

    Price, Gavin R; Ansari, Daniel

    2011-08-01

    Arabic digits are one of the most ubiquitous symbol sets in the world. While there have been many investigations into the neural processing of the semantic information digits represent (e.g. through numerical comparison tasks), little is known about the neural mechanisms which support the processing of digits as visual symbols. To characterise the component neurocognitive mechanisms which underlie numerical cognition, it is essential to understand the processing of digits as a visual category, independent of numerical magnitude processing. The 'Triple Code Model' (Dehaene, 1992; Dehaene and Cohen, 1995) posits an asemantic visual code for processing Arabic digits in the ventral visual stream, yet there is currently little empirical evidence in support of this code. This outstanding question was addressed in the current functional Magnetic Resonance (fMRI) study by contrasting brain responses during the passive viewing of digits versus letters and novel symbols at short (50 ms) and long (500 ms) presentation times. The results of this study reveal increased activation for familiar symbols (digits and letters) relative to unfamiliar symbols (scrambled digits and letters) at long presentation durations in the left dorsal Angular gyrus (dAG). Furthermore, increased activation for Arabic digits was observed in the left ventral Angular gyrus (vAG) in comparison to letters, scrambled digits and scrambled letters at long presentation durations, but no digit specific activation in any region at short presentation durations. These results suggest an absence of a digit specific 'Visual Number Form Area' (VNFA) in the ventral visual cortex, and provide evidence for the role of the left ventral AG during the processing of digits in the absence of any explicit processing demands. We conclude that Arabic digit processing depends specifically on the left AG rather than a ventral visual stream VNFA. Copyright © 2011 Elsevier Inc. All rights reserved.

  13. Radiologists' preferences for digital mammographic display. The International Digital Mammography Development Group.

    PubMed

    Pisano, E D; Cole, E B; Major, S; Zong, S; Hemminger, B M; Muller, K E; Johnston, R E; Walsh, R; Conant, E; Fajardo, L L; Feig, S A; Nishikawa, R M; Yaffe, M J; Williams, M B; Aylward, S R

    2000-09-01

    To determine the preferences of radiologists among eight different image processing algorithms applied to digital mammograms obtained for screening and diagnostic imaging tasks. Twenty-eight images representing histologically proved masses or calcifications were obtained by using three clinically available digital mammographic units. Images were processed and printed on film by using manual intensity windowing, histogram-based intensity windowing, mixture model intensity windowing, peripheral equalization, multiscale image contrast amplification (MUSICA), contrast-limited adaptive histogram equalization, Trex processing, and unsharp masking. Twelve radiologists compared the processed digital images with screen-film mammograms obtained in the same patient for breast cancer screening and breast lesion diagnosis. For the screening task, screen-film mammograms were preferred to all digital presentations, but the acceptability of images processed with Trex and MUSICA algorithms were not significantly different. All printed digital images were preferred to screen-film radiographs in the diagnosis of masses; mammograms processed with unsharp masking were significantly preferred. For the diagnosis of calcifications, no processed digital mammogram was preferred to screen-film mammograms. When digital mammograms were preferred to screen-film mammograms, radiologists selected different digital processing algorithms for each of three mammographic reading tasks and for different lesion types. Soft-copy display will eventually allow radiologists to select among these options more easily.

  14. Assessment of Photogrammetry Structure-from-Motion Compared to Terrestrial LiDAR Scanning for Generating Digital Elevation Models. Application to the Austre Lovéenbreen Polar Glacier Basin, Spitsbergen 79°N

    NASA Astrophysics Data System (ADS)

    Tolle, F.; Friedt, J. M.; Bernard, É.; Prokop, A.; Griselin, M.

    2014-12-01

    Digital Elevation Model (DEM) is a key tool for analyzing spatially dependent processes including snow accumulation on slopes or glacier mass balance. Acquiring DEM within short time intervals provides new opportunities to evaluate such phenomena at the daily to seasonal rates.DEMs are usually generated from satellite imagery, aerial photography, airborne and ground-based LiDAR, and GPS surveys. In addition to these classical methods, we consider another alternative for periodic DEM acquisition with lower logistics requirements: digital processing of ground based, oblique view digital photography. Such a dataset, acquired using commercial off the shelf cameras, provides the source for generating elevation models using Structure from Motion (SfM) algorithms. Sets of pictures of a same structure but taken from various points of view are acquired. Selected features are identified on the images and allow for the reconstruction of the three-dimensional (3D) point cloud after computing the camera positions and optical properties. This cloud point, generated in an arbitrary coordinate system, is converted to an absolute coordinate system either by adding constraints of Ground Control Points (GCP), or including the (GPS) position of the cameras in the processing chain. We selected the opensource digital signal processing library provided by the French Geographic Institute (IGN) called Micmac for its fine processing granularity and the ability to assess the quality of each processing step.Although operating in snow covered environments appears challenging due to the lack of relevant features, we observed that enough reference points could be identified for 3D reconstruction. Despite poor climatic environment of the Arctic region considered (Ny Alesund area, 79oN) is not a problem for SfM, the low lying spring sun and the cast shadows appear as a limitation because of the lack of color dynamics in the digital cameras we used. A detailed understanding of the processing steps is mandatory during the image acquisition phase: compliance with acquisition rules reducing digital processing errors helps minimizing the uncertainty on the point cloud absolute position in its coordinate system. 3D models from SfM are compared with terrestrial LiDAR acquisitions for resolution assesment.

  15. Particle detection, number estimation, and feature measurement in gene transfer studies: optical fractionator stereology integrated with digital image processing and analysis.

    PubMed

    King, Michael A; Scotty, Nicole; Klein, Ronald L; Meyer, Edwin M

    2002-10-01

    Assessing the efficacy of in vivo gene transfer often requires a quantitative determination of the number, size, shape, or histological visualization characteristics of biological objects. The optical fractionator has become a choice stereological method for estimating the number of objects, such as neurons, in a structure, such as a brain subregion. Digital image processing and analytic methods can increase detection sensitivity and quantify structural and/or spectral features located in histological specimens. We describe a hardware and software system that we have developed for conducting the optical fractionator process. A microscope equipped with a video camera and motorized stage and focus controls is interfaced with a desktop computer. The computer contains a combination live video/computer graphics adapter with a video frame grabber and controls the stage, focus, and video via a commercial imaging software package. Specialized macro programs have been constructed with this software to execute command sequences requisite to the optical fractionator method: defining regions of interest, positioning specimens in a systematic uniform random manner, and stepping through known volumes of tissue for interactive object identification (optical dissectors). The system affords the flexibility to work with count regions that exceed the microscope image field size at low magnifications and to adjust the parameters of the fractionator sampling to best match the demands of particular specimens and object types. Digital image processing can be used to facilitate object detection and identification, and objects that meet criteria for counting can be analyzed for a variety of morphometric and optical properties. Copyright 2002 Elsevier Science (USA)

  16. Flexible circuits with integrated switches for robotic shape sensing

    NASA Astrophysics Data System (ADS)

    Harnett, C. K.

    2016-05-01

    Digital switches are commonly used for detecting surface contact and limb-position limits in robotics. The typical momentary-contact digital switch is a mechanical device made from metal springs, designed to connect with a rigid printed circuit board (PCB). However, flexible printed circuits are taking over from the rigid PCB in robotics because the circuits can bend while carrying signals and power through moving joints. This project is motivated by a previous work where an array of surface-mount momentary contact switches on a flexible circuit acted as an all-digital shape sensor compatible with the power resources of energy harvesting systems. Without a rigid segment, the smallest commercially-available surface-mount switches would detach from the flexible circuit after several bending cycles, sometimes violently. This report describes a low-cost, conductive fiber based method to integrate electromechanical switches into flexible circuits and other soft, bendable materials. Because the switches are digital (on/off), they differ from commercially-available continuous-valued bend/flex sensors. No amplification or analog-to-digital conversion is needed to read the signal, but the tradeoff is that the digital switches only give a threshold curvature value. Boundary conditions on the edges of the flexible circuit are key to setting the threshold curvature value for switching. This presentation will discuss threshold-setting, size scaling of the design, automation for inserting a digital switch into the flexible circuit fabrication process, and methods for reconstructing a shape from an array of digital switch states.

  17. Integration of image capture and processing: beyond single-chip digital camera

    NASA Astrophysics Data System (ADS)

    Lim, SukHwan; El Gamal, Abbas

    2001-05-01

    An important trend in the design of digital cameras is the integration of capture and processing onto a single CMOS chip. Although integrating the components of a digital camera system onto a single chip significantly reduces system size and power, it does not fully exploit the potential advantages of integration. We argue that a key advantage of integration is the ability to exploit the high speed imaging capability of CMOS image senor to enable new applications such as multiple capture for enhancing dynamic range and to improve the performance of existing applications such as optical flow estimation. Conventional digital cameras operate at low frame rates and it would be too costly, if not infeasible, to operate their chips at high frame rates. Integration solves this problem. The idea is to capture images at much higher frame rates than he standard frame rate, process the high frame rate data on chip, and output the video sequence and the application specific data at standard frame rate. This idea is applied to optical flow estimation, where significant performance improvements are demonstrate over methods using standard frame rate sequences. We then investigate the constraints on memory size and processing power that can be integrated with a CMOS image sensor in a 0.18 micrometers process and below. We show that enough memory and processing power can be integrated to be able to not only perform the functions of a conventional camera system but also to perform applications such as real time optical flow estimation.

  18. Digital off-axis holographic interferometry with simulated wavefront.

    PubMed

    Belashov, A V; Petrov, N V; Semenova, I V

    2014-11-17

    The paper presents a novel algorithm based on digital holographic interferometry and being promising for evaluation of phase variations from highly noisy or modulated by speckle-structures digital holograms. The suggested algorithm simulates an interferogram in finite width fringes, by analogy with classical double exposure holographic interferometry. Thus obtained interferogram is then processed as a digital hologram. The advantages of the suggested approach are demonstrated in numerical experiments on calculations of differences in phase distributions of wave fronts modulated by speckle structure, as well as in a physical experiment on the analysis of laser-induced heating dynamics of an aqueous solution of a photosensitizer. It is shown that owing to the inherent capability of the approach to perform adjustable smoothing of compared wave fronts, the resulting difference undergoes noise filtering. This capability of adjustable smoothing may be used to minimize losses in spatial resolution. Since the method allows to vary an observation angle of compared wave fields, an opportunity to compensate misalignment of optical axes of these wave fronts arises. This feature can be required, for example, when using two different setups in comparative digital holography or for compensation of recording system displacements during a set of exposures in studies of dynamic processes.

  19. Integrated mixed signal control IC for 500-kHz switching frequency buck regulator

    NASA Astrophysics Data System (ADS)

    Chen, Keng; Zhang, Hong

    2015-12-01

    The main purpose for this work is to study the challenges of designing a digital buck regulator using pipelined analog to digital converter (ADC). Although pipelined ADC can achieve high sampling speed, it will introduce additional phase lag to the buck circuit. Along with the latency brought by processing time of additional digital circuits, as well as the time delay associated with the switching frequency, the closed loop will be unstable; moreover, raw ADC outputs have low signal-to-noise ratio, which usually need back-end calibration. In order to compensate these phase lag and make control loop unconditional stable, as well as boost up signal-to-noise ratio of the ADC block with cost-efficient design, a finite impulse response filter followed by digital proportional-integral-derivative blocks were designed. All these digital function blocks were optimised with processing speed. In the system simulation, it can be found that this controller achieved output regulation within 10% of nominal 5 V output voltage under 1 A/µs load transient condition; moreover, with the soft-start method, there is no turn-on overshooting. The die size of this controller is controlled within 3 mm2 by using 180 nm CMOS technology.

  20. Evaluating Unsupervised Methods to Size and Classify Suspended Particles Using Digital Holography

    NASA Astrophysics Data System (ADS)

    Davies, E. J.; Buscombe, D.; Graham, G.; Nimmo-Smith, A.

    2013-12-01

    The use of digital holography to image suspended particles in-situ using submersible systems is on the ascendancy. Such systems allow visualization of the in-focus particles without the depth-of-field issues associated with conventional imaging. The size and concentration of all particles, and each individual particle, can be rapidly and automatically assessed. The automated methods by which to extract these quantities can be readily evaluated using manual measurements. These methods are not possible using instruments based on optical and acoustic (back- or forward-) scattering, so-called 'sediment surrogate' methods, which are sensitive to the bulk quantities of all suspended particles in a sample volume, and rely on mathematically inverting a measured signal to derive the property of interest. Depending on the intended application, the number of holograms required to elucidate a process could range from tens to millions. Therefore manual particle extraction is not feasible for most data-sets. This has created a pressing need among the growing community of holography users, for accurate, automated processing which is comparable in output to more well-established in-situ sizing techniques such as laser diffraction. Here we discuss the computational considerations required to focus and segment individual particles from raw digital holograms, and then size and classify these particles by type; all using unsupervised (automated) image processing. To do so, we draw upon imagery from both controlled laboratory conditions to near-shore coastal environments, using different holographic system designs, and constituting a significant variety in particle types, sizes and shapes. We evaluate the success of these techniques, and suggest directions for future developments.

  1. Saliency-aware food image segmentation for personal dietary assessment using a wearable computer

    USDA-ARS?s Scientific Manuscript database

    Image-based dietary assessment has recently received much attention in the community of obesity research. In this assessment, foods in digital pictures are specified, and their portion sizes (volumes) are estimated. Although manual processing is currently the most utilized method, image processing h...

  2. GSFC specification electronic data processing magnetic recording tape

    NASA Technical Reports Server (NTRS)

    Tinari, D. F.; Perry, J. L.

    1980-01-01

    The design requirements are given for magnetic oxide coated, electronic data processing tape, wound on reels. Magnetic recording tape types covered by this specification are intended for use on digital tape transports using the Non-Return-to-Zero-change-on-ones (NRZI) recording method for recording densities up to and including 800 characters per inch (cpi) and the Phase-Encoding (PE) recording method for a recording density of 1600 cpi.

  3. OPTICAL correlation identification technology applied in underwater laser imaging target identification

    NASA Astrophysics Data System (ADS)

    Yao, Guang-tao; Zhang, Xiao-hui; Ge, Wei-long

    2012-01-01

    The underwater laser imaging detection is an effective method of detecting short distance target underwater as an important complement of sonar detection. With the development of underwater laser imaging technology and underwater vehicle technology, the underwater automatic target identification has gotten more and more attention, and is a research difficulty in the area of underwater optical imaging information processing. Today, underwater automatic target identification based on optical imaging is usually realized with the method of digital circuit software programming. The algorithm realization and control of this method is very flexible. However, the optical imaging information is 2D image even 3D image, the amount of imaging processing information is abundant, so the electronic hardware with pure digital algorithm will need long identification time and is hard to meet the demands of real-time identification. If adopt computer parallel processing, the identification speed can be improved, but it will increase complexity, size and power consumption. This paper attempts to apply optical correlation identification technology to realize underwater automatic target identification. The optics correlation identification technology utilizes the Fourier transform characteristic of Fourier lens which can accomplish Fourier transform of image information in the level of nanosecond, and optical space interconnection calculation has the features of parallel, high speed, large capacity and high resolution, combines the flexibility of calculation and control of digital circuit method to realize optoelectronic hybrid identification mode. We reduce theoretical formulation of correlation identification and analyze the principle of optical correlation identification, and write MATLAB simulation program. We adopt single frame image obtained in underwater range gating laser imaging to identify, and through identifying and locating the different positions of target, we can improve the speed and orientation efficiency of target identification effectively, and validate the feasibility of this method primarily.

  4. Visual investigation on the heat dissipation process of a heat sink by using digital holographic interferometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Bingjing; Zhao, Jianlin, E-mail: jlzhao@nwpu.edu.cn; Wang, Jun

    2013-11-21

    We present a method for visually and quantitatively investigating the heat dissipation process of plate-fin heat sinks by using digital holographic interferometry. A series of phase change maps reflecting the temperature distribution and variation trend of the air field surrounding heat sink during the heat dissipation process are numerically reconstructed based on double-exposure holographic interferometry. According to the phase unwrapping algorithm and the derived relationship between temperature and phase change of the detection beam, the full-field temperature distributions are quantitatively obtained with a reasonably high measurement accuracy. And then the impact of heat sink's channel width on the heat dissipationmore » performance in the case of natural convection is analyzed. In addition, a comparison between simulation and experiment results is given to verify the reliability of this method. The experiment results certify the feasibility and validity of the presented method in full-field, dynamical, and quantitative measurement of the air field temperature distribution, which provides a basis for analyzing the heat dissipation performance of plate-fin heat sinks.« less

  5. Sequential or parallel decomposed processing of two-digit numbers? Evidence from eye-tracking.

    PubMed

    Moeller, Korbinian; Fischer, Martin H; Nuerk, Hans-Christoph; Willmes, Klaus

    2009-02-01

    While reaction time data have shown that decomposed processing of two-digit numbers occurs, there is little evidence about how decomposed processing functions. Poltrock and Schwartz (1984) argued that multi-digit numbers are compared in a sequential digit-by-digit fashion starting at the leftmost digit pair. In contrast, Nuerk and Willmes (2005) favoured parallel processing of the digits constituting a number. These models (i.e., sequential decomposition, parallel decomposition) make different predictions regarding the fixation pattern in a two-digit number magnitude comparison task and can therefore be differentiated by eye fixation data. We tested these models by evaluating participants' eye fixation behaviour while selecting the larger of two numbers. The stimulus set consisted of within-decade comparisons (e.g., 53_57) and between-decade comparisons (e.g., 42_57). The between-decade comparisons were further divided into compatible and incompatible trials (cf. Nuerk, Weger, & Willmes, 2001) and trials with different decade and unit distances. The observed fixation pattern implies that the comparison of two-digit numbers is not executed by sequentially comparing decade and unit digits as proposed by Poltrock and Schwartz (1984) but rather in a decomposed but parallel fashion. Moreover, the present fixation data provide first evidence that digit processing in multi-digit numbers is not a pure bottom-up effect, but is also influenced by top-down factors. Finally, implications for multi-digit number processing beyond the range of two-digit numbers are discussed.

  6. Simple algorithms for digital pulse-shape discrimination with liquid scintillation detectors

    NASA Astrophysics Data System (ADS)

    Alharbi, T.

    2015-01-01

    The development of compact, battery-powered digital liquid scintillation neutron detection systems for field applications requires digital pulse processing (DPP) algorithms with minimum computational overhead. To meet this demand, two DPP algorithms for the discrimination of neutron and γ-rays with liquid scintillation detectors were developed and examined by using a NE213 liquid scintillation detector in a mixed radiation field. The first algorithm is based on the relation between the amplitude of a current pulse at the output of a photomultiplier tube and the amount of charge contained in the pulse. A figure-of-merit (FOM) value of 0.98 with 450 keVee (electron equivalent energy) energy threshold was achieved with this method when pulses were sampled at 250 MSample/s and with 8-bit resolution. Compared to the similar method of charge-comparison this method requires only a single integration window, thereby reducing the amount of computations by approximately 40%. The second approach is a digital version of the trailing-edge constant-fraction discrimination method. A FOM value of 0.84 with an energy threshold of 450 keVee was achieved with this method. In comparison with the similar method of rise-time discrimination this method requires a single time pick-off, thereby reducing the amount of computations by approximately 50%. The algorithms described in this work are useful for developing portable detection systems for applications such as homeland security, radiation dosimetry and environmental monitoring.

  7. Three-directional motion-compensation mask-based novel look-up table on graphics processing units for video-rate generation of digital holographic videos of three-dimensional scenes.

    PubMed

    Kwon, Min-Woo; Kim, Seung-Cheol; Kim, Eun-Soo

    2016-01-20

    A three-directional motion-compensation mask-based novel look-up table method is proposed and implemented on graphics processing units (GPUs) for video-rate generation of digital holographic videos of three-dimensional (3D) scenes. Since the proposed method is designed to be well matched with the software and memory structures of GPUs, the number of compute-unified-device-architecture kernel function calls can be significantly reduced. This results in a great increase of the computational speed of the proposed method, allowing video-rate generation of the computer-generated hologram (CGH) patterns of 3D scenes. Experimental results reveal that the proposed method can generate 39.8 frames of Fresnel CGH patterns with 1920×1080 pixels per second for the test 3D video scenario with 12,088 object points on dual GPU boards of NVIDIA GTX TITANs, and they confirm the feasibility of the proposed method in the practical application fields of electroholographic 3D displays.

  8. Dynamic Rod Worth Measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chao, Y.A.; Chapman, D.M.; Hill, D.J.

    2000-12-15

    The dynamic rod worth measurement (DRWM) technique is a method of quickly validating the predicted bank worth of control rods and shutdown rods. The DRWM analytic method is based on three-dimensional, space-time kinetic simulations of the rapid rod movements. Its measurement data is processed with an advanced digital reactivity computer. DRWM has been used as the method of bank worth validation at numerous plant startups with excellent results. The process and methodology of DRWM are described, and the measurement results of using DRWM are presented.

  9. User Requirements Analysis For Digital Library Application Using Quality Function Deployment.

    NASA Astrophysics Data System (ADS)

    Wulandari, Lily; Sularto, Lana; Yusnitasari, Tristyanti; Ikasari, Diana

    2017-03-01

    This study attemp to build Smart Digital Library to be used by the wider community wherever they are. The system is built in the form of Smart Digital Library portal which uses semantic similarity method (Semantic Similarity) to search journals, articles or books by title or author name. This method is also used to determine the recommended books to be read by visitors of Smart Digital Library based on testimony from a previous reader automatically. Steps being taken in the development of Smart Digital Library system is the analysis phase, design phase, testing and implementation phase. At this stage of the analysis using WebQual for the preparation of the instruments to be distributed to the respondents and the data obtained from the respondents will be processed using Quality Function Deployment. In the analysis phase has the purpose of identifying consumer needs and technical requirements. The analysis was performed to a digital library on the web digital library Gunadarma University, Bogor Institute of Agriculture, University of Indonesia, etc. The questionnaire was distributed to 200 respondents. The research methodology begins with the collection of user requirements and analyse it using QFD. Application design is funded by the government through a program of Featured Universities Research by the Directorate General of Higher Education (DIKTI). Conclusions from this research are identified which include the Consumer Requirements of digital library application. The elements of the consumers requirements consists of 13 elements and 25 elements of Engineering Characteristics digital library requirements. Therefore the design of digital library applications that will be built, is designed according to the findings by eliminating features that are not needed by restaurant based on QFD House of Quality.

  10. Design and testing of a 750MHz CW-EPR digital console for small animal imaging.

    PubMed

    Sato-Akaba, Hideo; Emoto, Miho C; Hirata, Hiroshi; Fujii, Hirotada G

    2017-11-01

    This paper describes the development of a digital console for three-dimensional (3D) continuous wave electron paramagnetic resonance (CW-EPR) imaging of a small animal to improve the signal-to-noise ratio and lower the cost of the EPR imaging system. A RF generation board, an RF acquisition board and a digital signal processing (DSP) & control board were built for the digital EPR detection. Direct sampling of the reflected RF signal from a resonator (approximately 750MHz), which contains the EPR signal, was carried out using a band-pass subsampling method. A direct automatic control system to reduce the reflection from the resonator was proposed and implemented in the digital EPR detection scheme. All DSP tasks were carried out in field programmable gate array ICs. In vivo 3D imaging of nitroxyl radicals in a mouse's head was successfully performed. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Design and testing of a 750 MHz CW-EPR digital console for small animal imaging

    NASA Astrophysics Data System (ADS)

    Sato-Akaba, Hideo; Emoto, Miho C.; Hirata, Hiroshi; Fujii, Hirotada G.

    2017-11-01

    This paper describes the development of a digital console for three-dimensional (3D) continuous wave electron paramagnetic resonance (CW-EPR) imaging of a small animal to improve the signal-to-noise ratio and lower the cost of the EPR imaging system. A RF generation board, an RF acquisition board and a digital signal processing (DSP) & control board were built for the digital EPR detection. Direct sampling of the reflected RF signal from a resonator (approximately 750 MHz), which contains the EPR signal, was carried out using a band-pass subsampling method. A direct automatic control system to reduce the reflection from the resonator was proposed and implemented in the digital EPR detection scheme. All DSP tasks were carried out in field programmable gate array ICs. In vivo 3D imaging of nitroxyl radicals in a mouse's head was successfully performed.

  12. [Reliability of three dimensional resin model by rapid prototyping manufacturing and digital modeling].

    PubMed

    Zeng, Fei-huang; Xu, Yuan-zhi; Fang, Li; Tang, Xiao-shan

    2012-02-01

    To describe a new technique for fabricating an 3D resin model by 3D reconstruction and rapid prototyping, and to analyze the precision of this method. An optical grating scanner was used to acquire the data of silastic cavity block , digital dental cast was reconstructed with the data through Geomagic Studio image processing software. The final 3D reconstruction was saved in the pattern of Stl. The 3D resin model was fabricated by fuse deposition modeling, and was compared with the digital model and gypsum model. The data of three groups were statistically analyzed using SPSS 16.0 software package. No significant difference was found in gypsum model,digital dental cast and 3D resin model (P>0.05). Rapid prototyping manufacturing and digital modeling would be helpful for dental information acquisition, treatment design, appliance manufacturing, and can improve the communications between patients and doctors.

  13. Selecting a digital camera for telemedicine.

    PubMed

    Patricoski, Chris; Ferguson, A Stewart

    2009-06-01

    The digital camera is an essential component of store-and-forward telemedicine (electronic consultation). There are numerous makes and models of digital cameras on the market, and selecting a suitable consumer-grade camera can be complicated. Evaluation of digital cameras includes investigating the features and analyzing image quality. Important features include the camera settings, ease of use, macro capabilities, method of image transfer, and power recharging. Consideration needs to be given to image quality, especially as it relates to color (skin tones) and detail. It is important to know the level of the photographer and the intended application. The goal is to match the characteristics of the camera with the telemedicine program requirements. In the end, selecting a digital camera is a combination of qualitative (subjective) and quantitative (objective) analysis. For the telemedicine program in Alaska in 2008, the camera evaluation and decision process resulted in a specific selection based on the criteria developed for our environment.

  14. Dynamical measurement of refractive index distribution using digital holographic interferometry based on total internal reflection.

    PubMed

    Zhang, Jiwei; Di, Jianglei; Li, Ying; Xi, Teli; Zhao, Jianlin

    2015-10-19

    We present a method for dynamically measuring the refractive index distribution in a large range based on the combination of digital holographic interferometry and total internal reflection. A series of holograms, carrying the index information of mixed liquids adhered on a total reflection prism surface, are recorded with CCD during the diffusion process. Phase shift differences of the reflected light are reconstructed exploiting the principle of double-exposure holographic interferometry. According to the relationship between the reflection phase shift difference and the liquid index, two dimensional index distributions can be directly figured out, assuming that the index of air near the prism surface is constant. The proposed method can also be applied to measure the index of solid media and monitor the index variation during some chemical reaction processes.

  15. Combination of digital signal processing methods towards an improved analysis algorithm for structural health monitoring.

    NASA Astrophysics Data System (ADS)

    Pentaris, Fragkiskos P.; Makris, John P.

    2013-04-01

    In Structural Health Monitoring (SHM) is of great importance to reveal valuable information from the recorded SHM data that could be used to predict or indicate structural fault or damage in a building. In this work a combination of digital signal processing methods, namely FFT along with Wavelet Transform is applied, together with a proposed algorithm to study frequency dispersion, in order to depict non-linear characteristics of SHM data collected in two university buildings under natural or anthropogenic excitation. The selected buildings are of great importance from civil protection point of view, as there are the premises of a public higher education institute, undergoing high use, stress, visit from academic staff and students. The SHM data are collected from two neighboring buildings that have different age (4 and 18 years old respectively). Proposed digital signal processing methods are applied to the data, presenting a comparison of the structural behavior of both buildings in response to seismic activity, weather conditions and man-made activity. Acknowledgments This work was supported in part by the Archimedes III Program of the Ministry of Education of Greece, through the Operational Program "Educational and Lifelong Learning", in the framework of the project entitled «Interdisciplinary Multi-Scale Research of Earthquake Physics and Seismotectonics at the front of the Hellenic Arc (IMPACT-ARC) » and is co-financed by the European Union (European Social Fund) and Greek National Fund.

  16. Feature selection methods for object-based classification of sub-decimeter resolution digital aerial imagery

    USDA-ARS?s Scientific Manuscript database

    Due to the availability of numerous spectral, spatial, and contextual features, the determination of optimal features and class separabilities can be a time consuming process in object-based image analysis (OBIA). While several feature selection methods have been developed to assist OBIA, a robust c...

  17. Coniferous forest classification and inventory using Landsat and digital terrain data

    NASA Technical Reports Server (NTRS)

    Franklin, J.; Logan, T. L.; Woodcock, C. E.; Strahler, A. H.

    1986-01-01

    Machine-processing techniques were used in a Forest Classification and Inventory System (FOCIS) procedure to extract and process tonal, textural, and terrain information from registered Landsat multispectral and digital terrain data. Using FOCIS as a basis for stratified sampling, the softwood timber volumes of the Klamath National Forest and Eldorado National Forest were estimated within standard errors of 4.8 and 4.0 percent, respectively. The accuracy of these large-area inventories is comparable to the accuracy yielded by use of conventional timber inventory methods, but, because of automation, the FOCIS inventories are more rapid (9-12 months compared to 2-3 years for conventional manual photointerpretation, map compilation and drafting, field sampling, and data processing) and are less costly.

  18. Digital Signal Processing and Generation for a DC Current Transformer for Particle Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zorzetti, Silvia

    2013-01-01

    The thesis topic, digital signal processing and generation for a DC current transformer, focuses on the most fundamental beam diagnostics in the field of particle accelerators, the measurement of the beam intensity, or beam current. The technology of a DC current transformer (DCCT) is well known, and used in many areas, including particle accelerator beam instrumentation, as non-invasive (shunt-free) method to monitor the DC current in a conducting wire, or in our case, the current of charged particles travelling inside an evacuated metal pipe. So far, custom and commercial DCCTs are entirely based on analog technologies and signal processing, whichmore » makes them inflexible, sensitive to component aging, and difficult to maintain and calibrate.« less

  19. Quantitative optical scanning tests of complex microcircuits

    NASA Technical Reports Server (NTRS)

    Erickson, J. J.

    1980-01-01

    An approach for the development of the optical scanner as a screening inspection instrument for microcircuits involves comparing the quantitative differences in photoresponse images and then correlating them with electrical parameter differences in test devices. The existing optical scanner was modified so that the photoresponse data could be recorded and subsequently digitized. A method was devised for applying digital image processing techniques to the digitized photoresponse data in order to quantitatively compare the data. Electrical tests were performed and photoresponse images were recorded before and following life test intervals on two groups of test devices. Correlations were made between differences or changes in the electrical parameters of the test devices.

  20. Characterization of a 300-GHz Transmission System for Digital Communications

    NASA Astrophysics Data System (ADS)

    Hudlička, Martin; Salhi, Mohammed; Kleine-Ostmann, Thomas; Schrader, Thorsten

    2017-08-01

    The paper presents the characterization of a 300-GHz transmission system for modern digital communications. The quality of the modulated signal at the output of the system (error vector magnitude, EVM) is measured using a vector signal analyzer. A method using a digital real-time oscilloscope and consecutive mathematical processing in a computer is shown for analysis of signals with bandwidths exceeding that of state-of-the-art vector signal analyzers. The uncertainty of EVM measured using the real-time oscilloscope is open to analysis. Behaviour of the 300-GHz transmission system is studied with respect to various modulation schemes and different signal symbol rates.

  1. Improved stereo matching applied to digitization of greenhouse plants

    NASA Astrophysics Data System (ADS)

    Zhang, Peng; Xu, Lihong; Li, Dawei; Gu, Xiaomeng

    2015-03-01

    The digitization of greenhouse plants is an important aspect of digital agriculture. Its ultimate aim is to reconstruct a visible and interoperable virtual plant model on the computer by using state-of-the-art image process and computer graphics technologies. The most prominent difficulties of the digitization of greenhouse plants include how to acquire the three-dimensional shape data of greenhouse plants and how to carry out its realistic stereo reconstruction. Concerning these issues an effective method for the digitization of greenhouse plants is proposed by using a binocular stereo vision system in this paper. Stereo vision is a technique aiming at inferring depth information from two or more cameras; it consists of four parts: calibration of the cameras, stereo rectification, search of stereo correspondence and triangulation. Through the final triangulation procedure, the 3D point cloud of the plant can be achieved. The proposed stereo vision system can facilitate further segmentation of plant organs such as stems and leaves; moreover, it can provide reliable digital samples for the visualization of greenhouse tomato plants.

  2. Hybrid Modeling Based on Scsg-Br and Orthophoto

    NASA Astrophysics Data System (ADS)

    Zhou, G.; Huang, Y.; Yue, T.; Li, X.; Huang, W.; He, C.; Wu, Z.

    2018-05-01

    With the development of digital city, digital applications are more and more widespread, while the urban buildings are more complex. Therefore, establishing an effective data model is the key to express urban building models accurately. In addition, the combination of 3D building model and remote sensing data become a trend to build digital city there are a large amount of data resulting in data redundancy. In order to solve the limitation of single modelling of constructive solid geometry (CSG), this paper presents a mixed modelling method based on SCSG-BR for urban buildings representation. On one hand, the improved CSG method, which is called as "Spatial CSG (SCSG)" representation method, is used to represent the exterior shape of urban buildings. On the other hand, the boundary representation (BR) method represents the topological relationship between geometric elements of urban building, in which the textures is considered as the attribute data of the wall and the roof of urban building. What's more, the method combined file database and relational database is used to manage the data of three-dimensional building model, which can decrease the complex processes in texture mapping. During the data processing, the least-squares algorithm with constraints is used to orthogonalize the building polygons and adjust the polygons topology to ensure the accuracy of the modelling data. Finally, this paper matches the urban building model with the corresponding orthophoto. This paper selects data of Denver, Colorado, USA to establish urban building realistic model. The results show that the SCSG-BR method can represent the topological relations of building more precisely. The organization and management of urban building model data reduce the redundancy of data and improve modelling speed. The combination of orthophoto and urban building model further strengthens the application in view analysis and spatial query, which enhance the scope of digital city applications.

  3. High-throughput method for ear phenotyping and kernel weight estimation in maize using ear digital imaging.

    PubMed

    Makanza, R; Zaman-Allah, M; Cairns, J E; Eyre, J; Burgueño, J; Pacheco, Ángela; Diepenbrock, C; Magorokosho, C; Tarekegne, A; Olsen, M; Prasanna, B M

    2018-01-01

    Grain yield, ear and kernel attributes can assist to understand the performance of maize plant under different environmental conditions and can be used in the variety development process to address farmer's preferences. These parameters are however still laborious and expensive to measure. A low-cost ear digital imaging method was developed that provides estimates of ear and kernel attributes i.e., ear number and size, kernel number and size as well as kernel weight from photos of ears harvested from field trial plots. The image processing method uses a script that runs in a batch mode on ImageJ; an open source software. Kernel weight was estimated using the total kernel number derived from the number of kernels visible on the image and the average kernel size. Data showed a good agreement in terms of accuracy and precision between ground truth measurements and data generated through image processing. Broad-sense heritability of the estimated parameters was in the range or higher than that for measured grain weight. Limitation of the method for kernel weight estimation is discussed. The method developed in this work provides an opportunity to significantly reduce the cost of selection in the breeding process, especially for resource constrained crop improvement programs and can be used to learn more about the genetic bases of grain yield determinants.

  4. Modeling human faces with multi-image photogrammetry

    NASA Astrophysics Data System (ADS)

    D'Apuzzo, Nicola

    2002-03-01

    Modeling and measurement of the human face have been increasing by importance for various purposes. Laser scanning, coded light range digitizers, image-based approaches and digital stereo photogrammetry are the used methods currently employed in medical applications, computer animation, video surveillance, teleconferencing and virtual reality to produce three dimensional computer models of the human face. Depending on the application, different are the requirements. Ours are primarily high accuracy of the measurement and automation in the process. The method presented in this paper is based on multi-image photogrammetry. The equipment, the method and results achieved with this technique are here depicted. The process is composed of five steps: acquisition of multi-images, calibration of the system, establishment of corresponding points in the images, computation of their 3-D coordinates and generation of a surface model. The images captured by five CCD cameras arranged in front of the subject are digitized by a frame grabber. The complete system is calibrated using a reference object with coded target points, which can be measured fully automatically. To facilitate the establishment of correspondences in the images, texture in the form of random patterns can be projected from two directions onto the face. The multi-image matching process, based on a geometrical constrained least squares matching algorithm, produces a dense set of corresponding points in the five images. Neighborhood filters are then applied on the matching results to remove the errors. After filtering the data, the three-dimensional coordinates of the matched points are computed by forward intersection using the results of the calibration process; the achieved mean accuracy is about 0.2 mm in the sagittal direction and about 0.1 mm in the lateral direction. The last step of data processing is the generation of a surface model from the point cloud and the application of smooth filters. Moreover, a color texture image can be draped over the model to achieve a photorealistic visualization. The advantage of the presented method over laser scanning and coded light range digitizers is the acquisition of the source data in a fraction of a second, allowing the measurement of human faces with higher accuracy and the possibility to measure dynamic events like the speech of a person.

  5. Free-running ADC- and FPGA-based signal processing method for brain PET using GAPD arrays

    NASA Astrophysics Data System (ADS)

    Hu, Wei; Choi, Yong; Hong, Key Jo; Kang, Jihoon; Jung, Jin Ho; Huh, Youn Suk; Lim, Hyun Keong; Kim, Sang Su; Kim, Byung-Tae; Chung, Yonghyun

    2012-02-01

    Currently, for most photomultiplier tube (PMT)-based PET systems, constant fraction discriminators (CFD) and time to digital converters (TDC) have been employed to detect gamma ray signal arrival time, whereas anger logic circuits and peak detection analog-to-digital converters (ADCs) have been implemented to acquire position and energy information of detected events. As compared to PMT the Geiger-mode avalanche photodiodes (GAPDs) have a variety of advantages, such as compactness, low bias voltage requirement and MRI compatibility. Furthermore, the individual read-out method using a GAPD array coupled 1:1 with an array scintillator can provide better image uniformity than can be achieved using PMT and anger logic circuits. Recently, a brain PET using 72 GAPD arrays (4×4 array, pixel size: 3 mm×3 mm) coupled 1:1 with LYSO scintillators (4×4 array, pixel size: 3 mm×3 mm×20 mm) has been developed for simultaneous PET/MRI imaging in our laboratory. Eighteen 64:1 position decoder circuits (PDCs) were used to reduce GAPD channel number and three off-the-shelf free-running ADC and field programmable gate array (FPGA) combined data acquisition (DAQ) cards were used for data acquisition and processing. In this study, a free-running ADC- and FPGA-based signal processing method was developed for the detection of gamma ray signal arrival time, energy and position information all together for each GAPD channel. For the method developed herein, three DAQ cards continuously acquired 18 channels of pre-amplified analog gamma ray signals and 108-bit digital addresses from 18 PDCs. In the FPGA, the digitized gamma ray pulses and digital addresses were processed to generate data packages containing pulse arrival time, baseline value, energy value and GAPD channel ID. Finally, these data packages were saved to a 128 Mbyte on-board synchronous dynamic random access memory (SDRAM) and then transferred to a host computer for coincidence sorting and image reconstruction. In order to evaluate the functionality of the developed signal processing method, energy and timing resolutions for brain PET were measured via the placement of a 6 μCi 22Na point source at the center of the PET scanner. Furthermore the PET image of the hot rod phantom (rod diameter: from 2.5 mm to 6.5 mm) with activity of 1 mCi was simulated, and then image acquisition experiment was performed using the brain PET. Measured average energy resolution for 1152 GAPD channels and system timing resolution were 19.5% (FWHM%) and 2.7 ns (FWHM), respectively. With regard to the acquisition of the hot rod phantom image, rods could be resolved down to a diameter of 2.5 mm, which was similar to simulated results. The experimental results demonstrated that the signal processing method developed herein was successfully implemented for brain PET. This reduced the complexity, cost and developing duration for PET system relative to normal PET electronics, and it will obviously be useful for the development of high-performance investigational PET systems.

  6. Proposal of digital interface for the system of the air conditioner's remote control: analysis of the system of feedback.

    PubMed

    da Silva de Queiroz Pierre, Raisa; Kawada, Tarô Arthur Tavares; Fontes, André Guimarães

    2012-01-01

    Develop a proposal of digital interface for the system of the remote control, that functions as support system during the manipulation of air conditioner adjusted for the users in general, from ergonomic parameters, objectifying the reduction of the problems faced for the user and improving the process. 20 people with questionnaire with both qualitative and quantitative level. Linear Method consists of a sequence of steps in which the input of one of them depends on the output from the previous one, although they are independent. The process of feedback, when necessary, must occur within each step separately.

  7. Applied digital signal processing systems for vortex flowmeter with digital signal processing.

    PubMed

    Xu, Ke-Jun; Zhu, Zhi-Hai; Zhou, Yang; Wang, Xiao-Fen; Liu, San-Shan; Huang, Yun-Zhi; Chen, Zhi-Yuan

    2009-02-01

    The spectral analysis is combined with digital filter to process the vortex sensor signal for reducing the effect of disturbance at low frequency from pipe vibrations and increasing the turndown ratio. Using digital signal processing chip, two kinds of digital signal processing systems are developed to implement these algorithms. One is an integrative system, and the other is a separated system. A limiting amplifier is designed in the input analog condition circuit to adapt large amplitude variation of sensor signal. Some technique measures are taken to improve the accuracy of the output pulse, speed up the response time of the meter, and reduce the fluctuation of the output signal. The experimental results demonstrate the validity of the digital signal processing systems.

  8. Hybrid acousto-optic and digital equalization for microwave digital radio channels

    NASA Astrophysics Data System (ADS)

    Anderson, C. S.; Vanderlugt, A.

    1990-11-01

    Digital radio transmission systems use complex modulation schemes that require powerful signal-processing techniques to correct channel distortions and to minimize BERs. This paper proposes combining the computation power of acoustooptic processing and the accuracy of digital processing to produce a hybrid channel equalizer that exceeds the performance of digital equalization alone. Analysis shows that a hybrid equalizer for 256-level quadrature amplitude modulation (QAM) performs better than a digital equalizer for 64-level QAM.

  9. Breaking down number syntax: spared comprehension of multi-digit numbers in a patient with impaired digit-to-word conversion.

    PubMed

    Dotan, Dror; Friedmann, Naama; Dehaene, Stanislas

    2014-10-01

    Can the meaning of two-digit Arabic numbers be accessed independently of their verbal-phonological representations? To answer this question we explored the number processing of ZN, an aphasic patient with a syntactic deficit in digit-to-verbal transcoding, who could hardly read aloud two-digit numbers, but could read them as single digits ("four, two"). Neuropsychological examination showed that ZN's deficit was neither in the digit input nor in the phonological output processes, as he could copy and repeat two-digit numbers. His deficit thus lied in a central process that converts digits to abstract number words and sends this information to phonological retrieval processes. Crucially, in spite of this deficit in number transcoding, ZN's two-digit comprehension was spared in several ways: (1) he could calculate two-digit additions; (2) he showed good performance in a two-digit comparison task, and a continuous distance effect; and (3) his performance in a task of mapping numbers to positions on an unmarked number line showed a logarithmic (nonlinear) factor, indicating that he represented two-digit Arabic numbers as holistic two-digit quantities. Thus, at least these aspects of number comprehension can be performed without converting the two-digit number from digits to verbal representation.

  10. A Channelization-Based DOA Estimation Method for Wideband Signals

    PubMed Central

    Guo, Rui; Zhang, Yue; Lin, Qianqiang; Chen, Zengping

    2016-01-01

    In this paper, we propose a novel direction of arrival (DOA) estimation method for wideband signals with sensor arrays. The proposed method splits the wideband array output into multiple frequency sub-channels and estimates the signal parameters using a digital channelization receiver. Based on the output sub-channels, a channelization-based incoherent signal subspace method (Channelization-ISM) and a channelization-based test of orthogonality of projected subspaces method (Channelization-TOPS) are proposed. Channelization-ISM applies narrowband signal subspace methods on each sub-channel independently. Then the arithmetic mean or geometric mean of the estimated DOAs from each sub-channel gives the final result. Channelization-TOPS measures the orthogonality between the signal and the noise subspaces of the output sub-channels to estimate DOAs. The proposed channelization-based method isolates signals in different bandwidths reasonably and improves the output SNR. It outperforms the conventional ISM and TOPS methods on estimation accuracy and dynamic range, especially in real environments. Besides, the parallel processing architecture makes it easy to implement on hardware. A wideband digital array radar (DAR) using direct wideband radio frequency (RF) digitization is presented. Experiments carried out in a microwave anechoic chamber with the wideband DAR are presented to demonstrate the performance. The results verify the effectiveness of the proposed method. PMID:27384566

  11. Digital technology in fixed implant prosthodontics.

    PubMed

    Joda, Tim; Ferrari, Marco; Gallucci, German O; Wittneben, Julia-Gabriela; Brägger, Urs

    2017-02-01

    Digital protocols are increasingly influencing prosthodontic treatment concepts. Implant-supported single-unit and short-span reconstructions will benefit mostly from the present digital trends. In these protocols, monolithic implant crowns connected to prefabricated titanium abutments, which are created based on data obtained from an intraoral scan followed by virtual design and production, without the need of a physical master cast, have to be considered in lieu of conventional manufacturing techniques for posterior implant restorations. No space for storage is needed in the complete digital workflow, and if a remake is required a replica of the original reconstruction can be produced quickly and inexpensively using rapid prototyping. The technological process is split into subtractive methods, such as milling or laser ablation, and additive processing, such as three-dimensional printing and selective laser melting. The dimensions of the supra-implant soft-tissue architecture can be calculated in advance of implant placement, according to the morphologic copy, and consequently are individualized for each patient. All these technologies have to be considered before implementing new digital dental workflows in daily routine. The correct indication and application are prerequisite and crucial for the success of the overall therapy, and, finally, for a satisfied patient. This includes a teamwork approach and equally affects the clinician, the dental assistant and the technician as well. The digitization process has the potential to change the entire dental profession. The major benefits will be reduced production costs, improvement in time efficiency and fulfilment of patients' perceptions of a modernized treatment concept. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  12. Detection and Evaluation of Skin Disorders by One of Photogrammetric Image Analysis Methods

    NASA Astrophysics Data System (ADS)

    Güçin, M.; Patias, P.; Altan, M. O.

    2012-08-01

    Abnormalities on skin may vary from simple acne to painful wounds which affect a person's life quality. Detection of these kinds of disorders in early stages, followed by the evaluation of abnormalities is of high importance. At this stage, photogrammetry offers a non-contact solution to this concern by providing geometric highly accurate data. Photogrammetry, which has been used for firstly topographic purposes, in virtue of terrestrial photogrammetry became useful technique in non-topographic applications also (Wolf et al., 2000). Moreover the extension of usage of photogrammetry, in parallel with the development in technology, analogue photographs are replaced with digital images and besides digital image processing techniques, it provides modification of digital images by using filters, registration processes etc. Besides, photogrammetry (using same coordinate system by registration of images) can serve as a tool for the comparison of temporal imaging data. The aim of this study is to examine several digital image processing techniques, in particular the digital filters, which might be useful to determine skin disorders. In our study we examine affordable to purchase, user friendly software which needs neither expertise nor pre-training. Since it is a pre-work for subsequent and deeper studies, Adobe Photoshop 7.0 is used as a present software. In addition to that Adobe Photoshop released a DesAcc plug-ins with CS3 version and provides full compatibility with DICOM (Digital Imaging and Communications in Medicine) and PACS (Picture Archiving and Communications System) that enables doctors to store all medical data together with relevant images and share if necessary.

  13. Enhancing the pictorial content of digital holograms at 100 frames per second.

    PubMed

    Tsang, P W M; Poon, T-C; Cheung, K W K

    2012-06-18

    We report a low complexity, non-iterative method for enhancing the sharpness, brightness, and contrast of the pictorial content that is recorded in a digital hologram, without the need of re-generating the latter from the original object scene. In our proposed method, the hologram is first back-projected to a 2-D virtual diffraction plane (VDP) which is located at close proximity to the original object points. Next the field distribution on the VDP, which shares similar optical properties as the object scene, is enhanced. Subsequently, the processed VDP is expanded into a full hologram. We demonstrate two types of enhancement: a modified histogram equalization to improve the brightness and contrast, and localized high-boost-filtering (LHBF) to increase the sharpness. Experiment results have demonstrated that our proposed method is capable of enhancing a 2048x2048 hologram at a rate of around 100 frames per second. To the best of our knowledge, this is the first time real-time image enhancement is considered in the context of digital holography.

  14. Real-time digital signal recovery for a multi-pole low-pass transfer function system.

    PubMed

    Lee, Jhinhwan

    2017-08-01

    In order to solve the problems of waveform distortion and signal delay by many physical and electrical systems with multi-pole linear low-pass transfer characteristics, a simple digital-signal-processing (DSP)-based method of real-time recovery of the original source waveform from the distorted output waveform is proposed. A mathematical analysis on the convolution kernel representation of the single-pole low-pass transfer function shows that the original source waveform can be accurately recovered in real time using a particular moving average algorithm applied on the input stream of the distorted waveform, which can also significantly reduce the overall delay time constant. This method is generalized for multi-pole low-pass systems and has noise characteristics of the inverse of the low-pass filter characteristics. This method can be applied to most sensors and amplifiers operating close to their frequency response limits to improve the overall performance of data acquisition systems and digital feedback control systems.

  15. Automation of extrusion of porous cable products based on a digital controller

    NASA Astrophysics Data System (ADS)

    Chostkovskii, B. K.; Mitroshin, V. N.

    2017-07-01

    This paper presents a new approach to designing an automated system for monitoring and controlling the process of applying porous insulation material on a conductive cable core, which is based on using structurally and parametrically optimized digital controllers of an arbitrary order instead of calculating typical PID controllers using known methods. The digital controller is clocked by signals from the clock length sensor of a measuring wheel, instead of a timer signal, and this provides the robust properties of the system with respect to the changing insulation speed. Digital controller parameters are tuned to provide the operating parameters of the manufactured cable using a simulation model of stochastic extrusion and are minimized by moving a regular simplex in the parameter space of the tuned controller.

  16. Analog Design for Digital Deployment of a Serious Leadership Game

    NASA Technical Reports Server (NTRS)

    Maxwell, Nicholas; Lang, Tristan; Herman, Jeffrey L.; Phares, Richard

    2012-01-01

    This paper presents the design, development, and user testing of a leadership development simulation. The authors share lessons learned from using a design process for a board game to allow for quick and inexpensive revision cycles during the development of a serious leadership development game. The goal of this leadership simulation is to accelerate the development of leadership capacity in high-potential mid-level managers (GS-15 level) in a federal government agency. Simulation design included a mixed-method needs analysis, using both quantitative and qualitative approaches to determine organizational leadership needs. Eight design iterations were conducted, including three user testing phases. Three re-design iterations followed initial development, enabling game testing as part of comprehensive instructional events. Subsequent design, development and testing processes targeted digital application to a computer- and tablet-based environment. Recommendations include pros and cons of development and learner testing of an initial analog simulation prior to full digital simulation development.

  17. Fast digital zooming system using directionally adaptive image interpolation and restoration.

    PubMed

    Kang, Wonseok; Jeon, Jaehwan; Yu, Soohwan; Paik, Joonki

    2014-01-01

    This paper presents a fast digital zooming system for mobile consumer cameras using directionally adaptive image interpolation and restoration methods. The proposed interpolation algorithm performs edge refinement along the initially estimated edge orientation using directionally steerable filters. Either the directionally weighted linear or adaptive cubic-spline interpolation filter is then selectively used according to the refined edge orientation for removing jagged artifacts in the slanted edge region. A novel image restoration algorithm is also presented for removing blurring artifacts caused by the linear or cubic-spline interpolation using the directionally adaptive truncated constrained least squares (TCLS) filter. Both proposed steerable filter-based interpolation and the TCLS-based restoration filters have a finite impulse response (FIR) structure for real time processing in an image signal processing (ISP) chain. Experimental results show that the proposed digital zooming system provides high-quality magnified images with FIR filter-based fast computational structure.

  18. Proposals for best-quality immunohistochemical staining of paraffin-embedded brain tissue slides in forensics.

    PubMed

    Trautz, Florian; Dreßler, Jan; Stassart, Ruth; Müller, Wolf; Ondruschka, Benjamin

    2018-01-03

    Immunohistochemistry (IHC) has become an integral part in forensic histopathology over the last decades. However, the underlying methods for IHC vary greatly depending on the institution, creating a lack of comparability. The aim of this study was to assess the optimal approach for different technical aspects of IHC, in order to improve and standardize this procedure. Therefore, qualitative results from manual and automatic IHC staining of brain samples were compared, as well as potential differences in suitability of common IHC glass slides. Further, possibilities of image digitalization and connected issues were investigated. In our study, automatic staining showed more consistent staining results, compared to manual staining procedures. Digitalization and digital post-processing facilitated direct analysis and analysis for reproducibility considerably. No differences were found for different commercially available microscopic glass slides regarding suitability of IHC brain researches, but a certain rate of tissue loss should be expected during the staining process.

  19. Wideband aperture array using RF channelizers and massively parallel digital 2D IIR filterbank

    NASA Astrophysics Data System (ADS)

    Sengupta, Arindam; Madanayake, Arjuna; Gómez-García, Roberto; Engeberg, Erik D.

    2014-05-01

    Wideband receive-mode beamforming applications in wireless location, electronically-scanned antennas for radar, RF sensing, microwave imaging and wireless communications require digital aperture arrays that offer a relatively constant far-field beam over several octaves of bandwidth. Several beamforming schemes including the well-known true time-delay and the phased array beamformers have been realized using either finite impulse response (FIR) or fast Fourier transform (FFT) digital filter-sum based techniques. These beamforming algorithms offer the desired selectivity at the cost of a high computational complexity and frequency-dependant far-field array patterns. A novel approach to receiver beamforming is the use of massively parallel 2-D infinite impulse response (IIR) fan filterbanks for the synthesis of relatively frequency independent RF beams at an order of magnitude lower multiplier complexity compared to FFT or FIR filter based conventional algorithms. The 2-D IIR filterbanks demand fast digital processing that can support several octaves of RF bandwidth, fast analog-to-digital converters (ADCs) for RF-to-bits type direct conversion of wideband antenna element signals. Fast digital implementation platforms that can realize high-precision recursive filter structures necessary for real-time beamforming, at RF radio bandwidths, are also desired. We propose a novel technique that combines a passive RF channelizer, multichannel ADC technology, and single-phase massively parallel 2-D IIR digital fan filterbanks, realized at low complexity using FPGA and/or ASIC technology. There exists native support for a larger bandwidth than the maximum clock frequency of the digital implementation technology. We also strive to achieve More-than-Moore throughput by processing a wideband RF signal having content with N-fold (B = N Fclk/2) bandwidth compared to the maximum clock frequency Fclk Hz of the digital VLSI platform under consideration. Such increase in bandwidth is achieved without use of polyphase signal processing or time-interleaved ADC methods. That is, all digital processors operate at the same Fclk clock frequency without phasing, while wideband operation is achieved by sub-sampling of narrower sub-bands at the the RF channelizer outputs.

  20. A projector calibration method for monocular structured light system based on digital image correlation

    NASA Astrophysics Data System (ADS)

    Feng, Zhixin

    2018-02-01

    Projector calibration is crucial for a camera-projector three-dimensional (3-D) structured light measurement system, which has one camera and one projector. In this paper, a novel projector calibration method is proposed based on digital image correlation. In the method, the projector is viewed as an inverse camera, and a plane calibration board with feature points is used to calibrate the projector. During the calibration processing, a random speckle pattern is projected onto the calibration board with different orientations to establish the correspondences between projector images and camera images. Thereby, dataset for projector calibration are generated. Then the projector can be calibrated using a well-established camera calibration algorithm. The experiment results confirm that the proposed method is accurate and reliable for projector calibration.

  1. Dim target detection method based on salient graph fusion

    NASA Astrophysics Data System (ADS)

    Hu, Ruo-lan; Shen, Yi-yan; Jiang, Jun

    2018-02-01

    Dim target detection is one key problem in digital image processing field. With development of multi-spectrum imaging sensor, it becomes a trend to improve the performance of dim target detection by fusing the information from different spectral images. In this paper, one dim target detection method based on salient graph fusion was proposed. In the method, Gabor filter with multi-direction and contrast filter with multi-scale were combined to construct salient graph from digital image. And then, the maximum salience fusion strategy was designed to fuse the salient graph from different spectral images. Top-hat filter was used to detect dim target from the fusion salient graph. Experimental results show that proposal method improved the probability of target detection and reduced the probability of false alarm on clutter background images.

  2. Development of Total Knee Replacement Digital Templating Software

    NASA Astrophysics Data System (ADS)

    Yusof, Siti Fairuz; Sulaiman, Riza; Thian Seng, Lee; Mohd. Kassim, Abdul Yazid; Abdullah, Suhail; Yusof, Shahril; Omar, Masbah; Abdul Hamid, Hamzaini

    In this study, by taking full advantage of digital X-ray and computer technology, we have developed a semi-automated procedure to template knee implants, by making use of digital templating method. Using this approach, a software system called OrthoKneeTMhas been designed and developed. The system is to be utilities as a study in the Department of Orthopaedic and Traumatology in medical faculty, UKM (FPUKM). OrthoKneeTMtemplating process employs uses a technique similar to those used by many surgeons, using acetate templates over X-ray films. Using template technique makes it easy to template various implant from every Implant manufacturers who have with a comprehensive database of templates. The templating functionality includes, template (knee) and manufactures templates (Smith & Nephew; and Zimmer). From an image of patient x-ray OrthoKneeTMtemplates help in quickly and easily reads to the approximate template size needed. The visual templating features then allow us quickly review multiple template sizes against the X-ray and thus obtain the nearly precise view of the implant size required. The system can assist by templating on one patient image and will generate reports that can accompany patient notes. The software system was implemented in Visual basic 6.0 Pro using the object-oriented techniques to manage the graphics and objects. The approaches for image scaling will be discussed. Several of measurement in orthopedic diagnosis process have been studied and added in this software as measurement tools features using mathematic theorem and equations. The study compared the results of the semi-automated (using digital templating) method to the conventional method to demonstrate the accuracy of the system.

  3. Measuring digit lengths with 3D digital stereophotogrammetry: A comparison across methods.

    PubMed

    Gremba, Allison; Weinberg, Seth M

    2018-05-09

    We compared digital 3D stereophotogrammetry to more traditional measurement methods (direct anthropometry and 2D scanning) to capture digit lengths and ratios. The length of the second and fourth digits was measured by each method and the second-to-fourth ratio was calculated. For each digit measurement, intraobserver agreement was calculated for each of the three collection methods. Further, measurements from the three methods were compared directly to one another. Agreement statistics included the intraclass correlation coefficient (ICC) and technical error of measurement (TEM). Intraobserver agreement statistics for the digit length measurements were high for all three methods; ICC values exceeded 0.97 and TEM values were below 1 mm. For digit ratio, intraobserver agreement was also acceptable for all methods, with direct anthropometry exhibiting lower agreement (ICC = 0.87) compared to indirect methods. For the comparison across methods, the overall agreement was high for digit length measurements (ICC values ranging from 0.93 to 0.98; TEM values below 2 mm). For digit ratios, high agreement was observed between the two indirect methods (ICC = 0.93), whereas indirect methods showed lower agreement when compared to direct anthropometry (ICC < 0.75). Digit measurements and derived ratios from 3D stereophotogrammetry showed high intraobserver agreement (similar to more traditional methods) suggesting that landmarks could be placed reliably on 3D hand surface images. While digit length measurements were found to be comparable across all three methods, ratios derived from direct anthropometry tended to be higher than those calculated indirectly from 2D or 3D images. © 2018 Wiley Periodicals, Inc.

  4. Note: A new method for directly reducing the sampling jitter noise of the digital phasemeter

    NASA Astrophysics Data System (ADS)

    Liang, Yu-Rong

    2018-03-01

    The sampling jitter noise is one non-negligible noise source of the digital phasemeter used for space gravitational wave detection missions. This note provides a new method for directly reducing the sampling jitter noise of the digital phasemeter, by adding a dedicated signal of which the frequency, amplitude, and initial phase should be pre-set. In contrast to the phase correction using the pilot-tone in the work of Burnett, Gerberding et al., Liang et al., Ales et al., Gerberding et al., and Ware et al. [M.Sc. thesis, Luleå University of Technology, 2010; Classical Quantum Gravity 30, 235029 (2013); Rev. Sci. Instrum. 86, 016106 (2015); Rev. Sci. Instrum. 86, 084502 (2015); Rev. Sci. Instrum. 86, 074501 (2015); and Proceedings of the Earth Science Technology Conference (NASA, USA, 2006)], the new method is intrinsically additive noise suppression. The experiment results validate that the new method directly reduces the sampling jitter noise without data post-processing and provides the same phase measurement noise level (10-6 rad/Hz1/2 at 0.1 Hz) as the pilot-tone correction.

  5. Review of edgematchimg procedures for digital cartographic data used in Geographic Information Systems (GIS)

    USGS Publications Warehouse

    Nebert, D.D.

    1989-01-01

    In the process of developing a continuous hydrographic data layer for water resources applications in the Pacific Northwest, map-edge discontinuities in the U.S. Geological Survey 1:100 ,000-scale digital data that required application of computer-assisted edgematching procedures were identified. The spatial data sets required by the project must have line features that match closely enough across map boundaries to ensure full line topology when adjacent files are joined by the computer. Automated edgematching techniques are evaluated as to their effects on positional accuracy. Interactive methods such as selective node-matching and on-screen editing are also reviewed. Interactive procedures complement automated methods by allowing supervision of edgematching in a cartographic and hydrologic context. Common edge conditions encountered in the preparation of the Northwest Rivers data base are described, as are recommended processing solutions. Suggested edgematching procedures for 1:100,000-scale hydrography data are included in an appendix to encourage consistent processing of this theme on a national scale. (USGS)

  6. Compact hybrid optoelectrical unit for image processing and recognition

    NASA Astrophysics Data System (ADS)

    Cheng, Gang; Jin, Guofan; Wu, Minxian; Liu, Haisong; He, Qingsheng; Yuan, ShiFu

    1998-07-01

    In this paper a compact opto-electric unit (CHOEU) for digital image processing and recognition is proposed. The central part of CHOEU is an incoherent optical correlator, which is realized with a SHARP QA-1200 8.4 inch active matrix TFT liquid crystal display panel which is used as two real-time spatial light modulators for both the input image and reference template. CHOEU can do two main processing works. One is digital filtering; the other is object matching. Using CHOEU an edge-detection operator is realized to extract the edges from the input images. Then the reprocessed images are sent into the object recognition unit for identifying the important targets. A novel template- matching method is proposed for gray-tome image recognition. A positive and negative cycle-encoding method is introduced to realize the absolute difference measurement pixel- matching on a correlator structure simply. The system has god fault-tolerance ability for rotation distortion, Gaussian noise disturbance or information losing. The experiments are given at the end of this paper.

  7. Light Weight MP3 Watermarking Method for Mobile Terminals

    NASA Astrophysics Data System (ADS)

    Takagi, Koichi; Sakazawa, Shigeyuki; Takishima, Yasuhiro

    This paper proposes a novel MP3 watermarking method which is applicable to a mobile terminal with limited computational resources. Considering that in most cases the embedded information is copyright information or metadata, which should be extracted before playing back audio contents, the watermark detection process should be executed at high speed. However, when conventional methods are used with a mobile terminal, it takes a considerable amount of time to detect a digital watermark. This paper focuses on scalefactor manipulation to enable high speed watermark embedding/detection for MP3 audio and also proposes the manipulation method which minimizes audio quality degradation adaptively. Evaluation tests showed that the proposed method is capable of embedding 3 bits/frame information without degrading audio quality and detecting it at very high speed. Finally, this paper describes application examples for authentication with a digital signature.

  8. Double-pulse digital speckle pattern interferometry for vibration analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Dazhi; Xue, Jingfeng; Chen, Lu; Wen, Juying; Wang, Jingjing

    2014-12-01

    The double-pulse Digital Speckle Pattern Interferometry (DSPI) in the laboratory is established. Two good performances have been achieved at the same time, which is uniform distribution of laser beam energy by space filter and recording two successive pictures by a CCD camera successfully. Then two-dimensional discrete orthogonal wavelet transform method is used for the process of filtering method. By using the DSPI, speckle pattern of a vibrated object is obtained with interval of (2~800)μs, and 3D plot of the transient vibration is achieved. Moreover, good agreements of the mode shapes and displacement are obtained by comparing with Laser Doppler Vibrometer (LDV) .

  9. Digital test signal generation: An accurate SNR calibration approach for the DSN

    NASA Technical Reports Server (NTRS)

    Gutierrez-Luaces, Benito O.

    1993-01-01

    In support of the on-going automation of the Deep Space Network (DSN) a new method of generating analog test signals with accurate signal-to-noise ratio (SNR) is described. High accuracy is obtained by simultaneous generation of digital noise and signal spectra at the desired bandwidth (base-band or bandpass). The digital synthesis provides a test signal embedded in noise with the statistical properties of a stationary random process. Accuracy is dependent on test integration time and limited only by the system quantization noise (0.02 dB). The monitor and control as well as signal-processing programs reside in a personal computer (PC). Commands are transmitted to properly configure the specially designed high-speed digital hardware. The prototype can generate either two data channels modulated or not on a subcarrier, or one QPSK channel, or a residual carrier with one biphase data channel. The analog spectrum generated is on the DC to 10 MHz frequency range. These spectra may be up-converted to any desired frequency without loss on the characteristics of the SNR provided. Test results are presented.

  10. Digital case-based learning system in school.

    PubMed

    Gu, Peipei; Guo, Jiayang

    2017-01-01

    With the continuing growth of multi-media learning resources, it is important to offer methods helping learners to explore and acquire relevant learning information effectively. As services that organize multi-media learning materials together to support programming learning, the digital case-based learning system is needed. In order to create a case-oriented e-learning system, this paper concentrates on the digital case study of multi-media resources and learning processes with an integrated framework. An integration of multi-media resources, testing and learning strategies recommendation as the learning unit is proposed in the digital case-based learning framework. The learning mechanism of learning guidance, multi-media materials learning and testing feedback is supported in our project. An improved personalized genetic algorithm which incorporates preference information and usage degree into the crossover and mutation process is proposed to assemble the personalized test sheet for each learner. A learning strategies recommendation solution is proposed to recommend learning strategies for learners to help them to learn. The experiments are conducted to prove that the proposed approaches are capable of constructing personalized sheets and the effectiveness of the framework.

  11. Digital case-based learning system in school

    PubMed Central

    Gu, Peipei

    2017-01-01

    With the continuing growth of multi-media learning resources, it is important to offer methods helping learners to explore and acquire relevant learning information effectively. As services that organize multi-media learning materials together to support programming learning, the digital case-based learning system is needed. In order to create a case-oriented e-learning system, this paper concentrates on the digital case study of multi-media resources and learning processes with an integrated framework. An integration of multi-media resources, testing and learning strategies recommendation as the learning unit is proposed in the digital case-based learning framework. The learning mechanism of learning guidance, multi-media materials learning and testing feedback is supported in our project. An improved personalized genetic algorithm which incorporates preference information and usage degree into the crossover and mutation process is proposed to assemble the personalized test sheet for each learner. A learning strategies recommendation solution is proposed to recommend learning strategies for learners to help them to learn. The experiments are conducted to prove that the proposed approaches are capable of constructing personalized sheets and the effectiveness of the framework. PMID:29107965

  12. Digital Analysis and Sorting of Fluorescence Lifetime by Flow Cytometry

    PubMed Central

    Houston, Jessica P.; Naivar, Mark A.; Freyer, James P.

    2010-01-01

    Frequency-domain flow cytometry techniques are combined with modifications to the digital signal processing capabilities of the Open Reconfigurable Cytometric Acquisition System (ORCAS) to analyze fluorescence decay lifetimes and control sorting. Real-time fluorescence lifetime analysis is accomplished by rapidly digitizing correlated, radiofrequency modulated detector signals, implementing Fourier analysis programming with ORCAS’ digital signal processor (DSP) and converting the processed data into standard cytometric list mode data. To systematically test the capabilities of the ORCAS 50 MS/sec analog-to-digital converter (ADC) and our DSP programming, an error analysis was performed using simulated light scatter and fluorescence waveforms (0.5–25 ns simulated lifetime), pulse widths ranging from 2 to 15 µs, and modulation frequencies from 2.5 to 16.667 MHz. The standard deviations of digitally acquired lifetime values ranged from 0.112 to >2 ns, corresponding to errors in actual phase shifts from 0.0142° to 1.6°. The lowest coefficients of variation (<1%) were found for 10-MHz modulated waveforms having pulse widths of 6 µs and simulated lifetimes of 4 ns. Direct comparison of the digital analysis system to a previous analog phase-sensitive flow cytometer demonstrated similar precision and accuracy on measurements of a range of fluorescent microspheres, unstained cells and cells stained with three common fluorophores. Sorting based on fluorescence lifetime was accomplished by adding analog outputs to ORCAS and interfacing with a commercial cell sorter with a radiofrequency modulated solid-state laser. Two populations of fluorescent microspheres with overlapping fluorescence intensities but different lifetimes (2 and 7 ns) were separated to ~98% purity. Overall, the digital signal acquisition and processing methods we introduce present a simple yet robust approach to phase-sensitive measurements in flow cytometry. The ability to simply and inexpensively implement this system on a commercial flow sorter will both allow better dissemination of this technology and better exploit the traditionally underutilized parameter of fluorescence lifetime. PMID:20662090

  13. Comparison of digital and conventional impression techniques: evaluation of patients’ perception, treatment comfort, effectiveness and clinical outcomes

    PubMed Central

    2014-01-01

    Background The purpose of this study was to compare two impression techniques from the perspective of patient preferences and treatment comfort. Methods Twenty-four (12 male, 12 female) subjects who had no previous experience with either conventional or digital impression participated in this study. Conventional impressions of maxillary and mandibular dental arches were taken with a polyether impression material (Impregum, 3 M ESPE), and bite registrations were made with polysiloxane bite registration material (Futar D, Kettenbach). Two weeks later, digital impressions and bite scans were performed using an intra-oral scanner (CEREC Omnicam, Sirona). Immediately after the impressions were made, the subjects’ attitudes, preferences and perceptions towards impression techniques were evaluated using a standardized questionnaire. The perceived source of stress was evaluated using the State-Trait Anxiety Scale. Processing steps of the impression techniques (tray selection, working time etc.) were recorded in seconds. Statistical analyses were performed with the Wilcoxon Rank test, and p < 0.05 was considered significant. Results There were significant differences among the groups (p < 0.05) in terms of total working time and processing steps. Patients stated that digital impressions were more comfortable than conventional techniques. Conclusions Digital impressions resulted in a more time-efficient technique than conventional impressions. Patients preferred the digital impression technique rather than conventional techniques. PMID:24479892

  14. Routine Digital Pathology Workflow: The Catania Experience

    PubMed Central

    Fraggetta, Filippo; Garozzo, Salvatore; Zannoni, Gian Franco; Pantanowitz, Liron; Rossi, Esther Diana

    2017-01-01

    Introduction: Successful implementation of whole slide imaging (WSI) for routine clinical practice has been accomplished in only a few pathology laboratories worldwide. We report the transition to an effective and complete digital surgical pathology workflow in the pathology laboratory at Cannizzaro Hospital in Catania, Italy. Methods: All (100%) permanent histopathology glass slides were digitized at ×20 using Aperio AT2 scanners. Compatible stain and scanning slide racks were employed to streamline operations. eSlide Manager software was bidirectionally interfaced with the anatomic pathology laboratory information system. Virtual slide trays connected to the two-dimensional (2D) barcode tracking system allowed pathologists to confirm that they were correctly assigned slides and that all tissues on these glass slides were scanned. Results: Over 115,000 glass slides were digitized with a scan fail rate of around 1%. Drying glass slides before scanning minimized them sticking to scanner racks. Implementation required introduction of a 2D barcode tracking system and modification of histology workflow processes. Conclusion: Our experience indicates that effective adoption of WSI for primary diagnostic use was more dependent on optimizing preimaging variables and integration with the laboratory information system than on information technology infrastructure and ensuring pathologist buy-in. Implementation of digital pathology for routine practice not only leveraged the benefits of digital imaging but also creates an opportunity for establishing standardization of workflow processes in the pathology laboratory. PMID:29416914

  15. On Certain New Methodology for Reducing Sensor and Readout Electronics Circuitry Noise in Digital Domain

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Miko, Joseph; Bradley, Damon; Heinzen, Katherine

    2008-01-01

    NASA Hubble Space Telescope (HST) and upcoming cosmology science missions carry instruments with multiple focal planes populated with many large sensor detector arrays. These sensors are passively cooled to low temperatures for low-level light (L3) and near-infrared (NIR) signal detection, and the sensor readout electronics circuitry must perform at extremely low noise levels to enable new required science measurements. Because we are at the technological edge of enhanced performance for sensors and readout electronics circuitry, as determined by thermal noise level at given temperature in analog domain, we must find new ways of further compensating for the noise in the signal digital domain. To facilitate this new approach, state-of-the-art sensors are augmented at their array hardware boundaries by non-illuminated reference pixels, which can be used to reduce noise attributed to sensors. There are a few proposed methodologies of processing in the digital domain the information carried by reference pixels, as employed by the Hubble Space Telescope and the James Webb Space Telescope Projects. These methods involve using spatial and temporal statistical parameters derived from boundary reference pixel information to enhance the active (non-reference) pixel signals. To make a step beyond this heritage methodology, we apply the NASA-developed technology known as the Hilbert- Huang Transform Data Processing System (HHT-DPS) for reference pixel information processing and its utilization in reconfigurable hardware on-board a spaceflight instrument or post-processing on the ground. The methodology examines signal processing for a 2-D domain, in which high-variance components of the thermal noise are carried by both active and reference pixels, similar to that in processing of low-voltage differential signals and subtraction of a single analog reference pixel from all active pixels on the sensor. Heritage methods using the aforementioned statistical parameters in the digital domain (such as statistical averaging of the reference pixels themselves) zeroes out the high-variance components, and the counterpart components in the active pixels remain uncorrected. This paper describes how the new methodology was demonstrated through analysis of fast-varying noise components using the Hilbert-Huang Transform Data Processing System tool (HHT-DPS) developed at NASA and the high-level programming language MATLAB (Trademark of MathWorks Inc.), as well as alternative methods for correcting for the high-variance noise component, using an HgCdTe sensor data. The NASA Hubble Space Telescope data post-processing, as well as future deep-space cosmology projects on-board instrument data processing from all the sensor channels, would benefit from this effort.

  16. Analysis of digital communication signals and extraction of parameters

    NASA Astrophysics Data System (ADS)

    Al-Jowder, Anwar

    1994-12-01

    The signal classification performance of four types of electronics support measure (ESM) communications detection systems is compared from the standpoint of the unintended receiver (interceptor). Typical digital communication signals considered include binary phase shift keying (BPSK), quadrature phase shift keying (QPSK), frequency shift keying (FSK), and on-off keying (OOK). The analysis emphasizes the use of available signal processing software. Detection methods compared include broadband energy detection, FFT-based narrowband energy detection, and two correlation methods which employ the fast Fourier transform (FFT). The correlation methods utilize modified time-frequency distributions, where one of these is based on the Wigner-Ville distribution (WVD). Gaussian white noise is added to the signal to simulate various signal-to-noise ratios (SNR's).

  17. Examination of the semi-automatic calculation technique of vegetation cover rate by digital camera images.

    NASA Astrophysics Data System (ADS)

    Takemine, S.; Rikimaru, A.; Takahashi, K.

    The rice is one of the staple foods in the world High quality rice production requires periodically collecting rice growth data to control the growth of rice The height of plant the number of stem the color of leaf is well known parameters to indicate rice growth Rice growth diagnosis method based on these parameters is used operationally in Japan although collecting these parameters by field survey needs a lot of labor and time Recently a laborsaving method for rice growth diagnosis is proposed which is based on vegetation cover rate of rice Vegetation cover rate of rice is calculated based on discriminating rice plant areas in a digital camera image which is photographed in nadir direction Discrimination of rice plant areas in the image was done by the automatic binarization processing However in the case of vegetation cover rate calculation method depending on the automatic binarization process there is a possibility to decrease vegetation cover rate against growth of rice In this paper a calculation method of vegetation cover rate was proposed which based on the automatic binarization process and referred to the growth hysteresis information For several images obtained by field survey during rice growing season vegetation cover rate was calculated by the conventional automatic binarization processing and the proposed method respectively And vegetation cover rate of both methods was compared with reference value obtained by visual interpretation As a result of comparison the accuracy of discriminating rice plant areas was increased by the proposed

  18. Numerical solution of differential equations by artificial neural networks

    NASA Technical Reports Server (NTRS)

    Meade, Andrew J., Jr.

    1995-01-01

    Conventionally programmed digital computers can process numbers with great speed and precision, but do not easily recognize patterns or imprecise or contradictory data. Instead of being programmed in the conventional sense, artificial neural networks (ANN's) are capable of self-learning through exposure to repeated examples. However, the training of an ANN can be a time consuming and unpredictable process. A general method is being developed by the author to mate the adaptability of the ANN with the speed and precision of the digital computer. This method has been successful in building feedforward networks that can approximate functions and their partial derivatives from examples in a single iteration. The general method also allows the formation of feedforward networks that can approximate the solution to nonlinear ordinary and partial differential equations to desired accuracy without the need of examples. It is believed that continued research will produce artificial neural networks that can be used with confidence in practical scientific computing and engineering applications.

  19. Generation and coherent detection of QPSK signal using a novel method of digital signal processing

    NASA Astrophysics Data System (ADS)

    Zhao, Yuan; Hu, Bingliang; He, Zhen-An; Xie, Wenjia; Gao, Xiaohui

    2018-02-01

    We demonstrate an optical quadrature phase-shift keying (QPSK) signal transmitter and an optical receiver for demodulating optical QPSK signal with homodyne detection and digital signal processing (DSP). DSP on the homodyne detection scheme is employed without locking the phase of the local oscillator (LO). In this paper, we present an extracting one-dimensional array of down-sampling method for reducing unwanted samples of constellation diagram measurement. Such a novel scheme embodies the following major advantages over the other conventional optical QPSK signal detection methods. First, this homodyne detection scheme does not need strict requirement on LO in comparison with linear optical sampling, such as having a flat spectral density and phase over the spectral support of the source under test. Second, the LabVIEW software is directly used for recovering the QPSK signal constellation without employing complex DSP circuit. Third, this scheme is applicable to multilevel modulation formats such as M-ary PSK and quadrature amplitude modulation (QAM) or higher speed signals by making minor changes.

  20. Digital processing of array seismic recordings

    USGS Publications Warehouse

    Ryall, Alan; Birtill, John

    1962-01-01

    This technical letter contains a brief review of the operations which are involved in digital processing of array seismic recordings by the methods of velocity filtering, summation, cross-multiplication and integration, and by combinations of these operations (the "UK Method" and multiple correlation). Examples are presented of analyses by the several techniques on array recordings which were obtained by the U.S. Geological Survey during chemical and nuclear explosions in the western United States. Seismograms are synthesized using actual noise and Pn-signal recordings, such that the signal-to-noise ratio, onset time and velocity of the signal are predetermined for the synthetic record. These records are then analyzed by summation, cross-multiplication, multiple correlation and the UK technique, and the results are compared. For all of the examples presented, analysis by the non-linear techniques of multiple correlation and cross-multiplication of the traces on an array recording are preferred to analyses by the linear operations involved in summation and the UK Method.

  1. Computer-aided design of tooth preparations for automated development of fixed prosthodontics.

    PubMed

    Yuan, Fusong; Sun, Yuchun; Wang, Yong; Lv, Peijun

    2014-01-01

    This paper introduces a method to digitally design a virtual model of a tooth preparation of the mandibular first molar, by using the commercial three-dimensional (3D) computer-aided design software packages Geomagic and Imageware, and using the model as an input to automatic tooth preparing system. The procedure included acquisition of 3D data from dentate casts and digital modeling of the shape of the tooth preparation components, such as the margin, occlusal surface, and axial surface. The completed model data were stored as stereolithography (STL) files, which were used in a tooth preparation system to help to plan the trajectory. Meanwhile, the required mathematical models in the design process were introduced. The method was used to make an individualized tooth preparation of the mandibular first molar. The entire process took 15min. Using the method presented, a straightforward 3D shape of a full crown can be obtained to meet clinical needs prior to tooth preparation. © 2013 Published by Elsevier Ltd.

  2. Bayer Demosaicking with Polynomial Interpolation.

    PubMed

    Wu, Jiaji; Anisetti, Marco; Wu, Wei; Damiani, Ernesto; Jeon, Gwanggil

    2016-08-30

    Demosaicking is a digital image process to reconstruct full color digital images from incomplete color samples from an image sensor. It is an unavoidable process for many devices incorporating camera sensor (e.g. mobile phones, tablet, etc.). In this paper, we introduce a new demosaicking algorithm based on polynomial interpolation-based demosaicking (PID). Our method makes three contributions: calculation of error predictors, edge classification based on color differences, and a refinement stage using a weighted sum strategy. Our new predictors are generated on the basis of on the polynomial interpolation, and can be used as a sound alternative to other predictors obtained by bilinear or Laplacian interpolation. In this paper we show how our predictors can be combined according to the proposed edge classifier. After populating three color channels, a refinement stage is applied to enhance the image quality and reduce demosaicking artifacts. Our experimental results show that the proposed method substantially improves over existing demosaicking methods in terms of objective performance (CPSNR, S-CIELAB E, and FSIM), and visual performance.

  3. Image processing of aerodynamic data

    NASA Technical Reports Server (NTRS)

    Faulcon, N. D.

    1985-01-01

    The use of digital image processing techniques in analyzing and evaluating aerodynamic data is discussed. An image processing system that converts images derived from digital data or from transparent film into black and white, full color, or false color pictures is described. Applications to black and white images of a model wing with a NACA 64-210 section in simulated rain and to computed low properties for transonic flow past a NACA 0012 airfoil are presented. Image processing techniques are used to visualize the variations of water film thicknesses on the wing model and to illustrate the contours of computed Mach numbers for the flow past the NACA 0012 airfoil. Since the computed data for the NACA 0012 airfoil are available only at discrete spatial locations, an interpolation method is used to provide values of the Mach number over the entire field.

  4. WE-A-17A-06: Evaluation of An Automatic Interstitial Catheter Digitization Algorithm That Reduces Treatment Planning Time and Provide Means for Adaptive Re-Planning in HDR Brachytherapy of Gynecologic Cancers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dise, J; Liang, X; Lin, L

    Purpose: To evaluate an automatic interstitial catheter digitization algorithm that reduces treatment planning time and provide means for adaptive re-planning in HDR Brachytherapy of Gynecologic Cancers. Methods: The semi-automatic catheter digitization tool utilizes a region growing algorithm in conjunction with a spline model of the catheters. The CT images were first pre-processed to enhance the contrast between the catheters and soft tissue. Several seed locations were selected in each catheter for the region growing algorithm. The spline model of the catheters assisted in the region growing by preventing inter-catheter cross-over caused by air or metal artifacts. Source dwell positions frommore » day one CT scans were applied to subsequent CTs and forward calculated using the automatically digitized catheter positions. This method was applied to 10 patients who had received HDR interstitial brachytherapy on an IRB approved image-guided radiation therapy protocol. The prescribed dose was 18.75 or 20 Gy delivered in 5 fractions, twice daily, over 3 consecutive days. Dosimetric comparisons were made between automatic and manual digitization on day two CTs. Results: The region growing algorithm, assisted by the spline model of the catheters, was able to digitize all catheters. The difference between automatic and manually digitized positions was 0.8±0.3 mm. The digitization time ranged from 34 minutes to 43 minutes with a mean digitization time of 37 minutes. The bulk of the time was spent on manual selection of initial seed positions and spline parameter adjustments. There was no significance difference in dosimetric parameters between the automatic and manually digitized plans. D90% to the CTV was 91.5±4.4% for the manual digitization versus 91.4±4.4% for the automatic digitization (p=0.56). Conclusion: A region growing algorithm was developed to semi-automatically digitize interstitial catheters in HDR brachytherapy using the Syed-Neblett template. This automatic digitization tool was shown to be accurate compared to manual digitization.« less

  5. Easy Leaf Area: Automated digital image analysis for rapid and accurate measurement of leaf area.

    PubMed

    Easlon, Hsien Ming; Bloom, Arnold J

    2014-07-01

    Measurement of leaf areas from digital photographs has traditionally required significant user input unless backgrounds are carefully masked. Easy Leaf Area was developed to batch process hundreds of Arabidopsis rosette images in minutes, removing background artifacts and saving results to a spreadsheet-ready CSV file. • Easy Leaf Area uses the color ratios of each pixel to distinguish leaves and calibration areas from their background and compares leaf pixel counts to a red calibration area to eliminate the need for camera distance calculations or manual ruler scale measurement that other software methods typically require. Leaf areas estimated by this software from images taken with a camera phone were more accurate than ImageJ estimates from flatbed scanner images. • Easy Leaf Area provides an easy-to-use method for rapid measurement of leaf area and nondestructive estimation of canopy area from digital images.

  6. Rolling Shutter Effect aberration compensation in Digital Holographic Microscopy

    NASA Astrophysics Data System (ADS)

    Monaldi, Andrea C.; Romero, Gladis G.; Cabrera, Carlos M.; Blanc, Adriana V.; Alanís, Elvio E.

    2016-05-01

    Due to the sequential-readout nature of most CMOS sensors, each row of the sensor array is exposed at a different time, resulting in the so-called rolling shutter effect that induces geometric distortion to the image if the video camera or the object moves during image acquisition. Particularly in digital holograms recording, while the sensor captures progressively each row of the hologram, interferometric fringes can oscillate due to external vibrations and/or noises even when the object under study remains motionless. The sensor records each hologram row in different instants of these disturbances. As a final effect, phase information is corrupted, distorting the reconstructed holograms quality. We present a fast and simple method for compensating this effect based on image processing tools. The method is exemplified by holograms of microscopic biological static objects. Results encourage incorporating CMOS sensors over CCD in Digital Holographic Microscopy due to a better resolution and less expensive benefits.

  7. Frequency-feature based antistrong-disturbance signal processing method and system for vortex flowmeter with single sensor

    NASA Astrophysics Data System (ADS)

    Xu, Ke-Jun; Luo, Qing-Lin; Wang, Gang; Liu, San-Shan; Kang, Yi-Bo

    2010-07-01

    Digital signal processing methods have been applied to vortex flowmeter for extracting the useful information from noisy output of the vortex flow sensor. But these approaches are unavailable when the power of the mechanical vibration noise is larger than that of the vortex flow signal. In order to solve this problem, an antistrong-disturbance signal processing method is proposed based on frequency features of the vortex flow signal and mechanical vibration noise for the vortex flowmeter with single sensor. The frequency bandwidth of the vortex flow signal is different from that of the mechanical vibration noise. The autocorrelation function can represent bandwidth features of the signal and noise. The output of the vortex flow sensor is processed by the spectrum analysis, filtered by bandpass filters, and calculated by autocorrelation function at the fixed delaying time and at τ =0 to obtain ratios. The frequency corresponding to the minimal ratio is regarded as the vortex flow frequency. With an ultralow-power microcontroller, a digital signal processing system is developed to implement the antistrong-disturbance algorithm, and at the same time to ensure low-power and two-wire mode for meeting the requirement of process instrumentation. The water flow-rate calibration and vibration test experiments are conducted, and the experimental results show that both the algorithm and system are effective.

  8. Frequency-feature based antistrong-disturbance signal processing method and system for vortex flowmeter with single sensor.

    PubMed

    Xu, Ke-Jun; Luo, Qing-Lin; Wang, Gang; Liu, San-Shan; Kang, Yi-Bo

    2010-07-01

    Digital signal processing methods have been applied to vortex flowmeter for extracting the useful information from noisy output of the vortex flow sensor. But these approaches are unavailable when the power of the mechanical vibration noise is larger than that of the vortex flow signal. In order to solve this problem, an antistrong-disturbance signal processing method is proposed based on frequency features of the vortex flow signal and mechanical vibration noise for the vortex flowmeter with single sensor. The frequency bandwidth of the vortex flow signal is different from that of the mechanical vibration noise. The autocorrelation function can represent bandwidth features of the signal and noise. The output of the vortex flow sensor is processed by the spectrum analysis, filtered by bandpass filters, and calculated by autocorrelation function at the fixed delaying time and at tau=0 to obtain ratios. The frequency corresponding to the minimal ratio is regarded as the vortex flow frequency. With an ultralow-power microcontroller, a digital signal processing system is developed to implement the antistrong-disturbance algorithm, and at the same time to ensure low-power and two-wire mode for meeting the requirement of process instrumentation. The water flow-rate calibration and vibration test experiments are conducted, and the experimental results show that both the algorithm and system are effective.

  9. Accuracy and precision of two indirect methods for estimating canopy fuels

    Treesearch

    Abran Steele-Feldman; Elizabeth Reinhardt; Russell A. Parsons

    2006-01-01

    We compared the accuracy and precision of digital hemispherical photography and the LI-COR LAI-2000 plant canopy analyzer as predictors of canopy fuels. We collected data on 12 plots in western Montana under a variety of lighting and sky conditions, and used a variety of processing methods to compute estimates. Repeated measurements from each method displayed...

  10. On the Development of Arabic Three-Digit Number Processing in Primary School Children

    ERIC Educational Resources Information Center

    Mann, Anne; Moeller, Korbinian; Pixner, Silvia; Kaufmann, Liane; Nuerk, Hans-Christoph

    2012-01-01

    The development of two-digit number processing in children, and in particular the influence of place-value understanding, has recently received increasing research interest. However, place-value influences leading to decomposed processing have not yet been investigated for multi-digit numbers beyond the two-digit number range in children.…

  11. Triboelectric Nanogenerator as a Self-Powered Communication Unit for Processing and Transmitting Information.

    PubMed

    Yu, Aifang; Chen, Xiangyu; Wang, Rui; Liu, Jingyu; Luo, Jianjun; Chen, Libo; Zhang, Yang; Wu, Wei; Liu, Caihong; Yuan, Hongtao; Peng, Mingzeng; Hu, Weiguo; Zhai, Junyi; Wang, Zhong Lin

    2016-04-26

    In this paper, we demonstrate an application of a triboelectric nanogenerator (TENG) as a self-powered communication unit. An elaborately designed TENG is used to translate a series of environmental triggering signals into binary digital signals and drives an electronic-optical device to transmit binary digital data in real-time without an external power supply. The elaborately designed TENG is built in a membrane structure that can effectively drive the electronic-optical device in a bandwidth from 1.30 to 1.65 kHz. Two typical communication modes (amplitude-shift keying and frequency-shift keying) are realized through the resonant response of TENG to different frequencies, and two digital signals, i.e., "1001" and "0110", are successfully transmitted and received through this system, respectively. Hence, in this study, a simple but efficient method for directly transmitting ambient vibration to the receiver as a digital signal is established using an elaborately designed TENG and an optical communication technique. This type of the communication system, as well as the implementation method presented, exhibits great potential for applications in the smart city, smart home, password authentication, and so on.

  12. Enhancement of digital radiography image quality using a convolutional neural network.

    PubMed

    Sun, Yuewen; Li, Litao; Cong, Peng; Wang, Zhentao; Guo, Xiaojing

    2017-01-01

    Digital radiography system is widely used for noninvasive security check and medical imaging examination. However, the system has a limitation of lower image quality in spatial resolution and signal to noise ratio. In this study, we explored whether the image quality acquired by the digital radiography system can be improved with a modified convolutional neural network to generate high-resolution images with reduced noise from the original low-quality images. The experiment evaluated on a test dataset, which contains 5 X-ray images, showed that the proposed method outperformed the traditional methods (i.e., bicubic interpolation and 3D block-matching approach) as measured by peak signal to noise ratio (PSNR) about 1.3 dB while kept highly efficient processing time within one second. Experimental results demonstrated that a residual to residual (RTR) convolutional neural network remarkably improved the image quality of object structural details by increasing the image resolution and reducing image noise. Thus, this study indicated that applying this RTR convolutional neural network system was useful to improve image quality acquired by the digital radiography system.

  13. Digital adaptive optics confocal microscopy based on iterative retrieval of optical aberration from a guidestar hologram

    PubMed Central

    Liu, Changgeng; Thapa, Damber; Yao, Xincheng

    2017-01-01

    Guidestar hologram based digital adaptive optics (DAO) is one recently emerging active imaging modality. It records each complex distorted line field reflected or scattered from the sample by an off-axis digital hologram, measures the optical aberration from a separate off-axis digital guidestar hologram, and removes the optical aberration from the distorted line fields by numerical processing. In previously demonstrated DAO systems, the optical aberration was directly retrieved from the guidestar hologram by taking its Fourier transform and extracting the phase term. For the direct retrieval method (DRM), when the sample is not coincident with the guidestar focal plane, the accuracy of the optical aberration retrieved by DRM undergoes a fast decay, leading to quality deterioration of corrected images. To tackle this problem, we explore here an image metrics-based iterative method (MIM) to retrieve the optical aberration from the guidestar hologram. Using an aberrated objective lens and scattering samples, we demonstrate that MIM can improve the accuracy of the retrieved aberrations from both focused and defocused guidestar holograms, compared to DRM, to improve the robustness of the DAO. PMID:28380937

  14. BPSK Demodulation Using Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Garcia, Thomas R.

    1996-01-01

    A digital communications signal is a sinusoidal waveform that is modified by a binary (digital) information signal. The sinusoidal waveform is called the carrier. The carrier may be modified in amplitude, frequency, phase, or a combination of these. In this project a binary phase shift keyed (BPSK) signal is the communication signal. In a BPSK signal the phase of the carrier is set to one of two states, 180 degrees apart, by a binary (i.e., 1 or 0) information signal. A digital signal is a sampled version of a "real world" time continuous signal. The digital signal is generated by sampling the continuous signal at discrete points in time. The rate at which the signal is sampled is called the sampling rate (f(s)). The device that performs this operation is called an analog-to-digital (A/D) converter or a digitizer. The digital signal is composed of the sequence of individual values of the sampled BPSK signal. Digital signal processing (DSP) is the modification of the digital signal by mathematical operations. A device that performs this processing is called a digital signal processor. After processing, the digital signal may then be converted back to an analog signal using a digital-to-analog (D/A) converter. The goal of this project is to develop a system that will recover the digital information from a BPSK signal using DSP techniques. The project is broken down into the following steps: (1) Development of the algorithms required to demodulate the BPSK signal; (2) Simulation of the system; and (3) Implementation a BPSK receiver using digital signal processing hardware.

  15. Relevance of 19th century continuous tone photomechanical printing techniques to digitally generated imagery

    NASA Astrophysics Data System (ADS)

    Hoskins, Stephen; Thirkell, Paul

    2003-01-01

    Collotype and Woodburytype are late 19th early 20th century continuous tone methods of reproducing photography in print, which do not have an underlying dot structure. The aesthetic and tactile qualities produced by these methods at their best, have never been surpassed. Woodburytype is the only photomechanical print process using a printing matrix and ink, that is capable of rendering true continuous tone; it also has the characteristic of rendering a photographic image by mapping a three-dimensional surface topography. Collotype"s absence of an underlying dot structure enables an image to be printed in as many colours as desired without creating any form of interference structure. Research at the Centre for Fine Print Research, UWE Bristol aims to recreate these processes for artists and photographers and assess their potential to create a digitally generated image printed in full colour and continuous tone that will not fade or deteriorate. Through this research the Centre seeks to provide a context in which the development of current four-colour CMYK printing may be viewed as an expedient rather than a logical route for the development of colour printing within the framework of digitally generated hard copy paper output.

  16. An online detection system for aggregate sizes and shapes based on digital image processing

    NASA Astrophysics Data System (ADS)

    Yang, Jianhong; Chen, Sijia

    2017-02-01

    Traditional aggregate size measuring methods are time-consuming, taxing, and do not deliver online measurements. A new online detection system for determining aggregate size and shape based on a digital camera with a charge-coupled device, and subsequent digital image processing, have been developed to overcome these problems. The system captures images of aggregates while falling and flat lying. Using these data, the particle size and shape distribution can be obtained in real time. Here, we calibrate this method using standard globules. Our experiments show that the maximum particle size distribution error was only 3 wt%, while the maximum particle shape distribution error was only 2 wt% for data derived from falling aggregates, having good dispersion. In contrast, the data for flat-lying aggregates had a maximum particle size distribution error of 12 wt%, and a maximum particle shape distribution error of 10 wt%; their accuracy was clearly lower than for falling aggregates. However, they performed well for single-graded aggregates, and did not require a dispersion device. Our system is low-cost and easy to install. It can successfully achieve online detection of aggregate size and shape with good reliability, and it has great potential for aggregate quality assurance.

  17. Classifying Physical Morphology of Cocoa Beans Digital Images using Multiclass Ensemble Least-Squares Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Lawi, Armin; Adhitya, Yudhi

    2018-03-01

    The objective of this research is to determine the quality of cocoa beans through morphology of their digital images. Samples of cocoa beans were scattered on a bright white paper under a controlled lighting condition. A compact digital camera was used to capture the images. The images were then processed to extract their morphological parameters. Classification process begins with an analysis of cocoa beans image based on morphological feature extraction. Parameters for extraction of morphological or physical feature parameters, i.e., Area, Perimeter, Major Axis Length, Minor Axis Length, Aspect Ratio, Circularity, Roundness, Ferret Diameter. The cocoa beans are classified into 4 groups, i.e.: Normal Beans, Broken Beans, Fractured Beans, and Skin Damaged Beans. The model of classification used in this paper is the Multiclass Ensemble Least-Squares Support Vector Machine (MELS-SVM), a proposed improvement model of SVM using ensemble method in which the separate hyperplanes are obtained by least square approach and the multiclass procedure uses One-Against- All method. The result of our proposed model showed that the classification with morphological feature input parameters were accurately as 99.705% for the four classes, respectively.

  18. A precise time synchronization method for 5G based on radio-over-fiber network with SDN controller

    NASA Astrophysics Data System (ADS)

    He, Linkuan; Wei, Baoguo; Yang, Hui; Yu, Ao; Wang, Zhengyong; Zhang, Jie

    2018-02-01

    There is an increasing demand on accurate time synchronization with the growing bandwidth of network service for 5G. In 5G network, it's necessary for base station to achieve accurate time synchronization to guarantee the quality of communication. In order to keep accuracy time for 5G network, we propose a time synchronization system for satellite ground station based on radio-over-fiber network (RoFN) with software defined optical network (SDON) controller. The advantage of this method is to improve the accuracy of time synchronization of ground station. The IEEE 1588 time synchronization protocol can solve the problems of high cost and lack of precision. However, in the process of time synchronization, distortion exists during the transmission of digital time signal. RoF uses analog optical transmission links and therefore analog transmission can be implemented among ground stations instead of digital transmission, which means distortion and bandwidth waste in the process of digital synchronization can be avoided. Additionally, the thought of SDN, software defined network, can optimize RoFN with centralized control and simplifying base station. Related simulation had been carried out to prove its superiority.

  19. Pendant-Drop Surface-Tension Measurement On Molten Metal

    NASA Technical Reports Server (NTRS)

    Man, Kin Fung; Thiessen, David

    1996-01-01

    Method of measuring surface tension of molten metal based on pendant-drop method implemented in quasi-containerless manner and augmented with digital processing of image data. Electrons bombard lower end of sample rod in vacuum, generating hanging drop of molten metal. Surface tension of drop computed from its shape. Technique minimizes effects of contamination.

  20. [Detection of lung nodules. New opportunities in chest radiography].

    PubMed

    Pötter-Lang, S; Schalekamp, S; Schaefer-Prokop, C; Uffmann, M

    2014-05-01

    Chest radiography still represents the most commonly performed X-ray examination because it is readily available, requires low radiation doses and is relatively inexpensive. However, as previously published, many initially undetected lung nodules are retrospectively visible in chest radiographs. The great improvements in detector technology with the increasing dose efficiency and improved contrast resolution provide a better image quality and reduced dose needs. The dual energy acquisition technique and advanced image processing methods (e.g. digital bone subtraction and temporal subtraction) reduce the anatomical background noise by reduction of overlapping structures in chest radiography. Computer-aided detection (CAD) schemes increase the awareness of radiologists for suspicious areas. The advanced image processing methods show clear improvements for the detection of pulmonary lung nodules in chest radiography and strengthen the role of this method in comparison to 3D acquisition techniques, such as computed tomography (CT). Many of these methods will probably be integrated into standard clinical treatment in the near future. Digital software solutions offer advantages as they can be easily incorporated into radiology departments and are often more affordable as compared to hardware solutions.

  1. Integrated electrofluidic circuits: pressure sensing with analog and digital operation functionalities for microfluidics.

    PubMed

    Wu, Chueh-Yu; Lu, Jau-Ching; Liu, Man-Chi; Tung, Yi-Chung

    2012-10-21

    Microfluidic technology plays an essential role in various lab on a chip devices due to its desired advantages. An automated microfluidic system integrated with actuators and sensors can further achieve better controllability. A number of microfluidic actuation schemes have been well developed. In contrast, most of the existing sensing methods still heavily rely on optical observations and external transducers, which have drawbacks including: costly instrumentation, professional operation, tedious interfacing, and difficulties of scaling up and further signal processing. This paper reports the concept of electrofluidic circuits - electrical circuits which are constructed using ionic liquid (IL)-filled fluidic channels. The developed electrofluidic circuits can be fabricated using a well-developed multi-layer soft lithography (MSL) process with polydimethylsiloxane (PDMS) microfluidic channels. Electrofluidic circuits allow seamless integration of pressure sensors with analog and digital operation functions into microfluidic systems and provide electrical readouts for further signal processing. In the experiments, the analog operation device is constructed based on electrofluidic Wheatstone bridge circuits with electrical outputs of the addition and subtraction results of the applied pressures. The digital operation (AND, OR, and XOR) devices are constructed using the electrofluidic pressure controlled switches, and output electrical signals of digital operations of the applied pressures. The experimental results demonstrate the designed functions for analog and digital operations of applied pressures are successfully achieved using the developed electrofluidic circuits, making them promising to develop integrated microfluidic systems with capabilities of precise pressure monitoring and further feedback control for advanced lab on a chip applications.

  2. Determination of Shift/Bias in Digital Aerial Triangulation of UAV Imagery Sequences

    NASA Astrophysics Data System (ADS)

    Wierzbicki, Damian

    2017-12-01

    Currently UAV Photogrammetry is characterized a largely automated and efficient data processing. Depicting from the low altitude more often gains on the meaning in the uses of applications as: cities mapping, corridor mapping, road and pipeline inspections or mapping of large areas e.g. forests. Additionally, high-resolution video image (HD and bigger) is more often use for depicting from the low altitude from one side it lets deliver a lot of details and characteristics of ground surfaces features, and from the other side is presenting new challenges in the data processing. Therefore, determination of elements of external orientation plays a substantial role the detail of Digital Terrain Models and artefact-free ortophoto generation. Parallel a research on the quality of acquired images from UAV and above the quality of products e.g. orthophotos are conducted. Despite so fast development UAV photogrammetry still exists the necessity of accomplishment Automatic Aerial Triangulation (AAT) on the basis of the observations GPS/INS and via ground control points. During low altitude photogrammetric flight, the approximate elements of external orientation registered by UAV are burdened with the influence of some shift/bias errors. In this article, methods of determination shift/bias error are presented. In the process of the digital aerial triangulation two solutions are applied. In the first method shift/bias error was determined together with the drift/bias error, elements of external orientation and coordinates of ground control points. In the second method shift/bias error was determined together with the elements of external orientation, coordinates of ground control points and drift/bias error equals 0. When two methods were compared the difference for shift/bias error is more than ±0.01 m for all terrain coordinates XYZ.

  3. Advanced IR System For Supersonic Boundary Layer Transition Flight Experiment

    NASA Technical Reports Server (NTRS)

    Banks, Daniel W.

    2008-01-01

    Infrared thermography is a preferred method investigating transition in flight: a) Global and non-intrusive; b) Can also be used to visualize and characterize other fluid mechanic phenomena such as shock impingement, separation etc. F-15 based system was updated with new camera and digital video recorder to support high Reynolds number transition tests. Digital Recording improves image quality and analysis capability and allows for accurate quantitative (temperature) measurements and greater enhancement through image processing allows analysis of smaller scale phenomena.

  4. The Opinion of Mr. Justice Douglas in the Case of Robert Gottschalk, Acting Commissioner of Patents, vs. Gary R. Benson and Arthur C. Tabbot, Delivered in the Supreme Court of the United States on November 20, 1972.

    ERIC Educational Resources Information Center

    Supreme Court of the U. S., Washington, DC.

    Respondents filed in the Patent Office an application for an invention which was described as being related to the processing of data by program and more particularly to the programmed conversion of numerical information in general purpose digital computers. The patent sought is on a method of programming a general purpose digital computer to…

  5. Digital imaging of autoradiographs from paintings by Georges de La Tour (1593-1652)

    NASA Astrophysics Data System (ADS)

    Fischer, C.-O.; Gallagher, M.; Laurenze, C.; Schmidt, Ch; Slusallek, K.

    1999-11-01

    The artistic work of the painter Georges de La Tour has been studied very intensively in the last few years, mainly by French and US-American art historians and natural scientists. To support the in-depth analysis of two paintings from the Kimbell Art Museum in Fort Worth, Texas, USA, two similar paintings from the Gemäldegalerie Berlin have been investigated. The method of neutron activation autoradiography has been applied using imaging plates with digital image processing.

  6. Internet Protocol Security (IPSEC): Testing and Implications on IPv4 and IPv6 Networks

    DTIC Science & Technology

    2008-08-27

    Message Authentication Code-Message Digest 5-96). Due to the processing power consumption and slowness of public key authentication methods, RSA ...MODP) group with a 768 -bit modulus 2. a MODP group with a 1024-bit modulus 3. an Elliptic Curve Group over GF[ 2n ] (EC2N) group with a 155-bit...nonces, digital signatures using the Digital Signature Algorithm, and the Rivest-Shamir- Adelman ( RSA ) algorithm. For more information about the

  7. Computational approach to integrate 3D X-ray microtomography and NMR data.

    PubMed

    Lucas-Oliveira, Everton; Araujo-Ferreira, Arthur G; Trevizan, Willian A; Fortulan, Carlos A; Bonagamba, Tito J

    2018-05-04

    Nowadays, most of the efforts in NMR applied to porous media are dedicated to studying the molecular fluid dynamics within and among the pores. These analyses have a higher complexity due to morphology and chemical composition of rocks, besides dynamic effects as restricted diffusion, diffusional coupling, and exchange processes. Since the translational nuclear spin diffusion in a confined geometry (e.g. pores and fractures) requires specific boundary conditions, the theoretical solutions are restricted to some special problems and, in many cases, computational methods are required. The Random Walk Method is a classic way to simulate self-diffusion along a Digital Porous Medium. Bergman model considers the magnetic relaxation process of the fluid molecules by including a probability rate of magnetization survival under surface interactions. Here we propose a statistical approach to correlate surface magnetic relaxivity with the computational method applied to the NMR relaxation in order to elucidate the relationship between simulated relaxation time and pore size of the Digital Porous Medium. The proposed computational method simulates one- and two-dimensional NMR techniques reproducing, for example, longitudinal and transverse relaxation times (T 1 and T 2 , respectively), diffusion coefficients (D), as well as their correlations. For a good approximation between the numerical and experimental results, it is necessary to preserve the complexity of translational diffusion through the microstructures in the digital rocks. Therefore, we use Digital Porous Media obtained by 3D X-ray microtomography. To validate the method, relaxation times of ideal spherical pores were obtained and compared with the previous determinations by the Brownstein-Tarr model, as well as the computational approach proposed by Bergman. Furthermore, simulated and experimental results of synthetic porous media are compared. These results make evident the potential of computational physics in the analysis of the NMR data for complex porous materials. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Multi-feature machine learning model for automatic segmentation of green fractional vegetation cover for high-throughput field phenotyping.

    PubMed

    Sadeghi-Tehran, Pouria; Virlet, Nicolas; Sabermanesh, Kasra; Hawkesford, Malcolm J

    2017-01-01

    Accurately segmenting vegetation from the background within digital images is both a fundamental and a challenging task in phenotyping. The performance of traditional methods is satisfactory in homogeneous environments, however, performance decreases when applied to images acquired in dynamic field environments. In this paper, a multi-feature learning method is proposed to quantify vegetation growth in outdoor field conditions. The introduced technique is compared with the state-of the-art and other learning methods on digital images. All methods are compared and evaluated with different environmental conditions and the following criteria: (1) comparison with ground-truth images, (2) variation along a day with changes in ambient illumination, (3) comparison with manual measurements and (4) an estimation of performance along the full life cycle of a wheat canopy. The method described is capable of coping with the environmental challenges faced in field conditions, with high levels of adaptiveness and without the need for adjusting a threshold for each digital image. The proposed method is also an ideal candidate to process a time series of phenotypic information throughout the crop growth acquired in the field. Moreover, the introduced method has an advantage that it is not limited to growth measurements only but can be applied on other applications such as identifying weeds, diseases, stress, etc.

  9. Breast cancer histopathology image analysis: a review.

    PubMed

    Veta, Mitko; Pluim, Josien P W; van Diest, Paul J; Viergever, Max A

    2014-05-01

    This paper presents an overview of methods that have been proposed for the analysis of breast cancer histopathology images. This research area has become particularly relevant with the advent of whole slide imaging (WSI) scanners, which can perform cost-effective and high-throughput histopathology slide digitization, and which aim at replacing the optical microscope as the primary tool used by pathologist. Breast cancer is the most prevalent form of cancers among women, and image analysis methods that target this disease have a huge potential to reduce the workload in a typical pathology lab and to improve the quality of the interpretation. This paper is meant as an introduction for nonexperts. It starts with an overview of the tissue preparation, staining and slide digitization processes followed by a discussion of the different image processing techniques and applications, ranging from analysis of tissue staining to computer-aided diagnosis, and prognosis of breast cancer patients.

  10. Research on Splicing Method of Digital Relic Fragment Model

    NASA Astrophysics Data System (ADS)

    Yan, X.; Hu, Y.; Hou, M.

    2018-04-01

    In the course of archaeological excavation, a large number of pieces of cultural relics were unearthed, and the restoration of these fragments was done manually by traditional arts and crafts experts. In this process, cultural relics experts often try to splice the existing cultural relics, and then use adhesive to stick together the fragments of correct location, which will cause irreversible secondary damage to cultural relics. In order to minimize such damage, the surveyors combine 3D laser scanning with computer technology, and use the method of establishing digital cultural relics fragments model to make virtual splicing of cultural relics. The 3D software on the common market can basically achieve the model translation and rotation, using this two functions can be achieved manually splicing between models, mosaic records after the completion of the specific location of each piece of fragments, so as to effectively reduce the damage to the relics had tried splicing process.

  11. Objective Measurement of Erythema in Psoriasis using Digital Color Photography with Color Calibration

    PubMed Central

    Raina, Abhay; Hennessy, Ricky; Rains, Michael; Allred, James; Hirshburg, Jason M; Diven, Dayna; Markey, Mia K.

    2016-01-01

    Background Traditional metrics for evaluating the severity of psoriasis are subjective, which complicates efforts to measure effective treatments in clinical trials. Methods We collected images of psoriasis plaques and calibrated the coloration of the images according to an included color card. Features were extracted from the images and used to train a linear discriminant analysis classifier with cross-validation to automatically classify the degree of erythema. The results were tested against numerical scores obtained by a panel of dermatologists using a standard rating system. Results Quantitative measures of erythema based on the digital color images showed good agreement with subjective assessment of erythema severity (κ = 0.4203). The color calibration process improved the agreement from κ = 0.2364 to κ = 0.4203. Conclusions We propose a method for the objective measurement of the psoriasis severity parameter of erythema and show that the calibration process improved the results. PMID:26517973

  12. Integrated unaligned resonant modulator tuning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zortman, William A.; Lentine, Anthony L.

    Methods and systems for tuning a resonant modulator are disclosed. One method includes receiving a carrier signal modulated by the resonant modulator with a stream of data having an approximately equal number of high and low bits, determining an average power of the modulated carrier signal, comparing the average power to a predetermined threshold, and operating a tuning device coupled to the resonant modulator based on the comparison of the average power and the predetermined threshold. One system includes an input structure, a plurality of processing elements, and a digital control element. The input structure is configured to receive, frommore » the resonant modulator, a modulated carrier signal. The plurality of processing elements are configured to determine an average power of the modulated carrier signal. The digital control element is configured to operate a tuning device coupled to the resonant modulator based on the average power of the modulated carrier signal.« less

  13. Gamma ray spectroscopy employing divalent europium-doped alkaline earth halides and digital readout for accurate histogramming

    DOEpatents

    Cherepy, Nerine Jane; Payne, Stephen Anthony; Drury, Owen B.; Sturm, Benjamin W.

    2016-02-09

    According to one embodiment, a scintillator radiation detector system includes a scintillator, and a processing device for processing pulse traces corresponding to light pulses from the scintillator, where the processing device is configured to: process each pulse trace over at least two temporal windows and to use pulse digitization to improve energy resolution of the system. According to another embodiment, a scintillator radiation detector system includes a processing device configured to: fit digitized scintillation waveforms to an algorithm, perform a direct integration of fit parameters, process multiple integration windows for each digitized scintillation waveform to determine a correction factor, and apply the correction factor to each digitized scintillation waveform.

  14. Implementation of Non-Destructive Evaluation and Process Monitoring in DLP-based Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Kovalenko, Iaroslav; Verron, Sylvain; Garan, Maryna; Šafka, Jiří; Moučka, Michal

    2017-04-01

    This article describes a method of in-situ process monitoring in the digital light processing (DLP) 3D printer. It is based on the continuous measurement of the adhesion force between printing surface and bottom of a liquid resin bath. This method is suitable only for the bottom-up DPL printers. Control system compares the force at the moment of unsticking of printed layer from the bottom of the tank, when it has the largest value in printing cycle, with theoretical value. Implementation of suggested algorithm can make detection of faults during the printing process possible.

  15. Most Courses Are Not Born Digital: An Overview of the Quality Matters Peer Review Process for Online Course Design

    ERIC Educational Resources Information Center

    Varonis, Evageline Marlos

    2014-01-01

    Purpose: The purpose of this paper is to discuss benefits of and barriers to online learning and describe utilization of the Quality Matters (QM) peer review process as a method to assure the quality of online courses. It outlines the QM higher education rubric, explains how the collaborative QM peer review process facilitates online course design…

  16. A Historical Perspective on Digital Hearing Aids: How Digital Technology Has Changed Modern Hearing Aids

    PubMed Central

    Levitt, Harry

    2007-01-01

    This article provides the author's perspective on the development of digital hearing aids and how digital signal processing approaches have led to changes in hearing aid design. Major landmarks in the evolution of digital technology are identified, and their impact on the development of digital hearing aids is discussed. Differences between analog and digital approaches to signal processing in hearing aids are identified. PMID:17301334

  17. Archive of single beam and swath bathymetry data collected nearshore of the Gulf Islands National Seashore, Mississippi, from West Ship Island, Mississippi, to Dauphin Island, Alabama: Methods and data report for USGS Cruises 08CCT01 and 08CCT02, July 2008, and 09CCT03 and 09CCT04, June 2009

    USGS Publications Warehouse

    DeWitt, Nancy T.; Flocks, James G.; Pendleton, Elizabeth A.; Hansen, Mark E.; Reynolds, B.J.; Kelso, Kyle W.; Wiese, Dana S.; Worley, Charles R.

    2012-01-01

    See the digital FACS equipment log for details about the acquisition equipment used. Raw datasets are stored digitally at the USGS St. Petersburg Coastal and Marine Science Center and processed systematically using Novatel's GrafNav version 7.6, SANDS version 3.7, SEA SWATHplus version 3.06.04.03, CARIS HIPS AND SIPS version 3.6, and ESRI ArcGIS version 9.3.1. For more information on processing refer to the Equipment and Processing page. Chirp seismic data were also collected during these surveys and are archived separately.

  18. Regionally adaptive histogram equalization of the chest.

    PubMed

    Sherrier, R H; Johnson, G A

    1987-01-01

    Advances in the area of digital chest radiography have resulted in the acquisition of high-quality images of the human chest. With these advances, there arises a genuine need for image processing algorithms specific to the chest, in order to fully exploit this digital technology. We have implemented the well-known technique of histogram equalization, noting the problems encountered when it is adapted to chest images. These problems have been successfully solved with our regionally adaptive histogram equalization method. With this technique histograms are calculated locally and then modified according to both the mean pixel value of that region as well as certain characteristics of the cumulative distribution function. This process, which has allowed certain regions of the chest radiograph to be enhanced differentially, may also have broader implications for other image processing tasks.

  19. Reducing uncertainty in wind turbine blade health inspection with image processing techniques

    NASA Astrophysics Data System (ADS)

    Zhang, Huiyi

    Structural health inspection has been widely applied in the operation of wind farms to find early cracks in wind turbine blades (WTBs). Increased numbers of turbines and expanded rotor diameters are driving up the workloads and safety risks for site employees. Therefore, it is important to automate the inspection process as well as minimize the uncertainties involved in routine blade health inspection. In addition, crack documentation and trending is vital to assess rotor blade and turbine reliability in the 20 year designed life span. A new crack recognition and classification algorithm is described that can support automated structural health inspection of the surface of large composite WTBs. The first part of the study investigated the feasibility of digital image processing in WTB health inspection and defined the capability of numerically detecting cracks as small as hairline thickness. The second part of the study identified and analyzed the uncertainty of the digital image processing method. A self-learning algorithm was proposed to recognize and classify cracks without comparing a blade image to a library of crack images. The last part of the research quantified the uncertainty in the field conditions and the image processing methods.

  20. Comparative data compression techniques and multi-compression results

    NASA Astrophysics Data System (ADS)

    Hasan, M. R.; Ibrahimy, M. I.; Motakabber, S. M. A.; Ferdaus, M. M.; Khan, M. N. H.

    2013-12-01

    Data compression is very necessary in business data processing, because of the cost savings that it offers and the large volume of data manipulated in many business applications. It is a method or system for transmitting a digital image (i.e., an array of pixels) from a digital data source to a digital data receiver. More the size of the data be smaller, it provides better transmission speed and saves time. In this communication, we always want to transmit data efficiently and noise freely. This paper will provide some compression techniques for lossless text type data compression and comparative result of multiple and single compression, that will help to find out better compression output and to develop compression algorithms.

  1. Recognition of digital characteristics based new improved genetic algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Meng; Xu, Guoqiang; Lin, Zihao

    2017-08-01

    In the field of digital signal processing, Estimating the characteristics of signal modulation parameters is an significant research direction. The paper determines the set of eigenvalue which can show the difference of the digital signal modulation based on the deep research of the new improved genetic algorithm. Firstly take them as the best gene pool; secondly, The best gene pool will be changed in the genetic evolvement by selecting, overlapping and eliminating each other; Finally, Adapting the strategy of futher enhance competition and punishment to more optimizer the gene pool and ensure each generation are of high quality gene. The simulation results show that this method not only has the global convergence, stability and faster convergence speed.

  2. Scene Analysis: Non-Linear Spatial Filtering for Automatic Target Detection.

    DTIC Science & Technology

    1982-12-01

    In this thesis, a method for two-dimensional pattern recognition was developed and tested. The method included a global search scheme for candidate...test global switch TYPEO Creating negative video file only.W 11=0 12=256 13=512 14=768 GO 70 2 1 TYPE" Creating negative and horizontally flipped video...purpose was to develop a base of image processing software for the AFIT Digital Signal Processing Laboratory NOVA- ECLIPSE minicomputer system, for

  3. Iterative nonlinear joint transform correlation for the detection of objects in cluttered scenes

    NASA Astrophysics Data System (ADS)

    Haist, Tobias; Tiziani, Hans J.

    1999-03-01

    An iterative correlation technique with digital image processing in the feedback loop for the detection of small objects in cluttered scenes is proposed. A scanning aperture is combined with the method in order to improve the immunity against noise and clutter. Multiple reference objects or different views of one object are processed in parallel. We demonstrate the method by detecting a noisy and distorted face in a crowd with a nonlinear joint transform correlator.

  4. Real-Time Digital Signal Processing Based on FPGAs for Electronic Skin Implementation †

    PubMed Central

    Ibrahim, Ali; Gastaldo, Paolo; Chible, Hussein; Valle, Maurizio

    2017-01-01

    Enabling touch-sensing capability would help appliances understand interaction behaviors with their surroundings. Many recent studies are focusing on the development of electronic skin because of its necessity in various application domains, namely autonomous artificial intelligence (e.g., robots), biomedical instrumentation, and replacement prosthetic devices. An essential task of the electronic skin system is to locally process the tactile data and send structured information either to mimic human skin or to respond to the application demands. The electronic skin must be fabricated together with an embedded electronic system which has the role of acquiring the tactile data, processing, and extracting structured information. On the other hand, processing tactile data requires efficient methods to extract meaningful information from raw sensor data. Machine learning represents an effective method for data analysis in many domains: it has recently demonstrated its effectiveness in processing tactile sensor data. In this framework, this paper presents the implementation of digital signal processing based on FPGAs for tactile data processing. It provides the implementation of a tensorial kernel function for a machine learning approach. Implementation results are assessed by highlighting the FPGA resource utilization and power consumption. Results demonstrate the feasibility of the proposed implementation when real-time classification of input touch modalities are targeted. PMID:28287448

  5. Real-Time Digital Signal Processing Based on FPGAs for Electronic Skin Implementation.

    PubMed

    Ibrahim, Ali; Gastaldo, Paolo; Chible, Hussein; Valle, Maurizio

    2017-03-10

    Enabling touch-sensing capability would help appliances understand interaction behaviors with their surroundings. Many recent studies are focusing on the development of electronic skin because of its necessity in various application domains, namely autonomous artificial intelligence (e.g., robots), biomedical instrumentation, and replacement prosthetic devices. An essential task of the electronic skin system is to locally process the tactile data and send structured information either to mimic human skin or to respond to the application demands. The electronic skin must be fabricated together with an embedded electronic system which has the role of acquiring the tactile data, processing, and extracting structured information. On the other hand, processing tactile data requires efficient methods to extract meaningful information from raw sensor data. Machine learning represents an effective method for data analysis in many domains: it has recently demonstrated its effectiveness in processing tactile sensor data. In this framework, this paper presents the implementation of digital signal processing based on FPGAs for tactile data processing. It provides the implementation of a tensorial kernel function for a machine learning approach. Implementation results are assessed by highlighting the FPGA resource utilization and power consumption. Results demonstrate the feasibility of the proposed implementation when real-time classification of input touch modalities are targeted.

  6. Thermal imaging measurement of lateral diffusivity and non-invasive material defect detection

    DOEpatents

    Sun, Jiangang; Deemer, Chris

    2003-01-01

    A system and method for determining lateral thermal diffusivity of a material sample using a heat pulse; a sample oriented within an orthogonal coordinate system; an infrared camera; and a computer that has a digital frame grabber, and data acquisition and processing software. The mathematical model used within the data processing software is capable of determining the lateral thermal diffusivity of a sample of finite boundaries. The system and method may also be used as a nondestructive method for detecting and locating cracks within the material sample.

  7. Method of recording bioelectrical signals using a capacitive coupling

    NASA Astrophysics Data System (ADS)

    Simon, V. A.; Gerasimov, V. A.; Kostrin, D. K.; Selivanov, L. M.; Uhov, A. A.

    2017-11-01

    In this article a technique for the bioelectrical signals acquisition by means of the capacitive sensors is described. A feedback loop for the ultra-high impedance biasing of the input instrumentation amplifier, which provides receiving of the electrical cardiac signal (ECS) through a capacitive coupling, is proposed. The mains 50/60 Hz noise is suppressed by a narrow-band stop filter with an independent notch frequency and quality factor tuning. Filter output is attached to a ΣΔ analog-to-digital converter (ADC), which acquires the filtered signal with a 24-bit resolution. Signal processing board is connected through universal serial bus interface to a personal computer, where ECS in a digital form is recorded and processed.

  8. Automated Meteor Detection by All-Sky Digital Camera Systems

    NASA Astrophysics Data System (ADS)

    Suk, Tomáš; Šimberová, Stanislava

    2017-12-01

    We have developed a set of methods to detect meteor light traces captured by all-sky CCD cameras. Operating at small automatic observatories (stations), these cameras create a network spread over a large territory. Image data coming from these stations are merged in one central node. Since a vast amount of data is collected by the stations in a single night, robotic storage and analysis are essential to processing. The proposed methodology is adapted to data from a network of automatic stations equipped with digital fish-eye cameras and includes data capturing, preparation, pre-processing, analysis, and finally recognition of objects in time sequences. In our experiments we utilized real observed data from two stations.

  9. Orthoscopic real-image display of digital holograms.

    PubMed

    Makowski, P L; Kozacki, T; Zaperty, W

    2017-10-01

    We present a practical solution for the long-standing problem of depth inversion in real-image holographic display of digital holograms. It relies on a field lens inserted in front of the spatial light modulator device addressed by a properly processed hologram. The processing algorithm accounts for pixel size and wavelength mismatch between capture and display devices in a way that prevents image deformation. Complete images of large dimensions are observable from one position with a naked eye. We demonstrate the method experimentally on a 10-cm-long 3D object using a single full-HD spatial light modulator, but it can supplement most holographic displays designed to form a real image, including circular wide angle configurations.

  10. Quantitative Evaluation of Surface Color of Tomato Fruits Cultivated in Remote Farm Using Digital Camera Images

    NASA Astrophysics Data System (ADS)

    Hashimoto, Atsushi; Suehara, Ken-Ichiro; Kameoka, Takaharu

    To measure the quantitative surface color information of agricultural products with the ambient information during cultivation, a color calibration method for digital camera images and a remote monitoring system of color imaging using the Web were developed. Single-lens reflex and web digital cameras were used for the image acquisitions. The tomato images through the post-ripening process were taken by the digital camera in both the standard image acquisition system and in the field conditions from the morning to evening. Several kinds of images were acquired with the standard RGB color chart set up just behind the tomato fruit on a black matte, and a color calibration was carried out. The influence of the sunlight could be experimentally eliminated, and the calibrated color information consistently agreed with the standard ones acquired in the system through the post-ripening process. Furthermore, the surface color change of the tomato on the tree in a greenhouse was remotely monitored during maturation using the digital cameras equipped with the Field Server. The acquired digital color images were sent from the Farm Station to the BIFE Laboratory of Mie University via VPN. The time behavior of the tomato surface color change during the maturing process could be measured using the color parameter calculated based on the obtained and calibrated color images along with the ambient atmospheric record. This study is a very important step in developing the surface color analysis for both the simple and rapid evaluation of the crop vigor in the field and to construct an ambient and networked remote monitoring system for food security, precision agriculture, and agricultural research.

  11. Microcomputer system for receiving and processing of satellite TOVS/TIP data for vertical sounding of the atmosphere

    NASA Astrophysics Data System (ADS)

    Baranski, L. A.; Rozemski, K.

    TOVS/TIP digital data transmitted at the VHF-BEACON range from NOAA satellites are receiving and processing at the SDRPC. Receiving station is connected with the microcomputer IBM-PC/AT which process TOVS/TIP data via two states: initial data processing and retrieval of vertical profiles of the temperature, water vapour and ozone mixing ratio in the atmosphere. Receiving and processing equipment, retrieval methods, results and error discussion are presented.

  12. Dynamic Fracture Behavior of Plastic-Bonded Explosives

    NASA Astrophysics Data System (ADS)

    Fu, Hua; Li, Jun-Ling; Tan, Duo-Wang; Ifp, Caep Team

    2011-06-01

    Plastic-Bonded Explosives (PBX) are used as important energetic materials in nuclear or conventional weapons. Arms Warhead in the service process and the ballistic phase, may experience complex process such as long pulse and higher loading, compresson, tension and reciprocating compression - tension, friction with the projectile shell, which would lead to explosive deformation and fracture.And the dynamic deformation and fracture behavior of PBX subsequently affect reaction characteristics and initiation mechanism in explosives, then having influence on explosives safety. The dynamic fracure behavior of PBX are generally complex and not well studied or understood. In this paper, the dynamic fracture of explosives are conducted using a Kolsky bar. The Brazilian test, also known as a indirect tensile test or splitting test, is chosen as the test method. Tensile strength under different strain rates are obtained using quartz crystal embedded in rod end. The dynamic deformation and fracture process are captured in real-time by high-speed digital camera, and the displacement and strain fields distribution before specimen fracture are obtained by digital correlation method. Considering the non-uniform microstructure of explosives,the dynamic fracture behavior of explosive are simulated by discrete element method, the simulation results can reproduce the deformation and fracture process in Brazilian test using a maximum tensile strain criterion.

  13. Comparison of infrared and 3D digital image correlation techniques applied for mechanical testing of materials

    NASA Astrophysics Data System (ADS)

    Krstulović-Opara, Lovre; Surjak, Martin; Vesenjak, Matej; Tonković, Zdenko; Kodvanj, Janoš; Domazet, Željko

    2015-11-01

    To investigate the applicability of infrared thermography as a tool for acquiring dynamic yielding in metals, a comparison of infrared thermography with three dimensional digital image correlation has been made. Dynamical tension tests and three point bending tests of aluminum alloys have been performed to evaluate results obtained by IR thermography in order to detect capabilities and limits for these two methods. Both approaches detect pastification zone migrations during the yielding process. The results of the tension test and three point bending test proved the validity of the IR approach as a method for evaluating the dynamic yielding process when used on complex structures such as cellular porous materials. The stability of the yielding process in the three point bending test, as contrary to the fluctuation of the plastification front in the tension test, is of great importance for the validation of numerical constitutive models. The research proved strong performance, robustness and reliability of the IR approach when used to evaluate yielding during dynamic loading processes, while the 3D DIC method proved to be superior in the low velocity loading regimes. This research based on two basic tests, proved the conclusions and suggestions presented in our previous research on porous materials where middle wave infrared thermography was applied.

  14. Experimental studies of breaking of elastic tired wheel under variable normal load

    NASA Astrophysics Data System (ADS)

    Fedotov, A. I.; Zedgenizov, V. G.; Ovchinnikova, N. I.

    2017-10-01

    The paper analyzes the braking of a vehicle wheel subjected to disturbances of normal load variations. Experimental tests and methods for developing test modes as sinusoidal force disturbances of the normal wheel load were used. Measuring methods for digital and analogue signals were used as well. Stabilization of vehicle wheel braking subjected to disturbances of normal load variations is a topical issue. The paper suggests a method for analyzing wheel braking processes under disturbances of normal load variations. A method to control wheel baking processes subjected to disturbances of normal load variations was developed.

  15. A cognitive model for multidigit number reading: Inferences from individuals with selective impairments.

    PubMed

    Dotan, Dror; Friedmann, Naama

    2018-04-01

    We propose a detailed cognitive model of multi-digit number reading. The model postulates separate processes for visual analysis of the digit string and for oral production of the verbal number. Within visual analysis, separate sub-processes encode the digit identities and the digit order, and additional sub-processes encode the number's decimal structure: its length, the positions of 0, and the way it is parsed into triplets (e.g., 314987 → 314,987). Verbal production consists of a process that generates the verbal structure of the number, and another process that retrieves the phonological forms of each number word. The verbal number structure is first encoded in a tree-like structure, similarly to syntactic trees of sentences, and then linearized to a sequence of number-word specifiers. This model is based on an investigation of the number processing abilities of seven individuals with different selective deficits in number reading. We report participants with impairment in specific sub-processes of the visual analysis of digit strings - in encoding the digit order, in encoding the number length, or in parsing the digit string to triplets. Other participants were impaired in verbal production, making errors in the number structure (shifts of digits to another decimal position, e.g., 3,040 → 30,004). Their selective deficits yielded several dissociations: first, we found a double dissociation between visual analysis deficits and verbal production deficits. Second, several dissociations were found within visual analysis: a double dissociation between errors in digit order and errors in the number length; a dissociation between order/length errors and errors in parsing the digit string into triplets; and a dissociation between the processing of different digits - impaired order encoding of the digits 2-9, without errors in the 0 position. Third, within verbal production, a dissociation was found between digit shifts and substitutions of number words. A selective deficit in any of the processes described by the model would cause difficulties in number reading, which we propose to term "dysnumeria". Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. A New Digital Signal Processing Method for Spectrum Interference Monitoring

    NASA Astrophysics Data System (ADS)

    Angrisani, L.; Capriglione, D.; Ferrigno, L.; Miele, G.

    2011-01-01

    Frequency spectrum is a limited shared resource, nowadays interested by an ever growing number of different applications. Generally, the companies providing such services pay to the governments the right of using a limited portion of the spectrum, consequently they would be assured that the licensed radio spectrum resource is not interested by significant external interferences. At the same time, they have to guarantee that their devices make an efficient use of the spectrum and meet the electromagnetic compatibility regulations. Therefore the competent authorities are called to control the access to the spectrum adopting suitable management and monitoring policies, as well as the manufacturers have to periodically verify the correct working of their apparatuses. Several measurement solutions are present on the market. They generally refer to real-time spectrum analyzers and measurement receivers. Both of them are characterized by good metrological accuracies but show costs, dimensions and weights that make no possible a use "on the field". The paper presents a first step in realizing a digital signal processing based measurement instrument able to suitably accomplish for the above mentioned needs. In particular the attention has been given to the DSP based measurement section of the instrument. To these aims an innovative measurement method for spectrum monitoring and management is proposed in this paper. It performs an efficient sequential analysis based on a sample by sample digital processing. Three main issues are in particular pursued: (i) measurement performance comparable to that exhibited by other methods proposed in literature; (ii) fast measurement time, (iii) easy implementation on cost-effective measurement hardware.

  17. Comparison of tissue equalization, and premium view post-processing methods in full field digital mammography.

    PubMed

    Chen, Baoying; Wang, Wei; Huang, Jin; Zhao, Ming; Cui, Guangbin; Xu, Jing; Guo, Wei; Du, Pang; Li, Pei; Yu, Jun

    2010-10-01

    To retrospectively evaluate the diagnostic abilities of 2 post-processing methods provided by GE Senographe DS system, tissue equalization (TE) and premium view (PV) in full field digital mammography (FFDM). In accordance with the ethical standards of the World Medical Association, this study was approved by regional ethics committee and signed informed patient consents were obtained. We retrospectively reviewed digital mammograms from 101 women (mean age, 47 years; range, 23-81 years) in the modes of TE and PV, respectively. Three radiologists, fully blinded to the post-processing methods, all patient clinical information and histologic results, read images by using objective image interpretation criteria for diagnostic information end points such as lesion border delineation, definition of disease extent, visualization of internal and surrounding morphologic features of the lesions. Also, overall diagnostic impression in terms of lesion conspicuity, detectability and diagnostic confidence was assessed. Between-group comparisons were performed with Wilcoxon signed rank test. Readers 1, 2, and 3 demonstrated significant overall better impression of PV in 29, 27, and 24 patients, compared with that for TE in 12, 13, and 11 patients, respectively (p<0.05). Significant (p<0.05) better impression of PV was also demonstrated for diagnostic information end points. Importantly, PV proved to be more sensitive than TE while detecting malignant lesions in dense breast rather than benign lesions and malignancy in non-dense breast (p<0.01). PV compared with TE provides marked better diagnostic information in FFDM, particularly for patients with malignancy in dense breast. Copyright © 2009 Elsevier Ireland Ltd. All rights reserved.

  18. Timing performance comparison of digital methods in positron emission tomography

    NASA Astrophysics Data System (ADS)

    Aykac, Mehmet; Hong, Inki; Cho, Sanghee

    2010-11-01

    Accurate timing information is essential in positron emission tomography (PET). Recent improvements in high speed electronics made digital methods more attractive to find alternative solutions to create a time mark for an event. Two new digital methods (mean PMT pulse model, MPPM, and median filtered zero crossing method, MFZCM) were introduced in this work and compared to traditional methods such as digital leading edge (LE) and digital constant fraction discrimination (CFD). In addition, the performances of all four digital methods were compared to analog based LE and CFD. The time resolution values for MPPM and MFZCM were measured below 300 ps at 1.6 GS/s and above that was similar to the analog based coincidence timing results. In addition, the two digital methods were insensitive to the changes in threshold setting that might give some improvement in system dead time.

  19. BEYOND THE PRINT—VIRTUAL PALEONTOLOGY IN SCIENCE PUBLISHING, OUTREACH, AND EDUCATION

    PubMed Central

    LAUTENSCHLAGER, STEPHAN; RÜCKLIN, MARTIN

    2015-01-01

    Virtual paleontology unites a variety of computational techniques and methods for the visualization and analysis of fossils. Due to their great potential and increasing availability, these methods have become immensely popular in the last decade. However, communicating the wealth of digital information and results produced by the various techniques is still exacerbated by traditional methods of publication. Transferring and processing three-dimensional information, such as interactive models or animations, into scientific publications still poses a challenge. Here, we present different methods and applications to communicate digital data in academia, outreach and education. Three-dimensional PDFs, QR codes, anaglyph stereo imaging, and rapid prototyping—methods routinely used in the engineering, entertainment, or medical industries—are outlined and evaluated for their potential in science publishing and public engagement. Although limitations remain, these are simple, mostly cost-effective, and powerful tools to create novel and innovative resources for education, public engagement, or outreach. PMID:26306051

  20. How to Implement an E-Learning Curriculum to Streamline Teaching Digital Image Processing

    ERIC Educational Resources Information Center

    Király, Sándor

    2016-01-01

    In the field of teaching, one of the interesting subjects is the research of the fact which didactic methods are good for learning the current curriculum for the students who show a wide range of age, interest, chosen courses, previous studies and motivation. This article introduces the facilities that support the learning process: the…

  1. Atypical Brain Activation during Simple & Complex Levels of Processing in Adult ADHD: An fMRI Study

    ERIC Educational Resources Information Center

    Hale, T. Sigi; Bookheimer, Susan; McGough, James J.; Phillips, Joseph M.; McCracken, James T.

    2007-01-01

    Objective: Executive dysfunction in ADHD is well supported. However, recent studies suggest that more fundamental impairments may be contributing. We assessed brain function in adults with ADHD during simple and complex forms of processing. Method: We used functional magnetic resonance imaging with forward and backward digit spans to investigate…

  2. Introduction to computer image processing

    NASA Technical Reports Server (NTRS)

    Moik, J. G.

    1973-01-01

    Theoretical backgrounds and digital techniques for a class of image processing problems are presented. Image formation in the context of linear system theory, image evaluation, noise characteristics, mathematical operations on image and their implementation are discussed. Various techniques for image restoration and image enhancement are presented. Methods for object extraction and the problem of pictorial pattern recognition and classification are discussed.

  3. Approaching the Affective Factors of Information Seeking: The Viewpoint of the Information Search Process Model

    ERIC Educational Resources Information Center

    Savolainen, Reijo

    2015-01-01

    Introduction: The article contributes to the conceptual studies of affective factors in information seeking by examining Kuhlthau's information search process model. Method: This random-digit dial telephone survey of 253 people (75% female) living in a rural, medically under-serviced area of Ontario, Canada, follows-up a previous interview study…

  4. An Automatic and Robust Algorithm of Reestablishment of Digital Dental Occlusion

    PubMed Central

    Chang, Yu-Bing; Xia, James J.; Gateno, Jaime; Xiong, Zixiang; Zhou, Xiaobo; Wong, Stephen T. C.

    2017-01-01

    In the field of craniomaxillofacial (CMF) surgery, surgical planning can be performed on composite 3-D models that are generated by merging a computerized tomography scan with digital dental models. Digital dental models can be generated by scanning the surfaces of plaster dental models or dental impressions with a high-resolution laser scanner. During the planning process, one of the essential steps is to reestablish the dental occlusion. Unfortunately, this task is time-consuming and often inaccurate. This paper presents a new approach to automatically and efficiently reestablish dental occlusion. It includes two steps. The first step is to initially position the models based on dental curves and a point matching technique. The second step is to reposition the models to the final desired occlusion based on iterative surface-based minimum distance mapping with collision constraints. With linearization of rotation matrix, the alignment is modeled by solving quadratic programming. The simulation was completed on 12 sets of digital dental models. Two sets of dental models were partially edentulous, and another two sets have first premolar extractions for orthodontic treatment. Two validation methods were applied to the articulated models. The results show that using our method, the dental models can be successfully articulated with a small degree of deviations from the occlusion achieved with the gold-standard method. PMID:20529735

  5. State of the Art and Development Trends of the Digital Radiography Systems for Cargo Inspection

    NASA Astrophysics Data System (ADS)

    Udod, V.; Van, J.; Osipov, S.; Chakhlov, S.; Temnik, A.

    2016-01-01

    Increasing requirements for technical parameters of inspection digital radiography systems are caused by increasing incidences of terrorism, drug trafficking and explosives via variety of transport. These requirements have determined research for new technical solutions that enable to ensure the safety of passengers and cargos in real-time. The main efforts in the analyzed method of testing are aimed at the creation of new and modernization of operated now systems of digital radiography as a whole and their main components and elements in particular. The number of these main components and elements includes sources of X-ray recording systems and transformation of radiometric information as well as algorithms and software that implements these algorithms for processing, visualization and results interpretation of inspection. Recent developments of X-ray units and betatrons used for inspection of small- and large-sized objects that are made from different materials are deserve special attention. The most effective X-ray detectors are a line and a radiometric detector matrix based on various scintillators. The most promising methods among the algorithms of material identification of testing objects are dual-energy methods. The article describes various models of digital radiography systems applied in Russia and abroad to inspection of baggage, containers, vehicles and large trucks.

  6. Radar echo processing with partitioned de-ramp

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dubbert, Dale F.; Tise, Bertice L.

    2013-03-19

    The spurious-free dynamic range of a wideband radar system is increased by apportioning de-ramp processing across analog and digital processing domains. A chirp rate offset is applied between the received waveform and the reference waveform that is used for downconversion to the intermediate frequency (IF) range. The chirp rate offset results in a residual chirp in the IF signal prior to digitization. After digitization, the residual IF chirp is removed with digital signal processing.

  7. Image re-sampling detection through a novel interpolation kernel.

    PubMed

    Hilal, Alaa

    2018-06-01

    Image re-sampling involved in re-size and rotation transformations is an essential element block in a typical digital image alteration. Fortunately, traces left from such processes are detectable, proving that the image has gone a re-sampling transformation. Within this context, we present in this paper two original contributions. First, we propose a new re-sampling interpolation kernel. It depends on five independent parameters that controls its amplitude, angular frequency, standard deviation, and duration. Then, we demonstrate its capacity to imitate the same behavior of the most frequent interpolation kernels used in digital image re-sampling applications. Secondly, the proposed model is used to characterize and detect the correlation coefficients involved in re-sampling transformations. The involved process includes a minimization of an error function using the gradient method. The proposed method is assessed over a large database of 11,000 re-sampled images. Additionally, it is implemented within an algorithm in order to assess images that had undergone complex transformations. Obtained results demonstrate better performance and reduced processing time when compared to a reference method validating the suitability of the proposed approaches. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Detecting 2LSB steganography using extended pairs of values analysis

    NASA Astrophysics Data System (ADS)

    Khalind, Omed; Aziz, Benjamin

    2014-05-01

    In this paper, we propose an extended pairs of values analysis to detect and estimate the amount of secret messages embedded with 2LSB replacement in digital images based on chi-square attack and regularity rate in pixel values. The detection process is separated from the estimation of the hidden message length, as it is the main requirement of any steganalysis method. Hence, the detection process acts as a discrete classifier, which classifies a given set of images into stego and clean classes. The method can accurately detect 2LSB replacement even when the message length is about 10% of the total capacity, it also reaches its best performance with an accuracy of higher than 0.96 and a true positive rate of more than 0.997 when the amount of data are 20% to 100% of the total capacity. However, the method puts no assumptions neither on the image nor the secret message, as it tested with two sets of 3000 images, compressed and uncompressed, embedded with a random message for each case. This method of detection could also be used as an automated tool to analyse a bulk of images for hidden contents, which could be used by digital forensics analysts in their investigation process.

  9. Correlation and agreement between eplet mismatches calculated using serological, low-intermediate and high resolution molecular human leukocyte antigen typing methods.

    PubMed

    Fidler, Samantha; D'Orsogna, Lloyd; Irish, Ashley B; Lewis, Joshua R; Wong, Germaine; Lim, Wai H

    2018-03-02

    Structural human leukocyte antigen (HLA) matching at the eplet level can be identified by HLAMatchmaker, which requires the entry of four-digit alleles. The aim of this study was to evaluate the agreement between eplet mismatches calculated by serological and two-digit typing methods compared to high-resolution four-digit typing. In a cohort of 264 donor/recipient pairs, the evaluation of measurement error was assessed using intra-class correlation to confirm the absolute agreement between the number of eplet mismatches at class I (HLA-A, -B, C) and II loci (HLA-DQ and -DR) calculated using serological or two-digit molecular typing compared to four-digit molecular typing methods. The proportion of donor/recipient pairs with a difference of >5 eplet mismatches between the HLA typing methods was also determined. Intra-class correlation coefficients between serological and four-digit molecular typing methods were 0.969 (95% confidence intervals [95% CI] 0.960-0.975) and 0.926 (95% CI 0.899-0.944), respectively; and 0.995 (95% CI 0.994-0.996) and 0.993 (95% CI 0.991-0.995), respectively between two-digit and four-digit molecular typing methods. The proportion of donor/recipient pairs with a difference of >5 eplet mismatches at class I and II loci was 4% and 16% for serological versus four-digit molecular typing methods, and 0% and 2% for two-digit versus four-digit molecular typing methods, respectively. In this small predominantly Caucasian population, compared with serology, there is a high level of agreement in the number of eplet mismatches calculated using two-compared to four-digit molecular HLA-typing methods, suggesting that two-digit typing may be sufficient in determining eplet mismatch load in kidney transplantation.

  10. Non-therapist identification of falling hazards in older adult homes using digital photography.

    PubMed

    Ritchey, Katherine C; Meyer, Deborah; Ice, Gillian H

    2015-01-01

    Evaluation and removal of home hazards is an invaluable method for preventing in-home falls and preserving independent living. Current processes for conducting home hazard assessments are impractical from a whole population standpoint given the substantial resources required for implementation. Digital photography offers an opportunity to remotely evaluate an environment for falling hazards. However, reliability of this method has only been tested under the direction of skilled therapists. Ten community dwelling adults over the age of 65 were recruited from local primary care practices between July, 2009 and February, 2010. In-home (IH) assessments were completed immediately after a photographer, blinded to the assessment form, took digital photographs (DP) of the participant home. A different non-therapist assessor then reviewed the photographs and completed a second assessment of the home. Kappa statistic was used to analyze the reliability between the two independent assessments. Home assessments completed by a non-therapist using digital photographs had a substantial agreement (Kappa = 0.61, p < 0.001) with in-home assessments completed by another non-therapist. Additionally, the DP assessments agreed with the IH assessments on the presence or absence of items 96.8% of the time. This study showed that non-therapists can reliably conduct home hazard evaluations using digital photographs.

  11. Dual function seal: visualized digital signature for electronic medical record systems.

    PubMed

    Yu, Yao-Chang; Hou, Ting-Wei; Chiang, Tzu-Chiang

    2012-10-01

    Digital signature is an important cryptography technology to be used to provide integrity and non-repudiation in electronic medical record systems (EMRS) and it is required by law. However, digital signatures normally appear in forms unrecognizable to medical staff, this may reduce the trust from medical staff that is used to the handwritten signatures or seals. Therefore, in this paper we propose a dual function seal to extend user trust from a traditional seal to a digital signature. The proposed dual function seal is a prototype that combines the traditional seal and digital seal. With this prototype, medical personnel are not just can put a seal on paper but also generate a visualized digital signature for electronic medical records. Medical Personnel can then look at the visualized digital signature and directly know which medical personnel generated it, just like with a traditional seal. Discrete wavelet transform (DWT) is used as an image processing method to generate a visualized digital signature, and the peak signal to noise ratio (PSNR) is calculated to verify that distortions of all converted images are beyond human recognition, and the results of our converted images are from 70 dB to 80 dB. The signature recoverability is also tested in this proposed paper to ensure that the visualized digital signature is verifiable. A simulated EMRS is implemented to show how the visualized digital signature can be integrity into EMRS.

  12. [Digitalization of radiological imaging information and consequences for patient care in the hospital ].

    PubMed

    den Heeten, G J; Barneveld Binkhuysen, F H

    2001-08-25

    Determining the rate at which radiology must be digitalised has been a controversial issue for many years. Much radiological information is still obtained from the film-screen combination (X-rays) with all of its known inherent restrictions. The importance of imaging information in the healthcare process continues to increase for both radiologists and referring physicians, and the ongoing developments in information technology means that it is possible to integrate imaging information and electronic patient files. The healthcare process can only become more effective and efficient when the appropriate information is in the right place at the right time, something that conventional methods, using photos that need to be physically moved, can scarcely satisfy. There is also a desire for integration with information obtained from nuclear medicine, pathology and endoscopy, and eventually of all stand-alone data systems with relevance for the individually oriented hospital healthcare. The transition from a conventional to a digital process is complex; it is accompanied by the transition from a data-oriented to a process-oriented system. Many years have already been invested in the integration of information systems and the development of digital systems within radiology, the current performance of which is such that many hospitals are considering the digitalisation process or are already implementing parts of it.

  13. The application of digital signal processing techniques to a teleoperator radar system

    NASA Technical Reports Server (NTRS)

    Pujol, A.

    1982-01-01

    A digital signal processing system was studied for the determination of the spectral frequency distribution of echo signals from a teleoperator radar system. The system consisted of a sample and hold circuit, an analog to digital converter, a digital filter, and a Fast Fourier Transform. The system is interfaced to a 16 bit microprocessor. The microprocessor is programmed to control the complete digital signal processing. The digital filtering and Fast Fourier Transform functions are implemented by a S2815 digital filter/utility peripheral chip and a S2814A Fast Fourier Transform chip. The S2815 initially simulates a low-pass Butterworth filter with later expansion to complete filter circuit (bandpass and highpass) synthesizing.

  14. The place-value of a digit in multi-digit numbers is processed automatically.

    PubMed

    Kallai, Arava Y; Tzelgov, Joseph

    2012-09-01

    The automatic processing of the place-value of digits in a multi-digit number was investigated in 4 experiments. Experiment 1 and two control experiments employed a numerical comparison task in which the place-value of a non-zero digit was varied in a string composed of zeros. Experiment 2 employed a physical comparison task in which strings of digits varied in their physical sizes. In both types of tasks, the place-value of the non-zero digit in the string was irrelevant to the task performed. Interference of the place-value information was found in both tasks. When the non-zero digit occupied a lower place-value, it was recognized slower as a larger digit or as written in a larger font size. We concluded that place-value in a multi-digit number is processed automatically. These results support the notion of a decomposed representation of multi-digit numbers in memory. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  15. A review of computer aided interpretation technology for the evaluation of radiographs of aluminum welds

    NASA Technical Reports Server (NTRS)

    Lloyd, J. F., Sr.

    1987-01-01

    Industrial radiography is a well established, reliable means of providing nondestructive structural integrity information. The majority of industrial radiographs are interpreted by trained human eyes using transmitted light and various visual aids. Hundreds of miles of radiographic information are evaluated, documented and archived annually. In many instances, there are serious considerations in terms of interpreter fatigue, subjectivity and limited archival space. Quite often it is difficult to quickly retrieve radiographic information for further analysis or investigation. Methods of improving the quality and efficiency of the radiographic process are being explored, developed and incorporated whenever feasible. High resolution cameras, digital image processing, and mass digital data storage offer interesting possibilities for improving the industrial radiographic process. A review is presented of computer aided radiographic interpretation technology in terms of how it could be used to enhance the radiographic interpretation process in evaluating radiographs of aluminum welds.

  16. Fabrication of multilayered conductive polymer structures via selective visible light photopolymerization

    NASA Astrophysics Data System (ADS)

    Cullen, Andrew T.; Price, Aaron D.

    2017-04-01

    Electropolymerization of pyrrole is commonly employed to fabricate intrinsically conductive polymer films that exhibit desirable electromechanical properties. Due to their monolithic nature, electroactive polypyrrole films produced via this process are typically limited to simple linear or bending actuation modes, which has hindered their application in complex actuation tasks. This initiative aims to develop the specialized fabrication methods and polymer formulations required to realize three-dimensional conductive polymer structures capable of more elaborate actuation modes. Our group has previously reported the application of the digital light processing additive manufacturing process for the fabrication of three-dimensional conductive polymer structures using ultraviolet radiation. In this investigation, we further expand upon this initial work and present an improved polymer formulation designed for digital light processing additive manufacturing using visible light. This technology enables the design of novel electroactive polymer sensors and actuators with enhanced capabilities and brings us one step closer to realizing more advanced electroactive polymer enabled devices.

  17. The Need for (Digital) Story: First Graders Using Digital Tools to Tell Stories

    ERIC Educational Resources Information Center

    Solomon, Marva Jeanine

    2010-01-01

    The purpose of this study was to explore the process and product of African American First Graders as they participated in digital storytelling. Of interest was the role digital tools played in the creation process. Eight participants participated in 18 study sessions during which they composed, recorded, and then shared their digital texts with…

  18. Enhancing surgical safety using digital multimedia technology.

    PubMed

    Dixon, Jennifer L; Mukhopadhyay, Dhriti; Hunt, Justin; Jupiter, Daniel; Smythe, William R; Papaconstantinou, Harry T

    2016-06-01

    The purpose of this study was to examine whether incorporating digital and video multimedia components improved surgical time-out performance of a surgical safety checklist. A prospective pilot study was designed for implementation of a multimedia time-out, including a patient video. Perceptions of the staff participants were surveyed before and after intervention (Likert scale: 1, strongly disagree to 5, strongly agree). Employee satisfaction was high for both time-out procedures. However, employees appreciated improved clarity of patient identification (P < .05) and operative laterality (P < .05) with the digital method. About 87% of the respondents preferred the digital version to the standard time-out (75% anesthesia, 89% surgeons, 93% nursing). Although the duration of time-outs increased (49 and 79 seconds for standard and digital time-outs, respectively, P > .001), there was significant improvement in performance of key safety elements. The multimedia time-out allows improved participation by the surgical team and is preferred to a standard time-out process. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Mass spectroscopic apparatus and method

    DOEpatents

    Bomse, David S.; Silver, Joel A.; Stanton, Alan C.

    1991-01-01

    The disclosure is directed to a method and apparatus for ionization modulated mass spectrometric analysis. Analog or digital data acquisition and processing can be used. Ions from a time variant source are detected and quantified. The quantified ion output is analyzed using a computer to provide a two-dimensional representation of at least one component present within an analyte.

  20. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, Lee M.; Ng, Esmond G.

    1998-01-01

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated.

  1. Stability and performance analysis of a jump linear control system subject to digital upsets

    NASA Astrophysics Data System (ADS)

    Wang, Rui; Sun, Hui; Ma, Zhen-Yang

    2015-04-01

    This paper focuses on the methodology analysis for the stability and the corresponding tracking performance of a closed-loop digital jump linear control system with a stochastic switching signal. The method is applied to a flight control system. A distributed recoverable platform is implemented on the flight control system and subject to independent digital upsets. The upset processes are used to stimulate electromagnetic environments. Specifically, the paper presents the scenarios that the upset process is directly injected into the distributed flight control system, which is modeled by independent Markov upset processes and independent and identically distributed (IID) processes. A theoretical performance analysis and simulation modelling are both presented in detail for a more complete independent digital upset injection. The specific examples are proposed to verify the methodology of tracking performance analysis. The general analyses for different configurations are also proposed. Comparisons among different configurations are conducted to demonstrate the availability and the characteristics of the design. Project supported by the Young Scientists Fund of the National Natural Science Foundation of China (Grant No. 61403395), the Natural Science Foundation of Tianjin, China (Grant No. 13JCYBJC39000), the Scientific Research Foundation for the Returned Overseas Chinese Scholars, State Education Ministry, China, the Tianjin Key Laboratory of Civil Aircraft Airworthiness and Maintenance in Civil Aviation of China (Grant No. 104003020106), and the Fund for Scholars of Civil Aviation University of China (Grant No. 2012QD21x).

  2. Digitization of the human body in the present-day economy

    NASA Astrophysics Data System (ADS)

    D'Apuzzo, Nicola

    2004-12-01

    In this paper we report on the historic development of human body digitization and on the actual state of commercially available technology. Complete systems for the digitization of the human body exist since more than ten years. One of the main users of this technology was the entertainment industry. Every new movie excited with attractive visual effects, but only few people knew that the most thrilling cuts were realized by using virtual persons. The faces and bodies of actors were digitized and the "virtual twin" replaced the actor in the movie. Nowadays, the state of the human body digitization is so high that it is not possible any more to distinguish the real actor from the virtual one. Indeed, for the rush technical development has to be thanked the movie industry, which was one of the strong economic motors for this technology. Today, with the possibility of a massive cost reduction given by new technologies, methods for digitization of the human body are used also in other fields of application, such as ergonomics, medical applications, computer games, biometry and anthropometrics. With the time, this technology becomes interesting also for sport, fitness, fashion and beauty. A large expansion of human body digitization is expected in the near future. To date, different technologies are used commercially for the measurement of the human body. They can be divided into three distinguished groups: laser-scanning, projection of light patterns, combination modeling and image processing. The different solutions have strengths and weaknesses that profile their suitability for specific applications. This paper gives an overview of their differences and characteristics and expresses clues for the selection of the adequate method. Practical examples of commercial exploitation of human body digitization are also presented and new interesting perspectives are introduced.

  3. Digitization of the human body in the present-day economy

    NASA Astrophysics Data System (ADS)

    D'Apuzzo, Nicola

    2005-01-01

    In this paper we report on the historic development of human body digitization and on the actual state of commercially available technology. Complete systems for the digitization of the human body exist since more than ten years. One of the main users of this technology was the entertainment industry. Every new movie excited with attractive visual effects, but only few people knew that the most thrilling cuts were realized by using virtual persons. The faces and bodies of actors were digitized and the "virtual twin" replaced the actor in the movie. Nowadays, the state of the human body digitization is so high that it is not possible any more to distinguish the real actor from the virtual one. Indeed, for the rush technical development has to be thanked the movie industry, which was one of the strong economic motors for this technology. Today, with the possibility of a massive cost reduction given by new technologies, methods for digitization of the human body are used also in other fields of application, such as ergonomics, medical applications, computer games, biometry and anthropometrics. With the time, this technology becomes interesting also for sport, fitness, fashion and beauty. A large expansion of human body digitization is expected in the near future. To date, different technologies are used commercially for the measurement of the human body. They can be divided into three distinguished groups: laser-scanning, projection of light patterns, combination modeling and image processing. The different solutions have strengths and weaknesses that profile their suitability for specific applications. This paper gives an overview of their differences and characteristics and expresses clues for the selection of the adequate method. Practical examples of commercial exploitation of human body digitization are also presented and new interesting perspectives are introduced.

  4. U.S. Climate Change Technology Program: Strategic Plan

    DTIC Science & Technology

    2006-09-01

    and Long Term, provides details on the 85 technologies in the R&D portfolio. 21 (Figure 2-1) Continuing Process The United States, in partnership with...locations may be centered near or in residential locations, and work processes and products may be more commonly communicated or delivered via digital... chemical properties, along with advanced methods to simulate processes , will stem from advances in computational technology. Current Portfolio The current

  5. Digital image modification detection using color information and its histograms.

    PubMed

    Zhou, Haoyu; Shen, Yue; Zhu, Xinghui; Liu, Bo; Fu, Zigang; Fan, Na

    2016-09-01

    The rapid development of many open source and commercial image editing software makes the authenticity of the digital images questionable. Copy-move forgery is one of the most widely used tampering techniques to create desirable objects or conceal undesirable objects in a scene. Existing techniques reported in the literature to detect such tampering aim to improve the robustness of these methods against the use of JPEG compression, blurring, noise, or other types of post processing operations. These post processing operations are frequently used with the intention to conceal tampering and reduce tampering clues. A robust method based on the color moments and other five image descriptors is proposed in this paper. The method divides the image into fixed size overlapping blocks. Clustering operation divides entire search space into smaller pieces with similar color distribution. Blocks from the tampered regions will reside within the same cluster since both copied and moved regions have similar color distributions. Five image descriptors are used to extract block features, which makes the method more robust to post processing operations. An ensemble of deep compositional pattern-producing neural networks are trained with these extracted features. Similarity among feature vectors in clusters indicates possible forged regions. Experimental results show that the proposed method can detect copy-move forgery even if an image was distorted by gamma correction, addictive white Gaussian noise, JPEG compression, or blurring. Copyright © 2016. Published by Elsevier Ireland Ltd.

  6. Simplified signal processing for impedance spectroscopy with spectrally sparse sequences

    NASA Astrophysics Data System (ADS)

    Annus, P.; Land, R.; Reidla, M.; Ojarand, J.; Mughal, Y.; Min, M.

    2013-04-01

    Classical method for measurement of the electrical bio-impedance involves excitation with sinusoidal waveform. Sinusoidal excitation at fixed frequency points enables wide variety of signal processing options, most general of them being Fourier transform. Multiplication with two quadrature waveforms at desired frequency could be easily accomplished both in analogue and in digital domains, even simplest quadrature square waves can be considered, which reduces signal processing task in analogue domain to synchronous switching followed by low pass filter, and in digital domain requires only additions. So called spectrally sparse excitation sequences (SSS), which have been recently introduced into bio-impedance measurement domain, are very reasonable choice when simultaneous multifrequency excitation is required. They have many good properties, such as ease of generation and good crest factor compared to similar multisinusoids. Typically, the usage of discrete or fast Fourier transform in signal processing step is considered so far. Usage of simplified methods nevertheless would reduce computational burden, and enable simpler, less costly and less energy hungry signal processing platforms. Accuracy of the measurement with SSS excitation when using different waveforms for quadrature demodulation will be compared in order to evaluate the feasibility of the simplified signal processing. Sigma delta modulated sinusoid (binary signal) is considered to be a good alternative for a synchronous demodulation.

  7. Improvement of lateral resolution of spectral domain optical coherence tomography images in out-of-focus regions with holographic data processing techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moiseev, A A; Gelikonov, G V; Terpelov, D A

    2014-08-31

    An analogy between spectral-domain optical coherence tomography (SD OCT) data and broadband digital holography data is considered. Based on this analogy, a method for processing SD OCT data, which makes it possible to construct images with a lateral resolution in the whole investigated volume equal to the resolution in the in-focus region, is developed. Several issues concerning practical application of the proposed method are discussed. (laser biophotonics)

  8. Method of steering the gain of a multiple antenna global positioning system receiver

    NASA Astrophysics Data System (ADS)

    Evans, Alan G.; Hermann, Bruce R.

    1992-06-01

    A method for steering the gain of a multiple antenna Global Positioning System (GPS) receiver toward a plurality of a GPS satellites simultaneously is provided. The GPS signals of a known wavelength are processed digitally for a particular instant in time. A range difference or propagation delay between each antenna for GPS signals received from each satellite is first resolved. The range difference consists of a fractional wavelength difference and an integer wavelength difference. The fractional wavelength difference is determined by each antenna's tracking loop. The integer wavelength difference is based upon the known wavelength and separation between each antenna with respect to each satellite position. The range difference is then used to digitally delay the GPS signals at each antenna with respect to a reference antenna. The signal at the reference antenna is then summed with the digitally delayed signals to generate a composite antenna gain. The method searches for the correct number of integer wavelengths to maximize the composite gain. The range differences are also used to determine the attitude of the array.

  9. A methodology for the semi-automatic digital image analysis of fragmental impactites

    NASA Astrophysics Data System (ADS)

    Chanou, A.; Osinski, G. R.; Grieve, R. A. F.

    2014-04-01

    A semi-automated digital image analysis method is developed for the comparative textural study of impact melt-bearing breccias. This method uses the freeware software ImageJ developed by the National Institute of Health (NIH). Digital image analysis is performed on scans of hand samples (10-15 cm across), based on macroscopic interpretations of the rock components. All image processing and segmentation are done semi-automatically, with the least possible manual intervention. The areal fraction of components is estimated and modal abundances can be deduced, where the physical optical properties (e.g., contrast, color) of the samples allow it. Other parameters that can be measured include, for example, clast size, clast-preferred orientations, average box-counting dimension or fragment shape complexity, and nearest neighbor distances (NnD). This semi-automated method allows the analysis of a larger number of samples in a relatively short time. Textures, granulometry, and shape descriptors are of considerable importance in rock characterization. The methodology is used to determine the variations of the physical characteristics of some examples of fragmental impactites.

  10. A new OTDR based on probe frequency multiplexing

    NASA Astrophysics Data System (ADS)

    Lu, Lidong; Liang, Yun; Li, Binglin; Guo, Jinghong; Zhang, Xuping

    2013-12-01

    Two signal multiplexing methods are proposed and experimentally demonstrated in optical time domain reflectometry (OTDR) for fault location of optical fiber transmission line to obtain high measurement efficiency. Probe signal multiplexing is individually obtained by phase modulation for generation of multi-frequency and time sequential frequency probe pulses. The backscattered Rayleigh light of the multiplexing probe signals is transferred to corresponding heterodyne intermediate frequency (IF) through heterodyning with the single frequency local oscillator (LO). Then the IFs are simultaneously acquired by use of a data acquisition card (DAQ) with sampling rate of 100Msps, and the obtained data are processed by digital band pass filtering (BPF), digital down conversion (DDC) and digital low pass filtering (BPF) procedure. For each probe frequency of the detected signals, the extraction of the time domain reflecting signal power is performed by parallel computing method. For a comprehensive performance comparison with conventional coherent OTDR on the probe frequency multiplexing methods, the potential for enhancement of dynamic range, spatial resolution and measurement time are analyzed and discussed. Experimental results show that by use of the probe frequency multiplexing method, the measurement efficiency of coherent OTDR can be enhanced by nearly 40 times.

  11. Optoelectronic scanning system upgrade by energy center localization methods

    NASA Astrophysics Data System (ADS)

    Flores-Fuentes, W.; Sergiyenko, O.; Rodriguez-Quiñonez, J. C.; Rivas-López, M.; Hernández-Balbuena, D.; Básaca-Preciado, L. C.; Lindner, L.; González-Navarro, F. F.

    2016-11-01

    A problem of upgrading an optoelectronic scanning system with digital post-processing of the signal based on adequate methods of energy center localization is considered. An improved dynamic triangulation analysis technique is proposed by an example of industrial infrastructure damage detection. A modification of our previously published method aimed at searching for the energy center of an optoelectronic signal is described. Application of the artificial intelligence algorithm of compensation for the error of determining the angular coordinate in calculating the spatial coordinate through dynamic triangulation is demonstrated. Five energy center localization methods are developed and tested to select the best method. After implementation of these methods, digital compensation for the measurement error, and statistical data analysis, a non-parametric behavior of the data is identified. The Wilcoxon signed rank test is applied to improve the result further. For optical scanning systems, it is necessary to detect a light emitter mounted on the infrastructure being investigated to calculate its spatial coordinate by the energy center localization method.

  12. Convolution Operations on Coding Metasurface to Reach Flexible and Continuous Controls of Terahertz Beams.

    PubMed

    Liu, Shuo; Cui, Tie Jun; Zhang, Lei; Xu, Quan; Wang, Qiu; Wan, Xiang; Gu, Jian Qiang; Tang, Wen Xuan; Qing Qi, Mei; Han, Jia Guang; Zhang, Wei Li; Zhou, Xiao Yang; Cheng, Qiang

    2016-10-01

    The concept of coding metasurface makes a link between physically metamaterial particles and digital codes, and hence it is possible to perform digital signal processing on the coding metasurface to realize unusual physical phenomena. Here, this study presents to perform Fourier operations on coding metasurfaces and proposes a principle called as scattering-pattern shift using the convolution theorem, which allows steering of the scattering pattern to an arbitrarily predesigned direction. Owing to the constant reflection amplitude of coding particles, the required coding pattern can be simply achieved by the modulus of two coding matrices. This study demonstrates that the scattering patterns that are directly calculated from the coding pattern using the Fourier transform have excellent agreements to the numerical simulations based on realistic coding structures, providing an efficient method in optimizing coding patterns to achieve predesigned scattering beams. The most important advantage of this approach over the previous schemes in producing anomalous single-beam scattering is its flexible and continuous controls to arbitrary directions. This work opens a new route to study metamaterial from a fully digital perspective, predicting the possibility of combining conventional theorems in digital signal processing with the coding metasurface to realize more powerful manipulations of electromagnetic waves.

  13. Study of optical techniques for the Ames unitary wind tunnel: Digital image processing, part 6

    NASA Technical Reports Server (NTRS)

    Lee, George

    1993-01-01

    A survey of digital image processing techniques and processing systems for aerodynamic images has been conducted. These images covered many types of flows and were generated by many types of flow diagnostics. These include laser vapor screens, infrared cameras, laser holographic interferometry, Schlieren, and luminescent paints. Some general digital image processing systems, imaging networks, optical sensors, and image computing chips were briefly reviewed. Possible digital imaging network systems for the Ames Unitary Wind Tunnel were explored.

  14. Digital test assembly of truck parts with the IMMA-tool--an illustrative case.

    PubMed

    Hanson, L; Högberg, D; Söderholm, M

    2012-01-01

    Several digital human modelling (DHM) tools have been developed for simulation and visualisation of human postures and motions. In 2010 the DHM tool IMMA (Intelligently Moving Manikins) was introduced as a DHM tool that uses advanced path planning techniques to generate collision free and biomechanically acceptable motions for digital human models (as well as parts) in complex assembly situations. The aim of the paper is to illustrate how the IPS/IMMA tool is used at Scania CV AB in a digital test assembly process, and to compare the tool with other DHM tools on the market. The illustrated case of using the IMMA tool, here combined with the path planner tool IPS, indicates that the tool is promising. The major strengths of the tool are its user friendly interface, the motion generation algorithms, the batch simulation of manikins and the ergonomics assessment methods that consider time.

  15. Digital lectures for learning gross anatomy: a study of their efficacy

    PubMed Central

    2017-01-01

    Purpose The current study investigates the level of students’ learning and attitudes towards the teaching and learning process when using digital lectures to teach gross anatomy to year 1 medical students. Methods The study sampled year 1 medical students of cohorts 2013 and 2014. The year 1 medical students in 2013 were taught gross anatomy of the heart by didactic classroom lectures while those in 2014 were taught with digital lectures using the same content. A review session was conducted for the 2014 cohort. A 19-item survey was distributed amongst students to investigate their attitudes and feedback. The data were analysed using SPSS software. Results The 2014 cohort had a mean score of 47.65 for short essay questions and 51.19 for multiple choice questions, while the 2013 cohort scored an average of 36.80 for short essay questions and 49.22 for multiple choice questions. The difference in scores for each type of question was found to be significant. Using a 5-point Likert scale, students gave an average of 4.11 when asked if they liked the teaching and learning process and would like it to be applied further. Conclusion The results of the study provide strong evidence that the digital teaching and learning process was well received by students and could also lead to improved performance. Digital lectures can provide a satisfactory substitute for classroom lectures to teach gross anatomy, thus providing flexibility in learning and efficient learning, whilst also freeing lecture slots to promote mastery learning. PMID:28264551

  16. Discrete Walsh Hadamard transform based visible watermarking technique for digital color images

    NASA Astrophysics Data System (ADS)

    Santhi, V.; Thangavelu, Arunkumar

    2011-10-01

    As the size of the Internet is growing enormously the illegal manipulation of digital multimedia data become very easy with the advancement in technology tools. In order to protect those multimedia data from unauthorized access the digital watermarking system is used. In this paper a new Discrete walsh Hadamard Transform based visible watermarking system is proposed. As the watermark is embedded in transform domain, the system is robust to many signal processing attacks. Moreover in this proposed method the watermark is embedded in tiling manner in all the range of frequencies to make it robust to compression and cropping attack. The robustness of the algorithm is tested against noise addition, cropping, compression, Histogram equalization and resizing attacks. The experimental results show that the algorithm is robust to common signal processing attacks and the observed peak signal to noise ratio (PSNR) of watermarked image is varying from 20 to 30 db depends on the size of the watermark.

  17. Imaging model for the scintillator and its application to digital radiography image enhancement.

    PubMed

    Wang, Qian; Zhu, Yining; Li, Hongwei

    2015-12-28

    Digital Radiography (DR) images obtained by OCD-based (optical coupling detector) Micro-CT system usually suffer from low contrast. In this paper, a mathematical model is proposed to describe the image formation process in scintillator. By solving the correlative inverse problem, the quality of DR images is improved, i.e. higher contrast and spatial resolution. By analyzing the radiative transfer process of visible light in scintillator, scattering is recognized as the main factor leading to low contrast. Moreover, involved blurring effect is also concerned and described as point spread function (PSF). Based on these physical processes, the scintillator imaging model is then established. When solving the inverse problem, pre-correction to the intensity of x-rays, dark channel prior based haze removing technique, and an effective blind deblurring approach are employed. Experiments on a variety of DR images show that the proposed approach could improve the contrast of DR images dramatically as well as eliminate the blurring vision effectively. Compared with traditional contrast enhancement methods, such as CLAHE, our method could preserve the relative absorption values well.

  18. Conventional and digital radiographic assessment of tooth enamel de-/remineralization processes: an experimental study.

    PubMed

    Leite-Ribeiro, Patrícia; de Oliveira, Thais Feitosa Leitão; Mathias, Paula; Campo, Elisângela de Jesus; Sarmento, Viviane Almeida

    2014-01-01

    This study aimed to compare digital techniques for evaluating dental enamel de-/remineralization. Sixty extracted molars were subjected to a process of de- and remineralization. Radiographs were taken before and after each stage. These radiographs were evaluated by the conventional method and were then scanned and analyzed either with or without the use of image enhancement. Moreover, the gray levels (GLs) of the affected areas were measured. All methods exhibited low sensitivity and identical levels of specificity (99.4%). Analysis of the grayscale levels found statistically significant differences between the initial radiographs (P < 0.05). The mean GL of the carious group was significantly lower than that of the remineralized group. The GL did not differ significantly between the initial and final radiographs of the remineralized group, although the mean of the first group was lower than that of the second, which demonstrated that the remineralization process restored the normal density of the dental enamel. Measurement of the mean GL was sufficiently sensitive to detect small alterations in the surface of the enamel.

  19. Coherent detection and digital signal processing for fiber optic communications

    NASA Astrophysics Data System (ADS)

    Ip, Ezra

    The drive towards higher spectral efficiency in optical fiber systems has generated renewed interest in coherent detection. We review different detection methods, including noncoherent, differentially coherent, and coherent detection, as well as hybrid detection methods. We compare the modulation methods that are enabled and their respective performances in a linear regime. An important system parameter is the number of degrees of freedom (DOF) utilized in transmission. Polarization-multiplexed quadrature-amplitude modulation maximizes spectral efficiency and power efficiency as it uses all four available DOF contained in the two field quadratures in the two polarizations. Dual-polarization homodyne or heterodyne downconversion are linear processes that can fully recover the received signal field in these four DOF. When downconverted signals are sampled at the Nyquist rate, compensation of transmission impairments can be performed using digital signal processing (DSP). Software based receivers benefit from the robustness of DSP, flexibility in design, and ease of adaptation to time-varying channels. Linear impairments, including chromatic dispersion (CD) and polarization-mode dispersion (PMD), can be compensated quasi-exactly using finite impulse response filters. In practical systems, sampling the received signal at 3/2 times the symbol rate is sufficient to enable an arbitrary amount of CD and PMD to be compensated for a sufficiently long equalizer whose tap length scales linearly with transmission distance. Depending on the transmitted constellation and the target bit error rate, the analog-to-digital converter (ADC) should have around 5 to 6 bits of resolution. Digital coherent receivers are naturally suited for the implementation of feedforward carrier recovery, which has superior linewidth tolerance than phase-locked loops, and does not suffer from feedback delay constraints. Differential bit encoding can be used to prevent catastrophic receiver failure due to cycle slips. In systems where nonlinear effects are concentrated mostly at fiber locations with small accumulated dispersion, nonlinear phase de-rotation is a low-complexity algorithm that can partially mitigate nonlinear effects. For systems with arbitrary dispersion maps, however, backpropagation is the only universal technique that can jointly compensate dispersion and fiber nonlinearity. Backpropagation requires solving the nonlinear Schrodinger equation at the receiver, and has high computational cost. Backpropagation is most effective when dispersion compensation fibers are removed, and when signal processing is performed at three times oversampling. Backpropagation can improve system performance and increase transmission distance. With anticipated advances in analog-to-digital converters and integrated circuit technology, DSP-based coherent receivers at bit rates up to 100 Gb/s should become practical in the near future.

  20. Video rate morphological processor based on a redundant number representation

    NASA Astrophysics Data System (ADS)

    Kuczborski, Wojciech; Attikiouzel, Yianni; Crebbin, Gregory A.

    1992-03-01

    This paper presents a video rate morphological processor for automated visual inspection of printed circuit boards, integrated circuit masks, and other complex objects. Inspection algorithms are based on gray-scale mathematical morphology. Hardware complexity of the known methods of real-time implementation of gray-scale morphology--the umbra transform and the threshold decomposition--has prompted us to propose a novel technique which applied an arithmetic system without carrying propagation. After considering several arithmetic systems, a redundant number representation has been selected for implementation. Two options are analyzed here. The first is a pure signed digit number representation (SDNR) with the base of 4. The second option is a combination of the base-2 SDNR (to represent gray levels of images) and the conventional twos complement code (to represent gray levels of structuring elements). Operation principle of the morphological processor is based on the concept of the digit level systolic array. Individual processing units and small memory elements create a pipeline. The memory elements store current image windows (kernels). All operation primitives of processing units apply a unified direction of digit processing: most significant digit first (MSDF). The implementation technology is based on the field programmable gate arrays by Xilinx. This paper justified the rationality of a new approach to logic design, which is the decomposition of Boolean functions instead of Boolean minimization.

Top