Science.gov

Sample records for digital light processing

  1. DMD: a digital light processing application to projection displays

    NASA Astrophysics Data System (ADS)

    Feather, Gary A.

    1989-01-01

    Summary Revolutionary technologies achieve rapid product and subsequent business diffusion only when the in- ventors focus on technology application, maturation, and proliferation. A revolutionary technology is emerg- ing with micro-electromechanical systems (MEMS). MEMS are being developed by leveraging mature semi- conductor processing coupled with mechanical systems into complete, integrated, useful systems. The digital micromirror device (DMD), a Texas Instruments invented MEMS, has focused on its application to projec- tion displays. The DMD has demonstrated its application as a digital light processor, processing and produc- ing compelling computer and video projection displays. This tutorial discusses requirements in the projection display market and the potential solutions offered by this digital light processing system. The seminar in- cludes an evaluation of the market, system needs, design, fabrication, application, and performance results of a system using digital light processing solutions.

  2. Digital Light Processing update: status and future applications

    NASA Astrophysics Data System (ADS)

    Hornbeck, Larry J.

    1999-05-01

    Digital Light Processing (DLP) projection displays based on the Digital Micromirror Device (DMD) were introduced to the market in 1996. Less than 3 years later, DLP-based projectors are found in such diverse applications as mobile, conference room, video wall, home theater, and large-venue. They provide high-quality, seamless, all-digital images that have exceptional stability as well as freedom from both flicker and image lag. Marked improvements have been made in the image quality of DLP-based projection display, including brightness, resolution, contrast ratio, and border image. DLP-based mobile projectors that weighted about 27 pounds in 1996 now weight only about 7 pounds. This weight reduction has been responsible for the definition of an entirely new projector class, the ultraportable. New applications are being developed for this important new projection display technology; these include digital photofinishing for high process speed minilab and maxilab applications and DLP Cinema for the digital delivery of films to audiences around the world. This paper describes the status of DLP-based projection display technology, including its manufacturing, performance improvements, and new applications, with emphasis on DLP Cinema.

  3. Desolvation Induced Origami of Photocurable Polymers by Digit Light Processing.

    PubMed

    Zhao, Zeang; Wu, Jiangtao; Mu, Xiaoming; Chen, Haosen; Qi, H Jerry; Fang, Daining

    2017-07-01

    Self-folding origami is of great interest in current research on functional materials and structures, but there is still a challenge to develop a simple method to create freestanding, reversible, and complex origami structures. This communication provides a feasible solution to this challenge by developing a method based on the digit light processing technique and desolvation-induced self-folding. In this new method, flat polymer sheets can be cured by a light field from a commercial projector with varying intensity, and the self-folding process is triggered by desolvation in water. Folded origami structures can be recovered once immersed in the swelling medium. The self-folding process is investigated both experimentally and theoretically. Diverse 3D origami shapes are demonstrated. This method can be used for responsive actuators and the fabrication of 3D electronic devices. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Modular Elastomer Photoresins for Digital Light Processing Additive Manufacturing.

    PubMed

    Thrasher, Carl J; Schwartz, Johanna J; Boydston, Andrew J

    2017-11-15

    A series of photoresins suitable for the production of elastomeric objects via digital light processing additive manufacturing are reported. Notably, the printing procedure is readily accessible using only entry-level equipment under ambient conditions using visible light projection. The photoresin formulations were found to be modular in nature, and straightforward adjustments to the resin components enabled access to a range of compositions and mechanical properties. Collectively, the series includes silicones, hydrogels, and hybrids thereof. Printed test specimens displayed maximum elongations of up to 472% under tensile load, a tunable swelling behavior in water, and Shore A hardness values from 13.7 to 33.3. A combination of the resins was used to print a functional multimaterial three-armed pneumatic gripper. These photoresins could be transformative to advanced prototyping applications such as simulated human tissues, stimuli-responsive materials, wearable devices, and soft robotics.

  5. Anisotropy of Photopolymer Parts Made by Digital Light Processing

    PubMed Central

    Monzón, Mario; Ortega, Zaida; Hernández, Alba; Paz, Rubén; Ortega, Fernando

    2017-01-01

    Digital light processing (DLP) is an accurate additive manufacturing (AM) technology suitable for producing micro-parts by photopolymerization. As most AM technologies, anisotropy of parts made by DLP is a key issue to deal with, taking into account that several operational factors modify this characteristic. Design for this technology and photopolymers becomes a challenge because the manufacturing process and post-processing strongly influence the mechanical properties of the part. This paper shows experimental work to demonstrate the particular behavior of parts made using DLP. Being different to any other AM technology, rules for design need to be adapted. Influence of build direction and post-curing process on final mechanical properties and anisotropy are reported and justified based on experimental data and theoretical simulation of bi-material parts formed by fully-cured resin and partially-cured resin. Three photopolymers were tested under different working conditions, concluding that post-curing can, in some cases, correct the anisotropy, mainly depending on the nature of photopolymer. PMID:28772426

  6. Highly Stretchable and UV Curable Elastomers for Digital Light Processing Based 3D Printing.

    PubMed

    Patel, Dinesh K; Sakhaei, Amir Hosein; Layani, Michael; Zhang, Biao; Ge, Qi; Magdassi, Shlomo

    2017-04-01

    Stretchable UV-curable (SUV) elastomers can be stretched by up to 1100% and are suitable for digital-light-processing (DLP)-based 3D-printing technology. DLP printing of these SUV elastomers enables the direct creation of highly deformable complex 3D hollow structures such as balloons, soft actuators, grippers, and buckyball electronical switches. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Continuous Digital Light Processing (cDLP): Highly Accurate Additive Manufacturing of Tissue Engineered Bone Scaffolds.

    PubMed

    Dean, David; Jonathan, Wallace; Siblani, Ali; Wang, Martha O; Kim, Kyobum; Mikos, Antonios G; Fisher, John P

    2012-03-01

    Highly accurate rendering of the external and internal geometry of bone tissue engineering scaffolds effects fit at the defect site, loading of internal pore spaces with cells, bioreactor-delivered nutrient and growth factor circulation, and scaffold resorption. It may be necessary to render resorbable polymer scaffolds with 50 μm or less accuracy to achieve these goals. This level of accuracy is available using Continuous Digital Light processing (cDLP) which utilizes a DLP(®) (Texas Instruments, Dallas, TX) chip. One such additive manufacturing device is the envisionTEC (Ferndale, MI) Perfactory(®). To use cDLP we integrate a photo-crosslinkable polymer, a photo-initiator, and a biocompatible dye. The dye attenuates light, thereby limiting the depth of polymerization. In this study we fabricated scaffolds using the well-studied resorbable polymer, poly(propylene fumarate) (PPF), titanium dioxide (TiO(2)) as a dye, Irgacure(®) 819 (BASF [Ciba], Florham Park, NJ) as an initiator, and diethyl fumarate as a solvent to control viscosity.

  8. Evaluation of a novel laparoscopic camera for characterization of renal ischemia in a porcine model using digital light processing (DLP) hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Olweny, Ephrem O.; Tan, Yung K.; Faddegon, Stephen; Jackson, Neil; Wehner, Eleanor F.; Best, Sara L.; Park, Samuel K.; Thapa, Abhas; Cadeddu, Jeffrey A.; Zuzak, Karel J.

    2012-03-01

    Digital light processing hyperspectral imaging (DLP® HSI) was adapted for use during laparoscopic surgery by coupling a conventional laparoscopic light guide with a DLP-based Agile Light source (OL 490, Optronic Laboratories, Orlando, FL), incorporating a 0° laparoscope, and a customized digital CCD camera (DVC, Austin, TX). The system was used to characterize renal ischemia in a porcine model.

  9. Build Angle: Does It Influence the Accuracy of 3D-Printed Dental Restorations Using Digital Light-Processing Technology?

    PubMed

    Osman, Reham B; Alharbi, Nawal; Wismeijer, Daniel

    The aim of this study was to evaluate the effect of the build orientation/build angle on the dimensional accuracy of full-coverage dental restorations manufactured using digital light-processing technology (DLP-AM). A full dental crown was digitally designed and 3D-printed using DLP-AM. Nine build angles were used: 90, 120, 135, 150, 180, 210, 225, 240, and 270 degrees. The specimens were digitally scanned using a high-resolution optical surface scanner (IScan D104i, Imetric). Dimensional accuracy was evaluated using the digital subtraction technique. The 3D digital files of the scanned printed crowns (test model) were exported in standard tessellation language (STL) format and superimposed on the STL file of the designed crown [reference model] using Geomagic Studio 2014 (3D Systems). The root mean square estimate (RMSE) values were evaluated, and the deviation patterns on the color maps were further assessed. The build angle influenced the dimensional accuracy of 3D-printed restorations. The lowest RMSE was recorded for the 135-degree and 210-degree build angles. However, the overall deviation pattern on the color map was more favorable with the 135-degree build angle in contrast with the 210-degree build angle where the deviation was observed around the critical marginal area. Within the limitations of this study, the recommended build angle using the current DLP system was 135 degrees. Among the selected build angles, it offers the highest dimensional accuracy and the most favorable deviation pattern. It also offers a self-supporting crown geometry throughout the building process.

  10. Fabrication of an interim complete removable dental prosthesis with an in-office digital light processing three-dimensional printer: A proof-of-concept technique.

    PubMed

    Lin, Wei-Shao; Harris, Bryan T; Pellerito, John; Morton, Dean

    2018-04-30

    This report describes a proof of concept for fabricating an interim complete removable dental prosthesis with a digital light processing 3-dimensional (3D) printer. Although an in-office 3D printer can reduce the overall production cost for an interim complete removable dental prosthesis, the process has not been validated with clinical studies. This report provided a preliminary proof of concept in developing a digital workflow for the in-office additively manufactured interim complete removable dental prosthesis. Copyright © 2018 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  11. Aquarius Digital Processing Unit

    NASA Technical Reports Server (NTRS)

    Forgione, Joshua; Winkert, George; Dobson, Norman

    2009-01-01

    Three documents provide information on a digital processing unit (DPU) for the planned Aquarius mission, in which a radiometer aboard a spacecraft orbiting Earth is to measure radiometric temperatures from which data on sea-surface salinity are to be deduced. The DPU is the interface between the radiometer and an instrument-command-and-data system aboard the spacecraft. The DPU cycles the radiometer through a programmable sequence of states, collects and processes all radiometric data, and collects all housekeeping data pertaining to operation of the radiometer. The documents summarize the DPU design, with emphasis on innovative aspects that include mainly the following: a) In the radiometer and the DPU, conversion from analog voltages to digital data is effected by means of asynchronous voltage-to-frequency converters in combination with a frequency-measurement scheme implemented in field-programmable gate arrays (FPGAs). b) A scheme to compensate for aging and changes in the temperature of the DPU in order to provide an overall temperature-measurement accuracy within 0.01 K includes a high-precision, inexpensive DC temperature measurement scheme and a drift-compensation scheme that was used on the Cassini radar system. c) An interface among multiple FPGAs in the DPU guarantees setup and hold times.

  12. The influence of filling technique on depth of tubule penetration by root canal sealer: a study using light microscopy and digital image processing.

    PubMed

    De Deus, Gustavo A; Gurgel-Filho, Eduardo Diogo; Maniglia-Ferreira, Cláudio; Coutinho-Filho, Tauby

    2004-04-01

    The purpose of this study was to compare the depth of sealer penetration into dentinal tubules by three root-filling techniques using light microscopy and digital image processing. Thirty-two maxillary central incisors were prepared. Two teeth were separated for the control group. The rest were divided into three equal groups and obturated as following--G1: lateral condensation; G2: warm vertical compaction of gutta-percha and G3: Thermafil system. Each sample was sectioned longitudinally and prepared for microscopic analysis. A sequence of photomicrographs with magnifications of X50, X200 and X500 were taken. Through digital image analysis and processing, measurements for each field were obtained. A non-parametric ANOVA Kruskal-Wallis analysis was used to determine whether there were significant differences among the groups. Significant differences between G2 and G1 (p = 0.034) and between G3 and G1 (p = 0.021) were identified. There were no significant differences between G2 and G3 (p > 0.05). The results of this research suggest that samples root-filled by thermoplasticised gutta-percha techniques lead to deeper penetration of the root canal sealer into the dentinal tubules.

  13. Validating continuous digital light processing (cDLP) additive manufacturing accuracy and tissue engineering utility of a dye-initiator package.

    PubMed

    Wallace, Jonathan; Wang, Martha O; Thompson, Paul; Busso, Mallory; Belle, Vaijayantee; Mammoser, Nicole; Kim, Kyobum; Fisher, John P; Siblani, Ali; Xu, Yueshuo; Welter, Jean F; Lennon, Donald P; Sun, Jiayang; Caplan, Arnold I; Dean, David

    2014-03-01

    This study tested the accuracy of tissue engineering scaffold rendering via the continuous digital light processing (cDLP) light-based additive manufacturing technology. High accuracy (i.e., <50 µm) allows the designed performance of features relevant to three scale spaces: cell-scaffold, scaffold-tissue, and tissue-organ interactions. The biodegradable polymer poly (propylene fumarate) was used to render highly accurate scaffolds through the use of a dye-initiator package, TiO2 and bis (2,4,6-trimethylbenzoyl)phenylphosphine oxide. This dye-initiator package facilitates high accuracy in the Z dimension. Linear, round, and right-angle features were measured to gauge accuracy. Most features showed accuracies between 5.4-15% of the design. However, one feature, an 800 µm diameter circular pore, exhibited a 35.7% average reduction of patency. Light scattered in the x, y directions by the dye may have reduced this feature's accuracy. Our new fine-grained understanding of accuracy could be used to make further improvements by including corrections in the scaffold design software. Successful cell attachment occurred with both canine and human mesenchymal stem cells (MSCs). Highly accurate cDLP scaffold rendering is critical to the design of scaffolds that both guide bone regeneration and that fully resorb. Scaffold resorption must occur for regenerated bone to be remodeled and, thereby, achieve optimal strength.

  14. Process simulation in digital camera system

    NASA Astrophysics Data System (ADS)

    Toadere, Florin

    2012-06-01

    The goal of this paper is to simulate the functionality of a digital camera system. The simulations cover the conversion from light to numerical signal and the color processing and rendering. We consider the image acquisition system to be linear shift invariant and axial. The light propagation is orthogonal to the system. We use a spectral image processing algorithm in order to simulate the radiometric properties of a digital camera. In the algorithm we take into consideration the transmittances of the: light source, lenses, filters and the quantum efficiency of a CMOS (complementary metal oxide semiconductor) sensor. The optical part is characterized by a multiple convolution between the different points spread functions of the optical components. We use a Cooke triplet, the aperture, the light fall off and the optical part of the CMOS sensor. The electrical part consists of the: Bayer sampling, interpolation, signal to noise ratio, dynamic range, analog to digital conversion and JPG compression. We reconstruct the noisy blurred image by blending different light exposed images in order to reduce the photon shot noise, also we filter the fixed pattern noise and we sharpen the image. Then we have the color processing blocks: white balancing, color correction, gamma correction, and conversion from XYZ color space to RGB color space. For the reproduction of color we use an OLED (organic light emitting diode) monitor. The analysis can be useful to assist students and engineers in image quality evaluation and imaging system design. Many other configurations of blocks can be used in our analysis.

  15. Digital TV processing system

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Two digital video data compression systems directly applicable to the Space Shuttle TV Communication System were described: (1) For the uplink, a low rate monochrome data compressor is used. The compression is achieved by using a motion detection technique in the Hadamard domain. To transform the variable source rate into a fixed rate, an adaptive rate buffer is provided. (2) For the downlink, a color data compressor is considered. The compression is achieved first by intra-color transformation of the original signal vector, into a vector which has lower information entropy. Then two-dimensional data compression techniques are applied to the Hadamard transformed components of this last vector. Mathematical models and data reliability analyses were also provided for the above video data compression techniques transmitted over a channel encoded Gaussian channel. It was shown that substantial gains can be achieved by the combination of video source and channel coding.

  16. Advanced digital SAR processing study

    NASA Technical Reports Server (NTRS)

    Martinson, L. W.; Gaffney, B. P.; Liu, B.; Perry, R. P.; Ruvin, A.

    1982-01-01

    A highly programmable, land based, real time synthetic aperture radar (SAR) processor requiring a processed pixel rate of 2.75 MHz or more in a four look system was designed. Variations in range and azimuth compression, number of looks, range swath, range migration and SR mode were specified. Alternative range and azimuth processing algorithms were examined in conjunction with projected integrated circuit, digital architecture, and software technologies. The advaced digital SAR processor (ADSP) employs an FFT convolver algorithm for both range and azimuth processing in a parallel architecture configuration. Algorithm performace comparisons, design system design, implementation tradeoffs and the results of a supporting survey of integrated circuit and digital architecture technologies are reported. Cost tradeoffs and projections with alternate implementation plans are presented.

  17. Inelastic Light Scattering Processes

    NASA Technical Reports Server (NTRS)

    Fouche, Daniel G.; Chang, Richard K.

    1973-01-01

    Five different inelastic light scattering processes will be denoted by, ordinary Raman scattering (ORS), resonance Raman scattering (RRS), off-resonance fluorescence (ORF), resonance fluorescence (RF), and broad fluorescence (BF). A distinction between fluorescence (including ORF and RF) and Raman scattering (including ORS and RRS) will be made in terms of the number of intermediate molecular states which contribute significantly to the scattered amplitude, and not in terms of excited state lifetimes or virtual versus real processes. The theory of these processes will be reviewed, including the effects of pressure, laser wavelength, and laser spectral distribution on the scattered intensity. The application of these processes to the remote sensing of atmospheric pollutants will be discussed briefly. It will be pointed out that the poor sensitivity of the ORS technique cannot be increased by going toward resonance without also compromising the advantages it has over the RF technique. Experimental results on inelastic light scattering from I(sub 2) vapor will be presented. As a single longitudinal mode 5145 A argon-ion laser line was tuned away from an I(sub 2) absorption line, the scattering was observed to change from RF to ORF. The basis, of the distinction is the different pressure dependence of the scattered intensity. Nearly three orders of magnitude enhancement of the scattered intensity was measured in going from ORF to RF. Forty-seven overtones were observed and their relative intensities measured. The ORF cross section of I(sub 2) compared to the ORS cross section of N2 was found to be 3 x 10(exp 6), with I(sub 2) at its room temperature vapor pressure.

  18. Digitizing the KSO white light images

    NASA Astrophysics Data System (ADS)

    Pötzi, W.

    From 1989 up to 2007 the Sun was observed at the Kanzelhöhe Observatory in white light on photographic film material. The images are on transparent sheet films and are not available to the scientific community now. With a photo scanner for transparent film material the films are now scanned and then prepared for scientific use. The programs for post processing are already finished and as an output FITS and JPEG-files are produced. The scanning should be finished end of 2011 and the data should then be available via our homepage.

  19. Parallel processing for digital picture comparison

    NASA Technical Reports Server (NTRS)

    Cheng, H. D.; Kou, L. T.

    1987-01-01

    In picture processing an important problem is to identify two digital pictures of the same scene taken under different lighting conditions. This kind of problem can be found in remote sensing, satellite signal processing and the related areas. The identification can be done by transforming the gray levels so that the gray level histograms of the two pictures are closely matched. The transformation problem can be solved by using the packing method. Researchers propose a VLSI architecture consisting of m x n processing elements with extensive parallel and pipelining computation capabilities to speed up the transformation with the time complexity 0(max(m,n)), where m and n are the numbers of the gray levels of the input picture and the reference picture respectively. If using uniprocessor and a dynamic programming algorithm, the time complexity will be 0(m(3)xn). The algorithm partition problem, as an important issue in VLSI design, is discussed. Verification of the proposed architecture is also given.

  20. Controlling the digital transfer process

    NASA Astrophysics Data System (ADS)

    Brunner, Felix

    1997-02-01

    The accuracy of today's color management systems fails to satisfy the requirements of the graphic arts market. A first explanation for this is that color calibration charts on which these systems rely, because of print technical reasons, are subject to color deviations and inconsistencies. A second reason is that colorimetry describes the human visual perception of color differences and has no direct relation to the rendering technology itself of a proofing or printing device. The author explains that only firm process control of the many parameters in offset printing by means of a system as for example EUROSTANDARD System Brunner, can lead to accurate and consistent calibration of scanner, display, proof and print. The same principles hold for the quality management of digital presses.

  1. Digital signal processing in microwave radiometers

    NASA Technical Reports Server (NTRS)

    Lawrence, R. W.; Stanley, W. D.; Harrington, R. F.

    1980-01-01

    A microprocessor based digital signal processing unit has been proposed to replace analog sections of a microwave radiometer. A brief introduction to the radiometer system involved and a description of problems encountered in the use of digital techniques in radiometer design are discussed. An analysis of the digital signal processor as part of the radiometer is then presented.

  2. Digital processing of radiographic images

    NASA Technical Reports Server (NTRS)

    Bond, A. D.; Ramapriyan, H. K.

    1973-01-01

    Some techniques are presented and the software documentation for the digital enhancement of radiographs. Both image handling and image processing operations are considered. The image handling operations dealt with are: (1) conversion of format of data from packed to unpacked and vice versa; (2) automatic extraction of image data arrays; (3) transposition and 90 deg rotations of large data arrays; (4) translation of data arrays for registration; and (5) reduction of the dimensions of data arrays by integral factors. Both the frequency and the spatial domain approaches are presented for the design and implementation of the image processing operation. It is shown that spatial domain recursive implementation of filters is much faster than nonrecursive implementations using fast fourier transforms (FFT) for the cases of interest in this work. The recursive implementation of a class of matched filters for enhancing image signal to noise ratio is described. Test patterns are used to illustrate the filtering operations. The application of the techniques to radiographic images of metallic structures is demonstrated through several examples.

  3. BPSK Demodulation Using Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Garcia, Thomas R.

    1996-01-01

    A digital communications signal is a sinusoidal waveform that is modified by a binary (digital) information signal. The sinusoidal waveform is called the carrier. The carrier may be modified in amplitude, frequency, phase, or a combination of these. In this project a binary phase shift keyed (BPSK) signal is the communication signal. In a BPSK signal the phase of the carrier is set to one of two states, 180 degrees apart, by a binary (i.e., 1 or 0) information signal. A digital signal is a sampled version of a "real world" time continuous signal. The digital signal is generated by sampling the continuous signal at discrete points in time. The rate at which the signal is sampled is called the sampling rate (f(s)). The device that performs this operation is called an analog-to-digital (A/D) converter or a digitizer. The digital signal is composed of the sequence of individual values of the sampled BPSK signal. Digital signal processing (DSP) is the modification of the digital signal by mathematical operations. A device that performs this processing is called a digital signal processor. After processing, the digital signal may then be converted back to an analog signal using a digital-to-analog (D/A) converter. The goal of this project is to develop a system that will recover the digital information from a BPSK signal using DSP techniques. The project is broken down into the following steps: (1) Development of the algorithms required to demodulate the BPSK signal; (2) Simulation of the system; and (3) Implementation a BPSK receiver using digital signal processing hardware.

  4. Tabletop computed lighting for practical digital photography.

    PubMed

    Mohan, Ankit; Bailey, Reynold; Waite, Jonathan; Tumblin, Jack; Grimm, Cindy; Bodenheimer, Bobby

    2007-01-01

    We apply simplified image-based lighting methods to reduce the equipment, cost, time, and specialized skills required for high-quality photographic lighting of desktop-sized static objects such as museum artifacts. We place the object and a computer-steered moving-head spotlight inside a simple foam-core enclosure and use a camera to record photos as the light scans the box interior. Optimization, guided by interactive user sketching, selects a small set of these photos whose weighted sum best matches the user-defined target sketch. Unlike previous image-based relighting efforts, our method requires only a single area light source, yet it can achieve high-resolution light positioning to avoid multiple sharp shadows. A reduced version uses only a handheld light and may be suitable for battery-powered field photography equipment that fits into a backpack.

  5. Low cost 3D scanning process using digital image processing

    NASA Astrophysics Data System (ADS)

    Aguilar, David; Romero, Carlos; Martínez, Fernando

    2017-02-01

    This paper shows the design and building of a low cost 3D scanner, able to digitize solid objects through contactless data acquisition, using active object reflection. 3D scanners are used in different applications such as: science, engineering, entertainment, etc; these are classified in: contact scanners and contactless ones, where the last ones are often the most used but they are expensive. This low-cost prototype is done through a vertical scanning of the object using a fixed camera and a mobile horizontal laser light, which is deformed depending on the 3-dimensional surface of the solid. Using digital image processing an analysis of the deformation detected by the camera was done; it allows determining the 3D coordinates using triangulation. The obtained information is processed by a Matlab script, which gives to the user a point cloud corresponding to each horizontal scanning done. The obtained results show an acceptable quality and significant details of digitalized objects, making this prototype (built on LEGO Mindstorms NXT kit) a versatile and cheap tool, which can be used for many applications, mainly by engineering students.

  6. Unified Digital Image Display And Processing System

    NASA Astrophysics Data System (ADS)

    Horii, Steven C.; Maguire, Gerald Q.; Noz, Marilyn E.; Schimpf, James H.

    1981-11-01

    Our institution like many others, is faced with a proliferation of medical imaging techniques. Many of these methods give rise to digital images (e.g. digital radiography, computerized tomography (CT) , nuclear medicine and ultrasound). We feel that a unified, digital system approach to image management (storage, transmission and retrieval), image processing and image display will help in integrating these new modalities into the present diagnostic radiology operations. Future techniques are likely to employ digital images, so such a system could readily be expanded to include other image sources. We presently have the core of such a system. We can both view and process digital nuclear medicine (conventional gamma camera) images, positron emission tomography (PET) and CT images on a single system. Images from our recently installed digital radiographic unit can be added. Our paper describes our present system, explains the rationale for its configuration, and describes the directions in which it will expand.

  7. How Digital Image Processing Became Really Easy

    NASA Astrophysics Data System (ADS)

    Cannon, Michael

    1988-02-01

    In the early and mid-1970s, digital image processing was the subject of intense university and corporate research. The research lay along two lines: (1) developing mathematical techniques for improving the appearance of or analyzing the contents of images represented in digital form, and (2) creating cost-effective hardware to carry out these techniques. The research has been very effective, as evidenced by the continued decline of image processing as a research topic, and the rapid increase of commercial companies to market digital image processing software and hardware.

  8. Metric Aspects of Digital Images and Digital Image Processing.

    DTIC Science & Technology

    1984-09-01

    produced in a reconstructed digital image. Synthesized aerial photographs were formed by processing a combined elevation and orthophoto data base. These...brightness values h1 and Iion b) a line equation whose two parameters are calculated h12, along with tile borderline that separates the two intensity

  9. The Process of Digitizing of Old Globe

    NASA Astrophysics Data System (ADS)

    Ambrožová, K.; Havrlanta, J.; Talich, M.; Böhm, O.

    2016-06-01

    This paper describes the process of digitalization of old globes that brings with it the possibility to use globes in their digital form. Created digital models are available to the general public through modern technology in the Internet network. This gives an opportunity to study old globes located in various historical collections, and prevent damage of the originals. Another benefit of digitization is also a possibility of comparing different models both among themselves and with current map data by increasing the transparency of individual layers. Digitization is carried out using special device that allows digitizing globes with a diameter ranging from 5 cm to 120 cm. This device can be easily disassembled, and it is fully mobile therefore the globes can be digitized in the place of its storage. Image data of globe surface are acquired by digital camera firmly fastened to the device. Acquired image data are then georeferenced by using a method of complex adjustment. The last step of digitization is publication of the final models that is realized by two ways. The first option is in the form of 3D model through JavaScript library Cesium or Google Earth plug-in in the Web browser. The second option is as a georeferenced map using Tile Map Service.

  10. Digital images in the map revision process

    NASA Astrophysics Data System (ADS)

    Newby, P. R. T.

    Progress towards the adoption of digital (or softcopy) photogrammetric techniques for database and map revision is reviewed. Particular attention is given to the Ordnance Survey of Great Britain, the author's former employer, where digital processes are under investigation but have not yet been introduced for routine production. Developments which may lead to increasing automation of database update processes appear promising, but because of the cost and practical problems associated with managing as well as updating large digital databases, caution is advised when considering the transition to softcopy photogrammetry for revision tasks.

  11. Digital image processing of bone - Problems and potentials

    NASA Technical Reports Server (NTRS)

    Morey, E. R.; Wronski, T. J.

    1980-01-01

    The development of a digital image processing system for bone histomorphometry and fluorescent marker monitoring is discussed. The system in question is capable of making measurements of UV or light microscope features on a video screen with either video or computer-generated images, and comprises a microscope, low-light-level video camera, video digitizer and display terminal, color monitor, and PDP 11/34 computer. Capabilities demonstrated in the analysis of an undecalcified rat tibia include the measurement of perimeter and total bone area, and the generation of microscope images, false color images, digitized images and contoured images for further analysis. Software development will be based on an existing software library, specifically the mini-VICAR system developed at JPL. It is noted that the potentials of the system in terms of speed and reliability far exceed any problems associated with hardware and software development.

  12. Digital photography for the light microscope: results with a gated, video-rate CCD camera and NIH-image software.

    PubMed

    Shaw, S L; Salmon, E D; Quatrano, R S

    1995-12-01

    In this report, we describe a relatively inexpensive method for acquiring, storing and processing light microscope images that combines the advantages of video technology with the powerful medium now termed digital photography. Digital photography refers to the recording of images as digital files that are stored, manipulated and displayed using a computer. This report details the use of a gated video-rate charge-coupled device (CCD) camera and a frame grabber board for capturing 256 gray-level digital images from the light microscope. This camera gives high-resolution bright-field, phase contrast and differential interference contrast (DIC) images but, also, with gated on-chip integration, has the capability to record low-light level fluorescent images. The basic components of the digital photography system are described, and examples are presented of fluorescence and bright-field micrographs. Digital processing of images to remove noise, to enhance contrast and to prepare figures for printing is discussed.

  13. Digital processing of signals from femtosecond combs

    NASA Astrophysics Data System (ADS)

    Čížek, Martin; Šmíd, Radek; Buchta, Zdeněk.; Mikel, Břetislav; Lazar, Josef; Číp, Ondrej

    2012-01-01

    The presented work is focused on digital processing of beat note signals from a femtosecond optical frequency comb. The levels of mixing products of single spectral components of the comb with CW laser sources are usually very low compared to products of mixing all the comb components together. RF counters are more likely to measure the frequency of the strongest spectral component rather than a weak beat note. Proposed experimental digital signal processing system solves this problem by analyzing the whole spectrum of the output RF signal and using software defined radio (SDR) algorithms. Our efforts concentrate in two main areas: Firstly, we are experimenting with digital signal processing of the RF beat note spectrum produced by f-2f 1 technique and with fully digital servo-loop stabilization of the fs comb. Secondly, we are using digital servo-loop techniques for locking free running continuous laser sources on single components of the fs comb spectrum. Software capable of computing and analyzing the beat-note RF spectrums using FFT and peak detection was developed. A SDR algorithm performing phase demodulation on the f- 2f signal is used as a regulation error signal source for a digital phase-locked loop stabilizing the offset and repetition frequencies of the fs comb.

  14. Digital Signal Processing Based Biotelemetry Receivers

    NASA Technical Reports Server (NTRS)

    Singh, Avtar; Hines, John; Somps, Chris

    1997-01-01

    This is an attempt to develop a biotelemetry receiver using digital signal processing technology and techniques. The receiver developed in this work is based on recovering signals that have been encoded using either Pulse Position Modulation (PPM) or Pulse Code Modulation (PCM) technique. A prototype has been developed using state-of-the-art digital signal processing technology. A Printed Circuit Board (PCB) is being developed based on the technique and technology described here. This board is intended to be used in the UCSF Fetal Monitoring system developed at NASA. The board is capable of handling a variety of PPM and PCM signals encoding signals such as ECG, temperature, and pressure. A signal processing program has also been developed to analyze the received ECG signal to determine heart rate. This system provides a base for using digital signal processing in biotelemetry receivers and other similar applications.

  15. Digital processing of Mariner 9 television data.

    NASA Technical Reports Server (NTRS)

    Green, W. B.; Seidman, J. B.

    1973-01-01

    The digital image processing performed by the Image Processing Laboratory (IPL) at JPL in support of the Mariner 9 mission is summarized. The support is divided into the general categories of image decalibration (the removal of photometric and geometric distortions from returned imagery), computer cartographic projections in support of mapping activities, and adaptive experimenter support (flexible support to provide qualitative digital enhancements and quantitative data reduction of returned imagery). Among the tasks performed were the production of maximum discriminability versions of several hundred frames to support generation of a geodetic control net for Mars, and special enhancements supporting analysis of Phobos and Deimos images.

  16. Low-Light Image Enhancement Using Adaptive Digital Pixel Binning

    PubMed Central

    Yoo, Yoonjong; Im, Jaehyun; Paik, Joonki

    2015-01-01

    This paper presents an image enhancement algorithm for low-light scenes in an environment with insufficient illumination. Simple amplification of intensity exhibits various undesired artifacts: noise amplification, intensity saturation, and loss of resolution. In order to enhance low-light images without undesired artifacts, a novel digital binning algorithm is proposed that considers brightness, context, noise level, and anti-saturation of a local region in the image. The proposed algorithm does not require any modification of the image sensor or additional frame-memory; it needs only two line-memories in the image signal processor (ISP). Since the proposed algorithm does not use an iterative computation, it can be easily embedded in an existing digital camera ISP pipeline containing a high-resolution image sensor. PMID:26121609

  17. Moderated histogram equalization, an automatic means of enhancing the contrast in digital light micrographs reversibly.

    PubMed

    Entwistle, A

    2004-06-01

    A means for improving the contrast in the images produced from digital light micrographs is described that requires no intervention by the experimenter: zero-order, scaling, tonally independent, moderated histogram equalization. It is based upon histogram equalization, which often results in digital light micrographs that contain regions that appear to be saturated, negatively biased or very grainy. Here a non-decreasing monotonic function is introduced into the process, which moderates the changes in contrast that are generated. This method is highly effective for all three of the main types of contrast found in digital light micrography: bright objects viewed against a dark background, e.g. fluorescence and dark-ground or dark-field image data sets; bright and dark objects sets against a grey background, e.g. image data sets collected with phase or Nomarski differential interference contrast optics; and darker objects set against a light background, e.g. views of absorbing specimens. Moreover, it is demonstrated that there is a single fixed moderating function, whose actions are independent of the number of elements of image data, which works well with all types of digital light micrographs, including multimodal or multidimensional image data sets. The use of this fixed function is very robust as the appearance of the final image is not altered discernibly when it is applied repeatedly to an image data set. Consequently, moderated histogram equalization can be applied to digital light micrographs as a push-button solution, thereby eliminating biases that those undertaking the processing might have introduced during manual processing. Finally, moderated histogram equalization yields a mapping function and so, through the use of look-up tables, indexes or palettes, the information present in the original data file can be preserved while an image with the improved contrast is displayed on the monitor screen.

  18. Digital image processing of vascular angiograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Beckenbach, E. S.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.

    1975-01-01

    The paper discusses the estimation of the degree of atherosclerosis in the human femoral artery through the use of a digital image processing system for vascular angiograms. The film digitizer uses an electronic image dissector camera to scan the angiogram and convert the recorded optical density information into a numerical format. Another processing step involves locating the vessel edges from the digital image. The computer has been programmed to estimate vessel abnormality through a series of measurements, some derived primarily from the vessel edge information and others from optical density variations within the lumen shadow. These measurements are combined into an atherosclerosis index, which is found in a post-mortem study to correlate well with both visual and chemical estimates of atherosclerotic disease.

  19. Eliminating "Hotspots" in Digital Image Processing

    NASA Technical Reports Server (NTRS)

    Salomon, P. M.

    1984-01-01

    Signals from defective picture elements rejected. Image processing program for use with charge-coupled device (CCD) or other mosaic imager augmented with algorithm that compensates for common type of electronic defect. Algorithm prevents false interpretation of "hotspots". Used for robotics, image enhancement, image analysis and digital television.

  20. Computer Aided Teaching of Digital Signal Processing.

    ERIC Educational Resources Information Center

    Castro, Ian P.

    1990-01-01

    Describes a microcomputer-based software package developed at the University of Surrey for teaching digital signal processing to undergraduate science and engineering students. Menu-driven software capabilities are explained, including demonstration of qualitative concepts and experimentation with quantitative data, and examples are given of…

  1. A Virtual Laboratory for Digital Signal Processing

    ERIC Educational Resources Information Center

    Dow, Chyi-Ren; Li, Yi-Hsung; Bai, Jin-Yu

    2006-01-01

    This work designs and implements a virtual digital signal processing laboratory, VDSPL. VDSPL consists of four parts: mobile agent execution environments, mobile agents, DSP development software, and DSP experimental platforms. The network capability of VDSPL is created by using mobile agent and wrapper techniques without modifying the source code…

  2. Digital Image Processing in Private Industry.

    ERIC Educational Resources Information Center

    Moore, Connie

    1986-01-01

    Examines various types of private industry optical disk installations in terms of business requirements for digital image systems in five areas: records management; transaction processing; engineering/manufacturing; information distribution; and office automation. Approaches for implementing image systems are addressed as well as key success…

  3. [Digital thoracic radiology: devices, image processing, limits].

    PubMed

    Frija, J; de Géry, S; Lallouet, F; Guermazi, A; Zagdanski, A M; De Kerviler, E

    2001-09-01

    In a first part, the different techniques of digital thoracic radiography are described. Since computed radiography with phosphore plates are the most commercialized it is more emphasized. But the other detectors are also described, as the drum coated with selenium and the direct digital radiography with selenium detectors. The other detectors are also studied in particular indirect flat panels detectors and the system with four high resolution CCD cameras. In a second step the most important image processing are discussed: the gradation curves, the unsharp mask processing, the system MUSICA, the dynamic range compression or reduction, the soustraction with dual energy. In the last part the advantages and the drawbacks of computed thoracic radiography are emphasized. The most important are the almost constant good quality of the pictures and the possibilities of image processing.

  4. Applied digital signal processing systems for vortex flowmeter with digital signal processing.

    PubMed

    Xu, Ke-Jun; Zhu, Zhi-Hai; Zhou, Yang; Wang, Xiao-Fen; Liu, San-Shan; Huang, Yun-Zhi; Chen, Zhi-Yuan

    2009-02-01

    The spectral analysis is combined with digital filter to process the vortex sensor signal for reducing the effect of disturbance at low frequency from pipe vibrations and increasing the turndown ratio. Using digital signal processing chip, two kinds of digital signal processing systems are developed to implement these algorithms. One is an integrative system, and the other is a separated system. A limiting amplifier is designed in the input analog condition circuit to adapt large amplitude variation of sensor signal. Some technique measures are taken to improve the accuracy of the output pulse, speed up the response time of the meter, and reduce the fluctuation of the output signal. The experimental results demonstrate the validity of the digital signal processing systems.

  5. A volumetric three-dimensional digital light photoactivatable dye display

    NASA Astrophysics Data System (ADS)

    Patel, Shreya K.; Cao, Jian; Lippert, Alexander R.

    2017-07-01

    Volumetric three-dimensional displays offer spatially accurate representations of images with a 360° view, but have been difficult to implement due to complex fabrication requirements. Herein, a chemically enabled volumetric 3D digital light photoactivatable dye display (3D Light PAD) is reported. The operating principle relies on photoactivatable dyes that become reversibly fluorescent upon illumination with ultraviolet light. Proper tuning of kinetics and emission wavelengths enables the generation of a spatial pattern of fluorescent emission at the intersection of two structured light beams. A first-generation 3D Light PAD was fabricated using the photoactivatable dye N-phenyl spirolactam rhodamine B, a commercial picoprojector, an ultraviolet projector and a custom quartz imaging chamber. The system displays a minimum voxel size of 0.68 mm3, 200 μm resolution and good stability over repeated `on-off' cycles. A range of high-resolution 3D images and animations can be projected, setting the foundation for widely accessible volumetric 3D displays.

  6. A volumetric three-dimensional digital light photoactivatable dye display

    PubMed Central

    Patel, Shreya K.; Cao, Jian; Lippert, Alexander R.

    2017-01-01

    Volumetric three-dimensional displays offer spatially accurate representations of images with a 360° view, but have been difficult to implement due to complex fabrication requirements. Herein, a chemically enabled volumetric 3D digital light photoactivatable dye display (3D Light PAD) is reported. The operating principle relies on photoactivatable dyes that become reversibly fluorescent upon illumination with ultraviolet light. Proper tuning of kinetics and emission wavelengths enables the generation of a spatial pattern of fluorescent emission at the intersection of two structured light beams. A first-generation 3D Light PAD was fabricated using the photoactivatable dye N-phenyl spirolactam rhodamine B, a commercial picoprojector, an ultraviolet projector and a custom quartz imaging chamber. The system displays a minimum voxel size of 0.68 mm3, 200 μm resolution and good stability over repeated ‘on-off’ cycles. A range of high-resolution 3D images and animations can be projected, setting the foundation for widely accessible volumetric 3D displays. PMID:28695887

  7. Fundamental Concepts of Digital Image Processing

    DOE R&D Accomplishments Database

    Twogood, R. E.

    1983-03-01

    The field of a digital-image processing has experienced dramatic growth and increasingly widespread applicability in recent years. Fortunately, advances in computer technology have kept pace with the rapid growth in volume of image data in these and other applications. Digital image processing has become economical in many fields of research and in industrial and military applications. While each application has requirements unique from the others, all are concerned with faster, cheaper, more accurate, and more extensive computation. The trend is toward real-time and interactive operations, where the user of the system obtains preliminary results within a short enough time that the next decision can be made by the human processor without loss of concentration on the task at hand. An example of this is the obtaining of two-dimensional (2-D) computer-aided tomography (CAT) images. A medical decision might be made while the patient is still under observation rather than days later.

  8. Digital signal processing the Tevatron BPM signals

    SciTech Connect

    Cancelo, G.; James, E.; Wolbers, S.

    2005-05-01

    The Beam Position Monitor (TeV BPM) readout system at Fermilab's Tevatron has been updated and is currently being commissioned. The new BPMs use new analog and digital hardware to achieve better beam position measurement resolution. The new system reads signals from both ends of the existing directional stripline pickups to provide simultaneous proton and antiproton measurements. The signals provided by the two ends of the BPM pickups are processed by analog band-pass filters and sampled by 14-bit ADCs at 74.3MHz. A crucial part of this work has been the design of digital filters that process the signal. This paper describesmore » the digital processing and estimation techniques used to optimize the beam position measurement. The BPM electronics must operate in narrow-band and wide-band modes to enable measurements of closed-orbit and turn-by-turn positions. The filtering and timing conditions of the signals are tuned accordingly for the operational modes. The analysis and the optimized result for each mode are presented.« less

  9. Digital image processing for information extraction.

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.

    1973-01-01

    The modern digital computer has made practical image processing techniques for handling nonlinear operations in both the geometrical and the intensity domains, various types of nonuniform noise cleanup, and the numerical analysis of pictures. An initial requirement is that a number of anomalies caused by the camera (e.g., geometric distortion, MTF roll-off, vignetting, and nonuniform intensity response) must be taken into account or removed to avoid their interference with the information extraction process. Examples illustrating these operations are discussed along with computer techniques used to emphasize details, perform analyses, classify materials by multivariate analysis, detect temporal differences, and aid in human interpretation of photos.

  10. Fringe image processing based on structured light series

    NASA Astrophysics Data System (ADS)

    Gai, Shaoyan; Da, Feipeng; Li, Hongyan

    2009-11-01

    The code analysis of the fringe image is playing a vital role in the data acquisition of structured light systems, which affects precision, computational speed and reliability of the measurement processing. According to the self-normalizing characteristic, a fringe image processing method based on structured light is proposed. In this method, a series of projective patterns is used when detecting the fringe order of the image pixels. The structured light system geometry is presented, which consist of a white light projector and a digital camera, the former projects sinusoidal fringe patterns upon the object, and the latter acquires the fringe patterns that are deformed by the object's shape. Then the binary images with distinct white and black strips can be obtained and the ability to resist image noise is improved greatly. The proposed method can be implemented easily and applied for profile measurement based on special binary code in a wide field.

  11. [Digital processing and evaluation of ultrasound images].

    PubMed

    Borchers, J; Klews, P M

    1993-10-01

    With the help of workstations and PCs, on-site image processing has become possible. If the images are not available in digital form the video signal has to be A/D converted. In the case of colour images the colour channels R (red), G (green) and B (blue) have to be digitized separately. "Truecolour" imaging calls for an 8 bit resolution per channel, leading to 24 bits per pixel. Out of a pool of 2(24) possible values only the relevant 128 gray values and 64 shades of red and blue respectively needed for a colour-coded ultrasound image have to be isolated. Digital images can be changed and evaluated with the help of readily available image evaluation programmes. It is mandatory that during image manipulation the gray scale and colour pixels and LUTs (Look-Up-Table) must be worked on separately. Using relatively simple LUT manipulations astonishing image improvements are possible. Application of simple mathematical operations can lead to completely new clinical results. For example, by subtracting two consecutive colour flow images in time and special LUT operations, local acceleration of blood flow can be visualized (Colour Acceleration Imaging).

  12. Digital holographic interferometry applied to the investigation of ignition process.

    PubMed

    Pérez-Huerta, J S; Saucedo-Anaya, Tonatiuh; Moreno, I; Ariza-Flores, D; Saucedo-Orozco, B

    2017-06-12

    We use the digital holographic interferometry (DHI) technique to display the early ignition process for a butane-air mixture flame. Because such an event occurs in a short time (few milliseconds), a fast CCD camera is used to study the event. As more detail is required for monitoring the temporal evolution of the process, less light coming from the combustion is captured by the CCD camera, resulting in a deficient and underexposed image. Therefore, the CCD's direct observation of the combustion process is limited (down to 1000 frames per second). To overcome this drawback, we propose the use of DHI along with a high power laser in order to supply enough light to increase the speed capture, thus improving the visualization of the phenomenon in the initial moments. An experimental optical setup based on DHI is used to obtain a large sequence of phase maps that allows us to observe two transitory stages in the ignition process: a first explosion which slightly emits visible light, and a second stage induced by variations in temperature when the flame is emerging. While the last stage can be directly monitored by the CCD camera, the first stage is hardly detected by direct observation, and DHI clearly evidences this process. Furthermore, our method can be easily adapted for visualizing other types of fast processes.

  13. Coherent imaging with incoherent light in digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Chmelik, Radim

    2012-01-01

    Digital holographic microscope (DHM) allows for imaging with a quantitative phase contrast. In this way it becomes an important instrument, a completely non-invasive tool for a contrast intravital observation of living cells and a cell drymass density distribution measurement. A serious drawback of current DHMs is highly coherent illumination which makes the lateral resolution worse and impairs the image quality by a coherence noise and a parasitic interference. An uncompromising solution to this problem can be found in the Leith concept of incoherent holography. An off-axis hologram can be formed with arbitrary degree of light coherence in systems equipped with an achromatic interferometer and thus the resolution and the image quality typical for an incoherent-light wide-field microscopy can be achieved. In addition, advanced imaging modes based on limited coherence can be utilized. The typical example is a coherence-gating effect which provides a finite axial resolution and makes DHM image similar to that of a confocal microscope. These possibilities were described theoretically using the formalism of three-dimensional coherent transfer functions and proved experimentally by the coherence-controlled holographic microscope which is DHM based on the Leith achromatic interferometer. Quantitative-phase-contrast imaging is demonstrated with incoherent light by the living cancer cells observation and their motility evaluation. The coherence-gating effect was proved by imaging of model samples through a scattering layer and living cells inside an opalescent medium.

  14. Digital signal processing methods for biosequence comparison.

    PubMed Central

    Benson, D C

    1990-01-01

    A method is discussed for DNA or protein sequence comparison using a finite field fast Fourier transform, a digital signal processing technique; and statistical methods are discussed for analyzing the output of this algorithm. This method compares two sequences of length N in computing time proportional to N log N compared to N2 for methods currently used. This method makes it feasible to compare very long sequences. An example is given to show that the method correctly identifies sites of known homology. PMID:2349096

  15. Image processing techniques for digital orthophotoquad production

    USGS Publications Warehouse

    Hood, Joy J.; Ladner, L. J.; Champion, Richard A.

    1989-01-01

    Orthophotographs have long been recognized for their value as supplements or alternatives to standard maps. Recent trends towards digital cartography have resulted in efforts by the US Geological Survey to develop a digital orthophotoquad production system. Digital image files were created by scanning color infrared photographs on a microdensitometer. Rectification techniques were applied to remove tile and relief displacement, thereby creating digital orthophotos. Image mosaicking software was then used to join the rectified images, producing digital orthophotos in quadrangle format.

  16. Integrated High Resolution Digital Color Light Sensor in 130 nm CMOS Technology

    PubMed Central

    Strle, Drago; Nahtigal, Uroš; Batistell, Graciele; Zhang, Vincent Chi; Ofner, Erwin; Fant, Andrea; Sturm, Johannes

    2015-01-01

    This article presents a color light detection system integrated in 130 nm CMOS technology. The sensors and corresponding electronics detect light in a CIE XYZ color luminosity space using on-chip integrated sensors without any additional process steps, high-resolution analog-to-digital converter, and dedicated DSP algorithm. The sensor consists of a set of laterally arranged integrated photodiodes that are partly covered by metal, where color separation between the photodiodes is achieved by lateral carrier diffusion together with wavelength-dependent absorption. A high resolution, hybrid, ∑∆ ADC converts each photo diode’s current into a 22-bit digital result, canceling the dark current of the photo diodes. The digital results are further processed by the DSP, which calculates normalized XYZ or RGB color and intensity parameters using linear transformations of the three photo diode responses by multiplication of the data with a transformation matrix, where the coefficients are extracted by training in combination with a pseudo-inverse operation and the least-mean square approximation. The sensor system detects the color light parameters with 22-bit accuracy, consumes less than 60 μA on average at 10 readings per second, and occupies approx. 0.8 mm2 of silicon area (including three photodiodes and the analog part of the ADC). The DSP is currently implemented on FPGA. PMID:26205275

  17. Integrated High Resolution Digital Color Light Sensor in 130 nm CMOS Technology.

    PubMed

    Strle, Drago; Nahtigal, Uroš; Batistell, Graciele; Zhang, Vincent Chi; Ofner, Erwin; Fant, Andrea; Sturm, Johannes

    2015-07-22

    This article presents a color light detection system integrated in 130 nm CMOS technology. The sensors and corresponding electronics detect light in a CIE XYZ color luminosity space using on-chip integrated sensors without any additional process steps, high-resolution analog-to-digital converter, and dedicated DSP algorithm. The sensor consists of a set of laterally arranged integrated photodiodes that are partly covered by metal, where color separation between the photodiodes is achieved by lateral carrier diffusion together with wavelength-dependent absorption. A high resolution, hybrid, ∑∆ ADC converts each photo diode's current into a 22-bit digital result, canceling the dark current of the photo diodes. The digital results are further processed by the DSP, which calculates normalized XYZ or RGB color and intensity parameters using linear transformations of the three photo diode responses by multiplication of the data with a transformation matrix, where the coefficients are extracted by training in combination with a pseudo-inverse operation and the least-mean square approximation. The sensor system detects the color light parameters with 22-bit accuracy, consumes less than 60 μA on average at 10 readings per second, and occupies approx. 0.8 mm(2) of silicon area (including three photodiodes and the analog part of the ADC). The DSP is currently implemented on FPGA.

  18. Digital techniques for processing Landsat imagery

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1978-01-01

    An overview of the basic techniques used to process Landsat images with a digital computer, and the VICAR image processing software developed at JPL and available to users through the NASA sponsored COSMIC computer program distribution center is presented. Examples of subjective processing performed to improve the information display for the human observer, such as contrast enhancement, pseudocolor display and band rationing, and of quantitative processing using mathematical models, such as classification based on multispectral signatures of different areas within a given scene and geometric transformation of imagery into standard mapping projections are given. Examples are illustrated by Landsat scenes of the Andes mountains and Altyn-Tagh fault zone in China before and after contrast enhancement and classification of land use in Portland, Oregon. The VICAR image processing software system which consists of a language translator that simplifies execution of image processing programs and provides a general purpose format so that imagery from a variety of sources can be processed by the same basic set of general applications programs is described.

  19. Fuzzy Logic Enhanced Digital PIV Processing Software

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.

    1999-01-01

    Digital Particle Image Velocimetry (DPIV) is an instantaneous, planar velocity measurement technique that is ideally suited for studying transient flow phenomena in high speed turbomachinery. DPIV is being actively used at the NASA Glenn Research Center to study both stable and unstable operating conditions in a high speed centrifugal compressor. Commercial PIV systems are readily available which provide near real time feedback of the PIV image data quality. These commercial systems are well designed to facilitate the expedient acquisition of PIV image data. However, as with any general purpose system, these commercial PIV systems do not meet all of the data processing needs required for PIV image data reduction in our compressor research program. An in-house PIV PROCessing (PIVPROC) code has been developed for reducing PIV data. The PIVPROC software incorporates fuzzy logic data validation for maximum information recovery from PIV image data. PIVPROC enables combined cross-correlation/particle tracking wherein the highest possible spatial resolution velocity measurements are obtained.

  20. Digital interactive image analysis by array processing

    NASA Technical Reports Server (NTRS)

    Sabels, B. E.; Jennings, J. D.

    1973-01-01

    An attempt is made to draw a parallel between the existing geophysical data processing service industries and the emerging earth resources data support requirements. The relationship of seismic data analysis to ERTS data analysis is natural because in either case data is digitally recorded in the same format, resulting from remotely sensed energy which has been reflected, attenuated, shifted and degraded on its path from the source to the receiver. In the seismic case the energy is acoustic, ranging in frequencies from 10 to 75 cps, for which the lithosphere appears semi-transparent. In earth survey remote sensing through the atmosphere, visible and infrared frequency bands are being used. Yet the hardware and software required to process the magnetically recorded data from the two realms of inquiry are identical and similar, respectively. The resulting data products are similar.

  1. White Light Optical Information Processing.

    DTIC Science & Technology

    1985-05-31

    together) incident on the nematic film , after passage through the opti- cal system, was about 0.2 watts. A second beam splitter BSI was placed between... film , a process that is like holography, indeed is often termed image-plane holography, but in fact goes back 0 to Ives.5 In particular, the use of...slit images became straight, whereupon the system was assumed . to be properly adjusted. For the real time, or phase conjugation process, a thin film

  2. Digital Signal Processing Methods for Ultrasonic Echoes.

    PubMed

    Sinding, Kyle; Drapaca, Corina; Tittmann, Bernhard

    2016-04-28

    Digital signal processing has become an important component of data analysis needed in industrial applications. In particular, for ultrasonic thickness measurements the signal to noise ratio plays a major role in the accurate calculation of the arrival time. For this application a band pass filter is not sufficient since the noise level cannot be significantly decreased such that a reliable thickness measurement can be performed. This paper demonstrates the abilities of two regularization methods - total variation and Tikhonov - to filter acoustic and ultrasonic signals. Both of these methods are compared to a frequency based filtering for digitally produced signals as well as signals produced by ultrasonic transducers. This paper demonstrates the ability of the total variation and Tikhonov filters to accurately recover signals from noisy acoustic signals faster than a band pass filter. Furthermore, the total variation filter has been shown to reduce the noise of a signal significantly for signals with clear ultrasonic echoes. Signal to noise ratios have been increased over 400% by using a simple parameter optimization. While frequency based filtering is efficient for specific applications, this paper shows that the reduction of noise in ultrasonic systems can be much more efficient with regularization methods.

  3. Digital data processing system dynamic loading analysis

    NASA Technical Reports Server (NTRS)

    Lagas, J. J.; Peterka, J. J.; Tucker, A. E.

    1976-01-01

    Simulation and analysis of the Space Shuttle Orbiter Digital Data Processing System (DDPS) are reported. The mated flight and postseparation flight phases of the space shuttle's approach and landing test configuration were modeled utilizing the Information Management System Interpretative Model (IMSIM) in a computerized simulation modeling of the ALT hardware, software, and workload. System requirements simulated for the ALT configuration were defined. Sensitivity analyses determined areas of potential data flow problems in DDPS operation. Based on the defined system requirements and the sensitivity analyses, a test design is described for adapting, parameterizing, and executing the IMSIM. Varying load and stress conditions for the model execution are given. The analyses of the computer simulation runs were documented as results, conclusions, and recommendations for DDPS improvements.

  4. Parallel Processing with Digital Signal Processing Hardware and Software

    NASA Technical Reports Server (NTRS)

    Swenson, Cory V.

    1995-01-01

    The assembling and testing of a parallel processing system is described which will allow a user to move a Digital Signal Processing (DSP) application from the design stage to the execution/analysis stage through the use of several software tools and hardware devices. The system will be used to demonstrate the feasibility of the Algorithm To Architecture Mapping Model (ATAMM) dataflow paradigm for static multiprocessor solutions of DSP applications. The individual components comprising the system are described followed by the installation procedure, research topics, and initial program development.

  5. E-inclusion Process and Societal Digital Skill Development

    ERIC Educational Resources Information Center

    Vitolina, Ieva

    2015-01-01

    Nowadays, the focus shifts from information and communication technology access to skills and knowledge. Moreover, lack of digital skills is an obstacle in the process of learning new digital competences using technologies and e-learning. The objective of this research is to investigate how to facilitate students to use the acquired digital skills…

  6. An Interactive Graphics Program for Investigating Digital Signal Processing.

    ERIC Educational Resources Information Center

    Miller, Billy K.; And Others

    1983-01-01

    Describes development of an interactive computer graphics program for use in teaching digital signal processing. The program allows students to interactively configure digital systems on a monitor display and observe their system's performance by means of digital plots on the system's outputs. A sample program run is included. (JN)

  7. Focusing light inside dynamic scattering media with millisecond digital optical phase conjugation (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Liu, Yan; Ma, Cheng; Shen, Yuecheng; Wang, Lihong V.

    2017-02-01

    Optical phase conjugation based wavefront shaping techniques are being actively developed to focus light through or inside scattering media such as biological tissue, and they promise to revolutionize optical imaging, manipulation, and therapy. The speed of digital optical phase conjugation (DOPC) has been limited by the low speeds of cameras and spatial light modulators (SLMs), preventing DOPC from being applied to thick living tissue. Recently, a fast DOPC system was developed based on a single-shot wavefront measurement method, a field programmable gate array (FPGA) for data processing, and a digital micromirror device (DMD) for fast modulation. However, this system has the following limitations. First, the reported single-shot wavefront measurement method does not work when our goal is to focus light inside, instead of through, scattering media. Second, the DMD performed binary amplitude modulation, which resulted in a lower focusing contrast compared with that of phase modulations. Third, the optical fluence threshold causing DMDs to malfunction under pulsed laser illumination is lower than that of liquid crystal based SLMs, and the system alignment is significantly complicated by the oblique reflection angle of the DMD. Here, we developed a simple but high-speed DOPC system using a ferroelectric liquid crystal based SLM (512 × 512 pixels), and focused light through three diffusers within 4.7 ms. Using focused-ultrasound-guided DOPC along with a double exposure scheme, we focused light inside a scattering medium containing two diffusers within 7.7 ms, thus achieving the fastest digital time-reversed ultrasonically encoded (TRUE) optical focusing to date.

  8. Process Architecture for Managing Digital Object Identifiers

    NASA Astrophysics Data System (ADS)

    Wanchoo, L.; James, N.; Stolte, E.

    2014-12-01

    In 2010, NASA's Earth Science Data and Information System (ESDIS) Project implemented a process for registering Digital Object Identifiers (DOIs) for data products distributed by Earth Observing System Data and Information System (EOSDIS). For the first 3 years, ESDIS evolved the process involving the data provider community in the development of processes for creating and assigning DOIs, and guidelines for the landing page. To accomplish this, ESDIS established two DOI User Working Groups: one for reviewing the DOI process whose recommendations were submitted to ESDIS in February 2014; and the other recently tasked to review and further develop DOI landing page guidelines for ESDIS approval by end of 2014. ESDIS has recently upgraded the DOI system from a manually-driven system to one that largely automates the DOI process. The new automated feature include: a) reviewing the DOI metadata, b) assigning of opaque DOI name if data provider chooses, and c) reserving, registering, and updating the DOIs. The flexibility of reserving the DOI allows data providers to embed and test the DOI in the data product metadata before formally registering with EZID. The DOI update process allows the changing of any DOI metadata except the DOI name unless the name has not been registered. Currently, ESDIS has processed a total of 557 DOIs of which 379 DOIs are registered with EZID and 178 are reserved with ESDIS. The DOI incorporates several metadata elements that effectively identify the data product and the source of availability. Of these elements, the Uniform Resource Locator (URL) attribute has the very important function of identifying the landing page which describes the data product. ESDIS in consultation with data providers in the Earth Science community is currently developing landing page guidelines that specify the key data product descriptive elements to be included on each data product's landing page. This poster will describe in detail the unique automated process and

  9. Digital Radiographic Image Processing and Analysis.

    PubMed

    Yoon, Douglas C; Mol, André; Benn, Douglas K; Benavides, Erika

    2018-07-01

    This article describes digital radiographic imaging and analysis from the basics of image capture to examples of some of the most advanced digital technologies currently available. The principles underlying the imaging technologies are described to provide a better understanding of their strengths and limitations. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Digital-Difference Processing For Collision Avoidance.

    NASA Technical Reports Server (NTRS)

    Shores, Paul; Lichtenberg, Chris; Kobayashi, Herbert S.; Cunningham, Allen R.

    1988-01-01

    Digital system for automotive crash avoidance measures and displays difference in frequency between two sinusoidal input signals of slightly different frequencies. Designed for use with Doppler radars. Characterized as digital mixer coupled to frequency counter measuring difference frequency in mixer output. Technique determines target path mathematically. Used for tracking cars, missiles, bullets, baseballs, and other fast-moving objects.

  11. X-ray light valve (XLV): a novel detectors' technology for digital mammography

    NASA Astrophysics Data System (ADS)

    Marcovici, Sorin; Sukhovatkin, Vlad; Oakham, Peter

    2014-03-01

    A novel method, based on X-ray Light Valve (XLV) technology, is proposed for making good image quality yet inexpensive flat panel detectors for digital mammography. The digital mammography markets, particularly in the developing countries, demand quality machines at substantially lower prices than the ones available today. Continuous pressure is applied on x-ray detectors' manufacturers to reduce the flat panel detectors' prices. XLV presents a unique opportunity to achieve the needed price - performance characteristics for direct conversion, x-ray detectors. The XLV based detectors combine the proven, superior, spatial resolution of a-Se with the simplicity and low cost of liquid crystals and optical scanning. The x-ray quanta absorbed by a 200 μm a-Se produce electron - hole pairs that move under an electric field to the top and bottom of a-Se layer. This 2D charge distribution creates at the interface with the liquid crystals a continuous (analog) charge image corresponding to the impinging radiation's information. Under the influence of local electrical charges next to them, the liquid crystals twist proportionally to the charges and vary their light reflectivity. A scanning light source illuminates the liquid crystals while an associated, pixilated photo-detector, having a 42 μm pixel size, captures the light reflected by the liquid crystals and converts it in16 bit words that are transmitted to the machine for image processing and display. The paper will describe a novel XLV, 25 cm x 30 cm, flat panel detector structure and its underlying physics as well as its preliminary performance measured on several engineering prototypes. In particular, the paper will present the results of measuring XLV detectors' DQE, MTF, dynamic range, low contrast resolution and dynamic behavior. Finally, the paper will introduce the new, low cost, XLV detector based, digital mammography machine under development at XLV Diagnostics Inc.

  12. The Creation Process in Digital Art

    NASA Astrophysics Data System (ADS)

    Marcos, Adérito Fernandes; Branco, Pedro Sérgio; Zagalo, Nelson Troca

    The process behind the act of the art creation or the creation process has been the subject of much debate and research during the last fifty years at least, even thinking art and beauty has been a subject of analysis already by the ancient Greeks such were Plato or Aristotle. Even though intuitively it is a simple phenomenon, creativity or the human ability to generate innovation (new ideas, concepts, etc.) is in fact quite complex. It has been studied from the perspectives of behavioral and social psychology, cognitive science, artificial intelligence, philosophy, history, design research, digital art, and computational aesthetics, among others. In spite of many years of discussion and research there is no single, authoritative perspective or definition of creativity, i.e., there is no standardized measurement technique. Regarding the development process that supports the intellectual act of creation it is usually described as a procedure where the artist experiments the medium, explores it with one or more techniques, changing shapes, forms, appearances, where beyond time and space, he/she seeks his/her way out to a clearing, i.e., envisages a path from intention to realization. Duchamp in his lecture "The Creative Act" states the artist is never alone with his/her artwork; there is always the spectator that later on will react critically to the work of art. If the artist succeeds in transmitting his/her intentions in terms of a message, emotion or feeling to the spectator then a form of aesthetic osmosis actually takes place through the inert matter (the medium) that enabled this communication or interaction phenomenon to occur. The role of the spectator may become gradually more active by interacting with the artwork itself possibly changing or becoming a part of it [2][4].

  13. Digital processing of array seismic recordings

    USGS Publications Warehouse

    Ryall, Alan; Birtill, John

    1962-01-01

    This technical letter contains a brief review of the operations which are involved in digital processing of array seismic recordings by the methods of velocity filtering, summation, cross-multiplication and integration, and by combinations of these operations (the "UK Method" and multiple correlation). Examples are presented of analyses by the several techniques on array recordings which were obtained by the U.S. Geological Survey during chemical and nuclear explosions in the western United States. Seismograms are synthesized using actual noise and Pn-signal recordings, such that the signal-to-noise ratio, onset time and velocity of the signal are predetermined for the synthetic record. These records are then analyzed by summation, cross-multiplication, multiple correlation and the UK technique, and the results are compared. For all of the examples presented, analysis by the non-linear techniques of multiple correlation and cross-multiplication of the traces on an array recording are preferred to analyses by the linear operations involved in summation and the UK Method.

  14. Digital image processing for photo-reconnaissance applications

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.

    1972-01-01

    Digital image-processing techniques developed for processing pictures from NASA space vehicles are analyzed in terms of enhancement, quantitative restoration, and information extraction. Digital filtering, and the action of a high frequency filter in the real and Fourier domain are discussed along with color and brightness.

  15. Modular digital holographic fringe data processing system

    NASA Technical Reports Server (NTRS)

    Downward, J. G.; Vavra, P. C.; Schebor, F. S.; Vest, C. M.

    1985-01-01

    A software architecture suitable for reducing holographic fringe data into useful engineering data is developed and tested. The results, along with a detailed description of the proposed architecture for a Modular Digital Fringe Analysis System, are presented.

  16. Multiscale image processing and antiscatter grids in digital radiography.

    PubMed

    Lo, Winnie Y; Hornof, William J; Zwingenberger, Allison L; Robertson, Ian D

    2009-01-01

    Scatter radiation is a source of noise and results in decreased signal-to-noise ratio and thus decreased image quality in digital radiography. We determined subjectively whether a digitally processed image made without a grid would be of similar quality to an image made with a grid but without image processing. Additionally the effects of exposure dose and of a using a grid with digital radiography on overall image quality were studied. Thoracic and abdominal radiographs of five dogs of various sizes were made. Four acquisition techniques were included (1) with a grid, standard exposure dose, digital image processing; (2) without a grid, standard exposure dose, digital image processing; (3) without a grid, half the exposure dose, digital image processing; and (4) with a grid, standard exposure dose, no digital image processing (to mimic a film-screen radiograph). Full-size radiographs as well as magnified images of specific anatomic regions were generated. Nine reviewers rated the overall image quality subjectively using a five-point scale. All digitally processed radiographs had higher overall scores than nondigitally processed radiographs regardless of patient size, exposure dose, or use of a grid. The images made at half the exposure dose had a slightly lower quality than those made at full dose, but this was only statistically significant in magnified images. Using a grid with digital image processing led to a slight but statistically significant increase in overall quality when compared with digitally processed images made without a grid but whether this increase in quality is clinically significant is unknown.

  17. Fast, optically controlled Kerr phase shifter for digital signal processing.

    PubMed

    Li, R B; Deng, L; Hagley, E W; Payne, M G; Bienfang, J C; Levine, Z H

    2013-05-01

    We demonstrate an optically controlled Kerr phase shifter using a room-temperature 85Rb vapor operating in a Raman gain scheme. Phase shifts from zero to π relative to an unshifted reference wave are observed, and gated operations are demonstrated. We further demonstrate the versatile digital manipulation of encoded signal light with an encoded phase-control light field using an unbalanced Mach-Zehnder interferometer. Generalizations of this scheme should be capable of full manipulation of a digitized signal field at high speed, opening the door to future applications.

  18. Digital pulse processing for planar TlBr detectors

    NASA Astrophysics Data System (ADS)

    Nakhostin, M.; Hitomi, K.; Ishii, K.; Kikuchi, Y.

    2010-04-01

    We report on a digital pulse processing algorithm for correction of charge trapping in the planar TlBr detectors. The algorithm is performed on the signals digitized at the preamplifier stage. The algorithm is very simple and is implemented with little computational effort. By using a digitizer with a sampling rate of 250 MSample/s and 8 bit resolution, an energy resolution of 6.5% is achieved at 511 keV with a 0.7 mm thick detector.

  19. Programmable rate modem utilizing digital signal processing techniques

    NASA Technical Reports Server (NTRS)

    Naveh, Arad

    1992-01-01

    The need for a Programmable Rate Digital Satellite Modem capable of supporting both burst and continuous transmission modes with either Binary Phase Shift Keying (BPSK) or Quadrature Phase Shift Keying (QPSK) modulation is discussed. The preferred implementation technique is an all digital one which utilizes as much digital signal processing (DSP) as possible. The design trade-offs in each portion of the modulator and demodulator subsystem are outlined.

  20. Pedagogical reforms of digital signal processing education

    NASA Astrophysics Data System (ADS)

    Christensen, Michael

    The future of the engineering discipline is arguably predicated heavily upon appealing to the future generation, in all its sensibilities. The greatest burden in doing so, one might rightly believe, lies on the shoulders of the educators. In examining the causal means by which the profession arrived at such a state, one finds that the technical revolution, precipitated by global war, had, as its catalyst, institutions as expansive as the government itself to satisfy the demand for engineers, who, as a result of such an existential crisis, were taught predominantly theoretical underpinnings to address a finite purpose. By contrast, the modern engineer, having expanded upon this vision and adapted to an evolving society, is increasingly placed in the proverbial role of the worker who must don many hats: not solely a scientist, yet often an artist; not a businessperson alone, but neither financially naive; not always a representative, though frequently a collaborator. Inasmuch as change then serves as the only constancy in a global climate, therefore, the educational system - if it is to mimic the demands of the industry - is left with an inherent need for perpetual revitalization to remain relevant. This work aims to serve that end. Motivated by existing research in engineering education, an epistemological challenge is molded into the framework of the electrical engineer with emphasis on digital signal processing. In particular, it is investigated whether students are better served by a learning paradigm that tolerates and, when feasible, encourages error via a medium free of traditional adjudication. Through the creation of learning modules using the Adobe Captivate environment, a wide range of fundamental knowledge in signal processing is challenged within the confines of existing undergraduate courses. It is found that such an approach not only conforms to the research agenda outlined for the engineering educator, but also reflects an often neglected reality

  1. The digital storytelling process: A comparative analysis from various experts

    NASA Astrophysics Data System (ADS)

    Hussain, Hashiroh; Shiratuddin, Norshuhada

    2016-08-01

    Digital Storytelling (DST) is a method of delivering information to the audience. It combines narrative and digital media content infused with the multimedia elements. In order for the educators (i.e the designers) to create a compelling digital story, there are sets of processes introduced by experts. Nevertheless, the experts suggest varieties of processes to guide them; of which some are redundant. The main aim of this study is to propose a single guide process for the creation of DST. A comparative analysis is employed where ten DST models from various experts are analysed. The process can also be implemented in other multimedia materials that used the concept of DST.

  2. Digital signal processor and processing method for GPS receivers

    NASA Technical Reports Server (NTRS)

    Thomas, Jr., Jess B. (Inventor)

    1989-01-01

    A digital signal processor and processing method therefor for use in receivers of the NAVSTAR/GLOBAL POSITIONING SYSTEM (GPS) employs a digital carrier down-converter, digital code correlator and digital tracking processor. The digital carrier down-converter and code correlator consists of an all-digital, minimum bit implementation that utilizes digital chip and phase advancers, providing exceptional control and accuracy in feedback phase and in feedback delay. Roundoff and commensurability errors can be reduced to extremely small values (e.g., less than 100 nanochips and 100 nanocycles roundoff errors and 0.1 millichip and 1 millicycle commensurability errors). The digital tracking processor bases the fast feedback for phase and for group delay in the C/A, P.sub.1, and P.sub.2 channels on the L.sub.1 C/A carrier phase thereby maintaining lock at lower signal-to-noise ratios, reducing errors in feedback delays, reducing the frequency of cycle slips and in some cases obviating the need for quadrature processing in the P channels. Simple and reliable methods are employed for data bit synchronization, data bit removal and cycle counting. Improved precision in averaged output delay values is provided by carrier-aided data-compression techniques. The signal processor employs purely digital operations in the sense that exactly the same carrier phase and group delay measurements are obtained, to the last decimal place, every time the same sampled data (i.e., exactly the same bits) are processed.

  3. The place-value of a digit in multi-digit numbers is processed automatically.

    PubMed

    Kallai, Arava Y; Tzelgov, Joseph

    2012-09-01

    The automatic processing of the place-value of digits in a multi-digit number was investigated in 4 experiments. Experiment 1 and two control experiments employed a numerical comparison task in which the place-value of a non-zero digit was varied in a string composed of zeros. Experiment 2 employed a physical comparison task in which strings of digits varied in their physical sizes. In both types of tasks, the place-value of the non-zero digit in the string was irrelevant to the task performed. Interference of the place-value information was found in both tasks. When the non-zero digit occupied a lower place-value, it was recognized slower as a larger digit or as written in a larger font size. We concluded that place-value in a multi-digit number is processed automatically. These results support the notion of a decomposed representation of multi-digit numbers in memory. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  4. Powerful Practices in Digital Learning Processes

    ERIC Educational Resources Information Center

    Sørensen, Birgitte Holm; Levinsen, Karin Tweddell

    2015-01-01

    The present paper is based on two empirical research studies. The "Netbook 1:1" project (2009-2012), funded by the municipality of Gentofte and Microsoft Denmark, is complete, while "Students' digital production and students as learning designers" (2013-2015), funded by the Danish Ministry of Education, is ongoing. Both…

  5. Preliminary development of digital signal processing in microwave radiometers

    NASA Technical Reports Server (NTRS)

    Stanley, W. D.

    1980-01-01

    Topics covered involve a number of closely related tasks including: the development of several control loop and dynamic noise model computer programs for simulating microwave radiometer measurements; computer modeling of an existing stepped frequency radiometer in an effort to determine its optimum operational characteristics; investigation of the classical second order analog control loop to determine its ability to reduce the estimation error in a microwave radiometer; investigation of several digital signal processing unit designs; initiation of efforts to develop required hardware and software for implementation of the digital signal processing unit; and investigation of the general characteristics and peculiarities of digital processing noiselike microwave radiometer signals.

  6. Image Processing of Digital Cartographic Data.

    DTIC Science & Technology

    1982-02-05

    calidad coma exactitud en la produccio’n de mapas y cartas ha pasada a ser una de las mayares preocupacianes del cart6grafo. En esta ponencia se... calidad y exact itud de la infarmaci6n digital sabre elevacia’n y accidentes artificiales. Se exponen claramente las ventajas y desventajas de las...sambreada) y a las t~cnicas de cantral de calidad relativas al realce de las imfigenes. ABSTRACT FOR SCIENTIFIC PAPER TO BE PRESENTED AT: XII GENERAL

  7. Spectrally resolved digital holography using a white light LED

    NASA Astrophysics Data System (ADS)

    Claus, D.; Pedrini, G.; Buchta, D.; Osten, W.

    2017-06-01

    This paper introduces the concept of spectrally resolved digital holography. The measurement principle and the analysis of the data will be discussed in detail. The usefulness of spectrally resolved digital holography is demonstrated for colour imaging and optical metrology with regards to the recovery of modulus information and phase information, respectively. The phase information will be used to measure the shape of an object via the application of the dual wavelength method. Based on the large degree of data available, multiple speckle de-correlated dual wavelength phase maps can be obtained, which when averaged result in a signal to noise ratio improvement.

  8. Variable self-powered light detection CMOS chip with real-time adaptive tracking digital output based on a novel on-chip sensor.

    PubMed

    Wang, HongYi; Fan, Youyou; Lu, Zhijian; Luo, Tao; Fu, Houqiang; Song, Hongjiang; Zhao, Yuji; Christen, Jennifer Blain

    2017-10-02

    This paper provides a solution for a self-powered light direction detection with digitized output. Light direction sensors, energy harvesting photodiodes, real-time adaptive tracking digital output unit and other necessary circuits are integrated on a single chip based on a standard 0.18 µm CMOS process. Light direction sensors proposed have an accuracy of 1.8 degree over a 120 degree range. In order to improve the accuracy, a compensation circuit is presented for photodiodes' forward currents. The actual measurement precision of output is approximately 7 ENOB. Besides that, an adaptive under voltage protection circuit is designed for variable supply power which may undulate with temperature and process.

  9. Digital pulse processing in Mössbauer spectroscopy

    NASA Astrophysics Data System (ADS)

    Veiga, A.; Grunfeld, C. M.

    2014-04-01

    In this work we present some advances towards full digitization of the detection subsystem of a Mössbauer transmission spectrometer. We show how, using adequate instrumentation, preamplifier output of a proportional counter can be digitized with no deterioration in spectrum quality, avoiding the need of a shaping amplifier. A pipelined architecture is proposed for a digital processor, which constitutes a versatile platform for the development of pulse processing techniques. Requirements for minimization of the analog processing are considered and experimental results are presented.

  10. Digital signal processing in the radio science stability analyzer

    NASA Technical Reports Server (NTRS)

    Greenhall, C. A.

    1995-01-01

    The Telecommunications Division has built a stability analyzer for testing Deep Space Network installations during flight radio science experiments. The low-frequency part of the analyzer operates by digitizing wave signals with bandwidths between 80 Hz and 45 kHz. Processed outputs include spectra of signal, phase, amplitude, and differential phase; time series of the same quantities; and Allan deviation of phase and differential phase. This article documents the digital signal-processing methods programmed into the analyzer.

  11. Detecting jaundice by using digital image processing

    NASA Astrophysics Data System (ADS)

    Castro-Ramos, J.; Toxqui-Quitl, C.; Villa Manriquez, F.; Orozco-Guillen, E.; Padilla-Vivanco, A.; Sánchez-Escobar, JJ.

    2014-03-01

    When strong Jaundice is presented, babies or adults should be subject to clinical exam like "serum bilirubin" which can cause traumas in patients. Often jaundice is presented in liver disease such as hepatitis or liver cancer. In order to avoid additional traumas we propose to detect jaundice (icterus) in newborns or adults by using a not pain method. By acquiring digital images in color, in palm, soles and forehead, we analyze RGB attributes and diffuse reflectance spectra as the parameter to characterize patients with either jaundice or not, and we correlate that parameters with the level of bilirubin. By applying support vector machine we distinguish between healthy and sick patients.

  12. Digital processing of radiographic images from PACS to publishing.

    PubMed

    Christian, M E; Davidson, H C; Wiggins, R H; Berges, G; Cannon, G; Jackson, G; Chapman, B; Harnsberger, H R

    2001-03-01

    Several studies have addressed the implications of filmless radiologic imaging on telemedicine, diagnostic ability, and electronic teaching files. However, many publishers still require authors to submit hard-copy images for publication of articles and textbooks. This study compares the quality digital images directly exported from picture archive and communications systems (PACS) to images digitized from radiographic film. The authors evaluated the quality of publication-grade glossy photographs produced from digital radiographic images using 3 different methods: (1) film images digitized using a desktop scanner and then printed, (2) digital images obtained directly from PACS then printed, and (3) digital images obtained from PACS and processed to improve sharpness prior to printing. Twenty images were printed using each of the 3 different methods and rated for quality by 7 radiologists. The results were analyzed for statistically significant differences among the image sets. Subjective evaluations of the filmless images found them to be of equal or better quality than the digitized images. Direct electronic transfer of PACS images reduces the number of steps involved in creating publication-quality images as well as providing the means to produce high-quality radiographic images in a digital environment.

  13. Validation of Digital Microscopy Compared With Light Microscopy for the Diagnosis of Canine Cutaneous Tumors.

    PubMed

    Bertram, Christof A; Gurtner, Corinne; Dettwiler, Martina; Kershaw, Olivia; Dietert, Kristina; Pieper, Laura; Pischon, Hannah; Gruber, Achim D; Klopfleisch, Robert

    2018-07-01

    Integration of new technologies, such as digital microscopy, into a highly standardized laboratory routine requires the validation of its performance in terms of reliability, specificity, and sensitivity. However, a validation study of digital microscopy is currently lacking in veterinary pathology. The aim of the current study was to validate the usability of digital microscopy in terms of diagnostic accuracy, speed, and confidence for diagnosing and differentiating common canine cutaneous tumor types and to compare it to classical light microscopy. Therefore, 80 histologic sections including 17 different skin tumor types were examined twice as glass slides and twice as digital whole-slide images by 6 pathologists with different levels of experience at 4 time points. Comparison of both methods found digital microscopy to be noninferior for differentiating individual tumor types within the category epithelial and mesenchymal tumors, but diagnostic concordance was slightly lower for differentiating individual round cell tumor types by digital microscopy. In addition, digital microscopy was associated with significantly shorter diagnostic time, but diagnostic confidence was lower and technical quality was considered inferior for whole-slide images compared with glass slides. Of note, diagnostic performance for whole-slide images scanned at 200× magnification was noninferior in diagnostic performance for slides scanned at 400×. In conclusion, digital microscopy differs only minimally from light microscopy in few aspects of diagnostic performance and overall appears adequate for the diagnosis of individual canine cutaneous tumors with minor limitations for differentiating individual round cell tumor types and grading of mast cell tumors.

  14. Relationships between digital signal processing and control and estimation theory

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1978-01-01

    Research areas associated with digital signal processing and control and estimation theory are identified. Particular attention is given to image processing, system identification problems (parameter identification, linear prediction, least squares, Kalman filtering), stability analyses (the use of the Liapunov theory, frequency domain criteria, passivity), and multiparameter systems, distributed processes, and random fields.

  15. A projector calibration method for monocular structured light system based on digital image correlation

    NASA Astrophysics Data System (ADS)

    Feng, Zhixin

    2018-02-01

    Projector calibration is crucial for a camera-projector three-dimensional (3-D) structured light measurement system, which has one camera and one projector. In this paper, a novel projector calibration method is proposed based on digital image correlation. In the method, the projector is viewed as an inverse camera, and a plane calibration board with feature points is used to calibrate the projector. During the calibration processing, a random speckle pattern is projected onto the calibration board with different orientations to establish the correspondences between projector images and camera images. Thereby, dataset for projector calibration are generated. Then the projector can be calibrated using a well-established camera calibration algorithm. The experiment results confirm that the proposed method is accurate and reliable for projector calibration.

  16. Hexagonal wavelet processing of digital mammography

    NASA Astrophysics Data System (ADS)

    Laine, Andrew F.; Schuler, Sergio; Huda, Walter; Honeyman-Buck, Janice C.; Steinbach, Barbara G.

    1993-09-01

    This paper introduces a novel approach for accomplishing mammographic feature analysis through overcomplete multiresolution representations. We show that efficient representations may be identified from digital mammograms and used to enhance features of importance to mammography within a continuum of scale-space. We present a method of contrast enhancement based on an overcomplete, non-separable multiscale representation: the hexagonal wavelet transform. Mammograms are reconstructed from transform coefficients modified at one or more levels by local and global non-linear operators. Multiscale edges identified within distinct levels of transform space provide local support for enhancement. We demonstrate that features extracted from multiresolution representations can provide an adaptive mechanism for accomplishing local contrast enhancement. We suggest that multiscale detection and local enhancement of singularities may be effectively employed for the visualization of breast pathology without excessive noise amplification.

  17. Noise reduction in digital lensless holographic microscopy by engineering the light from a light-emitting diode.

    PubMed

    Garcia-Sucerquia, Jorge

    2013-01-01

    By engineering the light from a light-emitting diode (LED) the noises present in digital lensless holographic microscopy (DLHM) are reduced. The partially coherent light from an LED is tailored to produce a spherical wavefront with limited coherence time and the spatial coherence needed by DLHM to work. DLHM with this engineered light source is used to image biological samples that cover areas of the order of mm(2). The ratio between the diameter of the area that is almost coherently illuminated to the diameter of the illumination area is utilized as parameter to quantify the performance of the DLHM with the engineered LED light source. Experimental results show that while the noises can be reduced effectively the spatial resolution can be kept in the micrometer range.

  18. Image processing operations achievable with the Microchannel Spatial Light Modulator

    NASA Astrophysics Data System (ADS)

    Warde, C.; Fisher, A. D.; Thackara, J. I.; Weiss, A. M.

    1980-01-01

    The Microchannel Spatial Light Modulator (MSLM) is a versatile, optically-addressed, highly-sensitive device that is well suited for low-light-level, real-time, optical information processing. It consists of a photocathode, a microchannel plate (MCP), a planar acceleration grid, and an electro-optic plate in proximity focus. A framing rate of 20 Hz with full modulation depth, and 100 Hz with 20% modulation depth has been achieved in a vacuum-demountable LiTaO3 device. A halfwave exposure sensitivity of 2.2 mJ/sq cm and an optical information storage time of more than 2 months have been achieved in a similar gridless LiTaO3 device employing a visible photocathode. Image processing operations such as analog and digital thresholding, real-time image hard clipping, contrast reversal, contrast enhancement, image addition and subtraction, and binary-level logic operations such as AND, OR, XOR, and NOR can be achieved with this device. This collection of achievable image processing characteristics makes the MSLM potentially useful for a number of smart sensor applications.

  19. Digital processing of RF signals from optical frequency combs

    NASA Astrophysics Data System (ADS)

    Cizek, Martin; Smid, Radek; Buchta, Zdeněk.; Mikel, Břetislav; Lazar, Josef; Cip, Ondřej

    2013-01-01

    The presented work is focused on digital processing of beat note signals from a femtosecond optical frequency comb. The levels of mixing products of single spectral components of the comb with CW laser sources are usually very low compared to products of mixing all the comb components together. RF counters are more likely to measure the frequency of the strongest spectral component rather than a weak beat note. Proposed experimental digital signal processing system solves this problem by analyzing the whole spectrum of the output RF signal and using software defined radio (SDR) algorithms. Our efforts concentrate in two main areas: Firstly, using digital servo-loop techniques for locking free running continuous laser sources on single components of the fs comb spectrum. Secondly, we are experimenting with digital signal processing of the RF beat note spectrum produced by f-2f 1 technique used for assessing the offset and repetition frequencies of the comb, resulting in digital servo-loop stabilization of the fs comb. Software capable of computing and analyzing the beat-note RF spectrums using FFT and peak detection was developed. A SDR algorithm performing phase demodulation on the f- 2f signal is used as a regulation error signal source for a digital phase-locked loop stabilizing the offset frequency of the fs comb.

  20. Wavelet processing techniques for digital mammography

    NASA Astrophysics Data System (ADS)

    Laine, Andrew F.; Song, Shuwu

    1992-09-01

    This paper introduces a novel approach for accomplishing mammographic feature analysis through multiresolution representations. We show that efficient (nonredundant) representations may be identified from digital mammography and used to enhance specific mammographic features within a continuum of scale space. The multiresolution decomposition of wavelet transforms provides a natural hierarchy in which to embed an interactive paradigm for accomplishing scale space feature analysis. Similar to traditional coarse to fine matching strategies, the radiologist may first choose to look for coarse features (e.g., dominant mass) within low frequency levels of a wavelet transform and later examine finer features (e.g., microcalcifications) at higher frequency levels. In addition, features may be extracted by applying geometric constraints within each level of the transform. Choosing wavelets (or analyzing functions) that are simultaneously localized in both space and frequency, results in a powerful methodology for image analysis. Multiresolution and orientation selectivity, known biological mechanisms in primate vision, are ingrained in wavelet representations and inspire the techniques presented in this paper. Our approach includes local analysis of complete multiscale representations. Mammograms are reconstructed from wavelet representations, enhanced by linear, exponential and constant weight functions through scale space. By improving the visualization of breast pathology we can improve the chances of early detection of breast cancers (improve quality) while requiring less time to evaluate mammograms for most patients (lower costs).

  1. Digital intermediate frequency QAM modulator using parallel processing

    DOEpatents

    Pao, Hsueh-Yuan [Livermore, CA; Tran, Binh-Nien [San Ramon, CA

    2008-05-27

    The digital Intermediate Frequency (IF) modulator applies to various modulation types and offers a simple and low cost method to implement a high-speed digital IF modulator using field programmable gate arrays (FPGAs). The architecture eliminates multipliers and sequential processing by storing the pre-computed modulated cosine and sine carriers in ROM look-up-tables (LUTs). The high-speed input data stream is parallel processed using the corresponding LUTs, which reduces the main processing speed, allowing the use of low cost FPGAs.

  2. Relationships between digital signal processing and control and estimation theory

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1978-01-01

    Research directions in the fields of digital signal processing and modern control and estimation theory are discussed. Stability theory, linear prediction and parameter identification, system synthesis and implementation, two-dimensional filtering, decentralized control and estimation, and image processing are considered in order to uncover some of the basic similarities and differences in the goals, techniques, and philosophy of the disciplines.

  3. FPGA-Based Filterbank Implementation for Parallel Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Berner, Stephan; DeLeon, Phillip

    1999-01-01

    One approach to parallel digital signal processing decomposes a high bandwidth signal into multiple lower bandwidth (rate) signals by an analysis bank. After processing, the subband signals are recombined into a fullband output signal by a synthesis bank. This paper describes an implementation of the analysis and synthesis banks using (Field Programmable Gate Arrays) FPGAs.

  4. Characterization of fluid flow by digital correlation of scattered light

    NASA Technical Reports Server (NTRS)

    Gilbert, John A.; Matthys, Donald R.

    1989-01-01

    The objective is to produce a physical system suitable for a space environment that can measure fluid velocities in a three-dimensional volume by the development of a particle correlation velocimetry technique. Experimental studies were conducted on a field test cell to demonstrate the suitability and accuracy of digital correlation techniques for measuring two-dimensional fluid flows. This objective was satisfied by: (1) the design of an appropriate illumination and detection system for making velocity measurements within a test cell; (2) the design and construction of a test cell; (3) the preliminary evaluations on fluid and seeding requirements; and (4) the performance of controlled tests using a multiple exposure correlation technique. This presentation is represented by viewgraphs with very little text.

  5. Green light may improve diagnostic accuracy of nailfold capillaroscopy with a simple digital videomicroscope.

    PubMed

    Weekenstroo, Harm H A; Cornelissen, Bart M W; Bernelot Moens, Hein J

    2015-06-01

    Nailfold capillaroscopy is a non-invasive and safe technique for the analysis of microangiopathologies. Imaging quality of widely used simple videomicroscopes is poor. The use of green illumination instead of the commonly used white light may improve contrast. The aim of the study was to compare the effect of green illumination with white illumination, regarding capillary density, the number of microangiopathologies, and sensitivity and specificity for systemic sclerosis. Five rheumatologists have evaluated 80 images; 40 images acquired with green light, and 40 images acquired with white light. A larger number of microangiopathologies were found in images acquired with green light than in images acquired with white light. This results in slightly higher sensitivity with green light in comparison with white light, without reducing the specificity. These findings suggest that green instead of white illumination may facilitate evaluation of capillaroscopic images obtained with a low-cost digital videomicroscope.

  6. Lighting in digital game worlds: effects on affect and play performance.

    PubMed

    Knez, Igor; Niedenthal, Simon

    2008-04-01

    As a means of extending the significance of findings in experimental psychology and nonvisual psychological lighting research to digital game research, the present study was designed to investigate the impact of warm (reddish) and cool (bluish) simulated illumination in digital game worlds on game users' affect and play performance. In line with some previous findings, we predicted that lighting in a digital game world might, as in the real world, differently influence the nonvisual psychological mechanisms of affect, which in turn might enhance or impair the players' performance. It was shown that the players performed best and fastest in a game world lit with a warm (reddish) as compared to a cool (bluish) lighting. The former color of lighting also induced the highest level of pleasantness in game users. A regression analysis indicated tentatively that it was the level of pleasantness induced by the warm lighting that enhanced the players' better performance in that digital game world. It was also shown that high- as opposed to medium- or low-skilled players engage almost 2.5 times more per week in game playing. Given their skill, they performed significantly faster and felt significantly calmer and more relaxed in doing so.

  7. Digital Art Making as a Representational Process

    ERIC Educational Resources Information Center

    Halverson, Erica Rosenfeld

    2013-01-01

    In this article I bring artistic production into the learning sciences conversation by using the production of representations as a bridging concept between art making and the new literacies. Through case studies with 4 youth media arts organizations across the United States I ask how organizations structure the process of producing…

  8. Digital image processing of vascular angiograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Blankenhorn, D. H.; Beckenbach, E. S.; Crawford, D. W.; Brooks, S. H.

    1975-01-01

    A computer image processing technique was developed to estimate the degree of atherosclerosis in the human femoral artery. With an angiographic film of the vessel as input, the computer was programmed to estimate vessel abnormality through a series of measurements, some derived primarily from the vessel edge information and others from optical density variations within the lumen shadow. These measurements were combined into an atherosclerosis index, which was found to correlate well with both visual and chemical estimates of atherosclerotic disease.

  9. Investigations of high-speed digital imaging of low-light-level events using pulsed near-infrared laser light sources

    NASA Astrophysics Data System (ADS)

    Jantzen, Connie; Slagle, Rick

    1997-05-01

    The distinction between exposure time and sample rate is often the first point raised in any discussion of high speed imaging. Many high speed events require exposure times considerably shorter than those that can be achieved solely by the sample rate of the camera, where exposure time equals 1/sample rate. Gating, a method of achieving short exposure times in digital cameras, is often difficult to achieve for exposure time requirements shorter than 100 microseconds. This paper discusses the advantages and limitations of using the short duration light pulse of a near infrared laser with high speed digital imaging systems. By closely matching the output wavelength of the pulsed laser to the peak near infrared response of current sensors, high speed image capture can be accomplished at very low (visible) light levels of illumination. By virtue of the short duration light pulse, adjustable to as short as two microseconds, image capture of very high speed events can be achieved at relatively low sample rates of less than 100 pictures per second, without image blur. For our initial investigations, we chose a ballistic subject. The results of early experimentation revealed the limitations of applying traditional ballistic imaging methods when using a pulsed infrared lightsource with a digital imaging system. These early disappointing results clarified the need to further identify the unique system characteristics of the digital imager and pulsed infrared combination. It was also necessary to investigate how the infrared reflectance and transmittance of common materials affects the imaging process. This experimental work yielded a surprising, successful methodology which will prove useful in imaging ballistic and weapons tests, as well as forensics, flow visualizations, spray pattern analyses, and nocturnal animal behavioral studies.

  10. Increasing signal-to-noise ratio of reconstructed digital holograms by using light spatial noise portrait of camera's photosensor

    NASA Astrophysics Data System (ADS)

    Cheremkhin, Pavel A.; Evtikhiev, Nikolay N.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Starikov, Sergey N.

    2015-01-01

    Digital holography is technique which includes recording of interference pattern with digital photosensor, processing of obtained holographic data and reconstruction of object wavefront. Increase of signal-to-noise ratio (SNR) of reconstructed digital holograms is especially important in such fields as image encryption, pattern recognition, static and dynamic display of 3D scenes, and etc. In this paper compensation of photosensor light spatial noise portrait (LSNP) for increase of SNR of reconstructed digital holograms is proposed. To verify the proposed method, numerical experiments with computer generated Fresnel holograms with resolution equal to 512×512 elements were performed. Simulation of shots registration with digital camera Canon EOS 400D was performed. It is shown that solo use of the averaging over frames method allows to increase SNR only up to 4 times, and further increase of SNR is limited by spatial noise. Application of the LSNP compensation method in conjunction with the averaging over frames method allows for 10 times SNR increase. This value was obtained for LSNP measured with 20 % error. In case of using more accurate LSNP, SNR can be increased up to 20 times.

  11. Display nonlinearity in digital image processing for visual communications

    NASA Astrophysics Data System (ADS)

    Peli, Eli

    1992-11-01

    The luminance emitted from a cathode ray tube (CRT) display is a nonlinear function (the gamma function) of the input video signal voltage. In most analog video systems, compensation for this nonlinear transfer function is implemented in the camera amplifiers. When CRT displays are used to present psychophysical stimuli in vision research, the specific display nonlinearity usually is measured and accounted for to ensure that the luminance of each pixel in the synthetic image property represents the intended value. However, when using digital image processing, the linear analog-to-digital converters store a digital image that is nonlinearly related to the displayed or recorded image. The effect of this nonlinear transformation on a variety of image-processing applications used in visual communications is described.

  12. Display nonlinearity in digital image processing for visual communications

    NASA Astrophysics Data System (ADS)

    Peli, Eli

    1991-11-01

    The luminance emitted from a cathode ray tube, (CRT) display is a nonlinear function (the gamma function) of the input video signal voltage. In most analog video systems, compensation for this nonlinear transfer function is implemented in the camera amplifiers. When CRT displays are used to present psychophysical stimuli in vision research, the specific display nonlinearity usually is measured and accounted for to ensure that the luminance of each pixel in the synthetic image properly represents the intended value. However, when using digital image processing, the linear analog-to-digital converters store a digital image that is nonlinearly related to the displayed or recorded image. This paper describes the effect of this nonlinear transformation on a variety of image-processing applications used in visual communications.

  13. Synthetic aperture radar and digital processing: An introduction

    NASA Technical Reports Server (NTRS)

    Dicenzo, A.

    1981-01-01

    A tutorial on synthetic aperture radar (SAR) is presented with emphasis on digital data collection and processing. Background information on waveform frequency and phase notation, mixing, Q conversion, sampling and cross correlation operations is included for clarity. The fate of a SAR signal from transmission to processed image is traced in detail, using the model of a single bright point target against a dark background. Some of the principal problems connected with SAR processing are also discussed.

  14. Better understanding of digital photography for skin color measurement: With a special emphasis on light characteristics.

    PubMed

    Seo, Soo Hong; Kim, Jae Hwan; Kim, Ji Woong; Kye, Young Chul; Ahn, Hyo Hyun

    2011-02-01

    Digital photography can be used to measure skin color colorimetrically when combined with proper techniques. To better understand the settings of digital photography for the evaluation and measurement of skin colors, we used a tungsten lamp with filters and the custom white balance (WB) function of a digital camera. All colored squares on a color chart were photographed with each original and filtered light, analyzed into CIELAB coordinates to produce the calibration method for each given light setting, and compared statistically with reference coordinates obtained using a reflectance spectrophotometer. They were summarized as to the typical color groups, such as skin colors. We compared these results according to the fixed vs. custom WB of a digital camera. The accuracy of color measurement was improved when using light with a proper color temperature conversion filter. The skin colors from color charts could be measured more accurately using a fixed WB. In vivo measurement of skin color was easy and possible with our method and settings. The color temperature conversion filter that produced daylight-like light from the tungsten lamp was the best choice when combined with fixed WB for the measurement of colors and acceptable photographs. © 2010 John Wiley & Sons A/S.

  15. Digital image processing: a primer for JVIR authors and readers: Part 3: Digital image editing.

    PubMed

    LaBerge, Jeanne M; Andriole, Katherine P

    2003-12-01

    This is the final installment of a three-part series on digital image processing intended to prepare authors for online submission of manuscripts. In the first two articles of the series, the fundamentals of digital image architecture were reviewed and methods of importing images to the computer desktop were described. In this article, techniques are presented for editing images in preparation for online submission. A step-by-step guide to basic editing with use of Adobe Photoshop is provided and the ethical implications of this activity are explored.

  16. Digital-Computer Processing of Graphical Data. Final Report.

    ERIC Educational Resources Information Center

    Freeman, Herbert

    The final report of a two-year study concerned with the digital-computer processing of graphical data. Five separate investigations carried out under this study are described briefly, and a detailed bibliography, complete with abstracts, is included in which are listed the technical papers and reports published during the period of this program.…

  17. Trust in Numbers? Digital Education Governance and the Inspection Process

    ERIC Educational Resources Information Center

    Ozga, Jenny

    2016-01-01

    The aim of the paper is to contribute to the critical study of digital data use in education, through examination of the processes surrounding school inspection judgements. The interaction between pupil performance data and other (embodied, enacted) sources of inspection judgement is scrutinised and discussed with a focus on the interaction…

  18. Digital Signal Processing in Acoustics--Part 2.

    ERIC Educational Resources Information Center

    Davies, H.; McNeill, D. J.

    1986-01-01

    Reviews the potential of a data acquisition system for illustrating the nature and significance of ideas in digital signal processing. Focuses on the fast Fourier transform and the utility of its two-channel format, emphasizing cross-correlation and its two-microphone technique of acoustic intensity measurement. Includes programing format. (ML)

  19. Experiences with digital processing of images at INPE

    NASA Technical Reports Server (NTRS)

    Mascarenhas, N. D. A. (Principal Investigator)

    1984-01-01

    Four different research experiments with digital image processing at INPE will be described: (1) edge detection by hypothesis testing; (2) image interpolation by finite impulse response filters; (3) spatial feature extraction methods in multispectral classification; and (4) translational image registration by sequential tests of hypotheses.

  20. Effects of digital phase-conjugate light intensity on time-reversal imaging through animal tissue.

    PubMed

    Toda, Sogo; Kato, Yuji; Kudo, Nobuki; Shimizu, Koichi

    2018-04-01

    For transillumination imaging of animal tissues, we have attempted to suppress the scattering effect in a turbid medium using the time-reversal principle of phase-conjugate light. We constructed a digital phase-conjugate system to enable intensity modulation and phase modulation. Using this system, we clarified the effectiveness of the intensity information for restoration of the original light distribution through a turbid medium. By varying the scattering coefficient of the medium, we clarified the limit of time-reversal ability with intensity information of the phase-conjugate light. Experiment results demonstrated the applicability of the proposed technique to animal tissue.

  1. Confocal Retinal Imaging Using a Digital Light Projector with a Near Infrared VCSEL Source

    PubMed Central

    Muller, Matthew S.; Elsner, Ann E.

    2018-01-01

    A custom near infrared VCSEL source has been implemented in a confocal non-mydriatic retinal camera, the Digital Light Ophthalmoscope (DLO). The use of near infrared light improves patient comfort, avoids pupil constriction, penetrates the deeper retina, and does not mask visual stimuli. The DLO performs confocal imaging by synchronizing a sequence of lines displayed with a digital micromirror device to the rolling shutter exposure of a 2D CMOS camera. Real-time software adjustments enable multiply scattered light imaging, which rapidly and cost-effectively emphasizes drusen and other scattering disruptions in the deeper retina. A separate 5.1″ LCD display provides customizable visible stimuli for vision experiments with simultaneous near infrared imaging. PMID:29899586

  2. Confocal retinal imaging using a digital light projector with a near infrared VCSEL source

    NASA Astrophysics Data System (ADS)

    Muller, Matthew S.; Elsner, Ann E.

    2018-02-01

    A custom near infrared VCSEL source has been implemented in a confocal non-mydriatic retinal camera, the Digital Light Ophthalmoscope (DLO). The use of near infrared light improves patient comfort, avoids pupil constriction, penetrates the deeper retina, and does not mask visual stimuli. The DLO performs confocal imaging by synchronizing a sequence of lines displayed with a digital micromirror device to the rolling shutter exposure of a 2D CMOS camera. Real-time software adjustments enable multiply scattered light imaging, which rapidly and cost-effectively emphasizes drusen and other scattering disruptions in the deeper retina. A separate 5.1" LCD display provides customizable visible stimuli for vision experiments with simultaneous near infrared imaging.

  3. Investigating Digital Optical Computing with Spatial Light Rebroadcasters

    DTIC Science & Technology

    1991-10-31

    3303, 1991 5 5: I 66. F. Rosenblatt, Principles of neurodynamics : Perceptrons and the theory of brain mechanism, Spartan, Washington, 1961. 67. D. E...eds., Parallel Distributed Processing: Explorations in the Microstructure of Cognition , Vol-I : Foundations, MIT Press, 1986. 68. A. Ayyalusamy

  4. Digital image processing for the earth resources technology satellite data.

    NASA Technical Reports Server (NTRS)

    Will, P. M.; Bakis, R.; Wesley, M. A.

    1972-01-01

    This paper discusses the problems of digital processing of the large volumes of multispectral image data that are expected to be received from the ERTS program. Correction of geometric and radiometric distortions are discussed and a byte oriented implementation is proposed. CPU timing estimates are given for a System/360 Model 67, and show that a processing throughput of 1000 image sets per week is feasible.

  5. Fundamentals of image acquisition and processing in the digital era.

    PubMed

    Farman, A G

    2003-01-01

    To review the historic context for digital imaging in dentistry and to outline the fundamental issues related to digital imaging modalities. Digital dental X-ray images can be achieved by scanning analog film radiographs (secondary capture), with photostimulable phosphors, or using solid-state detectors (e.g. charge-coupled device and complementary metal oxide semiconductor). There are four characteristics that are basic to all digital image detectors; namely, size of active area, signal-to-noise ratio, contrast resolution and the spatial resolution. To perceive structure in a radiographic image, there needs to be sufficient difference between contrasting densities. This primarily depends on the differences in the attenuation of the X-ray beam by adjacent tissues. It is also depends on the signal received; therefore, contrast tends to increase with increased exposure. Given adequate signal and sufficient differences in radiodensity, contrast will be sufficient to differentiate between adjacent structures, irrespective of the recording modality and processing used. Where contrast is not sufficient, digital images can sometimes be post-processed to disclose details that would otherwise go undetected. For example, cephalogram isodensity mapping can improve soft tissue detail. It is concluded that it could be a further decade or two before three-dimensional digital imaging systems entirely replace two-dimensional analog films. Such systems need not only to produce prettier images, but also to provide a demonstrable evidence-based higher standard of care at a cost that is not economically prohibitive for the practitioner or society, and which allows efficient and effective workflow within the business of dental practice.

  6. A 256×256 low-light-level CMOS imaging sensor with digital CDS

    NASA Astrophysics Data System (ADS)

    Zou, Mei; Chen, Nan; Zhong, Shengyou; Li, Zhengfen; Zhang, Jicun; Yao, Li-bin

    2016-10-01

    In order to achieve high sensitivity for low-light-level CMOS image sensors (CIS), a capacitive transimpedance amplifier (CTIA) pixel circuit with a small integration capacitor is used. As the pixel and the column area are highly constrained, it is difficult to achieve analog correlated double sampling (CDS) to remove the noise for low-light-level CIS. So a digital CDS is adopted, which realizes the subtraction algorithm between the reset signal and pixel signal off-chip. The pixel reset noise and part of the column fixed-pattern noise (FPN) can be greatly reduced. A 256×256 CIS with CTIA array and digital CDS is implemented in the 0.35μm CMOS technology. The chip size is 7.7mm×6.75mm, and the pixel size is 15μm×15μm with a fill factor of 20.6%. The measured pixel noise is 24LSB with digital CDS in RMS value at dark condition, which shows 7.8× reduction compared to the image sensor without digital CDS. Running at 7fps, this low-light-level CIS can capture recognizable images with the illumination down to 0.1lux.

  7. Developing daisy chain receivers for light-emitting diode illumination adopting the digital multiplex-512 protocol.

    PubMed

    Um, Keehong; Yoo, Sooyeup

    2013-10-01

    Protocol for digital multiplex with 512 pieces of information is increasingly adopted in the design of illumination systems. In conventional light-emitting diode systems, the receivers are connected in parallel and each of the receiving units receives all the data from the master dimmer console, but each receiving unit operates by recognizing as its own data that which corresponds to the assigned number of the receiver. Because the serial numbers of illumination devices are transmitted in binary code, synchronization is too complicated to be used properly. In order to improve the protocol of illumination control systems, we propose an algorithm of protocol reception to install and manage the system in a simpler and more convenient way. We propose the systems for controlling the light-emitting diode illumination of simplified receiver slaves adopting the digital multiplex-512 protocol where master console and multiple receiver slaves are connected in a daisy chain fashion. The digital multiplex-512 data packet is received according to the sequence order of their locations from the console, without assigning the sequence number of each channel at the receiving device. The purpose of this paper is to design a simple and small-sized controller for the control systems of lamps and lighting adopting the digital multiplex-512 network.

  8. Light scattering and transmission measurement using digital imaging for online analysis of constituents in milk

    NASA Astrophysics Data System (ADS)

    Jain, Pranay; Sarma, Sanjay E.

    2015-05-01

    Milk is an emulsion of fat globules and casein micelles dispersed in an aqueous medium with dissolved lactose, whey proteins and minerals. Quantification of constituents in milk is important in various stages of the dairy supply chain for proper process control and quality assurance. In field-level applications, spectrophotometric analysis is an economical option due to the low-cost of silicon photodetectors, sensitive to UV/Vis radiation with wavelengths between 300 - 1100 nm. Both absorption and scattering are witnessed as incident UV/Vis radiation interacts with dissolved and dispersed constituents in milk. These effects can in turn be used to characterize the chemical and physical composition of a milk sample. However, in order to simplify analysis, most existing instrument require dilution of samples to avoid effects of multiple scattering. The sample preparation steps are usually expensive, prone to human errors and unsuitable for field-level and online analysis. This paper introduces a novel digital imaging based method of online spectrophotometric measurements on raw milk without any sample preparation. Multiple LEDs of different emission spectra are used as discrete light sources and a digital CMOS camera is used as an image sensor. The extinction characteristic of samples is derived from captured images. The dependence of multiple scattering on power of incident radiation is exploited to quantify scattering. The method has been validated with experiments for response with varying fat concentrations and fat globule sizes. Despite of the presence of multiple scattering, the method is able to unequivocally quantify extinction of incident radiation and relate it to the fat concentrations and globule sizes of samples.

  9. A digital signal processing system for coherent laser radar

    NASA Technical Reports Server (NTRS)

    Hampton, Diana M.; Jones, William D.; Rothermel, Jeffry

    1991-01-01

    A data processing system for use with continuous-wave lidar is described in terms of its configuration and performance during the second survey mission of NASA'a Global Backscatter Experiment. The system is designed to estimate a complete lidar spectrum in real time, record the data from two lidars, and monitor variables related to the lidar operating environment. The PC-based system includes a transient capture board, a digital-signal processing (DSP) board, and a low-speed data-acquisition board. Both unprocessed and processed lidar spectrum data are monitored in real time, and the results are compared to those of a previous non-DSP-based system. Because the DSP-based system is digital it is slower than the surface-acoustic-wave signal processor and collects 2500 spectra/s. However, the DSP-based system provides complete data sets at two wavelengths from the continuous-wave lidars.

  10. Integrating digital topology in image-processing libraries.

    PubMed

    Lamy, Julien

    2007-01-01

    This paper describes a method to integrate digital topology informations in image-processing libraries. This additional information allows a library user to write algorithms respecting topological constraints, for example, a seed fill or a skeletonization algorithm. As digital topology is absent from most image-processing libraries, such constraints cannot be fulfilled. We describe and give code samples for all the structures necessary for this integration, and show a use case in the form of a homotopic thinning filter inside ITK. The obtained filter can be up to a hundred times as fast as ITK's thinning filter and works for any image dimension. This paper mainly deals of integration within ITK, but can be adapted with only minor modifications to other image-processing libraries.

  11. A novel method for detecting light source for digital images forensic

    NASA Astrophysics Data System (ADS)

    Roy, A. K.; Mitra, S. K.; Agrawal, R.

    2011-06-01

    Manipulation in image has been in practice since centuries. These manipulated images are intended to alter facts — facts of ethics, morality, politics, sex, celebrity or chaos. Image forensic science is used to detect these manipulations in a digital image. There are several standard ways to analyze an image for manipulation. Each one has some limitation. Also very rarely any method tried to capitalize on the way image was taken by the camera. We propose a new method that is based on light and its shade as light and shade are the fundamental input resources that may carry all the information of the image. The proposed method measures the direction of light source and uses the light based technique for identification of any intentional partial manipulation in the said digital image. The method is tested for known manipulated images to correctly identify the light sources. The light source of an image is measured in terms of angle. The experimental results show the robustness of the methodology.

  12. Focusing light inside dynamic scattering media with millisecond digital optical phase conjugation

    PubMed Central

    Liu, Yan; Ma, Cheng; Shen, Yuecheng; Shi, Junhui; Wang, Lihong V.

    2017-01-01

    Wavefront shaping based on digital optical phase conjugation (DOPC) focuses light through or inside scattering media, but the low speed of DOPC prevents it from being applied to thick, living biological tissue. Although a fast DOPC approach was recently developed, the reported single-shot wavefront measurement method does not work when the goal is to focus light inside, instead of through, highly scattering media. Here, using a ferroelectric liquid crystal based spatial light modulator, we develop a simpler but faster DOPC system that focuses light not only through, but also inside scattering media. By controlling 2.6 × 105 optical degrees of freedom, our system focused light through 3 mm thick moving chicken tissue, with a system latency of 3.0 ms. Using ultrasound-guided DOPC, along with a binary wavefront measurement method, our system focused light inside a scattering medium comprising moving tissue with a latency of 6.0 ms, which is one to two orders of magnitude shorter than those of previous digital wavefront shaping systems. Since the demonstrated speed approaches tissue decorrelation rates, this work is an important step toward in vivo deep-tissue non-invasive optical imaging, manipulation, and therapy. PMID:28815194

  13. Experimental demonstration of quantum digital signatures using phase-encoded coherent states of light

    PubMed Central

    Clarke, Patrick J.; Collins, Robert J.; Dunjko, Vedran; Andersson, Erika; Jeffers, John; Buller, Gerald S.

    2012-01-01

    Digital signatures are frequently used in data transfer to prevent impersonation, repudiation and message tampering. Currently used classical digital signature schemes rely on public key encryption techniques, where the complexity of so-called ‘one-way' mathematical functions is used to provide security over sufficiently long timescales. No mathematical proofs are known for the long-term security of such techniques. Quantum digital signatures offer a means of sending a message, which cannot be forged or repudiated, with security verified by information-theoretical limits and quantum mechanics. Here we demonstrate an experimental system, which distributes quantum signatures from one sender to two receivers and enables message sending ensured against forging and repudiation. Additionally, we analyse the security of the system in some typical scenarios. Our system is based on the interference of phase-encoded coherent states of light and our implementation utilizes polarization-maintaining optical fibre and photons with a wavelength of 850 nm. PMID:23132024

  14. Novel Digital Driving Method Using Dual Scan for Active Matrix Organic Light-Emitting Diode Displays

    NASA Astrophysics Data System (ADS)

    Jung, Myoung Hoon; Choi, Inho; Chung, Hoon-Ju; Kim, Ohyun

    2008-11-01

    A new digital driving method has been developed for low-temperature polycrystalline silicon, transistor-driven, active-matrix organic light-emitting diode (AM-OLED) displays by time-ratio gray-scale expression. This driving method effectively increases the emission ratio and the number of subfields by inserting another subfield set into nondisplay periods in the conventional digital driving method. By employing the proposed modified gravity center coding, this method can be used to effectively compensate for dynamic false contour noise. The operation and performance were verified by current measurement and image simulation. The simulation results using eight test images show that the proposed approach improves the average peak signal-to-noise ratio by 2.61 dB, and the emission ratio by 20.5%, compared with the conventional digital driving method.

  15. Computer image processing - The Viking experience. [digital enhancement techniques

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1977-01-01

    Computer processing of digital imagery from the Viking mission to Mars is discussed, with attention given to subjective enhancement and quantitative processing. Contrast stretching and high-pass filtering techniques of subjective enhancement are described; algorithms developed to determine optimal stretch and filtering parameters are also mentioned. In addition, geometric transformations to rectify the distortion of shapes in the field of view and to alter the apparent viewpoint of the image are considered. Perhaps the most difficult problem in quantitative processing of Viking imagery was the production of accurate color representations of Orbiter and Lander camera images.

  16. Viking image processing. [digital stereo imagery and computer mosaicking

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1977-01-01

    The paper discusses the camera systems capable of recording black and white and color imagery developed for the Viking Lander imaging experiment. Each Viking Lander image consisted of a matrix of numbers with 512 rows and an arbitrary number of columns up to a maximum of about 9,000. Various techniques were used in the processing of the Viking Lander images, including: (1) digital geometric transformation, (2) the processing of stereo imagery to produce three-dimensional terrain maps, and (3) computer mosaicking of distinct processed images. A series of Viking Lander images is included.

  17. Digital SAR processing using a fast polynomial transform

    NASA Technical Reports Server (NTRS)

    Butman, S.; Lipes, R.; Rubin, A.; Truong, T. K.

    1981-01-01

    A new digital processing algorithm based on the fast polynomial transform is developed for producing images from Synthetic Aperture Radar data. This algorithm enables the computation of the two dimensional cyclic correlation of the raw echo data with the impulse response of a point target, thereby reducing distortions inherent in one dimensional transforms. This SAR processing technique was evaluated on a general-purpose computer and an actual Seasat SAR image was produced. However, regular production runs will require a dedicated facility. It is expected that such a new SAR processing algorithm could provide the basis for a real-time SAR correlator implementation in the Deep Space Network.

  18. Digital image processing using parallel computing based on CUDA technology

    NASA Astrophysics Data System (ADS)

    Skirnevskiy, I. P.; Pustovit, A. V.; Abdrashitova, M. O.

    2017-01-01

    This article describes expediency of using a graphics processing unit (GPU) in big data processing in the context of digital images processing. It provides a short description of a parallel computing technology and its usage in different areas, definition of the image noise and a brief overview of some noise removal algorithms. It also describes some basic requirements that should be met by certain noise removal algorithm in the projection to computer tomography. It provides comparison of the performance with and without using GPU as well as with different percentage of using CPU and GPU.

  19. Digital Methodology to implement the ECOUTER engagement process.

    PubMed

    Wilson, Rebecca C; Butters, Oliver W; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J

    2016-01-01

    ECOUTER ( E mploying CO ncept u al schema for policy and T ranslation E  in R esearch - French for 'to listen' - is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes.

  20. Processing Digital Imagery to Enhance Perceptions of Realism

    NASA Technical Reports Server (NTRS)

    Woodell, Glenn A.; Jobson, Daniel J.; Rahman, Zia-ur

    2003-01-01

    Multi-scale retinex with color restoration (MSRCR) is a method of processing digital image data based on Edwin Land s retinex (retina + cortex) theory of human color vision. An outgrowth of basic scientific research and its application to NASA s remote-sensing mission, MSRCR is embodied in a general-purpose algorithm that greatly improves the perception of visual realism and the quantity and quality of perceived information in a digitized image. In addition, the MSRCR algorithm includes provisions for automatic corrections to accelerate and facilitate what could otherwise be a tedious image-editing process. The MSRCR algorithm has been, and is expected to continue to be, the basis for development of commercial image-enhancement software designed to extend and refine its capabilities for diverse applications.

  1. Multispectral digital lensless holographic microscopy: from femtosecond laser to white light LED

    NASA Astrophysics Data System (ADS)

    Garcia-Sucerquia, J.

    2015-04-01

    The use of femtosecond laser radiation and super bright white LED in digital lensless holographic microscopy is presented. For the ultrafast laser radiation two different configurations of operation of the microscope are presented and the dissimilar performance of each one analyzed. The microscope operating with a super bright white light LED in combination with optical filters shows very competitive performance as it is compared with more expensive optical sources. The broadband emission of both radiation sources allows the multispectral imaging of biological samples to obtain spectral responses and/or full color images of the microscopic specimens; sections of the head of a Drosophila melanogaster fly are imaged in this contribution. The simple, solid, compact, lightweight, and reliable architecture of digital lensless holographic microscopy operating with broadband light sources to image biological specimens exhibiting micrometer-sized details is evaluated in the present contribution.

  2. Audit and Certification Process for Science Data Digital Repositories

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Giaretta, D.; Ambacher, B.; Ashley, K.; Conrad, M.; Downs, R. R.; Garrett, J.; Guercio, M.; Lambert, S.; Longstreth, T.; Sawyer, D. M.; Sierman, B.; Tibbo, H.; Waltz, M.

    2011-12-01

    Science data digital repositories are entrusted to ensure that a science community's data are available and useful to users both today and in the future. Part of the challenge in meeting this responsibility is identifying the standards, policies and procedures required to accomplish effective data preservation. Subsequently a repository should be evaluated on whether or not they are effective in their data preservation efforts. This poster will outline the process by which digital repositories are being formally evaluated in terms of their ability to preserve the digitally encoded information with which they have been entrusted. The ISO standards on which this is based will be identified and the relationship of these standards to the Open Archive Information System (OAIS) reference model will be shown. Six test audits have been conducted with three repositories in Europe and three in the USA. Some of the major lessons learned from these test audits will be briefly described. An assessment of the possible impact of this type of audit and certification on the practice of preserving digital information will also be provided.

  3. Digital signal processing algorithms for automatic voice recognition

    NASA Technical Reports Server (NTRS)

    Botros, Nazeih M.

    1987-01-01

    The current digital signal analysis algorithms are investigated that are implemented in automatic voice recognition algorithms. Automatic voice recognition means, the capability of a computer to recognize and interact with verbal commands. The digital signal is focused on, rather than the linguistic, analysis of speech signal. Several digital signal processing algorithms are available for voice recognition. Some of these algorithms are: Linear Predictive Coding (LPC), Short-time Fourier Analysis, and Cepstrum Analysis. Among these algorithms, the LPC is the most widely used. This algorithm has short execution time and do not require large memory storage. However, it has several limitations due to the assumptions used to develop it. The other 2 algorithms are frequency domain algorithms with not many assumptions, but they are not widely implemented or investigated. However, with the recent advances in the digital technology, namely signal processors, these 2 frequency domain algorithms may be investigated in order to implement them in voice recognition. This research is concerned with real time, microprocessor based recognition algorithms.

  4. Digital signal processing based on inverse scattering transform.

    PubMed

    Turitsyna, Elena G; Turitsyn, Sergei K

    2013-10-15

    Through numerical modeling, we illustrate the possibility of a new approach to digital signal processing in coherent optical communications based on the application of the so-called inverse scattering transform. Considering without loss of generality a fiber link with normal dispersion and quadrature phase shift keying signal modulation, we demonstrate how an initial information pattern can be recovered (without direct backward propagation) through the calculation of nonlinear spectral data of the received optical signal.

  5. Development of a digital astronomical intensity interferometer: laboratory results with thermal light

    NASA Astrophysics Data System (ADS)

    Matthews, Nolan; Kieda, David; LeBohec, Stephan

    2018-06-01

    We present measurements of the second-order spatial coherence function of thermal light sources using Hanbury-Brown and Twiss interferometry with a digital correlator. We demonstrate that intensity fluctuations between orthogonal polarizations, or at detector separations greater than the spatial coherence length of the source, are uncorrelated but can be used to reduce systematic noise. The work performed here can readily be applied to existing and future Imaging Air-Cherenkov Telescopes used as star light collectors for stellar intensity interferometry to measure spatial properties of astronomical objects.

  6. Fingerprint pattern restoration by digital image processing techniques.

    PubMed

    Wen, Che-Yen; Yu, Chiu-Chung

    2003-09-01

    Fingerprint evidence plays an important role in solving criminal problems. However, defective (lacking information needed for completeness) or contaminated (undesirable information included) fingerprint patterns make identifying and recognizing processes difficult. Unfortunately. this is the usual case. In the recognizing process (enhancement of patterns, or elimination of "false alarms" so that a fingerprint pattern can be searched in the Automated Fingerprint Identification System (AFIS)), chemical and physical techniques have been proposed to improve pattern legibility. In the identifying process, a fingerprint examiner can enhance contaminated (but not defective) fingerprint patterns under guidelines provided by the Scientific Working Group on Friction Ridge Analysis, Study and Technology (SWGFAST), the Scientific Working Group on Imaging Technology (SWGIT), and an AFIS working group within the National Institute of Justice. Recently, the image processing techniques have been successfully applied in forensic science. For example, we have applied image enhancement methods to improve the legibility of digital images such as fingerprints and vehicle plate numbers. In this paper, we propose a novel digital image restoration technique based on the AM (amplitude modulation)-FM (frequency modulation) reaction-diffusion method to restore defective or contaminated fingerprint patterns. This method shows its potential application to fingerprint pattern enhancement in the recognizing process (but not for the identifying process). Synthetic and real images are used to show the capability of the proposed method. The results of enhancing fingerprint patterns by the manual process and our method are evaluated and compared.

  7. The Relationship between Digit Span and Cognitive Processing Across Ability Groups.

    ERIC Educational Resources Information Center

    Schofield, Neville J.; Ashman, Adrian F.

    1986-01-01

    The relationship between forward and backward digit span and basic cognitive processes was examined. Subjects were administered measures of sequential processing, simultaneous processing, and planning. Correlational analyses indicated the serial processing character of forward digit span, and the relationship between backward digit span and…

  8. MethyLight droplet digital PCR for detection and absolute quantification of infrequently methylated alleles.

    PubMed

    Yu, Ming; Carter, Kelly T; Makar, Karen W; Vickers, Kathy; Ulrich, Cornelia M; Schoen, Robert E; Brenner, Dean; Markowitz, Sanford D; Grady, William M

    2015-01-01

    Aberrant DNA methylation is a common epigenetic alteration found in colorectal adenomas and cancers and plays a role in cancer initiation and progression. Aberrantly methylated DNA loci can also be found infrequently present in normal colon tissue, where they seem to have potential to be used as colorectal cancer (CRC) risk biomarkers. However, detection and precise quantification of the infrequent methylation events seen in normal colon is likely beyond the capability of commonly used PCR technologies. To determine the potential for methylated DNA loci as CRC risk biomarkers, we developed MethyLight droplet digital PCR (ddPCR) assays and compared their performance to the widely used conventional MethyLight PCR. Our analyses demonstrated the capacity of MethyLight ddPCR to detect a single methylated NTRK3 allele from among more than 3125 unmethylated alleles, 25-fold more sensitive than conventional MethyLight PCR. The MethyLight ddPCR assay detected as little as 19 and 38 haploid genome equivalents of methylated EVL and methylated NTRK3, respectively, which far exceeded conventional MethyLight PCR (379 haploid genome equivalents for both genes). When assessing methylated EVL levels in CRC tissue samples, MethyLight ddPCR reduced coefficients of variation (CV) to 6-65% of CVs seen with conventional MethyLight PCR. Importantly, we showed the ability of MethyLight ddPCR to detect infrequently methylated EVL alleles in normal colon mucosa samples that could not be detected by conventional MethyLight PCR. This study suggests that the sensitivity and precision of methylation detection by MethyLight ddPCR enhances the potential of methylated alleles for use as CRC risk biomarkers.

  9. Light-reflection random-target method for measurement of the modulation transfer function of a digital video-camera

    NASA Astrophysics Data System (ADS)

    Pospisil, J.; Jakubik, P.; Machala, L.

    2005-11-01

    This article reports the suggestion, realization and verification of the newly developed measuring means of the noiseless and locally shift-invariant modulation transfer function (MTF) of a digital video camera in a usual incoherent visible region of optical intensity, especially of its combined imaging, detection, sampling and digitizing steps which are influenced by the additive and spatially discrete photodetector, aliasing and quantization noises. Such means relates to the still camera automatic working regime and static two-dimensional spatially continuous light-reflection random target of white-noise property. The introduced theoretical reason for such a random-target method is also performed under exploitation of the proposed simulation model of the linear optical intensity response and possibility to express the resultant MTF by a normalized and smoothed rate of the ascertainable output and input power spectral densities. The random-target and resultant image-data were obtained and processed by means of a processing and evaluational PC with computation programs developed on the basis of MATLAB 6.5E The present examples of results and other obtained results of the performed measurements demonstrate the sufficient repeatability and acceptability of the described method for comparative evaluations of the performance of digital video cameras under various conditions.

  10. [Observation of oral actions using digital image processing system].

    PubMed

    Ichikawa, T; Komoda, J; Horiuchi, M; Ichiba, H; Hada, M; Matsumoto, N

    1990-04-01

    A new digital image processing system to observe oral actions is proposed. The system provides analyses of motion pictures along with other physiological signals. The major components are a video tape recorder, a digital image processor, a percept scope, a CCD camera, an A/D converter and a personal computer. Five reference points were marked on the lip and eyeglasses of 9 adult subjects. Lip movements were recorded and analyzed using the system when uttering five vowels and [ka, sa, ta, ha, ra, ma, pa, ba[. 1. Positions of the lip when uttering five vowels were clearly classified. 2. Active articulatory movements of the lip were not recognized when uttering consonants [k, s, t, h, r[. It seemed lip movements were dependent on tongue and mandibular movements. Downward and rearward movements of the upper lip, and upward and forward movements of the lower lip were observed when uttering consonants [m, p, b[.

  11. A Versatile Multichannel Digital Signal Processing Module for Microcalorimeter Arrays

    NASA Astrophysics Data System (ADS)

    Tan, H.; Collins, J. W.; Walby, M.; Hennig, W.; Warburton, W. K.; Grudberg, P.

    2012-06-01

    Different techniques have been developed for reading out microcalorimeter sensor arrays: individual outputs for small arrays, and time-division or frequency-division or code-division multiplexing for large arrays. Typically, raw waveform data are first read out from the arrays using one of these techniques and then stored on computer hard drives for offline optimum filtering, leading not only to requirements for large storage space but also limitations on achievable count rate. Thus, a read-out module that is capable of processing microcalorimeter signals in real time will be highly desirable. We have developed multichannel digital signal processing electronics that are capable of on-board, real time processing of microcalorimeter sensor signals from multiplexed or individual pixel arrays. It is a 3U PXI module consisting of a standardized core processor board and a set of daughter boards. Each daughter board is designed to interface a specific type of microcalorimeter array to the core processor. The combination of the standardized core plus this set of easily designed and modified daughter boards results in a versatile data acquisition module that not only can easily expand to future detector systems, but is also low cost. In this paper, we first present the core processor/daughter board architecture, and then report the performance of an 8-channel daughter board, which digitizes individual pixel outputs at 1 MSPS with 16-bit precision. We will also introduce a time-division multiplexing type daughter board, which takes in time-division multiplexing signals through fiber-optic cables and then processes the digital signals to generate energy spectra in real time.

  12. The brightness of lights on Earth at night, digitally recorded by DMSP satellite

    USGS Publications Warehouse

    Croft, Thomas A.

    1979-01-01

    The U.S. Air Force has operated its Defense Meteorological Satellite Program (DMSP) for nearly a decade, and film images from the system have been openly available since 1973. Films are well suited for the study of weather, and users of such films have derived much useful data. For many potential remote sensing applications, however, a quantitative measurement of the brightness of the imaged light patterns is needed, and it cannot be extracted with adequte accuracy from the films. Such information is contained in the telemetry from the spacecraft and is retained on digital tapes, which store the images for a few days while they await filming. For practical reasons, it has not heretofore been feasible for the Air Force to provide a remote-sensing user with these digital data, and the quantitative brightness information has been lost with the erasure of tapes for re-use. For the purpose of evaluation of tapes as a means for remote sensing, the Air Force recently did provide to the author six examples containing records of nighttime DMSP imagery similar to that which has previously 1 been evaluated by SRI International in a film format. The digital data create many new applications for these images, owing to a combination of several factors, the most important of which are the preservation of photometric information and of full spatial resolution. In this evaluation, stress has been placed upon determination of the broad potential value of the data rather than the full exploitation of any one aspect of it. The effort was guided by an objective to develop handling methods for the vast body of numbers--methods which will be practical for use in a research or engineering environment where budgets are limited, and specialized capabilities and image reproduction equipment has not already been developed. We report the degree of success obtained in this effort, pointing out the relative strengths and the relative limitations, as compared to the sophisticated, weather

  13. Programmable rate modem utilizing digital signal processing techniques

    NASA Technical Reports Server (NTRS)

    Bunya, George K.; Wallace, Robert L.

    1989-01-01

    The engineering development study to follow was written to address the need for a Programmable Rate Digital Satellite Modem capable of supporting both burst and continuous transmission modes with either binary phase shift keying (BPSK) or quadrature phase shift keying (QPSK) modulation. The preferred implementation technique is an all digital one which utilizes as much digital signal processing (DSP) as possible. Here design tradeoffs in each portion of the modulator and demodulator subsystem are outlined, and viable circuit approaches which are easily repeatable, have low implementation losses and have low production costs are identified. The research involved for this study was divided into nine technical papers, each addressing a significant region of concern in a variable rate modem design. Trivial portions and basic support logic designs surrounding the nine major modem blocks were omitted. In brief, the nine topic areas were: (1) Transmit Data Filtering; (2) Transmit Clock Generation; (3) Carrier Synthesizer; (4) Receive AGC; (5) Receive Data Filtering; (6) RF Oscillator Phase Noise; (7) Receive Carrier Selectivity; (8) Carrier Recovery; and (9) Timing Recovery.

  14. Spiral Light Beams and Contour Image Processing

    NASA Astrophysics Data System (ADS)

    Kishkin, Sergey A.; Kotova, Svetlana P.; Volostnikov, Vladimir G.

    Spiral beams of light are characterized by their ability to remain structurally unchanged at propagation. They may have the shape of any closed curve. In the present paper a new approach is proposed within the framework of the contour analysis based on a close cooperation of modern coherent optics, theory of functions and numerical methods. An algorithm for comparing contours is presented and theoretically justified, which allows convincing of whether two contours are similar or not to within the scale factor and/or rotation. The advantages and disadvantages of the proposed approach are considered; the results of numerical modeling are presented.

  15. Evaluation of clinical image processing algorithms used in digital mammography.

    PubMed

    Zanca, Federica; Jacobs, Jurgen; Van Ongeval, Chantal; Claus, Filip; Celis, Valerie; Geniets, Catherine; Provost, Veerle; Pauwels, Herman; Marchal, Guy; Bosmans, Hilde

    2009-03-01

    Screening is the only proven approach to reduce the mortality of breast cancer, but significant numbers of breast cancers remain undetected even when all quality assurance guidelines are implemented. With the increasing adoption of digital mammography systems, image processing may be a key factor in the imaging chain. Although to our knowledge statistically significant effects of manufacturer-recommended image processings have not been previously demonstrated, the subjective experience of our radiologists, that the apparent image quality can vary considerably between different algorithms, motivated this study. This article addresses the impact of five such algorithms on the detection of clusters of microcalcifications. A database of unprocessed (raw) images of 200 normal digital mammograms, acquired with the Siemens Novation DR, was collected retrospectively. Realistic simulated microcalcification clusters were inserted in half of the unprocessed images. All unprocessed images were subsequently processed with five manufacturer-recommended image processing algorithms (Agfa Musica 1, IMS Raffaello Mammo 1.2, Sectra Mamea AB Sigmoid, Siemens OPVIEW v2, and Siemens OPVIEW v1). Four breast imaging radiologists were asked to locate and score the clusters in each image on a five point rating scale. The free-response data were analyzed by the jackknife free-response receiver operating characteristic (JAFROC) method and, for comparison, also with the receiver operating characteristic (ROC) method. JAFROC analysis revealed highly significant differences between the image processings (F = 8.51, p < 0.0001), suggesting that image processing strongly impacts the detectability of clusters. Siemens OPVIEW2 and Siemens OPVIEW1 yielded the highest and lowest performances, respectively. ROC analysis of the data also revealed significant differences between the processing but at lower significance (F = 3.47, p = 0.0305) than JAFROC. Both statistical analysis methods revealed that the

  16. DSPSR: Digital Signal Processing Software for Pulsar Astronomy

    NASA Astrophysics Data System (ADS)

    van Straten, W.; Bailes, M.

    2010-10-01

    DSPSR, written primarily in C++, is an open-source, object-oriented, digital signal processing software library and application suite for use in radio pulsar astronomy. The library implements an extensive range of modular algorithms for use in coherent dedispersion, filterbank formation, pulse folding, and other tasks. The software is installed and compiled using the standard GNU configure and make system, and is able to read astronomical data in 18 different file formats, including FITS, S2, CPSR, CPSR2, PuMa, PuMa2, WAPP, ASP, and Mark5.

  17. Production Process for Strong, Light Ceramic Tiles

    NASA Technical Reports Server (NTRS)

    Holmquist, G. R.; Cordia, E. R.; Tomer, R. S.

    1985-01-01

    Proportions of ingredients and sintering time/temperature schedule changed. Production process for lightweight, high-strength ceramic insulating tiles for Space Shuttle more than just scaled-up version of laboratory process for making small tiles. Boron in aluminum borosilicate fibers allows fusion at points where fibers contact each other during sintering, thereby greatly strengthening tiles structure.

  18. REVIEW ARTICLE: Spectrophotometric applications of digital signal processing

    NASA Astrophysics Data System (ADS)

    Morawski, Roman Z.

    2006-09-01

    Spectrophotometry is more and more often the method of choice not only in analysis of (bio)chemical substances, but also in the identification of physical properties of various objects and their classification. The applications of spectrophotometry include such diversified tasks as monitoring of optical telecommunications links, assessment of eating quality of food, forensic classification of papers, biometric identification of individuals, detection of insect infestation of seeds and classification of textiles. In all those applications, large numbers of data, generated by spectrophotometers, are processed by various digital means in order to extract measurement information. The main objective of this paper is to review the state-of-the-art methodology for digital signal processing (DSP) when applied to data provided by spectrophotometric transducers and spectrophotometers. First, a general methodology of DSP applications in spectrophotometry, based on DSP-oriented models of spectrophotometric data, is outlined. Then, the most important classes of DSP methods for processing spectrophotometric data—the methods for DSP-aided calibration of spectrophotometric instrumentation, the methods for the estimation of spectra on the basis of spectrophotometric data, the methods for the estimation of spectrum-related measurands on the basis of spectrophotometric data—are presented. Finally, the methods for preprocessing and postprocessing of spectrophotometric data are overviewed. Throughout the review, the applications of DSP are illustrated with numerous examples related to broadly understood spectrophotometry.

  19. Digital Image Processing Overview For Helmet Mounted Displays

    NASA Astrophysics Data System (ADS)

    Parise, Michael J.

    1989-09-01

    Digital image processing provides a means to manipulate an image and presents a user with a variety of display formats that are not available in the analog image processing environment. When performed in real time and presented on a Helmet Mounted Display, system capability and flexibility are greatly enhanced. The information content of a display can be increased by the addition of real time insets and static windows from secondary sensor sources, near real time 3-D imaging from a single sensor can be achieved, graphical information can be added, and enhancement techniques can be employed. Such increased functionality is generating a considerable amount of interest in the military and commercial markets. This paper discusses some of these image processing techniques and their applications.

  20. Digital SAR processing using a fast polynomial transform

    NASA Technical Reports Server (NTRS)

    Truong, T. K.; Lipes, R. G.; Butman, S. A.; Reed, I. S.; Rubin, A. L.

    1984-01-01

    A new digital processing algorithm based on the fast polynomial transform is developed for producing images from Synthetic Aperture Radar data. This algorithm enables the computation of the two dimensional cyclic correlation of the raw echo data with the impulse response of a point target, thereby reducing distortions inherent in one dimensional transforms. This SAR processing technique was evaluated on a general-purpose computer and an actual Seasat SAR image was produced. However, regular production runs will require a dedicated facility. It is expected that such a new SAR processing algorithm could provide the basis for a real-time SAR correlator implementation in the Deep Space Network. Previously announced in STAR as N82-11295

  1. Signal propagation in cortical networks: a digital signal processing approach.

    PubMed

    Rodrigues, Francisco Aparecido; da Fontoura Costa, Luciano

    2009-01-01

    This work reports a digital signal processing approach to representing and modeling transmission and combination of signals in cortical networks. The signal dynamics is modeled in terms of diffusion, which allows the information processing undergone between any pair of nodes to be fully characterized in terms of a finite impulse response (FIR) filter. Diffusion without and with time decay are investigated. All filters underlying the cat and macaque cortical organization are found to be of low-pass nature, allowing the cortical signal processing to be summarized in terms of the respective cutoff frequencies (a high cutoff frequency meaning little alteration of signals through their intermixing). Several findings are reported and discussed, including the fact that the incorporation of temporal activity decay tends to provide more diversified cutoff frequencies. Different filtering intensity is observed for each community in those networks. In addition, the brain regions involved in object recognition tend to present the highest cutoff frequencies for both the cat and macaque networks.

  2. Liquid crystal thermography and true-colour digital image processing

    NASA Astrophysics Data System (ADS)

    Stasiek, J.; Stasiek, A.; Jewartowski, M.; Collins, M. W.

    2006-06-01

    In the last decade thermochromic liquid crystals (TLC) and true-colour digital image processing have been successfully used in non-intrusive technical, industrial and biomedical studies and applications. Thin coatings of TLCs at surfaces are utilized to obtain detailed temperature distributions and heat transfer rates for steady or transient processes. Liquid crystals also can be used to make visible the temperature and velocity fields in liquids by the simple expedient of directly mixing the liquid crystal material into the liquid (water, glycerol, glycol, and silicone oils) in very small quantities to use as thermal and hydrodynamic tracers. In biomedical situations e.g., skin diseases, breast cancer, blood circulation and other medical application, TLC and image processing are successfully used as an additional non-invasive diagnostic method especially useful for screening large groups of potential patients. The history of this technique is reviewed, principal methods and tools are described and some examples are also presented.

  3. Holographic digital microscopy in on-line process control

    NASA Astrophysics Data System (ADS)

    Osanlou, Ardeshir

    2011-09-01

    This article investigates the feasibility of real-time three-dimensional imaging of microscopic objects within various emulsions while being produced in specialized production vessels. The study is particularly relevant to on-line process monitoring and control in chemical, pharmaceutical, food, cleaning, and personal hygiene industries. Such processes are often dynamic and the materials cannot be measured once removed from the production vessel. The technique reported here is applicable to three-dimensional characterization analyses on stirred fluids in small reaction vessels. Relatively expensive pulsed lasers have been avoided through the careful control of the speed of the moving fluid in relation to the speed of the camera exposure and the wavelength of the continuous wave laser used. The ultimate aim of the project is to introduce a fully robust and compact digital holographic microscope as a process control tool in a full size specialized production vessel.

  4. Incremental terrain processing for large digital elevation models

    NASA Astrophysics Data System (ADS)

    Ye, Z.

    2012-12-01

    Incremental terrain processing for large digital elevation models Zichuan Ye, Dean Djokic, Lori Armstrong Esri, 380 New York Street, Redlands, CA 92373, USA (E-mail: zye@esri.com, ddjokic@esri.com , larmstrong@esri.com) Efficient analyses of large digital elevation models (DEM) require generation of additional DEM artifacts such as flow direction, flow accumulation and other DEM derivatives. When the DEMs to analyze have a large number of grid cells (usually > 1,000,000,000) the generation of these DEM derivatives is either impractical (it takes too long) or impossible (software is incapable of processing such a large number of cells). Different strategies and algorithms can be put in place to alleviate this situation. This paper describes an approach where the overall DEM is partitioned in smaller processing units that can be efficiently processed. The processed DEM derivatives for each partition can then be either mosaicked back into a single large entity or managed on partition level. For dendritic terrain morphologies, the way in which partitions are to be derived and the order in which they are to be processed depend on the river and catchment patterns. These patterns are not available until flow pattern of the whole region is created, which in turn cannot be established upfront due to the size issues. This paper describes a procedure that solves this problem: (1) Resample the original large DEM grid so that the total number of cells is reduced to a level for which the drainage pattern can be established. (2) Run standard terrain preprocessing operations on the resampled DEM to generate the river and catchment system. (3) Define the processing units and their processing order based on the river and catchment system created in step (2). (4) Based on the processing order, apply the analysis, i.e., flow accumulation operation to each of the processing units, at the full resolution DEM. (5) As each processing unit is processed based on the processing order defined

  5. Coherent detection and digital signal processing for fiber optic communications

    NASA Astrophysics Data System (ADS)

    Ip, Ezra

    The drive towards higher spectral efficiency in optical fiber systems has generated renewed interest in coherent detection. We review different detection methods, including noncoherent, differentially coherent, and coherent detection, as well as hybrid detection methods. We compare the modulation methods that are enabled and their respective performances in a linear regime. An important system parameter is the number of degrees of freedom (DOF) utilized in transmission. Polarization-multiplexed quadrature-amplitude modulation maximizes spectral efficiency and power efficiency as it uses all four available DOF contained in the two field quadratures in the two polarizations. Dual-polarization homodyne or heterodyne downconversion are linear processes that can fully recover the received signal field in these four DOF. When downconverted signals are sampled at the Nyquist rate, compensation of transmission impairments can be performed using digital signal processing (DSP). Software based receivers benefit from the robustness of DSP, flexibility in design, and ease of adaptation to time-varying channels. Linear impairments, including chromatic dispersion (CD) and polarization-mode dispersion (PMD), can be compensated quasi-exactly using finite impulse response filters. In practical systems, sampling the received signal at 3/2 times the symbol rate is sufficient to enable an arbitrary amount of CD and PMD to be compensated for a sufficiently long equalizer whose tap length scales linearly with transmission distance. Depending on the transmitted constellation and the target bit error rate, the analog-to-digital converter (ADC) should have around 5 to 6 bits of resolution. Digital coherent receivers are naturally suited for the implementation of feedforward carrier recovery, which has superior linewidth tolerance than phase-locked loops, and does not suffer from feedback delay constraints. Differential bit encoding can be used to prevent catastrophic receiver failure due

  6. Data Processing Factory for the Sloan Digital Sky Survey

    NASA Astrophysics Data System (ADS)

    Stoughton, Christopher; Adelman, Jennifer; Annis, James T.; Hendry, John; Inkmann, John; Jester, Sebastian; Kent, Steven M.; Kuropatkin, Nickolai; Lee, Brian; Lin, Huan; Peoples, John, Jr.; Sparks, Robert; Tucker, Douglas; Vanden Berk, Dan; Yanny, Brian; Yocum, Dan

    2002-12-01

    The Sloan Digital Sky Survey (SDSS) data handling presents two challenges: large data volume and timely production of spectroscopic plates from imaging data. A data processing factory, using technologies both old and new, handles this flow. Distribution to end users is via disk farms, to serve corrected images and calibrated spectra, and a database, to efficiently process catalog queries. For distribution of modest amounts of data from Apache Point Observatory to Fermilab, scripts use rsync to update files, while larger data transfers are accomplished by shipping magnetic tapes commercially. All data processing pipelines are wrapped in scripts to address consecutive phases: preparation, submission, checking, and quality control. We constructed the factory by chaining these pipelines together while using an operational database to hold processed imaging catalogs. The science database catalogs all imaging and spectroscopic object, with pointers to the various external files associated with them. Diverse computing systems address particular processing phases. UNIX computers handle tape reading and writing, as well as calibration steps that require access to a large amount of data with relatively modest computational demands. Commodity CPUs process steps that require access to a limited amount of data with more demanding computations requirements. Disk servers optimized for cost per Gbyte serve terabytes of processed data, while servers optimized for disk read speed run SQLServer software to process queries on the catalogs. This factory produced data for the SDSS Early Data Release in June 2001, and it is currently producing Data Release One, scheduled for January 2003.

  7. White-Light Optical Information Processing and Holography.

    DTIC Science & Technology

    1982-05-03

    artifact noise . I. wever, the deblurring spatial filter that we used were a narrow spectral band centered at 5154A green light. To compensate for the scaling...Processing, White-Light 11olographyv, Image Profcessing, Optical Signal Process inI, Image Subtraction, Image Deblurring . 70. A S’ R ACT (Continua on crow ad...optical processing technique, we had shown that the incoherent source techniques provides better image quality, and very low coherent artifact noise

  8. Understanding the Physical Optics Phenomena by Using a Digital Application for Light Propagation

    NASA Astrophysics Data System (ADS)

    Sierra-Sosa, Daniel-Esteban; Ángel-Toro, Luciano

    2011-01-01

    Understanding the light propagation on the basis of the Huygens-Fresnel principle stands for a fundamental factor for deeper comprehension of different physical optics related phenomena like diffraction, self-imaging, image formation, Fourier analysis and spatial filtering. This constitutes the physical approach of the Fourier optics whose principles and applications have been developed since the 1950's. Both for analytical and digital applications purposes, light propagation can be formulated in terms of the Fresnel Integral Transform. In this work, a digital optics application based on the implementation of the Discrete Fresnel Transform (DFT), and addressed to serve as a tool for applications in didactics of optics is presented. This tool allows, at a basic and intermediate learning level, exercising with the identification of basic phenomena, and observing changes associated with modifications of physical parameters. This is achieved by using a friendly graphic user interface (GUI). It also assists the user in the development of his capacity for abstracting and predicting the characteristics of more complicated phenomena. At an upper level of learning, the application could be used to favor a deeper comprehension of involved physics and models, and experimenting with new models and configurations. To achieve this, two characteristics of the didactic tool were taken into account when designing it. First, all physical operations, ranging from simple diffraction experiments to digital holography and interferometry, were developed on the basis of the more fundamental concept of light propagation. Second, the algorithm was conceived to be easily upgradable due its modular architecture based in MATLAB® software environment. Typical results are presented and briefly discussed in connection with didactics of optics.

  9. On the Development of Arabic Three-Digit Number Processing in Primary School Children

    ERIC Educational Resources Information Center

    Mann, Anne; Moeller, Korbinian; Pixner, Silvia; Kaufmann, Liane; Nuerk, Hans-Christoph

    2012-01-01

    The development of two-digit number processing in children, and in particular the influence of place-value understanding, has recently received increasing research interest. However, place-value influences leading to decomposed processing have not yet been investigated for multi-digit numbers beyond the two-digit number range in children.…

  10. Focusing light through scattering media by polarization modulation based generalized digital optical phase conjugation

    NASA Astrophysics Data System (ADS)

    Yang, Jiamiao; Shen, Yuecheng; Liu, Yan; Hemphill, Ashton S.; Wang, Lihong V.

    2017-11-01

    Optical scattering prevents light from being focused through thick biological tissue at depths greater than ˜1 mm. To break this optical diffusion limit, digital optical phase conjugation (DOPC) based wavefront shaping techniques are being actively developed. Previous DOPC systems employed spatial light modulators that modulated either the phase or the amplitude of the conjugate light field. Here, we achieve optical focusing through scattering media by using polarization modulation based generalized DOPC. First, we describe an algorithm to extract the polarization map from the measured scattered field. Then, we validate the algorithm through numerical simulations and find that the focusing contrast achieved by polarization modulation is similar to that achieved by phase modulation. Finally, we build a system using an inexpensive twisted nematic liquid crystal based spatial light modulator (SLM) and experimentally demonstrate light focusing through 3-mm thick chicken breast tissue. Since the polarization modulation based SLMs are widely used in displays and are having more and more pixel counts with the prevalence of 4 K displays, these SLMs are inexpensive and valuable devices for wavefront shaping.

  11. White-Light Optical Information Processing and Holography.

    DTIC Science & Technology

    1985-07-29

    this technique is the processing system does not require to carry its own light source. It is very suitable for spaceborne and satellite application. We...developed a technique of generating a spatialtrequency color coded speech spectrogram with a white-light optical system . This system not only offers a low...that the annoying moire fringes can be eliminated. In short, we have once again demonstrated the versatility of the white-light progress system ; a

  12. White light scanner-based repeatability of 3-dimensional digitizing of silicon rubber abutment teeth impressions

    PubMed Central

    Jeon, Jin-Hun; Lee, Kyung-Tak; Kim, Hae-Young; Kim, Ji-Hwan

    2013-01-01

    PURPOSE The aim of this study was to evaluate the repeatability of the digitizing of silicon rubber impressions of abutment teeth by using a white light scanner and compare differences in repeatability between different abutment teeth types. MATERIALS AND METHODS Silicon rubber impressions of a canine, premolar, and molar tooth were each digitized 8 times using a white light scanner, and 3D surface models were created using the point clouds. The size of any discrepancy between each model and the corresponding reference tooth were measured, and the distribution of these values was analyzed by an inspection software (PowerInspect 2012, Delcamplc., Birmingham, UK). Absolute values of discrepancies were analyzed by the Kruskal-Wallis test and multiple comparisons (α=.05). RESULTS The discrepancy between the impressions for the canine, premolar, and molar teeth were 6.3 µm (95% confidence interval [CI], 5.4-7.2), 6.4 µm (95% CI, 5.3-7.6), and 8.9 µm (95% CI, 8.2-9.5), respectively. The discrepancy of the molar tooth impression was significantly higher than that of other tooth types. The largest variation (as mean [SD]) in discrepancies was seen in the premolar tooth impression scans: 26.7 µm (95% CI, 19.7-33.8); followed by canine and molar teeth impressions, 16.3 µm (95% CI, 15.3-17.3), and 14.0 µm (95% CI, 12.3-15.7), respectively. CONCLUSION The repeatability of the digitizing abutment teeth's silicon rubber impressions by using a white light scanner was improved compared to that with a laser scanner, showing only a low mean discrepancy between 6.3 µm and 8.9 µm, which was in an clinically acceptable range. Premolar impression with a long and narrow shape showed a significantly larger discrepancy than canine and molar impressions. Further work is needed to increase the digitizing performance of the white light scanner for deep and slender impressions. PMID:24353885

  13. NIR light propagation in a digital head model for traumatic brain injury (TBI)

    PubMed Central

    Francis, Robert; Khan, Bilal; Alexandrakis, George; Florence, James; MacFarlane, Duncan

    2015-01-01

    Near infrared spectroscopy (NIRS) is capable of detecting and monitoring acute changes in cerebral blood volume and oxygenation associated with traumatic brain injury (TBI). Wavelength selection, source-detector separation, optode density, and detector sensitivity are key design parameters that determine the imaging depth, chromophore separability, and, ultimately, clinical usefulness of a NIRS instrument. We present simulation results of NIR light propagation in a digital head model as it relates to the ability to detect intracranial hematomas and monitor the peri-hematomal tissue viability. These results inform NIRS instrument design specific to TBI diagnosis and monitoring. PMID:26417498

  14. Processing techniques for digital sonar images from GLORIA.

    USGS Publications Warehouse

    Chavez, P.S.

    1986-01-01

    Image processing techniques have been developed to handle data from one of the newest members of the remote sensing family of digital imaging systems. This paper discusses software to process data collected by the GLORIA (Geological Long Range Inclined Asdic) sonar imaging system, designed and built by the Institute of Oceanographic Sciences (IOS) in England, to correct for both geometric and radiometric distortions that exist in the original 'raw' data. Preprocessing algorithms that are GLORIA-specific include corrections for slant-range geometry, water column offset, aspect ratio distortion, changes in the ship's velocity, speckle noise, and shading problems caused by the power drop-off which occurs as a function of range.-from Author

  15. Digital processing of mesoscale analysis and space sensor data

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.; Karitani, S.

    1985-01-01

    The mesoscale analysis and space sensor (MASS) data management and analysis system on the research computer system is presented. The MASS data base management and analysis system was implemented on the research computer system which provides a wide range of capabilities for processing and displaying large volumes of conventional and satellite derived meteorological data. The research computer system consists of three primary computers (HP-1000F, Harris/6, and Perkin-Elmer 3250), each of which performs a specific function according to its unique capabilities. The overall tasks performed concerning the software, data base management and display capabilities of the research computer system in terms of providing a very effective interactive research tool for the digital processing of mesoscale analysis and space sensor data is described.

  16. Deep-tissue focal fluorescence imaging with digitally time-reversed ultrasound-encoded light

    PubMed Central

    Wang, Ying Min; Judkewitz, Benjamin; DiMarzio, Charles A.; Yang, Changhuei

    2012-01-01

    Fluorescence imaging is one of the most important research tools in biomedical sciences. However, scattering of light severely impedes imaging of thick biological samples beyond the ballistic regime. Here we directly show focusing and high-resolution fluorescence imaging deep inside biological tissues by digitally time-reversing ultrasound-tagged light with high optical gain (~5×105). We confirm the presence of a time-reversed optical focus along with a diffuse background—a corollary of partial phase conjugation—and develop an approach for dynamic background cancellation. To illustrate the potential of our method, we image complex fluorescent objects and tumour microtissues at an unprecedented depth of 2.5 mm in biological tissues at a lateral resolution of 36 μm×52 μm and an axial resolution of 657 μm. Our results set the stage for a range of deep-tissue imaging applications in biomedical research and medical diagnostics. PMID:22735456

  17. White Light Optical Processing and Holography.

    DTIC Science & Technology

    1982-10-01

    of the object beam. The major problem in image deblurring is noise in the dclurred image. There are two kinds of noise : S (a) False images. The...reducing the noise this work is described in Sec. 3. 2. We addressed the bias buildup and SNR in incoherent optical processing, making an analysis that...system is generally better than the coherent for SNR. Thus, if we have a sensitive, low- noise detector at the output of an incoherent system, we should

  18. Optimized light sharing for high-resolution TOF PET detector based on digital silicon photomultipliers.

    PubMed

    Marcinkowski, R; España, S; Van Holen, R; Vandenberghe, S

    2014-12-07

    The majority of current whole-body PET scanners are based on pixelated scintillator arrays with a transverse pixel size of 4 mm. However, recent studies have shown that decreasing the pixel size to 2 mm can significantly improve image spatial resolution. In this study, the performance of Digital Photon Counter (DPC) from Philips Digital Photon Counting (PDPC) was evaluated to determine their potential for high-resolution whole-body time of flight (TOF) PET scanners. Two detector configurations were evaluated. First, the DPC3200-44-22 DPC array was coupled to a LYSO block of 15 × 15 2 × 2 × 22 mm(3) pixels through a 1 mm thick light guide. Due to light sharing among the dies neighbour logic of the DPC was used. In a second setup the same DPC was coupled directly to a scalable 4 × 4 LYSO matrix of 1.9 × 1.9 × 22 mm(3) crystals with a dedicated reflector arrangement allowing for controlled light sharing patterns inside the matrix. With the first approach an average energy resolution of 14.5% and an average CRT of 376 ps were achieved. For the second configuration an average energy resolution of 11% and an average CRT of 295 ps were achieved. Our studies show that the DPC is a suitable photosensor for a high-resolution TOF-PET detector. The dedicated reflector arrangement allows one to achieve better performances than the light guide approach. The count loss, caused by dark counts, is overcome by fitting the matrix size to the size of DPC single die.

  19. Creating an EPICS Based Test Stand Development System for a BPM Digitizer of the Linac Coherent Light Source

    SciTech Connect

    Not Available

    2011-06-22

    The Linac Coherent Light Source (LCLS) is required to deliver a high quality electron beam for producing coherent X-rays. As a result, high resolution beam position monitoring is required. The Beam Position Monitor (BPM) digitizer acquires analog signals from the beam line and digitizes them to obtain beam position data. Although Matlab is currently being used to test the BPM digitizer?s functions and capability, the Controls Department at SLAC prefers to use Experimental Physics and Industrial Control Systems (EPICS). This paper discusses the transition of providing similar as well as enhanced functionalities, than those offered by Matlab, to test themore » digitizer. Altogether, the improved test stand development system can perform mathematical and statistical calculations with the waveform signals acquired from the digitizer and compute the fast Fourier transform (FFT) of the signals. Finally, logging of meaningful data into files has been added.« less

  20. Digital processing of radiographic images for print publication.

    PubMed

    Cockerill, James W

    2002-01-01

    Digital imaging of X-rays yields high quality, evenly exposed negatives and prints. This article outlines the method used, materials and methods of this technique and discusses the advantages of digital radiographic images.

  1. Memory for light as a quantum process.

    PubMed

    Lobino, M; Kupchak, C; Figueroa, E; Lvovsky, A I

    2009-05-22

    We report complete characterization of an optical memory based on electromagnetically induced transparency. We recover the superoperator associated with the memory, under two different working conditions, by means of a quantum process tomography technique that involves storage of coherent states and their characterization upon retrieval. In this way, we can predict the quantum state retrieved from the memory for any input, for example, the squeezed vacuum or the Fock state. We employ the acquired superoperator to verify the nonclassicality benchmark for the storage of a Gaussian distributed set of coherent states.

  2. Digital Image Processing Technique for Breast Cancer Detection

    NASA Astrophysics Data System (ADS)

    Guzmán-Cabrera, R.; Guzmán-Sepúlveda, J. R.; Torres-Cisneros, M.; May-Arrioja, D. A.; Ruiz-Pinales, J.; Ibarra-Manzano, O. G.; Aviña-Cervantes, G.; Parada, A. González

    2013-09-01

    Breast cancer is the most common cause of death in women and the second leading cause of cancer deaths worldwide. Primary prevention in the early stages of the disease becomes complex as the causes remain almost unknown. However, some typical signatures of this disease, such as masses and microcalcifications appearing on mammograms, can be used to improve early diagnostic techniques, which is critical for women’s quality of life. X-ray mammography is the main test used for screening and early diagnosis, and its analysis and processing are the keys to improving breast cancer prognosis. As masses and benign glandular tissue typically appear with low contrast and often very blurred, several computer-aided diagnosis schemes have been developed to support radiologists and internists in their diagnosis. In this article, an approach is proposed to effectively analyze digital mammograms based on texture segmentation for the detection of early stage tumors. The proposed algorithm was tested over several images taken from the digital database for screening mammography for cancer research and diagnosis, and it was found to be absolutely suitable to distinguish masses and microcalcifications from the background tissue using morphological operators and then extract them through machine learning techniques and a clustering algorithm for intensity-based segmentation.

  3. IMAGEP - A FORTRAN ALGORITHM FOR DIGITAL IMAGE PROCESSING

    NASA Technical Reports Server (NTRS)

    Roth, D. J.

    1994-01-01

    IMAGEP is a FORTRAN computer algorithm containing various image processing, analysis, and enhancement functions. It is a keyboard-driven program organized into nine subroutines. Within the subroutines are other routines, also, selected via keyboard. Some of the functions performed by IMAGEP include digitization, storage and retrieval of images; image enhancement by contrast expansion, addition and subtraction, magnification, inversion, and bit shifting; display and movement of cursor; display of grey level histogram of image; and display of the variation of grey level intensity as a function of image position. This algorithm has possible scientific, industrial, and biomedical applications in material flaw studies, steel and ore analysis, and pathology, respectively. IMAGEP is written in VAX FORTRAN for DEC VAX series computers running VMS. The program requires the use of a Grinnell 274 image processor which can be obtained from Mark McCloud Associates, Campbell, CA. An object library of the required GMR series software is included on the distribution media. IMAGEP requires 1Mb of RAM for execution. The standard distribution medium for this program is a 1600 BPI 9track magnetic tape in VAX FILES-11 format. It is also available on a TK50 tape cartridge in VAX FILES-11 format. This program was developed in 1991. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation.

  4. Digital Signal Processing For Low Bit Rate TV Image Codecs

    NASA Astrophysics Data System (ADS)

    Rao, K. R.

    1987-06-01

    In view of the 56 KBPS digital switched network services and the ISDN, low bit rate codecs for providing real time full motion color video are under various stages of development. Some companies have already brought the codecs into the market. They are being used by industry and some Federal Agencies for video teleconferencing. In general, these codecs have various features such as multiplexing audio and data, high resolution graphics, encryption, error detection and correction, self diagnostics, freezeframe, split video, text overlay etc. To transmit the original color video on a 56 KBPS network requires bit rate reduction of the order of 1400:1. Such a large scale bandwidth compression can be realized only by implementing a number of sophisticated,digital signal processing techniques. This paper provides an overview of such techniques and outlines the newer concepts that are being investigated. Before resorting to the data compression techniques, various preprocessing operations such as noise filtering, composite-component transformation and horizontal and vertical blanking interval removal are to be implemented. Invariably spatio-temporal subsampling is achieved by appropriate filtering. Transform and/or prediction coupled with motion estimation and strengthened by adaptive features are some of the tools in the arsenal of the data reduction methods. Other essential blocks in the system are quantizer, bit allocation, buffer, multiplexer, channel coding etc.

  5. Measurement of action spectra of light-activated processes

    NASA Astrophysics Data System (ADS)

    Ross, Justin; Zvyagin, Andrei V.; Heckenberg, Norman R.; Upcroft, Jacqui; Upcroft, Peter; Rubinsztein-Dunlop, Halina H.

    2006-01-01

    We report on a new experimental technique suitable for measurement of light-activated processes, such as fluorophore transport. The usefulness of this technique is derived from its capacity to decouple the imaging and activation processes, allowing fluorescent imaging of fluorophore transport at a convenient activation wavelength. We demonstrate the efficiency of this new technique in determination of the action spectrum of the light mediated transport of rhodamine 123 into the parasitic protozoan Giardia duodenalis.

  6. Digital Transformation of Words in Learning Processes: A Critical View.

    ERIC Educational Resources Information Center

    Saga, Hiroo

    1999-01-01

    Presents some negative aspects of society's dependence on digital transformation of words by referring to works by Walter Ong and Martin Heidegger. Discusses orality, literacy and digital literacy and describes three aspects of the digital transformation of words. Compares/contrasts art with technology and discusses implications for education.…

  7. Digital Signal Processing and Control for the Study of Gene Networks

    PubMed Central

    Shin, Yong-Jun

    2016-01-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks. PMID:27102828

  8. Digital Signal Processing and Control for the Study of Gene Networks.

    PubMed

    Shin, Yong-Jun

    2016-04-22

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.

  9. Digital Signal Processing and Control for the Study of Gene Networks

    NASA Astrophysics Data System (ADS)

    Shin, Yong-Jun

    2016-04-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.

  10. Implementation of real-time digital signal processing systems

    NASA Technical Reports Server (NTRS)

    Narasimha, M.; Peterson, A.; Narayan, S.

    1978-01-01

    Special purpose hardware implementation of DFT Computers and digital filters is considered in the light of newly introduced algorithms and IC devices. Recent work by Winograd on high-speed convolution techniques for computing short length DFT's, has motivated the development of more efficient algorithms, compared to the FFT, for evaluating the transform of longer sequences. Among these, prime factor algorithms appear suitable for special purpose hardware implementations. Architectural considerations in designing DFT computers based on these algorithms are discussed. With the availability of monolithic multiplier-accumulators, a direct implementation of IIR and FIR filters, using random access memories in place of shift registers, appears attractive. The memory addressing scheme involved in such implementations is discussed. A simple counter set-up to address the data memory in the realization of FIR filters is also described. The combination of a set of simple filters (weighting network) and a DFT computer is shown to realize a bank of uniform bandpass filters. The usefulness of this concept in arriving at a modular design for a million channel spectrum analyzer, based on microprocessors, is discussed.

  11. The effects of gray scale image processing on digital mammography interpretation performance.

    PubMed

    Cole, Elodia B; Pisano, Etta D; Zeng, Donglin; Muller, Keith; Aylward, Stephen R; Park, Sungwook; Kuzmiak, Cherie; Koomen, Marcia; Pavic, Dag; Walsh, Ruth; Baker, Jay; Gimenez, Edgardo I; Freimanis, Rita

    2005-05-01

    To determine the effects of three image-processing algorithms on diagnostic accuracy of digital mammography in comparison with conventional screen-film mammography. A total of 201 cases consisting of nonprocessed soft copy versions of the digital mammograms acquired from GE, Fischer, and Trex digital mammography systems (1997-1999) and conventional screen-film mammograms of the same patients were interpreted by nine radiologists. The raw digital data were processed with each of three different image-processing algorithms creating three presentations-manufacturer's default (applied and laser printed to film by each of the manufacturers), MUSICA, and PLAHE-were presented in soft copy display. There were three radiologists per presentation. Area under the receiver operating characteristic curve for GE digital mass cases was worse than screen-film for all digital presentations. The area under the receiver operating characteristic for Trex digital mass cases was better, but only with images processed with the manufacturer's default algorithm. Sensitivity for GE digital mass cases was worse than screen film for all digital presentations. Specificity for Fischer digital calcifications cases was worse than screen film for images processed in default and PLAHE algorithms. Specificity for Trex digital calcifications cases was worse than screen film for images processed with MUSICA. Specific image-processing algorithms may be necessary for optimal presentation for interpretation based on machine and lesion type.

  12. Infective endocarditis detection through SPECT/CT images digital processing

    NASA Astrophysics Data System (ADS)

    Moreno, Albino; Valdés, Raquel; Jiménez, Luis; Vallejo, Enrique; Hernández, Salvador; Soto, Gabriel

    2014-03-01

    Infective endocarditis (IE) is a difficult-to-diagnose pathology, since its manifestation in patients is highly variable. In this work, it was proposed a semiautomatic algorithm based on SPECT images digital processing for the detection of IE using a CT images volume as a spatial reference. The heart/lung rate was calculated using the SPECT images information. There were no statistically significant differences between the heart/lung rates values of a group of patients diagnosed with IE (2.62+/-0.47) and a group of healthy or control subjects (2.84+/-0.68). However, it is necessary to increase the study sample of both the individuals diagnosed with IE and the control group subjects, as well as to improve the images quality.

  13. Edge detection - Image-plane versus digital processing

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Fales, Carl L.; Park, Stephen K.; Triplett, Judith A.

    1987-01-01

    To optimize edge detection with the familiar Laplacian-of-Gaussian operator, it has become common to implement this operator with a large digital convolution mask followed by some interpolation of the processed data to determine the zero crossings that locate edges. It is generally recognized that this large mask causes substantial blurring of fine detail. It is shown that the spatial detail can be improved by a factor of about four with either the Wiener-Laplacian-of-Gaussian filter or an image-plane processor. The Wiener-Laplacian-of-Gaussian filter minimizes the image-gathering degradations if the scene statistics are at least approximately known and also serves as an interpolator to determine the desired zero crossings directly. The image-plane processor forms the Laplacian-of-Gaussian response by properly combining the optical design of the image-gathering system with a minimal three-by-three lateral-inhibitory processing mask. This approach, which is suggested by Marr's model of early processing in human vision, also reduces data processing by about two orders of magnitude and data transmission by up to an order of magnitude.

  14. A New Digital Signal Processing Method for Spectrum Interference Monitoring

    NASA Astrophysics Data System (ADS)

    Angrisani, L.; Capriglione, D.; Ferrigno, L.; Miele, G.

    2011-01-01

    Frequency spectrum is a limited shared resource, nowadays interested by an ever growing number of different applications. Generally, the companies providing such services pay to the governments the right of using a limited portion of the spectrum, consequently they would be assured that the licensed radio spectrum resource is not interested by significant external interferences. At the same time, they have to guarantee that their devices make an efficient use of the spectrum and meet the electromagnetic compatibility regulations. Therefore the competent authorities are called to control the access to the spectrum adopting suitable management and monitoring policies, as well as the manufacturers have to periodically verify the correct working of their apparatuses. Several measurement solutions are present on the market. They generally refer to real-time spectrum analyzers and measurement receivers. Both of them are characterized by good metrological accuracies but show costs, dimensions and weights that make no possible a use "on the field". The paper presents a first step in realizing a digital signal processing based measurement instrument able to suitably accomplish for the above mentioned needs. In particular the attention has been given to the DSP based measurement section of the instrument. To these aims an innovative measurement method for spectrum monitoring and management is proposed in this paper. It performs an efficient sequential analysis based on a sample by sample digital processing. Three main issues are in particular pursued: (i) measurement performance comparable to that exhibited by other methods proposed in literature; (ii) fast measurement time, (iii) easy implementation on cost-effective measurement hardware.

  15. The central corneal light reflex ratio from photographs derived from a digital camera in young adults.

    PubMed

    Duangsang, Suampa; Tengtrisorn, Supaporn

    2012-05-01

    To determine the normal range of Central Corneal Light Reflex Ratio (CCLRR) from photographs of young adults. A digital camera equipped with a telephoto lens with a flash attachment placed directly above the lens was used to obtain corneal light reflex photographs of 104 subjects, first with the subject fixating on the lens of the camera at a distance of 43 centimeters, and then while looking past the camera to a wall at a distance of 5.4 meters. Digital images were displayed using Adobe Photoshop at a magnification of l200%. The CCLRR was the ratio of the sum of distances between the inner margin of cornea and the central corneal light reflex of each eye to the sum of horizontal corneal diameter of each eye. Measurements were made by three technicians on all subjects, and repeated on a 16% (n=17) subsample. Mean ratios (standard deviation-SD) from near/distance measurements were 0.468 (0.012)/0.452 (0.019). Limits of the normal range, with 95% certainty, were 0.448 and 0.488 for near measurements and 0.419 and 0.484 for distance measurements. Lower and upper indeterminate zones were 0.440-0.447 and 0.489-0.497 for near measurements and 0.406-0.418 and 0.485-0.497 for distance measurements. More extreme values can be considered as abnormal. The reproducibility and repeatability of the test was good. This method is easy to perform and has potential for use in strabismus screening by paramedical personnel.

  16. Symbol processing in the left angular gyrus: evidence from passive perception of digits.

    PubMed

    Price, Gavin R; Ansari, Daniel

    2011-08-01

    Arabic digits are one of the most ubiquitous symbol sets in the world. While there have been many investigations into the neural processing of the semantic information digits represent (e.g. through numerical comparison tasks), little is known about the neural mechanisms which support the processing of digits as visual symbols. To characterise the component neurocognitive mechanisms which underlie numerical cognition, it is essential to understand the processing of digits as a visual category, independent of numerical magnitude processing. The 'Triple Code Model' (Dehaene, 1992; Dehaene and Cohen, 1995) posits an asemantic visual code for processing Arabic digits in the ventral visual stream, yet there is currently little empirical evidence in support of this code. This outstanding question was addressed in the current functional Magnetic Resonance (fMRI) study by contrasting brain responses during the passive viewing of digits versus letters and novel symbols at short (50 ms) and long (500 ms) presentation times. The results of this study reveal increased activation for familiar symbols (digits and letters) relative to unfamiliar symbols (scrambled digits and letters) at long presentation durations in the left dorsal Angular gyrus (dAG). Furthermore, increased activation for Arabic digits was observed in the left ventral Angular gyrus (vAG) in comparison to letters, scrambled digits and scrambled letters at long presentation durations, but no digit specific activation in any region at short presentation durations. These results suggest an absence of a digit specific 'Visual Number Form Area' (VNFA) in the ventral visual cortex, and provide evidence for the role of the left ventral AG during the processing of digits in the absence of any explicit processing demands. We conclude that Arabic digit processing depends specifically on the left AG rather than a ventral visual stream VNFA. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. First light on a new fully digital camera based on SiPM for CTA SST-1M telescope

    NASA Astrophysics Data System (ADS)

    della Volpe, Domenico; Al Samarai, Imen; Alispach, Cyril; Bulik, Tomasz; Borkowski, Jerzy; Cadoux, Franck; Coco, Victor; Favre, Yannick; Grudzińska, Mira; Heller, Matthieu; Jamrozy, Marek; Kasperek, Jerzy; Lyard, Etienne; Mach, Emil; Mandat, Dusan; Michałowski, Jerzy; Moderski, Rafal; Montaruli, Teresa; Neronov, Andrii; Niemiec, Jacek; Njoh Ekoume, T. R. S.; Ostrowski, Michal; Paśko, Paweł; Pech, Miroslav; Rajda, Pawel; Rafalski, Jakub; Schovanek, Petr; Seweryn, Karol; Skowron, Krzysztof; Sliusar, Vitalii; Stawarz, Łukasz; Stodulska, Magdalena; Stodulski, Marek; Travnicek, Petr; Troyano Pujadas, Isaac; Walter, Roland; Zagdański, Adam; Zietara, Krzysztof

    2017-08-01

    The Cherenkov Telescope Array (CTA) will explore with unprecedented precision the Universe in the gammaray domain covering an energy range from 50 GeV to more the 300 TeV. To cover such a broad range with a sensitivity which will be ten time better than actual instruments, different types of telescopes are needed: the Large Size Telescopes (LSTs), with a ˜24 m diameter mirror, a Medium Size Telescopes (MSTs), with a ˜12 m mirror and the small size telescopes (SSTs), with a ˜4 m diameter mirror. The single mirror small size telescope (SST-1M), one of the proposed solutions to become part of the small-size telescopes of CTA, will be equipped with an innovative camera. The SST-1M has a Davies-Cotton optical design with a mirror dish of 4 m diameter and focal ratio 1.4 focussing the Cherenkov light produced in atmospheric showers onto a 90 cm wide hexagonal camera providing a FoV of 9 degrees. The camera is an innovative design based on silicon photomultipliers (SiPM ) and adopting a fully digital trigger and readout architecture. The camera features 1296 custom designed large area hexagonal SiPM coupled to hollow optical concentrators to achieve a pixel size of almost 2.4 cm. The SiPM is a custom design developed with Hamamatsu and with its active area of almost 1 cm2 is one of the largest monolithic SiPM existing. Also the optical concentrators are innovative being light funnels made of a polycarbonate substrate coated with a custom designed UV-enhancing coating. The analog signals coming from the SiPM are fed into the fully digital readout electronics, where digital data are processed by high-speed FPGAs both for trigger and readout. The trigger logic, implemented into an Virtex 7 FPGA, uses the digital data to elaborate a trigger decision by matching data against predefined patterns. This approach is extremely flexible and allows improvements and continued evolutions of the system. The prototype camera is being tested in laboratory prior to its installation

  18. A novel digital pulse processing architecture for nuclear instrumentation

    SciTech Connect

    Moline, Yoann; Thevenin, Mathieu; Corre, Gwenole

    The field of nuclear instrumentation covers a wide range of applications, including counting, spectrometry, pulse shape discrimination and multi-channel coincidence. These applications are the topic of many researches, new algorithms and implementations are constantly proposed thanks to advances in digital signal processing. However, these improvements are not yet implemented in instrumentation devices. This is especially true for neutron-gamma discrimination applications which traditionally use charge comparison method while literature proposes other algorithms based on frequency domain or wavelet theory which show better performances. Another example is pileups which are generally rejected while pileup correction algorithms also exist. These processes are traditionallymore » performed offline due to two issues. The first is the Poissonian characteristic of the signal, composed of random arrival pulses which requires to current architectures to work in data flow. The second is the real-time requirement, which implies losing pulses when the pulse rate is too high. Despite the possibility of treating the pulses independently from each other, current architectures paralyze the acquisition of the signal during the processing of a pulse. This loss is called dead-time. These two issues have led current architectures to use dedicated solutions based on re-configurable components like Field Programmable Gate Arrays (FPGAs) to overcome the need of performance necessary to deal with dead-time. However, dedicated hardware algorithm implementations on re-configurable technologies are complex and time-consuming. For all these reasons, a programmable Digital pulse Processing (DPP) architecture in a high level language such as Cor C++ which can reduce dead-time would be worthwhile for nuclear instrumentation. This would reduce prototyping and test duration by reducing the level of hardware expertise to implement new algorithms. However, today's programmable solutions do not

  19. Digital image processing and analysis for activated sludge wastewater treatment.

    PubMed

    Khan, Muhammad Burhan; Lee, Xue Yong; Nisar, Humaira; Ng, Choon Aun; Yeap, Kim Ho; Malik, Aamir Saeed

    2015-01-01

    Activated sludge system is generally used in wastewater treatment plants for processing domestic influent. Conventionally the activated sludge wastewater treatment is monitored by measuring physico-chemical parameters like total suspended solids (TSSol), sludge volume index (SVI) and chemical oxygen demand (COD) etc. For the measurement, tests are conducted in the laboratory, which take many hours to give the final measurement. Digital image processing and analysis offers a better alternative not only to monitor and characterize the current state of activated sludge but also to predict the future state. The characterization by image processing and analysis is done by correlating the time evolution of parameters extracted by image analysis of floc and filaments with the physico-chemical parameters. This chapter briefly reviews the activated sludge wastewater treatment; and, procedures of image acquisition, preprocessing, segmentation and analysis in the specific context of activated sludge wastewater treatment. In the latter part additional procedures like z-stacking, image stitching are introduced for wastewater image preprocessing, which are not previously used in the context of activated sludge. Different preprocessing and segmentation techniques are proposed, along with the survey of imaging procedures reported in the literature. Finally the image analysis based morphological parameters and correlation of the parameters with regard to monitoring and prediction of activated sludge are discussed. Hence it is observed that image analysis can play a very useful role in the monitoring of activated sludge wastewater treatment plants.

  20. Modeling and system design for the LOFAR station digital processing

    NASA Astrophysics Data System (ADS)

    Alliot, Sylvain; van Veelen, Martijn

    2004-09-01

    In the context of the LOFAR preliminary design phase and in particular for the specification of the Station Digital Processing (SDP), a performance/cost model of the system was used. We present here the framework and the trajectory followed in this phase when going from requirements to specification. In the phased array antenna concepts for the next generation of radio telescopes (LOFAR, ATA, SKA) signal processing (multi-beaming and RFI mitigation) replaces the large antenna dishes. The embedded systems for these telescopes are major infrastructure cost items. Moreover, the flexibility and overall performance of the instrument depend greatly on them, therefore alternative solutions need to be investigated. In particular, the technology and the various data transport selections play a fundamental role in the optimization of the architecture. We proposed a formal method [1] of exploring these alternatives that has been followed during the SDP developments. Different scenarios were compared for the specification of the application (selection of the algorithms as well as detailed signal processing techniques) and in the specification of the system architecture (selection of high level topologies, platforms and components). It gave us inside knowledge on the possible trade-offs in the application and architecture domains. This was successful in providing firm basis for the design choices that are demanded by technical review committees.

  1. Real-time digital signal processing for live electro-optic imaging.

    PubMed

    Sasagawa, Kiyotaka; Kanno, Atsushi; Tsuchiya, Masahiro

    2009-08-31

    We present an imaging system that enables real-time magnitude and phase detection of modulated signals and its application to a Live Electro-optic Imaging (LEI) system, which realizes instantaneous visualization of RF electric fields. The real-time acquisition of magnitude and phase images of a modulated optical signal at 5 kHz is demonstrated by imaging with a Si-based high-speed CMOS image sensor and real-time signal processing with a digital signal processor. In the LEI system, RF electric fields are probed with light via an electro-optic crystal plate and downconverted to an intermediate frequency by parallel optical heterodyning, which can be detected with the image sensor. The artifacts caused by the optics and the image sensor characteristics are corrected by image processing. As examples, we demonstrate real-time visualization of electric fields from RF circuits.

  2. Digital signal processing techniques for coherent optical communication

    NASA Astrophysics Data System (ADS)

    Goldfarb, Gilad

    Coherent detection with subsequent digital signal processing (DSP) is developed, analyzed theoretically and numerically and experimentally demonstrated in various fiber-optic transmission scenarios. The use of DSP in conjunction with coherent detection unleashes the benefits of coherent detection which rely on the preservaton of full information of the incoming field. These benefits include high receiver sensitivity, the ability to achieve high spectral-efficiency and the use of advanced modulation formats. With the immense advancements in DSP speeds, many of the problems hindering the use of coherent detection in optical transmission systems have been eliminated. Most notably, DSP alleviates the need for hardware phase-locking and polarization tracking, which can now be achieved in the digital domain. The complexity previously associated with coherent detection is hence significantly diminished and coherent detection is once gain considered a feasible detection alternative. In this thesis, several aspects of coherent detection (with or without subsequent DSP) are addressed. Coherent detection is presented as a means to extend the dispersion limit of a duobinary signal using an analog decision-directed phase-lock loop. Analytical bit-error ratio estimation for quadrature phase-shift keying signals is derived. To validate the promise for high spectral efficiency, the orthogonal-wavelength-division multiplexing scheme is suggested. In this scheme the WDM channels are spaced at the symbol rate, thus achieving the spectral efficiency limit. Theory, simulation and experimental results demonstrate the feasibility of this approach. Infinite impulse response filtering is shown to be an efficient alternative to finite impulse response filtering for chromatic dispersion compensation. Theory, design considerations, simulation and experimental results relating to this topic are presented. Interaction between fiber dispersion and nonlinearity remains the last major challenge

  3. Study of optical techniques for the Ames unitary wind tunnel: Digital image processing, part 6

    NASA Technical Reports Server (NTRS)

    Lee, George

    1993-01-01

    A survey of digital image processing techniques and processing systems for aerodynamic images has been conducted. These images covered many types of flows and were generated by many types of flow diagnostics. These include laser vapor screens, infrared cameras, laser holographic interferometry, Schlieren, and luminescent paints. Some general digital image processing systems, imaging networks, optical sensors, and image computing chips were briefly reviewed. Possible digital imaging network systems for the Ames Unitary Wind Tunnel were explored.

  4. White-Light Optical Information Processing and Holography.

    DTIC Science & Technology

    1983-05-03

    Processing, White-Light Holography, Image Subtraction, Image Deblurring , Coherence Requirement, Apparent Transfer Function, Source Encoding, Signal...in this period, also demonstrated several color image processing capabilities. Among those are broadband color image deblurring and color image...Broadband Image Deblurring ..... ......... 6 2.5 Color Image Subtraction ............... 7 2.6 Rainbow Holographic Aberrations . . ..... 7 2.7

  5. Digital computer processing of peach orchard multispectral aerial photography

    NASA Technical Reports Server (NTRS)

    Atkinson, R. J.

    1976-01-01

    Several methods of analysis using digital computers applicable to digitized multispectral aerial photography, are described, with particular application to peach orchard test sites. This effort was stimulated by the recent premature death of peach trees in the Southeastern United States. The techniques discussed are: (1) correction of intensity variations by digital filtering, (2) automatic detection and enumeration of trees in five size categories, (3) determination of unhealthy foliage by infrared reflectances, and (4) four band multispectral classification into healthy and declining categories.

  6. Digital Signal Processing Techniques for the GIFTS SM EDU

    NASA Technical Reports Server (NTRS)

    Tian, Jialin; Reisse, Robert A.; Gazarik, Michael J.

    2007-01-01

    The Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) Sensor Module (SM) Engineering Demonstration Unit (EDU) is a high resolution spectral imager designed to measure infrared (IR) radiance using a Fourier transform spectrometer (FTS). The GIFTS instrument employs three Focal Plane Arrays (FPAs), which gather measurements across the long-wave IR (LWIR), short/mid-wave IR (SMWIR), and visible spectral bands. The raw interferogram measurements are radiometrically and spectrally calibrated to produce radiance spectra, which are further processed to obtain atmospheric profiles via retrieval algorithms. This paper describes several digital signal processing (DSP) techniques involved in the development of the calibration model. In the first stage, the measured raw interferograms must undergo a series of processing steps that include filtering, decimation, and detector nonlinearity correction. The digital filtering is achieved by employing a linear-phase even-length FIR complex filter that is designed based on the optimum equiripple criteria. Next, the detector nonlinearity effect is compensated for using a set of pre-determined detector response characteristics. In the next stage, a phase correction algorithm is applied to the decimated interferograms. This is accomplished by first estimating the phase function from the spectral phase response of the windowed interferogram, and then correcting the entire interferogram based on the estimated phase function. In the calibration stage, we first compute the spectral responsivity based on the previous results and the ideal Planck blackbody spectra at the given temperatures, from which, the calibrated ambient blackbody (ABB), hot blackbody (HBB), and scene spectra can be obtained. In the post-calibration stage, we estimate the Noise Equivalent Spectral Radiance (NESR) from the calibrated ABB and HBB spectra. The NESR is generally considered as a measure of the instrument noise performance, and can be estimated as

  7. Statistical mechanics of image processing by digital halftoning

    NASA Astrophysics Data System (ADS)

    Inoue, Jun-Ichi; Norimatsu, Wataru; Saika, Yohei; Okada, Masato

    2009-03-01

    We consider the problem of digital halftoning (DH). The DH is an image processing representing each grayscale in images in terms of black and white dots, and it is achieved by making use of the threshold dither mask, namely, each pixel is determined as black if the grayscale pixel is greater than or equal to the mask value and as white vice versa. To determine the mask for a given grayscale image, we assume that human-eyes might recognize the BW dots as the corresponding grayscale by linear filters. Then, the Hamiltonian is constructed as a distance between the original and recognized images which is written in terms of the mask. Finding the ground state of the Hamiltonian via deterministic annealing, we obtain the optimal mask and the BW dots simultaneously. From the spectrum analysis, we find that the BW dots are desirable from the view point of human-eyes modulation properties. We also show that the lower bound of the mean square error for the inverse process of the DH is minimized on the Nishimori line which is well-known in the research field of spin glasses.

  8. Closed-loop optical stabilization and digital image registration in adaptive optics scanning light ophthalmoscopy

    PubMed Central

    Yang, Qiang; Zhang, Jie; Nozato, Koji; Saito, Kenichi; Williams, David R.; Roorda, Austin; Rossi, Ethan A.

    2014-01-01

    Eye motion is a major impediment to the efficient acquisition of high resolution retinal images with the adaptive optics (AO) scanning light ophthalmoscope (AOSLO). Here we demonstrate a solution to this problem by implementing both optical stabilization and digital image registration in an AOSLO. We replaced the slow scanning mirror with a two-axis tip/tilt mirror for the dual functions of slow scanning and optical stabilization. Closed-loop optical stabilization reduced the amplitude of eye-movement related-image motion by a factor of 10–15. The residual RMS error after optical stabilization alone was on the order of the size of foveal cones: ~1.66–2.56 μm or ~0.34–0.53 arcmin with typical fixational eye motion for normal observers. The full implementation, with real-time digital image registration, corrected the residual eye motion after optical stabilization with an accuracy of ~0.20–0.25 μm or ~0.04–0.05 arcmin RMS, which to our knowledge is more accurate than any method previously reported. PMID:25401030

  9. Characterizing Chain Processes in Visible Light Photoredox Catalysis

    PubMed Central

    Cismesia, Megan A.

    2015-01-01

    The recognition that Ru(bpy)32+ andsimilar visible light absorbing transition metal complexes can be photocatalysts for a variety of synthetically useful organic reactions has resulted in a recent resurgence of interest in photoredox catalysis. However, many of the critical mechanistic aspects of this class of reactions remain poorly understood. In particular, the degree to which visible light photoredox reactions involve radical chain processes has been a point of some disagreement that has not been subjected to systematic analysis. We have now performed quantum yield measurements to demonstrate that threerepresentative, mechanistically distinct photoredox processes involve product-forming chain reactions. Moreover, we show that the combination of quantum yield and luminescence quenching experiments provides a rapid method to estimate the length of these chains. Together, these measurements constitute a robust, operationally facile strategy for characterizing chain processes in a wide range of visible light photoredox reactions. PMID:26668708

  10. Knowledge and Processes That Predict Proficiency in Digital Literacy

    ERIC Educational Resources Information Center

    Bulger, Monica E.; Mayer, Richard E.; Metzger, Miriam J.

    2014-01-01

    Proficiency in digital literacy refers to the ability to read and write using online sources, and includes the ability to select sources relevant to the task, synthesize information into a coherent message, and communicate the message with an audience. The present study examines the determinants of digital literacy proficiency by asking 150…

  11. Microcomputer-Based Digital Signal Processing Laboratory Experiments.

    ERIC Educational Resources Information Center

    Tinari, Jr., Rocco; Rao, S. Sathyanarayan

    1985-01-01

    Describes a system (Apple II microcomputer interfaced to flexible, custom-designed digital hardware) which can provide: (1) Fast Fourier Transform (FFT) computation on real-time data with a video display of spectrum; (2) frequency synthesis experiments using the inverse FFT; and (3) real-time digital filtering experiments. (JN)

  12. Micro-light-pipe array with an excitation attenuation filter for lensless digital enzyme-linked immunosorbent assay

    NASA Astrophysics Data System (ADS)

    Takehara, Hironari; Nagasaki, Mizuki; Sasagawa, Kiyotaka; Takehara, Hiroaki; Noda, Toshihiko; Tokuda, Takashi; Ohta, Jun

    2016-03-01

    Digital enzyme-linked immunosorbent assay (ELISA) is used for detecting various biomarkers with hypersensitivity. We have been developing compact systems by replacing the fluorescence microscope with a CMOS image sensor. Here, we propose a micro-light-pipe array structure made of metal filled with dye-doped resin, which can be used as a fabrication substrate of the micro-reaction-chamber array of digital ELISA. The possibility that this structure enhances the coupling efficiency for fluorescence was simulated using a simple model. To realize the structure, we fabricated a 30-µm-thick micropipe array by copper electroplating around a thick photoresist pattern. The typical diameter of each fabricated micropipe was 10 µm. The pipes were filled with yellow-dye-doped epoxy resin. The transmittance ratio of fluorescence and excitation light could be controlled by adjusting the doping concentration. We confirmed that an angled excitation light incidence suppressed the leakage of excitation light.

  13. Automated Coronal Loop Identification Using Digital Image Processing Techniques

    NASA Technical Reports Server (NTRS)

    Lee, Jong K.; Gary, G. Allen; Newman, Timothy S.

    2003-01-01

    The results of a master thesis project on a study of computer algorithms for automatic identification of optical-thin, 3-dimensional solar coronal loop centers from extreme ultraviolet and X-ray 2-dimensional images will be presented. These center splines are proxies of associated magnetic field lines. The project is pattern recognition problems in which there are no unique shapes or edges and in which photon and detector noise heavily influence the images. The study explores extraction techniques using: (1) linear feature recognition of local patterns (related to the inertia-tensor concept), (2) parametric space via the Hough transform, and (3) topological adaptive contours (snakes) that constrains curvature and continuity as possible candidates for digital loop detection schemes. We have developed synthesized images for the coronal loops to test the various loop identification algorithms. Since the topology of these solar features is dominated by the magnetic field structure, a first-order magnetic field approximation using multiple dipoles provides a priori information in the identification process. Results from both synthesized and solar images will be presented.

  14. High-power laser phosphor light source with liquid cooling for digital cinema applications

    NASA Astrophysics Data System (ADS)

    Li, Kenneth

    2014-02-01

    Laser excited phosphor has been used to excite phosphor material, producing high intensity light output with smaller etendue than that of LEDs with the same long lifetime. But due to the high intensity of the laser light, phosphor with organic binder burns at low power, which requires the phosphor to be deposited on a rotating wheel in practical applications. Phosphor with inorganic binders, commonly known as ceramic phosphor, on the other hand, does not burn, but efficiency goes down as temperature goes up under high power excitation. This paper describes cooling schemes in sealed chambers such that the phosphor materials using organic or inorganic binders can be liquid cooled for high efficiency operations. Confined air bubbles are introduced into the sealed chamber accommodating the differential thermal expansion of the liquid and the chamber. For even higher power operation suitable for digital cinema, a suspension of phosphor in liquid is described suitable for screen brightness of over 30,000 lumens. The aging issues of phosphor can also be solved by using replaceable phosphor cartridges.

  15. Recognition and inference of crevice processing on digitized paintings

    NASA Astrophysics Data System (ADS)

    Karuppiah, S. P.; Srivatsa, S. K.

    2013-03-01

    This paper is designed to detect and removal of cracks on digitized paintings. The cracks are detected by threshold. Afterwards, the thin dark brush strokes which have been misidentified as cracks are removed using Median radial basis function neural network on hue and saturation data, Semi-automatic procedure based on region growing. Finally, crack is filled using wiener filter. The paper is well designed in such a way that most of the cracks on digitized paintings have identified and removed. The paper % of betterment is 90%. This paper helps us to perform not only on digitized paintings but also the medical images and bmp images. This paper is implemented by Mat Lab.

  16. Sequential or parallel decomposed processing of two-digit numbers? Evidence from eye-tracking.

    PubMed

    Moeller, Korbinian; Fischer, Martin H; Nuerk, Hans-Christoph; Willmes, Klaus

    2009-02-01

    While reaction time data have shown that decomposed processing of two-digit numbers occurs, there is little evidence about how decomposed processing functions. Poltrock and Schwartz (1984) argued that multi-digit numbers are compared in a sequential digit-by-digit fashion starting at the leftmost digit pair. In contrast, Nuerk and Willmes (2005) favoured parallel processing of the digits constituting a number. These models (i.e., sequential decomposition, parallel decomposition) make different predictions regarding the fixation pattern in a two-digit number magnitude comparison task and can therefore be differentiated by eye fixation data. We tested these models by evaluating participants' eye fixation behaviour while selecting the larger of two numbers. The stimulus set consisted of within-decade comparisons (e.g., 53_57) and between-decade comparisons (e.g., 42_57). The between-decade comparisons were further divided into compatible and incompatible trials (cf. Nuerk, Weger, & Willmes, 2001) and trials with different decade and unit distances. The observed fixation pattern implies that the comparison of two-digit numbers is not executed by sequentially comparing decade and unit digits as proposed by Poltrock and Schwartz (1984) but rather in a decomposed but parallel fashion. Moreover, the present fixation data provide first evidence that digit processing in multi-digit numbers is not a pure bottom-up effect, but is also influenced by top-down factors. Finally, implications for multi-digit number processing beyond the range of two-digit numbers are discussed.

  17. Enhancing the Teaching of Digital Processing of Remote Sensing Image Course through Geospatial Web Processing Services

    NASA Astrophysics Data System (ADS)

    di, L.; Deng, M.

    2010-12-01

    Remote sensing (RS) is an essential method to collect data for Earth science research. Huge amount of remote sensing data, most of them in the image form, have been acquired. Almost all geography departments in the world offer courses in digital processing of remote sensing images. Such courses place emphasis on how to digitally process large amount of multi-source images for solving real world problems. However, due to the diversity and complexity of RS images and the shortcomings of current data and processing infrastructure, obstacles for effectively teaching such courses still remain. The major obstacles include 1) difficulties in finding, accessing, integrating and using massive RS images by students and educators, and 2) inadequate processing functions and computing facilities for students to freely explore the massive data. Recent development in geospatial Web processing service systems, which make massive data, computing powers, and processing capabilities to average Internet users anywhere in the world, promises the removal of the obstacles. The GeoBrain system developed by CSISS is an example of such systems. All functions available in GRASS Open Source GIS have been implemented as Web services in GeoBrain. Petabytes of remote sensing images in NASA data centers, the USGS Landsat data archive, and NOAA CLASS are accessible transparently and processable through GeoBrain. The GeoBrain system is operated on a high performance cluster server with large disk storage and fast Internet connection. All GeoBrain capabilities can be accessed by any Internet-connected Web browser. Dozens of universities have used GeoBrain as an ideal platform to support data-intensive remote sensing education. This presentation gives a specific example of using GeoBrain geoprocessing services to enhance the teaching of GGS 588, Digital Remote Sensing taught at the Department of Geography and Geoinformation Science, George Mason University. The course uses the textbook "Introductory

  18. Hue-saturation-density (HSD) model for stain recognition in digital images from transmitted light microscopy.

    PubMed

    van Der Laak, J A; Pahlplatz, M M; Hanselaar, A G; de Wilde, P C

    2000-04-01

    Transmitted light microscopy is used in pathology to examine stained tissues. Digital image analysis is gaining importance as a means to quantify alterations in tissues. A prerequisite for accurate and reproducible quantification is the possibility to recognise stains in a standardised manner, independently of variations in the staining density. The usefulness of three colour models was studied using data from computer simulations and experimental data from an immuno-doublestained tissue section. Direct use of the three intensities obtained by a colour camera results in the red-green-blue (RGB) model. By decoupling the intensity from the RGB data, the hue-saturation-intensity (HSI) model is obtained. However, the major part of the variation in perceived intensities in transmitted light microscopy is caused by variations in staining density. Therefore, the hue-saturation-density (HSD) transform was defined as the RGB to HSI transform, applied to optical density values rather than intensities for the individual RGB channels. In the RGB model, the mixture of chromatic and intensity information hampers standardisation of stain recognition. In the HSI model, mixtures of stains that could be distinguished from other stains in the RGB model could not be separated. The HSD model enabled all possible distinctions in a two-dimensional, standardised data space. In the RGB model, standardised recognition is only possible by using complex and time-consuming algorithms. The HSI model is not suitable for stain recognition in transmitted light microscopy. The newly derived HSD model was found superior to the existing models for this purpose. Copyright 2000 Wiley-Liss, Inc.

  19. Exploring the Developmental Changes in Automatic Two-Digit Number Processing

    ERIC Educational Resources Information Center

    Chan, Winnie Wai Lan; Au, Terry K.; Tang, Joey

    2011-01-01

    Even when two-digit numbers are irrelevant to the task at hand, adults process them. Do children process numbers automatically, and if so, what kind of information is activated? In a novel dot-number Stroop task, children (Grades 1-5) and adults were shown two different two-digit numbers made up of dots. Participants were asked to select the…

  20. Data reduction complex analog-to-digital data processing requirements for onsite test facilities

    NASA Technical Reports Server (NTRS)

    Debbrecht, J. D.

    1976-01-01

    The analog to digital processing requirements of onsite test facilities are described. The source and medium of all input data to the Data Reduction Complex (DRC) and the destination and medium of all output products of the analog-to-digital processing are identified. Additionally, preliminary input and output data formats are presented along with the planned use of the output products.

  1. Geometric processing of digital images of the planets

    NASA Technical Reports Server (NTRS)

    Edwards, Kathleen

    1987-01-01

    New procedures and software have been developed for geometric transformation of images to support digital cartography of the planets. The procedures involve the correction of spacecraft camera orientation of each image with the use of ground control and the transformation of each image to a Sinusoidal Equal-Area map projection with an algorithm which allows the number of transformation calculations to vary as the distortion varies within the image. When the distortion is low in an area of an image, few transformation computations are required, and most pixels can be interpolated. When distortion is extreme, the location of each pixel is computed. Mosaics are made of these images and stored as digital databases. Completed Sinusoidal databases may be used for digital analysis and registration with other spatial data. They may also be reproduced as published image maps by digitally transforming them to appropriate map projections.

  2. Coherent optical processing using noncoherent light after source masking.

    PubMed

    Boopathi, V; Vasu, R M

    1992-01-10

    Coherent optical processing starting with spatially noncoherent illumination is described. Good spatial coherence is introduced in the far field by modulating a noncoherent source when masks with sharp autocorrelation are used. The far-field mutual coherence function of light is measured and it is seen that, for the masks and the source size used here, we get a fairly large area over which the mutual coherence function is high and flat. We demonstrate traditional coherent processing operations such as Fourier transformation and image deblurring when coherent light that is produced in the above fashion is used. A coherence-redundancy merit function is defined for this type of processing system. It is experimentally demonstrated that the processing system introduced here has superior blemish tolerance compared with a traditional processor that uses coherent illumination.

  3. Microwave Processing of Crowns from Winter Cereals for Light Microscopy.

    USDA-ARS?s Scientific Manuscript database

    Microwave processing of tissue considerably shortens the time it takes to prepare samples for light and electron microscopy. However, plant tissues from different species and different regions of the plant respond differently making it impossible to use a single protocol for all plant tissue. The ...

  4. Evaluation of mobile digital light-emitting diode fluorescence microscopy in Hanoi, Viet Nam.

    PubMed

    Chaisson, L H; Reber, C; Phan, H; Switz, N; Nilsson, L M; Myers, F; Nhung, N V; Luu, L; Pham, T; Vu, C; Nguyen, H; Nguyen, A; Dinh, T; Nahid, P; Fletcher, D A; Cattamanchi, A

    2015-09-01

    Hanoi Lung Hospital, Hanoi, Viet Nam. To compare the accuracy of CellScopeTB, a manually operated mobile digital fluorescence microscope, with conventional microscopy techniques. Patients referred for sputum smear microscopy to the Hanoi Lung Hospital from May to September 2013 were included. Ziehl-Neelsen (ZN) smear microscopy, conventional light-emitting diode (LED) fluorescence microscopy (FM), CellScopeTB-based LED FM and Xpert(®) MTB/RIF were performed on sputum samples. The sensitivity and specificity of microscopy techniques were determined in reference to Xpert results, and differences were compared using McNemar's paired test of proportions. Of 326 patients enrolled, 93 (28.5%) were Xpert-positive for TB. The sensitivity of ZN microscopy, conventional LED FM, and CellScopeTB-based LED FM was respectively 37.6% (95%CI 27.8-48.3), 41.9% (95%CI 31.8-52.6), and 35.5% (95%CI 25.8-46.1). The sensitivity of CellScopeTB was similar to that of conventional LED FM (difference -6.5%, 95%CI -18.2 to 5.3, P = 0.33) and ZN microscopy (difference -2.2%, 95%CI -9.2 to 4.9, P = 0.73). The specificity was >99% for all three techniques. CellScopeTB performed similarly to conventional microscopy techniques in the hands of experienced TB microscopists. However, the sensitivity of all sputum microscopy techniques was low. Options enabled by digital microscopy, such as automated imaging with real-time computerized analysis, should be explored to increase sensitivity.

  5. The application of digital signal processing techniques to a teleoperator radar system

    NASA Technical Reports Server (NTRS)

    Pujol, A.

    1982-01-01

    A digital signal processing system was studied for the determination of the spectral frequency distribution of echo signals from a teleoperator radar system. The system consisted of a sample and hold circuit, an analog to digital converter, a digital filter, and a Fast Fourier Transform. The system is interfaced to a 16 bit microprocessor. The microprocessor is programmed to control the complete digital signal processing. The digital filtering and Fast Fourier Transform functions are implemented by a S2815 digital filter/utility peripheral chip and a S2814A Fast Fourier Transform chip. The S2815 initially simulates a low-pass Butterworth filter with later expansion to complete filter circuit (bandpass and highpass) synthesizing.

  6. Process for converting light alkanes to higher hydrocarbons

    DOEpatents

    Noceti, Richard P.; Taylor, Charles E.

    1988-01-01

    A process is disclosed for the production of aromatic-rich, gasoline boiling range hydrocarbons from the lower alkanes, particularly from methane. The process is carried out in two stages. In the first, alkane is reacted with oxygen and hydrogen chloride over an oxyhydrochlorination catalyst such as copper chloride with minor proportions of potassium chloride and rare earth chloride. This produces an intermediate gaseous mixture containing water and chlorinated alkanes. The chlorinated alkanes are contacted with a crystalline aluminosilicate catalyst in the hydrogen or metal promoted form to produce gasoline range hydrocarbons with a high proportion of aromatics and a small percentage of light hydrocarbons (C.sub.2 -C.sub.4). The light hydrocarbons can be recycled for further processing over the oxyhydrochlorination catalyst.

  7. Development of Coriolis mass flowmeter with digital drive and signal processing technology.

    PubMed

    Hou, Qi-Li; Xu, Ke-Jun; Fang, Min; Liu, Cui; Xiong, Wen-Jun

    2013-09-01

    Coriolis mass flowmeter (CMF) often suffers from two-phase flowrate which may cause flowtube stalling. To solve this problem, a digital drive method and a digital signal processing method of CMF is studied and implemented in this paper. A positive-negative step signal is used to initiate the flowtube oscillation without knowing the natural frequency of the flowtube. A digital zero-crossing detection method based on Lagrange interpolation is adopted to calculate the frequency and phase difference of the sensor output signals in order to synthesize the digital drive signal. The digital drive approach is implemented by a multiplying digital to analog converter (MDAC) and a direct digital synthesizer (DDS). A digital Coriolis mass flow transmitter is developed with a digital signal processor (DSP) to control the digital drive, and realize the signal processing. Water flow calibrations and gas-liquid two-phase flowrate experiments are conducted to examine the performance of the transmitter. The experimental results show that the transmitter shortens the start-up time and can maintain the oscillation of flowtube in two-phase flowrate condition. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  8. All-solution processed transparent organic light emitting diodes

    NASA Astrophysics Data System (ADS)

    Zhang, Min; Höfle, Stefan; Czolk, Jens; Mertens, Adrian; Colsmann, Alexander

    2015-11-01

    In this work, we report on indium tin oxide-free, all-solution processed transparent organic light emitting diodes (OLEDs) with inverted device architecture. Conductive polymer layers are employed as both transparent cathodes and transparent anodes, with the top anodes having enhanced conductivities from a supporting stochastic silver nanowire mesh. Both electrodes exhibit transmittances of 80-90% in the visible spectral regime. Upon the incorporation of either yellow- or blue-light emitting fluorescent polymers, the OLEDs show low onset voltages, demonstrating excellent charge carrier injection from the polymer electrodes into the emission layers. Overall luminances and current efficiencies equal the performance of opaque reference OLEDs with indium tin oxide and aluminium electrodes, proving excellent charge carrier-to-light conversion within the device.

  9. Light-emitting block copolymers composition, process and use

    DOEpatents

    Ferraris, John P.; Gutierrez, Jose J.

    2006-11-14

    Generally, and in one form, the present invention is a composition of light-emitting block copolymer. In another form, the present invention is a process producing a light-emitting block copolymers that intends polymerizing a first di(halo-methyl) aromatic monomer compound in the presence of an anionic initiator and a base to form a polymer and contacting a second di(halo-methyl) aromatic monomer compound with the polymer to form a homopolymer or block copolymer wherein the block copolymer is a diblock, triblock, or star polymer. In yet another form, the present invention is an electroluminescent device comprising a light-emitting block copolymer, wherein the electroluminescent device is to be used in the manufacturing of optical and electrical devices.

  10. Attosecond control of electronic processes by intense light fields.

    PubMed

    Baltuska, A; Udem, Th; Uiberacker, M; Hentschel, M; Goulielmakis, E; Gohle, Ch; Holzwarth, R; Yakovlev, V S; Scrinzi, A; Hänsch, T W; Krausz, F

    2003-02-06

    The amplitude and frequency of laser light can be routinely measured and controlled on a femtosecond (10(-15) s) timescale. However, in pulses comprising just a few wave cycles, the amplitude envelope and carrier frequency are not sufficient to characterize and control laser radiation, because evolution of the light field is also influenced by a shift of the carrier wave with respect to the pulse peak. This so-called carrier-envelope phase has been predicted and observed to affect strong-field phenomena, but random shot-to-shot shifts have prevented the reproducible guiding of atomic processes using the electric field of light. Here we report the generation of intense, few-cycle laser pulses with a stable carrier envelope phase that permit the triggering and steering of microscopic motion with an ultimate precision limited only by quantum mechanical uncertainty. Using these reproducible light waveforms, we create light-induced atomic currents in ionized matter; the motion of the electronic wave packets can be controlled on timescales shorter than 250 attoseconds (250 x 10(-18) s). This enables us to control the attosecond temporal structure of coherent soft X-ray emission produced by the atomic currents--these X-ray photons provide a sensitive and intuitive tool for determining the carrier-envelope phase.

  11. All-digital precision processing of ERTS images

    NASA Technical Reports Server (NTRS)

    Bernstein, R. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. Digital techniques have been developed and used to apply precision-grade radiometric and geometric corrections to ERTS MSS and RBV scenes. Geometric accuracies sufficient for mapping at 1:250,000 scale have been demonstrated. Radiometric quality has been superior to ERTS NDPF precision products. A configuration analysis has shown that feasible, cost-effective all-digital systems for correcting ERTS data are easily obtainable. This report contains a summary of all results obtained during this study and includes: (1) radiometric and geometric correction techniques, (2) reseau detection, (3) GCP location, (4) resampling, (5) alternative configuration evaluations, and (6) error analysis.

  12. Measuring the remineralization potential of different agents with quantitative light-induced fluorescence digital Biluminator.

    PubMed

    Kucukyilmaz, Ebru; Savas, Selcuk

    2017-01-26

    The aim of this study was to investigate the effectiveness of different remineralization agents by quantitative light-induced fluorescence digital BiluminatorTM (QLF-D). Artificial caries lesions were created, and the teeth were divided according to the tested materials: (i) distilled water, (ii) acidulated phosphate fluoride (APF), (iii) Curodont Repair (CR), (iv) ammonium hexafluorosilicate (SiF) and (v) ammonium hexafluorosilicate plus cetylpyridinium chloride (SiF + CPC). After treatment procedures, each of the samples was placed in artificial saliva. After demineralization and 1 and 4 weeks of remineralization procedures, fluorescence loss and lesion areas were measured with QLF-D. Data were statistically analyzed (α = 0.05). The fluorescence values of the demineralized enamel specimens treated with the various agents differed significantly compared with pretreatment values for both 1 and 4 weeks (p<0.05). At 4 weeks, the highest fluorescence gain was calculated in the CR, APF and SiF groups compared with the control (p<0.05). APF, SiF and CR groups yielded greater remineralization ability than SiF + CPC and control groups.

  13. Blue-light digital communication in underwater environments utilizing orbital angular momentum

    NASA Astrophysics Data System (ADS)

    Baghdady, Joshua; Miller, Keith; Osler, Sean; Morgan, Kaitlyn; Li, Wenzhe; Johnson, Eric; Cochenour, Brandon

    2016-05-01

    Underwater optical communication has recently become the topic of much investigation as the demands for underwater data transmission have rapidly grown in recent years. The need for reliable, high-speed, secure underwater communication has turned increasingly to blue-light optical solutions. The blue-green visible wavelength window provides an attractive solution to the problem of underwater data transmission thanks to its low attenuation, where traditional RF solutions used in free-space communications collapse. Beginning with GaN laser diodes as the optical source, this work explores the encoding and transmission of digital data across underwater environments of varying turbidities. Given the challenges present in an underwater environment, such as the mechanical and optical turbulences that make proper alignment difficult to maintain, it is desirable to achieve extremely high data rates in order to allow the time window of alignment between the transmitter and receiver to be as small as possible. In this paper, work is done to increase underwater data rates through the use of orbital angular momentum. Results are shown for a range of data rates across a variety of channel types ranging in turbidity from that of a clear ocean to a dirty harbor.

  14. Screen Capture Technology: A Digital Window into Students' Writing Processes

    ERIC Educational Resources Information Center

    Seror, Jeremie

    2013-01-01

    Technological innovations and the prevalence of the computer as a means of producing and engaging with texts have dramatically transformed how literacy is defined and developed in modern society. This rise in digital writing practices has led to a growing number of tools and methods that can be used to explore second language (L2) writing…

  15. Identification and Quantification Soil Redoximorphic Features by Digital Image Processing

    USDA-ARS?s Scientific Manuscript database

    Soil redoximorphic features (SRFs) have provided scientists and land managers with insight into relative soil moisture for approximately 60 years. The overall objective of this study was to develop a new method of SRF identification and quantification from soil cores using a digital camera and imag...

  16. Rounding Technique for High-Speed Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Wechsler, E. R.

    1983-01-01

    Arithmetic technique facilitates high-speed rounding of 2's complement binary data. Conventional rounding of 2's complement numbers presents problems in high-speed digital circuits. Proposed technique consists of truncating K + 1 bits then attaching bit in least significant position. Mean output error is zero, eliminating introducing voltage offset at input.

  17. Autism and Digital Learning Environments: Processes of Interaction and Mediation

    ERIC Educational Resources Information Center

    Passerino, Liliana M.; Santarosa, Lucila M. Costi

    2008-01-01

    Using a socio-historical perspective to explain social interaction and taking advantage of information and communication technologies (ICTs) currently available for creating digital learning environments (DLEs), this paper seeks to redress the absence of empirical data concerning technology-aided social interaction between autistic individuals. In…

  18. Light Water Reactor Sustainability Program: Digital Technology Business Case Methodology Guide

    SciTech Connect

    Thomas, Ken; Lawrie, Sean; Hart, Adam

    The Department of Energy’s (DOE’s) Light Water Reactor Sustainability Program aims to develop and deploy technologies that will make the existing U.S. nuclear fleet more efficient and competitive. The program has developed a standard methodology for determining the impact of new technologies in order to assist nuclear power plant (NPP) operators in building sound business cases. The Advanced Instrumentation, Information, and Control (II&C) Systems Technologies Pathway is part of the DOE’s Light Water Reactor Sustainability (LWRS) Program. It conducts targeted research and development (R&D) to address aging and reliability concerns with the legacy instrumentation and control and related information systemsmore » of the U.S. operating light water reactor (LWR) fleet. This work involves two major goals: (1) to ensure that legacy analog II&C systems are not life-limiting issues for the LWR fleet and (2) to implement digital II&C technology in a manner that enables broad innovation and business improvement in the NPP operating model. Resolving long-term operational concerns with the II&C systems contributes to the long-term sustainability of the LWR fleet, which is vital to the nation’s energy and environmental security. The II&C Pathway is conducting a series of pilot projects that enable the development and deployment of new II&C technologies in existing nuclear plants. Through the LWRS program, individual utilities and plants are able to participate in these projects or otherwise leverage the results of projects conducted at demonstration plants. Performance advantages of the new pilot project technologies are widely acknowledged, but it has proven difficult for utilities to derive business cases for justifying investment in these new capabilities. Lack of a business case is often cited by utilities as a barrier to pursuing wide-scale application of digital technologies to nuclear plant work activities. The decision to move forward with funding usually

  19. Light Extraction From Solution-Based Processable Electrophosphorescent Organic Light-Emitting Diodes

    NASA Astrophysics Data System (ADS)

    Krummacher, Benjamin C.; Mathai, Mathew; So, Franky; Choulis, Stelios; Choong, And-En, Vi

    2007-06-01

    Molecular dye dispersed solution processable blue emitting organic light-emitting devices have been fabricated and the resulting devices exhibit efficiency as high as 25 cd/A. With down-conversion phosphors, white emitting devices have been demonstrated with peak efficiency of 38 cd/A and luminous efficiency of 25 lm/W. The high efficiencies have been a product of proper tuning of carrier transport, optimization of the location of the carrier recombination zone and, hence, microcavity effect, efficient down-conversion from blue to white light, and scattering/isotropic remission due to phosphor particles. An optical model has been developed to investigate all these effects. In contrast to the common misunderstanding that light out-coupling efficiency is about 22% and independent of device architecture, our device data and optical modeling results clearly demonstrated that the light out-coupling efficiency is strongly dependent on the exact location of the recombination zone. Estimating the device internal quantum efficiencies based on external quantum efficiencies without considering the device architecture could lead to erroneous conclusions.

  20. Technological processes of grating light valve as diffractive spatial light modulator in laser phototypesetting system

    NASA Astrophysics Data System (ADS)

    Zhang, Wei; Geng, Yu; Hou, Changlun; Yang, Guoguang; Bai, Jian

    2008-11-01

    Grating Light Valve (GLV) is a kind of optics device based on Micro-Opto-Electro-Mechanical System (MOEMS) technology, utilizing diffraction principle to switch, attenuate and modulate light. In this paper, traditional GLV device's structure and its working principle are illuminated, and a kind of modified GLV structure is presented, with details introduction of the fabrication technology. The GLV structure includes single crystal silicon substrate, silicon dioxide isolating layer, aluminum layer of fixed ribbons and silicon nitride of movable ribbons. In the fabrication, lots of techniques are adopted, such as low-pressure chemical vapor deposition (LPCVD), photolithography, etching and evaporation. During the fabrication processes, Photolithography is a fundamental and fatal technology, which determines etching result and GLV quality. Some methods are proposed through repeated experiments, to improve etching result greatly and guide the practical application. This kind of GLV device can be made both small and inexpensively, and has been tested to show proper range of actuation under DC bias, with good performance. The GLV device also has merits such as low cost, simple technology, high fill ratio and low driving voltage. It can properly be well used and match the demands of high light power needed in laser phototypesetting system, as a high-speed, high-resolution light modulator.

  1. Reaction of photochemical resists used in screen printing under the influence of digitally modulated ultra violet light

    NASA Astrophysics Data System (ADS)

    Gmuender, T.

    2017-02-01

    Different chemical photo-reactive emulsions are used in screen printing for stencil production. Depending on the bandwidth, optical power and depth of field from the optical system, the reaction / exposure speed has a diverse value. In this paper, the emulsions get categorized and validated in a first step. After that a mathematical model gets developed and adapted due to heuristic experience to estimate the exposure speed under the influence of digitally modulated ultra violet (UV) light. The main intention is to use the technical specifications (intended wavelength, exposure time, distance to the stencil, electrical power, stencil configuration) in the emulsion data sheet primary written down with an uncertainty factor for the end user operating with large projector arc lamps and photo films. These five parameters are the inputs for a mathematical formula which gives as an output the exposure speed for the Computer to Screen (CTS) machine calculated for each emulsion / stencil setup. The importance of this work relies in the possibility to rate with just a few boundaries the performance and capacity of an exposure system used in screen printing instead of processing a long test series for each emulsion / stencil configuration.

  2. Low light adaptation: energy transfer processes in different types of light harvesting complexes from Rhodopseudomonas palustris.

    PubMed

    Moulisová, Vladimíra; Luer, Larry; Hoseinkhani, Sajjad; Brotosudarmo, Tatas H P; Collins, Aaron M; Lanzani, Guglielmo; Blankenship, Robert E; Cogdell, Richard J

    2009-12-02

    Energy transfer processes in photosynthetic light harvesting 2 (LH2) complexes isolated from purple bacterium Rhodopseudomonas palustris grown at different light intensities were studied by ground state and transient absorption spectroscopy. The decomposition of ground state absorption spectra shows contributions from B800 and B850 bacteriochlorophyll (BChl) a rings, the latter component splitting into a low energy and a high energy band in samples grown under low light (LL) conditions. A spectral analysis reveals strong inhomogeneity of the B850 excitons in the LL samples that is well reproduced by an exponential-type distribution. Transient spectra show a bleach of both the low energy and high energy bands, together with the respective blue-shifted exciton-to-biexciton transitions. The different spectral evolutions were analyzed by a global fitting procedure. Energy transfer from B800 to B850 occurs in a mono-exponential process and the rate of this process is only slightly reduced in LL compared to high light samples. In LL samples, spectral relaxation of the B850 exciton follows strongly nonexponential kinetics that can be described by a reduction of the bleach of the high energy excitonic component and a red-shift of the low energetic one. We explain these spectral changes by picosecond exciton relaxation caused by a small coupling parameter of the excitonic splitting of the BChl a molecules to the surrounding bath. The splitting of exciton energy into two excitonic bands in LL complex is most probably caused by heterogenous composition of LH2 apoproteins that gives some of the BChls in the B850 ring B820-like site energies, and causes a disorder in LH2 structure.

  3. GEOMETRIC PROCESSING OF DIGITAL IMAGES OF THE PLANETS.

    USGS Publications Warehouse

    Edwards, Kathleen

    1987-01-01

    New procedures and software have been developed for geometric transformations of images to support digital cartography of the planets. The procedures involve the correction of spacecraft camera orientation of each image with the use of ground control and the transformation of each image to a Sinusoidal Equal-Area map projection with an algorithm which allows the number of transformation calculations to vary as the distortion varies within the image. When the distortion is low in an area of an image, few transformation computations are required, and most pixels can be interpolated. When distortion is extreme, the location of each pixel is computed. Mosaics are made of these images and stored as digital databases.

  4. Spectral analysis and filtering techniques in digital spatial data processing

    USGS Publications Warehouse

    Pan, Jeng-Jong

    1989-01-01

    A filter toolbox has been developed at the EROS Data Center, US Geological Survey, for retrieving or removing specified frequency information from two-dimensional digital spatial data. This filter toolbox provides capabilities to compute the power spectrum of a given data and to design various filters in the frequency domain. Three types of filters are available in the toolbox: point filter, line filter, and area filter. Both the point and line filters employ Gaussian-type notch filters, and the area filter includes the capabilities to perform high-pass, band-pass, low-pass, and wedge filtering techniques. These filters are applied for analyzing satellite multispectral scanner data, airborne visible and infrared imaging spectrometer (AVIRIS) data, gravity data, and the digital elevation models (DEM) data. -from Author

  5. Applications of digital processing for noise removal from plasma diagnostics

    SciTech Connect

    Kane, R.J.; Candy, J.V.; Casper, T.A.

    1985-11-11

    The use of digital signal techniques for removal of noise components present in plasma diagnostic signals is discussed, particularly with reference to diamagnetic loop signals. These signals contain noise due to power supply ripple in addition to plasma characteristics. The application of noise canceling techniques, such as adaptive noise canceling and model-based estimation, will be discussed. The use of computer codes such as SIG is described. 19 refs., 5 figs.

  6. Organic light-emitting devices using spin-dependent processes

    DOEpatents

    Vardeny, Z. Valy; Wohlgenannt, Markus

    2010-03-23

    The maximum luminous efficiency of organic light-emitting materials is increased through spin-dependent processing. The technique is applicable to all electro-luminescent processes in which light is produced by singlet exciton decay, and all devices which use such effects, including LEDs, super-radiant devices, amplified stimulated emission devices, lasers, other optical microcavity devices, electrically pumped optical amplifiers, and phosphorescence (Ph) based light emitting devices. In preferred embodiments, the emissive material is doped with an impurity, or otherwise modified, to increase the spin-lattice relaxation rate (i.e., decrease the spin-lattice time), and hence raise the efficiency of the device. The material may be a polymer, oligomer, small molecule, single crystal, molecular crystal, or fullerene. The impurity is preferably a magnetic or paramagnetic substance. The invention is applicable to IR, UV, and other electromagnetic radiation generation and is thus not limited to the visible region of the spectrum. The methods of the invention may also be combined with other techniques used to improve device performance.

  7. From CAD to Digital Modeling: the Necessary Hybridization of Processes

    NASA Astrophysics Data System (ADS)

    Massari, G. A.; Bernardi, F.; Cristofolini, A.

    2011-09-01

    The essay deals with the themes of digital representation of architecture starting from several years of teaching activity which is growing within the course of Automatic Design of the degree course in Engineering/Architecture in the University of Trento. With the development of CAD systems, architectural representation lies less in the tracking of a simple graph and drawn deeper into a series of acts of building a complex digital model, which can be used as a data base on which to report all the stages of project and interpretation work, and from which to derive final drawings and documents. The advent of digital technology has led to increasing difficulty in finding explicit connections between one type of operation and the subsequent outcome; thereby increasing need for guidelines, the need to understand in order to precede the changes, the desire not to be overwhelmed by uncontrollable influences brought by technological hardware and software systems to use only in accordance with the principle of maximum productivity. Formation occupies a crucial role because has the ability to direct the profession toward a thoughtful and selective use of specific applications; teaching must build logical routes in the fluid world of info-graphics and the only way to do so is to describe its contours through method indications: this will consist in understanding, studying and divulging what in its mobility does not change, as procedural issues, rather than what is transitory in its fixity, as manual questions.

  8. A Digital Methodology for the Design Process of Aerospace Assemblies with Sustainable Composite Processes & Manufacture

    NASA Astrophysics Data System (ADS)

    McEwan, W.; Butterfield, J.

    2011-05-01

    The well established benefits of composite materials are driving a significant shift in design and manufacture strategies for original equipment manufacturers (OEMs). Thermoplastic composites have advantages over the traditional thermosetting materials with regards to sustainability and environmental impact, features which are becoming increasingly pertinent in the aerospace arena. However, when sustainability and environmental impact are considered as design drivers, integrated methods for part design and product development must be developed so that any benefits of sustainable composite material systems can be assessed during the design process. These methods must include mechanisms to account for process induced part variation and techniques related to re-forming, recycling and decommissioning, which are in their infancy. It is proposed in this paper that predictive techniques related to material specification, part processing and product cost of thermoplastic composite components, be integrated within a Through Life Management (TLM) product development methodology as part of a larger strategy of product system modeling to improve disciplinary concurrency, realistic part performance, and to place sustainability at the heart of the design process. This paper reports the enhancement of digital manufacturing tools as a means of drawing simulated part manufacturing scenarios, real time costing mechanisms, and broader lifecycle performance data capture into the design cycle. The work demonstrates predictive processes for sustainable composite product manufacture and how a Product-Process-Resource (PPR) structure can be customised and enhanced to include design intent driven by `Real' part geometry and consequent assembly. your paper.

  9. 77 FR 50724 - Developing Software Life Cycle Processes for Digital Computer Software Used in Safety Systems of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Developing Software Life Cycle Processes for Digital... Software Life Cycle Processes for Digital Computer Software used in Safety Systems of Nuclear Power Plants... clarifications, the enhanced consensus practices for developing software life-cycle processes for digital...

  10. The x-ray light valve: a potentially low-cost, digital radiographic imaging system-concept and implementation considerations.

    PubMed

    Webster, Christie Ann; Koprinarov, Ivaylo; Germann, Stephen; Rowlands, J A

    2008-03-01

    New x-ray radiographic systems based on large-area flat-panel technology have revolutionized our capability to produce digital x-ray images. However, these imagers are extraordinarily expensive compared to the systems they are replacing. Hence, there is a need for a low-cost digital imaging system for general applications in radiology. A novel potentially low-cost radiographic imaging system based on established technologies is proposed-the X-Ray Light Valve (XLV). This is a potentially high-quality digital x-ray detector made of a photoconducting layer and a liquid-crystal cell, physically coupled in a sandwich structure. Upon exposure to x rays, charge is collected on the surface of the photoconductor. This causes a change in the optical properties of the liquid-crystal cell and a visible image is generated. Subsequently, it is digitized by a scanned optical imager. The image formation is based on controlled modulation of light from an external source. The operation and practical implementation of the XLV system are described. The potential performance of the complete system and issues related to sensitivity, spatial resolution, noise, and speed are discussed. The feasibility of clinical use of an XLV device based on amorphous selenium (a-Se) as the photoconductor and a reflective electrically controlled birefringence cell is analyzed. The results of our analysis indicate that the XLV can potentially be adapted to a wide variety of radiographic tasks.

  11. Digital Cover Photography for Estimating Leaf Area Index (LAI) in Apple Trees Using a Variable Light Extinction Coefficient

    PubMed Central

    Poblete-Echeverría, Carlos; Fuentes, Sigfredo; Ortega-Farias, Samuel; Gonzalez-Talice, Jaime; Yuri, Jose Antonio

    2015-01-01

    Leaf area index (LAI) is one of the key biophysical variables required for crop modeling. Direct LAI measurements are time consuming and difficult to obtain for experimental and commercial fruit orchards. Devices used to estimate LAI have shown considerable errors when compared to ground-truth or destructive measurements, requiring tedious site-specific calibrations. The objective of this study was to test the performance of a modified digital cover photography method to estimate LAI in apple trees using conventional digital photography and instantaneous measurements of incident radiation (Io) and transmitted radiation (I) through the canopy. Leaf area of 40 single apple trees were measured destructively to obtain real leaf area index (LAID), which was compared with LAI estimated by the proposed digital photography method (LAIM). Results showed that the LAIM was able to estimate LAID with an error of 25% using a constant light extinction coefficient (k = 0.68). However, when k was estimated using an exponential function based on the fraction of foliage cover (ff) derived from images, the error was reduced to 18%. Furthermore, when measurements of light intercepted by the canopy (Ic) were used as a proxy value for k, the method presented an error of only 9%. These results have shown that by using a proxy k value, estimated by Ic, helped to increase accuracy of LAI estimates using digital cover images for apple trees with different canopy sizes and under field conditions. PMID:25635411

  12. Digital cover photography for estimating leaf area index (LAI) in apple trees using a variable light extinction coefficient.

    PubMed

    Poblete-Echeverría, Carlos; Fuentes, Sigfredo; Ortega-Farias, Samuel; Gonzalez-Talice, Jaime; Yuri, Jose Antonio

    2015-01-28

    Leaf area index (LAI) is one of the key biophysical variables required for crop modeling. Direct LAI measurements are time consuming and difficult to obtain for experimental and commercial fruit orchards. Devices used to estimate LAI have shown considerable errors when compared to ground-truth or destructive measurements, requiring tedious site-specific calibrations. The objective of this study was to test the performance of a modified digital cover photography method to estimate LAI in apple trees using conventional digital photography and instantaneous measurements of incident radiation (Io) and transmitted radiation (I) through the canopy. Leaf area of 40 single apple trees were measured destructively to obtain real leaf area index (LAI(D)), which was compared with LAI estimated by the proposed digital photography method (LAI(M)). Results showed that the LAI(M) was able to estimate LAI(D) with an error of 25% using a constant light extinction coefficient (k = 0.68). However, when k was estimated using an exponential function based on the fraction of foliage cover (f(f)) derived from images, the error was reduced to 18%. Furthermore, when measurements of light intercepted by the canopy (Ic) were used as a proxy value for k, the method presented an error of only 9%. These results have shown that by using a proxy k value, estimated by Ic, helped to increase accuracy of LAI estimates using digital cover images for apple trees with different canopy sizes and under field conditions.

  13. Hair treatment process providing dispersed colors by light diffraction

    DOEpatents

    Sutton, Richard Matthew Charles; Lamartine, Bruce Carvell; Orler, E. Bruce; Song, Shuangqi

    2015-12-22

    A hair treatment process for providing dispersed colors by light diffraction including (a) coating the hair with a material comprising a polymer, (b) pressing the hair with a pressing device including one or more surfaces, and (c) forming a secondary nanostructured surface pattern on the hair that is complementary to the primary nanostructured surface pattern on the one or more surfaces of the pressing device. The secondary nanostructured surface pattern diffracts light into dispersed colors that are visible on the hair. The section of the hair is pressed with the pressing device for from about 1 to 55 seconds. The polymer has a glass transition temperature from about 55.degree. C. to about 90.degree. C. The one or more surfaces include a primary nanostructured surface pattern.

  14. Quantum information processing with a travelling wave of light

    NASA Astrophysics Data System (ADS)

    Serikawa, Takahiro; Shiozawa, Yu; Ogawa, Hisashi; Takanashi, Naoto; Takeda, Shuntaro; Yoshikawa, Jun-ichi; Furusawa, Akira

    2018-02-01

    We exploit quantum information processing on a traveling wave of light, expecting emancipation from thermal noise, easy coupling to fiber communication, and potentially high operation speed. Although optical memories are technically challenging, we have an alternative approach to apply multi-step operations on traveling light, that is, continuous-variable one-way computation. So far our achievement includes generation of a one-million-mode entangled chain in time-domain, mode engineering of nonlinear resource states, and real-time nonlinear feedforward. Although they are implemented with free space optics, we are also investigating photonic integration and performed quantum teleportation with a passive liner waveguide chip as a demonstration of entangling, measurement, and feedforward. We also suggest a loop-based architecture as another model of continuous-variable computing.

  15. Developing an undergraduate geography course on digital image processing of remotely sensed data

    NASA Technical Reports Server (NTRS)

    Baumann, P. R.

    1981-01-01

    Problems relating to the development of a digital image processing course in an undergraduate geography environment is discussed. Computer resource requirements, course prerequisites, and the size of the study area are addressed.

  16. Assessing the use of Quantitative Light-induced Fluorescence-Digital as a clinical plaque assessment.

    PubMed

    Han, Sun-Young; Kim, Bo-Ra; Ko, Hae-Youn; Kwon, Ho-Keun; Kim, Baek-Il

    2016-03-01

    The aims of this study were to compare the relationship between red fluorescent plaque (RF plaque) area by Quantitative Light-induced Fluorescence-Digital (QLF-D) and disclosed plaque area by two-tone disclosure, and to assess the bacterial composition of the RF plaque by real time-PCR. Fifty healthy subjects were included and 600 facial surfaces of their anterior teeth were examined. QLF-D was taken on two separate occasions (before and after disclosing), and the RF plaque area was calculated based on Plaque Percent Index (PPI). After disclosing, the stained plaque area was analyzed to investigate the relationship with the RF plaque area. The relationship was evaluated using Pearson correlation and paired t-test. Then, the RF and non-red fluorescent (non-RF) plaque samples were obtained from the same subject for real-time PCR test. Total 10 plaque samples were compared the ratio of the 6 of bacteria using Wilcoxon signed rank test. Regarding the paired t-test, the blue-staining plaque area (9.3±9.2) showed significantly similarity with the RF plaque area (9.1±14.9, p=0.80) at ΔR20, however, the red-staining plaque area (31.6±20.9) presented difference from the RF plaque area (p<0.0001). In addition, bacterial composition of Prevotella intermedia and Streptococcus anginosus was associated with substantially more the RF plaque than the non-RF plaque (p<0.05). The plaque assessment method using QLF-D has potential to detect mature plaque, and the plaque area was associated with the blue-staining area using two-tone disclosure. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Is place-value processing in four-digit numbers fully automatic? Yes, but not always.

    PubMed

    García-Orza, Javier; Estudillo, Alejandro J; Calleja, Marina; Rodríguez, José Miguel

    2017-12-01

    Knowing the place-value of digits in multi-digit numbers allows us to identify, understand and distinguish between numbers with the same digits (e.g., 1492 vs. 1942). Research using the size congruency task has shown that the place-value in a string of three zeros and a non-zero digit (e.g., 0090) is processed automatically. In the present study, we explored whether place-value is also automatically activated when more complex numbers (e.g., 2795) are presented. Twenty-five participants were exposed to pairs of four-digit numbers that differed regarding the position of some digits and their physical size. Participants had to decide which of the two numbers was presented in a larger font size. In the congruent condition, the number shown in a bigger font size was numerically larger. In the incongruent condition, the number shown in a smaller font size was numerically larger. Two types of numbers were employed: numbers composed of three zeros and one non-zero digit (e.g., 0040-0400) and numbers composed of four non-zero digits (e.g., 2795-2759). Results showed larger congruency effects in more distant pairs in both type of numbers. Interestingly, this effect was considerably stronger in the strings composed of zeros. These results indicate that place-value coding is partially automatic, as it depends on the perceptual and numerical properties of the numbers to be processed.

  18. Digital Learning As Enhanced Learning Processing? Cognitive Evidence for New insight of Smart Learning.

    PubMed

    Di Giacomo, Dina; Ranieri, Jessica; Lacasa, Pilar

    2017-01-01

    Large use of technology improved quality of life across aging and favoring the development of digital skills. Digital skills can be considered an enhancing to human cognitive activities. New research trend is about the impact of the technology in the elaboration information processing of the children. We wanted to analyze the influence of technology in early age evaluating the impact on cognition. We investigated the performance of a sample composed of n. 191 children in school age distributed in two groups as users: high digital users and low digital users. We measured the verbal and visuoperceptual cognitive performance of children by n. 8 standardized psychological tests and ad hoc self-report questionnaire. Results have evidenced the influence of digital exposition on cognitive development: the cognitive performance is looked enhanced and better developed: high digital users performed better in naming, semantic, visual memory and logical reasoning tasks. Our finding confirms the data present in literature and suggests the strong impact of the technology using not only in the social, educational and quality of life of the people, but also it outlines the functionality and the effect of the digital exposition in early age; increased cognitive abilities of the children tailor digital skilled generation with enhanced cognitive processing toward to smart learning.

  19. Experimental investigation of analog and digital dimming techniques on photometric performance of an indoor Visible Light Communication (VLC) system

    NASA Astrophysics Data System (ADS)

    Zafar, Fahad; Kalavally, Vineetha; Bakaul, Masuduzzaman; Parthiban, R.

    2015-09-01

    For making commercial implementation of light emitting diode (LED) based visible light communication (VLC) systems feasible, it is necessary to incorporate it with dimming schemes which will provide energy savings, moods and increase the aesthetic value of the places using this technology. There are two general methods which are used to dim LEDs commonly categorized as analog and digital dimming. Incorporating fast data transmission with these techniques is a key challenge in VLC. In this paper, digital and analog dimming for a 10 Mb/s non return to zero on-off keying (NRZ-OOK) based VLC system is experimentally investigated considering both photometric and communicative parameters. A spectrophotometer was used for photometric analysis and a line of sight (LOS) configuration in the presence of ambient light was used for analyzing communication parameters. Based on the experimental results, it was determined that digital dimming scheme is preferable for use in indoor VLC systems requiring high dimming precision and data transmission at lower brightness levels. On the other hand, analog dimming scheme is a cost effective solution for high speed systems where dimming precision is insignificant.

  20. Processing, mosaicking and management of the Monterey Bay digital sidescan-sonar images

    USGS Publications Warehouse

    Chavez, P.S.; Isbrecht, J.; Galanis, P.; Gabel, G.L.; Sides, S.C.; Soltesz, D.L.; Ross, Stephanie L.; Velasco, M.G.

    2002-01-01

    Sidescan-sonar imaging systems with digital capabilities have now been available for approximately 20 years. In this paper we present several of the various digital image processing techniques developed by the U.S. Geological Survey (USGS) and used to apply intensity/radiometric and geometric corrections, as well as enhance and digitally mosaic, sidescan-sonar images of the Monterey Bay region. New software run by a WWW server was designed and implemented to allow very large image data sets, such as the digital mosaic, to be easily viewed interactively, including the ability to roam throughout the digital mosaic at the web site in either compressed or full 1-m resolution. The processing is separated into the two different stages: preprocessing and information extraction. In the preprocessing stage, sensor-specific algorithms are applied to correct for both geometric and intensity/radiometric distortions introduced by the sensor. This is followed by digital mosaicking of the track-line strips into quadrangle format which can be used as input to either visual or digital image analysis and interpretation. An automatic seam removal procedure was used in combination with an interactive digital feathering/stenciling procedure to help minimize tone or seam matching problems between image strips from adjacent track-lines. The sidescan-sonar image processing package is part of the USGS Mini Image Processing System (MIPS) and has been designed to process data collected by any 'generic' digital sidescan-sonar imaging system. The USGS MIPS software, developed over the last 20 years as a public domain package, is available on the WWW at: http://terraweb.wr.usgs.gov/trs/software.html.

  1. Performance of the SIR-B digital image processing subsystem

    NASA Technical Reports Server (NTRS)

    Curlander, J. C.

    1986-01-01

    A ground-based system to generate digital SAR image products has been developed and implemented in support of the SIR-B mission. This system is designed to achieve the maximum throughput while meeting strict image fidelity criteria. Its capabilities include: automated radiometric and geometric correction of the output imagery; high-precision absolute location without tiepoint registration; filtering of the raw data to remove spurious signals from alien radars; and automated catologing to maintain a full set of radar and image production facility in support of the SIR-B science investigators routinely produces over 80 image frames per week.

  2. Image processing of metal surface with structured light

    NASA Astrophysics Data System (ADS)

    Luo, Cong; Feng, Chang; Wang, Congzheng

    2014-09-01

    In structured light vision measurement system, the ideal image of structured light strip, in addition to black background , contains only the gray information of the position of the stripe. However, the actual image contains image noise, complex background and so on, which does not belong to the stripe, and it will cause interference to useful information. To extract the stripe center of mental surface accurately, a new processing method was presented. Through adaptive median filtering, the noise can be preliminary removed, and the noise which introduced by CCD camera and measured environment can be further removed with difference image method. To highlight fine details and enhance the blurred regions between the stripe and noise, the sharping algorithm is used which combine the best features of Laplacian operator and Sobel operator. Morphological opening operation and closing operation are used to compensate the loss of information.Experimental results show that this method is effective in the image processing, not only to restrain the information but also heighten contrast. It is beneficial for the following processing.

  3. Shedding (Incoherent) Light on Quantum Effects in Light-Induced Biological Processes.

    PubMed

    Brumer, Paul

    2018-05-18

    Light-induced processes that occur in nature, such as photosynthesis and photoisomerization in the first steps in vision, are often studied in the laboratory using coherent pulsed laser sources, which induce time-dependent coherent wavepacket molecule dynamics. Nature, however, uses stationary incoherent thermal radiation, such as sunlight, leading to a totally different molecular response, the time-independent steady state. It is vital to appreciate this difference in order to assess the role of quantum coherence effects in biological systems. Developments in this area are discussed in detail.

  4. Implementation of real-time digital endoscopic image processing system

    NASA Astrophysics Data System (ADS)

    Song, Chul Gyu; Lee, Young Mook; Lee, Sang Min; Kim, Won Ky; Lee, Jae Ho; Lee, Myoung Ho

    1997-10-01

    Endoscopy has become a crucial diagnostic and therapeutic procedure in clinical areas. Over the past four years, we have developed a computerized system to record and store clinical data pertaining to endoscopic surgery of laparascopic cholecystectomy, pelviscopic endometriosis, and surgical arthroscopy. In this study, we developed a computer system, which is composed of a frame grabber, a sound board, a VCR control board, a LAN card and EDMS. Also, computer system controls peripheral instruments such as a color video printer, a video cassette recorder, and endoscopic input/output signals. Digital endoscopic data management system is based on open architecture and a set of widely available industry standards; namely Microsoft Windows as an operating system, TCP/IP as a network protocol and a time sequential database that handles both images and speech. For the purpose of data storage, we used MOD and CD- R. Digital endoscopic system was designed to be able to store, recreate, change, and compress signals and medical images. Computerized endoscopy enables us to generate and manipulate the original visual document, making it accessible to a virtually unlimited number of physicians.

  5. Lean Six Sigma Application in Rear Combination Automotive Lighting Process

    NASA Astrophysics Data System (ADS)

    Sodkomkham, Thanwarhat; Chutima, Parames

    2016-05-01

    The case study company produces various front and rear lightings for automobiles and motorcycles. Currently, it faces two problems, i.e. high defective rate and high inventory. Lean Six Sigma was applied as a tool to solve the first problem, whereas the other problem was managed by changing the production concept from push to pull. The results showed that after applying all new settings to the process, the defect rate was reduced from 36,361 DPPM to 3,029 DPPM. In addition, after the implementation of the Kanban system, the company achieved substantial improvement in lead time reduction by 44%, in-process inventory reduction by 42%, finished good inventory reduction by 50%, and finished good area increased by 16%.

  6. Digital signal processing for the ATLAS/LUCID detector

    SciTech Connect

    NONE

    2015-07-01

    Both the detector and the associated read-out electronics have been improved in order to cope with the LHC luminosity increase foreseen for RUN 2 and RUN 3. The new operating conditions require a careful tuning of the read-out electronics in order to optimize the signal-to-noise ratio. The new read-out electronics will allow the use of digital filtering of the photo multiplier tube signals. In this talk, we will present the first results that we obtained in the optimization of the signal-to-noise ratio. In addition, we will introduce the next steps to adapt this system to high performance read-out chains formore » low energy gamma rays. Such systems are based, for instance, on Silicon Drift Detector devices and can be used in applications at Free-Electron-Laser facilities such as the XFEL under construction at DESY. (authors)« less

  7. Digital active material processing platform effort (DAMPER), SBIR phase 2

    NASA Technical Reports Server (NTRS)

    Blackburn, John; Smith, Dennis

    1992-01-01

    Applied Technology Associates, Inc., (ATA) has demonstrated that inertial actuation can be employed effectively in digital, active vibration isolation systems. Inertial actuation involves the use of momentum exchange to produce corrective forces which act directly on the payload being actively isolated. In a typical active vibration isolation system, accelerometers are used to measure the inertial motion of the payload. The signals from the accelerometers are then used to calculate the corrective forces required to counteract, or 'cancel out' the payload motion. Active vibration isolation is common technology, but the use of inertial actuation in such systems is novel, and is the focus of the DAMPER project. A May 1991 report was completed which documented the successful demonstration of inertial actuation, employed in the control of vibration in a single axis. In the 1 degree-of-freedom (1DOF) experiment a set of air bearing rails was used to suspend the payload, simulating a microgravity environment in a single horizontal axis. Digital Signal Processor (DSP) technology was used to calculate in real time, the control law between the accelerometer signals and the inertial actuators. The data obtained from this experiment verified that as much as 20 dB of rejection could be realized by this type of system. A discussion is included of recent tests performed in which vibrations were actively controlled in three axes simultaneously. In the three degree-of-freedom (3DOF) system, the air bearings were designed in such a way that the payload is free to rotate about the azimuth axis, as well as translate in the two horizontal directions. The actuator developed for the DAMPER project has applications beyond payload isolation, including structural damping and source vibration isolation. This report includes a brief discussion of these applications, as well as a commercialization plan for the actuator.

  8. Digital active material processing platform effort (DAMPER), SBIR phase 2

    NASA Astrophysics Data System (ADS)

    Blackburn, John; Smith, Dennis

    1992-11-01

    Applied Technology Associates, Inc., (ATA) has demonstrated that inertial actuation can be employed effectively in digital, active vibration isolation systems. Inertial actuation involves the use of momentum exchange to produce corrective forces which act directly on the payload being actively isolated. In a typical active vibration isolation system, accelerometers are used to measure the inertial motion of the payload. The signals from the accelerometers are then used to calculate the corrective forces required to counteract, or 'cancel out' the payload motion. Active vibration isolation is common technology, but the use of inertial actuation in such systems is novel, and is the focus of the DAMPER project. A May 1991 report was completed which documented the successful demonstration of inertial actuation, employed in the control of vibration in a single axis. In the 1 degree-of-freedom (1DOF) experiment a set of air bearing rails was used to suspend the payload, simulating a microgravity environment in a single horizontal axis. Digital Signal Processor (DSP) technology was used to calculate in real time, the control law between the accelerometer signals and the inertial actuators. The data obtained from this experiment verified that as much as 20 dB of rejection could be realized by this type of system. A discussion is included of recent tests performed in which vibrations were actively controlled in three axes simultaneously. In the three degree-of-freedom (3DOF) system, the air bearings were designed in such a way that the payload is free to rotate about the azimuth axis, as well as translate in the two horizontal directions. The actuator developed for the DAMPER project has applications beyond payload isolation, including structural damping and source vibration isolation. This report includes a brief discussion of these applications, as well as a commercialization plan for the actuator.

  9. Image processing for a tactile/vision substitution system using digital CNN.

    PubMed

    Lin, Chien-Nan; Yu, Sung-Nien; Hu, Jin-Cheng

    2006-01-01

    In view of the parallel processing and easy implementation properties of CNN, we propose to use digital CNN as the image processor of a tactile/vision substitution system (TVSS). The digital CNN processor is used to execute the wavelet down-sampling filtering and the half-toning operations, aiming to extract important features from the images. A template combination method is used to embed the two image processing functions into a single CNN processor. The digital CNN processor is implemented on an intellectual property (IP) and is implemented on a XILINX VIRTEX II 2000 FPGA board. Experiments are designated to test the capability of the CNN processor in the recognition of characters and human subjects in different environments. The experiments demonstrates impressive results, which proves the proposed digital CNN processor a powerful component in the design of efficient tactile/vision substitution systems for the visually impaired people.

  10. Modernizing bridge safety inspection with process improvement and digital assistance.

    DOT National Transportation Integrated Search

    2004-01-01

    This research effort was developed to record and analyze the Virginia Department of Transportation (VDOT) bridge/structure inspection processes as an aid to modernizing and automating these inspection processes through the use of mobile personal comp...

  11. [Digital x-ray image processing as an aid in forensic medicine].

    PubMed

    Buitrago-Tellez, C; Wenz, W; Friedrich, G

    1992-02-01

    Radiology plays an important role in the identification of unknown corpses. Positive radiographic identification by comparison with antemortem films is an established technique in this setting. Technical defects together with non-well-preserved films make it sometimes difficult or even impossible to establish a confident comparison. Digital image processing after secondary digitalization of ante- and postmortem films represents an important development and aid in forensic medicine. The application of this method is demonstrated on a single case.

  12. Development of a compact and cost effective multi-input digital signal processing system

    NASA Astrophysics Data System (ADS)

    Darvish-Molla, Sahar; Chin, Kenrick; Prestwich, William V.; Byun, Soo Hyun

    2018-01-01

    A prototype digital signal processing system (DSP) was developed using a microcontroller interfaced with a 12-bit sampling ADC, which offers a considerably inexpensive solution for processing multiple detectors with high throughput. After digitization of the incoming pulses, in order to maximize the output counting rate, a simple algorithm was employed for pulse height analysis. Moreover, an algorithm aiming at the real-time pulse pile-up deconvolution was implemented. The system was tested using a NaI(Tl) detector in comparison with a traditional analogue and commercial digital systems for a variety of count rates. The performance of the prototype system was consistently superior to the analogue and the commercial digital systems up to the input count rate of 61 kcps while was slightly inferior to the commercial digital system but still superior to the analogue system in the higher input rates. Considering overall cost, size and flexibility, this custom made multi-input digital signal processing system (MMI-DSP) was the best reliable choice for the purpose of the 2D microdosimetric data collection, or for any measurement in which simultaneous multi-data collection is required.

  13. System design and implementation of digital-image processing using computational grids

    NASA Astrophysics Data System (ADS)

    Shen, Zhanfeng; Luo, Jiancheng; Zhou, Chenghu; Huang, Guangyu; Ma, Weifeng; Ming, Dongping

    2005-06-01

    As a special type of digital image, remotely sensed images are playing increasingly important roles in our daily lives. Because of the enormous amounts of data involved, and the difficulties of data processing and transfer, an important issue for current computer and geo-science experts is developing internet technology to implement rapid remotely sensed image processing. Computational grids are able to solve this problem effectively. These networks of computer workstations enable the sharing of data and resources, and are used by computer experts to solve imbalances of network resources and lopsided usage. In China, computational grids combined with spatial-information-processing technology have formed a new technology: namely, spatial-information grids. In the field of remotely sensed images, spatial-information grids work more effectively for network computing, data processing, resource sharing, task cooperation and so on. This paper focuses mainly on the application of computational grids to digital-image processing. Firstly, we describe the architecture of digital-image processing on the basis of computational grids, its implementation is then discussed in detail with respect to the technology of middleware. The whole network-based intelligent image-processing system is evaluated on the basis of the experimental analysis of remotely sensed image-processing tasks; the results confirm the feasibility of the application of computational grids to digital-image processing.

  14. Development strategy and process models for phased automation of design and digital manufacturing electronics

    NASA Astrophysics Data System (ADS)

    Korshunov, G. I.; Petrushevskaya, A. A.; Lipatnikov, V. A.; Smirnova, M. S.

    2018-03-01

    The strategy of quality of electronics insurance is represented as most important. To provide quality, the processes sequence is considered and modeled by Markov chain. The improvement is distinguished by simple database means of design for manufacturing for future step-by-step development. Phased automation of design and digital manufacturing electronics is supposed. The MatLab modelling results showed effectiveness increase. New tools and software should be more effective. The primary digital model is proposed to represent product in the processes sequence from several processes till the whole life circle.

  15. Analysis of Shade Matching in Natural Dentitions Using Intraoral Digital Spectrophotometer in LED and Filtered LED Light Sources.

    PubMed

    Chitrarsu, Vijai Krishnan; Chidambaranathan, Ahila Singaravel; Balasubramaniam, Muthukumar

    2017-10-31

    To evaluate the shade matching capabilities in natural dentitions using Vita Toothguide 3D-Master and an intraoral digital spectrophotometer (Vita Easyshade Advance 4.0) in various light sources. Participants between 20 and 40 years old with natural, unrestored right maxillary central incisors, no history of bleaching, orthodontic treatment, or malocclusion and no rotations were included. According to their shades, subjects were randomly selected and grouped into A1, A2, and A3. A total of 100 participants (50 male and 50 female) in each group were chosen for this study. Shade selection was made between 10 am and 2 pm for all light sources. The same examiner selected the shade of natural teeth with Vita Toothguide 3D-Master under natural light within 2 minutes. Once the Vita Toothguide 3D-Masterwas matched with the maxillary right central incisor, the L*, a*, and b* values, chroma, and hue were recorded with Vita Easyshade Advance 4.0 by placing it on the shade tab under the same light source. The values were statistically analyzed using one-way ANOVA and Tukey's HSD post hoc test with SPSS v22.0 software. The mean ∆E* ab values for shades A1, A2, and A3 for groups 1, 2, and 3 were statistically significantly different from each other (p < 0.001). The intraoral digital spectrophotometer showed statistically significant differences in shade matching compared to Vita Toothguide 3D-Master. Incandescent light showed more accurate shade matching than the filtered LED, LED, and daylight. © 2017 by the American College of Prosthodontists.

  16. Hair treatment process providing dispersed colors by light diffraction

    DOEpatents

    Lamartine, Bruce Carvell; Orler, E. Bruce; Sutton, Richard Matthew Charles; Song, Shuangqi

    2014-11-11

    Hair was coated with polymer-containing fluid and then hot pressed to form a composite of hair and a polymer film imprinted with a nanopattern. Polychromatic light incident on the nanopattern is diffracted into dispersed colored light.

  17. Hair treatment process providing dispersed colors by light diffraction

    DOEpatents

    Lamartine, Bruce Carvell; Orler, E. Bruce; Sutton, Richard Matthew Charles; Song, Shuangqi

    2013-12-17

    Hair was coated with polymer-containing fluid and then hot pressed to form a composite of hair and a polymer film imprinted with a nanopattern. Polychromatic light incident on the nanopattern is diffracted into dispersed colored light.

  18. Modeling and Analysis of Power Processing Systems. [use of a digital computer for designing power plants

    NASA Technical Reports Server (NTRS)

    Fegley, K. A.; Hayden, J. H.; Rehmann, D. W.

    1974-01-01

    The feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems is investigated. It is shown that a digital computer may be used in an interactive mode for the design, modeling, analysis, and comparison of power processing systems.

  19. Rethinking Design Process: Using 3D Digital Models as an Interface in Collaborative Session

    ERIC Educational Resources Information Center

    Ding, Suining

    2008-01-01

    This paper describes a pilot study for an alternative design process by integrating a designer-user collaborative session with digital models. The collaborative session took place in a 3D AutoCAD class for a real world project. The 3D models served as an interface for designer-user collaboration during the design process. Students not only learned…

  20. The x-ray light valve: a low-cost, digital radiographic imaging system-spatial resolution

    NASA Astrophysics Data System (ADS)

    MacDougall, Robert D.; Koprinarov, Ivaylo; Webster, Christie A.; Rowlands, J. A.

    2007-03-01

    In recent years, new x-ray radiographic systems based on large area flat panel technology have revolutionized our capability to produce digital x-ray radiographic images. However, these active matrix flat panel imagers (AMFPIs) are extraordinarily expensive compared to the systems they are replacing. Thus there is a need for a low cost digital imaging system for general applications in radiology. Different approaches have been considered to make lower cost, integrated x-ray imaging devices for digital radiography, including: scanned projection x-ray, an integrated approach based on computed radiography technology and optically demagnified x-ray screen/CCD systems. These approaches suffer from either high cost or high mechanical complexity and do not have the image quality of AMFPIs. We have identified a new approach - the X-ray Light Valve (XLV). The XLV has the potential to achieve the immediate readout in an integrated system with image quality comparable to AMFPIs. The XLV concept combines three well-established and hence lowcost technologies: an amorphous selenium (a-Se) layer to convert x-rays to image charge, a liquid crystal (LC) cell as an analog display, and an optical scanner for image digitization. Here we investigate the spatial resolution possible with XLV systems. Both a-Se and LC cells have both been shown separately to have inherently very high spatial resolution. Due to the close electrostatic coupling in the XLV, it can be expected that the spatial resolution of this system will also be very high. A prototype XLV was made and a typical office scanner was used for image digitization. The Modulation Transfer Function was measured and the limiting factor was seen to be the optical scanner. However, even with this limitation the XLV system is able to meet or exceed the resolution requirements for chest radiography.

  1. A Federated Digital Identity Management Approach for Business Processes

    NASA Astrophysics Data System (ADS)

    Bertino, Elisa; Ferrini, Rodolfo; Musci, Andrea; Paci, Federica; Steuer, Kevin J.

    Business processes have gained a lot of attention because of the pressing need for integrating existing resources and services to better fulfill customer needs. A key feature of business processes is that they are built from composable services, referred to as component services, that may belong to different domains. In such a context, flexible multi-domain identity management solutions are crucial for increased security and user-convenience. In particular, it is important that during the execution of a business process the component services be able to verify the identity of the client to check that it has the required permissions for accessing the services. To address the problem of multi-domain identity management, we propose a multi-factor identity attribute verification protocol for business processes that assures clients privacy and handles naming heterogeneity.

  2. Detection of melanomas by digital imaging of spectrally resolved UV light-induced autofluorescence of human skin

    NASA Astrophysics Data System (ADS)

    Chwirot, B. W.; Chwirot, S.; Jedrzejczyk, W.; Redzinski, J.; Raczynska, A. M.; Telega, K.

    2001-07-01

    We studied spectral and spatial distributions of the intensity of the ultraviolet light-excited fluorescence of human skin. Our studied performed in situ in 162 patients with malignant and non-malignant skin lesions resulted in a new method of detecting melanomas in situ using digital imaging of the spectrally resolved fluorescence. With our diagnostic algorithm we could successfully detect 88.5% of the cases of melanoma in the group of patients subject to examinations with the fluorescence method. A patent application for the method has been submitted to the Patent Office in Warsaw.

  3. Smectic A Filled Birefringent Elements and Fast Switching Twisted Dual Frequency Nematic Cells Used for Digital Light Deflection

    NASA Technical Reports Server (NTRS)

    Pishnyak, Oleg; Golovin, Andrii; Kreminskia, Liubov; Pouch, John J.; Miranda, Felix A.; Winker, Bruce K.; Lavrentovich, Oleg D.

    2006-01-01

    We describe the application of smectic A (SmA) liquid crystals for beam deflection. SmA materials can be used in digital beam deflectors (DBDs) as fillers for passive birefringent prisms. SmA prisms have high birefringence and can be constructed in a variety of shapes, including single prisms and prismatic blazed gratings of different angles and profiles. We address the challenges of uniform alignment of SmA, such as elimination of focal conic domains. Fast rotation of the incident light polarization in DBDs is achieved by an electrically switched 90 twisted nematic (TN) cell.

  4. Digital Detection and Processing of Multiple Quadrature Harmonics for EPR Spectroscopy

    PubMed Central

    Ahmad, R.; Som, S.; Kesselring, E.; Kuppusamy, P.; Zweier, J.L.; Potter, L.C.

    2010-01-01

    A quadrature digital receiver and associated signal estimation procedure are reported for L-band electron paramagnetic resonance (EPR) spectroscopy. The approach provides simultaneous acquisition and joint processing of multiple harmonics in both in-phase and out-of-phase channels. The digital receiver, based on a high-speed dual-channel analog-to-digital converter, allows direct digital down-conversion with heterodyne processing using digital capture of the microwave reference signal. Thus, the receiver avoids noise and nonlinearity associated with analog mixers. Also, the architecture allows for low-Q anti-alias filtering and does not require the sampling frequency to be time-locked to the microwave reference. A noise model applicable for arbitrary contributions of oscillator phase noise is presented, and a corresponding maximum-likelihood estimator of unknown parameters is also reported. The signal processing is applicable for Lorentzian lineshape under nonsaturating conditions. The estimation is carried out using a convergent iterative algorithm capable of jointly processing the in-phase and out-of-phase data in the presence of phase noise and unknown microwave phase. Cramér-Rao bound analysis and simulation results demonstrate a significant reduction in linewidth estimation error using quadrature detection, for both low and high values of phase noise. EPR spectroscopic data are also reported for illustration. PMID:20971667

  5. Digital detection and processing of multiple quadrature harmonics for EPR spectroscopy.

    PubMed

    Ahmad, R; Som, S; Kesselring, E; Kuppusamy, P; Zweier, J L; Potter, L C

    2010-12-01

    A quadrature digital receiver and associated signal estimation procedure are reported for L-band electron paramagnetic resonance (EPR) spectroscopy. The approach provides simultaneous acquisition and joint processing of multiple harmonics in both in-phase and out-of-phase channels. The digital receiver, based on a high-speed dual-channel analog-to-digital converter, allows direct digital down-conversion with heterodyne processing using digital capture of the microwave reference signal. Thus, the receiver avoids noise and nonlinearity associated with analog mixers. Also, the architecture allows for low-Q anti-alias filtering and does not require the sampling frequency to be time-locked to the microwave reference. A noise model applicable for arbitrary contributions of oscillator phase noise is presented, and a corresponding maximum-likelihood estimator of unknown parameters is also reported. The signal processing is applicable for Lorentzian lineshape under nonsaturating conditions. The estimation is carried out using a convergent iterative algorithm capable of jointly processing the in-phase and out-of-phase data in the presence of phase noise and unknown microwave phase. Cramér-Rao bound analysis and simulation results demonstrate a significant reduction in linewidth estimation error using quadrature detection, for both low and high values of phase noise. EPR spectroscopic data are also reported for illustration. Copyright © 2010 Elsevier Inc. All rights reserved.

  6. Interactive Computing and Graphics in Undergraduate Digital Signal Processing. Microcomputing Working Paper Series F 84-9.

    ERIC Educational Resources Information Center

    Onaral, Banu; And Others

    This report describes the development of a Drexel University electrical and computer engineering course on digital filter design that used interactive computing and graphics, and was one of three courses in a senior-level sequence on digital signal processing (DSP). Interactive and digital analysis/design routines and the interconnection of these…

  7. Digital photogrammetry and histomorphometric assessment of the effect of non-coherent light (light-emitting diode) therapy (λ640 ± 20 nm) on the repair of third-degree burns in rats.

    PubMed

    Neves, Silvana Maria Véras; Nicolau, Renata Amadei; Filho, Antônio Luiz Martins Maia; Mendes, Lianna Martha Soares; Veloso, Ana Maria

    2014-01-01

    Recent studies have demonstrated the efficacy of coherent light therapy from the red region of the electromagnetic spectrum on the tissue-healing process. This study analysed the effect of non-coherent light therapy (light-emitting diode-LED) with or without silver sulfadiazine (sulpha) on the healing process of third-degree burns. In this study, 72 rats with third-degree burns were randomly divided into six groups (n = 12): Gr1 (control), Gr2 (non-contact LED), Gr3 (contact LED), Gr4 (sulfadiazine), Gr5 (sulfadiazine + non-contact LED) and Gr6 (sulfadiazine + contact LED). The groups treated with LED therapy received treatment every 48 h (λ = 640 ± 20 nm, 110 mW, 16 J/cm(2); 41 s with contact and 680 s without contact). The digital photometric and histomorphometric analyses were conducted after the burn occurred. The combination of sulpha and LED (contact or non-contact) improved the healing of burn wounds. These results demonstrate that the combination of silver sulfadiazine with LED therapy (λ = 640 ± 20 nm, 4 J/cm(2), without contact) improves healing of third-degree burn wounds, significantly reduces the lesion area and increases the granulation tissue, increases the number of fibroblasts, promotes collagen synthesis and prevents burn infections by accelerating recovery.

  8. Angle of sky light polarization derived from digital images of the sky under various conditions.

    PubMed

    Zhang, Wenjing; Cao, Yu; Zhang, Xuanzhe; Yang, Yi; Ning, Yu

    2017-01-20

    Skylight polarization is used for navigation by some birds and insects. Skylight polarization also has potential for human navigation applications. Its advantages include relative immunity from interference and the absence of error accumulation over time. However, there are presently few examples of practical applications for polarization navigation technology. The main reason is its weak robustness during cloudy weather conditions. In this paper, the real-time measurement of the sky light polarization pattern across the sky has been achieved with a wide field of view camera. The images were processed under a new reference coordinate system to clearly display the symmetrical distribution of angle of polarization with respect to the solar meridian. A new algorithm for the extraction of the image axis of symmetry is proposed, in which the real-time azimuth angle between the camera and the solar meridian is accurately calculated. Our experimental results under different weather conditions show that polarization navigation has high accuracy, is strongly robust, and performs well during fog and haze, clouds, and strong sunlight.

  9. The x-ray light valve: a potentially low-cost, digital radiographic imaging system--a liquid crystal cell design for chest radiography.

    PubMed

    Szeto, Timothy C; Webster, Christie Ann; Koprinarov, Ivaylo; Rowlands, J A

    2008-03-01

    Digital x-ray radiographic systems are desirable as they offer high quality images which can be processed, transferred, and stored without secondary steps. However, current clinical systems are extraordinarily expensive in comparison to film-based systems. Thus, there is a need for an economical digital imaging system for general radiology. The x-ray light valve (XLV) is a novel digital x-ray detector concept with the potential for high image quality and low cost. The XLV is comprised of a photoconductive detector layer and liquid crystal (LC) cell physically coupled in a sandwich structure. Upon exposure to x rays, charge is collected at the surface of the photoconductor, causing a change in the reflective properties of the LC cell. The visible image so formed can subsequently be digitized with an optical scanner. By choosing the properties of the LC cell in combination with the appropriate photoconductor thickness and bias potentials, the XLV can be optimized for various diagnostic imaging tasks. Specifically for chest radiography, we identified three potentially practical reflective cell designs by selecting from those commonly used in LC display technology. The relationship between reflectance and x-ray exposure (i.e., the characteristic curve) was determined for all three cells using a theoretical model. The results indicate that the reflective electrically controlled birefringence (r-ECB) cell is the preferred choice for chest radiography, provided that the characteristic curve can be shifted towards lower exposures. The feasibility of the shift of the characteristic curve is shown experimentally. The experimental results thus demonstrate that an XLV based on the r-ECB cell design exhibits a characteristic curve suitable for chest radiography.

  10. Digital Screening and Halftone Techniques for Raster Processing,

    DTIC Science & Technology

    1980-01-14

    I 7 A)-AO81 090 ARMY ENGINEER TOPOGRAPH4IC LASS FORT Bs.voiR VA 161/ OIGITAL SCREENING ANO HALFTONE TECNNIOUIES FOR RASTER PROCLPSINM-.TC(U) JAN GO R... HALFTONE TECHNIQUES 0FOR RASTER PROCESSING BY RICHARD L. ROSENTHAL DTIC FEB 𔃼 7 1980 W.A Approved for public release; distribution unlimited AU...creening and halftone techniques forlt -rastei’ processing ~ A 6. PERFORMING ORG. REPORT NUMBER 7. AUTNOP-r- S. CONTRACT OR GRANT NUMBER(*) c*t- Ri chard

  11. Scalable Parallel Algorithms for Multidimensional Digital Signal Processing

    DTIC Science & Technology

    1991-12-31

    Proceedings, San Diego CL., August 1989, pp. 132-146. 53 [13] A. L. Gorin, L. Auslander, and A. Silberger . Balanced computation of 2D trans- forms on a tree...Speech, Signal Processing. ASSP-34, Oct. 1986,pp. 1301-1309. [24] A. Norton and A. Silberger . Parallelization and performance analysis of the Cooley-Tukey

  12. Digital Distraction: Shedding Light on the 21st-Century College Classroom

    ERIC Educational Resources Information Center

    Aaron, Lynn S.; Lipton, Talia

    2018-01-01

    It is not uncommon to walk into a college classroom and find all heads bowed down to a flashing screen and the room . . . silent. While digital devices can certainly support learning, what about when they are a distraction? This study explored this 21st-century phenomenon from two perspectives: Does the use of a device for nonacademic purposes…

  13. Digital optical signal processing with polarization-bistable semiconductor lasers

    SciTech Connect

    Jai-Ming Liu,; Ying-Chin Chen,

    1985-04-01

    The operations of a complete set of optical AND, NAND, OR, and NOR gates and clocked optical S-R, D, J-K, and T flip-flops are demonstrated, based on direct polarization switching and polarization bistability, which we have recently observed in InGaAsP/InP semiconductor lasers. By operating the laser in the direct-polarizationswitchable mode, the output of the laser can be directly switched between the TM00 and TE00 modes with high extinction ratios by changing the injection-current level, and optical logic gates are constructed with two optoelectronic switches or photodetectors. In the polarization-bistable mode, the laser exhibits controllable hysteresis loops in the polarization-resolved powermore » versus current characteristics. When the laser is biased in the middle of the hysteresis loop, the light output can be switched between the two polarization states by injection of short electrical or optical pulses, and clocked optical flip-flops are constructed with a few optoelectronic switches and/or photodetectors. The 1 and 0 states of these devices are defined through polarization changes of the laser and direct complement functions are obtainable from the TE and TM output signals from the same laser. Switching of the polarization-bistable lasers with fast-rising current pulses has an instrument-limited mode-switching time on the order of 1 ns. With fast optoelectronic switches and/or fast photodetectors, the overall switching speed of the logic gates and flip-flops is limited by the polarizationbistable laser to <1 ns. We have demonstrated the operations of these devices using optical signals generated by semiconductor lasers. The proposed schemes of our devices are compatible with monolithic integration based on current fabrication technology and are applicable to other types of bistable semiconductor lasers.« less

  14. Information collection and processing of dam distortion in digital reservoir system

    NASA Astrophysics Data System (ADS)

    Liang, Yong; Zhang, Chengming; Li, Yanling; Wu, Qiulan; Ge, Pingju

    2007-06-01

    The "digital reservoir" is usually understood as describing the whole reservoir with digital information technology to make it serve the human existence and development furthest. Strictly speaking, the "digital reservoir" is referred to describing vast information of the reservoir in different dimension and space-time by RS, GPS, GIS, telemetry, remote-control and virtual reality technology based on computer, multi-media, large-scale memory and wide-band networks technology for the human existence, development and daily work, life and entertainment. The core of "digital reservoir" is to realize the intelligence and visibility of vast information of the reservoir through computers and networks. The dam is main building of reservoir, whose safety concerns reservoir and people's safety. Safety monitoring is important way guaranteeing the dam's safety, which controls the dam's running through collecting the dam's information concerned and developing trend. Safety monitoring of the dam is the process from collection and processing of initial safety information to forming safety concept in the brain. The paper mainly researches information collection and processing of the dam by digital means.

  15. Experimental study of digital image processing techniques for LANDSAT data

    NASA Technical Reports Server (NTRS)

    Rifman, S. S. (Principal Investigator); Allendoerfer, W. B.; Caron, R. H.; Pemberton, L. J.; Mckinnon, D. M.; Polanski, G.; Simon, K. W.

    1976-01-01

    The author has identified the following significant results. Results are reported for: (1) subscene registration, (2) full scene rectification and registration, (3) resampling techniques, (4) and ground control point (GCP) extraction. Subscenes (354 pixels x 234 lines) were registered to approximately 1/4 pixel accuracy and evaluated by change detection imagery for three cases: (1) bulk data registration, (2) precision correction of a reference subscene using GCP data, and (3) independently precision processed subscenes. Full scene rectification and registration results were evaluated by using a correlation technique to measure registration errors of 0.3 pixel rms thoughout the full scene. Resampling evaluations of nearest neighbor and TRW cubic convolution processed data included change detection imagery and feature classification. Resampled data were also evaluated for an MSS scene containing specular solar reflections.

  16. Advanced Digital Signal Processing for Hybrid Lidar FY 2014

    DTIC Science & Technology

    2014-10-30

    processing steps on raw data, with a PC miming Lab VIEW performing the fmal calculations to obtain range measurements . A MATLAB- based system...regarding the object and it reduces the image contrast and resolution as well as the object ranging measurement accuracy. There have been various...frequency (>100MHz) approach that uses high speed modulation to help suppress backscatter while also providing an unambiguous range measurement . In general

  17. Detection of proximal caries using quantitative light-induced fluorescence-digital and laser fluorescence: a comparative study.

    PubMed

    Yoon, Hyung-In; Yoo, Min-Jeong; Park, Eun-Jin

    2017-12-01

    The purpose of this study was to evaluate the in vitro validity of quantitative light-induced fluorescence-digital (QLF-D) and laser fluorescence (DIAGNOdent) for assessing proximal caries in extracted premolars, using digital radiography as reference method. A total of 102 extracted premolars with similar lengths and shapes were used. A single operator conducted all the examinations using three different detection methods (bitewing radiography, QLF-D, and DIAGNOdent). The bitewing x-ray scale, QLF-D fluorescence loss (ΔF), and DIAGNOdent peak readings were compared and statistically analyzed. Each method showed an excellent reliability. The correlation coefficient between bitewing radiography and QLF-D, DIAGNOdent were -0.644 and 0.448, respectively, while the value between QLF-D and DIAGNOdent was -0.382. The kappa statistics for bitewing radiography and QLF-D had a higher diagnosis consensus than those for bitewing radiography and DIAGNOdent. The QLF-D was moderately to highly accurate (AUC = 0.753 - 0.908), while DIAGNOdent was moderately to less accurate (AUC = 0.622 - 0.784). All detection methods showed statistically significant correlation and high correlation between the bitewing radiography and QLF-D. QLF-D was found to be a valid and reliable alternative diagnostic method to digital bitewing radiography for in vitro detection of proximal caries.

  18. Efficient Light Extraction of Organic Light-Emitting Diodes on a Fully Solution-Processed Flexible Substrate

    SciTech Connect

    Tong, Kwing; Liu, Xiaofeng; Zhao, Fangchao

    A flexible, nanocomposite substrate for maximizing light outcoupling efficiencies of organic light-emitting diodes (OLEDs) is introduced. In depth investigation is performed on designing the integrated strategy based on considerations of surface conductivity, microcavity tuning, and internal light scattering. The resulting nanocomposite substrate consists of silver nanowires as the electrode and a high-index polymer layer and a light-scattering layer for light extraction. It is able to outcouple both the waveguide and the substrate modes, two modes accounting for significant losses in OLED device efficiency. With enhanced light outcoupling, white OLEDs subsequently fabricated on the nanocomposite substrates demonstrate performance metrics of 107more » lm W -1 power efficiency and 49% external quantum efficiency at 1000 cd m -2. Thus, the nanocomposite substrate is fabricated by solution processes at low temperatures for potentially low manufacturing cost.« less

  19. Efficient Light Extraction of Organic Light-Emitting Diodes on a Fully Solution-Processed Flexible Substrate

    DOE PAGES

    Tong, Kwing; Liu, Xiaofeng; Zhao, Fangchao; ...

    2017-07-18

    A flexible, nanocomposite substrate for maximizing light outcoupling efficiencies of organic light-emitting diodes (OLEDs) is introduced. In depth investigation is performed on designing the integrated strategy based on considerations of surface conductivity, microcavity tuning, and internal light scattering. The resulting nanocomposite substrate consists of silver nanowires as the electrode and a high-index polymer layer and a light-scattering layer for light extraction. It is able to outcouple both the waveguide and the substrate modes, two modes accounting for significant losses in OLED device efficiency. With enhanced light outcoupling, white OLEDs subsequently fabricated on the nanocomposite substrates demonstrate performance metrics of 107more » lm W -1 power efficiency and 49% external quantum efficiency at 1000 cd m -2. Thus, the nanocomposite substrate is fabricated by solution processes at low temperatures for potentially low manufacturing cost.« less

  20. Faster processing of multiple spatially-heterodyned direct to digital holograms

    DOEpatents

    Hanson, Gregory R.; Bingham, Philip R.

    2006-10-03

    Systems and methods are described for faster processing of multiple spatially-heterodyned direct to digital holograms. A method includes of obtaining multiple spatially-heterodyned holograms, includes: digitally recording a first spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; digitally recording a second spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded first spatially-heterodyned hologram by shifting a first original origin of the recorded first spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a first angle between a first reference beam and a first, object beam; applying a first digital filter to cut off signals around the first original origin and performing an inverse Fourier transform on the result; Fourier analyzing the recorded second spatially-heterodyned hologram by shifting a second original origin of the recorded second spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a second angle between a second reference beam and a second object beam; and applying a second digital filter to cut off signals around the second original origin and performing an inverse Fourier transform on the result, wherein digitally recording the first spatially-heterodyned hologram is completed before digitally recording the second spatially-heterodyned hologram and a single digital image includes both the first spatially-heterodyned hologram and the second spatially-heterodyned hologram.

  1. Faster processing of multiple spatially-heterodyned direct to digital holograms

    DOEpatents

    Hanson, Gregory R [Clinton, TN; Bingham, Philip R [Knoxville, TN

    2008-09-09

    Systems and methods are described for faster processing of multiple spatially-heterodyned direct to digital holograms. A method includes of obtaining multiple spatially-heterodyned holograms, includes: digitally recording a first spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; digitally recording a second spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded first spatially-heterodyned hologram by shifting a first original origin of the recorded first spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a first angle between a first reference beam and a first object beam; applying a first digital filter to cut off signals around the first original origin and performing an inverse Fourier transform on the result; Fourier analyzing the recorded second spatially-heterodyned hologram by shifting a second original origin of the recorded second spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a second angle between a second reference beam and a second object beam; and applying a second digital filter to cut off signals around the second original origin and performing an inverse Fourier transform on the result, wherein digitally recording the first spatially-heterodyned hologram is completed before digitally recording the second spatially-heterodyned hologram and a single digital image includes both the first spatially-heterodyned hologram and the second spatially-heterodyned hologram.

  2. Using digital inpainting to estimate incident light intensity for the calculation of red blood cell oxygen saturation from microscopy images.

    PubMed

    Sové, Richard J; Drakos, Nicole E; Fraser, Graham M; Ellis, Christopher G

    2018-05-25

    Red blood cell oxygen saturation is an important indicator of oxygen supply to tissues in the body. Oxygen saturation can be measured by taking advantage of spectroscopic properties of hemoglobin. When this technique is applied to transmission microscopy, the calculation of saturation requires determination of incident light intensity at each pixel occupied by the red blood cell; this value is often approximated from a sequence of images as the maximum intensity over time. This method often fails when the red blood cells are moving too slowly, or if hematocrit is too large since there is not a large enough gap between the cells to accurately calculate the incident intensity value. A new method of approximating incident light intensity is proposed using digital inpainting. This novel approach estimates incident light intensity with an average percent error of approximately 3%, which exceeds the accuracy of the maximum intensity based method in most cases. The error in incident light intensity corresponds to a maximum error of approximately 2% saturation. Therefore, though this new method is computationally more demanding than the traditional technique, it can be used in cases where the maximum intensity-based method fails (e.g. stationary cells), or when higher accuracy is required. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  3. Light-addressable electrodeposition of cell-encapsulated alginate hydrogels for a cellular microarray using a digital micromirror device

    PubMed Central

    Huang, Shih-Hao; Hsueh, Hui-Jung; Jiang, Yeu-Long

    2011-01-01

    This paper describes a light-addressable electrolytic system used to perform an electrodeposition of calcium alginate hydrogels using a digital micromirror device (DMD). In this system, a patterned light illumination is projected onto a photoconductive substrate serving as a photo-anode to electrolytically produce protons, which can lead to a decreased pH gradient. The low pH generated at the anode can locally release calcium ions from insoluble calcium carbonate (CaCO3) to cause gelation of calcium alginate through sol-gel transition. By controlling the illumination pattern on the DMD, a light-addressable electrodeposition of calcium alginate hydrogels with different shapes and sizes, as well as multiplexed micropatterning was performed. The effects of the concentration of the alginate and CaCO3 solutions on the dimensional resolution of alginate hydrogel formation were experimentally examined. A 3 × 3 array of cell-encapsulated alginate hydrogels was also successfully demonstrated through light-addressable electrodeposition. Our proposed method provides a programmable method for the spatiotemporally controllable assembly of cell populations into cellular microarrays and could have a wide range of biological applications in cell-based biosensing, toxicology, and drug discovery. PMID:22685500

  4. Light-addressable electrodeposition of cell-encapsulated alginate hydrogels for a cellular microarray using a digital micromirror device.

    PubMed

    Huang, Shih-Hao; Hsueh, Hui-Jung; Jiang, Yeu-Long

    2011-09-01

    This paper describes a light-addressable electrolytic system used to perform an electrodeposition of calcium alginate hydrogels using a digital micromirror device (DMD). In this system, a patterned light illumination is projected onto a photoconductive substrate serving as a photo-anode to electrolytically produce protons, which can lead to a decreased pH gradient. The low pH generated at the anode can locally release calcium ions from insoluble calcium carbonate (CaCO(3)) to cause gelation of calcium alginate through sol-gel transition. By controlling the illumination pattern on the DMD, a light-addressable electrodeposition of calcium alginate hydrogels with different shapes and sizes, as well as multiplexed micropatterning was performed. The effects of the concentration of the alginate and CaCO(3) solutions on the dimensional resolution of alginate hydrogel formation were experimentally examined. A 3 × 3 array of cell-encapsulated alginate hydrogels was also successfully demonstrated through light-addressable electrodeposition. Our proposed method provides a programmable method for the spatiotemporally controllable assembly of cell populations into cellular microarrays and could have a wide range of biological applications in cell-based biosensing, toxicology, and drug discovery.

  5. Digital phonocardiographic experiments and signal processing in multidisciplinary fields of university education

    NASA Astrophysics Data System (ADS)

    Nagy, Tamás; Vadai, Gergely; Gingl, Zoltán

    2017-09-01

    Modern measurement of physical signals is based on the use of sensors, electronic signal conditioning, analog-to-digital conversion and digital signal processing carried out by dedicated software. The same signal chain is used in many devices such as home appliances, automotive electronics, medical instruments, and smartphones. Teaching the theoretical, experimental, and signal processing background must be an essential part of improving the standard of higher education, and it fits well to the increasingly multidisciplinary nature of physics and engineering too. In this paper, we show how digital phonocardiography can be used in university education as a universal, highly scalable, exciting, and inspiring laboratory practice and as a demonstration at various levels and complexity. We have developed open-source software templates in modern programming languages to support immediate use and to serve as a basis of further modifications using personal computers, tablets, and smartphones.

  6. Implementation of a Digital Signal Processing Subsystem for a Long Wavelength Array Station

    NASA Technical Reports Server (NTRS)

    Soriano, Melissa; Navarro, Robert; D'Addario, Larry; Sigman, Elliott; Wang, Douglas

    2011-01-01

    This paper describes the implementation of a Digital Signal Processing (DP) subsystem for a single Long Wavelength Array (LWA) station.12 The LWA is a radio telescope that will consist of many phased array stations. Each LWA station consists of 256 pairs of dipole-like antennas operating over the 10-88 MHz frequency range. The Digital Signal Processing subsystem digitizes up to 260 dual-polarization signals at 196 MHz from the LWA Analog Receiver, adjusts the delay and amplitude of each signal, and forms four independent beams. Coarse delay is implemented using a first-in-first-out buffer and fine delay is implemented using a finite impulse response filter. Amplitude adjustment and polarization corrections are implemented using a 2x2 matrix multiplication

  7. Evaluation of solar angle variation over digital processing of LANDSAT imagery. [Brazil

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Novo, E. M. L. M.

    1984-01-01

    The effects of the seasonal variation of illumination over digital processing of LANDSAT images are evaluated. Original images are transformed by means of digital filtering to enhance their spatial features. The resulting images are used to obtain an unsupervised classification of relief units. After defining relief classes, which are supposed to be spectrally different, topographic variables (declivity, altitude, relief range and slope length) are used to identify the true relief units existing on the ground. The samples are also clustered by means of an unsupervised classification option. The results obtained for each LANDSAT overpass are compared. Digital processing is highly affected by illumination geometry. There is no correspondence between relief units as defined by spectral features and those resulting from topographic features.

  8. Introduction to the Special Issue on Digital Signal Processing in Radio Astronomy

    NASA Astrophysics Data System (ADS)

    Price, D. C.; Kocz, J.; Bailes, M.; Greenhill, L. J.

    2016-03-01

    Advances in astronomy are intimately linked to advances in digital signal processing (DSP). This special issue is focused upon advances in DSP within radio astronomy. The trend within that community is to use off-the-shelf digital hardware where possible and leverage advances in high performance computing. In particular, graphics processing units (GPUs) and field programmable gate arrays (FPGAs) are being used in place of application-specific circuits (ASICs); high-speed Ethernet and Infiniband are being used for interconnect in place of custom backplanes. Further, to lower hurdles in digital engineering, communities have designed and released general-purpose FPGA-based DSP systems, such as the CASPER ROACH board, ASTRON Uniboard, and CSIRO Redback board. In this introductory paper, we give a brief historical overview, a summary of recent trends, and provide an outlook on future directions.

  9. Application of digital image processing techniques to astronomical imagery, 1979

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.

    1979-01-01

    Several areas of applications of image processing to astronomy were identified and discussed. These areas include: (1) deconvolution for atmospheric seeing compensation; a comparison between maximum entropy and conventional Wiener algorithms; (2) polarization in galaxies from photographic plates; (3) time changes in M87 and methods of displaying these changes; (4) comparing emission line images in planetary nebulae; and (5) log intensity, hue saturation intensity, and principal component color enhancements of M82. Examples are presented of these techniques applied to a variety of objects.

  10. Oversampling of digitized images. [effects on interpolation in signal processing

    NASA Technical Reports Server (NTRS)

    Fischel, D.

    1976-01-01

    Oversampling is defined as sampling with a device whose characteristic width is greater than the interval between samples. This paper shows why oversampling should be avoided and discusses the limitations in data processing if circumstances dictate that oversampling cannot be circumvented. Principally, oversampling should not be used to provide interpolating data points. Rather, the time spent oversampling should be used to obtain more signal with less relative error, and the Sampling Theorem should be employed to provide any desired interpolated values. The concepts are applicable to single-element and multielement detectors.

  11. Digital-image processing and image analysis of glacier ice

    USGS Publications Warehouse

    Fitzpatrick, Joan J.

    2013-01-01

    This document provides a methodology for extracting grain statistics from 8-bit color and grayscale images of thin sections of glacier ice—a subset of physical properties measurements typically performed on ice cores. This type of analysis is most commonly used to characterize the evolution of ice-crystal size, shape, and intercrystalline spatial relations within a large body of ice sampled by deep ice-coring projects from which paleoclimate records will be developed. However, such information is equally useful for investigating the stress state and physical responses of ice to stresses within a glacier. The methods of analysis presented here go hand-in-hand with the analysis of ice fabrics (aggregate crystal orientations) and, when combined with fabric analysis, provide a powerful method for investigating the dynamic recrystallization and deformation behaviors of bodies of ice in motion. The procedures described in this document compose a step-by-step handbook for a specific image acquisition and data reduction system built in support of U.S. Geological Survey ice analysis projects, but the general methodology can be used with any combination of image processing and analysis software. The specific approaches in this document use the FoveaPro 4 plug-in toolset to Adobe Photoshop CS5 Extended but it can be carried out equally well, though somewhat less conveniently, with software such as the image processing toolbox in MATLAB, Image-Pro Plus, or ImageJ.

  12. Encoding and decoding of digital spiral imaging based on bidirectional transformation of light's spatial eigenmodes.

    PubMed

    Zhang, Wuhong; Chen, Lixiang

    2016-06-15

    Digital spiral imaging has been demonstrated as an effective optical tool to encode optical information and retrieve topographic information of an object. Here we develop a conceptually new and concise scheme for optical image encoding and decoding toward free-space digital spiral imaging. We experimentally demonstrate that the optical lattices with ℓ=±50 orbital angular momentum superpositions and a clover image with nearly 200 Laguerre-Gaussian (LG) modes can be well encoded and successfully decoded. It is found that an image encoded/decoded with a two-index LG spectrum (considering both azimuthal and radial indices, ℓ and p) possesses much higher fidelity than that with a one-index LG spectrum (only considering the ℓ index). Our work provides an alternative tool for the image encoding/decoding scheme toward free-space optical communications.

  13. A comparison of orthogonal transformations for digital speech processing.

    NASA Technical Reports Server (NTRS)

    Campanella, S. J.; Robinson, G. S.

    1971-01-01

    Discrete forms of the Fourier, Hadamard, and Karhunen-Loeve transforms are examined for their capacity to reduce the bit rate necessary to transmit speech signals. To rate their effectiveness in accomplishing this goal the quantizing error (or noise) resulting for each transformation method at various bit rates is computed and compared with that for conventional companded PCM processing. Based on this comparison, it is found that Karhunen-Loeve provides a reduction in bit rate of 13.5 kbits/s, Fourier 10 kbits/s, and Hadamard 7.5 kbits/s as compared with the bit rate required for companded PCM. These bit-rate reductions are shown to be somewhat independent of the transmission bit rate.

  14. Digital Signal Processing for the Event Horizon Telescope

    NASA Astrophysics Data System (ADS)

    Weintroub, Jonathan

    2015-08-01

    A broad international collaboration is building the Event Horizon Telescope (EHT). The aim is to test Einstein’s theory of General Relativity in one of the very few places it could break down: the strong gravity regime right at the edge of a black hole. The EHT is an earth-size VLBI array operating at the shortest radio wavelengths, that has achieved unprecedented angular resolution of a few tens of μarcseconds. For nearby super massive black holes (SMBH) this size scale is comparable to the Schwarzschild Radius, and emission in the immediate neighborhood of the event horizon can be directly observed. We give an introduction to the science behind the CASPER-enabled EHT, and outline technical developments, with emphasis on the secret sauce of high speed signal processing.

  15. Expansion of the Eclipse Digital Signal Processing System.

    DTIC Science & Technology

    1982-12-01

    8217eOU WIdT TO,. Fig 1 IE.ETZIM U2. E( 11 -4., - IULTIPI.E P * S WI) STPM FILTER (- PAWtfTEP FILE PFILE FILTER FILE: WILE FIEP. LENGTH 55 WINIIM OF WQS...Vg u I k114 2.2 1 .2 I 11 .l111 1.6 MICROCOPY RESOLUTION TEST CHART NA, ONA BURMAU OF SrANDARDS-1963 A b i -I i.i 1s Lt USF w191 UNITED STATES AIR...SIGNAL PROCESSING SYSTI.M I"’ 1 /GI,/V/H 2 D- I6 Gordon R. Alln ist Lt USAF" I . . SECURITY CLASSIFICATION OF THIS PAGE (When Data Entered) READ

  16. Modular Scanning Confocal Microscope with Digital Image Processing.

    PubMed

    Ye, Xianjun; McCluskey, Matthew D

    2016-01-01

    In conventional confocal microscopy, a physical pinhole is placed at the image plane prior to the detector to limit the observation volume. In this work, we present a modular design of a scanning confocal microscope which uses a CCD camera to replace the physical pinhole for materials science applications. Experimental scans were performed on a microscope resolution target, a semiconductor chip carrier, and a piece of etched silicon wafer. The data collected by the CCD were processed to yield images of the specimen. By selecting effective pixels in the recorded CCD images, a virtual pinhole is created. By analyzing the image moments of the imaging data, a lateral resolution enhancement is achieved by using a 20 × / NA = 0.4 microscope objective at 532 nm laser wavelength.

  17. Modular Scanning Confocal Microscope with Digital Image Processing

    PubMed Central

    McCluskey, Matthew D.

    2016-01-01

    In conventional confocal microscopy, a physical pinhole is placed at the image plane prior to the detector to limit the observation volume. In this work, we present a modular design of a scanning confocal microscope which uses a CCD camera to replace the physical pinhole for materials science applications. Experimental scans were performed on a microscope resolution target, a semiconductor chip carrier, and a piece of etched silicon wafer. The data collected by the CCD were processed to yield images of the specimen. By selecting effective pixels in the recorded CCD images, a virtual pinhole is created. By analyzing the image moments of the imaging data, a lateral resolution enhancement is achieved by using a 20 × / NA = 0.4 microscope objective at 532 nm laser wavelength. PMID:27829052

  18. Automated Coronal Loop Identification using Digital Image Processing Techniques

    NASA Astrophysics Data System (ADS)

    Lee, J. K.; Gary, G. A.; Newman, T. S.

    2003-05-01

    The results of a Master's thesis study of computer algorithms for automatic extraction and identification (i.e., collectively, "detection") of optically-thin, 3-dimensional, (solar) coronal-loop center "lines" from extreme ultraviolet and X-ray 2-dimensional images will be presented. The center lines, which can be considered to be splines, are proxies of magnetic field lines. Detecting the loops is challenging because there are no unique shapes, the loop edges are often indistinct, and because photon and detector noise heavily influence the images. Three techniques for detecting the projected magnetic field lines have been considered and will be described in the presentation. The three techniques used are (i) linear feature recognition of local patterns (related to the inertia-tensor concept), (ii) parametric space inferences via the Hough transform, and (iii) topological adaptive contours (snakes) that constrain curvature and continuity. Since coronal loop topology is dominated by the magnetic field structure, a first-order magnetic field approximation using multiple dipoles provides a priori information that has also been incorporated into the detection process. Synthesized images have been generated to benchmark the suitability of the three techniques, and the performance of the three techniques on both synthesized and solar images will be presented and numerically evaluated in the presentation. The process of automatic detection of coronal loops is important in the reconstruction of the coronal magnetic field where the derived magnetic field lines provide a boundary condition for magnetic models ( cf. , Gary (2001, Solar Phys., 203, 71) and Wiegelmann & Neukirch (2002, Solar Phys., 208, 233)). . This work was supported by NASA's Office of Space Science - Solar and Heliospheric Physics Supporting Research and Technology Program.

  19. Advanced Signal Processing Methods Applied to Digital Mammography

    NASA Technical Reports Server (NTRS)

    Stauduhar, Richard P.

    1997-01-01

    The work reported here is on the extension of the earlier proposal of the same title, August 1994-June 1996. The report for that work is also being submitted. The work reported there forms the foundation for this work from January 1997 to September 1997. After the earlier work was completed there were a few items that needed to be completed prior to submission of a new and more comprehensive proposal for further research. Those tasks have been completed and two new proposals have been submitted, one to NASA, and one to Health & Human Services WS). The main purpose of this extension was to refine some of the techniques that lead to automatic large scale evaluation of full mammograms. Progress on each of the proposed tasks follows. Task 1: A multiresolution segmentation of background from breast has been developed and tested. The method is based on the different noise characteristics of the two different fields. The breast field has more power in the lower octaves and the off-breast field behaves similar to a wideband process, where more power is in the high frequency octaves. After the two fields are separated by lowpass filtering, a region labeling routine is used to find the largest contiguous region, the breast. Task 2: A wavelet expansion that can decompose the image without zero padding has been developed. The method preserves all properties of the power-of-two wavelet transform and does not add appreciably to computation time or storage. This work is essential for analysis of the full mammogram, as opposed to selecting sections from the full mammogram. Task 3: A clustering method has been developed based on a simple counting mechanism. No ROC analysis has been performed (and was not proposed), so we cannot finally evaluate this work without further support. Task 4: Further testing of the filter reveals that different wavelet bases do yield slightly different qualitative results. We cannot provide quantitative conclusions about this for all possible bases

  20. Digital Citizenship

    ERIC Educational Resources Information Center

    Isman, Aytekin; Canan Gungoren, Ozlem

    2014-01-01

    Era in which we live is known and referred as digital age.In this age technology is rapidly changed and developed. In light of these technological advances in 21st century, schools have the responsibility of training "digital citizen" as well as a good citizen. Digital citizens must have extensive skills, knowledge, Internet and …

  1. Matching rendered and real world images by digital image processing

    NASA Astrophysics Data System (ADS)

    Mitjà, Carles; Bover, Toni; Bigas, Miquel; Escofet, Jaume

    2010-05-01

    Recent advances in computer-generated images (CGI) have been used in commercial and industrial photography providing a broad scope in product advertising. Mixing real world images with those rendered from virtual space software shows a more or less visible mismatching between corresponding image quality performance. Rendered images are produced by software which quality performance is only limited by the resolution output. Real world images are taken with cameras with some amount of image degradation factors as lens residual aberrations, diffraction, sensor low pass anti aliasing filters, color pattern demosaicing, etc. The effect of all those image quality degradation factors can be characterized by the system Point Spread Function (PSF). Because the image is the convolution of the object by the system PSF, its characterization shows the amount of image degradation added to any taken picture. This work explores the use of image processing to degrade the rendered images following the parameters indicated by the real system PSF, attempting to match both virtual and real world image qualities. The system MTF is determined by the slanted edge method both in laboratory conditions and in the real picture environment in order to compare the influence of the working conditions on the device performance; an approximation to the system PSF is derived from the two measurements. The rendered images are filtered through a Gaussian filter obtained from the taking system PSF. Results with and without filtering are shown and compared measuring the contrast achieved in different final image regions.

  2. A Novel Optical/digital Processing System for Pattern Recognition

    NASA Technical Reports Server (NTRS)

    Boone, Bradley G.; Shukla, Oodaye B.

    1993-01-01

    This paper describes two processing algorithms that can be implemented optically: the Radon transform and angular correlation. These two algorithms can be combined in one optical processor to extract all the basic geometric and amplitude features from objects embedded in video imagery. We show that the internal amplitude structure of objects is recovered by the Radon transform, which is a well-known result, but, in addition, we show simulation results that calculate angular correlation, a simple but unique algorithm that extracts object boundaries from suitably threshold images from which length, width, area, aspect ratio, and orientation can be derived. In addition to circumventing scale and rotation distortions, these simulations indicate that the features derived from the angular correlation algorithm are relatively insensitive to tracking shifts and image noise. Some optical architecture concepts, including one based on micro-optical lenslet arrays, have been developed to implement these algorithms. Simulation test and evaluation using simple synthetic object data will be described, including results of a study that uses object boundaries (derivable from angular correlation) to classify simple objects using a neural network.

  3. Implementation and Performance of GaAs Digital Signal Processing ASICs

    NASA Technical Reports Server (NTRS)

    Whitaker, William D.; Buchanan, Jeffrey R.; Burke, Gary R.; Chow, Terrance W.; Graham, J. Scott; Kowalski, James E.; Lam, Barbara; Siavoshi, Fardad; Thompson, Matthew S.; Johnson, Robert A.

    1993-01-01

    The feasibility of performing high speed digital signal processing in GaAs gate array technology has been demonstrated with the successful implementation of a VLSI communications chip set for NASA's Deep Space Network. This paper describes the techniques developed to solve some of the technology and implementation problems associated with large scale integration of GaAs gate arrays.

  4. Digital Video Cameras for Brainstorming and Outlining: The Process and Potential

    ERIC Educational Resources Information Center

    Unger, John A.; Scullion, Vicki A.

    2013-01-01

    This "Voices from the Field" paper presents methods and participant-exemplar data for integrating digital video cameras into the writing process across postsecondary literacy contexts. The methods and participant data are part of an ongoing action-based research project systematically designed to bring research and theory into practice…

  5. An Undergraduate Course and Laboratory in Digital Signal Processing with Field Programmable Gate Arrays

    ERIC Educational Resources Information Center

    Meyer-Base, U.; Vera, A.; Meyer-Base, A.; Pattichis, M. S.; Perry, R. J.

    2010-01-01

    In this paper, an innovative educational approach to introducing undergraduates to both digital signal processing (DSP) and field programmable gate array (FPGA)-based design in a one-semester course and laboratory is described. While both DSP and FPGA-based courses are currently present in different curricula, this integrated approach reduces the…

  6. Digitizing Dissertations for an Institutional Repository: A Process and Cost Analysis*

    PubMed Central

    Piorun, Mary; Palmer, Lisa A.

    2008-01-01

    Objective: This paper describes the Lamar Soutter Library's process and costs associated with digitizing 300 doctoral dissertations for a newly implemented institutional repository at the University of Massachusetts Medical School. Methodology: Project tasks included identifying metadata elements, obtaining and tracking permissions, converting the dissertations to an electronic format, and coordinating workflow between library departments. Each dissertation was scanned, reviewed for quality control, enhanced with a table of contents, processed through an optical character recognition function, and added to the institutional repository. Results: Three hundred and twenty dissertations were digitized and added to the repository for a cost of $23,562, or $0.28 per page. Seventy-four percent of the authors who were contacted (n = 282) granted permission to digitize their dissertations. Processing time per title was 170 minutes, for a total processing time of 906 hours. In the first 17 months, full-text dissertations in the collection were downloaded 17,555 times. Conclusion: Locally digitizing dissertations or other scholarly works for inclusion in institutional repositories can be cost effective, especially if small, defined projects are chosen. A successful project serves as an excellent recruitment strategy for the institutional repository and helps libraries build new relationships. Challenges include workflow, cost, policy development, and copyright permissions. PMID:18654648

  7. Proceedings of the Fourth Annual Workshop on the Use of Digital Computers in Process Control.

    ERIC Educational Resources Information Center

    Smith, Cecil L., Ed.

    Contents: Computer hardware testing (results of vendor-user interaction); CODIL (a new language for process control programing); the design and implementation of control systems utilizing CRT display consoles; the systems contractor - valuable professional or unnecessary middle man; power station digital computer applications; from inspiration to…

  8. Mathematics and Science Teachers' Perceptions about Using Drama during the Digital Story Creation Process

    ERIC Educational Resources Information Center

    Yuksekyalcin, Gozen; Tanriseven, Isil; Sancar-Tokmak, Hatice

    2016-01-01

    This case study investigated math and science teachers' perceptions about the use of creative drama during a digital story (DS) creation process for educational purposes. A total of 25 secondary science and math teachers were selected according to criterion sampling strategy to participate in the study. Data were collected through an open-ended…

  9. Digitizing dissertations for an institutional repository: a process and cost analysis.

    PubMed

    Piorun, Mary; Palmer, Lisa A

    2008-07-01

    This paper describes the Lamar Soutter Library's process and costs associated with digitizing 300 doctoral dissertations for a newly implemented institutional repository at the University of Massachusetts Medical School. Project tasks included identifying metadata elements, obtaining and tracking permissions, converting the dissertations to an electronic format, and coordinating workflow between library departments. Each dissertation was scanned, reviewed for quality control, enhanced with a table of contents, processed through an optical character recognition function, and added to the institutional repository. Three hundred and twenty dissertations were digitized and added to the repository for a cost of $23,562, or $0.28 per page. Seventy-four percent of the authors who were contacted (n = 282) granted permission to digitize their dissertations. Processing time per title was 170 minutes, for a total processing time of 906 hours. In the first 17 months, full-text dissertations in the collection were downloaded 17,555 times. Locally digitizing dissertations or other scholarly works for inclusion in institutional repositories can be cost effective, especially if small, defined projects are chosen. A successful project serves as an excellent recruitment strategy for the institutional repository and helps libraries build new relationships. Challenges include workflow, cost, policy development, and copyright permissions.

  10. A Method for Identifying Contours in Processing Digital Images from Computer Tomograph

    NASA Astrophysics Data System (ADS)

    Roşu, Şerban; Pater, Flavius; Costea, Dan; Munteanu, Mihnea; Roşu, Doina; Fratila, Mihaela

    2011-09-01

    The first step in digital processing of two-dimensional computed tomography images is to identify the contour of component elements. This paper deals with the collective work of specialists in medicine and applied mathematics in computer science on elaborating new algorithms and methods in medical 2D and 3D imagery.

  11. Methods of Adapting Digital Content for the Learning Process via Mobile Devices

    ERIC Educational Resources Information Center

    Lopez, J. L. Gimenez; Royo, T. Magal; Laborda, Jesus Garcia; Calvo, F. Garde

    2009-01-01

    This article analyses different methods of adapting digital content for its delivery via mobile devices taking into account two aspects which are a fundamental part of the learning process; on the one hand, functionality of the contents, and on the other, the actual controlled navigation requirements that the learner needs in order to acquire high…

  12. Qualification process of CR system and quantification of digital image quality

    NASA Astrophysics Data System (ADS)

    Garnier, P.; Hun, L.; Klein, J.; Lemerle, C.

    2013-01-01

    CEA Valduc uses several X-Ray generators to carry out many inspections: void search, welding expertise, gap measurements, etc. Most of these inspections are carried out on silver based plates. For several years, the CEA/Valduc has decided to qualify new devices such as digital plates or CCD/flat panel plates. On one hand, the choice of this technological orientation is to forecast the assumed and eventual disappearance of silver based plates; on the other hand, it is also to keep our skills mastering up-to-date. The main improvement brought by numerical plates is the continuous progress of the measurement accuracy, especially with image data processing. It is now common to measure defects thickness or depth position within a part. In such applications, data image processing is used to obtain complementary information compared to scanned silver based plates. This scanning procedure is harmful for measurements which imply a data corruption of the resolution, the adding of numerical noise and is time expensive. Digital plates enable to suppress the scanning procedure and to increase resolution. It is nonetheless difficult to define, for digital images, single criteria for the image quality. A procedure has to be defined in order to estimate quality of the digital data itself; the impact of the scanning device and the configuration parameters are also to be taken into account. This presentation deals with the qualification process developed by CEA/Valduc for digital plates (DUR-NDT) based on the study of quantitative criteria chosen to define a direct numerical image quality that could be compared with scanned silver based pictures and the classical optical density. The versatility of the X-Ray parameters is also discussed (X-ray tension, intensity, time exposure). The aim is to be able to transfer the year long experience of CEA/Valduc with silver-based plates inspection to these new digital plates supports. This is an industrial stake.

  13. Realization of guitar audio effects using methods of digital signal processing

    NASA Astrophysics Data System (ADS)

    Buś, Szymon; Jedrzejewski, Konrad

    2015-09-01

    The paper is devoted to studies on possibilities of realization of guitar audio effects by means of methods of digital signal processing. As a result of research, some selected audio effects corresponding to the specifics of guitar sound were realized as the real-time system called Digital Guitar Multi-effect. Before implementation in the system, the selected effects were investigated using the dedicated application with a graphical user interface created in Matlab environment. In the second stage, the real-time system based on a microcontroller and an audio codec was designed and realized. The system is designed to perform audio effects on the output signal of an electric guitar.

  14. The influence of the microscope lamp filament colour temperature on the process of digital images of histological slides acquisition standardization.

    PubMed

    Korzynska, Anna; Roszkowiak, Lukasz; Pijanowska, Dorota; Kozlowski, Wojciech; Markiewicz, Tomasz

    2014-01-01

    The aim of this study is to compare the digital images of the tissue biopsy captured with optical microscope using bright field technique under various light conditions. The range of colour's variation in immunohistochemically stained with 3,3'-Diaminobenzidine and Haematoxylin tissue samples is immense and coming from various sources. One of them is inadequate setting of camera's white balance to microscope's light colour temperature. Although this type of error can be easily handled during the stage of image acquisition, it can be eliminated with use of colour adjustment algorithms. The examination of the dependence of colour variation from microscope's light temperature and settings of the camera is done as an introductory research to the process of automatic colour standardization. Six fields of view with empty space among the tissue samples have been selected for analysis. Each field of view has been acquired 225 times with various microscope light temperature and camera white balance settings. The fourteen randomly chosen images have been corrected and compared, with the reference image, by the following methods: Mean Square Error, Structural SIMilarity and visual assessment of viewer. For two types of backgrounds and two types of objects, the statistical image descriptors: range, median, mean and its standard deviation of chromaticity on a and b channels from CIELab colour space, and luminance L, and local colour variability for objects' specific area have been calculated. The results have been averaged for 6 images acquired in the same light conditions and camera settings for each sample. The analysis of the results leads to the following conclusions: (1) the images collected with white balance setting adjusted to light colour temperature clusters in certain area of chromatic space, (2) the process of white balance correction for images collected with white balance camera settings not matched to the light temperature moves image descriptors into proper

  15. Light energy conservation processes in Halobacterium halobium cells

    NASA Technical Reports Server (NTRS)

    Bogomolni, R. A.

    1977-01-01

    Proton pumping driven by light or by respiration generates an electrochemical potential difference across the membrane in Halobacterium halobium. The pH changes induced by light or by respiration in cell suspensions are complicated by proton flows associated with the functioning of the cellular energy transducers. A proton-per-ATP ratio of about 3 is calculated from simultaneous measurements of phosphorylation and the proton inflow. This value is compatible with the chemiosmotic coupling hypothesis. The time course of the light-induced changes in membrane potential indicates that light-driven pumping increases a dark pre-existing potential of about 130 mV only by a small amount (20 to 30 mV). The complex kinetic features of the membrane potential changes do not closely follow those of the pH changes, which suggests that flows of ions other than protons are involved. A qualitative model consistent with the available data is presented.

  16. Digital processing of the Mariner 10 images of Venus and Mercury

    NASA Technical Reports Server (NTRS)

    Soha, J. M.; Lynn, D. J.; Mosher, J. A.; Elliot, D. A.

    1977-01-01

    An extensive effort was devoted to the digital processing of the Mariner 10 images of Venus and Mercury at the Image Processing Laboratory of the Jet Propulsion Laboratory. This effort was designed to optimize the display of the considerable quantity of information contained in the images. Several image restoration, enhancement, and transformation procedures were applied; examples of these techniques are included. A particular task was the construction of large mosaics which characterize the surface of Mercury and the atmospheric structure of Venus.

  17. Low-cost Landsat digital processing system for state and local information systems

    NASA Technical Reports Server (NTRS)

    Hooper, N. J.; Spann, G. W.; Faust, N. L.; Paludan, C. T. N.

    1979-01-01

    The paper details a minicomputer-based system which is well within the budget of many state, regional, and local agencies that previously could not afford digital processing capability. In order to achieve this goal a workable small-scale Landsat system is examined to provide low-cost automated processing. It is anticipated that the alternative systems will be based on a single minicomputer, but that the peripherals will vary depending on the capability emphasized in a particular system.

  18. Passive auto-focus for digital still cameras and camera phones: Filter-switching and low-light techniques

    NASA Astrophysics Data System (ADS)

    Gamadia, Mark Noel

    In order to gain valuable market share in the growing consumer digital still camera and camera phone market, camera manufacturers have to continually add and improve existing features to their latest product offerings. Auto-focus (AF) is one such feature, whose aim is to enable consumers to quickly take sharply focused pictures with little or no manual intervention in adjusting the camera's focus lens. While AF has been a standard feature in digital still and cell-phone cameras, consumers often complain about their cameras' slow AF performance, which may lead to missed photographic opportunities, rendering valuable moments and events with undesired out-of-focus pictures. This dissertation addresses this critical issue to advance the state-of-the-art in the digital band-pass filter, passive AF method. This method is widely used to realize AF in the camera industry, where a focus actuator is adjusted via a search algorithm to locate the in-focus position by maximizing a sharpness measure extracted from a particular frequency band of the incoming image of the scene. There are no known systematic methods for automatically deriving the parameters such as the digital pass-bands or the search step-size increments used in existing passive AF schemes. Conventional methods require time consuming experimentation and tuning in order to arrive at a set of parameters which balance AF performance in terms of speed and accuracy ultimately causing a delay in product time-to-market. This dissertation presents a new framework for determining an optimal set of passive AF parameters, named Filter- Switching AF, providing an automatic approach to achieve superior AF performance, both in good and low lighting conditions based on the following performance measures (metrics): speed (total number of iterations), accuracy (offset from truth), power consumption (total distance moved), and user experience (in-focus position overrun). Performance results using three different prototype cameras

  19. [Diagnossis and treatment of complicated anterior teeth esthetic defects by combination of whole-process digital esthetic rehabilitation with periodontic surgery].

    PubMed

    Li, Z; Liu, Y S; Ye, H Q; Liu, Y S; Hu, W J; Zhou, Y S

    2017-02-18

    To explore a new method of whole-process digital esthetic prosthodontic rehabilitation combined with periodontic surgery for complicated anterior teeth esthetic defects accompanied by soft tissue morphology, to provide an alternative choice for solving this problem under the guidance of three-dimensional (3D) printing digital dental model and surgical guide, thus completing periodontic surgery and digital esthetic rehabilitation of anterior teeth. In this study, 12 patients with complicated esthetic problems accompanied by soft tissue morphology in their anterior teeth were included. The dentition and facial images were obtained by intra-oral scanning and three-dimensional (3D) facial scanning and then calibrated. Two esthetic designs and prosthodontic outcome predictions were created by computer aided design /computer aided manufacturing (CAD/CAM) software combined with digital photography, including consideration of white esthetics and comprehensive consideration of pink-white esthetics. The predictive design of prostheses and the facial appearances of the two designs were evaluated by the patients. If the patients chose the design of comprehensive consideration of pink-white esthetics, they would choose whether they would receive periodontic surgery before esthetic rehabilitation. The dentition design cast of those who chose periodontic surgery would be 3D printed for the guide of periodontic surgery accordingly. In light of the two digital designs based on intra-oral scanning, facing scanning and digital photography, the satisfaction rate of the patients was significantly higher for the comprehensive consideration of pink-white esthetic design (P<0.05) and more patients tended to choose priodontic surgery before esthetic rehabilitation. The 3D printed digital dental model and surgical guide provided significant instructions for periodontic surgery, and achieved success transfer from digital design to clinical application. The prostheses were fabricated by CAD

  20. Evolution of digital angiography systems.

    PubMed

    Brigida, Raffaela; Misciasci, Teresa; Martarelli, Fabiola; Gangitano, Guido; Ottaviani, Pierfrancesco; Rollo, Massimo; Marano, Pasquale

    2003-01-01

    The innovations introduced by digital subtraction angiography in digital radiography are briefly illustrated with the description of its components and functioning. The pros and cons of digital subtraction angiography are analyzed in light of present and future imaging technologies. In particular, among advantages there are: automatic exposure, digital image subtraction, digital post-processing, high number of images per second, possible changes in density and contrast. Among disadvantages there are: small round field of view, geometric distortion at the image periphery, high sensitivity to patient movements, not very high spatial resolution. At present, flat panel detectors represent the most suitable substitutes for digital subtraction angiography, with the introduction of novel solutions for those artifacts which for years have hindered its diagnostic validity. The concept of temporal artifact, reset light and possible future evolutions of this technology that may afford both diagnostic and protectionist advantages, are analyzed.

  1. Study of heat dissipation process from heat sink using lensless Fourier transform digital holographic interferometry.

    PubMed

    Kumar, Varun; Shakher, Chandra

    2015-02-20

    This paper presents the results of experimental investigations about the heat dissipation process of plate fin heat sink using digital holographic interferometry. Visual inspection of reconstructed phase difference maps of the air field around the heat sink with and without electric power in the load resistor provides qualitative information about the variation of temperature and the heat dissipation process. Quantitative information about the temperature distribution is obtained from the relationship between the digitally reconstructed phase difference map of ambient air and heated air. Experimental results are presented for different current and voltage in the load resistor to investigate the heat dissipation process. The effect of fin spacing on the heat dissipation performance of the heat sink is also investigated in the case of natural heat convection. From experimental data, heat transfer parameters, such as local heat flux and convective heat transfer coefficients, are also calculated.

  2. White OLED devices and processes for lighting applications

    NASA Astrophysics Data System (ADS)

    Ide, Nobuhiro; Tsuji, Hiroya; Ito, Norihiro; Matsuhisa, Yuko; Houzumi, Shingo; Nishimori, Taisuke

    2010-05-01

    In these days, the basic performances of white OLEDs are dramatically improved and application of OLEDs to "Lighting" is expected to be true in the near future. We have developed various technologies for OLED lighting with the aid of the Japanese governmental project, "High-efficiency lighting based on the organic light-emitting mechanism." In this project, a white OLED with high efficiency (37 lm/W) and high quality emission characteristics (CRI of 95 with a small variation of chromaticity in different directions and chromaticity just on the black-body radiation curve) applicable to "Lighting" was realized by a two-unit structure with a fluorescent deep blue emissive unit and a phosphorescent green and red emissive unit. Half-decay lifetime of this white OLED at 1,000 cd/m2 was over 40,000 h. A heat radiative, thin encapsulation structure (less than 1 mm) realized a very stable emission at high luminance of over 3,000 cd/m2. A new deposition source with a hot-wall and a rate controllable valve was developed. Thickness uniformity within +/- 3% at high deposition rate of over 8 nm/s, high material utilization of over 70 %, and repeatable deposition rate controllability were confirmed.

  3. Optimization of digital image processing to determine quantum dots' height and density from atomic force microscopy.

    PubMed

    Ruiz, J E; Paciornik, S; Pinto, L D; Ptak, F; Pires, M P; Souza, P L

    2018-01-01

    An optimized method of digital image processing to interpret quantum dots' height measurements obtained by atomic force microscopy is presented. The method was developed by combining well-known digital image processing techniques and particle recognition algorithms. The properties of quantum dot structures strongly depend on dots' height, among other features. Determination of their height is sensitive to small variations in their digital image processing parameters, which can generate misleading results. Comparing the results obtained with two image processing techniques - a conventional method and the new method proposed herein - with the data obtained by determining the height of quantum dots one by one within a fixed area, showed that the optimized method leads to more accurate results. Moreover, the log-normal distribution, which is often used to represent natural processes, shows a better fit to the quantum dots' height histogram obtained with the proposed method. Finally, the quantum dots' height obtained were used to calculate the predicted photoluminescence peak energies which were compared with the experimental data. Again, a better match was observed when using the proposed method to evaluate the quantum dots' height. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Integration of image capture and processing: beyond single-chip digital camera

    NASA Astrophysics Data System (ADS)

    Lim, SukHwan; El Gamal, Abbas

    2001-05-01

    An important trend in the design of digital cameras is the integration of capture and processing onto a single CMOS chip. Although integrating the components of a digital camera system onto a single chip significantly reduces system size and power, it does not fully exploit the potential advantages of integration. We argue that a key advantage of integration is the ability to exploit the high speed imaging capability of CMOS image senor to enable new applications such as multiple capture for enhancing dynamic range and to improve the performance of existing applications such as optical flow estimation. Conventional digital cameras operate at low frame rates and it would be too costly, if not infeasible, to operate their chips at high frame rates. Integration solves this problem. The idea is to capture images at much higher frame rates than he standard frame rate, process the high frame rate data on chip, and output the video sequence and the application specific data at standard frame rate. This idea is applied to optical flow estimation, where significant performance improvements are demonstrate over methods using standard frame rate sequences. We then investigate the constraints on memory size and processing power that can be integrated with a CMOS image sensor in a 0.18 micrometers process and below. We show that enough memory and processing power can be integrated to be able to not only perform the functions of a conventional camera system but also to perform applications such as real time optical flow estimation.

  5. A digital process for additive manufacturing of occlusal splints: a clinical pilot study

    PubMed Central

    Salmi, Mika; Paloheimo, Kaija-Stiina; Tuomi, Jukka; Ingman, Tuula; Mäkitie, Antti

    2013-01-01

    The aim of this study was to develop and evaluate a digital process for manufacturing of occlusal splints. An alginate impression was taken from the upper and lower jaws of a patient with temporomandibular disorder owing to cross bite and wear of the teeth, and then digitized using a table laser scanner. The scanned model was repaired using the 3Data Expert software, and a splint was designed with the Viscam RP software. A splint was manufactured from a biocompatible liquid photopolymer by stereolithography. The system employed in the process was SLA 350. The splint was worn nightly for six months. The patient adapted to the splint well and found it comfortable to use. The splint relieved tension in the patient's bite muscles. No sign of tooth wear or significant splint wear was detected after six months of testing. Modern digital technology enables us to manufacture clinically functional occlusal splints, which might reduce costs, dental technician working time and chair-side time. Maximum-dimensional errors of approximately 1 mm were found at thin walls and sharp corners of the splint when compared with the digital model. PMID:23614943

  6. Newly patented process enables low-cost solution for increasing white light spectrum of LEDs

    NASA Astrophysics Data System (ADS)

    Spanard, Jan-Marie

    2017-10-01

    A newly patented process for completing the spectral light array emitted by LED bulbs provides a low-cost method for producing better human centered lighting (HCL). This process uses non-luminescent colorant filters, filling out the jagged LED spectral emission into a full, white light array. While LED bulbs have the distinct economic advantages of using less energy, producing less heat and lasting years longer than traditional incandescent bulbs, the persistent metameric failure of LED bulbs has resulted in slower, and sometimes reluctant, adoption of LED lighting by the residential, retail and architectural markets. Adding missing wavelengths to LED generated bulbs via colorant filters increases the aesthetic appeal of the light by decreasing current levels of metameric failure, reducing the `flatness', `harshness', and `dullness' of LED generated light reported by consumers. LED phosphor-converted light can be successfully tuned to "whiter" white light with selective color filtering using permanent, durable transparent pigments. These transparent pigments are selectively applied in combination with existing manufacturing technologies and utilized as a final color-tuning step in bulb design. The quantity of emitted light chosen for color filtering can be adjusted from 1% to 100% of emitted light, creating a custom balance of light quantity with light quality. This invention recognizes that "better light" is frequently chosen over "more light" in the consumer marketplace.

  7. Photosynthetic light capture and processing from cell to canopy

    SciTech Connect

    Stenberg, P.; DeLucia, E.H.; Schoettle, A.W.

    1995-07-01

    We have addressed the unique structural features of conifers, as they relate to photosynthetic production, at different levels of organization (from needle to canopy). Many concepts and measures must be defined for conifers so that they are consistent with the structural properties of needles and shoots. Consistency is needed in comparing the photosynthetic performance of conifers and broad leaves, wherein it is important to distinguish the effect of structural factors on light capture from differences in the photosynthetic response at a fixed interception. Needles differ from broad leaves both with respect to inner structure and external shape, which includes amore » continuum from nearly flat to cylindrical. For nonflat three-dimensional objects such as for conifer needles, total surface area is the natural measure. The meaning of the one-sided area of needles is not clear, but consistency requires that it be defined as half the total needle surface area, as concluded. Characteristic structural factors of conifers that affect their ability to harvest light are a deep canopy combined with a small needle size, which create an important penumbra effect, and the clustering of needles on shoots, which creates a discontinuous distribution of needle area. These factors imply that, at a fixed leaf area index, the intercepted PAR would be smaller in coniferous than in broad-leafed canopies, but the vertical gradient of light in conifers is less steep and light reaching the lower canopy is all penumbral (diffuse). Conifers can maintain a higher leaf area index, and this may be accomplished by a more even distribution of light between shoots at different locations in the canopy and also because shade shoots have a structure that effectively intercepts light. Broad leaves in general have higher maximum photosynthetic rates than do needles, and yet conifers are at least equally productive on a stand basis. Possible reasons are discussed.« less

  8. UmUTracker: A versatile MATLAB program for automated particle tracking of 2D light microscopy or 3D digital holography data

    NASA Astrophysics Data System (ADS)

    Zhang, Hanqing; Stangner, Tim; Wiklund, Krister; Rodriguez, Alvaro; Andersson, Magnus

    2017-10-01

    We present a versatile and fast MATLAB program (UmUTracker) that automatically detects and tracks particles by analyzing video sequences acquired by either light microscopy or digital in-line holographic microscopy. Our program detects the 2D lateral positions of particles with an algorithm based on the isosceles triangle transform, and reconstructs their 3D axial positions by a fast implementation of the Rayleigh-Sommerfeld model using a radial intensity profile. To validate the accuracy and performance of our program, we first track the 2D position of polystyrene particles using bright field and digital holographic microscopy. Second, we determine the 3D particle position by analyzing synthetic and experimentally acquired holograms. Finally, to highlight the full program features, we profile the microfluidic flow in a 100 μm high flow chamber. This result agrees with computational fluid dynamic simulations. On a regular desktop computer UmUTracker can detect, analyze, and track multiple particles at 5 frames per second for a template size of 201 ×201 in a 1024 × 1024 image. To enhance usability and to make it easy to implement new functions we used object-oriented programming. UmUTracker is suitable for studies related to: particle dynamics, cell localization, colloids and microfluidic flow measurement. Program Files doi : http://dx.doi.org/10.17632/fkprs4s6xp.1 Licensing provisions : Creative Commons by 4.0 (CC by 4.0) Programming language : MATLAB Nature of problem: 3D multi-particle tracking is a common technique in physics, chemistry and biology. However, in terms of accuracy, reliable particle tracking is a challenging task since results depend on sample illumination, particle overlap, motion blur and noise from recording sensors. Additionally, the computational performance is also an issue if, for example, a computationally expensive process is executed, such as axial particle position reconstruction from digital holographic microscopy data. Versatile

  9. Polymer Light-Emitting Diode (PLED) Process Development

    DTIC Science & Technology

    2003-12-01

    conclusions and recommendations for Phase II of the Flexible Display Program. 15. SUBJECT TERMS LIGHT EMITTING DIODES LIQUID CRYSTAL DISPLAY SYSTEMS...space for Phase I and II confined by backplane complexity and substrate form...12 Figure 6. Semi automated I-V curve measurement setup consisting of Keithley power supply, computer and

  10. Deconstructing processing speed deficits in schizophrenia: application of a parametric digit symbol coding test.

    PubMed

    Bachman, Peter; Reichenberg, Abraham; Rice, Patrick; Woolsey, Mary; Chaves, Olga; Martinez, David; Maples, Natalie; Velligan, Dawn I; Glahn, David C

    2010-05-01

    Cognitive processing inefficiency, often measured using digit symbol coding tasks, is a putative vulnerability marker for schizophrenia and a reliable indicator of illness severity and functional outcome. Indeed, performance on the digit symbol coding task may be the most severe neuropsychological deficit patients with schizophrenia display at the group level. Yet, little is known about the contributions of simpler cognitive processes to coding performance in schizophrenia (e.g. decision making, visual scanning, relational memory, motor ability). We developed an experimental behavioral task, based on a computerized digit symbol coding task, which allows the manipulation of demands placed on visual scanning efficiency and relational memory while holding decisional and motor requirements constant. Although patients (n=85) were impaired on all aspects of the task when compared to demographically matched healthy comparison subjects (n=30), they showed a particularly striking failure to benefit from the presence of predictable target information. These findings are consistent with predicted impairments in cognitive processing speed due to schizophrenia patients' well-known memory impairment, suggesting that this mnemonic deficit may have consequences for critical aspects of information processing that are traditionally considered quite separate from the memory domain. Future investigation into the mechanisms underlying the wide-ranging consequences of mnemonic deficits in schizophrenia should provide additional insight. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  11. The application of image processing in the measurement for three-light-axis parallelity of laser ranger

    NASA Astrophysics Data System (ADS)

    Wang, Yang; Wang, Qianqian

    2008-12-01

    When laser ranger is transported or used in field operations, the transmitting axis, receiving axis and aiming axis may be not parallel. The nonparallelism of the three-light-axis will affect the range-measuring ability or make laser ranger not be operated exactly. So testing and adjusting the three-light-axis parallelity in the production and maintenance of laser ranger is important to ensure using laser ranger reliably. The paper proposes a new measurement method using digital image processing based on the comparison of some common measurement methods for the three-light-axis parallelity. It uses large aperture off-axis paraboloid reflector to get the images of laser spot and white light cross line, and then process the images on LabVIEW platform. The center of white light cross line can be achieved by the matching arithmetic in LABVIEW DLL. And the center of laser spot can be achieved by gradation transformation, binarization and area filter in turn. The software system can set CCD, detect the off-axis paraboloid reflector, measure the parallelity of transmitting axis and aiming axis and control the attenuation device. The hardware system selects SAA7111A, a programmable vedio decoding chip, to perform A/D conversion. FIFO (first-in first-out) is selected as buffer.USB bus is used to transmit data to PC. The three-light-axis parallelity can be achieved according to the position bias between them. The device based on this method has been already used. The application proves this method has high precision, speediness and automatization.

  12. A digital pixel cell for address event representation image convolution processing

    NASA Astrophysics Data System (ADS)

    Camunas-Mesa, Luis; Acosta-Jimenez, Antonio; Serrano-Gotarredona, Teresa; Linares-Barranco, Bernabe

    2005-06-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows for real-time virtual massive connectivity between huge number of neurons located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate events according to their information levels. Neurons with more information (activity, derivative of activities, contrast, motion, edges,...) generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. AER technology has been used and reported for the implementation of various type of image sensors or retinae: luminance with local agc, contrast retinae, motion retinae,... Also, there has been a proposal for realizing programmable kernel image convolution chips. Such convolution chips would contain an array of pixels that perform weighted addition of events. Once a pixel has added sufficient event contributions to reach a fixed threshold, the pixel fires an event, which is then routed out of the chip for further processing. Such convolution chips have been proposed to be implemented using pulsed current mode mixed analog and digital circuit techniques. In this paper we present a fully digital pixel implementation to perform the weighted additions and fire the events. This way, for a given technology, there is a fully digital implementation reference against which compare the mixed signal implementations. We have designed, implemented and tested a fully digital AER convolution pixel. This pixel will be used to implement a full AER convolution chip for programmable kernel image convolution processing.

  13. Electronic polarization-division demultiplexing based on digital signal processing in intensity-modulation direct-detection optical communication systems.

    PubMed

    Kikuchi, Kazuro

    2014-01-27

    We propose a novel configuration of optical receivers for intensity-modulation direct-detection (IM · DD) systems, which can cope with dual-polarization (DP) optical signals electrically. Using a Stokes analyzer and a newly-developed digital signal-processing (DSP) algorithm, we can achieve polarization tracking and demultiplexing in the digital domain after direct detection. Simulation results show that the power penalty stemming from digital polarization manipulations is negligibly small.

  14. PREFACE: I International Scientific School Methods of Digital Image Processing in Optics and Photonics

    NASA Astrophysics Data System (ADS)

    Gurov, I. P.; Kozlov, S. A.

    2014-09-01

    The first international scientific school "Methods of Digital Image Processing in Optics and Photonics" was held with a view to develop cooperation between world-class experts, young scientists, students and post-graduate students, and to exchange information on the current status and directions of research in the field of digital image processing in optics and photonics. The International Scientific School was managed by: Saint Petersburg National Research University of Information Technologies, Mechanics and Optics (ITMO University) - Saint Petersburg (Russia) Chernyshevsky Saratov State University - Saratov (Russia) National research nuclear University "MEPHI" (NRNU MEPhI) - Moscow (Russia) The school was held with the participation of the local chapters of Optical Society of America (OSA), the Society of Photo-Optical Instrumentation Engineers (SPIE) and IEEE Photonics Society. Further details, including topics, committees and conference photos are available in the PDF

  15. Recent developments at JPL in the application of digital image processing techniques to astronomical images

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.; Lynn, D. J.; Benton, W. D.

    1976-01-01

    Several techniques of a digital image-processing nature are illustrated which have proved useful in visual analysis of astronomical pictorial data. Processed digital scans of photographic plates of Stephans Quintet and NGC 4151 are used as examples to show how faint nebulosity is enhanced by high-pass filtering, how foreground stars are suppressed by linear interpolation, and how relative color differences between two images recorded on plates with different spectral sensitivities can be revealed by generating ratio images. Analyses are outlined which are intended to compensate partially for the blurring effects of the atmosphere on images of Stephans Quintet and to obtain more detailed information about Saturn's ring structure from low- and high-resolution scans of the planet and its ring system. The employment of a correlation picture to determine the tilt angle of an average spectral line in a low-quality spectrum is demonstrated for a section of the spectrum of Uranus.

  16. Ex vivo accuracy of an apex locator using digital signal processing in primary teeth.

    PubMed

    Leonardo, Mário Roberto; da Silva, Lea Assed Bezerra; Nelson-Filho, Paulo; da Silva, Raquel Assed Bezerra; Lucisano, Marília Pacífico

    2009-01-01

    The purpose of this study was to evaluate ex vivo the accuracy an electronic apex locator during root canal length determination in primary molars. One calibrated examiner determined the root canal length in 15 primary molars (total=34 root canals) with different stages of root resorption. Root canal length was measured both visually with the placement of a K-file 1 mm short of the apical foramen or the apical resorption bevel, and electronically using an electronic apex locator (Digital Signal Processing). Data were analyzed statistically using the intraclass correlation (ICC) test. Comparing the actual and electronic root canal length measurements in the primary teeth showed a high correlation (ICC=0.95). The Digital Signal Processing apex locator is useful and accurate for apex foramen location during root canal length measurement in primary molars.

  17. Searching early bone metastasis on plain radiography by using digital imaging processing

    SciTech Connect

    Jaramillo-Nunez, A.; Perez-Meza, M.; Universidad de la Sierra Sur, C. P. 70800, Miahuatlan, Oax.

    2012-10-23

    Some authors mention that it is not possible to detect early bone metastasis on plain radiography. In this work we use digital imaging processing to analyze three radiographs taken from a patient with bone metastasis discomfort on the right shoulder. The time period among the first and second radiography was approximately one month and between the first and the third one year. This procedure is a first approach in order to know if in this particular case it was possible to detect an early bone metastasis. The obtained results suggest that by carrying out a digital processing is possible tomore » detect the metastasis since the radiography contains the information although visually it is not possible to observe it.« less

  18. Methods, media and systems for managing a distributed application running in a plurality of digital processing devices

    DOEpatents

    Laadan, Oren; Nieh, Jason; Phung, Dan

    2012-10-02

    Methods, media and systems for managing a distributed application running in a plurality of digital processing devices are provided. In some embodiments, a method includes running one or more processes associated with the distributed application in virtualized operating system environments on a plurality of digital processing devices, suspending the one or more processes, and saving network state information relating to network connections among the one or more processes. The method further include storing process information relating to the one or more processes, recreating the network connections using the saved network state information, and restarting the one or more processes using the stored process information.

  19. Computational analysis of Pelton bucket tip erosion using digital image processing

    NASA Astrophysics Data System (ADS)

    Shrestha, Bim Prasad; Gautam, Bijaya; Bajracharya, Tri Ratna

    2008-03-01

    Erosion of hydro turbine components through sand laden river is one of the biggest problems in Himalayas. Even with sediment trapping systems, complete removal of fine sediment from water is impossible and uneconomical; hence most of the turbine components in Himalayan Rivers are exposed to sand laden water and subject to erode. Pelton bucket which are being wildly used in different hydropower generation plant undergoes erosion on the continuous presence of sand particles in water. The subsequent erosion causes increase in splitter thickness, which is supposed to be theoretically zero. This increase in splitter thickness gives rise to back hitting of water followed by decrease in turbine efficiency. This paper describes the process of measurement of sharp edges like bucket tip using digital image processing. Image of each bucket is captured and allowed to run for 72 hours; sand concentration in water hitting the bucket is closely controlled and monitored. Later, the image of the test bucket is taken in the same condition. The process is repeated for 10 times. In this paper digital image processing which encompasses processes that performs image enhancement in both spatial and frequency domain. In addition, the processes that extract attributes from images, up to and including the measurement of splitter's tip. Processing of image has been done in MATLAB 6.5 platform. The result shows that quantitative measurement of edge erosion of sharp edges could accurately be detected and the erosion profile could be generated using image processing technique.

  20. Digital signal processing at Bell Labs-Foundations for speech and acoustics research

    NASA Astrophysics Data System (ADS)

    Rabiner, Lawrence R.

    2004-05-01

    Digital signal processing (DSP) is a fundamental tool for much of the research that has been carried out of Bell Labs in the areas of speech and acoustics research. The fundamental bases for DSP include the sampling theorem of Nyquist, the method for digitization of analog signals by Shannon et al., methods of spectral analysis by Tukey, the cepstrum by Bogert et al., and the FFT by Tukey (and Cooley of IBM). Essentially all of these early foundations of DSP came out of the Bell Labs Research Lab in the 1930s, 1940s, 1950s, and 1960s. This fundamental research was motivated by fundamental applications (mainly in the areas of speech, sonar, and acoustics) that led to novel design methods for digital filters (Kaiser, Golden, Rabiner, Schafer), spectrum analysis methods (Rabiner, Schafer, Allen, Crochiere), fast convolution methods based on the FFT (Helms, Bergland), and advanced digital systems used to implement telephony channel banks (Jackson, McDonald, Freeny, Tewksbury). This talk summarizes the key contributions to DSP made at Bell Labs, and illustrates how DSP was utilized in the areas of speech and acoustics research. It also shows the vast, worldwide impact of this DSP research on modern consumer electronics.

  1. Hygroscopic Swelling Determination of Cellulose Nanocrystal (CNC) Films by Polarized Light Microscopy Digital Image Correlation.

    PubMed

    Shrestha, Shikha; Diaz, Jairo A; Ghanbari, Siavash; Youngblood, Jeffrey P

    2017-05-08

    The coefficient of hygroscopic swelling (CHS) of self-organized and shear-oriented cellulose nanocrystal (CNC) films was determined by capturing hygroscopic strains produced as result of isothermal water vapor intake in equilibrium. Contrast enhanced microscopy digital image correlation enabled the characterization of dimensional changes induced by the hygroscopic swelling of the films. The distinct microstructure and birefringence of CNC films served in exploring the in-plane hygroscopic swelling at relative humidity values ranging from 0% to 97%. Water vapor intake in CNC films was measured using dynamic vapor sorption (DVS) at constant temperature. The obtained experimental moisture sorption and kinetic profiles were analyzed by fitting with Guggenheim, Anderson, and deBoer (GAB) and Parallel Exponential Kinetics (PEK) models, respectively. Self-organized CNC films showed isotropic swelling, CHS ∼0.040 %strain/%C. By contrast, shear-oriented CNC films exhibited an anisotropic swelling, resulting in CHS ∼0.02 and ∼0.30 %strain/%C, parallel and perpendicular to CNC alignment, respectively. Finite element analysis (FEA) further predicted moisture diffusion as the predominant mechanism for swelling of CNC films.

  2. Ultraviolet Light Emitting Diode Use in Advanced Oxidation Processes

    DTIC Science & Technology

    2014-03-27

    or medium pressure mercury lamps , but UV light emitting diodes ( LEDs ) have the capacity to be used for water disinfection also. Traditional mercury...based upon the phosphors that are selected and used to coat the inside of the glass tube from which these lamps are produced. A UV LED is...Research has demonstrated the ability to use UV LEDs in place of mercury lamps to achieve the same 7 disinfection capacity, and limited research has

  3. Method and Apparatus for Evaluating the Visual Quality of Processed Digital Video Sequences

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B. (Inventor)

    2002-01-01

    A Digital Video Quality (DVQ) apparatus and method that incorporate a model of human visual sensitivity to predict the visibility of artifacts. The DVQ method and apparatus are used for the evaluation of the visual quality of processed digital video sequences and for adaptively controlling the bit rate of the processed digital video sequences without compromising the visual quality. The DVQ apparatus minimizes the required amount of memory and computation. The input to the DVQ apparatus is a pair of color image sequences: an original (R) non-compressed sequence, and a processed (T) sequence. Both sequences (R) and (T) are sampled, cropped, and subjected to color transformations. The sequences are then subjected to blocking and discrete cosine transformation, and the results are transformed to local contrast. The next step is a time filtering operation which implements the human sensitivity to different time frequencies. The results are converted to threshold units by dividing each discrete cosine transform coefficient by its respective visual threshold. At the next stage the two sequences are subtracted to produce an error sequence. The error sequence is subjected to a contrast masking operation, which also depends upon the reference sequence (R). The masked errors can be pooled in various ways to illustrate the perceptual error over various dimensions, and the pooled error can be converted to a visual quality measure.

  4. The effects of solar incidence angle over digital processing of LANDSAT data

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Novo, E. M. L. M.

    1983-01-01

    A technique to extract the topography modulation component from digital data is described. The enhancement process is based on the fact that the pixel contains two types of information: (1) reflectance variation due to the target; (2) reflectance variation due to the topography. In order to enhance the signal variation due to topography, the technique recommends the extraction from original LANDSAT data of the component resulting from target reflectance. Considering that the role of topographic modulation over the pixel information will vary with solar incidence angle, the results of this technique of digital processing will differ from one season to another, mainly in highly dissected topography. In this context, the effects of solar incidence angle over the topographic modulation technique were evaluated. Two sets of MSS/LANDSAT data, with solar elevation angles varying from 22 to 41 deg were selected to implement the digital processing at the Image-100 System. A secondary watershed (Rio Bocaina) draining into Rio Paraiba do Sul (Sao Paulo State) was selected as a test site. The results showed that the technique used was more appropriate to MSS data acquired under higher Sun elevation angles. Topographic modulation components applied to low Sun elevation angles lessens rather than enhances topography.

  5. E-Learning Content Design Standards Based on Interactive Digital Concepts Maps in the Light of Meaningful and Constructivist Learning Theory

    ERIC Educational Resources Information Center

    Afify, Mohammed Kamal

    2018-01-01

    The present study aims to identify standards of interactive digital concepts maps design and their measurement indicators as a tool to develop, organize and administer e-learning content in the light of Meaningful Learning Theory and Constructivist Learning Theory. To achieve the objective of the research, the author prepared a list of E-learning…

  6. A Solution Processed Flexible Nanocomposite Electrode with Efficient Light Extraction for Organic Light Emitting Diodes

    NASA Astrophysics Data System (ADS)

    Li, Lu; Liang, Jiajie; Chou, Shu-Yu; Zhu, Xiaodan; Niu, Xiaofan; Zhibinyu; Pei, Qibing

    2014-03-01

    Highly efficient organic light emitting diodes (OLEDs) based on multiple layers of vapor evaporated small molecules, indium tin oxide transparent electrode, and glass substrate have been extensively investigated and are being commercialized. The light extraction from the exciton radiative decay is limited to less than 30% due to plasmonic quenching on the metallic cathode and the waveguide in the multi-layer sandwich structure. Here we report a flexible nanocomposite electrode comprising single-walled carbon nanotubes and silver nanowires stacked and embedded in the surface of a polymer substrate. Nanoparticles of barium strontium titanate are dispersed within the substrate to enhance light extraction efficiency. Green polymer OLED (PLEDs) fabricated on the nanocomposite electrode exhibit a maximum current efficiency of 118 cd/A at 10,000 cd/m2 with the calculated external quantum efficiency being 38.9%. The efficiencies of white PLEDs are 46.7 cd/A and 30.5%, respectively. The devices can be bent to 3 mm radius repeatedly without significant loss of electroluminescent performance. The nanocomposite electrode could pave the way to high-efficiency flexible OLEDs with simplified device structure and low fabrication cost.

  7. A solution processed flexible nanocomposite electrode with efficient light extraction for organic light emitting diodes.

    PubMed

    Li, Lu; Liang, Jiajie; Chou, Shu-Yu; Zhu, Xiaodan; Niu, Xiaofan; ZhibinYu; Pei, Qibing

    2014-03-17

    Highly efficient organic light emitting diodes (OLEDs) based on multiple layers of vapor evaporated small molecules, indium tin oxide transparent electrode, and glass substrate have been extensively investigated and are being commercialized. The light extraction from the exciton radiative decay is limited to less than 30% due to plasmonic quenching on the metallic cathode and the waveguide in the multi-layer sandwich structure. Here we report a flexible nanocomposite electrode comprising single-walled carbon nanotubes and silver nanowires stacked and embedded in the surface of a polymer substrate. Nanoparticles of barium strontium titanate are dispersed within the substrate to enhance light extraction efficiency. Green polymer OLED (PLEDs) fabricated on the nanocomposite electrode exhibit a maximum current efficiency of 118 cd/A at 10,000 cd/m(2) with the calculated external quantum efficiency being 38.9%. The efficiencies of white PLEDs are 46.7 cd/A and 30.5%, respectively. The devices can be bent to 3 mm radius repeatedly without significant loss of electroluminescent performance. The nanocomposite electrode could pave the way to high-efficiency flexible OLEDs with simplified device structure and low fabrication cost.

  8. A Solution Processed Flexible Nanocomposite Electrode with Efficient Light Extraction for Organic Light Emitting Diodes

    PubMed Central

    Li, Lu; Liang, Jiajie; Chou, Shu-Yu; Zhu, Xiaodan; Niu, Xiaofan; ZhibinYu; Pei, Qibing

    2014-01-01

    Highly efficient organic light emitting diodes (OLEDs) based on multiple layers of vapor evaporated small molecules, indium tin oxide transparent electrode, and glass substrate have been extensively investigated and are being commercialized. The light extraction from the exciton radiative decay is limited to less than 30% due to plasmonic quenching on the metallic cathode and the waveguide in the multi-layer sandwich structure. Here we report a flexible nanocomposite electrode comprising single-walled carbon nanotubes and silver nanowires stacked and embedded in the surface of a polymer substrate. Nanoparticles of barium strontium titanate are dispersed within the substrate to enhance light extraction efficiency. Green polymer OLED (PLEDs) fabricated on the nanocomposite electrode exhibit a maximum current efficiency of 118 cd/A at 10,000 cd/m2 with the calculated external quantum efficiency being 38.9%. The efficiencies of white PLEDs are 46.7 cd/A and 30.5%, respectively. The devices can be bent to 3 mm radius repeatedly without significant loss of electroluminescent performance. The nanocomposite electrode could pave the way to high-efficiency flexible OLEDs with simplified device structure and low fabrication cost. PMID:24632742

  9. The effect of image processing on the detection of cancers in digital mammography.

    PubMed

    Warren, Lucy M; Given-Wilson, Rosalind M; Wallis, Matthew G; Cooke, Julie; Halling-Brown, Mark D; Mackenzie, Alistair; Chakraborty, Dev P; Bosmans, Hilde; Dance, David R; Young, Kenneth C

    2014-08-01

    OBJECTIVE. The objective of our study was to investigate the effect of image processing on the detection of cancers in digital mammography images. MATERIALS AND METHODS. Two hundred seventy pairs of breast images (both breasts, one view) were collected from eight systems using Hologic amorphous selenium detectors: 80 image pairs showed breasts containing subtle malignant masses; 30 image pairs, biopsy-proven benign lesions; 80 image pairs, simulated calcification clusters; and 80 image pairs, no cancer (normal). The 270 image pairs were processed with three types of image processing: standard (full enhancement), low contrast (intermediate enhancement), and pseudo-film-screen (no enhancement). Seven experienced observers inspected the images, locating and rating regions they suspected to be cancer for likelihood of malignancy. The results were analyzed using a jackknife-alternative free-response receiver operating characteristic (JAFROC) analysis. RESULTS. The detection of calcification clusters was significantly affected by the type of image processing: The JAFROC figure of merit (FOM) decreased from 0.65 with standard image processing to 0.63 with low-contrast image processing (p = 0.04) and from 0.65 with standard image processing to 0.61 with film-screen image processing (p = 0.0005). The detection of noncalcification cancers was not significantly different among the image-processing types investigated (p > 0.40). CONCLUSION. These results suggest that image processing has a significant impact on the detection of calcification clusters in digital mammography. For the three image-processing versions and the system investigated, standard image processing was optimal for the detection of calcification clusters. The effect on cancer detection should be considered when selecting the type of image processing in the future.

  10. Laminar Soot Processes Experiment Shedding Light on Flame Radiation

    NASA Technical Reports Server (NTRS)

    Urban, David L.

    1998-01-01

    The Laminar Soot Processes (LSP) experiment investigated soot processes in nonturbulent, round gas jet diffusion flames in still air. The soot processes within these flames are relevant to practical combustion in aircraft propulsion systems, diesel engines, and furnaces. However, for the LSP experiment, the flames were slowed and spread out to allow measurements that are not tractable for practical, Earth-bound flames.

  11. Core Formation Process and Light Elements in the Planetary Core

    NASA Astrophysics Data System (ADS)

    Ohtani, E.; Sakairi, T.; Watanabe, K.; Kamada, S.; Sakamaki, T.; Hirao, N.

    2015-12-01

    Si, O, and S are major candidates for light elements in the planetary core. In the early stage of the planetary formation, the core formation started by percolation of the metallic liquid though silicate matrix because Fe-S-O and Fe-S-Si eutectic temperatures are significantly lower than the solidus of the silicates. Therefore, in the early stage of accretion of the planets, the eutectic liquid with S enrichment was formed and separated into the core by percolation. The major light element in the core at this stage will be sulfur. The internal pressure and temperature increased with the growth of the planets, and the metal component depleted in S was molten. The metallic melt contained both Si and O at high pressure in the deep magma ocean in the later stage. Thus, the core contains S, Si, and O in this stage of core formation. Partitioning experiments between solid and liquid metals indicate that S is partitioned into the liquid metal, whereas O is weakly into the liquid. Partitioning of Si changes with the metallic iron phases, i.e., fcc iron-alloy coexisting with the metallic liquid below 30 GPa is depleted in Si. Whereas hcp-Fe alloy above 30 GPa coexisting with the liquid favors Si. This contrast of Si partitioning provides remarkable difference in compositions of the solid inner core and liquid outer core among different terrestrial planets. Our melting experiments of the Fe-S-Si and Fe-O-S systems at high pressure indicate the core-adiabats in small planets, Mercury and Mars, are greater than the slope of the solidus and liquidus curves of these systems. Thus, in these planets, the core crystallized at the top of the liquid core and 'snowing core' formation occurred during crystallization. The solid inner core is depleted in both Si and S whereas the liquid outer core is relatively enriched in Si and S in these planets. On the other hand, the core adiabats in large planets, Earth and Venus, are smaller than the solidus and liquidus curves of the systems. The

  12. Digital Audio Signal Processing and Nde: AN Unlikely but Valuable Partnership

    NASA Astrophysics Data System (ADS)

    Gaydecki, Patrick

    2008-02-01

    In the Digital Signal Processing (DSP) group, within the School of Electrical and Electronic Engineering at The University of Manchester, research is conducted into two seemingly distinct and disparate subjects: instrumentation for nondestructive evaluation, and DSP systems & algorithms for digital audio. We have often found that many of the hardware systems and algorithms employed to recover, extract or enhance audio signals may also be applied to signals provided by ultrasonic or magnetic NDE instruments. Furthermore, modern DSP hardware is so fast (typically performing hundreds of millions of operations per second), that much of the processing and signal reconstruction may be performed in real time. Here, we describe some of the hardware systems we have developed, together with algorithms that can be implemented both in real time and offline. A next generation system has now been designed, which incorporates a processor operating at 0.55 Giga MMACS, six input and eight output analogue channels, digital input/output in the form of S/PDIF, a JTAG and a USB interface. The software allows the user, with no knowledge of filter theory or programming, to design and run standard or arbitrary FIR, IIR and adaptive filters. Using audio as a vehicle, we can demonstrate the remarkable properties of modern reconstruction algorithms when used in conjunction with such hardware; applications in NDE include signal enhancement and recovery in acoustic, ultrasonic, magnetic and eddy current modalities.

  13. Using digital flow cytometry to assess the degradation of three cyanobacteria species after oxidation processes.

    PubMed

    Wert, Eric C; Dong, Mei Mei; Rosario-Ortiz, Fernando L

    2013-07-01

    Depending on drinking water treatment conditions, oxidation processes may result in the degradation of cyanobacteria cells causing the release of toxic metabolites (microcystin), odorous metabolites (MIB, geosmin), or disinfection byproduct precursors. In this study, a digital flow cytometer (FlowCAM(®)) in combination with chlorophyll-a analysis was used to evaluate the ability of ozone, chlorine, chlorine dioxide, and chloramine to damage or lyse cyanobacteria cells added to Colorado River water. Microcystis aeruginosa (MA), Oscillatoria sp. (OSC) and Lyngbya sp. (LYN) were selected for the study due to their occurrence in surface water supplies, metabolite production, and morphology. Results showed that cell damage was observed without complete lysis or fragmentation of the cell membrane under many of the conditions tested. During ozone and chlorine experiments, the unicellular MA was more susceptible to oxidation than the filamentous OSC and LYN. Rate constants were developed based on the loss of chlorophyll-a and oxidant exposure, which showed the oxidants degraded MA, OSC, and LYN according to the order of ozone > chlorine ~ chlorine dioxide > chloramine. Digital and binary images taken by the digital flow cytometer provided qualitative insight regarding cell damage. When applying this information, drinking water utilities can better understand the risk of cell damage or lysis during oxidation processes. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Automatic rice crop height measurement using a field server and digital image processing.

    PubMed

    Sritarapipat, Tanakorn; Rakwatin, Preesan; Kasetkasem, Teerasit

    2014-01-07

    Rice crop height is an important agronomic trait linked to plant type and yield potential. This research developed an automatic image processing technique to detect rice crop height based on images taken by a digital camera attached to a field server. The camera acquires rice paddy images daily at a consistent time of day. The images include the rice plants and a marker bar used to provide a height reference. The rice crop height can be indirectly measured from the images by measuring the height of the marker bar compared to the height of the initial marker bar. Four digital image processing steps are employed to automatically measure the rice crop height: band selection, filtering, thresholding, and height measurement. Band selection is used to remove redundant features. Filtering extracts significant features of the marker bar. The thresholding method is applied to separate objects and boundaries of the marker bar versus other areas. The marker bar is detected and compared with the initial marker bar to measure the rice crop height. Our experiment used a field server with a digital camera to continuously monitor a rice field located in Suphanburi Province, Thailand. The experimental results show that the proposed method measures rice crop height effectively, with no human intervention required.

  15. Validation of quantitative light-induced fluorescence-digital (QLF-D) for the detection of approximal caries in vitro.

    PubMed

    Ko, Hae-Youn; Kang, Si-Mook; Kim, Hee Eun; Kwon, Ho-Keun; Kim, Baek-Il

    2015-05-01

    Detection of approximal caries lesions can be difficult due to their anatomical position. This study aimed to assess the ability of the quantitative light-induced fluorescence-digital (QLF-D) in detecting approximal caries, and to compare the performance with those of the International Caries Detection and Assessment System II (ICDAS II) and digital radiography (DR). Extracted permanent teeth (n=100) were selected and mounted in pairs. The simulation pairs were assessed by one calibrated dentist using each detection method. After all the examinations, the teeth (n=95) were sectioned and examined histologically as gold standard. The modalities were compared in terms of sensitivity, specificity, areas under receiver operating characteristic curves (AUROC) for enamel (D1) and dentine (D3) levels. The intra-examiner reliability was assessed for all modalities. At D1 threshold, the ICDAS II presented the highest sensitivity (0.80) while the DR showed the highest specificity (0.89); however, the methods with the greatest AUC values at D1 threshold were DR and QLF-D (0.80 and 0.80 respectively). At D3 threshold, the methods with the highest sensitivity were ICDAS II and QLF-D (0.64 and 0.64 respectively) while the method with the lowest sensitivity was DR (0.50). However, with regard to the AUC values at D3 threshold, the QLF-D presented the highest value (0.76). All modalities showed to have excellent intra-examiner reliability. The newly developed QLF-D was not only able to detect proximal caries, but also showed to have comparable performance to the visual inspection and radiography in detecting proximal caries. QLF-D has the potential to be a useful detection method for proximal caries. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Interpretation of digital chest radiographs: comparison of light emitting diode versus cold cathode fluorescent lamp backlit monitors.

    PubMed

    Lim, Hyun-ju; Chung, Myung Jin; Lee, Geewon; Yie, Miyeon; Shin, Kyung Eun; Moon, Jung Won; Lee, Kyung Soo

    2013-01-01

    To compare the diagnostic performance of light emitting diode (LED) backlight monitors and cold cathode fluorescent lamp (CCFL) monitors for the interpretation of digital chest radiographs. We selected 130 chest radiographs from health screening patients. The soft copy image data were randomly sorted and displayed on a 3.5 M LED (2560 × 1440 pixels) monitor and a 3 M CCFL (2048 × 1536 pixels) monitor. Eight radiologists rated their confidence in detecting nodules and abnormal interstitial lung markings (ILD). Low dose chest CT images were used as a reference standard. The performance of the monitor systems was assessed by analyzing 2080 observations and comparing them by multi-reader, multi-case receiver operating characteristic analysis. The observers reported visual fatigue and a sense of heat. Radiant heat and brightness of the monitors were measured. Measured brightness was 291 cd/m(2) for the LED and 354 cd/m(2) for the CCFL monitor. Area under curves for nodule detection were 0.721 ± 0.072 and 0.764 ± 0.098 for LED and CCFL (p = 0.173), whereas those for ILD were 0.871 ± 0.073 and 0.844 ± 0.068 (p = 0.145), respectively. There were no significant differences in interpretation time (p = 0.446) or fatigue score (p = 0.102) between the two monitors. Sense of heat was lower for the LED monitor (p = 0.024). The temperature elevation was 6.7℃ for LED and 12.4℃ for the CCFL monitor. Although the LED monitor had lower maximum brightness compared with the CCFL monitor, soft copy reading of the digital chest radiographs on LED and CCFL showed no difference in terms of diagnostic performance. In addition, LED emitted less heat.

  17. [Research and realization of signal processing algorithms based on FPGA in digital ophthalmic ultrasonography imaging].

    PubMed

    Fang, Simin; Zhou, Sheng; Wang, Xiaochun; Ye, Qingsheng; Tian, Ling; Ji, Jianjun; Wang, Yanqun

    2015-01-01

    To design and improve signal processing algorithms of ophthalmic ultrasonography based on FPGA. Achieved three signal processing modules: full parallel distributed dynamic filter, digital quadrature demodulation, logarithmic compression, using Verilog HDL hardware language in Quartus II. Compared to the original system, the hardware cost is reduced, the whole image shows clearer and more information of the deep eyeball contained in the image, the depth of detection increases from 5 cm to 6 cm. The new algorithms meet the design requirements and achieve the system's optimization that they can effectively improve the image quality of existing equipment.

  18. Applications of digital image processing techniques to problems of data registration and correlation

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1978-01-01

    An overview is presented of the evolution of the computer configuration at JPL's Image Processing Laboratory (IPL). The development of techniques for the geometric transformation of digital imagery is discussed and consideration is given to automated and semiautomated image registration, and the registration of imaging and nonimaging data. The increasing complexity of image processing tasks at IPL is illustrated with examples of various applications from the planetary program and earth resources activities. It is noted that the registration of existing geocoded data bases with Landsat imagery will continue to be important if the Landsat data is to be of genuine use to the user community.

  19. Software for Processing of Digitized Astronegatives from Archives and Databases of Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Protsyuk, Yu. I.; Andruk, V. N.; Kazantseva, L. V.

    The paper discusses and illustrates the steps of basic processing of digitized image of astro negatives. Software for obtaining of a rectangular coordinates and photometric values of objects on photographic plates was created in the environment LINUX / MIDAS / ROMAFOT. The program can automatically process the specified number of files in FITS format with sizes up to 20000 x 20000 pixels. Other programs were made in FORTRAN and PASCAL with the ability to work in an environment of LINUX or WINDOWS. They were used for: identification of stars, separation and exclusion of diffraction satellites and double and triple exposures, elimination of image defects, reduction to the equatorial coordinates and magnitudes of a reference catalogs.

  20. Surface processing: existing and potential applications of ultraviolet light.

    PubMed

    Manzocco, Lara; Nicoli, Maria Cristina

    2015-01-01

    Solid foods represent optimal matrices for ultraviolet processing with effects well beyond nonthermal surface disinfection. UV radiation favors hormetic response in plant tissues and degradation of toxic compound on the product surface. Photoinduced reactions can also provide unexplored possibilities to steer structure and functionality of food biopolymers. The possibility to extensively exploit this technology will depend on availability of robust information about efficacious processing conditions and adequate strategies to completely and homogeneously process food surface.

  1. Automated image processing of LANDSAT 2 digital data for watershed runoff prediction

    NASA Technical Reports Server (NTRS)

    Sasso, R. R.; Jensen, J. R.; Estes, J. E.

    1977-01-01

    The U.S. Soil Conservation Service (SCS) model for watershed runoff prediction uses soil and land cover information as its major drivers. Kern County Water Agency is implementing the SCS model to predict runoff for 10,400 sq cm of mountainous watershed in Kern County, California. The Remote Sensing Unit, University of California, Santa Barbara, was commissioned by KCWA to conduct a 230 sq cm feasibility study in the Lake Isabella, California region to evaluate remote sensing methodologies which could be ultimately extrapolated to the entire 10,400 sq cm Kern County watershed. Digital results indicate that digital image processing of Landsat 2 data will provide usable land cover required by KCWA for input to the SCS runoff model.

  2. Terabit bandwidth-adaptive transmission using low-complexity format-transparent digital signal processing.

    PubMed

    Zhuge, Qunbi; Morsy-Osman, Mohamed; Chagnon, Mathieu; Xu, Xian; Qiu, Meng; Plant, David V

    2014-02-10

    In this paper, we propose a low-complexity format-transparent digital signal processing (DSP) scheme for next generation flexible and energy-efficient transceiver. It employs QPSK symbols as the training and pilot symbols for the initialization and tracking stage of the receiver-side DSP, respectively, for various modulation formats. The performance is numerically and experimentally evaluated in a dual polarization (DP) 11 Gbaud 64QAM system. Employing the proposed DSP scheme, we conduct a system-level study of Tb/s bandwidth-adaptive superchannel transmissions with flexible modulation formats including QPSK, 8QAM and 16QAM. The spectrum bandwidth allocation is realized in the digital domain instead of turning on/off sub-channels, which improves the performance of higher order QAM. Various transmission distances ranging from 240 km to 6240 km are demonstrated with a colorless detection for hardware complexity reduction.

  3. Multi-enzyme logic network architectures for assessing injuries: digital processing of biomarkers.

    PubMed

    Halámek, Jan; Bocharova, Vera; Chinnapareddy, Soujanya; Windmiller, Joshua Ray; Strack, Guinevere; Chuang, Min-Chieh; Zhou, Jian; Santhosh, Padmanabhan; Ramirez, Gabriela V; Arugula, Mary A; Wang, Joseph; Katz, Evgeny

    2010-12-01

    A multi-enzyme biocatalytic cascade processing simultaneously five biomarkers characteristic of traumatic brain injury (TBI) and soft tissue injury (STI) was developed. The system operates as a digital biosensor based on concerted function of 8 Boolean AND logic gates, resulting in the decision about the physiological conditions based on the logic analysis of complex patterns of the biomarkers. The system represents the first example of a multi-step/multi-enzyme biosensor with the built-in logic for the analysis of complex combinations of biochemical inputs. The approach is based on recent advances in enzyme-based biocomputing systems and the present paper demonstrates the potential applicability of biocomputing for developing novel digital biosensor networks.

  4. Digital pulse shape discrimination.

    PubMed

    Miller, L F; Preston, J; Pozzi, S; Flaska, M; Neal, J

    2007-01-01

    Pulse-shape discrimination (PSD) has been utilised for about 40 years as a method to obtain estimates for dose in mixed neutron and photon fields. Digitizers that operate close to GHz are currently available at a reasonable cost, and they can be used to directly sample signals from photomultiplier tubes. This permits one to perform digital PSD rather than the traditional, and well-established, analogoue techniques. One issue that complicates PSD for neutrons in mixed fields is that the light output characteristics of typical scintillators available for PSD, such as BC501A, vary as a function of energy deposited in the detector. This behaviour is more easily accommodated with digital processing of signals than with analogoue signal processing. Results illustrate the effectiveness of digital PSD.

  5. Lighting

    SciTech Connect

    McKay, H.N.

    The lighting section of ASHRAE standard 90.1 is discussed. It applies to all new buildings except low-rise residential, while excluding specialty lighting applications such as signage, art exhibits, theatrical productions, medical and dental tasks, and others. In addition, lighting for indoor plant growth is excluded if designed to operate only between 10 p.m. and 6 a.m. Lighting allowances for the interior of a building are determined by the use of the system performance path unless the space functions are not fully known, such as during the initial stages of design or for speculative buildings. In such cases, the prescriptive pathmore » is available. Lighting allowances for the exterior of all buildings are determined by a table of unit power allowances. A new addition the exterior lighting procedure is the inclusion of facade lighting. However, it is no longer possible to trade-off power allotted for the exterior with the interior of a building or vice versa. A significant change is the new emphasis on lighting controls.« less

  6. Prototyping scalable digital signal processing systems for radio astronomy using dataflow models

    NASA Astrophysics Data System (ADS)

    Sane, N.; Ford, J.; Harris, A. I.; Bhattacharyya, S. S.

    2012-05-01

    There is a growing trend toward using high-level tools for design and implementation of radio astronomy digital signal processing (DSP) systems. Such tools, for example, those from the Collaboration for Astronomy Signal Processing and Electronics Research (CASPER), are usually platform-specific, and lack high-level, platform-independent, portable, scalable application specifications. This limits the designer's ability to experiment with designs at a high-level of abstraction and early in the development cycle. We address some of these issues using a model-based design approach employing dataflow models. We demonstrate this approach by applying it to the design of a tunable digital downconverter (TDD) used for narrow-bandwidth spectroscopy. Our design is targeted toward an FPGA platform, called the Interconnect Break-out Board (IBOB), that is available from the CASPER. We use the term TDD to refer to a digital downconverter for which the decimation factor and center frequency can be reconfigured without the need for regenerating the hardware code. Such a design is currently not available in the CASPER DSP library. The work presented in this paper focuses on two aspects. First, we introduce and demonstrate a dataflow-based design approach using the dataflow interchange format (DIF) tool for high-level application specification, and we integrate this approach with the CASPER tool flow. Secondly, we explore the trade-off between the flexibility of TDD designs and the low hardware cost of fixed-configuration digital downconverter (FDD) designs that use the available CASPER DSP library. We further explore this trade-off in the context of a two-stage downconversion scheme employing a combination of TDD or FDD designs.

  7. Evolution of quality characteristics of minimally processed asparagus during storage in different lighting conditions.

    PubMed

    Sanz, S; Olarte, C; Ayala, F; Echávarri, J F

    2009-08-01

    The effect of different types of lighting (white, green, red, and blue light) on minimally processed asparagus during storage at 4 degrees C was studied. The gas concentrations in the packages, pH, mesophilic counts, and weight loss were also determined. Lighting caused an increase in physiological activity. Asparagus stored under lighting achieved atmospheres with higher CO(2) and lower O(2) content than samples kept in the dark. This activity increase explains the greater deterioration experienced by samples stored under lighting, which clearly affected texture and especially color, accelerating the appearance of greenish hues in the tips and reddish-brown hues in the spears. Exposure to light had a negative effect on the quality parameters of the asparagus and it caused a significant reduction in shelf life. Hence, the 11 d shelf life of samples kept in the dark was reduced to only 3 d in samples kept under red and green light, and to 7 d in those kept under white and blue light. However, quality indicators such as the color of the tips and texture showed significantly better behavior under blue light than with white light, which allows us to state that it is better to use this type of light or blue-tinted packaging film for the display of minimally processed asparagus to consumers.

  8. Using Serial and Discrete Digit Naming to Unravel Word Reading Processes

    PubMed Central

    Altani, Angeliki; Protopapas, Athanassios; Georgiou, George K.

    2018-01-01

    During reading acquisition, word recognition is assumed to undergo a developmental shift from slow serial/sublexical processing of letter strings to fast parallel processing of whole word forms. This shift has been proposed to be detected by examining the size of the relationship between serial- and discrete-trial versions of word reading and rapid naming tasks. Specifically, a strong association between serial naming of symbols and single word reading suggests that words are processed serially, whereas a strong association between discrete naming of symbols and single word reading suggests that words are processed in parallel as wholes. In this study, 429 Grade 1, 3, and 5 English-speaking Canadian children were tested on serial and discrete digit naming and word reading. Across grades, single word reading was more strongly associated with discrete naming than with serial naming of digits, indicating that short high-frequency words are processed as whole units early in the development of reading ability in English. In contrast, serial naming was not a unique predictor of single word reading across grades, suggesting that within-word sequential processing was not required for the successful recognition for this set of words. Factor mixture analysis revealed that our participants could be clustered into two classes, namely beginning and more advanced readers. Serial naming uniquely predicted single word reading only among the first class of readers, indicating that novice readers rely on a serial strategy to decode words. Yet, a considerable proportion of Grade 1 students were assigned to the second class, evidently being able to process short high-frequency words as unitized symbols. We consider these findings together with those from previous studies to challenge the hypothesis of a binary distinction between serial/sublexical and parallel/lexical processing in word reading. We argue instead that sequential processing in word reading operates on a continuum

  9. Using Serial and Discrete Digit Naming to Unravel Word Reading Processes.

    PubMed

    Altani, Angeliki; Protopapas, Athanassios; Georgiou, George K

    2018-01-01

    During reading acquisition, word recognition is assumed to undergo a developmental shift from slow serial/sublexical processing of letter strings to fast parallel processing of whole word forms. This shift has been proposed to be detected by examining the size of the relationship between serial- and discrete-trial versions of word reading and rapid naming tasks. Specifically, a strong association between serial naming of symbols and single word reading suggests that words are processed serially, whereas a strong association between discrete naming of symbols and single word reading suggests that words are processed in parallel as wholes. In this study, 429 Grade 1, 3, and 5 English-speaking Canadian children were tested on serial and discrete digit naming and word reading. Across grades, single word reading was more strongly associated with discrete naming than with serial naming of digits, indicating that short high-frequency words are processed as whole units early in the development of reading ability in English. In contrast, serial naming was not a unique predictor of single word reading across grades, suggesting that within-word sequential processing was not required for the successful recognition for this set of words. Factor mixture analysis revealed that our participants could be clustered into two classes, namely beginning and more advanced readers. Serial naming uniquely predicted single word reading only among the first class of readers, indicating that novice readers rely on a serial strategy to decode words. Yet, a considerable proportion of Grade 1 students were assigned to the second class, evidently being able to process short high-frequency words as unitized symbols. We consider these findings together with those from previous studies to challenge the hypothesis of a binary distinction between serial/sublexical and parallel/lexical processing in word reading. We argue instead that sequential processing in word reading operates on a continuum

  10. Merged GLORIA sidescan and hydrosweep pseudo-sidescan: Processing and creation of digital mosaics

    USGS Publications Warehouse

    Bird, R.T.; Searle, R.C.; Paskevich, V.; Twichell, D.C.

    1996-01-01

    We have replaced the usual band of poor-quality data in the near-nadir region of our GLORIA long-range sidescan-sonar imagery with a shaded-relief image constructed from swath bathymetry data (collected simultaneously with GLORIA) which completely cover the nadir area. We have developed a technique to enhance these "pseudo-sidescan" images in order to mimic the neighbouring GLORIA backscatter intensities. As a result, the enhanced images greatly facilitate the geologic interpretation of the adjacent GLORIA data, and geologic features evident in the GLORIA data may be correlated with greater confidence across track. Features interpreted from the pseudo-sidescan may be extrapolated from the near-nadir region out into the GLORIA range where they may not have been recognized otherwise, and therefore the pseudo-sidescan can be used to ground-truth GLORIA interpretations. Creation of digital sidescan mosaics utilized an approach not previously used for GLORIA data. Pixels were correctly placed in cartographic space and the time required to complete a final mosaic was significantly reduced. Computer software for digital mapping and mosaic creation is incorporated into the newly-developed Woods Hole Image Processing System (WHIPS) which can process both low- and high-frequency sidescan, and can interchange data with the Mini Image Processing System (MIPS) most commonly used for GLORIA processing. These techniques are tested by creating digital mosaics of merged GLORIA sidescan and Hydrosweep pseudo-sidescan data from the vicinity of the Juan Fernandez microplate along the East Pacific Rise (EPR). 

  11. APPLEPIPS /Apple Personal Image Processing System/ - An interactive digital image processing system for the Apple II microcomputer

    NASA Technical Reports Server (NTRS)

    Masuoka, E.; Rose, J.; Quattromani, M.

    1981-01-01

    Recent developments related to microprocessor-based personal computers have made low-cost digital image processing systems a reality. Image analysis systems built around these microcomputers provide color image displays for images as large as 256 by 240 pixels in sixteen colors. Descriptive statistics can be computed for portions of an image, and supervised image classification can be obtained. The systems support Basic, Fortran, Pascal, and assembler language. A description is provided of a system which is representative of the new microprocessor-based image processing systems currently on the market. While small systems may never be truly independent of larger mainframes, because they lack 9-track tape drives, the independent processing power of the microcomputers will help alleviate some of the turn-around time problems associated with image analysis and display on the larger multiuser systems.

  12. Measuring Down: Evaluating Digital Storytelling as a Process for Narrative Health Promotion.

    PubMed

    Gubrium, Aline C; Fiddian-Green, Alice; Lowe, Sarah; DiFulvio, Gloria; Del Toro-Mejías, Lizbeth

    2016-05-15

    Digital storytelling (DST) engages participants in a group-based process to create and share narrative accounts of life events. We present key evaluation findings of a 2-year, mixed-methods study that focused on effects of participating in the DST process on young Puerto Rican Latina's self-esteem, social support, empowerment, and sexual attitudes and behaviors. Quantitative results did not show significant changes in the expected outcomes. However, in our qualitative findings we identified several ways in which the DST made positive, health-bearing effects. We argue for the importance of "measuring down" to reflect the locally grounded, felt experiences of participants who engage in the process, as current quantitative scales do not "measure up" to accurately capture these effects. We end by suggesting the need to develop mixed-methods, culturally relevant, and sensitive evaluation tools that prioritize process effects as they inform intervention and health promotion. © The Author(s) 2016.

  13. Conflict resolution in two-digit number processing: evidence of an inhibitory mechanism.

    PubMed

    Macizo, Pedro

    2017-01-01

    We investigated the mechanism involved in conflict resolution when individuals processed two-digit numbers. Participants performed a comparison task in blocks of two trials. In the first trial, between-decade two-digit numbers were used in a compatible condition where the decade and the unit of one number were larger than those of the other number (i.e., 21-73) and an incompatible condition where the decade of one number was larger but the unit was smaller than those of the other number (i.e., 61-53). In the second trial, within-decade two-digit numbers were presented in a related condition where the numbers contained the units presented previously (i.e., 41-43) and an unrelated condition with units that did not appear before (i.e., 48-49). In the first trial, participants responded more slowly in incompatible trials relative to compatible trials. In the second trial, participants were slower in the related condition relative to unrelated trials only after incompatible trials. These results suggest that participants experienced conflict in the incompatible condition of first trial and that they inhibited irrelevant units to resolve conflict.

  14. Use and validation of mirrorless digital single light reflex camera for recording of vitreoretinal surgeries in high definition

    PubMed Central

    Khanduja, Sumeet; Sampangi, Raju; Hemlatha, B C; Singh, Satvir; Lall, Ashish

    2018-01-01

    Purpose: The purpose of this study is to describe the use of commercial digital single light reflex (DSLR) for vitreoretinal surgery recording and compare it to standard 3-chip charged coupling device (CCD) camera. Methods: Simultaneous recording was done using Sony A7s2 camera and Sony high-definition 3-chip camera attached to each side of the microscope. The videos recorded from both the camera systems were edited and sequences of similar time frames were selected. Three sequences that selected for evaluation were (a) anterior segment surgery, (b) surgery under direct viewing system, and (c) surgery under indirect wide-angle viewing system. The videos of each sequence were evaluated and rated on a scale of 0-10 for color, contrast, and overall quality Results: Most results were rated either 8/10 or 9/10 for both the cameras. A noninferiority analysis by comparing mean scores of DSLR camera versus CCD camera was performed and P values were obtained. The mean scores of the two cameras were comparable for each other on all parameters assessed in the different videos except of color and contrast in posterior pole view and color on wide-angle view, which were rated significantly higher (better) in DSLR camera. Conclusion: Commercial DSLRs are an affordable low-cost alternative for vitreoretinal surgery recording and may be used for documentation and teaching. PMID:29283133

  15. Use and validation of mirrorless digital single light reflex camera for recording of vitreoretinal surgeries in high definition.

    PubMed

    Khanduja, Sumeet; Sampangi, Raju; Hemlatha, B C; Singh, Satvir; Lall, Ashish

    2018-01-01

    The purpose of this study is to describe the use of commercial digital single light reflex (DSLR) for vitreoretinal surgery recording and compare it to standard 3-chip charged coupling device (CCD) camera. Simultaneous recording was done using Sony A7s2 camera and Sony high-definition 3-chip camera attached to each side of the microscope. The videos recorded from both the camera systems were edited and sequences of similar time frames were selected. Three sequences that selected for evaluation were (a) anterior segment surgery, (b) surgery under direct viewing system, and (c) surgery under indirect wide-angle viewing system. The videos of each sequence were evaluated and rated on a scale of 0-10 for color, contrast, and overall quality Results: Most results were rated either 8/10 or 9/10 for both the cameras. A noninferiority analysis by comparing mean scores of DSLR camera versus CCD camera was performed and P values were obtained. The mean scores of the two cameras were comparable for each other on all parameters assessed in the different videos except of color and contrast in posterior pole view and color on wide-angle view, which were rated significantly higher (better) in DSLR camera. Commercial DSLRs are an affordable low-cost alternative for vitreoretinal surgery recording and may be used for documentation and teaching.

  16. Digital CODEC for real-time processing of broadcast quality video signals at 1.8 bits/pixel

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary JO; Whyte, Wayne A.

    1991-01-01

    Advances in very large scale integration and recent work in the field of bandwidth efficient digital modulation techniques have combined to make digital video processing technically feasible an potentially cost competitive for broadcast quality television transmission. A hardware implementation was developed for DPCM (differential pulse code midulation)-based digital television bandwidth compression algorithm which processes standard NTSC composite color television signals and produces broadcast quality video in real time at an average of 1.8 bits/pixel. The data compression algorithm and the hardware implementation of the codec are described, and performance results are provided.

  17. Digital CODEC for real-time processing of broadcast quality video signals at 1.8 bits/pixel

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary JO; Whyte, Wayne A., Jr.

    1989-01-01

    Advances in very large-scale integration and recent work in the field of bandwidth efficient digital modulation techniques have combined to make digital video processing technically feasible and potentially cost competitive for broadcast quality television transmission. A hardware implementation was developed for a DPCM-based digital television bandwidth compression algorithm which processes standard NTSC composite color television signals and produces broadcast quality video in real time at an average of 1.8 bits/pixel. The data compression algorithm and the hardware implementation of the CODEC are described, and performance results are provided.

  18. Phase noise optimization in temporal phase-shifting digital holography with partial coherence light sources and its application in quantitative cell imaging.

    PubMed

    Remmersmann, Christian; Stürwald, Stephan; Kemper, Björn; Langehanenberg, Patrik; von Bally, Gert

    2009-03-10

    In temporal phase-shifting-based digital holographic microscopy, high-resolution phase contrast imaging requires optimized conditions for hologram recording and phase retrieval. To optimize the phase resolution, for the example of a variable three-step algorithm, a theoretical analysis on statistical errors, digitalization errors, uncorrelated errors, and errors due to a misaligned temporal phase shift is carried out. In a second step the theoretically predicted results are compared to the measured phase noise obtained from comparative experimental investigations with several coherent and partially coherent light sources. Finally, the applicability for noise reduction is demonstrated by quantitative phase contrast imaging of pancreas tumor cells.

  19. Advanced digital signal processing for short-haul and access network

    NASA Astrophysics Data System (ADS)

    Zhang, Junwen; Yu, Jianjun; Chi, Nan

    2016-02-01

    Digital signal processing (DSP) has been proved to be a successful technology recently in high speed and high spectrum-efficiency optical short-haul and access network, which enables high performances based on digital equalizations and compensations. In this paper, we investigate advanced DSP at the transmitter and receiver side for signal pre-equalization and post-equalization in an optical access network. A novel DSP-based digital and optical pre-equalization scheme has been proposed for bandwidth-limited high speed short-distance communication system, which is based on the feedback of receiver-side adaptive equalizers, such as least-mean-squares (LMS) algorithm and constant or multi-modulus algorithms (CMA, MMA). Based on this scheme, we experimentally demonstrate 400GE on a single optical carrier based on the highest ETDM 120-GBaud PDM-PAM-4 signal, using one external modulator and coherent detection. A line rate of 480-Gb/s is achieved, which enables 20% forward-error correction (FEC) overhead to keep the 400-Gb/s net information rate. The performance after fiber transmission shows large margin for both short range and metro/regional networks. We also extend the advanced DSP for short haul optical access networks by using high order QAMs. We propose and demonstrate a high speed multi-band CAP-WDM-PON system on intensity modulation, direct detection and digital equalizations. A hybrid modified cascaded MMA post-equalization schemes are used to equalize the multi-band CAP-mQAM signals. Using this scheme, we successfully demonstrates 550Gb/s high capacity WDMPON system with 11 WDM channels, 55 sub-bands, and 10-Gb/s per user in the downstream over 40-km SMF.

  20. Experimental demonstration of a format-flexible single-carrier coherent receiver using data-aided digital signal processing.

    PubMed

    Elschner, Robert; Frey, Felix; Meuer, Christian; Fischer, Johannes Karl; Alreesh, Saleem; Schmidt-Langhorst, Carsten; Molle, Lutz; Tanimura, Takahito; Schubert, Colja

    2012-12-17

    We experimentally demonstrate the use of data-aided digital signal processing for format-flexible coherent reception of different 28-GBd PDM and 4D modulated signals in WDM transmission experiments over up to 7680 km SSMF by using the same resource-efficient digital signal processing algorithms for the equalization of all formats. Stable and regular performance in the nonlinear transmission regime is confirmed.

  1. A cost-effective line-based light-balancing technique using adaptive processing.

    PubMed

    Hsia, Shih-Chang; Chen, Ming-Huei; Chen, Yu-Min

    2006-09-01

    The camera imaging system has been widely used; however, the displaying image appears to have an unequal light distribution. This paper presents novel light-balancing techniques to compensate uneven illumination based on adaptive signal processing. For text image processing, first, we estimate the background level and then process each pixel with nonuniform gain. This algorithm can balance the light distribution while keeping a high contrast in the image. For graph image processing, the adaptive section control using piecewise nonlinear gain is proposed to equalize the histogram. Simulations show that the performance of light balance is better than the other methods. Moreover, we employ line-based processing to efficiently reduce the memory requirement and the computational cost to make it applicable in real-time systems.

  2. Rethinking the Systems Engineering Process in Light of Design Thinking

    DTIC Science & Technology

    2016-04-30

    systems engineering process models (Blanchard & Fabrycky, 1990) and the majority of engineering design education (Dym et al., 2005). The waterfall model ...Engineering Career Competency Model Clifford Whitcomb, Systems Engineering Professor, NPS Corina White, Systems Engineering Research Associate, NPS...Postgraduate School (NPS) in Monterey, CA. He teaches and conducts research in the design of enterprise systems, systems modeling , and system

  3. An evaluation of the Intel 2920 digital signal processing integrated circuit

    NASA Technical Reports Server (NTRS)

    Heller, J.

    1981-01-01

    The circuit consists of a digital to analog converter, accumulator, read write memory and UV erasable read only memory. The circuit can convert an analog signal to a digital representation, perform mathematical operations on the digital signal and subsequently convert the digital signal to an analog output. Development software tailored for programming the 2920 is presented.

  4. Comparing digital data processing techniques for surface mine and reclamation monitoring

    NASA Technical Reports Server (NTRS)

    Witt, R. G.; Bly, B. G.; Campbell, W. J.; Bloemer, H. H. L.; Brumfield, J. O.

    1982-01-01

    The results of three techniques used for processing Landsat digital data are compared for their utility in delineating areas of surface mining and subsequent reclamation. An unsupervised clustering algorithm (ISOCLS), a maximum-likelihood classifier (CLASFY), and a hybrid approach utilizing canonical analysis (ISOCLS/KLTRANS/ISOCLS) were compared by means of a detailed accuracy assessment with aerial photography at NASA's Goddard Space Flight Center. Results show that the hybrid approach was superior to the traditional techniques in distinguishing strip mined and reclaimed areas.

  5. Automated image processing of Landsat II digital data for watershed runoff prediction

    NASA Technical Reports Server (NTRS)

    Sasso, R. R.; Jensen, J. R.; Estes, J. E.

    1977-01-01

    Digital image processing of Landsat data from a 230 sq km area was examined as a possible means of generating soil cover information for use in the watershed runoff prediction of Kern County, California. The soil cover information included data on brush, grass, pasture lands and forests. A classification accuracy of 94% for the Landsat-based soil cover survey suggested that the technique could be applied to the watershed runoff estimate. However, problems involving the survey of complex mountainous environments may require further attention

  6. The discrete prolate spheroidal filter as a digital signal processing tool

    NASA Technical Reports Server (NTRS)

    Mathews, J. D.; Breakall, J. K.; Karawas, G. K.

    1983-01-01

    The discrete prolate spheriodall (DPS) filter is one of the glass of nonrecursive finite impulse response (FIR) filters. The DPS filter is superior to other filters in this class in that it has maximum energy concentration in the frequency passband and minimum ringing in the time domain. A mathematical development of the DPS filter properties is given, along with information required to construct the filter. The properties of this filter were compared with those of the more commonly used filters of the same class. Use of the DPS filter allows for particularly meaningful statements of data time/frequency resolution cell values. The filter forms an especially useful tool for digital signal processing.

  7. LANDSAT digital data processing: A near real-time application. [Gulf of Mexico

    NASA Technical Reports Server (NTRS)

    Barker, J. L.; Bohn, C.; Stuart, L.; Hill, J.

    1975-01-01

    An application of rapid generation of classed digital images from LANDSAT-1 was demonstrated and its feasibility evaluated by NASA in conjunction with the Environmental Protection Agency (EPA), Texas A and M University (TAMU), and the Cousteau Society. The primary purpose was to show that satellite data could be processed and transmitted to the Calypso, which was used as a research vessel, in time for use in directing it to specific locations of possible plankton upwellings, sediment, or other anomalies in the coastal water areas along the Gulf of Mexico.

  8. Comparison of digital signal processing modules in gamma-ray spectrometry.

    PubMed

    Lépy, Marie-Christine; Cissé, Ousmane Ibrahima; Pierre, Sylvie

    2014-05-01

    Commercial digital signal-processing modules have been tested for their applicability to gamma-ray spectrometry. The tests were based on the same n-type high purity germanium detector. The spectrum quality was studied in terms of energy resolution and peak area versus shaping parameters, using a Eu-152 point source. The stability of a reference peak count rate versus the total count rate was also examined. The reliability of the quantitative results is discussed for their use in measurement at the metrological level. © 2013 Published by Elsevier Ltd.

  9. UCMS - A new signal parameter measurement system using digital signal processing techniques. [User Constraint Measurement System

    NASA Technical Reports Server (NTRS)

    Choi, H. J.; Su, Y. T.

    1986-01-01

    The User Constraint Measurement System (UCMS) is a hardware/software package developed by NASA Goddard to measure the signal parameter constraints of the user transponder in the TDRSS environment by means of an all-digital signal sampling technique. An account is presently given of the features of UCMS design and of its performance capabilities and applications; attention is given to such important aspects of the system as RF interface parameter definitions, hardware minimization, the emphasis on offline software signal processing, and end-to-end link performance. Applications to the measurement of other signal parameters are also discussed.

  10. Basic forest cover mapping using digitized remote sensor data and automated data processing techniques

    NASA Technical Reports Server (NTRS)

    Coggeshall, M. E.; Hoffer, R. M.

    1973-01-01

    Remote sensing equipment and automatic data processing techniques were employed as aids in the institution of improved forest resource management methods. On the basis of automatically calculated statistics derived from manually selected training samples, the feature selection processor of LARSYS selected, upon consideration of various groups of the four available spectral regions, a series of channel combinations whose automatic classification performances (for six cover types, including both deciduous and coniferous forest) were tested, analyzed, and further compared with automatic classification results obtained from digitized color infrared photography.

  11. Inventories of Delaware's coastal vegetation and land-use utilizing digital processing of ERTS-1 imagery

    NASA Technical Reports Server (NTRS)

    Klemas, V. (Principal Investigator); Bartlett, D.; Rogers, R.; Reed, L.

    1974-01-01

    The author has identified the following significant results. Analysis of ERTS-1 color composite images using analogy processing equipment confirmed that all the major wetlands plant species were distinguishable at ERTS-1 scale. Furthermore, human alterations of the coastal zone were easily recognized since such alterations typically involve removal of vegetative cover resulting in a change of spectral signature. The superior spectral resolution of the CCTs as compared with single band or composite imagery has indeed provided good discrimination through digital analysis of the CCTs with the added advantage of rapid production of thematic maps and data.

  12. Processing of Digital Plates1.2m of Baldone Observatory Schmidt Telescope

    NASA Astrophysics Data System (ADS)

    Eglitis, Ilgmars; Andruk, Vitaly

    2017-04-01

    The aim of this research is to evaluate accuracy of Plate Processing Method and to perform a detailed study of the Epson Expression 10000XL scanner, which was used to digitize plates from the database collection of the 1.2 m Schmidt Telescope installed in the Baldone Observatory. Special software developed in LINUX/MIDAS/ROMAFOT environment was used for processing the scans. Results of the digitized files with grey gradations of 8- and 16-bits were compared; an estimation of the accuracy of the developed method for rectangular coordinates determination and photometry was made. Errors in the instrumental system are ±0.026 pixels and ±0.024m for coordinates and stellar magnitudes respectively. To evaluate the repeatability of the scanner's astrometric and photometric errors, six consecutive scans of one plate were processed with a spatial separation of 1200 dpi. The following error estimations are obtained for stars brighter than U< 13.5m: σxy = ±0.021 to 0.027 pixels and σm = ±0.014m to 0.016m for rectangular coordinates and instrumental stellar magnitudes respectively.

  13. Qualitative and quantitative interpretation of SEM image using digital image processing.

    PubMed

    Saladra, Dawid; Kopernik, Magdalena

    2016-10-01

    The aim of the this study is improvement of qualitative and quantitative analysis of scanning electron microscope micrographs by development of computer program, which enables automatic crack analysis of scanning electron microscopy (SEM) micrographs. Micromechanical tests of pneumatic ventricular assist devices result in a large number of micrographs. Therefore, the analysis must be automatic. Tests for athrombogenic titanium nitride/gold coatings deposited on polymeric substrates (Bionate II) are performed. These tests include microshear, microtension and fatigue analysis. Anisotropic surface defects observed in the SEM micrographs require support for qualitative and quantitative interpretation. Improvement of qualitative analysis of scanning electron microscope images was achieved by a set of computational tools that includes binarization, simplified expanding, expanding, simple image statistic thresholding, the filters Laplacian 1, and Laplacian 2, Otsu and reverse binarization. Several modifications of the known image processing techniques and combinations of the selected image processing techniques were applied. The introduced quantitative analysis of digital scanning electron microscope images enables computation of stereological parameters such as area, crack angle, crack length, and total crack length per unit area. This study also compares the functionality of the developed computer program of digital image processing with existing applications. The described pre- and postprocessing may be helpful in scanning electron microscopy and transmission electron microscopy surface investigations. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  14. Digital Storytelling as a Narrative Health Promotion Process: Evaluation of a Pilot Study.

    PubMed

    DiFulvio, Gloria T; Gubrium, Aline C; Fiddian-Green, Alice; Lowe, Sarah E; Del Toro-Mejias, Lizbeth Marie

    2016-04-01

    Digital storytelling (DST) engages participants in a group-based process to create and share narrative accounts of life events. The process of individuals telling their own stories has not been well assessed as a mechanism of health behavior change. This study looks at outcomes associated with engaging in the DST process for vulnerable youth. The project focused on the experiences of Puerto Rican Latinas between the ages of 15 to 21. A total of 30 participants enrolled in a 4-day DST workshops, with 29 completing a 1 to 3-minute digital story. Self-reported data on several scales (self-esteem, social support, empowerment, and sexual attitudes and behaviors) were collected and analyzed. Participants showed an increase in positive social interactions from baseline to 3-month post workshop. Participants also demonstrated increases in optimism and control over the future immediately after the workshop, but this change was not sustained at 3 months. Analysis of qualitative results and implications are discussed. © The Author(s) 2016.

  15. Virtual and flexible digital signal processing system based on software PnP and component works

    NASA Astrophysics Data System (ADS)

    He, Tao; Wu, Qinghua; Zhong, Fei; Li, Wei

    2005-05-01

    An idea about software PnP (Plug & Play) is put forward according to the hardware PnP. And base on this idea, a virtual flexible digital signal processing system (FVDSPS) is carried out. FVDSPS is composed of a main control center, many sub-function modules and other hardware I/O modules. Main control center sends out commands to sub-function modules, and manages running orders, parameters and results of sub-functions. The software kernel of FVDSPS is DSP (Digital Signal Processing) module, which communicates with the main control center through some protocols, accept commands or send requirements. The data sharing and exchanging between the main control center and the DSP modules are carried out and managed by the files system of the Windows Operation System through the effective communication. FVDSPS real orients objects, orients engineers and orients engineering problems. With FVDSPS, users can freely plug and play, and fast reconfigure a signal process system according to engineering problems without programming. What you see is what you get. Thus, an engineer can orient engineering problems directly, pay more attention to engineering problems, and promote the flexibility, reliability and veracity of testing system. Because FVDSPS orients TCP/IP protocol, through Internet, testing engineers, technology experts can be connected freely without space. Engineering problems can be resolved fast and effectively. FVDSPS can be used in many fields such as instruments and meter, fault diagnosis, device maintenance and quality control.

  16. White-Light Optical Information Processing and Holography.

    DTIC Science & Technology

    1984-06-22

    Processing, Image Deblurring , Source Encoding, Signal Sampling, Coherence Measurement, Noise Performance, / Pseudocolor Encoding. , ’ ’ * .~ 10.ASS!RACT...o 2.1 Broad Spectral Band Color Image Deblurring .. . 4 2.2 Noise Performance ...... ...... .. . 4 2.3 Pseudocolor Encoding with Three Primary...spectra. This technique is particularly suitable for linear smeared color image deblurring . 2.2 Noise Performance In this period, we have also

  17. Adults' Arithmetic Builds on Fast and Automatic Processing of Arabic Digits: Evidence from an Audiovisual Matching Paradigm

    PubMed Central

    Sasanguie, Delphine; Reynvoet, Bert

    2014-01-01

    Several studies have shown that performance on symbolic number tasks is related to individual differences in arithmetic. However, it is not clear which process is responsible for this association, i.e. fast, automatic processing of symbols per se or access to the underlying non-symbolic representation of the symbols. To dissociate between both options, adult participants performed an audiovisual matching paradigm. Auditory presented number words needed to be matched with either Arabic digits or dot patterns. The results revealed that a distance effect was present in the dots-number word matching task and absent in the digit-number word matching task. Crucially, only performance in the digit task contributed to the variance in arithmetical abilities. This led us to conclude that adults' arithmetic builds on the ability to quickly and automatically process Arabic digits, without the underlying non-symbolic magnitude representation being activated. PMID:24505308

  18. Design of a high-speed digital processing element for parallel simulation

    NASA Technical Reports Server (NTRS)

    Milner, E. J.; Cwynar, D. S.

    1983-01-01

    A prototype of a custom designed computer to be used as a processing element in a multiprocessor based jet engine simulator is described. The purpose of the custom design was to give the computer the speed and versatility required to simulate a jet engine in real time. Real time simulations are needed for closed loop testing of digital electronic engine controls. The prototype computer has a microcycle time of 133 nanoseconds. This speed was achieved by: prefetching the next instruction while the current one is executing, transporting data using high speed data busses, and using state of the art components such as a very large scale integration (VLSI) multiplier. Included are discussions of processing element requirements, design philosophy, the architecture of the custom designed processing element, the comprehensive instruction set, the diagnostic support software, and the development status of the custom design.

  19. A Study of Light Level Effect on the Accuracy of Image Processing-based Tomato Grading

    NASA Astrophysics Data System (ADS)

    Prijatna, D.; Muhaemin, M.; Wulandari, R. P.; Herwanto, T.; Saukat, M.; Sugandi, W. K.

    2018-05-01

    Image processing method has been used in non-destructive tests of agricultural products. Compared to manual method, image processing method may produce more objective and consistent results. Image capturing box installed in currently used tomato grading machine (TEP-4) is equipped with four fluorescence lamps to illuminate the processed tomatoes. Since the performance of any lamp will decrease if its service time has exceeded its lifetime, it is predicted that this will affect tomato classification. The objective of this study was to determine the minimum light levels which affect classification accuracy. This study was conducted by varying light level from minimum and maximum on tomatoes in image capturing boxes and then investigates its effects on image characteristics. Research results showed that light intensity affects two variables which are important for classification, for example, area and color of captured image. Image processing program was able to determine correctly the weight and classification of tomatoes when light level was 30 lx to 140 lx.

  20. Tunable Light-Guide Image Processing Snapshot Spectrometer (TuLIPSS) for Earth and Moon Observations

    NASA Astrophysics Data System (ADS)

    Tkaczyk, T. S.; Alexander, D.; Luvall, J. C.; Wang, Y.; Dwight, J. G.; Pawlowsk, M. E.; Howell, B.; Tatum, P. F.; Stoian, R.-I.; Cheng, S.; Daou, A.

    2018-02-01

    A tunable light-guide image processing snapshot spectrometer (TuLIPSS) for Earth science research and observation is being developed through a NASA instrument incubator project with Rice University and Marshall Space Flight Center.

  1. Implementation theory of distortion-invariant pattern recognition for optical and digital signal processing systems

    NASA Astrophysics Data System (ADS)

    Lhamon, Michael Earl

    A pattern recognition system which uses complex correlation filter banks requires proportionally more computational effort than single-real valued filters. This introduces increased computation burden but also introduces a higher level of parallelism, that common computing platforms fail to identify. As a result, we consider algorithm mapping to both optical and digital processors. For digital implementation, we develop computationally efficient pattern recognition algorithms, referred to as, vector inner product operators that require less computational effort than traditional fast Fourier methods. These algorithms do not need correlation and they map readily onto parallel digital architectures, which imply new architectures for optical processors. These filters exploit circulant-symmetric matrix structures of the training set data representing a variety of distortions. By using the same mathematical basis as with the vector inner product operations, we are able to extend the capabilities of more traditional correlation filtering to what we refer to as "Super Images". These "Super Images" are used to morphologically transform a complicated input scene into a predetermined dot pattern. The orientation of the dot pattern is related to the rotational distortion of the object of interest. The optical implementation of "Super Images" yields feature reduction necessary for using other techniques, such as artificial neural networks. We propose a parallel digital signal processor architecture based on specific pattern recognition algorithms but general enough to be applicable to other similar problems. Such an architecture is classified as a data flow architecture. Instead of mapping an algorithm to an architecture, we propose mapping the DSP architecture to a class of pattern recognition algorithms. Today's optical processing systems have difficulties implementing full complex filter structures. Typically, optical systems (like the 4f correlators) are limited to phase

  2. The processing and heterostructuring of silk with light.

    PubMed

    Sidhu, Mehra S; Kumar, Bhupesh; Singh, Kamal P

    2017-09-01

    Spider silk is a tough, elastic and lightweight biomaterial, although there is a lack of tools available for non-invasive processing of silk structures. Here we show that nonlinear multiphoton interactions of silk with few-cycle femtosecond pulses allow the processing and heterostructuring of the material in ambient air. Two qualitatively different responses, bulging by multiphoton absorption and plasma-assisted ablation, are observed for low- and high-peak intensities, respectively. Plasma ablation allows us to make localized nanocuts, microrods, nanotips and periodic patterns with minimal damage while preserving molecular structure. The bulging regime facilitates confined bending and microwelding of silk with materials such as metal, glass and Kevlar with strengths comparable to pristine silk. Moreover, analysis of Raman bands of microwelded joints reveals that the polypeptide backbone remains intact while perturbing its weak hydrogen bonds. Using this approach, we fabricate silk-based functional topological microstructures, such as Mobiüs strips, chiral helices and silk-based sensors.

  3. The processing and heterostructuring of silk with light

    NASA Astrophysics Data System (ADS)

    Sidhu, Mehra S.; Kumar, Bhupesh; Singh, Kamal P.

    2017-09-01

    Spider silk is a tough, elastic and lightweight biomaterial, although there is a lack of tools available for non-invasive processing of silk structures. Here we show that nonlinear multiphoton interactions of silk with few-cycle femtosecond pulses allow the processing and heterostructuring of the material in ambient air. Two qualitatively different responses, bulging by multiphoton absorption and plasma-assisted ablation, are observed for low- and high-peak intensities, respectively. Plasma ablation allows us to make localized nanocuts, microrods, nanotips and periodic patterns with minimal damage while preserving molecular structure. The bulging regime facilitates confined bending and microwelding of silk with materials such as metal, glass and Kevlar with strengths comparable to pristine silk. Moreover, analysis of Raman bands of microwelded joints reveals that the polypeptide backbone remains intact while perturbing its weak hydrogen bonds. Using this approach, we fabricate silk-based functional topological microstructures, such as Mobiüs strips, chiral helices and silk-based sensors.

  4. Beyond data collection in digital mapping: interpretation, sketching and thought process elements in geological map making

    NASA Astrophysics Data System (ADS)

    Watkins, Hannah; Bond, Clare; Butler, Rob

    2016-04-01

    Geological mapping techniques have advanced significantly in recent years from paper fieldslips to Toughbook, smartphone and tablet mapping; but how do the methods used to create a geological map affect the thought processes that result in the final map interpretation? Geological maps have many key roles in the field of geosciences including understanding geological processes and geometries in 3D, interpreting geological histories and understanding stratigraphic relationships in 2D and 3D. Here we consider the impact of the methods used to create a map on the thought processes that result in the final geological map interpretation. As mapping technology has advanced in recent years, the way in which we produce geological maps has also changed. Traditional geological mapping is undertaken using paper fieldslips, pencils and compass clinometers. The map interpretation evolves through time as data is collected. This interpretive process that results in the final geological map is often supported by recording in a field notebook, observations, ideas and alternative geological models explored with the use of sketches and evolutionary diagrams. In combination the field map and notebook can be used to challenge the map interpretation and consider its uncertainties. These uncertainties and the balance of data to interpretation are often lost in the creation of published 'fair' copy geological maps. The advent of Toughbooks, smartphones and tablets in the production of geological maps has changed the process of map creation. Digital data collection, particularly through the use of inbuilt gyrometers in phones and tablets, has changed smartphones into geological mapping tools that can be used to collect lots of geological data quickly. With GPS functionality this data is also geospatially located, assuming good GPS connectivity, and can be linked to georeferenced infield photography. In contrast line drawing, for example for lithological boundary interpretation and sketching

  5. Shining a light on planetary processes using synchrotron techniques

    NASA Astrophysics Data System (ADS)

    Brand, H. E. A.; Kimpton, J. A.

    2017-12-01

    The Australian Synchrotron is a world-class national research facility that uses accelerator technology to produce X-rays and infrared for research. It is available for researchers from all institutions and disciplines. This contribution is intended to inform the community of the current capabilities at the facility using examples drawn from planetary research across the beamlines. Examples will include: formation of jarosite minerals with a view to Mars; studies of Micrometeorites; and large volume CT imaging of geological samples. A suite of new beamlines has been proposed for the growth of the facility and one of these, ADS, the Advanced Diffraction and Scattering beamline, is intended to be a high energy X-ray diffraction beamline capable of reaching extreme conditions and carrying out challenging in situ experiments. There is an opportunity to develop complex new sample environments which could be of relevance to shock metamorphic processes and this will form part of the discussion.

  6. D Reconstruction-Reverse Engineering - Digital Fabrication of the Egyptian Palermo Stone Using by Smartphone and Light Structured Scanner

    NASA Astrophysics Data System (ADS)

    Di Paola, F.; Inzerillo, L.

    2018-05-01

    This paper presents a pipeline that has been developed to acquire a shape with particular features both under the geometric and radiometric aspects. In fact, the challenge was to build a 3D model of the black Stone of Palermo, where the oldest Egyptian history was printed with the use of hieroglyphs. The dark colour of the material and the superficiality of the hieroglyphs' groove have made the acquisition process very complex to the point of having to experiment with a pipeline that allows the structured light scanner not to lose the homologous points in the 3D alignment phase. For the texture reconstruction we used a last generation smartphone.

  7. Processing and Memory of Color, Contour, and Pattern Found in Computer Digitized Color Pictures for Elementary Children.

    ERIC Educational Resources Information Center

    Marschalek, Douglas G.

    1988-01-01

    Describes study of children in grades one, three, and five that examined their active processing and short term memory (STM) of color, contour, and interior pattern of shapes found in computer digitized pictures. Age-related differences are examined, and the role of processing visual information in the learning process is discussed. (12…

  8. CORDIC-based digital signal processing (DSP) element for adaptive signal processing

    NASA Astrophysics Data System (ADS)

    Bolstad, Gregory D.; Neeld, Kenneth B.

    1995-04-01

    The High Performance Adaptive Weight Computation (HAWC) processing element is a CORDIC based application specific DSP element that, when connected in a linear array, can perform extremely high throughput (100s of GFLOPS) matrix arithmetic operations on linear systems of equations in real time. In particular, it very efficiently performs the numerically intense computation of optimal least squares solutions for large, over-determined linear systems. Most techniques for computing solutions to these types of problems have used either a hard-wired, non-programmable systolic array approach, or more commonly, programmable DSP or microprocessor approaches. The custom logic methods can be efficient, but are generally inflexible. Approaches using multiple programmable generic DSP devices are very flexible, but suffer from poor efficiency and high computation latencies, primarily due to the large number of DSP devices that must be utilized to achieve the necessary arithmetic throughput. The HAWC processor is implemented as a highly optimized systolic array, yet retains some of the flexibility of a programmable data-flow system, allowing efficient implementation of algorithm variations. This provides flexible matrix processing capabilities that are one to three orders of magnitude less expensive and more dense than the current state of the art, and more importantly, allows a realizable solution to matrix processing problems that were previously considered impractical to physically implement. HAWC has direct applications in RADAR, SONAR, communications, and image processing, as well as in many other types of systems.

  9. Irdis: A Digital Scene Storage And Processing System For Hardware-In-The-Loop Missile Testing

    NASA Astrophysics Data System (ADS)

    Sedlar, Michael F.; Griffith, Jerry A.

    1988-07-01

    This paper describes the implementation of a Seeker Evaluation and Test Simulation (SETS) Facility at Eglin Air Force Base. This facility will be used to evaluate imaging infrared (IIR) guided weapon systems by performing various types of laboratory tests. One such test is termed Hardware-in-the-Loop (HIL) simulation (Figure 1) in which the actual flight of a weapon system is simulated as closely as possible in the laboratory. As shown in the figure, there are four major elements in the HIL test environment; the weapon/sensor combination, an aerodynamic simulator, an imagery controller, and an infrared imagery system. The paper concentrates on the approaches and methodologies used in the imagery controller and infrared imaging system elements for generating scene information. For procurement purposes, these two elements have been combined into an Infrared Digital Injection System (IRDIS) which provides scene storage, processing, and output interface to drive a radiometric display device or to directly inject digital video into the weapon system (bypassing the sensor). The paper describes in detail how standard and custom image processing functions have been combined with off-the-shelf mass storage and computing devices to produce a system which provides high sample rates (greater than 90 Hz), a large terrain database, high weapon rates of change, and multiple independent targets. A photo based approach has been used to maximize terrain and target fidelity, thus providing a rich and complex scene for weapon/tracker evaluation.

  10. Combination of digital signal processing methods towards an improved analysis algorithm for structural health monitoring.

    NASA Astrophysics Data System (ADS)

    Pentaris, Fragkiskos P.; Makris, John P.

    2013-04-01

    In Structural Health Monitoring (SHM) is of great importance to reveal valuable information from the recorded SHM data that could be used to predict or indicate structural fault or damage in a building. In this work a combination of digital signal processing methods, namely FFT along with Wavelet Transform is applied, together with a proposed algorithm to study frequency dispersion, in order to depict non-linear characteristics of SHM data collected in two university buildings under natural or anthropogenic excitation. The selected buildings are of great importance from civil protection point of view, as there are the premises of a public higher education institute, undergoing high use, stress, visit from academic staff and students. The SHM data are collected from two neighboring buildings that have different age (4 and 18 years old respectively). Proposed digital signal processing methods are applied to the data, presenting a comparison of the structural behavior of both buildings in response to seismic activity, weather conditions and man-made activity. Acknowledgments This work was supported in part by the Archimedes III Program of the Ministry of Education of Greece, through the Operational Program "Educational and Lifelong Learning", in the framework of the project entitled «Interdisciplinary Multi-Scale Research of Earthquake Physics and Seismotectonics at the front of the Hellenic Arc (IMPACT-ARC) » and is co-financed by the European Union (European Social Fund) and Greek National Fund.

  11. A working environment for digital planetary data processing and mapping using ISIS and GRASS GIS

    USGS Publications Warehouse

    Frigeri, A.; Hare, T.; Neteler, M.; Coradini, A.; Federico, C.; Orosei, R.

    2011-01-01

    Since the beginning of planetary exploration, mapping has been fundamental to summarize observations returned by scientific missions. Sensor-based mapping has been used to highlight specific features from the planetary surfaces by means of processing. Interpretative mapping makes use of instrumental observations to produce thematic maps that summarize observations of actual data into a specific theme. Geologic maps, for example, are thematic interpretative maps that focus on the representation of materials and processes and their relative timing. The advancements in technology of the last 30 years have allowed us to develop specialized systems where the mapping process can be made entirely in the digital domain. The spread of networked computers on a global scale allowed the rapid propagation of software and digital data such that every researcher can now access digital mapping facilities on his desktop. The efforts to maintain planetary missions data accessible to the scientific community have led to the creation of standardized digital archives that facilitate the access to different datasets by software capable of processing these data from the raw level to the map projected one. Geographic Information Systems (GIS) have been developed to optimize the storage, the analysis, and the retrieval of spatially referenced Earth based environmental geodata; since the last decade these computer programs have become popular among the planetary science community, and recent mission data start to be distributed in formats compatible with these systems. Among all the systems developed for the analysis of planetary and spatially referenced data, we have created a working environment combining two software suites that have similar characteristics in their modular design, their development history, their policy of distribution and their support system. The first, the Integrated Software for Imagers and Spectrometers (ISIS) developed by the United States Geological Survey

  12. Individual differences influence two-digit number processing, but not their analog magnitude processing: a large-scale online study.

    PubMed

    Huber, Stefan; Nuerk, Hans-Christoph; Reips, Ulf-Dietrich; Soltanlou, Mojtaba

    2017-12-23

    Symbolic magnitude comparison is one of the most well-studied cognitive processes in research on numerical cognition. However, while the cognitive mechanisms of symbolic magnitude processing have been intensively studied, previous studies have paid less attention to individual differences influencing symbolic magnitude comparison. Employing a two-digit number comparison task in an online setting, we replicated previous effects, including the distance effect, the unit-decade compatibility effect, and the effect of cognitive control on the adaptation to filler items, in a large-scale study in 452 adults. Additionally, we observed that the most influential individual differences were participants' first language, time spent playing computer games and gender, followed by reported alcohol consumption, age and mathematical ability. Participants who used a first language with a left-to-right reading/writing direction were faster than those who read and wrote in the right-to-left direction. Reported playing time for computer games was correlated with faster reaction times. Female participants showed slower reaction times and a larger unit-decade compatibility effect than male participants. Participants who reported never consuming alcohol showed overall slower response times than others. Older participants were slower, but more accurate. Finally, higher grades in mathematics were associated with faster reaction times. We conclude that typical experiments on numerical cognition that employ a keyboard as an input device can also be run in an online setting. Moreover, while individual differences have no influence on domain-specific magnitude processing-apart from age, which increases the decade distance effect-they generally influence performance on a two-digit number comparison task.

  13. Digital ultrasonics signal processing: Flaw data post processing use and description

    NASA Technical Reports Server (NTRS)

    Buel, V. E.

    1981-01-01

    A modular system composed of two sets of tasks which interprets the flaw data and allows compensation of the data due to transducer characteristics is described. The hardware configuration consists of two main units. A DEC LSI-11 processor running under the RT-11 sngle job, version 2C-02 operating system, controls the scanner hardware and the ultrasonic unit. A DEC PDP-11/45 processor also running under the RT-11, version 2C-02, operating system, stores, processes and displays the flaw data. The software developed the Ultrasonics Evaluation System, is divided into two catagories; transducer characterization and flaw classification. Each category is divided further into two functional tasks: a data acquisition and a postprocessor ask. The flaw characterization collects data, compresses its, and writes it to a disk file. The data is then processed by the flaw classification postprocessing task. The use and operation of a flaw data postprocessor is described.

  14. Digital seismo-acoustic signal processing aboard a wireless sensor platform

    NASA Astrophysics Data System (ADS)

    Marcillo, O.; Johnson, J. B.; Lorincz, K.; Werner-Allen, G.; Welsh, M.

    2006-12-01

    We are developing a low power, low-cost wireless sensor array to conduct real-time signal processing of earthquakes at active volcanoes. The sensor array, which integrates data from both seismic and acoustic sensors, is based on Moteiv TMote Sky wireless sensor nodes (www.moteiv.com). The nodes feature a Texas Instruments MSP430 microcontroller, 48 Kbytes of program memory, 10 Kbytes of static RAM, 1 Mbyte of external flash memory, and a 2.4-GHz Chipcon CC2420 IEEE 802.15.4 radio. The TMote Sky is programmed in TinyOS. Basic signal processing occurs on an array of three peripheral sensor nodes. These nodes are tied into a dedicated GPS receiver node, which is focused on time synchronization, and a central communications node, which handles data integration and additional processing. The sensor nodes incorporate dual 12-bit digitizers sampling a seismic sensor and a pressure transducer at 100 samples per second. The wireless capabilities of the system allow flexible array geometry, with a maximum aperture of 200m. We have already developed the digital signal processing routines on board the Moteiv Tmote sensor nodes. The developed routines accomplish Real-time Seismic-Amplitude Measurement (RSAM), Seismic Spectral- Amplitude Measurement (SSAM), and a user-configured Short Term Averaging / Long Term Averaging (STA LTA ratio), which is used to calculate first arrivals. The processed data from individual nodes are transmitted back to a central node, where additional processing may be performed. Such processing will include back azimuth determination and other wave field analyses. Future on-board signal processing will focus on event characterization utilizing pattern recognition and spectral characterization. The processed data is intended as low bandwidth information which can be transmitted periodically and at low cost through satellite telemetry to a web server. The processing is limited by the computational capabilities (RAM, ROM) of the nodes. Nevertheless, we

  15. An in vitro comparison of quantitative light-induced fluorescence-digital and spectrophotometer on monitoring artificial white spot lesions.

    PubMed

    Kim, Hee Eun; Kim, Baek-Il

    2015-09-01

    The aim of this study was to evaluate the efficacy of quantitative light-induced fluorescence-digital (QLF-D) compared to a spectrophotometer in monitoring progression of enamel lesions. To generate artificial caries with various severities of lesion depths, twenty bovine specimens were immersed in demineralizing solution for 40 days. During the production of the lesions, repeat measurements of fluorescence loss (ΔF) and color change (ΔE) were performed in six distinct stages after the demineralization of the specimens: after 3, 5, 10, 20, 30, and 40 days of exposure to the demineralizing solution. Changes in the ΔF values in the lesions were analyzed using the QLF-D, and changes in the ΔE values in lesions were analyzed using a spectrophotometer. The repeated measures ANOVA of ΔF and ΔE values were used to determine whether there are significant differences at different exposure times in the demineralizing solution. Spearman's rank correlation coefficient was analyzed between ΔF and ΔE. The ΔF values significantly decreased based on the demineralizing period (p<0.001). Relatively large changes in ΔF values were observed during the first 10 days. There were significant changes in L(*), a(*), b(*), and ΔE values in lesions with increasing demineralizing duration (p<0.001). A strong correlation was observed between ΔF and ΔE with p=-0.853 (p<0.001). The results support the efficacy of QLF-D in monitoring color changes. Our findings demonstrate that QLF-D are a more efficient and stable tool for early caries detection. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. TRIIG - Time-lapse reproduction of images through interactive graphics. [digital processing of quality hard copy

    NASA Technical Reports Server (NTRS)

    Buckner, J. D.; Council, H. W.; Edwards, T. R.

    1974-01-01

    Description of the hardware and software implementing the system of time-lapse reproduction of images through interactive graphics (TRIIG). The system produces a quality hard copy of processed images in a fast and inexpensive manner. This capability allows for optimal development of processing software through the rapid viewing of many image frames in an interactive mode. Three critical optical devices are used to reproduce an image: an Optronics photo reader/writer, the Adage Graphics Terminal, and Polaroid Type 57 high speed film. Typical sources of digitized images are observation satellites, such as ERTS or Mariner, computer coupled electron microscopes for high-magnification studies, or computer coupled X-ray devices for medical research.

  17. Digital image processing of nanometer-size metal particles on amorphous substrates

    NASA Technical Reports Server (NTRS)

    Soria, F.; Artal, P.; Bescos, J.; Heinemann, K.

    1989-01-01

    The task of differentiating very small metal aggregates supported on amorphous films from the phase contrast image features inherently stemming from the support is extremely difficult in the nanometer particle size range. Digital image processing was employed to overcome some of the ambiguities in evaluating such micrographs. It was demonstrated that such processing allowed positive particle detection and a limited degree of statistical size analysis even for micrographs where by bare eye examination the distribution between particles and erroneous substrate features would seem highly ambiguous. The smallest size class detected for Pd/C samples peaks at 0.8 nm. This size class was found in various samples prepared under different evaporation conditions and it is concluded that these particles consist of 'a magic number' of 13 atoms and have cubooctahedral or icosahedral crystal structure.

  18. Real-time digital holographic microscopy using the graphic processing unit.

    PubMed

    Shimobaba, Tomoyoshi; Sato, Yoshikuni; Miura, Junya; Takenouchi, Mai; Ito, Tomoyoshi

    2008-08-04

    Digital holographic microscopy (DHM) is a well-known powerful method allowing both the amplitude and phase of a specimen to be simultaneously observed. In order to obtain a reconstructed image from a hologram, numerous calculations for the Fresnel diffraction are required. The Fresnel diffraction can be accelerated by the FFT (Fast Fourier Transform) algorithm. However, real-time reconstruction from a hologram is difficult even if we use a recent central processing unit (CPU) to calculate the Fresnel diffraction by the FFT algorithm. In this paper, we describe a real-time DHM system using a graphic processing unit (GPU) with many stream processors, which allows use as a highly parallel processor. The computational speed of the Fresnel diffraction using the GPU is faster than that of recent CPUs. The real-time DHM system can obtain reconstructed images from holograms whose size is 512 x 512 grids in 24 frames per second.

  19. Dynamic deformation image de-blurring and image processing for digital imaging correlation measurement

    NASA Astrophysics Data System (ADS)

    Guo, X.; Li, Y.; Suo, T.; Liu, H.; Zhang, C.

    2017-11-01

    This paper proposes a method for de-blurring of images captured in the dynamic deformation of materials. De-blurring is achieved based on the dynamic-based approach, which is used to estimate the Point Spread Function (PSF) during the camera exposure window. The deconvolution process involving iterative matrix calculations of pixels, is then performed on the GPU to decrease the time cost. Compared to the Gauss method and the Lucy-Richardson method, it has the best result of the image restoration. The proposed method has been evaluated by using the Hopkinson bar loading system. In comparison to the blurry image, the proposed method has successfully restored the image. It is also demonstrated from image processing applications that the de-blurring method can improve the accuracy and the stability of the digital imaging correlation measurement.

  20. Digital Signal Processing by Virtual Instrumentation of a MEMS Magnetic Field Sensor for Biomedical Applications

    PubMed Central

    Juárez-Aguirre, Raúl; Domínguez-Nicolás, Saúl M.; Manjarrez, Elías; Tapia, Jesús A.; Figueras, Eduard; Vázquez-Leal, Héctor; Aguilera-Cortés, Luz A.; Herrera-May, Agustín L.

    2013-01-01

    We present a signal processing system with virtual instrumentation of a MEMS sensor to detect magnetic flux density for biomedical applications. This system consists of a magnetic field sensor, electronic components implemented on a printed circuit board (PCB), a data acquisition (DAQ) card, and a virtual instrument. It allows the development of a semi-portable prototype with the capacity to filter small electromagnetic interference signals through digital signal processing. The virtual instrument includes an algorithm to implement different configurations of infinite impulse response (IIR) filters. The PCB contains a precision instrumentation amplifier, a demodulator, a low-pass filter (LPF) and a buffer with operational amplifier. The proposed prototype is used for real-time non-invasive monitoring of magnetic flux density in the thoracic cage of rats. The response of the rat respiratory magnetogram displays a similar behavior as the rat electromyogram (EMG). PMID:24196434

  1. Digital Signal Processing and Generation for a DC Current Transformer for Particle Accelerators

    SciTech Connect

    Zorzetti, Silvia

    2013-01-01

    The thesis topic, digital signal processing and generation for a DC current transformer, focuses on the most fundamental beam diagnostics in the field of particle accelerators, the measurement of the beam intensity, or beam current. The technology of a DC current transformer (DCCT) is well known, and used in many areas, including particle accelerator beam instrumentation, as non-invasive (shunt-free) method to monitor the DC current in a conducting wire, or in our case, the current of charged particles travelling inside an evacuated metal pipe. So far, custom and commercial DCCTs are entirely based on analog technologies and signal processing, whichmore » makes them inflexible, sensitive to component aging, and difficult to maintain and calibrate.« less

  2. Digital signal processing by virtual instrumentation of a MEMS magnetic field sensor for biomedical applications.

    PubMed

    Juárez-Aguirre, Raúl; Domínguez-Nicolás, Saúl M; Manjarrez, Elías; Tapia, Jesús A; Figueras, Eduard; Vázquez-Leal, Héctor; Aguilera-Cortés, Luz A; Herrera-May, Agustín L

    2013-11-05

    We present a signal processing system with virtual instrumentation of a MEMS sensor to detect magnetic flux density for biomedical applications. This system consists of a magnetic field sensor, electronic components implemented on a printed circuit board (PCB), a data acquisition (DAQ) card, and a virtual instrument. It allows the development of a semi-portable prototype with the capacity to filter small electromagnetic interference signals through digital signal processing. The virtual instrument includes an algorithm to implement different configurations of infinite impulse response (IIR) filters. The PCB contains a precision instrumentation amplifier, a demodulator, a low-pass filter (LPF) and a buffer with operational amplifier. The proposed prototype is used for real-time non-invasive monitoring of magnetic flux density in the thoracic cage of rats. The response of the rat respiratory magnetogram displays a similar behavior as the rat electromyogram (EMG).

  3. Current status on the application of image processing of digital intraoral radiographs amongst general dental practitioners.

    PubMed

    Tohidast, Parisa; Shi, Xie-Qi

    2016-01-01

    The objectives of this study were to present the subjective knowledge level and the use of image processing on digital intraoral radiographs amongst general dental practitioners at Distriktståndvrden AB, Stockholm. A questionnaire, consisting of12 questions, was sent to 12 dental prac- tices in Stockholm. Additionally, 2000 radiographs were randomly selected from these clinics for evaluation of applied image processing and its effect on image quality. Descriptive and analytical statistical methods were applied to present the current status of the use of image proces- sing alternatives for the dentists' daily clinical work. 50 out of 53 dentists participated in the survey.The survey showed that most of dentists in.this study had received education on image processing at some stage of their career. No correlations were found between application of image processing on one side and educa- tion received with regards to image processing, previous working experience, age and gender on the other. Image processing in terms of adjusting brightness and contrast was frequently used. Overall, in this study 24.5% of the 200 images were actually image processed in practice, in which 90% of the images were improved or maintained in image quality. According to our survey, image processing is experienced to be frequently used by the dentists at Distriktstandvåden AB for diagnosing anatomical and pathological changes using intraoral radiographs. 24.5% of the 200 images were actually image processed in terms of adjusting brightness and/or contrast. In the present study we did not found that the dentists' age, gender, previous working experience and education in image processing influence their viewpoint towards the application of image processing.

  4. Digital photorefraction

    NASA Astrophysics Data System (ADS)

    Costa, Manuel F. M.; Jorge, Jorge M.

    1998-01-01

    The early evaluation of the visual status of human infants is of a critical importance. It is of utmost importance to the development of the child's visual system that she perceives clear, focused, retinal images. Furthermore if the refractive problems are not corrected in due time amblyopia may occur. Photorefraction is a non-invasive clinical tool rather convenient for application to this kind of population. A qualitative or semi-quantitative information about refractive errors, accommodation, strabismus, amblyogenic factors and some pathologies (cataracts) can the easily obtained. The photorefraction experimental setup we established using new technological breakthroughs on the fields of imaging devices, image processing and fiber optics, allows the implementation of both the isotropic and eccentric photorefraction approaches. Essentially both methods consist on delivering a light beam into the eyes. It is refracted by the ocular media, strikes the retina, focusing or not, reflects off and is collected by a camera. The system is formed by one CCD color camera and a light source. A beam splitter in front of the camera's objective allows coaxial illumination and observation. An optomechanical system also allows eccentric illumination. The light source is a flash type one and is synchronized with the camera's image acquisition. The camera's image is digitized displayed in real time. Image processing routines are applied for image's enhancement and feature extraction.

  5. Digital photorefraction

    NASA Astrophysics Data System (ADS)

    Costa, Manuel F.; Jorge, Jorge M.

    1997-12-01

    The early evaluation of the visual status of human infants is of a critical importance. It is of utmost importance to the development of the child's visual system that she perceives clear, focused, retinal images. Furthermore if the refractive problems are not corrected in due time amblyopia may occur. Photorefraction is a non-invasive clinical tool rather convenient for application to this kind of population. A qualitative or semi-quantitative information about refractive errors, accommodation, strabismus, amblyogenic factors and some pathologies (cataracts) can the easily obtained. The photorefraction experimental setup we established using new technological breakthroughs on the fields of imaging devices, image processing and fiber optics, allows the implementation of both the isotropic and eccentric photorefraction approaches. Essentially both methods consist on delivering a light beam into the eyes. It is refracted by the ocular media, strikes the retina, focusing or not, reflects off and is collected by a camera. The system is formed by one CCD color camera and a light source. A beam splitter in front of the camera's objective allows coaxial illumination and observation. An optomechanical system also allows eccentric illumination. The light source is a flash type one and is synchronized with the camera's image acquisition. The camera's image is digitized displayed in real time. Image processing routines are applied for image's enhancement and feature extraction.

  6. Phenopix: a R package to process digital images of a vegetation cover

    NASA Astrophysics Data System (ADS)

    Filippa, Gianluca; Cremonese, Edoardo; Migliavacca, Mirco; Galvagno, Marta; Morra di Cella, Umberto; Richardson, Andrew

    2015-04-01

    Plant phenology is a globally recognized indicator of the effects of climate change on the terrestrial biosphere. Accordingly, new tools to automatically track the seasonal development of a vegetation cover are becoming available and more and more deployed. Among them, near-continuous digital images are being collected in several networks in the US, Europe, Asia and Australia in a range of different ecosystems, including agricultural lands, deciduous and evergreen forests, and grasslands. The growing scientific interest in vegetation image analysis highlights the need of easy to use, flexible and standardized processing techniques. In this contribution we illustrate a new open source package called "phenopix" written in R language that allows to process images of a vegetation cover. The main features include: (i) define of one or more areas of interest on an image and process pixel information within them, (ii) compute vegetation indexes based on red green and blue channels, (iii) fit a curve to the seasonal trajectory of vegetation indexes and extract relevant dates (aka thresholds) on the seasonal trajectory; (iv) analyze image pixels separately to extract spatially explicit phenological information. The utilities of the package will be illustrated in detail for two subalpine sites, a grassland and a larch stand at about 2000 m in the Italian Western Alps. The phenopix package is a cost free and easy-to-use tool that allows to process digital images of a vegetation cover in a standardized, flexible and reproducible way. The software is available for download at the R forge web site (r-forge.r-project.org/projects/phenopix/).

  7. Real-Time Digital Signal Processing Based on FPGAs for Electronic Skin Implementation †

    PubMed Central

    Ibrahim, Ali; Gastaldo, Paolo; Chible, Hussein; Valle, Maurizio

    2017-01-01

    Enabling touch-sensing capability would help appliances understand interaction behaviors with their surroundings. Many recent studies are focusing on the development of electronic skin because of its necessity in various application domains, namely autonomous artificial intelligence (e.g., robots), biomedical instrumentation, and replacement prosthetic devices. An essential task of the electronic skin system is to locally process the tactile data and send structured information either to mimic human skin or to respond to the application demands. The electronic skin must be fabricated together with an embedded electronic system which has the role of acquiring the tactile data, processing, and extracting structured information. On the other hand, processing tactile data requires efficient methods to extract meaningful information from raw sensor data. Machine learning represents an effective method for data analysis in many domains: it has recently demonstrated its effectiveness in processing tactile sensor data. In this framework, this paper presents the implementation of digital signal processing based on FPGAs for tactile data processing. It provides the implementation of a tensorial kernel function for a machine learning approach. Implementation results are assessed by highlighting the FPGA resource utilization and power consumption. Results demonstrate the feasibility of the proposed implementation when real-time classification of input touch modalities are targeted. PMID:28287448

  8. Real-Time Digital Signal Processing Based on FPGAs for Electronic Skin Implementation.

    PubMed

    Ibrahim, Ali; Gastaldo, Paolo; Chible, Hussein; Valle, Maurizio

    2017-03-10

    Enabling touch-sensing capability would help appliances understand interaction behaviors with their surroundings. Many recent studies are focusing on the development of electronic skin because of its necessity in various application domains, namely autonomous artificial intelligence (e.g., robots), biomedical instrumentation, and replacement prosthetic devices. An essential task of the electronic skin system is to locally process the tactile data and send structured information either to mimic human skin or to respond to the application demands. The electronic skin must be fabricated together with an embedded electronic system which has the role of acquiring the tactile data, processing, and extracting structured information. On the other hand, processing tactile data requires efficient methods to extract meaningful information from raw sensor data. Machine learning represents an effective method for data analysis in many domains: it has recently demonstrated its effectiveness in processing tactile sensor data. In this framework, this paper presents the implementation of digital signal processing based on FPGAs for tactile data processing. It provides the implementation of a tensorial kernel function for a machine learning approach. Implementation results are assessed by highlighting the FPGA resource utilization and power consumption. Results demonstrate the feasibility of the proposed implementation when real-time classification of input touch modalities are targeted.

  9. Facial protection conferred by cycle safety helmets: use of digitized image processing to develop a new nondestructive test.

    PubMed

    Harrison, M; Shepherd, J P

    1997-07-01

    Cycle safety helmets are designed to prevent head injury. Although most commercially available helmets conform to one of several national and international standards, individual designs differ widely, particularly in relation to face coverage. A method was developed to assess the potential for the differing designs to protect the face from injury. A nonimpact test was assessed, using digitized image-processing software (Digithurst Ltd.) to measure the shadow cast by a helmet rim under a collimated plane light source onto the face of a mannequin headform. Twelve helmet designs available internationally were tested and ranked with respect to the direct protection conferred (area of the face directly covered by the helmet) and indirect protection (area of the face shaded). The three highest-ranking helmets for direct protection (Rosebank Stackhat, Asphalt Warrior, and Lazer Voyager) also ranked the highest for indirect protection. These helmets were more inferiorly extended and were of a more bulky construction. It was concluded that the dimensions of cycle helmets in relation to face coverage are crucial in influencing the extent to which facial protection is conferred. International test standards need urgent revision to ensure that face coverage is optimized. Lower-face protection could be achieved through incorporation of a lower-face bar to cycle helmets.

  10. Modification in oxidative processes in muscle tissues exposed to laser- and light-emitting diode radiation.

    PubMed

    Monich, Victor A; Bavrina, Anna P; Malinovskaya, Svetlana L

    2018-01-01

    Exposure of living tissues to high-intensity red or near-infrared light can produce the oxidative stress effects both in the target zone and adjacent ones. The protein oxidative modification (POM) products can be used as reliable and early markers of oxidative stress. The contents of modified proteins in the investigated specimens can be evaluated by the 2,4-dinitrophenylhydrazine assay (the DNPH assay). Low-intensity red light is able to decrease the activity of oxidative processes and the DNPH assay data about the POM products in the biological tissues could show both an oxidative stress level and an efficiency of physical agent protection against the oxidative processes. Two control groups of white rats were irradiated by laser light, the first control group by red light and the second one by near-infrared radiation (NIR).Two experimental groups were consequently treated with laser and red low-level light-emitting diode radiation (LED). One of them was exposed to red laser light + LED and the other to NIR + LED. The fifth group was intact. Each group included ten animals. The effect of laser light was studied by methods of protein oxidative modifications. We measured levels of both induced and spontaneous POM products by the DNPH assay. The dramatic increase in levels of POM products in the control group samples when compared with the intact group data as well as the sharp decrease in the POM products in the experimental groups treated with LED low-level light were statistically significant (p ≤ 0.05). Exposure of skeletal muscles to high-intensity red and near-infrared laser light causes oxidative stress that continues not less than 3 days. The method of measurement of POM product contents by the DNPH assay is a reliable test of an oxidative process rate. Red low-intensity LED radiation can provide rehabilitation of skeletal muscle tissues treated with high-intensity laser light.

  11. Comparison of breast percent density estimation from raw versus processed digital mammograms

    NASA Astrophysics Data System (ADS)

    Li, Diane; Gavenonis, Sara; Conant, Emily; Kontos, Despina

    2011-03-01

    We compared breast percent density (PD%) measures obtained from raw and post-processed digital mammographic (DM) images. Bilateral raw and post-processed medio-lateral oblique (MLO) images from 81 screening studies were retrospectively analyzed. Image acquisition was performed with a GE Healthcare DS full-field DM system. Image post-processing was performed using the PremiumViewTM algorithm (GE Healthcare). Area-based breast PD% was estimated by a radiologist using a semi-automated image thresholding technique (Cumulus, Univ. Toronto). Comparison of breast PD% between raw and post-processed DM images was performed using the Pearson correlation (r), linear regression, and Student's t-test. Intra-reader variability was assessed with a repeat read on the same data-set. Our results show that breast PD% measurements from raw and post-processed DM images have a high correlation (r=0.98, R2=0.95, p<0.001). Paired t-test comparison of breast PD% between the raw and the post-processed images showed a statistically significant difference equal to 1.2% (p = 0.006). Our results suggest that the relatively small magnitude of the absolute difference in PD% between raw and post-processed DM images is unlikely to be clinically significant in breast cancer risk stratification. Therefore, it may be feasible to use post-processed DM images for breast PD% estimation in clinical settings. Since most breast imaging clinics routinely use and store only the post-processed DM images, breast PD% estimation from post-processed data may accelerate the integration of breast density in breast cancer risk assessment models used in clinical practice.

  12. Real-time monitoring of the solution concentration variation during the crystallization process of protein-lysozyme by using digital holographic interferometry.

    PubMed

    Zhang, Yanyan; Zhao, Jianlin; Di, Jianglei; Jiang, Hongzhen; Wang, Qian; Wang, Jun; Guo, Yunzhu; Yin, Dachuan

    2012-07-30

    We report a real-time measurement method of the solution concentration variation during the growth of protein-lysozyme crystals based on digital holographic interferometry. A series of holograms containing the information of the solution concentration variation in the whole crystallization process is recorded by CCD. Based on the principle of double-exposure holographic interferometry and the relationship between the phase difference of the reconstructed object wave and the solution concentration, the solution concentration variation with time for arbitrary point in the solution can be obtained, and then the two-dimensional concentration distribution of the solution during crystallization process can also be figured out under the precondition which the refractive index is constant through the light propagation direction. The experimental results turns out that it is feasible to in situ, full-field and real-time monitor the crystal growth process by using this method.

  13. A new approach to pre-processing digital image for wavelet-based watermark

    NASA Astrophysics Data System (ADS)

    Agreste, Santa; Andaloro, Guido

    2008-11-01

    The growth of the Internet has increased the phenomenon of digital piracy, in multimedia objects, like software, image, video, audio and text. Therefore it is strategic to individualize and to develop methods and numerical algorithms, which are stable and have low computational cost, that will allow us to find a solution to these problems. We describe a digital watermarking algorithm for color image protection and authenticity: robust, not blind, and wavelet-based. The use of Discrete Wavelet Transform is motivated by good time-frequency features and a good match with Human Visual System directives. These two combined elements are important for building an invisible and robust watermark. Moreover our algorithm can work with any image, thanks to the step of pre-processing of the image that includes resize techniques that adapt to the size of the original image for Wavelet transform. The watermark signal is calculated in correlation with the image features and statistic properties. In the detection step we apply a re-synchronization between the original and watermarked image according to the Neyman-Pearson statistic criterion. Experimentation on a large set of different images has been shown to be resistant against geometric, filtering, and StirMark attacks with a low rate of false alarm.

  14. Digital image analysis in breast pathology-from image processing techniques to artificial intelligence.

    PubMed

    Robertson, Stephanie; Azizpour, Hossein; Smith, Kevin; Hartman, Johan

    2018-04-01

    Breast cancer is the most common malignant disease in women worldwide. In recent decades, earlier diagnosis and better adjuvant therapy have substantially improved patient outcome. Diagnosis by histopathology has proven to be instrumental to guide breast cancer treatment, but new challenges have emerged as our increasing understanding of cancer over the years has revealed its complex nature. As patient demand for personalized breast cancer therapy grows, we face an urgent need for more precise biomarker assessment and more accurate histopathologic breast cancer diagnosis to make better therapy decisions. The digitization of pathology data has opened the door to faster, more reproducible, and more precise diagnoses through computerized image analysis. Software to assist diagnostic breast pathology through image processing techniques have been around for years. But recent breakthroughs in artificial intelligence (AI) promise to fundamentally change the way we detect and treat breast cancer in the near future. Machine learning, a subfield of AI that applies statistical methods to learn from data, has seen an explosion of interest in recent years because of its ability to recognize patterns in data with less need for human instruction. One technique in particular, known as deep learning, has produced groundbreaking results in many important problems including image classification and speech recognition. In this review, we will cover the use of AI and deep learning in diagnostic breast pathology, and other recent developments in digital image analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. An online detection system for aggregate sizes and shapes based on digital image processing

    NASA Astrophysics Data System (ADS)

    Yang, Jianhong; Chen, Sijia

    2017-02-01

    Traditional aggregate size measuring methods are time-consuming, taxing, and do not deliver online measurements. A new online detection system for determining aggregate size and shape based on a digital camera with a charge-coupled device, and subsequent digital image processing, have been developed to overcome these problems. The system captures images of aggregates while falling and flat lying. Using these data, the particle size and shape distribution can be obtained in real time. Here, we calibrate this method using standard globules. Our experiments show that the maximum particle size distribution error was only 3 wt%, while the maximum particle shape distribution error was only 2 wt% for data derived from falling aggregates, having good dispersion. In contrast, the data for flat-lying aggregates had a maximum particle size distribution error of 12 wt%, and a maximum particle shape distribution error of 10 wt%; their accuracy was clearly lower than for falling aggregates. However, they performed well for single-graded aggregates, and did not require a dispersion device. Our system is low-cost and easy to install. It can successfully achieve online detection of aggregate size and shape with good reliability, and it has great potential for aggregate quality assurance.

  16. The effects of digital signal processing features on children's speech recognition and loudness perception.

    PubMed

    Crukley, Jeffery; Scollie, Susan D

    2014-03-01

    The purpose of this study was to determine the effects of hearing instruments set to Desired Sensation Level version 5 (DSL v5) hearing instrument prescription algorithm targets and equipped with directional microphones and digital noise reduction (DNR) on children's sentence recognition in noise performance and loudness perception in a classroom environment. Ten children (ages 8-17 years) with stable, congenital sensorineural hearing losses participated in the study. Participants were fitted bilaterally with behind-the-ear hearing instruments set to DSL v5 prescriptive targets. Sentence recognition in noise was evaluated using the Bamford-Kowal-Bench Speech in Noise Test (Niquette et al., 2003). Loudness perception was evaluated using a modified version of the Contour Test of Loudness Perception (Cox, Alexander, Taylor, & Gray, 1997). Children's sentence recognition in noise performance was significantly better when using directional microphones alone or in combination with DNR than when using omnidirectional microphones alone or in combination with DNR. Children's loudness ratings for sounds above 72 dB SPL were lowest when fitted with the DSL v5 Noise prescription combined with directional microphones. DNR use showed no effect on loudness ratings. Use of the DSL v5 Noise prescription with a directional microphone improved sentence recognition in noise performance and reduced loudness perception ratings for loud sounds relative to a typical clinical reference fitting with the DSL v5 Quiet prescription with no digital signal processing features enabled. Potential clinical strategies are discussed.

  17. Flame colour characterization in the visible and infrared spectrum using a digital camera and image processing

    NASA Astrophysics Data System (ADS)

    Huang, Hua-Wei; Zhang, Yang

    2008-08-01

    An attempt has been made to characterize the colour spectrum of methane flame under various burning conditions using RGB and HSV colour models instead of resolving the real physical spectrum. The results demonstrate that each type of flame has its own characteristic distribution in both the RGB and HSV space. It has also been observed that the averaged B and G values in the RGB model represent well the CH* and C*2 emission of methane premixed flame. Theses features may be utilized for flame measurement and monitoring. The great advantage of using a conventional camera for monitoring flame properties based on the colour spectrum is that it is readily available, easy to interface with a computer, cost effective and has certain spatial resolution. Furthermore, it has been demonstrated that a conventional digital camera is able to image flame not only in the visible spectrum but also in the infrared. This feature is useful in avoiding the problem of image saturation typically encountered in capturing the very bright sooty flames. As a result, further digital imaging processing and quantitative information extraction is possible. It has been identified that an infrared image also has its own distribution in both the RGB and HSV colour space in comparison with a flame image in the visible spectrum.

  18. An application of digital image processing techniques to the characterization of liquid petroleum gas (LPG) spray

    NASA Astrophysics Data System (ADS)

    Qi, Y. L.; Xu, B. Y.; Cai, S. L.

    2006-12-01

    To control fuel injection, optimize combustion and reduce emissions for LPG (liquefied petroleum gas) engines, it is necessary and important to understand the characteristics of LPG sprays. The present work investigates the geometry of LPG sprays, including spray tip penetration, spray angle, projected spray area and spray volume, by using schlieren photography and digital image processing techniques. Two types of single nozzle injectors were studied, with the same nozzle diameter, but one with and one without a double-hole flow-split head. A code developed to analyse the results directly from the digitized images is shown to be more accurate and efficient than manual measurement and analysis. Test results show that a higher injection pressure produces a longer spray tip penetration, a larger projected spray area and spray volume, but a smaller spray cone angle. The injector with the double-hole split-head nozzle produces better atomization and shorter tip penetration at medium and late injection times, but longer tip penetration in the early stage.

  19. Implementation of cost-effective diffuse light source mechanism to reduce specular reflection and halo effects for resistor-image processing

    NASA Astrophysics Data System (ADS)

    Chen, Yung-Sheng; Wang, Jeng-Yau

    2015-09-01

    Light source plays a significant role to acquire a qualified image from objects for facilitating the image processing and pattern recognition. For objects possessing specular surface, the phenomena of reflection and halo appearing in the acquired image will increase the difficulty of information processing. Such a situation may be improved by the assistance of valuable diffuse light source. Consider reading resistor via computer vision, due to the resistor's specular reflective surface it will face with a severe non-uniform luminous intensity on image yielding a higher error rate in recognition without a well-controlled light source. A measurement system including mainly a digital microscope embedded in a replaceable diffuse cover, a ring-type LED embedded onto a small pad carrying a resistor for evaluation, and Arduino microcontrollers connected with PC, is presented in this paper. Several replaceable cost-effective diffuse covers made by paper bowl, cup and box inside pasted with white paper are presented for reducing specular reflection and halo effects and compared with a commercial diffuse some. The ring-type LED can be flexibly configured to be a full or partial lighting based on the application. For each self-made diffuse cover, a set of resistors with 4 or 5 color bands are captured via digital microscope for experiments. The signal-to-noise ratio from the segmented resistor-image is used for performance evaluation. The detected principal axis of resistor body is used for the partial LED configuration to further improve the lighting condition. Experimental results confirm that the proposed mechanism can not only evaluate the cost-effective diffuse light source but also be extended as an automatic recognition system for resistor reading.

  20. A method for the processing and analysis of digital terrain elevation data. [Shiprock and Gallup Quadrangles, Arizona and New Mexico

    NASA Technical Reports Server (NTRS)

    Junkin, B. G. (Principal Investigator)

    1979-01-01

    A method is presented for the processing and analysis of digital topography data that can subsequently be entered in an interactive data base in the form of slope, slope length, elevation, and aspect angle. A discussion of the data source and specific descriptions of the data processing software programs are included. In addition, the mathematical considerations involved in the registration of raw digitized coordinate points to the UTM coordinate system are presented. Scale factor considerations are also included. Results of the processing and analysis are illustrated using the Shiprock and Gallup Quadrangle test data.

  1. Publisher Correction: Precisely printable and biocompatible silk fibroin bioink for digital light processing 3D printing.

    PubMed

    Kim, Soon Hee; Yeon, Yeung Kyu; Lee, Jung Min; Chao, Janet Ren; Lee, Young Jin; Seo, Ye Been; Sultan, Md Tipu; Lee, Ok Joo; Lee, Ji Seung; Yoon, Sung-Il; Hong, In-Sun; Khang, Gilson; Lee, Sang Jin; Yoo, James J; Park, Chan Hum

    2018-06-11

    The original version of this Article contained errors in Figs. 5 and 6. In Fig. 5b, the second panel on the bottom row was stretched out of proportion. In Fig. 6d, the first panel was also stretched out of proportion. In Fig. 6f, the fifth panel inadvertently repeated the fourth. This has been corrected in both the PDF and HTML versions of the Article.

  2. Reengineering the picture archiving and communication system (PACS) process for digital imaging networks PACS.

    PubMed

    Horton, M C; Lewis, T E; Kinsey, T V

    1999-05-01

    Prior to June 1997, military picture archiving and communications systems (PACS) were planned, procured, and installed with key decisions on the system, equipment, and even funding sources made through a research and development office called Medical Diagnostic Imaging Systems (MDIS). Beginning in June 1997, the Joint Imaging Technology Project Office (JITPO) initiated a collaborative and consultative process for planning and implementing PACS into military treatment facilities through a new Department of Defense (DoD) contract vehicle called digital imaging networks (DIN)-PACS. The JITPO reengineered this process incorporating multiple organizations and politics. The reengineered PACS process administered through the JITPO transformed the decision process and accountability from a single office to a consultative method that increased end-user knowledge, responsibility, and ownership in PACS. The JITPO continues to provide information and services that assist multiple groups and users in rendering PACS planning and implementation decisions. Local site project managers are involved from the outset and this end-user collaboration has made the sometimes difficult transition to PACS an easier and more acceptable process for all involved. Corporately, this process saved DoD sites millions by having PACS plans developed within the government and proposed to vendors second, and then having vendors respond specifically to those plans. The integrity and efficiency of the process have reduced the opportunity for implementing nonstandard systems while sharing resources and reducing wasted government dollars. This presentation will describe the chronology of changes, encountered obstacles, and lessons learned within the reengineering of the PACS process for DIN-PACS.

  3. SPECIAL ISSUE ON OPTICAL PROCESSING OF INFORMATION: Optimal configuration of optical systems with spatial light modulators

    NASA Astrophysics Data System (ADS)

    Fedorov, Yu V.

    1995-10-01

    A description is given of a novel optical system for optical information processing. An analysis is given of ways of increasing optoenergetic characteristics of optical information processing systems in which use is made of spatial light modulators with phase-relief (in thermoplastic materials) and polarisation (in crystalline structures of the DKDP type) information storage.

  4. Effect of a photoperiodic green light programme during incubation on embryo development and hatch process.

    PubMed

    Tong, Q; McGonnell, I M; Demmers, T G M; Roulston, N; Bergoug, H; Romanini, C E; Verhelst, R; Guinebretière, M; Eterradossi, N; Berckmans, D; Exadaktylos, V

    2018-04-01

    This study was conducted to evaluate the effect of a 12-h light, 12-h dark (12L : 12D) photoperiod of green light during day 1 to day 18 of incubation time, on embryo growth, hormone concentration and the hatch process. In the test group, monochromatic light was provided by a total of 204 green light-emitting diodes (522 nm) mounted in a frame which was placed above the top tray of eggs to give even spread of illumination. No light-dark cycle was used in the control group. Four batches of eggs (n=300/group per batch) from fertile Ross 308 broiler breeders were used in this experiment. The beak length and crown-rump length of embryos incubated under green light were significantly longer than that of control embryos at day 10 and day 12, respectively (P<0.01). Furthermore, green light-exposed embryos had a longer third toe length compared with control embryos at day 10, day 14 and day 17 (P=0.02). At group level (n=4 batches), light stimulation had no effect on chick weight and quality at take-off, the initiation of hatch and hatch window. However, the individual hatching time of the light exposure focal chicks (n=33) was 3.4 h earlier (P=0.49) than the control focal chicks (n=36) probably due to the change in melatonin rhythm of the light group. The results of this study indicate that green light accelerates embryo development and alters hatch-related hormones (thyroid and corticosterone), which may result in earlier hatching.

  5. Frequency mismatch in stimulated scattering processes: An important factor for the transverse distribution of scattered light

    SciTech Connect

    Gong, Tao; Research Center of Laser Fusion, China Academy of Engineering Physics, Mianyang, Sichuan 621900; Zheng, Jian, E-mail: jzheng@ustc.edu.cn

    2016-06-15

    A 2D cylindrically symmetric model with inclusion of both diffraction and self-focus effects is developed to deal with the stimulated scattering processes of a single hotspot. The calculated results show that the transverse distribution of the scattered light is sensitive to the longitudinal profiles of the plasma parameters. The analysis of the evolution of the scattered light indicates that it is the frequency mismatch of coupling due to the inhomogeneity of plasmas that determines the transverse distribution of the scattered light.

  6. A hybrid silicon membrane spatial light modulator for optical information processing

    NASA Technical Reports Server (NTRS)

    Pape, D. R.; Hornbeck, L. J.

    1984-01-01

    A new two dimensional, fast, analog, electrically addressable, silicon based membrane spatial light modulator (SLM) was developed for optical information processing applications. Coherent light reflected from the mirror elements is phase modulated producing an optical Fourier transform of an analog signal input to the device. The DMD architecture and operating parameters related to this application are presented. A model is developed that describes the optical Fourier transform properties of the DMD.

  7. Organic light-emitting diodes for lighting: High color quality by controlling energy transfer processes in host-guest-systems

    NASA Astrophysics Data System (ADS)

    Weichsel, Caroline; Reineke, Sebastian; Furno, Mauro; Lüssem, Björn; Leo, Karl

    2012-02-01

    Exciton generation and transfer processes in a multilayer organic light-emitting diode (OLED) are studied in order to realize OLEDs with warm white color coordinates and high color-rendering index (CRI). We investigate a host-guest-system containing four phosphorescent emitters and two matrix materials with different transport properties. We show, by time-resolved spectroscopy, that an energy back-transfer from the blue emitter to the matrix materials occurs, which can be used to transport excitons to the other emitter molecules. Furthermore, we investigate the excitonic and electronic transfer processes by designing suitable emission layer stacks. As a result, we obtain an OLED with Commission Internationale de lÉclairage (CIE) coordinates of (0.444;0.409), a CRI of 82, and a spectrum independent of the applied current. The OLED shows an external quantum efficiency of 10% and a luminous efficacy of 17.4 lm/W at 1000 cd/m2.

  8. Digital processing with single electrons for arbitrary waveform generation of current

    NASA Astrophysics Data System (ADS)

    Okazaki, Yuma; Nakamura, Shuji; Onomitsu, Koji; Kaneko, Nobu-Hisa

    2018-03-01

    We demonstrate arbitrary waveform generation of current using a GaAs-based single-electron pump. In our experiment, a digital processing algorithm known as delta-sigma modulation is incorporated into single-electron pumping to generate a density-modulated single-electron stream, by which we demonstrate the generation of arbitrary waveforms of current including sinusoidal, square, and triangular waves with a peak-to-peak amplitude of approximately 10 pA and an output bandwidth ranging from dc to close to 1 MHz. The developed current generator can be used as the precise and calculable current reference required for measurements of current noise in low-temperature experiments.

  9. Two dimensional recursive digital filters for near real time image processing

    NASA Technical Reports Server (NTRS)

    Olson, D.; Sherrod, E.

    1980-01-01

    A program was designed toward the demonstration of the feasibility of using two dimensional recursive digital filters for subjective image processing applications that require rapid turn around. The concept of the use of a dedicated minicomputer for the processor for this application was demonstrated. The minicomputer used was the HP1000 series E with a RTE 2 disc operating system and 32K words of memory. A Grinnel 256 x 512 x 8 bit display system was used to display the images. Sample images were provided by NASA Goddard on a 800 BPI, 9 track tape. Four 512 x 512 images representing 4 spectral regions of the same scene were provided. These images were filtered with enhancement filters developed during this effort.

  10. Low-Latency Digital Signal Processing for Feedback and Feedforward in Quantum Computing and Communication

    NASA Astrophysics Data System (ADS)

    Salathé, Yves; Kurpiers, Philipp; Karg, Thomas; Lang, Christian; Andersen, Christian Kraglund; Akin, Abdulkadir; Krinner, Sebastian; Eichler, Christopher; Wallraff, Andreas

    2018-03-01

    Quantum computing architectures rely on classical electronics for control and readout. Employing classical electronics in a feedback loop with the quantum system allows us to stabilize states, correct errors, and realize specific feedforward-based quantum computing and communication schemes such as deterministic quantum teleportation. These feedback and feedforward operations are required to be fast compared to the coherence time of the quantum system to minimize the probability of errors. We present a field-programmable-gate-array-based digital signal processing system capable of real-time quadrature demodulation, a determination of the qubit state, and a generation of state-dependent feedback trigger signals. The feedback trigger is generated with a latency of 110 ns with respect to the timing of the analog input signal. We characterize the performance of the system for an active qubit initialization protocol based on the dispersive readout of a superconducting qubit and discuss potential applications in feedback and feedforward algorithms.

  11. On Digital Simulation of Multicorrelated Random Processes and Its Applications. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Sinha, A. K.

    1973-01-01

    Two methods are described to simulate, on a digital computer, a set of correlated, stationary, and Gaussian time series with zero mean from the given matrix of power spectral densities and cross spectral densities. The first method is based upon trigonometric series with random amplitudes and deterministic phase angles. The random amplitudes are generated by using a standard random number generator subroutine. An example is given which corresponds to three components of wind velocities at two different spatial locations for a total of six correlated time series. In the second method, the whole process is carried out using the Fast Fourier Transform approach. This method gives more accurate results and works about twenty times faster for a set of six correlated time series.

  12. Advanced digital signal processing for short haul optical fiber transmission beyond 100G

    NASA Astrophysics Data System (ADS)

    Kikuchi, Nobuhiko

    2017-01-01

    Significant increase of intra and inter data center traffic has been expected by the rapid spread of various network applications like SNS, IoT, mobile and cloud computing, and the needs for ultra-high speed and cost-effective short- to medium-reach optical fiber links beyond 100-Gbit/s is becoming larger and larger. Such high-speed links typically use multilevel modulation to lower signaling speed, which in turn face serious challenges in limited loss budget and waveform distortion tolerance. One of the promising techniques to overcome them is the use of advanced digital signal processing (DSP) and we review various DSP applications for short-to-medium reach applications.

  13. Digital Signal Processing Based on a Clustering Algorithm for Ir/Au TES Microcalorimeter

    NASA Astrophysics Data System (ADS)

    Zen, N.; Kunieda, Y.; Takahashi, H.; Hiramoto, K.; Nakazawa, M.; Fukuda, D.; Ukibe, M.; Ohkubo, M.

    2006-02-01

    In recent years, cryogenic microcalorimeters using their superconducting transition edge have been under development for possible application to the research for astronomical X-ray observations. To improve the energy resolution of superconducting transition edge sensors (TES), several correction methods have been developed. Among them, a clustering method based on digital signal processing has recently been proposed. In this paper, we applied the clustering method to Ir/Au bilayer TES. This method resulted in almost a 10% improvement in the energy resolution. Conversely, from the point of view of imaging X-ray spectroscopy, we applied the clustering method to pixellated Ir/Au-TES devices. We will thus show how a clustering method which sorts signals by their shapes is also useful for position identification

  14. A 'user friendly' geographic information system in a color interactive digital image processing system environment

    NASA Technical Reports Server (NTRS)

    Campbell, W. J.; Goldberg, M.

    1982-01-01

    NASA's Eastern Regional Remote Sensing Applications Center (ERRSAC) has recognized the need to accommodate spatial analysis techniques in its remote sensing technology transfer program. A computerized Geographic Information System to incorporate remotely sensed data, specifically Landsat, with other relevant data was considered a realistic approach to address a given resource problem. Questions arose concerning the selection of a suitable available software system to demonstrate, train, and undertake demonstration projects with ERRSAC's user community. The very specific requirements for such a system are discussed. The solution found involved the addition of geographic information processing functions to the Interactive Digital Image Manipulation System (IDIMS). Details regarding the functions of the new integrated system are examined along with the characteristics of the software.

  15. Space shuttle orbiter digital data processing system timing sensitivity analysis OFT ascent phase

    NASA Technical Reports Server (NTRS)

    Lagas, J. J.; Peterka, J. J.; Becker, D. A.

    1977-01-01

    Dynamic loads were investigated to provide simulation and analysis of the space shuttle orbiter digital data processing system (DDPS). Segments of the ascent test (OFT) configuration were modeled utilizing the information management system interpretive model (IMSIM) in a computerized simulation modeling of the OFT hardware and software workload. System requirements for simulation of the OFT configuration were defined, and sensitivity analyses determined areas of potential data flow problems in DDPS operation. Based on the defined system requirements and these sensitivity analyses, a test design was developed for adapting, parameterizing, and executing IMSIM, using varying load and stress conditions for model execution. Analyses of the computer simulation runs are documented, including results, conclusions, and recommendations for DDPS improvements.

  16. Man-machine interactive imaging and data processing using high-speed digital mass storage

    NASA Technical Reports Server (NTRS)

    Alsberg, H.; Nathan, R.

    1975-01-01

    The role of vision in teleoperation has been recognized as an important element in the man-machine control loop. In most applications of remote manipulation, direct vision cannot be used. To overcome this handicap, the human operator's control capabilities are augmented by a television system. This medium provides a practical and useful link between workspace and the control station from which the operator perform his tasks. Human performance deteriorates when the images are degraded as a result of instrumental and transmission limitations. Image enhancement is used to bring out selected qualities in a picture to increase the perception of the observer. A general purpose digital computer, an extensive special purpose software system is used to perform an almost unlimited repertoire of processing operations.

  17. [A modified speech enhancement algorithm for electronic cochlear implant and its digital signal processing realization].

    PubMed

    Wang, Yulin; Tian, Xuelong

    2014-08-01

    In order to improve the speech quality and auditory perceptiveness of electronic cochlear implant under strong noise background, a speech enhancement system used for electronic cochlear implant front-end was constructed. Taking digital signal processing (DSP) as the core, the system combines its multi-channel buffered serial port (McBSP) data transmission channel with extended audio interface chip TLV320AIC10, so speech signal acquisition and output with high speed are realized. Meanwhile, due to the traditional speech enhancement method which has the problems as bad adaptability, slow convergence speed and big steady-state error, versiera function and de-correlation principle were used to improve the existing adaptive filtering algorithm, which effectively enhanced the quality of voice communications. Test results verified the stability of the system and the de-noising performance of the algorithm, and it also proved that they could provide clearer speech signals for the deaf or tinnitus patients.

  18. [Digital signal processing of a novel neuron discharge model stimulation strategy for cochlear implants].

    PubMed

    Yang, Yiwei; Xu, Yuejin; Miu, Jichang; Zhou, Linghong; Xiao, Zhongju

    2012-10-01

    To apply the classic leakage integrate-and-fire models, based on the mechanism of the generation of physiological auditory stimulation, in the information processing coding of cochlear implants to improve the auditory result. The results of algorithm simulation in digital signal processor (DSP) were imported into Matlab for a comparative analysis. Compared with CIS coding, the algorithm of membrane potential integrate-and-fire (MPIF) allowed more natural pulse discharge in a pseudo-random manner to better fit the physiological structures. The MPIF algorithm can effectively solve the problem of the dynamic structure of the delivered auditory information sequence issued in the auditory center and allowed integration of the stimulating pulses and time coding to ensure the coherence and relevance of the stimulating pulse time.

  19. On-chip temperature-based digital signal processing for customized wireless microcontroller

    NASA Astrophysics Data System (ADS)

    Farhah Razanah Faezal, Siti; Isa, Mohd Nazrin Md; Harun, Azizi; Nizam Mohyar, Shaiful; Bahari Jambek, Asral

    2017-11-01

    Increases in die size and power density inside system-on-chip (SoC) design have brought thermal issue inside the system. Uneven heat-up and increasing in temperature offset on-chip has become a major factor that can limits the system performance. This paper presents the design and simulation of a temperature-based digital signal processing for modern system-on-chip design using the Verilog HDL. This design yields continuous monitoring of temperature and reacts to specified conditions. The simulation of the system has been done on Altera Quartus Software v. 14. With system above, microcontroller can achieve nominal power dissipation and operation is within the temperature range due to the incorporate of an interrupt-based system.

  20. A software to digital image processing to be used in the voxel phantom development.

    PubMed

    Vieira, J W; Lima, F R A

    2009-11-15

    Anthropomorphic models used in computational dosimetry, also denominated phantoms, are based on digital images recorded from scanning of real people by Computed Tomography (CT) or Magnetic Resonance Imaging (MRI). The voxel phantom construction requests computational processing for transformations of image formats, to compact two-dimensional (2-D) images forming of three-dimensional (3-D) matrices, image sampling and quantization, image enhancement, restoration and segmentation, among others. Hardly the researcher of computational dosimetry will find all these available abilities in single software, and almost always this difficulty presents as a result the decrease of the rhythm of his researches or the use, sometimes inadequate, of alternative tools. The need to integrate the several tasks mentioned above to obtain an image that can be used in an exposure computational model motivated the development of the Digital Image Processing (DIP) software, mainly to solve particular problems in Dissertations and Thesis developed by members of the Grupo de Pesquisa em Dosimetria Numérica (GDN/CNPq). Because of this particular objective, the software uses the Portuguese idiom in their implementations and interfaces. This paper presents the second version of the DIP, whose main changes are the more formal organization on menus and menu items, and menu for digital image segmentation. Currently, the DIP contains the menus Fundamentos, Visualizações, Domínio Espacial, Domínio de Frequências, Segmentações and Estudos. Each menu contains items and sub-items with functionalities that, usually, request an image as input and produce an image or an attribute in the output. The DIP reads edits and writes binary files containing the 3-D matrix corresponding to a stack of axial images from a given geometry that can be a human body or other volume of interest. It also can read any type of computational image and to make conversions. When the task involves only an output image

  1. SPECIAL ISSUE ON OPTICAL PROCESSING OF INFORMATION: Optical information processing with transformation of the spatial coherence of light

    NASA Astrophysics Data System (ADS)

    Bykovskii, Yurii A.; Markilov, A. A.; Rodin, V. G.; Starikov, S. N.

    1995-10-01

    A description is given of systems with spatially incoherent illumination, intended for spectral and correlation analysis, and for the recording of Fourier holograms. These systems make use of transformation of the degree of the spatial coherence of light. The results are given of the processing of images and signals, including those transmitted by a bundle of fibre-optic waveguides both as monochromatic light and as quasimonochromatic radiation from a cathode-ray tube. The feasibility of spatial frequency filtering and of correlation analysis of images with a bipolar impulse response is considered for systems with spatially incoherent illumination where these tasks are performed by double transformation of the spatial coherence of light. A description is given of experimental systems and the results of image processing are reported.

  2. Comparison of tissue equalization, and premium view post-processing methods in full field digital mammography.

    PubMed

    Chen, Baoying; Wang, Wei; Huang, Jin; Zhao, Ming; Cui, Guangbin; Xu, Jing; Guo, Wei; Du, Pang; Li, Pei; Yu, Jun

    2010-10-01

    To retrospectively evaluate the diagnostic abilities of 2 post-processing methods provided by GE Senographe DS system, tissue equalization (TE) and premium view (PV) in full field digital mammography (FFDM). In accordance with the ethical standards of the World Medical Association, this study was approved by regional ethics committee and signed informed patient consents were obtained. We retrospectively reviewed digital mammograms from 101 women (mean age, 47 years; range, 23-81 years) in the modes of TE and PV, respectively. Three radiologists, fully blinded to the post-processing methods, all patient clinical information and histologic results, read images by using objective image interpretation criteria for diagnostic information end points such as lesion border delineation, definition of disease extent, visualization of internal and surrounding morphologic features of the lesions. Also, overall diagnostic impression in terms of lesion conspicuity, detectability and diagnostic confidence was assessed. Between-group comparisons were performed with Wilcoxon signed rank test. Readers 1, 2, and 3 demonstrated significant overall better impression of PV in 29, 27, and 24 patients, compared with that for TE in 12, 13, and 11 patients, respectively (p<0.05). Significant (p<0.05) better impression of PV was also demonstrated for diagnostic information end points. Importantly, PV proved to be more sensitive than TE while detecting malignant lesions in dense breast rather than benign lesions and malignancy in non-dense breast (p<0.01). PV compared with TE provides marked better diagnostic information in FFDM, particularly for patients with malignancy in dense breast. Copyright © 2009 Elsevier Ireland Ltd. All rights reserved.

  3. Focusing light through biological tissue and tissue-mimicking phantoms up to 9.6 cm in thickness with digital optical phase conjugation

    NASA Astrophysics Data System (ADS)

    Shen, Yuecheng; Liu, Yan; Ma, Cheng; Wang, Lihong V.

    2016-08-01

    Optical phase conjugation (OPC)-based wavefront shaping techniques focus light through or within scattering media, which is critically important for deep-tissue optical imaging, manipulation, and therapy. However, to date, the sample thickness in OPC experiments has been limited to only a few millimeters. Here, by using a laser with a long coherence length and an optimized digital OPC system that can safely deliver more light power, we focused 532-nm light through tissue-mimicking phantoms up to 9.6 cm thick, as well as through ex vivo chicken breast tissue up to 2.5 cm thick. Our results demonstrate that OPC can be achieved even when photons have experienced on average 1000 scattering events. The demonstrated penetration of nearly 10 cm (˜100 transport mean free paths) has never been achieved before by any optical focusing technique, and it shows the promise of OPC for deep-tissue noninvasive optical imaging, manipulation, and therapy.

  4. [Evaluating the maturity of IT-supported clinical imaging and diagnosis using the Digital Imaging Adoption Model : Are your clinical imaging processes ready for the digital era?

    PubMed

    Studzinski, J

    2017-06-01

    The Digital Imaging Adoption Model (DIAM) has been jointly developed by HIMSS Analytics and the European Society of Radiology (ESR). It helps evaluate the maturity of IT-supported processes in medical imaging, particularly in radiology. This eight-stage maturity model drives your organisational, strategic and tactical alignment towards imaging-IT planning. The key audience for the model comprises hospitals with imaging centers, as well as external imaging centers that collaborate with hospitals. The assessment focuses on different dimensions relevant to digital imaging, such as software infrastructure and usage, workflow security, clinical documentation and decision support, data exchange and analytical capabilities. With its standardised approach, it enables regional, national and international benchmarking. All DIAM participants receive a structured report that can be used as a basis for presenting, e.g. budget planning and investment decisions at management level.

  5. Quantitative Light-induced Fluorescence-Digital as an oral hygiene evaluation tool to assess plaque accumulation and enamel demineralization in orthodontics.

    PubMed

    Miller, Cara C; Burnside, Girvan; Higham, Susan M; Flannigan, Norah L

    2016-11-01

      To assess the use of Quantitative Light-induced Fluorescence-Digital as an oral hygiene evaluation tool during orthodontic treatment.   In this prospective, randomized clinical trial, 33 patients undergoing fixed orthodontic appliance treatment were randomly allocated to receive oral hygiene reinforcement at four consecutive appointments using either white light (WL) or Quantitative Light-induced Fluorescence-Digital (QLF) images, taken with a device, as visual aids. Oral hygiene was recorded assessing the QLF images for demineralization, by fluorescence loss (ΔF), and plaque coverage (ΔR30). A debriefing questionnaire ascertained patient perspectives.   There were no significant differences in demineralization (P  =  .56) or plaque accumulation (P  =  .82) between the WL and QLF groups from T0 to T4. There was no significant reduction in demineralization, ΔF, in the WL, or the QLF group from T0-T4 (P > .05); however, there was a significant reduction in ΔR30 plaque scores (P < .05). All the participants found being shown the images helpful, with 100% of the QLF group reflecting that it would be useful to have oral hygiene reinforcement for the full duration of treatment compared with 81% of the WL group (OR 2.3; P < .05).   Quantitative Light-induced Fluorescence-Digital can be used to detect and monitor demineralization and plaque during orthodontics. Oral hygiene reinforcement at consecutive appointments using WL or QLF images as visual aids is effective in reducing plaque coverage. In terms of clinical benefits, QLF and WL images are of similar effectiveness; however, patients preferred the QLF images.

  6. Lighting

    SciTech Connect

    Audin, L.

    1994-12-31

    EPAct covers a vast territory beyond lighting and, like all legislation, also contains numerous {open_quotes}favors,{close_quotes} compromises, and even some sleight-of-hand. Tucked away under Title XIX, for example, is an increase from 20% to 28% tax on gambling winnings, effective January 1, 1993 - apparently as a way to help pay for new spending listed elsewhere in the bill. Overall, it is a landmark piece of legislation, about a decade overdue. It remains to be seen how the Federal Government will enforce upgrading of state (or even their own) energy codes. There is no mention of funding for {open_quotes}energy police{close_quotes} inmore » EPAct. Merely creating such a national standard, however, provides a target for those who sincerely wish to create an energy-efficient future.« less

  7. Autofluorescence endoscopy with "real-time" digital image processing in differential diagnostics of selected benign and malignant lesions in the oesophagus.

    PubMed

    Sieroń-Stołtny, Karolina; Kwiatek, Sebastian; Latos, Wojciech; Kawczyk-Krupka, Aleksandra; Cieślar, Grzegorz; Stanek, Agata; Ziaja, Damian; Bugaj, Andrzej M; Sieroń, Aleksander

    2012-03-01

    Oesophageal papilloma and Barrett's oesophagus are benign lesions known as risk factors of carcinoma in the oesophagus. Therefore, it is important to diagnose these early changes before neoplastic transformation. Autofluorescence endoscopy is a fast and non-invasive method of imaging of tissues based on the natural fluorescence of endogenous fluorophores. The aim of this study was to prove the diagnostic utility of autofluorescence endoscopy with digital image processing in histological diagnosis of endoscopic findings in the upper digestive tract, primarily in the imaging of oesophageal papilloma. During the retrospective analysis of about 200 endoscopic procedures in the upper digestive tract, 67 cases of benign, precancerous or cancerous changes were found. White light endoscopy (WLE) image, single-channel (red or green) autofluorescence images, as well as green and red fluorescence intensities in two modal fluorescence image and red-to-green (R/G) ratio (Numerical Colour Value, NCV) were correlated with histopathologic results. The NCV analysis in autofluorescence imaging (AFI) showed increased R/G ratio in cancerous changes in 96% vs. 85% in WLE. Simultaneous analysis with digital image processing allowed us to diagnose suspicious tissue as cancerous in all of cases. Barrett's metaplasia was confirmed in 90% vs. 79% (AFI vs. WLE), and 98% in imaging with digital image processing. In benign lesions, WLE allowed us to exclude tissue as malignant in 85%. Using autofluorescence endoscopy R/G ratio was increased in only 10% of benign changes causing the picture to be interpreted as suspicious, but when both methods were used together, 97.5% were cases excluded as malignancies. Mean R/G ratios were estimated to be 2.5 in cancers, 1.25 in Barrett's metaplasia and 0.75 in benign changes and were statistically significant (p=0.04). Autofluorescence imaging is a sensitive method to diagnose precancerous and cancerous early stages of the diseases located in oesophagus

  8. All-digital multicarrier demodulators for on-board processing satellites in mobile communication systems

    NASA Astrophysics Data System (ADS)

    Yim, Wan Hung

    Economical operation of future satellite systems for mobile communications can only be fulfilled by using dedicated on-board processing satellites, which would allow both cheap earth terminals and lower space segment costs. With on-board modems and codecs, the up-link and down-link can be optimized separately. An attractive scheme is to use frequency-division multiple access/single chanel per carrier (FDMA/SCPC) on the up-link and time division multiplexing (TDM) on the down-link. This scheme allows mobile terminals to transmit a narrow band, low power signal, resulting in smaller dishes and high power amplifiers (HPA's) with lower output power. On the up-link, there are hundreds to thousands of FDM channels to be demodulated on-board. The most promising approach is the use of all-digital multicarrier demodulators (MCD's), where analog and digital hardware are efficiently shared among channels, and digital signal processing (DSP) is used at an early stage to take advantage of very large scale integration (VLSI) implementation. A MCD consists of a channellizer for separation of frequency division multiplexing (FDM) channels, followed by individual modulators for each channel. Major research areas in MCD's are in multirate DSP, and the optimal estimation for synchronization, which form the basis of the thesis. Complex signal theories are central to the development of structured approaches for the sampling and processing of bandpass signals, which are the foundations in both channellizer and demodulator design. In multirate DSP, polyphase theories replace many ad-hoc, tedious and error-prone design procedures. For example, a polyphase-matrix deep space network frequency and timing system (DFT) channellizer includes all efficient filter bank techniques as special cases. Also, a polyphase-lattice filter is derived, not only for sampling rate conversion, but also capable of sampling phase variation, which is required for symbol timing adjustment in all-digital

  9. A novel pre-processing technique for improving image quality in digital breast tomosynthesis.

    PubMed

    Kim, Hyeongseok; Lee, Taewon; Hong, Joonpyo; Sabir, Sohail; Lee, Jung-Ryun; Choi, Young Wook; Kim, Hak Hee; Chae, Eun Young; Cho, Seungryong

    2017-02-01

    Nonlinear pre-reconstruction processing of the projection data in computed tomography (CT) where accurate recovery of the CT numbers is important for diagnosis is usually discouraged, for such a processing would violate the physics of image formation in CT. However, one can devise a pre-processing step to enhance detectability of lesions in digital breast tomosynthesis (DBT) where accurate recovery of the CT numbers is fundamentally impossible due to the incompleteness of the scanned data. Since the detection of lesions such as micro-calcifications and mass in breasts is the purpose of using DBT, it is justified that a technique producing higher detectability of lesions is a virtue. A histogram modification technique was developed in the projection data domain. Histogram of raw projection data was first divided into two parts: One for the breast projection data and the other for background. Background pixel values were set to a single value that represents the boundary between breast and background. After that, both histogram parts were shifted by an appropriate amount of offset and the histogram-modified projection data were log-transformed. Filtered-backprojection (FBP) algorithm was used for image reconstruction of DBT. To evaluate performance of the proposed method, we computed the detectability index for the reconstructed images from clinically acquired data. Typical breast border enhancement artifacts were greatly suppressed and the detectability of calcifications and masses was increased by use of the proposed method. Compared to a global threshold-based post-reconstruction processing technique, the proposed method produced images of higher contrast without invoking additional image artifacts. In this work, we report a novel pre-processing technique that improves detectability of lesions in DBT and has potential advantages over the global threshold-based post-reconstruction processing technique. The proposed method not only increased the lesion detectability

  10. Speckle reduction process based on digital filtering and wavelet compounding in optical coherence tomography for dermatology

    NASA Astrophysics Data System (ADS)

    Gómez Valverde, Juan J.; Ortuño, Juan E.; Guerra, Pedro; Hermann, Boris; Zabihian, Behrooz; Rubio-Guivernau, José L.; Santos, Andrés.; Drexler, Wolfgang; Ledesma-Carbayo, Maria J.

    2015-07-01

    Optical Coherence Tomography (OCT) has shown a great potential as a complementary imaging tool in the diagnosis of skin diseases. Speckle noise is the most prominent artifact present in OCT images and could limit the interpretation and detection capabilities. In this work we propose a new speckle reduction process and compare it with various denoising filters with high edge-preserving potential, using several sets of dermatological OCT B-scans. To validate the performance we used a custom-designed spectral domain OCT and two different data set groups. The first group consisted in five datasets of a single B-scan captured N times (with N<20), the second were five 3D volumes of 25 Bscans. As quality metrics we used signal to noise (SNR), contrast to noise (CNR) and equivalent number of looks (ENL) ratios. Our results show that a process based on a combination of a 2D enhanced sigma digital filter and a wavelet compounding method achieves the best results in terms of the improvement of the quality metrics. In the first group of individual B-scans we achieved improvements in SNR, CNR and ENL of 16.87 dB, 2.19 and 328 respectively; for the 3D volume datasets the improvements were 15.65 dB, 3.44 and 1148. Our results suggest that the proposed enhancement process may significantly reduce speckle, increasing SNR, CNR and ENL and reducing the number of extra acquisitions of the same frame.

  11. Effect of various digital processing algorithms on the measurement accuracy of endodontic file length.

    PubMed

    Kal, Betül Ilhan; Baksi, B Güniz; Dündar, Nesrin; Sen, Bilge Hakan

    2007-02-01

    The aim of this study was to compare the accuracy of endodontic file lengths after application of various image enhancement modalities. Endodontic files of three different ISO sizes were inserted in 20 single-rooted extracted permanent mandibular premolar teeth and standardized images were obtained. Original digital images were then enhanced using five processing algorithms. Six evaluators measured the length of each file on each image. The measurements from each processing algorithm and each file size were compared using repeated measures ANOVA and Bonferroni tests (P = 0.05). Paired t test was performed to compare the measurements with the true lengths of the files (P = 0.05). All of the processing algorithms provided significantly shorter measurements than the true length of each file size (P < 0.05). The threshold enhancement modality produced significantly higher mean error values (P < 0.05), while there was no significant difference among the other enhancement modalities (P > 0.05). Decrease in mean error value was observed with increasing file size (P < 0.05). Invert, contrast/brightness and edge enhancement algorithms may be recommended for accurate file length measurements when utilizing storage phosphor plates.

  12. Digital signal processing for velocity measurements in dynamical material's behaviour studies.

    PubMed

    Devlaminck, Julien; Luc, Jérôme; Chanal, Pierre-Yves

    2014-03-01

    In this work, we describe different configurations of optical fiber interferometers (types Michelson and Mach-Zehnder) used to measure velocities during dynamical material's behaviour studies. We detail the algorithms of processing developed and optimized to improve the performance of these interferometers especially in terms of time and frequency resolutions. Three methods of analysis of interferometric signals were studied. For Michelson interferometers, the time-frequency analysis of signals by Short-Time Fourier Transform (STFT) is compared to a time-frequency analysis by Continuous Wavelet Transform (CWT). The results have shown that the CWT was more suitable than the STFT for signals with low signal-to-noise, and low velocity and high acceleration areas. For Mach-Zehnder interferometers, the measurement is carried out by analyzing the phase shift between three interferometric signals (Triature processing). These three methods of digital signal processing were evaluated, their measurement uncertainties estimated, and their restrictions or operational limitations specified from experimental results performed on a pulsed power machine.

  13. Integration of digital signal processing technologies with pulsed electron paramagnetic resonance imaging

    PubMed Central

    Pursley, Randall H.; Salem, Ghadi; Devasahayam, Nallathamby; Subramanian, Sankaran; Koscielniak, Janusz; Krishna, Murali C.; Pohida, Thomas J.

    2006-01-01

    The integration of modern data acquisition and digital signal processing (DSP) technologies with Fourier transform electron paramagnetic resonance (FT-EPR) imaging at radiofrequencies (RF) is described. The FT-EPR system operates at a Larmor frequency (Lf) of 300 MHz to facilitate in vivo studies. This relatively low frequency Lf, in conjunction with our ~10 MHz signal bandwidth, enables the use of direct free induction decay time-locked subsampling (TLSS). This particular technique provides advantages by eliminating the traditional analog intermediate frequency downconversion stage along with the corresponding noise sources. TLSS also results in manageable sample rates that facilitate the design of DSP-based data acquisition and image processing platforms. More specifically, we utilize a high-speed field programmable gate array (FPGA) and a DSP processor to perform advanced real-time signal and image processing. The migration to a DSP-based configuration offers the benefits of improved EPR system performance, as well as increased adaptability to various EPR system configurations (i.e., software configurable systems instead of hardware reconfigurations). The required modifications to the FT-EPR system design are described, with focus on the addition of DSP technologies including the application-specific hardware, software, and firmware developed for the FPGA and DSP processor. The first results of using real-time DSP technologies in conjunction with direct detection bandpass sampling to implement EPR imaging at RF frequencies are presented. PMID:16243552

  14. Digital optical processing of optical communications: towards an Optical Turing Machine

    NASA Astrophysics Data System (ADS)

    Touch, Joe; Cao, Yinwen; Ziyadi, Morteza; Almaiman, Ahmed; Mohajerin-Ariaei, Amirhossein; Willner, Alan E.

    2017-01-01

    Optical computing is needed to support Tb/s in-network processing in a way that unifies communication and computation using a single data representation that supports in-transit network packet processing, security, and big data filtering. Support for optical computation of this sort requires leveraging the native properties of optical wave mixing to enable computation and switching for programmability. As a consequence, data must be encoded digitally as phase (M-PSK), semantics-preserving regeneration is the key to high-order computation, and data processing at Tb/s rates requires mixing. Experiments have demonstrated viable approaches to phase squeezing and power restoration. This work led our team to develop the first serial, optical Internet hop-count decrement, and to design and simulate optical circuits for calculating the Internet checksum and multiplexing Internet packets. The current exploration focuses on limited-lookback computational models to reduce the need for permanent storage and hybrid nanophotonic circuits that combine phase-aligned comb sources, non-linear mixing, and switching on the same substrate to avoid the macroscopic effects that hamper benchtop prototypes.

  15. Fully digital data processing during cardiovascular implantable electronic device follow-up in a high-volume tertiary center.

    PubMed

    Staudacher, Ingo; Nalpathamkalam, Asha Roy; Uhlmann, Lorenz; Illg, Claudius; Seehausen, Sebastian; Akhavanpoor, Mohammadreza; Buchauer, Anke; Geis, Nicolas; Lugenbiel, Patrick; Schweizer, Patrick A; Xynogalos, Panagiotis; Zylla, Maura M; Scholz, Eberhard; Zitron, Edgar; Katus, Hugo A; Thomas, Dierk

    2017-10-11

    Increasing numbers of patients with cardiovascular implantable electronic devices (CIEDs) and limited follow-up capacities highlight unmet challenges in clinical electrophysiology. Integrated software (MediConnect ® ) enabling fully digital processing of device interrogation data has been commercially developed to facilitate follow-up visits. We sought to assess feasibility of fully digital data processing (FDDP) during ambulatory device follow-up in a high-volume tertiary hospital to provide guidance for future users of FDDP software. A total of 391 patients (mean age, 70 years) presenting to the outpatient department for routine device follow-up were analyzed (pacemaker, 44%; implantable cardioverter defibrillator, 39%; cardiac resynchronization therapy device, 16%). Quality of data transfer and follow-up duration were compared between digital (n = 265) and manual processing of device data (n = 126). Digital data import was successful, complete and correct in 82% of cases when early software versions were used. When using the most recent software version the rate of successful digital data import increased to 100%. Software-based import of interrogation data was complete and without failure in 97% of cases. The mean duration of a follow-up visit did not differ between the two groups (digital 18.7 min vs. manual data transfer 18.2 min). FDDP software was successfully implemented into the ambulatory follow-up of patients with implanted pacemakers and defibrillators. Digital data import into electronic patient management software was feasible and supported the physician's workflow. The total duration of follow-up visits comprising technical device interrogation and clinical actions was not affected in the present tertiary center outpatient cohort.

  16. Use of edible coatings to preserve quality of lightly (and slightly) processed products.

    PubMed

    Baldwin, E A; Nisperos-Carriedo, M O; Baker, R A

    1995-11-01

    Lightly processed agricultural products present a special problem to the food industry and to scientists involved in postharvest and food technology research. Light or minimal processing includes cutting, slicing, coring, peeling, trimming, or sectioning of agricultural produce. These products have an active metabolism that can result in deteriorative changes, such as increased respiration and ethylene production. If not controlled, these changes can lead to rapid senescence and general deterioration of the product. In addition, the surface water activity of cut fruits and vegetables is generally quite high, inviting microbial attack, which further reduces product stability. Methods for control of these changes are numerous and can include the use of edible coatings. Also mentioned in this review are coating of nut products, and dried, dehydrated, and freeze-dried fruits. Technically, these are not considered to be minimally processed, but many of the problems and benefits of coating these products are similar to coating lightly processed products. Generally, the potential benefits of edible coatings for processed or lightly processed produce is to stabilize the product and thereby extend product shelf life. More specifically, coatings have the potential to reduce moisture loss, restrict oxygen entrance, lower respiration, retard ethylene production, seal in flavor volatiles, and carry additives that retard discoloration and microbial growth.

  17. Using Digital Time-Lapse Videos to Teach Geomorphic Processes to Undergraduates

    NASA Astrophysics Data System (ADS)

    Clark, D. H.; Linneman, S. R.; Fuller, J.

    2004-12-01

    We demonstrate the use of relatively low-cost, computer-based digital imagery to create time-lapse videos of two distinct geomorphic processes in order to help students grasp the significance of the rates, styles, and temporal dependence of geologic phenomena. Student interviews indicate that such videos help them to understand the relationship between processes and landform development. Time-lapse videos have been used extensively in some sciences (e.g., biology - http://sbcf.iu.edu/goodpract/hangarter.html, meteorology - http://www.apple.com/education/hed/aua0101s/meteor/, chemistry - http://www.chem.yorku.ca/profs/hempsted/chemed/home.html) to demonstrate gradual processes that are difficult for many students to visualize. Most geologic processes are slower still, and are consequently even more difficult for students to grasp, yet time-lapse videos are rarely used in earth science classrooms. The advent of inexpensive web-cams and computers provides a new means to explore the temporal dimension of earth surface processes. To test the use of time-lapse videos in geoscience education, we are developing time-lapse movies that record the evolution of two landforms: a stream-table delta and a large, natural, active landslide. The former involves well-known processes in a controlled, repeatable laboratory experiment, whereas the latter tracks the developing dynamics of an otherwise poorly understood slope failure. The stream-table delta is small and grows in ca. 2 days; we capture a frame on an overhead web-cam every 3 minutes. Before seeing the video, students are asked to hypothesize how the delta will grow through time. The final time-lapse video, ca. 20-80 MB, elegantly shows channel migration, progradation rates, and formation of major geomorphic elements (topset, foreset, bottomset beds). The web-cam can also be "zoomed-in" to show smaller-scale processes, such as bedload transfer, and foreset slumping. Post-lab tests and interviews with students indicate that

  18. High-rate dead-time corrections in a general purpose digital pulse processing system

    PubMed Central

    Abbene, Leonardo; Gerardi, Gaetano

    2015-01-01

    Dead-time losses are well recognized and studied drawbacks in counting and spectroscopic systems. In this work the abilities on dead-time correction of a real-time digital pulse processing (DPP) system for high-rate high-resolution radiation measurements are presented. The DPP system, through a fast and slow analysis of the output waveform from radiation detectors, is able to perform multi-parameter analysis (arrival time, pulse width, pulse height, pulse shape, etc.) at high input counting rates (ICRs), allowing accurate counting loss corrections even for variable or transient radiations. The fast analysis is used to obtain both the ICR and energy spectra with high throughput, while the slow analysis is used to obtain high-resolution energy spectra. A complete characterization of the counting capabilities, through both theoretical and experimental approaches, was performed. The dead-time modeling, the throughput curves, the experimental time-interval distributions (TIDs) and the counting uncertainty of the recorded events of both the fast and the slow channels, measured with a planar CdTe (cadmium telluride) detector, will be presented. The throughput formula of a series of two types of dead-times is also derived. The results of dead-time corrections, performed through different methods, will be reported and discussed, pointing out the error on ICR estimation and the simplicity of the procedure. Accurate ICR estimations (nonlinearity < 0.5%) were performed by using the time widths and the TIDs (using 10 ns time bin width) of the detected pulses up to 2.2 Mcps. The digital system allows, after a simple parameter setting, different and sophisticated procedures for dead-time correction, traditionally implemented in complex/dedicated systems and time-consuming set-ups. PMID:26289270

  19. Estimation of reactive surface area using a combined method of laboratory analyses and digital image processing

    NASA Astrophysics Data System (ADS)

    Ma, Jin; Kong, Xiang-Zhao; Saar, Martin O.

    2017-04-01

    Fluid-rock interactions play an important role in the engineering processes such as chemical stimulation of enhanced geothermal systems and carbon capture, utilization, and storage. However, these interactions highly depend on the accessible reactive surface area of the minerals that are generally poorly constrained for natural geologic samples. In particular, quantifying surface area of each reacting mineral within whole rock samples is challenging due to the heterogeneous distribution of minerals and pore space. In this study, detailed laboratory analyses were performed on sandstone samples from deep geothermal sites in Lithuania. We measure specific surface area of whole rock samples using a gas adsorption method (so-called B.E.T.) with N2 at a temperature of 77.3K. We also quantify their porosity and pore size distribution by a Helium gas pycnometer and a Hg porosimetry, respectively. Rock compositions are determined by a combination of X-ray fluorescence (XRF) and quantitative scanning electron microscopy (SEM) - Energy-dispersive X-ray spectroscopy (EDS), which are later geometrically mapped on images of two-dimensional SEM- Backscattered electrons (BSE) with a resolution of 1.2 μm and three-dimensional micro-CT with a resolution of 10.3 μm to produce a digital mineral map for further constraining the accessibility of reactive minerals. Moreover, we attempt to link the whole rock porosity, pore size distribution, and B.E.T. specific surface area with the digital mineral maps. We anticipate these necessary analyses to provide in-depth understanding of fluid sample chemistry from later hydrothermal reactive flow-through experiments on whole rock samples at elevated pressure and temperature.

  20. Wavelet processing and digital interferometric contrast to improve reconstructions from X-ray Gabor holograms.

    PubMed

    Aguilar, Juan C; Misawa, Masaki; Matsuda, Kiyofumi; Suzuki, Yoshio; Takeuchi, Akihisa; Yasumoto, Masato

    2018-05-01

    In this work, the application of an undecimated wavelet transformation together with digital interferometric contrast to improve the resulting reconstructions in a digital hard X-ray Gabor holographic microscope is shown. Specifically, the starlet transform is used together with digital Zernike contrast. With this contrast, the results show that only a small set of scales from the hologram are, in effect, useful, and it is possible to enhance the details of the reconstruction.