Science.gov

Sample records for digital processing methodology

  1. Digital Methodology to implement the ECOUTER engagement process

    PubMed Central

    Wilson, Rebecca C.; Butters, Oliver W.; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J.

    2017-01-01

    ECOUTER ( Employing COncept ual schema for policy and Translation E in Research – French for ‘to listen’ – is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes. PMID:27366320

  2. Digital Methodology to implement the ECOUTER engagement process.

    PubMed

    Wilson, Rebecca C; Butters, Oliver W; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J

    2016-01-01

    ECOUTER ( Employing COncept ual schema for policy and Translation E in Research - French for 'to listen' - is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes.

  3. A Digital Methodology for the Design Process of Aerospace Assemblies with Sustainable Composite Processes & Manufacture

    NASA Astrophysics Data System (ADS)

    McEwan, W.; Butterfield, J.

    2011-05-01

    The well established benefits of composite materials are driving a significant shift in design and manufacture strategies for original equipment manufacturers (OEMs). Thermoplastic composites have advantages over the traditional thermosetting materials with regards to sustainability and environmental impact, features which are becoming increasingly pertinent in the aerospace arena. However, when sustainability and environmental impact are considered as design drivers, integrated methods for part design and product development must be developed so that any benefits of sustainable composite material systems can be assessed during the design process. These methods must include mechanisms to account for process induced part variation and techniques related to re-forming, recycling and decommissioning, which are in their infancy. It is proposed in this paper that predictive techniques related to material specification, part processing and product cost of thermoplastic composite components, be integrated within a Through Life Management (TLM) product development methodology as part of a larger strategy of product system modeling to improve disciplinary concurrency, realistic part performance, and to place sustainability at the heart of the design process. This paper reports the enhancement of digital manufacturing tools as a means of drawing simulated part manufacturing scenarios, real time costing mechanisms, and broader lifecycle performance data capture into the design cycle. The work demonstrates predictive processes for sustainable composite product manufacture and how a Product-Process-Resource (PPR) structure can be customised and enhanced to include design intent driven by `Real' part geometry and consequent assembly. your paper.

  4. A Methodology to Teach Advanced A/D Converters, Combining Digital Signal Processing and Microelectronics Perspectives

    ERIC Educational Resources Information Center

    Quintans, C.; Colmenar, A.; Castro, M.; Moure, M. J.; Mandado, E.

    2010-01-01

    ADCs (analog-to-digital converters), especially Pipeline and Sigma-Delta converters, are designed using complex architectures in order to increase their sampling rate and/or resolution. Consequently, the learning of ADC devices also encompasses complex concepts such as multistage synchronization, latency, oversampling, modulation, noise shaping,…

  5. A Methodology to Teach Advanced A/D Converters, Combining Digital Signal Processing and Microelectronics Perspectives

    ERIC Educational Resources Information Center

    Quintans, C.; Colmenar, A.; Castro, M.; Moure, M. J.; Mandado, E.

    2010-01-01

    ADCs (analog-to-digital converters), especially Pipeline and Sigma-Delta converters, are designed using complex architectures in order to increase their sampling rate and/or resolution. Consequently, the learning of ADC devices also encompasses complex concepts such as multistage synchronization, latency, oversampling, modulation, noise shaping,…

  6. Digital image processing.

    PubMed

    Seeram, Euclid

    2004-01-01

    Digital image processing is now commonplace in radiology, nuclear medicine and sonography. This article outlines underlying principles and concepts of digital image processing. After completing this article, readers should be able to: List the limitations of film-based imaging. Identify major components of a digital imaging system. Describe the history and application areas of digital image processing. Discuss image representation and the fundamentals of digital image processing. Outline digital image processing techniques and processing operations used in selected imaging modalities. Explain the basic concepts and visualization tools used in 3-D and virtual reality imaging. Recognize medical imaging informatics as a new area of specialization for radiologic technologists.

  7. Integration Of Digital Methodologies (Field, Processing, and Presentation) In A Combined Sedimentology/Stratigraphy and Structure Course

    NASA Astrophysics Data System (ADS)

    Malinconico, L. L., Jr.; Sunderlin, D.; Liew, C. W.

    2015-12-01

    Over the course of the last three years we have designed, developed and refined two Apps for the iPad. GeoFieldBook and StratLogger allow for the real-time display of spatial (structural) and temporal (stratigraphic) field data as well as very easy in-field navigation. Field techniques and methods for data acquisition and mapping in the field have dramatically advanced and simplified how we collect and analyze data while in the field. The Apps are not geologic mapping programs, but rather a way of bypassing the analog field book step to acquire digital data directly that can then be used in various analysis programs (GIS, Google Earth, Stereonet, spreadsheet and drawing programs). We now complete all of our fieldwork digitally. GeoFieldBook can be used to collect structural and other field observations. Each record includes location/date/time information, orientation measurements, formation names, text observations and photos taken with the tablet camera. Records are customizable, so users can add fields of their own choosing. Data are displayed on an image base in real time with oriented structural symbols. The image base is also used for in-field navigation. In StratLogger, the user records bed thickness, lithofacies, biofacies, and contact data in preset and modifiable fields. Each bed/unit record may also be photographed and geo-referenced. As each record is collected, a column diagram of the stratigraphic sequence is built in real time, complete with lithology color, lithology texture, and fossil symbols. The recorded data from any measured stratigraphic sequence can be exported as both the live-drawn column image and as a .csv formatted file for use in spreadsheet or other applications. Common to both Apps is the ability to export the data (via .csv files), photographs and maps or stratigraphic columns (images). Since the data are digital they are easily imported into various processing programs (for example for stereoplot analysis). Requiring that all maps

  8. Digital image processing

    NASA Technical Reports Server (NTRS)

    Bernstein, R.; Ferneyhough, D. G., Jr.

    1975-01-01

    The Federal Systems Division of IBM has developed an image processing facility to experimentally process, view, and record digital image data. This facility has been used to support LANDSAT digital image processing investigations and advanced image processing research and development. A brief description of the facility is presented, some techniques that have been developed to correct the image data are discussed, and some results obtained by users of the facility are described.

  9. Digital processing clock

    NASA Technical Reports Server (NTRS)

    Phillips, D. H.

    1982-01-01

    Tthe digital processing clock SG 1157/U is described. It is compatible with the PTTI world where it can be driven by an external cesium source. Built-in test equipment shows synchronization with cesium through 1 pulse per second. It is built to be expandable to accommodate future time-keeping needs of the Navy as well as any other time ordered functions. Examples of this expandibility are the inclusion of an unmodulated XR3 time code and the 2137 modulate time code (XR3 with 1 kHz carrier).

  10. Aquarius Digital Processing Unit

    NASA Technical Reports Server (NTRS)

    Forgione, Joshua; Winkert, George; Dobson, Norman

    2009-01-01

    Three documents provide information on a digital processing unit (DPU) for the planned Aquarius mission, in which a radiometer aboard a spacecraft orbiting Earth is to measure radiometric temperatures from which data on sea-surface salinity are to be deduced. The DPU is the interface between the radiometer and an instrument-command-and-data system aboard the spacecraft. The DPU cycles the radiometer through a programmable sequence of states, collects and processes all radiometric data, and collects all housekeeping data pertaining to operation of the radiometer. The documents summarize the DPU design, with emphasis on innovative aspects that include mainly the following: a) In the radiometer and the DPU, conversion from analog voltages to digital data is effected by means of asynchronous voltage-to-frequency converters in combination with a frequency-measurement scheme implemented in field-programmable gate arrays (FPGAs). b) A scheme to compensate for aging and changes in the temperature of the DPU in order to provide an overall temperature-measurement accuracy within 0.01 K includes a high-precision, inexpensive DC temperature measurement scheme and a drift-compensation scheme that was used on the Cassini radar system. c) An interface among multiple FPGAs in the DPU guarantees setup and hold times.

  11. A Design Methodology for Medical Processes.

    PubMed

    Ferrante, Simona; Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient's needs, the uncertainty of the patient's response, and the indeterminacy of patient's compliance to treatment. Also, the multiple actors involved in patient's care need clear and transparent communication to ensure care coordination. In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution.

  12. Multi-digit number processing beyond the two-digit number range: a combination of sequential and parallel processes.

    PubMed

    Meyerhoff, Hauke S; Moeller, Korbinian; Debus, Kolja; Nuerk, Hans-Christoph

    2012-05-01

    Investigations of multi-digit number processing typically focus on two-digit numbers. Here, we aim to investigate the generality of results from two-digit numbers for four- and six-digit numbers. Previous studies on two-digit numbers mostly suggested a parallel processing of tens and units. In contrast, the few studies examining the processing of larger numbers suggest sequential processing of the individual constituting digits. In this study, we combined the methodological approaches of studies implying either parallel or sequential processing. Participants completed a number magnitude comparison task on two-, four-, and six-digit numbers including unit-decade compatible and incompatible differing digit pairs (e.g., 32_47, 3<4 and 2<7 vs. 37_52, 3<5 but 7>2, respectively) at all possible digit positions. Response latencies and fixation behavior indicated that sequential and parallel decomposition is not exclusive in multi-digit number processing. Instead, our results clearly suggested that sequential and parallel processing strategies seem to be combined when processing multi-digit numbers beyond the two-digit number range. To account for the results, we propose a chunking hypothesis claiming that multi-digit numbers are separated into chunks of shorter digit strings. While the different chunks are processed sequentially digits within these chunks are processed in parallel. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. A Design Methodology for Medical Processes

    PubMed Central

    Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Summary Background Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient’s needs, the uncertainty of the patient’s response, and the indeterminacy of patient’s compliance to treatment. Also, the multiple actors involved in patient’s care need clear and transparent communication to ensure care coordination. Objectives In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. Methods The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. Results The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Conclusions Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution. PMID:27081415

  14. Digital TV processing system

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Two digital video data compression systems directly applicable to the Space Shuttle TV Communication System were described: (1) For the uplink, a low rate monochrome data compressor is used. The compression is achieved by using a motion detection technique in the Hadamard domain. To transform the variable source rate into a fixed rate, an adaptive rate buffer is provided. (2) For the downlink, a color data compressor is considered. The compression is achieved first by intra-color transformation of the original signal vector, into a vector which has lower information entropy. Then two-dimensional data compression techniques are applied to the Hadamard transformed components of this last vector. Mathematical models and data reliability analyses were also provided for the above video data compression techniques transmitted over a channel encoded Gaussian channel. It was shown that substantial gains can be achieved by the combination of video source and channel coding.

  15. Measuring user experience in digital gaming: theoretical and methodological issues

    NASA Astrophysics Data System (ADS)

    Takatalo, Jari; Häkkinen, Jukka; Kaistinen, Jyrki; Nyman, Göte

    2007-01-01

    There are innumerable concepts, terms and definitions for user experience. Few of them have a solid empirical foundation. In trying to understand user experience in interactive technologies such as computer games and virtual environments, reliable and valid concepts are needed for measuring relevant user reactions and experiences. Here we present our approach to create both theoretically and methodologically sound methods for quantification of the rich user experience in different digital environments. Our approach is based on the idea that the experience received from a content presented with a specific technology is always a result of a complex psychological interpretation process, which components should be understood. The main aim of our approach is to grasp the complex and multivariate nature of the experience and make it measurable. We will present our two basic measurement frameworks, which have been developed and tested in large data set (n=2182). The 15 measurement scales extracted from these models are applied to digital gaming with a head-mounted display and a table-top display. The results show how it is possible to map between experience, technology variables and the background of the user (e.g., gender). This approach can help to optimize, for example, the contents for specific viewing devices or viewing situations.

  16. Methodologies for digital 3D acquisition and representation of mosaics

    NASA Astrophysics Data System (ADS)

    Manferdini, Anna Maria; Cipriani, Luca; Kniffitz, Linda

    2011-07-01

    Despite the recent improvements and widespread of digital technologies and their applications in the field of Cultural Heritage, nowadays Museums and Institutions still aren't encouraged to adopt digital procedures as a standard practice to collect data upon the heritage they are called to preserve and promote. One of the main reasons for this lack can be singled out in the high costs connected with these procedures and with their increasing due to difficulties connected with digital survey of artifacts and artworks which present evident intrinsic complexities and peculiarities that cannot be reconnected to recurrences. The aim of this paper is to show the results of a research conducted in order to find the most suitable digital methodology and procedure to be adopted to collect geometric and radiometric data upon mosaics that can straightforward both the preservation of the consistency of information about its geometry and the management of huge amount of data. One of the most immediate application of digital 3d survey of mosaics is the substitution of plaster casts that are usually built to add the third dimension to pictorial or photographic surveys before restoration interventions in order to document their conservation conditions and ease reconstruction procedures. Moreover, digital 3d surveys of mosaics allow to reproduce restoration interventions in digital environment able to perform reliable preliminary evaluations; in addition, 3d reality-based models of mosaics can be used within digital catalogues or for digital exhibitions and reconstruction aims.

  17. Digital signal processing

    NASA Astrophysics Data System (ADS)

    Oppenheim, A. V.; Baggeroer, A. B.; Lim, J. S.; Musicus, B. R.; Mook, D. R.; Duckworth, G. L.; Bordley, T. E.; Curtis, S. R.; Deadrick, D. S.; Dove, W. P.

    1984-01-01

    Signal and image processing research projects are described. Topics include: (1) modeling underwater acoustic propagation; (2) image restoration; (3) signal reconstruction; (4) speech enhancement; (5) pitch detection; (6) spectral analysis; (7) speech synthesis; (8) speech enhancement; (9) autoregressive spectral estimation; (10) knowledge based array processing; (11) speech analysis; (12) estimating the degree of coronary stenosis with image processing; (13) automatic target detection; and (14) video conferencing.

  18. Digital Storytelling: A Novel Methodology for Sexual Health Promotion

    ERIC Educational Resources Information Center

    Guse, Kylene; Spagat, Andrea; Hill, Amy; Lira, Andrea; Heathcock, Stephen; Gilliam, Melissa

    2013-01-01

    Digital storytelling draws on the power of narrative for personal and social transformation. This technique has many desirable attributes for sexuality education, including a participatory methodology, provision of a "safe space" to collaboratively address stigmatized topics, and an emphasis on the social and political contexts that…

  19. Digital Video as Research Practice: Methodology for the Millennium

    ERIC Educational Resources Information Center

    Shrum, Wesley; Duque, Ricardo; Brown, Timothy

    2005-01-01

    This essay has its origin in a project on the globalization of science that rediscovered the wisdom of past research practices through the technology of the future. The main argument of this essay is that a convergence of digital video technologies with practices of social surveillance portends a methodological shift towards a new variety of…

  20. Digital Storytelling: A Novel Methodology for Sexual Health Promotion

    ERIC Educational Resources Information Center

    Guse, Kylene; Spagat, Andrea; Hill, Amy; Lira, Andrea; Heathcock, Stephen; Gilliam, Melissa

    2013-01-01

    Digital storytelling draws on the power of narrative for personal and social transformation. This technique has many desirable attributes for sexuality education, including a participatory methodology, provision of a "safe space" to collaboratively address stigmatized topics, and an emphasis on the social and political contexts that…

  1. Advanced digital SAR processing study

    NASA Technical Reports Server (NTRS)

    Martinson, L. W.; Gaffney, B. P.; Liu, B.; Perry, R. P.; Ruvin, A.

    1982-01-01

    A highly programmable, land based, real time synthetic aperture radar (SAR) processor requiring a processed pixel rate of 2.75 MHz or more in a four look system was designed. Variations in range and azimuth compression, number of looks, range swath, range migration and SR mode were specified. Alternative range and azimuth processing algorithms were examined in conjunction with projected integrated circuit, digital architecture, and software technologies. The advaced digital SAR processor (ADSP) employs an FFT convolver algorithm for both range and azimuth processing in a parallel architecture configuration. Algorithm performace comparisons, design system design, implementation tradeoffs and the results of a supporting survey of integrated circuit and digital architecture technologies are reported. Cost tradeoffs and projections with alternate implementation plans are presented.

  2. Digital Literacy: Tools and Methodologies for Information Society

    ERIC Educational Resources Information Center

    Rivoltella, Pier Cesare, Ed.

    2008-01-01

    Currently in a state of cultural transition, global society is moving from a literary society to digital one, adopting widespread use of advanced technologies such as the Internet and mobile devices. Digital media has an extraordinary impact on society's formative processes, forcing a pragmatic shift in their management and organization. This…

  3. Digital Literacy: Tools and Methodologies for Information Society

    ERIC Educational Resources Information Center

    Rivoltella, Pier Cesare, Ed.

    2008-01-01

    Currently in a state of cultural transition, global society is moving from a literary society to digital one, adopting widespread use of advanced technologies such as the Internet and mobile devices. Digital media has an extraordinary impact on society's formative processes, forcing a pragmatic shift in their management and organization. This…

  4. Digital image processing in cephalometric analysis.

    PubMed

    Jäger, A; Döler, W; Schormann, T

    1989-01-01

    Digital image processing methods were applied to improve the practicability of cephalometric analysis. The individual X-ray film was digitized by the aid of a high resolution microscope-photometer. Digital processing was done using a VAX 8600 computer system. An improvement of the image quality was achieved by means of various digital enhancement and filtering techniques.

  5. Friction Stir Process Mapping Methodology

    NASA Technical Reports Server (NTRS)

    Kooney, Alex; Bjorkman, Gerry; Russell, Carolyn; Smelser, Jerry (Technical Monitor)

    2002-01-01

    In FSW (friction stir welding), the weld process performance for a given weld joint configuration and tool setup is summarized on a 2-D plot of RPM vs. IPM. A process envelope is drawn within the map to identify the range of acceptable welds. The sweet spot is selected as the nominal weld schedule. The nominal weld schedule is characterized in the expected manufacturing environment. The nominal weld schedule in conjunction with process control ensures a consistent and predictable weld performance.

  6. Friction Stir Process Mapping Methodology

    NASA Technical Reports Server (NTRS)

    Bjorkman, Gerry; Kooney, Alex; Russell, Carolyn

    2003-01-01

    The weld process performance for a given weld joint configuration and tool setup is summarized on a 2-D plot of RPM vs. IPM. A process envelope is drawn within the map to identify the range of acceptable welds. The sweet spot is selected as the nominal weld schedule The nominal weld schedule is characterized in the expected manufacturing environment. The nominal weld schedule in conjunction with process control ensures a consistent and predictable weld performance.

  7. Effectiveness of Digital Pulse Processing Using a Slow Waveform Digitizer

    NASA Astrophysics Data System (ADS)

    Anthony, Adam; Ahmed, Mohammad; Sikora, Mark

    2016-09-01

    Using a waveform digitizer, one can replace nearly all of the analog electronics typically involved in processing pulses from a detector by directly digitizing the signal and processing it using digital algorithms. Algorithms for timing filter amplification, constant fraction discrimination, trapezoidal pulse shaping, peak sensing with pileup rejection, and charge integration were developed and implemented. The algorithms and a digitizer with a sampling rate of 62.5 MS/sec were used to calculate the energy and timing resolution of a various scintillation and solid state detectors. These resolutions are compared against both a traditional charge to digital (QDC), and the analog to digital (ADC) data acquisition setup in use at the High Intensity Gamma Source at Duke University. Preliminary results are presented.

  8. Metric Aspects of Digital Images and Digital Image Processing.

    DTIC Science & Technology

    1984-09-01

    image files were synthesized aerial images, produced using the program SIM. This program makes use of a digital terrain model containing gray shade...the Arizona test data. This test data was derived from a digitized stereo model formed by two nearly vertical images taken in October 1066 near... digital image processing operations will be investigated in a manner similar to compression. 7) It is hoped that the ability to quantitatively assess

  9. Challenges of implementing digital technology in motion picture distribution and exhibition: testing and evaluation methodology

    NASA Astrophysics Data System (ADS)

    Swartz, Charles S.

    2003-05-01

    The process of distributing and exhibiting a motion picture has changed little since the Lumière brothers presented the first motion picture to an audience in 1895. While this analog photochemical process is capable of producing screen images of great beauty and expressive power, more often the consumer experience is diminished by third generation prints and by the wear and tear of the mechanical process. Furthermore, the film industry globally spends approximately $1B annually manufacturing and shipping prints. Alternatively, distributing digital files would theoretically yield great benefits in terms of image clarity and quality, lower cost, greater security, and more flexibility in the cinema (e.g., multiple language versions). In order to understand the components of the digital cinema chain and evaluate the proposed technical solutions, the Entertainment Technology Center at USC in 2000 established the Digital Cinema Laboratory as a critical viewing environment, with the highest quality film and digital projection equipment. The presentation describes the infrastructure of the Lab, test materials, and testing methodologies developed for compression evaluation, and lessons learned up to the present. In addition to compression, the Digital Cinema Laboratory plans to evaluate other components of the digital cinema process as well.

  10. Methodology for digital radiography simulation using the Monte Carlo code MCNPX for industrial applications.

    PubMed

    Souza, E M; Correa, S C A; Silva, A X; Lopes, R T; Oliveira, D F

    2008-05-01

    This work presents a methodology for digital radiography simulation for industrial applications using the MCNPX radiography tally. In order to perform the simulation, the energy-dependent response of a BaFBr imaging plate detector was modeled and introduced in the MCNPX radiography tally input. In addition, a post-processing program was used to convert the MCNPX radiography tally output into 16-bit digital images. Simulated and experimental images of a steel pipe containing corrosion alveoli and stress corrosion cracking were compared, and the results showed good agreement between both images.

  11. Digital processing system for developing countries

    NASA Technical Reports Server (NTRS)

    Nanayakkara, C.; Wagner, H.

    1977-01-01

    An effort was undertaken to perform simple digital processing tasks using pre-existing general purpose digital computers. An experimental software package, LIGMALS, was obtained and modified for this purpose. The resulting software permits basic processing tasks to be performed including level slicing, gray mapping and ratio processing. The experience gained in this project indicates a possible direction which may be used by other developing countries to obtain digital processing capabilities.

  12. Digital signal processing in microwave radiometers

    NASA Technical Reports Server (NTRS)

    Lawrence, R. W.; Stanley, W. D.; Harrington, R. F.

    1980-01-01

    A microprocessor based digital signal processing unit has been proposed to replace analog sections of a microwave radiometer. A brief introduction to the radiometer system involved and a description of problems encountered in the use of digital techniques in radiometer design are discussed. An analysis of the digital signal processor as part of the radiometer is then presented.

  13. Topics in digital signal processing

    NASA Astrophysics Data System (ADS)

    Narayan, S. S. R.

    Three discrete Fourier transform (DFT) algorithms, namely, the fast Fourier transform algorithm (FFT), the prime factor algorithm (PFA) and the Winograd Fourier transform algorithm (WFTA) are analyzed and compared. A new set of short-length DFT algorithms well-suited for special purpose hardware implementations, employing monolithic multiplier-accumulators and microprocessors, are presented. Architectural considerations in designing DFT processors based on these algorithms are discussed. Efficient hardware structures for implementing the FFT and the PFA are presented. A digital implementation for performing linear-FM (LFM) pulse compression by using bandpass filter banks is presented. The concept of transform domain adaptive filtering is introduced. The DFT and the discrete cosine transform (DFT) domain adaptive filtering algorithm are considered. Applications of these in the areas of speech processing and adaptive line enhancers are discussed. A simple waveform coding algorithm capable of providing good quality speech at about 1.5 bits per sample is presented.

  14. Digital processing of radiographic images

    NASA Technical Reports Server (NTRS)

    Bond, A. D.; Ramapriyan, H. K.

    1973-01-01

    Some techniques are presented and the software documentation for the digital enhancement of radiographs. Both image handling and image processing operations are considered. The image handling operations dealt with are: (1) conversion of format of data from packed to unpacked and vice versa; (2) automatic extraction of image data arrays; (3) transposition and 90 deg rotations of large data arrays; (4) translation of data arrays for registration; and (5) reduction of the dimensions of data arrays by integral factors. Both the frequency and the spatial domain approaches are presented for the design and implementation of the image processing operation. It is shown that spatial domain recursive implementation of filters is much faster than nonrecursive implementations using fast fourier transforms (FFT) for the cases of interest in this work. The recursive implementation of a class of matched filters for enhancing image signal to noise ratio is described. Test patterns are used to illustrate the filtering operations. The application of the techniques to radiographic images of metallic structures is demonstrated through several examples.

  15. Seamless lesion insertion in digital mammography: methodology and reader study

    NASA Astrophysics Data System (ADS)

    Pezeshk, Aria; Petrick, Nicholas; Sahiner, Berkman

    2016-03-01

    Collection of large repositories of clinical images containing verified cancer locations is costly and time consuming due to difficulties associated with both the accumulation of data and establishment of the ground truth. This problem poses a significant challenge to the development of machine learning algorithms that require large amounts of data to properly train and avoid overfitting. In this paper we expand the methods in our previous publications by making several modifications that significantly increase the speed of our insertion algorithms, thereby allowing them to be used for inserting lesions that are much larger in size. These algorithms have been incorporated into an image composition tool that we have made publicly available. This tool allows users to modify or supplement existing datasets by seamlessly inserting a real breast mass or micro-calcification cluster extracted from a source digital mammogram into a different location on another mammogram. We demonstrate examples of the performance of this tool on clinical cases taken from the University of South Florida Digital Database for Screening Mammography (DDSM). Finally, we report the results of a reader study evaluating the realism of inserted lesions compared to clinical lesions. Analysis of the radiologist scores in the study using receiver operating characteristic (ROC) methodology indicates that inserted lesions cannot be reliably distinguished from clinical lesions.

  16. [Generation and processing of digital images in radiodiagnosis].

    PubMed

    Bajla, I; Belan, V

    1993-05-01

    The paper describes universal principles of diagnostic imaging. The attention is focused particularly on digital image generation in medicine. The methodology of display visualization of measured data is discussed. The problems of spatial relation representation and visual perception of image brightness are mentioned. The methodological issues of digital image processing (DIP) are discussed, particularly the relation of DIP to the other related disciplines, fundamental tasks in DIP and classification of DIP operations from the computational viewpoint. The following examples of applying DIP operations in diagnostic radiology are overviewed: local contrast enhancement in digital image, spatial filtering, quantitative texture analysis, synthesis of the 3D pseudospatial image based on the 2D tomogram set, multimodal processing of medical images. New trends of application of DIP methods in diagnostic radiology are outlined: evaluation of the diagnostic efficiency of DIP operations by means of ROC analysis, construction of knowledge-based systems of DIP in medicine. (Fig. 12, Ref. 26.)

  17. BPSK Demodulation Using Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Garcia, Thomas R.

    1996-01-01

    A digital communications signal is a sinusoidal waveform that is modified by a binary (digital) information signal. The sinusoidal waveform is called the carrier. The carrier may be modified in amplitude, frequency, phase, or a combination of these. In this project a binary phase shift keyed (BPSK) signal is the communication signal. In a BPSK signal the phase of the carrier is set to one of two states, 180 degrees apart, by a binary (i.e., 1 or 0) information signal. A digital signal is a sampled version of a "real world" time continuous signal. The digital signal is generated by sampling the continuous signal at discrete points in time. The rate at which the signal is sampled is called the sampling rate (f(s)). The device that performs this operation is called an analog-to-digital (A/D) converter or a digitizer. The digital signal is composed of the sequence of individual values of the sampled BPSK signal. Digital signal processing (DSP) is the modification of the digital signal by mathematical operations. A device that performs this processing is called a digital signal processor. After processing, the digital signal may then be converted back to an analog signal using a digital-to-analog (D/A) converter. The goal of this project is to develop a system that will recover the digital information from a BPSK signal using DSP techniques. The project is broken down into the following steps: (1) Development of the algorithms required to demodulate the BPSK signal; (2) Simulation of the system; and (3) Implementation a BPSK receiver using digital signal processing hardware.

  18. Advanced Digital Signal Processing for Hybrid Lidar

    DTIC Science & Technology

    2013-09-30

    Advanced Digital Signal Processing for Hybrid Lidar William D. Jemison Clarkson University [Technical Section Technical Objectives The technical...objective of this project is the development and evaluation of various digital signal processing (DSP) algorithms that will enhance hybrid lidar ...algorithm as shown in Figure 1. Hardware Platform for Algorithm Implementation + Underwater Channel Characteristics ^ Lidar DSP Algorithm Figure

  19. A Methodology and Implementation for Annotating Digital Images for Context-appropriate Use in an Academic Health Care Environment

    PubMed Central

    Goede, Patricia A.; Lauman, Jason R.; Cochella, Christopher; Katzman, Gregory L.; Morton, David A.; Albertine, Kurt H.

    2004-01-01

    Use of digital medical images has become common over the last several years, coincident with the release of inexpensive, mega-pixel quality digital cameras and the transition to digital radiology operation by hospitals. One problem that clinicians, medical educators, and basic scientists encounter when handling images is the difficulty of using business and graphic arts commercial-off-the-shelf (COTS) software in multicontext authoring and interactive teaching environments. The authors investigated and developed software-supported methodologies to help clinicians, medical educators, and basic scientists become more efficient and effective in their digital imaging environments. The software that the authors developed provides the ability to annotate images based on a multispecialty methodology for annotation and visual knowledge representation. This annotation methodology is designed by consensus, with contributions from the authors and physicians, medical educators, and basic scientists in the Departments of Radiology, Neurobiology and Anatomy, Dermatology, and Ophthalmology at the University of Utah. The annotation methodology functions as a foundation for creating, using, reusing, and extending dynamic annotations in a context-appropriate, interactive digital environment. The annotation methodology supports the authoring process as well as output and presentation mechanisms. The annotation methodology is the foundation for a Windows implementation that allows annotated elements to be represented as structured eXtensible Markup Language and stored separate from the image(s). PMID:14527971

  20. Digital signal processing for radioactive decay studies

    SciTech Connect

    Miller, D.; Madurga, M.; Paulauskas, S. V.; Ackermann, D.; Heinz, S.; Hessberger, F. P.; Hofmann, S.; Grzywacz, R.; Miernik, K.; Rykaczewski, K.; Tan, H.

    2011-11-30

    The use of digital acquisition system has been instrumental in the investigation of proton and alpha emitting nuclei. Recent developments extend the sensitivity and breadth of the application. The digital signal processing capabilities, used predominately by UT/ORNL for decay studies, include digitizers with decreased dead time, increased sampling rates, and new innovative firmware. Digital techniques and these improvements are furthermore applicable to a range of detector systems. Improvements in experimental sensitivity for alpha and beta-delayed neutron emitters measurements as well as the next generation of superheavy experiments are discussed.

  1. How Digital Image Processing Became Really Easy

    NASA Astrophysics Data System (ADS)

    Cannon, Michael

    1988-02-01

    In the early and mid-1970s, digital image processing was the subject of intense university and corporate research. The research lay along two lines: (1) developing mathematical techniques for improving the appearance of or analyzing the contents of images represented in digital form, and (2) creating cost-effective hardware to carry out these techniques. The research has been very effective, as evidenced by the continued decline of image processing as a research topic, and the rapid increase of commercial companies to market digital image processing software and hardware.

  2. Digital data acquisition and processing.

    PubMed

    Naivar, Mark A; Galbraith, David W

    2015-01-05

    A flow cytometer is made up of many different subsystems that work together to measure the optical properties of individual cells within a sample. The data acquisition system (also called the data system) is one of these subsystems, and it is responsible for converting the electrical signals from the optical detectors into list-mode data. This unit describes the inner workings of the data system, and provides insight into how the instrument functions as a whole. Some of the information provided in this unit is applicable to everyday use of these instruments, and, at minimum, should make it easier for the reader to assemble a specific data system. With the considerable advancement of electronics technology, it becomes possible to build an entirely functional data system using inexpensive hobbyist-level electronics. This unit covers both analog and digital data systems, but the primary focus is on the more prevalent digital data systems of modern flow cytometric instrumentation.

  3. The Process of Digitizing of Old Globe

    NASA Astrophysics Data System (ADS)

    Ambrožová, K.; Havrlanta, J.; Talich, M.; Böhm, O.

    2016-06-01

    This paper describes the process of digitalization of old globes that brings with it the possibility to use globes in their digital form. Created digital models are available to the general public through modern technology in the Internet network. This gives an opportunity to study old globes located in various historical collections, and prevent damage of the originals. Another benefit of digitization is also a possibility of comparing different models both among themselves and with current map data by increasing the transparency of individual layers. Digitization is carried out using special device that allows digitizing globes with a diameter ranging from 5 cm to 120 cm. This device can be easily disassembled, and it is fully mobile therefore the globes can be digitized in the place of its storage. Image data of globe surface are acquired by digital camera firmly fastened to the device. Acquired image data are then georeferenced by using a method of complex adjustment. The last step of digitization is publication of the final models that is realized by two ways. The first option is in the form of 3D model through JavaScript library Cesium or Google Earth plug-in in the Web browser. The second option is as a georeferenced map using Tile Map Service.

  4. On Certain New Methodology for Reducing Sensor and Readout Electronics Circuitry Noise in Digital Domain

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Miko, Joseph; Bradley, Damon; Heinzen, Katherine

    2008-01-01

    NASA Hubble Space Telescope (HST) and upcoming cosmology science missions carry instruments with multiple focal planes populated with many large sensor detector arrays. These sensors are passively cooled to low temperatures for low-level light (L3) and near-infrared (NIR) signal detection, and the sensor readout electronics circuitry must perform at extremely low noise levels to enable new required science measurements. Because we are at the technological edge of enhanced performance for sensors and readout electronics circuitry, as determined by thermal noise level at given temperature in analog domain, we must find new ways of further compensating for the noise in the signal digital domain. To facilitate this new approach, state-of-the-art sensors are augmented at their array hardware boundaries by non-illuminated reference pixels, which can be used to reduce noise attributed to sensors. There are a few proposed methodologies of processing in the digital domain the information carried by reference pixels, as employed by the Hubble Space Telescope and the James Webb Space Telescope Projects. These methods involve using spatial and temporal statistical parameters derived from boundary reference pixel information to enhance the active (non-reference) pixel signals. To make a step beyond this heritage methodology, we apply the NASA-developed technology known as the Hilbert- Huang Transform Data Processing System (HHT-DPS) for reference pixel information processing and its utilization in reconfigurable hardware on-board a spaceflight instrument or post-processing on the ground. The methodology examines signal processing for a 2-D domain, in which high-variance components of the thermal noise are carried by both active and reference pixels, similar to that in processing of low-voltage differential signals and subtraction of a single analog reference pixel from all active pixels on the sensor. Heritage methods using the aforementioned statistical parameters in the

  5. RSFQ Baseband Digital Signal Processing

    NASA Astrophysics Data System (ADS)

    Herr, Anna Yurievna

    Ultra fast switching speed of superconducting digital circuits enable realization of Digital Signal Processors with performance unattainable by any other technology. Based on rapid-single-flux technology (RSFQ) logic, these integrated circuits are capable of delivering high computation capacity up to 30 GOPS on a single processor and very short latency of 0.1ns. There are two main applications of such hardware for practical telecommunication systems: filters for superconducting ADCs operating with digital RF data and recursive filters at baseband. The later of these allows functions such as multiuser detection for 3G WCDMA, equalization and channel precoding for 4G OFDM MIMO, and general blind detection. The performance gain is an increase in the cell capacity, quality of service, and transmitted data rate. The current status of the development of the RSFQ baseband DSP is discussed. Major components with operating speed of 30GHz have been developed. Designs, test results, and future development of the complete systems including cryopackaging and CMOS interface are reviewed.

  6. Checking Fits With Digital Image Processing

    NASA Technical Reports Server (NTRS)

    Davis, R. M.; Geaslen, W. D.

    1988-01-01

    Computer-aided video inspection of mechanical and electrical connectors feasible. Report discusses work done on digital image processing for computer-aided interface verification (CAIV). Two kinds of components examined: mechanical mating flange and electrical plug.

  7. CT Image Processing Using Public Digital Networks

    PubMed Central

    Rhodes, Michael L.; Azzawi, Yu-Ming; Quinn, John F.; Glenn, William V.; Rothman, Stephen L.G.

    1984-01-01

    Nationwide commercial computer communication is now commonplace for those applications where digital dialogues are generally short and widely distributed, and where bandwidth does not exceed that of dial-up telephone lines. Image processing using such networks is prohibitive because of the large volume of data inherent to digital pictures. With a blend of increasing bandwidth and distributed processing, network image processing becomes possible. This paper examines characteristics of a digital image processing service for a nationwide network of CT scanner installations. Issues of image transmission, data compression, distributed processing, software maintenance, and interfacility communication are also discussed. Included are results that show the volume and type of processing experienced by a network of over 50 CT scanners for the last 32 months.

  8. Process independent automated sizing methodology for current steering DAC

    NASA Astrophysics Data System (ADS)

    Vural, R. A.; Kahraman, N.; Erkmen, B.; Yildirim, T.

    2015-10-01

    This study introduces a process independent automated sizing methodology based on general regression neural network (GRNN) for current steering complementary metal-oxide semiconductor (CMOS) digital-to-analog converter (DAC) circuit. The aim is to utilise circuit structures designed with previous process technologies and to synthesise circuit structures for novel process technologies in contrast to other modelling researches that consider a particular process technology. The simulations were performed using ON SEMI 1.5 µm, ON SEMI 0.5 µm and TSMC 0.35 µm technology process parameters. Eventually, a high-dimensional database was developed consisting of transistor sizes of DAC designs and corresponded static specification errors obtained from simulation results. The key point is that the GRNN was trained with the data set including the simulation results of ON-SEMI 1.5 µm and 0.5 µm technology parameters and the test data were constituted with only the simulation results of TSMC 0.35 µm technology parameters that had not been applied to GRNN for training beforehand. The proposed methodology provides the channel lengths and widths of all transistors for a newer technology when the designer sets the numeric values of DAC static output specifications as Differential Non-linearity error, Integral Non-linearity error, monotonicity and gain error as the inputs of the network.

  9. Overview of Digital Signal Processing Theory

    DTIC Science & Technology

    1975-05-20

    of digital integrated- circuit hardware elements along with their extremely high reliability, maintainability, and repeatability of performance have...limited by large-signal-performance and power limitations of circuit components. In the implementation of digital signal process- ing systems there...E. Polak and E. Wong, Notes For A First Course On Linear Systems, Van Nostrand Reinhold Company, New York, 1970. 2. C.A. Desoer , Notes For A

  10. Digital Signal Processing Based Biotelemetry Receivers

    NASA Technical Reports Server (NTRS)

    Singh, Avtar; Hines, John; Somps, Chris

    1997-01-01

    This is an attempt to develop a biotelemetry receiver using digital signal processing technology and techniques. The receiver developed in this work is based on recovering signals that have been encoded using either Pulse Position Modulation (PPM) or Pulse Code Modulation (PCM) technique. A prototype has been developed using state-of-the-art digital signal processing technology. A Printed Circuit Board (PCB) is being developed based on the technique and technology described here. This board is intended to be used in the UCSF Fetal Monitoring system developed at NASA. The board is capable of handling a variety of PPM and PCM signals encoding signals such as ECG, temperature, and pressure. A signal processing program has also been developed to analyze the received ECG signal to determine heart rate. This system provides a base for using digital signal processing in biotelemetry receivers and other similar applications.

  11. Digital Photon Correlation Data Processing Techniques

    DTIC Science & Technology

    1976-07-01

    AEDC-TR-76-81 DIGITAL PHOTON CORRELATION DATA PROCESSING TECHNIQUES JUL ~ 5 1980 JUN 0 9 1987 SCIENCE APPLICATIONS, INC. ATLANTA, GEORGIA 30339...Y ’ S C A T A L O G N U M B E R L AEDC-TR-76-81 4 T I T L E ( imd Sub l l de ) DIGITAL PHOTON CORRELATION DATA PROCESS ING TECHNIQUES 7 A...Inc. AE DC-T R-76-81 S UMMARY Presently available laser velocimeter (LV) electronic signal processing techniques are often inadequate for detec

  12. Digital signal processing for ionospheric propagation diagnostics

    NASA Astrophysics Data System (ADS)

    Rino, Charles L.; Groves, Keith M.; Carrano, Charles S.; Gunter, Jacob H.; Parris, Richard T.

    2015-08-01

    For decades, analog beacon satellite receivers have generated multifrequency narrowband complex data streams that could be processed directly to extract total electron content (TEC) and scintillation diagnostics. With the advent of software-defined radio, modern digital receivers generate baseband complex data streams that require intermediate processing to extract the narrowband modulation imparted to the signal by ionospheric structure. This paper develops and demonstrates a processing algorithm for digital beacon satellite data that will extract TEC and scintillation components. For algorithm evaluation, a simulator was developed to generate noise-limited multifrequency complex digital signal realizations with representative orbital dynamics and propagation disturbances. A frequency-tracking procedure is used to capture the slowly changing frequency component. Dynamic demodulation against the low-frequency estimate captures the scintillation. The low-frequency reference can be used directly for dual-frequency TEC estimation.

  13. Digital processing of Mariner 9 television data.

    NASA Technical Reports Server (NTRS)

    Green, W. B.; Seidman, J. B.

    1973-01-01

    The digital image processing performed by the Image Processing Laboratory (IPL) at JPL in support of the Mariner 9 mission is summarized. The support is divided into the general categories of image decalibration (the removal of photometric and geometric distortions from returned imagery), computer cartographic projections in support of mapping activities, and adaptive experimenter support (flexible support to provide qualitative digital enhancements and quantitative data reduction of returned imagery). Among the tasks performed were the production of maximum discriminability versions of several hundred frames to support generation of a geodetic control net for Mars, and special enhancements supporting analysis of Phobos and Deimos images.

  14. Digital processing of Mariner 9 television data.

    NASA Technical Reports Server (NTRS)

    Green, W. B.; Seidman, J. B.

    1973-01-01

    The digital image processing performed by the Image Processing Laboratory (IPL) at JPL in support of the Mariner 9 mission is summarized. The support is divided into the general categories of image decalibration (the removal of photometric and geometric distortions from returned imagery), computer cartographic projections in support of mapping activities, and adaptive experimenter support (flexible support to provide qualitative digital enhancements and quantitative data reduction of returned imagery). Among the tasks performed were the production of maximum discriminability versions of several hundred frames to support generation of a geodetic control net for Mars, and special enhancements supporting analysis of Phobos and Deimos images.

  15. Digital database architecture and delineation methodology for deriving drainage basins, and a comparison of digitally and non-digitally derived numeric drainage areas

    USGS Publications Warehouse

    Dupree, Jean A.; Crowfoot, Richard M.

    2012-01-01

    The drainage basin is a fundamental hydrologic entity used for studies of surface-water resources and during planning of water-related projects. Numeric drainage areas published by the U.S. Geological Survey water science centers in Annual Water Data Reports and on the National Water Information Systems (NWIS) Web site are still primarily derived from hard-copy sources and by manual delineation of polygonal basin areas on paper topographic map sheets. To expedite numeric drainage area determinations, the Colorado Water Science Center developed a digital database structure and a delineation methodology based on the hydrologic unit boundaries in the National Watershed Boundary Dataset. This report describes the digital database architecture and delineation methodology and also presents the results of a comparison of the numeric drainage areas derived using this digital methodology with those derived using traditional, non-digital methods. (Please see report for full Abstract)

  16. Digital image processing of vascular angiograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Beckenbach, E. S.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.

    1975-01-01

    The paper discusses the estimation of the degree of atherosclerosis in the human femoral artery through the use of a digital image processing system for vascular angiograms. The film digitizer uses an electronic image dissector camera to scan the angiogram and convert the recorded optical density information into a numerical format. Another processing step involves locating the vessel edges from the digital image. The computer has been programmed to estimate vessel abnormality through a series of measurements, some derived primarily from the vessel edge information and others from optical density variations within the lumen shadow. These measurements are combined into an atherosclerosis index, which is found in a post-mortem study to correlate well with both visual and chemical estimates of atherosclerotic disease.

  17. Digitally Controlled Analog Signal Processing

    DTIC Science & Technology

    1988-04-01

    as a* .1 4* U S -;. - 4.. 14 01 1.2-8 A discusion of alternative state of the art approaches to monolithic continuous-time signal processing can be...vi(t) art , he input and output samples which are simultaneously measured at time t. There are three unknowns in this expression; the maximum input and...Unfortunately the current OpAmp bandwidth of 30MHz is near state-of-the- art limits. 2.2-108 (2) The finite voltage-dependent on-resistance of S, distorted

  18. Eliminating "Hotspots" in Digital Image Processing

    NASA Technical Reports Server (NTRS)

    Salomon, P. M.

    1984-01-01

    Signals from defective picture elements rejected. Image processing program for use with charge-coupled device (CCD) or other mosaic imager augmented with algorithm that compensates for common type of electronic defect. Algorithm prevents false interpretation of "hotspots". Used for robotics, image enhancement, image analysis and digital television.

  19. Digital Image Processing in Private Industry.

    ERIC Educational Resources Information Center

    Moore, Connie

    1986-01-01

    Examines various types of private industry optical disk installations in terms of business requirements for digital image systems in five areas: records management; transaction processing; engineering/manufacturing; information distribution; and office automation. Approaches for implementing image systems are addressed as well as key success…

  20. A Virtual Laboratory for Digital Signal Processing

    ERIC Educational Resources Information Center

    Dow, Chyi-Ren; Li, Yi-Hsung; Bai, Jin-Yu

    2006-01-01

    This work designs and implements a virtual digital signal processing laboratory, VDSPL. VDSPL consists of four parts: mobile agent execution environments, mobile agents, DSP development software, and DSP experimental platforms. The network capability of VDSPL is created by using mobile agent and wrapper techniques without modifying the source code…

  1. A Virtual Laboratory for Digital Signal Processing

    ERIC Educational Resources Information Center

    Dow, Chyi-Ren; Li, Yi-Hsung; Bai, Jin-Yu

    2006-01-01

    This work designs and implements a virtual digital signal processing laboratory, VDSPL. VDSPL consists of four parts: mobile agent execution environments, mobile agents, DSP development software, and DSP experimental platforms. The network capability of VDSPL is created by using mobile agent and wrapper techniques without modifying the source code…

  2. Eliminating "Hotspots" in Digital Image Processing

    NASA Technical Reports Server (NTRS)

    Salomon, P. M.

    1984-01-01

    Signals from defective picture elements rejected. Image processing program for use with charge-coupled device (CCD) or other mosaic imager augmented with algorithm that compensates for common type of electronic defect. Algorithm prevents false interpretation of "hotspots". Used for robotics, image enhancement, image analysis and digital television.

  3. Digital Image Processing in Private Industry.

    ERIC Educational Resources Information Center

    Moore, Connie

    1986-01-01

    Examines various types of private industry optical disk installations in terms of business requirements for digital image systems in five areas: records management; transaction processing; engineering/manufacturing; information distribution; and office automation. Approaches for implementing image systems are addressed as well as key success…

  4. Research Methodologies and the Doctoral Process.

    ERIC Educational Resources Information Center

    Creswell, John W.; Miller, Gary A.

    1997-01-01

    Doctoral students often select one of four common research methodologies that are popular in the social sciences and education today: positivist; interpretive; ideological; and pragmatic. But choice of methodology also influences the student's choice of course work, membership of dissertation committee, and the form and structure of the…

  5. On process optimization considering LCA methodology.

    PubMed

    Pieragostini, Carla; Mussati, Miguel C; Aguirre, Pío

    2012-04-15

    The goal of this work is to research the state-of-the-art in process optimization techniques and tools based on LCA, focused in the process engineering field. A collection of methods, approaches, applications, specific software packages, and insights regarding experiences and progress made in applying the LCA methodology coupled to optimization frameworks is provided, and general trends are identified. The "cradle-to-gate" concept to define the system boundaries is the most used approach in practice, instead of the "cradle-to-grave" approach. Normally, the relationship between inventory data and impact category indicators is linearly expressed by the characterization factors; then, synergic effects of the contaminants are neglected. Among the LCIA methods, the eco-indicator 99, which is based on the endpoint category and the panel method, is the most used in practice. A single environmental impact function, resulting from the aggregation of environmental impacts, is formulated as the environmental objective in most analyzed cases. SimaPro is the most used software for LCA applications in literature analyzed. The multi-objective optimization is the most used approach for dealing with this kind of problems, where the ε-constraint method for generating the Pareto set is the most applied technique. However, a renewed interest in formulating a single economic objective function in optimization frameworks can be observed, favored by the development of life cycle cost software and progress made in assessing costs of environmental externalities. Finally, a trend to deal with multi-period scenarios into integrated LCA-optimization frameworks can be distinguished providing more accurate results upon data availability.

  6. [Digital thoracic radiology: devices, image processing, limits].

    PubMed

    Frija, J; de Géry, S; Lallouet, F; Guermazi, A; Zagdanski, A M; De Kerviler, E

    2001-09-01

    In a first part, the different techniques of digital thoracic radiography are described. Since computed radiography with phosphore plates are the most commercialized it is more emphasized. But the other detectors are also described, as the drum coated with selenium and the direct digital radiography with selenium detectors. The other detectors are also studied in particular indirect flat panels detectors and the system with four high resolution CCD cameras. In a second step the most important image processing are discussed: the gradation curves, the unsharp mask processing, the system MUSICA, the dynamic range compression or reduction, the soustraction with dual energy. In the last part the advantages and the drawbacks of computed thoracic radiography are emphasized. The most important are the almost constant good quality of the pictures and the possibilities of image processing.

  7. Fundamental Concepts of Digital Image Processing

    DOE R&D Accomplishments Database

    Twogood, R. E.

    1983-03-01

    The field of a digital-image processing has experienced dramatic growth and increasingly widespread applicability in recent years. Fortunately, advances in computer technology have kept pace with the rapid growth in volume of image data in these and other applications. Digital image processing has become economical in many fields of research and in industrial and military applications. While each application has requirements unique from the others, all are concerned with faster, cheaper, more accurate, and more extensive computation. The trend is toward real-time and interactive operations, where the user of the system obtains preliminary results within a short enough time that the next decision can be made by the human processor without loss of concentration on the task at hand. An example of this is the obtaining of two-dimensional (2-D) computer-aided tomography (CAT) images. A medical decision might be made while the patient is still under observation rather than days later.

  8. A brief review of digital image processing

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.

    1975-01-01

    The review is presented with particular reference to Skylab S-192 and Landsat MSS imagery. Attention is given to rectification (calibration) processing with emphasis on geometric correction of image distortions. Image enhancement techniques (e.g., the use of high pass digital filters to eliminate gross shading to allow emphasis of the fine detail) are described along with data analysis and system considerations (software philosophy).

  9. Digital processing of ionospheric electron content data

    NASA Technical Reports Server (NTRS)

    Bernhardt, P. A.

    1979-01-01

    Ionospheric electron content data contain periodicities that are produced by a diversity of sources including hydromagnetic waves, gravity waves, and lunar tides. Often these periodicities are masked by the strong daily variation in the data. Digital filtering can be used to isolate the weaker components. The filtered data can then be further processed to provide estimates of the source properties. In addition, homomorphic filtering may be used to identify nonlinear interactions in the ionosphere.

  10. REVIEW ARTICLE: Spectrophotometric applications of digital signal processing

    NASA Astrophysics Data System (ADS)

    Morawski, Roman Z.

    2006-09-01

    Spectrophotometry is more and more often the method of choice not only in analysis of (bio)chemical substances, but also in the identification of physical properties of various objects and their classification. The applications of spectrophotometry include such diversified tasks as monitoring of optical telecommunications links, assessment of eating quality of food, forensic classification of papers, biometric identification of individuals, detection of insect infestation of seeds and classification of textiles. In all those applications, large numbers of data, generated by spectrophotometers, are processed by various digital means in order to extract measurement information. The main objective of this paper is to review the state-of-the-art methodology for digital signal processing (DSP) when applied to data provided by spectrophotometric transducers and spectrophotometers. First, a general methodology of DSP applications in spectrophotometry, based on DSP-oriented models of spectrophotometric data, is outlined. Then, the most important classes of DSP methods for processing spectrophotometric data—the methods for DSP-aided calibration of spectrophotometric instrumentation, the methods for the estimation of spectra on the basis of spectrophotometric data, the methods for the estimation of spectrum-related measurands on the basis of spectrophotometric data—are presented. Finally, the methods for preprocessing and postprocessing of spectrophotometric data are overviewed. Throughout the review, the applications of DSP are illustrated with numerous examples related to broadly understood spectrophotometry.

  11. Process simulation in digital camera system

    NASA Astrophysics Data System (ADS)

    Toadere, Florin

    2012-06-01

    The goal of this paper is to simulate the functionality of a digital camera system. The simulations cover the conversion from light to numerical signal and the color processing and rendering. We consider the image acquisition system to be linear shift invariant and axial. The light propagation is orthogonal to the system. We use a spectral image processing algorithm in order to simulate the radiometric properties of a digital camera. In the algorithm we take into consideration the transmittances of the: light source, lenses, filters and the quantum efficiency of a CMOS (complementary metal oxide semiconductor) sensor. The optical part is characterized by a multiple convolution between the different points spread functions of the optical components. We use a Cooke triplet, the aperture, the light fall off and the optical part of the CMOS sensor. The electrical part consists of the: Bayer sampling, interpolation, signal to noise ratio, dynamic range, analog to digital conversion and JPG compression. We reconstruct the noisy blurred image by blending different light exposed images in order to reduce the photon shot noise, also we filter the fixed pattern noise and we sharpen the image. Then we have the color processing blocks: white balancing, color correction, gamma correction, and conversion from XYZ color space to RGB color space. For the reproduction of color we use an OLED (organic light emitting diode) monitor. The analysis can be useful to assist students and engineers in image quality evaluation and imaging system design. Many other configurations of blocks can be used in our analysis.

  12. A methodology for use of digital image correlation for hot mix asphalt testing

    NASA Astrophysics Data System (ADS)

    Ramos, Estefany

    Digital Image Correlation (DIC) is a relatively new technology which aids in the measurement of material properties without the need for installation of sensors. DIC is a noncontact measuring technique that requires the specimen to be marked with a random speckled pattern and to be photographed during the test. The photographs are then post-processed based on the location of the pattern throughout the test. DIC can aid in calculating properties that would otherwise be too difficult even with other measuring instruments. The objective of this thesis is to discuss the methodology and validate the use of DIC in different hot mix asphalt (HMA) tests, such as, the Overlay Tester (OT) Test, Indirect Tensile (IDT) Test, and the Semicircular Bending (SCB) Test. The DIC system provides displacements and strains in any visible surface. The properly calibrated 2-D or 3-D DIC data can be used to understand the complex stress and strain distributions and the modes of the initiation and propagation of cracks. The use of this observational method will lead to further understanding of the complex boundary conditions of the different test, and therefore, allowing it to be implemented in the analysis of other materials. The use of digital image correlation will bring insight and knowledge onto what is happening during a test.

  13. Digital Signal Processing in the GRETINA Spectrometer

    NASA Astrophysics Data System (ADS)

    Cromaz, Mario

    2015-10-01

    Developments in the segmentation of large-volume HPGe crystals has enabled the development of high-efficiency gamma-ray spectrometers which have the ability to track the path of gamma-rays scattering through the detector volume. This technology has been successfully implemented in the GRETINA spectrometer whose high efficiency and ability to perform precise event-by-event Doppler correction has made it an important tool in nuclear spectroscopy. Tracking has required the spectrometer to employ a fully digital signal processing chain. Each of the systems 1120 channels are digitized by 100 Mhz, 14-bit flash ADCs. Filters that provide timing and high-resolution energies are implemented on local FPGAs acting on the ADC data streams while interaction point locations and tracks, derived from the trace on each detector segment, are calculated in real time on a computing cluster. In this presentation we will give a description of GRETINA's digital signal processing system, the impact of design decisions on system performance, and a discussion of possible future directions as we look towards soon developing larger spectrometers such as GRETA with full 4 π solid angle coverage. This work was supported by the Office of Science in the Department of Energy under grant DE-AC02-05CH11231.

  14. Image processing of digital chest ionograms.

    PubMed

    Yarwood, J R; Moores, B M

    1988-10-01

    A number of image-processing techniques have been applied to a digital ionographic chest image in order to evaluate their possible effects on this type of image. In order to quantify any effect, a simulated lesion was superimposed on the image at a variety of locations representing different types of structural detail. Visualization of these lesions was evaluated by a number of observers both pre- and post-processing operations. The operations employed included grey-scale transformations, histogram operations, edge-enhancement and smoothing functions. The resulting effects of these operations on the visualization of the simulated lesions are discussed.

  15. C language algorithms for digital signal processing

    SciTech Connect

    Embree, P.M.; Kimble, B.

    1991-01-01

    The use of the C programming language to construct digital signal-processing (DSP) algorithms for operation on high-performance personal computers is described in a textbook for engineering students. Chapters are devoted to the fundamental principles of DSP, basic C programming techniques, user-interface and disk-storage routines, filtering routines, discrete Fourier transforms, matrix and vector routines, and image-processing routines. Also included is a floppy disk containing a library of standard C mathematics, character-string, memory-allocation, and I/O functions; a library of DSP functions; and several sample DSP programs. 83 refs.

  16. Applications of Digital Image Processing 11

    NASA Technical Reports Server (NTRS)

    Cho, Y. -C.

    1988-01-01

    A new technique, digital image velocimetry, is proposed for the measurement of instantaneous velocity fields of time dependent flows. A time sequence of single-exposure images of seed particles are captured with a high-speed camera, and a finite number of the single-exposure images are sampled within a prescribed period in time. The sampled images are then digitized on an image processor, enhanced, and superimposed to construct an image which is equivalent to a multiple exposure image used in both laser speckle velocimetry and particle image velocimetry. The superimposed image and a single-exposure Image are digitally Fourier transformed for extraction of information on the velocity field. A great enhancement of the dynamic range of the velocity measurement is accomplished through the new technique by manipulating the Fourier transform of both the single-exposure image and the superimposed image. Also the direction of the velocity vector is unequivocally determined. With the use of a high-speed video camera, the whole process from image acquisition to velocity determination can be carried out electronically; thus this technique can be developed into a real-time capability.

  17. Modeling and Analysis of Power Processing Systems. [use of a digital computer for designing power plants

    NASA Technical Reports Server (NTRS)

    Fegley, K. A.; Hayden, J. H.; Rehmann, D. W.

    1974-01-01

    The feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems is investigated. It is shown that a digital computer may be used in an interactive mode for the design, modeling, analysis, and comparison of power processing systems.

  18. Unraveling Change in Therapy: Three Different Process Research Methodologies.

    ERIC Educational Resources Information Center

    Woolley, Scott R.; Butler, Mark H.; Wampler, Karen S.

    2000-01-01

    In response to repeated calls for process research on couple and family therapy, three different process research methodologies, grounded therapy, change events analysis, and experimental manipulation - are presented and evaluated. The strengths and weaknesses of each methodology are discussed, along with their role in generating and testing…

  19. Advanced Digital Signal Processing for Hybrid Lidar

    DTIC Science & Technology

    2014-03-31

    on a multimeter to ensure that the PMT remained within its linear operating regime. The AC-coupTed signal was demodulated and digitized in the SDR ...receiver. The I and Q samples obtained by"" the SDR are transferred over an Ethernet cable to a PC, where the data are processed in a custom LabVIEW...Q samples are generated by the SDR receiver and used to compute range on a PC. Ranging results from the FDR experiments and RangeFinder simulations

  20. Advanced Digital Signal Processing for Hybrid Lidar

    DTIC Science & Technology

    2013-03-31

    project "Advanced Digital Signal Processing for Hybrid Lidar " covering the period of 1/1/2013-3/31/2013. 9LO\\SO^O’IH^’?’ William D. Jemison...Chaotic LIDAR for Naval Applications This document contains a Progress Summary for FY13 Q2 and a Short Work Statement for FY13 Progress Summary for...This technique has the potential to increase the unambiguous range of hybrid lidar -radar while maintaining reasonable range resolution. Proof-of

  1. Parallel processing for digital picture comparison

    NASA Technical Reports Server (NTRS)

    Cheng, H. D.; Kou, L. T.

    1987-01-01

    In picture processing an important problem is to identify two digital pictures of the same scene taken under different lighting conditions. This kind of problem can be found in remote sensing, satellite signal processing and the related areas. The identification can be done by transforming the gray levels so that the gray level histograms of the two pictures are closely matched. The transformation problem can be solved by using the packing method. Researchers propose a VLSI architecture consisting of m x n processing elements with extensive parallel and pipelining computation capabilities to speed up the transformation with the time complexity 0(max(m,n)), where m and n are the numbers of the gray levels of the input picture and the reference picture respectively. If using uniprocessor and a dynamic programming algorithm, the time complexity will be 0(m(3)xn). The algorithm partition problem, as an important issue in VLSI design, is discussed. Verification of the proposed architecture is also given.

  2. Image processing techniques for digital orthophotoquad production

    USGS Publications Warehouse

    Hood, Joy J.; Ladner, L. J.; Champion, Richard A.

    1989-01-01

    Orthophotographs have long been recognized for their value as supplements or alternatives to standard maps. Recent trends towards digital cartography have resulted in efforts by the US Geological Survey to develop a digital orthophotoquad production system. Digital image files were created by scanning color infrared photographs on a microdensitometer. Rectification techniques were applied to remove tile and relief displacement, thereby creating digital orthophotos. Image mosaicking software was then used to join the rectified images, producing digital orthophotos in quadrangle format.

  3. Seismic Rayleigh Wave Digital Processing Technology

    NASA Astrophysics Data System (ADS)

    Jie, Li

    2013-04-01

    In Rayleigh wave exploration, the digital processing of data plays a very important position. This directly affects the interpretation of ground effect. Therefore, the use of accurate processing software and effective method in the Rayleigh wave exploration has important theoretical and practical significance. Previously, Rayleigh wave dispersion curve obtained by the one-dimensional phase analysis. This method requires channel spacing should be less than the effective wavelength. And minimal phase error will cause great changes in the phase velocity of Rayleigh wave. Damped least square method is a local linear model. It is easy to cause that inversion objective function cannot find the global optimal solution. Therefore, the method and the technology used in the past are difficult to apply the requirements of the current Rayleigh wave exploration. This study focused on the related technologies and algorithms of F-K domain dispersion curve extraction and GA global non-linear inversion, and combined with the impact of Rayleigh wave data acquisition parameters and the characteristics. Rayleigh wave exploration data processing software design and process technology research is completed. Firstly, the article describes the theoretical basis of Rayleigh wave method. This is also part of the theoretical basis of following treatment. The theoretical proof of existence of Rayleigh wave Dispersive in layered strata. Secondly, F-K domain dispersion curve extraction tests showed that the method can overcome the one-dimensional digital processing technology deficiencies, and make full use of multi-channel Rayleigh wave data record information. GA global non-linear inversion indicated that the inversion is not easy getting into local optimal solution. Thirdly, some examples illustrate each mode Rayleigh wave dispersion curve characteristics in the X-T domain. Tests demonstrated the impact on their extraction of dispersion curves. Parameters change example (including the X

  4. Digital techniques for processing Landsat imagery

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1978-01-01

    An overview of the basic techniques used to process Landsat images with a digital computer, and the VICAR image processing software developed at JPL and available to users through the NASA sponsored COSMIC computer program distribution center is presented. Examples of subjective processing performed to improve the information display for the human observer, such as contrast enhancement, pseudocolor display and band rationing, and of quantitative processing using mathematical models, such as classification based on multispectral signatures of different areas within a given scene and geometric transformation of imagery into standard mapping projections are given. Examples are illustrated by Landsat scenes of the Andes mountains and Altyn-Tagh fault zone in China before and after contrast enhancement and classification of land use in Portland, Oregon. The VICAR image processing software system which consists of a language translator that simplifies execution of image processing programs and provides a general purpose format so that imagery from a variety of sources can be processed by the same basic set of general applications programs is described.

  5. An investigation of radiometer design using digital processing techniques

    NASA Technical Reports Server (NTRS)

    Lawrence, R. W.

    1981-01-01

    The use of digital signal processing techniques in Dicke switching radiometer design was investigated. The general approach was to develop an analytical model of the existing analog radiometer and identify factors which adversly affect its performance. A digital processor was then proposed to verify the feasibility of using digital techniques to minimize these adverse effects and improve the radiometer performance. Analysis and preliminary test results comparing the digital and analog processing approaches in radiometers design were analyzed.

  6. Digital interactive image analysis by array processing

    NASA Technical Reports Server (NTRS)

    Sabels, B. E.; Jennings, J. D.

    1973-01-01

    An attempt is made to draw a parallel between the existing geophysical data processing service industries and the emerging earth resources data support requirements. The relationship of seismic data analysis to ERTS data analysis is natural because in either case data is digitally recorded in the same format, resulting from remotely sensed energy which has been reflected, attenuated, shifted and degraded on its path from the source to the receiver. In the seismic case the energy is acoustic, ranging in frequencies from 10 to 75 cps, for which the lithosphere appears semi-transparent. In earth survey remote sensing through the atmosphere, visible and infrared frequency bands are being used. Yet the hardware and software required to process the magnetically recorded data from the two realms of inquiry are identical and similar, respectively. The resulting data products are similar.

  7. Fuzzy Logic Enhanced Digital PIV Processing Software

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.

    1999-01-01

    Digital Particle Image Velocimetry (DPIV) is an instantaneous, planar velocity measurement technique that is ideally suited for studying transient flow phenomena in high speed turbomachinery. DPIV is being actively used at the NASA Glenn Research Center to study both stable and unstable operating conditions in a high speed centrifugal compressor. Commercial PIV systems are readily available which provide near real time feedback of the PIV image data quality. These commercial systems are well designed to facilitate the expedient acquisition of PIV image data. However, as with any general purpose system, these commercial PIV systems do not meet all of the data processing needs required for PIV image data reduction in our compressor research program. An in-house PIV PROCessing (PIVPROC) code has been developed for reducing PIV data. The PIVPROC software incorporates fuzzy logic data validation for maximum information recovery from PIV image data. PIVPROC enables combined cross-correlation/particle tracking wherein the highest possible spatial resolution velocity measurements are obtained.

  8. Textural identification of carbonate rocks by image processing and neural network: Methodology proposal and examples

    NASA Astrophysics Data System (ADS)

    Marmo, Roberto; Amodio, Sabrina; Tagliaferri, Roberto; Ferreri, Vittoria; Longo, Giuseppe

    2005-06-01

    Using more than 1000 thin section photos of ancient (Phanerozoic) carbonates from different marine environments (pelagic to shallow-water) a new numerical methodology, based on digitized images of thin sections, is proposed here. In accordance with the Dunham classification, it allows the user to automatically identify carbonate textures unaffected by post-depositional modifications (recrystallization, dolomitization, meteoric dissolution and so on). The methodology uses, as input, 256 grey-tone digital image and by image processing gives, as output, a set of 23 values of numerical features measured on the whole image including the "white areas" (calcite cement). A multi-layer perceptron neural network takes as input this features and gives, as output, the estimated class. We used 532 images of thin sections to train the neural network, whereas to test the methodology we used 268 images taken from the same photo collection and 215 images from San Lorenzello carbonate sequence (Matese Mountains, southern Italy), Early Cretaceous in age. This technique has shown 93.3% and 93.5% of accuracy to classify automatically textures of carbonate rocks using digitized images on the 268 and 215 test sets, respectively. Therefore, the proposed methodology is a further promising application to the geosciences allowing carbonate textures of many thin sections to be identified in a rapid and accurate way. A MATLAB-based computer code has been developed for the processing and display of images.

  9. SYDDARTA: new methodology for digitization of deterioration estimation in paintings

    NASA Astrophysics Data System (ADS)

    Granero-Montagud, Luís.; Portalés, Cristina; Pastor-Carbonell, Begoña.; Ribes-Gómez, Emilio; Gutiérrez-Lucas, Antonio; Tornari, Vivi; Papadakis, Vassilis; Groves, Roger M.; Sirmacek, Beril; Bonazza, Alessandra; Ozga, Izabela; Vermeiren, Jan; van der Zanden, Koen; Föster, Matthias; Aswendt, Petra; Borreman, Albert; Ward, Jon D.; Cardoso, António; Aguiar, Luís.; Alves, Filipa; Ropret, Polonca; Luzón-Nogué, José María.; Dietz, Christian

    2013-05-01

    The SYDDARTA project is an on-going European Commission funded initiative under the 7th Framework Programme. Its main objective is the development of a pre-industrial prototype for diagnosing the deterioration of movable art assets. The device combines two different optical techniques for the acquisition of data. On one hand, hyperspectral imaging is implemented by means of electronically tunable filters. On the other, 3D scanning, using structured light projection and capturing is developed. These techniques are integrated in a single piece of equipment, allowing the recording of two optical information streams. Together with multi-sensor data merging and information processing, estimates of artwork deterioration and degradation can be made. In particular, the resulting system will implement two optical channels (3D scanning and short wave infrared (SWIR) hyperspectral imaging) featuring a structured light projector and electronically tunable spectral separators. The system will work in the VIS-NIR range (400-1000nm), and SWIR range (900-2500nm). It will be also portable and user-friendly. Among all possible art work under consideration, Baroque paintings on canvas and wooden panels were selected as the project case studies.

  10. Development of economic consequence methodology for process risk analysis.

    PubMed

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies.

  11. Digital Light Processing and MEMS: reflecting the digital display needs of the networked society

    NASA Astrophysics Data System (ADS)

    Hornbeck, Larry J.

    1996-08-01

    Digital video technology is becoming increasingly important to the networked society. The natural interface to digital video is a digital display, one that accepts electrical bits at its input and converts them into optical bits at the output. The digital-to-analog processing function is performed in the mind of the observer. Texas Instruments has developed such a display with its recent market introduction of the Digital Light ProcessingTM (DLPTM) projection display. DLP technology is based on the Digital Micromirror DeviceTM (DMDTM), a microelectromechanical systems (MEMS) array of semiconductor-based digital light switches. The DMD switching array precisely controls a light source for projection display and digital printing applications. This paper presents an overview of DLP technology along with the architecture, projection operation, manufacture, and reliability of the DMD. Features of DMD technology that distinguish it from conventional MEMS technology are explored. Finally, the paper provides a view of DLP business opportunities.

  12. Applying Statistical Process Quality Control Methodology to Educational Settings.

    ERIC Educational Resources Information Center

    Blumberg, Carol Joyce

    A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…

  13. Methodology of Diagnostics of Interethnic Relations and Ethnosocial Processes

    ERIC Educational Resources Information Center

    Maximova, Svetlana G.; Noyanzina, Oksana Ye.; Omelchenko, Daria A.; Maximov, Maxim B.; Avdeeva, Galina C.

    2016-01-01

    The purpose of this study was to research the methodological approaches to the study of interethnic relations and ethno-social processes. The analysis of the literature was conducted in three main areas: 1) the theoretical and methodological issues of organizing the research of inter-ethnic relations, allowing to highlight the current…

  14. A rapid prototyping methodology to implement and optimize image processing algorithms for FPGAs

    NASA Astrophysics Data System (ADS)

    Akil, Mohamed; Niang, Pierre; Grandpierre, Thierry

    2006-02-01

    In this article we present the local operations in image processing based upon spatial 2D discrete convolution. We study different implementation of such local operations. We also present the principles and the design flow of the AAA methodology and its associated CAD software tool for integrated circuit (SynDEx-IC). In this methodology, the algorithm is modeled by Conditioned (if - then - else) and Factorized (Loop) Data Dependence Graph and the optimized implementation is obtained by graph transformations. The AAA/SynDEx-IC is used to specify and to optimize the some digital image filters on FPGA XC2100 board.

  15. Low cost 3D scanning process using digital image processing

    NASA Astrophysics Data System (ADS)

    Aguilar, David; Romero, Carlos; Martínez, Fernando

    2017-02-01

    This paper shows the design and building of a low cost 3D scanner, able to digitize solid objects through contactless data acquisition, using active object reflection. 3D scanners are used in different applications such as: science, engineering, entertainment, etc; these are classified in: contact scanners and contactless ones, where the last ones are often the most used but they are expensive. This low-cost prototype is done through a vertical scanning of the object using a fixed camera and a mobile horizontal laser light, which is deformed depending on the 3-dimensional surface of the solid. Using digital image processing an analysis of the deformation detected by the camera was done; it allows determining the 3D coordinates using triangulation. The obtained information is processed by a Matlab script, which gives to the user a point cloud corresponding to each horizontal scanning done. The obtained results show an acceptable quality and significant details of digitalized objects, making this prototype (built on LEGO Mindstorms NXT kit) a versatile and cheap tool, which can be used for many applications, mainly by engineering students.

  16. Parallel Processing with Digital Signal Processing Hardware and Software

    NASA Technical Reports Server (NTRS)

    Swenson, Cory V.

    1995-01-01

    The assembling and testing of a parallel processing system is described which will allow a user to move a Digital Signal Processing (DSP) application from the design stage to the execution/analysis stage through the use of several software tools and hardware devices. The system will be used to demonstrate the feasibility of the Algorithm To Architecture Mapping Model (ATAMM) dataflow paradigm for static multiprocessor solutions of DSP applications. The individual components comprising the system are described followed by the installation procedure, research topics, and initial program development.

  17. An Interactive Graphics Program for Investigating Digital Signal Processing.

    ERIC Educational Resources Information Center

    Miller, Billy K.; And Others

    1983-01-01

    Describes development of an interactive computer graphics program for use in teaching digital signal processing. The program allows students to interactively configure digital systems on a monitor display and observe their system's performance by means of digital plots on the system's outputs. A sample program run is included. (JN)

  18. E-inclusion Process and Societal Digital Skill Development

    ERIC Educational Resources Information Center

    Vitolina, Ieva

    2015-01-01

    Nowadays, the focus shifts from information and communication technology access to skills and knowledge. Moreover, lack of digital skills is an obstacle in the process of learning new digital competences using technologies and e-learning. The objective of this research is to investigate how to facilitate students to use the acquired digital skills…

  19. E-inclusion Process and Societal Digital Skill Development

    ERIC Educational Resources Information Center

    Vitolina, Ieva

    2015-01-01

    Nowadays, the focus shifts from information and communication technology access to skills and knowledge. Moreover, lack of digital skills is an obstacle in the process of learning new digital competences using technologies and e-learning. The objective of this research is to investigate how to facilitate students to use the acquired digital skills…

  20. Digital data processing system dynamic loading analysis

    NASA Technical Reports Server (NTRS)

    Lagas, J. J.; Peterka, J. J.; Tucker, A. E.

    1976-01-01

    Simulation and analysis of the Space Shuttle Orbiter Digital Data Processing System (DDPS) are reported. The mated flight and postseparation flight phases of the space shuttle's approach and landing test configuration were modeled utilizing the Information Management System Interpretative Model (IMSIM) in a computerized simulation modeling of the ALT hardware, software, and workload. System requirements simulated for the ALT configuration were defined. Sensitivity analyses determined areas of potential data flow problems in DDPS operation. Based on the defined system requirements and the sensitivity analyses, a test design is described for adapting, parameterizing, and executing the IMSIM. Varying load and stress conditions for the model execution are given. The analyses of the computer simulation runs were documented as results, conclusions, and recommendations for DDPS improvements.

  1. Development and testing of methodology for evaluating the performance of multi-input/multi-output digital control systems

    NASA Technical Reports Server (NTRS)

    Pototzky, Anthony S.; Wieseman, Carol D.; Hoadley, Sherwood Tiffany; Mukhopadhyay, Vivek

    1990-01-01

    A Controller Performance Evaluation (CPE) methodology for multi-input/multi-output digital control systems was developed and tested on an aeroelastic wind-tunnel model. Modern signal processing methods were used to implement control laws and to acquire time domain data of the whole system (controller and plant) from which appropriate transfer matrices of the control system could be generated. Matrix computational procedures were used to calculate singular values of return-difference matrices at the plant input and output points to evaluate the performance of the control system. The CPE procedures effectively identified potentially destabilizing controllers and confirmed the satisfactory performance of stabilizing ones.

  2. On digital image processing technology and application in geometric measure

    NASA Astrophysics Data System (ADS)

    Yuan, Jiugen; Xing, Ruonan; Liao, Na

    2014-04-01

    Digital image processing technique is an emerging science that emerging with the development of semiconductor integrated circuit technology and computer science technology since the 1960s.The article introduces the digital image processing technique and principle during measuring compared with the traditional optical measurement method. It takes geometric measure as an example and introduced the development tendency of digital image processing technology from the perspective of technology application.

  3. Digital speech processing for cochlear implants.

    PubMed

    Dillier, N; Bögli, H; Spillmann, T

    1992-01-01

    A rather general basic working hypothesis for cochlear implant research might be formulated as follows. Signal processing for cochlear implants should carefully select a subset of the total information contained in the sound signal and transform these elements into those physical stimulation parameters which can generate distinctive perceptions for the listener. Several new digital processing strategies have thus been implemented on a laboratory cochlear implant speech processor for the Nucleus 22-electrode system. One of the approaches (PES, pitch excited sampler) is based on the maximum peak channel vocoder concept whereby the spectral energy of a number of frequency bands is transformed into appropriate electrical stimulation parameters for up to 22 electrodes using a voice pitch synchronous pulse rate at any electrode. Another approach (CIS, continuous interleaved sampler) uses a maximally high pitch-independent stimulation pulse rate on a selected number of electrodes. As only one electrode can be stimulated at any instance of time, the rate of stimulation is limited by the required stimulus pulse widths (as determined individually for each subject) and some additional constraints and parameters which have to be optimized and fine tuned by psychophysical measurements. Evaluation experiments with 5 cochlear implant users resulted in significantly improved performance in consonant identification tests with the new processing strategies as compared with the subjects own wearable speech processors whereas improvements in vowel identification tasks were rarely observed. The pitch-synchronous coding (PES) resulted in worse performance compared to the coding without explicit pitch extraction (CIS).(ABSTRACT TRUNCATED AT 250 WORDS)

  4. Design methodology: edgeless 3D ASICs with complex in-pixel processing for pixel detectors

    SciTech Connect

    Fahim Farah, Fahim Farah; Deptuch, Grzegorz W.; Hoff, James R.; Mohseni, Hooman

    2015-08-28

    The design methodology for the development of 3D integrated edgeless pixel detectors with in-pixel processing using Electronic Design Automation (EDA) tools is presented. A large area 3 tier 3D detector with one sensor layer and two ASIC layers containing one analog and one digital tier, is built for x-ray photon time of arrival measurement and imaging. A full custom analog pixel is 65μm x 65μm. It is connected to a sensor pixel of the same size on one side, and on the other side it has approximately 40 connections to the digital pixel. A 32 x 32 edgeless array without any peripheral functional blocks constitutes a sub-chip. The sub-chip is an indivisible unit, which is further arranged in a 6 x 6 array to create the entire 1.248cm x 1.248cm ASIC. Each chip has 720 bump-bond I/O connections, on the back of the digital tier to the ceramic PCB. All the analog tier power and biasing is conveyed through the digital tier from the PCB. The assembly has no peripheral functional blocks, and hence the active area extends to the edge of the detector. This was achieved by using a few flavors of almost identical analog pixels (minimal variation in layout) to allow for peripheral biasing blocks to be placed within pixels. The 1024 pixels within a digital sub-chip array have a variety of full custom, semi-custom and automated timing driven functional blocks placed together. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout. The methodology uses the Cadence design platform, however it is not limited to this tool.

  5. Process Architecture for Managing Digital Object Identifiers

    NASA Astrophysics Data System (ADS)

    Wanchoo, L.; James, N.; Stolte, E.

    2014-12-01

    In 2010, NASA's Earth Science Data and Information System (ESDIS) Project implemented a process for registering Digital Object Identifiers (DOIs) for data products distributed by Earth Observing System Data and Information System (EOSDIS). For the first 3 years, ESDIS evolved the process involving the data provider community in the development of processes for creating and assigning DOIs, and guidelines for the landing page. To accomplish this, ESDIS established two DOI User Working Groups: one for reviewing the DOI process whose recommendations were submitted to ESDIS in February 2014; and the other recently tasked to review and further develop DOI landing page guidelines for ESDIS approval by end of 2014. ESDIS has recently upgraded the DOI system from a manually-driven system to one that largely automates the DOI process. The new automated feature include: a) reviewing the DOI metadata, b) assigning of opaque DOI name if data provider chooses, and c) reserving, registering, and updating the DOIs. The flexibility of reserving the DOI allows data providers to embed and test the DOI in the data product metadata before formally registering with EZID. The DOI update process allows the changing of any DOI metadata except the DOI name unless the name has not been registered. Currently, ESDIS has processed a total of 557 DOIs of which 379 DOIs are registered with EZID and 178 are reserved with ESDIS. The DOI incorporates several metadata elements that effectively identify the data product and the source of availability. Of these elements, the Uniform Resource Locator (URL) attribute has the very important function of identifying the landing page which describes the data product. ESDIS in consultation with data providers in the Earth Science community is currently developing landing page guidelines that specify the key data product descriptive elements to be included on each data product's landing page. This poster will describe in detail the unique automated process and

  6. Design Methodology: ASICs with complex in-pixel processing for Pixel Detectors

    SciTech Connect

    Fahim, Farah

    2014-10-31

    The development of Application Specific Integrated Circuits (ASIC) for pixel detectors with complex in-pixel processing using Computer Aided Design (CAD) tools that are, themselves, mainly developed for the design of conventional digital circuits requires a specialized approach. Mixed signal pixels often require parasitically aware detailed analog front-ends and extremely compact digital back-ends with more than 1000 transistors in small areas below 100μm x 100μm. These pixels are tiled to create large arrays, which have the same clock distribution and data readout speed constraints as in, for example, micro-processors. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout.

  7. Digital-Difference Processing For Collision Avoidance.

    NASA Technical Reports Server (NTRS)

    Shores, Paul; Lichtenberg, Chris; Kobayashi, Herbert S.; Cunningham, Allen R.

    1988-01-01

    Digital system for automotive crash avoidance measures and displays difference in frequency between two sinusoidal input signals of slightly different frequencies. Designed for use with Doppler radars. Characterized as digital mixer coupled to frequency counter measuring difference frequency in mixer output. Technique determines target path mathematically. Used for tracking cars, missiles, bullets, baseballs, and other fast-moving objects.

  8. Digital image processing of cephalometric radiographs: a preliminary report.

    PubMed

    Jackson, P H; Dickson, G C; Birnie, D J

    1985-07-01

    The principles of image capture, image storage and image processing in digital radiology are described. The enhancement of radiographic images using digital image processing techniques and its application to cephalometry is discussed. The results of a pilot study which compared some common cephalometric measurements made from manual point identification with those made by direct digitization of digital radiographic images from video monitors are presented. Although in an early stage of development, the results from the image processing system were comparable with those obtained by traditional methods.

  9. A review of some digital image processing in cometary research

    NASA Astrophysics Data System (ADS)

    Larson, S. M.

    The development of electronic digitizers, digital detector arrays and modern high speed computer processing has led to more efficient, quantitative methods of studying the spatial, temporal and photometric properties of cometary phenomena. Digital image processing techniques are being used and further developed to reduce two dimensional data, to enhance the visibility of cometary features, and to quantify spatial and temporal changes. Some of these methods are reviewed, and their merits and limitations are discussed.

  10. The Creation Process in Digital Art

    NASA Astrophysics Data System (ADS)

    Marcos, Adérito Fernandes; Branco, Pedro Sérgio; Zagalo, Nelson Troca

    The process behind the act of the art creation or the creation process has been the subject of much debate and research during the last fifty years at least, even thinking art and beauty has been a subject of analysis already by the ancient Greeks such were Plato or Aristotle. Even though intuitively it is a simple phenomenon, creativity or the human ability to generate innovation (new ideas, concepts, etc.) is in fact quite complex. It has been studied from the perspectives of behavioral and social psychology, cognitive science, artificial intelligence, philosophy, history, design research, digital art, and computational aesthetics, among others. In spite of many years of discussion and research there is no single, authoritative perspective or definition of creativity, i.e., there is no standardized measurement technique. Regarding the development process that supports the intellectual act of creation it is usually described as a procedure where the artist experiments the medium, explores it with one or more techniques, changing shapes, forms, appearances, where beyond time and space, he/she seeks his/her way out to a clearing, i.e., envisages a path from intention to realization. Duchamp in his lecture "The Creative Act" states the artist is never alone with his/her artwork; there is always the spectator that later on will react critically to the work of art. If the artist succeeds in transmitting his/her intentions in terms of a message, emotion or feeling to the spectator then a form of aesthetic osmosis actually takes place through the inert matter (the medium) that enabled this communication or interaction phenomenon to occur. The role of the spectator may become gradually more active by interacting with the artwork itself possibly changing or becoming a part of it [2][4].

  11. The Creation Process in Digital Art

    NASA Astrophysics Data System (ADS)

    Marcos, Adérito Fernandes; Branco, Pedro Sérgio; Zagalo, Nelson Troca

    The process behind the act of the art creation or the creation process has been the subject of much debate and research during the last fifty years at least, even thinking art and beauty has been a subject of analysis already by the ancient Greeks such were Plato or Aristotle. Even though intuitively it is a simple phenomenon, creativity or the human ability to generate innovation (new ideas, concepts, etc.) is in fact quite complex. It has been studied from the perspectives of behavioral and social psychology, cognitive science, artificial intelligence, philosophy, history, design research, digital art, and computational aesthetics, among others. In spite of many years of discussion and research there is no single, authoritative perspective or definition of creativity, i.e., there is no standardized measurement technique. Regarding the development process that supports the intellectual act of creation it is usually described as a procedure where the artist experiments the medium, explores it with one or more techniques, changing shapes, forms, appearances, where beyond time and space, he/she seeks his/her way out to a clearing, i.e., envisages a path from intention to realization. Duchamp in his lecture "The Creative Act" states the artist is never alone with his/her artwork; there is always the spectator that later on will react critically to the work of art. If the artist succeeds in transmitting his/her intentions in terms of a message, emotion or feeling to the spectator then a form of aesthetic osmosis actually takes place through the inert matter (the medium) that enabled this communication or interaction phenomenon to occur. The role of the spectator may become gradually more active by interacting with the artwork itself possibly changing or becoming a part of it [2][4].

  12. Digital processing of array seismic recordings

    USGS Publications Warehouse

    Ryall, Alan; Birtill, John

    1962-01-01

    This technical letter contains a brief review of the operations which are involved in digital processing of array seismic recordings by the methods of velocity filtering, summation, cross-multiplication and integration, and by combinations of these operations (the "UK Method" and multiple correlation). Examples are presented of analyses by the several techniques on array recordings which were obtained by the U.S. Geological Survey during chemical and nuclear explosions in the western United States. Seismograms are synthesized using actual noise and Pn-signal recordings, such that the signal-to-noise ratio, onset time and velocity of the signal are predetermined for the synthetic record. These records are then analyzed by summation, cross-multiplication, multiple correlation and the UK technique, and the results are compared. For all of the examples presented, analysis by the non-linear techniques of multiple correlation and cross-multiplication of the traces on an array recording are preferred to analyses by the linear operations involved in summation and the UK Method.

  13. Tolerance-based process proximity correction (PPC) verification methodology

    NASA Astrophysics Data System (ADS)

    Hashimoto, Kohji; Fujise, Hiroharu; Nojima, Shigeki; Ito, Takeshi; Ikeda, Takahiro

    2004-08-01

    Tolerance-based process proximity correction (PPC) verification methodology is proposed for "hot spot management" in LSI fabrication process flow. This methodology verifies the PPC accuracy with the features of actual processed wafers/masks and target features in CAD data including CD tolerance around hot spots. The CD tolerance in CAD data is decided according to device characteristics, process integration, CD budget, and so on, and is used for the judgment criteria of the PPC accuracy. After the verifications, the actions in the manufacturing are decided. This methodology is demonstrated for the 65nm-node CMOS local metal at three representative hot spots extracted by lithography simulation, and the results yielded useful information for the manufacturing.

  14. Neutron coincidence counting with digital signal processing

    NASA Astrophysics Data System (ADS)

    Bagi, Janos; Dechamp, Luc; Dransart, Pascal; Dzbikowicz, Zdzislaw; Dufour, Jean-Luc; Holzleitner, Ludwig; Huszti, Joseph; Looman, Marc; Marin Ferrer, Montserrat; Lambert, Thierry; Peerani, Paolo; Rackham, Jamie; Swinhoe, Martyn; Tobin, Steve; Weber, Anne-Laure; Wilson, Mark

    2009-09-01

    Neutron coincidence counting is a widely adopted nondestructive assay (NDA) technique used in nuclear safeguards to measure the mass of nuclear material in samples. Nowadays, most neutron-counting systems are based on the original-shift-register technology, like the (ordinary or multiplicity) Shift-Register Analyser. The analogue signal from the He-3 tubes is processed by an amplifier/single channel analyser (SCA) producing a train of TTL pulses that are fed into an electronic unit that performs the time- correlation analysis. Following the suggestion of the main inspection authorities (IAEA, Euratom and the French Ministry of Industry), several research laboratories have started to study and develop prototypes of neutron-counting systems with PC-based processing. Collaboration in this field among JRC, IRSN and LANL has been established within the framework of the ESARDA-NDA working group. Joint testing campaigns have been performed in the JRC PERLA laboratory, using different equipment provided by the three partners. One area of development is the use of high-speed PCs and pulse acquisition electronics that provide a time stamp (LIST-Mode Acquisition) for every digital pulse. The time stamp data can be processed directly during acquisition or saved on a hard disk. The latter method has the advantage that measurement data can be analysed with different values for parameters like predelay and gate width, without repeating the acquisition. Other useful diagnostic information, such as die-away time and dead time, can also be extracted from this stored data. A second area is the development of "virtual instruments." These devices, in which the pulse-processing system can be embedded in the neutron counter itself and sends counting data to a PC, can give increased data-acquisition speeds. Either or both of these developments could give rise to the next generation of instrumentation for improved practical neutron-correlation measurements. The paper will describe the

  15. Digital image processing for photo-reconnaissance applications

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.

    1972-01-01

    Digital image-processing techniques developed for processing pictures from NASA space vehicles are analyzed in terms of enhancement, quantitative restoration, and information extraction. Digital filtering, and the action of a high frequency filter in the real and Fourier domain are discussed along with color and brightness.

  16. Digital Handling and Processing of Remote Sensing Data

    NASA Technical Reports Server (NTRS)

    Algazi, R.

    1972-01-01

    Details of plans for developing a digital image processing facility to handle remote sensing data are reported. Also given are programs for data acquisition and handling, systems for use in programming various digital image processing tasks, and algorithms for feature enhancement.

  17. Modular digital holographic fringe data processing system

    NASA Technical Reports Server (NTRS)

    Downward, J. G.; Vavra, P. C.; Schebor, F. S.; Vest, C. M.

    1985-01-01

    A software architecture suitable for reducing holographic fringe data into useful engineering data is developed and tested. The results, along with a detailed description of the proposed architecture for a Modular Digital Fringe Analysis System, are presented.

  18. A methodology aimed at fostering and sustaining the development processes of an IE-based industry

    NASA Astrophysics Data System (ADS)

    Corallo, Angelo; Errico, Fabrizio; de Maggio, Marco; Giangreco, Enza

    In the current competitive scenario, where business relationships are fundamental in building successful business models and inter/intra organizational business processes are progressively digitalized, an end-to-end methodology is required that is capable of guiding business networks through the Internetworked Enterprise (IE) paradigm: a new and innovative organizational model able to leverage Internet technologies to perform real-time coordination of intra and inter-firm activities, to create value by offering innovative and personalized products/services and reduce transaction costs. This chapter presents the TEKNE project Methodology of change that guides business networks, by means of a modular and flexible approach, towards the IE techno-organizational paradigm, taking into account the competitive environment of the network and how this environment influences its strategic, organizational and technological levels. Contingency, the business model, enterprise architecture and performance metrics are the key concepts that form the cornerstone of this methodological framework.

  19. Multiscale image processing and antiscatter grids in digital radiography.

    PubMed

    Lo, Winnie Y; Hornof, William J; Zwingenberger, Allison L; Robertson, Ian D

    2009-01-01

    Scatter radiation is a source of noise and results in decreased signal-to-noise ratio and thus decreased image quality in digital radiography. We determined subjectively whether a digitally processed image made without a grid would be of similar quality to an image made with a grid but without image processing. Additionally the effects of exposure dose and of a using a grid with digital radiography on overall image quality were studied. Thoracic and abdominal radiographs of five dogs of various sizes were made. Four acquisition techniques were included (1) with a grid, standard exposure dose, digital image processing; (2) without a grid, standard exposure dose, digital image processing; (3) without a grid, half the exposure dose, digital image processing; and (4) with a grid, standard exposure dose, no digital image processing (to mimic a film-screen radiograph). Full-size radiographs as well as magnified images of specific anatomic regions were generated. Nine reviewers rated the overall image quality subjectively using a five-point scale. All digitally processed radiographs had higher overall scores than nondigitally processed radiographs regardless of patient size, exposure dose, or use of a grid. The images made at half the exposure dose had a slightly lower quality than those made at full dose, but this was only statistically significant in magnified images. Using a grid with digital image processing led to a slight but statistically significant increase in overall quality when compared with digitally processed images made without a grid but whether this increase in quality is clinically significant is unknown.

  20. Multilingual subjective methodology and evaluation of low-rate digital voice processors

    NASA Astrophysics Data System (ADS)

    Dimolitsas, Spiros; Corcoran, Franklin L.; Baraniecki, Marion R.; Phipps, John G., Jr.

    The methodology and results for a multilingual evaluation of source encoding algorithms operating at 16 kbit/s are presented. The evaluation was conducted in three languages (English, French, and Madarin), using listener opinion subjective assessments to determine whether 'toll-quality' performance is possible at 16 kbit/s. The study demonstrated that toll-quality voice is indeed possible at 16 kbit/s, and that several of the methods evaluated are more robust under high bit error conditions than either 32- or 64-kbit/s encoding. Thus, 16-kbit/s voice coding technology is currently suitable for many applications with the public-switched telephone network, including the next generation of digital circuit multiplication equipment, and integrated services digital network videotelephony.

  1. Digital radiography image quality: image processing and display.

    PubMed

    Krupinski, Elizabeth A; Williams, Mark B; Andriole, Katherine; Strauss, Keith J; Applegate, Kimberly; Wyatt, Margaret; Bjork, Sandra; Seibert, J Anthony

    2007-06-01

    This article on digital radiography image processing and display is the second of two articles written as part of an intersociety effort to establish image quality standards for digital and computed radiography. The topic of the other paper is digital radiography image acquisition. The articles were developed collaboratively by the ACR, the American Association of Physicists in Medicine, and the Society for Imaging Informatics in Medicine. Increasingly, medical imaging and patient information are being managed using digital data during acquisition, transmission, storage, display, interpretation, and consultation. The management of data during each of these operations may have an impact on the quality of patient care. These articles describe what is known to improve image quality for digital and computed radiography and to make recommendations on optimal acquisition, processing, and display. The practice of digital radiography is a rapidly evolving technology that will require timely revision of any guidelines and standards.

  2. Using Constructivist Case Study Methodology to Understand Community Development Processes: Proposed Methodological Questions to Guide the Research Process

    ERIC Educational Resources Information Center

    Lauckner, Heidi; Paterson, Margo; Krupa, Terry

    2012-01-01

    Often, research projects are presented as final products with the methodologies cleanly outlined and little attention paid to the decision-making processes that led to the chosen approach. Limited attention paid to these decision-making processes perpetuates a sense of mystery about qualitative approaches, particularly for new researchers who will…

  3. Lean methodology: supporting battlefield medical fitness by cutting process waste.

    PubMed

    Huggins, Elaine J

    2010-01-01

    Healthcare has long looked at decreasing risk in communication and patient care processes. Increasing the simplicity in communication and patient care process is a newer concept contained in Lean methodology. Lean is a strategy for achieving improvement in performance through the elimination of steps that use resources without contributing to customer value. This is known as cutting waste or nonvalue added steps. This article outlines how the use of Lean improved a key process that supports battlefield medical fitness.

  4. Programmable rate modem utilizing digital signal processing techniques

    NASA Technical Reports Server (NTRS)

    Naveh, Arad

    1992-01-01

    The need for a Programmable Rate Digital Satellite Modem capable of supporting both burst and continuous transmission modes with either Binary Phase Shift Keying (BPSK) or Quadrature Phase Shift Keying (QPSK) modulation is discussed. The preferred implementation technique is an all digital one which utilizes as much digital signal processing (DSP) as possible. The design trade-offs in each portion of the modulator and demodulator subsystem are outlined.

  5. A methodology for high resolution digital image correlation in high temperature experiments

    NASA Astrophysics Data System (ADS)

    Blaber, Justin; Adair, Benjamin S.; Antoniou, Antonia

    2015-03-01

    We propose a methodology for performing high resolution Digital Image Correlation (DIC) analysis during high-temperature mechanical tests. Specifically, we describe a technique for producing a stable, high-quality pattern on metal surfaces along with a simple optical system that uses a visible-range camera and a long-range microscope. The results are analyzed with a high-quality open-source DIC software developed by us. Using the proposed technique, we successfully acquired high-resolution strain maps of the crack tip field in a nickel superalloy sample at 1000 °C.

  6. The digital storytelling process: A comparative analysis from various experts

    NASA Astrophysics Data System (ADS)

    Hussain, Hashiroh; Shiratuddin, Norshuhada

    2016-08-01

    Digital Storytelling (DST) is a method of delivering information to the audience. It combines narrative and digital media content infused with the multimedia elements. In order for the educators (i.e the designers) to create a compelling digital story, there are sets of processes introduced by experts. Nevertheless, the experts suggest varieties of processes to guide them; of which some are redundant. The main aim of this study is to propose a single guide process for the creation of DST. A comparative analysis is employed where ten DST models from various experts are analysed. The process can also be implemented in other multimedia materials that used the concept of DST.

  7. Digital signal processor and processing method for GPS receivers

    NASA Technical Reports Server (NTRS)

    Thomas, Jr., Jess B. (Inventor)

    1989-01-01

    A digital signal processor and processing method therefor for use in receivers of the NAVSTAR/GLOBAL POSITIONING SYSTEM (GPS) employs a digital carrier down-converter, digital code correlator and digital tracking processor. The digital carrier down-converter and code correlator consists of an all-digital, minimum bit implementation that utilizes digital chip and phase advancers, providing exceptional control and accuracy in feedback phase and in feedback delay. Roundoff and commensurability errors can be reduced to extremely small values (e.g., less than 100 nanochips and 100 nanocycles roundoff errors and 0.1 millichip and 1 millicycle commensurability errors). The digital tracking processor bases the fast feedback for phase and for group delay in the C/A, P.sub.1, and P.sub.2 channels on the L.sub.1 C/A carrier phase thereby maintaining lock at lower signal-to-noise ratios, reducing errors in feedback delays, reducing the frequency of cycle slips and in some cases obviating the need for quadrature processing in the P channels. Simple and reliable methods are employed for data bit synchronization, data bit removal and cycle counting. Improved precision in averaged output delay values is provided by carrier-aided data-compression techniques. The signal processor employs purely digital operations in the sense that exactly the same carrier phase and group delay measurements are obtained, to the last decimal place, every time the same sampled data (i.e., exactly the same bits) are processed.

  8. Pedagogical reforms of digital signal processing education

    NASA Astrophysics Data System (ADS)

    Christensen, Michael

    The future of the engineering discipline is arguably predicated heavily upon appealing to the future generation, in all its sensibilities. The greatest burden in doing so, one might rightly believe, lies on the shoulders of the educators. In examining the causal means by which the profession arrived at such a state, one finds that the technical revolution, precipitated by global war, had, as its catalyst, institutions as expansive as the government itself to satisfy the demand for engineers, who, as a result of such an existential crisis, were taught predominantly theoretical underpinnings to address a finite purpose. By contrast, the modern engineer, having expanded upon this vision and adapted to an evolving society, is increasingly placed in the proverbial role of the worker who must don many hats: not solely a scientist, yet often an artist; not a businessperson alone, but neither financially naive; not always a representative, though frequently a collaborator. Inasmuch as change then serves as the only constancy in a global climate, therefore, the educational system - if it is to mimic the demands of the industry - is left with an inherent need for perpetual revitalization to remain relevant. This work aims to serve that end. Motivated by existing research in engineering education, an epistemological challenge is molded into the framework of the electrical engineer with emphasis on digital signal processing. In particular, it is investigated whether students are better served by a learning paradigm that tolerates and, when feasible, encourages error via a medium free of traditional adjudication. Through the creation of learning modules using the Adobe Captivate environment, a wide range of fundamental knowledge in signal processing is challenged within the confines of existing undergraduate courses. It is found that such an approach not only conforms to the research agenda outlined for the engineering educator, but also reflects an often neglected reality

  9. Digital computer processing of X-ray photos

    NASA Technical Reports Server (NTRS)

    Nathan, R.; Selzer, R. H.

    1967-01-01

    Digital computers correct various distortions in medical and biological photographs. One of the principal methods of computer enhancement involves the use of a two-dimensional digital filter to modify the frequency spectrum of the picture. Another computer processing method is image subtraction.

  10. Rethinking the Purposes and Processes for Designing Digital Portfolios

    ERIC Educational Resources Information Center

    Hicks, Troy; Russo, Anne; Autrey, Tara; Gardner, Rebecca; Kabodian, Aram; Edington, Cathy

    2007-01-01

    As digital portfolios become more prevalent in teacher education, the purposes and processes for creating them have become contested. Originally meant to be critical and reflective spaces for learning about multimedia and conceived as contributing to professional growth, research shows that digital portfolios are now increasingly being used to…

  11. A Phenomenological Study of an Emergent National Digital Library, Part I: Theory and Methodological Framework

    ERIC Educational Resources Information Center

    Dalbello, Marija

    2005-01-01

    The activities surrounding the National Digital Library Program (NDLP) at the Library of Congress (1995-2000) are used to study institutional processes associated with technological innovation in the library context. The study identified modalities of successful innovation and the characteristics of creative decision making. Theories of social…

  12. A Phenomenological Study of an Emergent National Digital Library, Part I: Theory and Methodological Framework

    ERIC Educational Resources Information Center

    Dalbello, Marija

    2005-01-01

    The activities surrounding the National Digital Library Program (NDLP) at the Library of Congress (1995-2000) are used to study institutional processes associated with technological innovation in the library context. The study identified modalities of successful innovation and the characteristics of creative decision making. Theories of social…

  13. A POLLUTION REDUCTION METHODOLOGY FOR CHEMICAL PROCESS SIMULATORS

    EPA Science Inventory

    A pollution minimization methodology was developed for chemical process design using computer simulation. It is based on a pollution balance that at steady state is used to define a pollution index with units of mass of pollution per mass of products. The pollution balance has be...

  14. A POLLUTION REDUCTION METHODOLOGY FOR CHEMICAL PROCESS SIMULATORS

    EPA Science Inventory

    A pollution minimization methodology was developed for chemical process design using computer simulation. It is based on a pollution balance that at steady state is used to define a pollution index with units of mass of pollution per mass of products. The pollution balance has be...

  15. The AP Course Audit Syllabus Review Process: Methodological Explanation

    ERIC Educational Resources Information Center

    Conley, David T.

    2007-01-01

    The AP Course Audit utilizes a criterion-based professional judgment method of analysis within a nested multi-step review process. The overall goal of the methodology is to yield a final judgment on each syllabus that is ultimately valid. While reviewer consistency is an important consideration, the most important goal is to reach a final judgment…

  16. The Technology Transfer Process: Concepts, Framework and Methodology.

    ERIC Educational Resources Information Center

    Jolly, James A.

    This paper discusses the conceptual framework and methodology of the technology transfer process and develops a model of the transfer mechanism. This model is then transformed into a predictive model of technology transfer incorporating nine factors that contribute to the movement of knowledge from source to user. Each of these factors is examined…

  17. Application of concept selection methodology in IC process design

    NASA Astrophysics Data System (ADS)

    Kim, Myung-Kul

    1993-01-01

    Search for an effective methodology practical in IC manufacturing process development led to trial of quantitative 'concept selection' methodology in selecting the 'best' alternative for interlevel dielectric (ILD) processes. A cross-functional team selected multi-criteria with scoring guidelines to be used in the definition of the 'best'. The project was targeted for the 3 level metal backend process for sub-micron gate array product. The outcome of the project showed that the maturity of the alternatives has strong influence on the scores, because scores on the adopted criteria such as yield, reliability and maturity will depend on the maturity of a particular process. At the same time, the project took longer than expected since it required data for the multiple criteria. These observations suggest that adopting a simpler procedure that can analyze total inherent controllability of a process would be more effective. The methodology of the DFS (design for simplicity) tools used in analyzing the manufacturability of such electronics products as computers, phones and other consumer electronics products could be used as an 'analogy' in constructing an evaluation method for IC processes that produce devices used in those electronics products. This could be done by focusing on the basic process operation elements rather than the layers that are being built.

  18. Powerful Practices in Digital Learning Processes

    ERIC Educational Resources Information Center

    Sørensen, Birgitte Holm; Levinsen, Karin Tweddell

    2015-01-01

    The present paper is based on two empirical research studies. The "Netbook 1:1" project (2009-2012), funded by the municipality of Gentofte and Microsoft Denmark, is complete, while "Students' digital production and students as learning designers" (2013-2015), funded by the Danish Ministry of Education, is ongoing. Both…

  19. Preliminary development of digital signal processing in microwave radiometers

    NASA Technical Reports Server (NTRS)

    Stanley, W. D.

    1980-01-01

    Topics covered involve a number of closely related tasks including: the development of several control loop and dynamic noise model computer programs for simulating microwave radiometer measurements; computer modeling of an existing stepped frequency radiometer in an effort to determine its optimum operational characteristics; investigation of the classical second order analog control loop to determine its ability to reduce the estimation error in a microwave radiometer; investigation of several digital signal processing unit designs; initiation of efforts to develop required hardware and software for implementation of the digital signal processing unit; and investigation of the general characteristics and peculiarities of digital processing noiselike microwave radiometer signals.

  20. The teaching of computer programming and digital image processing in radiography.

    PubMed

    Allan, G L; Zylinski, J

    1998-06-01

    The increased use of digital processing techniques in Medical Radiations imaging modalities, along with the rapid advance in information technology has resulted in a significant change in the delivery of radiographic teaching programs. This paper details a methodology used to concurrently educate radiographers in both computer programming and image processing. The students learn to program in visual basic applications (VBA), and the programming skills are contextualised by requiring the students to write a digital subtraction angiography (DSA) package. Program code generation and image presentation interface is undertaken by the spreadsheet Microsoft Excel. The user-friendly nature of this common interface enables all students to readily begin program creation. The teaching of programming and image processing skills by this method may be readily generalised to other vocational fields where digital image manipulation is a professional requirement.

  1. Working memory and two-digit number processing.

    PubMed

    Macizo, Pedro; Herrera, Amparo

    2011-11-01

    The processing of two-digit numbers in comparison tasks involves the activation and manipulation of magnitude information to decide which number is larger. The present study explored the role of different working memory (WM) components and skills in the processing of two-digit numbers by examining the unit-decade compatibility effect with Arabic digits and number words. In the study, the unit-decade compatibility effect and different WM components were evaluated. The results indicated that the unit-decade compatibility effect was associated to specific WM skills depending on the number format (Arabic digits and number words). We discussed the implications of these results for the decomposed view of two-digit numbers.

  2. The New Digital Engineering Design and Graphics Process.

    ERIC Educational Resources Information Center

    Barr, R. E.; Krueger, T. J.; Aanstoos, T. A.

    2002-01-01

    Summarizes the digital engineering design process using software widely available for the educational setting. Points out that newer technology used in the field is not used in engineering graphics education. (DDR)

  3. Application of digital image processing techniques to astronomical imagery 1977

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.; Lynn, D. J.

    1978-01-01

    Nine specific techniques of combination of techniques developed for applying digital image processing technology to existing astronomical imagery are described. Photoproducts are included to illustrate the results of each of these investigations.

  4. Effective DQE (eDQE) and speed of digital radiographic systems: An experimental methodology

    PubMed Central

    Samei, Ehsan; Ranger, Nicole T.; MacKenzie, Alistair; Honey, Ian D.; Dobbins, James T.; Ravin, Carl E.

    2009-01-01

    Prior studies on performance evaluation of digital radiographic systems have primarily focused on the assessment of the detector performance alone. However, the clinical performance of such systems is also substantially impacted by magnification, focal spot blur, the presence of scattered radiation, and the presence of an antiscatter grid. The purpose of this study is to evaluate an experimental methodology to assess the performance of a digital radiographic system, including those attributes, and to propose a new metric, effective detective quantum efficiency (eDQE), a candidate for defining the efficiency or speed of digital radiographic imaging systems. The study employed a geometric phantom simulating the attenuation and scatter properties of the adult human thorax and a representative indirect flat-panel-based clinical digital radiographic imaging system. The noise power spectrum (NPS) was derived from images of the phantom acquired at three exposure levels spanning the operating range of the clinical system. The modulation transfer function (MTF) was measured using an edge device positioned at the surface of the phantom, facing the x-ray source. Scatter measurements were made using a beam stop technique. The eDQE was then computed from these measurements, along with measures of phantom attenuation and x-ray flux. The MTF results showed notable impact from the focal spot blur, while the NPS depicted a large component of structured noise resulting from use of an antiscatter grid. The eDQE was found to be an order of magnitude lower than the conventional DQE. At 120 kVp, eDQE(0) was in the 8%–9% range, fivefold lower than DQE(0) at the same technique. The eDQE method yielded reproducible estimates of the system performance in a clinically relevant context by quantifying the inherent speed of the system, that is, the actual signal to noise ratio that would be measured under clinical operating conditions. PMID:19746814

  5. Cell-based top-down design methodology for RSFQ digital circuits

    NASA Astrophysics Data System (ADS)

    Yoshikawa, N.; Koshiyama, J.; Motoori, K.; Matsuzaki, F.; Yoda, K.

    2001-08-01

    We propose a cell-based top-down design methodology for rapid single flux quantum (RSFQ) digital circuits. Our design methodology employs a binary decision diagram (BDD), which is currently used for the design of CMOS pass-transistor logic circuits. The main features of the BDD RSFQ circuits are the limited primitive number, dual rail nature, non-clocking architecture, and small gate count. We have made a standard BDD RSFQ cell library and prepared a top-down design CAD environment, by which we can perform logic synthesis, logic simulation, circuit simulation and layout view extraction. In order to clarify problems expected in large-scale RSFQ circuits design, we have designed a small RSFQ microprocessor based on simple architecture using our top-down design methodology. We have estimated its system performance and compared it with that of the CMOS microprocessor with the same architecture. It was found that the RSFQ system is superior in terms of the operating speed though it requires extremely large chip area.

  6. Digital signal processing in the radio science stability analyzer

    NASA Technical Reports Server (NTRS)

    Greenhall, C. A.

    1995-01-01

    The Telecommunications Division has built a stability analyzer for testing Deep Space Network installations during flight radio science experiments. The low-frequency part of the analyzer operates by digitizing wave signals with bandwidths between 80 Hz and 45 kHz. Processed outputs include spectra of signal, phase, amplitude, and differential phase; time series of the same quantities; and Allan deviation of phase and differential phase. This article documents the digital signal-processing methods programmed into the analyzer.

  7. Agricultural inventory capabilities of machine processed LANDSAT digital data

    NASA Technical Reports Server (NTRS)

    Dietrick, D. L.; Fries, R. E.; Egbert, D. D.

    1975-01-01

    Agricultural crop identification and acreage determination analysis of LANDSAT digital data was performed for two study areas. A multispectral image processing and analysis system was utilized to perform the manmachine interactive analysis. The developed techniques yielded crop acreage estimate results with accuracy greater than 90% and as high as 99%. These results are encouraging evidence of agricultural inventory capabilities of machine processed LANDSAT digital data.

  8. Demonstration of the Dynamic Flowgraph Methodology using the Titan 2 Space Launch Vehicle Digital Flight Control System

    NASA Technical Reports Server (NTRS)

    Yau, M.; Guarro, S.; Apostolakis, G.

    1993-01-01

    Dynamic Flowgraph Methodology (DFM) is a new approach developed to integrate the modeling and analysis of the hardware and software components of an embedded system. The objective is to complement the traditional approaches which generally follow the philosophy of separating out the hardware and software portions of the assurance analysis. In this paper, the DFM approach is demonstrated using the Titan 2 Space Launch Vehicle Digital Flight Control System. The hardware and software portions of this embedded system are modeled in an integrated framework. In addition, the time dependent behavior and the switching logic can be captured by this DFM model. In the modeling process, it is found that constructing decision tables for software subroutines is very time consuming. A possible solution is suggested. This approach makes use of a well-known numerical method, the Newton-Raphson method, to solve the equations implemented in the subroutines in reverse. Convergence can be achieved in a few steps.

  9. Demonstration of the Dynamic Flowgraph Methodology using the Titan 2 Space Launch Vehicle Digital Flight Control System

    NASA Astrophysics Data System (ADS)

    Yau, M.; Guarro, S.; Apostolakis, G.

    1993-06-01

    Dynamic Flowgraph Methodology (DFM) is a new approach developed to integrate the modeling and analysis of the hardware and software components of an embedded system. The objective is to complement the traditional approaches which generally follow the philosophy of separating out the hardware and software portions of the assurance analysis. In this paper, the DFM approach is demonstrated using the Titan 2 Space Launch Vehicle Digital Flight Control System. The hardware and software portions of this embedded system are modeled in an integrated framework. In addition, the time dependent behavior and the switching logic can be captured by this DFM model. In the modeling process, it is found that constructing decision tables for software subroutines is very time consuming. A possible solution is suggested. This approach makes use of a well-known numerical method, the Newton-Raphson method, to solve the equations implemented in the subroutines in reverse. Convergence can be achieved in a few steps.

  10. Categorical digital soil database for the Carpathian-basin using the modified e-SOTER methodology

    NASA Astrophysics Data System (ADS)

    Dobos, Endre; Vadnai, Péter; Micheli, Erika; Pasztor, Laszlo

    2015-04-01

    Harmonized, spatially and thematically consistent, high resolution soil data covering larger regions, several countries to model different environmental and socio-economic scenarios is needed for several applications. The only way to have such data with large spatial coverage and high resolution data is to make use of the available high resolution digital data sources, digital soil mapping tools in the development process. Digital soil mapping has become a very efficient tool in soil science, several applications have been published in this topic recently. Many of these applications use environmental covariates like remotely sensed images and digital elevation models, which are raster based data sources with block support. The majority of soil data users requires data in raster format with values of certain properties, like pH, clay content or soil organic matter content. However, the use of these soil properties are often limited, an adequate interpretation of these numbers requires knowledge on the soil system, and its major processes and process associations. This soil system description can be best done using the existing knowledge of soil science expressed in soil classification. as diagnostics - features, materials horizons, as important descriptive information - and the classification categories. The most commonly used and internationally accepted classification system is the Worlds Reference Base for soil description, the so called WRB. Each soil classification category represent a complex association of processes and properties, which is difficult to be used and understood and also mapped due to its complex information behind the category names. The major advantage of the diagnostics based classification systems, like WRB, is that the complex soil categories, classes can be interpreted as unique combinations of the diagnostic features. Therefore each classes an be disaggregated into several diagnostics, where each have independent useful information

  11. A pollution reduction methodology for chemical process simulators

    SciTech Connect

    Mallick, S.K.; Cabezas, H.; Bare, J.C.; Sikdar, S.K.

    1996-11-01

    A pollution minimization methodology was developed for chemical process design using computer simulation. It is based on a pollution balance that at steady state is used to define a pollution index with units of mass of pollution per mass of products. The pollution balance has been modified by weighing the mass flowrate of each pollutant by its potential environmental impact score. This converts the mass balance into an environmental impact balance. This balance defines an impact index with units of environmental impact per mass of products. The impact index measures the potential environmental effects of process wastes. Three different schemes for chemical ranking were considered: (1) no ranking, (2) simple ranking from 0 to 3, and (3) ranking by a scientifically derived measure of human health and environmental effects. Use of the methodology is illustrated with two examples from the production of (1) methyl ethyl ketone and (2) synthetic ammonia.

  12. Advanced Digital Signal Processing for Hybrid Lidar

    DTIC Science & Technology

    2012-01-31

    18, 1 gives an attenuation of-12dB and a sensitivity of 2 gives an attenuation of OdB. The signal is digitized using a Sigma - Delta ADC with a... Matlab code that called the API C functions was used. -The mean of the I data and the mean of the Q data were computed. - The amplitude was calculated...software . defined radio (SDR) modules . Clarkson has investigated several other SDR approaches which include two open source SDR platforms and one

  13. Detecting jaundice by using digital image processing

    NASA Astrophysics Data System (ADS)

    Castro-Ramos, J.; Toxqui-Quitl, C.; Villa Manriquez, F.; Orozco-Guillen, E.; Padilla-Vivanco, A.; Sánchez-Escobar, JJ.

    2014-03-01

    When strong Jaundice is presented, babies or adults should be subject to clinical exam like "serum bilirubin" which can cause traumas in patients. Often jaundice is presented in liver disease such as hepatitis or liver cancer. In order to avoid additional traumas we propose to detect jaundice (icterus) in newborns or adults by using a not pain method. By acquiring digital images in color, in palm, soles and forehead, we analyze RGB attributes and diffuse reflectance spectra as the parameter to characterize patients with either jaundice or not, and we correlate that parameters with the level of bilirubin. By applying support vector machine we distinguish between healthy and sick patients.

  14. Applications of digital image processing IX

    SciTech Connect

    Tescher, A.G.

    1986-01-01

    This book contains the proceedings of SPIE - The International Society for Optical Engineering. The first session covers image compression and includes papers such as ''Knowledge-based image bandwidth compression.'' Session two is about instrumentation such as ''Real-time inspection of currency'' and ''Experimental digital image processor.'' Session three discusses theoretical concepts such as ''Study of texture segmentation.'' Session four is about algorithms. One such topic is ''Dynamic ordered dither algorithm.'' Session five covers registration and modeling. For example, one paper is ''3D-motion estimation from projections.'' Session six is about restoration and enhancement. Papers include ''Wobble error correction for laser scanners'' and ''Robotics with computer Vision.''

  15. Why optics students should take digital signal processing courses and why digital signal processing students should take optics courses

    NASA Astrophysics Data System (ADS)

    Cathey, W. Thomas, Jr.

    2000-06-01

    This paper is based on the claim that future major contributions in the field of imaging systems will be made by those who have a background in both optics and digital signal processing. As the introduction of Fourier transforms and linear systems theory to optics had a major impact on the design of hybrid optical/digital imaging systems, the introduction of digital signal processing into optics programs will have a major impact. Examples are given of new hybrid imaging systems that have unique performance. By jointly designing the optics and the signal processing in a digital camera, a new paradigm arises where aberration balancing takes into consideration not only the number of surfaces and indices of refraction, but also the processing capability.

  16. A Comprehensive and Harmonized Digital Forensic Investigation Process Model.

    PubMed

    Valjarevic, Aleksandar; Venter, Hein S

    2015-11-01

    Performing a digital forensic investigation (DFI) requires a standardized and formalized process. There is currently neither an international standard nor does a global, harmonized DFI process (DFIP) exist. The authors studied existing state-of-the-art DFIP models and concluded that there are significant disparities pertaining to the number of processes, the scope, the hierarchical levels, and concepts applied. This paper proposes a comprehensive model that harmonizes existing models. An effort was made to incorporate all types of processes proposed by the existing models, including those aimed at achieving digital forensic readiness. The authors introduce a novel class of processes called concurrent processes. This is a novel contribution that should, together with the rest of the model, enable more efficient and effective DFI, while ensuring admissibility of digital evidence. Ultimately, the proposed model is intended to be used for different types of DFI and should lead to standardization. © 2015 American Academy of Forensic Sciences.

  17. Digital processing of radiographic images from PACS to publishing.

    PubMed

    Christian, M E; Davidson, H C; Wiggins, R H; Berges, G; Cannon, G; Jackson, G; Chapman, B; Harnsberger, H R

    2001-03-01

    Several studies have addressed the implications of filmless radiologic imaging on telemedicine, diagnostic ability, and electronic teaching files. However, many publishers still require authors to submit hard-copy images for publication of articles and textbooks. This study compares the quality digital images directly exported from picture archive and communications systems (PACS) to images digitized from radiographic film. The authors evaluated the quality of publication-grade glossy photographs produced from digital radiographic images using 3 different methods: (1) film images digitized using a desktop scanner and then printed, (2) digital images obtained directly from PACS then printed, and (3) digital images obtained from PACS and processed to improve sharpness prior to printing. Twenty images were printed using each of the 3 different methods and rated for quality by 7 radiologists. The results were analyzed for statistically significant differences among the image sets. Subjective evaluations of the filmless images found them to be of equal or better quality than the digitized images. Direct electronic transfer of PACS images reduces the number of steps involved in creating publication-quality images as well as providing the means to produce high-quality radiographic images in a digital environment.

  18. Digital Historic Urban Landscape Methodology for Heritage Impact Assessment of Singapore

    NASA Astrophysics Data System (ADS)

    Widodo, J.; Wong, Y. C.; Ismail, F.

    2017-08-01

    Using the case study of Singapore's existing heritage websites, this research will probe the circumstances of the emerging technology and practice of consuming heritage architecture on a digital platform. Despite the diverse objectives, technology is assumed to help deliver greater interpretation through the use of new and high technology emphasising experience and provide visual fidelity. However, the success is limited as technology is insufficient to provide the past from multiple perspectives. Currently, existing projects provide linear narratives developed through a top-down approach that assumes the end-users as an individual entity and limits heritage as a consumable product. Through this research, we hope to uncover for better experience of digital heritage architecture where interpretation is an evolving `process' that is participatory and contributory that allows public participation, together with effective presentation, cultural learning and embodiment, to enhance the end-users' interpretation of digital heritage architecture. Additionally, this research seeks to establish an inventory in the form of a digital platform that adopts the Historic Urban Landscape (HUL) into the Singapore context to better and deepen the understandings of the public towards architectural as well as cultural heritage through an intercultural and intergenerational dialogue. Through HUL, this research hopes that it will better shape conservation strategies and urban planning.

  19. Process Waste Assessment for the Plotting and Digitizing Support Laboratory

    SciTech Connect

    Phillips, N.M.

    1994-04-01

    This Process Waste Assessment was conducted to evaluate the Plotting and Digitizing Support Laboratory, located in Building 913, Room 157. It documents the processes, identifies the hazardous chemical waste streams generated by these processes, recommends possible ways to minimize waste, and serves as a reference for future assessments of this facility.

  20. Analysis of Interpersonal Communication Processes in Digital Factory Environments

    NASA Astrophysics Data System (ADS)

    Schütze, Jens; Baum, Heiko; Laue, Martin; Müller, Egon

    The paper outlines the scope of influence of digital factory on the interpersonal communication process and the exemplary description of them. On the basis of a brief description about the theoretical basic concepts of the digital factory occurs the illustration of communicative features in digital factory. Practical coherences of interpersonal communication from a human oriented view were analyzed in Volkswagen AG in Wolfsburg in a pilot project. A modeling method was developed within the process analysis. This method makes it possible to visualize interpersonal communication and its human oriented attribute in a technically focused workflow. Due to the results of a developed inquiry about communication analysis and process models of modeling methods it was possible to build the processes in a suitable way for humans and to obtain a positive implication on the communication processes.

  1. Digital signal processing for fiber-optic thermometers

    SciTech Connect

    Fernicola, V.; Crovini, L.

    1994-12-31

    A digital signal processing scheme for measurement of exponentially-decaying signals, such as those found in fluorescence, lifetime-based, fiber-optic sensors, is proposed. The instrument uses a modified digital phase-sensitive-detection technique with the phase locked to a fixed value and the modulation period tracking the measured lifetime. Typical resolution of the system is 0.05% for slow decay (>500 {mu}s) and 0.1% for fast decay.

  2. Digital image processing of earth observation sensor data

    NASA Technical Reports Server (NTRS)

    Bernstein, R.

    1976-01-01

    This paper describes digital image processing techniques that were developed to precisely correct Landsat multispectral earth observation data and gives illustrations of the results achieved, e.g., geometric corrections with an error of less than one picture element, a relative error of one-fourth picture element, and no radiometric error effect. Techniques for enhancing the sensor data, digitally mosaicking multiple scenes, and extracting information are also illustrated.

  3. A color image processing pipeline for digital microscope

    NASA Astrophysics Data System (ADS)

    Liu, Yan; Liu, Peng; Zhuang, Zhefeng; Chen, Enguo; Yu, Feihong

    2012-10-01

    Digital microscope has found wide application in the field of biology, medicine et al. A digital microscope differs from traditional optical microscope in that there is no need to observe the sample through an eyepiece directly, because the optical image is projected directly on the CCD/CMOS camera. However, because of the imaging difference between human eye and sensor, color image processing pipeline is needed for the digital microscope electronic eyepiece to get obtain fine image. The color image pipeline for digital microscope, including the procedures that convert the RAW image data captured by sensor into real color image, is of great concern to the quality of microscopic image. The color pipeline for digital microscope is different from digital still cameras and video cameras because of the specific requirements of microscopic image, which should have the characters of high dynamic range, keeping the same color with the objects observed and a variety of image post-processing. In this paper, a new color image processing pipeline is proposed to satisfy the requirements of digital microscope image. The algorithm of each step in the color image processing pipeline is designed and optimized with the purpose of getting high quality image and accommodating diverse user preferences. With the proposed pipeline implemented on the digital microscope platform, the output color images meet the various analysis requirements of images in the medicine and biology fields very well. The major steps of color imaging pipeline proposed include: black level adjustment, defect pixels removing, noise reduction, linearization, white balance, RGB color correction, tone scale correction and gamma correction.

  4. Defect reduction methodologies for damascene interconnect process development

    NASA Astrophysics Data System (ADS)

    Skumanich, Andrew; Cai, Man-Ping

    1999-08-01

    A critical aspect of interconnect process development is identifying and eliminating yield impacting defects. A methodology is described which has been implemented at Applied Materials to utilize wafer metrology tools to drive process development for advanced interconnect fabrication. The methodology is based on a patterned wafer inspection tool, the WF736Duo, combined with a high throughput defect- review SEM with automatic defect classification, the SEMVision. This combination is tools facilitates defect sourcing and elimination. The requirements for defect reduction are increased since defects can result from both the levels and the interaction between levels. A full-flow Cu damascene interconnect process is examined from oxide deposition to final electrical test to establish inspection strategies for defect reduction. The inspection points for optimal defect reduction are identified based on e-test determination of yield limiting defects. The WF736 was utilized to capture a wide range of defects at the various processing steps. The progression of the defects is tracked to the final e-test point. This tracking both establishes the key defect types and facilitates defect sourcing. Further, the unique ability of the WF736 to segregate defects during the inspection with no loss in throughput, along with the SEMVision ADC analysis, allowed for faster defect sourcing.

  5. Process optimization of mechano-electrospinning by response surface methodology.

    PubMed

    Bu, Ningbin; Huang, YongAn; Duan, Yongqing; Yin, Zhouping

    2014-05-01

    In this paper, mechano-electrospinning (MES) is presented to write the polyvinylidene fluoride (PVDF) solution into fibers directly, and the effects of the process parameters on the fiber are investigated experimentally based on response surface methodology. The different width of the fiber is obtained by adjusting the individual process parameters (velocity of the substrate, applied voltage and nozzle-to-substrate distance). Considering the continuous jet and stable Taylor-cone, the operation field is selected for investigating the complicated relationship between the process parameters on the width of the fiber by using the response surface methodology. The experiment results show that the predicted width of the fiber is in good agreement with the actual width of the fiber. Based on the analysis of the importance of the terms in the equation, a simple model can be used to predict the width of the fiber. Depending on this model, a large number of calibration experiments can be subducted. Additionally, the principle of the selection of the process parameters is presented by optimizing parameters, which can give a guideline for obtaining the desired fiber in the experiment.

  6. [Fundamental bases of digital information processing in nuclear cardiology (III)].

    PubMed

    Cuarón, A; González, C; García Moreira, C

    1984-01-01

    This article describes the transformation of the gamma-camera images into digital form. The incidence of a gamma photon on the detector, produces two voltage pulses, which are proportional to the coordinates of the incidence points, and a digital pulse, indicative of the occurrence of the event. The coordinate pulses passes through a analog-digital converter, that is activated by the pulse. The result is the appearance of a digital number at the out-put of the converter, which is proportional to the voltage at its in-put. This number, is stored on the accumulation memory of the system, either on a list mode or on a matrix mode. Static images can be stored on a single matrix. Dynamic data can be stored on a series of matrixes, each representing a different period of acquisition. It is also possible to capture information on a series of matrixes syncronized with the electrocardiogram of the patient. In this instance, each matrix represents a distinct period of the cardiac cycle. Data stored on the memory, can be used to process and display images and quantitative histograms on a video screen. In order to do that, it is necessary to translate the digital data on the memory to voltage levels, and to transform these on light levels on the screen. This, is achieved through a digital analog converter. The reading of the digital memory must be syncronic with the electronic scanning of the video screen.

  7. Digital intermediate frequency QAM modulator using parallel processing

    DOEpatents

    Pao, Hsueh-Yuan [Livermore, CA; Tran, Binh-Nien [San Ramon, CA

    2008-05-27

    The digital Intermediate Frequency (IF) modulator applies to various modulation types and offers a simple and low cost method to implement a high-speed digital IF modulator using field programmable gate arrays (FPGAs). The architecture eliminates multipliers and sequential processing by storing the pre-computed modulated cosine and sine carriers in ROM look-up-tables (LUTs). The high-speed input data stream is parallel processed using the corresponding LUTs, which reduces the main processing speed, allowing the use of low cost FPGAs.

  8. Dry Machining Process of Milling Machine using Axiomatic Green Methodology

    NASA Astrophysics Data System (ADS)

    Puspita Andriani, Gita; Akbar, Muhammad; Irianto, Dradjad

    2016-02-01

    Most of companies know that there are strategies to become green industry, and they realize that green efforts have impacts on product quality and cost. Axiomatic Green Methodology models the relationship between green, quality, and cost. This methodology starts with determining the green improvement objective and then continues with mapping the functional, economic, and green requirements. From the mapping, variables which affect the requirements are identified. Afterwards, the effect of each variable is determined by performing experiments and regression modelling. In this research, axiomatic green methodology was implemented to dry machining of milling machine in order to reduce the amount of coolant. Dry machining will be feasible if it is not worse than the minimum required quality. As a result, dry machining is feasible without producing any defect. The proposed machining parameter is to reduce the coolant flow rate from 6.882 ml/minute to 0 ml/minute, set the depth of cut at 1.2 mm, spindle rotation speed at 500 rpm, and feed rate at 128 mm/minute. This solution is also resulted in reduction of cost for 200.48 rupiahs for each process.

  9. Relationships between digital signal processing and control and estimation theory

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1978-01-01

    Research directions in the fields of digital signal processing and modern control and estimation theory are discussed. Stability theory, linear prediction and parameter identification, system synthesis and implementation, two-dimensional filtering, decentralized control and estimation, and image processing are considered in order to uncover some of the basic similarities and differences in the goals, techniques, and philosophy of the disciplines.

  10. Restoration Of Faded Color Photographs By Digital Image Processing

    NASA Astrophysics Data System (ADS)

    Gschwind, Rudolf

    1989-10-01

    Color photographs possess a poor stability towards light, chemicals heat and humidity. As a consequence, the colors of photographs deteriorate with time. Because of the complexity of processes that cause the dyes to fade, it is impossible to restore the images by chemical means. It is therefore attempted to restore faded color films by means of digital image processing.

  11. FPGA-Based Filterbank Implementation for Parallel Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Berner, Stephan; DeLeon, Phillip

    1999-01-01

    One approach to parallel digital signal processing decomposes a high bandwidth signal into multiple lower bandwidth (rate) signals by an analysis bank. After processing, the subband signals are recombined into a fullband output signal by a synthesis bank. This paper describes an implementation of the analysis and synthesis banks using (Field Programmable Gate Arrays) FPGAs.

  12. Assessment Methodology for Process Validation Lifecycle Stage 3A.

    PubMed

    Sayeed-Desta, Naheed; Pazhayattil, Ajay Babu; Collins, Jordan; Chen, Shu; Ingram, Marzena; Spes, Jana

    2017-07-01

    The paper introduces evaluation methodologies and associated statistical approaches for process validation lifecycle Stage 3A. The assessment tools proposed can be applied to newly developed and launched small molecule as well as bio-pharma products, where substantial process and product knowledge has been gathered. The following elements may be included in Stage 3A: number of 3A batch determination; evaluation of critical material attributes, critical process parameters, critical quality attributes; in vivo in vitro correlation; estimation of inherent process variability (IPV) and PaCS index; process capability and quality dashboard (PCQd); and enhanced control strategy. US FDA guidance on Process Validation: General Principles and Practices, January 2011 encourages applying previous credible experience with suitably similar products and processes. A complete Stage 3A evaluation is a valuable resource for product development and future risk mitigation of similar products and processes. Elements of 3A assessment were developed to address industry and regulatory guidance requirements. The conclusions made provide sufficient information to make a scientific and risk-based decision on product robustness.

  13. An image-processing methodology for extracting bloodstain pattern features.

    PubMed

    Arthur, Ravishka M; Humburg, Philomena J; Hoogenboom, Jerry; Baiker, Martin; Taylor, Michael C; de Bruin, Karla G

    2017-08-01

    There is a growing trend in forensic science to develop methods to make forensic pattern comparison tasks more objective. This has generally involved the application of suitable image-processing methods to provide numerical data for identification or comparison. This paper outlines a unique image-processing methodology that can be utilised by analysts to generate reliable pattern data that will assist them in forming objective conclusions about a pattern. A range of features were defined and extracted from a laboratory-generated impact spatter pattern. These features were based in part on bloodstain properties commonly used in the analysis of spatter bloodstain patterns. The values of these features were consistent with properties reported qualitatively for such patterns. The image-processing method developed shows considerable promise as a way to establish measurable discriminating pattern criteria that are lacking in current bloodstain pattern taxonomies. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Modeling of electrohydrodynamic drying process using response surface methodology

    PubMed Central

    Dalvand, Mohammad Jafar; Mohtasebi, Seyed Saeid; Rafiee, Shahin

    2014-01-01

    Energy consumption index is one of the most important criteria for judging about new, and emerging drying technologies. One of such novel and promising alternative of drying process is called electrohydrodynamic (EHD) drying. In this work, a solar energy was used to maintain required energy of EHD drying process. Moreover, response surface methodology (RSM) was used to build a predictive model in order to investigate the combined effects of independent variables such as applied voltage, field strength, number of discharge electrode (needle), and air velocity on moisture ratio, energy efficiency, and energy consumption as responses of EHD drying process. Three-levels and four-factor Box–Behnken design was employed to evaluate the effects of independent variables on system responses. A stepwise approach was followed to build up a model that can map the entire response surface. The interior relationships between parameters were well defined by RSM. PMID:24936289

  15. Modeling of electrohydrodynamic drying process using response surface methodology.

    PubMed

    Dalvand, Mohammad Jafar; Mohtasebi, Seyed Saeid; Rafiee, Shahin

    2014-05-01

    Energy consumption index is one of the most important criteria for judging about new, and emerging drying technologies. One of such novel and promising alternative of drying process is called electrohydrodynamic (EHD) drying. In this work, a solar energy was used to maintain required energy of EHD drying process. Moreover, response surface methodology (RSM) was used to build a predictive model in order to investigate the combined effects of independent variables such as applied voltage, field strength, number of discharge electrode (needle), and air velocity on moisture ratio, energy efficiency, and energy consumption as responses of EHD drying process. Three-levels and four-factor Box-Behnken design was employed to evaluate the effects of independent variables on system responses. A stepwise approach was followed to build up a model that can map the entire response surface. The interior relationships between parameters were well defined by RSM.

  16. A Methodology To Generate A Digital Elevation Model By Combining Topographic And Bathymetric Data In Fluvial Environments

    NASA Astrophysics Data System (ADS)

    Matias, Magda Paraiso; Falcano, Ana Paula; Goncalves, Alexandre B.; Alvares, Teressa; Pestana, Rita; Van Zeller, Emilia; Rodrigues, Victor; Heleno, Sandra

    2013-12-01

    In hydrodynamic simulations, a digital elevation model valid for the whole study area is a requirement. The construction of this model usually implies the use of topographic and bathymetric data collected by distinct equipment and methods, at different times and acquired with a variety of spatial resolutions and accuracies. Several methodologies have been tested in order to combine both datasets involving the use of diverse spatial interpolators. In this paper we present the first results of a new methodology that combines a digital elevation model acquired by radar for floodplain areas with cross- sections in the river bed, in order to provide the most accurate and reliable digital elevation model. Since data was collected in distinct epochs, in the overlapped areas differences in elevation might exist, due to the morphological dynamics of the river. In order to analyse and validate those differences, a dataset with SAR imagery, provided by ESA, was used.

  17. Digital pulse processing: new possibilities in nuclear spectroscopy

    PubMed

    Warburton; Momayezi; Hubbard-Nelson; Skulski

    2000-10-01

    Digital pulse processing is a signal processing technique in which detector (preamplifier output) signals are directly digitized and processed to extract quantities of interest. This approach has several significant advantages compared to traditional analog signal shaping. First, analyses can be developed which take pulse-by-pulse differences into account, as in making ballistic deficit compensations. Second, transient induced charge signals, which deposit no net charge on an electrode, can be analyzed to give, for example, information on the position of interaction within the detector. Third, deadtimes from transient overload signals are greatly reduced, from tens of micros to hundreds of ns. Fourth, signals are easily captured, so that more complex analyses can be postponed until the source event has been deemed "interesting". Fifth, signal capture and processing may easily be based on coincidence criteria between different detectors or different parts of the same detector. XIAs recently introduced CAMAC module, the DGF-4C, provides many of these features for four input channels, including two levels of digital processing and a FIFO for signal capture for each signal channel. The first level of digital processing is "immediate", taking place in a gate array at the 40 MHz digitization rate, and implements pulse detection, pileup inspection, trapezoidal energy filtering, and control of an external 25.6 micros long FIFO. The second level of digital processing is provided by a digital signal processor (DSP), where more complex algorithms can be implemented. To illustrate digital pulse processing's possibilities, we describe the application of the DGF-4C to a series of experiments. The first, for which the DGF was originally developed, involves locating gamma-ray interaction sites within large segmented Ge detectors. The goal of this work is to attain spatial resolutions of order 2 mm sigma within 70 mm x 90 mm detectors. We show how pulse shape analysis allows

  18. Digitizing Dissertations for an Institutional Repository: A Process and Cost Analysis*

    PubMed Central

    Piorun, Mary; Palmer, Lisa A.

    2008-01-01

    Objective: This paper describes the Lamar Soutter Library's process and costs associated with digitizing 300 doctoral dissertations for a newly implemented institutional repository at the University of Massachusetts Medical School. Methodology: Project tasks included identifying metadata elements, obtaining and tracking permissions, converting the dissertations to an electronic format, and coordinating workflow between library departments. Each dissertation was scanned, reviewed for quality control, enhanced with a table of contents, processed through an optical character recognition function, and added to the institutional repository. Results: Three hundred and twenty dissertations were digitized and added to the repository for a cost of $23,562, or $0.28 per page. Seventy-four percent of the authors who were contacted (n = 282) granted permission to digitize their dissertations. Processing time per title was 170 minutes, for a total processing time of 906 hours. In the first 17 months, full-text dissertations in the collection were downloaded 17,555 times. Conclusion: Locally digitizing dissertations or other scholarly works for inclusion in institutional repositories can be cost effective, especially if small, defined projects are chosen. A successful project serves as an excellent recruitment strategy for the institutional repository and helps libraries build new relationships. Challenges include workflow, cost, policy development, and copyright permissions. PMID:18654648

  19. Application of Six Sigma methodology to a diagnostic imaging process.

    PubMed

    Taner, Mehmet Tolga; Sezen, Bulent; Atwat, Kamal M

    2012-01-01

    This paper aims to apply the Six Sigma methodology to improve workflow by eliminating the causes of failure in the medical imaging department of a private Turkish hospital. Implementation of the design, measure, analyse, improve and control (DMAIC) improvement cycle, workflow chart, fishbone diagrams and Pareto charts were employed, together with rigorous data collection in the department. The identification of root causes of repeat sessions and delays was followed by failure, mode and effect analysis, hazard analysis and decision tree analysis. The most frequent causes of failure were malfunction of the RIS/PACS system and improper positioning of patients. Subsequent to extensive training of professionals, the sigma level was increased from 3.5 to 4.2. The data were collected over only four months. Six Sigma's data measurement and process improvement methodology is the impetus for health care organisations to rethink their workflow and reduce malpractice. It involves measuring, recording and reporting data on a regular basis. This enables the administration to monitor workflow continuously. The improvements in the workflow under study, made by determining the failures and potential risks associated with radiologic care, will have a positive impact on society in terms of patient safety. Having eliminated repeat examinations, the risk of being exposed to more radiation was also minimised. This paper supports the need to apply Six Sigma and present an evaluation of the process in an imaging department.

  20. Advanced Digital Signal Processing for Hybrid Lidar

    DTIC Science & Technology

    2014-10-30

    microcontroller board is used to control a laser scanner. A MATLAB program collects data at each pixel and constructs both amplitude and range images...development of signed processing algorithms for hybrid lidar- radar designed to improve detection performance. 15. SUBJECT TERMS Hybrid Lidar - Radar 16...hardware implementation and underwater channel characteristics. Tecfinical Approach A significant challenge in hybrid lidar- radar is optical

  1. Digital Art Making as a Representational Process

    ERIC Educational Resources Information Center

    Halverson, Erica Rosenfeld

    2013-01-01

    In this article I bring artistic production into the learning sciences conversation by using the production of representations as a bridging concept between art making and the new literacies. Through case studies with 4 youth media arts organizations across the United States I ask how organizations structure the process of producing…

  2. Methodological framework for evaluating clinical processes: A cognitive informatics perspective.

    PubMed

    Kannampallil, Thomas G; Abraham, Joanna; Patel, Vimla L

    2016-12-01

    We propose a methodological framework for evaluating clinical cognitive activities in complex real-world environments that provides a guiding framework for characterizing the patterns of activities. This approach, which we refer to as a process-based approach, is particularly relevant to cognitive informatics (CI) research-an interdisciplinary domain utilizing cognitive approaches in the study of computing systems and applications-as it provides new ways for understanding human information processing, interactions, and behaviors. Using this approach involves the identification of a process of interest (e.g., a clinical workflow), and the contributing sequences of activities in that process (e.g., medication ordering). A variety of analytical approaches can then be used to characterize the inherent dependencies and relations within the contributing activities within the considered process. Using examples drawn from our own research and the extant research literature, we describe the theoretical foundations of the process-based approach, relevant practical and pragmatic considerations for using such an approach, and a generic framework for applying this approach for evaluation studies in clinical settings. We also discuss the potential for this approach in future evaluations of interactive clinical systems, given the need for new approaches for evaluation, and significant opportunities for automated, unobtrusive data collection. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Reservoir continuous process improvement six sigma methodology implementation

    SciTech Connect

    Wannamaker, A.L.

    1996-12-01

    The six sigma methodology adopted by AlliedSignal Inc. for implementing continuous improvement activity was applied to a new manufacturing assignment for Federal Manufacturing & Technologies (FM&T). The responsibility for reservoir development/production was transferred from Rocky Flats to FM&T. Pressure vessel fabrication was new to this facility. No fabrication history for this type of product existed in-house. Statistical tools such as process mapping, failure mode and effects analysis, and design of experiments were used to define and fully characterize the machine processes to be used in reservoir production. Continuous improvement with regard to operating efficiencies and product quality is an ongoing activity at FM&T.

  4. Signal processing methodologies for an acoustic fetal heart rate monitor

    NASA Technical Reports Server (NTRS)

    Pretlow, Robert A., III; Stoughton, John W.

    1992-01-01

    Research and development is presented of real time signal processing methodologies for the detection of fetal heart tones within a noise-contaminated signal from a passive acoustic sensor. A linear predictor algorithm is utilized for detection of the heart tone event and additional processing derives heart rate. The linear predictor is adaptively 'trained' in a least mean square error sense on generic fetal heart tones recorded from patients. A real time monitor system is described which outputs to a strip chart recorder for plotting the time history of the fetal heart rate. The system is validated in the context of the fetal nonstress test. Comparisons are made with ultrasonic nonstress tests on a series of patients. Comparative data provides favorable indications of the feasibility of the acoustic monitor for clinical use.

  5. [Improving inpatient pharmacoterapeutic process by Lean Six Sigma methodology].

    PubMed

    Font Noguera, I; Fernández Megía, M J; Ferrer Riquelme, A J; Balasch I Parisi, S; Edo Solsona, M D; Poveda Andres, J L

    2013-01-01

    Lean Six Sigma methodology has been used to improve care processes, eliminate waste, reduce costs, and increase patient satisfaction. To analyse the results obtained with Lean Six Sigma methodology in the diagnosis and improvement of the inpatient pharmacotherapy process during structural and organisational changes in a tertiary hospital. 1.000 beds tertiary hospital. prospective observational study. The define, measure, analyse, improve and control (DMAIC), were deployed from March to September 2011. An Initial Project Charter was updated as results were obtained. 131 patients with treatments prescribed within 24h after admission and with 4 drugs. safety indicators (medication errors), and efficiency indicators (complaints and time delays). Proportion of patients with a medication error was reduced from 61.0% (25/41 patients) to 55.7% (39/70 patients) in four months. Percentage of errors (regarding the opportunities for error) decreased in the different phases of the process: Prescription: from 5.1% (19/372 opportunities) to 3.3% (19/572 opportunities); Preparation: from 2.7% (14/525 opportunities) to 1.3% (11/847 opportunities); and administration: from 4.9% (16/329 opportunities) to 3.0% (13/433 opportunities). Nursing complaints decreased from 10.0% (2119/21038 patients) to 5.7% (1779/31097 patients). The estimated economic impact was 76,800 euros saved. An improvement in the pharmacotherapeutic process and a positive economic impact was observed, as well as enhancing patient safety and efficiency of the organization. Standardisation and professional training are future Lean Six Sigma candidate projects. Copyright © 2012 SECA. Published by Elsevier Espana. All rights reserved.

  6. Digital image processing of vascular angiograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Blankenhorn, D. H.; Beckenbach, E. S.; Crawford, D. W.; Brooks, S. H.

    1975-01-01

    A computer image processing technique was developed to estimate the degree of atherosclerosis in the human femoral artery. With an angiographic film of the vessel as input, the computer was programmed to estimate vessel abnormality through a series of measurements, some derived primarily from the vessel edge information and others from optical density variations within the lumen shadow. These measurements were combined into an atherosclerosis index, which was found to correlate well with both visual and chemical estimates of atherosclerotic disease.

  7. Synthetic aperture radar and digital processing: An introduction

    NASA Technical Reports Server (NTRS)

    Dicenzo, A.

    1981-01-01

    A tutorial on synthetic aperture radar (SAR) is presented with emphasis on digital data collection and processing. Background information on waveform frequency and phase notation, mixing, Q conversion, sampling and cross correlation operations is included for clarity. The fate of a SAR signal from transmission to processed image is traced in detail, using the model of a single bright point target against a dark background. Some of the principal problems connected with SAR processing are also discussed.

  8. Processing classical holographic interferograms by algorithms of digital hologram reconstruction

    NASA Astrophysics Data System (ADS)

    Belashov, A. V.; Petrov, N. V.; Semenova, I. V.

    2015-07-01

    The capability of digital hologram reconstruction algorithms applied for the processing of holographic interferograms in finite-width fringes being recorded and reconstructed by the classical optical method is validated. Application of these algorithms, significantly simplifies the processing procedure. Results of the processing of the holographic interferogram of a bulk strain soliton performed by the two methods are demonstrated to be in a good agreement.

  9. Results of precision processing (scene correction) of ERTS-1 images using digital image processing techniques

    NASA Technical Reports Server (NTRS)

    Bernstein, R.

    1973-01-01

    ERTS-1 MSS and RBV data recorded on computer compatible tapes have been analyzed and processed, and preliminary results have been obtained. No degradation of intensity (radiance) information occurred in implementing the geometric correction. The quality and resolution of the digitally processed images are very good, due primarily to the fact that the number of film generations and conversions is reduced to a minimum. Processing times of digitally processed images are about equivalent to the NDPF electro-optical processor.

  10. Advanced Digital Signal Processing for Hybrid Lidar

    DTIC Science & Technology

    2012-03-30

    Progress Report (1/1/2012- 3 /30/2012) This document provides a progress report on the project "Advanced JDigital Signal Processing" covering the...period of 1/1/2012- 3 /30/2012. XCiV’So3>oTu^lDM William D. Jemison, Professor and Chair, PO Box 5720, Clarkson University, Potsdam, NY 13699-5720 315...z + Az. t) S/jY(z,t) d) Az ■^ Souri^f^) Figura 1, Single delay line canceter When applying this filtering approach to a turbid underwater

  11. Digital Light Processing update: status and future applications

    NASA Astrophysics Data System (ADS)

    Hornbeck, Larry J.

    1999-05-01

    Digital Light Processing (DLP) projection displays based on the Digital Micromirror Device (DMD) were introduced to the market in 1996. Less than 3 years later, DLP-based projectors are found in such diverse applications as mobile, conference room, video wall, home theater, and large-venue. They provide high-quality, seamless, all-digital images that have exceptional stability as well as freedom from both flicker and image lag. Marked improvements have been made in the image quality of DLP-based projection display, including brightness, resolution, contrast ratio, and border image. DLP-based mobile projectors that weighted about 27 pounds in 1996 now weight only about 7 pounds. This weight reduction has been responsible for the definition of an entirely new projector class, the ultraportable. New applications are being developed for this important new projection display technology; these include digital photofinishing for high process speed minilab and maxilab applications and DLP Cinema for the digital delivery of films to audiences around the world. This paper describes the status of DLP-based projection display technology, including its manufacturing, performance improvements, and new applications, with emphasis on DLP Cinema.

  12. Application of automated methodologies based on digital images for phenological behaviour analysis in Mediterranean species

    NASA Astrophysics Data System (ADS)

    Cesaraccio, Carla; Piga, Alessandra; Ventura, Andrea; Arca, Angelo; Duce, Pierpaolo; Granados, Joel

    2015-04-01

    The importance of phenological research for understanding the consequences of global environmental change on vegetation is highlighted in the most recent IPCC reports. Collecting time series of phenological events appears to be of crucial importance to better understand how vegetation systems respond to climatic regime fluctuations, and, consequently, to develop effective management and adaptation strategies. Vegetation monitoring based on "near-surface" remote sensing techniques have been proposed in recent researches. In particular, the use of digital cameras has become more common for phenological monitoring. Digital images provide spectral information in the red, green, and blue (RGB) wavelengths. Inflection points in seasonal variations of intensities of each color channel can be used to identify phenological events. In this research, an Automated Phenological Observation System (APOS), based on digital image sensors, was used for monitoring the phenological behavior of shrubland species in a Mediterranean site. Major species of the shrubland ecosystem that were analyzed were: Cistus monspeliensis L., Cistus incanus L., Rosmarinus officinalis L., Pistacia lentiscus L., and Pinus halepensis Mill. The system was developed under the INCREASE (an Integrated Network on Climate Change Research) EU-funded research infrastructure project, which is based upon large scale field experiments with non-intrusive climatic manipulations. Monitoring of phenological behavior was conducted during 2012-2014 years. To the end of retrieve phenological information from digital images, a routine of commands to process the digital image file using the program MATLAB (R2013b, The MathWorks, Natick, Mass.) was specifically created. The images of the dataset have been re-classified and renamed files according to the date and time of acquisition. The analysis was focused on regions of interest (ROIs) of the panoramas acquired, defined by the presence of the most representative species of

  13. Optical hybrid analog-digital signal processing based on spike processing in neurons

    NASA Astrophysics Data System (ADS)

    Fok, Mable P.; Tian, Yue; Rosenbluth, David; Deng, Yanhua; Prucnal, Paul R.

    2011-09-01

    Spike processing is one kind of hybrid analog-digital signal processing, which has the efficiency of analog processing and the robustness to noise of digital processing. When instantiated with optics, a hybrid analog-digital processing primitive has the potential to be scalable, computationally powerful, and have high operation bandwidth. These devices open up a range of processing applications for which electronic processing is too slow. Our approach is based on a hybrid analog/digital computational primitive that elegantly implements the functionality of an integrate-and-fire neuron using a Ge-doped non-linear optical fiber and off-the-shelf semiconductor devices. In this paper, we introduce our photonic neuron architecture and demonstrate the feasibility of implementing simple photonic neuromorphic circuits, including the auditory localization algorithm of the barn owl, which is useful for LIDAR localization, and the crayfish tail-flip escape response.

  14. Processing of digital holograms: segmentation and inpainting

    NASA Astrophysics Data System (ADS)

    Jiao, Shuming; Zou, Wenbin

    2016-10-01

    In this paper, two novel hologram image processing issues, i.e., hologram decomposition and hologram inpainting, are briefly reviewed and discussed. By hologram decomposition, one hologram can be decomposed into several subholograms and each sub-hologram represents one individual item in the 3D object scene. A Virtual Diffraction Plane based hologram decomposition scheme is proposed based on Otsu thresholding segmentation, morphological dilation and sequential scan labelling. Hologram decomposition can be employed for focus distance detection in blind hologram reconstruction. By hologram impainting, a damaged hologram can be restored by filling in the missing pixels. An exemplar and search based technique is applied for hologram inpainting with enhanced computing speed by Artificial Bee Colony algorithm. Potential applications of hologram inpainting are discussed.

  15. Introduction and comparison of new EBSD post-processing methodologies.

    PubMed

    Wright, Stuart I; Nowell, Matthew M; Lindeman, Scott P; Camus, Patrick P; De Graef, Marc; Jackson, Michael A

    2015-12-01

    Electron Backscatter Diffraction (EBSD) provides a useful means for characterizing microstructure. However, it can be difficult to obtain index-able diffraction patterns from some samples. This can lead to noisy maps reconstructed from the scan data. Various post-processing methodologies have been developed to improve the scan data generally based on correlating non-indexed or mis-indexed points with the orientations obtained at neighboring points in the scan grid. Two new approaches are introduced (1) a re-scanning approach using local pattern averaging and (2) using the multiple solutions obtained by the triplet indexing method. These methodologies are applied to samples with noise introduced into the patterns artificially and by the operational settings of the EBSD camera. They are also applied to a heavily deformed and a fine-grained sample. In all cases, both techniques provide an improvement in the resulting scan data, the local pattern averaging providing the most improvement of the two. However, the local pattern averaging is most helpful when the noise in the patterns is due to the camera operating conditions as opposed to inherent challenges in the sample itself. A byproduct of this study was insight into the validity of various indexing success rate metrics. A metric based given by the fraction of points with CI values greater than some tolerance value (0.1 in this case) was confirmed to provide an accurate assessment of the indexing success rate. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  16. Experiences with digital processing of images at INPE

    NASA Technical Reports Server (NTRS)

    Mascarenhas, N. D. A. (Principal Investigator)

    1984-01-01

    Four different research experiments with digital image processing at INPE will be described: (1) edge detection by hypothesis testing; (2) image interpolation by finite impulse response filters; (3) spatial feature extraction methods in multispectral classification; and (4) translational image registration by sequential tests of hypotheses.

  17. Trust in Numbers? Digital Education Governance and the Inspection Process

    ERIC Educational Resources Information Center

    Ozga, Jenny

    2016-01-01

    The aim of the paper is to contribute to the critical study of digital data use in education, through examination of the processes surrounding school inspection judgements. The interaction between pupil performance data and other (embodied, enacted) sources of inspection judgement is scrutinised and discussed with a focus on the interaction…

  18. Digital-Computer Processing of Graphical Data. Final Report.

    ERIC Educational Resources Information Center

    Freeman, Herbert

    The final report of a two-year study concerned with the digital-computer processing of graphical data. Five separate investigations carried out under this study are described briefly, and a detailed bibliography, complete with abstracts, is included in which are listed the technical papers and reports published during the period of this program.…

  19. Digital Signal Processing in Acoustics--Part 2.

    ERIC Educational Resources Information Center

    Davies, H.; McNeill, D. J.

    1986-01-01

    Reviews the potential of a data acquisition system for illustrating the nature and significance of ideas in digital signal processing. Focuses on the fast Fourier transform and the utility of its two-channel format, emphasizing cross-correlation and its two-microphone technique of acoustic intensity measurement. Includes programing format. (ML)

  20. Optimization of Control Processes of Digital Electrical Drive Systems

    NASA Astrophysics Data System (ADS)

    Dochviri, J.

    2010-01-01

    The aim of the work is solution of the problems associated with synthesis of the digital speed regulators both for DC and AC thyristor electrical drives. The investigation is realized based on the parameters of continuous technological equipment (e.g. paper-making machine) by taking into account elastic transmission links of the drive systems. Appropriate frequency characteristics and transient processes are described.

  1. Trust in Numbers? Digital Education Governance and the Inspection Process

    ERIC Educational Resources Information Center

    Ozga, Jenny

    2016-01-01

    The aim of the paper is to contribute to the critical study of digital data use in education, through examination of the processes surrounding school inspection judgements. The interaction between pupil performance data and other (embodied, enacted) sources of inspection judgement is scrutinised and discussed with a focus on the interaction…

  2. Sliding mean edge estimation. [in digital image processing

    NASA Technical Reports Server (NTRS)

    Ford, G. E.

    1978-01-01

    A method for determining the locations of the major edges of objects in digital images is presented. The method is based on an algorithm utilizing maximum likelihood concepts. An image line-scan interval is processed to determine if an edge exists within the interval and its location. The proposed algorithm has demonstrated good results even in noisy images.

  3. Digital Image Processing application to spray and flammability studies

    NASA Technical Reports Server (NTRS)

    Hernan, M. A.; Parikh, P.; Sarohia, V.

    1985-01-01

    Digital Image Processing has been integrated into a new technique for measurements of fuel spray characteristics. The advantages of this technique are: a wide dynamic range of droplet sizes, accounting for nonspherical droplet shapes not possible with other spray assessment techniques. Finally, the technique has been applied to the study of turbojet engine fuel nozzle atomization performance with Jet A and antimisting fuel.

  4. Reflexivity: a methodological tool in the knowledge translation process?

    PubMed

    Alley, Sarah; Jackson, Suzanne F; Shakya, Yogendra B

    2015-05-01

    Knowledge translation is a dynamic and iterative process that includes the synthesis, dissemination, exchange, and application of knowledge. It is considered the bridge that closes the gap between research and practice. Yet it appears that in all areas of practice, a significant gap remains in translating research knowledge into practical application. Recently, researchers and practitioners in the field of health care have begun to recognize reflection and reflexive exercises as a fundamental component to the knowledge translation process. As a practical tool, reflexivity can go beyond simply looking at what practitioners are doing; when approached in a systematic manner, it has the potential to enable practitioners from a wide variety of backgrounds to identify, understand, and act in relation to the personal, professional, and political challenges they face in practice. This article focuses on how reflexive practice as a methodological tool can provide researchers and practitioners with new insights and increased self-awareness, as they are able to critically examine the nature of their work and acknowledge biases, which may affect the knowledge translation process. Through the use of structured journal entries, the nature of the relationship between reflexivity and knowledge translation was examined, specifically exploring if reflexivity can improve the knowledge translation process, leading to increased utilization and application of research findings into everyday practice.

  5. Lean methodology for performance improvement in the trauma discharge process.

    PubMed

    O'Mara, Michael Shaymus; Ramaniuk, Aliaksandr; Graymire, Vickie; Rozzell, Monica; Martin, Stacey

    2014-07-01

    High-volume, complex services such as trauma and acute care surgery are at risk for inefficiency. Lean process improvement can reduce health care waste. Lean allows a structured look at processes not easily amenable to analysis. We applied lean methodology to the current state of communication and discharge planning on an urban trauma service, citing areas for improvement. A lean process mapping event was held. The process map was used to identify areas for immediate analysis and intervention-defining metrics for the stakeholders. After intervention, new performance was assessed by direct data evaluation. The process was completed with an analysis of effect and plans made for addressing future focus areas. The primary area of concern identified was interservice communication. Changes centering on a standardized morning report structure reduced the number of consult questions unanswered from 67% to 34% (p = 0.0021). Physical therapy rework was reduced from 35% to 19% (p = 0.016). Patients admitted to units not designated to the trauma service had 1.6 times longer stays (p < 0.0001). The lean process lasted 8 months, and three areas for new improvement were identified: (1) the off-unit patients; (2) patients with length of stay more than 15 days contribute disproportionately to length of stay; and (3) miscommunication exists around patient education at discharge. Lean process improvement is a viable means of health care analysis. When applied to a trauma service with 4,000 admissions annually, lean identifies areas ripe for improvement. Our inefficiencies surrounded communication and patient localization. Strategies arising from the input of all stakeholders led to real solutions for communication through a face-to-face morning report and identified areas for ongoing improvement. This focuses resource use and identifies areas for improvement of throughput in care delivery.

  6. Light Water Reactor Sustainability Program: Digital Technology Business Case Methodology Guide

    SciTech Connect

    Thomas, Ken; Lawrie, Sean; Hart, Adam; Vlahoplus, Chris

    2014-09-01

    The Department of Energy’s (DOE’s) Light Water Reactor Sustainability Program aims to develop and deploy technologies that will make the existing U.S. nuclear fleet more efficient and competitive. The program has developed a standard methodology for determining the impact of new technologies in order to assist nuclear power plant (NPP) operators in building sound business cases. The Advanced Instrumentation, Information, and Control (II&C) Systems Technologies Pathway is part of the DOE’s Light Water Reactor Sustainability (LWRS) Program. It conducts targeted research and development (R&D) to address aging and reliability concerns with the legacy instrumentation and control and related information systems of the U.S. operating light water reactor (LWR) fleet. This work involves two major goals: (1) to ensure that legacy analog II&C systems are not life-limiting issues for the LWR fleet and (2) to implement digital II&C technology in a manner that enables broad innovation and business improvement in the NPP operating model. Resolving long-term operational concerns with the II&C systems contributes to the long-term sustainability of the LWR fleet, which is vital to the nation’s energy and environmental security. The II&C Pathway is conducting a series of pilot projects that enable the development and deployment of new II&C technologies in existing nuclear plants. Through the LWRS program, individual utilities and plants are able to participate in these projects or otherwise leverage the results of projects conducted at demonstration plants. Performance advantages of the new pilot project technologies are widely acknowledged, but it has proven difficult for utilities to derive business cases for justifying investment in these new capabilities. Lack of a business case is often cited by utilities as a barrier to pursuing wide-scale application of digital technologies to nuclear plant work activities. The decision to move forward with funding usually hinges on

  7. Adaptive digital signal processing for X-ray spectrometry

    NASA Astrophysics Data System (ADS)

    Lakatos, T.

    1990-05-01

    A real-time fully digital signal processing and analyzing system based on a new concept has been developed for high count rate high resolution spectrometry. The principle has been realized with digital filtering of the preamplifier output signals. The system's unique features are the maximum theoretically possible throughput rate with high resolution, and the adaptive noise filtering for nearly loss-free measurements. In adaptive mode the maximum output rate is about 20 times higher than in the case of the semi-Gaussian shaping, with low degradation of energy resolution. All parameters of the signal processor are software controllable.

  8. Methodology for the systems engineering process. Volume 3: Operational availability

    NASA Technical Reports Server (NTRS)

    Nelson, J. H.

    1972-01-01

    A detailed description and explanation of the operational availability parameter is presented. The fundamental mathematical basis for operational availability is developed, and its relationship to a system's overall performance effectiveness is illustrated within the context of identifying specific availability requirements. Thus, in attempting to provide a general methodology for treating both hypothetical and existing availability requirements, the concept of an availability state, in conjunction with the more conventional probability-time capability, is investigated. In this respect, emphasis is focused upon a balanced analytical and pragmatic treatment of operational availability within the system design process. For example, several applications of operational availability to typical aerospace systems are presented, encompassing the techniques of Monte Carlo simulation, system performance availability trade-off studies, analytical modeling of specific scenarios, as well as the determination of launch-on-time probabilities. Finally, an extensive bibliography is provided to indicate further levels of depth and detail of the operational availability parameter.

  9. Evaluation methodologies for an advanced information processing system

    NASA Technical Reports Server (NTRS)

    Schabowsky, R. S., Jr.; Gai, E.; Walker, B. K.; Lala, J. H.; Motyka, P.

    1984-01-01

    The system concept and requirements for an Advanced Information Processing System (AIPS) are briefly described, but the emphasis of this paper is on the evaluation methodologies being developed and utilized in the AIPS program. The evaluation tasks include hardware reliability, maintainability and availability, software reliability, performance, and performability. Hardware RMA and software reliability are addressed with Markov modeling techniques. The performance analysis for AIPS is based on queueing theory. Performability is a measure of merit which combines system reliability and performance measures. The probability laws of the performance measures are obtained from the Markov reliability models. Scalar functions of this law such as the mean and variance provide measures of merit in the AIPS performability evaluations.

  10. Evaluation methodologies for an advanced information processing system

    NASA Technical Reports Server (NTRS)

    Schabowsky, R. S., Jr.; Gai, E.; Walker, B. K.; Lala, J. H.; Motyka, P.

    1984-01-01

    The system concept and requirements for an Advanced Information Processing System (AIPS) are briefly described, but the emphasis of this paper is on the evaluation methodologies being developed and utilized in the AIPS program. The evaluation tasks include hardware reliability, maintainability and availability, software reliability, performance, and performability. Hardware RMA and software reliability are addressed with Markov modeling techniques. The performance analysis for AIPS is based on queueing theory. Performability is a measure of merit which combines system reliability and performance measures. The probability laws of the performance measures are obtained from the Markov reliability models. Scalar functions of this law such as the mean and variance provide measures of merit in the AIPS performability evaluations.

  11. Methodology to reduce chronic defect mechanisms in semiconductor processing

    NASA Astrophysics Data System (ADS)

    Ecton, Timothy W.; Frazee, Kenneth G.

    1990-06-01

    This paper docuitents a structur approach to defect elimination in seiiiccructor processing. Classical problem solving techniques were used to logically guide the defect rIuction effort. tfect infontation was gatherei using an automated wafer inspection systeaii ar defects were classifi&1 by production workers on a rete review station. This approach distiruishe actual causes from several probable causes. A process change has reduc the defect mechanism. This methodology was applied to ruce !IEFWN' perfluoroalkoxy (PFA) particles in a one micron semiccructor process. Electrical test structures identified a critical layer where yield loss was occurring. An audit procedure was establishi at this layer arx defects were c1assifi into broad cateories. Further breakout of defect t'pes by appearance was necessaxy to construct a meaningful Pareto chart ard identify the xist fr&ijiently occurring fatal defect. The critical process zone was segmented using autaat wafer inspection to isolate the step causing the defect. An IshiJcawa or cause-effect diagram was construct with input from process engineers to outline all possible causes of the defect. A nest probable branch was selected for investigation arxi pursued until it became clear that this branch was not related to the cause. At this point, new ideas were sought from a sister production facility. ring the visit a breakthrough irxicat& a different path ar ultiltiately lead to identifying the source of the defect. A process change was implemented. An evaluation of the change she1 a substantial decrease in defect evel. rther efforts to eliminate the defect srce are in rogres.

  12. Digital image processing of bone - Problems and potentials

    NASA Technical Reports Server (NTRS)

    Morey, E. R.; Wronski, T. J.

    1980-01-01

    The development of a digital image processing system for bone histomorphometry and fluorescent marker monitoring is discussed. The system in question is capable of making measurements of UV or light microscope features on a video screen with either video or computer-generated images, and comprises a microscope, low-light-level video camera, video digitizer and display terminal, color monitor, and PDP 11/34 computer. Capabilities demonstrated in the analysis of an undecalcified rat tibia include the measurement of perimeter and total bone area, and the generation of microscope images, false color images, digitized images and contoured images for further analysis. Software development will be based on an existing software library, specifically the mini-VICAR system developed at JPL. It is noted that the potentials of the system in terms of speed and reliability far exceed any problems associated with hardware and software development.

  13. Digital image processing of bone - Problems and potentials

    NASA Technical Reports Server (NTRS)

    Morey, E. R.; Wronski, T. J.

    1980-01-01

    The development of a digital image processing system for bone histomorphometry and fluorescent marker monitoring is discussed. The system in question is capable of making measurements of UV or light microscope features on a video screen with either video or computer-generated images, and comprises a microscope, low-light-level video camera, video digitizer and display terminal, color monitor, and PDP 11/34 computer. Capabilities demonstrated in the analysis of an undecalcified rat tibia include the measurement of perimeter and total bone area, and the generation of microscope images, false color images, digitized images and contoured images for further analysis. Software development will be based on an existing software library, specifically the mini-VICAR system developed at JPL. It is noted that the potentials of the system in terms of speed and reliability far exceed any problems associated with hardware and software development.

  14. Process sequence optimization for digital microfluidic integration using EWOD technique

    NASA Astrophysics Data System (ADS)

    Yadav, Supriya; Joyce, Robin; Sharma, Akash Kumar; Sharma, Himani; Sharma, Niti Nipun; Varghese, Soney; Akhtar, Jamil

    2016-04-01

    Micro/nano-fluidic MEMS biosensors are the devices that detects the biomolecules. The emerging micro/nano-fluidic devices provide high throughput and high repeatability with very low response time and reduced device cost as compared to traditional devices. This article presents the experimental details for process sequence optimization of digital microfluidics (DMF) using "electrowetting-on-dielectric" (EWOD). Stress free thick film deposition of silicon dioxide using PECVD and subsequent process for EWOD techniques have been optimized in this work.

  15. Advanced Digital Signal Processing for Hybrid Lidar FY 2013

    DTIC Science & Technology

    2013-01-01

    Report 4. TITLE AND SUBTITLE Advance Digital Signal Processing for Hybrid Lidar 6. AUTHOR(S) William D. Jemison 7. PERFORMING ORGANIZATION NAME(S...development of signed processing algorithms for hybrid lidar - radar designed to improve detection performance. i , 15. SUBJECT TERMS Hybrid... Lidar - Radar 16. SECURITY CLASSIFICATION OF: a. REPORT b. ABSTRACT c. THIS PAGE 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF

  16. Digital image processing for the earth resources technology satellite data.

    NASA Technical Reports Server (NTRS)

    Will, P. M.; Bakis, R.; Wesley, M. A.

    1972-01-01

    This paper discusses the problems of digital processing of the large volumes of multispectral image data that are expected to be received from the ERTS program. Correction of geometric and radiometric distortions are discussed and a byte oriented implementation is proposed. CPU timing estimates are given for a System/360 Model 67, and show that a processing throughput of 1000 image sets per week is feasible.

  17. Digital processing of histopathological aspects in renal transplantation

    NASA Astrophysics Data System (ADS)

    de Albuquerque Araujo, Arnaldo; de Andrade, Marcos C.; Bambirra, Eduardo A.; dos Santos, A. M. M.

    1993-07-01

    We describe here our initial experience with the digital image processing of histopathological aspects from multiple renal biopsies of transplanted kidney in a patient treated with Cyclosporine (CsA), a powerful immunosupressor drug whose use has improved the chances of a successful vascularized organ transplantation (Tx). Unfortunately, CsA promotes morphological alterations to the glomerular structure of the kidneys. To characterize this process, glomeruli, tufts, and lumen areas distributions are measured. The results are presented in form of graphics.

  18. The quantified process approach: an emerging methodology to neuropsychological assessment.

    PubMed

    Poreh, A M

    2000-05-01

    An important development in the field of neuropsychological assessment is the quantification of the process by which individuals solve common neuropsychological tasks. The present article outlines the history leading to this development, the Quantified Process Approach, and suggests that this line of applied research bridges the gap between the clinical and statistical approaches to neuropsychological assessment. It is argued that the enterprise of quantifying the process approach proceeds via three major methodologies: (1) the "Satellite" Testing Paradigm: an approach by which new tasks are developed to complement existing tests so as to clarify a given test performance; (2) the Composition Paradigm: an approach by which data on a given test that have been largely overlooked are compiled and subsequently analyzed, resulting in new indices that are believed to reflect underlying constructs accounting for test performance; and (3) the Decomposition Paradigm: an approach which investigates the relationship between test items of a given measure according to underlying facets, resulting in the development of new subscores. The article illustrates each of the above paradigms, offers a critique of this new field according to prevailing professional standards for psychological measures, and provides suggestions for future research.

  19. Blood flow determination using recursive processing: a digital radiographic method

    SciTech Connect

    Kruger, R.A.; Bateman, W.; Liu, P.Y.; Nelson, J.A.

    1983-10-01

    Temporal filtration of fluoroscopic video sequences is being used as an alternative to pulsed digital subtraction angiography. Using the same image processing architecture and a slight modification in processing logic a parametric image can be synthesized from such a temporally filtered image sequence in virtual real time, i.e., an image sequence that spans T seconds takes exactly T seconds to process. Off-line computer processing is not required. Initial phantom studies imply that the time to maximum opacification (t/sub max/) can be used to determine absolute and relative blood flow with a high confidence level (r > .989). Phantom and animal examples are presented.

  20. A digital signal processing system for coherent laser radar

    NASA Technical Reports Server (NTRS)

    Hampton, Diana M.; Jones, William D.; Rothermel, Jeffry

    1991-01-01

    A data processing system for use with continuous-wave lidar is described in terms of its configuration and performance during the second survey mission of NASA'a Global Backscatter Experiment. The system is designed to estimate a complete lidar spectrum in real time, record the data from two lidars, and monitor variables related to the lidar operating environment. The PC-based system includes a transient capture board, a digital-signal processing (DSP) board, and a low-speed data-acquisition board. Both unprocessed and processed lidar spectrum data are monitored in real time, and the results are compared to those of a previous non-DSP-based system. Because the DSP-based system is digital it is slower than the surface-acoustic-wave signal processor and collects 2500 spectra/s. However, the DSP-based system provides complete data sets at two wavelengths from the continuous-wave lidars.

  1. Integrating digital topology in image-processing libraries.

    PubMed

    Lamy, Julien

    2007-01-01

    This paper describes a method to integrate digital topology informations in image-processing libraries. This additional information allows a library user to write algorithms respecting topological constraints, for example, a seed fill or a skeletonization algorithm. As digital topology is absent from most image-processing libraries, such constraints cannot be fulfilled. We describe and give code samples for all the structures necessary for this integration, and show a use case in the form of a homotopic thinning filter inside ITK. The obtained filter can be up to a hundred times as fast as ITK's thinning filter and works for any image dimension. This paper mainly deals of integration within ITK, but can be adapted with only minor modifications to other image-processing libraries.

  2. Flow manipulation and control methodologies for vacuum infusion processes

    NASA Astrophysics Data System (ADS)

    Alms, Justin B.

    experienced. First, the effect on permeability is characterized, so the process can be simulated and the flow front patterns can be predicted. It was found that using the VIPR process in combination with tool side injection gates is a very effective method to control resin flow. Based on this understanding several control algorithms were developed to use the process in an automated manufacturing environment which were tested and validated in a virtual environment. To implement and demonstrate the approach, an experimental workstation was built and various infusion examples were performed in the automated environment to validate the capability of the VIPR process with the control methodologies. The VIPR process with control consistently performed better than the process without control. This contribution should prove useful in making VIPs more reliable in the production of large scale composite structures.

  3. Novel Optimization Methodology for Welding Process/Consumable Integration

    SciTech Connect

    Quintana, Marie A; DebRoy, Tarasankar; Vitek, John; Babu, Suresh

    2006-01-15

    Advanced materials are being developed to improve the energy efficiency of many industries of future including steel, mining, and chemical, as well as, US infrastructures including bridges, pipelines and buildings. Effective deployment of these materials is highly dependent upon the development of arc welding technology. Traditional welding technology development is slow and often involves expensive and time-consuming trial and error experimentation. The reason for this is the lack of useful predictive tools that enable welding technology development to keep pace with the deployment of new materials in various industrial sectors. Literature reviews showed two kinds of modeling activities. Academic and national laboratory efforts focus on developing integrated weld process models by employing the detailed scientific methodologies. However, these models are cumbersome and not easy to use. Therefore, these scientific models have limited application in real-world industrial conditions. On the other hand, industrial users have relied on simple predictive models based on analytical and empirical equations to drive their product development. The scopes of these simple models are limited. In this research, attempts were made to bridge this gap and provide the industry with a computational tool that combines the advantages of both approaches. This research resulted in the development of predictive tools which facilitate the development of optimized welding processes and consumables. The work demonstrated that it is possible to develop hybrid integrated models for relating the weld metal composition and process parameters to the performance of welds. In addition, these tools can be deployed for industrial users through user friendly graphical interface. In principle, the welding industry users can use these modular tools to guide their welding process parameter and consumable composition selection. It is hypothesized that by expanding these tools throughout welding industry

  4. A symbolic methodology to improve disassembly process design.

    PubMed

    Rios, Pedro; Blyler, Leslie; Tieman, Lisa; Stuart, Julie Ann; Grant, Ed

    2003-12-01

    Millions of end-of-life electronic components are retired annually due to the proliferation of new models and their rapid obsolescence. The recovery of resources such as plastics from these goods requires their disassembly. The time required for each disassembly and its associated cost is defined by the operator's familiarity with the product design and its complexity. Since model proliferation serves to complicate an operator's learning curve, it is worthwhile to investigate the benefits to be gained in a disassembly operator's preplanning process. Effective disassembly process design demands the application of green engineering principles, such as those developed by Anastas and Zimmerman (Environ. Sci. Technol. 2003, 37, 94A-101A), which include regard for product complexity, structural commonality, separation energy, material value, and waste prevention. This paper introduces the concept of design symbolsto help the operator more efficiently survey product complexity with respect to location and number of fasteners to remove a structure that is common to all electronics: the housing. With a sample of 71 different computers, printers, and monitors, we demonstrate that appropriate symbols reduce the total disassembly planning time by 13.2 min. Such an improvement could well make efficient the separation of plastic that would otherwise be destined for waste-to-energy or landfill. The symbolic methodology presented may also improve Design for Recycling and Design for Maintenance and Support.

  5. Automated image processing of LANDSAT 2 digital data for watershed runoff prediction

    NASA Technical Reports Server (NTRS)

    Sasso, R. R.; Jensen, J. R.; Estes, J. E.

    1977-01-01

    The U.S. Soil Conservation Service (SCS) model for watershed runoff prediction uses soil and land cover information as its major drivers. Kern County Water Agency is implementing the SCS model to predict runoff for 10,400 sq cm of mountainous watershed in Kern County, California. The Remote Sensing Unit, University of California, Santa Barbara, was commissioned by KCWA to conduct a 230 sq cm feasibility study in the Lake Isabella, California region to evaluate remote sensing methodologies which could be ultimately extrapolated to the entire 10,400 sq cm Kern County watershed. Digital results indicate that digital image processing of Landsat 2 data will provide usable land cover required by KCWA for input to the SCS runoff model.

  6. Automated image processing of LANDSAT 2 digital data for watershed runoff prediction

    NASA Technical Reports Server (NTRS)

    Sasso, R. R.; Jensen, J. R.; Estes, J. E.

    1977-01-01

    The U.S. Soil Conservation Service (SCS) model for watershed runoff prediction uses soil and land cover information as its major drivers. Kern County Water Agency is implementing the SCS model to predict runoff for 10,400 sq cm of mountainous watershed in Kern County, California. The Remote Sensing Unit, University of California, Santa Barbara, was commissioned by KCWA to conduct a 230 sq cm feasibility study in the Lake Isabella, California region to evaluate remote sensing methodologies which could be ultimately extrapolated to the entire 10,400 sq cm Kern County watershed. Digital results indicate that digital image processing of Landsat 2 data will provide usable land cover required by KCWA for input to the SCS runoff model.

  7. On-Board Spaceborne Real-time Digital Signal Processing

    NASA Astrophysics Data System (ADS)

    Gao, G.; Long, F.; Liu, L.

    begin center Abstract end center This paper reports a preliminary study result of an on-board digital signal processing system It consists of the on-board processing requirement analysis functional specifications and implementation with the radiation tolerant field-programmable gate array FPGA technology The FPGA program is designed in the VHDL hardware description language and implemented onto a high density F PGA chip The design takes full advantage of the massively parallel architecture of the VirtexII FPGA logic slices to achieve real-time processing at a big data rate Further more an FFT algorithm s implementation with the system is provided as an illustration

  8. Computer image processing - The Viking experience. [digital enhancement techniques

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1977-01-01

    Computer processing of digital imagery from the Viking mission to Mars is discussed, with attention given to subjective enhancement and quantitative processing. Contrast stretching and high-pass filtering techniques of subjective enhancement are described; algorithms developed to determine optimal stretch and filtering parameters are also mentioned. In addition, geometric transformations to rectify the distortion of shapes in the field of view and to alter the apparent viewpoint of the image are considered. Perhaps the most difficult problem in quantitative processing of Viking imagery was the production of accurate color representations of Orbiter and Lander camera images.

  9. Digital image processing using parallel computing based on CUDA technology

    NASA Astrophysics Data System (ADS)

    Skirnevskiy, I. P.; Pustovit, A. V.; Abdrashitova, M. O.

    2017-01-01

    This article describes expediency of using a graphics processing unit (GPU) in big data processing in the context of digital images processing. It provides a short description of a parallel computing technology and its usage in different areas, definition of the image noise and a brief overview of some noise removal algorithms. It also describes some basic requirements that should be met by certain noise removal algorithm in the projection to computer tomography. It provides comparison of the performance with and without using GPU as well as with different percentage of using CPU and GPU.

  10. Computer image processing - The Viking experience. [digital enhancement techniques

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1977-01-01

    Computer processing of digital imagery from the Viking mission to Mars is discussed, with attention given to subjective enhancement and quantitative processing. Contrast stretching and high-pass filtering techniques of subjective enhancement are described; algorithms developed to determine optimal stretch and filtering parameters are also mentioned. In addition, geometric transformations to rectify the distortion of shapes in the field of view and to alter the apparent viewpoint of the image are considered. Perhaps the most difficult problem in quantitative processing of Viking imagery was the production of accurate color representations of Orbiter and Lander camera images.

  11. Tunable photonic filters: a digital signal processing design approach.

    PubMed

    Binh, Le Nguyen

    2009-05-20

    Digital signal processing techniques are used for synthesizing tunable optical filters with variable bandwidth and centered reference frequency including the tunability of the low-pass, high-pass, bandpass, and bandstop optical filters. Potential applications of such filters are discussed, and the design techniques and properties of recursive digital filters are outlined. The basic filter structures, namely, the first-order all-pole optical filter (FOAPOF) and the first-order all-zero optical filter (FOAZOF), are described, and finally the design process of tunable optical filters and the designs of the second-order Butterworth low-pass, high-pass, bandpass, and bandstop tunable optical filters are presented. Indeed, we identify that the all-zero and all-pole networks are equivalent with well known principles of optics of interference and resonance, respectively. It is thus very straightforward to implement tunable optical filters, which is a unique feature.

  12. Processing Digital Imagery to Enhance Perceptions of Realism

    NASA Technical Reports Server (NTRS)

    Woodell, Glenn A.; Jobson, Daniel J.; Rahman, Zia-ur

    2003-01-01

    Multi-scale retinex with color restoration (MSRCR) is a method of processing digital image data based on Edwin Land s retinex (retina + cortex) theory of human color vision. An outgrowth of basic scientific research and its application to NASA s remote-sensing mission, MSRCR is embodied in a general-purpose algorithm that greatly improves the perception of visual realism and the quantity and quality of perceived information in a digitized image. In addition, the MSRCR algorithm includes provisions for automatic corrections to accelerate and facilitate what could otherwise be a tedious image-editing process. The MSRCR algorithm has been, and is expected to continue to be, the basis for development of commercial image-enhancement software designed to extend and refine its capabilities for diverse applications.

  13. Digital image processing of crystalline specimens examined by electron microscopy.

    PubMed

    Kanaya, K

    1988-12-01

    Crystalline specimens imaged in the electron microscope are analysed using digital processing. Some principles of structural analysis using the method of Fourier decomposition are discussed. Complementary techniques, such as enhancement by gradient and Laplacian operators, have been found useful in analysing electron micrographs. The application of these techniques to some problems in Materials Science and Biology are reviewed. By selecting and phase-correcting spots in the computed diffraction pattern, it was possible to localize atoms, molecules, and their defective arrangement in evaporated gold, sputter-deposited tungsten films, and single crystals of cadmium selenide. Digital processing based on the theory of helical diffraction was used to explore the three-dimensional arrangement of molecules in cellular components of alveolar soft part sarcoma, Hirano bodies, and neurofibrillar tangles in the human brain.

  14. Audit and Certification Process for Science Data Digital Repositories

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Giaretta, D.; Ambacher, B.; Ashley, K.; Conrad, M.; Downs, R. R.; Garrett, J.; Guercio, M.; Lambert, S.; Longstreth, T.; Sawyer, D. M.; Sierman, B.; Tibbo, H.; Waltz, M.

    2011-12-01

    Science data digital repositories are entrusted to ensure that a science community's data are available and useful to users both today and in the future. Part of the challenge in meeting this responsibility is identifying the standards, policies and procedures required to accomplish effective data preservation. Subsequently a repository should be evaluated on whether or not they are effective in their data preservation efforts. This poster will outline the process by which digital repositories are being formally evaluated in terms of their ability to preserve the digitally encoded information with which they have been entrusted. The ISO standards on which this is based will be identified and the relationship of these standards to the Open Archive Information System (OAIS) reference model will be shown. Six test audits have been conducted with three repositories in Europe and three in the USA. Some of the major lessons learned from these test audits will be briefly described. An assessment of the possible impact of this type of audit and certification on the practice of preserving digital information will also be provided.

  15. Digital signal processing algorithms for automatic voice recognition

    NASA Technical Reports Server (NTRS)

    Botros, Nazeih M.

    1987-01-01

    The current digital signal analysis algorithms are investigated that are implemented in automatic voice recognition algorithms. Automatic voice recognition means, the capability of a computer to recognize and interact with verbal commands. The digital signal is focused on, rather than the linguistic, analysis of speech signal. Several digital signal processing algorithms are available for voice recognition. Some of these algorithms are: Linear Predictive Coding (LPC), Short-time Fourier Analysis, and Cepstrum Analysis. Among these algorithms, the LPC is the most widely used. This algorithm has short execution time and do not require large memory storage. However, it has several limitations due to the assumptions used to develop it. The other 2 algorithms are frequency domain algorithms with not many assumptions, but they are not widely implemented or investigated. However, with the recent advances in the digital technology, namely signal processors, these 2 frequency domain algorithms may be investigated in order to implement them in voice recognition. This research is concerned with real time, microprocessor based recognition algorithms.

  16. Predicting protein subcellular location using digital signal processing.

    PubMed

    Pan, Yu-Xi; Li, Da-Wei; Duan, Yun; Zhang, Zhi-Zhou; Xu, Ming-Qing; Feng, Guo-Yin; He, Lin

    2005-02-01

    The biological functions of a protein are closely related to its attributes in a cell. With the rapid accumulation of newly found protein sequence data in databanks, it is highly desirable to develop an automated method for predicting the subcellular location of proteins. The establishment of such a predictor will expedite the functional determination of newly found proteins and the process of prioritizing genes and proteins identified by genomic efforts as potential molecular targets for drug design. The traditional algorithms for predicting these attributes were based solely on amino acid composition in which no sequence order effect was taken into account. To improve the prediction quality, it is necessary to incorporate such an effect. However, the number of possible patterns in protein sequences is extremely large, posing a formidable difficulty for realizing this goal. To deal with such difficulty, a well-developed tool in digital signal processing named digital Fourier transform (DFT) [1] was introduced. After being translated to a digital signal according to the hydrophobicity of each amino acid, a protein was analyzed by DFT within the frequency domain. A set of frequency spectrum parameters, thus obtained, were regarded as the factors to represent the sequence order effect. A significant improvement in prediction quality was observed by incorporating the frequency spectrum parameters with the conventional amino acid composition. One of the crucial merits of this approach is that many existing tools in mathematics and engineering can be easily applied in the predicting process. It is anticipated that digital signal processing may serve as a useful vehicle for many other protein science areas.

  17. Optical Digital Algebraic Processing for Multi-Sensor-Array Data.

    DTIC Science & Technology

    1986-02-01

    kAC-AlE? L96 OPTICAL DIGITAL ALGEBRAIC PROCESSING FOR I NMULTI-SENiSOR-ARRARY DATA(U) GEORGIA INST OF TECH ATLANTA SCHOOL OF ELECTRICAL ENGINEERING N...PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT , TASK Georgia Institute of Technology AREA ORKUN, UERS School of Electrical... Engineering Fur Atlanta, Georgia 30332 _ ____-_’_-_ II CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE AFOSR/WN& February 1986 Building 410 1s

  18. APET methodology for Defense Waste Processing Facility: Mode C operation

    SciTech Connect

    Taylor, R.P. Jr.; Massey, W.M.

    1995-04-01

    Safe operation of SRS facilities continues to be the highest priority of the Savannah River Site (SRS). One of these facilities, the Defense Waste Processing Facility or DWPF, is currently undergoing cold chemical runs to verify the design and construction preparatory to hot startup in 1995. The DWPFF is a facility designed to convert the waste currently stored in tanks at the 200-Area tank farm into a form that is suitable for long term storage in engineered surface facilities and, ultimately, geologic isolation. As a part of the program to ensure safe operation of the DWPF, a probabilistic Safety Assessment of the DWPF has been completed. The results of this analysis are incorporated into the Safety Analysis Report (SAR) for DWPF. The usual practice in preparation of Safety Analysis Reports is to include only a conservative analysis of certain design basis accidents. A major part of a Probabilistic Safety Assessment is the development and quantification of an Accident Progression Event Tree or APET. The APET provides a probabilistic representation of potential sequences along which an accident may progress. The methodology used to determine the risk of operation of the DWPF borrows heavily from methods applied to the Probabilistic Safety Assessment of SRS reactors and to some commercial reactors. This report describes the Accident Progression Event Tree developed for the Probabilistic Safety Assessment of the DWPF.

  19. [Effect of digital radiography processing parameters on digital chest radiograph for occupational exposed workers].

    PubMed

    Wang, Xiao-hua; Liu, Dong-sheng; Xuan, Xiao; Kang, Han; Yuan, Hui-shu

    2013-05-01

    To investigate the effect of different processing parameters of digital radiography (DR) on the image quality of digital chest radiograph in dust-exposed workers. One hundred and five dust-exposed workers underwent both high-KV radiography and DR to obtain chest radiographs; the image processing parameters were set by the conventional processing method for digital chest radiograph (method A) and the processing method based on the special requirements of occupational diseases (method B). With the high-KV chest radiograph as the reference, the image qualities at 10 anatomic sites of DR image were graded. The images acquired by DR and high-KV radiography were compared, and the DR images acquired by methods A and B were also compared. For method A, the scores at the 10 anatomic sites of DR image were mostly 0 and +1, accounting for over 88%, and the mean score was 0.23 ∼ 0.65, there was a significant difference between the mean score of DR image and the score of high-KV image (P < 0.001). For method B, the scores at the 10 anatomic sites of DR image were mostly 0, accounting for over 65%, and the mean score was -0.01∼ +0.02 except at the pleura and chest wall; there was no significant difference between the mean score of DR image and the score of high-KV image (P > 0.05). There were significant differences in the scores at the 10 anatomic sites between the DR images acquired by methods A and B (P < 0.01). The DR images acquired based on different processing parameters are different. The quality of DR image acquired by the processing method based on the special requirements of occupational diseases is similar to that of high-KV image at the anatomic sites.

  20. Quantitative Assessment of Mouse Mammary Gland Morphology Using Automated Digital Image Processing and TEB Detection.

    PubMed

    Blacher, Silvia; Gérard, Céline; Gallez, Anne; Foidart, Jean-Michel; Noël, Agnès; Péqueux, Christel

    2016-04-01

    The assessment of rodent mammary gland morphology is largely used to study the molecular mechanisms driving breast development and to analyze the impact of various endocrine disruptors with putative pathological implications. In this work, we propose a methodology relying on fully automated digital image analysis methods including image processing and quantification of the whole ductal tree and of the terminal end buds as well. It allows to accurately and objectively measure both growth parameters and fine morphological glandular structures. Mammary gland elongation was characterized by 2 parameters: the length and the epithelial area of the ductal tree. Ductal tree fine structures were characterized by: 1) branch end-point density, 2) branching density, and 3) branch length distribution. The proposed methodology was compared with quantification methods classically used in the literature. This procedure can be transposed to several software and thus largely used by scientists studying rodent mammary gland morphology.

  1. Application of digital image processing for the generation of voxels phantoms for Monte Carlo simulation.

    PubMed

    Boia, L S; Menezes, A F; Cardoso, M A C; da Rosa, L A R; Batista, D V S; Cardoso, S C; Silva, A X; Facure, A

    2012-01-01

    This paper presents the application of a computational methodology for optimizing the conversion of medical tomographic images in voxel anthropomorphic models for simulation of radiation transport using the MCNP code. A computational system was developed for digital image processing that compresses the information from the DICOM medical image before it is converted to the Scan2MCNP software input file for optimization of the image data. In order to validate the computational methodology, a radiosurgery treatment simulation was performed using the Alderson Rando phantom and the acquisition of DICOM images was performed. The simulation results were compared with data obtained with the BrainLab planning system. The comparison showed good agreement for three orthogonal treatment beams of (60)Co gamma radiation. The percentage differences were 3.07%, 0.77% and 6.15% for axial, coronal and sagital projections, respectively. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. a Semi-Automated Point Cloud Processing Methodology for 3d Cultural Heritage Documentation

    NASA Astrophysics Data System (ADS)

    Kıvılcım, C. Ö.; Duran, Z.

    2016-06-01

    The preliminary phase in any architectural heritage project is to obtain metric measurements and documentation of the building and its individual elements. On the other hand, conventional measurement techniques require tremendous resources and lengthy project completion times for architectural surveys and 3D model production. Over the past two decades, the widespread use of laser scanning and digital photogrammetry have significantly altered the heritage documentation process. Furthermore, advances in these technologies have enabled robust data collection and reduced user workload for generating various levels of products, from single buildings to expansive cityscapes. More recently, the use of procedural modelling methods and BIM relevant applications for historic building documentation purposes has become an active area of research, however fully automated systems in cultural heritage documentation still remains open. In this paper, we present a semi-automated methodology, for 3D façade modelling of cultural heritage assets based on parametric and procedural modelling techniques and using airborne and terrestrial laser scanning data. We present the contribution of our methodology, which we implemented in an open source software environment using the example project of a 16th century early classical era Ottoman structure, Sinan the Architect's Şehzade Mosque in Istanbul, Turkey.

  3. Digital image processing for the early localization of cancer

    NASA Astrophysics Data System (ADS)

    Kelmar, Cheryl M.

    1991-06-01

    The prognosis for cancer patients becomes much better if a tumor is diagnosed, localized and treated early, in a precancerous stage. The difficulty lies in the localization of cancerous tumors. Carcinoma in situ (CIS) refers to a tumor which is approximately 100 microns thick and one which has not penetrated through the epithelium wall or become invasive (2). A tumor of this size cannot be detected by existing techniques such as x-ray, computer tomography, magnetic resonance imaging, nuclear medicine or conventional endoscopy under white-light illumination. However, these tumors can be localized and destroyed by photodynamic diagnosis and therapy. This research shows that digital image processing and the technique of digital image ratioing contribute to photodynamic diagnosis and the early localization of cancer. A software package has been developed as a result of this research. The software package quantifies the usefulness of digital image processing for tumor localization and detectability. System parameters such as the endoscope distance and angle variations, tumor size and tumor concentration, sensitivity and specificity of the system have been tested and quantified.

  4. Kinematic analysis of human walking gait using digital image processing.

    PubMed

    O'Malley, M; de Paor, D L

    1993-07-01

    A system using digital image processing techniques for kinematic analysis of human gait has been developed. The system is cheap, easy to use, automated and provides useful detailed quantitative information to the medical profession. Passive markers comprising black annuli on white card are placed on the anatomical landmarks of the subject. Digital images at the standard television rate of 25 per second are acquired of the subject walking past a white background. The images are obtained, stored and processed using standard commercially available hardware, i.e. video camera, video recorder, digital framestore and an IBM PC. Using a single-threshold grey level, all the images are thresholded to produce binary images. An automatic routine then uses a set of pattern recognition algorithms to locate accurately and consistently the markers in each image. The positions of the markers are analysed to determine to which anatomical landmark they correspond, and thus a stick diagram for each image is obtained. There is also a facility where the positions of the markers may be entered manually and errors corrected. The results may be presented in a variety of ways: stick diagram animation, sagittal displacement graphs, flexion diagrams and gait parameters.

  5. Digital Processing of Weak Signals Buried in Noise

    NASA Astrophysics Data System (ADS)

    Emerson, Darrel

    This article describes the use of digital signal processing to pull the AMSAT AO-13 ZRO test signal out of the noise. In the ZRO tests, a signal is transmitted from the Oscar 13 satellite at progressively lower power levels, in 3 dB steps. The challenge is to decode successfully the weakest possible signal. The signal from the receiver audio was digitized using a Sound Blaster card, then filtered with a modified FFT routine. The modification was to allow the pre-detection filter to follow the slowly drifting signal. After using the matched, sliding filter before detection, the post-detection signal was passed through another matched filter. Finally, a cross-correlation technique comparing the detected, filtered signal with every possible combination of ZRO signal was applied, taking also into account a gradual drift of CW sending speed. The final, statistically most probable, solution turned out to be correct. This gave the only successful detection of level A transmissions from Oscar 13 so far (Aug 1996.) The extensive digital processing partly made up for the relatively poor receiving antenna; a 10-element 146 MHz Yagi, part of the Cushcraft AOP-1 combination.

  6. DYMAC digital electronic balance. [LASL Plutonium Processing Facility

    SciTech Connect

    Stephens, M.M.

    1980-06-01

    The Dynamic Materials Accountability (DYMAC) System at LASL integrates nondestructive assay (NDA) instruments with interactive data-processing equipment to provide near-real-time accountability of the nuclear material in the LASL Plutonium Processing Facility. The most widely used NDA instrument in the system is the DYMAC digital electronic balance. The DYMAC balance is a commercial instrument that has been modified at LASL for weighing material in gloveboxes and for transmitting the weight data directly to a central computer. This manual describes the balance components, details the LASL modifications, reviews a DYMAC measurement control program that monitors balance performance, and provides instructions for balance operation and maintenance.

  7. Multiplexed interferometric fiber-optic sensors with digital signal processing.

    PubMed

    Sadkowski, R; Lee, C E; Taylor, H F

    1995-09-01

    A microcontroller-based digital signal processing system developed for use with fiber-optic sensors for measuring pressure in internal combustion engines is described. A single distributed feedback laser source provides optical power for four interferometric sensors. The laser current is repetitively modulated so that its optical frequency is nearly a linear function of time over most of a cycle. The interferometer phase shift is proportional to the elapsed time from the initiation of a sawtooth until the sensor output signal level crosses a threshold value proportional to the laser output power. This elapsed time, assumed to vary linearly with the combustion chamber pressure, is determined by the use of a digital timer-counter. The system has been used with fiber Fabry-Perot interferometer transducers for in-cylinder pressure measurement on a four-cylinder gasoline-powered engine.

  8. Instruments and Methodologies for the Underwater Tridimensional Digitization and Data Musealization

    NASA Astrophysics Data System (ADS)

    Repola, L.; Memmolo, R.; Signoretti, D.

    2015-04-01

    In the research started within the SINAPSIS project of the Università degli Studi Suor Orsola Benincasa an underwater stereoscopic scanning aimed at surveying of submerged archaeological sites, integrable to standard systems for geomorphological detection of the coast, has been developed. The project involves the construction of hardware consisting of an aluminum frame supporting a pair of GoPro Hero Black Edition cameras and software for the production of point clouds and the initial processing of data. The software has features for stereoscopic vision system calibration, reduction of noise and the of distortion of underwater captured images, searching for corresponding points of stereoscopic images using stereo-matching algorithms (dense and sparse), for points cloud generating and filtering. Only after various calibration and survey tests carried out during the excavations envisaged in the project, the mastery of methods for an efficient acquisition of data has been achieved. The current development of the system has allowed generation of portions of digital models of real submerged scenes. A semi-automatic procedure for global registration of partial models is under development as a useful aid for the study and musealization of sites.

  9. Methodological accuracy of digital and manual model analysis in orthodontics - A retrospective clinical study.

    PubMed

    Lippold, Carsten; Kirschneck, Christian; Schreiber, Kristina; Abukiress, Saleh; Tahvildari, Amir; Moiseenko, Tatjana; Danesh, Gholamreza

    2015-07-01

    Computer-based digital orthodontic models are available for clinicians, supplemented by dedicated software for performing required diagnostic measurements. The purpose of this study was to evaluate the accuracy of measurements made on three-dimensional digital models obtained with a CBCT-scanner (DigiModel™, OrthoProof(®), Nieuwegin, The Netherlands). 66 orthodontic dental casts of primary and early mixed dentitions were selected. Three-dimensional images were obtained on this CBCT-scanner and analyzed by means of the DigiModel™ software. Measurements were made with a digital caliper directly on the conventional casts and also digitally on the virtual models. 6 anatomic dental points were identified, and a total of 11 measurements were taken from each cast, including midline deviation, overjet, overbite and arch widths. Conformity of digital and manual measurements as well as intra-, inter- and repeated-measurement-reliability were evaluated by Lin's Concordance Correlation Coefficient, ICC and a Bland-Altman-analysis. The agreement and conformity of digital and manual measurements was substantial for all parameters evaluated. Intra-, inter- and repeated-measurement-reliability was excellent. Measurements on digital models obtained by a CBCT scan of conventional casts (DigiModel™, OrthoProof(®)) are suited for reliable diagnostic measurements, which compare well to those obtained from plaster casts, the current gold standard. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Enhancement Of Optical Registration Signals Through Digital Signal Processing Techniques

    NASA Astrophysics Data System (ADS)

    Cote, Daniel R.; Lazo-Wasem, Jeanne

    1988-01-01

    Alignment and setup of lighography processes has largely been conducted on special test wafers. Actual product level optimization has been limited to manual techniques such as optical verniers. This is especially time consuming and prone to inconsistencies when the registration characteristics of lithographic systems are being measured. One key factor obstructing the use of automated metrology equipment on product level wafers is the inability to discern reliably, metrology features from the background noise and variations in optical registration signals. This is often the case for metal levels such as aluminum and tungsten. This paper discusses methods for enhancement of typical registration signals obtained from difficult semiconductor process levels. Brightfield and darkfield registration signals are obtained using a microscope and a 1024 element linear photodiode array. These signals are then digitized and stored on the hard disk of a computer. The techniques utilized include amplitude selective and adaptive and non-adaptive frequency domain filtering techniques. The effect of each of these techniques upon calculated registration values is analyzed by determining the positional variation of the center location of a two line registration feature. Plots of raw and processed signals obtained are presented as are plots of the power spectral density of ideal metrology feature signal and noise patterns. It is concluded that the proper application of digital signal processing (DSP) techniques to problematic optical registration signals greatly enhances the applicability of automated optical registration measurement techniques to difficult semiconductor process levels.

  11. Digital metamaterials.

    PubMed

    Della Giovampaola, Cristian; Engheta, Nader

    2014-12-01

    Balancing complexity and simplicity has played an important role in the development of many fields in science and engineering. One of the well-known and powerful examples of such balance can be found in Boolean algebra and its impact on the birth of digital electronics and the digital information age. The simplicity of using only two numbers, '0' and '1', in a binary system for describing an arbitrary quantity made the fields of digital electronics and digital signal processing powerful and ubiquitous. Here, inspired by the binary concept, we propose to develop the notion of digital metamaterials. Specifically, we investigate how one can synthesize an electromagnetic metamaterial with a desired permittivity, using as building blocks only two elemental materials, which we call 'metamaterial bits', with two distinct permittivity functions. We demonstrate, analytically and numerically, how proper spatial mixtures of such metamaterial bits lead to elemental 'metamaterial bytes' with effective material parameters that are different from the parameters of the metamaterial bits. We then apply this methodology to several design examples of optical elements, such as digital convex lenses, flat graded-index digital lenses, digital constructs for epsilon-near-zero (ENZ) supercoupling and digital hyperlenses, thus highlighting the power and simplicity of the methodology.

  12. Rapid Process Optimization: A Novel Process Improvement Methodology to Innovate Health Care Delivery.

    PubMed

    Wiler, Jennifer L; Bookman, Kelly; Birznieks, Derek B; Leeret, Robert; Koehler, April; Planck, Shauna; Zane, Richard

    2016-03-26

    Health care systems have utilized various process redesign methodologies to improve care delivery. This article describes the creation of a novel process improvement methodology, Rapid Process Optimization (RPO). This system was used to redesign emergency care delivery within a large academic health care system, which resulted in a decrease: (1) door-to-physician time (Department A: 54 minutes pre vs 12 minutes 1 year post; Department B: 20 minutes pre vs 8 minutes 3 months post), (2) overall length of stay (Department A: 228 vs 184; Department B: 202 vs 192), (3) discharge length of stay (Department A: 216 vs 140; Department B: 179 vs 169), and (4) left without being seen rates (Department A: 5.5% vs 0.0%; Department B: 4.1% vs 0.5%) despite a 47% increased census at Department A (34 391 vs 50 691) and a 4% increase at Department B (8404 vs 8753). The novel RPO process improvement methodology can inform and guide successful care redesign.

  13. Image processing in digital pathology: an opportunity to solve inter-batch variability of immunohistochemical staining

    NASA Astrophysics Data System (ADS)

    van Eycke, Yves-Rémi; Allard, Justine; Salmon, Isabelle; Debeir, Olivier; Decaestecker, Christine

    2017-02-01

    Immunohistochemistry (IHC) is a widely used technique in pathology to evidence protein expression in tissue samples. However, this staining technique is known for presenting inter-batch variations. Whole slide imaging in digital pathology offers a possibility to overcome this problem by means of image normalisation techniques. In the present paper we propose a methodology to objectively evaluate the need of image normalisation and to identify the best way to perform it. This methodology uses tissue microarray (TMA) materials and statistical analyses to evidence the possible variations occurring at colour and intensity levels as well as to evaluate the efficiency of image normalisation methods in correcting them. We applied our methodology to test different methods of image normalisation based on blind colour deconvolution that we adapted for IHC staining. These tests were carried out for different IHC experiments on different tissue types and targeting different proteins with different subcellular localisations. Our methodology enabled us to establish and to validate inter-batch normalization transforms which correct the non-relevant IHC staining variations. The normalised image series were then processed to extract coherent quantitative features characterising the IHC staining patterns.

  14. Image processing in digital pathology: an opportunity to solve inter-batch variability of immunohistochemical staining

    PubMed Central

    Van Eycke, Yves-Rémi; Allard, Justine; Salmon, Isabelle; Debeir, Olivier; Decaestecker, Christine

    2017-01-01

    Immunohistochemistry (IHC) is a widely used technique in pathology to evidence protein expression in tissue samples. However, this staining technique is known for presenting inter-batch variations. Whole slide imaging in digital pathology offers a possibility to overcome this problem by means of image normalisation techniques. In the present paper we propose a methodology to objectively evaluate the need of image normalisation and to identify the best way to perform it. This methodology uses tissue microarray (TMA) materials and statistical analyses to evidence the possible variations occurring at colour and intensity levels as well as to evaluate the efficiency of image normalisation methods in correcting them. We applied our methodology to test different methods of image normalisation based on blind colour deconvolution that we adapted for IHC staining. These tests were carried out for different IHC experiments on different tissue types and targeting different proteins with different subcellular localisations. Our methodology enabled us to establish and to validate inter-batch normalization transforms which correct the non-relevant IHC staining variations. The normalised image series were then processed to extract coherent quantitative features characterising the IHC staining patterns. PMID:28220842

  15. Image processing in digital pathology: an opportunity to solve inter-batch variability of immunohistochemical staining.

    PubMed

    Van Eycke, Yves-Rémi; Allard, Justine; Salmon, Isabelle; Debeir, Olivier; Decaestecker, Christine

    2017-02-21

    Immunohistochemistry (IHC) is a widely used technique in pathology to evidence protein expression in tissue samples. However, this staining technique is known for presenting inter-batch variations. Whole slide imaging in digital pathology offers a possibility to overcome this problem by means of image normalisation techniques. In the present paper we propose a methodology to objectively evaluate the need of image normalisation and to identify the best way to perform it. This methodology uses tissue microarray (TMA) materials and statistical analyses to evidence the possible variations occurring at colour and intensity levels as well as to evaluate the efficiency of image normalisation methods in correcting them. We applied our methodology to test different methods of image normalisation based on blind colour deconvolution that we adapted for IHC staining. These tests were carried out for different IHC experiments on different tissue types and targeting different proteins with different subcellular localisations. Our methodology enabled us to establish and to validate inter-batch normalization transforms which correct the non-relevant IHC staining variations. The normalised image series were then processed to extract coherent quantitative features characterising the IHC staining patterns.

  16. Thermal Modeling of Direct Digital Melt-Deposition Processes

    NASA Astrophysics Data System (ADS)

    Cooper, K. P.; Lambrakos, S. G.

    2011-02-01

    Additive manufacturing involves creating three-dimensional (3D) objects by depositing materials layer-by-layer. The freeform nature of the method permits the production of components with complex geometry. Deposition processes provide one more capability, which is the addition of multiple materials in a discrete manner to create "heterogeneous" objects with locally controlled composition and microstructure. The result is direct digital manufacturing (DDM) by which dissimilar materials are added voxel-by-voxel (a voxel is volumetric pixel) following a predetermined tool-path. A typical example is functionally gradient material such as a gear with a tough core and a wear-resistant surface. The inherent complexity of DDM processes is such that process modeling based on direct physics-based theory is difficult, especially due to a lack of temperature-dependent thermophysical properties and particularly when dealing with melt-deposition processes. In order to overcome this difficulty, an inverse problem approach is proposed for the development of thermal models that can represent multi-material, direct digital melt deposition. This approach is based on the construction of a numerical-algorithmic framework for modeling anisotropic diffusivity such as that which would occur during energy deposition within a heterogeneous workpiece. This framework consists of path-weighted integral formulations of heat diffusion according to spatial variations in material composition and requires consideration of parameter sensitivity issues.

  17. Programmable rate modem utilizing digital signal processing techniques

    NASA Technical Reports Server (NTRS)

    Bunya, George K.; Wallace, Robert L.

    1989-01-01

    The engineering development study to follow was written to address the need for a Programmable Rate Digital Satellite Modem capable of supporting both burst and continuous transmission modes with either binary phase shift keying (BPSK) or quadrature phase shift keying (QPSK) modulation. The preferred implementation technique is an all digital one which utilizes as much digital signal processing (DSP) as possible. Here design tradeoffs in each portion of the modulator and demodulator subsystem are outlined, and viable circuit approaches which are easily repeatable, have low implementation losses and have low production costs are identified. The research involved for this study was divided into nine technical papers, each addressing a significant region of concern in a variable rate modem design. Trivial portions and basic support logic designs surrounding the nine major modem blocks were omitted. In brief, the nine topic areas were: (1) Transmit Data Filtering; (2) Transmit Clock Generation; (3) Carrier Synthesizer; (4) Receive AGC; (5) Receive Data Filtering; (6) RF Oscillator Phase Noise; (7) Receive Carrier Selectivity; (8) Carrier Recovery; and (9) Timing Recovery.

  18. Evaluation of clinical image processing algorithms used in digital mammography.

    PubMed

    Zanca, Federica; Jacobs, Jurgen; Van Ongeval, Chantal; Claus, Filip; Celis, Valerie; Geniets, Catherine; Provost, Veerle; Pauwels, Herman; Marchal, Guy; Bosmans, Hilde

    2009-03-01

    Screening is the only proven approach to reduce the mortality of breast cancer, but significant numbers of breast cancers remain undetected even when all quality assurance guidelines are implemented. With the increasing adoption of digital mammography systems, image processing may be a key factor in the imaging chain. Although to our knowledge statistically significant effects of manufacturer-recommended image processings have not been previously demonstrated, the subjective experience of our radiologists, that the apparent image quality can vary considerably between different algorithms, motivated this study. This article addresses the impact of five such algorithms on the detection of clusters of microcalcifications. A database of unprocessed (raw) images of 200 normal digital mammograms, acquired with the Siemens Novation DR, was collected retrospectively. Realistic simulated microcalcification clusters were inserted in half of the unprocessed images. All unprocessed images were subsequently processed with five manufacturer-recommended image processing algorithms (Agfa Musica 1, IMS Raffaello Mammo 1.2, Sectra Mamea AB Sigmoid, Siemens OPVIEW v2, and Siemens OPVIEW v1). Four breast imaging radiologists were asked to locate and score the clusters in each image on a five point rating scale. The free-response data were analyzed by the jackknife free-response receiver operating characteristic (JAFROC) method and, for comparison, also with the receiver operating characteristic (ROC) method. JAFROC analysis revealed highly significant differences between the image processings (F = 8.51, p < 0.0001), suggesting that image processing strongly impacts the detectability of clusters. Siemens OPVIEW2 and Siemens OPVIEW1 yielded the highest and lowest performances, respectively. ROC analysis of the data also revealed significant differences between the processing but at lower significance (F = 3.47, p = 0.0305) than JAFROC. Both statistical analysis methods revealed that the

  19. DSPSR: Digital Signal Processing Software for Pulsar Astronomy

    NASA Astrophysics Data System (ADS)

    van Straten, W.; Bailes, M.

    2010-10-01

    DSPSR, written primarily in C++, is an open-source, object-oriented, digital signal processing software library and application suite for use in radio pulsar astronomy. The library implements an extensive range of modular algorithms for use in coherent dedispersion, filterbank formation, pulse folding, and other tasks. The software is installed and compiled using the standard GNU configure and make system, and is able to read astronomical data in 18 different file formats, including FITS, S2, CPSR, CPSR2, PuMa, PuMa2, WAPP, ASP, and Mark5.

  20. Moire technique by means of digital image processing.

    PubMed

    Gasvik, K J

    1983-11-15

    Moiré technique by means of projected fringes is a suitable method for full field measurements of out-of-plane deformations and object contouring. One disadvantage in industrial applications has been the photographic process with the involved time-consuming development of the photographic film. This paper presents a new method using a TV camera and a digital image processor whereby real-time measurements of deformations and comparison of object contours are possible. Also the principles and limitations of the projected moiré method are described.

  1. Digital processing of side-scan sonar data with the Woods Hole image processing system software

    USGS Publications Warehouse

    Paskevich, Valerie F.

    1992-01-01

    Since 1985, the Branch of Atlantic Marine Geology has been involved in collecting, processing and digitally mosaicking high and low-resolution side-scan sonar data. Recent development of a UNIX-based image-processing software system includes a series of task specific programs for processing side-scan sonar data. This report describes the steps required to process the collected data and to produce an image that has equal along- and across-track resol

  2. Digital signal processor and programming system for parallel signal processing

    SciTech Connect

    Van den Bout, D.E.

    1987-01-01

    This thesis describes an integrated assault upon the problem of designing high-throughput, low-cost digital signal-processing systems. The dual prongs of this assault consist of: (1) the design of a digital signal processor (DSP) which efficiently executes signal-processing algorithms in either a uniprocessor or multiprocessor configuration, (2) the PaLS programming system which accepts an arbitrary algorithm, partitions it across a group of DSPs, synthesizes an optimal communication link topology for the DSPs, and schedules the partitioned algorithm upon the DSPs. The results of applying a new quasi-dynamic analysis technique to a set of high-level signal-processing algorithms were used to determine the uniprocessor features of the DSP design. For multiprocessing applications, the DSP contains an interprocessor communications port (IPC) which supports simple, flexible, dataflow communications while allowing the total communication bandwidth to be incrementally allocated to achieve the best link utilization. The net result is a DSP with a simple architecture that is easy to program for both uniprocessor and multi-processor modes of operation. The PaLS programming system simplifies the task of parallelizing an algorithm for execution upon a multiprocessor built with the DSP.

  3. Perspectives on Learning: Methodologies for Exploring Learning Processes and Outcomes

    ERIC Educational Resources Information Center

    Goldman, Susan R.

    2014-01-01

    The papers in this Special Issue were initially prepared for an EARLI 2013 Symposium that was designed to examine methodologies in use by researchers from two sister communities, Learning and Instruction and Learning Sciences. The four papers reflect a common ground in advances in conceptions of learning since the early days of the "cognitive…

  4. Holographic digital microscopy in on-line process control

    NASA Astrophysics Data System (ADS)

    Osanlou, Ardeshir

    2011-09-01

    This article investigates the feasibility of real-time three-dimensional imaging of microscopic objects within various emulsions while being produced in specialized production vessels. The study is particularly relevant to on-line process monitoring and control in chemical, pharmaceutical, food, cleaning, and personal hygiene industries. Such processes are often dynamic and the materials cannot be measured once removed from the production vessel. The technique reported here is applicable to three-dimensional characterization analyses on stirred fluids in small reaction vessels. Relatively expensive pulsed lasers have been avoided through the careful control of the speed of the moving fluid in relation to the speed of the camera exposure and the wavelength of the continuous wave laser used. The ultimate aim of the project is to introduce a fully robust and compact digital holographic microscope as a process control tool in a full size specialized production vessel.

  5. Desolvation Induced Origami of Photocurable Polymers by Digit Light Processing.

    PubMed

    Zhao, Zeang; Wu, Jiangtao; Mu, Xiaoming; Chen, Haosen; Qi, H Jerry; Fang, Daining

    2016-12-22

    Self-folding origami is of great interest in current research on functional materials and structures, but there is still a challenge to develop a simple method to create freestanding, reversible, and complex origami structures. This communication provides a feasible solution to this challenge by developing a method based on the digit light processing technique and desolvation-induced self-folding. In this new method, flat polymer sheets can be cured by a light field from a commercial projector with varying intensity, and the self-folding process is triggered by desolvation in water. Folded origami structures can be recovered once immersed in the swelling medium. The self-folding process is investigated both experimentally and theoretically. Diverse 3D origami shapes are demonstrated. This method can be used for responsive actuators and the fabrication of 3D electronic devices.

  6. Liquid crystal thermography and true-colour digital image processing

    NASA Astrophysics Data System (ADS)

    Stasiek, J.; Stasiek, A.; Jewartowski, M.; Collins, M. W.

    2006-06-01

    In the last decade thermochromic liquid crystals (TLC) and true-colour digital image processing have been successfully used in non-intrusive technical, industrial and biomedical studies and applications. Thin coatings of TLCs at surfaces are utilized to obtain detailed temperature distributions and heat transfer rates for steady or transient processes. Liquid crystals also can be used to make visible the temperature and velocity fields in liquids by the simple expedient of directly mixing the liquid crystal material into the liquid (water, glycerol, glycol, and silicone oils) in very small quantities to use as thermal and hydrodynamic tracers. In biomedical situations e.g., skin diseases, breast cancer, blood circulation and other medical application, TLC and image processing are successfully used as an additional non-invasive diagnostic method especially useful for screening large groups of potential patients. The history of this technique is reviewed, principal methods and tools are described and some examples are also presented.

  7. Applying of digital signal processing to optical equisignal zone system

    NASA Astrophysics Data System (ADS)

    Maraev, Anton A.; Timofeev, Aleksandr N.; Gusarov, Vadim F.

    2015-05-01

    In this work we are trying to assess the application of array detectors and digital information processing to the system with the optical equisignal zone as a new method of evaluating of optical equisignal zone position. Peculiarities of optical equisignal zone formation are described. The algorithm of evaluation of optical equisignal zone position is applied to processing on the array detector. This algorithm enables to evaluate as lateral displacement as turning angles of the receiver relative to the projector. Interrelation of parameters of the projector and the receiver is considered. According to described principles an experimental set was made and then characterized. The accuracy of position evaluation of the equisignal zone is shown dependent of the size of the equivalent entrance pupil at processing.

  8. Fundamentals of in Situ Digital Camera Methodology for Water Quality Monitoring of Coast and Ocean

    PubMed Central

    Goddijn-Murphy, Lonneke; Dailloux, Damien; White, Martin; Bowers, Dave

    2009-01-01

    Conventional digital cameras, the Nikon Coolpix885® and the SeaLife ECOshot®, were used as in situ optical instruments for water quality monitoring. Measured response spectra showed that these digital cameras are basically three-band radiometers. The response values in the red, green and blue bands, quantified by RGB values of digital images of the water surface, were comparable to measurements of irradiance levels at red, green and cyan/blue wavelengths of water leaving light. Different systems were deployed to capture upwelling light from below the surface, while eliminating direct surface reflection. Relationships between RGB ratios of water surface images, and water quality parameters were found to be consistent with previous measurements using more traditional narrow-band radiometers. This current paper focuses on the method that was used to acquire digital images, derive RGB values and relate measurements to water quality parameters. Field measurements were obtained in Galway Bay, Ireland, and in the Southern Rockall Trough in the North Atlantic, where both yellow substance and chlorophyll concentrations were successfully assessed using the digital camera method. PMID:22346729

  9. Naturalistic Observation of Health-Relevant Social Processes: The Electronically Activated Recorder (EAR) Methodology in Psychosomatics

    PubMed Central

    Mehl, Matthias R.; Robbins, Megan L.; Deters, Fenne große

    2012-01-01

    This article introduces a novel, observational ambulatory monitoring method called the Electronically Activated Recorder or EAR. The EAR is a digital audio recorder that runs on a handheld computer and periodically and unobtrusively records snippets of ambient sounds from participants’ momentary environments. In tracking moment-to-moment ambient sounds, it yields acoustic logs of people’s days as they naturally unfold. In sampling only a fraction of the time, it protects participants’ privacy and makes large observational studies feasible. As a naturalistic observation method, it provides an observer’s account of daily life and is optimized for the objective assessment of audible aspects of social environments, behaviors, and interactions (e.g., habitual preferences for social settings, idiosyncratic interaction styles, and subtle emotional expressions). The article discusses the EAR method conceptually and methodologically, reviews prior research with it, and identifies three concrete ways in which it can enrich psychosomatic research. Specifically, it can (a) calibrate psychosocial effects on health against frequencies of real-world behavior, (b) provide ecological, observational measures of health-related social processes that are independent of self-report, and (c) help with the assessment of subtle and habitual social behaviors that evade self-report but have important health implications. An important avenue for future research lies in merging traditional, self-report based ambulatory monitoring methods with observational approaches such as the EAR to allow for the simultaneous yet methodologically independent assessment of inner, experiential (e.g., loneliness) and outer, observable aspects (e.g., social isolation) of real-world social processes to reveal their unique effects on health. PMID:22582338

  10. On the Development of Arabic Three-Digit Number Processing in Primary School Children

    ERIC Educational Resources Information Center

    Mann, Anne; Moeller, Korbinian; Pixner, Silvia; Kaufmann, Liane; Nuerk, Hans-Christoph

    2012-01-01

    The development of two-digit number processing in children, and in particular the influence of place-value understanding, has recently received increasing research interest. However, place-value influences leading to decomposed processing have not yet been investigated for multi-digit numbers beyond the two-digit number range in children.…

  11. Coherent detection and digital signal processing for fiber optic communications

    NASA Astrophysics Data System (ADS)

    Ip, Ezra

    The drive towards higher spectral efficiency in optical fiber systems has generated renewed interest in coherent detection. We review different detection methods, including noncoherent, differentially coherent, and coherent detection, as well as hybrid detection methods. We compare the modulation methods that are enabled and their respective performances in a linear regime. An important system parameter is the number of degrees of freedom (DOF) utilized in transmission. Polarization-multiplexed quadrature-amplitude modulation maximizes spectral efficiency and power efficiency as it uses all four available DOF contained in the two field quadratures in the two polarizations. Dual-polarization homodyne or heterodyne downconversion are linear processes that can fully recover the received signal field in these four DOF. When downconverted signals are sampled at the Nyquist rate, compensation of transmission impairments can be performed using digital signal processing (DSP). Software based receivers benefit from the robustness of DSP, flexibility in design, and ease of adaptation to time-varying channels. Linear impairments, including chromatic dispersion (CD) and polarization-mode dispersion (PMD), can be compensated quasi-exactly using finite impulse response filters. In practical systems, sampling the received signal at 3/2 times the symbol rate is sufficient to enable an arbitrary amount of CD and PMD to be compensated for a sufficiently long equalizer whose tap length scales linearly with transmission distance. Depending on the transmitted constellation and the target bit error rate, the analog-to-digital converter (ADC) should have around 5 to 6 bits of resolution. Digital coherent receivers are naturally suited for the implementation of feedforward carrier recovery, which has superior linewidth tolerance than phase-locked loops, and does not suffer from feedback delay constraints. Differential bit encoding can be used to prevent catastrophic receiver failure due

  12. Data Processing Factory for the Sloan Digital Sky Survey

    NASA Astrophysics Data System (ADS)

    Stoughton, Christopher; Adelman, Jennifer; Annis, James T.; Hendry, John; Inkmann, John; Jester, Sebastian; Kent, Steven M.; Kuropatkin, Nickolai; Lee, Brian; Lin, Huan; Peoples, John, Jr.; Sparks, Robert; Tucker, Douglas; Vanden Berk, Dan; Yanny, Brian; Yocum, Dan

    2002-12-01

    The Sloan Digital Sky Survey (SDSS) data handling presents two challenges: large data volume and timely production of spectroscopic plates from imaging data. A data processing factory, using technologies both old and new, handles this flow. Distribution to end users is via disk farms, to serve corrected images and calibrated spectra, and a database, to efficiently process catalog queries. For distribution of modest amounts of data from Apache Point Observatory to Fermilab, scripts use rsync to update files, while larger data transfers are accomplished by shipping magnetic tapes commercially. All data processing pipelines are wrapped in scripts to address consecutive phases: preparation, submission, checking, and quality control. We constructed the factory by chaining these pipelines together while using an operational database to hold processed imaging catalogs. The science database catalogs all imaging and spectroscopic object, with pointers to the various external files associated with them. Diverse computing systems address particular processing phases. UNIX computers handle tape reading and writing, as well as calibration steps that require access to a large amount of data with relatively modest computational demands. Commodity CPUs process steps that require access to a limited amount of data with more demanding computations requirements. Disk servers optimized for cost per Gbyte serve terabytes of processed data, while servers optimized for disk read speed run SQLServer software to process queries on the catalogs. This factory produced data for the SDSS Early Data Release in June 2001, and it is currently producing Data Release One, scheduled for January 2003.

  13. Adaptive optical signal processing architecture using signed-digit number system

    NASA Astrophysics Data System (ADS)

    Ramamoorthy, P. A.; Govind, G.

    1988-01-01

    Signed-digit arithmetic techniques are evaluated for applicability in adaptive signal processing architectures. It is shown that signed-digit arithmetic offers the advantage of parallelism in computation without the accompanying conversion problems of the residue arithmetic representation.

  14. Measurements methodology for evaluation of Digital TV operation in VHF high-band

    NASA Astrophysics Data System (ADS)

    Pudwell Chaves de Almeida, M.; Vladimir Gonzalez Castellanos, P.; Alfredo Cal Braz, J.; Pereira David, R.; Saboia Lima de Souza, R.; Pereira da Soledade, A.; Rodrigues Nascimento Junior, J.; Ferreira Lima, F.

    2016-07-01

    This paper describes the experimental setup of field measurements carried out for evaluating the operation of the ISDB-TB (Integrated Services Digital Broadcasting, Terrestrial, Brazilian version) standard digital TV in the VHF-highband. Measurements were performed in urban and suburban areas in a medium-sized Brazilian city. Besides the direct measurements of received power and environmental noise, a measurement procedure involving the injection of Gaussian additive noise was employed to achieve the signal to noise ratio threshold at each measurement site. The analysis includes results of static reception measurements for evaluating the received field strength and the signal to noise ratio thresholds for correct signal decoding.

  15. Phase resolved digital signal processing in optical coherence tomography

    NASA Astrophysics Data System (ADS)

    de Boer, Johannes F.; Tripathi, Renu; Park, Boris H.; Nassif, Nader

    2002-06-01

    We present phase resolved digital signal processing techniques for Optical Coherence Tomography to correct for the non Gaussian shape of source spectra and for Group Delay Dispersion (GDD). A broadband source centered at 820 nm was synthesized by combining the spectra of two superluminescent diodes to improve axial image resolution in an optical coherence tomography (OCT) system. Spectral shaping was used to reduce the side lobes (ringing) in the axial point spread function due to the non-Gaussian shape of the spectra. Images of onion cells taken with each individual source and the combined sources, respectively, show the improved resolution and quality enhancement in a turbid biological sample. An OCT system operating at 1310 nm was used to demonstrate that the broadening effect of group delay dispersion (GDD) on the coherence function could be eliminated completely by introducing a quadratic phase shift in the Fourier domain of the interferometric signal. The technique is demonstrated by images of human skin grafts with group delay dispersion mismatch between sample and reference arm before and after digital processing.

  16. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems

    PubMed Central

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-01-01

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015. PMID:26633395

  17. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems.

    PubMed

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-11-30

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015.

  18. Microcomputer-based digital image processing - A tutorial package for exploration geologists

    NASA Technical Reports Server (NTRS)

    Harrington, J. A., Jr.; Cartin, K. F.

    1985-01-01

    An Apple II microcomputer-based software package for analysis of digital data developed at the University of Oklahoma, the Digital Image Analysis System (DIAS), provides a relatively low-cost, portable alternative to large, dedicated minicomputers for digital image processing education. Digital processing techniques for analysis of Landsat MSS data and a series of tutorial exercises for exploration geologists are described and evaluated. DIAS allows in-house training that does not interfere with computer-based prospect analysis objectives.

  19. Microcomputer-based digital image processing - A tutorial package for exploration geologists

    NASA Technical Reports Server (NTRS)

    Harrington, J. A., Jr.; Cartin, K. F.

    1985-01-01

    An Apple II microcomputer-based software package for analysis of digital data developed at the University of Oklahoma, the Digital Image Analysis System (DIAS), provides a relatively low-cost, portable alternative to large, dedicated minicomputers for digital image processing education. Digital processing techniques for analysis of Landsat MSS data and a series of tutorial exercises for exploration geologists are described and evaluated. DIAS allows in-house training that does not interfere with computer-based prospect analysis objectives.

  20. Digital Methodologies of Education Governance: Pearson plc and the Remediation of Methods

    ERIC Educational Resources Information Center

    Williamson, Ben

    2016-01-01

    This article analyses the rise of software systems in education governance, focusing on digital methods in the collection, calculation and circulation of educational data. It examines how software-mediated methods intervene in the ways educational institutions and actors are seen, known and acted upon through an analysis of the methodological…

  1. Digital Methodologies of Education Governance: Pearson plc and the Remediation of Methods

    ERIC Educational Resources Information Center

    Williamson, Ben

    2016-01-01

    This article analyses the rise of software systems in education governance, focusing on digital methods in the collection, calculation and circulation of educational data. It examines how software-mediated methods intervene in the ways educational institutions and actors are seen, known and acted upon through an analysis of the methodological…

  2. Processing techniques for digital sonar images from GLORIA.

    USGS Publications Warehouse

    Chavez, P.S.

    1986-01-01

    Image processing techniques have been developed to handle data from one of the newest members of the remote sensing family of digital imaging systems. This paper discusses software to process data collected by the GLORIA (Geological Long Range Inclined Asdic) sonar imaging system, designed and built by the Institute of Oceanographic Sciences (IOS) in England, to correct for both geometric and radiometric distortions that exist in the original 'raw' data. Preprocessing algorithms that are GLORIA-specific include corrections for slant-range geometry, water column offset, aspect ratio distortion, changes in the ship's velocity, speckle noise, and shading problems caused by the power drop-off which occurs as a function of range.-from Author

  3. Graphics processing unit accelerated computation of digital holograms.

    PubMed

    Kang, Hoonjong; Yaraş, Fahri; Onural, Levent

    2009-12-01

    An approximation for fast digital hologram generation is implemented on a central processing unit (CPU), a graphics processing unit (GPU), and a multi-GPU computational platform. The computational performance of the method on each platform is measured and compared. The computational speed on the GPU platform is much faster than on a CPU, and the algorithm could be further accelerated on a multi-GPU platform. In addition, the accuracy of the algorithm for single- and double-precision arithmetic is evaluated. The quality of the reconstruction from the algorithm using single-precision arithmetic is comparable with the quality from the double-precision arithmetic, and thus the implementation using single-precision arithmetic on a multi-GPU platform can be used for holographic video displays.

  4. Digital processing of mesoscale analysis and space sensor data

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.; Karitani, S.

    1985-01-01

    The mesoscale analysis and space sensor (MASS) data management and analysis system on the research computer system is presented. The MASS data base management and analysis system was implemented on the research computer system which provides a wide range of capabilities for processing and displaying large volumes of conventional and satellite derived meteorological data. The research computer system consists of three primary computers (HP-1000F, Harris/6, and Perkin-Elmer 3250), each of which performs a specific function according to its unique capabilities. The overall tasks performed concerning the software, data base management and display capabilities of the research computer system in terms of providing a very effective interactive research tool for the digital processing of mesoscale analysis and space sensor data is described.

  5. The Digital Fields Board for the FIELDS instrument suite on the Solar Probe Plus mission: Analog and digital signal processing

    NASA Astrophysics Data System (ADS)

    Malaspina, David M.; Ergun, Robert E.; Bolton, Mary; Kien, Mark; Summers, David; Stevens, Ken; Yehle, Alan; Karlsson, Magnus; Hoxie, Vaughn C.; Bale, Stuart D.; Goetz, Keith

    2016-06-01

    The first in situ measurements of electric and magnetic fields in the near-Sun environment (< 0.25 AU from the Sun) will be made by the FIELDS instrument suite on the Solar Probe Plus mission. The Digital Fields Board (DFB) is an electronics board within FIELDS that performs analog and digital signal processing, as well as digitization, for signals between DC and 60 kHz from five voltage sensors and four search coil magnetometer channels. These nine input signals are processed on the DFB into 26 analog data streams. A specialized application-specific integrated circuit performs analog to digital conversion on all 26 analog channels simultaneously. The DFB then processes the digital data using a field programmable gate array (FPGA), generating a variety of data products, including digitally filtered continuous waveforms, high-rate burst capture waveforms, power spectra, cross spectra, band-pass filter data, and several ancillary products. While the data products are optimized for encounter-based mission operations, they are also highly configurable, a key design aspect for a mission of exploration. This paper describes the analog and digital signal processing used to ensure that the DFB produces high-quality science data, using minimal resources, in the challenging near-Sun environment.

  6. Performance evaluation of image processing algorithms in digital mammography

    NASA Astrophysics Data System (ADS)

    Zanca, Federica; Van Ongeval, Chantal; Jacobs, Jurgen; Pauwels, Herman; Marchal, Guy; Bosmans, Hilde

    2008-03-01

    The purpose of the study is to evaluate the performance of different image processing algorithms in terms of representation of microcalcification clusters in digital mammograms. Clusters were simulated in clinical raw ("for processing") images. The entire dataset of images consisted of 200 normal mammograms, selected out of our clinical routine cases and acquired with a Siemens Novation DR system. In 100 of the normal images a total of 142 clusters were simulated; the remaining 100 normal mammograms served as true negative input cases. Both abnormal and normal images were processed with 5 commercially available processing algorithms: Siemens OpView1 and Siemens OpView2, Agfa Musica1, Sectra Mamea AB Sigmoid and IMS Raffaello Mammo 1.2. Five observers were asked to locate and score the cluster(s) in each image, by means of dedicated software tool. Observer performance was assessed using the JAFROC Figure of Merit. FROC curves, fitted using the IDCA method, have also been calculated. JAFROC analysis revealed significant differences among the image processing algorithms in the detection of microcalcifications clusters (p=0.0000369). Calculated average Figures of Merit are: 0.758 for Siemens OpView2, 0.747 for IMS Processing 1.2, 0.736 for Agfa Musica1 processing, 0.706 for Sectra Mamea AB Sigmoid processing and 0.703 for Siemens OpView1. This study is a first step towards a quantitative assessment of image processing in terms of cluster detection in clinical mammograms. Although we showed a significant difference among the image processing algorithms, this method does not on its own allow for a global performance ranking of the investigated algorithms.

  7. Digital Signal Processing and Control for the Study of Gene Networks

    NASA Astrophysics Data System (ADS)

    Shin, Yong-Jun

    2016-04-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.

  8. Digital Signal Processing and Control for the Study of Gene Networks

    PubMed Central

    Shin, Yong-Jun

    2016-01-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks. PMID:27102828

  9. Digital Signal Processing and Control for the Study of Gene Networks.

    PubMed

    Shin, Yong-Jun

    2016-04-22

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.

  10. Digital Image Processing Technique for Breast Cancer Detection

    NASA Astrophysics Data System (ADS)

    Guzmán-Cabrera, R.; Guzmán-Sepúlveda, J. R.; Torres-Cisneros, M.; May-Arrioja, D. A.; Ruiz-Pinales, J.; Ibarra-Manzano, O. G.; Aviña-Cervantes, G.; Parada, A. González

    2013-09-01

    Breast cancer is the most common cause of death in women and the second leading cause of cancer deaths worldwide. Primary prevention in the early stages of the disease becomes complex as the causes remain almost unknown. However, some typical signatures of this disease, such as masses and microcalcifications appearing on mammograms, can be used to improve early diagnostic techniques, which is critical for women’s quality of life. X-ray mammography is the main test used for screening and early diagnosis, and its analysis and processing are the keys to improving breast cancer prognosis. As masses and benign glandular tissue typically appear with low contrast and often very blurred, several computer-aided diagnosis schemes have been developed to support radiologists and internists in their diagnosis. In this article, an approach is proposed to effectively analyze digital mammograms based on texture segmentation for the detection of early stage tumors. The proposed algorithm was tested over several images taken from the digital database for screening mammography for cancer research and diagnosis, and it was found to be absolutely suitable to distinguish masses and microcalcifications from the background tissue using morphological operators and then extract them through machine learning techniques and a clustering algorithm for intensity-based segmentation.

  11. IMAGEP - A FORTRAN ALGORITHM FOR DIGITAL IMAGE PROCESSING

    NASA Technical Reports Server (NTRS)

    Roth, D. J.

    1994-01-01

    IMAGEP is a FORTRAN computer algorithm containing various image processing, analysis, and enhancement functions. It is a keyboard-driven program organized into nine subroutines. Within the subroutines are other routines, also, selected via keyboard. Some of the functions performed by IMAGEP include digitization, storage and retrieval of images; image enhancement by contrast expansion, addition and subtraction, magnification, inversion, and bit shifting; display and movement of cursor; display of grey level histogram of image; and display of the variation of grey level intensity as a function of image position. This algorithm has possible scientific, industrial, and biomedical applications in material flaw studies, steel and ore analysis, and pathology, respectively. IMAGEP is written in VAX FORTRAN for DEC VAX series computers running VMS. The program requires the use of a Grinnell 274 image processor which can be obtained from Mark McCloud Associates, Campbell, CA. An object library of the required GMR series software is included on the distribution media. IMAGEP requires 1Mb of RAM for execution. The standard distribution medium for this program is a 1600 BPI 9track magnetic tape in VAX FILES-11 format. It is also available on a TK50 tape cartridge in VAX FILES-11 format. This program was developed in 1991. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation.

  12. Digital Transformation of Words in Learning Processes: A Critical View.

    ERIC Educational Resources Information Center

    Saga, Hiroo

    1999-01-01

    Presents some negative aspects of society's dependence on digital transformation of words by referring to works by Walter Ong and Martin Heidegger. Discusses orality, literacy and digital literacy and describes three aspects of the digital transformation of words. Compares/contrasts art with technology and discusses implications for education.…

  13. Automation of contact lens fitting evaluation by digital image processing

    NASA Astrophysics Data System (ADS)

    Costa, Manuel F. M.; Barros, Rui; Franco, Sandra B.

    1997-08-01

    Contact lens' fitting evaluation is of critical importance in the contact lens' prescription process. For the correction of eye's refraction problems the use of contact lens' is very appealing to the user. However its prescription is far more demanding than the one of eye glasses. The fitting of a contact lens to a particular cornea must be carefully assessed in order to reduce possible user's physical miscomfort or even medical situations.The traditional way of easily checking the fitting of a contact lens is to perform a fluorescein test. The simple visual evaluation of the 'smoothness' of the color/brightness distribution of the fluorescence at the contact lens' location gives the optometrist an idea of the fitting's quality. We suggested the automation of the process simply by the substitution of the optometrist's eye by a CCD camera, and the use of appropriated simple image processing techniques. The setup and the digitalization and processing routines will be described in this communication. The processed images may then be directly analyzed by the optometrist in a faster, easier and more efficient way. However, it is also possible to perform an automated fitting evaluation by working out the information given by the image's intensity histograms for the green and blue RGB' channels.

  14. Automation of contact lens' fitting evaluation by digital image processing

    NASA Astrophysics Data System (ADS)

    da Cunha Martins Costa, M.; Barros, Rui; Franco, Sandra B.

    1997-10-01

    Contact lens' fitting evaluation is of critical importance in the contact lens' prescription process. For the correction of eye's refraction problems the use of contact lens' is very appealing to the user. However its prescription is far more demanding than the one of eye glasses. The fitting of a contact lens to a particular cornea must be carefully assessed in order to reduce possible user's physical miscomfort or even medical situations.The traditional way of easily checking the fitting of a contact lens is to perform a fluorescein test. The simple visual evaluation of the 'smoothness' of the color/brightness distribution of the fluorescence at the contact lens' location gives the optometrist an idea of the fitting's quality. We suggested the automation of the process simply by the substitution of the optometrist's eye by a CCD camera, and the use of appropriated simple image processing techniques. The setup and the digitalization and processing routines will be described in this communication. The processed images may then be directly analyzed by the optometrist in a faster, easier and more efficient way. However, it is also possible to perform an automated fitting evaluation by working out the information given by the image's intensity histograms for the green and blue RGB' channels.

  15. The effects of gray scale image processing on digital mammography interpretation performance.

    PubMed

    Cole, Elodia B; Pisano, Etta D; Zeng, Donglin; Muller, Keith; Aylward, Stephen R; Park, Sungwook; Kuzmiak, Cherie; Koomen, Marcia; Pavic, Dag; Walsh, Ruth; Baker, Jay; Gimenez, Edgardo I; Freimanis, Rita

    2005-05-01

    To determine the effects of three image-processing algorithms on diagnostic accuracy of digital mammography in comparison with conventional screen-film mammography. A total of 201 cases consisting of nonprocessed soft copy versions of the digital mammograms acquired from GE, Fischer, and Trex digital mammography systems (1997-1999) and conventional screen-film mammograms of the same patients were interpreted by nine radiologists. The raw digital data were processed with each of three different image-processing algorithms creating three presentations-manufacturer's default (applied and laser printed to film by each of the manufacturers), MUSICA, and PLAHE-were presented in soft copy display. There were three radiologists per presentation. Area under the receiver operating characteristic curve for GE digital mass cases was worse than screen-film for all digital presentations. The area under the receiver operating characteristic for Trex digital mass cases was better, but only with images processed with the manufacturer's default algorithm. Sensitivity for GE digital mass cases was worse than screen film for all digital presentations. Specificity for Fischer digital calcifications cases was worse than screen film for images processed in default and PLAHE algorithms. Specificity for Trex digital calcifications cases was worse than screen film for images processed with MUSICA. Specific image-processing algorithms may be necessary for optimal presentation for interpretation based on machine and lesion type.

  16. Methodology for measurement of fault latency in a digital avionic miniprocessor

    NASA Technical Reports Server (NTRS)

    Mcgough, J. G.; Swern, F.; Bavuso, S. J.

    1981-01-01

    Investigations regarding the synthesis of a reliability assessment capability for fault-tolerant computer-based systems have been conducted for several years. In 1978 a pilot study was conducted to test the feasibility of measuring detection coverage and investigating the dynamics of fault propagation in a digital computer. A description is presented of an investigation concerned with the applicability of previous results to a real avionics processor. The obtained results show that emulation is a practicable approach to failure modes and effects analysis of a digital processor. The run time of the emulated processor on a PDP-10 host computer is only 20,000 to 25,000 times slower than the actual processor. As a consequence large numbers of faults can be studied at relatively little cost and in a timely manner.

  17. Solvent Substitution Methodology Using Multiattribute Utility Theory and the Analytical Hierarchical Process

    DTIC Science & Technology

    1994-09-01

    Wright-Patterson Air Force Base, Ohio AFIT/GEE/ENS/94S-3 SOLVENT SUBSTITUTION METHODOLOGY USING MULTIATTRIBUTE UTILITY THEORY AND THE ANALYTICAL...CLASS: AFIT/GEE/ENS/94S-3 THESIS TITLE: Solvent Substitution Methodology using Multiattribute Utility Theory and the Analytical Hierarchical Process...Process: Depot Level 13 Substitution Process: Field Level 16 Contractor Substitution Process 17 Multiattribute Utility Theory (MAUT) 18 Independence

  18. Intranets and Digital Organizational Information Resources: Towards a Portable Methodology for Design and Development.

    ERIC Educational Resources Information Center

    Rosenbaum, Howard

    1997-01-01

    Discusses the concept of the intranet, comparing and contrasting it with groupware, and presents an argument for its value based on technical and information management considerations. Presents an intranet development project for an academic organization and describes a portable, user-centered and team-based methodology for the design and…

  19. A Digital Ecosystem for the Collaborative Production of Open Textbooks: The LATIn Methodology

    ERIC Educational Resources Information Center

    Silveira, Ismar Frango; Ochôa, Xavier; Cuadros-Vargas, Alex; Pérez Casas, Alén; Casali, Ana; Ortega, Andre; Sprock, Antonio Silva; Alves, Carlos Henrique; Collazos Ordoñez, Cesar Alberto; Deco, Claudia; Cuadros-Vargas, Ernesto; Knihs, Everton; Parra, Gonzalo; Muñoz-Arteaga, Jaime; Gomes dos Santos, Jéssica; Broisin, Julien; Omar, Nizam; Motz, Regina; Rodés, Virginia; Bieliukas, Yosly Hernández C.

    2013-01-01

    Access to books in higher education is an issue to be addressed, especially in the context of underdeveloped countries, such as those in Latin America. More than just financial issues, cultural aspects and need for adaptation must be considered. The present conceptual paper proposes a methodology framework that would support collaborative open…

  20. A Digital Ecosystem for the Collaborative Production of Open Textbooks: The LATIn Methodology

    ERIC Educational Resources Information Center

    Silveira, Ismar Frango; Ochôa, Xavier; Cuadros-Vargas, Alex; Pérez Casas, Alén; Casali, Ana; Ortega, Andre; Sprock, Antonio Silva; Alves, Carlos Henrique; Collazos Ordoñez, Cesar Alberto; Deco, Claudia; Cuadros-Vargas, Ernesto; Knihs, Everton; Parra, Gonzalo; Muñoz-Arteaga, Jaime; Gomes dos Santos, Jéssica; Broisin, Julien; Omar, Nizam; Motz, Regina; Rodés, Virginia; Bieliukas, Yosly Hernández C.

    2013-01-01

    Access to books in higher education is an issue to be addressed, especially in the context of underdeveloped countries, such as those in Latin America. More than just financial issues, cultural aspects and need for adaptation must be considered. The present conceptual paper proposes a methodology framework that would support collaborative open…

  1. [Photodensitometry: microdensitometry (MD): digital image processing method (DIP)].

    PubMed

    Ohama, K; Sanada, M; Nakagawa, H

    1994-09-01

    The principles of microdensitometry (MD) and digital image processing method (DIP), as well as the application of these methods to measure bone mineral density in clinical practice, were mentioned in the report. MD and DIP assess bone mineral content and bone mineral density by analyzing relative contrast of the metacarpus II on X-ray image. However, the parameters obtained by these methods have been reported to be closely related to lumber vertebral bone mineral density and whole-body bone mineral content as measured by dual energy X-ray absorptiometry (DXA). Being easy to use, MD and DIP are adequate for the screening of osteoporosis. Once any reduction in bone mineral content or bone mineral density is shown by MD or DIP, it is recommendable to measure bone mineral density of vertebrae and femoral neck by DXA.

  2. Infective endocarditis detection through SPECT/CT images digital processing

    NASA Astrophysics Data System (ADS)

    Moreno, Albino; Valdés, Raquel; Jiménez, Luis; Vallejo, Enrique; Hernández, Salvador; Soto, Gabriel

    2014-03-01

    Infective endocarditis (IE) is a difficult-to-diagnose pathology, since its manifestation in patients is highly variable. In this work, it was proposed a semiautomatic algorithm based on SPECT images digital processing for the detection of IE using a CT images volume as a spatial reference. The heart/lung rate was calculated using the SPECT images information. There were no statistically significant differences between the heart/lung rates values of a group of patients diagnosed with IE (2.62+/-0.47) and a group of healthy or control subjects (2.84+/-0.68). However, it is necessary to increase the study sample of both the individuals diagnosed with IE and the control group subjects, as well as to improve the images quality.

  3. Digital-image processing and image analysis of glacier ice

    USGS Publications Warehouse

    Fitzpatrick, Joan J.

    2013-01-01

    This document provides a methodology for extracting grain statistics from 8-bit color and grayscale images of thin sections of glacier ice—a subset of physical properties measurements typically performed on ice cores. This type of analysis is most commonly used to characterize the evolution of ice-crystal size, shape, and intercrystalline spatial relations within a large body of ice sampled by deep ice-coring projects from which paleoclimate records will be developed. However, such information is equally useful for investigating the stress state and physical responses of ice to stresses within a glacier. The methods of analysis presented here go hand-in-hand with the analysis of ice fabrics (aggregate crystal orientations) and, when combined with fabric analysis, provide a powerful method for investigating the dynamic recrystallization and deformation behaviors of bodies of ice in motion. The procedures described in this document compose a step-by-step handbook for a specific image acquisition and data reduction system built in support of U.S. Geological Survey ice analysis projects, but the general methodology can be used with any combination of image processing and analysis software. The specific approaches in this document use the FoveaPro 4 plug-in toolset to Adobe Photoshop CS5 Extended but it can be carried out equally well, though somewhat less conveniently, with software such as the image processing toolbox in MATLAB, Image-Pro Plus, or ImageJ.

  4. Anisotropy of Photopolymer Parts Made by Digital Light Processing.

    PubMed

    Monzón, Mario; Ortega, Zaida; Hernández, Alba; Paz, Rubén; Ortega, Fernando

    2017-01-13

    Digital light processing (DLP) is an accurate additive manufacturing (AM) technology suitable for producing micro-parts by photopolymerization. As most AM technologies, anisotropy of parts made by DLP is a key issue to deal with, taking into account that several operational factors modify this characteristic. Design for this technology and photopolymers becomes a challenge because the manufacturing process and post-processing strongly influence the mechanical properties of the part. This paper shows experimental work to demonstrate the particular behavior of parts made using DLP. Being different to any other AM technology, rules for design need to be adapted. Influence of build direction and post-curing process on final mechanical properties and anisotropy are reported and justified based on experimental data and theoretical simulation of bi-material parts formed by fully-cured resin and partially-cured resin. Three photopolymers were tested under different working conditions, concluding that post-curing can, in some cases, correct the anisotropy, mainly depending on the nature of photopolymer.

  5. Edge detection - Image-plane versus digital processing

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Fales, Carl L.; Park, Stephen K.; Triplett, Judith A.

    1987-01-01

    To optimize edge detection with the familiar Laplacian-of-Gaussian operator, it has become common to implement this operator with a large digital convolution mask followed by some interpolation of the processed data to determine the zero crossings that locate edges. It is generally recognized that this large mask causes substantial blurring of fine detail. It is shown that the spatial detail can be improved by a factor of about four with either the Wiener-Laplacian-of-Gaussian filter or an image-plane processor. The Wiener-Laplacian-of-Gaussian filter minimizes the image-gathering degradations if the scene statistics are at least approximately known and also serves as an interpolator to determine the desired zero crossings directly. The image-plane processor forms the Laplacian-of-Gaussian response by properly combining the optical design of the image-gathering system with a minimal three-by-three lateral-inhibitory processing mask. This approach, which is suggested by Marr's model of early processing in human vision, also reduces data processing by about two orders of magnitude and data transmission by up to an order of magnitude.

  6. Anisotropy of Photopolymer Parts Made by Digital Light Processing

    PubMed Central

    Monzón, Mario; Ortega, Zaida; Hernández, Alba; Paz, Rubén; Ortega, Fernando

    2017-01-01

    Digital light processing (DLP) is an accurate additive manufacturing (AM) technology suitable for producing micro-parts by photopolymerization. As most AM technologies, anisotropy of parts made by DLP is a key issue to deal with, taking into account that several operational factors modify this characteristic. Design for this technology and photopolymers becomes a challenge because the manufacturing process and post-processing strongly influence the mechanical properties of the part. This paper shows experimental work to demonstrate the particular behavior of parts made using DLP. Being different to any other AM technology, rules for design need to be adapted. Influence of build direction and post-curing process on final mechanical properties and anisotropy are reported and justified based on experimental data and theoretical simulation of bi-material parts formed by fully-cured resin and partially-cured resin. Three photopolymers were tested under different working conditions, concluding that post-curing can, in some cases, correct the anisotropy, mainly depending on the nature of photopolymer. PMID:28772426

  7. Principles of image processing in digital chest radiography.

    PubMed

    Prokop, Mathias; Neitzel, Ulrich; Schaefer-Prokop, Cornelia

    2003-07-01

    Image processing has a major impact on image quality and diagnostic performance of digital chest radiographs. Goals of processing are to reduce the dynamic range of the image data to capture the full range of attenuation differences between lungs and mediastinum, to improve the modulation transfer function to optimize spatial resolution, to enhance structural contrast, and to suppress image noise. Image processing comprises look-up table operations and spatial filtering. Look-up table operations allow for automated signal normalization and arbitrary choice of image gradation. The most simple and still widely applied spatial filtering algorithms are based on unsharp masking. Various modifications were introduced for dynamic range reduction and MTF restoration. More elaborate and more effective are multi-scale frequency processing algorithms. They are based on the subdivision of an image in multiple frequency bands according to its structural composition. This allows for a wide range of image manipulations including a size-independent enhancement of low-contrast structures. Principles of the various algorithms will be explained and their impact on image appearance will be illustrated by clinical examples. Optimum and sub-optimum parameter settings are discussed and pitfalls will be explained.

  8. A New Digital Signal Processing Method for Spectrum Interference Monitoring

    NASA Astrophysics Data System (ADS)

    Angrisani, L.; Capriglione, D.; Ferrigno, L.; Miele, G.

    2011-01-01

    Frequency spectrum is a limited shared resource, nowadays interested by an ever growing number of different applications. Generally, the companies providing such services pay to the governments the right of using a limited portion of the spectrum, consequently they would be assured that the licensed radio spectrum resource is not interested by significant external interferences. At the same time, they have to guarantee that their devices make an efficient use of the spectrum and meet the electromagnetic compatibility regulations. Therefore the competent authorities are called to control the access to the spectrum adopting suitable management and monitoring policies, as well as the manufacturers have to periodically verify the correct working of their apparatuses. Several measurement solutions are present on the market. They generally refer to real-time spectrum analyzers and measurement receivers. Both of them are characterized by good metrological accuracies but show costs, dimensions and weights that make no possible a use "on the field". The paper presents a first step in realizing a digital signal processing based measurement instrument able to suitably accomplish for the above mentioned needs. In particular the attention has been given to the DSP based measurement section of the instrument. To these aims an innovative measurement method for spectrum monitoring and management is proposed in this paper. It performs an efficient sequential analysis based on a sample by sample digital processing. Three main issues are in particular pursued: (i) measurement performance comparable to that exhibited by other methods proposed in literature; (ii) fast measurement time, (iii) easy implementation on cost-effective measurement hardware.

  9. Digital Image Processing for Noise Reduction in Medical Ultrasonics

    NASA Astrophysics Data System (ADS)

    Loupas, Thanasis

    Available from UMI in association with The British Library. Requires signed TDF. The purpose of this project was to investigate the application of digital image processing techniques as a means of reducing noise in medical ultrasonic imaging. Ultrasonic images suffer primarily from a type of acoustic noise, known as speckle, which is generally regarded as a major source of image quality degradation. The origin of speckle, its statistical properties as well as methods suggested to eliminate this artifact were reviewed. A simple model which can characterize the statistics of speckle on displays was also developed. A large number of digital noise reduction techniques was investigated. These include frame averaging techniques performed by commercially available devices and spatial filters implemented in software. Among the latter, some filters have been proposed in the scientific literature for ultrasonic, laser and microwave speckle or general noise suppression and the rest are original, developed specifically to suppress ultrasonic speckle. Particular emphasis was placed on adaptive techniques which adjust the processing performed at each point according to the local image content. In this way, they manage to suppress speckle with negligible loss of genuine image detail. Apart from preserving the diagnostically significant features of a scan another requirement a technique must satisfy before it is accepted in routine clinical practice is real-time operation. A spatial filter capable of satisfying both these requirements was designed and built in hardware using low-cost and readily available components. The possibility of incorporating all the necessary filter circuitry into a single VLSI chip was also investigated. In order to establish the effectiveness and usefulness of speckle suppression, a representative sample from the techniques examined here was applied to a large number of abdominal scans and their effect on image quality was evaluated. Finally, further

  10. Symbol processing in the left angular gyrus: evidence from passive perception of digits.

    PubMed

    Price, Gavin R; Ansari, Daniel

    2011-08-01

    Arabic digits are one of the most ubiquitous symbol sets in the world. While there have been many investigations into the neural processing of the semantic information digits represent (e.g. through numerical comparison tasks), little is known about the neural mechanisms which support the processing of digits as visual symbols. To characterise the component neurocognitive mechanisms which underlie numerical cognition, it is essential to understand the processing of digits as a visual category, independent of numerical magnitude processing. The 'Triple Code Model' (Dehaene, 1992; Dehaene and Cohen, 1995) posits an asemantic visual code for processing Arabic digits in the ventral visual stream, yet there is currently little empirical evidence in support of this code. This outstanding question was addressed in the current functional Magnetic Resonance (fMRI) study by contrasting brain responses during the passive viewing of digits versus letters and novel symbols at short (50 ms) and long (500 ms) presentation times. The results of this study reveal increased activation for familiar symbols (digits and letters) relative to unfamiliar symbols (scrambled digits and letters) at long presentation durations in the left dorsal Angular gyrus (dAG). Furthermore, increased activation for Arabic digits was observed in the left ventral Angular gyrus (vAG) in comparison to letters, scrambled digits and scrambled letters at long presentation durations, but no digit specific activation in any region at short presentation durations. These results suggest an absence of a digit specific 'Visual Number Form Area' (VNFA) in the ventral visual cortex, and provide evidence for the role of the left ventral AG during the processing of digits in the absence of any explicit processing demands. We conclude that Arabic digit processing depends specifically on the left AG rather than a ventral visual stream VNFA. Copyright © 2011 Elsevier Inc. All rights reserved.

  11. Study of optical techniques for the Ames unitary wind tunnel: Digital image processing, part 6

    NASA Technical Reports Server (NTRS)

    Lee, George

    1993-01-01

    A survey of digital image processing techniques and processing systems for aerodynamic images has been conducted. These images covered many types of flows and were generated by many types of flow diagnostics. These include laser vapor screens, infrared cameras, laser holographic interferometry, Schlieren, and luminescent paints. Some general digital image processing systems, imaging networks, optical sensors, and image computing chips were briefly reviewed. Possible digital imaging network systems for the Ames Unitary Wind Tunnel were explored.

  12. Digital image processing and analysis for activated sludge wastewater treatment.

    PubMed

    Khan, Muhammad Burhan; Lee, Xue Yong; Nisar, Humaira; Ng, Choon Aun; Yeap, Kim Ho; Malik, Aamir Saeed

    2015-01-01

    Activated sludge system is generally used in wastewater treatment plants for processing domestic influent. Conventionally the activated sludge wastewater treatment is monitored by measuring physico-chemical parameters like total suspended solids (TSSol), sludge volume index (SVI) and chemical oxygen demand (COD) etc. For the measurement, tests are conducted in the laboratory, which take many hours to give the final measurement. Digital image processing and analysis offers a better alternative not only to monitor and characterize the current state of activated sludge but also to predict the future state. The characterization by image processing and analysis is done by correlating the time evolution of parameters extracted by image analysis of floc and filaments with the physico-chemical parameters. This chapter briefly reviews the activated sludge wastewater treatment; and, procedures of image acquisition, preprocessing, segmentation and analysis in the specific context of activated sludge wastewater treatment. In the latter part additional procedures like z-stacking, image stitching are introduced for wastewater image preprocessing, which are not previously used in the context of activated sludge. Different preprocessing and segmentation techniques are proposed, along with the survey of imaging procedures reported in the literature. Finally the image analysis based morphological parameters and correlation of the parameters with regard to monitoring and prediction of activated sludge are discussed. Hence it is observed that image analysis can play a very useful role in the monitoring of activated sludge wastewater treatment plants.

  13. A novel digital pulse processing architecture for nuclear instrumentation

    SciTech Connect

    Moline, Yoann; Thevenin, Mathieu; Corre, Gwenole; Paindavoine, Michel

    2015-07-01

    The field of nuclear instrumentation covers a wide range of applications, including counting, spectrometry, pulse shape discrimination and multi-channel coincidence. These applications are the topic of many researches, new algorithms and implementations are constantly proposed thanks to advances in digital signal processing. However, these improvements are not yet implemented in instrumentation devices. This is especially true for neutron-gamma discrimination applications which traditionally use charge comparison method while literature proposes other algorithms based on frequency domain or wavelet theory which show better performances. Another example is pileups which are generally rejected while pileup correction algorithms also exist. These processes are traditionally performed offline due to two issues. The first is the Poissonian characteristic of the signal, composed of random arrival pulses which requires to current architectures to work in data flow. The second is the real-time requirement, which implies losing pulses when the pulse rate is too high. Despite the possibility of treating the pulses independently from each other, current architectures paralyze the acquisition of the signal during the processing of a pulse. This loss is called dead-time. These two issues have led current architectures to use dedicated solutions based on re-configurable components like Field Programmable Gate Arrays (FPGAs) to overcome the need of performance necessary to deal with dead-time. However, dedicated hardware algorithm implementations on re-configurable technologies are complex and time-consuming. For all these reasons, a programmable Digital pulse Processing (DPP) architecture in a high level language such as Cor C++ which can reduce dead-time would be worthwhile for nuclear instrumentation. This would reduce prototyping and test duration by reducing the level of hardware expertise to implement new algorithms. However, today's programmable solutions do not meet the

  14. Digital signal processing techniques for coherent optical communication

    NASA Astrophysics Data System (ADS)

    Goldfarb, Gilad

    Coherent detection with subsequent digital signal processing (DSP) is developed, analyzed theoretically and numerically and experimentally demonstrated in various fiber-optic transmission scenarios. The use of DSP in conjunction with coherent detection unleashes the benefits of coherent detection which rely on the preservaton of full information of the incoming field. These benefits include high receiver sensitivity, the ability to achieve high spectral-efficiency and the use of advanced modulation formats. With the immense advancements in DSP speeds, many of the problems hindering the use of coherent detection in optical transmission systems have been eliminated. Most notably, DSP alleviates the need for hardware phase-locking and polarization tracking, which can now be achieved in the digital domain. The complexity previously associated with coherent detection is hence significantly diminished and coherent detection is once gain considered a feasible detection alternative. In this thesis, several aspects of coherent detection (with or without subsequent DSP) are addressed. Coherent detection is presented as a means to extend the dispersion limit of a duobinary signal using an analog decision-directed phase-lock loop. Analytical bit-error ratio estimation for quadrature phase-shift keying signals is derived. To validate the promise for high spectral efficiency, the orthogonal-wavelength-division multiplexing scheme is suggested. In this scheme the WDM channels are spaced at the symbol rate, thus achieving the spectral efficiency limit. Theory, simulation and experimental results demonstrate the feasibility of this approach. Infinite impulse response filtering is shown to be an efficient alternative to finite impulse response filtering for chromatic dispersion compensation. Theory, design considerations, simulation and experimental results relating to this topic are presented. Interaction between fiber dispersion and nonlinearity remains the last major challenge

  15. Array Signal Processing for Source Localization and Digital Communication.

    NASA Astrophysics Data System (ADS)

    Song, Bong-Gee

    Array antennas are used in several areas such as sonar and digital communication. Although array patterns may be different depending on applications, they are used with a view to collecting more data and obtaining better results. We first consider a passive sonar system in random environments where the index of refraction is random. While source localization problems for deterministic environments are well studied, they require accurate propagation models which are not available in random environments. We extend the localization problems to random environments. It has been shown that methods developed for deterministic environments fail in random environments because of the stochastic nature of acoustic propagation. Therefore, we model observations as random, and use a statistical signal processing technique combined with physics. The statistical signal model is provided by physics either empirically or theoretically. The performance technique relies on the accuracy of the statistical models. We have applied the maximum likelihood method to angle of arrival estimation and range estimation problems. The Cramer-Rao lower bounds have been also derived to predict the estimation performance. Next, we use the array antennas for diversity combining equalization in digital communications. Spatial diversity equalization is used in two ways; to improve bit error rate or to improve the transmission rate. This is feasible by using more antennas at the receiver end. We apply Helstrom's saddle point integration method to multi -input multi-output communication systems and show that a factor of 3-4 of channel reuse is possible. It is also shown that the advantage is because of the diversity itself not because of more taps. We further improve the equalization performance by joint pre- and postfilter design. Two different methods have been proposed according to the prefilter type. Although the mean square error is not easy to minimize, appropriate methods have been adopted and show

  16. Digital computer processing of peach orchard multispectral aerial photography

    NASA Technical Reports Server (NTRS)

    Atkinson, R. J.

    1976-01-01

    Several methods of analysis using digital computers applicable to digitized multispectral aerial photography, are described, with particular application to peach orchard test sites. This effort was stimulated by the recent premature death of peach trees in the Southeastern United States. The techniques discussed are: (1) correction of intensity variations by digital filtering, (2) automatic detection and enumeration of trees in five size categories, (3) determination of unhealthy foliage by infrared reflectances, and (4) four band multispectral classification into healthy and declining categories.

  17. Social Information Processing, Emotions, and Aggression: Conceptual and Methodological Contributions of the Special Section Articles

    ERIC Educational Resources Information Center

    Arsenio, William F.

    2010-01-01

    This discussion summarizes some of the key conceptual and methodological contributions of the four articles in this special section on social information processing (SIP) and aggression. One major contribution involves the new methodological tools these studies provide for future researchers. Eye-tracking and mood induction techniques will make it…

  18. Social Information Processing, Emotions, and Aggression: Conceptual and Methodological Contributions of the Special Section Articles

    ERIC Educational Resources Information Center

    Arsenio, William F.

    2010-01-01

    This discussion summarizes some of the key conceptual and methodological contributions of the four articles in this special section on social information processing (SIP) and aggression. One major contribution involves the new methodological tools these studies provide for future researchers. Eye-tracking and mood induction techniques will make it…

  19. Digital Signal Processing Techniques for the GIFTS SM EDU

    NASA Technical Reports Server (NTRS)

    Tian, Jialin; Reisse, Robert A.; Gazarik, Michael J.

    2007-01-01

    The Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) Sensor Module (SM) Engineering Demonstration Unit (EDU) is a high resolution spectral imager designed to measure infrared (IR) radiance using a Fourier transform spectrometer (FTS). The GIFTS instrument employs three Focal Plane Arrays (FPAs), which gather measurements across the long-wave IR (LWIR), short/mid-wave IR (SMWIR), and visible spectral bands. The raw interferogram measurements are radiometrically and spectrally calibrated to produce radiance spectra, which are further processed to obtain atmospheric profiles via retrieval algorithms. This paper describes several digital signal processing (DSP) techniques involved in the development of the calibration model. In the first stage, the measured raw interferograms must undergo a series of processing steps that include filtering, decimation, and detector nonlinearity correction. The digital filtering is achieved by employing a linear-phase even-length FIR complex filter that is designed based on the optimum equiripple criteria. Next, the detector nonlinearity effect is compensated for using a set of pre-determined detector response characteristics. In the next stage, a phase correction algorithm is applied to the decimated interferograms. This is accomplished by first estimating the phase function from the spectral phase response of the windowed interferogram, and then correcting the entire interferogram based on the estimated phase function. In the calibration stage, we first compute the spectral responsivity based on the previous results and the ideal Planck blackbody spectra at the given temperatures, from which, the calibrated ambient blackbody (ABB), hot blackbody (HBB), and scene spectra can be obtained. In the post-calibration stage, we estimate the Noise Equivalent Spectral Radiance (NESR) from the calibrated ABB and HBB spectra. The NESR is generally considered as a measure of the instrument noise performance, and can be estimated as

  20. Digital Signal Processing Techniques for the GIFTS SM EDU

    NASA Astrophysics Data System (ADS)

    Tian, J.; Reisse, R.; Gazarik, M.

    The Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) Sensor Module (SM) Engineering Demonstration Unit (EDU) is a high resolution spectral imager designed to measure infrared (IR) radiance using a Fourier transform spectrometer (FTS). The GIFTS instrument employs three Focal Plane Arrays (FPAs), which gather measurements across the long-wave IR (LWIR), short/mid-wave IR (SMWIR), and visible spectral bands. The raw interferogram measurements are radiometrically and spectrally calibrated to produce radiance spectra, which are further processed to obtain atmospheric profiles via retrieval algorithms. This paper describes several digital signal processing (DSP) techniques involved in the development of the calibration model. In the first stage, the measured raw interferograms must undergo a series of processing steps that include filtering, decimation, and detector nonlinearity correction. The digital filtering is achieved by employing a linear-phase even-length FIR complex filter that is designed based on the optimum equiripple criteria. Next, the detector nonlinearity effect is compensated for using a set of pre-determined detector response characteristics. In the next stage, a phase correction algorithm is applied to the decimated interferograms. This is accomplished by first estimating the phase function from the spectral phase response of the windowed interferogram, and then correcting the entire interferogram based on the estimated phase function. In the calibration stage, we first compute the spectral responsivity based on the previous results and the ideal Planck blackbody spectra at the given temperatures, from which, the calibrated ambient blackbody (ABB), hot blackbody (HBB), and scene spectra can be obtained. In the post-calibration stage, we estimate the Noise Equivalent Spectral Radiance (NESR) from the calibrated ABB and HBB spectra. The NESR is generally considered as a measure of the instrument noise performance, and can be estimated as

  1. Digital Signal Processing Techniques for the GIFTS SM EDU

    NASA Technical Reports Server (NTRS)

    Tian, Jialin; Reisse, Robert A.; Gazarik, Michael J.

    2007-01-01

    The Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) Sensor Module (SM) Engineering Demonstration Unit (EDU) is a high resolution spectral imager designed to measure infrared (IR) radiance using a Fourier transform spectrometer (FTS). The GIFTS instrument employs three Focal Plane Arrays (FPAs), which gather measurements across the long-wave IR (LWIR), short/mid-wave IR (SMWIR), and visible spectral bands. The raw interferogram measurements are radiometrically and spectrally calibrated to produce radiance spectra, which are further processed to obtain atmospheric profiles via retrieval algorithms. This paper describes several digital signal processing (DSP) techniques involved in the development of the calibration model. In the first stage, the measured raw interferograms must undergo a series of processing steps that include filtering, decimation, and detector nonlinearity correction. The digital filtering is achieved by employing a linear-phase even-length FIR complex filter that is designed based on the optimum equiripple criteria. Next, the detector nonlinearity effect is compensated for using a set of pre-determined detector response characteristics. In the next stage, a phase correction algorithm is applied to the decimated interferograms. This is accomplished by first estimating the phase function from the spectral phase response of the windowed interferogram, and then correcting the entire interferogram based on the estimated phase function. In the calibration stage, we first compute the spectral responsivity based on the previous results and the ideal Planck blackbody spectra at the given temperatures, from which, the calibrated ambient blackbody (ABB), hot blackbody (HBB), and scene spectra can be obtained. In the post-calibration stage, we estimate the Noise Equivalent Spectral Radiance (NESR) from the calibrated ABB and HBB spectra. The NESR is generally considered as a measure of the instrument noise performance, and can be estimated as

  2. Methodologies for automating the collection and processing of GPS-GIS information for transportation systems

    NASA Astrophysics Data System (ADS)

    Zhao, Bingyan

    All transportation departments have large amounts of data and information that are needed for planning and operation of their systems. This information can be textual, graphical or spatial in nature. Spatial information is generally in the form of maps and these maps are increasingly being stored and processed as digital GIS files that can be linked to other types of information generally referred to as attribute information. In the NYSDOT database, there are many kinds of features for which information must be maintained. For example, there are about 22,500 bridges within the New York State road systems. The current spatial location for these bridges may not have the level of accuracy that would be desired by today's standards and that can be achieved with new spatial measuring techniques. Although the updating of bridge locations and the location of other features can be done using new techniques such as GPS, if this is done manually it presents a forbidding task. The main objective of this study is to find a way to automatically collect feature location data using GPS equipment and to automate the transfer of this information into archival databases. Among the objectives of this dissertation are: how to automatically download information from the DOT database; how to collect field data following a uniform procedure; how to convert the surveying results into Arc/View shape files and how to update the DOT feature location map information using field data. The end goal is to develop feasible methodologies to automate updating of mapping information using GPS by creating a systems design for the process and to create the scripts and programming needed to make this system work. This has been accomplished and is demonstrated in a sample program. Details of the Automated Acquisition System are described in this dissertation.

  3. Influence of Digital Camera Errors on the Photogrammetric Image Processing

    NASA Astrophysics Data System (ADS)

    Sužiedelytė-Visockienė, Jūratė; Bručas, Domantas

    2009-01-01

    The paper deals with the calibration of digital camera Canon EOS 350D, often used for the photogrammetric 3D digitalisation and measurements of industrial and construction site objects. During the calibration data on the optical and electronic parameters, influencing the distortion of images, such as correction of the principal point, focal length of the objective, radial symmetrical and non-symmetrical distortions were obtained. The calibration was performed by means of the Tcc software implementing the polynomial of Chebichev and using a special test-field with the marks, coordinates of which are precisely known. The main task of the research - to determine how parameters of the camera calibration influence the processing of images, i. e. the creation of geometric model, the results of triangulation calculations and stereo-digitalisation. Two photogrammetric projects were created for this task. In first project the non-corrected and in the second the corrected ones, considering the optical errors of the camera obtained during the calibration, images were used. The results of analysis of the images processing is shown in the images and tables. The conclusions are given.

  4. Research and development for Digital Voice Processing (DVP)

    NASA Astrophysics Data System (ADS)

    Tardelli, J. D.; Lafollette, P. A.; Walter, C. M.; Leblanc, J.; Gatewood, P. D.

    1991-07-01

    This technical report covers a variety of topics and research areas. An update of the Canonical Coordinate (CC) Transformation process for digital speech compression based on non-Euclidean error minimization criteria is given. A method of transforming empirical filter sets to unitary error metrics is introduced and a sample error metric is developed. Research into frequency shift invarient transformation is presented and its utility for developing CC error metrics is discussed. A study into the relationship between CC analysis and Linear Predictive Coding is presented. An improved AP120-B implementation of the CC algorithm is given along with a fully parallel CC implementation on a systolic array processor. The latest developments in processor hardware, operating systems, and program development software at RADC/EEV are documented. Custom additions to the Interactive Laboratory System (ILS) analysis and display package are covered along with its relationship to the Speech Data Base Library used at RADC/EEV. Data Base input/output programs, analysis tools, search/sort programs are all presented. The status of a communicability testbed system at the Speech Processing Facility is presented.

  5. Complexity, Methodology and Method: Crafting a Critical Process of Research

    ERIC Educational Resources Information Center

    Alhadeff-Jones, Michel

    2013-01-01

    This paper defines a theoretical framework aiming to support the actions and reflections of researchers looking for a "method" in order to critically conceive the complexity of a scientific process of research. First, it starts with a brief overview of the core assumptions framing Morin's "paradigm of complexity" and Le…

  6. Guidelines for the structure of methodological processes in visual anthropology.

    PubMed

    Svilicić, Niksa

    2011-12-01

    Vast majority of visual anthropologists of the 20th century were more focused on general phenomenology of visual anthropology, i.e. the content aspect of their works and their impact on scientific knowledge, leaving behind style of directing and practical principles & processes of creating anthropological film. So far, judging by the available literature, there are no strict guidelines for directorial procedures, nor the precise definition of determining of the methodical processes in production of an anthropological film. Consequently, the goal of this study is to determine the structure and forms of methodical processes as well as to define the advantages and disadvantages of each of them. By using adequate guidelines, the researcher, i.e. the author of the anthropological film, can optimize the production and post-production processes as soon as in preparation (preproduction) period of working on the film, by the technical choice of the approach to the production (proactive/reactive/subjective/objective...) and by defining the style of directing. In other words, it ultimately means more relevant scientific research result with less time and resources.

  7. Process improvements using the NCMS electrical testing methodology

    SciTech Connect

    Goldammer, S.E.; Tucker, D.R.

    1997-06-01

    The conductor analysis electrical testing method uses the artwork patterns and equipment developed by the National Center for Manufacturing Sciences (NCMS) Printed Wiring Board Imaging Team. These patterns and electrical test methods are used to evaluate new or improve existing printed wiring board processes.

  8. Radiometric calibration of digital cameras using Gaussian processes

    NASA Astrophysics Data System (ADS)

    Schall, Martin; Grunwald, Michael; Umlauf, Georg; Franz, Matthias O.

    2015-05-01

    Digital cameras are subject to physical, electronic and optic effects that result in errors and noise in the image. These effects include for example a temperature dependent dark current, read noise, optical vignetting or different sensitivities of individual pixels. The task of a radiometric calibration is to reduce these errors in the image and thus improve the quality of the overall application. In this work we present an algorithm for radiometric calibration based on Gaussian processes. Gaussian processes are a regression method widely used in machine learning that is particularly useful in our context. Then Gaussian process regression is used to learn a temperature and exposure time dependent mapping from observed gray-scale values to true light intensities for each pixel. Regression models based on the characteristics of single pixels suffer from excessively high runtime and thus are unsuitable for many practical applications. In contrast, a single regression model for an entire image with high spatial resolution leads to a low quality radiometric calibration, which also limits its practical use. The proposed algorithm is predicated on a partitioning of the pixels such that each pixel partition can be represented by one single regression model without quality loss. Partitioning is done by extracting features from the characteristic of each pixel and using them for lexicographic sorting. Splitting the sorted data into partitions with equal size yields the final partitions, each of which is represented by the partition centers. An individual Gaussian process regression and model selection is done for each partition. Calibration is performed by interpolating the gray-scale value of each pixel with the regression model of the respective partition. The experimental comparison of the proposed approach to classical flat field calibration shows a consistently higher reconstruction quality for the same overall number of calibration frames.

  9. Development of next generation digital flat panel catheterization system: design principles and validation methodology

    NASA Astrophysics Data System (ADS)

    Belanger, B.; Betraoui, F.; Dhawale, P.; Gopinath, P.; Tegzes, Pal; Vagvolgyi, B.

    2006-03-01

    The design principles that drove the development of a new cardiovascular x-ray digital flat panel (DFP) detector system are presented, followed by assessments of imaging and dose performance achieved relative to other state of the art FPD systems. The new system (GE Innova 2100 IQ TM) incorporates a new detector with substantially improved DQE at fluoroscopic (73%@1μR) and record (79%@114uR) doses, an x-ray tube with higher continuous fluoro power (3.2kW), a collimator with a wide range of copper spectral filtration (up to 0.9mm), and an improved automatic x-ray exposure management system. The performance of this new system was compared to that of the previous generation GE product (Innova 2000) and to state-of-the art cardiac digital x-ray flat panel systems from two other major manufacturers. Performance was assessed with the industry standard Cardiac X-ray NEMA/SCA and I phantom, and a new moving coronary artery stent (MCAS) phantom, designed to simulate cardiac clinical imaging conditions, composed of an anthropomorphic chest section with stents moving in a manner simulating normal coronary arteries. The NEMA/SCA&I phantom results showed the Innova 2100 IQ to exceed or equal the Innova 2000 in all of the performance categories, while operating at 28% lower dose on average, and to exceed the other DFP systems in most of the performance categories. The MCAS phantom tests showed the Innova 2100 IQ to be significantly better (p << 0.05) than the Innova 2000, and significantly better than the other DFP systems in most cases at comparable or lower doses, thereby verifying excellent performance against design goals.

  10. Microcomputer-Based Digital Signal Processing Laboratory Experiments.

    ERIC Educational Resources Information Center

    Tinari, Jr., Rocco; Rao, S. Sathyanarayan

    1985-01-01

    Describes a system (Apple II microcomputer interfaced to flexible, custom-designed digital hardware) which can provide: (1) Fast Fourier Transform (FFT) computation on real-time data with a video display of spectrum; (2) frequency synthesis experiments using the inverse FFT; and (3) real-time digital filtering experiments. (JN)

  11. Knowledge and Processes That Predict Proficiency in Digital Literacy

    ERIC Educational Resources Information Center

    Bulger, Monica E.; Mayer, Richard E.; Metzger, Miriam J.

    2014-01-01

    Proficiency in digital literacy refers to the ability to read and write using online sources, and includes the ability to select sources relevant to the task, synthesize information into a coherent message, and communicate the message with an audience. The present study examines the determinants of digital literacy proficiency by asking 150…

  12. Systematic methodology for estimating direct capital costs for blanket tritium processing systems

    SciTech Connect

    Finn, P.A.

    1985-01-01

    This paper describes the methodology developed for estimating the relative capital costs of blanket processing systems. The capital costs of the nine blanket concepts selected in the Blanket Comparison and Selection Study are presented and compared.

  13. Monitoring Pharmacy Expert System Performance Using Statistical Process Control Methodology

    PubMed Central

    Doherty, Joshua A.; Reichley, Richard M.; Noirot, Laura A.; Resetar, Ervina; Hodge, Michael R.; Sutter, Robert D.; Dunagan, Wm Claiborne; Bailey, Thomas C.

    2003-01-01

    Automated expert systems provide a reliable and effective way to improve patient safety in a hospital environment. Their ability to analyze large amounts of data without fatigue is a decided advantage over clinicians who perform the same tasks. As dependence on expert systems increase and the systems become more complex, it is important to closely monitor their performance. Failure to generate alerts can jeopardize the health and safety of patients, while generating excessive false positives can cause valid alerts to be dismissed as noise. In this study, statistical process control charts were used to monitor an expert system, and the strengths and weaknesses of this technology are presented. PMID:14728163

  14. How processing digital elevation models can affect simulated water budgets.

    PubMed

    Kuniansky, Eve L; Lowery, Mark A; Campbell, Bruce G

    2009-01-01

    For regional models, the shallow water table surface is often used as a source/sink boundary condition, as model grid scale precludes simulation of the water table aquifer. This approach is appropriate when the water table surface is relatively stationary. Since water table surface maps are not readily available, the elevation of the water table used in model cells is estimated via a two-step process. First, a regression equation is developed using existing land and water table elevations from wells in the area. This equation is then used to predict the water table surface for each model cell using land surface elevation available from digital elevation models (DEM). Two methods of processing DEM for estimating the land surface for each cell are commonly used (value nearest the cell centroid or mean value in the cell). This article demonstrates how these two methods of DEM processing can affect the simulated water budget. For the example presented, approximately 20% more total flow through the aquifer system is simulated if the centroid value rather than the mean value is used. This is due to the one-third greater average ground water gradients associated with the centroid value than the mean value. The results will vary depending on the particular model area topography and cell size. The use of the mean DEM value in each model cell will result in a more conservative water budget and is more appropriate because the model cell water table value should be representative of the entire cell area, not the centroid of the model cell.

  15. Pollution balance. A new methodology for minimizing waste production in manufacturing processes

    SciTech Connect

    Hilaly, A.K.; Sikdar, S.K.

    1994-11-01

    A new methodology based on a generic pollution balance equation has been developed for minimizing waste production in manufacturing processes. A `pollution index,` defined as the mass of waste produced per unit mass of a product, has been introduced to provide a quantitative measure of waste generation in a process. A waste reduction algorithm also has been developed from the pollution balance equation. This paper explains this methodology and demonstrates the applicability of the method by a case study. 8 refs., 7 figs.

  16. Process integration methodology for natural gas-fueled heat pumps and cogeneration systems

    NASA Astrophysics Data System (ADS)

    Rossiter, Alan P.

    1988-11-01

    A process integration methodology was developed for analyzing industrial processes, identifying those that will benefit from natural gas fueled heat pumps and cogeneration system as well as novel, process-specific opportunities for further equipment improvements, including performance targets. The development included the writing of software to assist in implementing the methodology and application of the procedures in studies using both literature data and plant operating data. These highlighted potential applications for gas fueled heat pumps in ethylene processes and liquor distilling plants, and slightly less attractive opportunities in a number of other plants. Many of the processes studied showed excellent potentials for cogeneration applications.

  17. Environmental testing of a prototypic digital safety channel, Phase I: System design and test methodology

    SciTech Connect

    Korsah, K.; Turner, G.W.; Mullens, J.A.

    1995-04-01

    A microprocessor-based reactor trip channel has been assembled for environmental testing under an Instrumentation and Control (I&C) Qualification Program sponsored by the US Nuclear Regulatory Commission. The goal of this program is to establish the technical basis and acceptance criteria for the qualification of advanced I&C systems. The trip channel implemented for this study employs technologies and digital subsystems representative of those proposed for use in some advanced light-water reactors (ALWRs) such as the Simplified Boiling Water Reactor (SBWR). It is expected that these tests will reveal any potential system vulnerabilities for technologies representative of those proposed for use in ALWRs. The experimental channel will be purposely stressed considerably beyond what it is likely to experience in a normal nuclear power plant environment, so that the tests can uncover the worst-case failure modes (i.e., failures that are likely to prevent an entire trip system from performing its safety function when required to do so). Based on information obtained from this study, it may be possible to recommend tests that are likely to indicate the presence of such failure mechanisms. Such recommendations would be helpful in augmenting current qualification guidelines.

  18. Automated Coronal Loop Identification Using Digital Image Processing Techniques

    NASA Technical Reports Server (NTRS)

    Lee, Jong K.; Gary, G. Allen; Newman, Timothy S.

    2003-01-01

    The results of a master thesis project on a study of computer algorithms for automatic identification of optical-thin, 3-dimensional solar coronal loop centers from extreme ultraviolet and X-ray 2-dimensional images will be presented. These center splines are proxies of associated magnetic field lines. The project is pattern recognition problems in which there are no unique shapes or edges and in which photon and detector noise heavily influence the images. The study explores extraction techniques using: (1) linear feature recognition of local patterns (related to the inertia-tensor concept), (2) parametric space via the Hough transform, and (3) topological adaptive contours (snakes) that constrains curvature and continuity as possible candidates for digital loop detection schemes. We have developed synthesized images for the coronal loops to test the various loop identification algorithms. Since the topology of these solar features is dominated by the magnetic field structure, a first-order magnetic field approximation using multiple dipoles provides a priori information in the identification process. Results from both synthesized and solar images will be presented.

  19. Delicate visual artifacts of advanced digital video processing algorithms

    NASA Astrophysics Data System (ADS)

    Nicolas, Marina M.; Lebowsky, Fritz

    2005-03-01

    With the incoming of digital TV, sophisticated video processing algorithms have been developed to improve the rendering of motion or colors. However, the perceived subjective quality of these new systems sometimes happens to be in conflict with the objective measurable improvement we expect to get. In this presentation, we show examples where algorithms should visually improve the skin tone rendering of decoded pictures under normal conditions, but surprisingly fail, when the quality of mpeg encoding drops below a just noticeable threshold. In particular, we demonstrate that simple objective criteria used for the optimization, such as SAD, PSNR or histogram sometimes fail, partly because they are defined on a global scale, ignoring local characteristics of the picture content. We then integrate a simple human visual model to measure potential artifacts with regard to spatial and temporal variations of the objects' characteristics. Tuning some of the model's parameters allows correlating the perceived objective quality with compression metrics of various encoders. We show the evolution of our reference parameters in respect to the compression ratios. Finally, using the output of the model, we can control the parameters of the skin tone algorithm to reach an improvement in overall system quality.

  20. Digital Signal Processing for SiPM Timing Resolution

    NASA Astrophysics Data System (ADS)

    Philippov, D. E.; Popova, E. V.; Belyaev, V. N.; Buzhan, P. Z.; Stifutkin, A. A.; Vinogradov, S. L.

    2017-01-01

    Digital signal processing (DSP) is an emerging trend in experimental studies and applications of various detectors including SiPMs. In particular, the DSP is recognized as a promising approach to improve coincidence timing resolution (CTR) of fast SiPM-based scintillation detectors. Single photon timing resolution (SPTR) is one of the key parameters affecting CTR, especially important in a case when CTR is approaching to its ultimate limits as, for example, highly demanded in Time-of-Flight PET. To study SiPM timing resolution, we developed a special DSP software and applied it to both SPTR and CTR measurements. These measurements were carried out using 3x3 mm2 KETEK SiPM samples of timing optimized and standard designs with 405 nm picosecond laser for SPTR and with 3x3x5 mm3 LYSO crystals and 511 keV Na-22 source for CRT. Results of the study are useful for further improvements of DSP algorithms and SiPM designs for fast timing.

  1. Automated Coronal Loop Identification Using Digital Image Processing Techniques

    NASA Technical Reports Server (NTRS)

    Lee, Jong K.; Gary, G. Allen; Newman, Timothy S.

    2003-01-01

    The results of a master thesis project on a study of computer algorithms for automatic identification of optical-thin, 3-dimensional solar coronal loop centers from extreme ultraviolet and X-ray 2-dimensional images will be presented. These center splines are proxies of associated magnetic field lines. The project is pattern recognition problems in which there are no unique shapes or edges and in which photon and detector noise heavily influence the images. The study explores extraction techniques using: (1) linear feature recognition of local patterns (related to the inertia-tensor concept), (2) parametric space via the Hough transform, and (3) topological adaptive contours (snakes) that constrains curvature and continuity as possible candidates for digital loop detection schemes. We have developed synthesized images for the coronal loops to test the various loop identification algorithms. Since the topology of these solar features is dominated by the magnetic field structure, a first-order magnetic field approximation using multiple dipoles provides a priori information in the identification process. Results from both synthesized and solar images will be presented.

  2. Fully Digital: Policy and Process Implications for the AAS

    NASA Astrophysics Data System (ADS)

    Biemesderfer, Chris

    Over the past two decades, every scholarly publisher has migrated at least the mechanical aspects of their journal publishing so that they utilize digital means. The academy was comfortable with that for a while, but publishers are under increasing pressure to adapt further. At the American Astronomical Society (AAS), we think that means bringing our publishing program to the point of being fully digital, by establishing procedures and policies that regard the digital objects of publication primarily. We have always thought about our electronic journals as databases of digital articles, from which we can publish and syndicate articles one at a time, and we must now put flesh on those bones by developing practices that are consistent with the realities of article at a time publication online. As a learned society that holds the long-term rights to the literature, we have actively taken responsibility for the preservation of the digital assets that constitute our journals, and in so doing we have not forsaken the legacy pre-digital assets. All of us who serve as the long-term stewards of scholarship must begin to evolve into fully digital publishers.

  3. Optical Digital Parallel Truth-Table Look-Up Processing

    NASA Astrophysics Data System (ADS)

    Mirsalehi, Mir Mojtaba

    During the last decade, a number of optical digital processors have been proposed that combine the parallelism and speed of optics with the accuracy and flexibility of a digital representation. In this thesis, two types of such processors (an EXCLUSIVE OR-based processor and a NAND-based processor) that function as content-addressable memories (CAM's) are analyzed. The main factors that affect the performance of the EXCLUSIVE OR-based processor are found to be the Gaussian nature of the reference beam and the finite square aperture of the crystal. A quasi-one-dimensional model is developed to analyze the effect of the Gaussian reference beam, and a circular aperture is used to increase the dynamic range in the output power. The main factors that affect the performance of the NAND-based processor are found to be the variations in the amplitudes and the relative phase of the laser beams during the recording process. A mathematical model is developed for analyzing the probability of error in the output of the processor. Using this model, the performance of the processor for some practical cases is analyzed. Techniques that have been previously used to reduce the number of reference patterns in a CAM include: using the residue number system and applying logical minimization methods. In the present work, these and additional techniques are investigated. A systematic procedure is developed for selecting the optimum set of moduli. The effect of coding is investigated and it is shown that multi-level coding, when used in conjunction with logical minimization techniques, significantly reduces the number of reference patterns. The Quine-McCluskey method is extended to multiple -valued logic and a computer program based on this extension is used for logical minimization. The results show that for moduli expressable as p('n), where p is a prime number and n is an integer greater than one, p-level coding provides significant reduction. The NAND-based processor is modified for

  4. On the development of Arabic three-digit number processing in primary school children.

    PubMed

    Mann, Anne; Moeller, Korbinian; Pixner, Silvia; Kaufmann, Liane; Nuerk, Hans-Christoph

    2012-12-01

    The development of two-digit number processing in children, and in particular the influence of place-value understanding, has recently received increasing research interest. However, place-value influences leading to decomposed processing have not yet been investigated for multi-digit numbers beyond the two-digit number range in children. Therefore, we evaluated the separate influences of hundreds, tens, and units on three-digit number processing by means of the hundred distance effect, the decade-hundred compatibility effect, and the unit-hundred compatibility effect in a longitudinal design from Grade 2 to Grade 4. In a number magnitude comparison task, a strong hundred distance effect indicated that the magnitudes of the hundreds digits were predominantly processed. We also observed indexes of decomposed parallel processing of hundreds and units digits but not of hundreds and tens digits. Regarding the developmental trajectories, the hundred distance effect and the unit-hundred compatibility effect showed a reliable trend to increase with grade level. However, both the significance and the increase with grade level of decomposed parallel processing were observed to be less consistent than expected. The latter is discussed in terms of different processing strategies as well as specificities differentiating between two- and three-digit numbers. Taken together, these are the first data showing decomposed processing of three-digit numbers in children. Yet, it must be noted that the results also indicate that findings from two-digit number processing cannot simply be generalized to the three-digit number range. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Recognition and inference of crevice processing on digitized paintings

    NASA Astrophysics Data System (ADS)

    Karuppiah, S. P.; Srivatsa, S. K.

    2013-03-01

    This paper is designed to detect and removal of cracks on digitized paintings. The cracks are detected by threshold. Afterwards, the thin dark brush strokes which have been misidentified as cracks are removed using Median radial basis function neural network on hue and saturation data, Semi-automatic procedure based on region growing. Finally, crack is filled using wiener filter. The paper is well designed in such a way that most of the cracks on digitized paintings have identified and removed. The paper % of betterment is 90%. This paper helps us to perform not only on digitized paintings but also the medical images and bmp images. This paper is implemented by Mat Lab.

  6. CIDOC-CRM extensions for conservation processes: A methodological approach

    NASA Astrophysics Data System (ADS)

    Vassilakaki, Evgenia; Zervos, Spiros; Giannakopoulos, Georgios

    2015-02-01

    This paper aims to report the steps taken to create the CIDOC Conceptual Reference Model (CIDOC-CRM) extensions and the relationships established to accommodate the depiction of conservation processes. In particular, the specific steps undertaken for developing and applying the CIDOC-CRM extensions for defining the conservation interventions performed on the cultural artifacts of the National Archaeological Museum of Athens, Greece are presented in detail. A report on the preliminary design of the DOC-CULTURE project (Development of an integrated information environment for assessment and documentation of conservation interventions to cultural works/objects with nondestructive testing techniques [NDTs], www.ndt-lab.gr/docculture), co-financed by the European Union NSRF THALES program, can be found in Kyriaki-Manessi, Zervos & Giannakopoulos (1) whereas the NDT&E methods and their output data through CIDOC-CRM extension of the DOC-CULTURE project approach to standardize the documentation of the conservation were further reported in Kouis et al. (2).

  7. Exploring the Developmental Changes in Automatic Two-Digit Number Processing

    ERIC Educational Resources Information Center

    Chan, Winnie Wai Lan; Au, Terry K.; Tang, Joey

    2011-01-01

    Even when two-digit numbers are irrelevant to the task at hand, adults process them. Do children process numbers automatically, and if so, what kind of information is activated? In a novel dot-number Stroop task, children (Grades 1-5) and adults were shown two different two-digit numbers made up of dots. Participants were asked to select the…

  8. Data reduction complex analog-to-digital data processing requirements for onsite test facilities

    NASA Technical Reports Server (NTRS)

    Debbrecht, J. D.

    1976-01-01

    The analog to digital processing requirements of onsite test facilities are described. The source and medium of all input data to the Data Reduction Complex (DRC) and the destination and medium of all output products of the analog-to-digital processing are identified. Additionally, preliminary input and output data formats are presented along with the planned use of the output products.

  9. Enhancing the Teaching of Digital Processing of Remote Sensing Image Course through Geospatial Web Processing Services

    NASA Astrophysics Data System (ADS)

    di, L.; Deng, M.

    2010-12-01

    Remote sensing (RS) is an essential method to collect data for Earth science research. Huge amount of remote sensing data, most of them in the image form, have been acquired. Almost all geography departments in the world offer courses in digital processing of remote sensing images. Such courses place emphasis on how to digitally process large amount of multi-source images for solving real world problems. However, due to the diversity and complexity of RS images and the shortcomings of current data and processing infrastructure, obstacles for effectively teaching such courses still remain. The major obstacles include 1) difficulties in finding, accessing, integrating and using massive RS images by students and educators, and 2) inadequate processing functions and computing facilities for students to freely explore the massive data. Recent development in geospatial Web processing service systems, which make massive data, computing powers, and processing capabilities to average Internet users anywhere in the world, promises the removal of the obstacles. The GeoBrain system developed by CSISS is an example of such systems. All functions available in GRASS Open Source GIS have been implemented as Web services in GeoBrain. Petabytes of remote sensing images in NASA data centers, the USGS Landsat data archive, and NOAA CLASS are accessible transparently and processable through GeoBrain. The GeoBrain system is operated on a high performance cluster server with large disk storage and fast Internet connection. All GeoBrain capabilities can be accessed by any Internet-connected Web browser. Dozens of universities have used GeoBrain as an ideal platform to support data-intensive remote sensing education. This presentation gives a specific example of using GeoBrain geoprocessing services to enhance the teaching of GGS 588, Digital Remote Sensing taught at the Department of Geography and Geoinformation Science, George Mason University. The course uses the textbook "Introductory

  10. Optical processing architecture and its potential application for digital and analog radiography.

    PubMed

    Liu, H; Xu, J; Fajardo, L L

    1999-04-01

    In this report we introduce the fundamental architectures and the potential applications of optical processing techniques in medical imaging. Three basic optical processing architectures were investigated for digital and analog radiography. The processors consist of a module that converts either the analog or the digital radiograph into a coherent light distribution; a coherent optical processing architecture that performs various mathematical operations; a programmable digital-optical interface and other accessories. Optical frequency filters were implemented for mammographic and other clinical feature enhancement. In medical image processing, digital computers offer the advantages of programmability and flexibility. In contrast, optical processors perform parallel image processing with high speed. Optical processors also offer analog nature, compact size, and cost effectiveness. With technical advances of digital-optical interface devices, the medical image processor, in the foreseeable future, may be a hybrid device, namely, a programmable optical architecture.

  11. Social work practice in the digital age: therapeutic e-mail as a direct practice methodology.

    PubMed

    Mattison, Marian

    2012-07-01

    The author addresses the risks and benefits of incorporating therapeutic e-mail communication into clinical social work practice. Consumer demand for online clinical services is growing faster than the professional response. E-mail, when used as an adjunct to traditional meetings with clients, offers distinct advantages and risks. Benefits include the potential to reach clients in geographically remote and underserved communities, enhancing and extending the therapeutic relationship and improving treatment outcomes. Risks include threats to client confidentiality and privacy, liability coverage for practitioners, licensing jurisdiction, and the lack of competency standards for delivering e-mail interventions. Currently, the social work profession does not have adequate instructive guidelines and best-practice standards for using e-mail as a direct practice methodology. Practitioners need (formal) academic training in the techniques connected to e-mail exchanges with clients. The author describes the ethical and legal risks for practitioners using therapeutic e-mail with clients and identifies recommendations for establishing best-practice standards.

  12. ISSUES IN DIGITAL IMAGE PROCESSING OF AERIAL PHOTOGRAPHY FOR MAPPING SUBMERSED AQUATIC VEGETATION

    EPA Science Inventory

    The paper discusses the numerous issues that needed to be addressed when developing a methodology for mapping Submersed Aquatic Vegetation (SAV) from digital aerial photography. Specifically, we discuss 1) choice of film; 2) consideration of tide and weather constraints; 3) in-s...

  13. ISSUES IN DIGITAL IMAGE PROCESSING OF AERIAL PHOTOGRAPHY FOR MAPPING SUBMERSED AQUATIC VEGETATION

    EPA Science Inventory

    The paper discusses the numerous issues that needed to be addressed when developing a methodology for mapping Submersed Aquatic Vegetation (SAV) from digital aerial photography. Specifically, we discuss 1) choice of film; 2) consideration of tide and weather constraints; 3) in-s...

  14. Modeling the simulation execution process with digital objects

    NASA Astrophysics Data System (ADS)

    Cubert, Robert M.; Fishwick, Paul A.

    1999-06-01

    Object Oriented Physical Modeling (OOPM), formerly known as MOOSE, and its implementation of behavior multimodels provide an ability to manage arbitrarily complex patterns of behavioral abstraction in web-friendly simulation modeling. In an OOPM mode, one object stands as surrogate for another object, and these surrogates cognitively map to the real world. This `physical object' principle mitigates impact of incomplete knowledge and ambiguity because its real-world metaphors enable model authors to draw on intuition, facilitating reuse and integration, as well as consistency in collaborative efforts. A 3D interface for modeling and simulation visualization, under construction to augment the existing 2D GUI, obeys the physical object principle, providing a means to create, change, reuse, and integrate digital worlds made of digital objects. Implementation includes Distributed Simulation Executive, Digital object MultiModel Language, Digital Object Warehouse, and multimodel Translator. This approach is powerful and its capabilities have steadily grown; however, it has lacked a formal basis which we now provide: we define multimodels, represent digital objects as multimodels, transform multimodels to simulations, demonstrate the correctness of execution sequence of the simulations, and closure under coupling of digital objects. These theoretical results complement and enhance the practical aspects of physical multimodeling.

  15. The application of digital signal processing techniques to a teleoperator radar system

    NASA Technical Reports Server (NTRS)

    Pujol, A.

    1982-01-01

    A digital signal processing system was studied for the determination of the spectral frequency distribution of echo signals from a teleoperator radar system. The system consisted of a sample and hold circuit, an analog to digital converter, a digital filter, and a Fast Fourier Transform. The system is interfaced to a 16 bit microprocessor. The microprocessor is programmed to control the complete digital signal processing. The digital filtering and Fast Fourier Transform functions are implemented by a S2815 digital filter/utility peripheral chip and a S2814A Fast Fourier Transform chip. The S2815 initially simulates a low-pass Butterworth filter with later expansion to complete filter circuit (bandpass and highpass) synthesizing.

  16. Beyond roots alone: Novel methodologies for analyzing complex soil and minirhizotron imagery using image processing and GIS tools

    NASA Astrophysics Data System (ADS)

    Silva, Justina A.

    Quantifying belowground dynamics is critical to our understanding of plant and ecosystem function and belowground carbon cycling, yet currently available tools for complex belowground image analyses are insufficient. We introduce novel techniques combining digital image processing tools and geographic information systems (GIS) analysis to permit semi-automated analysis of complex root and soil dynamics. We illustrate methodologies with imagery from microcosms, minirhizotrons, and a rhizotron, in upland and peatland soils. We provide guidelines for correct image capture, a method that automatically stitches together numerous minirhizotron images into one seamless image, and image analysis using image segmentation and classification in SPRING or change analysis in ArcMap. These methods facilitate spatial and temporal root and soil interaction studies, providing a framework to expand a more comprehensive understanding of belowground dynamics.

  17. Development of Coriolis mass flowmeter with digital drive and signal processing technology.

    PubMed

    Hou, Qi-Li; Xu, Ke-Jun; Fang, Min; Liu, Cui; Xiong, Wen-Jun

    2013-09-01

    Coriolis mass flowmeter (CMF) often suffers from two-phase flowrate which may cause flowtube stalling. To solve this problem, a digital drive method and a digital signal processing method of CMF is studied and implemented in this paper. A positive-negative step signal is used to initiate the flowtube oscillation without knowing the natural frequency of the flowtube. A digital zero-crossing detection method based on Lagrange interpolation is adopted to calculate the frequency and phase difference of the sensor output signals in order to synthesize the digital drive signal. The digital drive approach is implemented by a multiplying digital to analog converter (MDAC) and a direct digital synthesizer (DDS). A digital Coriolis mass flow transmitter is developed with a digital signal processor (DSP) to control the digital drive, and realize the signal processing. Water flow calibrations and gas-liquid two-phase flowrate experiments are conducted to examine the performance of the transmitter. The experimental results show that the transmitter shortens the start-up time and can maintain the oscillation of flowtube in two-phase flowrate condition.

  18. Digital Subtraction Fluoroscopic System With Tandem Video Processing Units

    NASA Astrophysics Data System (ADS)

    Gould, Robert G.; Lipton, Martin J.; Mengers, Paul; Dahlberg, Roger

    1981-07-01

    A real-time digital fluoroscopic system utilizing two video processing units (Quantex) in tandem to produce continuous subtraction images of peripheral and internal vessels following intravenous contrast media injection has been inves-tigated. The first processor subtracts a mask image consisting of an exponentially weighted moving average of N1 frames (N1 = 2k where k = 0.7) from each incoming video frame, divides by N1, and outputs the resulting difference image to the second processor. The second unit continuously averages N2 incoming frames (N2 = 2k) and outputs to a video monitor and analog disc recorder. The contrast of the subtracted images can be manipulated by changing gain or by a non-linear output transform. After initial equipment adjustments, a subtraction sequence can be produced without operator interaction with the processors. Alternatively, the operator can freeze the mask and/or the subtracted output image at any time during the sequence. Raw data is preserved on a wide band video tape recorder permitting retrospective viewing of an injection sequence with different processor settings. The advantage of the tandem arrangement is that it has great flexibility in varying the duration and the time of both the mask and injection images thereby minimizing problems of registration between them. In addition, image noise is reduced by compiling video frames rather than by using a large radiation dose for a single frame, which requires a wide dynamic range video camera riot commonly available in diagnostic x-ray equipment. High quality subtraction images of arteries have been obtained in 15 anesthetized dogs using relatively low exposure rates (10-12 μR/video frame) modest volumes of contrast medium (0.5-1 ml/kg), and low injection flow rates (6-10 ml/sec). The results/ achieved so far suggest that this system has direct clinical applications.

  19. Digital mapping of side-scan sonar data with the Woods Hole Image Processing System software

    USGS Publications Warehouse

    Paskevich, Valerie F.

    1992-01-01

    Since 1985, the Branch of Atlantic Marine Geology has been involved in collecting, processing and digitally mosaicking high and low resolution sidescan sonar data. In the past, processing and digital mosaicking has been accomplished with a dedicated, shore-based computer system. Recent development of a UNIX-based image-processing software system includes a series of task specific programs for pre-processing sidescan sonar data. To extend the capabilities of the UNIX-based programs, development of digital mapping techniques have been developed. This report describes the initial development of an automated digital mapping procedure. Included is a description of the programs and steps required to complete the digital mosaicking on a UNIXbased computer system, and a comparison of techniques that the user may wish to select.

  20. How to Find Exculpatory and Inculpatory Evidence Using a Circular Digital Forensics Process Model

    NASA Astrophysics Data System (ADS)

    Khatir, Marjan; Hejazi, Seyed Mahmood

    With raising the number of cyber crimes, the need of having a proper digital forensic process also increases. Although digital forensics is practiced in recent years, there is still a big gap between previously suggested digital forensics processes and what is really needed to be done in real cases. Some problems with current processes are lack of flexible transition between phases, not having a clear method or a complete scenario for addressing reliable evidence, and not paying enough attention to management aspects and team roles. This paper provides a process model by paying special attention to the team roles and management aspects as well as both exculpatory and inculpatory evidence.

  1. Digital process for an implant-supported fixed dental prosthesis: A clinical report.

    PubMed

    Brandt, Jan; Lauer, Hans-Christoph; Peter, Thorsten; Brandt, Silvia

    2015-10-01

    A digital process is presented for an implant-supported single-tooth and a 3-unit fixed dental prosthesis (FDP) with customized abutments and monolithic prosthetic zirconia restorations. The digital impression on the implant level was made with a TRIOS intraoral scanner (3Shape). This process included the fabrication of an implant cast with the fused deposition modeling technique and a 3-dimensional printing process with integrated implant analogs. The process enabled the FDPs to be designed with CAD/CAM on the cast before patient contact. Designing a printed implant cast expands the use of the digital workflow in the dental field.

  2. Processing multi-digit numbers: a translingual eye-tracking study.

    PubMed

    Bahnmueller, Julia; Huber, Stefan; Nuerk, Hans-Christoph; Göbel, Silke M; Moeller, Korbinian

    2016-05-01

    The present study aimed at investigating the underlying cognitive processes and language specificities of three-digit number processing. More specifically, it was intended to clarify whether the single digits of three-digit numbers are processed in parallel and/or sequentially and whether processing strategies are influenced by the inversion of number words with respect to the Arabic digits [e.g., 43: dreiundvierzig ("three and forty")] and/or by differences in reading behavior of the respective first language. Therefore, English- and German-speaking adults had to complete a three-digit number comparison task while their eye-fixation behavior was recorded. Replicating previous results, reliable hundred-decade-compatibility effects (e.g., 742_896: hundred-decade compatible because 7 < 8 and 4 < 9; 362_517: hundred-decade incompatible because 3 < 5 but 6 > 1) for English- as well as hundred-unit-compatibility effects for English- and German-speaking participants were observed, indicating parallel processing strategies. While no indices of partial sequential processing were found for the English-speaking group, about half of the German-speaking participants showed an inverse hundred-decade-compatibility effect accompanied by longer inspection time on the hundred digit indicating additional sequential processes. Thereby, the present data revealed that in transition from two- to higher multi-digit numbers, the homogeneity of underlying processing strategies varies between language groups. The regular German orthography (allowing for letter-by-letter reading) and its associated more sequential reading behavior may have promoted sequential processing strategies in multi-digit number processing. Furthermore, these results indicated that the inversion of number words alone is not sufficient to explain all observed language differences in three-digit number processing.

  3. [Methodological study on digitalization of tongue image in traditional Chinese medical diagnosis].

    PubMed

    Zhou, Yue; Yang, Jie; Shen, Li

    2004-12-01

    This is a research aimed at proposing a computerized tongue analysis method based on computerized image processing for quantizing the tongue properties in traditional Chinese medical diagnosis. The chromatic algorithm and 2-D Gabor wavelet transformation are applied to segmenting tongue from original image. The statistical method is adopted in identifying the colors of each pixel, which are attributed to the tongue substance and coating respectively. Thickness of tongue coating is determined by energy of 2-D Gabor wavelet coefficients (GWTE). The distribution of GWTE and invariant moment algorithm are used to judge the tongue texture. The experiment result shows that all methods proposed in this paper are effective.

  4. Using Dual-Task Methodology to Dissociate Automatic from Nonautomatic Processes Involved in Artificial Grammar Learning

    ERIC Educational Resources Information Center

    Hendricks, Michelle A.; Conway, Christopher M.; Kellogg, Ronald T.

    2013-01-01

    Previous studies have suggested that both automatic and intentional processes contribute to the learning of grammar and fragment knowledge in artificial grammar learning (AGL) tasks. To explore the relative contribution of automatic and intentional processes to knowledge gained in AGL, we utilized dual-task methodology to dissociate automatic and…

  5. Using Dual-Task Methodology to Dissociate Automatic from Nonautomatic Processes Involved in Artificial Grammar Learning

    ERIC Educational Resources Information Center

    Hendricks, Michelle A.; Conway, Christopher M.; Kellogg, Ronald T.

    2013-01-01

    Previous studies have suggested that both automatic and intentional processes contribute to the learning of grammar and fragment knowledge in artificial grammar learning (AGL) tasks. To explore the relative contribution of automatic and intentional processes to knowledge gained in AGL, we utilized dual-task methodology to dissociate automatic and…

  6. Reengineering the Acquisition/Procurement Process: A Methodology for Requirements Collection

    NASA Technical Reports Server (NTRS)

    Taylor, Randall; Vanek, Thomas

    2011-01-01

    This paper captures the systematic approach taken by JPL's Acquisition Reengineering Project team, the methodology used, challenges faced, and lessons learned. It provides pragmatic "how-to" techniques and tools for collecting requirements and for identifying areas of improvement in an acquisition/procurement process or other core process of interest.

  7. Reengineering the Acquisition/Procurement Process: A Methodology for Requirements Collection

    NASA Technical Reports Server (NTRS)

    Taylor, Randall; Vanek, Thomas

    2011-01-01

    This paper captures the systematic approach taken by JPL's Acquisition Reengineering Project team, the methodology used, challenges faced, and lessons learned. It provides pragmatic "how-to" techniques and tools for collecting requirements and for identifying areas of improvement in an acquisition/procurement process or other core process of interest.

  8. Methodology development for the sustainability process assessment of sheet metal forming of complex-shaped products

    NASA Astrophysics Data System (ADS)

    Pankratov, D. L.; Kashapova, L. R.

    2015-06-01

    A methodology was developed for automated assessment of the reliability of the process of sheet metal forming process to reduce the defects in complex components manufacture. The article identifies the range of allowable values of the stamp parameters to obtain defect-free punching of spars trucks.

  9. All-digital precision processing of ERTS images

    NASA Technical Reports Server (NTRS)

    Bernstein, R. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. Digital techniques have been developed and used to apply precision-grade radiometric and geometric corrections to ERTS MSS and RBV scenes. Geometric accuracies sufficient for mapping at 1:250,000 scale have been demonstrated. Radiometric quality has been superior to ERTS NDPF precision products. A configuration analysis has shown that feasible, cost-effective all-digital systems for correcting ERTS data are easily obtainable. This report contains a summary of all results obtained during this study and includes: (1) radiometric and geometric correction techniques, (2) reseau detection, (3) GCP location, (4) resampling, (5) alternative configuration evaluations, and (6) error analysis.

  10. The Effects of Digital Portfolio Assessment Process on Students' Writing and Drawing Performances

    ERIC Educational Resources Information Center

    Tezci, Erdogan; Dikici, Ayhan

    2006-01-01

    In this paper, it was investigated the effect of digital portfolio assessment process on the drawing and story writing performances of the 14-15 ages students. For this reason, a digital portfolio assessment rubric was prepared in order to evaluate students' drawing and story writing works. For the validity and reliability analyze was applied to…

  11. Digital signal and image processing in echocardiography. The American Society of Echocardiography.

    PubMed

    Skorton, D J; Collins, S M; Garcia, E; Geiser, E A; Hillard, W; Koppes, W; Linker, D; Schwartz, G

    1985-12-01

    Digital signal and image processing techniques are acquiring an increasingly important role in the generation and analysis of cardiac images. This is particularly true of 2D echocardiography, in which image acquisition, manipulation, and storage within the echocardiograph, as well as quantitative analysis of echocardiographic data by means of "off-line" systems, depend upon digital techniques. The increasing role of computers in echocardiography makes it essential that echocardiographers and technologists understand the basic principles of digital techniques applied to echocardiographic instrumentation and data analysis. In this article, we have discussed digital techniques as applied to image generation (digital scan conversion, preprocessing, and postprocessing) as well as to the analysis of image data (computer-assisted border detection, 3D reconstruction, tissue characterization, and contrast echocardiography); a general introduction to off-line analysis systems was also given. Experience with other cardiac imaging methods indicates that digital techniques will likely play a dominant role in the future of echocardiographic imaging.

  12. Identification and Quantification Soil Redoximorphic Features by Digital Image Processing

    USDA-ARS?s Scientific Manuscript database

    Soil redoximorphic features (SRFs) have provided scientists and land managers with insight into relative soil moisture for approximately 60 years. The overall objective of this study was to develop a new method of SRF identification and quantification from soil cores using a digital camera and imag...

  13. Screen Capture Technology: A Digital Window into Students' Writing Processes

    ERIC Educational Resources Information Center

    Seror, Jeremie

    2013-01-01

    Technological innovations and the prevalence of the computer as a means of producing and engaging with texts have dramatically transformed how literacy is defined and developed in modern society. This rise in digital writing practices has led to a growing number of tools and methods that can be used to explore second language (L2) writing…

  14. Autism and Digital Learning Environments: Processes of Interaction and Mediation

    ERIC Educational Resources Information Center

    Passerino, Liliana M.; Santarosa, Lucila M. Costi

    2008-01-01

    Using a socio-historical perspective to explain social interaction and taking advantage of information and communication technologies (ICTs) currently available for creating digital learning environments (DLEs), this paper seeks to redress the absence of empirical data concerning technology-aided social interaction between autistic individuals. In…

  15. Process reengineering: the role of a planning methodology and picture archiving and communications system team building.

    PubMed

    Carrino, J A; Unkel, P J; Shelton, P; Johnson, T G

    1999-05-01

    The acquisition of a picture archiving and communications system (PACS) is an opportunity to reengineer business practices and should optimally consider the entire process from image acquisition to communication of results. The purpose of this presentation is to describe the PACS planning methodology used by the Department of Defense (DOD) Joint Imaging Technology Project Office (JITPO), outline the critical procedures for each phase, and review the military experience using this model. The methodology is segmented into four phases: strategic planning, clinical scenario planning, installation planning, and implementation planning. Each is further subdivided based on the specific tasks that need to be accomplished within that phase. By using this method, an institution will have clearly defined program goals, objectives, and PACS requirements before vendors are contacted. The development of an institution-specific PACS requirement should direct the process of proposal comparisons to be based on functionality and exclude unnecessary equipment. This PACS planning methodology is being used at more than eight DOD medical treatment facilities. When properly executed, this methodology facilitates a seamless transition to the electronic environment and contributes to the successful integration of the healthcare enterprise. A crucial component of this methodology is the development of a local PACS planning team to manage all aspects of the process. A plan formulated by the local team is based on input from each department that will be integrating with the PACS. Involving all users in the planning process is paramount for successful implementation.

  16. Sensor Signal Processing for Ultrasonic Sensors Using Delta-Sigma Modulated Single-Bit Digital Signal

    NASA Astrophysics Data System (ADS)

    Hirata, S.; Kurosawa, M. K.; Katagiri, T.

    Ultrasonic distance measurement is based on determining the time of flight of ultrasonic wave. The pulse compression technique that obtains the cross-correlation function between the received signal and the reference signal is used to improve the resolution of distance measurement. The cross-correlation method requires high-cost digital signal processing. This paper presents a cross-correlation method using a delta-sigma modulated single-bit digital signal. Sensor signal processing composed of the cross-correlation between two single-bit signals and a post-moving average filter is proposed and enables reducing the cost of digital signal processing.

  17. Digital Versus Optical Techniques In Synthetic Aperture Radar (SAR) Data Processing

    NASA Astrophysics Data System (ADS)

    Ausherman, Dale A.

    1980-04-01

    The principle of synthetic aperture radar (SAR) image formation is reviewed in preparation for a discussion of both optical and digital processing techniques. The tilted-plane optical processing approach is presented as being representative of optical techniques. Since the newer digital approaches can take several forms, three classes of digital processors are examined: direct convolution, frequency multiplexing, and frequency analysis of dechirped data. A subjective listing of the relative merits for both processing media is presented. Both are found to be technically viable. The final choice will depend primarily upon the application requirements.

  18. Parallel Digital Watermarking Process on Ultrasound Medical Images in Multicores Environment

    PubMed Central

    Khor, Hui Liang; Liew, Siau-Chuin; Zain, Jasni Mohd.

    2016-01-01

    With the advancement of technology in communication network, it facilitated digital medical images transmitted to healthcare professionals via internal network or public network (e.g., Internet), but it also exposes the transmitted digital medical images to the security threats, such as images tampering or inserting false data in the images, which may cause an inaccurate diagnosis and treatment. Medical image distortion is not to be tolerated for diagnosis purposes; thus a digital watermarking on medical image is introduced. So far most of the watermarking research has been done on single frame medical image which is impractical in the real environment. In this paper, a digital watermarking on multiframes medical images is proposed. In order to speed up multiframes watermarking processing time, a parallel watermarking processing on medical images processing by utilizing multicores technology is introduced. An experiment result has shown that elapsed time on parallel watermarking processing is much shorter than sequential watermarking processing. PMID:26981111

  19. A New Methodology for Studying Dynamics of Aerosol Particles in Sneeze and Cough Using a Digital High-Vision, High-Speed Video System and Vector Analyses

    PubMed Central

    Nishimura, Hidekazu; Sakata, Soichiro; Kaga, Akikazu

    2013-01-01

    Microbial pathogens of respiratory infectious diseases are often transmitted through particles in sneeze and cough. Therefore, understanding the particle movement is important for infection control. Images of a sneeze induced by nasal cavity stimulation by healthy adult volunteers, were taken by a digital high-vision, high-speed video system equipped with a computer system and treated as a research model. The obtained images were enhanced electronically, converted to digital images every 1/300 s, and subjected to vector analysis of the bioparticles contained in the whole sneeze cloud using automatic image processing software. The initial velocity of the particles or their clusters in the sneeze was greater than 6 m/s, but decreased as the particles moved forward; the momentums of the particles seemed to be lost by 0.15–0.20 s and started a diffusion movement. An approximate equation of a function of elapsed time for their velocity was obtained from the vector analysis to represent the dynamics of the front-line particles. This methodology was also applied for a cough. Microclouds contained in a smoke exhaled with a voluntary cough by a volunteer after smoking one breath of cigarette, were traced as the visible, aerodynamic surrogates for invisible bioparticles of cough. The smoke cough microclouds had an initial velocity greater than 5 m/s. The fastest microclouds were located at the forefront of cloud mass that moving forward; however, their velocity clearly decreased after 0.05 s and they began to diffuse in the environmental airflow. The maximum direct reaches of the particles and microclouds driven by sneezing and coughing unaffected by environmental airflows were estimated by calculations using the obtained equations to be about 84 cm and 30 cm from the mouth, respectively, both achieved in about 0.2 s, suggesting that data relating to the dynamics of sneeze and cough became available by calculation. PMID:24312206

  20. A new methodology for studying dynamics of aerosol particles in sneeze and cough using a digital high-vision, high-speed video system and vector analyses.

    PubMed

    Nishimura, Hidekazu; Sakata, Soichiro; Kaga, Akikazu

    2013-01-01

    Microbial pathogens of respiratory infectious diseases are often transmitted through particles in sneeze and cough. Therefore, understanding the particle movement is important for infection control. Images of a sneeze induced by nasal cavity stimulation by healthy adult volunteers, were taken by a digital high-vision, high-speed video system equipped with a computer system and treated as a research model. The obtained images were enhanced electronically, converted to digital images every 1/300 s, and subjected to vector analysis of the bioparticles contained in the whole sneeze cloud using automatic image processing software. The initial velocity of the particles or their clusters in the sneeze was greater than 6 m/s, but decreased as the particles moved forward; the momentums of the particles seemed to be lost by 0.15-0.20 s and started a diffusion movement. An approximate equation of a function of elapsed time for their velocity was obtained from the vector analysis to represent the dynamics of the front-line particles. This methodology was also applied for a cough. Microclouds contained in a smoke exhaled with a voluntary cough by a volunteer after smoking one breath of cigarette, were traced as the visible, aerodynamic surrogates for invisible bioparticles of cough. The smoke cough microclouds had an initial velocity greater than 5 m/s. The fastest microclouds were located at the forefront of cloud mass that moving forward; however, their velocity clearly decreased after 0.05 s and they began to diffuse in the environmental airflow. The maximum direct reaches of the particles and microclouds driven by sneezing and coughing unaffected by environmental airflows were estimated by calculations using the obtained equations to be about 84 cm and 30 cm from the mouth, respectively, both achieved in about 0.2 s, suggesting that data relating to the dynamics of sneeze and cough became available by calculation.

  1. Characterization of digital signal processing in the DiDAC data acquisition system

    SciTech Connect

    Parson, J.D.; Olivier, T.L.; Habbersett, R.C.; Martin, J.C.; Wilder, M.E.; Jett, J.H. )

    1993-01-01

    A new generation data acquisition system for flow cytometers has been constructed. This Digital Data Acquisition and Control (DiDAC) system is based on the VME architecture and uses both the standard VME bus and a private bus for system communication and data transfer. At the front end of the system is a free running 20 MHz ADC. The output of a detector preamp provides the signal for digitization. The digitized waveform is passed to a custom built digital signal processing circuit that extracts the height, width, and integral of the waveform. Calculation of these parameters is started (and stopped) when the waveform exceeds (and falls below) a preset threshold value. The free running ADC is specified to have 10 bit accuracy at 25 MHZ. The authors have characterized it to the results obtained with conventional analog signal processing followed by digitization. Comparisons are made between the two approaches in terms of measurement CV, linearity and in other aspects.

  2. Spectral analysis and filtering techniques in digital spatial data processing

    USGS Publications Warehouse

    Pan, Jeng-Jong

    1989-01-01

    A filter toolbox has been developed at the EROS Data Center, US Geological Survey, for retrieving or removing specified frequency information from two-dimensional digital spatial data. This filter toolbox provides capabilities to compute the power spectrum of a given data and to design various filters in the frequency domain. Three types of filters are available in the toolbox: point filter, line filter, and area filter. Both the point and line filters employ Gaussian-type notch filters, and the area filter includes the capabilities to perform high-pass, band-pass, low-pass, and wedge filtering techniques. These filters are applied for analyzing satellite multispectral scanner data, airborne visible and infrared imaging spectrometer (AVIRIS) data, gravity data, and the digital elevation models (DEM) data. -from Author

  3. GEOMETRIC PROCESSING OF DIGITAL IMAGES OF THE PLANETS.

    USGS Publications Warehouse

    Edwards, Kathleen

    1987-01-01

    New procedures and software have been developed for geometric transformations of images to support digital cartography of the planets. The procedures involve the correction of spacecraft camera orientation of each image with the use of ground control and the transformation of each image to a Sinusoidal Equal-Area map projection with an algorithm which allows the number of transformation calculations to vary as the distortion varies within the image. When the distortion is low in an area of an image, few transformation computations are required, and most pixels can be interpolated. When distortion is extreme, the location of each pixel is computed. Mosaics are made of these images and stored as digital databases.

  4. Centralized Digital Picture Processing System For Cardiac Imaging

    NASA Astrophysics Data System (ADS)

    LeFree, M. T.; Vogel, R. A.

    1982-01-01

    We have designed and implemented a system for the centralized acquisition, display, analysis and archiving of diagnostic cardiac medical images from x-ray fluoroscopy, two-dimensional ultrasonography and nuclear scintigraphy. Centered around a DLC PUP 11/34 minicomputer with an existing gamma camera interface, we have added a closed-circuit television system with a 256x512x8-bit video digitizer and image display controller to interface the video output of the fluoroscope and ultrasonograph. A video disc recorder (under computer control) is used as an input and playback buffer, allowing for data transfer to and from digital disc drives. Thus, real-time video digitization is possible for up to ten seconds of incoming RS-170-compatible video. The digitizer separates video fields at real-time into two 256x256x8-bit refresh memories, providing 60Hz temporal resolution. Generally, however, we choose to record at non-real-time rates to encompass more than ten seconds. In addition to I/O software controlling data acquisition ana playback, we have developed a versatile data analysis package (offering such capabilities as image algebra, Fourier analysis and convolutional filtering), as well as interactive data reduction subroutines (such as region-of-interest definition, profile plotting and regional extraction of statistical and probabilistic information). We have found the system useful for standard cardiac image analysis, for simultaneous display of images from the three modalities, for picture storage and retrieval, and as a research tool. future plans include the addition of intelligent terminals at each modality and progression to a 32-bit machine for the central processor.

  5. Digital microfluidic processing of mammalian embryos for vitrification.

    PubMed

    Pyne, Derek G; Liu, Jun; Abdelgawad, Mohamed; Sun, Yu

    2014-01-01

    Cryopreservation is a key technology in biology and clinical practice. This paper presents a digital microfluidic device that automates sample preparation for mammalian embryo vitrification. Individual micro droplets manipulated on the microfluidic device were used as micro-vessels to transport a single mouse embryo through a complete vitrification procedure. Advantages of this approach, compared to manual operation and channel-based microfluidic vitrification, include automated operation, cryoprotectant concentration gradient generation, and feasibility of loading and retrieval of embryos.

  6. Applications of digital processing for noise removal from plasma diagnostics

    SciTech Connect

    Kane, R.J.; Candy, J.V.; Casper, T.A.

    1985-11-11

    The use of digital signal techniques for removal of noise components present in plasma diagnostic signals is discussed, particularly with reference to diamagnetic loop signals. These signals contain noise due to power supply ripple in addition to plasma characteristics. The application of noise canceling techniques, such as adaptive noise canceling and model-based estimation, will be discussed. The use of computer codes such as SIG is described. 19 refs., 5 figs.

  7. Developing an undergraduate geography course on digital image processing of remotely sensed data

    NASA Technical Reports Server (NTRS)

    Baumann, P. R.

    1981-01-01

    Problems relating to the development of a digital image processing course in an undergraduate geography environment is discussed. Computer resource requirements, course prerequisites, and the size of the study area are addressed.

  8. Digital holographic inspection for drying processes of paint films and ink dots

    NASA Astrophysics Data System (ADS)

    Yokota, M.; Aoyama, F.

    2017-06-01

    Digital holographic techniques to investigate drying processes of both paint films and ink dot is presented. The proposed technique based on digital holographic interferometry can achieve both visualization of variations and analysis of dryness of paint films in the drying process by using phase changes between two subsequent reconstructed complex amplitudes of the reflected light from the film. To follow the drying processes, holograms are recorded at a constant time interval. Phase-shifting digital holography has been applied to analyze the dryness of commercial paints applied on the metal plate. For analysis of an ink dot having diameter of a few hundred micrometers, digital holographic microscopy is applied to evaluating the time history of dryness of ink dot in the drying process. This paper describes these holographic techniques applied to the commercially available paint and ink and presents some experimental results.

  9. Developing an undergraduate geography course on digital image processing of remotely sensed data

    NASA Technical Reports Server (NTRS)

    Baumann, P. R.

    1981-01-01

    Problems relating to the development of a digital image processing course in an undergraduate geography environment is discussed. Computer resource requirements, course prerequisites, and the size of the study area are addressed.

  10. Is place-value processing in four-digit numbers fully automatic? Yes, but not always.

    PubMed

    García-Orza, Javier; Estudillo, Alejandro J; Calleja, Marina; Rodríguez, José Miguel

    2017-01-30

    Knowing the place-value of digits in multi-digit numbers allows us to identify, understand and distinguish between numbers with the same digits (e.g., 1492 vs. 1942). Research using the size congruency task has shown that the place-value in a string of three zeros and a non-zero digit (e.g., 0090) is processed automatically. In the present study, we explored whether place-value is also automatically activated when more complex numbers (e.g., 2795) are presented. Twenty-five participants were exposed to pairs of four-digit numbers that differed regarding the position of some digits and their physical size. Participants had to decide which of the two numbers was presented in a larger font size. In the congruent condition, the number shown in a bigger font size was numerically larger. In the incongruent condition, the number shown in a smaller font size was numerically larger. Two types of numbers were employed: numbers composed of three zeros and one non-zero digit (e.g., 0040-0400) and numbers composed of four non-zero digits (e.g., 2795-2759). Results showed larger congruency effects in more distant pairs in both type of numbers. Interestingly, this effect was considerably stronger in the strings composed of zeros. These results indicate that place-value coding is partially automatic, as it depends on the perceptual and numerical properties of the numbers to be processed.

  11. Digital Learning As Enhanced Learning Processing? Cognitive Evidence for New insight of Smart Learning

    PubMed Central

    Di Giacomo, Dina; Ranieri, Jessica; Lacasa, Pilar

    2017-01-01

    Large use of technology improved quality of life across aging and favoring the development of digital skills. Digital skills can be considered an enhancing to human cognitive activities. New research trend is about the impact of the technology in the elaboration information processing of the children. We wanted to analyze the influence of technology in early age evaluating the impact on cognition. We investigated the performance of a sample composed of n. 191 children in school age distributed in two groups as users: high digital users and low digital users. We measured the verbal and visuoperceptual cognitive performance of children by n. 8 standardized psychological tests and ad hoc self-report questionnaire. Results have evidenced the influence of digital exposition on cognitive development: the cognitive performance is looked enhanced and better developed: high digital users performed better in naming, semantic, visual memory and logical reasoning tasks. Our finding confirms the data present in literature and suggests the strong impact of the technology using not only in the social, educational and quality of life of the people, but also it outlines the functionality and the effect of the digital exposition in early age; increased cognitive abilities of the children tailor digital skilled generation with enhanced cognitive processing toward to smart learning. PMID:28824508

  12. Digital Learning As Enhanced Learning Processing? Cognitive Evidence for New insight of Smart Learning.

    PubMed

    Di Giacomo, Dina; Ranieri, Jessica; Lacasa, Pilar

    2017-01-01

    Large use of technology improved quality of life across aging and favoring the development of digital skills. Digital skills can be considered an enhancing to human cognitive activities. New research trend is about the impact of the technology in the elaboration information processing of the children. We wanted to analyze the influence of technology in early age evaluating the impact on cognition. We investigated the performance of a sample composed of n. 191 children in school age distributed in two groups as users: high digital users and low digital users. We measured the verbal and visuoperceptual cognitive performance of children by n. 8 standardized psychological tests and ad hoc self-report questionnaire. Results have evidenced the influence of digital exposition on cognitive development: the cognitive performance is looked enhanced and better developed: high digital users performed better in naming, semantic, visual memory and logical reasoning tasks. Our finding confirms the data present in literature and suggests the strong impact of the technology using not only in the social, educational and quality of life of the people, but also it outlines the functionality and the effect of the digital exposition in early age; increased cognitive abilities of the children tailor digital skilled generation with enhanced cognitive processing toward to smart learning.

  13. A new and practical method to obtain grain size measurements in sandy shores based on digital image acquisition and processing

    NASA Astrophysics Data System (ADS)

    Baptista, P.; Cunha, T. R.; Gama, C.; Bernardes, C.

    2012-12-01

    Modern methods for the automated evaluation of sediment size in sandy shores relay on digital image processing algorithms as an alternative to time-consuming traditional sieving methodologies. However, the requirements necessary to guarantee that the considered image processing algorithm has a good grain identification success rate impose the need for dedicated hardware setups to capture the sand surface images. Examples are specially designed camera housings that maintain a constant distance between the camera lens and the sand surface, tripods to fix and maintain the camera angle orthogonal to the sand surface, external illumination systems that guarantee the light level necessary for the image processing algorithms, and special lenses and focusing systems for close proximity image capturing. In some cases, controlled image-capturing conditions can make the fieldwork more laborious which incurs in significant costs for monitoring campaigns considering large areas. To circumvent this problem, it is proposed a new automated image-processing algorithm that identifies sand grains in digital images acquired with a standard digital camera without any extra hardware attached to it. The accuracy and robustness of the proposed algorithm are evaluated in this work by means of a laboratory test on previously controlled grain samples, field tests where 64 samples (spread over a beach stretch of 65 km and with grain size ranging from 0.5 mm to 1.9 mm) were processed by both the proposed method and by sieving and finally by manual point count on all acquired images. The calculated root-mean-square (RMS) error between mean grain sizes obtained from the proposed image processing method and the sieve method (for the 64 samples) was 0.33 mm, and for the image processing method versus manual point counts comparison, with the same images, was 0.12 mm. The achieved correlation coefficients (r) were 0.91 and 0.96, respectively.

  14. Processing, mosaicking and management of the Monterey Bay digital sidescan-sonar images

    USGS Publications Warehouse

    Chavez, P.S.; Isbrecht, J.; Galanis, P.; Gabel, G.L.; Sides, S.C.; Soltesz, D.L.; Ross, S.L.; Velasco, M.G.

    2002-01-01

    Sidescan-sonar imaging systems with digital capabilities have now been available for approximately 20 years. In this paper we present several of the various digital image processing techniques developed by the U.S. Geological Survey (USGS) and used to apply intensity/radiometric and geometric corrections, as well as enhance and digitally mosaic, sidescan-sonar images of the Monterey Bay region. New software run by a WWW server was designed and implemented to allow very large image data sets, such as the digital mosaic, to be easily viewed interactively, including the ability to roam throughout the digital mosaic at the web site in either compressed or full 1-m resolution. The processing is separated into the two different stages: preprocessing and information extraction. In the preprocessing stage, sensor-specific algorithms are applied to correct for both geometric and intensity/radiometric distortions introduced by the sensor. This is followed by digital mosaicking of the track-line strips into quadrangle format which can be used as input to either visual or digital image analysis and interpretation. An automatic seam removal procedure was used in combination with an interactive digital feathering/stenciling procedure to help minimize tone or seam matching problems between image strips from adjacent track-lines. The sidescan-sonar image processing package is part of the USGS Mini Image Processing System (MIPS) and has been designed to process data collected by any 'generic' digital sidescan-sonar imaging system. The USGS MIPS software, developed over the last 20 years as a public domain package, is available on the WWW at: http://terraweb.wr.usgs.gov/trs/software.html.

  15. An Effective Methodology for Processing and Analyzing Large, Complex Spacecraft Data Streams

    ERIC Educational Resources Information Center

    Teymourlouei, Haydar

    2013-01-01

    The emerging large datasets have made efficient data processing a much more difficult task for the traditional methodologies. Invariably, datasets continue to increase rapidly in size with time. The purpose of this research is to give an overview of some of the tools and techniques that can be utilized to manage and analyze large datasets. We…

  16. Relationship of Automatic Data Processing Training Curriculum and Methodology in the Federal Government.

    ERIC Educational Resources Information Center

    Office of Education (DHEW), Washington, DC.

    A conference, held in Washington, D. C., in 1967 by the Association for Educational Data Systems and the U.S. Office of Education, attempted to lay the groundwork for an efficient automatic data processing training program for the Federal Government utilizing new instructional methodologies. The rapid growth of computer applications and computer…

  17. An Effective Methodology for Processing and Analyzing Large, Complex Spacecraft Data Streams

    ERIC Educational Resources Information Center

    Teymourlouei, Haydar

    2013-01-01

    The emerging large datasets have made efficient data processing a much more difficult task for the traditional methodologies. Invariably, datasets continue to increase rapidly in size with time. The purpose of this research is to give an overview of some of the tools and techniques that can be utilized to manage and analyze large datasets. We…

  18. Understanding Teachers' Cognitive Processes during Online Professional Learning: A Methodological Comparison

    ERIC Educational Resources Information Center

    Beach, Pamela; Willows, Dale

    2017-01-01

    This study examined the effectiveness of three types of think aloud methods for understanding elementary teachers' cognitive processes as they used a professional development website. A methodology combining a retrospective think aloud procedure with screen capture technology (referred to as the virtual revisit) was compared with concurrent and…

  19. Development of an Optimization Methodology for the Aluminum Alloy Wheel Casting Process

    NASA Astrophysics Data System (ADS)

    Duan, Jianglan; Reilly, Carl; Maijer, Daan M.; Cockcroft, Steve L.; Phillion, Andre B.

    2015-08-01

    An optimization methodology has been developed for the aluminum alloy wheel casting process. The methodology is focused on improving the timing of cooling processes in a die to achieve improved casting quality. This methodology utilizes (1) a casting process model, which was developed within the commercial finite element package, ABAQUS™—ABAQUS is a trademark of Dassault Systèms; (2) a Python-based results extraction procedure; and (3) a numerical optimization module from the open-source Python library, Scipy. To achieve optimal casting quality, a set of constraints have been defined to ensure directional solidification, and an objective function, based on the solidification cooling rates, has been defined to either maximize, or target a specific, cooling rate. The methodology has been applied to a series of casting and die geometries with different cooling system configurations, including a 2-D axisymmetric wheel and die assembly generated from a full-scale prototype wheel. The results show that, with properly defined constraint and objective functions, solidification conditions can be improved and optimal cooling conditions can be achieved leading to process productivity and product quality improvements.

  20. Digitizing rocks standardizing the geological description process using workstations

    SciTech Connect

    Saunders, M.R. , Windsor, Berkshire ); Shields, J.A. ); Taylor, M.R. )

    1993-09-01

    The preservation of geological knowledge in a standardized digital form presents a challenge. Data sources, inherently fuzzy, range in scale from the macroscopic (e.g., outcrop) through the mesoscopic (e.g., hand-specimen) core and sidewall core, to the microscopic (e.g., drill cuttings, thin sections, and microfossils). Each scale change results in increased heterogeneity and potentially contradictory data and the providers of such data may vary in experience level. To address these issues with respect to cores and drill cuttings, a geological description workstation has been developed and is undergoing field trials. Over 1000 carefully defined geological attributes are currently available within a depth-indexed, relational database. Attributes are stored in digital form, allowing multiple users to select familiar usage (e.g., diabase vs. dolerite). Data can be entered in one language and retrieved in other languages. The database structure allow groupings of similar elements (e.g., rhyolites in acidic, igneous or volcanics subgroups or the igneous rock group) permitting different uses to analyze details appropriate to the scale of the usage. Data entry uses a graphical user interface, allowing the geologist to make quick, logical selections in a standardized or custom-built format with extensive menus, on-screen graphics and help screens available. Description ranges are permissible. Entries for lithology, petrology, structures (sedimentary, organic and deformational), reservoir characteristics (porosity and hydrocarbon shows), and macrofossils are available. Sampling points for thin sections, core analysis, geochemistry, or micropaleontology studies are also recorded. Using digital data storage, geological logs using graphical, alphanumeric and symbolic depictions are possible. Data can be integrated with drilling and mud gas data, MWD and wireline data and off well-site analyses to produced composite formation evaluation logs and interpretational crossplots.

  1. An image processing system for digital chest X-ray images.

    PubMed

    Cocklin, M; Gourlay, A; Jackson, P; Kaye, G; Miessler, M; Kerr, I; Lams, P

    1984-01-01

    This paper investigates the requirements for image processing of digital chest X-ray images. These images are conventionally recorded on film and are characterised by large size, wide dynamic range and high resolution. X-ray detection systems are now becoming available for capturing these images directly in photoelectronic-digital form. In this report, the hardware and software facilities required for handling these images are described. These facilities include high resolution digital image displays, programmable video look up tables, image stores for image capture and processing and a full range of software tools for image manipulation. Examples are given of the application of digital image processing techniques to this class of image.

  2. Performance of the SIR-B digital image processing subsystem

    NASA Technical Reports Server (NTRS)

    Curlander, J. C.

    1986-01-01

    A ground-based system to generate digital SAR image products has been developed and implemented in support of the SIR-B mission. This system is designed to achieve the maximum throughput while meeting strict image fidelity criteria. Its capabilities include: automated radiometric and geometric correction of the output imagery; high-precision absolute location without tiepoint registration; filtering of the raw data to remove spurious signals from alien radars; and automated catologing to maintain a full set of radar and image production facility in support of the SIR-B science investigators routinely produces over 80 image frames per week.

  3. Performance of the SIR-B digital image processing subsystem

    NASA Technical Reports Server (NTRS)

    Curlander, J. C.

    1986-01-01

    A ground-based system to generate digital SAR image products has been developed and implemented in support of the SIR-B mission. This system is designed to achieve the maximum throughput while meeting strict image fidelity criteria. Its capabilities include: automated radiometric and geometric correction of the output imagery; high-precision absolute location without tiepoint registration; filtering of the raw data to remove spurious signals from alien radars; and automated catologing to maintain a full set of radar and image production facility in support of the SIR-B science investigators routinely produces over 80 image frames per week.

  4. Experimental Methodology for Determining Optimum Process Parameters for Production of Hydrous Metal Oxides by Internal Gelation

    SciTech Connect

    Collins, J.L.

    2005-10-28

    The objective of this report is to describe a simple but very useful experimental methodology that was used to determine optimum process parameters for preparing several hydrous metal-oxide gel spheres by the internal gelation process. The method is inexpensive and very effective in collection of key gel-forming data that are needed to prepare the hydrous metal-oxide microspheres of the best quality for a number of elements.

  5. On-line process failure diagnosis: The necessity and a comparative review of the methodologies

    SciTech Connect

    Kim, I.S.

    1991-01-01

    Three basic approaches to process failure management are defined and discussed to elucidate the role of diagnosis in the operation of nuclear power plants. The rationale for the necessity of diagnosis is given from various perspectives. A comparative review of some representative diagnostic methodologies is presented and their shortcomings are discussed. Based on the insights from the review, the desirable characteristics from the review, the desirable characteristics of advanced diagnostic methodologies are derived from the viewpoints of failure detection, diagnosis, and correction. 11 refs.

  6. Reaction Wheel Friction Telemetry Data Processing Methodology and On-Orbit Experience

    NASA Astrophysics Data System (ADS)

    Hacker, Johannes M.; Ying, Jiongyu; Lai, Peter C.

    2015-09-01

    A Globalstar 2nd generation satellite experienced a reaction wheel mechanical failure, and in response Globalstar has been closely monitoring reaction wheel bearing friction. To prevent another reaction wheel hardware failure and subsequent shortened satellite mission life, a friction data processing methodology was developed as an on-orbit monitoring tool for the ground to issue early warning and take appropriate action on any hardware degradation or potential failure. The methodology, reaction wheel friction behavior, and its application to an on-orbit anomaly experience will be presented.

  7. Implementation of real-time digital endoscopic image processing system

    NASA Astrophysics Data System (ADS)

    Song, Chul Gyu; Lee, Young Mook; Lee, Sang Min; Kim, Won Ky; Lee, Jae Ho; Lee, Myoung Ho

    1997-10-01

    Endoscopy has become a crucial diagnostic and therapeutic procedure in clinical areas. Over the past four years, we have developed a computerized system to record and store clinical data pertaining to endoscopic surgery of laparascopic cholecystectomy, pelviscopic endometriosis, and surgical arthroscopy. In this study, we developed a computer system, which is composed of a frame grabber, a sound board, a VCR control board, a LAN card and EDMS. Also, computer system controls peripheral instruments such as a color video printer, a video cassette recorder, and endoscopic input/output signals. Digital endoscopic data management system is based on open architecture and a set of widely available industry standards; namely Microsoft Windows as an operating system, TCP/IP as a network protocol and a time sequential database that handles both images and speech. For the purpose of data storage, we used MOD and CD- R. Digital endoscopic system was designed to be able to store, recreate, change, and compress signals and medical images. Computerized endoscopy enables us to generate and manipulate the original visual document, making it accessible to a virtually unlimited number of physicians.

  8. A methodology for evaluation and selection of nanoparticle manufacturing processes based on sustainability metrics.

    PubMed

    Naidu, Sasikumar; Sawhney, Rapinder; Li, Xueping

    2008-09-01

    A set of sustainability metrics, covering the economic, environmental and sociological dimensions of sustainability for evaluation of nanomanufacturing processes is developed. The metrics are divided into two categories namely industrial engineering metrics (process and safety metrics) and green chemistry metrics (environmental impact). The waste reduction algorithm (WAR) is used to determine the environmental impact of the processes and NAIADE (Novel Approach to Imprecise Assessment and Decision Environments) software is used for evaluation and decision analysis. The methodology is applied to three processes used for silica nanoparticle synthesis based on sol-gel and flame methods.

  9. Integrating rock mechanics issues with repository design through design process principles and methodology

    SciTech Connect

    Bieniawski, Z.T.

    1996-04-01

    A good designer needs not only knowledge for designing (technical know-how that is used to generate alternative design solutions) but also must have knowledge about designing (appropriate principles and systematic methodology to follow). Concepts such as {open_quotes}design for manufacture{close_quotes} or {open_quotes}concurrent engineering{close_quotes} are widely used in the industry. In the field of rock engineering, only limited attention has been paid to the design process because design of structures in rock masses presents unique challenges to the designers as a result of the uncertainties inherent in characterization of geologic media. However, a stage has now been reached where we are be able to sufficiently characterize rock masses for engineering purposes and identify the rock mechanics issues involved but are still lacking engineering design principles and methodology to maximize our design performance. This paper discusses the principles and methodology of the engineering design process directed to integrating site characterization activities with design, construction and performance of an underground repository. Using the latest information from the Yucca Mountain Project on geology, rock mechanics and starter tunnel design, the current lack of integration is pointed out and it is shown how rock mechanics issues can be effectively interwoven with repository design through a systematic design process methodology leading to improved repository performance. In essence, the design process is seen as the use of design principles within an integrating design methodology, leading to innovative problem solving. In particular, a new concept of {open_quotes}Design for Constructibility and Performance{close_quotes} is introduced. This is discussed with respect to ten rock mechanics issues identified for repository design and performance.

  10. Digital processing of signals arising from organic liquid scintillators for applications in the mixed-field assessment of nuclear threats

    NASA Astrophysics Data System (ADS)

    Aspinall, M. D.; Joyce, M. J.; Mackin, R. O.; Jarrah, Z.; Peyton, A. J.

    2008-10-01

    The nuclear aspect of the CBRN* threat is often divided amongst radiological substances posing no criticality risk, often referred to as 'dirty bomb' scenarios, and fissile threats. The latter have the theoretical potential for criticality excursion, resulting in elevated neutron fluxes in addition to the γ-ray component that is common to dirty bombs. Even in isolation of the highly-unlikely criticality scenario, fissile substances often exhibit radiation fields comprising a significant neutron component which can require considerably different counterterrorism measures and clean-up methodologies. The contrast between these threats can indicate important differences in the relative sophistication of the perpetrators and their organizations. Consequently, the detection and discrimination of nuclear perils in terms of mixed-field content is an important assay in combating terrorist threats. In this paper we report on the design and implementation of a fast digitizer and embedded-processor for onthe- fly signal processing of events from organic liquid scintillators. A digital technique, known as Pulse Gradient Analysis (PGA), has been developed at Lancaster University for the digital discrimination of neutrons and γ rays. PGA has been deployed on bespoke hardware and demonstrates remarkable improvement over analogue methods for the assay of mixed fields and the real-time discrimination of neutrons and γ rays. In this regard the technology constitutes an attractive and affordable means for the discrimination of the radiation fields arising from fissile threats and those from dirty bombs. Data are presented demonstrating this capability with sealed radioactive sources.

  11. Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology

    NASA Technical Reports Server (NTRS)

    Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.

    1974-01-01

    A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.

  12. A comparison of letter and digit processing in letter-by-letter reading.

    PubMed

    Ingles, Janet L; Eskes, Gail A

    2008-01-01

    The extent to which letter-by-letter reading results from a specific orthographic deficit, as compared with a nonspecific disturbance in basic visuoperceptual mechanisms, is unclear. The current study directly compared processing of letters and digits in a letter-by-letter reader, G.M., using a rapid serial visual presentation (RSVP) task and a speeded matching task. Comparisons were made to a group of six brain-damaged individuals without reading deficits. In the RSVP task, G.M. had increased difficulty reporting the target identities when they were letters, as compared with digits. Although this general pattern was also evident in the control group, the magnitude of the letter-digit accuracy difference was greater in G.M. Similarly, in the matching task, G.M. was slower to match letters than digits, relative to the control group, although his response times to both item types were increased. These data suggest that letter-by-letter reading, at least in this case, results from a visuoperceptual encoding deficit that particularly affects letters, but also extends to processing of digits to a lesser extent. Results are consistent with the notion that a left occipitotemporal area is specialized for letter processing with greater bilaterality in the visual processing of digits.

  13. Digital pulse processing and optimization of the front-end electronics for nuclear instrumentation.

    PubMed

    Bobin, C; Bouchard, J; Thiam, C; Ménesguen, Y

    2014-05-01

    This article describes an algorithm developed for the digital processing of signals provided by a high-efficiency well-type NaI(Tl) detector used to apply the 4πγ technique. In order to achieve a low-energy threshold, a new front-end electronics has been specifically designed to optimize the coupling to an analog-to-digital converter (14 bit, 125 MHz) connected to a digital development kit produced by Altera(®). The digital pulse processing is based on an IIR (Infinite Impulse Response) approximation of the Gaussian filter (and its derivatives) that can be applied to the real-time processing of digitized signals. Based on measurements obtained with the photon emissions generated by an (241)Am source, the energy threshold is estimated to be equal to ~2 keV corresponding to the physical threshold of the NaI(Tl) detector. An algorithm developed for a Silicon Drift Detector used for low-energy x-ray spectrometry is also described. In that case, the digital pulse processing is specifically designed for signals provided by a reset-type preamplifier ((55)Fe source). © 2013 Published by Elsevier Ltd.

  14. Digital Processing of Medical Images Obtained by a Si Microstrips Detector

    SciTech Connect

    Diaz, Claudia C.; Montano, Luis M.; Fontaine, Marcos; Leyva, Antonio

    2006-09-08

    We studied the capability of Matlab in digital processing of breast tissues images with microcalcifications. We obtained digital images of different byopsies through a Bede X-ray tube, fixed at 20 kV and 1 mA. Radiation exposition time was varied. The byopsies were placed between a 120{mu}m collimator and a 128 strips detector, which was used to measure the absorption of X rays in the tissue. Matlab allowed the manipulation of digital images, and this software was intended to improve the identification of microcalcifications in breast tissues.

  15. Digital signal processing for the ATLAS/LUCID detector

    SciTech Connect

    2015-07-01

    Both the detector and the associated read-out electronics have been improved in order to cope with the LHC luminosity increase foreseen for RUN 2 and RUN 3. The new operating conditions require a careful tuning of the read-out electronics in order to optimize the signal-to-noise ratio. The new read-out electronics will allow the use of digital filtering of the photo multiplier tube signals. In this talk, we will present the first results that we obtained in the optimization of the signal-to-noise ratio. In addition, we will introduce the next steps to adapt this system to high performance read-out chains for low energy gamma rays. Such systems are based, for instance, on Silicon Drift Detector devices and can be used in applications at Free-Electron-Laser facilities such as the XFEL under construction at DESY. (authors)

  16. Image processing for a tactile/vision substitution system using digital CNN.

    PubMed

    Lin, Chien-Nan; Yu, Sung-Nien; Hu, Jin-Cheng

    2006-01-01

    In view of the parallel processing and easy implementation properties of CNN, we propose to use digital CNN as the image processor of a tactile/vision substitution system (TVSS). The digital CNN processor is used to execute the wavelet down-sampling filtering and the half-toning operations, aiming to extract important features from the images. A template combination method is used to embed the two image processing functions into a single CNN processor. The digital CNN processor is implemented on an intellectual property (IP) and is implemented on a XILINX VIRTEX II 2000 FPGA board. Experiments are designated to test the capability of the CNN processor in the recognition of characters and human subjects in different environments. The experiments demonstrates impressive results, which proves the proposed digital CNN processor a powerful component in the design of efficient tactile/vision substitution systems for the visually impaired people.

  17. Digital active material processing platform effort (DAMPER), SBIR phase 2

    NASA Technical Reports Server (NTRS)

    Blackburn, John; Smith, Dennis

    1992-01-01

    Applied Technology Associates, Inc., (ATA) has demonstrated that inertial actuation can be employed effectively in digital, active vibration isolation systems. Inertial actuation involves the use of momentum exchange to produce corrective forces which act directly on the payload being actively isolated. In a typical active vibration isolation system, accelerometers are used to measure the inertial motion of the payload. The signals from the accelerometers are then used to calculate the corrective forces required to counteract, or 'cancel out' the payload motion. Active vibration isolation is common technology, but the use of inertial actuation in such systems is novel, and is the focus of the DAMPER project. A May 1991 report was completed which documented the successful demonstration of inertial actuation, employed in the control of vibration in a single axis. In the 1 degree-of-freedom (1DOF) experiment a set of air bearing rails was used to suspend the payload, simulating a microgravity environment in a single horizontal axis. Digital Signal Processor (DSP) technology was used to calculate in real time, the control law between the accelerometer signals and the inertial actuators. The data obtained from this experiment verified that as much as 20 dB of rejection could be realized by this type of system. A discussion is included of recent tests performed in which vibrations were actively controlled in three axes simultaneously. In the three degree-of-freedom (3DOF) system, the air bearings were designed in such a way that the payload is free to rotate about the azimuth axis, as well as translate in the two horizontal directions. The actuator developed for the DAMPER project has applications beyond payload isolation, including structural damping and source vibration isolation. This report includes a brief discussion of these applications, as well as a commercialization plan for the actuator.

  18. Optimizing Digital Mammographic Image Quality for Full-Field Digital Detectors: Artifacts Encountered during the QC Process.

    PubMed

    Jayadevan, Rashmi; Armada, M Julie; Shaheen, Rola; Mulcahy, Constance; Slanetz, Priscilla J

    2015-01-01

    Early detection of breast cancer through routine mammographic screening has been shown to reduce mortality from breast cancer by up to 30% in multiple studies. However, this reduction of mortality is possible only with careful attention to image quality by the medical physicist, radiologic technologist, and interpreting radiologist. The accepted quality control (QC) processes for analog mammography are well established. However, now that use of digital units is widespread in both the United States and internationally, information regarding the necessary steps and the inherent challenges that might be encountered at each step needs to be elucidated. In this review, the essential steps of the QC process for digital mammography are reviewed, with special attention to the possible problems that can occur during the QC process, many of which can lead to image artifacts. For each of the daily, weekly, monthly, and semiannual QC tests, we review the steps and expected performance and provide examples of some of the common artifacts that may be encountered. Understanding the components of the QC process and recognizing problems that may result in a suboptimal image is critical to ensure optimal image quality in an effort to maximize early detection of breast cancer. (©)RSNA, 2015.

  19. Digitized Chaos: Is Our Military Decision Making Process Ready for the Information Age?

    DTIC Science & Technology

    2007-11-02

    DIGITIZED CHAOS: IS OUR MILITARY DECISION MAKING PROCESS READY FOR THE INFORMATION AGE ? A MONOGRAPH BY Major John W. Charlton Infantry SETCLAVIS COR...FOR THE INFORMATION AGE ? By Major John W. Charlton, USA, 70 Pages. The integration of new technologies has always been important to the military. The...must accompany the new technology in order to exploit its full capabilities. Today the Army is looking at ways to integrate information age , or digital

  20. Considerations in developing geographic informations systems based on low-cost digital image processing

    NASA Technical Reports Server (NTRS)

    Henderson, F. M.; Dobson, M. W.

    1981-01-01

    The potential of digital image processing systems costing $20,000 or less for geographic information systems is assessed with the emphasis on the volume of data to be handled, the commercial hardware systems available, and the basic software for: (1) data entry, conversion and digitization; (2) georeferencing and geometric correction; (3) data structuring; (4) editing and updating; (5) analysis and retrieval; (6) output drivers; and (7) data management. Costs must also be considered as tangible and intangible factors.

  1. Considerations in developing geographic informations systems based on low-cost digital image processing

    NASA Technical Reports Server (NTRS)

    Henderson, F. M.; Dobson, M. W.

    1981-01-01

    The potential of digital image processing systems costing $20,000 or less for geographic information systems is assessed with the emphasis on the volume of data to be handled, the commercial hardware systems available, and the basic software for: (1) data entry, conversion and digitization; (2) georeferencing and geometric correction; (3) data structuring; (4) editing and updating; (5) analysis and retrieval; (6) output drivers; and (7) data management. Costs must also be considered as tangible and intangible factors.

  2. A Model-Based Methodology for Spray-Drying Process Development.

    PubMed

    Dobry, Dan E; Settell, Dana M; Baumann, John M; Ray, Rod J; Graham, Lisa J; Beyerinck, Ron A

    2009-09-01

    Solid amorphous dispersions are frequently used to improve the solubility and, thus, the bioavailability of poorly soluble active pharmaceutical ingredients (APIs). Spray-drying, a well-characterized pharmaceutical unit operation, is ideally suited to producing solid amorphous dispersions due to its rapid drying kinetics. This paper describes a novel flowchart methodology based on fundamental engineering models and state-of-the-art process characterization techniques that ensure that spray-drying process development and scale-up are efficient and require minimal time and API. This methodology offers substantive advantages over traditional process-development methods, which are often empirical and require large quantities of API and long development times. This approach is also in alignment with the current guidance on Pharmaceutical Development Q8(R1). The methodology is used from early formulation-screening activities (involving milligrams of API) through process development and scale-up for early clinical supplies (involving kilograms of API) to commercial manufacturing (involving metric tons of API). It has been used to progress numerous spray-dried dispersion formulations, increasing bioavailability of formulations at preclinical through commercial scales.

  3. Erosion processes by water in agricultural landscapes: a low-cost methodology for post-event analyses

    NASA Astrophysics Data System (ADS)

    Prosdocimi, Massimo; Calligaro, Simone; Sofia, Giulia; Tarolli, Paolo

    2015-04-01

    Throughout the world, agricultural landscapes assume a great importance, especially for supplying food and a livelihood. Among the land degradation phenomena, erosion processes caused by water are those that may most affect the benefits provided by agricultural lands and endanger people who work and live there. In particular, erosion processes that affect the banks of agricultural channels may cause the bank failure and represent, in this way, a severe threat to floodplain inhabitants and agricultural crops. Similarly, rills and gullies are critical soil erosion processes as well, because they bear upon the productivity of a farm and represent a cost that growers have to deal with. To estimate quantitatively soil losses due to bank erosion and rills processes, area based measurements of surface changes are necessary but, sometimes, they may be difficult to realize. In fact, surface changes due to short-term events have to be represented with fine resolution and their monitoring may entail too much money and time. The main objective of this work is to show the effectiveness of a user-friendly and low-cost technique that may even rely on smart-phones, for the post-event analyses of i) bank erosion affecting agricultural channels, and ii) rill processes occurring on an agricultural plot. Two case studies were selected and located in the Veneto floodplain (northeast Italy) and Marche countryside (central Italy), respectively. The work is based on high-resolution topographic data obtained by the emerging, low-cost photogrammetric method named Structure-from-Motion (SfM). Extensive photosets of the case studies were obtained using both standalone reflex digital cameras and smart-phone built-in cameras. Digital Terrain Models (DTMs) derived from SfM revealed to be effective to estimate quantitatively erosion volumes and, in the case of the bank eroded, deposited materials as well. SfM applied to pictures taken by smartphones is useful for the analysis of the topography

  4. Integrating the human element into the systems engineering process and MBSE methodology.

    SciTech Connect

    Tadros, Michael Samir.

    2013-12-01

    In response to the challenges related to the increasing size and complexity of systems, organizations have recognized the need to integrate human considerations in the beginning stages of systems development. Human Systems Integration (HSI) seeks to accomplish this objective by incorporating human factors within systems engineering (SE) processes and methodologies, which is the focus of this paper. A representative set of HSI methods from multiple sources are organized, analyzed, and mapped to the systems engineering Vee-model. These methods are then consolidated and evaluated against the SE process and Models-Based Systems Engineering (MBSE) methodology to determine where and how they could integrate within systems development activities in the form of specific enhancements. Overall conclusions based on these evaluations are presented and future research areas are proposed.

  5. The process of transitioning to digital operations in a clinic setting.

    PubMed

    Freeh, M; McFall, J; Nieves, A

    2001-06-01

    Transitioning to digital imaging operations in a department of radiology is often difficult for many radiologists, but it is a change that many have made effectively. Transitioning to digital operations in a clinic setting is even more difficult for the referring physician operating a business in the clinic. This paper will discuss our experience with transitioning several off site clinics to digital imaging operations. We will discuss the process followed to identify the physical equipment required to support clinic operations in a digital imaging environment, the process followed to help the physicians adjust their work patterns to allow them to practice in a digital imaging environment, and the benefits and pitfalls of implementing digital imaging in an off site clinic. Four off site clinic locations will be evaluated: 1. cancer clinic located immediately adjacent to the main hospital that relies heavily on CT and MRI images in their practice, 2. small clinic located about 60 miles from the main hospital that acquires xray images on site, 3. larger clinic located about 20 miles from the main hospital that acquires xray, MRI and CT images on site, 4. sports medicine clinic located about 2 miles from the main hospital that acquires xray images on site. Each of these clinics has a very different patient clientele and therefore operates differently in nearly all aspects of their daily operations. The physician's need for and use of film and digital images varies significantly between the sites and therefore each site has presented different challenges to our implementation process. As we explain the decisions that were made for each of these sites and reveal the methods that were used to help the physicians make the transition, the readers should be able to draw information that will be helpful to them as they make their own transition to a digital operation.

  6. The Importance of Rapid Auditory Processing Abilities to Early Language Development: Evidence from Converging Methodologies

    PubMed Central

    Thomas, Jennifer J.; Choudhury, Naseem; Leppänen, Paavo H. T.

    2006-01-01

    The ability to process two or more rapidly presented, successive, auditory stimuli is believed to underlie successful language acquisition. Likewise, deficits in rapid auditory processing of both verbal and nonverbal stimuli are characteristic of individuals with developmental language disorders such as Specific Language Impairment. Auditory processing abilities are well developed in infancy, and thus such deficits should be detectable in infants. In the studies presented here, converging methodologies are used to examine such abilities in infants with and without a family history of language disorder. Behavioral measures, including assessments of infant information processing, and an EEG/event-related potential (ERP) paradigm are used concurrently. Results suggest that rapid auditory processing skills differ as a function of family history and are predictive of later language outcome. Further, these paradigms may prove to be sensitive tools for identifying children with poor processing skills in infancy and thus at a higher risk for developing a language disorder. PMID:11891639

  7. Using process improvement methodology to address the complex issue of falls in the inpatient setting.

    PubMed

    Christopher, Deborah A; Trotta, Rebecca L; Yoho, Margaret A; Strong, Jocelyn; Dubendorf, Phyllis

    2014-01-01

    Falls in the acute care hospital are a significant patient safety issue. The purpose of this article was to describe the use of process improvement methodology to address inpatient falls on 5 units. This initiative focused on a proactive approach to falls, identification of high-risk patients, and a complete assessment of patients at risk. During the project timeframe, the mean total fall rate decreased from 3.7 to 2.8 total falls per 1000 patient days.

  8. Studying the relationship between dreaming and sleep-dependent memory processes: methodological challenges.

    PubMed

    Schredl, Michael

    2013-12-01

    The hypothesis that dreaming is involved in off-line memory processing is difficult to test because major methodological issues have to be addressed, such as dream recall and the effect of remembered dreams on memory. It would be fruitful--in addition to studying the ancient art of memory (AAOM) in a scanner--to study the dreams of persons who use AAOM regularly.

  9. Implementation of real-time digital signal processing systems

    NASA Technical Reports Server (NTRS)

    Narasimha, M.; Peterson, A.; Narayan, S.

    1978-01-01

    Special purpose hardware implementation of DFT Computers and digital filters is considered in the light of newly introduced algorithms and IC devices. Recent work by Winograd on high-speed convolution techniques for computing short length DFT's, has motivated the development of more efficient algorithms, compared to the FFT, for evaluating the transform of longer sequences. Among these, prime factor algorithms appear suitable for special purpose hardware implementations. Architectural considerations in designing DFT computers based on these algorithms are discussed. With the availability of monolithic multiplier-accumulators, a direct implementation of IIR and FIR filters, using random access memories in place of shift registers, appears attractive. The memory addressing scheme involved in such implementations is discussed. A simple counter set-up to address the data memory in the realization of FIR filters is also described. The combination of a set of simple filters (weighting network) and a DFT computer is shown to realize a bank of uniform bandpass filters. The usefulness of this concept in arriving at a modular design for a million channel spectrum analyzer, based on microprocessors, is discussed.

  10. Digital Libraries.

    ERIC Educational Resources Information Center

    Fox, Edward A.; Urs, Shalini R.

    2002-01-01

    Provides an overview of digital libraries research, practice, and literature. Highlights include new technologies; redefining roles; historical background; trends; creating digital content, including conversion; metadata; organizing digital resources; services; access; information retrieval; searching; natural language processing; visualization;…

  11. Digital Libraries.

    ERIC Educational Resources Information Center

    Fox, Edward A.; Urs, Shalini R.

    2002-01-01

    Provides an overview of digital libraries research, practice, and literature. Highlights include new technologies; redefining roles; historical background; trends; creating digital content, including conversion; metadata; organizing digital resources; services; access; information retrieval; searching; natural language processing; visualization;…

  12. A prototype software methodology for the rapid evaluation of biomanufacturing process options.

    PubMed

    Chhatre, Sunil; Francis, Richard; O'Donovan, Kieran; Titchener-Hooker, Nigel J; Newcombe, Anthony R; Keshavarz-Moore, Eli

    2007-10-01

    A three-layered simulation methodology is described that rapidly evaluates biomanufacturing process options. In each layer, inferior options are screened out, while more promising candidates are evaluated further in the subsequent, more refined layer, which uses more rigorous models that require more data from time-consuming experimentation. Screening ensures laboratory studies are focused only on options showing the greatest potential. To simplify the screening, outputs of production level, cost and time are combined into a single value using multi-attribute-decision-making techniques. The methodology was illustrated by evaluating alternatives to an FDA (U.S. Food and Drug Administration)-approved process manufacturing rattlesnake antivenom. Currently, antivenom antibodies are recovered from ovine serum by precipitation/centrifugation and proteolyzed before chromatographic purification. Alternatives included increasing the feed volume, replacing centrifugation with microfiltration and replacing precipitation/centrifugation with a Protein G column. The best alternative used a higher feed volume and a Protein G step. By rapidly evaluating the attractiveness of options, the methodology facilitates efficient and cost-effective process development.

  13. Rethinking Design Process: Using 3D Digital Models as an Interface in Collaborative Session

    ERIC Educational Resources Information Center

    Ding, Suining

    2008-01-01

    This paper describes a pilot study for an alternative design process by integrating a designer-user collaborative session with digital models. The collaborative session took place in a 3D AutoCAD class for a real world project. The 3D models served as an interface for designer-user collaboration during the design process. Students not only learned…

  14. Rethinking Design Process: Using 3D Digital Models as an Interface in Collaborative Session

    ERIC Educational Resources Information Center

    Ding, Suining

    2008-01-01

    This paper describes a pilot study for an alternative design process by integrating a designer-user collaborative session with digital models. The collaborative session took place in a 3D AutoCAD class for a real world project. The 3D models served as an interface for designer-user collaboration during the design process. Students not only learned…

  15. Industrial methodology for process verification in research (IMPROVER): toward systems biology verification

    PubMed Central

    Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo

    2012-01-01

    Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044

  16. Comparison of processing and sectioning methodologies for arteries containing metallic stents.

    PubMed

    Rippstein, Peter; Black, Melanie K; Boivin, Marie; Veinot, John P; Ma, Xiaoli; Chen, Yong-Xiang; Human, Paul; Zilla, Peter; O'Brien, Edward R

    2006-06-01

    The histological study of arteries with implanted metallic scaffolding devices, known as stents, remains a technical challenge. Given that the arterial response to stent implantation can sometimes lead to adverse outcomes, including the re-accumulation of tissue mass within the stent (or in-stent restenosis), overcoming these technical challenges is a priority for the advancement of research and development in this important clinical field. Essentially, the task is to section the stent-tissue interface with the least amount of disruption of tissue and cellular morphology. Although many methacrylate resin methodologies are successfully applied toward the study of endovascular stents by a variety of research laboratories, the exact formulations, as well as subsequent processing and sectioning methodology, remain largely coveted. In this paper, we describe in detail a methyl methacrylate resin-embedding methodology that can successfully be applied to tungsten carbide blade, as well as saw and grinding sectioning methods and transmission electron microscopy. In addition, we present a comparison of the two sectioning methodologies in terms of their effectiveness with regard to morphological, histochemical, and immunohistochemical analyses. This manuscript contains online supplemental material at http://www.jhc.org. Please visit this article online to view these materials.

  17. Digital image processing software system using an array processor

    SciTech Connect

    Sherwood, R.J.; Portnoff, M.R.; Journeay, C.H.; Twogood, R.E.

    1981-03-10

    A versatile array processor-based system for general-purpose image processing was developed. At the heart of this system is an extensive, flexible software package that incorporates the array processor for effective interactive image processing. The software system is described in detail, and its application to a diverse set of applications at LLNL is briefly discussed. 4 figures, 1 table.

  18. Interactive Computing and Graphics in Undergraduate Digital Signal Processing. Microcomputing Working Paper Series F 84-9.

    ERIC Educational Resources Information Center

    Onaral, Banu; And Others

    This report describes the development of a Drexel University electrical and computer engineering course on digital filter design that used interactive computing and graphics, and was one of three courses in a senior-level sequence on digital signal processing (DSP). Interactive and digital analysis/design routines and the interconnection of these…

  19. Interactive Computing and Graphics in Undergraduate Digital Signal Processing. Microcomputing Working Paper Series F 84-9.

    ERIC Educational Resources Information Center

    Onaral, Banu; And Others

    This report describes the development of a Drexel University electrical and computer engineering course on digital filter design that used interactive computing and graphics, and was one of three courses in a senior-level sequence on digital signal processing (DSP). Interactive and digital analysis/design routines and the interconnection of these…

  20. Digital Detection and Processing of Multiple Quadrature Harmonics for EPR Spectroscopy

    PubMed Central

    Ahmad, R.; Som, S.; Kesselring, E.; Kuppusamy, P.; Zweier, J.L.; Potter, L.C.

    2010-01-01

    A quadrature digital receiver and associated signal estimation procedure are reported for L-band electron paramagnetic resonance (EPR) spectroscopy. The approach provides simultaneous acquisition and joint processing of multiple harmonics in both in-phase and out-of-phase channels. The digital receiver, based on a high-speed dual-channel analog-to-digital converter, allows direct digital down-conversion with heterodyne processing using digital capture of the microwave reference signal. Thus, the receiver avoids noise and nonlinearity associated with analog mixers. Also, the architecture allows for low-Q anti-alias filtering and does not require the sampling frequency to be time-locked to the microwave reference. A noise model applicable for arbitrary contributions of oscillator phase noise is presented, and a corresponding maximum-likelihood estimator of unknown parameters is also reported. The signal processing is applicable for Lorentzian lineshape under nonsaturating conditions. The estimation is carried out using a convergent iterative algorithm capable of jointly processing the in-phase and out-of-phase data in the presence of phase noise and unknown microwave phase. Cramér-Rao bound analysis and simulation results demonstrate a significant reduction in linewidth estimation error using quadrature detection, for both low and high values of phase noise. EPR spectroscopic data are also reported for illustration. PMID:20971667

  1. Digital detection and processing of multiple quadrature harmonics for EPR spectroscopy.

    PubMed

    Ahmad, R; Som, S; Kesselring, E; Kuppusamy, P; Zweier, J L; Potter, L C

    2010-12-01

    A quadrature digital receiver and associated signal estimation procedure are reported for L-band electron paramagnetic resonance (EPR) spectroscopy. The approach provides simultaneous acquisition and joint processing of multiple harmonics in both in-phase and out-of-phase channels. The digital receiver, based on a high-speed dual-channel analog-to-digital converter, allows direct digital down-conversion with heterodyne processing using digital capture of the microwave reference signal. Thus, the receiver avoids noise and nonlinearity associated with analog mixers. Also, the architecture allows for low-Q anti-alias filtering and does not require the sampling frequency to be time-locked to the microwave reference. A noise model applicable for arbitrary contributions of oscillator phase noise is presented, and a corresponding maximum-likelihood estimator of unknown parameters is also reported. The signal processing is applicable for Lorentzian lineshape under nonsaturating conditions. The estimation is carried out using a convergent iterative algorithm capable of jointly processing the in-phase and out-of-phase data in the presence of phase noise and unknown microwave phase. Cramér-Rao bound analysis and simulation results demonstrate a significant reduction in linewidth estimation error using quadrature detection, for both low and high values of phase noise. EPR spectroscopic data are also reported for illustration.

  2. Optimization of the processing technology of Fructus Arctii by response surface methodology.

    PubMed

    Liu, Qi-Di; Qin, Kun-Ming; Shen, Bao-Jia; Cai, Hao; Cai, Bao-Chang

    2015-03-01

    The present study was designed to optimize the processing of Fructus Arctii by response surface methodology (RSM). Based on single factor studies, a three-variable, three-level Box-Behnken design (BBD) was used to monitor the effects of independent variables, including processing temperature and time, on the dependent variables. Response surfaces and contour plots of the contents of total lignans, chlorogenic acid, arctiin, and arctigenin were obtained through ultraviolet and visible (UV-Vis) monitoring and high performance liquid chromatography (HPLC). Fructus Arctii should be processed under heating in a pot at 311 °C, medicine at 119 °C for 123s with flipping frequently. The experimental values under the optimized processing technology were consistent with the predicted values. In conclusion, RSM is an effective method to optimize the processing of traditional Chinese medicine (TCM).

  3. Integrating the dynamics of personality and close relationship processes: methodological and data analytic implications.

    PubMed

    Graber, Elana C; Laurenceau, Jean-Philippe; Carver, Charles S

    2011-12-01

    A common theme that has emerged from classic and contemporary theoretical work in both the fields of personality and relationship science is a focus on process. Current process-focused theories bearing on personality invoke a view of the individual in ongoing action and interaction with the environment, reflecting a flow of experience rather than a static depiction. To understand the processes by which personality interacts with the social environment (particularly dyads), investigations must capture individuals interacting in multiple interpersonal situations, which likely necessitates complex study designs and corresponding data analytic strategies. Using an illustrative simulated data set, we focus on diary methods and corresponding individual and dyadic multilevel models to capture person-situation interaction within the context of processes in daily close relationship life. Finally, we consider future directions that conceptualize personality and close relationship processes from a dynamical systems theoretical and methodological perspective.

  4. BPMN, Toolsets, and Methodology: A Case Study of Business Process Management in Higher Education

    NASA Astrophysics Data System (ADS)

    Barn, Balbir S.; Oussena, Samia

    This chapter describes ongoing action research which is exploring the use of BPMN and a specific toolset - Intalio Designer to capture the “as is” essential process model of part of an overarching large business process within higher education. The chapter contends that understanding the efficacy of the BPMN notation and the notational elements to use is not enough. Instead, the effectiveness of a notation is determined by the notation, the toolset that is being used, and methodological consideration. The chapter presents some of the challenges that are faced in attempting to develop computation independent models in BPMN using toolsets such as Intalio Designer™.

  5. Age differences in decision making: a process methodology for examining strategic information processing.

    PubMed

    Johnson, M M

    1990-03-01

    This study explored the use of process tracing techniques in examining the decision-making processes of older and younger adults. Thirty-six college-age and thirty-six retirement-age participants decided which one of six cars they would purchase on the basis of computer-accessed data. They provided information search protocols. Results indicate that total time to reach a decision did not differ according to age. However, retirement-age participants used less information, spent more time viewing, and re-viewed fewer bits of information than college-age participants. Information search patterns differed markedly between age groups. Patterns of retirement-age adults indicated their use of noncompensatory decision rules which, according to decision-making literature (Payne, 1976), reduce cognitive processing demands. The patterns of the college-age adults indicated their use of compensatory decision rules, which have higher processing demands.

  6. The design, fabrication, and test of a new VLSI hybrid analog-digital neural processing element

    NASA Technical Reports Server (NTRS)

    Deyong, Mark R.; Findley, Randall L.; Fields, Chris

    1992-01-01

    A hybrid analog-digital neural processing element with the time-dependent behavior of biological neurons has been developed. The hybrid processing element is designed for VLSI implementation and offers the best attributes of both analog and digital computation. Custom VLSI layout reduces the layout area of the processing element, which in turn increases the expected network density. The hybrid processing element operates at the nanosecond time scale, which enables it to produce real-time solutions to complex spatiotemporal problems found in high-speed signal processing applications. VLSI prototype chips have been designed, fabricated, and tested with encouraging results. Systems utilizing the time-dependent behavior of the hybrid processing element have been simulated and are currently in the fabrication process. Future applications are also discussed.

  7. The design, fabrication, and test of a new VLSI hybrid analog-digital neural processing element

    NASA Technical Reports Server (NTRS)

    Deyong, Mark R.; Findley, Randall L.; Fields, Chris

    1992-01-01

    A hybrid analog-digital neural processing element with the time-dependent behavior of biological neurons has been developed. The hybrid processing element is designed for VLSI implementation and offers the best attributes of both analog and digital computation. Custom VLSI layout reduces the layout area of the processing element, which in turn increases the expected network density. The hybrid processing element operates at the nanosecond time scale, which enables it to produce real-time solutions to complex spatiotemporal problems found in high-speed signal processing applications. VLSI prototype chips have been designed, fabricated, and tested with encouraging results. Systems utilizing the time-dependent behavior of the hybrid processing element have been simulated and are currently in the fabrication process. Future applications are also discussed.

  8. Use of a qualitative methodological scaffolding process to design robust interprofessional studies.

    PubMed

    Wener, Pamela; Woodgate, Roberta L

    2013-07-01

    Increasingly, researchers are using qualitative methodology to study interprofessional collaboration (IPC). With this increase in use, there seems to be an appreciation for how qualitative studies allow us to understand the unique individual or group experience in more detail and form a basis for policy change and innovative interventions. Furthermore, there is an increased understanding of the potential of studying new or emerging phenomena qualitatively to inform further large-scale studies. Although there is a current trend toward greater acceptance of the value of qualitative studies describing the experiences of IPC, these studies are mostly descriptive in nature. Applying a process suggested by Crotty (1998) may encourage researchers to consider the value in situating research questions within a broader theoretical framework that will inform the overall research approach including methodology and methods. This paper describes the application of a process to a research project and then illustrates how this process encouraged iterative cycles of thinking and doing. The authors describe each step of the process, shares decision-making points, as well as suggests an additional step to the process. Applying this approach to selecting data collection methods may serve to guide and support the qualitative researcher in creating a well-designed study approach.

  9. Evaluation of a Change Detection Methodology by Means of Binary Thresholding Algorithms and Informational Fusion Processes

    PubMed Central

    Molina, Iñigo; Martinez, Estibaliz; Arquero, Agueda; Pajares, Gonzalo; Sanchez, Javier

    2012-01-01

    Landcover is subject to continuous changes on a wide variety of temporal and spatial scales. Those changes produce significant effects in human and natural activities. Maintaining an updated spatial database with the occurred changes allows a better monitoring of the Earth’s resources and management of the environment. Change detection (CD) techniques using images from different sensors, such as satellite imagery, aerial photographs, etc., have proven to be suitable and secure data sources from which updated information can be extracted efficiently, so that changes can also be inventoried and monitored. In this paper, a multisource CD methodology for multiresolution datasets is applied. First, different change indices are processed, then different thresholding algorithms for change/no_change are applied to these indices in order to better estimate the statistical parameters of these categories, finally the indices are integrated into a change detection multisource fusion process, which allows generating a single CD result from several combination of indices. This methodology has been applied to datasets with different spectral and spatial resolution properties. Then, the obtained results are evaluated by means of a quality control analysis, as well as with complementary graphical representations. The suggested methodology has also been proved efficiently for identifying the change detection index with the higher contribution. PMID:22737023

  10. [Development history of methodology of Chinese medicines' authentication].

    PubMed

    Chen, Ke-Li; Huang, Lin-Fang; Liu, Yi-Mei

    2014-04-01

    This paper reviewed the emergence process of the subject and methodology of Chinese Medicines' Authentication. Based on the research progress and major achievements acquired in research of each methodology including identification of origin, description, microscopic, physical, chemical and biological characteristics of Chinese medicines, it is expounded that the development process of each methodology combined modem digital technology, information science and its own characteristics. And the development direction is further described for methodology of Chinese Medicines' Authentication towards systematization and informationization.

  11. From Algorithms to Processing Chains: A Review of Land Cover and Land Use Change Detection Methodologies

    NASA Astrophysics Data System (ADS)

    Thonfeld, Frank; Hecheltjen, Antje; Braun, Matthias; Menz, Gunter

    2010-12-01

    Rapidly growing data archives and new data sources cause considerable heterogeneity in change detection methodologies. Here, we give an exemplary survey of change detection algorithms and their evolution over time. We divide them into five categories (algebra, transformation, classification, GIS and visualization) to simplify their heterogeneity. The survey is extended to complete process chains needed to generate change products. Focus is on major innovations in regard of automation. A quantitative analysis of literature was conducted by applying a keyword search to find out which algorithms are favored for which application. SAR and object-based approaches are considered as well. It can be shown that rather old methodologies are favored for the vast majority of applications.

  12. Digital Screening and Halftone Techniques for Raster Processing,

    DTIC Science & Technology

    1980-01-14

    I 7 A)-AO81 090 ARMY ENGINEER TOPOGRAPH4IC LASS FORT Bs.voiR VA 161/ OIGITAL SCREENING ANO HALFTONE TECNNIOUIES FOR RASTER PROCLPSINM-.TC(U) JAN GO R... HALFTONE TECHNIQUES 0FOR RASTER PROCESSING BY RICHARD L. ROSENTHAL DTIC FEB 𔃼 7 1980 W.A Approved for public release; distribution unlimited AU...creening and halftone techniques forlt -rastei’ processing ~ A 6. PERFORMING ORG. REPORT NUMBER 7. AUTNOP-r- S. CONTRACT OR GRANT NUMBER(*) c*t- Ri chard

  13. A Systematic Software, Firmware, and Hardware Codesign Methodology for Digital Signal Processing

    DTIC Science & Technology

    2014-03-01

    electronic warfare FCC Federal Communications Commission FFT fast Fourier transform FIR finite impulse response FPGA field programmable gate...conversations, Professor Neil C. Rowe tirelessly provided strategic advice ensuring our work was on track . Even though, he is methodical in guiding...and requirements. This approach is time-consuming and error-prone, and there is little tracking to ensure that changes are correctly implemented [6

  14. BM/C3 Force Model VLSI/VHSIC Digital Processing: A Cost Methodology

    DTIC Science & Technology

    1988-10-01

    number of produccions lots is increased. This is similar to a learning curve effect. Quantity demanded largely explains the reason why memory chips are...Arith) 6.90 7.02 -0.12 2.36% Avg (Absolute) 1.15 17.25% Average Actual ( Ave Act) in Unit Space ................ 6.90 Standard Error (SE) in Unit Space

  15. Application of digital image processing techniques to astronomical imagery 1978

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.

    1978-01-01

    Techniques for using image processing in astronomy are identified and developed for the following: (1) geometric and radiometric decalibration of vidicon-acquired spectra, (2) automatic identification and segregation of stars from galaxies; and (3) display of multiband radio maps in compact and meaningful formats. Examples are presented of these techniques applied to a variety of objects.

  16. Process optimization of rolling for zincked sheet technology using response surface methodology and genetic algorithm

    NASA Astrophysics Data System (ADS)

    Ji, Liang-Bo; Chen, Fang

    2017-07-01

    Numerical simulation and intelligent optimization technology were adopted for rolling and extrusion of zincked sheet. By response surface methodology (RSM), genetic algorithm (GA) and data processing technology, an efficient optimization of process parameters for rolling of zincked sheet was investigated. The influence trend of roller gap, rolling speed and friction factor effects on reduction rate and plate shortening rate were analyzed firstly. Then a predictive response surface model for comprehensive quality index of part was created using RSM. Simulated and predicted values were compared. Through genetic algorithm method, the optimal process parameters for the forming of rolling were solved. They were verified and the optimum process parameters of rolling were obtained. It is feasible and effective.

  17. MARKOV: A methodology for the solution of infinite time horizon MARKOV decision processes

    USGS Publications Warehouse

    Williams, B.K.

    1988-01-01

    Algorithms are described for determining optimal policies for finite state, finite action, infinite discrete time horizon Markov decision processes. Both value-improvement and policy-improvement techniques are used in the algorithms. Computing procedures are also described. The algorithms are appropriate for processes that are either finite or infinite, deterministic or stochastic, discounted or undiscounted, in any meaningful combination of these features. Computing procedures are described in terms of initial data processing, bound improvements, process reduction, and testing and solution. Application of the methodology is illustrated with an example involving natural resource management. Management implications of certain hypothesized relationships between mallard survival and harvest rates are addressed by applying the optimality procedures to mallard population models.

  18. Rapid processing of letters, digits and symbols: what purely visual-attentional deficit in developmental dyslexia?

    PubMed

    Ziegler, Johannes C; Pech-Georgel, Catherine; Dufau, Stéphane; Grainger, Jonathan

    2010-07-01

    Visual-attentional theories of dyslexia predict deficits for dyslexic children not only for the perception of letter strings but also for non-alphanumeric symbol strings. This prediction was tested in a two-alternative forced-choice paradigm with letters, digits, and symbols. Children with dyslexia showed significant deficits for letter and digit strings but not for symbol strings. This finding is difficult to explain for visual-attentional theories of dyslexia which postulate identical deficits for letters, digits and symbols. Moreover, dyslexics showed normal W-shaped serial position functions for letter and digit strings, which suggests that their deficit is not due to an abnormally small attentional window. Finally, the size of the deficit was identical for letters and digits, which suggests that poor letter perception is not just a consequence of the lack of reading. Together then, our results show that symbols that map onto phonological codes are impaired (i.e. letters and digits), whereas symbols that do not map onto phonological codes are not impaired. This dissociation suggests that impaired symbol-sound mapping rather than impaired visual-attentional processing is the key to understanding dyslexia.

  19. Advanced power analysis methodology targeted to the optimization of a digital pixel readout chip design and its critical serial powering system

    NASA Astrophysics Data System (ADS)

    Marconi, S.; Orfanelli, S.; Karagounis, M.; Hemperek, T.; Christiansen, J.; Placidi, P.

    2017-02-01

    A dedicated power analysis methodology, based on modern digital design tools and integrated with the VEPIX53 simulation framework developed within RD53 collaboration, is being used to guide vital choices for the design and optimization of the next generation ATLAS and CMS pixel chips and their critical serial powering circuit (shunt-LDO). Power consumption is studied at different stages of the design flow under different operating conditions. Significant effort is put into extensive investigations of dynamic power variations in relation with the decoupling seen by the powering network. Shunt-LDO simulations are also reported to prove the reliability at the system level.

  20. Faster processing of multiple spatially-heterodyned direct to digital holograms

    DOEpatents

    Hanson, Gregory R [Clinton, TN; Bingham, Philip R [Knoxville, TN

    2008-09-09

    Systems and methods are described for faster processing of multiple spatially-heterodyned direct to digital holograms. A method includes of obtaining multiple spatially-heterodyned holograms, includes: digitally recording a first spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; digitally recording a second spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded first spatially-heterodyned hologram by shifting a first original origin of the recorded first spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a first angle between a first reference beam and a first object beam; applying a first digital filter to cut off signals around the first original origin and performing an inverse Fourier transform on the result; Fourier analyzing the recorded second spatially-heterodyned hologram by shifting a second original origin of the recorded second spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a second angle between a second reference beam and a second object beam; and applying a second digital filter to cut off signals around the second original origin and performing an inverse Fourier transform on the result, wherein digitally recording the first spatially-heterodyned hologram is completed before digitally recording the second spatially-heterodyned hologram and a single digital image includes both the first spatially-heterodyned hologram and the second spatially-heterodyned hologram.

  1. Faster processing of multiple spatially-heterodyned direct to digital holograms

    DOEpatents

    Hanson, Gregory R.; Bingham, Philip R.

    2006-10-03

    Systems and methods are described for faster processing of multiple spatially-heterodyned direct to digital holograms. A method includes of obtaining multiple spatially-heterodyned holograms, includes: digitally recording a first spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; digitally recording a second spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded first spatially-heterodyned hologram by shifting a first original origin of the recorded first spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a first angle between a first reference beam and a first, object beam; applying a first digital filter to cut off signals around the first original origin and performing an inverse Fourier transform on the result; Fourier analyzing the recorded second spatially-heterodyned hologram by shifting a second original origin of the recorded second spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a second angle between a second reference beam and a second object beam; and applying a second digital filter to cut off signals around the second original origin and performing an inverse Fourier transform on the result, wherein digitally recording the first spatially-heterodyned hologram is completed before digitally recording the second spatially-heterodyned hologram and a single digital image includes both the first spatially-heterodyned hologram and the second spatially-heterodyned hologram.

  2. New methods, new methodology: Advanced CFD in the Snecma turbomachinery design process

    NASA Astrophysics Data System (ADS)

    Vuillez, Christophe; Petot, Bertrand

    1994-05-01

    CFD tools represent a significant source of improvements in the design process of turbomachinery components, leading to higher performances, cost and cycle savings as well as lower associated risks. Such methods are the backbone of compressor and turbine design methodologies at Snecma. In the 80's, the use of 3D Euler solvers was a key factor in designing fan blades with very high performance level. Counter rotating high speed propellers designed with this methodology reached measured performances very close to their ambitious objective from the first test series. In the late 80's and the beginning of the 90's, new, more powerful methods were rapidly developed and are now commonly used in the design process: a quasi-3D, compressible, transonic inverse method; quasi-3D and 3D Navier-Stokes solvers; 3D unsteady Euler solvers. As an example, several hundred 3D Navier-Stokes computations are run yearly for the design of low and high pressure compressor and turbine blades. In addition to their modelling capabilities, the efficient use of such methods in the design process comes from their close integration in the global methodology and from an adequate exploitation environment. Their validation, their calibration, and the correlations between different levels of modelling are of critical importance to an actual improvement in design know-how. The integration of different methods in the design process is described. Several examples of application illustrate their practical utilization. Comparisons between computational results and test results show their capabilities as well as their present limitations. The prospects linked to new developments currently under way are discussed.

  3. Methodology for the Elimination of Reflection and System Vibration Effects in Particle Image Velocimetry Data Processing

    NASA Technical Reports Server (NTRS)

    Bremmer, David M.; Hutcheson, Florence V.; Stead, Daniel J.

    2005-01-01

    A methodology to eliminate model reflection and system vibration effects from post processed particle image velocimetry data is presented. Reflection and vibration lead to loss of data, and biased velocity calculations in PIV processing. A series of algorithms were developed to alleviate these problems. Reflections emanating from the model surface caused by the laser light sheet are removed from the PIV images by subtracting an image in which only the reflections are visible from all of the images within a data acquisition set. The result is a set of PIV images where only the seeded particles are apparent. Fiduciary marks painted on the surface of the test model were used as reference points in the images. By locating the centroids of these marks it was possible to shift all of the images to a common reference frame. This image alignment procedure as well as the subtraction of model reflection are performed in a first algorithm. Once the images have been shifted, they are compared with a background image that was recorded under no flow conditions. The second and third algorithms find the coordinates of fiduciary marks in the acquisition set images and the background image and calculate the displacement between these images. The final algorithm shifts all of the images so that fiduciary mark centroids lie in the same location as the background image centroids. This methodology effectively eliminated the effects of vibration so that unbiased data could be used for PIV processing. The PIV data used for this work was generated at the NASA Langley Research Center Quiet Flow Facility. The experiment entailed flow visualization near the flap side edge region of an airfoil model. Commercial PIV software was used for data acquisition and processing. In this paper, the experiment and the PIV acquisition of the data are described. The methodology used to develop the algorithms for reflection and system vibration removal is stated, and the implementation, testing and

  4. Experimental study of digital image processing techniques for LANDSAT data

    NASA Technical Reports Server (NTRS)

    Rifman, S. S. (Principal Investigator); Allendoerfer, W. B.; Caron, R. H.; Pemberton, L. J.; Mckinnon, D. M.; Polanski, G.; Simon, K. W.

    1976-01-01

    The author has identified the following significant results. Results are reported for: (1) subscene registration, (2) full scene rectification and registration, (3) resampling techniques, (4) and ground control point (GCP) extraction. Subscenes (354 pixels x 234 lines) were registered to approximately 1/4 pixel accuracy and evaluated by change detection imagery for three cases: (1) bulk data registration, (2) precision correction of a reference subscene using GCP data, and (3) independently precision processed subscenes. Full scene rectification and registration results were evaluated by using a correlation technique to measure registration errors of 0.3 pixel rms thoughout the full scene. Resampling evaluations of nearest neighbor and TRW cubic convolution processed data included change detection imagery and feature classification. Resampled data were also evaluated for an MSS scene containing specular solar reflections.

  5. Digital image processing on a small computer system

    NASA Technical Reports Server (NTRS)

    Danielson, R.

    1981-01-01

    A minicomputer-based image processing facility provides a relatively low-cost entry point for education about image analysis applications in remote sensing. While a minicomputer has sufficient processing power to produce results quite rapidly for low volumes of small images, it does not have sufficient power to perform CPU- or 1/0-bound tasks on large images. A system equipped with a display terminal is ideally suited for interactive tasks. Software procurement is a limiting factor for most end users, and software availability may well be the overriding consideration in selecting a particular hardware configuration. The hardware chosen should be selected to be compatible with the software and with concern for future expansion.

  6. Pulsed digital holography system recording ultrafast process of the femtosecond order

    NASA Astrophysics Data System (ADS)

    Wang, Xiaolei; Zhai, Hongchen; Mu, Guoguang

    2006-06-01

    We report, for the first time to our knowledge, a pulsed digital microholographic system with spatial angular multiplexing for recording the ultrafast process of the femtosecond order. The optimized design of the two sets of subpulse-train generators in this system makes it possible to implement a digital holographic recording with spatial angular multiplexing of a frame interval of the femtosecond order, while keeping the incident angle of the object beams unchanged. Three pairs of amplitude and phase images from the same view angle digitally reconstructed by the system demonstrated the ultrafast dynamic process of laser-induced ionization of ambient air at a wavelength of 800 nm, with a time resolution of 50 fs and a frame interval of 300 fs.

  7. Digital phonocardiographic experiments and signal processing in multidisciplinary fields of university education

    NASA Astrophysics Data System (ADS)

    Nagy, Tamás; Vadai, Gergely; Gingl, Zoltán

    2017-09-01

    Modern measurement of physical signals is based on the use of sensors, electronic signal conditioning, analog-to-digital conversion and digital signal processing carried out by dedicated software. The same signal chain is used in many devices such as home appliances, automotive electronics, medical instruments, and smartphones. Teaching the theoretical, experimental, and signal processing background must be an essential part of improving the standard of higher education, and it fits well to the increasingly multidisciplinary nature of physics and engineering too. In this paper, we show how digital phonocardiography can be used in university education as a universal, highly scalable, exciting, and inspiring laboratory practice and as a demonstration at various levels and complexity. We have developed open-source software templates in modern programming languages to support immediate use and to serve as a basis of further modifications using personal computers, tablets, and smartphones.

  8. Implementation of a Digital Signal Processing Subsystem for a Long Wavelength Array Station

    NASA Technical Reports Server (NTRS)

    Soriano, Melissa; Navarro, Robert; D'Addario, Larry; Sigman, Elliott; Wang, Douglas

    2011-01-01

    This paper describes the implementation of a Digital Signal Processing (DP) subsystem for a single Long Wavelength Array (LWA) station.12 The LWA is a radio telescope that will consist of many phased array stations. Each LWA station consists of 256 pairs of dipole-like antennas operating over the 10-88 MHz frequency range. The Digital Signal Processing subsystem digitizes up to 260 dual-polarization signals at 196 MHz from the LWA Analog Receiver, adjusts the delay and amplitude of each signal, and forms four independent beams. Coarse delay is implemented using a first-in-first-out buffer and fine delay is implemented using a finite impulse response filter. Amplitude adjustment and polarization corrections are implemented using a 2x2 matrix multiplication

  9. Introduction to the Special Issue on Digital Signal Processing in Radio Astronomy

    NASA Astrophysics Data System (ADS)

    Price, D. C.; Kocz, J.; Bailes, M.; Greenhill, L. J.

    Advances in astronomy are intimately linked to advances in digital signal processing (DSP). This special issue is focused upon advances in DSP within radio astronomy. The trend within that community is to use off-the-shelf digital hardware where possible and leverage advances in high performance computing. In particular, graphics processing units (GPUs) and field programmable gate arrays (FPGAs) are being used in place of application-specific circuits (ASICs); high-speed Ethernet and Infiniband are being used for interconnect in place of custom backplanes. Further, to lower hurdles in digital engineering, communities have designed and released general-purpose FPGA-based DSP systems, such as the CASPER ROACH board, ASTRON Uniboard, and CSIRO Redback board. In this introductory paper, we give a brief historical overview, a summary of recent trends, and provide an outlook on future directions.

  10. Evaluation of solar angle variation over digital processing of LANDSAT imagery. [Brazil

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Novo, E. M. L. M.

    1984-01-01

    The effects of the seasonal variation of illumination over digital processing of LANDSAT images are evaluated. Original images are transformed by means of digital filtering to enhance their spatial features. The resulting images are used to obtain an unsupervised classification of relief units. After defining relief classes, which are supposed to be spectrally different, topographic variables (declivity, altitude, relief range and slope length) are used to identify the true relief units existing on the ground. The samples are also clustered by means of an unsupervised classification option. The results obtained for each LANDSAT overpass are compared. Digital processing is highly affected by illumination geometry. There is no correspondence between relief units as defined by spectral features and those resulting from topographic features.

  11. Advanced Digital Signal Processing for Hybrid Lidar FY 2014

    DTIC Science & Technology

    2014-10-30

    experimental setup is shown below in Figure 11. A PC running MATLAB configures an RF signal generator to modulate a laser diode, which illuminates a target in...report describes the technical progress towards the development of signed processing algorithms for hybrid lidar- radar designed to improve... Radar 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF RESPONSIBLE PERSON a. REPORT b. ABSTRACT c

  12. Comparison between digital Doppler filtering processes applied to radar signals

    NASA Astrophysics Data System (ADS)

    Desodt, G.

    1983-10-01

    Two families of Doppler processes based on FFT and FIR filters, respectively, are compared in terms of hardware complexity and performance. It is shown that FIR filter banks are characterized by better performance than FFT filter banks. For the same number of pulses, the FIR processor permits a better clutter rejection and greater bandwidth than the FFT one. Also, an FIR-based bank has a much simpler and more adaptable architecture than an FFT-based bank.

  13. Characterization of Periodically Poled Nonlinear Materials Using Digital Image Processing

    DTIC Science & Technology

    2008-04-01

    Interactions Due to the nonlinear nature of the response, a nonlinear polarization at new frequencies is generated which can radiate at frequencies not...present in the incident radiation field. This coupling allows energy to be transferred between different wavelengths and forms the basis of the...physical mechanism behind these processes. An isolated atom would radiate in the typical dipole radiation pattern, but in a material, a large number of

  14. Automated microstructural analysis of titanium alloys using digital image processing

    NASA Astrophysics Data System (ADS)

    Campbell, A.; Murray, P.; Yakushina, E.; Marshall, S.; Ion, W.

    2017-02-01

    Titanium is a material that exhibits many desirable properties including a very high strength to weight ratio and corrosive resistance. However, the specific properties of any components depend upon the microstructure of the material, which varies by the manufacturing process. This means it is often necessary to analyse the microstructure when designing new processes or performing quality assurance on manufactured parts. For Ti6Al4V, grain size analysis is typically performed manually by expert material scientists as the complicated microstructure of the material means that, to the authors knowledge, no existing software reliably identifies the grain boundaries. This manual process is time consuming and offers low repeatability due to human error and subjectivity. In this paper, we propose a new, automated method to segment microstructural images of a Ti6Al4V alloy into its constituent grains and produce measurements. The results of applying this technique are evaluated by comparing the measurements obtained by different analysis methods. By using measurements from a complete manual segmentation as a benchmark we explore the reliability of the current manual estimations of grain size and contrast this with improvements offered by our approach.

  15. Application of digital image processing techniques to astronomical imagery, 1979

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.

    1979-01-01

    Several areas of applications of image processing to astronomy were identified and discussed. These areas include: (1) deconvolution for atmospheric seeing compensation; a comparison between maximum entropy and conventional Wiener algorithms; (2) polarization in galaxies from photographic plates; (3) time changes in M87 and methods of displaying these changes; (4) comparing emission line images in planetary nebulae; and (5) log intensity, hue saturation intensity, and principal component color enhancements of M82. Examples are presented of these techniques applied to a variety of objects.

  16. High-speed optical processing using digital micromirror device

    NASA Astrophysics Data System (ADS)

    Chao, Tien-Hsin; Lu, Thomas; Walker, Brian; Reyes, George

    2014-04-01

    We have designed optical processing architecture and algorithms utilizing the DMD as the input and filter Spatial Light Modulators (SLM). Detailed system analysis will be depicted. Experimental demonstration, for the first time, showing that a complex-valued spatial filtered can be successfully written on the DMDSLM using a Computer Generated Hologram (CGH) [1] encoding technique will also be provided. The high-resolution, high-bandwidth provided by the DMD and its potential low cost due to mass production will enable its vast defense and civil application.

  17. Advancing Nursing Research in the Visual Era: Reenvisioning the Photovoice Process Across Phenomenological, Grounded Theory, and Critical Theory Methodologies.

    PubMed

    Evans-Agnew, Robin A; Boutain, Doris M; Rosemberg, Marie-Anne S

    Photovoice is a powerful research method that employs participant photography for advancing voice, knowledge, and transformative change among groups historically or currently marginalized. Paradoxically, this research method risks exploitation of participant voice because of weak methodology to method congruence. The purposes of this retrospective article are to revisit current interdisciplinary research using photovoice and to suggest how to advance photovoice by improving methodology-method congruence. Novel templates are provided for improving the photovoice process across phenomenological, grounded theory, and critical theory methodologies.

  18. An integrated methodology for process improvement and delivery system visualization at a multidisciplinary cancer center.

    PubMed

    Singprasong, Rachanee; Eldabi, Tillal

    2013-01-01

    Multidisciplinary cancer centers require an integrated, collaborative, and stream-lined workflow in order to provide high quality of patient care. Due to the complex nature of cancer care and continuing changes to treatment techniques and technologies, it is a constant struggle for centers to obtain a systemic and holistic view of treatment workflow for improving the delivery systems. Project management techniques, Responsibility matrix and a swim-lane activity diagram representing sequence of activities can be combined for data collection, presentation, and evaluation of the patient care. This paper presents this integrated methodology using multidisciplinary meetings and walking the route approach for data collection, integrated responsibility matrix and swim-lane activity diagram with activity time for data representation and 5-why and gap analysis approach for data analysis. This enables collection of right detail of information in a shorter time frame by identifying process flaws and deficiencies while being independent of the nature of the patient's disease or treatment techniques. A case study of a multidisciplinary regional cancer centre is used to illustrate effectiveness of the proposed methodology and demonstrates that the methodology is simple to understand, allowing for minimal training of staff and rapid implementation. © 2011 National Association for Healthcare Quality.

  19. On the selection of tuning methodology of FOPID controllers for the control of higher order processes.

    PubMed

    Das, Saptarshi; Saha, Suman; Das, Shantanu; Gupta, Amitava

    2011-07-01

    In this paper, a comparative study is done on the time and frequency domain tuning strategies for fractional order (FO) PID controllers to handle higher order processes. A new fractional order template for reduced parameter modelling of stable minimum/non-minimum phase higher order processes is introduced and its advantage in frequency domain tuning of FOPID controllers is also presented. The time domain optimal tuning of FOPID controllers have also been carried out to handle these higher order processes by performing optimization with various integral performance indices. The paper highlights on the practical control system implementation issues like flexibility of online autotuning, reduced control signal and actuator size, capability of measurement noise filtration, load disturbance suppression, robustness against parameter uncertainties etc. in light of the above tuning methodologies.

  20. Processing optimization of probiotic yogurt containing glucose oxidase using response surface methodology.

    PubMed

    Cruz, A G; Faria, J A F; Walter, E H M; Andrade, R R; Cavalcanti, R N; Oliveira, C A F; Granato, D

    2010-11-01

    Exposure to oxygen may induce a lack of functionality of probiotic dairy foods because the anaerobic metabolism of probiotic bacteria compromises during storage the maintenance of their viability to provide benefits to consumer health. Glucose oxidase can constitute a potential alternative to increase the survival of probiotic bacteria in yogurt because it consumes the oxygen permeating to the inside of the pot during storage, thus making it possible to avoid the use of chemical additives. This research aimed to optimize the processing of probiotic yogurt supplemented with glucose oxidase using response surface methodology and to determine the levels of glucose and glucose oxidase that minimize the concentration of dissolved oxygen and maximize the Bifidobacterium longum count by the desirability function. Response surface methodology mathematical models adequately described the process, with adjusted determination coefficients of 83% for the oxygen and 94% for the B. longum. Linear and quadratic effects of the glucose oxidase were reported for the oxygen model, whereas for the B. longum count model an influence of the glucose oxidase at the linear level was observed followed by the quadratic influence of glucose and quadratic effect of glucose oxidase. The desirability function indicated that 62.32 ppm of glucose oxidase and 4.35 ppm of glucose was the best combination of these components for optimization of probiotic yogurt processing. An additional validation experiment was performed and results showed acceptable error between the predicted and experimental results.

  1. SNMG: a social-level norm-based methodology for macro-governing service collaboration processes

    NASA Astrophysics Data System (ADS)

    Gao, Ji; Lv, Hexin; Jin, Zhiyong; Xu, Ping

    2017-08-01

    In order to adapt to the accelerative open tendency of collaborations between enterprises, this paper proposes a Social-level Norm-based methodology for Macro-Governing service collaboration processes, called SNMG, to regulate and control the social-level visible macro-behaviors of the social individuals participating in collaborations. SNMG not only can remove effectively the uncontrollability hindrance confronted with by open social activities, but also enables across-management-domain collaborations to be implemented by uniting the centralized controls of social individuals for respective social activities. Therefore, this paper provides a brand-new system construction mode to promote the development and large-scale deployment of service collaborations.

  2. Digital Image Processing Applied To Problems In Art And Archaeology

    NASA Astrophysics Data System (ADS)

    Asmus, John F.; Katz, Norman P.

    1988-12-01

    Many of the images encountered during scholarly studies in the fields of art and archaeology have deteriorated through the effects of time. The Ice-Age rock art of the now-closed caves near Lascaux are prime examples of this fate. However, faint and subtle details of these can be exceedingly important as some theories suggest that the designs are computers or calendars pertaining to astronomical cycles as well as seasons for hunting, gathering, and planting. Consequently, we have applied a range of standard image processing algorithms (viz., edge detection, spatial filtering, spectral differencing, and contrast enhancement) as well as specialized techniques (e.g., matched filters) to the clarification of these drawings. Also, we report the results of computer enhancement studies pertaining to authenticity, faint details, sitter identity, and age of portraits by da Vinci, Rembrandt, Rotari, and Titian.

  3. Digital Signal Processing for the Event Horizon Telescope

    NASA Astrophysics Data System (ADS)

    Weintroub, Jonathan

    2015-08-01

    A broad international collaboration is building the Event Horizon Telescope (EHT). The aim is to test Einstein’s theory of General Relativity in one of the very few places it could break down: the strong gravity regime right at the edge of a black hole. The EHT is an earth-size VLBI array operating at the shortest radio wavelengths, that has achieved unprecedented angular resolution of a few tens of μarcseconds. For nearby super massive black holes (SMBH) this size scale is comparable to the Schwarzschild Radius, and emission in the immediate neighborhood of the event horizon can be directly observed. We give an introduction to the science behind the CASPER-enabled EHT, and outline technical developments, with emphasis on the secret sauce of high speed signal processing.

  4. On modeling the digital gate delay under process variation

    NASA Astrophysics Data System (ADS)

    Mingzhi, Gao; Zuochang, Ye; Yan, Wang; Zhiping, Yu

    2011-07-01

    To achieve a characterization method for the gate delay library used in block based statistical static timing analysis with neither unacceptably poor accuracy nor forbiddingly high cost, we found that general-purpose gate delay models are useful as intermediaries between the circuit simulation data and the gate delay models in required forms. In this work, two gate delay models for process variation considering different driving and loading conditions are proposed. From the testing results, these two models, especially the one that combines effective dimension reduction (EDR) from statistics society with comprehensive gate delay models, offer good accuracy with low characterization cost, and they are thus competent for use in statistical timing analysis (SSTA). In addition, these two models have their own value in other SSTA techniques.

  5. Modular Scanning Confocal Microscope with Digital Image Processing

    PubMed Central

    McCluskey, Matthew D.

    2016-01-01

    In conventional confocal microscopy, a physical pinhole is placed at the image plane prior to the detector to limit the observation volume. In this work, we present a modular design of a scanning confocal microscope which uses a CCD camera to replace the physical pinhole for materials science applications. Experimental scans were performed on a microscope resolution target, a semiconductor chip carrier, and a piece of etched silicon wafer. The data collected by the CCD were processed to yield images of the specimen. By selecting effective pixels in the recorded CCD images, a virtual pinhole is created. By analyzing the image moments of the imaging data, a lateral resolution enhancement is achieved by using a 20 × / NA = 0.4 microscope objective at 532 nm laser wavelength. PMID:27829052

  6. Modular Scanning Confocal Microscope with Digital Image Processing.

    PubMed

    Ye, Xianjun; McCluskey, Matthew D

    2016-01-01

    In conventional confocal microscopy, a physical pinhole is placed at the image plane prior to the detector to limit the observation volume. In this work, we present a modular design of a scanning confocal microscope which uses a CCD camera to replace the physical pinhole for materials science applications. Experimental scans were performed on a microscope resolution target, a semiconductor chip carrier, and a piece of etched silicon wafer. The data collected by the CCD were processed to yield images of the specimen. By selecting effective pixels in the recorded CCD images, a virtual pinhole is created. By analyzing the image moments of the imaging data, a lateral resolution enhancement is achieved by using a 20 × / NA = 0.4 microscope objective at 532 nm laser wavelength.

  7. Automated Coronal Loop Identification using Digital Image Processing Techniques

    NASA Astrophysics Data System (ADS)

    Lee, J. K.; Gary, G. A.; Newman, T. S.

    2003-05-01

    The results of a Master's thesis study of computer algorithms for automatic extraction and identification (i.e., collectively, "detection") of optically-thin, 3-dimensional, (solar) coronal-loop center "lines" from extreme ultraviolet and X-ray 2-dimensional images will be presented. The center lines, which can be considered to be splines, are proxies of magnetic field lines. Detecting the loops is challenging because there are no unique shapes, the loop edges are often indistinct, and because photon and detector noise heavily influence the images. Three techniques for detecting the projected magnetic field lines have been considered and will be described in the presentation. The three techniques used are (i) linear feature recognition of local patterns (related to the inertia-tensor concept), (ii) parametric space inferences via the Hough transform, and (iii) topological adaptive contours (snakes) that constrain curvature and continuity. Since coronal loop topology is dominated by the magnetic field structure, a first-order magnetic field approximation using multiple dipoles provides a priori information that has also been incorporated into the detection process. Synthesized images have been generated to benchmark the suitability of the three techniques, and the performance of the three techniques on both synthesized and solar images will be presented and numerically evaluated in the presentation. The process of automatic detection of coronal loops is important in the reconstruction of the coronal magnetic field where the derived magnetic field lines provide a boundary condition for magnetic models ( cf. , Gary (2001, Solar Phys., 203, 71) and Wiegelmann & Neukirch (2002, Solar Phys., 208, 233)). . This work was supported by NASA's Office of Space Science - Solar and Heliospheric Physics Supporting Research and Technology Program.

  8. Mathematics and Science Teachers' Perceptions about Using Drama during the Digital Story Creation Process

    ERIC Educational Resources Information Center

    Yuksekyalcin, Gozen; Tanriseven, Isil; Sancar-Tokmak, Hatice

    2016-01-01

    This case study investigated math and science teachers' perceptions about the use of creative drama during a digital story (DS) creation process for educational purposes. A total of 25 secondary science and math teachers were selected according to criterion sampling strategy to participate in the study. Data were collected through an open-ended…

  9. Implementation and Performance of GaAs Digital Signal Processing ASICs

    NASA Technical Reports Server (NTRS)

    Whitaker, William D.; Buchanan, Jeffrey R.; Burke, Gary R.; Chow, Terrance W.; Graham, J. Scott; Kowalski, James E.; Lam, Barbara; Siavoshi, Fardad; Thompson, Matthew S.; Johnson, Robert A.

    1993-01-01

    The feasibility of performing high speed digital signal processing in GaAs gate array technology has been demonstrated with the successful implementation of a VLSI communications chip set for NASA's Deep Space Network. This paper describes the techniques developed to solve some of the technology and implementation problems associated with large scale integration of GaAs gate arrays.

  10. Wavelet image processing applied to optical and digital holography: past achievements and future challenges

    NASA Astrophysics Data System (ADS)

    Jones, Katharine J.

    2005-08-01

    The link between wavelets and optics goes back to the work of Dennis Gabor who both invented holography and developed Gabor decompositions. Holography involves 3-D images. Gabor decompositions involves 1-D signals. Gabor decompositions are the predecessors of wavelets. Wavelet image processing of holography, both optical holography and digital holography, will be examined with respect to past achievements and future challenges.

  11. Development of digital processing method of microfocus X-ray images

    NASA Astrophysics Data System (ADS)

    Staroverov, N. E.; Kholopova, E. D.; Gryaznov, A. Yu; Zhamova, K. K.

    2017-02-01

    The article describes the basic methods of X-ray images digital processing. Also in the article is proposed method for background image aligning based on modeling of distorting function and subtracting it from the image. As a result is proposed the improved algorithm for locally adaptive median filtering for which has been carried out the effectiveness experimental verification.

  12. A Method for Identifying Contours in Processing Digital Images from Computer Tomograph

    NASA Astrophysics Data System (ADS)

    Roşu, Şerban; Pater, Flavius; Costea, Dan; Munteanu, Mihnea; Roşu, Doina; Fratila, Mihaela

    2011-09-01

    The first step in digital processing of two-dimensional computed tomography images is to identify the contour of component elements. This paper deals with the collective work of specialists in medicine and applied mathematics in computer science on elaborating new algorithms and methods in medical 2D and 3D imagery.

  13. Digital image processing applications in the ignition and combustion of char/coal particles

    SciTech Connect

    Annamalai, K.; Kharbat, E.; Goplakrishnan, C.

    1992-12-01

    Digital image processing, is employed in this remarch study in order to visually investigate the ignition and combustion characteristics of isolated char/coal particles as well as the effect of interactivecombustion in two-particle char/coal arrays. Preliminary experiments are conducted on miniature isolated candles as well as two-candle arrays.

  14. An Undergraduate Course and Laboratory in Digital Signal Processing with Field Programmable Gate Arrays

    ERIC Educational Resources Information Center

    Meyer-Base, U.; Vera, A.; Meyer-Base, A.; Pattichis, M. S.; Perry, R. J.

    2010-01-01

    In this paper, an innovative educational approach to introducing undergraduates to both digital signal processing (DSP) and field programmable gate array (FPGA)-based design in a one-semester course and laboratory is described. While both DSP and FPGA-based courses are currently present in different curricula, this integrated approach reduces the…

  15. Highly Stretchable and UV Curable Elastomers for Digital Light Processing Based 3D Printing.

    PubMed

    Patel, Dinesh K; Sakhaei, Amir Hosein; Layani, Michael; Zhang, Biao; Ge, Qi; Magdassi, Shlomo

    2017-04-01

    Stretchable UV-curable (SUV) elastomers can be stretched by up to 1100% and are suitable for digital-light-processing (DLP)-based 3D-printing technology. DLP printing of these SUV elastomers enables the direct creation of highly deformable complex 3D hollow structures such as balloons, soft actuators, grippers, and buckyball electronical switches.

  16. Digitizing dissertations for an institutional repository: a process and cost analysis.

    PubMed

    Piorun, Mary; Palmer, Lisa A

    2008-07-01

    This paper describes the Lamar Soutter Library's process and costs associated with digitizing 300 doctoral dissertations for a newly implemented institutional repository at the University of Massachusetts Medical School. Project tasks included identifying metadata elements, obtaining and tracking permissions, converting the dissertations to an electronic format, and coordinating workflow between library departments. Each dissertation was scanned, reviewed for quality control, enhanced with a table of contents, processed through an optical character recognition function, and added to the institutional repository. Three hundred and twenty dissertations were digitized and added to the repository for a cost of $23,562, or $0.28 per page. Seventy-four percent of the authors who were contacted (n = 282) granted permission to digitize their dissertations. Processing time per title was 170 minutes, for a total processing time of 906 hours. In the first 17 months, full-text dissertations in the collection were downloaded 17,555 times. Locally digitizing dissertations or other scholarly works for inclusion in institutional repositories can be cost effective, especially if small, defined projects are chosen. A successful project serves as an excellent recruitment strategy for the institutional repository and helps libraries build new relationships. Challenges include workflow, cost, policy development, and copyright permissions.

  17. Hybrid-integrated optical acceleration seismometer and its digital processing system

    NASA Astrophysics Data System (ADS)

    En, De; Chen, Caihe; Cui, Yuming; Tang, Donglin; Liang, Zhengxi; Gao, Hongyu

    2005-02-01

    Hybrid-integrated Optical acceleration seismometer and its digital signal processing system are researched and developed. The simple system figure of the seismometer is given. The principle of the seismometer is explicated. The seismometer is composed of a seismic mass,Integrated Optical Chips and a set of Michelson interferometer light path. The Michelson Integrated Optical Chips are critical parts among the sensor elements. The simple figure of the digital signal processing system is given. As an advanced quality digital signal processing (DSP) chip equipped with necessary circuits has been used in its digital signal processing system, a high accurate detection of the acceleration signal has been achieved and the environmental interference signal has been effectively compensated. Test results indicate that the accelerometer has better frequency response well above the resonant frequency, and the output signal is in correspondence with the input signal. The accelerometer also has better frequency response under the resonant frequency. At last, the curve of Seismometer frequency response is given.

  18. Language influences on numerical development-Inversion effects on multi-digit number processing.

    PubMed

    Klein, E; Bahnmueller, J; Mann, A; Pixner, S; Kaufmann, L; Nuerk, H-C; Moeller, K

    2013-01-01

    In early numerical development, children have to become familiar with the Arabic number system and its place-value structure. The present review summarizes and discusses evidence for language influences on the acquisition of the highly transparent structuring principles of digital-Arabic digits by means of its moderation through the transparency of the respective language's number word system. In particular, the so-called inversion property (i.e., 24 named as "four and twenty" instead of "twenty four") was found to influence number processing in children not only in verbal but also in non-verbal numerical tasks. Additionally, there is first evidence suggesting that inversion-related difficulties may influence numerical processing longitudinally. Generally, language-specific influences in children's numerical development are most pronounced for multi-digit numbers. Yet, there is currently only one study on three-digit number processing for German-speaking children. A direct comparison of additional new data from Italian-speaking children further corroborates the assumption that language impacts on cognitive (number) processing as inversion-related interference was found most pronounced for German-speaking children. In sum, we conclude that numerical development may not be language-specific but seems to be moderated by language.

  19. Electronic post-compensation of WDM transmission impairments using coherent detection and digital signal processing.

    PubMed

    Li, Xiaoxu; Chen, Xin; Goldfarb, Gilad; Mateo, Eduardo; Kim, Inwoong; Yaman, Fatih; Li, Guifang

    2008-01-21

    A universal post-compensation scheme for fiber impairments in wavelength-division multiplexing (WDM) systems is proposed based on coherent detection and digital signal processing (DSP). Transmission of 10 x 10 Gbit/s binary-phase-shift-keying (BPSK) signals at a channel spacing of 20 GHz over 800 km dispersion shifted fiber (DSF) has been demonstrated numerically.

  20. Proceedings of the Fourth Annual Workshop on the Use of Digital Computers in Process Control.

    ERIC Educational Resources Information Center

    Smith, Cecil L., Ed.

    Contents: Computer hardware testing (results of vendor-user interaction); CODIL (a new language for process control programing); the design and implementation of control systems utilizing CRT display consoles; the systems contractor - valuable professional or unnecessary middle man; power station digital computer applications; from inspiration to…