Science.gov

Sample records for digital processing methodology

  1. Digital Methodology to implement the ECOUTER engagement process

    PubMed Central

    Wilson, Rebecca C.; Butters, Oliver W.; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J.

    2017-01-01

    ECOUTER ( Employing COncept ual schema for policy and Translation E in Research – French for ‘to listen’ – is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes. PMID:27366320

  2. Digital Methodology to implement the ECOUTER engagement process.

    PubMed

    Wilson, Rebecca C; Butters, Oliver W; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J

    2016-01-01

    ECOUTER ( Employing COncept ual schema for policy and Translation E in Research - French for 'to listen' - is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes.

  3. A Methodology to Teach Advanced A/D Converters, Combining Digital Signal Processing and Microelectronics Perspectives

    ERIC Educational Resources Information Center

    Quintans, C.; Colmenar, A.; Castro, M.; Moure, M. J.; Mandado, E.

    2010-01-01

    ADCs (analog-to-digital converters), especially Pipeline and Sigma-Delta converters, are designed using complex architectures in order to increase their sampling rate and/or resolution. Consequently, the learning of ADC devices also encompasses complex concepts such as multistage synchronization, latency, oversampling, modulation, noise shaping,…

  4. Digital image processing.

    PubMed

    Seeram, Euclid

    2004-01-01

    Digital image processing is now commonplace in radiology, nuclear medicine and sonography. This article outlines underlying principles and concepts of digital image processing. After completing this article, readers should be able to: List the limitations of film-based imaging. Identify major components of a digital imaging system. Describe the history and application areas of digital image processing. Discuss image representation and the fundamentals of digital image processing. Outline digital image processing techniques and processing operations used in selected imaging modalities. Explain the basic concepts and visualization tools used in 3-D and virtual reality imaging. Recognize medical imaging informatics as a new area of specialization for radiologic technologists.

  5. Integration Of Digital Methodologies (Field, Processing, and Presentation) In A Combined Sedimentology/Stratigraphy and Structure Course

    NASA Astrophysics Data System (ADS)

    Malinconico, L. L., Jr.; Sunderlin, D.; Liew, C. W.

    2015-12-01

    Over the course of the last three years we have designed, developed and refined two Apps for the iPad. GeoFieldBook and StratLogger allow for the real-time display of spatial (structural) and temporal (stratigraphic) field data as well as very easy in-field navigation. Field techniques and methods for data acquisition and mapping in the field have dramatically advanced and simplified how we collect and analyze data while in the field. The Apps are not geologic mapping programs, but rather a way of bypassing the analog field book step to acquire digital data directly that can then be used in various analysis programs (GIS, Google Earth, Stereonet, spreadsheet and drawing programs). We now complete all of our fieldwork digitally. GeoFieldBook can be used to collect structural and other field observations. Each record includes location/date/time information, orientation measurements, formation names, text observations and photos taken with the tablet camera. Records are customizable, so users can add fields of their own choosing. Data are displayed on an image base in real time with oriented structural symbols. The image base is also used for in-field navigation. In StratLogger, the user records bed thickness, lithofacies, biofacies, and contact data in preset and modifiable fields. Each bed/unit record may also be photographed and geo-referenced. As each record is collected, a column diagram of the stratigraphic sequence is built in real time, complete with lithology color, lithology texture, and fossil symbols. The recorded data from any measured stratigraphic sequence can be exported as both the live-drawn column image and as a .csv formatted file for use in spreadsheet or other applications. Common to both Apps is the ability to export the data (via .csv files), photographs and maps or stratigraphic columns (images). Since the data are digital they are easily imported into various processing programs (for example for stereoplot analysis). Requiring that all maps

  6. Digital processing clock

    NASA Technical Reports Server (NTRS)

    Phillips, D. H.

    1982-01-01

    Tthe digital processing clock SG 1157/U is described. It is compatible with the PTTI world where it can be driven by an external cesium source. Built-in test equipment shows synchronization with cesium through 1 pulse per second. It is built to be expandable to accommodate future time-keeping needs of the Navy as well as any other time ordered functions. Examples of this expandibility are the inclusion of an unmodulated XR3 time code and the 2137 modulate time code (XR3 with 1 kHz carrier).

  7. A Design Methodology for Medical Processes

    PubMed Central

    Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Summary Background Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient’s needs, the uncertainty of the patient’s response, and the indeterminacy of patient’s compliance to treatment. Also, the multiple actors involved in patient’s care need clear and transparent communication to ensure care coordination. Objectives In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. Methods The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. Results The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Conclusions Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution. PMID:27081415

  8. Measuring user experience in digital gaming: theoretical and methodological issues

    NASA Astrophysics Data System (ADS)

    Takatalo, Jari; Häkkinen, Jukka; Kaistinen, Jyrki; Nyman, Göte

    2007-01-01

    There are innumerable concepts, terms and definitions for user experience. Few of them have a solid empirical foundation. In trying to understand user experience in interactive technologies such as computer games and virtual environments, reliable and valid concepts are needed for measuring relevant user reactions and experiences. Here we present our approach to create both theoretically and methodologically sound methods for quantification of the rich user experience in different digital environments. Our approach is based on the idea that the experience received from a content presented with a specific technology is always a result of a complex psychological interpretation process, which components should be understood. The main aim of our approach is to grasp the complex and multivariate nature of the experience and make it measurable. We will present our two basic measurement frameworks, which have been developed and tested in large data set (n=2182). The 15 measurement scales extracted from these models are applied to digital gaming with a head-mounted display and a table-top display. The results show how it is possible to map between experience, technology variables and the background of the user (e.g., gender). This approach can help to optimize, for example, the contents for specific viewing devices or viewing situations.

  9. Methodologies for digital 3D acquisition and representation of mosaics

    NASA Astrophysics Data System (ADS)

    Manferdini, Anna Maria; Cipriani, Luca; Kniffitz, Linda

    2011-07-01

    Despite the recent improvements and widespread of digital technologies and their applications in the field of Cultural Heritage, nowadays Museums and Institutions still aren't encouraged to adopt digital procedures as a standard practice to collect data upon the heritage they are called to preserve and promote. One of the main reasons for this lack can be singled out in the high costs connected with these procedures and with their increasing due to difficulties connected with digital survey of artifacts and artworks which present evident intrinsic complexities and peculiarities that cannot be reconnected to recurrences. The aim of this paper is to show the results of a research conducted in order to find the most suitable digital methodology and procedure to be adopted to collect geometric and radiometric data upon mosaics that can straightforward both the preservation of the consistency of information about its geometry and the management of huge amount of data. One of the most immediate application of digital 3d survey of mosaics is the substitution of plaster casts that are usually built to add the third dimension to pictorial or photographic surveys before restoration interventions in order to document their conservation conditions and ease reconstruction procedures. Moreover, digital 3d surveys of mosaics allow to reproduce restoration interventions in digital environment able to perform reliable preliminary evaluations; in addition, 3d reality-based models of mosaics can be used within digital catalogues or for digital exhibitions and reconstruction aims.

  10. Digital TV processing system

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Two digital video data compression systems directly applicable to the Space Shuttle TV Communication System were described: (1) For the uplink, a low rate monochrome data compressor is used. The compression is achieved by using a motion detection technique in the Hadamard domain. To transform the variable source rate into a fixed rate, an adaptive rate buffer is provided. (2) For the downlink, a color data compressor is considered. The compression is achieved first by intra-color transformation of the original signal vector, into a vector which has lower information entropy. Then two-dimensional data compression techniques are applied to the Hadamard transformed components of this last vector. Mathematical models and data reliability analyses were also provided for the above video data compression techniques transmitted over a channel encoded Gaussian channel. It was shown that substantial gains can be achieved by the combination of video source and channel coding.

  11. Digital Storytelling: A Novel Methodology for Sexual Health Promotion

    ERIC Educational Resources Information Center

    Guse, Kylene; Spagat, Andrea; Hill, Amy; Lira, Andrea; Heathcock, Stephen; Gilliam, Melissa

    2013-01-01

    Digital storytelling draws on the power of narrative for personal and social transformation. This technique has many desirable attributes for sexuality education, including a participatory methodology, provision of a "safe space" to collaboratively address stigmatized topics, and an emphasis on the social and political contexts that…

  12. Digital signal processing

    NASA Astrophysics Data System (ADS)

    Oppenheim, A. V.; Baggeroer, A. B.; Lim, J. S.; Musicus, B. R.; Mook, D. R.; Duckworth, G. L.; Bordley, T. E.; Curtis, S. R.; Deadrick, D. S.; Dove, W. P.

    1984-01-01

    Signal and image processing research projects are described. Topics include: (1) modeling underwater acoustic propagation; (2) image restoration; (3) signal reconstruction; (4) speech enhancement; (5) pitch detection; (6) spectral analysis; (7) speech synthesis; (8) speech enhancement; (9) autoregressive spectral estimation; (10) knowledge based array processing; (11) speech analysis; (12) estimating the degree of coronary stenosis with image processing; (13) automatic target detection; and (14) video conferencing.

  13. Advanced digital SAR processing study

    NASA Technical Reports Server (NTRS)

    Martinson, L. W.; Gaffney, B. P.; Liu, B.; Perry, R. P.; Ruvin, A.

    1982-01-01

    A highly programmable, land based, real time synthetic aperture radar (SAR) processor requiring a processed pixel rate of 2.75 MHz or more in a four look system was designed. Variations in range and azimuth compression, number of looks, range swath, range migration and SR mode were specified. Alternative range and azimuth processing algorithms were examined in conjunction with projected integrated circuit, digital architecture, and software technologies. The advaced digital SAR processor (ADSP) employs an FFT convolver algorithm for both range and azimuth processing in a parallel architecture configuration. Algorithm performace comparisons, design system design, implementation tradeoffs and the results of a supporting survey of integrated circuit and digital architecture technologies are reported. Cost tradeoffs and projections with alternate implementation plans are presented.

  14. Digital Literacy: Tools and Methodologies for Information Society

    ERIC Educational Resources Information Center

    Rivoltella, Pier Cesare, Ed.

    2008-01-01

    Currently in a state of cultural transition, global society is moving from a literary society to digital one, adopting widespread use of advanced technologies such as the Internet and mobile devices. Digital media has an extraordinary impact on society's formative processes, forcing a pragmatic shift in their management and organization. This…

  15. Digital image processing in cephalometric analysis.

    PubMed

    Jäger, A; Döler, W; Schormann, T

    1989-01-01

    Digital image processing methods were applied to improve the practicability of cephalometric analysis. The individual X-ray film was digitized by the aid of a high resolution microscope-photometer. Digital processing was done using a VAX 8600 computer system. An improvement of the image quality was achieved by means of various digital enhancement and filtering techniques.

  16. Friction Stir Process Mapping Methodology

    NASA Technical Reports Server (NTRS)

    Kooney, Alex; Bjorkman, Gerry; Russell, Carolyn; Smelser, Jerry (Technical Monitor)

    2002-01-01

    In FSW (friction stir welding), the weld process performance for a given weld joint configuration and tool setup is summarized on a 2-D plot of RPM vs. IPM. A process envelope is drawn within the map to identify the range of acceptable welds. The sweet spot is selected as the nominal weld schedule. The nominal weld schedule is characterized in the expected manufacturing environment. The nominal weld schedule in conjunction with process control ensures a consistent and predictable weld performance.

  17. Friction Stir Process Mapping Methodology

    NASA Technical Reports Server (NTRS)

    Bjorkman, Gerry; Kooney, Alex; Russell, Carolyn

    2003-01-01

    The weld process performance for a given weld joint configuration and tool setup is summarized on a 2-D plot of RPM vs. IPM. A process envelope is drawn within the map to identify the range of acceptable welds. The sweet spot is selected as the nominal weld schedule The nominal weld schedule is characterized in the expected manufacturing environment. The nominal weld schedule in conjunction with process control ensures a consistent and predictable weld performance.

  18. Selecting a software development methodology. [of digital flight control systems

    NASA Technical Reports Server (NTRS)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  19. Challenges of implementing digital technology in motion picture distribution and exhibition: testing and evaluation methodology

    NASA Astrophysics Data System (ADS)

    Swartz, Charles S.

    2003-05-01

    The process of distributing and exhibiting a motion picture has changed little since the Lumière brothers presented the first motion picture to an audience in 1895. While this analog photochemical process is capable of producing screen images of great beauty and expressive power, more often the consumer experience is diminished by third generation prints and by the wear and tear of the mechanical process. Furthermore, the film industry globally spends approximately $1B annually manufacturing and shipping prints. Alternatively, distributing digital files would theoretically yield great benefits in terms of image clarity and quality, lower cost, greater security, and more flexibility in the cinema (e.g., multiple language versions). In order to understand the components of the digital cinema chain and evaluate the proposed technical solutions, the Entertainment Technology Center at USC in 2000 established the Digital Cinema Laboratory as a critical viewing environment, with the highest quality film and digital projection equipment. The presentation describes the infrastructure of the Lab, test materials, and testing methodologies developed for compression evaluation, and lessons learned up to the present. In addition to compression, the Digital Cinema Laboratory plans to evaluate other components of the digital cinema process as well.

  20. Effectiveness of Digital Pulse Processing Using a Slow Waveform Digitizer

    NASA Astrophysics Data System (ADS)

    Anthony, Adam; Ahmed, Mohammad; Sikora, Mark

    2016-09-01

    Using a waveform digitizer, one can replace nearly all of the analog electronics typically involved in processing pulses from a detector by directly digitizing the signal and processing it using digital algorithms. Algorithms for timing filter amplification, constant fraction discrimination, trapezoidal pulse shaping, peak sensing with pileup rejection, and charge integration were developed and implemented. The algorithms and a digitizer with a sampling rate of 62.5 MS/sec were used to calculate the energy and timing resolution of a various scintillation and solid state detectors. These resolutions are compared against both a traditional charge to digital (QDC), and the analog to digital (ADC) data acquisition setup in use at the High Intensity Gamma Source at Duke University. Preliminary results are presented.

  1. Methodology for digital radiography simulation using the Monte Carlo code MCNPX for industrial applications.

    PubMed

    Souza, E M; Correa, S C A; Silva, A X; Lopes, R T; Oliveira, D F

    2008-05-01

    This work presents a methodology for digital radiography simulation for industrial applications using the MCNPX radiography tally. In order to perform the simulation, the energy-dependent response of a BaFBr imaging plate detector was modeled and introduced in the MCNPX radiography tally input. In addition, a post-processing program was used to convert the MCNPX radiography tally output into 16-bit digital images. Simulated and experimental images of a steel pipe containing corrosion alveoli and stress corrosion cracking were compared, and the results showed good agreement between both images.

  2. Digital processing system for developing countries

    NASA Technical Reports Server (NTRS)

    Nanayakkara, C.; Wagner, H.

    1977-01-01

    An effort was undertaken to perform simple digital processing tasks using pre-existing general purpose digital computers. An experimental software package, LIGMALS, was obtained and modified for this purpose. The resulting software permits basic processing tasks to be performed including level slicing, gray mapping and ratio processing. The experience gained in this project indicates a possible direction which may be used by other developing countries to obtain digital processing capabilities.

  3. Digital signal processing in microwave radiometers

    NASA Technical Reports Server (NTRS)

    Lawrence, R. W.; Stanley, W. D.; Harrington, R. F.

    1980-01-01

    A microprocessor based digital signal processing unit has been proposed to replace analog sections of a microwave radiometer. A brief introduction to the radiometer system involved and a description of problems encountered in the use of digital techniques in radiometer design are discussed. An analysis of the digital signal processor as part of the radiometer is then presented.

  4. Topics in digital signal processing

    NASA Astrophysics Data System (ADS)

    Narayan, S. S. R.

    Three discrete Fourier transform (DFT) algorithms, namely, the fast Fourier transform algorithm (FFT), the prime factor algorithm (PFA) and the Winograd Fourier transform algorithm (WFTA) are analyzed and compared. A new set of short-length DFT algorithms well-suited for special purpose hardware implementations, employing monolithic multiplier-accumulators and microprocessors, are presented. Architectural considerations in designing DFT processors based on these algorithms are discussed. Efficient hardware structures for implementing the FFT and the PFA are presented. A digital implementation for performing linear-FM (LFM) pulse compression by using bandpass filter banks is presented. The concept of transform domain adaptive filtering is introduced. The DFT and the discrete cosine transform (DFT) domain adaptive filtering algorithm are considered. Applications of these in the areas of speech processing and adaptive line enhancers are discussed. A simple waveform coding algorithm capable of providing good quality speech at about 1.5 bits per sample is presented.

  5. Seamless lesion insertion in digital mammography: methodology and reader study

    NASA Astrophysics Data System (ADS)

    Pezeshk, Aria; Petrick, Nicholas; Sahiner, Berkman

    2016-03-01

    Collection of large repositories of clinical images containing verified cancer locations is costly and time consuming due to difficulties associated with both the accumulation of data and establishment of the ground truth. This problem poses a significant challenge to the development of machine learning algorithms that require large amounts of data to properly train and avoid overfitting. In this paper we expand the methods in our previous publications by making several modifications that significantly increase the speed of our insertion algorithms, thereby allowing them to be used for inserting lesions that are much larger in size. These algorithms have been incorporated into an image composition tool that we have made publicly available. This tool allows users to modify or supplement existing datasets by seamlessly inserting a real breast mass or micro-calcification cluster extracted from a source digital mammogram into a different location on another mammogram. We demonstrate examples of the performance of this tool on clinical cases taken from the University of South Florida Digital Database for Screening Mammography (DDSM). Finally, we report the results of a reader study evaluating the realism of inserted lesions compared to clinical lesions. Analysis of the radiologist scores in the study using receiver operating characteristic (ROC) methodology indicates that inserted lesions cannot be reliably distinguished from clinical lesions.

  6. [Generation and processing of digital images in radiodiagnosis].

    PubMed

    Bajla, I; Belan, V

    1993-05-01

    The paper describes universal principles of diagnostic imaging. The attention is focused particularly on digital image generation in medicine. The methodology of display visualization of measured data is discussed. The problems of spatial relation representation and visual perception of image brightness are mentioned. The methodological issues of digital image processing (DIP) are discussed, particularly the relation of DIP to the other related disciplines, fundamental tasks in DIP and classification of DIP operations from the computational viewpoint. The following examples of applying DIP operations in diagnostic radiology are overviewed: local contrast enhancement in digital image, spatial filtering, quantitative texture analysis, synthesis of the 3D pseudospatial image based on the 2D tomogram set, multimodal processing of medical images. New trends of application of DIP methods in diagnostic radiology are outlined: evaluation of the diagnostic efficiency of DIP operations by means of ROC analysis, construction of knowledge-based systems of DIP in medicine. (Fig. 12, Ref. 26.)

  7. Digital processing of radiographic images

    NASA Technical Reports Server (NTRS)

    Bond, A. D.; Ramapriyan, H. K.

    1973-01-01

    Some techniques are presented and the software documentation for the digital enhancement of radiographs. Both image handling and image processing operations are considered. The image handling operations dealt with are: (1) conversion of format of data from packed to unpacked and vice versa; (2) automatic extraction of image data arrays; (3) transposition and 90 deg rotations of large data arrays; (4) translation of data arrays for registration; and (5) reduction of the dimensions of data arrays by integral factors. Both the frequency and the spatial domain approaches are presented for the design and implementation of the image processing operation. It is shown that spatial domain recursive implementation of filters is much faster than nonrecursive implementations using fast fourier transforms (FFT) for the cases of interest in this work. The recursive implementation of a class of matched filters for enhancing image signal to noise ratio is described. Test patterns are used to illustrate the filtering operations. The application of the techniques to radiographic images of metallic structures is demonstrated through several examples.

  8. Advanced Digital Signal Processing for Hybrid Lidar

    DTIC Science & Technology

    2013-09-30

    Advanced Digital Signal Processing for Hybrid Lidar William D. Jemison Clarkson University [Technical Section Technical Objectives The technical...objective of this project is the development and evaluation of various digital signal processing (DSP) algorithms that will enhance hybrid lidar ...algorithm as shown in Figure 1. Hardware Platform for Algorithm Implementation + Underwater Channel Characteristics ^ Lidar DSP Algorithm Figure

  9. Digital signal processing for radioactive decay studies

    SciTech Connect

    Miller, D.; Madurga, M.; Paulauskas, S. V.; Ackermann, D.; Heinz, S.; Hessberger, F. P.; Hofmann, S.; Grzywacz, R.; Miernik, K.; Rykaczewski, K.; Tan, H.

    2011-11-30

    The use of digital acquisition system has been instrumental in the investigation of proton and alpha emitting nuclei. Recent developments extend the sensitivity and breadth of the application. The digital signal processing capabilities, used predominately by UT/ORNL for decay studies, include digitizers with decreased dead time, increased sampling rates, and new innovative firmware. Digital techniques and these improvements are furthermore applicable to a range of detector systems. Improvements in experimental sensitivity for alpha and beta-delayed neutron emitters measurements as well as the next generation of superheavy experiments are discussed.

  10. How Digital Image Processing Became Really Easy

    NASA Astrophysics Data System (ADS)

    Cannon, Michael

    1988-02-01

    In the early and mid-1970s, digital image processing was the subject of intense university and corporate research. The research lay along two lines: (1) developing mathematical techniques for improving the appearance of or analyzing the contents of images represented in digital form, and (2) creating cost-effective hardware to carry out these techniques. The research has been very effective, as evidenced by the continued decline of image processing as a research topic, and the rapid increase of commercial companies to market digital image processing software and hardware.

  11. On Certain New Methodology for Reducing Sensor and Readout Electronics Circuitry Noise in Digital Domain

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Miko, Joseph; Bradley, Damon; Heinzen, Katherine

    2008-01-01

    NASA Hubble Space Telescope (HST) and upcoming cosmology science missions carry instruments with multiple focal planes populated with many large sensor detector arrays. These sensors are passively cooled to low temperatures for low-level light (L3) and near-infrared (NIR) signal detection, and the sensor readout electronics circuitry must perform at extremely low noise levels to enable new required science measurements. Because we are at the technological edge of enhanced performance for sensors and readout electronics circuitry, as determined by thermal noise level at given temperature in analog domain, we must find new ways of further compensating for the noise in the signal digital domain. To facilitate this new approach, state-of-the-art sensors are augmented at their array hardware boundaries by non-illuminated reference pixels, which can be used to reduce noise attributed to sensors. There are a few proposed methodologies of processing in the digital domain the information carried by reference pixels, as employed by the Hubble Space Telescope and the James Webb Space Telescope Projects. These methods involve using spatial and temporal statistical parameters derived from boundary reference pixel information to enhance the active (non-reference) pixel signals. To make a step beyond this heritage methodology, we apply the NASA-developed technology known as the Hilbert- Huang Transform Data Processing System (HHT-DPS) for reference pixel information processing and its utilization in reconfigurable hardware on-board a spaceflight instrument or post-processing on the ground. The methodology examines signal processing for a 2-D domain, in which high-variance components of the thermal noise are carried by both active and reference pixels, similar to that in processing of low-voltage differential signals and subtraction of a single analog reference pixel from all active pixels on the sensor. Heritage methods using the aforementioned statistical parameters in the

  12. Digital data acquisition and processing.

    PubMed

    Naivar, Mark A; Galbraith, David W

    2015-01-05

    A flow cytometer is made up of many different subsystems that work together to measure the optical properties of individual cells within a sample. The data acquisition system (also called the data system) is one of these subsystems, and it is responsible for converting the electrical signals from the optical detectors into list-mode data. This unit describes the inner workings of the data system, and provides insight into how the instrument functions as a whole. Some of the information provided in this unit is applicable to everyday use of these instruments, and, at minimum, should make it easier for the reader to assemble a specific data system. With the considerable advancement of electronics technology, it becomes possible to build an entirely functional data system using inexpensive hobbyist-level electronics. This unit covers both analog and digital data systems, but the primary focus is on the more prevalent digital data systems of modern flow cytometric instrumentation.

  13. The Process of Digitizing of Old Globe

    NASA Astrophysics Data System (ADS)

    Ambrožová, K.; Havrlanta, J.; Talich, M.; Böhm, O.

    2016-06-01

    This paper describes the process of digitalization of old globes that brings with it the possibility to use globes in their digital form. Created digital models are available to the general public through modern technology in the Internet network. This gives an opportunity to study old globes located in various historical collections, and prevent damage of the originals. Another benefit of digitization is also a possibility of comparing different models both among themselves and with current map data by increasing the transparency of individual layers. Digitization is carried out using special device that allows digitizing globes with a diameter ranging from 5 cm to 120 cm. This device can be easily disassembled, and it is fully mobile therefore the globes can be digitized in the place of its storage. Image data of globe surface are acquired by digital camera firmly fastened to the device. Acquired image data are then georeferenced by using a method of complex adjustment. The last step of digitization is publication of the final models that is realized by two ways. The first option is in the form of 3D model through JavaScript library Cesium or Google Earth plug-in in the Web browser. The second option is as a georeferenced map using Tile Map Service.

  14. RSFQ Baseband Digital Signal Processing

    NASA Astrophysics Data System (ADS)

    Herr, Anna Yurievna

    Ultra fast switching speed of superconducting digital circuits enable realization of Digital Signal Processors with performance unattainable by any other technology. Based on rapid-single-flux technology (RSFQ) logic, these integrated circuits are capable of delivering high computation capacity up to 30 GOPS on a single processor and very short latency of 0.1ns. There are two main applications of such hardware for practical telecommunication systems: filters for superconducting ADCs operating with digital RF data and recursive filters at baseband. The later of these allows functions such as multiuser detection for 3G WCDMA, equalization and channel precoding for 4G OFDM MIMO, and general blind detection. The performance gain is an increase in the cell capacity, quality of service, and transmitted data rate. The current status of the development of the RSFQ baseband DSP is discussed. Major components with operating speed of 30GHz have been developed. Designs, test results, and future development of the complete systems including cryopackaging and CMOS interface are reviewed.

  15. CT Image Processing Using Public Digital Networks

    PubMed Central

    Rhodes, Michael L.; Azzawi, Yu-Ming; Quinn, John F.; Glenn, William V.; Rothman, Stephen L.G.

    1984-01-01

    Nationwide commercial computer communication is now commonplace for those applications where digital dialogues are generally short and widely distributed, and where bandwidth does not exceed that of dial-up telephone lines. Image processing using such networks is prohibitive because of the large volume of data inherent to digital pictures. With a blend of increasing bandwidth and distributed processing, network image processing becomes possible. This paper examines characteristics of a digital image processing service for a nationwide network of CT scanner installations. Issues of image transmission, data compression, distributed processing, software maintenance, and interfacility communication are also discussed. Included are results that show the volume and type of processing experienced by a network of over 50 CT scanners for the last 32 months.

  16. Process independent automated sizing methodology for current steering DAC

    NASA Astrophysics Data System (ADS)

    Vural, R. A.; Kahraman, N.; Erkmen, B.; Yildirim, T.

    2015-10-01

    This study introduces a process independent automated sizing methodology based on general regression neural network (GRNN) for current steering complementary metal-oxide semiconductor (CMOS) digital-to-analog converter (DAC) circuit. The aim is to utilise circuit structures designed with previous process technologies and to synthesise circuit structures for novel process technologies in contrast to other modelling researches that consider a particular process technology. The simulations were performed using ON SEMI 1.5 µm, ON SEMI 0.5 µm and TSMC 0.35 µm technology process parameters. Eventually, a high-dimensional database was developed consisting of transistor sizes of DAC designs and corresponded static specification errors obtained from simulation results. The key point is that the GRNN was trained with the data set including the simulation results of ON-SEMI 1.5 µm and 0.5 µm technology parameters and the test data were constituted with only the simulation results of TSMC 0.35 µm technology parameters that had not been applied to GRNN for training beforehand. The proposed methodology provides the channel lengths and widths of all transistors for a newer technology when the designer sets the numeric values of DAC static output specifications as Differential Non-linearity error, Integral Non-linearity error, monotonicity and gain error as the inputs of the network.

  17. Overview of Digital Signal Processing Theory

    DTIC Science & Technology

    1975-05-20

    of digital integrated- circuit hardware elements along with their extremely high reliability, maintainability, and repeatability of performance have...limited by large-signal-performance and power limitations of circuit components. In the implementation of digital signal process- ing systems there...E. Polak and E. Wong, Notes For A First Course On Linear Systems, Van Nostrand Reinhold Company, New York, 1970. 2. C.A. Desoer , Notes For A

  18. Digital Signal Processing Based Biotelemetry Receivers

    NASA Technical Reports Server (NTRS)

    Singh, Avtar; Hines, John; Somps, Chris

    1997-01-01

    This is an attempt to develop a biotelemetry receiver using digital signal processing technology and techniques. The receiver developed in this work is based on recovering signals that have been encoded using either Pulse Position Modulation (PPM) or Pulse Code Modulation (PCM) technique. A prototype has been developed using state-of-the-art digital signal processing technology. A Printed Circuit Board (PCB) is being developed based on the technique and technology described here. This board is intended to be used in the UCSF Fetal Monitoring system developed at NASA. The board is capable of handling a variety of PPM and PCM signals encoding signals such as ECG, temperature, and pressure. A signal processing program has also been developed to analyze the received ECG signal to determine heart rate. This system provides a base for using digital signal processing in biotelemetry receivers and other similar applications.

  19. Digital signal processing for ionospheric propagation diagnostics

    NASA Astrophysics Data System (ADS)

    Rino, Charles L.; Groves, Keith M.; Carrano, Charles S.; Gunter, Jacob H.; Parris, Richard T.

    2015-08-01

    For decades, analog beacon satellite receivers have generated multifrequency narrowband complex data streams that could be processed directly to extract total electron content (TEC) and scintillation diagnostics. With the advent of software-defined radio, modern digital receivers generate baseband complex data streams that require intermediate processing to extract the narrowband modulation imparted to the signal by ionospheric structure. This paper develops and demonstrates a processing algorithm for digital beacon satellite data that will extract TEC and scintillation components. For algorithm evaluation, a simulator was developed to generate noise-limited multifrequency complex digital signal realizations with representative orbital dynamics and propagation disturbances. A frequency-tracking procedure is used to capture the slowly changing frequency component. Dynamic demodulation against the low-frequency estimate captures the scintillation. The low-frequency reference can be used directly for dual-frequency TEC estimation.

  20. Digital database architecture and delineation methodology for deriving drainage basins, and a comparison of digitally and non-digitally derived numeric drainage areas

    USGS Publications Warehouse

    Dupree, Jean A.; Crowfoot, Richard M.

    2012-01-01

    The drainage basin is a fundamental hydrologic entity used for studies of surface-water resources and during planning of water-related projects. Numeric drainage areas published by the U.S. Geological Survey water science centers in Annual Water Data Reports and on the National Water Information Systems (NWIS) Web site are still primarily derived from hard-copy sources and by manual delineation of polygonal basin areas on paper topographic map sheets. To expedite numeric drainage area determinations, the Colorado Water Science Center developed a digital database structure and a delineation methodology based on the hydrologic unit boundaries in the National Watershed Boundary Dataset. This report describes the digital database architecture and delineation methodology and also presents the results of a comparison of the numeric drainage areas derived using this digital methodology with those derived using traditional, non-digital methods. (Please see report for full Abstract)

  1. Digital processing of Mariner 9 television data.

    NASA Technical Reports Server (NTRS)

    Green, W. B.; Seidman, J. B.

    1973-01-01

    The digital image processing performed by the Image Processing Laboratory (IPL) at JPL in support of the Mariner 9 mission is summarized. The support is divided into the general categories of image decalibration (the removal of photometric and geometric distortions from returned imagery), computer cartographic projections in support of mapping activities, and adaptive experimenter support (flexible support to provide qualitative digital enhancements and quantitative data reduction of returned imagery). Among the tasks performed were the production of maximum discriminability versions of several hundred frames to support generation of a geodetic control net for Mars, and special enhancements supporting analysis of Phobos and Deimos images.

  2. Digital image processing of vascular angiograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Beckenbach, E. S.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.

    1975-01-01

    The paper discusses the estimation of the degree of atherosclerosis in the human femoral artery through the use of a digital image processing system for vascular angiograms. The film digitizer uses an electronic image dissector camera to scan the angiogram and convert the recorded optical density information into a numerical format. Another processing step involves locating the vessel edges from the digital image. The computer has been programmed to estimate vessel abnormality through a series of measurements, some derived primarily from the vessel edge information and others from optical density variations within the lumen shadow. These measurements are combined into an atherosclerosis index, which is found in a post-mortem study to correlate well with both visual and chemical estimates of atherosclerotic disease.

  3. Digital Image Processing in Private Industry.

    ERIC Educational Resources Information Center

    Moore, Connie

    1986-01-01

    Examines various types of private industry optical disk installations in terms of business requirements for digital image systems in five areas: records management; transaction processing; engineering/manufacturing; information distribution; and office automation. Approaches for implementing image systems are addressed as well as key success…

  4. A Virtual Laboratory for Digital Signal Processing

    ERIC Educational Resources Information Center

    Dow, Chyi-Ren; Li, Yi-Hsung; Bai, Jin-Yu

    2006-01-01

    This work designs and implements a virtual digital signal processing laboratory, VDSPL. VDSPL consists of four parts: mobile agent execution environments, mobile agents, DSP development software, and DSP experimental platforms. The network capability of VDSPL is created by using mobile agent and wrapper techniques without modifying the source code…

  5. Eliminating "Hotspots" in Digital Image Processing

    NASA Technical Reports Server (NTRS)

    Salomon, P. M.

    1984-01-01

    Signals from defective picture elements rejected. Image processing program for use with charge-coupled device (CCD) or other mosaic imager augmented with algorithm that compensates for common type of electronic defect. Algorithm prevents false interpretation of "hotspots". Used for robotics, image enhancement, image analysis and digital television.

  6. Research Methodologies and the Doctoral Process.

    ERIC Educational Resources Information Center

    Creswell, John W.; Miller, Gary A.

    1997-01-01

    Doctoral students often select one of four common research methodologies that are popular in the social sciences and education today: positivist; interpretive; ideological; and pragmatic. But choice of methodology also influences the student's choice of course work, membership of dissertation committee, and the form and structure of the…

  7. On process optimization considering LCA methodology.

    PubMed

    Pieragostini, Carla; Mussati, Miguel C; Aguirre, Pío

    2012-04-15

    The goal of this work is to research the state-of-the-art in process optimization techniques and tools based on LCA, focused in the process engineering field. A collection of methods, approaches, applications, specific software packages, and insights regarding experiences and progress made in applying the LCA methodology coupled to optimization frameworks is provided, and general trends are identified. The "cradle-to-gate" concept to define the system boundaries is the most used approach in practice, instead of the "cradle-to-grave" approach. Normally, the relationship between inventory data and impact category indicators is linearly expressed by the characterization factors; then, synergic effects of the contaminants are neglected. Among the LCIA methods, the eco-indicator 99, which is based on the endpoint category and the panel method, is the most used in practice. A single environmental impact function, resulting from the aggregation of environmental impacts, is formulated as the environmental objective in most analyzed cases. SimaPro is the most used software for LCA applications in literature analyzed. The multi-objective optimization is the most used approach for dealing with this kind of problems, where the ε-constraint method for generating the Pareto set is the most applied technique. However, a renewed interest in formulating a single economic objective function in optimization frameworks can be observed, favored by the development of life cycle cost software and progress made in assessing costs of environmental externalities. Finally, a trend to deal with multi-period scenarios into integrated LCA-optimization frameworks can be distinguished providing more accurate results upon data availability.

  8. [Digital thoracic radiology: devices, image processing, limits].

    PubMed

    Frija, J; de Géry, S; Lallouet, F; Guermazi, A; Zagdanski, A M; De Kerviler, E

    2001-09-01

    In a first part, the different techniques of digital thoracic radiography are described. Since computed radiography with phosphore plates are the most commercialized it is more emphasized. But the other detectors are also described, as the drum coated with selenium and the direct digital radiography with selenium detectors. The other detectors are also studied in particular indirect flat panels detectors and the system with four high resolution CCD cameras. In a second step the most important image processing are discussed: the gradation curves, the unsharp mask processing, the system MUSICA, the dynamic range compression or reduction, the soustraction with dual energy. In the last part the advantages and the drawbacks of computed thoracic radiography are emphasized. The most important are the almost constant good quality of the pictures and the possibilities of image processing.

  9. Fundamental Concepts of Digital Image Processing

    DOE R&D Accomplishments Database

    Twogood, R. E.

    1983-03-01

    The field of a digital-image processing has experienced dramatic growth and increasingly widespread applicability in recent years. Fortunately, advances in computer technology have kept pace with the rapid growth in volume of image data in these and other applications. Digital image processing has become economical in many fields of research and in industrial and military applications. While each application has requirements unique from the others, all are concerned with faster, cheaper, more accurate, and more extensive computation. The trend is toward real-time and interactive operations, where the user of the system obtains preliminary results within a short enough time that the next decision can be made by the human processor without loss of concentration on the task at hand. An example of this is the obtaining of two-dimensional (2-D) computer-aided tomography (CAT) images. A medical decision might be made while the patient is still under observation rather than days later.

  10. REVIEW ARTICLE: Spectrophotometric applications of digital signal processing

    NASA Astrophysics Data System (ADS)

    Morawski, Roman Z.

    2006-09-01

    Spectrophotometry is more and more often the method of choice not only in analysis of (bio)chemical substances, but also in the identification of physical properties of various objects and their classification. The applications of spectrophotometry include such diversified tasks as monitoring of optical telecommunications links, assessment of eating quality of food, forensic classification of papers, biometric identification of individuals, detection of insect infestation of seeds and classification of textiles. In all those applications, large numbers of data, generated by spectrophotometers, are processed by various digital means in order to extract measurement information. The main objective of this paper is to review the state-of-the-art methodology for digital signal processing (DSP) when applied to data provided by spectrophotometric transducers and spectrophotometers. First, a general methodology of DSP applications in spectrophotometry, based on DSP-oriented models of spectrophotometric data, is outlined. Then, the most important classes of DSP methods for processing spectrophotometric data—the methods for DSP-aided calibration of spectrophotometric instrumentation, the methods for the estimation of spectra on the basis of spectrophotometric data, the methods for the estimation of spectrum-related measurands on the basis of spectrophotometric data—are presented. Finally, the methods for preprocessing and postprocessing of spectrophotometric data are overviewed. Throughout the review, the applications of DSP are illustrated with numerous examples related to broadly understood spectrophotometry.

  11. A methodology for use of digital image correlation for hot mix asphalt testing

    NASA Astrophysics Data System (ADS)

    Ramos, Estefany

    Digital Image Correlation (DIC) is a relatively new technology which aids in the measurement of material properties without the need for installation of sensors. DIC is a noncontact measuring technique that requires the specimen to be marked with a random speckled pattern and to be photographed during the test. The photographs are then post-processed based on the location of the pattern throughout the test. DIC can aid in calculating properties that would otherwise be too difficult even with other measuring instruments. The objective of this thesis is to discuss the methodology and validate the use of DIC in different hot mix asphalt (HMA) tests, such as, the Overlay Tester (OT) Test, Indirect Tensile (IDT) Test, and the Semicircular Bending (SCB) Test. The DIC system provides displacements and strains in any visible surface. The properly calibrated 2-D or 3-D DIC data can be used to understand the complex stress and strain distributions and the modes of the initiation and propagation of cracks. The use of this observational method will lead to further understanding of the complex boundary conditions of the different test, and therefore, allowing it to be implemented in the analysis of other materials. The use of digital image correlation will bring insight and knowledge onto what is happening during a test.

  12. Digital processing of ionospheric electron content data

    NASA Technical Reports Server (NTRS)

    Bernhardt, P. A.

    1979-01-01

    Ionospheric electron content data contain periodicities that are produced by a diversity of sources including hydromagnetic waves, gravity waves, and lunar tides. Often these periodicities are masked by the strong daily variation in the data. Digital filtering can be used to isolate the weaker components. The filtered data can then be further processed to provide estimates of the source properties. In addition, homomorphic filtering may be used to identify nonlinear interactions in the ionosphere.

  13. Process simulation in digital camera system

    NASA Astrophysics Data System (ADS)

    Toadere, Florin

    2012-06-01

    The goal of this paper is to simulate the functionality of a digital camera system. The simulations cover the conversion from light to numerical signal and the color processing and rendering. We consider the image acquisition system to be linear shift invariant and axial. The light propagation is orthogonal to the system. We use a spectral image processing algorithm in order to simulate the radiometric properties of a digital camera. In the algorithm we take into consideration the transmittances of the: light source, lenses, filters and the quantum efficiency of a CMOS (complementary metal oxide semiconductor) sensor. The optical part is characterized by a multiple convolution between the different points spread functions of the optical components. We use a Cooke triplet, the aperture, the light fall off and the optical part of the CMOS sensor. The electrical part consists of the: Bayer sampling, interpolation, signal to noise ratio, dynamic range, analog to digital conversion and JPG compression. We reconstruct the noisy blurred image by blending different light exposed images in order to reduce the photon shot noise, also we filter the fixed pattern noise and we sharpen the image. Then we have the color processing blocks: white balancing, color correction, gamma correction, and conversion from XYZ color space to RGB color space. For the reproduction of color we use an OLED (organic light emitting diode) monitor. The analysis can be useful to assist students and engineers in image quality evaluation and imaging system design. Many other configurations of blocks can be used in our analysis.

  14. Digital Signal Processing in the GRETINA Spectrometer

    NASA Astrophysics Data System (ADS)

    Cromaz, Mario

    2015-10-01

    Developments in the segmentation of large-volume HPGe crystals has enabled the development of high-efficiency gamma-ray spectrometers which have the ability to track the path of gamma-rays scattering through the detector volume. This technology has been successfully implemented in the GRETINA spectrometer whose high efficiency and ability to perform precise event-by-event Doppler correction has made it an important tool in nuclear spectroscopy. Tracking has required the spectrometer to employ a fully digital signal processing chain. Each of the systems 1120 channels are digitized by 100 Mhz, 14-bit flash ADCs. Filters that provide timing and high-resolution energies are implemented on local FPGAs acting on the ADC data streams while interaction point locations and tracks, derived from the trace on each detector segment, are calculated in real time on a computing cluster. In this presentation we will give a description of GRETINA's digital signal processing system, the impact of design decisions on system performance, and a discussion of possible future directions as we look towards soon developing larger spectrometers such as GRETA with full 4 π solid angle coverage. This work was supported by the Office of Science in the Department of Energy under grant DE-AC02-05CH11231.

  15. Image processing of digital chest ionograms.

    PubMed

    Yarwood, J R; Moores, B M

    1988-10-01

    A number of image-processing techniques have been applied to a digital ionographic chest image in order to evaluate their possible effects on this type of image. In order to quantify any effect, a simulated lesion was superimposed on the image at a variety of locations representing different types of structural detail. Visualization of these lesions was evaluated by a number of observers both pre- and post-processing operations. The operations employed included grey-scale transformations, histogram operations, edge-enhancement and smoothing functions. The resulting effects of these operations on the visualization of the simulated lesions are discussed.

  16. C language algorithms for digital signal processing

    SciTech Connect

    Embree, P.M.; Kimble, B.

    1991-01-01

    The use of the C programming language to construct digital signal-processing (DSP) algorithms for operation on high-performance personal computers is described in a textbook for engineering students. Chapters are devoted to the fundamental principles of DSP, basic C programming techniques, user-interface and disk-storage routines, filtering routines, discrete Fourier transforms, matrix and vector routines, and image-processing routines. Also included is a floppy disk containing a library of standard C mathematics, character-string, memory-allocation, and I/O functions; a library of DSP functions; and several sample DSP programs. 83 refs.

  17. Modeling and Analysis of Power Processing Systems. [use of a digital computer for designing power plants

    NASA Technical Reports Server (NTRS)

    Fegley, K. A.; Hayden, J. H.; Rehmann, D. W.

    1974-01-01

    The feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems is investigated. It is shown that a digital computer may be used in an interactive mode for the design, modeling, analysis, and comparison of power processing systems.

  18. Advanced Digital Signal Processing for Hybrid Lidar

    DTIC Science & Technology

    2013-03-31

    project "Advanced Digital Signal Processing for Hybrid Lidar " covering the period of 1/1/2013-3/31/2013. 9LO\\SO^O’IH^’?’ William D. Jemison...Chaotic LIDAR for Naval Applications This document contains a Progress Summary for FY13 Q2 and a Short Work Statement for FY13 Progress Summary for...This technique has the potential to increase the unambiguous range of hybrid lidar -radar while maintaining reasonable range resolution. Proof-of

  19. Advanced Digital Signal Processing for Hybrid Lidar

    DTIC Science & Technology

    2014-03-31

    on a multimeter to ensure that the PMT remained within its linear operating regime. The AC-coupTed signal was demodulated and digitized in the SDR ...receiver. The I and Q samples obtained by"" the SDR are transferred over an Ethernet cable to a PC, where the data are processed in a custom LabVIEW...Q samples are generated by the SDR receiver and used to compute range on a PC. Ranging results from the FDR experiments and RangeFinder simulations

  20. Image processing techniques for digital orthophotoquad production

    USGS Publications Warehouse

    Hood, Joy J.; Ladner, L. J.; Champion, Richard A.

    1989-01-01

    Orthophotographs have long been recognized for their value as supplements or alternatives to standard maps. Recent trends towards digital cartography have resulted in efforts by the US Geological Survey to develop a digital orthophotoquad production system. Digital image files were created by scanning color infrared photographs on a microdensitometer. Rectification techniques were applied to remove tile and relief displacement, thereby creating digital orthophotos. Image mosaicking software was then used to join the rectified images, producing digital orthophotos in quadrangle format.

  1. Parallel processing for digital picture comparison

    NASA Technical Reports Server (NTRS)

    Cheng, H. D.; Kou, L. T.

    1987-01-01

    In picture processing an important problem is to identify two digital pictures of the same scene taken under different lighting conditions. This kind of problem can be found in remote sensing, satellite signal processing and the related areas. The identification can be done by transforming the gray levels so that the gray level histograms of the two pictures are closely matched. The transformation problem can be solved by using the packing method. Researchers propose a VLSI architecture consisting of m x n processing elements with extensive parallel and pipelining computation capabilities to speed up the transformation with the time complexity 0(max(m,n)), where m and n are the numbers of the gray levels of the input picture and the reference picture respectively. If using uniprocessor and a dynamic programming algorithm, the time complexity will be 0(m(3)xn). The algorithm partition problem, as an important issue in VLSI design, is discussed. Verification of the proposed architecture is also given.

  2. Textural identification of carbonate rocks by image processing and neural network: Methodology proposal and examples

    NASA Astrophysics Data System (ADS)

    Marmo, Roberto; Amodio, Sabrina; Tagliaferri, Roberto; Ferreri, Vittoria; Longo, Giuseppe

    2005-06-01

    Using more than 1000 thin section photos of ancient (Phanerozoic) carbonates from different marine environments (pelagic to shallow-water) a new numerical methodology, based on digitized images of thin sections, is proposed here. In accordance with the Dunham classification, it allows the user to automatically identify carbonate textures unaffected by post-depositional modifications (recrystallization, dolomitization, meteoric dissolution and so on). The methodology uses, as input, 256 grey-tone digital image and by image processing gives, as output, a set of 23 values of numerical features measured on the whole image including the "white areas" (calcite cement). A multi-layer perceptron neural network takes as input this features and gives, as output, the estimated class. We used 532 images of thin sections to train the neural network, whereas to test the methodology we used 268 images taken from the same photo collection and 215 images from San Lorenzello carbonate sequence (Matese Mountains, southern Italy), Early Cretaceous in age. This technique has shown 93.3% and 93.5% of accuracy to classify automatically textures of carbonate rocks using digitized images on the 268 and 215 test sets, respectively. Therefore, the proposed methodology is a further promising application to the geosciences allowing carbonate textures of many thin sections to be identified in a rapid and accurate way. A MATLAB-based computer code has been developed for the processing and display of images.

  3. SYDDARTA: new methodology for digitization of deterioration estimation in paintings

    NASA Astrophysics Data System (ADS)

    Granero-Montagud, Luís.; Portalés, Cristina; Pastor-Carbonell, Begoña.; Ribes-Gómez, Emilio; Gutiérrez-Lucas, Antonio; Tornari, Vivi; Papadakis, Vassilis; Groves, Roger M.; Sirmacek, Beril; Bonazza, Alessandra; Ozga, Izabela; Vermeiren, Jan; van der Zanden, Koen; Föster, Matthias; Aswendt, Petra; Borreman, Albert; Ward, Jon D.; Cardoso, António; Aguiar, Luís.; Alves, Filipa; Ropret, Polonca; Luzón-Nogué, José María.; Dietz, Christian

    2013-05-01

    The SYDDARTA project is an on-going European Commission funded initiative under the 7th Framework Programme. Its main objective is the development of a pre-industrial prototype for diagnosing the deterioration of movable art assets. The device combines two different optical techniques for the acquisition of data. On one hand, hyperspectral imaging is implemented by means of electronically tunable filters. On the other, 3D scanning, using structured light projection and capturing is developed. These techniques are integrated in a single piece of equipment, allowing the recording of two optical information streams. Together with multi-sensor data merging and information processing, estimates of artwork deterioration and degradation can be made. In particular, the resulting system will implement two optical channels (3D scanning and short wave infrared (SWIR) hyperspectral imaging) featuring a structured light projector and electronically tunable spectral separators. The system will work in the VIS-NIR range (400-1000nm), and SWIR range (900-2500nm). It will be also portable and user-friendly. Among all possible art work under consideration, Baroque paintings on canvas and wooden panels were selected as the project case studies.

  4. An investigation of radiometer design using digital processing techniques

    NASA Technical Reports Server (NTRS)

    Lawrence, R. W.

    1981-01-01

    The use of digital signal processing techniques in Dicke switching radiometer design was investigated. The general approach was to develop an analytical model of the existing analog radiometer and identify factors which adversly affect its performance. A digital processor was then proposed to verify the feasibility of using digital techniques to minimize these adverse effects and improve the radiometer performance. Analysis and preliminary test results comparing the digital and analog processing approaches in radiometers design were analyzed.

  5. Fuzzy Logic Enhanced Digital PIV Processing Software

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.

    1999-01-01

    Digital Particle Image Velocimetry (DPIV) is an instantaneous, planar velocity measurement technique that is ideally suited for studying transient flow phenomena in high speed turbomachinery. DPIV is being actively used at the NASA Glenn Research Center to study both stable and unstable operating conditions in a high speed centrifugal compressor. Commercial PIV systems are readily available which provide near real time feedback of the PIV image data quality. These commercial systems are well designed to facilitate the expedient acquisition of PIV image data. However, as with any general purpose system, these commercial PIV systems do not meet all of the data processing needs required for PIV image data reduction in our compressor research program. An in-house PIV PROCessing (PIVPROC) code has been developed for reducing PIV data. The PIVPROC software incorporates fuzzy logic data validation for maximum information recovery from PIV image data. PIVPROC enables combined cross-correlation/particle tracking wherein the highest possible spatial resolution velocity measurements are obtained.

  6. Digital interactive image analysis by array processing

    NASA Technical Reports Server (NTRS)

    Sabels, B. E.; Jennings, J. D.

    1973-01-01

    An attempt is made to draw a parallel between the existing geophysical data processing service industries and the emerging earth resources data support requirements. The relationship of seismic data analysis to ERTS data analysis is natural because in either case data is digitally recorded in the same format, resulting from remotely sensed energy which has been reflected, attenuated, shifted and degraded on its path from the source to the receiver. In the seismic case the energy is acoustic, ranging in frequencies from 10 to 75 cps, for which the lithosphere appears semi-transparent. In earth survey remote sensing through the atmosphere, visible and infrared frequency bands are being used. Yet the hardware and software required to process the magnetically recorded data from the two realms of inquiry are identical and similar, respectively. The resulting data products are similar.

  7. Development of economic consequence methodology for process risk analysis.

    PubMed

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies.

  8. Methodology of Diagnostics of Interethnic Relations and Ethnosocial Processes

    ERIC Educational Resources Information Center

    Maximova, Svetlana G.; Noyanzina, Oksana Ye.; Omelchenko, Daria A.; Maximov, Maxim B.; Avdeeva, Galina C.

    2016-01-01

    The purpose of this study was to research the methodological approaches to the study of interethnic relations and ethno-social processes. The analysis of the literature was conducted in three main areas: 1) the theoretical and methodological issues of organizing the research of inter-ethnic relations, allowing to highlight the current…

  9. Digital Light Processing and MEMS: reflecting the digital display needs of the networked society

    NASA Astrophysics Data System (ADS)

    Hornbeck, Larry J.

    1996-08-01

    Digital video technology is becoming increasingly important to the networked society. The natural interface to digital video is a digital display, one that accepts electrical bits at its input and converts them into optical bits at the output. The digital-to-analog processing function is performed in the mind of the observer. Texas Instruments has developed such a display with its recent market introduction of the Digital Light ProcessingTM (DLPTM) projection display. DLP technology is based on the Digital Micromirror DeviceTM (DMDTM), a microelectromechanical systems (MEMS) array of semiconductor-based digital light switches. The DMD switching array precisely controls a light source for projection display and digital printing applications. This paper presents an overview of DLP technology along with the architecture, projection operation, manufacture, and reliability of the DMD. Features of DMD technology that distinguish it from conventional MEMS technology are explored. Finally, the paper provides a view of DLP business opportunities.

  10. Development and testing of methodology for evaluating the performance of multi-input/multi-output digital control systems

    NASA Technical Reports Server (NTRS)

    Pototzky, Anthony S.; Wieseman, Carol D.; Hoadley, Sherwood Tiffany; Mukhopadhyay, Vivek

    1990-01-01

    A Controller Performance Evaluation (CPE) methodology for multi-input/multi-output digital control systems was developed and tested on an aeroelastic wind-tunnel model. Modern signal processing methods were used to implement control laws and to acquire time domain data of the whole system (controller and plant) from which appropriate transfer matrices of the control system could be generated. Matrix computational procedures were used to calculate singular values of return-difference matrices at the plant input and output points to evaluate the performance of the control system. The CPE procedures effectively identified potentially destabilizing controllers and confirmed the satisfactory performance of stabilizing ones.

  11. E-inclusion Process and Societal Digital Skill Development

    ERIC Educational Resources Information Center

    Vitolina, Ieva

    2015-01-01

    Nowadays, the focus shifts from information and communication technology access to skills and knowledge. Moreover, lack of digital skills is an obstacle in the process of learning new digital competences using technologies and e-learning. The objective of this research is to investigate how to facilitate students to use the acquired digital skills…

  12. An Interactive Graphics Program for Investigating Digital Signal Processing.

    ERIC Educational Resources Information Center

    Miller, Billy K.; And Others

    1983-01-01

    Describes development of an interactive computer graphics program for use in teaching digital signal processing. The program allows students to interactively configure digital systems on a monitor display and observe their system's performance by means of digital plots on the system's outputs. A sample program run is included. (JN)

  13. Low cost 3D scanning process using digital image processing

    NASA Astrophysics Data System (ADS)

    Aguilar, David; Romero, Carlos; Martínez, Fernando

    2017-02-01

    This paper shows the design and building of a low cost 3D scanner, able to digitize solid objects through contactless data acquisition, using active object reflection. 3D scanners are used in different applications such as: science, engineering, entertainment, etc; these are classified in: contact scanners and contactless ones, where the last ones are often the most used but they are expensive. This low-cost prototype is done through a vertical scanning of the object using a fixed camera and a mobile horizontal laser light, which is deformed depending on the 3-dimensional surface of the solid. Using digital image processing an analysis of the deformation detected by the camera was done; it allows determining the 3D coordinates using triangulation. The obtained information is processed by a Matlab script, which gives to the user a point cloud corresponding to each horizontal scanning done. The obtained results show an acceptable quality and significant details of digitalized objects, making this prototype (built on LEGO Mindstorms NXT kit) a versatile and cheap tool, which can be used for many applications, mainly by engineering students.

  14. Digital data processing system dynamic loading analysis

    NASA Technical Reports Server (NTRS)

    Lagas, J. J.; Peterka, J. J.; Tucker, A. E.

    1976-01-01

    Simulation and analysis of the Space Shuttle Orbiter Digital Data Processing System (DDPS) are reported. The mated flight and postseparation flight phases of the space shuttle's approach and landing test configuration were modeled utilizing the Information Management System Interpretative Model (IMSIM) in a computerized simulation modeling of the ALT hardware, software, and workload. System requirements simulated for the ALT configuration were defined. Sensitivity analyses determined areas of potential data flow problems in DDPS operation. Based on the defined system requirements and the sensitivity analyses, a test design is described for adapting, parameterizing, and executing the IMSIM. Varying load and stress conditions for the model execution are given. The analyses of the computer simulation runs were documented as results, conclusions, and recommendations for DDPS improvements.

  15. Parallel Processing with Digital Signal Processing Hardware and Software

    NASA Technical Reports Server (NTRS)

    Swenson, Cory V.

    1995-01-01

    The assembling and testing of a parallel processing system is described which will allow a user to move a Digital Signal Processing (DSP) application from the design stage to the execution/analysis stage through the use of several software tools and hardware devices. The system will be used to demonstrate the feasibility of the Algorithm To Architecture Mapping Model (ATAMM) dataflow paradigm for static multiprocessor solutions of DSP applications. The individual components comprising the system are described followed by the installation procedure, research topics, and initial program development.

  16. Design methodology: edgeless 3D ASICs with complex in-pixel processing for pixel detectors

    SciTech Connect

    Fahim Farah, Fahim Farah; Deptuch, Grzegorz W.; Hoff, James R.; Mohseni, Hooman

    2015-08-28

    The design methodology for the development of 3D integrated edgeless pixel detectors with in-pixel processing using Electronic Design Automation (EDA) tools is presented. A large area 3 tier 3D detector with one sensor layer and two ASIC layers containing one analog and one digital tier, is built for x-ray photon time of arrival measurement and imaging. A full custom analog pixel is 65μm x 65μm. It is connected to a sensor pixel of the same size on one side, and on the other side it has approximately 40 connections to the digital pixel. A 32 x 32 edgeless array without any peripheral functional blocks constitutes a sub-chip. The sub-chip is an indivisible unit, which is further arranged in a 6 x 6 array to create the entire 1.248cm x 1.248cm ASIC. Each chip has 720 bump-bond I/O connections, on the back of the digital tier to the ceramic PCB. All the analog tier power and biasing is conveyed through the digital tier from the PCB. The assembly has no peripheral functional blocks, and hence the active area extends to the edge of the detector. This was achieved by using a few flavors of almost identical analog pixels (minimal variation in layout) to allow for peripheral biasing blocks to be placed within pixels. The 1024 pixels within a digital sub-chip array have a variety of full custom, semi-custom and automated timing driven functional blocks placed together. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout. The methodology uses the Cadence design platform, however it is not limited to this tool.

  17. On digital image processing technology and application in geometric measure

    NASA Astrophysics Data System (ADS)

    Yuan, Jiugen; Xing, Ruonan; Liao, Na

    2014-04-01

    Digital image processing technique is an emerging science that emerging with the development of semiconductor integrated circuit technology and computer science technology since the 1960s.The article introduces the digital image processing technique and principle during measuring compared with the traditional optical measurement method. It takes geometric measure as an example and introduced the development tendency of digital image processing technology from the perspective of technology application.

  18. Digital speech processing for cochlear implants.

    PubMed

    Dillier, N; Bögli, H; Spillmann, T

    1992-01-01

    A rather general basic working hypothesis for cochlear implant research might be formulated as follows. Signal processing for cochlear implants should carefully select a subset of the total information contained in the sound signal and transform these elements into those physical stimulation parameters which can generate distinctive perceptions for the listener. Several new digital processing strategies have thus been implemented on a laboratory cochlear implant speech processor for the Nucleus 22-electrode system. One of the approaches (PES, pitch excited sampler) is based on the maximum peak channel vocoder concept whereby the spectral energy of a number of frequency bands is transformed into appropriate electrical stimulation parameters for up to 22 electrodes using a voice pitch synchronous pulse rate at any electrode. Another approach (CIS, continuous interleaved sampler) uses a maximally high pitch-independent stimulation pulse rate on a selected number of electrodes. As only one electrode can be stimulated at any instance of time, the rate of stimulation is limited by the required stimulus pulse widths (as determined individually for each subject) and some additional constraints and parameters which have to be optimized and fine tuned by psychophysical measurements. Evaluation experiments with 5 cochlear implant users resulted in significantly improved performance in consonant identification tests with the new processing strategies as compared with the subjects own wearable speech processors whereas improvements in vowel identification tasks were rarely observed. The pitch-synchronous coding (PES) resulted in worse performance compared to the coding without explicit pitch extraction (CIS).(ABSTRACT TRUNCATED AT 250 WORDS)

  19. Process Architecture for Managing Digital Object Identifiers

    NASA Astrophysics Data System (ADS)

    Wanchoo, L.; James, N.; Stolte, E.

    2014-12-01

    In 2010, NASA's Earth Science Data and Information System (ESDIS) Project implemented a process for registering Digital Object Identifiers (DOIs) for data products distributed by Earth Observing System Data and Information System (EOSDIS). For the first 3 years, ESDIS evolved the process involving the data provider community in the development of processes for creating and assigning DOIs, and guidelines for the landing page. To accomplish this, ESDIS established two DOI User Working Groups: one for reviewing the DOI process whose recommendations were submitted to ESDIS in February 2014; and the other recently tasked to review and further develop DOI landing page guidelines for ESDIS approval by end of 2014. ESDIS has recently upgraded the DOI system from a manually-driven system to one that largely automates the DOI process. The new automated feature include: a) reviewing the DOI metadata, b) assigning of opaque DOI name if data provider chooses, and c) reserving, registering, and updating the DOIs. The flexibility of reserving the DOI allows data providers to embed and test the DOI in the data product metadata before formally registering with EZID. The DOI update process allows the changing of any DOI metadata except the DOI name unless the name has not been registered. Currently, ESDIS has processed a total of 557 DOIs of which 379 DOIs are registered with EZID and 178 are reserved with ESDIS. The DOI incorporates several metadata elements that effectively identify the data product and the source of availability. Of these elements, the Uniform Resource Locator (URL) attribute has the very important function of identifying the landing page which describes the data product. ESDIS in consultation with data providers in the Earth Science community is currently developing landing page guidelines that specify the key data product descriptive elements to be included on each data product's landing page. This poster will describe in detail the unique automated process and

  20. Design Methodology: ASICs with complex in-pixel processing for Pixel Detectors

    SciTech Connect

    Fahim, Farah

    2014-10-31

    The development of Application Specific Integrated Circuits (ASIC) for pixel detectors with complex in-pixel processing using Computer Aided Design (CAD) tools that are, themselves, mainly developed for the design of conventional digital circuits requires a specialized approach. Mixed signal pixels often require parasitically aware detailed analog front-ends and extremely compact digital back-ends with more than 1000 transistors in small areas below 100μm x 100μm. These pixels are tiled to create large arrays, which have the same clock distribution and data readout speed constraints as in, for example, micro-processors. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout.

  1. Digital image processing of cephalometric radiographs: a preliminary report.

    PubMed

    Jackson, P H; Dickson, G C; Birnie, D J

    1985-07-01

    The principles of image capture, image storage and image processing in digital radiology are described. The enhancement of radiographic images using digital image processing techniques and its application to cephalometry is discussed. The results of a pilot study which compared some common cephalometric measurements made from manual point identification with those made by direct digitization of digital radiographic images from video monitors are presented. Although in an early stage of development, the results from the image processing system were comparable with those obtained by traditional methods.

  2. A review of some digital image processing in cometary research

    NASA Astrophysics Data System (ADS)

    Larson, S. M.

    The development of electronic digitizers, digital detector arrays and modern high speed computer processing has led to more efficient, quantitative methods of studying the spatial, temporal and photometric properties of cometary phenomena. Digital image processing techniques are being used and further developed to reduce two dimensional data, to enhance the visibility of cometary features, and to quantify spatial and temporal changes. Some of these methods are reviewed, and their merits and limitations are discussed.

  3. The Creation Process in Digital Art

    NASA Astrophysics Data System (ADS)

    Marcos, Adérito Fernandes; Branco, Pedro Sérgio; Zagalo, Nelson Troca

    The process behind the act of the art creation or the creation process has been the subject of much debate and research during the last fifty years at least, even thinking art and beauty has been a subject of analysis already by the ancient Greeks such were Plato or Aristotle. Even though intuitively it is a simple phenomenon, creativity or the human ability to generate innovation (new ideas, concepts, etc.) is in fact quite complex. It has been studied from the perspectives of behavioral and social psychology, cognitive science, artificial intelligence, philosophy, history, design research, digital art, and computational aesthetics, among others. In spite of many years of discussion and research there is no single, authoritative perspective or definition of creativity, i.e., there is no standardized measurement technique. Regarding the development process that supports the intellectual act of creation it is usually described as a procedure where the artist experiments the medium, explores it with one or more techniques, changing shapes, forms, appearances, where beyond time and space, he/she seeks his/her way out to a clearing, i.e., envisages a path from intention to realization. Duchamp in his lecture "The Creative Act" states the artist is never alone with his/her artwork; there is always the spectator that later on will react critically to the work of art. If the artist succeeds in transmitting his/her intentions in terms of a message, emotion or feeling to the spectator then a form of aesthetic osmosis actually takes place through the inert matter (the medium) that enabled this communication or interaction phenomenon to occur. The role of the spectator may become gradually more active by interacting with the artwork itself possibly changing or becoming a part of it [2][4].

  4. Digital processing of array seismic recordings

    USGS Publications Warehouse

    Ryall, Alan; Birtill, John

    1962-01-01

    This technical letter contains a brief review of the operations which are involved in digital processing of array seismic recordings by the methods of velocity filtering, summation, cross-multiplication and integration, and by combinations of these operations (the "UK Method" and multiple correlation). Examples are presented of analyses by the several techniques on array recordings which were obtained by the U.S. Geological Survey during chemical and nuclear explosions in the western United States. Seismograms are synthesized using actual noise and Pn-signal recordings, such that the signal-to-noise ratio, onset time and velocity of the signal are predetermined for the synthetic record. These records are then analyzed by summation, cross-multiplication, multiple correlation and the UK technique, and the results are compared. For all of the examples presented, analysis by the non-linear techniques of multiple correlation and cross-multiplication of the traces on an array recording are preferred to analyses by the linear operations involved in summation and the UK Method.

  5. Neutron coincidence counting with digital signal processing

    NASA Astrophysics Data System (ADS)

    Bagi, Janos; Dechamp, Luc; Dransart, Pascal; Dzbikowicz, Zdzislaw; Dufour, Jean-Luc; Holzleitner, Ludwig; Huszti, Joseph; Looman, Marc; Marin Ferrer, Montserrat; Lambert, Thierry; Peerani, Paolo; Rackham, Jamie; Swinhoe, Martyn; Tobin, Steve; Weber, Anne-Laure; Wilson, Mark

    2009-09-01

    Neutron coincidence counting is a widely adopted nondestructive assay (NDA) technique used in nuclear safeguards to measure the mass of nuclear material in samples. Nowadays, most neutron-counting systems are based on the original-shift-register technology, like the (ordinary or multiplicity) Shift-Register Analyser. The analogue signal from the He-3 tubes is processed by an amplifier/single channel analyser (SCA) producing a train of TTL pulses that are fed into an electronic unit that performs the time- correlation analysis. Following the suggestion of the main inspection authorities (IAEA, Euratom and the French Ministry of Industry), several research laboratories have started to study and develop prototypes of neutron-counting systems with PC-based processing. Collaboration in this field among JRC, IRSN and LANL has been established within the framework of the ESARDA-NDA working group. Joint testing campaigns have been performed in the JRC PERLA laboratory, using different equipment provided by the three partners. One area of development is the use of high-speed PCs and pulse acquisition electronics that provide a time stamp (LIST-Mode Acquisition) for every digital pulse. The time stamp data can be processed directly during acquisition or saved on a hard disk. The latter method has the advantage that measurement data can be analysed with different values for parameters like predelay and gate width, without repeating the acquisition. Other useful diagnostic information, such as die-away time and dead time, can also be extracted from this stored data. A second area is the development of "virtual instruments." These devices, in which the pulse-processing system can be embedded in the neutron counter itself and sends counting data to a PC, can give increased data-acquisition speeds. Either or both of these developments could give rise to the next generation of instrumentation for improved practical neutron-correlation measurements. The paper will describe the

  6. A methodology aimed at fostering and sustaining the development processes of an IE-based industry

    NASA Astrophysics Data System (ADS)

    Corallo, Angelo; Errico, Fabrizio; de Maggio, Marco; Giangreco, Enza

    In the current competitive scenario, where business relationships are fundamental in building successful business models and inter/intra organizational business processes are progressively digitalized, an end-to-end methodology is required that is capable of guiding business networks through the Internetworked Enterprise (IE) paradigm: a new and innovative organizational model able to leverage Internet technologies to perform real-time coordination of intra and inter-firm activities, to create value by offering innovative and personalized products/services and reduce transaction costs. This chapter presents the TEKNE project Methodology of change that guides business networks, by means of a modular and flexible approach, towards the IE techno-organizational paradigm, taking into account the competitive environment of the network and how this environment influences its strategic, organizational and technological levels. Contingency, the business model, enterprise architecture and performance metrics are the key concepts that form the cornerstone of this methodological framework.

  7. Multilingual subjective methodology and evaluation of low-rate digital voice processors

    NASA Astrophysics Data System (ADS)

    Dimolitsas, Spiros; Corcoran, Franklin L.; Baraniecki, Marion R.; Phipps, John G., Jr.

    The methodology and results for a multilingual evaluation of source encoding algorithms operating at 16 kbit/s are presented. The evaluation was conducted in three languages (English, French, and Madarin), using listener opinion subjective assessments to determine whether 'toll-quality' performance is possible at 16 kbit/s. The study demonstrated that toll-quality voice is indeed possible at 16 kbit/s, and that several of the methods evaluated are more robust under high bit error conditions than either 32- or 64-kbit/s encoding. Thus, 16-kbit/s voice coding technology is currently suitable for many applications with the public-switched telephone network, including the next generation of digital circuit multiplication equipment, and integrated services digital network videotelephony.

  8. Modular digital holographic fringe data processing system

    NASA Technical Reports Server (NTRS)

    Downward, J. G.; Vavra, P. C.; Schebor, F. S.; Vest, C. M.

    1985-01-01

    A software architecture suitable for reducing holographic fringe data into useful engineering data is developed and tested. The results, along with a detailed description of the proposed architecture for a Modular Digital Fringe Analysis System, are presented.

  9. A methodology for high resolution digital image correlation in high temperature experiments

    NASA Astrophysics Data System (ADS)

    Blaber, Justin; Adair, Benjamin S.; Antoniou, Antonia

    2015-03-01

    We propose a methodology for performing high resolution Digital Image Correlation (DIC) analysis during high-temperature mechanical tests. Specifically, we describe a technique for producing a stable, high-quality pattern on metal surfaces along with a simple optical system that uses a visible-range camera and a long-range microscope. The results are analyzed with a high-quality open-source DIC software developed by us. Using the proposed technique, we successfully acquired high-resolution strain maps of the crack tip field in a nickel superalloy sample at 1000 °C.

  10. Using Constructivist Case Study Methodology to Understand Community Development Processes: Proposed Methodological Questions to Guide the Research Process

    ERIC Educational Resources Information Center

    Lauckner, Heidi; Paterson, Margo; Krupa, Terry

    2012-01-01

    Often, research projects are presented as final products with the methodologies cleanly outlined and little attention paid to the decision-making processes that led to the chosen approach. Limited attention paid to these decision-making processes perpetuates a sense of mystery about qualitative approaches, particularly for new researchers who will…

  11. Lean methodology: supporting battlefield medical fitness by cutting process waste.

    PubMed

    Huggins, Elaine J

    2010-01-01

    Healthcare has long looked at decreasing risk in communication and patient care processes. Increasing the simplicity in communication and patient care process is a newer concept contained in Lean methodology. Lean is a strategy for achieving improvement in performance through the elimination of steps that use resources without contributing to customer value. This is known as cutting waste or nonvalue added steps. This article outlines how the use of Lean improved a key process that supports battlefield medical fitness.

  12. Programmable rate modem utilizing digital signal processing techniques

    NASA Technical Reports Server (NTRS)

    Naveh, Arad

    1992-01-01

    The need for a Programmable Rate Digital Satellite Modem capable of supporting both burst and continuous transmission modes with either Binary Phase Shift Keying (BPSK) or Quadrature Phase Shift Keying (QPSK) modulation is discussed. The preferred implementation technique is an all digital one which utilizes as much digital signal processing (DSP) as possible. The design trade-offs in each portion of the modulator and demodulator subsystem are outlined.

  13. The digital storytelling process: A comparative analysis from various experts

    NASA Astrophysics Data System (ADS)

    Hussain, Hashiroh; Shiratuddin, Norshuhada

    2016-08-01

    Digital Storytelling (DST) is a method of delivering information to the audience. It combines narrative and digital media content infused with the multimedia elements. In order for the educators (i.e the designers) to create a compelling digital story, there are sets of processes introduced by experts. Nevertheless, the experts suggest varieties of processes to guide them; of which some are redundant. The main aim of this study is to propose a single guide process for the creation of DST. A comparative analysis is employed where ten DST models from various experts are analysed. The process can also be implemented in other multimedia materials that used the concept of DST.

  14. Digital signal processor and processing method for GPS receivers

    NASA Technical Reports Server (NTRS)

    Thomas, Jr., Jess B. (Inventor)

    1989-01-01

    A digital signal processor and processing method therefor for use in receivers of the NAVSTAR/GLOBAL POSITIONING SYSTEM (GPS) employs a digital carrier down-converter, digital code correlator and digital tracking processor. The digital carrier down-converter and code correlator consists of an all-digital, minimum bit implementation that utilizes digital chip and phase advancers, providing exceptional control and accuracy in feedback phase and in feedback delay. Roundoff and commensurability errors can be reduced to extremely small values (e.g., less than 100 nanochips and 100 nanocycles roundoff errors and 0.1 millichip and 1 millicycle commensurability errors). The digital tracking processor bases the fast feedback for phase and for group delay in the C/A, P.sub.1, and P.sub.2 channels on the L.sub.1 C/A carrier phase thereby maintaining lock at lower signal-to-noise ratios, reducing errors in feedback delays, reducing the frequency of cycle slips and in some cases obviating the need for quadrature processing in the P channels. Simple and reliable methods are employed for data bit synchronization, data bit removal and cycle counting. Improved precision in averaged output delay values is provided by carrier-aided data-compression techniques. The signal processor employs purely digital operations in the sense that exactly the same carrier phase and group delay measurements are obtained, to the last decimal place, every time the same sampled data (i.e., exactly the same bits) are processed.

  15. Pedagogical reforms of digital signal processing education

    NASA Astrophysics Data System (ADS)

    Christensen, Michael

    The future of the engineering discipline is arguably predicated heavily upon appealing to the future generation, in all its sensibilities. The greatest burden in doing so, one might rightly believe, lies on the shoulders of the educators. In examining the causal means by which the profession arrived at such a state, one finds that the technical revolution, precipitated by global war, had, as its catalyst, institutions as expansive as the government itself to satisfy the demand for engineers, who, as a result of such an existential crisis, were taught predominantly theoretical underpinnings to address a finite purpose. By contrast, the modern engineer, having expanded upon this vision and adapted to an evolving society, is increasingly placed in the proverbial role of the worker who must don many hats: not solely a scientist, yet often an artist; not a businessperson alone, but neither financially naive; not always a representative, though frequently a collaborator. Inasmuch as change then serves as the only constancy in a global climate, therefore, the educational system - if it is to mimic the demands of the industry - is left with an inherent need for perpetual revitalization to remain relevant. This work aims to serve that end. Motivated by existing research in engineering education, an epistemological challenge is molded into the framework of the electrical engineer with emphasis on digital signal processing. In particular, it is investigated whether students are better served by a learning paradigm that tolerates and, when feasible, encourages error via a medium free of traditional adjudication. Through the creation of learning modules using the Adobe Captivate environment, a wide range of fundamental knowledge in signal processing is challenged within the confines of existing undergraduate courses. It is found that such an approach not only conforms to the research agenda outlined for the engineering educator, but also reflects an often neglected reality

  16. A Phenomenological Study of an Emergent National Digital Library, Part I: Theory and Methodological Framework

    ERIC Educational Resources Information Center

    Dalbello, Marija

    2005-01-01

    The activities surrounding the National Digital Library Program (NDLP) at the Library of Congress (1995-2000) are used to study institutional processes associated with technological innovation in the library context. The study identified modalities of successful innovation and the characteristics of creative decision making. Theories of social…

  17. The Technology Transfer Process: Concepts, Framework and Methodology.

    ERIC Educational Resources Information Center

    Jolly, James A.

    This paper discusses the conceptual framework and methodology of the technology transfer process and develops a model of the transfer mechanism. This model is then transformed into a predictive model of technology transfer incorporating nine factors that contribute to the movement of knowledge from source to user. Each of these factors is examined…

  18. A POLLUTION REDUCTION METHODOLOGY FOR CHEMICAL PROCESS SIMULATORS

    EPA Science Inventory

    A pollution minimization methodology was developed for chemical process design using computer simulation. It is based on a pollution balance that at steady state is used to define a pollution index with units of mass of pollution per mass of products. The pollution balance has be...

  19. Digital computer processing of X-ray photos

    NASA Technical Reports Server (NTRS)

    Nathan, R.; Selzer, R. H.

    1967-01-01

    Digital computers correct various distortions in medical and biological photographs. One of the principal methods of computer enhancement involves the use of a two-dimensional digital filter to modify the frequency spectrum of the picture. Another computer processing method is image subtraction.

  20. The teaching of computer programming and digital image processing in radiography.

    PubMed

    Allan, G L; Zylinski, J

    1998-06-01

    The increased use of digital processing techniques in Medical Radiations imaging modalities, along with the rapid advance in information technology has resulted in a significant change in the delivery of radiographic teaching programs. This paper details a methodology used to concurrently educate radiographers in both computer programming and image processing. The students learn to program in visual basic applications (VBA), and the programming skills are contextualised by requiring the students to write a digital subtraction angiography (DSA) package. Program code generation and image presentation interface is undertaken by the spreadsheet Microsoft Excel. The user-friendly nature of this common interface enables all students to readily begin program creation. The teaching of programming and image processing skills by this method may be readily generalised to other vocational fields where digital image manipulation is a professional requirement.

  1. Powerful Practices in Digital Learning Processes

    ERIC Educational Resources Information Center

    Sørensen, Birgitte Holm; Levinsen, Karin Tweddell

    2015-01-01

    The present paper is based on two empirical research studies. The "Netbook 1:1" project (2009-2012), funded by the municipality of Gentofte and Microsoft Denmark, is complete, while "Students' digital production and students as learning designers" (2013-2015), funded by the Danish Ministry of Education, is ongoing. Both…

  2. Preliminary development of digital signal processing in microwave radiometers

    NASA Technical Reports Server (NTRS)

    Stanley, W. D.

    1980-01-01

    Topics covered involve a number of closely related tasks including: the development of several control loop and dynamic noise model computer programs for simulating microwave radiometer measurements; computer modeling of an existing stepped frequency radiometer in an effort to determine its optimum operational characteristics; investigation of the classical second order analog control loop to determine its ability to reduce the estimation error in a microwave radiometer; investigation of several digital signal processing unit designs; initiation of efforts to develop required hardware and software for implementation of the digital signal processing unit; and investigation of the general characteristics and peculiarities of digital processing noiselike microwave radiometer signals.

  3. Working memory and two-digit number processing.

    PubMed

    Macizo, Pedro; Herrera, Amparo

    2011-11-01

    The processing of two-digit numbers in comparison tasks involves the activation and manipulation of magnitude information to decide which number is larger. The present study explored the role of different working memory (WM) components and skills in the processing of two-digit numbers by examining the unit-decade compatibility effect with Arabic digits and number words. In the study, the unit-decade compatibility effect and different WM components were evaluated. The results indicated that the unit-decade compatibility effect was associated to specific WM skills depending on the number format (Arabic digits and number words). We discussed the implications of these results for the decomposed view of two-digit numbers.

  4. Cell-based top-down design methodology for RSFQ digital circuits

    NASA Astrophysics Data System (ADS)

    Yoshikawa, N.; Koshiyama, J.; Motoori, K.; Matsuzaki, F.; Yoda, K.

    2001-08-01

    We propose a cell-based top-down design methodology for rapid single flux quantum (RSFQ) digital circuits. Our design methodology employs a binary decision diagram (BDD), which is currently used for the design of CMOS pass-transistor logic circuits. The main features of the BDD RSFQ circuits are the limited primitive number, dual rail nature, non-clocking architecture, and small gate count. We have made a standard BDD RSFQ cell library and prepared a top-down design CAD environment, by which we can perform logic synthesis, logic simulation, circuit simulation and layout view extraction. In order to clarify problems expected in large-scale RSFQ circuits design, we have designed a small RSFQ microprocessor based on simple architecture using our top-down design methodology. We have estimated its system performance and compared it with that of the CMOS microprocessor with the same architecture. It was found that the RSFQ system is superior in terms of the operating speed though it requires extremely large chip area.

  5. Demonstration of the Dynamic Flowgraph Methodology using the Titan 2 Space Launch Vehicle Digital Flight Control System

    NASA Astrophysics Data System (ADS)

    Yau, M.; Guarro, S.; Apostolakis, G.

    1993-06-01

    Dynamic Flowgraph Methodology (DFM) is a new approach developed to integrate the modeling and analysis of the hardware and software components of an embedded system. The objective is to complement the traditional approaches which generally follow the philosophy of separating out the hardware and software portions of the assurance analysis. In this paper, the DFM approach is demonstrated using the Titan 2 Space Launch Vehicle Digital Flight Control System. The hardware and software portions of this embedded system are modeled in an integrated framework. In addition, the time dependent behavior and the switching logic can be captured by this DFM model. In the modeling process, it is found that constructing decision tables for software subroutines is very time consuming. A possible solution is suggested. This approach makes use of a well-known numerical method, the Newton-Raphson method, to solve the equations implemented in the subroutines in reverse. Convergence can be achieved in a few steps.

  6. Demonstration of the Dynamic Flowgraph Methodology using the Titan 2 Space Launch Vehicle Digital Flight Control System

    NASA Technical Reports Server (NTRS)

    Yau, M.; Guarro, S.; Apostolakis, G.

    1993-01-01

    Dynamic Flowgraph Methodology (DFM) is a new approach developed to integrate the modeling and analysis of the hardware and software components of an embedded system. The objective is to complement the traditional approaches which generally follow the philosophy of separating out the hardware and software portions of the assurance analysis. In this paper, the DFM approach is demonstrated using the Titan 2 Space Launch Vehicle Digital Flight Control System. The hardware and software portions of this embedded system are modeled in an integrated framework. In addition, the time dependent behavior and the switching logic can be captured by this DFM model. In the modeling process, it is found that constructing decision tables for software subroutines is very time consuming. A possible solution is suggested. This approach makes use of a well-known numerical method, the Newton-Raphson method, to solve the equations implemented in the subroutines in reverse. Convergence can be achieved in a few steps.

  7. Application of digital image processing techniques to astronomical imagery 1977

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.; Lynn, D. J.

    1978-01-01

    Nine specific techniques of combination of techniques developed for applying digital image processing technology to existing astronomical imagery are described. Photoproducts are included to illustrate the results of each of these investigations.

  8. Digital signal processing in the radio science stability analyzer

    NASA Technical Reports Server (NTRS)

    Greenhall, C. A.

    1995-01-01

    The Telecommunications Division has built a stability analyzer for testing Deep Space Network installations during flight radio science experiments. The low-frequency part of the analyzer operates by digitizing wave signals with bandwidths between 80 Hz and 45 kHz. Processed outputs include spectra of signal, phase, amplitude, and differential phase; time series of the same quantities; and Allan deviation of phase and differential phase. This article documents the digital signal-processing methods programmed into the analyzer.

  9. Agricultural inventory capabilities of machine processed LANDSAT digital data

    NASA Technical Reports Server (NTRS)

    Dietrick, D. L.; Fries, R. E.; Egbert, D. D.

    1975-01-01

    Agricultural crop identification and acreage determination analysis of LANDSAT digital data was performed for two study areas. A multispectral image processing and analysis system was utilized to perform the manmachine interactive analysis. The developed techniques yielded crop acreage estimate results with accuracy greater than 90% and as high as 99%. These results are encouraging evidence of agricultural inventory capabilities of machine processed LANDSAT digital data.

  10. A pollution reduction methodology for chemical process simulators

    SciTech Connect

    Mallick, S.K.; Cabezas, H.; Bare, J.C.; Sikdar, S.K.

    1996-11-01

    A pollution minimization methodology was developed for chemical process design using computer simulation. It is based on a pollution balance that at steady state is used to define a pollution index with units of mass of pollution per mass of products. The pollution balance has been modified by weighing the mass flowrate of each pollutant by its potential environmental impact score. This converts the mass balance into an environmental impact balance. This balance defines an impact index with units of environmental impact per mass of products. The impact index measures the potential environmental effects of process wastes. Three different schemes for chemical ranking were considered: (1) no ranking, (2) simple ranking from 0 to 3, and (3) ranking by a scientifically derived measure of human health and environmental effects. Use of the methodology is illustrated with two examples from the production of (1) methyl ethyl ketone and (2) synthetic ammonia.

  11. Detecting jaundice by using digital image processing

    NASA Astrophysics Data System (ADS)

    Castro-Ramos, J.; Toxqui-Quitl, C.; Villa Manriquez, F.; Orozco-Guillen, E.; Padilla-Vivanco, A.; Sánchez-Escobar, JJ.

    2014-03-01

    When strong Jaundice is presented, babies or adults should be subject to clinical exam like "serum bilirubin" which can cause traumas in patients. Often jaundice is presented in liver disease such as hepatitis or liver cancer. In order to avoid additional traumas we propose to detect jaundice (icterus) in newborns or adults by using a not pain method. By acquiring digital images in color, in palm, soles and forehead, we analyze RGB attributes and diffuse reflectance spectra as the parameter to characterize patients with either jaundice or not, and we correlate that parameters with the level of bilirubin. By applying support vector machine we distinguish between healthy and sick patients.

  12. Advanced Digital Signal Processing for Hybrid Lidar

    DTIC Science & Technology

    2014-10-30

    was moved in 10 cm increments from a range of 1.35 m to 3.05 m. The photomultiplier tube ( PMT ) collected light scattered from the submerged target...through the window. A bias-tee at the output of the PMT separated the DC and AC components of the photocurrent. The DC-coupled signal was monitored on a...multimeter to ensure that the PMT remained within its linear operating region. The AC-coupled signal was demodulated and digitized in the software

  13. Advanced Digital Signal Processing for Hybrid Lidar

    DTIC Science & Technology

    2014-09-30

    The target was moved in 10 cm increments from a range of 1.35 m to 3.05 m. The photomultiplier tube ( PMT ) collected light scattered from the...submerged target through the window. A bias-tee at the output of the PMT separated the DC and AC components of the photocurrent. The DC-coupled signal was...monitored on a multimeter to ensure that the PMT remained within its linear operating region. The AC-coupled signal was demodulated and digitized in

  14. Why optics students should take digital signal processing courses and why digital signal processing students should take optics courses

    NASA Astrophysics Data System (ADS)

    Cathey, W. Thomas, Jr.

    2000-06-01

    This paper is based on the claim that future major contributions in the field of imaging systems will be made by those who have a background in both optics and digital signal processing. As the introduction of Fourier transforms and linear systems theory to optics had a major impact on the design of hybrid optical/digital imaging systems, the introduction of digital signal processing into optics programs will have a major impact. Examples are given of new hybrid imaging systems that have unique performance. By jointly designing the optics and the signal processing in a digital camera, a new paradigm arises where aberration balancing takes into consideration not only the number of surfaces and indices of refraction, but also the processing capability.

  15. Digital processing of radiographic images from PACS to publishing.

    PubMed

    Christian, M E; Davidson, H C; Wiggins, R H; Berges, G; Cannon, G; Jackson, G; Chapman, B; Harnsberger, H R

    2001-03-01

    Several studies have addressed the implications of filmless radiologic imaging on telemedicine, diagnostic ability, and electronic teaching files. However, many publishers still require authors to submit hard-copy images for publication of articles and textbooks. This study compares the quality digital images directly exported from picture archive and communications systems (PACS) to images digitized from radiographic film. The authors evaluated the quality of publication-grade glossy photographs produced from digital radiographic images using 3 different methods: (1) film images digitized using a desktop scanner and then printed, (2) digital images obtained directly from PACS then printed, and (3) digital images obtained from PACS and processed to improve sharpness prior to printing. Twenty images were printed using each of the 3 different methods and rated for quality by 7 radiologists. The results were analyzed for statistically significant differences among the image sets. Subjective evaluations of the filmless images found them to be of equal or better quality than the digitized images. Direct electronic transfer of PACS images reduces the number of steps involved in creating publication-quality images as well as providing the means to produce high-quality radiographic images in a digital environment.

  16. Process Waste Assessment for the Plotting and Digitizing Support Laboratory

    SciTech Connect

    Phillips, N.M.

    1994-04-01

    This Process Waste Assessment was conducted to evaluate the Plotting and Digitizing Support Laboratory, located in Building 913, Room 157. It documents the processes, identifies the hazardous chemical waste streams generated by these processes, recommends possible ways to minimize waste, and serves as a reference for future assessments of this facility.

  17. Analysis of Interpersonal Communication Processes in Digital Factory Environments

    NASA Astrophysics Data System (ADS)

    Schütze, Jens; Baum, Heiko; Laue, Martin; Müller, Egon

    The paper outlines the scope of influence of digital factory on the interpersonal communication process and the exemplary description of them. On the basis of a brief description about the theoretical basic concepts of the digital factory occurs the illustration of communicative features in digital factory. Practical coherences of interpersonal communication from a human oriented view were analyzed in Volkswagen AG in Wolfsburg in a pilot project. A modeling method was developed within the process analysis. This method makes it possible to visualize interpersonal communication and its human oriented attribute in a technically focused workflow. Due to the results of a developed inquiry about communication analysis and process models of modeling methods it was possible to build the processes in a suitable way for humans and to obtain a positive implication on the communication processes.

  18. Digital signal processing for fiber-optic thermometers

    SciTech Connect

    Fernicola, V.; Crovini, L.

    1994-12-31

    A digital signal processing scheme for measurement of exponentially-decaying signals, such as those found in fluorescence, lifetime-based, fiber-optic sensors, is proposed. The instrument uses a modified digital phase-sensitive-detection technique with the phase locked to a fixed value and the modulation period tracking the measured lifetime. Typical resolution of the system is 0.05% for slow decay (>500 {mu}s) and 0.1% for fast decay.

  19. A color image processing pipeline for digital microscope

    NASA Astrophysics Data System (ADS)

    Liu, Yan; Liu, Peng; Zhuang, Zhefeng; Chen, Enguo; Yu, Feihong

    2012-10-01

    Digital microscope has found wide application in the field of biology, medicine et al. A digital microscope differs from traditional optical microscope in that there is no need to observe the sample through an eyepiece directly, because the optical image is projected directly on the CCD/CMOS camera. However, because of the imaging difference between human eye and sensor, color image processing pipeline is needed for the digital microscope electronic eyepiece to get obtain fine image. The color image pipeline for digital microscope, including the procedures that convert the RAW image data captured by sensor into real color image, is of great concern to the quality of microscopic image. The color pipeline for digital microscope is different from digital still cameras and video cameras because of the specific requirements of microscopic image, which should have the characters of high dynamic range, keeping the same color with the objects observed and a variety of image post-processing. In this paper, a new color image processing pipeline is proposed to satisfy the requirements of digital microscope image. The algorithm of each step in the color image processing pipeline is designed and optimized with the purpose of getting high quality image and accommodating diverse user preferences. With the proposed pipeline implemented on the digital microscope platform, the output color images meet the various analysis requirements of images in the medicine and biology fields very well. The major steps of color imaging pipeline proposed include: black level adjustment, defect pixels removing, noise reduction, linearization, white balance, RGB color correction, tone scale correction and gamma correction.

  20. Process optimization of mechano-electrospinning by response surface methodology.

    PubMed

    Bu, Ningbin; Huang, YongAn; Duan, Yongqing; Yin, Zhouping

    2014-05-01

    In this paper, mechano-electrospinning (MES) is presented to write the polyvinylidene fluoride (PVDF) solution into fibers directly, and the effects of the process parameters on the fiber are investigated experimentally based on response surface methodology. The different width of the fiber is obtained by adjusting the individual process parameters (velocity of the substrate, applied voltage and nozzle-to-substrate distance). Considering the continuous jet and stable Taylor-cone, the operation field is selected for investigating the complicated relationship between the process parameters on the width of the fiber by using the response surface methodology. The experiment results show that the predicted width of the fiber is in good agreement with the actual width of the fiber. Based on the analysis of the importance of the terms in the equation, a simple model can be used to predict the width of the fiber. Depending on this model, a large number of calibration experiments can be subducted. Additionally, the principle of the selection of the process parameters is presented by optimizing parameters, which can give a guideline for obtaining the desired fiber in the experiment.

  1. [Fundamental bases of digital information processing in nuclear cardiology (III)].

    PubMed

    Cuarón, A; González, C; García Moreira, C

    1984-01-01

    This article describes the transformation of the gamma-camera images into digital form. The incidence of a gamma photon on the detector, produces two voltage pulses, which are proportional to the coordinates of the incidence points, and a digital pulse, indicative of the occurrence of the event. The coordinate pulses passes through a analog-digital converter, that is activated by the pulse. The result is the appearance of a digital number at the out-put of the converter, which is proportional to the voltage at its in-put. This number, is stored on the accumulation memory of the system, either on a list mode or on a matrix mode. Static images can be stored on a single matrix. Dynamic data can be stored on a series of matrixes, each representing a different period of acquisition. It is also possible to capture information on a series of matrixes syncronized with the electrocardiogram of the patient. In this instance, each matrix represents a distinct period of the cardiac cycle. Data stored on the memory, can be used to process and display images and quantitative histograms on a video screen. In order to do that, it is necessary to translate the digital data on the memory to voltage levels, and to transform these on light levels on the screen. This, is achieved through a digital analog converter. The reading of the digital memory must be syncronic with the electronic scanning of the video screen.

  2. Digital intermediate frequency QAM modulator using parallel processing

    DOEpatents

    Pao, Hsueh-Yuan; Tran, Binh-Nien

    2008-05-27

    The digital Intermediate Frequency (IF) modulator applies to various modulation types and offers a simple and low cost method to implement a high-speed digital IF modulator using field programmable gate arrays (FPGAs). The architecture eliminates multipliers and sequential processing by storing the pre-computed modulated cosine and sine carriers in ROM look-up-tables (LUTs). The high-speed input data stream is parallel processed using the corresponding LUTs, which reduces the main processing speed, allowing the use of low cost FPGAs.

  3. Dry Machining Process of Milling Machine using Axiomatic Green Methodology

    NASA Astrophysics Data System (ADS)

    Puspita Andriani, Gita; Akbar, Muhammad; Irianto, Dradjad

    2016-02-01

    Most of companies know that there are strategies to become green industry, and they realize that green efforts have impacts on product quality and cost. Axiomatic Green Methodology models the relationship between green, quality, and cost. This methodology starts with determining the green improvement objective and then continues with mapping the functional, economic, and green requirements. From the mapping, variables which affect the requirements are identified. Afterwards, the effect of each variable is determined by performing experiments and regression modelling. In this research, axiomatic green methodology was implemented to dry machining of milling machine in order to reduce the amount of coolant. Dry machining will be feasible if it is not worse than the minimum required quality. As a result, dry machining is feasible without producing any defect. The proposed machining parameter is to reduce the coolant flow rate from 6.882 ml/minute to 0 ml/minute, set the depth of cut at 1.2 mm, spindle rotation speed at 500 rpm, and feed rate at 128 mm/minute. This solution is also resulted in reduction of cost for 200.48 rupiahs for each process.

  4. Assessment Methodology for Process Validation Lifecycle Stage 3A.

    PubMed

    Sayeed-Desta, Naheed; Pazhayattil, Ajay Babu; Collins, Jordan; Chen, Shu; Ingram, Marzena; Spes, Jana

    2016-10-06

    The paper introduces evaluation methodologies and associated statistical approaches for process validation lifecycle Stage 3A. The assessment tools proposed can be applied to newly developed and launched small molecule as well as bio-pharma products, where substantial process and product knowledge has been gathered. The following elements may be included in Stage 3A: number of 3A batch determination; evaluation of critical material attributes, critical process parameters, critical quality attributes; in vivo in vitro correlation; estimation of inherent process variability (IPV) and PaCS index; process capability and quality dashboard (PCQd); and enhanced control strategy. US FDA guidance on Process Validation: General Principles and Practices, January 2011 encourages applying previous credible experience with suitably similar products and processes. A complete Stage 3A evaluation is a valuable resource for product development and future risk mitigation of similar products and processes. Elements of 3A assessment were developed to address industry and regulatory guidance requirements. The conclusions made provide sufficient information to make a scientific and risk-based decision on product robustness.

  5. Modeling of electrohydrodynamic drying process using response surface methodology.

    PubMed

    Dalvand, Mohammad Jafar; Mohtasebi, Seyed Saeid; Rafiee, Shahin

    2014-05-01

    Energy consumption index is one of the most important criteria for judging about new, and emerging drying technologies. One of such novel and promising alternative of drying process is called electrohydrodynamic (EHD) drying. In this work, a solar energy was used to maintain required energy of EHD drying process. Moreover, response surface methodology (RSM) was used to build a predictive model in order to investigate the combined effects of independent variables such as applied voltage, field strength, number of discharge electrode (needle), and air velocity on moisture ratio, energy efficiency, and energy consumption as responses of EHD drying process. Three-levels and four-factor Box-Behnken design was employed to evaluate the effects of independent variables on system responses. A stepwise approach was followed to build up a model that can map the entire response surface. The interior relationships between parameters were well defined by RSM.

  6. Modeling of electrohydrodynamic drying process using response surface methodology

    PubMed Central

    Dalvand, Mohammad Jafar; Mohtasebi, Seyed Saeid; Rafiee, Shahin

    2014-01-01

    Energy consumption index is one of the most important criteria for judging about new, and emerging drying technologies. One of such novel and promising alternative of drying process is called electrohydrodynamic (EHD) drying. In this work, a solar energy was used to maintain required energy of EHD drying process. Moreover, response surface methodology (RSM) was used to build a predictive model in order to investigate the combined effects of independent variables such as applied voltage, field strength, number of discharge electrode (needle), and air velocity on moisture ratio, energy efficiency, and energy consumption as responses of EHD drying process. Three-levels and four-factor Box–Behnken design was employed to evaluate the effects of independent variables on system responses. A stepwise approach was followed to build up a model that can map the entire response surface. The interior relationships between parameters were well defined by RSM. PMID:24936289

  7. Restoration Of Faded Color Photographs By Digital Image Processing

    NASA Astrophysics Data System (ADS)

    Gschwind, Rudolf

    1989-10-01

    Color photographs possess a poor stability towards light, chemicals heat and humidity. As a consequence, the colors of photographs deteriorate with time. Because of the complexity of processes that cause the dyes to fade, it is impossible to restore the images by chemical means. It is therefore attempted to restore faded color films by means of digital image processing.

  8. FPGA-Based Filterbank Implementation for Parallel Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Berner, Stephan; DeLeon, Phillip

    1999-01-01

    One approach to parallel digital signal processing decomposes a high bandwidth signal into multiple lower bandwidth (rate) signals by an analysis bank. After processing, the subband signals are recombined into a fullband output signal by a synthesis bank. This paper describes an implementation of the analysis and synthesis banks using (Field Programmable Gate Arrays) FPGAs.

  9. Relationships between digital signal processing and control and estimation theory

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1978-01-01

    Research directions in the fields of digital signal processing and modern control and estimation theory are discussed. Stability theory, linear prediction and parameter identification, system synthesis and implementation, two-dimensional filtering, decentralized control and estimation, and image processing are considered in order to uncover some of the basic similarities and differences in the goals, techniques, and philosophy of the disciplines.

  10. A Methodology To Generate A Digital Elevation Model By Combining Topographic And Bathymetric Data In Fluvial Environments

    NASA Astrophysics Data System (ADS)

    Matias, Magda Paraiso; Falcano, Ana Paula; Goncalves, Alexandre B.; Alvares, Teressa; Pestana, Rita; Van Zeller, Emilia; Rodrigues, Victor; Heleno, Sandra

    2013-12-01

    In hydrodynamic simulations, a digital elevation model valid for the whole study area is a requirement. The construction of this model usually implies the use of topographic and bathymetric data collected by distinct equipment and methods, at different times and acquired with a variety of spatial resolutions and accuracies. Several methodologies have been tested in order to combine both datasets involving the use of diverse spatial interpolators. In this paper we present the first results of a new methodology that combines a digital elevation model acquired by radar for floodplain areas with cross- sections in the river bed, in order to provide the most accurate and reliable digital elevation model. Since data was collected in distinct epochs, in the overlapped areas differences in elevation might exist, due to the morphological dynamics of the river. In order to analyse and validate those differences, a dataset with SAR imagery, provided by ESA, was used.

  11. Digitizing Dissertations for an Institutional Repository: A Process and Cost Analysis*

    PubMed Central

    Piorun, Mary; Palmer, Lisa A.

    2008-01-01

    Objective: This paper describes the Lamar Soutter Library's process and costs associated with digitizing 300 doctoral dissertations for a newly implemented institutional repository at the University of Massachusetts Medical School. Methodology: Project tasks included identifying metadata elements, obtaining and tracking permissions, converting the dissertations to an electronic format, and coordinating workflow between library departments. Each dissertation was scanned, reviewed for quality control, enhanced with a table of contents, processed through an optical character recognition function, and added to the institutional repository. Results: Three hundred and twenty dissertations were digitized and added to the repository for a cost of $23,562, or $0.28 per page. Seventy-four percent of the authors who were contacted (n = 282) granted permission to digitize their dissertations. Processing time per title was 170 minutes, for a total processing time of 906 hours. In the first 17 months, full-text dissertations in the collection were downloaded 17,555 times. Conclusion: Locally digitizing dissertations or other scholarly works for inclusion in institutional repositories can be cost effective, especially if small, defined projects are chosen. A successful project serves as an excellent recruitment strategy for the institutional repository and helps libraries build new relationships. Challenges include workflow, cost, policy development, and copyright permissions. PMID:18654648

  12. Digital pulse processing: new possibilities in nuclear spectroscopy

    PubMed

    Warburton; Momayezi; Hubbard-Nelson; Skulski

    2000-10-01

    Digital pulse processing is a signal processing technique in which detector (preamplifier output) signals are directly digitized and processed to extract quantities of interest. This approach has several significant advantages compared to traditional analog signal shaping. First, analyses can be developed which take pulse-by-pulse differences into account, as in making ballistic deficit compensations. Second, transient induced charge signals, which deposit no net charge on an electrode, can be analyzed to give, for example, information on the position of interaction within the detector. Third, deadtimes from transient overload signals are greatly reduced, from tens of micros to hundreds of ns. Fourth, signals are easily captured, so that more complex analyses can be postponed until the source event has been deemed "interesting". Fifth, signal capture and processing may easily be based on coincidence criteria between different detectors or different parts of the same detector. XIAs recently introduced CAMAC module, the DGF-4C, provides many of these features for four input channels, including two levels of digital processing and a FIFO for signal capture for each signal channel. The first level of digital processing is "immediate", taking place in a gate array at the 40 MHz digitization rate, and implements pulse detection, pileup inspection, trapezoidal energy filtering, and control of an external 25.6 micros long FIFO. The second level of digital processing is provided by a digital signal processor (DSP), where more complex algorithms can be implemented. To illustrate digital pulse processing's possibilities, we describe the application of the DGF-4C to a series of experiments. The first, for which the DGF was originally developed, involves locating gamma-ray interaction sites within large segmented Ge detectors. The goal of this work is to attain spatial resolutions of order 2 mm sigma within 70 mm x 90 mm detectors. We show how pulse shape analysis allows

  13. Methodological framework for evaluating clinical processes: A cognitive informatics perspective.

    PubMed

    Kannampallil, Thomas G; Abraham, Joanna; Patel, Vimla L

    2016-12-01

    We propose a methodological framework for evaluating clinical cognitive activities in complex real-world environments that provides a guiding framework for characterizing the patterns of activities. This approach, which we refer to as a process-based approach, is particularly relevant to cognitive informatics (CI) research-an interdisciplinary domain utilizing cognitive approaches in the study of computing systems and applications-as it provides new ways for understanding human information processing, interactions, and behaviors. Using this approach involves the identification of a process of interest (e.g., a clinical workflow), and the contributing sequences of activities in that process (e.g., medication ordering). A variety of analytical approaches can then be used to characterize the inherent dependencies and relations within the contributing activities within the considered process. Using examples drawn from our own research and the extant research literature, we describe the theoretical foundations of the process-based approach, relevant practical and pragmatic considerations for using such an approach, and a generic framework for applying this approach for evaluation studies in clinical settings. We also discuss the potential for this approach in future evaluations of interactive clinical systems, given the need for new approaches for evaluation, and significant opportunities for automated, unobtrusive data collection.

  14. Signal processing methodologies for an acoustic fetal heart rate monitor

    NASA Technical Reports Server (NTRS)

    Pretlow, Robert A., III; Stoughton, John W.

    1992-01-01

    Research and development is presented of real time signal processing methodologies for the detection of fetal heart tones within a noise-contaminated signal from a passive acoustic sensor. A linear predictor algorithm is utilized for detection of the heart tone event and additional processing derives heart rate. The linear predictor is adaptively 'trained' in a least mean square error sense on generic fetal heart tones recorded from patients. A real time monitor system is described which outputs to a strip chart recorder for plotting the time history of the fetal heart rate. The system is validated in the context of the fetal nonstress test. Comparisons are made with ultrasonic nonstress tests on a series of patients. Comparative data provides favorable indications of the feasibility of the acoustic monitor for clinical use.

  15. Reservoir continuous process improvement six sigma methodology implementation

    SciTech Connect

    Wannamaker, A.L.

    1996-12-01

    The six sigma methodology adopted by AlliedSignal Inc. for implementing continuous improvement activity was applied to a new manufacturing assignment for Federal Manufacturing & Technologies (FM&T). The responsibility for reservoir development/production was transferred from Rocky Flats to FM&T. Pressure vessel fabrication was new to this facility. No fabrication history for this type of product existed in-house. Statistical tools such as process mapping, failure mode and effects analysis, and design of experiments were used to define and fully characterize the machine processes to be used in reservoir production. Continuous improvement with regard to operating efficiencies and product quality is an ongoing activity at FM&T.

  16. Synthetic aperture radar and digital processing: An introduction

    NASA Technical Reports Server (NTRS)

    Dicenzo, A.

    1981-01-01

    A tutorial on synthetic aperture radar (SAR) is presented with emphasis on digital data collection and processing. Background information on waveform frequency and phase notation, mixing, Q conversion, sampling and cross correlation operations is included for clarity. The fate of a SAR signal from transmission to processed image is traced in detail, using the model of a single bright point target against a dark background. Some of the principal problems connected with SAR processing are also discussed.

  17. Application of automated methodologies based on digital images for phenological behaviour analysis in Mediterranean species

    NASA Astrophysics Data System (ADS)

    Cesaraccio, Carla; Piga, Alessandra; Ventura, Andrea; Arca, Angelo; Duce, Pierpaolo; Granados, Joel

    2015-04-01

    The importance of phenological research for understanding the consequences of global environmental change on vegetation is highlighted in the most recent IPCC reports. Collecting time series of phenological events appears to be of crucial importance to better understand how vegetation systems respond to climatic regime fluctuations, and, consequently, to develop effective management and adaptation strategies. Vegetation monitoring based on "near-surface" remote sensing techniques have been proposed in recent researches. In particular, the use of digital cameras has become more common for phenological monitoring. Digital images provide spectral information in the red, green, and blue (RGB) wavelengths. Inflection points in seasonal variations of intensities of each color channel can be used to identify phenological events. In this research, an Automated Phenological Observation System (APOS), based on digital image sensors, was used for monitoring the phenological behavior of shrubland species in a Mediterranean site. Major species of the shrubland ecosystem that were analyzed were: Cistus monspeliensis L., Cistus incanus L., Rosmarinus officinalis L., Pistacia lentiscus L., and Pinus halepensis Mill. The system was developed under the INCREASE (an Integrated Network on Climate Change Research) EU-funded research infrastructure project, which is based upon large scale field experiments with non-intrusive climatic manipulations. Monitoring of phenological behavior was conducted during 2012-2014 years. To the end of retrieve phenological information from digital images, a routine of commands to process the digital image file using the program MATLAB (R2013b, The MathWorks, Natick, Mass.) was specifically created. The images of the dataset have been re-classified and renamed files according to the date and time of acquisition. The analysis was focused on regions of interest (ROIs) of the panoramas acquired, defined by the presence of the most representative species of

  18. Digital image processing of vascular angiograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Blankenhorn, D. H.; Beckenbach, E. S.; Crawford, D. W.; Brooks, S. H.

    1975-01-01

    A computer image processing technique was developed to estimate the degree of atherosclerosis in the human femoral artery. With an angiographic film of the vessel as input, the computer was programmed to estimate vessel abnormality through a series of measurements, some derived primarily from the vessel edge information and others from optical density variations within the lumen shadow. These measurements were combined into an atherosclerosis index, which was found to correlate well with both visual and chemical estimates of atherosclerotic disease.

  19. Results of precision processing (scene correction) of ERTS-1 images using digital image processing techniques

    NASA Technical Reports Server (NTRS)

    Bernstein, R.

    1973-01-01

    ERTS-1 MSS and RBV data recorded on computer compatible tapes have been analyzed and processed, and preliminary results have been obtained. No degradation of intensity (radiance) information occurred in implementing the geometric correction. The quality and resolution of the digitally processed images are very good, due primarily to the fact that the number of film generations and conversions is reduced to a minimum. Processing times of digitally processed images are about equivalent to the NDPF electro-optical processor.

  20. Light Water Reactor Sustainability Program: Digital Technology Business Case Methodology Guide

    SciTech Connect

    Thomas, Ken; Lawrie, Sean; Hart, Adam; Vlahoplus, Chris

    2014-09-01

    The Department of Energy’s (DOE’s) Light Water Reactor Sustainability Program aims to develop and deploy technologies that will make the existing U.S. nuclear fleet more efficient and competitive. The program has developed a standard methodology for determining the impact of new technologies in order to assist nuclear power plant (NPP) operators in building sound business cases. The Advanced Instrumentation, Information, and Control (II&C) Systems Technologies Pathway is part of the DOE’s Light Water Reactor Sustainability (LWRS) Program. It conducts targeted research and development (R&D) to address aging and reliability concerns with the legacy instrumentation and control and related information systems of the U.S. operating light water reactor (LWR) fleet. This work involves two major goals: (1) to ensure that legacy analog II&C systems are not life-limiting issues for the LWR fleet and (2) to implement digital II&C technology in a manner that enables broad innovation and business improvement in the NPP operating model. Resolving long-term operational concerns with the II&C systems contributes to the long-term sustainability of the LWR fleet, which is vital to the nation’s energy and environmental security. The II&C Pathway is conducting a series of pilot projects that enable the development and deployment of new II&C technologies in existing nuclear plants. Through the LWRS program, individual utilities and plants are able to participate in these projects or otherwise leverage the results of projects conducted at demonstration plants. Performance advantages of the new pilot project technologies are widely acknowledged, but it has proven difficult for utilities to derive business cases for justifying investment in these new capabilities. Lack of a business case is often cited by utilities as a barrier to pursuing wide-scale application of digital technologies to nuclear plant work activities. The decision to move forward with funding usually hinges on

  1. Optical hybrid analog-digital signal processing based on spike processing in neurons

    NASA Astrophysics Data System (ADS)

    Fok, Mable P.; Tian, Yue; Rosenbluth, David; Deng, Yanhua; Prucnal, Paul R.

    2011-09-01

    Spike processing is one kind of hybrid analog-digital signal processing, which has the efficiency of analog processing and the robustness to noise of digital processing. When instantiated with optics, a hybrid analog-digital processing primitive has the potential to be scalable, computationally powerful, and have high operation bandwidth. These devices open up a range of processing applications for which electronic processing is too slow. Our approach is based on a hybrid analog/digital computational primitive that elegantly implements the functionality of an integrate-and-fire neuron using a Ge-doped non-linear optical fiber and off-the-shelf semiconductor devices. In this paper, we introduce our photonic neuron architecture and demonstrate the feasibility of implementing simple photonic neuromorphic circuits, including the auditory localization algorithm of the barn owl, which is useful for LIDAR localization, and the crayfish tail-flip escape response.

  2. Reflexivity: a methodological tool in the knowledge translation process?

    PubMed

    Alley, Sarah; Jackson, Suzanne F; Shakya, Yogendra B

    2015-05-01

    Knowledge translation is a dynamic and iterative process that includes the synthesis, dissemination, exchange, and application of knowledge. It is considered the bridge that closes the gap between research and practice. Yet it appears that in all areas of practice, a significant gap remains in translating research knowledge into practical application. Recently, researchers and practitioners in the field of health care have begun to recognize reflection and reflexive exercises as a fundamental component to the knowledge translation process. As a practical tool, reflexivity can go beyond simply looking at what practitioners are doing; when approached in a systematic manner, it has the potential to enable practitioners from a wide variety of backgrounds to identify, understand, and act in relation to the personal, professional, and political challenges they face in practice. This article focuses on how reflexive practice as a methodological tool can provide researchers and practitioners with new insights and increased self-awareness, as they are able to critically examine the nature of their work and acknowledge biases, which may affect the knowledge translation process. Through the use of structured journal entries, the nature of the relationship between reflexivity and knowledge translation was examined, specifically exploring if reflexivity can improve the knowledge translation process, leading to increased utilization and application of research findings into everyday practice.

  3. Optimization of Control Processes of Digital Electrical Drive Systems

    NASA Astrophysics Data System (ADS)

    Dochviri, J.

    2010-01-01

    The aim of the work is solution of the problems associated with synthesis of the digital speed regulators both for DC and AC thyristor electrical drives. The investigation is realized based on the parameters of continuous technological equipment (e.g. paper-making machine) by taking into account elastic transmission links of the drive systems. Appropriate frequency characteristics and transient processes are described.

  4. Experiences with digital processing of images at INPE

    NASA Technical Reports Server (NTRS)

    Mascarenhas, N. D. A. (Principal Investigator)

    1984-01-01

    Four different research experiments with digital image processing at INPE will be described: (1) edge detection by hypothesis testing; (2) image interpolation by finite impulse response filters; (3) spatial feature extraction methods in multispectral classification; and (4) translational image registration by sequential tests of hypotheses.

  5. Trust in Numbers? Digital Education Governance and the Inspection Process

    ERIC Educational Resources Information Center

    Ozga, Jenny

    2016-01-01

    The aim of the paper is to contribute to the critical study of digital data use in education, through examination of the processes surrounding school inspection judgements. The interaction between pupil performance data and other (embodied, enacted) sources of inspection judgement is scrutinised and discussed with a focus on the interaction…

  6. Digital-Computer Processing of Graphical Data. Final Report.

    ERIC Educational Resources Information Center

    Freeman, Herbert

    The final report of a two-year study concerned with the digital-computer processing of graphical data. Five separate investigations carried out under this study are described briefly, and a detailed bibliography, complete with abstracts, is included in which are listed the technical papers and reports published during the period of this program.…

  7. Processing of digital holograms: segmentation and inpainting

    NASA Astrophysics Data System (ADS)

    Jiao, Shuming; Zou, Wenbin

    2016-10-01

    In this paper, two novel hologram image processing issues, i.e., hologram decomposition and hologram inpainting, are briefly reviewed and discussed. By hologram decomposition, one hologram can be decomposed into several subholograms and each sub-hologram represents one individual item in the 3D object scene. A Virtual Diffraction Plane based hologram decomposition scheme is proposed based on Otsu thresholding segmentation, morphological dilation and sequential scan labelling. Hologram decomposition can be employed for focus distance detection in blind hologram reconstruction. By hologram impainting, a damaged hologram can be restored by filling in the missing pixels. An exemplar and search based technique is applied for hologram inpainting with enhanced computing speed by Artificial Bee Colony algorithm. Potential applications of hologram inpainting are discussed.

  8. Methodology for the systems engineering process. Volume 3: Operational availability

    NASA Technical Reports Server (NTRS)

    Nelson, J. H.

    1972-01-01

    A detailed description and explanation of the operational availability parameter is presented. The fundamental mathematical basis for operational availability is developed, and its relationship to a system's overall performance effectiveness is illustrated within the context of identifying specific availability requirements. Thus, in attempting to provide a general methodology for treating both hypothetical and existing availability requirements, the concept of an availability state, in conjunction with the more conventional probability-time capability, is investigated. In this respect, emphasis is focused upon a balanced analytical and pragmatic treatment of operational availability within the system design process. For example, several applications of operational availability to typical aerospace systems are presented, encompassing the techniques of Monte Carlo simulation, system performance availability trade-off studies, analytical modeling of specific scenarios, as well as the determination of launch-on-time probabilities. Finally, an extensive bibliography is provided to indicate further levels of depth and detail of the operational availability parameter.

  9. Evaluation methodologies for an advanced information processing system

    NASA Technical Reports Server (NTRS)

    Schabowsky, R. S., Jr.; Gai, E.; Walker, B. K.; Lala, J. H.; Motyka, P.

    1984-01-01

    The system concept and requirements for an Advanced Information Processing System (AIPS) are briefly described, but the emphasis of this paper is on the evaluation methodologies being developed and utilized in the AIPS program. The evaluation tasks include hardware reliability, maintainability and availability, software reliability, performance, and performability. Hardware RMA and software reliability are addressed with Markov modeling techniques. The performance analysis for AIPS is based on queueing theory. Performability is a measure of merit which combines system reliability and performance measures. The probability laws of the performance measures are obtained from the Markov reliability models. Scalar functions of this law such as the mean and variance provide measures of merit in the AIPS performability evaluations.

  10. Digital image processing of bone - Problems and potentials

    NASA Technical Reports Server (NTRS)

    Morey, E. R.; Wronski, T. J.

    1980-01-01

    The development of a digital image processing system for bone histomorphometry and fluorescent marker monitoring is discussed. The system in question is capable of making measurements of UV or light microscope features on a video screen with either video or computer-generated images, and comprises a microscope, low-light-level video camera, video digitizer and display terminal, color monitor, and PDP 11/34 computer. Capabilities demonstrated in the analysis of an undecalcified rat tibia include the measurement of perimeter and total bone area, and the generation of microscope images, false color images, digitized images and contoured images for further analysis. Software development will be based on an existing software library, specifically the mini-VICAR system developed at JPL. It is noted that the potentials of the system in terms of speed and reliability far exceed any problems associated with hardware and software development.

  11. The quantified process approach: an emerging methodology to neuropsychological assessment.

    PubMed

    Poreh, A M

    2000-05-01

    An important development in the field of neuropsychological assessment is the quantification of the process by which individuals solve common neuropsychological tasks. The present article outlines the history leading to this development, the Quantified Process Approach, and suggests that this line of applied research bridges the gap between the clinical and statistical approaches to neuropsychological assessment. It is argued that the enterprise of quantifying the process approach proceeds via three major methodologies: (1) the "Satellite" Testing Paradigm: an approach by which new tasks are developed to complement existing tests so as to clarify a given test performance; (2) the Composition Paradigm: an approach by which data on a given test that have been largely overlooked are compiled and subsequently analyzed, resulting in new indices that are believed to reflect underlying constructs accounting for test performance; and (3) the Decomposition Paradigm: an approach which investigates the relationship between test items of a given measure according to underlying facets, resulting in the development of new subscores. The article illustrates each of the above paradigms, offers a critique of this new field according to prevailing professional standards for psychological measures, and provides suggestions for future research.

  12. Digital processing of histopathological aspects in renal transplantation

    NASA Astrophysics Data System (ADS)

    de Albuquerque Araujo, Arnaldo; de Andrade, Marcos C.; Bambirra, Eduardo A.; dos Santos, A. M. M.

    1993-07-01

    We describe here our initial experience with the digital image processing of histopathological aspects from multiple renal biopsies of transplanted kidney in a patient treated with Cyclosporine (CsA), a powerful immunosupressor drug whose use has improved the chances of a successful vascularized organ transplantation (Tx). Unfortunately, CsA promotes morphological alterations to the glomerular structure of the kidneys. To characterize this process, glomeruli, tufts, and lumen areas distributions are measured. The results are presented in form of graphics.

  13. Process sequence optimization for digital microfluidic integration using EWOD technique

    NASA Astrophysics Data System (ADS)

    Yadav, Supriya; Joyce, Robin; Sharma, Akash Kumar; Sharma, Himani; Sharma, Niti Nipun; Varghese, Soney; Akhtar, Jamil

    2016-04-01

    Micro/nano-fluidic MEMS biosensors are the devices that detects the biomolecules. The emerging micro/nano-fluidic devices provide high throughput and high repeatability with very low response time and reduced device cost as compared to traditional devices. This article presents the experimental details for process sequence optimization of digital microfluidics (DMF) using "electrowetting-on-dielectric" (EWOD). Stress free thick film deposition of silicon dioxide using PECVD and subsequent process for EWOD techniques have been optimized in this work.

  14. Advanced Digital Signal Processing for Hybrid Lidar FY 2013

    DTIC Science & Technology

    2013-01-01

    Report 4. TITLE AND SUBTITLE Advance Digital Signal Processing for Hybrid Lidar 6. AUTHOR(S) William D. Jemison 7. PERFORMING ORGANIZATION NAME(S...development of signed processing algorithms for hybrid lidar - radar designed to improve detection performance. i , 15. SUBJECT TERMS Hybrid... Lidar - Radar 16. SECURITY CLASSIFICATION OF: a. REPORT b. ABSTRACT c. THIS PAGE 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF

  15. Flow manipulation and control methodologies for vacuum infusion processes

    NASA Astrophysics Data System (ADS)

    Alms, Justin B.

    experienced. First, the effect on permeability is characterized, so the process can be simulated and the flow front patterns can be predicted. It was found that using the VIPR process in combination with tool side injection gates is a very effective method to control resin flow. Based on this understanding several control algorithms were developed to use the process in an automated manufacturing environment which were tested and validated in a virtual environment. To implement and demonstrate the approach, an experimental workstation was built and various infusion examples were performed in the automated environment to validate the capability of the VIPR process with the control methodologies. The VIPR process with control consistently performed better than the process without control. This contribution should prove useful in making VIPs more reliable in the production of large scale composite structures.

  16. A digital signal processing system for coherent laser radar

    NASA Technical Reports Server (NTRS)

    Hampton, Diana M.; Jones, William D.; Rothermel, Jeffry

    1991-01-01

    A data processing system for use with continuous-wave lidar is described in terms of its configuration and performance during the second survey mission of NASA'a Global Backscatter Experiment. The system is designed to estimate a complete lidar spectrum in real time, record the data from two lidars, and monitor variables related to the lidar operating environment. The PC-based system includes a transient capture board, a digital-signal processing (DSP) board, and a low-speed data-acquisition board. Both unprocessed and processed lidar spectrum data are monitored in real time, and the results are compared to those of a previous non-DSP-based system. Because the DSP-based system is digital it is slower than the surface-acoustic-wave signal processor and collects 2500 spectra/s. However, the DSP-based system provides complete data sets at two wavelengths from the continuous-wave lidars.

  17. Integrating digital topology in image-processing libraries.

    PubMed

    Lamy, Julien

    2007-01-01

    This paper describes a method to integrate digital topology informations in image-processing libraries. This additional information allows a library user to write algorithms respecting topological constraints, for example, a seed fill or a skeletonization algorithm. As digital topology is absent from most image-processing libraries, such constraints cannot be fulfilled. We describe and give code samples for all the structures necessary for this integration, and show a use case in the form of a homotopic thinning filter inside ITK. The obtained filter can be up to a hundred times as fast as ITK's thinning filter and works for any image dimension. This paper mainly deals of integration within ITK, but can be adapted with only minor modifications to other image-processing libraries.

  18. Novel Optimization Methodology for Welding Process/Consumable Integration

    SciTech Connect

    Quintana, Marie A; DebRoy, Tarasankar; Vitek, John; Babu, Suresh

    2006-01-15

    Advanced materials are being developed to improve the energy efficiency of many industries of future including steel, mining, and chemical, as well as, US infrastructures including bridges, pipelines and buildings. Effective deployment of these materials is highly dependent upon the development of arc welding technology. Traditional welding technology development is slow and often involves expensive and time-consuming trial and error experimentation. The reason for this is the lack of useful predictive tools that enable welding technology development to keep pace with the deployment of new materials in various industrial sectors. Literature reviews showed two kinds of modeling activities. Academic and national laboratory efforts focus on developing integrated weld process models by employing the detailed scientific methodologies. However, these models are cumbersome and not easy to use. Therefore, these scientific models have limited application in real-world industrial conditions. On the other hand, industrial users have relied on simple predictive models based on analytical and empirical equations to drive their product development. The scopes of these simple models are limited. In this research, attempts were made to bridge this gap and provide the industry with a computational tool that combines the advantages of both approaches. This research resulted in the development of predictive tools which facilitate the development of optimized welding processes and consumables. The work demonstrated that it is possible to develop hybrid integrated models for relating the weld metal composition and process parameters to the performance of welds. In addition, these tools can be deployed for industrial users through user friendly graphical interface. In principle, the welding industry users can use these modular tools to guide their welding process parameter and consumable composition selection. It is hypothesized that by expanding these tools throughout welding industry

  19. A symbolic methodology to improve disassembly process design.

    PubMed

    Rios, Pedro; Blyler, Leslie; Tieman, Lisa; Stuart, Julie Ann; Grant, Ed

    2003-12-01

    Millions of end-of-life electronic components are retired annually due to the proliferation of new models and their rapid obsolescence. The recovery of resources such as plastics from these goods requires their disassembly. The time required for each disassembly and its associated cost is defined by the operator's familiarity with the product design and its complexity. Since model proliferation serves to complicate an operator's learning curve, it is worthwhile to investigate the benefits to be gained in a disassembly operator's preplanning process. Effective disassembly process design demands the application of green engineering principles, such as those developed by Anastas and Zimmerman (Environ. Sci. Technol. 2003, 37, 94A-101A), which include regard for product complexity, structural commonality, separation energy, material value, and waste prevention. This paper introduces the concept of design symbolsto help the operator more efficiently survey product complexity with respect to location and number of fasteners to remove a structure that is common to all electronics: the housing. With a sample of 71 different computers, printers, and monitors, we demonstrate that appropriate symbols reduce the total disassembly planning time by 13.2 min. Such an improvement could well make efficient the separation of plastic that would otherwise be destined for waste-to-energy or landfill. The symbolic methodology presented may also improve Design for Recycling and Design for Maintenance and Support.

  20. Automated image processing of LANDSAT 2 digital data for watershed runoff prediction

    NASA Technical Reports Server (NTRS)

    Sasso, R. R.; Jensen, J. R.; Estes, J. E.

    1977-01-01

    The U.S. Soil Conservation Service (SCS) model for watershed runoff prediction uses soil and land cover information as its major drivers. Kern County Water Agency is implementing the SCS model to predict runoff for 10,400 sq cm of mountainous watershed in Kern County, California. The Remote Sensing Unit, University of California, Santa Barbara, was commissioned by KCWA to conduct a 230 sq cm feasibility study in the Lake Isabella, California region to evaluate remote sensing methodologies which could be ultimately extrapolated to the entire 10,400 sq cm Kern County watershed. Digital results indicate that digital image processing of Landsat 2 data will provide usable land cover required by KCWA for input to the SCS runoff model.

  1. On-Board Spaceborne Real-time Digital Signal Processing

    NASA Astrophysics Data System (ADS)

    Gao, G.; Long, F.; Liu, L.

    begin center Abstract end center This paper reports a preliminary study result of an on-board digital signal processing system It consists of the on-board processing requirement analysis functional specifications and implementation with the radiation tolerant field-programmable gate array FPGA technology The FPGA program is designed in the VHDL hardware description language and implemented onto a high density F PGA chip The design takes full advantage of the massively parallel architecture of the VirtexII FPGA logic slices to achieve real-time processing at a big data rate Further more an FFT algorithm s implementation with the system is provided as an illustration

  2. Computer image processing - The Viking experience. [digital enhancement techniques

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1977-01-01

    Computer processing of digital imagery from the Viking mission to Mars is discussed, with attention given to subjective enhancement and quantitative processing. Contrast stretching and high-pass filtering techniques of subjective enhancement are described; algorithms developed to determine optimal stretch and filtering parameters are also mentioned. In addition, geometric transformations to rectify the distortion of shapes in the field of view and to alter the apparent viewpoint of the image are considered. Perhaps the most difficult problem in quantitative processing of Viking imagery was the production of accurate color representations of Orbiter and Lander camera images.

  3. Digital image processing of crystalline specimens examined by electron microscopy.

    PubMed

    Kanaya, K

    1988-12-01

    Crystalline specimens imaged in the electron microscope are analysed using digital processing. Some principles of structural analysis using the method of Fourier decomposition are discussed. Complementary techniques, such as enhancement by gradient and Laplacian operators, have been found useful in analysing electron micrographs. The application of these techniques to some problems in Materials Science and Biology are reviewed. By selecting and phase-correcting spots in the computed diffraction pattern, it was possible to localize atoms, molecules, and their defective arrangement in evaporated gold, sputter-deposited tungsten films, and single crystals of cadmium selenide. Digital processing based on the theory of helical diffraction was used to explore the three-dimensional arrangement of molecules in cellular components of alveolar soft part sarcoma, Hirano bodies, and neurofibrillar tangles in the human brain.

  4. Processing Digital Imagery to Enhance Perceptions of Realism

    NASA Technical Reports Server (NTRS)

    Woodell, Glenn A.; Jobson, Daniel J.; Rahman, Zia-ur

    2003-01-01

    Multi-scale retinex with color restoration (MSRCR) is a method of processing digital image data based on Edwin Land s retinex (retina + cortex) theory of human color vision. An outgrowth of basic scientific research and its application to NASA s remote-sensing mission, MSRCR is embodied in a general-purpose algorithm that greatly improves the perception of visual realism and the quantity and quality of perceived information in a digitized image. In addition, the MSRCR algorithm includes provisions for automatic corrections to accelerate and facilitate what could otherwise be a tedious image-editing process. The MSRCR algorithm has been, and is expected to continue to be, the basis for development of commercial image-enhancement software designed to extend and refine its capabilities for diverse applications.

  5. Tunable photonic filters: a digital signal processing design approach.

    PubMed

    Binh, Le Nguyen

    2009-05-20

    Digital signal processing techniques are used for synthesizing tunable optical filters with variable bandwidth and centered reference frequency including the tunability of the low-pass, high-pass, bandpass, and bandstop optical filters. Potential applications of such filters are discussed, and the design techniques and properties of recursive digital filters are outlined. The basic filter structures, namely, the first-order all-pole optical filter (FOAPOF) and the first-order all-zero optical filter (FOAZOF), are described, and finally the design process of tunable optical filters and the designs of the second-order Butterworth low-pass, high-pass, bandpass, and bandstop tunable optical filters are presented. Indeed, we identify that the all-zero and all-pole networks are equivalent with well known principles of optics of interference and resonance, respectively. It is thus very straightforward to implement tunable optical filters, which is a unique feature.

  6. APET methodology for Defense Waste Processing Facility: Mode C operation

    SciTech Connect

    Taylor, R.P. Jr.; Massey, W.M.

    1995-04-01

    Safe operation of SRS facilities continues to be the highest priority of the Savannah River Site (SRS). One of these facilities, the Defense Waste Processing Facility or DWPF, is currently undergoing cold chemical runs to verify the design and construction preparatory to hot startup in 1995. The DWPFF is a facility designed to convert the waste currently stored in tanks at the 200-Area tank farm into a form that is suitable for long term storage in engineered surface facilities and, ultimately, geologic isolation. As a part of the program to ensure safe operation of the DWPF, a probabilistic Safety Assessment of the DWPF has been completed. The results of this analysis are incorporated into the Safety Analysis Report (SAR) for DWPF. The usual practice in preparation of Safety Analysis Reports is to include only a conservative analysis of certain design basis accidents. A major part of a Probabilistic Safety Assessment is the development and quantification of an Accident Progression Event Tree or APET. The APET provides a probabilistic representation of potential sequences along which an accident may progress. The methodology used to determine the risk of operation of the DWPF borrows heavily from methods applied to the Probabilistic Safety Assessment of SRS reactors and to some commercial reactors. This report describes the Accident Progression Event Tree developed for the Probabilistic Safety Assessment of the DWPF.

  7. Digital signal processing algorithms for automatic voice recognition

    NASA Technical Reports Server (NTRS)

    Botros, Nazeih M.

    1987-01-01

    The current digital signal analysis algorithms are investigated that are implemented in automatic voice recognition algorithms. Automatic voice recognition means, the capability of a computer to recognize and interact with verbal commands. The digital signal is focused on, rather than the linguistic, analysis of speech signal. Several digital signal processing algorithms are available for voice recognition. Some of these algorithms are: Linear Predictive Coding (LPC), Short-time Fourier Analysis, and Cepstrum Analysis. Among these algorithms, the LPC is the most widely used. This algorithm has short execution time and do not require large memory storage. However, it has several limitations due to the assumptions used to develop it. The other 2 algorithms are frequency domain algorithms with not many assumptions, but they are not widely implemented or investigated. However, with the recent advances in the digital technology, namely signal processors, these 2 frequency domain algorithms may be investigated in order to implement them in voice recognition. This research is concerned with real time, microprocessor based recognition algorithms.

  8. Audit and Certification Process for Science Data Digital Repositories

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Giaretta, D.; Ambacher, B.; Ashley, K.; Conrad, M.; Downs, R. R.; Garrett, J.; Guercio, M.; Lambert, S.; Longstreth, T.; Sawyer, D. M.; Sierman, B.; Tibbo, H.; Waltz, M.

    2011-12-01

    Science data digital repositories are entrusted to ensure that a science community's data are available and useful to users both today and in the future. Part of the challenge in meeting this responsibility is identifying the standards, policies and procedures required to accomplish effective data preservation. Subsequently a repository should be evaluated on whether or not they are effective in their data preservation efforts. This poster will outline the process by which digital repositories are being formally evaluated in terms of their ability to preserve the digitally encoded information with which they have been entrusted. The ISO standards on which this is based will be identified and the relationship of these standards to the Open Archive Information System (OAIS) reference model will be shown. Six test audits have been conducted with three repositories in Europe and three in the USA. Some of the major lessons learned from these test audits will be briefly described. An assessment of the possible impact of this type of audit and certification on the practice of preserving digital information will also be provided.

  9. Predicting protein subcellular location using digital signal processing.

    PubMed

    Pan, Yu-Xi; Li, Da-Wei; Duan, Yun; Zhang, Zhi-Zhou; Xu, Ming-Qing; Feng, Guo-Yin; He, Lin

    2005-02-01

    The biological functions of a protein are closely related to its attributes in a cell. With the rapid accumulation of newly found protein sequence data in databanks, it is highly desirable to develop an automated method for predicting the subcellular location of proteins. The establishment of such a predictor will expedite the functional determination of newly found proteins and the process of prioritizing genes and proteins identified by genomic efforts as potential molecular targets for drug design. The traditional algorithms for predicting these attributes were based solely on amino acid composition in which no sequence order effect was taken into account. To improve the prediction quality, it is necessary to incorporate such an effect. However, the number of possible patterns in protein sequences is extremely large, posing a formidable difficulty for realizing this goal. To deal with such difficulty, a well-developed tool in digital signal processing named digital Fourier transform (DFT) [1] was introduced. After being translated to a digital signal according to the hydrophobicity of each amino acid, a protein was analyzed by DFT within the frequency domain. A set of frequency spectrum parameters, thus obtained, were regarded as the factors to represent the sequence order effect. A significant improvement in prediction quality was observed by incorporating the frequency spectrum parameters with the conventional amino acid composition. One of the crucial merits of this approach is that many existing tools in mathematics and engineering can be easily applied in the predicting process. It is anticipated that digital signal processing may serve as a useful vehicle for many other protein science areas.

  10. Quantitative Assessment of Mouse Mammary Gland Morphology Using Automated Digital Image Processing and TEB Detection.

    PubMed

    Blacher, Silvia; Gérard, Céline; Gallez, Anne; Foidart, Jean-Michel; Noël, Agnès; Péqueux, Christel

    2016-04-01

    The assessment of rodent mammary gland morphology is largely used to study the molecular mechanisms driving breast development and to analyze the impact of various endocrine disruptors with putative pathological implications. In this work, we propose a methodology relying on fully automated digital image analysis methods including image processing and quantification of the whole ductal tree and of the terminal end buds as well. It allows to accurately and objectively measure both growth parameters and fine morphological glandular structures. Mammary gland elongation was characterized by 2 parameters: the length and the epithelial area of the ductal tree. Ductal tree fine structures were characterized by: 1) branch end-point density, 2) branching density, and 3) branch length distribution. The proposed methodology was compared with quantification methods classically used in the literature. This procedure can be transposed to several software and thus largely used by scientists studying rodent mammary gland morphology.

  11. Application of digital image processing for the generation of voxels phantoms for Monte Carlo simulation.

    PubMed

    Boia, L S; Menezes, A F; Cardoso, M A C; da Rosa, L A R; Batista, D V S; Cardoso, S C; Silva, A X; Facure, A

    2012-01-01

    This paper presents the application of a computational methodology for optimizing the conversion of medical tomographic images in voxel anthropomorphic models for simulation of radiation transport using the MCNP code. A computational system was developed for digital image processing that compresses the information from the DICOM medical image before it is converted to the Scan2MCNP software input file for optimization of the image data. In order to validate the computational methodology, a radiosurgery treatment simulation was performed using the Alderson Rando phantom and the acquisition of DICOM images was performed. The simulation results were compared with data obtained with the BrainLab planning system. The comparison showed good agreement for three orthogonal treatment beams of (60)Co gamma radiation. The percentage differences were 3.07%, 0.77% and 6.15% for axial, coronal and sagital projections, respectively.

  12. a Semi-Automated Point Cloud Processing Methodology for 3d Cultural Heritage Documentation

    NASA Astrophysics Data System (ADS)

    Kıvılcım, C. Ö.; Duran, Z.

    2016-06-01

    The preliminary phase in any architectural heritage project is to obtain metric measurements and documentation of the building and its individual elements. On the other hand, conventional measurement techniques require tremendous resources and lengthy project completion times for architectural surveys and 3D model production. Over the past two decades, the widespread use of laser scanning and digital photogrammetry have significantly altered the heritage documentation process. Furthermore, advances in these technologies have enabled robust data collection and reduced user workload for generating various levels of products, from single buildings to expansive cityscapes. More recently, the use of procedural modelling methods and BIM relevant applications for historic building documentation purposes has become an active area of research, however fully automated systems in cultural heritage documentation still remains open. In this paper, we present a semi-automated methodology, for 3D façade modelling of cultural heritage assets based on parametric and procedural modelling techniques and using airborne and terrestrial laser scanning data. We present the contribution of our methodology, which we implemented in an open source software environment using the example project of a 16th century early classical era Ottoman structure, Sinan the Architect's Şehzade Mosque in Istanbul, Turkey.

  13. Instruments and Methodologies for the Underwater Tridimensional Digitization and Data Musealization

    NASA Astrophysics Data System (ADS)

    Repola, L.; Memmolo, R.; Signoretti, D.

    2015-04-01

    In the research started within the SINAPSIS project of the Università degli Studi Suor Orsola Benincasa an underwater stereoscopic scanning aimed at surveying of submerged archaeological sites, integrable to standard systems for geomorphological detection of the coast, has been developed. The project involves the construction of hardware consisting of an aluminum frame supporting a pair of GoPro Hero Black Edition cameras and software for the production of point clouds and the initial processing of data. The software has features for stereoscopic vision system calibration, reduction of noise and the of distortion of underwater captured images, searching for corresponding points of stereoscopic images using stereo-matching algorithms (dense and sparse), for points cloud generating and filtering. Only after various calibration and survey tests carried out during the excavations envisaged in the project, the mastery of methods for an efficient acquisition of data has been achieved. The current development of the system has allowed generation of portions of digital models of real submerged scenes. A semi-automatic procedure for global registration of partial models is under development as a useful aid for the study and musealization of sites.

  14. Kinematic analysis of human walking gait using digital image processing.

    PubMed

    O'Malley, M; de Paor, D L

    1993-07-01

    A system using digital image processing techniques for kinematic analysis of human gait has been developed. The system is cheap, easy to use, automated and provides useful detailed quantitative information to the medical profession. Passive markers comprising black annuli on white card are placed on the anatomical landmarks of the subject. Digital images at the standard television rate of 25 per second are acquired of the subject walking past a white background. The images are obtained, stored and processed using standard commercially available hardware, i.e. video camera, video recorder, digital framestore and an IBM PC. Using a single-threshold grey level, all the images are thresholded to produce binary images. An automatic routine then uses a set of pattern recognition algorithms to locate accurately and consistently the markers in each image. The positions of the markers are analysed to determine to which anatomical landmark they correspond, and thus a stick diagram for each image is obtained. There is also a facility where the positions of the markers may be entered manually and errors corrected. The results may be presented in a variety of ways: stick diagram animation, sagittal displacement graphs, flexion diagrams and gait parameters.

  15. Digital image processing for the early localization of cancer

    NASA Astrophysics Data System (ADS)

    Kelmar, Cheryl M.

    1991-06-01

    The prognosis for cancer patients becomes much better if a tumor is diagnosed, localized and treated early, in a precancerous stage. The difficulty lies in the localization of cancerous tumors. Carcinoma in situ (CIS) refers to a tumor which is approximately 100 microns thick and one which has not penetrated through the epithelium wall or become invasive (2). A tumor of this size cannot be detected by existing techniques such as x-ray, computer tomography, magnetic resonance imaging, nuclear medicine or conventional endoscopy under white-light illumination. However, these tumors can be localized and destroyed by photodynamic diagnosis and therapy. This research shows that digital image processing and the technique of digital image ratioing contribute to photodynamic diagnosis and the early localization of cancer. A software package has been developed as a result of this research. The software package quantifies the usefulness of digital image processing for tumor localization and detectability. System parameters such as the endoscope distance and angle variations, tumor size and tumor concentration, sensitivity and specificity of the system have been tested and quantified.

  16. Digital Processing of Weak Signals Buried in Noise

    NASA Astrophysics Data System (ADS)

    Emerson, Darrel

    This article describes the use of digital signal processing to pull the AMSAT AO-13 ZRO test signal out of the noise. In the ZRO tests, a signal is transmitted from the Oscar 13 satellite at progressively lower power levels, in 3 dB steps. The challenge is to decode successfully the weakest possible signal. The signal from the receiver audio was digitized using a Sound Blaster card, then filtered with a modified FFT routine. The modification was to allow the pre-detection filter to follow the slowly drifting signal. After using the matched, sliding filter before detection, the post-detection signal was passed through another matched filter. Finally, a cross-correlation technique comparing the detected, filtered signal with every possible combination of ZRO signal was applied, taking also into account a gradual drift of CW sending speed. The final, statistically most probable, solution turned out to be correct. This gave the only successful detection of level A transmissions from Oscar 13 so far (Aug 1996.) The extensive digital processing partly made up for the relatively poor receiving antenna; a 10-element 146 MHz Yagi, part of the Cushcraft AOP-1 combination.

  17. Digital metamaterials.

    PubMed

    Della Giovampaola, Cristian; Engheta, Nader

    2014-12-01

    Balancing complexity and simplicity has played an important role in the development of many fields in science and engineering. One of the well-known and powerful examples of such balance can be found in Boolean algebra and its impact on the birth of digital electronics and the digital information age. The simplicity of using only two numbers, '0' and '1', in a binary system for describing an arbitrary quantity made the fields of digital electronics and digital signal processing powerful and ubiquitous. Here, inspired by the binary concept, we propose to develop the notion of digital metamaterials. Specifically, we investigate how one can synthesize an electromagnetic metamaterial with a desired permittivity, using as building blocks only two elemental materials, which we call 'metamaterial bits', with two distinct permittivity functions. We demonstrate, analytically and numerically, how proper spatial mixtures of such metamaterial bits lead to elemental 'metamaterial bytes' with effective material parameters that are different from the parameters of the metamaterial bits. We then apply this methodology to several design examples of optical elements, such as digital convex lenses, flat graded-index digital lenses, digital constructs for epsilon-near-zero (ENZ) supercoupling and digital hyperlenses, thus highlighting the power and simplicity of the methodology.

  18. DYMAC digital electronic balance. [LASL Plutonium Processing Facility

    SciTech Connect

    Stephens, M.M.

    1980-06-01

    The Dynamic Materials Accountability (DYMAC) System at LASL integrates nondestructive assay (NDA) instruments with interactive data-processing equipment to provide near-real-time accountability of the nuclear material in the LASL Plutonium Processing Facility. The most widely used NDA instrument in the system is the DYMAC digital electronic balance. The DYMAC balance is a commercial instrument that has been modified at LASL for weighing material in gloveboxes and for transmitting the weight data directly to a central computer. This manual describes the balance components, details the LASL modifications, reviews a DYMAC measurement control program that monitors balance performance, and provides instructions for balance operation and maintenance.

  19. Enhancement Of Optical Registration Signals Through Digital Signal Processing Techniques

    NASA Astrophysics Data System (ADS)

    Cote, Daniel R.; Lazo-Wasem, Jeanne

    1988-01-01

    Alignment and setup of lighography processes has largely been conducted on special test wafers. Actual product level optimization has been limited to manual techniques such as optical verniers. This is especially time consuming and prone to inconsistencies when the registration characteristics of lithographic systems are being measured. One key factor obstructing the use of automated metrology equipment on product level wafers is the inability to discern reliably, metrology features from the background noise and variations in optical registration signals. This is often the case for metal levels such as aluminum and tungsten. This paper discusses methods for enhancement of typical registration signals obtained from difficult semiconductor process levels. Brightfield and darkfield registration signals are obtained using a microscope and a 1024 element linear photodiode array. These signals are then digitized and stored on the hard disk of a computer. The techniques utilized include amplitude selective and adaptive and non-adaptive frequency domain filtering techniques. The effect of each of these techniques upon calculated registration values is analyzed by determining the positional variation of the center location of a two line registration feature. Plots of raw and processed signals obtained are presented as are plots of the power spectral density of ideal metrology feature signal and noise patterns. It is concluded that the proper application of digital signal processing (DSP) techniques to problematic optical registration signals greatly enhances the applicability of automated optical registration measurement techniques to difficult semiconductor process levels.

  20. Rapid Process Optimization: A Novel Process Improvement Methodology to Innovate Health Care Delivery.

    PubMed

    Wiler, Jennifer L; Bookman, Kelly; Birznieks, Derek B; Leeret, Robert; Koehler, April; Planck, Shauna; Zane, Richard

    2016-03-26

    Health care systems have utilized various process redesign methodologies to improve care delivery. This article describes the creation of a novel process improvement methodology, Rapid Process Optimization (RPO). This system was used to redesign emergency care delivery within a large academic health care system, which resulted in a decrease: (1) door-to-physician time (Department A: 54 minutes pre vs 12 minutes 1 year post; Department B: 20 minutes pre vs 8 minutes 3 months post), (2) overall length of stay (Department A: 228 vs 184; Department B: 202 vs 192), (3) discharge length of stay (Department A: 216 vs 140; Department B: 179 vs 169), and (4) left without being seen rates (Department A: 5.5% vs 0.0%; Department B: 4.1% vs 0.5%) despite a 47% increased census at Department A (34 391 vs 50 691) and a 4% increase at Department B (8404 vs 8753). The novel RPO process improvement methodology can inform and guide successful care redesign.

  1. Image processing in digital pathology: an opportunity to solve inter-batch variability of immunohistochemical staining.

    PubMed

    Van Eycke, Yves-Rémi; Allard, Justine; Salmon, Isabelle; Debeir, Olivier; Decaestecker, Christine

    2017-02-21

    Immunohistochemistry (IHC) is a widely used technique in pathology to evidence protein expression in tissue samples. However, this staining technique is known for presenting inter-batch variations. Whole slide imaging in digital pathology offers a possibility to overcome this problem by means of image normalisation techniques. In the present paper we propose a methodology to objectively evaluate the need of image normalisation and to identify the best way to perform it. This methodology uses tissue microarray (TMA) materials and statistical analyses to evidence the possible variations occurring at colour and intensity levels as well as to evaluate the efficiency of image normalisation methods in correcting them. We applied our methodology to test different methods of image normalisation based on blind colour deconvolution that we adapted for IHC staining. These tests were carried out for different IHC experiments on different tissue types and targeting different proteins with different subcellular localisations. Our methodology enabled us to establish and to validate inter-batch normalization transforms which correct the non-relevant IHC staining variations. The normalised image series were then processed to extract coherent quantitative features characterising the IHC staining patterns.

  2. Image processing in digital pathology: an opportunity to solve inter-batch variability of immunohistochemical staining

    PubMed Central

    Van Eycke, Yves-Rémi; Allard, Justine; Salmon, Isabelle; Debeir, Olivier; Decaestecker, Christine

    2017-01-01

    Immunohistochemistry (IHC) is a widely used technique in pathology to evidence protein expression in tissue samples. However, this staining technique is known for presenting inter-batch variations. Whole slide imaging in digital pathology offers a possibility to overcome this problem by means of image normalisation techniques. In the present paper we propose a methodology to objectively evaluate the need of image normalisation and to identify the best way to perform it. This methodology uses tissue microarray (TMA) materials and statistical analyses to evidence the possible variations occurring at colour and intensity levels as well as to evaluate the efficiency of image normalisation methods in correcting them. We applied our methodology to test different methods of image normalisation based on blind colour deconvolution that we adapted for IHC staining. These tests were carried out for different IHC experiments on different tissue types and targeting different proteins with different subcellular localisations. Our methodology enabled us to establish and to validate inter-batch normalization transforms which correct the non-relevant IHC staining variations. The normalised image series were then processed to extract coherent quantitative features characterising the IHC staining patterns. PMID:28220842

  3. Image processing in digital pathology: an opportunity to solve inter-batch variability of immunohistochemical staining

    NASA Astrophysics Data System (ADS)

    van Eycke, Yves-Rémi; Allard, Justine; Salmon, Isabelle; Debeir, Olivier; Decaestecker, Christine

    2017-02-01

    Immunohistochemistry (IHC) is a widely used technique in pathology to evidence protein expression in tissue samples. However, this staining technique is known for presenting inter-batch variations. Whole slide imaging in digital pathology offers a possibility to overcome this problem by means of image normalisation techniques. In the present paper we propose a methodology to objectively evaluate the need of image normalisation and to identify the best way to perform it. This methodology uses tissue microarray (TMA) materials and statistical analyses to evidence the possible variations occurring at colour and intensity levels as well as to evaluate the efficiency of image normalisation methods in correcting them. We applied our methodology to test different methods of image normalisation based on blind colour deconvolution that we adapted for IHC staining. These tests were carried out for different IHC experiments on different tissue types and targeting different proteins with different subcellular localisations. Our methodology enabled us to establish and to validate inter-batch normalization transforms which correct the non-relevant IHC staining variations. The normalised image series were then processed to extract coherent quantitative features characterising the IHC staining patterns.

  4. Thermal Modeling of Direct Digital Melt-Deposition Processes

    NASA Astrophysics Data System (ADS)

    Cooper, K. P.; Lambrakos, S. G.

    2011-02-01

    Additive manufacturing involves creating three-dimensional (3D) objects by depositing materials layer-by-layer. The freeform nature of the method permits the production of components with complex geometry. Deposition processes provide one more capability, which is the addition of multiple materials in a discrete manner to create "heterogeneous" objects with locally controlled composition and microstructure. The result is direct digital manufacturing (DDM) by which dissimilar materials are added voxel-by-voxel (a voxel is volumetric pixel) following a predetermined tool-path. A typical example is functionally gradient material such as a gear with a tough core and a wear-resistant surface. The inherent complexity of DDM processes is such that process modeling based on direct physics-based theory is difficult, especially due to a lack of temperature-dependent thermophysical properties and particularly when dealing with melt-deposition processes. In order to overcome this difficulty, an inverse problem approach is proposed for the development of thermal models that can represent multi-material, direct digital melt deposition. This approach is based on the construction of a numerical-algorithmic framework for modeling anisotropic diffusivity such as that which would occur during energy deposition within a heterogeneous workpiece. This framework consists of path-weighted integral formulations of heat diffusion according to spatial variations in material composition and requires consideration of parameter sensitivity issues.

  5. Programmable rate modem utilizing digital signal processing techniques

    NASA Technical Reports Server (NTRS)

    Bunya, George K.; Wallace, Robert L.

    1989-01-01

    The engineering development study to follow was written to address the need for a Programmable Rate Digital Satellite Modem capable of supporting both burst and continuous transmission modes with either binary phase shift keying (BPSK) or quadrature phase shift keying (QPSK) modulation. The preferred implementation technique is an all digital one which utilizes as much digital signal processing (DSP) as possible. Here design tradeoffs in each portion of the modulator and demodulator subsystem are outlined, and viable circuit approaches which are easily repeatable, have low implementation losses and have low production costs are identified. The research involved for this study was divided into nine technical papers, each addressing a significant region of concern in a variable rate modem design. Trivial portions and basic support logic designs surrounding the nine major modem blocks were omitted. In brief, the nine topic areas were: (1) Transmit Data Filtering; (2) Transmit Clock Generation; (3) Carrier Synthesizer; (4) Receive AGC; (5) Receive Data Filtering; (6) RF Oscillator Phase Noise; (7) Receive Carrier Selectivity; (8) Carrier Recovery; and (9) Timing Recovery.

  6. Evaluation of clinical image processing algorithms used in digital mammography.

    PubMed

    Zanca, Federica; Jacobs, Jurgen; Van Ongeval, Chantal; Claus, Filip; Celis, Valerie; Geniets, Catherine; Provost, Veerle; Pauwels, Herman; Marchal, Guy; Bosmans, Hilde

    2009-03-01

    Screening is the only proven approach to reduce the mortality of breast cancer, but significant numbers of breast cancers remain undetected even when all quality assurance guidelines are implemented. With the increasing adoption of digital mammography systems, image processing may be a key factor in the imaging chain. Although to our knowledge statistically significant effects of manufacturer-recommended image processings have not been previously demonstrated, the subjective experience of our radiologists, that the apparent image quality can vary considerably between different algorithms, motivated this study. This article addresses the impact of five such algorithms on the detection of clusters of microcalcifications. A database of unprocessed (raw) images of 200 normal digital mammograms, acquired with the Siemens Novation DR, was collected retrospectively. Realistic simulated microcalcification clusters were inserted in half of the unprocessed images. All unprocessed images were subsequently processed with five manufacturer-recommended image processing algorithms (Agfa Musica 1, IMS Raffaello Mammo 1.2, Sectra Mamea AB Sigmoid, Siemens OPVIEW v2, and Siemens OPVIEW v1). Four breast imaging radiologists were asked to locate and score the clusters in each image on a five point rating scale. The free-response data were analyzed by the jackknife free-response receiver operating characteristic (JAFROC) method and, for comparison, also with the receiver operating characteristic (ROC) method. JAFROC analysis revealed highly significant differences between the image processings (F = 8.51, p < 0.0001), suggesting that image processing strongly impacts the detectability of clusters. Siemens OPVIEW2 and Siemens OPVIEW1 yielded the highest and lowest performances, respectively. ROC analysis of the data also revealed significant differences between the processing but at lower significance (F = 3.47, p = 0.0305) than JAFROC. Both statistical analysis methods revealed that the

  7. Perspectives on Learning: Methodologies for Exploring Learning Processes and Outcomes

    ERIC Educational Resources Information Center

    Goldman, Susan R.

    2014-01-01

    The papers in this Special Issue were initially prepared for an EARLI 2013 Symposium that was designed to examine methodologies in use by researchers from two sister communities, Learning and Instruction and Learning Sciences. The four papers reflect a common ground in advances in conceptions of learning since the early days of the "cognitive…

  8. Fundamentals of in Situ Digital Camera Methodology for Water Quality Monitoring of Coast and Ocean

    PubMed Central

    Goddijn-Murphy, Lonneke; Dailloux, Damien; White, Martin; Bowers, Dave

    2009-01-01

    Conventional digital cameras, the Nikon Coolpix885® and the SeaLife ECOshot®, were used as in situ optical instruments for water quality monitoring. Measured response spectra showed that these digital cameras are basically three-band radiometers. The response values in the red, green and blue bands, quantified by RGB values of digital images of the water surface, were comparable to measurements of irradiance levels at red, green and cyan/blue wavelengths of water leaving light. Different systems were deployed to capture upwelling light from below the surface, while eliminating direct surface reflection. Relationships between RGB ratios of water surface images, and water quality parameters were found to be consistent with previous measurements using more traditional narrow-band radiometers. This current paper focuses on the method that was used to acquire digital images, derive RGB values and relate measurements to water quality parameters. Field measurements were obtained in Galway Bay, Ireland, and in the Southern Rockall Trough in the North Atlantic, where both yellow substance and chlorophyll concentrations were successfully assessed using the digital camera method. PMID:22346729

  9. Moire technique by means of digital image processing.

    PubMed

    Gasvik, K J

    1983-11-15

    Moiré technique by means of projected fringes is a suitable method for full field measurements of out-of-plane deformations and object contouring. One disadvantage in industrial applications has been the photographic process with the involved time-consuming development of the photographic film. This paper presents a new method using a TV camera and a digital image processor whereby real-time measurements of deformations and comparison of object contours are possible. Also the principles and limitations of the projected moiré method are described.

  10. DSPSR: Digital Signal Processing Software for Pulsar Astronomy

    NASA Astrophysics Data System (ADS)

    van Straten, W.; Bailes, M.

    2010-10-01

    DSPSR, written primarily in C++, is an open-source, object-oriented, digital signal processing software library and application suite for use in radio pulsar astronomy. The library implements an extensive range of modular algorithms for use in coherent dedispersion, filterbank formation, pulse folding, and other tasks. The software is installed and compiled using the standard GNU configure and make system, and is able to read astronomical data in 18 different file formats, including FITS, S2, CPSR, CPSR2, PuMa, PuMa2, WAPP, ASP, and Mark5.

  11. Naturalistic Observation of Health-Relevant Social Processes: The Electronically Activated Recorder (EAR) Methodology in Psychosomatics

    PubMed Central

    Mehl, Matthias R.; Robbins, Megan L.; Deters, Fenne große

    2012-01-01

    This article introduces a novel, observational ambulatory monitoring method called the Electronically Activated Recorder or EAR. The EAR is a digital audio recorder that runs on a handheld computer and periodically and unobtrusively records snippets of ambient sounds from participants’ momentary environments. In tracking moment-to-moment ambient sounds, it yields acoustic logs of people’s days as they naturally unfold. In sampling only a fraction of the time, it protects participants’ privacy and makes large observational studies feasible. As a naturalistic observation method, it provides an observer’s account of daily life and is optimized for the objective assessment of audible aspects of social environments, behaviors, and interactions (e.g., habitual preferences for social settings, idiosyncratic interaction styles, and subtle emotional expressions). The article discusses the EAR method conceptually and methodologically, reviews prior research with it, and identifies three concrete ways in which it can enrich psychosomatic research. Specifically, it can (a) calibrate psychosocial effects on health against frequencies of real-world behavior, (b) provide ecological, observational measures of health-related social processes that are independent of self-report, and (c) help with the assessment of subtle and habitual social behaviors that evade self-report but have important health implications. An important avenue for future research lies in merging traditional, self-report based ambulatory monitoring methods with observational approaches such as the EAR to allow for the simultaneous yet methodologically independent assessment of inner, experiential (e.g., loneliness) and outer, observable aspects (e.g., social isolation) of real-world social processes to reveal their unique effects on health. PMID:22582338

  12. Digital signal processor and programming system for parallel signal processing

    SciTech Connect

    Van den Bout, D.E.

    1987-01-01

    This thesis describes an integrated assault upon the problem of designing high-throughput, low-cost digital signal-processing systems. The dual prongs of this assault consist of: (1) the design of a digital signal processor (DSP) which efficiently executes signal-processing algorithms in either a uniprocessor or multiprocessor configuration, (2) the PaLS programming system which accepts an arbitrary algorithm, partitions it across a group of DSPs, synthesizes an optimal communication link topology for the DSPs, and schedules the partitioned algorithm upon the DSPs. The results of applying a new quasi-dynamic analysis technique to a set of high-level signal-processing algorithms were used to determine the uniprocessor features of the DSP design. For multiprocessing applications, the DSP contains an interprocessor communications port (IPC) which supports simple, flexible, dataflow communications while allowing the total communication bandwidth to be incrementally allocated to achieve the best link utilization. The net result is a DSP with a simple architecture that is easy to program for both uniprocessor and multi-processor modes of operation. The PaLS programming system simplifies the task of parallelizing an algorithm for execution upon a multiprocessor built with the DSP.

  13. Digital processing of side-scan sonar data with the Woods Hole image processing system software

    USGS Publications Warehouse

    Paskevich, Valerie F.

    1992-01-01

    Since 1985, the Branch of Atlantic Marine Geology has been involved in collecting, processing and digitally mosaicking high and low-resolution side-scan sonar data. Recent development of a UNIX-based image-processing software system includes a series of task specific programs for processing side-scan sonar data. This report describes the steps required to process the collected data and to produce an image that has equal along- and across-track resol

  14. Applying of digital signal processing to optical equisignal zone system

    NASA Astrophysics Data System (ADS)

    Maraev, Anton A.; Timofeev, Aleksandr N.; Gusarov, Vadim F.

    2015-05-01

    In this work we are trying to assess the application of array detectors and digital information processing to the system with the optical equisignal zone as a new method of evaluating of optical equisignal zone position. Peculiarities of optical equisignal zone formation are described. The algorithm of evaluation of optical equisignal zone position is applied to processing on the array detector. This algorithm enables to evaluate as lateral displacement as turning angles of the receiver relative to the projector. Interrelation of parameters of the projector and the receiver is considered. According to described principles an experimental set was made and then characterized. The accuracy of position evaluation of the equisignal zone is shown dependent of the size of the equivalent entrance pupil at processing.

  15. Liquid crystal thermography and true-colour digital image processing

    NASA Astrophysics Data System (ADS)

    Stasiek, J.; Stasiek, A.; Jewartowski, M.; Collins, M. W.

    2006-06-01

    In the last decade thermochromic liquid crystals (TLC) and true-colour digital image processing have been successfully used in non-intrusive technical, industrial and biomedical studies and applications. Thin coatings of TLCs at surfaces are utilized to obtain detailed temperature distributions and heat transfer rates for steady or transient processes. Liquid crystals also can be used to make visible the temperature and velocity fields in liquids by the simple expedient of directly mixing the liquid crystal material into the liquid (water, glycerol, glycol, and silicone oils) in very small quantities to use as thermal and hydrodynamic tracers. In biomedical situations e.g., skin diseases, breast cancer, blood circulation and other medical application, TLC and image processing are successfully used as an additional non-invasive diagnostic method especially useful for screening large groups of potential patients. The history of this technique is reviewed, principal methods and tools are described and some examples are also presented.

  16. Holographic digital microscopy in on-line process control

    NASA Astrophysics Data System (ADS)

    Osanlou, Ardeshir

    2011-09-01

    This article investigates the feasibility of real-time three-dimensional imaging of microscopic objects within various emulsions while being produced in specialized production vessels. The study is particularly relevant to on-line process monitoring and control in chemical, pharmaceutical, food, cleaning, and personal hygiene industries. Such processes are often dynamic and the materials cannot be measured once removed from the production vessel. The technique reported here is applicable to three-dimensional characterization analyses on stirred fluids in small reaction vessels. Relatively expensive pulsed lasers have been avoided through the careful control of the speed of the moving fluid in relation to the speed of the camera exposure and the wavelength of the continuous wave laser used. The ultimate aim of the project is to introduce a fully robust and compact digital holographic microscope as a process control tool in a full size specialized production vessel.

  17. Desolvation Induced Origami of Photocurable Polymers by Digit Light Processing.

    PubMed

    Zhao, Zeang; Wu, Jiangtao; Mu, Xiaoming; Chen, Haosen; Qi, H Jerry; Fang, Daining

    2016-12-22

    Self-folding origami is of great interest in current research on functional materials and structures, but there is still a challenge to develop a simple method to create freestanding, reversible, and complex origami structures. This communication provides a feasible solution to this challenge by developing a method based on the digit light processing technique and desolvation-induced self-folding. In this new method, flat polymer sheets can be cured by a light field from a commercial projector with varying intensity, and the self-folding process is triggered by desolvation in water. Folded origami structures can be recovered once immersed in the swelling medium. The self-folding process is investigated both experimentally and theoretically. Diverse 3D origami shapes are demonstrated. This method can be used for responsive actuators and the fabrication of 3D electronic devices.

  18. Measurements methodology for evaluation of Digital TV operation in VHF high-band

    NASA Astrophysics Data System (ADS)

    Pudwell Chaves de Almeida, M.; Vladimir Gonzalez Castellanos, P.; Alfredo Cal Braz, J.; Pereira David, R.; Saboia Lima de Souza, R.; Pereira da Soledade, A.; Rodrigues Nascimento Junior, J.; Ferreira Lima, F.

    2016-07-01

    This paper describes the experimental setup of field measurements carried out for evaluating the operation of the ISDB-TB (Integrated Services Digital Broadcasting, Terrestrial, Brazilian version) standard digital TV in the VHF-highband. Measurements were performed in urban and suburban areas in a medium-sized Brazilian city. Besides the direct measurements of received power and environmental noise, a measurement procedure involving the injection of Gaussian additive noise was employed to achieve the signal to noise ratio threshold at each measurement site. The analysis includes results of static reception measurements for evaluating the received field strength and the signal to noise ratio thresholds for correct signal decoding.

  19. Coherent detection and digital signal processing for fiber optic communications

    NASA Astrophysics Data System (ADS)

    Ip, Ezra

    The drive towards higher spectral efficiency in optical fiber systems has generated renewed interest in coherent detection. We review different detection methods, including noncoherent, differentially coherent, and coherent detection, as well as hybrid detection methods. We compare the modulation methods that are enabled and their respective performances in a linear regime. An important system parameter is the number of degrees of freedom (DOF) utilized in transmission. Polarization-multiplexed quadrature-amplitude modulation maximizes spectral efficiency and power efficiency as it uses all four available DOF contained in the two field quadratures in the two polarizations. Dual-polarization homodyne or heterodyne downconversion are linear processes that can fully recover the received signal field in these four DOF. When downconverted signals are sampled at the Nyquist rate, compensation of transmission impairments can be performed using digital signal processing (DSP). Software based receivers benefit from the robustness of DSP, flexibility in design, and ease of adaptation to time-varying channels. Linear impairments, including chromatic dispersion (CD) and polarization-mode dispersion (PMD), can be compensated quasi-exactly using finite impulse response filters. In practical systems, sampling the received signal at 3/2 times the symbol rate is sufficient to enable an arbitrary amount of CD and PMD to be compensated for a sufficiently long equalizer whose tap length scales linearly with transmission distance. Depending on the transmitted constellation and the target bit error rate, the analog-to-digital converter (ADC) should have around 5 to 6 bits of resolution. Digital coherent receivers are naturally suited for the implementation of feedforward carrier recovery, which has superior linewidth tolerance than phase-locked loops, and does not suffer from feedback delay constraints. Differential bit encoding can be used to prevent catastrophic receiver failure due

  20. Data Processing Factory for the Sloan Digital Sky Survey

    NASA Astrophysics Data System (ADS)

    Stoughton, Christopher; Adelman, Jennifer; Annis, James T.; Hendry, John; Inkmann, John; Jester, Sebastian; Kent, Steven M.; Kuropatkin, Nickolai; Lee, Brian; Lin, Huan; Peoples, John, Jr.; Sparks, Robert; Tucker, Douglas; Vanden Berk, Dan; Yanny, Brian; Yocum, Dan

    2002-12-01

    The Sloan Digital Sky Survey (SDSS) data handling presents two challenges: large data volume and timely production of spectroscopic plates from imaging data. A data processing factory, using technologies both old and new, handles this flow. Distribution to end users is via disk farms, to serve corrected images and calibrated spectra, and a database, to efficiently process catalog queries. For distribution of modest amounts of data from Apache Point Observatory to Fermilab, scripts use rsync to update files, while larger data transfers are accomplished by shipping magnetic tapes commercially. All data processing pipelines are wrapped in scripts to address consecutive phases: preparation, submission, checking, and quality control. We constructed the factory by chaining these pipelines together while using an operational database to hold processed imaging catalogs. The science database catalogs all imaging and spectroscopic object, with pointers to the various external files associated with them. Diverse computing systems address particular processing phases. UNIX computers handle tape reading and writing, as well as calibration steps that require access to a large amount of data with relatively modest computational demands. Commodity CPUs process steps that require access to a limited amount of data with more demanding computations requirements. Disk servers optimized for cost per Gbyte serve terabytes of processed data, while servers optimized for disk read speed run SQLServer software to process queries on the catalogs. This factory produced data for the SDSS Early Data Release in June 2001, and it is currently producing Data Release One, scheduled for January 2003.

  1. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems.

    PubMed

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-11-30

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015.

  2. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems

    PubMed Central

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-01-01

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015. PMID:26633395

  3. Digital Methodologies of Education Governance: Pearson plc and the Remediation of Methods

    ERIC Educational Resources Information Center

    Williamson, Ben

    2016-01-01

    This article analyses the rise of software systems in education governance, focusing on digital methods in the collection, calculation and circulation of educational data. It examines how software-mediated methods intervene in the ways educational institutions and actors are seen, known and acted upon through an analysis of the methodological…

  4. Phase resolved digital signal processing in optical coherence tomography

    NASA Astrophysics Data System (ADS)

    de Boer, Johannes F.; Tripathi, Renu; Park, Boris H.; Nassif, Nader

    2002-06-01

    We present phase resolved digital signal processing techniques for Optical Coherence Tomography to correct for the non Gaussian shape of source spectra and for Group Delay Dispersion (GDD). A broadband source centered at 820 nm was synthesized by combining the spectra of two superluminescent diodes to improve axial image resolution in an optical coherence tomography (OCT) system. Spectral shaping was used to reduce the side lobes (ringing) in the axial point spread function due to the non-Gaussian shape of the spectra. Images of onion cells taken with each individual source and the combined sources, respectively, show the improved resolution and quality enhancement in a turbid biological sample. An OCT system operating at 1310 nm was used to demonstrate that the broadening effect of group delay dispersion (GDD) on the coherence function could be eliminated completely by introducing a quadratic phase shift in the Fourier domain of the interferometric signal. The technique is demonstrated by images of human skin grafts with group delay dispersion mismatch between sample and reference arm before and after digital processing.

  5. Microcomputer-based digital image processing - A tutorial package for exploration geologists

    NASA Technical Reports Server (NTRS)

    Harrington, J. A., Jr.; Cartin, K. F.

    1985-01-01

    An Apple II microcomputer-based software package for analysis of digital data developed at the University of Oklahoma, the Digital Image Analysis System (DIAS), provides a relatively low-cost, portable alternative to large, dedicated minicomputers for digital image processing education. Digital processing techniques for analysis of Landsat MSS data and a series of tutorial exercises for exploration geologists are described and evaluated. DIAS allows in-house training that does not interfere with computer-based prospect analysis objectives.

  6. The Digital Fields Board for the FIELDS instrument suite on the Solar Probe Plus mission: Analog and digital signal processing

    NASA Astrophysics Data System (ADS)

    Malaspina, David M.; Ergun, Robert E.; Bolton, Mary; Kien, Mark; Summers, David; Stevens, Ken; Yehle, Alan; Karlsson, Magnus; Hoxie, Vaughn C.; Bale, Stuart D.; Goetz, Keith

    2016-06-01

    The first in situ measurements of electric and magnetic fields in the near-Sun environment (< 0.25 AU from the Sun) will be made by the FIELDS instrument suite on the Solar Probe Plus mission. The Digital Fields Board (DFB) is an electronics board within FIELDS that performs analog and digital signal processing, as well as digitization, for signals between DC and 60 kHz from five voltage sensors and four search coil magnetometer channels. These nine input signals are processed on the DFB into 26 analog data streams. A specialized application-specific integrated circuit performs analog to digital conversion on all 26 analog channels simultaneously. The DFB then processes the digital data using a field programmable gate array (FPGA), generating a variety of data products, including digitally filtered continuous waveforms, high-rate burst capture waveforms, power spectra, cross spectra, band-pass filter data, and several ancillary products. While the data products are optimized for encounter-based mission operations, they are also highly configurable, a key design aspect for a mission of exploration. This paper describes the analog and digital signal processing used to ensure that the DFB produces high-quality science data, using minimal resources, in the challenging near-Sun environment.

  7. Digital processing of mesoscale analysis and space sensor data

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.; Karitani, S.

    1985-01-01

    The mesoscale analysis and space sensor (MASS) data management and analysis system on the research computer system is presented. The MASS data base management and analysis system was implemented on the research computer system which provides a wide range of capabilities for processing and displaying large volumes of conventional and satellite derived meteorological data. The research computer system consists of three primary computers (HP-1000F, Harris/6, and Perkin-Elmer 3250), each of which performs a specific function according to its unique capabilities. The overall tasks performed concerning the software, data base management and display capabilities of the research computer system in terms of providing a very effective interactive research tool for the digital processing of mesoscale analysis and space sensor data is described.

  8. Processing techniques for digital sonar images from GLORIA.

    USGS Publications Warehouse

    Chavez, P.S.

    1986-01-01

    Image processing techniques have been developed to handle data from one of the newest members of the remote sensing family of digital imaging systems. This paper discusses software to process data collected by the GLORIA (Geological Long Range Inclined Asdic) sonar imaging system, designed and built by the Institute of Oceanographic Sciences (IOS) in England, to correct for both geometric and radiometric distortions that exist in the original 'raw' data. Preprocessing algorithms that are GLORIA-specific include corrections for slant-range geometry, water column offset, aspect ratio distortion, changes in the ship's velocity, speckle noise, and shading problems caused by the power drop-off which occurs as a function of range.-from Author

  9. Applying Systems Engineering Methodologies to the Creative Process

    DTIC Science & Technology

    2014-09-01

    underlying componential cognitive processes ” (30). Sawyer’s eight-stage creative theory was proposed and verified as the stage and componential process ...nature of the creative process in terms of stages, which can be sequential or recursive, or underlying componential cognitive processes ” (30). The...structure and nature of the creative process in terms of stages, which can be sequential or recursive, or underlying componential cognitive

  10. Performance evaluation of image processing algorithms in digital mammography

    NASA Astrophysics Data System (ADS)

    Zanca, Federica; Van Ongeval, Chantal; Jacobs, Jurgen; Pauwels, Herman; Marchal, Guy; Bosmans, Hilde

    2008-03-01

    The purpose of the study is to evaluate the performance of different image processing algorithms in terms of representation of microcalcification clusters in digital mammograms. Clusters were simulated in clinical raw ("for processing") images. The entire dataset of images consisted of 200 normal mammograms, selected out of our clinical routine cases and acquired with a Siemens Novation DR system. In 100 of the normal images a total of 142 clusters were simulated; the remaining 100 normal mammograms served as true negative input cases. Both abnormal and normal images were processed with 5 commercially available processing algorithms: Siemens OpView1 and Siemens OpView2, Agfa Musica1, Sectra Mamea AB Sigmoid and IMS Raffaello Mammo 1.2. Five observers were asked to locate and score the cluster(s) in each image, by means of dedicated software tool. Observer performance was assessed using the JAFROC Figure of Merit. FROC curves, fitted using the IDCA method, have also been calculated. JAFROC analysis revealed significant differences among the image processing algorithms in the detection of microcalcifications clusters (p=0.0000369). Calculated average Figures of Merit are: 0.758 for Siemens OpView2, 0.747 for IMS Processing 1.2, 0.736 for Agfa Musica1 processing, 0.706 for Sectra Mamea AB Sigmoid processing and 0.703 for Siemens OpView1. This study is a first step towards a quantitative assessment of image processing in terms of cluster detection in clinical mammograms. Although we showed a significant difference among the image processing algorithms, this method does not on its own allow for a global performance ranking of the investigated algorithms.

  11. Methodology for measurement of fault latency in a digital avionic miniprocessor

    NASA Technical Reports Server (NTRS)

    Mcgough, J. G.; Swern, F.; Bavuso, S. J.

    1981-01-01

    Investigations regarding the synthesis of a reliability assessment capability for fault-tolerant computer-based systems have been conducted for several years. In 1978 a pilot study was conducted to test the feasibility of measuring detection coverage and investigating the dynamics of fault propagation in a digital computer. A description is presented of an investigation concerned with the applicability of previous results to a real avionics processor. The obtained results show that emulation is a practicable approach to failure modes and effects analysis of a digital processor. The run time of the emulated processor on a PDP-10 host computer is only 20,000 to 25,000 times slower than the actual processor. As a consequence large numbers of faults can be studied at relatively little cost and in a timely manner.

  12. Digital Signal Processing and Control for the Study of Gene Networks

    PubMed Central

    Shin, Yong-Jun

    2016-01-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks. PMID:27102828

  13. Digital Signal Processing and Control for the Study of Gene Networks.

    PubMed

    Shin, Yong-Jun

    2016-04-22

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.

  14. Digital Signal Processing and Control for the Study of Gene Networks

    NASA Astrophysics Data System (ADS)

    Shin, Yong-Jun

    2016-04-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.

  15. IMAGEP - A FORTRAN ALGORITHM FOR DIGITAL IMAGE PROCESSING

    NASA Technical Reports Server (NTRS)

    Roth, D. J.

    1994-01-01

    IMAGEP is a FORTRAN computer algorithm containing various image processing, analysis, and enhancement functions. It is a keyboard-driven program organized into nine subroutines. Within the subroutines are other routines, also, selected via keyboard. Some of the functions performed by IMAGEP include digitization, storage and retrieval of images; image enhancement by contrast expansion, addition and subtraction, magnification, inversion, and bit shifting; display and movement of cursor; display of grey level histogram of image; and display of the variation of grey level intensity as a function of image position. This algorithm has possible scientific, industrial, and biomedical applications in material flaw studies, steel and ore analysis, and pathology, respectively. IMAGEP is written in VAX FORTRAN for DEC VAX series computers running VMS. The program requires the use of a Grinnell 274 image processor which can be obtained from Mark McCloud Associates, Campbell, CA. An object library of the required GMR series software is included on the distribution media. IMAGEP requires 1Mb of RAM for execution. The standard distribution medium for this program is a 1600 BPI 9track magnetic tape in VAX FILES-11 format. It is also available on a TK50 tape cartridge in VAX FILES-11 format. This program was developed in 1991. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation.

  16. Digital Image Processing Technique for Breast Cancer Detection

    NASA Astrophysics Data System (ADS)

    Guzmán-Cabrera, R.; Guzmán-Sepúlveda, J. R.; Torres-Cisneros, M.; May-Arrioja, D. A.; Ruiz-Pinales, J.; Ibarra-Manzano, O. G.; Aviña-Cervantes, G.; Parada, A. González

    2013-09-01

    Breast cancer is the most common cause of death in women and the second leading cause of cancer deaths worldwide. Primary prevention in the early stages of the disease becomes complex as the causes remain almost unknown. However, some typical signatures of this disease, such as masses and microcalcifications appearing on mammograms, can be used to improve early diagnostic techniques, which is critical for women’s quality of life. X-ray mammography is the main test used for screening and early diagnosis, and its analysis and processing are the keys to improving breast cancer prognosis. As masses and benign glandular tissue typically appear with low contrast and often very blurred, several computer-aided diagnosis schemes have been developed to support radiologists and internists in their diagnosis. In this article, an approach is proposed to effectively analyze digital mammograms based on texture segmentation for the detection of early stage tumors. The proposed algorithm was tested over several images taken from the digital database for screening mammography for cancer research and diagnosis, and it was found to be absolutely suitable to distinguish masses and microcalcifications from the background tissue using morphological operators and then extract them through machine learning techniques and a clustering algorithm for intensity-based segmentation.

  17. Digital Transformation of Words in Learning Processes: A Critical View.

    ERIC Educational Resources Information Center

    Saga, Hiroo

    1999-01-01

    Presents some negative aspects of society's dependence on digital transformation of words by referring to works by Walter Ong and Martin Heidegger. Discusses orality, literacy and digital literacy and describes three aspects of the digital transformation of words. Compares/contrasts art with technology and discusses implications for education.…

  18. A Digital Ecosystem for the Collaborative Production of Open Textbooks: The LATIn Methodology

    ERIC Educational Resources Information Center

    Silveira, Ismar Frango; Ochôa, Xavier; Cuadros-Vargas, Alex; Pérez Casas, Alén; Casali, Ana; Ortega, Andre; Sprock, Antonio Silva; Alves, Carlos Henrique; Collazos Ordoñez, Cesar Alberto; Deco, Claudia; Cuadros-Vargas, Ernesto; Knihs, Everton; Parra, Gonzalo; Muñoz-Arteaga, Jaime; Gomes dos Santos, Jéssica; Broisin, Julien; Omar, Nizam; Motz, Regina; Rodés, Virginia; Bieliukas, Yosly Hernández C.

    2013-01-01

    Access to books in higher education is an issue to be addressed, especially in the context of underdeveloped countries, such as those in Latin America. More than just financial issues, cultural aspects and need for adaptation must be considered. The present conceptual paper proposes a methodology framework that would support collaborative open…

  19. Solvent Substitution Methodology Using Multiattribute Utility Theory and the Analytical Hierarchical Process

    DTIC Science & Technology

    1994-09-01

    Wright-Patterson Air Force Base, Ohio AFIT/GEE/ENS/94S-3 SOLVENT SUBSTITUTION METHODOLOGY USING MULTIATTRIBUTE UTILITY THEORY AND THE ANALYTICAL...CLASS: AFIT/GEE/ENS/94S-3 THESIS TITLE: Solvent Substitution Methodology using Multiattribute Utility Theory and the Analytical Hierarchical Process...Process: Depot Level 13 Substitution Process: Field Level 16 Contractor Substitution Process 17 Multiattribute Utility Theory (MAUT) 18 Independence

  20. Automation of contact lens fitting evaluation by digital image processing

    NASA Astrophysics Data System (ADS)

    Costa, Manuel F. M.; Barros, Rui; Franco, Sandra B.

    1997-08-01

    Contact lens' fitting evaluation is of critical importance in the contact lens' prescription process. For the correction of eye's refraction problems the use of contact lens' is very appealing to the user. However its prescription is far more demanding than the one of eye glasses. The fitting of a contact lens to a particular cornea must be carefully assessed in order to reduce possible user's physical miscomfort or even medical situations.The traditional way of easily checking the fitting of a contact lens is to perform a fluorescein test. The simple visual evaluation of the 'smoothness' of the color/brightness distribution of the fluorescence at the contact lens' location gives the optometrist an idea of the fitting's quality. We suggested the automation of the process simply by the substitution of the optometrist's eye by a CCD camera, and the use of appropriated simple image processing techniques. The setup and the digitalization and processing routines will be described in this communication. The processed images may then be directly analyzed by the optometrist in a faster, easier and more efficient way. However, it is also possible to perform an automated fitting evaluation by working out the information given by the image's intensity histograms for the green and blue RGB' channels.

  1. Automation of contact lens' fitting evaluation by digital image processing

    NASA Astrophysics Data System (ADS)

    da Cunha Martins Costa, M.; Barros, Rui; Franco, Sandra B.

    1997-10-01

    Contact lens' fitting evaluation is of critical importance in the contact lens' prescription process. For the correction of eye's refraction problems the use of contact lens' is very appealing to the user. However its prescription is far more demanding than the one of eye glasses. The fitting of a contact lens to a particular cornea must be carefully assessed in order to reduce possible user's physical miscomfort or even medical situations.The traditional way of easily checking the fitting of a contact lens is to perform a fluorescein test. The simple visual evaluation of the 'smoothness' of the color/brightness distribution of the fluorescence at the contact lens' location gives the optometrist an idea of the fitting's quality. We suggested the automation of the process simply by the substitution of the optometrist's eye by a CCD camera, and the use of appropriated simple image processing techniques. The setup and the digitalization and processing routines will be described in this communication. The processed images may then be directly analyzed by the optometrist in a faster, easier and more efficient way. However, it is also possible to perform an automated fitting evaluation by working out the information given by the image's intensity histograms for the green and blue RGB' channels.

  2. Digital-image processing and image analysis of glacier ice

    USGS Publications Warehouse

    Fitzpatrick, Joan J.

    2013-01-01

    This document provides a methodology for extracting grain statistics from 8-bit color and grayscale images of thin sections of glacier ice—a subset of physical properties measurements typically performed on ice cores. This type of analysis is most commonly used to characterize the evolution of ice-crystal size, shape, and intercrystalline spatial relations within a large body of ice sampled by deep ice-coring projects from which paleoclimate records will be developed. However, such information is equally useful for investigating the stress state and physical responses of ice to stresses within a glacier. The methods of analysis presented here go hand-in-hand with the analysis of ice fabrics (aggregate crystal orientations) and, when combined with fabric analysis, provide a powerful method for investigating the dynamic recrystallization and deformation behaviors of bodies of ice in motion. The procedures described in this document compose a step-by-step handbook for a specific image acquisition and data reduction system built in support of U.S. Geological Survey ice analysis projects, but the general methodology can be used with any combination of image processing and analysis software. The specific approaches in this document use the FoveaPro 4 plug-in toolset to Adobe Photoshop CS5 Extended but it can be carried out equally well, though somewhat less conveniently, with software such as the image processing toolbox in MATLAB, Image-Pro Plus, or ImageJ.

  3. [Photodensitometry: microdensitometry (MD): digital image processing method (DIP)].

    PubMed

    Ohama, K; Sanada, M; Nakagawa, H

    1994-09-01

    The principles of microdensitometry (MD) and digital image processing method (DIP), as well as the application of these methods to measure bone mineral density in clinical practice, were mentioned in the report. MD and DIP assess bone mineral content and bone mineral density by analyzing relative contrast of the metacarpus II on X-ray image. However, the parameters obtained by these methods have been reported to be closely related to lumber vertebral bone mineral density and whole-body bone mineral content as measured by dual energy X-ray absorptiometry (DXA). Being easy to use, MD and DIP are adequate for the screening of osteoporosis. Once any reduction in bone mineral content or bone mineral density is shown by MD or DIP, it is recommendable to measure bone mineral density of vertebrae and femoral neck by DXA.

  4. Infective endocarditis detection through SPECT/CT images digital processing

    NASA Astrophysics Data System (ADS)

    Moreno, Albino; Valdés, Raquel; Jiménez, Luis; Vallejo, Enrique; Hernández, Salvador; Soto, Gabriel

    2014-03-01

    Infective endocarditis (IE) is a difficult-to-diagnose pathology, since its manifestation in patients is highly variable. In this work, it was proposed a semiautomatic algorithm based on SPECT images digital processing for the detection of IE using a CT images volume as a spatial reference. The heart/lung rate was calculated using the SPECT images information. There were no statistically significant differences between the heart/lung rates values of a group of patients diagnosed with IE (2.62+/-0.47) and a group of healthy or control subjects (2.84+/-0.68). However, it is necessary to increase the study sample of both the individuals diagnosed with IE and the control group subjects, as well as to improve the images quality.

  5. Principles of image processing in digital chest radiography.

    PubMed

    Prokop, Mathias; Neitzel, Ulrich; Schaefer-Prokop, Cornelia

    2003-07-01

    Image processing has a major impact on image quality and diagnostic performance of digital chest radiographs. Goals of processing are to reduce the dynamic range of the image data to capture the full range of attenuation differences between lungs and mediastinum, to improve the modulation transfer function to optimize spatial resolution, to enhance structural contrast, and to suppress image noise. Image processing comprises look-up table operations and spatial filtering. Look-up table operations allow for automated signal normalization and arbitrary choice of image gradation. The most simple and still widely applied spatial filtering algorithms are based on unsharp masking. Various modifications were introduced for dynamic range reduction and MTF restoration. More elaborate and more effective are multi-scale frequency processing algorithms. They are based on the subdivision of an image in multiple frequency bands according to its structural composition. This allows for a wide range of image manipulations including a size-independent enhancement of low-contrast structures. Principles of the various algorithms will be explained and their impact on image appearance will be illustrated by clinical examples. Optimum and sub-optimum parameter settings are discussed and pitfalls will be explained.

  6. Edge detection - Image-plane versus digital processing

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Fales, Carl L.; Park, Stephen K.; Triplett, Judith A.

    1987-01-01

    To optimize edge detection with the familiar Laplacian-of-Gaussian operator, it has become common to implement this operator with a large digital convolution mask followed by some interpolation of the processed data to determine the zero crossings that locate edges. It is generally recognized that this large mask causes substantial blurring of fine detail. It is shown that the spatial detail can be improved by a factor of about four with either the Wiener-Laplacian-of-Gaussian filter or an image-plane processor. The Wiener-Laplacian-of-Gaussian filter minimizes the image-gathering degradations if the scene statistics are at least approximately known and also serves as an interpolator to determine the desired zero crossings directly. The image-plane processor forms the Laplacian-of-Gaussian response by properly combining the optical design of the image-gathering system with a minimal three-by-three lateral-inhibitory processing mask. This approach, which is suggested by Marr's model of early processing in human vision, also reduces data processing by about two orders of magnitude and data transmission by up to an order of magnitude.

  7. Digital Image Processing for Noise Reduction in Medical Ultrasonics

    NASA Astrophysics Data System (ADS)

    Loupas, Thanasis

    Available from UMI in association with The British Library. Requires signed TDF. The purpose of this project was to investigate the application of digital image processing techniques as a means of reducing noise in medical ultrasonic imaging. Ultrasonic images suffer primarily from a type of acoustic noise, known as speckle, which is generally regarded as a major source of image quality degradation. The origin of speckle, its statistical properties as well as methods suggested to eliminate this artifact were reviewed. A simple model which can characterize the statistics of speckle on displays was also developed. A large number of digital noise reduction techniques was investigated. These include frame averaging techniques performed by commercially available devices and spatial filters implemented in software. Among the latter, some filters have been proposed in the scientific literature for ultrasonic, laser and microwave speckle or general noise suppression and the rest are original, developed specifically to suppress ultrasonic speckle. Particular emphasis was placed on adaptive techniques which adjust the processing performed at each point according to the local image content. In this way, they manage to suppress speckle with negligible loss of genuine image detail. Apart from preserving the diagnostically significant features of a scan another requirement a technique must satisfy before it is accepted in routine clinical practice is real-time operation. A spatial filter capable of satisfying both these requirements was designed and built in hardware using low-cost and readily available components. The possibility of incorporating all the necessary filter circuitry into a single VLSI chip was also investigated. In order to establish the effectiveness and usefulness of speckle suppression, a representative sample from the techniques examined here was applied to a large number of abdominal scans and their effect on image quality was evaluated. Finally, further

  8. A New Digital Signal Processing Method for Spectrum Interference Monitoring

    NASA Astrophysics Data System (ADS)

    Angrisani, L.; Capriglione, D.; Ferrigno, L.; Miele, G.

    2011-01-01

    Frequency spectrum is a limited shared resource, nowadays interested by an ever growing number of different applications. Generally, the companies providing such services pay to the governments the right of using a limited portion of the spectrum, consequently they would be assured that the licensed radio spectrum resource is not interested by significant external interferences. At the same time, they have to guarantee that their devices make an efficient use of the spectrum and meet the electromagnetic compatibility regulations. Therefore the competent authorities are called to control the access to the spectrum adopting suitable management and monitoring policies, as well as the manufacturers have to periodically verify the correct working of their apparatuses. Several measurement solutions are present on the market. They generally refer to real-time spectrum analyzers and measurement receivers. Both of them are characterized by good metrological accuracies but show costs, dimensions and weights that make no possible a use "on the field". The paper presents a first step in realizing a digital signal processing based measurement instrument able to suitably accomplish for the above mentioned needs. In particular the attention has been given to the DSP based measurement section of the instrument. To these aims an innovative measurement method for spectrum monitoring and management is proposed in this paper. It performs an efficient sequential analysis based on a sample by sample digital processing. Three main issues are in particular pursued: (i) measurement performance comparable to that exhibited by other methods proposed in literature; (ii) fast measurement time, (iii) easy implementation on cost-effective measurement hardware.

  9. Study of optical techniques for the Ames unitary wind tunnel: Digital image processing, part 6

    NASA Technical Reports Server (NTRS)

    Lee, George

    1993-01-01

    A survey of digital image processing techniques and processing systems for aerodynamic images has been conducted. These images covered many types of flows and were generated by many types of flow diagnostics. These include laser vapor screens, infrared cameras, laser holographic interferometry, Schlieren, and luminescent paints. Some general digital image processing systems, imaging networks, optical sensors, and image computing chips were briefly reviewed. Possible digital imaging network systems for the Ames Unitary Wind Tunnel were explored.

  10. A novel digital pulse processing architecture for nuclear instrumentation

    SciTech Connect

    Moline, Yoann; Thevenin, Mathieu; Corre, Gwenole; Paindavoine, Michel

    2015-07-01

    The field of nuclear instrumentation covers a wide range of applications, including counting, spectrometry, pulse shape discrimination and multi-channel coincidence. These applications are the topic of many researches, new algorithms and implementations are constantly proposed thanks to advances in digital signal processing. However, these improvements are not yet implemented in instrumentation devices. This is especially true for neutron-gamma discrimination applications which traditionally use charge comparison method while literature proposes other algorithms based on frequency domain or wavelet theory which show better performances. Another example is pileups which are generally rejected while pileup correction algorithms also exist. These processes are traditionally performed offline due to two issues. The first is the Poissonian characteristic of the signal, composed of random arrival pulses which requires to current architectures to work in data flow. The second is the real-time requirement, which implies losing pulses when the pulse rate is too high. Despite the possibility of treating the pulses independently from each other, current architectures paralyze the acquisition of the signal during the processing of a pulse. This loss is called dead-time. These two issues have led current architectures to use dedicated solutions based on re-configurable components like Field Programmable Gate Arrays (FPGAs) to overcome the need of performance necessary to deal with dead-time. However, dedicated hardware algorithm implementations on re-configurable technologies are complex and time-consuming. For all these reasons, a programmable Digital pulse Processing (DPP) architecture in a high level language such as Cor C++ which can reduce dead-time would be worthwhile for nuclear instrumentation. This would reduce prototyping and test duration by reducing the level of hardware expertise to implement new algorithms. However, today's programmable solutions do not meet the

  11. Digital image processing and analysis for activated sludge wastewater treatment.

    PubMed

    Khan, Muhammad Burhan; Lee, Xue Yong; Nisar, Humaira; Ng, Choon Aun; Yeap, Kim Ho; Malik, Aamir Saeed

    2015-01-01

    Activated sludge system is generally used in wastewater treatment plants for processing domestic influent. Conventionally the activated sludge wastewater treatment is monitored by measuring physico-chemical parameters like total suspended solids (TSSol), sludge volume index (SVI) and chemical oxygen demand (COD) etc. For the measurement, tests are conducted in the laboratory, which take many hours to give the final measurement. Digital image processing and analysis offers a better alternative not only to monitor and characterize the current state of activated sludge but also to predict the future state. The characterization by image processing and analysis is done by correlating the time evolution of parameters extracted by image analysis of floc and filaments with the physico-chemical parameters. This chapter briefly reviews the activated sludge wastewater treatment; and, procedures of image acquisition, preprocessing, segmentation and analysis in the specific context of activated sludge wastewater treatment. In the latter part additional procedures like z-stacking, image stitching are introduced for wastewater image preprocessing, which are not previously used in the context of activated sludge. Different preprocessing and segmentation techniques are proposed, along with the survey of imaging procedures reported in the literature. Finally the image analysis based morphological parameters and correlation of the parameters with regard to monitoring and prediction of activated sludge are discussed. Hence it is observed that image analysis can play a very useful role in the monitoring of activated sludge wastewater treatment plants.

  12. Social Information Processing, Emotions, and Aggression: Conceptual and Methodological Contributions of the Special Section Articles

    ERIC Educational Resources Information Center

    Arsenio, William F.

    2010-01-01

    This discussion summarizes some of the key conceptual and methodological contributions of the four articles in this special section on social information processing (SIP) and aggression. One major contribution involves the new methodological tools these studies provide for future researchers. Eye-tracking and mood induction techniques will make it…

  13. Digital signal processing techniques for coherent optical communication

    NASA Astrophysics Data System (ADS)

    Goldfarb, Gilad

    Coherent detection with subsequent digital signal processing (DSP) is developed, analyzed theoretically and numerically and experimentally demonstrated in various fiber-optic transmission scenarios. The use of DSP in conjunction with coherent detection unleashes the benefits of coherent detection which rely on the preservaton of full information of the incoming field. These benefits include high receiver sensitivity, the ability to achieve high spectral-efficiency and the use of advanced modulation formats. With the immense advancements in DSP speeds, many of the problems hindering the use of coherent detection in optical transmission systems have been eliminated. Most notably, DSP alleviates the need for hardware phase-locking and polarization tracking, which can now be achieved in the digital domain. The complexity previously associated with coherent detection is hence significantly diminished and coherent detection is once gain considered a feasible detection alternative. In this thesis, several aspects of coherent detection (with or without subsequent DSP) are addressed. Coherent detection is presented as a means to extend the dispersion limit of a duobinary signal using an analog decision-directed phase-lock loop. Analytical bit-error ratio estimation for quadrature phase-shift keying signals is derived. To validate the promise for high spectral efficiency, the orthogonal-wavelength-division multiplexing scheme is suggested. In this scheme the WDM channels are spaced at the symbol rate, thus achieving the spectral efficiency limit. Theory, simulation and experimental results demonstrate the feasibility of this approach. Infinite impulse response filtering is shown to be an efficient alternative to finite impulse response filtering for chromatic dispersion compensation. Theory, design considerations, simulation and experimental results relating to this topic are presented. Interaction between fiber dispersion and nonlinearity remains the last major challenge

  14. Array Signal Processing for Source Localization and Digital Communication.

    NASA Astrophysics Data System (ADS)

    Song, Bong-Gee

    Array antennas are used in several areas such as sonar and digital communication. Although array patterns may be different depending on applications, they are used with a view to collecting more data and obtaining better results. We first consider a passive sonar system in random environments where the index of refraction is random. While source localization problems for deterministic environments are well studied, they require accurate propagation models which are not available in random environments. We extend the localization problems to random environments. It has been shown that methods developed for deterministic environments fail in random environments because of the stochastic nature of acoustic propagation. Therefore, we model observations as random, and use a statistical signal processing technique combined with physics. The statistical signal model is provided by physics either empirically or theoretically. The performance technique relies on the accuracy of the statistical models. We have applied the maximum likelihood method to angle of arrival estimation and range estimation problems. The Cramer-Rao lower bounds have been also derived to predict the estimation performance. Next, we use the array antennas for diversity combining equalization in digital communications. Spatial diversity equalization is used in two ways; to improve bit error rate or to improve the transmission rate. This is feasible by using more antennas at the receiver end. We apply Helstrom's saddle point integration method to multi -input multi-output communication systems and show that a factor of 3-4 of channel reuse is possible. It is also shown that the advantage is because of the diversity itself not because of more taps. We further improve the equalization performance by joint pre- and postfilter design. Two different methods have been proposed according to the prefilter type. Although the mean square error is not easy to minimize, appropriate methods have been adopted and show

  15. Digital computer processing of peach orchard multispectral aerial photography

    NASA Technical Reports Server (NTRS)

    Atkinson, R. J.

    1976-01-01

    Several methods of analysis using digital computers applicable to digitized multispectral aerial photography, are described, with particular application to peach orchard test sites. This effort was stimulated by the recent premature death of peach trees in the Southeastern United States. The techniques discussed are: (1) correction of intensity variations by digital filtering, (2) automatic detection and enumeration of trees in five size categories, (3) determination of unhealthy foliage by infrared reflectances, and (4) four band multispectral classification into healthy and declining categories.

  16. Methodologies for automating the collection and processing of GPS-GIS information for transportation systems

    NASA Astrophysics Data System (ADS)

    Zhao, Bingyan

    All transportation departments have large amounts of data and information that are needed for planning and operation of their systems. This information can be textual, graphical or spatial in nature. Spatial information is generally in the form of maps and these maps are increasingly being stored and processed as digital GIS files that can be linked to other types of information generally referred to as attribute information. In the NYSDOT database, there are many kinds of features for which information must be maintained. For example, there are about 22,500 bridges within the New York State road systems. The current spatial location for these bridges may not have the level of accuracy that would be desired by today's standards and that can be achieved with new spatial measuring techniques. Although the updating of bridge locations and the location of other features can be done using new techniques such as GPS, if this is done manually it presents a forbidding task. The main objective of this study is to find a way to automatically collect feature location data using GPS equipment and to automate the transfer of this information into archival databases. Among the objectives of this dissertation are: how to automatically download information from the DOT database; how to collect field data following a uniform procedure; how to convert the surveying results into Arc/View shape files and how to update the DOT feature location map information using field data. The end goal is to develop feasible methodologies to automate updating of mapping information using GPS by creating a systems design for the process and to create the scripts and programming needed to make this system work. This has been accomplished and is demonstrated in a sample program. Details of the Automated Acquisition System are described in this dissertation.

  17. Development of next generation digital flat panel catheterization system: design principles and validation methodology

    NASA Astrophysics Data System (ADS)

    Belanger, B.; Betraoui, F.; Dhawale, P.; Gopinath, P.; Tegzes, Pal; Vagvolgyi, B.

    2006-03-01

    The design principles that drove the development of a new cardiovascular x-ray digital flat panel (DFP) detector system are presented, followed by assessments of imaging and dose performance achieved relative to other state of the art FPD systems. The new system (GE Innova 2100 IQ TM) incorporates a new detector with substantially improved DQE at fluoroscopic (73%@1μR) and record (79%@114uR) doses, an x-ray tube with higher continuous fluoro power (3.2kW), a collimator with a wide range of copper spectral filtration (up to 0.9mm), and an improved automatic x-ray exposure management system. The performance of this new system was compared to that of the previous generation GE product (Innova 2000) and to state-of-the art cardiac digital x-ray flat panel systems from two other major manufacturers. Performance was assessed with the industry standard Cardiac X-ray NEMA/SCA and I phantom, and a new moving coronary artery stent (MCAS) phantom, designed to simulate cardiac clinical imaging conditions, composed of an anthropomorphic chest section with stents moving in a manner simulating normal coronary arteries. The NEMA/SCA&I phantom results showed the Innova 2100 IQ to exceed or equal the Innova 2000 in all of the performance categories, while operating at 28% lower dose on average, and to exceed the other DFP systems in most of the performance categories. The MCAS phantom tests showed the Innova 2100 IQ to be significantly better (p << 0.05) than the Innova 2000, and significantly better than the other DFP systems in most cases at comparable or lower doses, thereby verifying excellent performance against design goals.

  18. Digital Signal Processing Techniques for the GIFTS SM EDU

    NASA Astrophysics Data System (ADS)

    Tian, J.; Reisse, R.; Gazarik, M.

    The Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) Sensor Module (SM) Engineering Demonstration Unit (EDU) is a high resolution spectral imager designed to measure infrared (IR) radiance using a Fourier transform spectrometer (FTS). The GIFTS instrument employs three Focal Plane Arrays (FPAs), which gather measurements across the long-wave IR (LWIR), short/mid-wave IR (SMWIR), and visible spectral bands. The raw interferogram measurements are radiometrically and spectrally calibrated to produce radiance spectra, which are further processed to obtain atmospheric profiles via retrieval algorithms. This paper describes several digital signal processing (DSP) techniques involved in the development of the calibration model. In the first stage, the measured raw interferograms must undergo a series of processing steps that include filtering, decimation, and detector nonlinearity correction. The digital filtering is achieved by employing a linear-phase even-length FIR complex filter that is designed based on the optimum equiripple criteria. Next, the detector nonlinearity effect is compensated for using a set of pre-determined detector response characteristics. In the next stage, a phase correction algorithm is applied to the decimated interferograms. This is accomplished by first estimating the phase function from the spectral phase response of the windowed interferogram, and then correcting the entire interferogram based on the estimated phase function. In the calibration stage, we first compute the spectral responsivity based on the previous results and the ideal Planck blackbody spectra at the given temperatures, from which, the calibrated ambient blackbody (ABB), hot blackbody (HBB), and scene spectra can be obtained. In the post-calibration stage, we estimate the Noise Equivalent Spectral Radiance (NESR) from the calibrated ABB and HBB spectra. The NESR is generally considered as a measure of the instrument noise performance, and can be estimated as

  19. Digital Signal Processing Techniques for the GIFTS SM EDU

    NASA Technical Reports Server (NTRS)

    Tian, Jialin; Reisse, Robert A.; Gazarik, Michael J.

    2007-01-01

    The Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) Sensor Module (SM) Engineering Demonstration Unit (EDU) is a high resolution spectral imager designed to measure infrared (IR) radiance using a Fourier transform spectrometer (FTS). The GIFTS instrument employs three Focal Plane Arrays (FPAs), which gather measurements across the long-wave IR (LWIR), short/mid-wave IR (SMWIR), and visible spectral bands. The raw interferogram measurements are radiometrically and spectrally calibrated to produce radiance spectra, which are further processed to obtain atmospheric profiles via retrieval algorithms. This paper describes several digital signal processing (DSP) techniques involved in the development of the calibration model. In the first stage, the measured raw interferograms must undergo a series of processing steps that include filtering, decimation, and detector nonlinearity correction. The digital filtering is achieved by employing a linear-phase even-length FIR complex filter that is designed based on the optimum equiripple criteria. Next, the detector nonlinearity effect is compensated for using a set of pre-determined detector response characteristics. In the next stage, a phase correction algorithm is applied to the decimated interferograms. This is accomplished by first estimating the phase function from the spectral phase response of the windowed interferogram, and then correcting the entire interferogram based on the estimated phase function. In the calibration stage, we first compute the spectral responsivity based on the previous results and the ideal Planck blackbody spectra at the given temperatures, from which, the calibrated ambient blackbody (ABB), hot blackbody (HBB), and scene spectra can be obtained. In the post-calibration stage, we estimate the Noise Equivalent Spectral Radiance (NESR) from the calibrated ABB and HBB spectra. The NESR is generally considered as a measure of the instrument noise performance, and can be estimated as

  20. Complexity, Methodology and Method: Crafting a Critical Process of Research

    ERIC Educational Resources Information Center

    Alhadeff-Jones, Michel

    2013-01-01

    This paper defines a theoretical framework aiming to support the actions and reflections of researchers looking for a "method" in order to critically conceive the complexity of a scientific process of research. First, it starts with a brief overview of the core assumptions framing Morin's "paradigm of complexity" and Le…

  1. Guidelines for the structure of methodological processes in visual anthropology.

    PubMed

    Svilicić, Niksa

    2011-12-01

    Vast majority of visual anthropologists of the 20th century were more focused on general phenomenology of visual anthropology, i.e. the content aspect of their works and their impact on scientific knowledge, leaving behind style of directing and practical principles & processes of creating anthropological film. So far, judging by the available literature, there are no strict guidelines for directorial procedures, nor the precise definition of determining of the methodical processes in production of an anthropological film. Consequently, the goal of this study is to determine the structure and forms of methodical processes as well as to define the advantages and disadvantages of each of them. By using adequate guidelines, the researcher, i.e. the author of the anthropological film, can optimize the production and post-production processes as soon as in preparation (preproduction) period of working on the film, by the technical choice of the approach to the production (proactive/reactive/subjective/objective...) and by defining the style of directing. In other words, it ultimately means more relevant scientific research result with less time and resources.

  2. Process improvements using the NCMS electrical testing methodology

    SciTech Connect

    Goldammer, S.E.; Tucker, D.R.

    1997-06-01

    The conductor analysis electrical testing method uses the artwork patterns and equipment developed by the National Center for Manufacturing Sciences (NCMS) Printed Wiring Board Imaging Team. These patterns and electrical test methods are used to evaluate new or improve existing printed wiring board processes.

  3. Influence of Digital Camera Errors on the Photogrammetric Image Processing

    NASA Astrophysics Data System (ADS)

    Sužiedelytė-Visockienė, Jūratė; Bručas, Domantas

    2009-01-01

    The paper deals with the calibration of digital camera Canon EOS 350D, often used for the photogrammetric 3D digitalisation and measurements of industrial and construction site objects. During the calibration data on the optical and electronic parameters, influencing the distortion of images, such as correction of the principal point, focal length of the objective, radial symmetrical and non-symmetrical distortions were obtained. The calibration was performed by means of the Tcc software implementing the polynomial of Chebichev and using a special test-field with the marks, coordinates of which are precisely known. The main task of the research - to determine how parameters of the camera calibration influence the processing of images, i. e. the creation of geometric model, the results of triangulation calculations and stereo-digitalisation. Two photogrammetric projects were created for this task. In first project the non-corrected and in the second the corrected ones, considering the optical errors of the camera obtained during the calibration, images were used. The results of analysis of the images processing is shown in the images and tables. The conclusions are given.

  4. Microcomputer-Based Digital Signal Processing Laboratory Experiments.

    ERIC Educational Resources Information Center

    Tinari, Jr., Rocco; Rao, S. Sathyanarayan

    1985-01-01

    Describes a system (Apple II microcomputer interfaced to flexible, custom-designed digital hardware) which can provide: (1) Fast Fourier Transform (FFT) computation on real-time data with a video display of spectrum; (2) frequency synthesis experiments using the inverse FFT; and (3) real-time digital filtering experiments. (JN)

  5. Pollution balance. A new methodology for minimizing waste production in manufacturing processes

    SciTech Connect

    Hilaly, A.K.; Sikdar, S.K.

    1994-11-01

    A new methodology based on a generic pollution balance equation has been developed for minimizing waste production in manufacturing processes. A `pollution index,` defined as the mass of waste produced per unit mass of a product, has been introduced to provide a quantitative measure of waste generation in a process. A waste reduction algorithm also has been developed from the pollution balance equation. This paper explains this methodology and demonstrates the applicability of the method by a case study. 8 refs., 7 figs.

  6. Environmental testing of a prototypic digital safety channel, Phase I: System design and test methodology

    SciTech Connect

    Korsah, K.; Turner, G.W.; Mullens, J.A.

    1995-04-01

    A microprocessor-based reactor trip channel has been assembled for environmental testing under an Instrumentation and Control (I&C) Qualification Program sponsored by the US Nuclear Regulatory Commission. The goal of this program is to establish the technical basis and acceptance criteria for the qualification of advanced I&C systems. The trip channel implemented for this study employs technologies and digital subsystems representative of those proposed for use in some advanced light-water reactors (ALWRs) such as the Simplified Boiling Water Reactor (SBWR). It is expected that these tests will reveal any potential system vulnerabilities for technologies representative of those proposed for use in ALWRs. The experimental channel will be purposely stressed considerably beyond what it is likely to experience in a normal nuclear power plant environment, so that the tests can uncover the worst-case failure modes (i.e., failures that are likely to prevent an entire trip system from performing its safety function when required to do so). Based on information obtained from this study, it may be possible to recommend tests that are likely to indicate the presence of such failure mechanisms. Such recommendations would be helpful in augmenting current qualification guidelines.

  7. How processing digital elevation models can affect simulated water budgets.

    PubMed

    Kuniansky, Eve L; Lowery, Mark A; Campbell, Bruce G

    2009-01-01

    For regional models, the shallow water table surface is often used as a source/sink boundary condition, as model grid scale precludes simulation of the water table aquifer. This approach is appropriate when the water table surface is relatively stationary. Since water table surface maps are not readily available, the elevation of the water table used in model cells is estimated via a two-step process. First, a regression equation is developed using existing land and water table elevations from wells in the area. This equation is then used to predict the water table surface for each model cell using land surface elevation available from digital elevation models (DEM). Two methods of processing DEM for estimating the land surface for each cell are commonly used (value nearest the cell centroid or mean value in the cell). This article demonstrates how these two methods of DEM processing can affect the simulated water budget. For the example presented, approximately 20% more total flow through the aquifer system is simulated if the centroid value rather than the mean value is used. This is due to the one-third greater average ground water gradients associated with the centroid value than the mean value. The results will vary depending on the particular model area topography and cell size. The use of the mean DEM value in each model cell will result in a more conservative water budget and is more appropriate because the model cell water table value should be representative of the entire cell area, not the centroid of the model cell.

  8. Digital Signal Processing for SiPM Timing Resolution

    NASA Astrophysics Data System (ADS)

    Philippov, D. E.; Popova, E. V.; Belyaev, V. N.; Buzhan, P. Z.; Stifutkin, A. A.; Vinogradov, S. L.

    2017-01-01

    Digital signal processing (DSP) is an emerging trend in experimental studies and applications of various detectors including SiPMs. In particular, the DSP is recognized as a promising approach to improve coincidence timing resolution (CTR) of fast SiPM-based scintillation detectors. Single photon timing resolution (SPTR) is one of the key parameters affecting CTR, especially important in a case when CTR is approaching to its ultimate limits as, for example, highly demanded in Time-of-Flight PET. To study SiPM timing resolution, we developed a special DSP software and applied it to both SPTR and CTR measurements. These measurements were carried out using 3x3 mm2 KETEK SiPM samples of timing optimized and standard designs with 405 nm picosecond laser for SPTR and with 3x3x5 mm3 LYSO crystals and 511 keV Na-22 source for CRT. Results of the study are useful for further improvements of DSP algorithms and SiPM designs for fast timing.

  9. Automated Coronal Loop Identification Using Digital Image Processing Techniques

    NASA Technical Reports Server (NTRS)

    Lee, Jong K.; Gary, G. Allen; Newman, Timothy S.

    2003-01-01

    The results of a master thesis project on a study of computer algorithms for automatic identification of optical-thin, 3-dimensional solar coronal loop centers from extreme ultraviolet and X-ray 2-dimensional images will be presented. These center splines are proxies of associated magnetic field lines. The project is pattern recognition problems in which there are no unique shapes or edges and in which photon and detector noise heavily influence the images. The study explores extraction techniques using: (1) linear feature recognition of local patterns (related to the inertia-tensor concept), (2) parametric space via the Hough transform, and (3) topological adaptive contours (snakes) that constrains curvature and continuity as possible candidates for digital loop detection schemes. We have developed synthesized images for the coronal loops to test the various loop identification algorithms. Since the topology of these solar features is dominated by the magnetic field structure, a first-order magnetic field approximation using multiple dipoles provides a priori information in the identification process. Results from both synthesized and solar images will be presented.

  10. Fully Digital: Policy and Process Implications for the AAS

    NASA Astrophysics Data System (ADS)

    Biemesderfer, Chris

    Over the past two decades, every scholarly publisher has migrated at least the mechanical aspects of their journal publishing so that they utilize digital means. The academy was comfortable with that for a while, but publishers are under increasing pressure to adapt further. At the American Astronomical Society (AAS), we think that means bringing our publishing program to the point of being fully digital, by establishing procedures and policies that regard the digital objects of publication primarily. We have always thought about our electronic journals as databases of digital articles, from which we can publish and syndicate articles one at a time, and we must now put flesh on those bones by developing practices that are consistent with the realities of article at a time publication online. As a learned society that holds the long-term rights to the literature, we have actively taken responsibility for the preservation of the digital assets that constitute our journals, and in so doing we have not forsaken the legacy pre-digital assets. All of us who serve as the long-term stewards of scholarship must begin to evolve into fully digital publishers.

  11. CIDOC-CRM extensions for conservation processes: A methodological approach

    NASA Astrophysics Data System (ADS)

    Vassilakaki, Evgenia; Zervos, Spiros; Giannakopoulos, Georgios

    2015-02-01

    This paper aims to report the steps taken to create the CIDOC Conceptual Reference Model (CIDOC-CRM) extensions and the relationships established to accommodate the depiction of conservation processes. In particular, the specific steps undertaken for developing and applying the CIDOC-CRM extensions for defining the conservation interventions performed on the cultural artifacts of the National Archaeological Museum of Athens, Greece are presented in detail. A report on the preliminary design of the DOC-CULTURE project (Development of an integrated information environment for assessment and documentation of conservation interventions to cultural works/objects with nondestructive testing techniques [NDTs], www.ndt-lab.gr/docculture), co-financed by the European Union NSRF THALES program, can be found in Kyriaki-Manessi, Zervos & Giannakopoulos (1) whereas the NDT&E methods and their output data through CIDOC-CRM extension of the DOC-CULTURE project approach to standardize the documentation of the conservation were further reported in Kouis et al. (2).

  12. Social work practice in the digital age: therapeutic e-mail as a direct practice methodology.

    PubMed

    Mattison, Marian

    2012-07-01

    The author addresses the risks and benefits of incorporating therapeutic e-mail communication into clinical social work practice. Consumer demand for online clinical services is growing faster than the professional response. E-mail, when used as an adjunct to traditional meetings with clients, offers distinct advantages and risks. Benefits include the potential to reach clients in geographically remote and underserved communities, enhancing and extending the therapeutic relationship and improving treatment outcomes. Risks include threats to client confidentiality and privacy, liability coverage for practitioners, licensing jurisdiction, and the lack of competency standards for delivering e-mail interventions. Currently, the social work profession does not have adequate instructive guidelines and best-practice standards for using e-mail as a direct practice methodology. Practitioners need (formal) academic training in the techniques connected to e-mail exchanges with clients. The author describes the ethical and legal risks for practitioners using therapeutic e-mail with clients and identifies recommendations for establishing best-practice standards.

  13. Optical Digital Parallel Truth-Table Look-Up Processing

    NASA Astrophysics Data System (ADS)

    Mirsalehi, Mir Mojtaba

    During the last decade, a number of optical digital processors have been proposed that combine the parallelism and speed of optics with the accuracy and flexibility of a digital representation. In this thesis, two types of such processors (an EXCLUSIVE OR-based processor and a NAND-based processor) that function as content-addressable memories (CAM's) are analyzed. The main factors that affect the performance of the EXCLUSIVE OR-based processor are found to be the Gaussian nature of the reference beam and the finite square aperture of the crystal. A quasi-one-dimensional model is developed to analyze the effect of the Gaussian reference beam, and a circular aperture is used to increase the dynamic range in the output power. The main factors that affect the performance of the NAND-based processor are found to be the variations in the amplitudes and the relative phase of the laser beams during the recording process. A mathematical model is developed for analyzing the probability of error in the output of the processor. Using this model, the performance of the processor for some practical cases is analyzed. Techniques that have been previously used to reduce the number of reference patterns in a CAM include: using the residue number system and applying logical minimization methods. In the present work, these and additional techniques are investigated. A systematic procedure is developed for selecting the optimum set of moduli. The effect of coding is investigated and it is shown that multi-level coding, when used in conjunction with logical minimization techniques, significantly reduces the number of reference patterns. The Quine-McCluskey method is extended to multiple -valued logic and a computer program based on this extension is used for logical minimization. The results show that for moduli expressable as p('n), where p is a prime number and n is an integer greater than one, p-level coding provides significant reduction. The NAND-based processor is modified for

  14. Data reduction complex analog-to-digital data processing requirements for onsite test facilities

    NASA Technical Reports Server (NTRS)

    Debbrecht, J. D.

    1976-01-01

    The analog to digital processing requirements of onsite test facilities are described. The source and medium of all input data to the Data Reduction Complex (DRC) and the destination and medium of all output products of the analog-to-digital processing are identified. Additionally, preliminary input and output data formats are presented along with the planned use of the output products.

  15. Recognition and inference of crevice processing on digitized paintings

    NASA Astrophysics Data System (ADS)

    Karuppiah, S. P.; Srivatsa, S. K.

    2013-03-01

    This paper is designed to detect and removal of cracks on digitized paintings. The cracks are detected by threshold. Afterwards, the thin dark brush strokes which have been misidentified as cracks are removed using Median radial basis function neural network on hue and saturation data, Semi-automatic procedure based on region growing. Finally, crack is filled using wiener filter. The paper is well designed in such a way that most of the cracks on digitized paintings have identified and removed. The paper % of betterment is 90%. This paper helps us to perform not only on digitized paintings but also the medical images and bmp images. This paper is implemented by Mat Lab.

  16. Enhancing the Teaching of Digital Processing of Remote Sensing Image Course through Geospatial Web Processing Services

    NASA Astrophysics Data System (ADS)

    di, L.; Deng, M.

    2010-12-01

    Remote sensing (RS) is an essential method to collect data for Earth science research. Huge amount of remote sensing data, most of them in the image form, have been acquired. Almost all geography departments in the world offer courses in digital processing of remote sensing images. Such courses place emphasis on how to digitally process large amount of multi-source images for solving real world problems. However, due to the diversity and complexity of RS images and the shortcomings of current data and processing infrastructure, obstacles for effectively teaching such courses still remain. The major obstacles include 1) difficulties in finding, accessing, integrating and using massive RS images by students and educators, and 2) inadequate processing functions and computing facilities for students to freely explore the massive data. Recent development in geospatial Web processing service systems, which make massive data, computing powers, and processing capabilities to average Internet users anywhere in the world, promises the removal of the obstacles. The GeoBrain system developed by CSISS is an example of such systems. All functions available in GRASS Open Source GIS have been implemented as Web services in GeoBrain. Petabytes of remote sensing images in NASA data centers, the USGS Landsat data archive, and NOAA CLASS are accessible transparently and processable through GeoBrain. The GeoBrain system is operated on a high performance cluster server with large disk storage and fast Internet connection. All GeoBrain capabilities can be accessed by any Internet-connected Web browser. Dozens of universities have used GeoBrain as an ideal platform to support data-intensive remote sensing education. This presentation gives a specific example of using GeoBrain geoprocessing services to enhance the teaching of GGS 588, Digital Remote Sensing taught at the Department of Geography and Geoinformation Science, George Mason University. The course uses the textbook "Introductory

  17. ISSUES IN DIGITAL IMAGE PROCESSING OF AERIAL PHOTOGRAPHY FOR MAPPING SUBMERSED AQUATIC VEGETATION

    EPA Science Inventory

    The paper discusses the numerous issues that needed to be addressed when developing a methodology for mapping Submersed Aquatic Vegetation (SAV) from digital aerial photography. Specifically, we discuss 1) choice of film; 2) consideration of tide and weather constraints; 3) in-s...

  18. Optical processing architecture and its potential application for digital and analog radiography.

    PubMed

    Liu, H; Xu, J; Fajardo, L L

    1999-04-01

    In this report we introduce the fundamental architectures and the potential applications of optical processing techniques in medical imaging. Three basic optical processing architectures were investigated for digital and analog radiography. The processors consist of a module that converts either the analog or the digital radiograph into a coherent light distribution; a coherent optical processing architecture that performs various mathematical operations; a programmable digital-optical interface and other accessories. Optical frequency filters were implemented for mammographic and other clinical feature enhancement. In medical image processing, digital computers offer the advantages of programmability and flexibility. In contrast, optical processors perform parallel image processing with high speed. Optical processors also offer analog nature, compact size, and cost effectiveness. With technical advances of digital-optical interface devices, the medical image processor, in the foreseeable future, may be a hybrid device, namely, a programmable optical architecture.

  19. Beyond roots alone: Novel methodologies for analyzing complex soil and minirhizotron imagery using image processing and GIS tools

    NASA Astrophysics Data System (ADS)

    Silva, Justina A.

    Quantifying belowground dynamics is critical to our understanding of plant and ecosystem function and belowground carbon cycling, yet currently available tools for complex belowground image analyses are insufficient. We introduce novel techniques combining digital image processing tools and geographic information systems (GIS) analysis to permit semi-automated analysis of complex root and soil dynamics. We illustrate methodologies with imagery from microcosms, minirhizotrons, and a rhizotron, in upland and peatland soils. We provide guidelines for correct image capture, a method that automatically stitches together numerous minirhizotron images into one seamless image, and image analysis using image segmentation and classification in SPRING or change analysis in ArcMap. These methods facilitate spatial and temporal root and soil interaction studies, providing a framework to expand a more comprehensive understanding of belowground dynamics.

  20. Modeling the simulation execution process with digital objects

    NASA Astrophysics Data System (ADS)

    Cubert, Robert M.; Fishwick, Paul A.

    1999-06-01

    Object Oriented Physical Modeling (OOPM), formerly known as MOOSE, and its implementation of behavior multimodels provide an ability to manage arbitrarily complex patterns of behavioral abstraction in web-friendly simulation modeling. In an OOPM mode, one object stands as surrogate for another object, and these surrogates cognitively map to the real world. This `physical object' principle mitigates impact of incomplete knowledge and ambiguity because its real-world metaphors enable model authors to draw on intuition, facilitating reuse and integration, as well as consistency in collaborative efforts. A 3D interface for modeling and simulation visualization, under construction to augment the existing 2D GUI, obeys the physical object principle, providing a means to create, change, reuse, and integrate digital worlds made of digital objects. Implementation includes Distributed Simulation Executive, Digital object MultiModel Language, Digital Object Warehouse, and multimodel Translator. This approach is powerful and its capabilities have steadily grown; however, it has lacked a formal basis which we now provide: we define multimodels, represent digital objects as multimodels, transform multimodels to simulations, demonstrate the correctness of execution sequence of the simulations, and closure under coupling of digital objects. These theoretical results complement and enhance the practical aspects of physical multimodeling.

  1. The application of digital signal processing techniques to a teleoperator radar system

    NASA Technical Reports Server (NTRS)

    Pujol, A.

    1982-01-01

    A digital signal processing system was studied for the determination of the spectral frequency distribution of echo signals from a teleoperator radar system. The system consisted of a sample and hold circuit, an analog to digital converter, a digital filter, and a Fast Fourier Transform. The system is interfaced to a 16 bit microprocessor. The microprocessor is programmed to control the complete digital signal processing. The digital filtering and Fast Fourier Transform functions are implemented by a S2815 digital filter/utility peripheral chip and a S2814A Fast Fourier Transform chip. The S2815 initially simulates a low-pass Butterworth filter with later expansion to complete filter circuit (bandpass and highpass) synthesizing.

  2. Development of Coriolis mass flowmeter with digital drive and signal processing technology.

    PubMed

    Hou, Qi-Li; Xu, Ke-Jun; Fang, Min; Liu, Cui; Xiong, Wen-Jun

    2013-09-01

    Coriolis mass flowmeter (CMF) often suffers from two-phase flowrate which may cause flowtube stalling. To solve this problem, a digital drive method and a digital signal processing method of CMF is studied and implemented in this paper. A positive-negative step signal is used to initiate the flowtube oscillation without knowing the natural frequency of the flowtube. A digital zero-crossing detection method based on Lagrange interpolation is adopted to calculate the frequency and phase difference of the sensor output signals in order to synthesize the digital drive signal. The digital drive approach is implemented by a multiplying digital to analog converter (MDAC) and a direct digital synthesizer (DDS). A digital Coriolis mass flow transmitter is developed with a digital signal processor (DSP) to control the digital drive, and realize the signal processing. Water flow calibrations and gas-liquid two-phase flowrate experiments are conducted to examine the performance of the transmitter. The experimental results show that the transmitter shortens the start-up time and can maintain the oscillation of flowtube in two-phase flowrate condition.

  3. Digital mapping of side-scan sonar data with the Woods Hole Image Processing System software

    USGS Publications Warehouse

    Paskevich, Valerie F.

    1992-01-01

    Since 1985, the Branch of Atlantic Marine Geology has been involved in collecting, processing and digitally mosaicking high and low resolution sidescan sonar data. In the past, processing and digital mosaicking has been accomplished with a dedicated, shore-based computer system. Recent development of a UNIX-based image-processing software system includes a series of task specific programs for pre-processing sidescan sonar data. To extend the capabilities of the UNIX-based programs, development of digital mapping techniques have been developed. This report describes the initial development of an automated digital mapping procedure. Included is a description of the programs and steps required to complete the digital mosaicking on a UNIXbased computer system, and a comparison of techniques that the user may wish to select.

  4. Digital Subtraction Fluoroscopic System With Tandem Video Processing Units

    NASA Astrophysics Data System (ADS)

    Gould, Robert G.; Lipton, Martin J.; Mengers, Paul; Dahlberg, Roger

    1981-07-01

    A real-time digital fluoroscopic system utilizing two video processing units (Quantex) in tandem to produce continuous subtraction images of peripheral and internal vessels following intravenous contrast media injection has been inves-tigated. The first processor subtracts a mask image consisting of an exponentially weighted moving average of N1 frames (N1 = 2k where k = 0.7) from each incoming video frame, divides by N1, and outputs the resulting difference image to the second processor. The second unit continuously averages N2 incoming frames (N2 = 2k) and outputs to a video monitor and analog disc recorder. The contrast of the subtracted images can be manipulated by changing gain or by a non-linear output transform. After initial equipment adjustments, a subtraction sequence can be produced without operator interaction with the processors. Alternatively, the operator can freeze the mask and/or the subtracted output image at any time during the sequence. Raw data is preserved on a wide band video tape recorder permitting retrospective viewing of an injection sequence with different processor settings. The advantage of the tandem arrangement is that it has great flexibility in varying the duration and the time of both the mask and injection images thereby minimizing problems of registration between them. In addition, image noise is reduced by compiling video frames rather than by using a large radiation dose for a single frame, which requires a wide dynamic range video camera riot commonly available in diagnostic x-ray equipment. High quality subtraction images of arteries have been obtained in 15 anesthetized dogs using relatively low exposure rates (10-12 μR/video frame) modest volumes of contrast medium (0.5-1 ml/kg), and low injection flow rates (6-10 ml/sec). The results/ achieved so far suggest that this system has direct clinical applications.

  5. Digital process for an implant-supported fixed dental prosthesis: A clinical report.

    PubMed

    Brandt, Jan; Lauer, Hans-Christoph; Peter, Thorsten; Brandt, Silvia

    2015-10-01

    A digital process is presented for an implant-supported single-tooth and a 3-unit fixed dental prosthesis (FDP) with customized abutments and monolithic prosthetic zirconia restorations. The digital impression on the implant level was made with a TRIOS intraoral scanner (3Shape). This process included the fabrication of an implant cast with the fused deposition modeling technique and a 3-dimensional printing process with integrated implant analogs. The process enabled the FDPs to be designed with CAD/CAM on the cast before patient contact. Designing a printed implant cast expands the use of the digital workflow in the dental field.

  6. How to Find Exculpatory and Inculpatory Evidence Using a Circular Digital Forensics Process Model

    NASA Astrophysics Data System (ADS)

    Khatir, Marjan; Hejazi, Seyed Mahmood

    With raising the number of cyber crimes, the need of having a proper digital forensic process also increases. Although digital forensics is practiced in recent years, there is still a big gap between previously suggested digital forensics processes and what is really needed to be done in real cases. Some problems with current processes are lack of flexible transition between phases, not having a clear method or a complete scenario for addressing reliable evidence, and not paying enough attention to management aspects and team roles. This paper provides a process model by paying special attention to the team roles and management aspects as well as both exculpatory and inculpatory evidence.

  7. [Methodological study on digitalization of tongue image in traditional Chinese medical diagnosis].

    PubMed

    Zhou, Yue; Yang, Jie; Shen, Li

    2004-12-01

    This is a research aimed at proposing a computerized tongue analysis method based on computerized image processing for quantizing the tongue properties in traditional Chinese medical diagnosis. The chromatic algorithm and 2-D Gabor wavelet transformation are applied to segmenting tongue from original image. The statistical method is adopted in identifying the colors of each pixel, which are attributed to the tongue substance and coating respectively. Thickness of tongue coating is determined by energy of 2-D Gabor wavelet coefficients (GWTE). The distribution of GWTE and invariant moment algorithm are used to judge the tongue texture. The experiment result shows that all methods proposed in this paper are effective.

  8. Reengineering the Acquisition/Procurement Process: A Methodology for Requirements Collection

    NASA Technical Reports Server (NTRS)

    Taylor, Randall; Vanek, Thomas

    2011-01-01

    This paper captures the systematic approach taken by JPL's Acquisition Reengineering Project team, the methodology used, challenges faced, and lessons learned. It provides pragmatic "how-to" techniques and tools for collecting requirements and for identifying areas of improvement in an acquisition/procurement process or other core process of interest.

  9. Using Dual-Task Methodology to Dissociate Automatic from Nonautomatic Processes Involved in Artificial Grammar Learning

    ERIC Educational Resources Information Center

    Hendricks, Michelle A.; Conway, Christopher M.; Kellogg, Ronald T.

    2013-01-01

    Previous studies have suggested that both automatic and intentional processes contribute to the learning of grammar and fragment knowledge in artificial grammar learning (AGL) tasks. To explore the relative contribution of automatic and intentional processes to knowledge gained in AGL, we utilized dual-task methodology to dissociate automatic and…

  10. Methodology development for the sustainability process assessment of sheet metal forming of complex-shaped products

    NASA Astrophysics Data System (ADS)

    Pankratov, D. L.; Kashapova, L. R.

    2015-06-01

    A methodology was developed for automated assessment of the reliability of the process of sheet metal forming process to reduce the defects in complex components manufacture. The article identifies the range of allowable values of the stamp parameters to obtain defect-free punching of spars trucks.

  11. All-digital precision processing of ERTS images

    NASA Technical Reports Server (NTRS)

    Bernstein, R. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. Digital techniques have been developed and used to apply precision-grade radiometric and geometric corrections to ERTS MSS and RBV scenes. Geometric accuracies sufficient for mapping at 1:250,000 scale have been demonstrated. Radiometric quality has been superior to ERTS NDPF precision products. A configuration analysis has shown that feasible, cost-effective all-digital systems for correcting ERTS data are easily obtainable. This report contains a summary of all results obtained during this study and includes: (1) radiometric and geometric correction techniques, (2) reseau detection, (3) GCP location, (4) resampling, (5) alternative configuration evaluations, and (6) error analysis.

  12. Process reengineering: the role of a planning methodology and picture archiving and communications system team building.

    PubMed

    Carrino, J A; Unkel, P J; Shelton, P; Johnson, T G

    1999-05-01

    The acquisition of a picture archiving and communications system (PACS) is an opportunity to reengineer business practices and should optimally consider the entire process from image acquisition to communication of results. The purpose of this presentation is to describe the PACS planning methodology used by the Department of Defense (DOD) Joint Imaging Technology Project Office (JITPO), outline the critical procedures for each phase, and review the military experience using this model. The methodology is segmented into four phases: strategic planning, clinical scenario planning, installation planning, and implementation planning. Each is further subdivided based on the specific tasks that need to be accomplished within that phase. By using this method, an institution will have clearly defined program goals, objectives, and PACS requirements before vendors are contacted. The development of an institution-specific PACS requirement should direct the process of proposal comparisons to be based on functionality and exclude unnecessary equipment. This PACS planning methodology is being used at more than eight DOD medical treatment facilities. When properly executed, this methodology facilitates a seamless transition to the electronic environment and contributes to the successful integration of the healthcare enterprise. A crucial component of this methodology is the development of a local PACS planning team to manage all aspects of the process. A plan formulated by the local team is based on input from each department that will be integrating with the PACS. Involving all users in the planning process is paramount for successful implementation.

  13. The Effects of Digital Portfolio Assessment Process on Students' Writing and Drawing Performances

    ERIC Educational Resources Information Center

    Tezci, Erdogan; Dikici, Ayhan

    2006-01-01

    In this paper, it was investigated the effect of digital portfolio assessment process on the drawing and story writing performances of the 14-15 ages students. For this reason, a digital portfolio assessment rubric was prepared in order to evaluate students' drawing and story writing works. For the validity and reliability analyze was applied to…

  14. Digital signal and image processing in echocardiography. The American Society of Echocardiography.

    PubMed

    Skorton, D J; Collins, S M; Garcia, E; Geiser, E A; Hillard, W; Koppes, W; Linker, D; Schwartz, G

    1985-12-01

    Digital signal and image processing techniques are acquiring an increasingly important role in the generation and analysis of cardiac images. This is particularly true of 2D echocardiography, in which image acquisition, manipulation, and storage within the echocardiograph, as well as quantitative analysis of echocardiographic data by means of "off-line" systems, depend upon digital techniques. The increasing role of computers in echocardiography makes it essential that echocardiographers and technologists understand the basic principles of digital techniques applied to echocardiographic instrumentation and data analysis. In this article, we have discussed digital techniques as applied to image generation (digital scan conversion, preprocessing, and postprocessing) as well as to the analysis of image data (computer-assisted border detection, 3D reconstruction, tissue characterization, and contrast echocardiography); a general introduction to off-line analysis systems was also given. Experience with other cardiac imaging methods indicates that digital techniques will likely play a dominant role in the future of echocardiographic imaging.

  15. A new methodology for studying dynamics of aerosol particles in sneeze and cough using a digital high-vision, high-speed video system and vector analyses.

    PubMed

    Nishimura, Hidekazu; Sakata, Soichiro; Kaga, Akikazu

    2013-01-01

    Microbial pathogens of respiratory infectious diseases are often transmitted through particles in sneeze and cough. Therefore, understanding the particle movement is important for infection control. Images of a sneeze induced by nasal cavity stimulation by healthy adult volunteers, were taken by a digital high-vision, high-speed video system equipped with a computer system and treated as a research model. The obtained images were enhanced electronically, converted to digital images every 1/300 s, and subjected to vector analysis of the bioparticles contained in the whole sneeze cloud using automatic image processing software. The initial velocity of the particles or their clusters in the sneeze was greater than 6 m/s, but decreased as the particles moved forward; the momentums of the particles seemed to be lost by 0.15-0.20 s and started a diffusion movement. An approximate equation of a function of elapsed time for their velocity was obtained from the vector analysis to represent the dynamics of the front-line particles. This methodology was also applied for a cough. Microclouds contained in a smoke exhaled with a voluntary cough by a volunteer after smoking one breath of cigarette, were traced as the visible, aerodynamic surrogates for invisible bioparticles of cough. The smoke cough microclouds had an initial velocity greater than 5 m/s. The fastest microclouds were located at the forefront of cloud mass that moving forward; however, their velocity clearly decreased after 0.05 s and they began to diffuse in the environmental airflow. The maximum direct reaches of the particles and microclouds driven by sneezing and coughing unaffected by environmental airflows were estimated by calculations using the obtained equations to be about 84 cm and 30 cm from the mouth, respectively, both achieved in about 0.2 s, suggesting that data relating to the dynamics of sneeze and cough became available by calculation.

  16. Screen Capture Technology: A Digital Window into Students' Writing Processes

    ERIC Educational Resources Information Center

    Seror, Jeremie

    2013-01-01

    Technological innovations and the prevalence of the computer as a means of producing and engaging with texts have dramatically transformed how literacy is defined and developed in modern society. This rise in digital writing practices has led to a growing number of tools and methods that can be used to explore second language (L2) writing…

  17. Identification and Quantification Soil Redoximorphic Features by Digital Image Processing

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Soil redoximorphic features (SRFs) have provided scientists and land managers with insight into relative soil moisture for approximately 60 years. The overall objective of this study was to develop a new method of SRF identification and quantification from soil cores using a digital camera and imag...

  18. Autism and Digital Learning Environments: Processes of Interaction and Mediation

    ERIC Educational Resources Information Center

    Passerino, Liliana M.; Santarosa, Lucila M. Costi

    2008-01-01

    Using a socio-historical perspective to explain social interaction and taking advantage of information and communication technologies (ICTs) currently available for creating digital learning environments (DLEs), this paper seeks to redress the absence of empirical data concerning technology-aided social interaction between autistic individuals. In…

  19. Parallel Digital Watermarking Process on Ultrasound Medical Images in Multicores Environment

    PubMed Central

    Khor, Hui Liang; Liew, Siau-Chuin; Zain, Jasni Mohd.

    2016-01-01

    With the advancement of technology in communication network, it facilitated digital medical images transmitted to healthcare professionals via internal network or public network (e.g., Internet), but it also exposes the transmitted digital medical images to the security threats, such as images tampering or inserting false data in the images, which may cause an inaccurate diagnosis and treatment. Medical image distortion is not to be tolerated for diagnosis purposes; thus a digital watermarking on medical image is introduced. So far most of the watermarking research has been done on single frame medical image which is impractical in the real environment. In this paper, a digital watermarking on multiframes medical images is proposed. In order to speed up multiframes watermarking processing time, a parallel watermarking processing on medical images processing by utilizing multicores technology is introduced. An experiment result has shown that elapsed time on parallel watermarking processing is much shorter than sequential watermarking processing. PMID:26981111

  20. Spectral analysis and filtering techniques in digital spatial data processing

    USGS Publications Warehouse

    Pan, Jeng-Jong

    1989-01-01

    A filter toolbox has been developed at the EROS Data Center, US Geological Survey, for retrieving or removing specified frequency information from two-dimensional digital spatial data. This filter toolbox provides capabilities to compute the power spectrum of a given data and to design various filters in the frequency domain. Three types of filters are available in the toolbox: point filter, line filter, and area filter. Both the point and line filters employ Gaussian-type notch filters, and the area filter includes the capabilities to perform high-pass, band-pass, low-pass, and wedge filtering techniques. These filters are applied for analyzing satellite multispectral scanner data, airborne visible and infrared imaging spectrometer (AVIRIS) data, gravity data, and the digital elevation models (DEM) data. -from Author

  1. GEOMETRIC PROCESSING OF DIGITAL IMAGES OF THE PLANETS.

    USGS Publications Warehouse

    Edwards, Kathleen

    1987-01-01

    New procedures and software have been developed for geometric transformations of images to support digital cartography of the planets. The procedures involve the correction of spacecraft camera orientation of each image with the use of ground control and the transformation of each image to a Sinusoidal Equal-Area map projection with an algorithm which allows the number of transformation calculations to vary as the distortion varies within the image. When the distortion is low in an area of an image, few transformation computations are required, and most pixels can be interpolated. When distortion is extreme, the location of each pixel is computed. Mosaics are made of these images and stored as digital databases.

  2. Centralized Digital Picture Processing System For Cardiac Imaging

    NASA Astrophysics Data System (ADS)

    LeFree, M. T.; Vogel, R. A.

    1982-01-01

    We have designed and implemented a system for the centralized acquisition, display, analysis and archiving of diagnostic cardiac medical images from x-ray fluoroscopy, two-dimensional ultrasonography and nuclear scintigraphy. Centered around a DLC PUP 11/34 minicomputer with an existing gamma camera interface, we have added a closed-circuit television system with a 256x512x8-bit video digitizer and image display controller to interface the video output of the fluoroscope and ultrasonograph. A video disc recorder (under computer control) is used as an input and playback buffer, allowing for data transfer to and from digital disc drives. Thus, real-time video digitization is possible for up to ten seconds of incoming RS-170-compatible video. The digitizer separates video fields at real-time into two 256x256x8-bit refresh memories, providing 60Hz temporal resolution. Generally, however, we choose to record at non-real-time rates to encompass more than ten seconds. In addition to I/O software controlling data acquisition ana playback, we have developed a versatile data analysis package (offering such capabilities as image algebra, Fourier analysis and convolutional filtering), as well as interactive data reduction subroutines (such as region-of-interest definition, profile plotting and regional extraction of statistical and probabilistic information). We have found the system useful for standard cardiac image analysis, for simultaneous display of images from the three modalities, for picture storage and retrieval, and as a research tool. future plans include the addition of intelligent terminals at each modality and progression to a 32-bit machine for the central processor.

  3. Digital microfluidic processing of mammalian embryos for vitrification.

    PubMed

    Pyne, Derek G; Liu, Jun; Abdelgawad, Mohamed; Sun, Yu

    2014-01-01

    Cryopreservation is a key technology in biology and clinical practice. This paper presents a digital microfluidic device that automates sample preparation for mammalian embryo vitrification. Individual micro droplets manipulated on the microfluidic device were used as micro-vessels to transport a single mouse embryo through a complete vitrification procedure. Advantages of this approach, compared to manual operation and channel-based microfluidic vitrification, include automated operation, cryoprotectant concentration gradient generation, and feasibility of loading and retrieval of embryos.

  4. Developing an undergraduate geography course on digital image processing of remotely sensed data

    NASA Technical Reports Server (NTRS)

    Baumann, P. R.

    1981-01-01

    Problems relating to the development of a digital image processing course in an undergraduate geography environment is discussed. Computer resource requirements, course prerequisites, and the size of the study area are addressed.

  5. A new and practical method to obtain grain size measurements in sandy shores based on digital image acquisition and processing

    NASA Astrophysics Data System (ADS)

    Baptista, P.; Cunha, T. R.; Gama, C.; Bernardes, C.

    2012-12-01

    Modern methods for the automated evaluation of sediment size in sandy shores relay on digital image processing algorithms as an alternative to time-consuming traditional sieving methodologies. However, the requirements necessary to guarantee that the considered image processing algorithm has a good grain identification success rate impose the need for dedicated hardware setups to capture the sand surface images. Examples are specially designed camera housings that maintain a constant distance between the camera lens and the sand surface, tripods to fix and maintain the camera angle orthogonal to the sand surface, external illumination systems that guarantee the light level necessary for the image processing algorithms, and special lenses and focusing systems for close proximity image capturing. In some cases, controlled image-capturing conditions can make the fieldwork more laborious which incurs in significant costs for monitoring campaigns considering large areas. To circumvent this problem, it is proposed a new automated image-processing algorithm that identifies sand grains in digital images acquired with a standard digital camera without any extra hardware attached to it. The accuracy and robustness of the proposed algorithm are evaluated in this work by means of a laboratory test on previously controlled grain samples, field tests where 64 samples (spread over a beach stretch of 65 km and with grain size ranging from 0.5 mm to 1.9 mm) were processed by both the proposed method and by sieving and finally by manual point count on all acquired images. The calculated root-mean-square (RMS) error between mean grain sizes obtained from the proposed image processing method and the sieve method (for the 64 samples) was 0.33 mm, and for the image processing method versus manual point counts comparison, with the same images, was 0.12 mm. The achieved correlation coefficients (r) were 0.91 and 0.96, respectively.

  6. Is place-value processing in four-digit numbers fully automatic? Yes, but not always.

    PubMed

    García-Orza, Javier; Estudillo, Alejandro J; Calleja, Marina; Rodríguez, José Miguel

    2017-01-30

    Knowing the place-value of digits in multi-digit numbers allows us to identify, understand and distinguish between numbers with the same digits (e.g., 1492 vs. 1942). Research using the size congruency task has shown that the place-value in a string of three zeros and a non-zero digit (e.g., 0090) is processed automatically. In the present study, we explored whether place-value is also automatically activated when more complex numbers (e.g., 2795) are presented. Twenty-five participants were exposed to pairs of four-digit numbers that differed regarding the position of some digits and their physical size. Participants had to decide which of the two numbers was presented in a larger font size. In the congruent condition, the number shown in a bigger font size was numerically larger. In the incongruent condition, the number shown in a smaller font size was numerically larger. Two types of numbers were employed: numbers composed of three zeros and one non-zero digit (e.g., 0040-0400) and numbers composed of four non-zero digits (e.g., 2795-2759). Results showed larger congruency effects in more distant pairs in both type of numbers. Interestingly, this effect was considerably stronger in the strings composed of zeros. These results indicate that place-value coding is partially automatic, as it depends on the perceptual and numerical properties of the numbers to be processed.

  7. An Effective Methodology for Processing and Analyzing Large, Complex Spacecraft Data Streams

    ERIC Educational Resources Information Center

    Teymourlouei, Haydar

    2013-01-01

    The emerging large datasets have made efficient data processing a much more difficult task for the traditional methodologies. Invariably, datasets continue to increase rapidly in size with time. The purpose of this research is to give an overview of some of the tools and techniques that can be utilized to manage and analyze large datasets. We…

  8. Development of an Optimization Methodology for the Aluminum Alloy Wheel Casting Process

    NASA Astrophysics Data System (ADS)

    Duan, Jianglan; Reilly, Carl; Maijer, Daan M.; Cockcroft, Steve L.; Phillion, Andre B.

    2015-08-01

    An optimization methodology has been developed for the aluminum alloy wheel casting process. The methodology is focused on improving the timing of cooling processes in a die to achieve improved casting quality. This methodology utilizes (1) a casting process model, which was developed within the commercial finite element package, ABAQUS™—ABAQUS is a trademark of Dassault Systèms; (2) a Python-based results extraction procedure; and (3) a numerical optimization module from the open-source Python library, Scipy. To achieve optimal casting quality, a set of constraints have been defined to ensure directional solidification, and an objective function, based on the solidification cooling rates, has been defined to either maximize, or target a specific, cooling rate. The methodology has been applied to a series of casting and die geometries with different cooling system configurations, including a 2-D axisymmetric wheel and die assembly generated from a full-scale prototype wheel. The results show that, with properly defined constraint and objective functions, solidification conditions can be improved and optimal cooling conditions can be achieved leading to process productivity and product quality improvements.

  9. Processing, mosaicking and management of the Monterey Bay digital sidescan-sonar images

    USGS Publications Warehouse

    Chavez, P.S.; Isbrecht, J.; Galanis, P.; Gabel, G.L.; Sides, S.C.; Soltesz, D.L.; Ross, S.L.; Velasco, M.G.

    2002-01-01

    Sidescan-sonar imaging systems with digital capabilities have now been available for approximately 20 years. In this paper we present several of the various digital image processing techniques developed by the U.S. Geological Survey (USGS) and used to apply intensity/radiometric and geometric corrections, as well as enhance and digitally mosaic, sidescan-sonar images of the Monterey Bay region. New software run by a WWW server was designed and implemented to allow very large image data sets, such as the digital mosaic, to be easily viewed interactively, including the ability to roam throughout the digital mosaic at the web site in either compressed or full 1-m resolution. The processing is separated into the two different stages: preprocessing and information extraction. In the preprocessing stage, sensor-specific algorithms are applied to correct for both geometric and intensity/radiometric distortions introduced by the sensor. This is followed by digital mosaicking of the track-line strips into quadrangle format which can be used as input to either visual or digital image analysis and interpretation. An automatic seam removal procedure was used in combination with an interactive digital feathering/stenciling procedure to help minimize tone or seam matching problems between image strips from adjacent track-lines. The sidescan-sonar image processing package is part of the USGS Mini Image Processing System (MIPS) and has been designed to process data collected by any 'generic' digital sidescan-sonar imaging system. The USGS MIPS software, developed over the last 20 years as a public domain package, is available on the WWW at: http://terraweb.wr.usgs.gov/trs/software.html.

  10. Reaction Wheel Friction Telemetry Data Processing Methodology and On-Orbit Experience

    NASA Astrophysics Data System (ADS)

    Hacker, Johannes M.; Ying, Jiongyu; Lai, Peter C.

    2015-09-01

    A Globalstar 2nd generation satellite experienced a reaction wheel mechanical failure, and in response Globalstar has been closely monitoring reaction wheel bearing friction. To prevent another reaction wheel hardware failure and subsequent shortened satellite mission life, a friction data processing methodology was developed as an on-orbit monitoring tool for the ground to issue early warning and take appropriate action on any hardware degradation or potential failure. The methodology, reaction wheel friction behavior, and its application to an on-orbit anomaly experience will be presented.

  11. Experimental Methodology for Determining Optimum Process Parameters for Production of Hydrous Metal Oxides by Internal Gelation

    SciTech Connect

    Collins, J.L.

    2005-10-28

    The objective of this report is to describe a simple but very useful experimental methodology that was used to determine optimum process parameters for preparing several hydrous metal-oxide gel spheres by the internal gelation process. The method is inexpensive and very effective in collection of key gel-forming data that are needed to prepare the hydrous metal-oxide microspheres of the best quality for a number of elements.

  12. An image processing system for digital chest X-ray images.

    PubMed

    Cocklin, M; Gourlay, A; Jackson, P; Kaye, G; Miessler, M; Kerr, I; Lams, P

    1984-01-01

    This paper investigates the requirements for image processing of digital chest X-ray images. These images are conventionally recorded on film and are characterised by large size, wide dynamic range and high resolution. X-ray detection systems are now becoming available for capturing these images directly in photoelectronic-digital form. In this report, the hardware and software facilities required for handling these images are described. These facilities include high resolution digital image displays, programmable video look up tables, image stores for image capture and processing and a full range of software tools for image manipulation. Examples are given of the application of digital image processing techniques to this class of image.

  13. Digitizing rocks standardizing the geological description process using workstations

    SciTech Connect

    Saunders, M.R. , Windsor, Berkshire ); Shields, J.A. ); Taylor, M.R. )

    1993-09-01

    The preservation of geological knowledge in a standardized digital form presents a challenge. Data sources, inherently fuzzy, range in scale from the macroscopic (e.g., outcrop) through the mesoscopic (e.g., hand-specimen) core and sidewall core, to the microscopic (e.g., drill cuttings, thin sections, and microfossils). Each scale change results in increased heterogeneity and potentially contradictory data and the providers of such data may vary in experience level. To address these issues with respect to cores and drill cuttings, a geological description workstation has been developed and is undergoing field trials. Over 1000 carefully defined geological attributes are currently available within a depth-indexed, relational database. Attributes are stored in digital form, allowing multiple users to select familiar usage (e.g., diabase vs. dolerite). Data can be entered in one language and retrieved in other languages. The database structure allow groupings of similar elements (e.g., rhyolites in acidic, igneous or volcanics subgroups or the igneous rock group) permitting different uses to analyze details appropriate to the scale of the usage. Data entry uses a graphical user interface, allowing the geologist to make quick, logical selections in a standardized or custom-built format with extensive menus, on-screen graphics and help screens available. Description ranges are permissible. Entries for lithology, petrology, structures (sedimentary, organic and deformational), reservoir characteristics (porosity and hydrocarbon shows), and macrofossils are available. Sampling points for thin sections, core analysis, geochemistry, or micropaleontology studies are also recorded. Using digital data storage, geological logs using graphical, alphanumeric and symbolic depictions are possible. Data can be integrated with drilling and mud gas data, MWD and wireline data and off well-site analyses to produced composite formation evaluation logs and interpretational crossplots.

  14. Performance of the SIR-B digital image processing subsystem

    NASA Technical Reports Server (NTRS)

    Curlander, J. C.

    1986-01-01

    A ground-based system to generate digital SAR image products has been developed and implemented in support of the SIR-B mission. This system is designed to achieve the maximum throughput while meeting strict image fidelity criteria. Its capabilities include: automated radiometric and geometric correction of the output imagery; high-precision absolute location without tiepoint registration; filtering of the raw data to remove spurious signals from alien radars; and automated catologing to maintain a full set of radar and image production facility in support of the SIR-B science investigators routinely produces over 80 image frames per week.

  15. Advanced Digital Signal Processing for Hybrid Lidar FY 2014

    DTIC Science & Technology

    2014-10-30

    was moved in 10 cm increments from a range of 1.35 m to 3.05 m. The photomuhiplier tube ( PMT ) collected light scattered from the submerged target...through the window. A bias-tee at the output of the PMT separated the DC and AC components of the photocurrent. The DC-coupled signal was monitored on...a multimeter to ensure that the PMT remained within its linear operating region. The AC-coupled signal was demodulated and digitized in the software

  16. A methodology for evaluation and selection of nanoparticle manufacturing processes based on sustainability metrics.

    PubMed

    Naidu, Sasikumar; Sawhney, Rapinder; Li, Xueping

    2008-09-01

    A set of sustainability metrics, covering the economic, environmental and sociological dimensions of sustainability for evaluation of nanomanufacturing processes is developed. The metrics are divided into two categories namely industrial engineering metrics (process and safety metrics) and green chemistry metrics (environmental impact). The waste reduction algorithm (WAR) is used to determine the environmental impact of the processes and NAIADE (Novel Approach to Imprecise Assessment and Decision Environments) software is used for evaluation and decision analysis. The methodology is applied to three processes used for silica nanoparticle synthesis based on sol-gel and flame methods.

  17. Integrating rock mechanics issues with repository design through design process principles and methodology

    SciTech Connect

    Bieniawski, Z.T.

    1996-04-01

    A good designer needs not only knowledge for designing (technical know-how that is used to generate alternative design solutions) but also must have knowledge about designing (appropriate principles and systematic methodology to follow). Concepts such as {open_quotes}design for manufacture{close_quotes} or {open_quotes}concurrent engineering{close_quotes} are widely used in the industry. In the field of rock engineering, only limited attention has been paid to the design process because design of structures in rock masses presents unique challenges to the designers as a result of the uncertainties inherent in characterization of geologic media. However, a stage has now been reached where we are be able to sufficiently characterize rock masses for engineering purposes and identify the rock mechanics issues involved but are still lacking engineering design principles and methodology to maximize our design performance. This paper discusses the principles and methodology of the engineering design process directed to integrating site characterization activities with design, construction and performance of an underground repository. Using the latest information from the Yucca Mountain Project on geology, rock mechanics and starter tunnel design, the current lack of integration is pointed out and it is shown how rock mechanics issues can be effectively interwoven with repository design through a systematic design process methodology leading to improved repository performance. In essence, the design process is seen as the use of design principles within an integrating design methodology, leading to innovative problem solving. In particular, a new concept of {open_quotes}Design for Constructibility and Performance{close_quotes} is introduced. This is discussed with respect to ten rock mechanics issues identified for repository design and performance.

  18. Implementation of real-time digital endoscopic image processing system

    NASA Astrophysics Data System (ADS)

    Song, Chul Gyu; Lee, Young Mook; Lee, Sang Min; Kim, Won Ky; Lee, Jae Ho; Lee, Myoung Ho

    1997-10-01

    Endoscopy has become a crucial diagnostic and therapeutic procedure in clinical areas. Over the past four years, we have developed a computerized system to record and store clinical data pertaining to endoscopic surgery of laparascopic cholecystectomy, pelviscopic endometriosis, and surgical arthroscopy. In this study, we developed a computer system, which is composed of a frame grabber, a sound board, a VCR control board, a LAN card and EDMS. Also, computer system controls peripheral instruments such as a color video printer, a video cassette recorder, and endoscopic input/output signals. Digital endoscopic data management system is based on open architecture and a set of widely available industry standards; namely Microsoft Windows as an operating system, TCP/IP as a network protocol and a time sequential database that handles both images and speech. For the purpose of data storage, we used MOD and CD- R. Digital endoscopic system was designed to be able to store, recreate, change, and compress signals and medical images. Computerized endoscopy enables us to generate and manipulate the original visual document, making it accessible to a virtually unlimited number of physicians.

  19. Digital processing of signals arising from organic liquid scintillators for applications in the mixed-field assessment of nuclear threats

    NASA Astrophysics Data System (ADS)

    Aspinall, M. D.; Joyce, M. J.; Mackin, R. O.; Jarrah, Z.; Peyton, A. J.

    2008-10-01

    The nuclear aspect of the CBRN* threat is often divided amongst radiological substances posing no criticality risk, often referred to as 'dirty bomb' scenarios, and fissile threats. The latter have the theoretical potential for criticality excursion, resulting in elevated neutron fluxes in addition to the γ-ray component that is common to dirty bombs. Even in isolation of the highly-unlikely criticality scenario, fissile substances often exhibit radiation fields comprising a significant neutron component which can require considerably different counterterrorism measures and clean-up methodologies. The contrast between these threats can indicate important differences in the relative sophistication of the perpetrators and their organizations. Consequently, the detection and discrimination of nuclear perils in terms of mixed-field content is an important assay in combating terrorist threats. In this paper we report on the design and implementation of a fast digitizer and embedded-processor for onthe- fly signal processing of events from organic liquid scintillators. A digital technique, known as Pulse Gradient Analysis (PGA), has been developed at Lancaster University for the digital discrimination of neutrons and γ rays. PGA has been deployed on bespoke hardware and demonstrates remarkable improvement over analogue methods for the assay of mixed fields and the real-time discrimination of neutrons and γ rays. In this regard the technology constitutes an attractive and affordable means for the discrimination of the radiation fields arising from fissile threats and those from dirty bombs. Data are presented demonstrating this capability with sealed radioactive sources.

  20. Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology

    NASA Technical Reports Server (NTRS)

    Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.

    1974-01-01

    A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.

  1. Digital pulse processing and optimization of the front-end electronics for nuclear instrumentation.

    PubMed

    Bobin, C; Bouchard, J; Thiam, C; Ménesguen, Y

    2014-05-01

    This article describes an algorithm developed for the digital processing of signals provided by a high-efficiency well-type NaI(Tl) detector used to apply the 4πγ technique. In order to achieve a low-energy threshold, a new front-end electronics has been specifically designed to optimize the coupling to an analog-to-digital converter (14 bit, 125 MHz) connected to a digital development kit produced by Altera(®). The digital pulse processing is based on an IIR (Infinite Impulse Response) approximation of the Gaussian filter (and its derivatives) that can be applied to the real-time processing of digitized signals. Based on measurements obtained with the photon emissions generated by an (241)Am source, the energy threshold is estimated to be equal to ~2 keV corresponding to the physical threshold of the NaI(Tl) detector. An algorithm developed for a Silicon Drift Detector used for low-energy x-ray spectrometry is also described. In that case, the digital pulse processing is specifically designed for signals provided by a reset-type preamplifier ((55)Fe source).

  2. A comparison of letter and digit processing in letter-by-letter reading.

    PubMed

    Ingles, Janet L; Eskes, Gail A

    2008-01-01

    The extent to which letter-by-letter reading results from a specific orthographic deficit, as compared with a nonspecific disturbance in basic visuoperceptual mechanisms, is unclear. The current study directly compared processing of letters and digits in a letter-by-letter reader, G.M., using a rapid serial visual presentation (RSVP) task and a speeded matching task. Comparisons were made to a group of six brain-damaged individuals without reading deficits. In the RSVP task, G.M. had increased difficulty reporting the target identities when they were letters, as compared with digits. Although this general pattern was also evident in the control group, the magnitude of the letter-digit accuracy difference was greater in G.M. Similarly, in the matching task, G.M. was slower to match letters than digits, relative to the control group, although his response times to both item types were increased. These data suggest that letter-by-letter reading, at least in this case, results from a visuoperceptual encoding deficit that particularly affects letters, but also extends to processing of digits to a lesser extent. Results are consistent with the notion that a left occipitotemporal area is specialized for letter processing with greater bilaterality in the visual processing of digits.

  3. Image processing for a tactile/vision substitution system using digital CNN.

    PubMed

    Lin, Chien-Nan; Yu, Sung-Nien; Hu, Jin-Cheng

    2006-01-01

    In view of the parallel processing and easy implementation properties of CNN, we propose to use digital CNN as the image processor of a tactile/vision substitution system (TVSS). The digital CNN processor is used to execute the wavelet down-sampling filtering and the half-toning operations, aiming to extract important features from the images. A template combination method is used to embed the two image processing functions into a single CNN processor. The digital CNN processor is implemented on an intellectual property (IP) and is implemented on a XILINX VIRTEX II 2000 FPGA board. Experiments are designated to test the capability of the CNN processor in the recognition of characters and human subjects in different environments. The experiments demonstrates impressive results, which proves the proposed digital CNN processor a powerful component in the design of efficient tactile/vision substitution systems for the visually impaired people.

  4. Digital signal processing for the ATLAS/LUCID detector

    SciTech Connect

    2015-07-01

    Both the detector and the associated read-out electronics have been improved in order to cope with the LHC luminosity increase foreseen for RUN 2 and RUN 3. The new operating conditions require a careful tuning of the read-out electronics in order to optimize the signal-to-noise ratio. The new read-out electronics will allow the use of digital filtering of the photo multiplier tube signals. In this talk, we will present the first results that we obtained in the optimization of the signal-to-noise ratio. In addition, we will introduce the next steps to adapt this system to high performance read-out chains for low energy gamma rays. Such systems are based, for instance, on Silicon Drift Detector devices and can be used in applications at Free-Electron-Laser facilities such as the XFEL under construction at DESY. (authors)

  5. Digital active material processing platform effort (DAMPER), SBIR phase 2

    NASA Technical Reports Server (NTRS)

    Blackburn, John; Smith, Dennis

    1992-01-01

    Applied Technology Associates, Inc., (ATA) has demonstrated that inertial actuation can be employed effectively in digital, active vibration isolation systems. Inertial actuation involves the use of momentum exchange to produce corrective forces which act directly on the payload being actively isolated. In a typical active vibration isolation system, accelerometers are used to measure the inertial motion of the payload. The signals from the accelerometers are then used to calculate the corrective forces required to counteract, or 'cancel out' the payload motion. Active vibration isolation is common technology, but the use of inertial actuation in such systems is novel, and is the focus of the DAMPER project. A May 1991 report was completed which documented the successful demonstration of inertial actuation, employed in the control of vibration in a single axis. In the 1 degree-of-freedom (1DOF) experiment a set of air bearing rails was used to suspend the payload, simulating a microgravity environment in a single horizontal axis. Digital Signal Processor (DSP) technology was used to calculate in real time, the control law between the accelerometer signals and the inertial actuators. The data obtained from this experiment verified that as much as 20 dB of rejection could be realized by this type of system. A discussion is included of recent tests performed in which vibrations were actively controlled in three axes simultaneously. In the three degree-of-freedom (3DOF) system, the air bearings were designed in such a way that the payload is free to rotate about the azimuth axis, as well as translate in the two horizontal directions. The actuator developed for the DAMPER project has applications beyond payload isolation, including structural damping and source vibration isolation. This report includes a brief discussion of these applications, as well as a commercialization plan for the actuator.

  6. A Model-Based Methodology for Spray-Drying Process Development.

    PubMed

    Dobry, Dan E; Settell, Dana M; Baumann, John M; Ray, Rod J; Graham, Lisa J; Beyerinck, Ron A

    2009-09-01

    Solid amorphous dispersions are frequently used to improve the solubility and, thus, the bioavailability of poorly soluble active pharmaceutical ingredients (APIs). Spray-drying, a well-characterized pharmaceutical unit operation, is ideally suited to producing solid amorphous dispersions due to its rapid drying kinetics. This paper describes a novel flowchart methodology based on fundamental engineering models and state-of-the-art process characterization techniques that ensure that spray-drying process development and scale-up are efficient and require minimal time and API. This methodology offers substantive advantages over traditional process-development methods, which are often empirical and require large quantities of API and long development times. This approach is also in alignment with the current guidance on Pharmaceutical Development Q8(R1). The methodology is used from early formulation-screening activities (involving milligrams of API) through process development and scale-up for early clinical supplies (involving kilograms of API) to commercial manufacturing (involving metric tons of API). It has been used to progress numerous spray-dried dispersion formulations, increasing bioavailability of formulations at preclinical through commercial scales.

  7. Digitized Chaos: Is Our Military Decision Making Process Ready for the Information Age?

    DTIC Science & Technology

    2007-11-02

    DIGITIZED CHAOS: IS OUR MILITARY DECISION MAKING PROCESS READY FOR THE INFORMATION AGE ? A MONOGRAPH BY Major John W. Charlton Infantry SETCLAVIS COR...FOR THE INFORMATION AGE ? By Major John W. Charlton, USA, 70 Pages. The integration of new technologies has always been important to the military. The...must accompany the new technology in order to exploit its full capabilities. Today the Army is looking at ways to integrate information age , or digital

  8. Considerations in developing geographic informations systems based on low-cost digital image processing

    NASA Technical Reports Server (NTRS)

    Henderson, F. M.; Dobson, M. W.

    1981-01-01

    The potential of digital image processing systems costing $20,000 or less for geographic information systems is assessed with the emphasis on the volume of data to be handled, the commercial hardware systems available, and the basic software for: (1) data entry, conversion and digitization; (2) georeferencing and geometric correction; (3) data structuring; (4) editing and updating; (5) analysis and retrieval; (6) output drivers; and (7) data management. Costs must also be considered as tangible and intangible factors.

  9. Erosion processes by water in agricultural landscapes: a low-cost methodology for post-event analyses

    NASA Astrophysics Data System (ADS)

    Prosdocimi, Massimo; Calligaro, Simone; Sofia, Giulia; Tarolli, Paolo

    2015-04-01

    Throughout the world, agricultural landscapes assume a great importance, especially for supplying food and a livelihood. Among the land degradation phenomena, erosion processes caused by water are those that may most affect the benefits provided by agricultural lands and endanger people who work and live there. In particular, erosion processes that affect the banks of agricultural channels may cause the bank failure and represent, in this way, a severe threat to floodplain inhabitants and agricultural crops. Similarly, rills and gullies are critical soil erosion processes as well, because they bear upon the productivity of a farm and represent a cost that growers have to deal with. To estimate quantitatively soil losses due to bank erosion and rills processes, area based measurements of surface changes are necessary but, sometimes, they may be difficult to realize. In fact, surface changes due to short-term events have to be represented with fine resolution and their monitoring may entail too much money and time. The main objective of this work is to show the effectiveness of a user-friendly and low-cost technique that may even rely on smart-phones, for the post-event analyses of i) bank erosion affecting agricultural channels, and ii) rill processes occurring on an agricultural plot. Two case studies were selected and located in the Veneto floodplain (northeast Italy) and Marche countryside (central Italy), respectively. The work is based on high-resolution topographic data obtained by the emerging, low-cost photogrammetric method named Structure-from-Motion (SfM). Extensive photosets of the case studies were obtained using both standalone reflex digital cameras and smart-phone built-in cameras. Digital Terrain Models (DTMs) derived from SfM revealed to be effective to estimate quantitatively erosion volumes and, in the case of the bank eroded, deposited materials as well. SfM applied to pictures taken by smartphones is useful for the analysis of the topography

  10. Integrating the human element into the systems engineering process and MBSE methodology.

    SciTech Connect

    Tadros, Michael Samir.

    2013-12-01

    In response to the challenges related to the increasing size and complexity of systems, organizations have recognized the need to integrate human considerations in the beginning stages of systems development. Human Systems Integration (HSI) seeks to accomplish this objective by incorporating human factors within systems engineering (SE) processes and methodologies, which is the focus of this paper. A representative set of HSI methods from multiple sources are organized, analyzed, and mapped to the systems engineering Vee-model. These methods are then consolidated and evaluated against the SE process and Models-Based Systems Engineering (MBSE) methodology to determine where and how they could integrate within systems development activities in the form of specific enhancements. Overall conclusions based on these evaluations are presented and future research areas are proposed.

  11. Studying the relationship between dreaming and sleep-dependent memory processes: methodological challenges.

    PubMed

    Schredl, Michael

    2013-12-01

    The hypothesis that dreaming is involved in off-line memory processing is difficult to test because major methodological issues have to be addressed, such as dream recall and the effect of remembered dreams on memory. It would be fruitful--in addition to studying the ancient art of memory (AAOM) in a scanner--to study the dreams of persons who use AAOM regularly.

  12. The Importance of Rapid Auditory Processing Abilities to Early Language Development: Evidence from Converging Methodologies

    PubMed Central

    Thomas, Jennifer J.; Choudhury, Naseem; Leppänen, Paavo H. T.

    2006-01-01

    The ability to process two or more rapidly presented, successive, auditory stimuli is believed to underlie successful language acquisition. Likewise, deficits in rapid auditory processing of both verbal and nonverbal stimuli are characteristic of individuals with developmental language disorders such as Specific Language Impairment. Auditory processing abilities are well developed in infancy, and thus such deficits should be detectable in infants. In the studies presented here, converging methodologies are used to examine such abilities in infants with and without a family history of language disorder. Behavioral measures, including assessments of infant information processing, and an EEG/event-related potential (ERP) paradigm are used concurrently. Results suggest that rapid auditory processing skills differ as a function of family history and are predictive of later language outcome. Further, these paradigms may prove to be sensitive tools for identifying children with poor processing skills in infancy and thus at a higher risk for developing a language disorder. PMID:11891639

  13. Digital Libraries.

    ERIC Educational Resources Information Center

    Fox, Edward A.; Urs, Shalini R.

    2002-01-01

    Provides an overview of digital libraries research, practice, and literature. Highlights include new technologies; redefining roles; historical background; trends; creating digital content, including conversion; metadata; organizing digital resources; services; access; information retrieval; searching; natural language processing; visualization;…

  14. The process of transitioning to digital operations in a clinic setting.

    PubMed

    Freeh, M; McFall, J; Nieves, A

    2001-06-01

    Transitioning to digital imaging operations in a department of radiology is often difficult for many radiologists, but it is a change that many have made effectively. Transitioning to digital operations in a clinic setting is even more difficult for the referring physician operating a business in the clinic. This paper will discuss our experience with transitioning several off site clinics to digital imaging operations. We will discuss the process followed to identify the physical equipment required to support clinic operations in a digital imaging environment, the process followed to help the physicians adjust their work patterns to allow them to practice in a digital imaging environment, and the benefits and pitfalls of implementing digital imaging in an off site clinic. Four off site clinic locations will be evaluated: 1. cancer clinic located immediately adjacent to the main hospital that relies heavily on CT and MRI images in their practice, 2. small clinic located about 60 miles from the main hospital that acquires xray images on site, 3. larger clinic located about 20 miles from the main hospital that acquires xray, MRI and CT images on site, 4. sports medicine clinic located about 2 miles from the main hospital that acquires xray images on site. Each of these clinics has a very different patient clientele and therefore operates differently in nearly all aspects of their daily operations. The physician's need for and use of film and digital images varies significantly between the sites and therefore each site has presented different challenges to our implementation process. As we explain the decisions that were made for each of these sites and reveal the methods that were used to help the physicians make the transition, the readers should be able to draw information that will be helpful to them as they make their own transition to a digital operation.

  15. A prototype software methodology for the rapid evaluation of biomanufacturing process options.

    PubMed

    Chhatre, Sunil; Francis, Richard; O'Donovan, Kieran; Titchener-Hooker, Nigel J; Newcombe, Anthony R; Keshavarz-Moore, Eli

    2007-10-01

    A three-layered simulation methodology is described that rapidly evaluates biomanufacturing process options. In each layer, inferior options are screened out, while more promising candidates are evaluated further in the subsequent, more refined layer, which uses more rigorous models that require more data from time-consuming experimentation. Screening ensures laboratory studies are focused only on options showing the greatest potential. To simplify the screening, outputs of production level, cost and time are combined into a single value using multi-attribute-decision-making techniques. The methodology was illustrated by evaluating alternatives to an FDA (U.S. Food and Drug Administration)-approved process manufacturing rattlesnake antivenom. Currently, antivenom antibodies are recovered from ovine serum by precipitation/centrifugation and proteolyzed before chromatographic purification. Alternatives included increasing the feed volume, replacing centrifugation with microfiltration and replacing precipitation/centrifugation with a Protein G column. The best alternative used a higher feed volume and a Protein G step. By rapidly evaluating the attractiveness of options, the methodology facilitates efficient and cost-effective process development.

  16. Rethinking Design Process: Using 3D Digital Models as an Interface in Collaborative Session

    ERIC Educational Resources Information Center

    Ding, Suining

    2008-01-01

    This paper describes a pilot study for an alternative design process by integrating a designer-user collaborative session with digital models. The collaborative session took place in a 3D AutoCAD class for a real world project. The 3D models served as an interface for designer-user collaboration during the design process. Students not only learned…

  17. Digital image processing software system using an array processor

    SciTech Connect

    Sherwood, R.J.; Portnoff, M.R.; Journeay, C.H.; Twogood, R.E.

    1981-03-10

    A versatile array processor-based system for general-purpose image processing was developed. At the heart of this system is an extensive, flexible software package that incorporates the array processor for effective interactive image processing. The software system is described in detail, and its application to a diverse set of applications at LLNL is briefly discussed. 4 figures, 1 table.

  18. Integrating the dynamics of personality and close relationship processes: methodological and data analytic implications.

    PubMed

    Graber, Elana C; Laurenceau, Jean-Philippe; Carver, Charles S

    2011-12-01

    A common theme that has emerged from classic and contemporary theoretical work in both the fields of personality and relationship science is a focus on process. Current process-focused theories bearing on personality invoke a view of the individual in ongoing action and interaction with the environment, reflecting a flow of experience rather than a static depiction. To understand the processes by which personality interacts with the social environment (particularly dyads), investigations must capture individuals interacting in multiple interpersonal situations, which likely necessitates complex study designs and corresponding data analytic strategies. Using an illustrative simulated data set, we focus on diary methods and corresponding individual and dyadic multilevel models to capture person-situation interaction within the context of processes in daily close relationship life. Finally, we consider future directions that conceptualize personality and close relationship processes from a dynamical systems theoretical and methodological perspective.

  19. Optimization of the processing technology of Fructus Arctii by response surface methodology.

    PubMed

    Liu, Qi-Di; Qin, Kun-Ming; Shen, Bao-Jia; Cai, Hao; Cai, Bao-Chang

    2015-03-01

    The present study was designed to optimize the processing of Fructus Arctii by response surface methodology (RSM). Based on single factor studies, a three-variable, three-level Box-Behnken design (BBD) was used to monitor the effects of independent variables, including processing temperature and time, on the dependent variables. Response surfaces and contour plots of the contents of total lignans, chlorogenic acid, arctiin, and arctigenin were obtained through ultraviolet and visible (UV-Vis) monitoring and high performance liquid chromatography (HPLC). Fructus Arctii should be processed under heating in a pot at 311 °C, medicine at 119 °C for 123s with flipping frequently. The experimental values under the optimized processing technology were consistent with the predicted values. In conclusion, RSM is an effective method to optimize the processing of traditional Chinese medicine (TCM).

  20. BPMN, Toolsets, and Methodology: A Case Study of Business Process Management in Higher Education

    NASA Astrophysics Data System (ADS)

    Barn, Balbir S.; Oussena, Samia

    This chapter describes ongoing action research which is exploring the use of BPMN and a specific toolset - Intalio Designer to capture the “as is” essential process model of part of an overarching large business process within higher education. The chapter contends that understanding the efficacy of the BPMN notation and the notational elements to use is not enough. Instead, the effectiveness of a notation is determined by the notation, the toolset that is being used, and methodological consideration. The chapter presents some of the challenges that are faced in attempting to develop computation independent models in BPMN using toolsets such as Intalio Designer™.

  1. Interactive Computing and Graphics in Undergraduate Digital Signal Processing. Microcomputing Working Paper Series F 84-9.

    ERIC Educational Resources Information Center

    Onaral, Banu; And Others

    This report describes the development of a Drexel University electrical and computer engineering course on digital filter design that used interactive computing and graphics, and was one of three courses in a senior-level sequence on digital signal processing (DSP). Interactive and digital analysis/design routines and the interconnection of these…

  2. Digital Detection and Processing of Multiple Quadrature Harmonics for EPR Spectroscopy

    PubMed Central

    Ahmad, R.; Som, S.; Kesselring, E.; Kuppusamy, P.; Zweier, J.L.; Potter, L.C.

    2010-01-01

    A quadrature digital receiver and associated signal estimation procedure are reported for L-band electron paramagnetic resonance (EPR) spectroscopy. The approach provides simultaneous acquisition and joint processing of multiple harmonics in both in-phase and out-of-phase channels. The digital receiver, based on a high-speed dual-channel analog-to-digital converter, allows direct digital down-conversion with heterodyne processing using digital capture of the microwave reference signal. Thus, the receiver avoids noise and nonlinearity associated with analog mixers. Also, the architecture allows for low-Q anti-alias filtering and does not require the sampling frequency to be time-locked to the microwave reference. A noise model applicable for arbitrary contributions of oscillator phase noise is presented, and a corresponding maximum-likelihood estimator of unknown parameters is also reported. The signal processing is applicable for Lorentzian lineshape under nonsaturating conditions. The estimation is carried out using a convergent iterative algorithm capable of jointly processing the in-phase and out-of-phase data in the presence of phase noise and unknown microwave phase. Cramér-Rao bound analysis and simulation results demonstrate a significant reduction in linewidth estimation error using quadrature detection, for both low and high values of phase noise. EPR spectroscopic data are also reported for illustration. PMID:20971667

  3. Digital detection and processing of multiple quadrature harmonics for EPR spectroscopy.

    PubMed

    Ahmad, R; Som, S; Kesselring, E; Kuppusamy, P; Zweier, J L; Potter, L C

    2010-12-01

    A quadrature digital receiver and associated signal estimation procedure are reported for L-band electron paramagnetic resonance (EPR) spectroscopy. The approach provides simultaneous acquisition and joint processing of multiple harmonics in both in-phase and out-of-phase channels. The digital receiver, based on a high-speed dual-channel analog-to-digital converter, allows direct digital down-conversion with heterodyne processing using digital capture of the microwave reference signal. Thus, the receiver avoids noise and nonlinearity associated with analog mixers. Also, the architecture allows for low-Q anti-alias filtering and does not require the sampling frequency to be time-locked to the microwave reference. A noise model applicable for arbitrary contributions of oscillator phase noise is presented, and a corresponding maximum-likelihood estimator of unknown parameters is also reported. The signal processing is applicable for Lorentzian lineshape under nonsaturating conditions. The estimation is carried out using a convergent iterative algorithm capable of jointly processing the in-phase and out-of-phase data in the presence of phase noise and unknown microwave phase. Cramér-Rao bound analysis and simulation results demonstrate a significant reduction in linewidth estimation error using quadrature detection, for both low and high values of phase noise. EPR spectroscopic data are also reported for illustration.

  4. Evaluation of a Change Detection Methodology by Means of Binary Thresholding Algorithms and Informational Fusion Processes

    PubMed Central

    Molina, Iñigo; Martinez, Estibaliz; Arquero, Agueda; Pajares, Gonzalo; Sanchez, Javier

    2012-01-01

    Landcover is subject to continuous changes on a wide variety of temporal and spatial scales. Those changes produce significant effects in human and natural activities. Maintaining an updated spatial database with the occurred changes allows a better monitoring of the Earth’s resources and management of the environment. Change detection (CD) techniques using images from different sensors, such as satellite imagery, aerial photographs, etc., have proven to be suitable and secure data sources from which updated information can be extracted efficiently, so that changes can also be inventoried and monitored. In this paper, a multisource CD methodology for multiresolution datasets is applied. First, different change indices are processed, then different thresholding algorithms for change/no_change are applied to these indices in order to better estimate the statistical parameters of these categories, finally the indices are integrated into a change detection multisource fusion process, which allows generating a single CD result from several combination of indices. This methodology has been applied to datasets with different spectral and spatial resolution properties. Then, the obtained results are evaluated by means of a quality control analysis, as well as with complementary graphical representations. The suggested methodology has also been proved efficiently for identifying the change detection index with the higher contribution. PMID:22737023

  5. Use of a qualitative methodological scaffolding process to design robust interprofessional studies.

    PubMed

    Wener, Pamela; Woodgate, Roberta L

    2013-07-01

    Increasingly, researchers are using qualitative methodology to study interprofessional collaboration (IPC). With this increase in use, there seems to be an appreciation for how qualitative studies allow us to understand the unique individual or group experience in more detail and form a basis for policy change and innovative interventions. Furthermore, there is an increased understanding of the potential of studying new or emerging phenomena qualitatively to inform further large-scale studies. Although there is a current trend toward greater acceptance of the value of qualitative studies describing the experiences of IPC, these studies are mostly descriptive in nature. Applying a process suggested by Crotty (1998) may encourage researchers to consider the value in situating research questions within a broader theoretical framework that will inform the overall research approach including methodology and methods. This paper describes the application of a process to a research project and then illustrates how this process encouraged iterative cycles of thinking and doing. The authors describe each step of the process, shares decision-making points, as well as suggests an additional step to the process. Applying this approach to selecting data collection methods may serve to guide and support the qualitative researcher in creating a well-designed study approach.

  6. The design, fabrication, and test of a new VLSI hybrid analog-digital neural processing element

    NASA Technical Reports Server (NTRS)

    Deyong, Mark R.; Findley, Randall L.; Fields, Chris

    1992-01-01

    A hybrid analog-digital neural processing element with the time-dependent behavior of biological neurons has been developed. The hybrid processing element is designed for VLSI implementation and offers the best attributes of both analog and digital computation. Custom VLSI layout reduces the layout area of the processing element, which in turn increases the expected network density. The hybrid processing element operates at the nanosecond time scale, which enables it to produce real-time solutions to complex spatiotemporal problems found in high-speed signal processing applications. VLSI prototype chips have been designed, fabricated, and tested with encouraging results. Systems utilizing the time-dependent behavior of the hybrid processing element have been simulated and are currently in the fabrication process. Future applications are also discussed.

  7. Advanced power analysis methodology targeted to the optimization of a digital pixel readout chip design and its critical serial powering system

    NASA Astrophysics Data System (ADS)

    Marconi, S.; Orfanelli, S.; Karagounis, M.; Hemperek, T.; Christiansen, J.; Placidi, P.

    2017-02-01

    A dedicated power analysis methodology, based on modern digital design tools and integrated with the VEPIX53 simulation framework developed within RD53 collaboration, is being used to guide vital choices for the design and optimization of the next generation ATLAS and CMS pixel chips and their critical serial powering circuit (shunt-LDO). Power consumption is studied at different stages of the design flow under different operating conditions. Significant effort is put into extensive investigations of dynamic power variations in relation with the decoupling seen by the powering network. Shunt-LDO simulations are also reported to prove the reliability at the system level.

  8. A Systematic Software, Firmware, and Hardware Codesign Methodology for Digital Signal Processing

    DTIC Science & Technology

    2014-03-01

    Interface C o m m u n ic at io n s B as ic 53 Streaming Data 67 Shared Bus 54 Message Passing 68 Token Ring 55 Remote-Procedure Call 69... technologies for the feasibility check. These design concepts and implementation technologies are in the form of models, and they can also be used for...hardware partitioning involves a diversity of applications, design styles and implementation technologies ; ultimately it depends on human expert

  9. MARKOV: A methodology for the solution of infinite time horizon MARKOV decision processes

    USGS Publications Warehouse

    Williams, B.K.

    1988-01-01

    Algorithms are described for determining optimal policies for finite state, finite action, infinite discrete time horizon Markov decision processes. Both value-improvement and policy-improvement techniques are used in the algorithms. Computing procedures are also described. The algorithms are appropriate for processes that are either finite or infinite, deterministic or stochastic, discounted or undiscounted, in any meaningful combination of these features. Computing procedures are described in terms of initial data processing, bound improvements, process reduction, and testing and solution. Application of the methodology is illustrated with an example involving natural resource management. Management implications of certain hypothesized relationships between mallard survival and harvest rates are addressed by applying the optimality procedures to mallard population models.

  10. Digital Screening and Halftone Techniques for Raster Processing,

    DTIC Science & Technology

    1980-01-14

    I 7 A)-AO81 090 ARMY ENGINEER TOPOGRAPH4IC LASS FORT Bs.voiR VA 161/ OIGITAL SCREENING ANO HALFTONE TECNNIOUIES FOR RASTER PROCLPSINM-.TC(U) JAN GO R... HALFTONE TECHNIQUES 0FOR RASTER PROCESSING BY RICHARD L. ROSENTHAL DTIC FEB 𔃼 7 1980 W.A Approved for public release; distribution unlimited AU...creening and halftone techniques forlt -rastei’ processing ~ A 6. PERFORMING ORG. REPORT NUMBER 7. AUTNOP-r- S. CONTRACT OR GRANT NUMBER(*) c*t- Ri chard

  11. New methods, new methodology: Advanced CFD in the Snecma turbomachinery design process

    NASA Astrophysics Data System (ADS)

    Vuillez, Christophe; Petot, Bertrand

    1994-05-01

    CFD tools represent a significant source of improvements in the design process of turbomachinery components, leading to higher performances, cost and cycle savings as well as lower associated risks. Such methods are the backbone of compressor and turbine design methodologies at Snecma. In the 80's, the use of 3D Euler solvers was a key factor in designing fan blades with very high performance level. Counter rotating high speed propellers designed with this methodology reached measured performances very close to their ambitious objective from the first test series. In the late 80's and the beginning of the 90's, new, more powerful methods were rapidly developed and are now commonly used in the design process: a quasi-3D, compressible, transonic inverse method; quasi-3D and 3D Navier-Stokes solvers; 3D unsteady Euler solvers. As an example, several hundred 3D Navier-Stokes computations are run yearly for the design of low and high pressure compressor and turbine blades. In addition to their modelling capabilities, the efficient use of such methods in the design process comes from their close integration in the global methodology and from an adequate exploitation environment. Their validation, their calibration, and the correlations between different levels of modelling are of critical importance to an actual improvement in design know-how. The integration of different methods in the design process is described. Several examples of application illustrate their practical utilization. Comparisons between computational results and test results show their capabilities as well as their present limitations. The prospects linked to new developments currently under way are discussed.

  12. Methodology for the Elimination of Reflection and System Vibration Effects in Particle Image Velocimetry Data Processing

    NASA Technical Reports Server (NTRS)

    Bremmer, David M.; Hutcheson, Florence V.; Stead, Daniel J.

    2005-01-01

    A methodology to eliminate model reflection and system vibration effects from post processed particle image velocimetry data is presented. Reflection and vibration lead to loss of data, and biased velocity calculations in PIV processing. A series of algorithms were developed to alleviate these problems. Reflections emanating from the model surface caused by the laser light sheet are removed from the PIV images by subtracting an image in which only the reflections are visible from all of the images within a data acquisition set. The result is a set of PIV images where only the seeded particles are apparent. Fiduciary marks painted on the surface of the test model were used as reference points in the images. By locating the centroids of these marks it was possible to shift all of the images to a common reference frame. This image alignment procedure as well as the subtraction of model reflection are performed in a first algorithm. Once the images have been shifted, they are compared with a background image that was recorded under no flow conditions. The second and third algorithms find the coordinates of fiduciary marks in the acquisition set images and the background image and calculate the displacement between these images. The final algorithm shifts all of the images so that fiduciary mark centroids lie in the same location as the background image centroids. This methodology effectively eliminated the effects of vibration so that unbiased data could be used for PIV processing. The PIV data used for this work was generated at the NASA Langley Research Center Quiet Flow Facility. The experiment entailed flow visualization near the flap side edge region of an airfoil model. Commercial PIV software was used for data acquisition and processing. In this paper, the experiment and the PIV acquisition of the data are described. The methodology used to develop the algorithms for reflection and system vibration removal is stated, and the implementation, testing and

  13. Rapid processing of letters, digits and symbols: what purely visual-attentional deficit in developmental dyslexia?

    PubMed

    Ziegler, Johannes C; Pech-Georgel, Catherine; Dufau, Stéphane; Grainger, Jonathan

    2010-07-01

    Visual-attentional theories of dyslexia predict deficits for dyslexic children not only for the perception of letter strings but also for non-alphanumeric symbol strings. This prediction was tested in a two-alternative forced-choice paradigm with letters, digits, and symbols. Children with dyslexia showed significant deficits for letter and digit strings but not for symbol strings. This finding is difficult to explain for visual-attentional theories of dyslexia which postulate identical deficits for letters, digits and symbols. Moreover, dyslexics showed normal W-shaped serial position functions for letter and digit strings, which suggests that their deficit is not due to an abnormally small attentional window. Finally, the size of the deficit was identical for letters and digits, which suggests that poor letter perception is not just a consequence of the lack of reading. Together then, our results show that symbols that map onto phonological codes are impaired (i.e. letters and digits), whereas symbols that do not map onto phonological codes are not impaired. This dissociation suggests that impaired symbol-sound mapping rather than impaired visual-attentional processing is the key to understanding dyslexia.

  14. Faster processing of multiple spatially-heterodyned direct to digital holograms

    DOEpatents

    Hanson, Gregory R.; Bingham, Philip R.

    2006-10-03

    Systems and methods are described for faster processing of multiple spatially-heterodyned direct to digital holograms. A method includes of obtaining multiple spatially-heterodyned holograms, includes: digitally recording a first spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; digitally recording a second spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded first spatially-heterodyned hologram by shifting a first original origin of the recorded first spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a first angle between a first reference beam and a first, object beam; applying a first digital filter to cut off signals around the first original origin and performing an inverse Fourier transform on the result; Fourier analyzing the recorded second spatially-heterodyned hologram by shifting a second original origin of the recorded second spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a second angle between a second reference beam and a second object beam; and applying a second digital filter to cut off signals around the second original origin and performing an inverse Fourier transform on the result, wherein digitally recording the first spatially-heterodyned hologram is completed before digitally recording the second spatially-heterodyned hologram and a single digital image includes both the first spatially-heterodyned hologram and the second spatially-heterodyned hologram.

  15. Faster processing of multiple spatially-heterodyned direct to digital holograms

    DOEpatents

    Hanson, Gregory R [Clinton, TN; Bingham, Philip R [Knoxville, TN

    2008-09-09

    Systems and methods are described for faster processing of multiple spatially-heterodyned direct to digital holograms. A method includes of obtaining multiple spatially-heterodyned holograms, includes: digitally recording a first spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; digitally recording a second spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded first spatially-heterodyned hologram by shifting a first original origin of the recorded first spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a first angle between a first reference beam and a first object beam; applying a first digital filter to cut off signals around the first original origin and performing an inverse Fourier transform on the result; Fourier analyzing the recorded second spatially-heterodyned hologram by shifting a second original origin of the recorded second spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a second angle between a second reference beam and a second object beam; and applying a second digital filter to cut off signals around the second original origin and performing an inverse Fourier transform on the result, wherein digitally recording the first spatially-heterodyned hologram is completed before digitally recording the second spatially-heterodyned hologram and a single digital image includes both the first spatially-heterodyned hologram and the second spatially-heterodyned hologram.

  16. Advancing Nursing Research in the Visual Era: Reenvisioning the Photovoice Process Across Phenomenological, Grounded Theory, and Critical Theory Methodologies.

    PubMed

    Evans-Agnew, Robin A; Boutain, Doris M; Rosemberg, Marie-Anne S

    Photovoice is a powerful research method that employs participant photography for advancing voice, knowledge, and transformative change among groups historically or currently marginalized. Paradoxically, this research method risks exploitation of participant voice because of weak methodology to method congruence. The purposes of this retrospective article are to revisit current interdisciplinary research using photovoice and to suggest how to advance photovoice by improving methodology-method congruence. Novel templates are provided for improving the photovoice process across phenomenological, grounded theory, and critical theory methodologies.

  17. Implementation of a Digital Signal Processing Subsystem for a Long Wavelength Array Station

    NASA Technical Reports Server (NTRS)

    Soriano, Melissa; Navarro, Robert; D'Addario, Larry; Sigman, Elliott; Wang, Douglas

    2011-01-01

    This paper describes the implementation of a Digital Signal Processing (DP) subsystem for a single Long Wavelength Array (LWA) station.12 The LWA is a radio telescope that will consist of many phased array stations. Each LWA station consists of 256 pairs of dipole-like antennas operating over the 10-88 MHz frequency range. The Digital Signal Processing subsystem digitizes up to 260 dual-polarization signals at 196 MHz from the LWA Analog Receiver, adjusts the delay and amplitude of each signal, and forms four independent beams. Coarse delay is implemented using a first-in-first-out buffer and fine delay is implemented using a finite impulse response filter. Amplitude adjustment and polarization corrections are implemented using a 2x2 matrix multiplication

  18. Pulsed digital holography system recording ultrafast process of the femtosecond order

    NASA Astrophysics Data System (ADS)

    Wang, Xiaolei; Zhai, Hongchen; Mu, Guoguang

    2006-06-01

    We report, for the first time to our knowledge, a pulsed digital microholographic system with spatial angular multiplexing for recording the ultrafast process of the femtosecond order. The optimized design of the two sets of subpulse-train generators in this system makes it possible to implement a digital holographic recording with spatial angular multiplexing of a frame interval of the femtosecond order, while keeping the incident angle of the object beams unchanged. Three pairs of amplitude and phase images from the same view angle digitally reconstructed by the system demonstrated the ultrafast dynamic process of laser-induced ionization of ambient air at a wavelength of 800 nm, with a time resolution of 50 fs and a frame interval of 300 fs.

  19. Introduction to the Special Issue on Digital Signal Processing in Radio Astronomy

    NASA Astrophysics Data System (ADS)

    Price, D. C.; Kocz, J.; Bailes, M.; Greenhill, L. J.

    Advances in astronomy are intimately linked to advances in digital signal processing (DSP). This special issue is focused upon advances in DSP within radio astronomy. The trend within that community is to use off-the-shelf digital hardware where possible and leverage advances in high performance computing. In particular, graphics processing units (GPUs) and field programmable gate arrays (FPGAs) are being used in place of application-specific circuits (ASICs); high-speed Ethernet and Infiniband are being used for interconnect in place of custom backplanes. Further, to lower hurdles in digital engineering, communities have designed and released general-purpose FPGA-based DSP systems, such as the CASPER ROACH board, ASTRON Uniboard, and CSIRO Redback board. In this introductory paper, we give a brief historical overview, a summary of recent trends, and provide an outlook on future directions.

  20. Evaluation of solar angle variation over digital processing of LANDSAT imagery. [Brazil

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Novo, E. M. L. M.

    1984-01-01

    The effects of the seasonal variation of illumination over digital processing of LANDSAT images are evaluated. Original images are transformed by means of digital filtering to enhance their spatial features. The resulting images are used to obtain an unsupervised classification of relief units. After defining relief classes, which are supposed to be spectrally different, topographic variables (declivity, altitude, relief range and slope length) are used to identify the true relief units existing on the ground. The samples are also clustered by means of an unsupervised classification option. The results obtained for each LANDSAT overpass are compared. Digital processing is highly affected by illumination geometry. There is no correspondence between relief units as defined by spectral features and those resulting from topographic features.

  1. Experimental study of digital image processing techniques for LANDSAT data

    NASA Technical Reports Server (NTRS)

    Rifman, S. S. (Principal Investigator); Allendoerfer, W. B.; Caron, R. H.; Pemberton, L. J.; Mckinnon, D. M.; Polanski, G.; Simon, K. W.

    1976-01-01

    The author has identified the following significant results. Results are reported for: (1) subscene registration, (2) full scene rectification and registration, (3) resampling techniques, (4) and ground control point (GCP) extraction. Subscenes (354 pixels x 234 lines) were registered to approximately 1/4 pixel accuracy and evaluated by change detection imagery for three cases: (1) bulk data registration, (2) precision correction of a reference subscene using GCP data, and (3) independently precision processed subscenes. Full scene rectification and registration results were evaluated by using a correlation technique to measure registration errors of 0.3 pixel rms thoughout the full scene. Resampling evaluations of nearest neighbor and TRW cubic convolution processed data included change detection imagery and feature classification. Resampled data were also evaluated for an MSS scene containing specular solar reflections.

  2. Characterization of Periodically Poled Nonlinear Materials Using Digital Image Processing

    DTIC Science & Technology

    2008-04-01

    Interactions Due to the nonlinear nature of the response, a nonlinear polarization at new frequencies is generated which can radiate at frequencies not...present in the incident radiation field. This coupling allows energy to be transferred between different wavelengths and forms the basis of the...physical mechanism behind these processes. An isolated atom would radiate in the typical dipole radiation pattern, but in a material, a large number of

  3. Comparison between digital Doppler filtering processes applied to radar signals

    NASA Astrophysics Data System (ADS)

    Desodt, G.

    1983-10-01

    Two families of Doppler processes based on FFT and FIR filters, respectively, are compared in terms of hardware complexity and performance. It is shown that FIR filter banks are characterized by better performance than FFT filter banks. For the same number of pulses, the FIR processor permits a better clutter rejection and greater bandwidth than the FFT one. Also, an FIR-based bank has a much simpler and more adaptable architecture than an FFT-based bank.

  4. Systems, Devices, and Materials for Digital Optical Processing.

    NASA Astrophysics Data System (ADS)

    Title, Mark Alan

    The massive parallelism and flexibility of three -dimensional optical communication may allow the development of new parallel computers free from the constraints of planar electronic technology. To bring the optical computer from possibility to reality, however, requires technological and scientific development in new optical systems, devices, and materials. We present here research results in each of these areas. First described is a prototype optical information processing system using CdS/liquid crystal spatial light modulators for optical logic and memory. This system has been developed as the first step in the implementation of a fine-grained, globally-interconnected optical processing element array. Notable system features include the implementation of programmable electronic control and the analysis of the optical power distribution within the processor, both directly applicable to the design of new and more advanced optical information processing systems. Next presented is the design and initial performance data for a new spatial light modulator combining an array of silicon phototransistors with the electro-optic material (Pb,La)(Zr,Ti)O _3, opening new possibilities for "intelligent" optical logic, memory, and switching devices. Important to the optimal performance of this Si/PLZT device is the fabrication of embedded electrodes in the electro-optic material, reducing the device operating voltage and switching energy while improving the uniformity of the optical modulation. An extensive computer model of embedded electrode performance and details of the electrode fabrication by reactive ion beam etching and electroless Ni deposition are presented. Finally, in the area of optical materials development we present initial results in the RF magnetron deposition of electro -optic PLZT on r-plane sapphire. This work is important to the fabrication of a monolithic, Si/PLZT-on-sapphire spatial light modulator, promising superior performance to devices using

  5. An integrated methodology for process improvement and delivery system visualization at a multidisciplinary cancer center.

    PubMed

    Singprasong, Rachanee; Eldabi, Tillal

    2013-01-01

    Multidisciplinary cancer centers require an integrated, collaborative, and stream-lined workflow in order to provide high quality of patient care. Due to the complex nature of cancer care and continuing changes to treatment techniques and technologies, it is a constant struggle for centers to obtain a systemic and holistic view of treatment workflow for improving the delivery systems. Project management techniques, Responsibility matrix and a swim-lane activity diagram representing sequence of activities can be combined for data collection, presentation, and evaluation of the patient care. This paper presents this integrated methodology using multidisciplinary meetings and walking the route approach for data collection, integrated responsibility matrix and swim-lane activity diagram with activity time for data representation and 5-why and gap analysis approach for data analysis. This enables collection of right detail of information in a shorter time frame by identifying process flaws and deficiencies while being independent of the nature of the patient's disease or treatment techniques. A case study of a multidisciplinary regional cancer centre is used to illustrate effectiveness of the proposed methodology and demonstrates that the methodology is simple to understand, allowing for minimal training of staff and rapid implementation.

  6. Automated microstructural analysis of titanium alloys using digital image processing

    NASA Astrophysics Data System (ADS)

    Campbell, A.; Murray, P.; Yakushina, E.; Marshall, S.; Ion, W.

    2017-02-01

    Titanium is a material that exhibits many desirable properties including a very high strength to weight ratio and corrosive resistance. However, the specific properties of any components depend upon the microstructure of the material, which varies by the manufacturing process. This means it is often necessary to analyse the microstructure when designing new processes or performing quality assurance on manufactured parts. For Ti6Al4V, grain size analysis is typically performed manually by expert material scientists as the complicated microstructure of the material means that, to the authors knowledge, no existing software reliably identifies the grain boundaries. This manual process is time consuming and offers low repeatability due to human error and subjectivity. In this paper, we propose a new, automated method to segment microstructural images of a Ti6Al4V alloy into its constituent grains and produce measurements. The results of applying this technique are evaluated by comparing the measurements obtained by different analysis methods. By using measurements from a complete manual segmentation as a benchmark we explore the reliability of the current manual estimations of grain size and contrast this with improvements offered by our approach.

  7. On the selection of tuning methodology of FOPID controllers for the control of higher order processes.

    PubMed

    Das, Saptarshi; Saha, Suman; Das, Shantanu; Gupta, Amitava

    2011-07-01

    In this paper, a comparative study is done on the time and frequency domain tuning strategies for fractional order (FO) PID controllers to handle higher order processes. A new fractional order template for reduced parameter modelling of stable minimum/non-minimum phase higher order processes is introduced and its advantage in frequency domain tuning of FOPID controllers is also presented. The time domain optimal tuning of FOPID controllers have also been carried out to handle these higher order processes by performing optimization with various integral performance indices. The paper highlights on the practical control system implementation issues like flexibility of online autotuning, reduced control signal and actuator size, capability of measurement noise filtration, load disturbance suppression, robustness against parameter uncertainties etc. in light of the above tuning methodologies.

  8. Processing optimization of probiotic yogurt containing glucose oxidase using response surface methodology.

    PubMed

    Cruz, A G; Faria, J A F; Walter, E H M; Andrade, R R; Cavalcanti, R N; Oliveira, C A F; Granato, D

    2010-11-01

    Exposure to oxygen may induce a lack of functionality of probiotic dairy foods because the anaerobic metabolism of probiotic bacteria compromises during storage the maintenance of their viability to provide benefits to consumer health. Glucose oxidase can constitute a potential alternative to increase the survival of probiotic bacteria in yogurt because it consumes the oxygen permeating to the inside of the pot during storage, thus making it possible to avoid the use of chemical additives. This research aimed to optimize the processing of probiotic yogurt supplemented with glucose oxidase using response surface methodology and to determine the levels of glucose and glucose oxidase that minimize the concentration of dissolved oxygen and maximize the Bifidobacterium longum count by the desirability function. Response surface methodology mathematical models adequately described the process, with adjusted determination coefficients of 83% for the oxygen and 94% for the B. longum. Linear and quadratic effects of the glucose oxidase were reported for the oxygen model, whereas for the B. longum count model an influence of the glucose oxidase at the linear level was observed followed by the quadratic influence of glucose and quadratic effect of glucose oxidase. The desirability function indicated that 62.32 ppm of glucose oxidase and 4.35 ppm of glucose was the best combination of these components for optimization of probiotic yogurt processing. An additional validation experiment was performed and results showed acceptable error between the predicted and experimental results.

  9. Application of digital image processing techniques to astronomical imagery, 1979

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.

    1979-01-01

    Several areas of applications of image processing to astronomy were identified and discussed. These areas include: (1) deconvolution for atmospheric seeing compensation; a comparison between maximum entropy and conventional Wiener algorithms; (2) polarization in galaxies from photographic plates; (3) time changes in M87 and methods of displaying these changes; (4) comparing emission line images in planetary nebulae; and (5) log intensity, hue saturation intensity, and principal component color enhancements of M82. Examples are presented of these techniques applied to a variety of objects.

  10. High-speed optical processing using digital micromirror device

    NASA Astrophysics Data System (ADS)

    Chao, Tien-Hsin; Lu, Thomas; Walker, Brian; Reyes, George

    2014-04-01

    We have designed optical processing architecture and algorithms utilizing the DMD as the input and filter Spatial Light Modulators (SLM). Detailed system analysis will be depicted. Experimental demonstration, for the first time, showing that a complex-valued spatial filtered can be successfully written on the DMDSLM using a Computer Generated Hologram (CGH) [1] encoding technique will also be provided. The high-resolution, high-bandwidth provided by the DMD and its potential low cost due to mass production will enable its vast defense and civil application.

  11. Digital Methods of the Optimum Processing of Radar Signals,

    DTIC Science & Technology

    1985-02-07

    Transliteration System ......................... ii *Preface ..................................................... 0................... 3 Chapter 1. Command of...Troops and the Tasks of Processing Radar Signals,........7 *Chapter 2. Arithmetic Operations with the Binary Numbers ...................... 16 Chapter 3 ...kh -%V Zh, zh Q LtaL Ts, ts 3 3 j Z, z H 4. i Ch, ch M A# M,9 b b HNHnH X N, nE, e 0 o 0 0 0,P0 hji 10 1 Yu, yu n fn 17 it P, p A R jr Ya, ya *ye

  12. User Evaluation: Summary of the Methodologies and Results for the Alexandria Digital Library, University of California at Santa Barbara.

    ERIC Educational Resources Information Center

    Hill, Linda L.; Dolin, Ron; Frew, James; Kemp, Randall B.; Larsgaard, Mary; Montello, Daniel R.; Rae, Mary-Anna; Simpson, Jason

    1997-01-01

    The collection and services of the Alexandria Digital Library (ADL) at the University of California Santa Barbara focus on geospatial information sources with links to geographic locations. Evaluation studies conducted within the ADL project are described as well as what was learned about user characteristics and reactions to ADL. (Author/AEF)

  13. Detecting Buried Archaeological Remains by the Use of Geophysical Data Processing with 'Diffusion Maps' Methodology

    NASA Astrophysics Data System (ADS)

    Eppelbaum, Lev

    2015-04-01

    observe that as a result of the above operations we embedded the original data into 3-dimensional space where data related to the AT subsurface are well separated from the N data. This 3D set of the data representatives can be used as a reference set for the classification of newly arriving data. Geophysically it means a reliable division of the studied areas for the AT-containing and not containing (N) these objects. Testing this methodology for delineation of archaeological cavities by magnetic and gravity data analysis displayed an effectiveness of this approach. References Alperovich, L., Eppelbaum, L., Zheludev, V., Dumoulin, J., Soldovieri, F., Proto, M., Bavusi, M. and Loperte, A., 2013. A new combined wavelet methodology applied to GPR and ERT data in the Montagnole experiment (French Alps). Journal of Geophysics and Engineering, 10, No. 2, 025017, 1-17. Averbuch, A., Hochman, K., Rabin, N., Schclar, A. and Zheludev, V., 2010. A diffusion frame-work for detection of moving vehicles. Digital Signal Processing, 20, No.1, 111-122. Averbuch A.Z., Neittaanmäki, P., and Zheludev, V.A., 2014. Spline and Spline Wavelet Methods with Applications to Signal and Image Processing. Volume I: Periodic Splines. Springer. Coifman, R.R. and Lafon, S., 2006. Diffusion maps, Applied and Computational Harmonic Analysis. Special issue on Diffusion Maps and Wavelets, 21, No. 7, 5-30. Eppelbaum, L.V., 2011. Study of magnetic anomalies over archaeological targets in urban conditions. Physics and Chemistry of the Earth, 36, No. 16, 1318-1330. Eppelbaum, L.V., 2014a. Geophysical observations at archaeological sites: Estimating informational content. Archaeological Prospection, 21, No. 2, 25-38. Eppelbaum, L.V. 2014b. Four Color Theorem and Applied Geophysics. Applied Mathematics, 5, 358-366. Eppelbaum, L.V., Alperovich, L., Zheludev, V. and Pechersky, A., 2011. Application of informational and wavelet approaches for integrated processing of geophysical data in complex environments. Proceed

  14. Hybrid-integrated optical acceleration seismometer and its digital processing system

    NASA Astrophysics Data System (ADS)

    En, De; Chen, Caihe; Cui, Yuming; Tang, Donglin; Liang, Zhengxi; Gao, Hongyu

    2005-02-01

    Hybrid-integrated Optical acceleration seismometer and its digital signal processing system are researched and developed. The simple system figure of the seismometer is given. The principle of the seismometer is explicated. The seismometer is composed of a seismic mass,Integrated Optical Chips and a set of Michelson interferometer light path. The Michelson Integrated Optical Chips are critical parts among the sensor elements. The simple figure of the digital signal processing system is given. As an advanced quality digital signal processing (DSP) chip equipped with necessary circuits has been used in its digital signal processing system, a high accurate detection of the acceleration signal has been achieved and the environmental interference signal has been effectively compensated. Test results indicate that the accelerometer has better frequency response well above the resonant frequency, and the output signal is in correspondence with the input signal. The accelerometer also has better frequency response under the resonant frequency. At last, the curve of Seismometer frequency response is given.

  15. Digital image processing applications in the ignition and combustion of char/coal particles

    SciTech Connect

    Annamalai, K.; Kharbat, E.; Goplakrishnan, C.

    1992-12-01

    Digital image processing, is employed in this remarch study in order to visually investigate the ignition and combustion characteristics of isolated char/coal particles as well as the effect of interactivecombustion in two-particle char/coal arrays. Preliminary experiments are conducted on miniature isolated candles as well as two-candle arrays.

  16. 21 CFR 1311.55 - Requirements for systems used to process digitally signed orders.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 9 2012-04-01 2012-04-01 false Requirements for systems used to process digitally signed orders. 1311.55 Section 1311.55 Food and Drugs DRUG ENFORCEMENT ADMINISTRATION, DEPARTMENT OF... system that is within five minutes of the official National Institute of Standards and Technology...

  17. 21 CFR 1311.55 - Requirements for systems used to process digitally signed orders.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 9 2013-04-01 2013-04-01 false Requirements for systems used to process digitally signed orders. 1311.55 Section 1311.55 Food and Drugs DRUG ENFORCEMENT ADMINISTRATION, DEPARTMENT OF... system that is within five minutes of the official National Institute of Standards and Technology...

  18. Highly Stretchable and UV Curable Elastomers for Digital Light Processing Based 3D Printing.

    PubMed

    Patel, Dinesh K; Sakhaei, Amir Hosein; Layani, Michael; Zhang, Biao; Ge, Qi; Magdassi, Shlomo

    2017-04-01

    Stretchable UV-curable (SUV) elastomers can be stretched by up to 1100% and are suitable for digital-light-processing (DLP)-based 3D-printing technology. DLP printing of these SUV elastomers enables the direct creation of highly deformable complex 3D hollow structures such as balloons, soft actuators, grippers, and buckyball electronical switches.

  19. Proceedings of the Fourth Annual Workshop on the Use of Digital Computers in Process Control.

    ERIC Educational Resources Information Center

    Smith, Cecil L., Ed.

    Contents: Computer hardware testing (results of vendor-user interaction); CODIL (a new language for process control programing); the design and implementation of control systems utilizing CRT display consoles; the systems contractor - valuable professional or unnecessary middle man; power station digital computer applications; from inspiration to…

  20. An Undergraduate Course and Laboratory in Digital Signal Processing with Field Programmable Gate Arrays

    ERIC Educational Resources Information Center

    Meyer-Base, U.; Vera, A.; Meyer-Base, A.; Pattichis, M. S.; Perry, R. J.

    2010-01-01

    In this paper, an innovative educational approach to introducing undergraduates to both digital signal processing (DSP) and field programmable gate array (FPGA)-based design in a one-semester course and laboratory is described. While both DSP and FPGA-based courses are currently present in different curricula, this integrated approach reduces the…

  1. Development of digital processing method of microfocus X-ray images

    NASA Astrophysics Data System (ADS)

    Staroverov, N. E.; Kholopova, E. D.; Gryaznov, A. Yu; Zhamova, K. K.

    2017-02-01

    The article describes the basic methods of X-ray images digital processing. Also in the article is proposed method for background image aligning based on modeling of distorting function and subtracting it from the image. As a result is proposed the improved algorithm for locally adaptive median filtering for which has been carried out the effectiveness experimental verification.

  2. Wavelet image processing applied to optical and digital holography: past achievements and future challenges

    NASA Astrophysics Data System (ADS)

    Jones, Katharine J.

    2005-08-01

    The link between wavelets and optics goes back to the work of Dennis Gabor who both invented holography and developed Gabor decompositions. Holography involves 3-D images. Gabor decompositions involves 1-D signals. Gabor decompositions are the predecessors of wavelets. Wavelet image processing of holography, both optical holography and digital holography, will be examined with respect to past achievements and future challenges.

  3. Mathematics and Science Teachers' Perceptions about Using Drama during the Digital Story Creation Process

    ERIC Educational Resources Information Center

    Yuksekyalcin, Gozen; Tanriseven, Isil; Sancar-Tokmak, Hatice

    2016-01-01

    This case study investigated math and science teachers' perceptions about the use of creative drama during a digital story (DS) creation process for educational purposes. A total of 25 secondary science and math teachers were selected according to criterion sampling strategy to participate in the study. Data were collected through an open-ended…

  4. Electronic post-compensation of WDM transmission impairments using coherent detection and digital signal processing.

    PubMed

    Li, Xiaoxu; Chen, Xin; Goldfarb, Gilad; Mateo, Eduardo; Kim, Inwoong; Yaman, Fatih; Li, Guifang

    2008-01-21

    A universal post-compensation scheme for fiber impairments in wavelength-division multiplexing (WDM) systems is proposed based on coherent detection and digital signal processing (DSP). Transmission of 10 x 10 Gbit/s binary-phase-shift-keying (BPSK) signals at a channel spacing of 20 GHz over 800 km dispersion shifted fiber (DSF) has been demonstrated numerically.

  5. A Method for Identifying Contours in Processing Digital Images from Computer Tomograph

    NASA Astrophysics Data System (ADS)

    Roşu, Şerban; Pater, Flavius; Costea, Dan; Munteanu, Mihnea; Roşu, Doina; Fratila, Mihaela

    2011-09-01

    The first step in digital processing of two-dimensional computed tomography images is to identify the contour of component elements. This paper deals with the collective work of specialists in medicine and applied mathematics in computer science on elaborating new algorithms and methods in medical 2D and 3D imagery.

  6. Implementation and Performance of GaAs Digital Signal Processing ASICs

    NASA Technical Reports Server (NTRS)

    Whitaker, William D.; Buchanan, Jeffrey R.; Burke, Gary R.; Chow, Terrance W.; Graham, J. Scott; Kowalski, James E.; Lam, Barbara; Siavoshi, Fardad; Thompson, Matthew S.; Johnson, Robert A.

    1993-01-01

    The feasibility of performing high speed digital signal processing in GaAs gate array technology has been demonstrated with the successful implementation of a VLSI communications chip set for NASA's Deep Space Network. This paper describes the techniques developed to solve some of the technology and implementation problems associated with large scale integration of GaAs gate arrays.

  7. Language influences on numerical development—Inversion effects on multi-digit number processing

    PubMed Central

    Klein, E.; Bahnmueller, J.; Mann, A.; Pixner, S.; Kaufmann, L.; Nuerk, H.-C.; Moeller, K.

    2013-01-01

    In early numerical development, children have to become familiar with the Arabic number system and its place-value structure. The present review summarizes and discusses evidence for language influences on the acquisition of the highly transparent structuring principles of digital-Arabic digits by means of its moderation through the transparency of the respective language's number word system. In particular, the so-called inversion property (i.e., 24 named as “four and twenty” instead of “twenty four”) was found to influence number processing in children not only in verbal but also in non-verbal numerical tasks. Additionally, there is first evidence suggesting that inversion-related difficulties may influence numerical processing longitudinally. Generally, language-specific influences in children's numerical development are most pronounced for multi-digit numbers. Yet, there is currently only one study on three-digit number processing for German-speaking children. A direct comparison of additional new data from Italian-speaking children further corroborates the assumption that language impacts on cognitive (number) processing as inversion-related interference was found most pronounced for German-speaking children. In sum, we conclude that numerical development may not be language-specific but seems to be moderated by language. PMID:23935585

  8. Modular Scanning Confocal Microscope with Digital Image Processing.

    PubMed

    Ye, Xianjun; McCluskey, Matthew D

    2016-01-01

    In conventional confocal microscopy, a physical pinhole is placed at the image plane prior to the detector to limit the observation volume. In this work, we present a modular design of a scanning confocal microscope which uses a CCD camera to replace the physical pinhole for materials science applications. Experimental scans were performed on a microscope resolution target, a semiconductor chip carrier, and a piece of etched silicon wafer. The data collected by the CCD were processed to yield images of the specimen. By selecting effective pixels in the recorded CCD images, a virtual pinhole is created. By analyzing the image moments of the imaging data, a lateral resolution enhancement is achieved by using a 20 × / NA = 0.4 microscope objective at 532 nm laser wavelength.

  9. Modular Scanning Confocal Microscope with Digital Image Processing

    PubMed Central

    McCluskey, Matthew D.

    2016-01-01

    In conventional confocal microscopy, a physical pinhole is placed at the image plane prior to the detector to limit the observation volume. In this work, we present a modular design of a scanning confocal microscope which uses a CCD camera to replace the physical pinhole for materials science applications. Experimental scans were performed on a microscope resolution target, a semiconductor chip carrier, and a piece of etched silicon wafer. The data collected by the CCD were processed to yield images of the specimen. By selecting effective pixels in the recorded CCD images, a virtual pinhole is created. By analyzing the image moments of the imaging data, a lateral resolution enhancement is achieved by using a 20 × / NA = 0.4 microscope objective at 532 nm laser wavelength. PMID:27829052

  10. Digital Signal Processing for the Event Horizon Telescope

    NASA Astrophysics Data System (ADS)

    Weintroub, Jonathan

    2015-08-01

    A broad international collaboration is building the Event Horizon Telescope (EHT). The aim is to test Einstein’s theory of General Relativity in one of the very few places it could break down: the strong gravity regime right at the edge of a black hole. The EHT is an earth-size VLBI array operating at the shortest radio wavelengths, that has achieved unprecedented angular resolution of a few tens of μarcseconds. For nearby super massive black holes (SMBH) this size scale is comparable to the Schwarzschild Radius, and emission in the immediate neighborhood of the event horizon can be directly observed. We give an introduction to the science behind the CASPER-enabled EHT, and outline technical developments, with emphasis on the secret sauce of high speed signal processing.

  11. A non-delta-chrome OPC methodology for process models with three-dimensional mask effects

    NASA Astrophysics Data System (ADS)

    Ng, Philip C. W.; Tsai, Kuen-Yu; Tang, Chih-Hsien; Melvin, Lawrence

    2010-04-01

    Delta-chrome optical proximity correction (OPC) has been widely adopted in lithographic patterning for semiconductor manufacturing. During the delta-chrome OPC iteration, a predetermined amount of chrome is added or subtracted from the mask pattern. With this chrome change, the change of exposure intensity error (IE) or the change of edge placement error (EPE) between the printed contour and the target pattern is then calculated based on standard Kirchhoff approximation. Linear approximation is used to predict the amount of the proper chrome change to remove the correction error. This approximation can be very fast and effective, but must be performed iteratively to capture interactions between chrome changes. As integrated circuit (IC) design shrinks to the deep sub-wavelength regime, previously ignored nonlinear process effects, such as three-dimensional (3D) mask effects and resist development effects, become significant for accurate prediction and correction of proximity effects. These nonlinearities challenge the deltachrome OPC methodology. The model response to mask pattern perturbation by linear approximation can be readily computed but inaccurate. In fact, computation of the mask perturbation response becomes complex and expensive. A non-delta-chrome OPC methodology with IE-based feedback compensation is proposed. It determines the amount of the proper chrome change based on IE without intensive computation of mask perturbation response. Its effectiveness in improving patterning fidelity and runtime is examined on a 50-nm practical circuit layout. Despite the presence and the absence of nonlinear 3D mask effects, our results show the proposed non-delta-chrome OPC outperforms the deltachrome one in terms of patterning fidelity and runtime. The results also demonstrate that process models with 3D mask effects limit the use of delta-chrome OPC methodology.

  12. Advanced Signal Processing Methods Applied to Digital Mammography

    NASA Technical Reports Server (NTRS)

    Stauduhar, Richard P.

    1997-01-01

    The work reported here is on the extension of the earlier proposal of the same title, August 1994-June 1996. The report for that work is also being submitted. The work reported there forms the foundation for this work from January 1997 to September 1997. After the earlier work was completed there were a few items that needed to be completed prior to submission of a new and more comprehensive proposal for further research. Those tasks have been completed and two new proposals have been submitted, one to NASA, and one to Health & Human Services WS). The main purpose of this extension was to refine some of the techniques that lead to automatic large scale evaluation of full mammograms. Progress on each of the proposed tasks follows. Task 1: A multiresolution segmentation of background from breast has been developed and tested. The method is based on the different noise characteristics of the two different fields. The breast field has more power in the lower octaves and the off-breast field behaves similar to a wideband process, where more power is in the high frequency octaves. After the two fields are separated by lowpass filtering, a region labeling routine is used to find the largest contiguous region, the breast. Task 2: A wavelet expansion that can decompose the image without zero padding has been developed. The method preserves all properties of the power-of-two wavelet transform and does not add appreciably to computation time or storage. This work is essential for analysis of the full mammogram, as opposed to selecting sections from the full mammogram. Task 3: A clustering method has been developed based on a simple counting mechanism. No ROC analysis has been performed (and was not proposed), so we cannot finally evaluate this work without further support. Task 4: Further testing of the filter reveals that different wavelet bases do yield slightly different qualitative results. We cannot provide quantitative conclusions about this for all possible bases

  13. Realization of guitar audio effects using methods of digital signal processing

    NASA Astrophysics Data System (ADS)

    Buś, Szymon; Jedrzejewski, Konrad

    2015-09-01

    The paper is devoted to studies on possibilities of realization of guitar audio effects by means of methods of digital signal processing. As a result of research, some selected audio effects corresponding to the specifics of guitar sound were realized as the real-time system called Digital Guitar Multi-effect. Before implementation in the system, the selected effects were investigated using the dedicated application with a graphical user interface created in Matlab environment. In the second stage, the real-time system based on a microcontroller and an audio codec was designed and realized. The system is designed to perform audio effects on the output signal of an electric guitar.

  14. Digital radar-gram processing for water pipelines leak detection

    NASA Astrophysics Data System (ADS)

    García-Márquez, Jorge; Flores, Ricardo; Valdivia, Ricardo; Carreón, Dora; Malacara, Zacarías; Camposeco, Arturo

    2006-02-01

    Ground penetrating radars (GPR) are useful underground exploration devices. Applications are found in archaeology, mine detection, pavement evaluation, among others. Here we use a GPR to detect by an indirect way, the anomalies caused by the presence of water in the neighborhood of an underground water pipeline. By Fourier transforming a GPR profile map we interpret the signal as spatial frequencies, instead of the temporal frequencies, that composes the profile map. This allows differentiating between signals returning from a standard subsoil feature from those coming back from anomalous zones. Facilities in Mexican cities are commonly buried up to 2.5 m. Their constituent materials are PVC, concrete or metal, typically steel. GPRs are ultra-wide band devices; leak detection must be an indirect process since echoes due to the presence of underground zones with high moisture levels are masked by dense reflections (clutter). In radargrams the presence of water is visualized as anomalies in the neighborhood of the facility. Enhancement of these anomalies will give us the information required to detect leaks.

  15. A Novel Optical/digital Processing System for Pattern Recognition

    NASA Technical Reports Server (NTRS)

    Boone, Bradley G.; Shukla, Oodaye B.

    1993-01-01

    This paper describes two processing algorithms that can be implemented optically: the Radon transform and angular correlation. These two algorithms can be combined in one optical processor to extract all the basic geometric and amplitude features from objects embedded in video imagery. We show that the internal amplitude structure of objects is recovered by the Radon transform, which is a well-known result, but, in addition, we show simulation results that calculate angular correlation, a simple but unique algorithm that extracts object boundaries from suitably threshold images from which length, width, area, aspect ratio, and orientation can be derived. In addition to circumventing scale and rotation distortions, these simulations indicate that the features derived from the angular correlation algorithm are relatively insensitive to tracking shifts and image noise. Some optical architecture concepts, including one based on micro-optical lenslet arrays, have been developed to implement these algorithms. Simulation test and evaluation using simple synthetic object data will be described, including results of a study that uses object boundaries (derivable from angular correlation) to classify simple objects using a neural network.

  16. Matching rendered and real world images by digital image processing

    NASA Astrophysics Data System (ADS)

    Mitjà, Carles; Bover, Toni; Bigas, Miquel; Escofet, Jaume

    2010-05-01

    Recent advances in computer-generated images (CGI) have been used in commercial and industrial photography providing a broad scope in product advertising. Mixing real world images with those rendered from virtual space software shows a more or less visible mismatching between corresponding image quality performance. Rendered images are produced by software which quality performance is only limited by the resolution output. Real world images are taken with cameras with some amount of image degradation factors as lens residual aberrations, diffraction, sensor low pass anti aliasing filters, color pattern demosaicing, etc. The effect of all those image quality degradation factors can be characterized by the system Point Spread Function (PSF). Because the image is the convolution of the object by the system PSF, its characterization shows the amount of image degradation added to any taken picture. This work explores the use of image processing to degrade the rendered images following the parameters indicated by the real system PSF, attempting to match both virtual and real world image qualities. The system MTF is determined by the slanted edge method both in laboratory conditions and in the real picture environment in order to compare the influence of the working conditions on the device performance; an approximation to the system PSF is derived from the two measurements. The rendered images are filtered through a Gaussian filter obtained from the taking system PSF. Results with and without filtering are shown and compared measuring the contrast achieved in different final image regions.

  17. Seismic acquisition and processing methodologies in overthrust areas: Some examples from Latin America

    SciTech Connect

    Tilander, N.G.; Mitchel, R..

    1996-08-01

    Overthrust areas represent some of the last frontiers in petroleum exploration today. Billion barrel discoveries in the Eastern Cordillera of Colombia and the Monagas fold-thrust belt of Venezuela during the past decade have highlighted the potential rewards for overthrust exploration. However the seismic data recorded in many overthrust areas is disappointingly poor. Challenges such as rough topography, complex subsurface structure, presence of high-velocity rocks at the surface, back-scattered energy and severe migration wavefronting continue to lower data quality and reduce interpretability. Lack of well/velocity control also reduces the reliability of depth estimations and migrated images. Failure to obtain satisfactory pre-drill structural images can easily result in costly wildcat failures. Advances in the methodologies used by Chevron for data acquisition, processing and interpretation have produced significant improvements in seismic data quality in Bolivia, Colombia and Trinidad. In this paper, seismic test results showing various swath geometries will be presented. We will also show recent examples of processing methods which have led to improved structural imaging. Rather than focusing on {open_quotes}black box{close_quotes} methodology, we will emphasize the cumulative effect of step-by-step improvements. Finally, the critical significance and interrelation of velocity measurements, modeling and depth migration will be explored. Pre-drill interpretations must ultimately encompass a variety of model solutions, and error bars should be established which realistically reflect the uncertainties in the data.

  18. Engineering aesthetics and aesthetic ergonomics: theoretical foundations and a dual-process research methodology.

    PubMed

    Liu, Yili

    Although industrial and product designers are keenly aware of the importance of design aesthetics, they make aesthetic design decisions largely on the basis of their intuitive judgments and "educated guesses". Whilst ergonomics and human factors researchers have made great contributions to the safety, productivity, ease-of-use, and comfort of human-machine-environment systems, aesthetics is largely ignored as a topic of systematic scientific research in human factors and ergonomics. This article discusses the need for incorporating the aesthetics dimension in ergonomics and proposes the establishment of a new scientific and engineering discipline that we can call "engineering aesthetics". This discipline addresses two major questions: How do we use engineering and scientific methods to study aesthetics concepts in general and design aesthetics in particular? How do we incorporate engineering and scientific methods in the aesthetic design and evaluation process? This article identifies two special features that distinguish aesthetic appraisal of products and system designs from aesthetic appreciation of art, and lays out a theoretical foundation as well as a dual-process research methodology for "engineering aesthetics". Sample applications of this methodology are also described.

  19. Image digitalization and processing of contact lens fitting to astigmatic eyes

    NASA Astrophysics Data System (ADS)

    Costa, Manuel F. M.

    1998-01-01

    The use of standard CCD cameras and image digitalization and processing on medical diagnosis are more and more frequent. The correction of human eye's refraction problems by the use of contact lenses is generalized. In spite its advantages in terms of users comfort, special care must be taken on its prescription and adaptation. Astigmatic eyes often place the highest problems. A careful assessment of the quality of the lens to cornea adaptation must to be performed. The basic and more traditional way to check the contact lens' fitting is to perform a fluorescein test. We intend to make the process more convenient for both patient and optometrist. The fluorescence images are acquired by a CCD camera and then digitized and processed in order to produce a semi- automated process.

  20. Digital camera workflow for high dynamic range images using a model of retinal processing

    NASA Astrophysics Data System (ADS)

    Tamburrino, Daniel; Alleysson, David; Meylan, Laurence; Süsstrunk, Sabine

    2008-02-01

    We propose a complete digital camera workflow to capture and render high dynamic range (HDR) static scenes, from RAW sensor data to an output-referred encoded image. In traditional digital camera processing, demosaicing is one of the first operations done after scene analysis. It is followed by rendering operations, such as color correction and tone mapping. In our workflow, which is based on a model of retinal processing, most of the rendering steps are performed before demosaicing. This reduces the complexity of the computation, as only one third of the pixels are processed. This is especially important as our tone mapping operator applies local and global tone corrections, which is usually needed to well render high dynamic scenes. Our algorithms efficiently process HDR images with different keys and different content.

  1. Digitizing data acquisition and time-of-flight pulse processing for ToF-ERDA

    NASA Astrophysics Data System (ADS)

    Julin, Jaakko; Sajavaara, Timo

    2016-01-01

    A versatile system to capture and analyze signals from multi channel plate (MCP) based time-of-flight detectors and ionization based energy detectors such as silicon diodes and gas ionization chambers (GIC) is introduced. The system is based on commercial digitizers and custom software. It forms a part of a ToF-ERDA spectrometer, which has to be able to detect recoil atoms of many different species and energies. Compared to the currently used analogue electronics the digitizing system provides comparable time-of-flight resolution and improved hydrogen detection efficiency, while allowing the operation of the spectrometer be studied and optimized after the measurement. The hardware, data acquisition software and digital pulse processing algorithms to suit this application are described in detail.

  2. A Methodology for Evaluating Artifacts Produced by a Formal Verification Process

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Miner, Paul S.; Person, Suzette

    2011-01-01

    The goal of this study is to produce a methodology for evaluating the claims and arguments employed in, and the evidence produced by formal verification activities. To illustrate the process, we conduct a full assessment of a representative case study for the Enabling Technology Development and Demonstration (ETDD) program. We assess the model checking and satisfiabilty solving techniques as applied to a suite of abstract models of fault tolerant algorithms which were selected to be deployed in Orion, namely the TTEthernet startup services specified and verified in the Symbolic Analysis Laboratory (SAL) by TTTech. To this end, we introduce the Modeling and Verification Evaluation Score (MVES), a metric that is intended to estimate the amount of trust that can be placed on the evidence that is obtained. The results of the evaluation process and the MVES can then be used by non-experts and evaluators in assessing the credibility of the verification results.

  3. The association between children's numerical magnitude processing and mental multi-digit subtraction.

    PubMed

    Linsen, Sarah; Verschaffel, Lieven; Reynvoet, Bert; De Smedt, Bert

    2014-01-01

    Children apply various strategies to mentally solve multi-digit subtraction problems and the efficient use of some of them may depend more or less on numerical magnitude processing. For example, the indirect addition strategy (solving 72-67 as "how much do I have to add up to 67 to get 72?"), which is particularly efficient when the two given numbers are close to each other, requires to determine the proximity of these two numbers, a process that may depend on numerical magnitude processing. In the present study, children completed a numerical magnitude comparison task and a number line estimation task, both in a symbolic and nonsymbolic format, to measure their numerical magnitude processing. We administered a multi-digit subtraction task, in which half of the items were specifically designed to elicit indirect addition. Partial correlational analyses, controlling for intellectual ability and motor speed, revealed significant associations between numerical magnitude processing and mental multi-digit subtraction. Additional analyses indicated that numerical magnitude processing was particularly important for those items for which the use of indirect addition is expected to be most efficient. Although this association was observed for both symbolic and nonsymbolic tasks, the strongest associations were found for the symbolic format, and they seemed to be more prominent on numerical magnitude comparison than on number line estimation.

  4. 77 FR 50724 - Developing Software Life Cycle Processes for Digital Computer Software Used in Safety Systems of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... Specifications for Digital Computer Software and Complex Electronics used in Safety Systems of Nuclear Power... COMMISSION Developing Software Life Cycle Processes for Digital Computer Software Used in Safety Systems of... comment draft regulatory guide (DG), DG-1210, ``Developing Software Life Cycle Processes for...

  5. Digital Storytelling as a Narrative Health Promotion Process: Evaluation of a Pilot Study.

    PubMed

    DiFulvio, Gloria T; Gubrium, Aline C; Fiddian-Green, Alice; Lowe, Sarah E; Del Toro-Mejias, Lizbeth Marie

    2016-04-01

    Digital storytelling (DST) engages participants in a group-based process to create and share narrative accounts of life events. The process of individuals telling their own stories has not been well assessed as a mechanism of health behavior change. This study looks at outcomes associated with engaging in the DST process for vulnerable youth. The project focused on the experiences of Puerto Rican Latinas between the ages of 15 to 21. A total of 30 participants enrolled in a 4-day DST workshops, with 29 completing a 1 to 3-minute digital story. Self-reported data on several scales (self-esteem, social support, empowerment, and sexual attitudes and behaviors) were collected and analyzed. Participants showed an increase in positive social interactions from baseline to 3-month post workshop. Participants also demonstrated increases in optimism and control over the future immediately after the workshop, but this change was not sustained at 3 months. Analysis of qualitative results and implications are discussed.

  6. On Process, Progress, Success and Methodology or the Unfolding of the Bologna Process as It Appears to Two Reasonably Benign Observers

    ERIC Educational Resources Information Center

    Neave, Guy; Amaral, Alberto

    2008-01-01

    This article examines the Bologna Process from two main perspectives: as a dynamic strategy as well as the unfolding of the methodology employed. It argues that the latter was largely determined by the former. Three phases of development are identified; the first two of which shows that the methodology was largely determined by the need to bestow…

  7. A digital process for additive manufacturing of occlusal splints: a clinical pilot study

    PubMed Central

    Salmi, Mika; Paloheimo, Kaija-Stiina; Tuomi, Jukka; Ingman, Tuula; Mäkitie, Antti

    2013-01-01

    The aim of this study was to develop and evaluate a digital process for manufacturing of occlusal splints. An alginate impression was taken from the upper and lower jaws of a patient with temporomandibular disorder owing to cross bite and wear of the teeth, and then digitized using a table laser scanner. The scanned model was repaired using the 3Data Expert software, and a splint was designed with the Viscam RP software. A splint was manufactured from a biocompatible liquid photopolymer by stereolithography. The system employed in the process was SLA 350. The splint was worn nightly for six months. The patient adapted to the splint well and found it comfortable to use. The splint relieved tension in the patient's bite muscles. No sign of tooth wear or significant splint wear was detected after six months of testing. Modern digital technology enables us to manufacture clinically functional occlusal splints, which might reduce costs, dental technician working time and chair-side time. Maximum-dimensional errors of approximately 1 mm were found at thin walls and sharp corners of the splint when compared with the digital model. PMID:23614943

  8. Optimization of permeabilization process of yeast cells for catalase activity using response surface methodology

    PubMed Central

    Trawczyńska, Ilona; Wójcik, Marek

    2015-01-01

    Biotransformation processes accompanied by whole yeast cells as biocatalyst are a promising area of food industry. Among the chemical sanitizers currently used in food technology, hydrogen peroxide is a very effective microbicidal and bleaching agent. In this paper, permeabilization has been applied to Saccharomyces cerevisiae yeast cells aiming at increased intracellular catalase activity for decomposed H2O2. Ethanol, which is non-toxic, biodegradable and easily available, has been used as permeabilization factor. Response surface methodology (RSM) has been applied in determining the influence of different parameters on permeabilization process. The aim of the study was to find such values of the process parameters that would yield maximum activity of catalase during decomposition of hydrogen peroxide. The optimum operating conditions for permeabilization process obtained by RSM were as follows: 53% (v/v) of ethanol concentration, temperature of 14.8 °C and treatment time of 40 min. After permeabilization, the activity of catalase increased ca. 40 times and its maximum value equalled to 4711 U/g. PMID:26019618

  9. Optimization of Electrochemical Treatment Process Conditions for Distillery Effluent Using Response Surface Methodology.

    PubMed

    Arulmathi, P; Elangovan, G; Begum, A Farjana

    2015-01-01

    Distillery industry is recognized as one of the most polluting industries in India with a large amount of annual effluent production. In this present study, the optimization of electrochemical treatment process variables was reported to treat the color and COD of distillery spent wash using Ti/Pt as an anode in a batch mode. Process variables such as pH, current density, electrolysis time, and electrolyte dose were selected as operation variables and chemical oxygen demand (COD) and color removal efficiency were considered as response variable for optimization using response surface methodology. Indirect electrochemical-oxidation process variables were optimized using Box-Behnken response surface design (BBD). The results showed that electrochemical treatment process effectively removed the COD (89.5%) and color (95.1%) of the distillery industry spent wash under the optimum conditions: pH of 4.12, current density of 25.02 mA/cm(2), electrolysis time of 103.27 min, and electrolyte (NaCl) concentration of 1.67 g/L, respectively.

  10. Process optimization for high-pressure processing of black tiger shrimp (Penaeus monodon) using response surface methodology.

    PubMed

    Kaur, Barjinder Pal; Rao, P Srinivasa

    2016-10-06

    This study aims to investigate the effect of high-pressure processing on the quality of black tiger shrimp using response surface methodology. A central composite rotatable design was applied to evaluate the effects of three processing parameters, namely pressure (300-600 MPa), temperature (30-50 ℃), and time (0-15 min), on the inactivation rate of Staphylococcus aureus and physical properties (color and texture) of shrimp and to optimize the process conditions to achieve maximum bacterial inactivation with minimal changes in quality attributes. The results revealed that the processing conditions significantly affected the studied responses and the experimental data have been adequately fitted into a second-order polynomial model with multiple regression coefficients (R(2)) of 0.92, 0.92, and 0.94 for the inactivation rate of S. aureus, hardness, and color changes, respectively. The optimized conditions targeting minimum six log cycle reductions of S. aureus with moderate changes in quality attributes were obtained as: pressure, 361 MPa; time, 12 min and temperature, 46 ℃. The adequacy of the model equation for predicting the optimum response values was verified effectively by the validation data.

  11. Optimization of Extraction Process for Polysaccharide in Salvia Miltiorrhiza Bunge Using Response Surface Methodology.

    PubMed

    Yanhua, Wang; Fuhua, Wu; Zhaohan, Guo; Mingxing, Peng; Yanan, Zhang; Ling, Pang Zhen; Minhua, Du; Caiying, Zhang; Zian, Liang

    2014-01-01

    This study was aimed to optimize the extraction process for Salvia miltiorrhiza Bunge polysaccharide using response surface methodology The results showed that four operating parameters including microwave power, microwave time and the particle size had notable effects on the polysaccharide extraction of Salvia miltiorrhiza Bunge. The effects could be ranked in decreasing order of importance as follows:. Microwave power > microwave time > the comminution degree. The optimal extraction parameters were determined as 573.83W of Microwave power and 8.4min of microwave time and 67.51mesh of the comminution degree, resulting in the yield of Salvia miltiorrhiza Bunge polysaccharide of 101.161mg / g. The established regression model describing polysaccharide extraction from as a function of the three extraction parameters was highly significant (R 2 = 0.9953). The predicted and experimental results were found to be in good agreement. Thus, the model can be applicable for the prediction of polysaccharide extraction from Salvia miltiorrhiza Bunge.

  12. Optimization of Extraction Process for Polysaccharide in Salvia Miltiorrhiza Bunge Using Response Surface Methodology.

    PubMed

    Yanhua, Wang; Fuhua, Wu; Zhaohan, Guo; Mingxing, Peng; Yanan, Zhang; Ling, Pang Zhen; Minhua, Du; Caiying, Zhang; Zian, Liang

    2015-01-01

    This study was aimed to optimize the extraction process for Salvia miltiorrhiza Bunge polysaccharide using response surface methodology The results showed that four operating parameters including microwave power, microwave time and the particle size had notable effects on the polysaccharide extraction of Salvia miltiorrhiza Bunge. The effects could be ranked in decreasing order of importance as follows:. Microwave power > microwave time > the comminution degree. The optimal extraction parameters were determined as 573.83W of Microwave power and 8.4min of microwave time and 67.51mesh of the comminution degree, resulting in the yield of Salvia miltiorrhiza Bunge polysaccharide of 101.161mg / g. The established regression model describing polysaccharide extraction from as a function of the three extraction parameters was highly significant (R 2 = 0.9953). The predicted and experimental results were found to be in good agreement. Thus, the model can be applicable for the prediction of polysaccharide extraction from Salvia miltiorrhiza Bunge.

  13. Process and methodology of developing Cassini G and C Telemetry Dictionary

    NASA Technical Reports Server (NTRS)

    Kan, Edwin P.

    1994-01-01

    While the Cassini spacecraft telemetry design had taken on the new approach of 'packetized telemetry', the AACS (Attitude and Articulation Subsystem) had further extended into the design of 'mini-packets' in its telemetry system. Such telemetry packet and mini-packet design produced the AACS Telemetry Dictionary; iterations of the latter in turn provided changes to the former. The ultimate goals were to achieve maximum telemetry packing density, optimize the 'freshness' of more time-critical data, and to effect flexibility, i.e., multiple AACS data collection schemes, without needing to change the overall spacecraft telemetry mode. This paper describes such a systematic process and methodology, evidenced by various design products related to, or as part of, the AACS Telemetry Dictionary.

  14. Electronic polarization-division demultiplexing based on digital signal processing in intensity-modulation direct-detection optical communication systems.

    PubMed

    Kikuchi, Kazuro

    2014-01-27

    We propose a novel configuration of optical receivers for intensity-modulation direct-detection (IM · DD) systems, which can cope with dual-polarization (DP) optical signals electrically. Using a Stokes analyzer and a newly-developed digital signal-processing (DSP) algorithm, we can achieve polarization tracking and demultiplexing in the digital domain after direct detection. Simulation results show that the power penalty stemming from digital polarization manipulations is negligibly small.

  15. Methods, media and systems for managing a distributed application running in a plurality of digital processing devices

    DOEpatents

    Laadan, Oren; Nieh, Jason; Phung, Dan

    2012-10-02

    Methods, media and systems for managing a distributed application running in a plurality of digital processing devices are provided. In some embodiments, a method includes running one or more processes associated with the distributed application in virtualized operating system environments on a plurality of digital processing devices, suspending the one or more processes, and saving network state information relating to network connections among the one or more processes. The method further include storing process information relating to the one or more processes, recreating the network connections using the saved network state information, and restarting the one or more processes using the stored process information.

  16. Two-dimensional quantification of the corrosion process in metal surfaces using digital speckle pattern interferometry

    SciTech Connect

    Andres, N.; Lobera, J.; Arroyo, M. P.; Angurel, L. A.

    2011-04-01

    The applicability of digital speckle pattern interferometry (DSPI) to the analysis of surface corrosion processes has been evaluated by studying the evolution of an Fe surface immersed in sulfuric acid. This work describes the analysis process required to obtain quantitative information about the corrosion process. It has been possible to evaluate the corrosion rate, and the results agree with those derived from the weight loss method. In addition, a two-dimensional analysis has been applied, showing that DSPI measurements can be used to extract information about the corrosion rate at any region of the surface.

  17. Searching early bone metastasis on plain radiography by using digital imaging processing

    SciTech Connect

    Jaramillo-Nunez, A.; Perez-Meza, M.

    2012-10-23

    Some authors mention that it is not possible to detect early bone metastasis on plain radiography. In this work we use digital imaging processing to analyze three radiographs taken from a patient with bone metastasis discomfort on the right shoulder. The time period among the first and second radiography was approximately one month and between the first and the third one year. This procedure is a first approach in order to know if in this particular case it was possible to detect an early bone metastasis. The obtained results suggest that by carrying out a digital processing is possible to detect the metastasis since the radiography contains the information although visually it is not possible to observe it.

  18. PREFACE: I International Scientific School Methods of Digital Image Processing in Optics and Photonics

    NASA Astrophysics Data System (ADS)

    Gurov, I. P.; Kozlov, S. A.

    2014-09-01

    The first international scientific school "Methods of Digital Image Processing in Optics and Photonics" was held with a view to develop cooperation between world-class experts, young scientists, students and post-graduate students, and to exchange information on the current status and directions of research in the field of digital image processing in optics and photonics. The International Scientific School was managed by: Saint Petersburg National Research University of Information Technologies, Mechanics and Optics (ITMO University) - Saint Petersburg (Russia) Chernyshevsky Saratov State University - Saratov (Russia) National research nuclear University "MEPHI" (NRNU MEPhI) - Moscow (Russia) The school was held with the participation of the local chapters of Optical Society of America (OSA), the Society of Photo-Optical Instrumentation Engineers (SPIE) and IEEE Photonics Society. Further details, including topics, committees and conference photos are available in the PDF

  19. Human blood group typing based on digital photographs of RBC agglutination process

    NASA Astrophysics Data System (ADS)

    Doubrovski, V. A.; Dolmashkin, A. A.

    2010-08-01

    A method for the monitoring of the human erythrocyte agglutination reaction in vitro, which is the basis for determining the human blood type (group), is proposed. The method is based on a statistical analysis of digital photographs of the agglutination process. It is experimentally shown that this analysis of photographs makes it possible to determine the probability that the agglutination reaction of erythrocytes of the studied specimen of blood with corresponding isohemagglutinating sera does occur. To increase the rate of the agglutination reaction of erythrocytes and to improve the sensitivity of the method of monitoring, the bioobject under study is subjected to the action of ultrasonic waves, as was previously proposed by the authors, and the result of the erythrocyte agglutination reaction is read optically. It is shown that, in principle, the method of statistical processing of digital photographs can be used to develop devices for automatic human blood typing in the AB0 and Rh systems.

  20. Searching early bone metastasis on plain radiography by using digital imaging processing

    NASA Astrophysics Data System (ADS)

    Jaramillo-Núñez, A.; Pérez-Meza, M.

    2012-10-01

    Some authors mention that it is not possible to detect early bone metastasis on plain radiography. In this work we use digital imaging processing to analyze three radiographs taken from a patient with bone metastasis discomfort on the right shoulder. The time period among the first and second radiography was approximately one month and between the first and the third one year. This procedure is a first approach in order to know if in this particular case it was possible to detect an early bone metastasis. The obtained results suggest that by carrying out a digital processing is possible to detect the metastasis since the radiography contains the information although visually it is not possible to observe it.

  1. Recent developments at JPL in the application of digital image processing techniques to astronomical images

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.; Lynn, D. J.; Benton, W. D.

    1976-01-01

    Several techniques of a digital image-processing nature are illustrated which have proved useful in visual analysis of astronomical pictorial data. Processed digital scans of photographic plates of Stephans Quintet and NGC 4151 are used as examples to show how faint nebulosity is enhanced by high-pass filtering, how foreground stars are suppressed by linear interpolation, and how relative color differences between two images recorded on plates with different spectral sensitivities can be revealed by generating ratio images. Analyses are outlined which are intended to compensate partially for the blurring effects of the atmosphere on images of Stephans Quintet and to obtain more detailed information about Saturn's ring structure from low- and high-resolution scans of the planet and its ring system. The employment of a correlation picture to determine the tilt angle of an average spectral line in a low-quality spectrum is demonstrated for a section of the spectrum of Uranus.

  2. Processing of post-consumer HDPE reinforced with wood flour: Effect of functionaliation methodology

    NASA Astrophysics Data System (ADS)

    Catto, A. L.; Montagna, L. S.; Rossini, K.; Santana, R. M. C.

    2014-05-01

    A way very interesting used in the reuse of waste cellulose derivatives such as wood flour is its incorporation into thermoplastics matrix. As the olefinic polymers have no interaction with cellulose derivatives, some chemical treatments have been used in the modification of vegetable fibers, to increase the interfacial adhesion between the cellulosic reinforcement and the polymeric matrix. In this sense, the objective of this study was to evaluate the influence of the methodology of the incorporation of compatibilizer agent (CA) in polyolefin matrix and evaluate the mechanical and morphological properties of composites. HDPE, wood flour from Eucalyptus grandis species (EU) and graftized polyethylene with maleic anhydride (CA) were used in composites, being extruded and after injection molding. The mixtures were processed in a single screw extruder (L/D: 22), with the temperature profile from 170° to 190°C. In a first step, the materials were processed together in the extruder, and after the samples were injected a temperature of 185°C and pressure of 600 bar. In a second step, the HDPE with the compatibilizer agent were first processed in the extruder in order to functionalize the polyolefin, and added after the wood flour (EU) sieved (30% w/w). Results showed that composites with CA had a higher mechanical performance compared to nocompatibilized. Also showed that composites compatibilized previously in the extruder with CA had better performance in comparison to other where the polymer matrix was not previously compatibilized.

  3. Process optimization via response surface methodology in the treatment of metal working industry wastewater with electrocoagulation.

    PubMed

    Guvenc, Senem Yazici; Okut, Yusuf; Ozak, Mert; Haktanir, Birsu; Bilgili, Mehmet Sinan

    2017-02-01

    In this study, process parameters in chemical oxygen demand (COD) and turbidity removal from metal working industry (MWI) wastewater were optimized by electrocoagulation (EC) using aluminum, iron and steel electrodes. The effects of process variables on COD and turbidity were investigated by developing a mathematical model using central composite design method, which is one of the response surface methodologies. Variance analysis was conducted to identify the interaction between process variables and model responses and the optimum conditions for the COD and turbidity removal. Second-order regression models were developed via the Statgraphics Centurion XVI.I software program to predict COD and turbidity removal efficiencies. Under the optimum conditions, removal efficiencies obtained from aluminum electrodes were found to be 76.72% for COD and 99.97% for turbidity, while the removal efficiencies obtained from iron electrodes were found to be 76.55% for COD and 99.9% for turbidity and the removal efficiencies obtained from steel electrodes were found to be 65.75% for COD and 99.25% for turbidity. Operational costs at optimum conditions were found to be 4.83, 1.91 and 2.91 €/m(3) for aluminum, iron and steel electrodes, respectively. Iron electrode was found to be more suitable for MWI wastewater treatment in terms of operational cost and treatment efficiency.

  4. Computational analysis of Pelton bucket tip erosion using digital image processing

    NASA Astrophysics Data System (ADS)

    Shrestha, Bim Prasad; Gautam, Bijaya; Bajracharya, Tri Ratna

    2008-03-01

    Erosion of hydro turbine components through sand laden river is one of the biggest problems in Himalayas. Even with sediment trapping systems, complete removal of fine sediment from water is impossible and uneconomical; hence most of the turbine components in Himalayan Rivers are exposed to sand laden water and subject to erode. Pelton bucket which are being wildly used in different hydropower generation plant undergoes erosion on the continuous presence of sand particles in water. The subsequent erosion causes increase in splitter thickness, which is supposed to be theoretically zero. This increase in splitter thickness gives rise to back hitting of water followed by decrease in turbine efficiency. This paper describes the process of measurement of sharp edges like bucket tip using digital image processing. Image of each bucket is captured and allowed to run for 72 hours; sand concentration in water hitting the bucket is closely controlled and monitored. Later, the image of the test bucket is taken in the same condition. The process is repeated for 10 times. In this paper digital image processing which encompasses processes that performs image enhancement in both spatial and frequency domain. In addition, the processes that extract attributes from images, up to and including the measurement of splitter's tip. Processing of image has been done in MATLAB 6.5 platform. The result shows that quantitative measurement of edge erosion of sharp edges could accurately be detected and the erosion profile could be generated using image processing technique.

  5. Design of film-micro-cavity interleaver filters based on digital signal processing

    NASA Astrophysics Data System (ADS)

    Hua, Dong; Zhang, Juan

    2016-01-01

    Based on digital signal processing theory, a novel and simple method is proposed to design asymmetrical film-microcavity interleaver filters (FMCIF) with arbitrary duty cycles. The transmission spectrums obtained have the characteristics of high isolation, great rectangular degree and wide flat passband. Design examples of different duty cycles are given and the influence of some key parameters on the spectral performance is discussed. The proposed method is simpler and more efficient than existing methods.

  6. Equalization-enhanced phase noise for coherent-detection systems using electronic digital signal processing.

    PubMed

    Shieh, William; Ho, Keang-Po

    2008-09-29

    In coherent optical systems employing electronic digital signal processing, the fiber chromatic dispersion can be gracefully compensated in electronic domain without resorting to optical techniques. Unlike optical dispersion compensator, the electronic equalizer enhances the impairments from the laser phase noise. This equalization-enhanced phase noise (EEPN) imposes a tighter constraint on the receive laser phase noise for transmission systems with high symbol rate and large electronically-compensated chromatic dispersion.

  7. Optical mapping of Kamchatka's volcanic deposits using digitally processed band-selective photographs

    NASA Astrophysics Data System (ADS)

    Ivanov, S. I.; Novikov, V. V.; Popov, A. P.; Tadzhidinov, Kh. G.

    1989-08-01

    A procedure is described for the digital processing of band-selective aerial photographs of volcano-bearing surfaces. The brightness and color parameters of samples of volcanic rocks and soils in their natural bedding are examined, and the results of two-parameter (albedo-color) mapping for an area around the Tolbachin Volcano are discussed. It is shown that the information obtained with this procedure yields accurate predictions of geochemical properties of volcanic deposits from optical data.

  8. Universal Michelson Gires-Tournois interferometer optical interleaver based on digital signal processing.

    PubMed

    Zhang, Juan; Yang, Xiaowei

    2010-03-01

    Optical interleavers based on Michelson Gires-Tournois interferometer (MGTI) with arbitrary cascaded reflectors for symmetrical or asymmetrical periodic frequency response with arbitrary duty cycles are defined as universal MGTI optical interleaver (UMGTIOI). It can significantly enhance flexibility and applicability of optical networks. A novel and simple method based on digital signal processing is proposed for the design of UMGTIOI. Different kinds of design examples are given to confirm effectiveness of the method.

  9. Digital Image Manipulation, Analysis and Processing Systems (DIMAPS) A research-oriented, experimental image-processing system

    NASA Astrophysics Data System (ADS)

    Dave, J. V.

    1985-04-01

    The acronym DIMAPS stands for the group of experimental Digital Image Manipulation, Analysis and Processing Systems developed at the IBM Scientific Center in Palo Alto, California. These are FORTRAN-based, dialog-driven, fully interactive programs for the IBM 4341 (or equivalent) computer running under VM/CMS or MVS/TSO. The work station consists of three screens (alphanumeric, high-resolution vector graphics, and high-resolution color display), plus a digitizing graphics tablet, cursor controllers, keyboards, and hard copy devices. The DIMAPS software is 98% FORTRAN, thus facilitating maintenance, system growth, and transportability. The original DIMAPS and its modified versions contain functions for the generation, display and comparison of multiband images, and for the quantitative as well as graphic display of data in a selected section of the image under study. Several functions for performing data modification and/or analysis tasks are also included. Some high-level image processing and analysis functions such as the generation of shaded-relief images, unsupervised multispectral classification, scene-to-scene or map-to-scene registration of multiband digital data, extraction of texture information using a two-dimensional Fourier transform of the band data, and reduction of random noise from multiband data using phase agreement among their Fourier coefficients, were developed as adjuncts to DIMAPS.

  10. Digital image manipulation, analysis and processing systems (DIMAPS) - A research-oriented, experimental image-processing system

    NASA Astrophysics Data System (ADS)

    Dave, J. V.

    1985-12-01

    Digital Image Manipulation, Analysis and Processing Systems DIMAPS are FORTRAN-based, dialog-driven, fully interactive programs for the IBM 4341 (or equivalent) computer running under VM/CMS or MVS/TSO. The work station consists of three screens (alphanumeric, high-resolution vector graphics, and high-resolution color display), together with a digitizing graphics tablet, cursor controllers, keyboards, and hard copy devices. The DIMAPS software is 98-percent FORTRAN, thus facilitating maintenance, system growth, and transportability. The original DIMAPS and its modified versions contain functions for the generation, display and comparison of multiband images, and for the quantitative as well as graphic display of data in a selected section of the image under study. Several functions for performing data modification and/or analysis tasks are also included. Some high-level image processing and analysis functions such as the generation of shaded-relief images, unsupervised multispectral classification, scene-to-scene or map-to-scene registration of multiband digital data, extraction of texture information using a two-dimensional Fourier transform of the band data, and reduction of random noise from multiband data using phase agreement among their Fourier coefficients, were developed as adjuncts to DIMAPS.

  11. Single-step affinity purification of enzyme biotherapeutics: a platform methodology for accelerated process development.

    PubMed

    Brower, Kevin P; Ryakala, Venkat K; Bird, Ryan; Godawat, Rahul; Riske, Frank J; Konstantinov, Konstantin; Warikoo, Veena; Gamble, Jean

    2014-01-01

    Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non-antibody biologics. Multi-step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single-step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi-step purification trains. These models empower process development decision-making with drug substance-equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT).

  12. Multichannel parallel free-space VCSEL optoelectronic interconnects for digital data transmission and processing

    NASA Astrophysics Data System (ADS)

    Liu, J. Jiang; Lawler, William B.; Riely, Brian P.; Chang, Wayne H.; Shen, Paul H.; Newman, Peter G.; Taysing-Lara, Monica A.; Olver, Kimberly; Koley, Bikash; Dagenais, Mario; Simonis, George J.

    2000-07-01

    A free-space integrated optoelectronic interconnect was built to explore parallel data transmission and processing. This interconnect comprises an 8 X 8 substrate-emitting 980-nm InGaAs/GaAs quantum-well vertical-cavity surface- emitting laser (VCSEL) array and an 8 X 8 InGaAs/InP P-I- N photodetector array. Both VCSEL and detector arrays were flip-chip bonded onto the complimentary metal-oxide- semiconductor (CMOS) circuitry, packaged in pin-grid array packages, and mounted on customized printed circuit boards. Individual data rates as high as 1.2 Gb/s on the VCSEL/CMOS transmitter array were measured. After the optical alignment, we carried out serial and parallel transmissions of digital data and live video scenes through this interconnect between two computers. Images captured by CCD camera were digitized to 8-bit data signals and transferred in serial bit-stream through multiple channels in this parallel VCSEL-detector optical interconnect. A data processing algorithm of edge detection was attempted during the data transfer. Final images were reconstructed back from optically transmitted and processed digital data. Although the transmitter and detector offered much higher data rates, we found that the overall image transfer rate was limited by the CMOS receiver circuits. A new design for the receiver circuitry was accomplished and submitted for fabrication.

  13. Method and Apparatus for Evaluating the Visual Quality of Processed Digital Video Sequences

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B. (Inventor)

    2002-01-01

    A Digital Video Quality (DVQ) apparatus and method that incorporate a model of human visual sensitivity to predict the visibility of artifacts. The DVQ method and apparatus are used for the evaluation of the visual quality of processed digital video sequences and for adaptively controlling the bit rate of the processed digital video sequences without compromising the visual quality. The DVQ apparatus minimizes the required amount of memory and computation. The input to the DVQ apparatus is a pair of color image sequences: an original (R) non-compressed sequence, and a processed (T) sequence. Both sequences (R) and (T) are sampled, cropped, and subjected to color transformations. The sequences are then subjected to blocking and discrete cosine transformation, and the results are transformed to local contrast. The next step is a time filtering operation which implements the human sensitivity to different time frequencies. The results are converted to threshold units by dividing each discrete cosine transform coefficient by its respective visual threshold. At the next stage the two sequences are subtracted to produce an error sequence. The error sequence is subjected to a contrast masking operation, which also depends upon the reference sequence (R). The masked errors can be pooled in various ways to illustrate the perceptual error over various dimensions, and the pooled error can be converted to a visual quality measure.

  14. The effects of solar incidence angle over digital processing of LANDSAT data

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Novo, E. M. L. M.

    1983-01-01

    A technique to extract the topography modulation component from digital data is described. The enhancement process is based on the fact that the pixel contains two types of information: (1) reflectance variation due to the target; (2) reflectance variation due to the topography. In order to enhance the signal variation due to topography, the technique recommends the extraction from original LANDSAT data of the component resulting from target reflectance. Considering that the role of topographic modulation over the pixel information will vary with solar incidence angle, the results of this technique of digital processing will differ from one season to another, mainly in highly dissected topography. In this context, the effects of solar incidence angle over the topographic modulation technique were evaluated. Two sets of MSS/LANDSAT data, with solar elevation angles varying from 22 to 41 deg were selected to implement the digital processing at the Image-100 System. A secondary watershed (Rio Bocaina) draining into Rio Paraiba do Sul (Sao Paulo State) was selected as a test site. The results showed that the technique used was more appropriate to MSS data acquired under higher Sun elevation angles. Topographic modulation components applied to low Sun elevation angles lessens rather than enhances topography.

  15. The effect of image processing on the detection of cancers in digital mammography.

    PubMed

    Warren, Lucy M; Given-Wilson, Rosalind M; Wallis, Matthew G; Cooke, Julie; Halling-Brown, Mark D; Mackenzie, Alistair; Chakraborty, Dev P; Bosmans, Hilde; Dance, David R; Young, Kenneth C

    2014-08-01

    OBJECTIVE. The objective of our study was to investigate the effect of image processing on the detection of cancers in digital mammography images. MATERIALS AND METHODS. Two hundred seventy pairs of breast images (both breasts, one view) were collected from eight systems using Hologic amorphous selenium detectors: 80 image pairs showed breasts containing subtle malignant masses; 30 image pairs, biopsy-proven benign lesions; 80 image pairs, simulated calcification clusters; and 80 image pairs, no cancer (normal). The 270 image pairs were processed with three types of image processing: standard (full enhancement), low contrast (intermediate enhancement), and pseudo-film-screen (no enhancement). Seven experienced observers inspected the images, locating and rating regions they suspected to be cancer for likelihood of malignancy. The results were analyzed using a jackknife-alternative free-response receiver operating characteristic (JAFROC) analysis. RESULTS. The detection of calcification clusters was significantly affected by the type of image processing: The JAFROC figure of merit (FOM) decreased from 0.65 with standard image processing to 0.63 with low-contrast image processing (p = 0.04) and from 0.65 with standard image processing to 0.61 with film-screen image processing (p = 0.0005). The detection of noncalcification cancers was not significantly different among the image-processing types investigated (p > 0.40). CONCLUSION. These results suggest that image processing has a significant impact on the detection of calcification clusters in digital mammography. For the three image-processing versions and the system investigated, standard image processing was optimal for the detection of calcification clusters. The effect on cancer detection should be considered when selecting the type of image processing in the future.

  16. An integrated approach to safety-driven and ICT-enabled process reengineering: methodological advice and a case study.

    PubMed

    Langer, M; Castellari, R; Locatelli, P; Sini, E; Torresani, M; Facchini, R; Moser, R

    2014-01-01

    Patient safety is a central concern inside any healthcare environment. With the progress of Information and Communication Technologies (ICTs), new solutions have become available to support care and management processes. Analyzing process risks helps identifying areas of improvement and provides ICT-solutions design with indications on what portions of the process need primary interventions. Understanding the link between process reengineering, technology assessment of enabling technologies and risk management allows user acceptance and patient safety improvements. Fondazione IRCCS Istituto Nazionale dei Tumori (INT), offers a good example of process reengineering driven by the purpose of increasing patient safety, enabled by new technologies. A pillar of the evolution of ICT process support at INT is based on Radio Frequency Identification technologies, implemented to identify and trace items and people across processes. This paper will present an integrated approach, based on process reengineering methodologies and risk assessment studies, and methodological advice applied to a case of surgical kits management procedures.

  17. Using dual-task methodology to dissociate automatic from nonautomatic processes involved in artificial grammar learning.

    PubMed

    Hendricks, Michelle A; Conway, Christopher M; Kellogg, Ronald T

    2013-09-01

    Previous studies have suggested that both automatic and intentional processes contribute to the learning of grammar and fragment knowledge in artificial grammar learning (AGL) tasks. To explore the relative contribution of automatic and intentional processes to knowledge gained in AGL, we utilized dual-task methodology to dissociate automatic and intentional grammar- and fragment-based knowledge in AGL at both acquisition and at test. Both experiments used a balanced chunk strength grammar to assure an equal proportion of fragment cues (i.e., chunks) in grammatical and nongrammatical test items. In Experiment 1, participants engaged in a working memory dual-task either during acquisition, test, or both acquisition and test. The results showed that participants performing the dual-task during acquisition learned the artificial grammar as well as the single-task group, presumably by relying on automatic learning mechanisms. A working memory dual-task at test resulted in attenuated grammar performance, suggesting a role for intentional processes for the expression of grammatical learning at test. Experiment 2 explored the importance of perceptual cues by changing letters between the acquisition and test phase; unlike Experiment 1, there was no significant learning of grammatical information for participants under dual-task conditions in Experiment 2, suggesting that intentional processing is necessary for successful acquisition and expression of grammar-based knowledge under transfer conditions. In sum, it appears that some aspects of learning in AGL are indeed relatively automatic, although the expression of grammatical information and the learning of grammatical patterns when perceptual similarity is eliminated both appear to require explicit resources.

  18. Defining the Medical Library Association research agenda: methodology and final results from a consensus process

    PubMed Central

    Eldredge, Jonathan D.; Harris, Martha R.; Ascher, Marie T.

    2009-01-01

    Objective: Using a group consensus methodology, the research sought to generate a list of the twelve to fifteen most important and answerable research questions in health sciences librarianship as part of a broader effort to implement the new Medical Library Association (MLA) research policy. Methods: The delphi method was used. The committee distributed a brief survey to all estimated 827 MLA leaders and 237 MLA Research Section members, requesting they submit what they considered to be the most important and answerable research questions facing the profession. The submitted questions were then subjected to 2 rounds of voting to produce a short list of top-ranked questions. Results: The survey produced 62 questions from 54 MLA leaders and MLA Research Section members, who responded from an estimated potential population of 1,064 targeted colleagues. These questions were considered by the process participants to be the most important and answerable research questions facing the profession. Through 2 rounds of voting, these 62 questions were reduced to the final 12 highest priority questions. Conclusion: The modified delphi method accomplished its desired survey and consensus goals. Future survey and consensus processes will be revised to generate more initial questions and to distill a larger number of ranked prioritized research questions. PMID:19626143

  19. Optimization of spray drying process for developing seabuckthorn fruit juice powder using response surface methodology.

    PubMed

    Selvamuthukumaran, Meenakshisundaram; Khanum, Farhath

    2014-12-01

    The response surface methodology was used to optimize the spray drying process for development of seabuckthorn fruit juice powder. The independent variables were different levels of inlet air temperature and maltodextrin concentration. The responses were moisture, solubility, dispersibility, vitamin C and overall color difference value. Statistical analysis revealed that independent variables significantly affected all the responses. The Inlet air temperature showed maximum influence on moisture and vitamin C content, while the maltodextrin concentration showed similar influence on solubility, dispersibility and overall color difference value. Contour plots for each response were used to generate an optimum area by superimposition. The seabuckthorn fruit juice powder was developed using the derived optimum processing conditions to check the validity of the second order polynomial model. The experimental values were found to be in close agreement to the predicted values and were within the acceptable limits indicating the suitability of the model in predicting quality attributes of seabuckthorn fruit juice powder. The recommended optimum spray drying conditions for drying 100 g fruit juice slurry were inlet air temperature and maltodextrin concentration of 162.5 °C and 25 g, respectively. The spray dried juice powder contains higher amounts of antioxidants viz., vitamin C, vitamin E, total carotenoids, total anthocyanins and total phenols when compared to commercial fruit juice powders and they are also found to be free flowing without any physical alterations such as caking, stickiness, collapse and crystallization by exhibiting greater glass transition temperature.

  20. A foundational methodology for determining system static complexity using notional lunar oxygen production processes

    NASA Astrophysics Data System (ADS)

    Long, Nicholas James

    This thesis serves to develop a preliminary foundational methodology for evaluating the static complexity of future lunar oxygen production systems when extensive information is not yet available about the various systems under consideration. Evaluating static complexity, as part of a overall system complexity analysis, is an important consideration in ultimately selecting a process to be used in a lunar base. When system complexity is higher, there is generally an overall increase in risk which could impact the safety of astronauts and the economic performance of the mission. To evaluate static complexity in lunar oxygen production, static complexity is simplified and defined into its essential components. First, three essential dimensions of static complexity are investigated, including interconnective complexity, strength of connections, and complexity in variety. Then a set of methods is developed upon which to separately evaluate each dimension. Q-connectivity analysis is proposed as a means to evaluate interconnective complexity and strength of connections. The law of requisite variety originating from cybernetic theory is suggested to interpret complexity in variety. Secondly, a means to aggregate the results of each analysis is proposed to create holistic measurement for static complexity using the Single Multi-Attribute Ranking Technique (SMART). Each method of static complexity analysis and the aggregation technique is demonstrated using notional data for four lunar oxygen production processes.

  1. Cointegration methodology for psychological researchers: An introduction to the analysis of dynamic process systems.

    PubMed

    Stroe-Kunold, Esther; Gruber, Antje; Stadnytska, Tetiana; Werner, Joachim; Brosig, Burkhard

    2012-11-01

    Longitudinal data analysis focused on internal characteristics of a single time series has attracted increasing interest among psychologists. The systemic psychological perspective suggests, however, that many long-term phenomena are mutually interconnected, forming a dynamic system. Hence, only multivariate methods can handle such human dynamics appropriately. Unlike the majority of time series methodologies, the cointegration approach allows interdependencies of integrated (i.e., extremely unstable) processes to be modelled. This advantage results from the fact that cointegrated series are connected by stationary long-run equilibrium relationships. Vector error-correction models are frequently used representations of cointegrated systems. They capture both this equilibrium and compensation mechanisms in the case of short-term deviations due to developmental changes. Thus, the past disequilibrium serves as explanatory variable in the dynamic behaviour of current variables. Employing empirical data from cognitive psychology, psychosomatics, and marital interaction research, this paper describes how to apply cointegration methods to dynamic process systems and how to interpret the parameters under investigation from a psychological perspective.

  2. Methodology to assess the environmental impact of a product and its processes

    NASA Astrophysics Data System (ADS)

    Kumar, K. R.; Lee, Dongwon; Malhotra, Arvind

    2001-02-01

    This study presents a methodology for capturing the environmental impact of a product and its processes throughout the life cycle in discrete part manufacturing. The objectives of the study are to identify opportunities to enhance environmental friendliness of a product in its design stage, and assess whether the environmental impact has actually been reduced or has simply been shifted elsewhere in the life cycle of the product. Using the bill of materials and the process route sheet, we build the environmental status of its operations as a vector of measurable attributes, categorized under the taxonomy of social, ecological, and economic impact that can be aggregated and evaluated at a business unit level. The vector of social impact deals with the effects of materials used and wastes produced on people through the life cycle. The vector of ecological impact consists of effects of recycling, reuse, and remanufacturing of a product based on the notion of materials balance. Finally, the vector of economic impact represents the conversion of the previous two vectors into managerially relevant costs to firm expressed in dollar amounts so that managers in any positions visually appraise their operations and communicate each other with the same language.

  3. Modeling and optimization of red currants vacuum drying process by response surface methodology (RSM).

    PubMed

    Šumić, Zdravko; Vakula, Anita; Tepić, Aleksandra; Čakarević, Jelena; Vitas, Jasmina; Pavlić, Branimir

    2016-07-15

    Fresh red currants were dried by vacuum drying process under different drying conditions. Box-Behnken experimental design with response surface methodology was used for optimization of drying process in terms of physical (moisture content, water activity, total color change, firmness and rehydratation power) and chemical (total phenols, total flavonoids, monomeric anthocyanins and ascorbic acid content and antioxidant activity) properties of dried samples. Temperature (48-78 °C), pressure (30-330 mbar) and drying time (8-16 h) were investigated as independent variables. Experimental results were fitted to a second-order polynomial model where regression analysis and analysis of variance were used to determine model fitness and optimal drying conditions. The optimal conditions of simultaneously optimized responses were temperature of 70.2 °C, pressure of 39 mbar and drying time of 8 h. It could be concluded that vacuum drying provides samples with good physico-chemical properties, similar to lyophilized sample and better than conventionally dried sample.

  4. Perceptual and category processing of the Uncanny Valley hypothesis' dimension of human likeness: some methodological issues.

    PubMed

    Cheetham, Marcus; Jancke, Lutz

    2013-06-03

    Mori's Uncanny Valley Hypothesis(1,2) proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings (3, 4, 5, 6). One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) (7). Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.

  5. Elongated styloid process evaluation on digital panoramic radiograph in a North Italian population

    PubMed Central

    Gracco, Antonio; De Stefani, Alberto; Balasso, Paolo; Alessandri-Bonetti, Giulio; Stellini, Edoardo

    2017-01-01

    Background The aim of this study is to evaluate the prevalence of elongated styloid process in digital panoramic radiographs in a North Italian population in relation to age, gender and side. Material and Methods This study was performed as a retrospective analysis on digital panoramic radiographs of 600 (271 males and 329 females) Italian patients between 6 and 87 years old. The styloid process length were measured using the measuring tool of Sidexis Software. It was measured from the point where it left the temporal bone plate to its tip. Styloid processes measuring more than 30 mm were considered elongated. Chi-squared and Fisher tests were used and the test is considered significant if the p-value is lower or equal to 0.05. Results Thirty-three per cent of the patients showed an elongated styloid process. Seventeen per cent were elongated on both right and left side, fifteen point nine per cent were elongated only in one side. Conclusions The prevalence of elongated styloid process was high and a progressive increase of the elongation prevalence was found in older groups. Key words:Elongated styloid process, Eagle’s syndrome, panoramic radiograph. PMID:28298982

  6. Automatic rice crop height measurement using a field server and digital image processing.

    PubMed

    Sritarapipat, Tanakorn; Rakwatin, Preesan; Kasetkasem, Teerasit

    2014-01-07

    Rice crop height is an important agronomic trait linked to plant type and yield potential. This research developed an automatic image processing technique to detect rice crop height based on images taken by a digital camera attached to a field server. The camera acquires rice paddy images daily at a consistent time of day. The images include the rice plants and a marker bar used to provide a height reference. The rice crop height can be indirectly measured from the images by measuring the height of the marker bar compared to the height of the initial marker bar. Four digital image processing steps are employed to automatically measure the rice crop height: band selection, filtering, thresholding, and height measurement. Band selection is used to remove redundant features. Filtering extracts significant features of the marker bar. The thresholding method is applied to separate objects and boundaries of the marker bar versus other areas. The marker bar is detected and compared with the initial marker bar to measure the rice crop height. Our experiment used a field server with a digital camera to continuously monitor a rice field located in Suphanburi Province, Thailand. The experimental results show that the proposed method measures rice crop height effectively, with no human intervention required.

  7. Using digital flow cytometry to assess the degradation of three cyanobacteria species after oxidation processes.

    PubMed

    Wert, Eric C; Dong, Mei Mei; Rosario-Ortiz, Fernando L

    2013-07-01

    Depending on drinking water treatment conditions, oxidation processes may result in the degradation of cyanobacteria cells causing the release of toxic metabolites (microcystin), odorous metabolites (MIB, geosmin), or disinfection byproduct precursors. In this study, a digital flow cytometer (FlowCAM(®)) in combination with chlorophyll-a analysis was used to evaluate the ability of ozone, chlorine, chlorine dioxide, and chloramine to damage or lyse cyanobacteria cells added to Colorado River water. Microcystis aeruginosa (MA), Oscillatoria sp. (OSC) and Lyngbya sp. (LYN) were selected for the study due to their occurrence in surface water supplies, metabolite production, and morphology. Results showed that cell damage was observed without complete lysis or fragmentation of the cell membrane under many of the conditions tested. During ozone and chlorine experiments, the unicellular MA was more susceptible to oxidation than the filamentous OSC and LYN. Rate constants were developed based on the loss of chlorophyll-a and oxidant exposure, which showed the oxidants degraded MA, OSC, and LYN according to the order of ozone > chlorine ~ chlorine dioxide > chloramine. Digital and binary images taken by the digital flow cytometer provided qualitative insight regarding cell damage. When applying this information, drinking water utilities can better understand the risk of cell damage or lysis during oxidation processes.

  8. Digital Audio Signal Processing and Nde: AN Unlikely but Valuable Partnership

    NASA Astrophysics Data System (ADS)

    Gaydecki, Patrick

    2008-02-01

    In the Digital Signal Processing (DSP) group, within the School of Electrical and Electronic Engineering at The University of Manchester, research is conducted into two seemingly distinct and disparate subjects: instrumentation for nondestructive evaluation, and DSP systems & algorithms for digital audio. We have often found that many of the hardware systems and algorithms employed to recover, extract or enhance audio signals may also be applied to signals provided by ultrasonic or magnetic NDE instruments. Furthermore, modern DSP hardware is so fast (typically performing hundreds of millions of operations per second), that much of the processing and signal reconstruction may be performed in real time. Here, we describe some of the hardware systems we have developed, together with algorithms that can be implemented both in real time and offline. A next generation system has now been designed, which incorporates a processor operating at 0.55 Giga MMACS, six input and eight output analogue channels, digital input/output in the form of S/PDIF, a JTAG and a USB interface. The software allows the user, with no knowledge of filter theory or programming, to design and run standard or arbitrary FIR, IIR and adaptive filters. Using audio as a vehicle, we can demonstrate the remarkable properties of modern reconstruction algorithms when used in conjunction with such hardware; applications in NDE include signal enhancement and recovery in acoustic, ultrasonic, magnetic and eddy current modalities.

  9. Spatial analysis of ambient gamma dose equivalent rate data by means of digital image processing techniques.

    PubMed

    Szabó, Katalin Zsuzsanna; Jordan, Gyozo; Petrik, Attila; Horváth, Ákos; Szabó, Csaba

    2017-01-01

    A detailed ambient gamma dose equivalent rate mapping based on field measurements at ground level and at 1 m height was carried out at 142 sites in 80 × 90 km area in Pest County, Hungary. Detailed digital image processing analysis was carried out to identify and characterise spatial features such as outlying points, anomalous zones and linear edges in a smoothed TIN interpolated surface. The applied method proceeds from the simple shaded relief model and digital cross-sections to the more complex gradient magnitude and gradient direction maps, 2nd derivative profile curvature map, relief map and lineament density map. Each map is analysed for statistical characteristics and histogram-based image segmentation is used to delineate areas homogeneous with respect to the parameter values in these maps. Assessment of spatial anisotropy is implemented by 2D autocorrelogram and directional variogram analyses. The identified spatial features are related to underlying geological and tectonic conditions using GIS technology. Results show that detailed digital image processing is efficient in revealing the pattern present in field-measured ambient gamma dose equivalent rates and they are related to regional scale tectonic zones and surface sedimentary lithological conditions in the study area.

  10. Evaluation of superconducting quantum interference devices interfaced with digital signal processing electronics for biomagnetic applications

    SciTech Connect

    Kung, Pang-Jen, Flynn, E.R.; Bracht, R.R.; Lewis, P.S.

    1994-08-01

    The performance of a dc-SQUID magnetometer driven by both analog electronics and digital signal processors are investigated and compared for biomagnetic applications. Low-noise ( < 5 {mu} {Phi} {sub 0}/{radical}Hz at 1 Hz) dc-SQUIDs were fabricated by Conductus, Inc. using the all-refractory Nb/Al/Al{sub 2}O{sub 3}/Nb process on silicon substrates with on-chip modulation coils and integral washer damping resistors. A second-order gradiometer was magnetically coupled to the input coil of the SQUID to maximize the detected signal strength. The readout of this SQUID gradiometer was achieved using a conventional flux-locked loop (FLL) circuit to provide a linearized voltage output that was proportional to the flux applied to the SQUID. A shielded cylinder was constructed to house the magnetometer to reduce ambient field noise. To realize the digital feedback loop, the analog FLL is replaced except for the preamplifier by a digital signal processing board with dual 16-bit A/D and D/A converters. This approach shows several advantages over the analog scheme including operational flexibility, cost reduction, and possibly, the enhancement of dynamic ranges and slew rates.

  11. Fully automated digital holographic processing for monitoring the dynamics of a vesicle suspension under shear flow

    PubMed Central

    Minetti, Christophe; Podgorski, Thomas; Coupier, Gwennou; Dubois, Frank

    2014-01-01

    We investigate the dynamics of a vesicle suspension under shear flow between plates using DHM with a spatially reduced coherent source. Holograms are grabbed at a frequency of 24 frames/sec. The distribution of the vesicle suspension is obtained after numerical processing of the digital holograms sequence resulting in a 4D distribution. Obtaining this distribution is not straightforward and requires special processing to automate the analysis. We present an original method that fully automates the analysis and provides distributions that are further analyzed to extract physical properties of the fluid. Details of the numerical implementation, as well as sample experimental results are presented. PMID:24877015

  12. Software for Processing of Digitized Astronegatives from Archives and Databases of Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Protsyuk, Yu. I.; Andruk, V. N.; Kazantseva, L. V.

    The paper discusses and illustrates the steps of basic processing of digitized image of astro negatives. Software for obtaining of a rectangular coordinates and photometric values of objects on photographic plates was created in the environment LINUX / MIDAS / ROMAFOT. The program can automatically process the specified number of files in FITS format with sizes up to 20000 x 20000 pixels. Other programs were made in FORTRAN and PASCAL with the ability to work in an environment of LINUX or WINDOWS. They were used for: identification of stars, separation and exclusion of diffraction satellites and double and triple exposures, elimination of image defects, reduction to the equatorial coordinates and magnitudes of a reference catalogs.

  13. Application of digital image processing techniques to faint solar flare phenomena

    NASA Technical Reports Server (NTRS)

    Glackin, D. L.; Martin, S. F.

    1980-01-01

    Digital image processing of eight solar flare events was performed using the Video Information Communication and Retrieval language in order to study moving emission fronts, flare halos, and Moreton waves. The techniques used include contrast enhancement, isointensity contouring, the differencing of images, spatial filtering, and geometrical registration. The spatial extent and temporal behavior of the faint phenomena is examined along with the relation of the three types of phenomena to one another. The image processing techniques make possible the detailed study of the history of the phenomena and provide clues to their physical nature.

  14. Digital image processing applied to analysis of geophysical and geochemical data for southern Missouri

    NASA Technical Reports Server (NTRS)

    Guinness, E. A.; Arvidson, R. E.; Leff, C. E.; Edwards, M. H.; Bindschadler, D. L.

    1983-01-01

    Digital image-processing techniques have been used to analyze a variety of geophysical and geochemical map data covering southern Missouri, a region with important basement and strata-bound mineral deposits. Gravity and magnetic anomaly patterns, which have been reformatted to image displays, indicate a deep crustal structure cutting northwest-southeast through southern Missouri. In addition, geologic map data, topography, and Landsat multispectral scanner images have been used as base maps for the digital overlay of aerial gamma-ray and stream sediment chemical data for the 1 x 2-deg Rolla quadrangle. Results indicate enrichment of a variety of elements within the clay-rich alluvium covering many of the interfluvial plains, as well as a complicated pattern of enrichment for the sedimentary units close to the Precambrian rhyolites and granites of the St. Francois Mountains.

  15. Investigation of super-resolution processing algorithm by target light-intensity search in digital holography

    NASA Astrophysics Data System (ADS)

    Neo, Atsushi; Kakue, Takashi; Shimobaba, Tomoyoshi; Masuda, Nobuyuki; Ito, Tomoyoshi

    2017-04-01

    Digital holography is expected to be useful in the analysis of moving three-dimensional (3D) image measurement. In this technique, a two-dimensional interference fringe recorded using a 3D image is captured with an image sensor, and the 3D image is reproduced on a computer. To obtain the reproduced 3D images with high spatial resolution, a high-performance image sensor is required, which increases the system cost. We propose an algorithm for super-resolution processing in digital holography that does not require a high-performance image sensor. The proposed algorithm wherein 3D images are considered as the aggregation of object points improves spatial resolution by performing a light-intensity search of the reproduced image and the object points.

  16. Using Lean Six Sigma Methodology to Improve a Mass Immunizations Process at the United States Naval Academy.

    PubMed

    Ha, Chrysanthy; McCoy, Donald A; Taylor, Christopher B; Kirk, Kayla D; Fry, Robert S; Modi, Jitendrakumar R

    2016-06-01

    Lean Six Sigma (LSS) is a process improvement methodology developed in the manufacturing industry to increase process efficiency while maintaining product quality. The efficacy of LSS application to the health care setting has not been adequately studied. This article presents a quality improvement project at the U.S. Naval Academy that uses LSS to improve the mass immunizations process for Midshipmen during in-processing. The process was standardized to give all vaccinations at one station instead of giving a different vaccination at each station. After project implementation, the average immunizations lead time decreased by 79% and staffing decreased by 10%. The process was shown to be in control with a capability index of 1.18 and performance index of 1.10, resulting in a defect rate of 0.04%. This project demonstrates that the LSS methodology can be applied successfully to the health care setting to make sustainable process improvements if used correctly and completely.

  17. Prototyping scalable digital signal processing systems for radio astronomy using dataflow models

    NASA Astrophysics Data System (ADS)

    Sane, N.; Ford, J.; Harris, A. I.; Bhattacharyya, S. S.

    2012-05-01

    There is a growing trend toward using high-level tools for design and implementation of radio astronomy digital signal processing (DSP) systems. Such tools, for example, those from the Collaboration for Astronomy Signal Processing and Electronics Research (CASPER), are usually platform-specific, and lack high-level, platform-independent, portable, scalable application specifications. This limits the designer's ability to experiment with designs at a high-level of abstraction and early in the development cycle. We address some of these issues using a model-based design approach employing dataflow models. We demonstrate this approach by applying it to the design of a tunable digital downconverter (TDD) used for narrow-bandwidth spectroscopy. Our design is targeted toward an FPGA platform, called the Interconnect Break-out Board (IBOB), that is available from the CASPER. We use the term TDD to refer to a digital downconverter for which the decimation factor and center frequency can be reconfigured without the need for regenerating the hardware code. Such a design is currently not available in the CASPER DSP library. The work presented in this paper focuses on two aspects. First, we introduce and demonstrate a dataflow-based design approach using the dataflow interchange format (DIF) tool for high-level application specification, and we integrate this approach with the CASPER tool flow. Secondly, we explore the trade-off between the flexibility of TDD designs and the low hardware cost of fixed-configuration digital downconverter (FDD) designs that use the available CASPER DSP library. We further explore this trade-off in the context of a two-stage downconversion scheme employing a combination of TDD or FDD designs.

  18. Choosing between Methodologies: An Inquiry into English Learning Processes in a Taiwanese Indigenous School

    ERIC Educational Resources Information Center

    Lin, Wen-Chuan

    2012-01-01

    Traditional, cognitive-oriented theories of English language acquisition tend to employ experimental modes of inquiry and neglect social, cultural and historical contexts. In this paper, I review the theoretical debate over methodology by examining ontological, epistemological and methodological controversies around cognitive-oriented theories. I…

  19. Methodology used to produce an encoded 1:100,000-scale digital hydrographic data layer for the Pacific Northwest

    USGS Publications Warehouse

    Fisher, B.J.

    1996-01-01

    The U.S. Geological Survey (USGS) has produced a River Reach File data layer for the Pacific Northwest for use in water-resource management applications. The Pacific Northwest (PNW) River Reach Files, a geo-referenced river reach data layer at 1:100,000-scale, are encoded with the U.S. Environmental Protection Agency"s (EPA) reach numbers. The encoding was a primary task of the River Reach project, because EPA"s reach identifiers are also an integral hydrologic component in a regional Northwest Environmental Data Base-an ongoing effort by Federal and State agencies to compile information on reach-specific resources on rivers in Oregon, Idaho, Washington, and western Montana. A unique conflation algorithm was developed by the USGS to transfer the EPA reach codes and other meaningful attributes from the 1:250,000-scale EPA TRACE graphic files to the PNW Reach Files. The PNW Reach Files also were designed so that reach-specific information upstream or downstream from a point in the stream network could be extracted from feature attribute tables or from a Geographic Information System. This report documents the methodology used to create this 1:100,000-scale hydrologic data layer.

  20. A New Screening Methodology for Improved Oil Recovery Processes Using Soft-Computing Techniques

    NASA Astrophysics Data System (ADS)

    Parada, Claudia; Ertekin, Turgay

    2010-05-01

    The first stage of production of any oil reservoir involves oil displacement by natural drive mechanisms such as solution gas drive, gas cap drive and gravity drainage. Typically, improved oil recovery (IOR) methods are applied to oil reservoirs that have been depleted naturally. In more recent years, IOR techniques are applied to reservoirs even before their natural energy drive is exhausted by primary depletion. Descriptive screening criteria for IOR methods are used to select the appropriate recovery technique according to the fluid and rock properties. This methodology helps in assessing the most suitable recovery process for field deployment of a candidate reservoir. However, the already published screening guidelines neither provide information about the expected reservoir performance nor suggest a set of project design parameters, which can be used towards the optimization of the process. In this study, artificial neural networks (ANN) are used to build a high-performance neuro-simulation tool for screening different improved oil recovery techniques: miscible injection (CO2 and N2), waterflooding and steam injection processes. The simulation tool consists of proxy models that implement a multilayer cascade feedforward back propagation network algorithm. The tool is intended to narrow the ranges of possible scenarios to be modeled using conventional simulation, reducing the extensive time and energy spent in dynamic reservoir modeling. A commercial reservoir simulator is used to generate the data to train and validate the artificial neural networks. The proxy models are built considering four different well patterns with different well operating conditions as the field design parameters. Different expert systems are developed for each well pattern. The screening networks predict oil production rate and cumulative oil production profiles for a given set of rock and fluid properties, and design parameters. The results of this study show that the networks are

  1. Photothermal heating as a methodology for post processing of polymeric nanofibers

    NASA Astrophysics Data System (ADS)

    Gorga, Russell; Clarke, Laura; Bochinski, Jason; Viswanath, Vidya; Maity, Somsubhra; Dong, Ju; Firestone, Gabriel

    2015-03-01

    Metal nanoparticles embedded within polymeric systems can be made to act as localized heat sources thereby aiding in-situ polymer processing. This is made possible by the surface plasmon resonance (SPR) mediated photothermal effect of metal (in this case gold) nanoparticles, wherein incident light absorbed by the nanoparticle generates a non-equilibrium electron distribution which subsequently transfers this energy into the surrounding medium, resulting in a temperature increase in the immediate region around the particle. Here we demonstrate this effect in polymer nanocomposite systems, specifically electrospun polyethylene oxide nanofibrous mats, which have been annealed at temperatures above the glass transition. A non-contact temperature measurement technique utilizing embedded fluorophores (perylene) has been used to monitor the average temperature within samples. The effect of annealing methods (conventional and photothermal) and annealing conditions (temperature and time) on the fiber morphology, overall crystallinity, and mechanical properties is discussed. This methodology is further utilized in core-sheath nanofibers to crosslink the core material, which is a pre-cured epoxy thermoset. NSF Grant CMMI-1069108.

  2. Optimization of Extraction Process for Polysaccharide in Salvia Miltiorrhiza Bunge Using Response Surface Methodology

    PubMed Central

    Yanhua, Wang; Fuhua, Wu; Zhaohan, Guo; Mingxing, Peng; Yanan, Zhang; Ling, Pang Zhen; Minhua, Du; Caiying, Zhang; Zian, Liang

    2014-01-01

    This study was aimed to optimize the extraction process for Salvia miltiorrhiza Bunge polysaccharide using response surface methodology The results showed that four operating parameters including microwave power, microwave time and the particle size had notable effects on the polysaccharide extraction of Salvia miltiorrhiza Bunge. The effects could be ranked in decreasing order of importance as follows:. Microwave power > microwave time > the comminution degree. The optimal extraction parameters were determined as 573.83W of Microwave power and 8.4min of microwave time and 67.51mesh of the comminution degree, resulting in the yield of Salvia miltiorrhiza Bunge polysaccharide of 101.161mg / g. The established regression model describing polysaccharide extraction from as a function of the three extraction parameters was highly significant (R 2 = 0.9953). The predicted and experimental results were found to be in good agreement. Thus, the model can be applicable for the prediction of polysaccharide extraction from Salvia miltiorrhiza Bunge. PMID:26312073

  3. Optimization of Extraction Process for Polysaccharide in Salvia Miltiorrhiza Bunge Using Response Surface Methodology

    PubMed Central

    Yanhua, Wang; Fuhua, Wu; Zhaohan, Guo; Mingxing, Peng; Yanan, Zhang; Ling, Pang Zhen; Minhua, Du; Caiying, Zhang; Zian, Liang

    2015-01-01

    This study was aimed to optimize the extraction process for Salvia miltiorrhiza Bunge polysaccharide using response surface methodology The results showed that four operating parameters including microwave power, microwave time and the particle size had notable effects on the polysaccharide extraction of Salvia miltiorrhiza Bunge. The effects could be ranked in decreasing order of importance as follows:. Microwave power > microwave time > the comminution degree. The optimal extraction parameters were determined as 573.83W of Microwave power and 8.4min of microwave time and 67.51mesh of the comminution degree, resulting in the yield of Salvia miltiorrhiza Bunge polysaccharide of 101.161mg / g. The established regression model describing polysaccharide extraction from as a function of the three extraction parameters was highly significant (R 2 = 0.9953). The predicted and experimental results were found to be in good agreement. Thus, the model can be applicable for the prediction of polysaccharide extraction from Salvia miltiorrhiza Bunge. PMID:26998185

  4. Microprocessor instruments for measuring nonlinear distortions; algorithms for digital processing of the measurement signal and an estimate of the errors

    SciTech Connect

    Mints, M.Ya.; Chinkov, V.N.

    1995-09-01

    Rational algorithms for measuring the harmonic coefficient in microprocessor instruments for measuring nonlinear distortions based on digital processing of the codes of the instantaneous values of the signal being investigated are described and the errors of such instruments are obtained.

  5. Merged GLORIA sidescan and hydrosweep pseudo-sidescan: Processing and creation of digital mosaics

    USGS Publications Warehouse

    Bird, R.T.; Searle, R.C.; Paskevich, V.; Twichell, D.C.

    1996-01-01

    We have replaced the usual band of poor-quality data in the near-nadir region of our GLORIA long-range sidescan-sonar imagery with a shaded-relief image constructed from swath bathymetry data (collected simultaneously with GLORIA) which completely cover the nadir area. We have developed a technique to enhance these "pseudo-sidescan" images in order to mimic the neighbouring GLORIA backscatter intensities. As a result, the enhanced images greatly facilitate the geologic interpretation of the adjacent GLORIA data, and geologic features evident in the GLORIA data may be correlated with greater confidence across track. Features interpreted from the pseudo-sidescan may be extrapolated from the near-nadir region out into the GLORIA range where they may not have been recognized otherwise, and therefore the pseudo-sidescan can be used to ground-truth GLORIA interpretations. Creation of digital sidescan mosaics utilized an approach not previously used for GLORIA data. Pixels were correctly placed in cartographic space and the time required to complete a final mosaic was significantly reduced. Computer software for digital mapping and mosaic creation is incorporated into the newly-developed Woods Hole Image Processing System (WHIPS) which can process both low- and high-frequency sidescan, and can interchange data with the Mini Image Processing System (MIPS) most commonly used for GLORIA processing. These techniques are tested by creating digital mosaics of merged GLORIA sidescan and Hydrosweep pseudo-sidescan data from the vicinity of the Juan Fernandez microplate along the East Pacific Rise (EPR). ?? 1996 Kluwer Academic Publishers.

  6. Multimedia abstract generation of intensive care data: the automation of clinical processes through AI methodologies.

    PubMed

    Jordan, Desmond; Rose, Sydney E

    2010-04-01

    Medical errors from communication failures are enormous during the perioperative period of cardiac surgical patients. As caregivers change shifts or surgical patients change location within the hospital, key information is lost or misconstrued. After a baseline cognitive study of information need and caregiver workflow, we implemented an advanced clinical decision support tool of intelligent agents, medical logic modules, and text generators called the "Inference Engine" to summarize individual patient's raw medical data elements into procedural milestones, illness severity, and care therapies. The system generates two displays: 1) the continuum of care, multimedia abstract generation of intensive care data (MAGIC)-an expert system that would automatically generate a physician briefing of a cardiac patient's operative course in a multimodal format; and 2) the isolated point in time, "Inference Engine"-a system that provides a real-time, high-level, summarized depiction of a patient's clinical status. In our studies, system accuracy and efficacy was judged against clinician performance in the workplace. To test the automated physician briefing, "MAGIC," the patient's intraoperative course, was reviewed in the intensive care unit before patient arrival. It was then judged against the actual physician briefing and that given in a cohort of patients where the system was not used. To test the real-time representation of the patient's clinical status, system inferences were judged against clinician decisions. Changes in workflow and situational awareness were assessed by questionnaires and process evaluation. MAGIC provides 200% more information, twice the accuracy, and enhances situational awareness. This study demonstrates that the automation of clinical processes through AI methodologies yields positive results.

  7. Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues

    PubMed Central

    Cheetham, Marcus; Jancke, Lutz

    2013-01-01

    Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated. PMID:23770728

  8. Methodology for processing backscattered electron images. Application to Aguada archaeological paints.

    PubMed

    Galván Josa, V; Bertolino, S R; Riveros, J A; Castellano, G

    2009-12-01

    Scanning electron microscopy is a powerful technique in several fields of science and technology. In particular it is an important complement in the characterization of materials for which X-ray analysis is not possible. Such is the case of thin paint layers on ceramic pots, in which, even for low incident energies, the electron interaction volume can be greater than the paint thickness--in addition to the problem arising from similar compositions. With the aim of complementing other common techniques used in compositional materials characterization, in this work, an image-processing software has been developed, which implements a new methodology for the treatment of backscattered electron (BSE) images in order to bring to evidence small mean atomic number contrasts, usually imperceptible to human eye. The program was used to study black and white pigments of ceramic pieces belonging to the Ambato style of "Aguada" culture (Catamarca province, Argentina, IV-XII centuries AD). Although the BSE images acquired for these samples showed no apparent contrast between sherd and black and white pigments, through image-processing algorithms using different space filters, chemical contrast between regions has been brought to evidence with a minor detail loss. This has been accomplished by applying a smoothing filter, after which the main routine for contrast enhancement reveals details in the grey-level region of interest; finally, a filter for edge enhancement permits to recover some details lost in the previous steps, achieving satisfactory results for the painted sherd samples analyzed. In order to validate the mean atomic number differences found between each pigment and the ceramic body, X-ray diffraction diagrams have been refined with Rietveld method using the software DIFFRACplus Topas, arriving to mineralogical differences which agree with the results obtained. As a consequence of this study, the program developed has proven to be a suitable tool for routine

  9. APPLEPIPS /Apple Personal Image Processing System/ - An interactive digital image processing system for the Apple II microcomputer

    NASA Technical Reports Server (NTRS)

    Masuoka, E.; Rose, J.; Quattromani, M.

    1981-01-01

    Recent developments related to microprocessor-based personal computers have made low-cost digital image processing systems a reality. Image analysis systems built around these microcomputers provide color image displays for images as large as 256 by 240 pixels in sixteen colors. Descriptive statistics can be computed for portions of an image, and supervised image classification can be obtained. The systems support Basic, Fortran, Pascal, and assembler language. A description is provided of a system which is representative of the new microprocessor-based image processing systems currently on the market. While small systems may never be truly independent of larger mainframes, because they lack 9-track tape drives, the independent processing power of the microcomputers will help alleviate some of the turn-around time problems associated with image analysis and display on the larger multiuser systems.

  10. Digital CODEC for real-time processing of broadcast quality video signals at 1.8 bits/pixel

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary JO; Whyte, Wayne A., Jr.

    1989-01-01

    Advances in very large-scale integration and recent work in the field of bandwidth efficient digital modulation techniques have combined to make digital video processing technically feasible and potentially cost competitive for broadcast quality television transmission. A hardware implementation was developed for a DPCM-based digital television bandwidth compression algorithm which processes standard NTSC composite color television signals and produces broadcast quality video in real time at an average of 1.8 bits/pixel. The data compression algorithm and the hardware implementation of the CODEC are described, and performance results are provided.

  11. Digital CODEC for real-time processing of broadcast quality video signals at 1.8 bits/pixel

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary JO; Whyte, Wayne A.

    1991-01-01

    Advances in very large scale integration and recent work in the field of bandwidth efficient digital modulation techniques have combined to make digital video processing technically feasible an potentially cost competitive for broadcast quality television transmission. A hardware implementation was developed for DPCM (differential pulse code midulation)-based digital television bandwidth compression algorithm which processes standard NTSC composite color television signals and produces broadcast quality video in real time at an average of 1.8 bits/pixel. The data compression algorithm and the hardware implementation of the codec are described, and performance results are provided.

  12. On-line digital holographic measurement of size and shape of microparticles for crystallization processes

    NASA Astrophysics Data System (ADS)

    Khanam, Taslima; Darakis, Emmanouil; Rajendran, Arvind; Kariwala, Vinay; Asundi, Anand K.; Naughton, Thomas J.

    2008-09-01

    Crystallization is a widely used chemical process that finds applications in pharmaceutical industries. In an industrial crystallization process, it is not only important to produce pure crystals but also to control the shape and size of the crystals, as they affect the efficiency of downstream processes and the dissolution property of the drug. The effectiveness of control algorithms depend on the availability of on-line, real-time information about these critical properties. In this paper, we investigate the use of lens-less in-line digital holographic microscopy for size and shape measurements for crystallization processes. For this purpose, we use non-crystalline spherical microparticles and carbon fibers with known sizes present in a liquid suspension as test systems. We propose an algorithm to extract size and shape information for a population of microparticles from the experimentally recorded digital holograms. The measurements obtained from the proposed method show good agreement with the corresponding known size and shape of the particles.

  13. Digital signal processing of cylinder pressure data for combustion diagnostics of HCCI engine

    NASA Astrophysics Data System (ADS)

    Kumar Maurya, Rakesh; Pal, Dev Datt; Kumar Agarwal, Avinash

    2013-03-01

    Diagnosis of combustion is necessary for the estimation of the combustion quality, and control of combustion timing in advanced combustion concepts like HCCI. Combustion diagnostics is often performed using digital processing of pressure signals measured using piezoelectric sensor installed in the combustion chamber of the engine. Four-step pressure signal processing consisting of (i) absolute pressure correction, (ii) phasing w.r.t. crank angle, (iii) cycle averaging and (iv) smoothening is used to get cylinder pressure data from the engine experiments, which is further analyzed to get information about combustion characteristics. This study focuses on various aspect of signal processing (cycle averaging and smoothing) of in-cylinder pressure signal from a HCCI engine acquired using a piezoelectric pressure sensor. Experimental investigations are conducted on a HCCI combustion engine operating at different engine speed/load/air-fuel ratio conditions. The cylinder pressure history of 3000 consecutive engine cycles is acquired for analysis using piezoelectric pressure sensor. This study determines the optimum number of engine cycles to be acquired for reasonably good pressure signals based on standard deviation of in-cylinder pressure, rate of pressure rise and rate of heat release signals. Different signal smoothening methods (using various digital filters) are also analyzed and their results are compared. This study also presents effect of signal processing methods on pressure, pressure rise rate and rate of heat release curves at different engine operating conditions.

  14. Obtaining Positions of Asteroids from Digitized Processing of Photographic Observations in Baldone Observatory (Code 069)

    NASA Astrophysics Data System (ADS)

    Eglitis, I.; Eglite, M.; Shatokhina, S. V.; Andruk, V. M.

    Digital processing of photographic plates of star fields allows to determine with high accuracy the coordinates and stellar magnitudes for all registered objects on these plates. The processing results can be used for a broad search for images of small bodies of the Solar system and determination of their coordinates. From the observations of earlier epoch, we can extract information about the locations of these bodies well before discovering them. Modern approach to processing early photographic observations with new technologies can be an effective instrument for rediscovery of asteroids and correction their orbits. We analyzed the results of digital processing of observations of clusters in UBVR bands which were made on the 1.2-m Schmidt telescope of the Observatory of Institute of Astronomy of University of Latvia in Baldone (code 069). As a result 87 images of minor planets from 9.8 to 17.1 stellar magnitude and 2 images of comets were identified on 152 plates for 1967-1996. The catalogue of positions and stellar magnitudes of the searching asteroids was compiled. Among them 12 observations of asteroids are the earliest of the world's known observations of these asteroids. All positions of asteroids were compared with the ephemeris JPL DE431. Analysis was carried out.

  15. Computer-aided feature extraction, classification, and acceptance processing of digital NDE data

    NASA Astrophysics Data System (ADS)

    Hildreth, Joseph H.

    1996-11-01

    As part of the advanced Launch System technology development effort begun in 1989, the Air Force initiated a program to automate, to the extent possible, the processing of NDE data from the inspection of slid rocket motors during fabrication. The computerized system, called the Automated NDE Data Evaluation System or ANDES, was developed under contract to Martin Marietta, now Lockheed Martin. The ANDES system is generic in structure and is highly tailorable. The system can be configured to process digital or digitized data from any source, to process data from a single or from multiple acquisition systems, and to function as a single stand-alone system or in a multiple workstation distributed network. The system can maintain multiple configurations from which the user can select. In large measure, a configuration is defined through the system's user interface and is stored in the system's data base to be recalled by the user at any time. Three operational systems are currently in use. These systems ar located at Hill AFB in Ogden, Utah, Kelly AFB in San Antonio, TX, and the Phillips Laboratory at Edwards AFB in California. Each of these systems is configured to process x-ray computed tomography, CT, images. The Hill AFB installation supports the aging surveillance effort on Minuteman third stage rocket motors. The Kelly AFB system supports the acceptance inspection of airframe and engine components and torpedo housing components. The installation at Edwards AFB provides technical support to the other two locations.

  16. Measuring Down: Evaluating Digital Storytelling as a Process for Narrative Health Promotion.

    PubMed

    Gubrium, Aline C; Fiddian-Green, Alice; Lowe, Sarah; DiFulvio, Gloria; Del Toro-Mejías, Lizbeth

    2016-05-15

    Digital storytelling (DST) engages participants in a group-based process to create and share narrative accounts of life events. We present key evaluation findings of a 2-year, mixed-methods study that focused on effects of participating in the DST process on young Puerto Rican Latina's self-esteem, social support, empowerment, and sexual attitudes and behaviors. Quantitative results did not show significant changes in the expected outcomes. However, in our qualitative findings we identified several ways in which the DST made positive, health-bearing effects. We argue for the importance of "measuring down" to reflect the locally grounded, felt experiences of participants who engage in the process, as current quantitative scales do not "measure up" to accurately capture these effects. We end by suggesting the need to develop mixed-methods, culturally relevant, and sensitive evaluation tools that prioritize process effects as they inform intervention and health promotion.

  17. Morphometrics of aeolian blowouts from high-resolution digital elevation data: methodological considerations, shape metrics, and scaling

    NASA Astrophysics Data System (ADS)

    Hamilton, T. K.; Duke, G.; Brown, O.; Koenig, D.; Barchyn, T. E.; Hugenholtz, C.

    2011-12-01

    Aeolian blowouts are wind erosion hollows that form in vegetated aeolian landscapes. They are especially pervasive in dunefields of the northern Great Plains, yielding highly pitted or hummocky terrain, and adding to the spatial variability of microenvironments. Their development is thought to be linked to feedbacks between morphology and airflow; however, few measurements are available to test this hypothesis. Currently, a dearth of morphology data is limiting modeling progress. From a systematic program of blowout mapping with high-resolution airborne LiDAR data, we used a GIS to calculate morphometrics for 1373 blowouts in Great Sand Hills, Saskatchewan, Canada. All of the blowouts selected for this investigation were covered by grassland vegetation and inactive; their morphology represents the final stage of evolution. We first outline methodological considerations for delineating blowouts and measuring their volume. In particular, we present an objective method to enhance edge and reduce operator error and bias. We show that blowouts are slightly elongate and 49% of the sample blowouts are oriented parallel to the prevailing westerly winds. We also show that their size distribution is heavy-tailed, meaning that most blowouts are relatively small and rarely increase in size beyond 400 m3. Given that blowout growth is dominated by a positive feedback between sediment transport and vegetation erosion, these results suggest several possible mechanisms: i) blowouts simultaneously evolved and stabilized as a result of external climate forcing, ii) blowouts are slaved to exogenous biogenic disturbance patterns (e.g., bison wallows), or iii) a morphodynamic limiting mechanism restricts blowout size. Overall, these data will serve as a foundation for future study, providing insight into an understudied landform that is common in many dunefields.

  18. Advanced digital signal processing for short-haul and access network

    NASA Astrophysics Data System (ADS)

    Zhang, Junwen; Yu, Jianjun; Chi, Nan

    2016-02-01

    Digital signal processing (DSP) has been proved to be a successful technology recently in high speed and high spectrum-efficiency optical short-haul and access network, which enables high performances based on digital equalizations and compensations. In this paper, we investigate advanced DSP at the transmitter and receiver side for signal pre-equalization and post-equalization in an optical access network. A novel DSP-based digital and optical pre-equalization scheme has been proposed for bandwidth-limited high speed short-distance communication system, which is based on the feedback of receiver-side adaptive equalizers, such as least-mean-squares (LMS) algorithm and constant or multi-modulus algorithms (CMA, MMA). Based on this scheme, we experimentally demonstrate 400GE on a single optical carrier based on the highest ETDM 120-GBaud PDM-PAM-4 signal, using one external modulator and coherent detection. A line rate of 480-Gb/s is achieved, which enables 20% forward-error correction (FEC) overhead to keep the 400-Gb/s net information rate. The performance after fiber transmission shows large margin for both short range and metro/regional networks. We also extend the advanced DSP for short haul optical access networks by using high order QAMs. We propose and demonstrate a high speed multi-band CAP-WDM-PON system on intensity modulation, direct detection and digital equalizations. A hybrid modified cascaded MMA post-equalization schemes are used to equalize the multi-band CAP-mQAM signals. Using this scheme, we successfully demonstrates 550Gb/s high capacity WDMPON system with 11 WDM channels, 55 sub-bands, and 10-Gb/s per user in the downstream over 40-km SMF.

  19. Digital image capture, processing, and recording system upgrade for the APS-94F SLAR

    NASA Astrophysics Data System (ADS)

    Ferraris, Guillermo L.

    2000-11-01

    The Argentine Army has been operating the APS-94F SLAR systems, on board the venerable OV-1D MOHAWK aircraft, since 1996. These systems were received from the U.S. Government through the FMS program. One major handicap of the system is due to the now obsolete imagery recording subsystem, which includes complex optical, thermal and electro-mechanical obsolete processes and components, that account for most of the degradations and distortions in the images obtained (not to mention the fact that images are recorded on a 9.5-inch silver halide film media, which has to be kept at -17 degree(s)C and has to be brought to thermal equilibrium with the environment eight hours before the mission). An integral digital capture, processing and recording subsystem was developed at CITEFA (Instituto de Investigaciones Cientificas y Tecnicas de las Fuerzas Armadas) to replace the old analog RO-495/U recorder, as an upgrade to this very robust and proven imaging radar system The subsystem developed includes three custom designed ISA boards: (1) Radar video and aircraft attitude signal conditioning board, (2) Microprocessor controlled two- channel high speed digitizing board and (3) Integrated 12- channel GPS OEM board. The operator's software interface is a PC based GUI C++ application, including radar imagery forming and processing algorithms, slant range to ground range conversion, digitally controlled image gain, bias and contrast adjustments, image registration (GPS), image file disk recording and retrieval functions, real time mensuration and MTI/FTI (moving target indication/fixed target indication) image correlation. The system also provides for the added capability to send compressed still radar images in NRT (near real time) to a ground receiving station through a secure data link. Due to serious space limitations inside the OV-1D two-seat cockpit, a military grade ruggedized laptop computer and docking station hardware implementation was selected.

  20. UCMS - A new signal parameter measurement system using digital signal processing techniques. [User Constraint Measurement System

    NASA Technical Reports Server (NTRS)

    Choi, H. J.; Su, Y. T.

    1986-01-01

    The User Constraint Measurement System (UCMS) is a hardware/software package developed by NASA Goddard to measure the signal parameter constraints of the user transponder in the TDRSS environment by means of an all-digital signal sampling technique. An account is presently given of the features of UCMS design and of its performance capabilities and applications; attention is given to such important aspects of the system as RF interface parameter definitions, hardware minimization, the emphasis on offline software signal processing, and end-to-end link performance. Applications to the measurement of other signal parameters are also discussed.

  1. Operation and performance of a longitudinal feedback system using digital signal processing

    SciTech Connect

    Teytelman, D.; Fox, J.; Hindi, H.

    1994-11-22

    A programmable longitudinal feedback system using a parallel array of AT&T 1610 digital signal processors has been developed as a component of the PEP-II R&D program. This system has been installed at the Advanced Light Source (LBL) and implements full speed bunch by bunch signal processing for storage rings with bunch spacing of 4ns. Open and closed loop results showing the action of the feedback system are presented, and the system is shown to damp coupled-bunch instabilities in the ALS. A unified PC-based software environment for the feedback system operation is also described.

  2. Basic forest cover mapping using digitized remote sensor data and automated data processing techniques

    NASA Technical Reports Server (NTRS)

    Coggeshall, M. E.; Hoffer, R. M.

    1973-01-01

    Remote sensing equipment and automatic data processing techniques were employed as aids in the institution of improved forest resource management methods. On the basis of automatically calculated statistics derived from manually selected training samples, the feature selection processor of LARSYS selected, upon consideration of various groups of the four available spectral regions, a series of channel combinations whose automatic classification performances (for six cover types, including both deciduous and coniferous forest) were tested, analyzed, and further compared with automatic classification results obtained from digitized color infrared photography.

  3. IBIS - A geographic information system based on digital image processing and image raster datatype

    NASA Technical Reports Server (NTRS)

    Bryant, N. A.; Zobrist, A. L.

    1976-01-01

    IBIS (Image Based Information System) is a geographic information system which makes use of digital image processing techniques to interface existing geocoded data sets and information management systems with thematic maps and remotely sensed imagery. The basic premise is that geocoded data sets can be referenced to a raster scan that is equivalent to a grid cell data set. The first applications (St. Tammany Parish, Louisiana, and Los Angeles County) have been restricted to the design of a land resource inventory and analysis system. It is thought that the algorithms and the hardware interfaces developed will be readily applicable to other Landsat imagery.

  4. Performance Evaluation Method of Chemical Mechanical Polishing Pad Conditioner Using Digital Image Correlation Processing

    NASA Astrophysics Data System (ADS)

    Uneda, Michio; Omote, Tatsunori; Ishikawa, Ken-ichi; Ichikawa, Koichiro; Doi, Toshiro; Kurokawa, Syuhei; Ohnishi, Osamu

    2012-05-01

    In chemical mechanical polishing (CMP), conditioning is generally used for the regeneration of the pad surface texture. Currently, the performance evaluation of conditioners depends on the user's experience so that it is important to develop a novel quantitative evaluation method for conditioner performance. In this paper, we propose a novel evaluation method for conditioner performance using digital image correlation (DIC) processing. The proposed method can measure the in-plane micro-deformation distribution of the pad surface texture by conditioning. It is found that a pad surface deforms over 40 µm with conditioning and that the in-plane deformation value increases with a decrease in the mesh size of conditioner grains.

  5. Combined flatland ST radar and digital-barometer network observations of mesoscale processes

    NASA Technical Reports Server (NTRS)

    Clark, W. L.; Vanzandt, T. E.; Gage, K. S.; Einaudi, F. E.; Rottman, J. W.; Hollinger, S. E.

    1991-01-01

    The paper describes a six-station digital-barometer network centered on the Flatland ST radar to support observational studies of gravity waves and other mesoscale features at the Flatland Atmospheric Observatory in central Illinois. The network's current mode of operation is examined, and a preliminary example of an apparent group of waves evident throughout the network as well as throughout the troposphere is presented. Preliminary results demonstrate the capabilities of the current operational system to study wave convection, wave-front, and other coherent mesoscale interactions and processes throughout the troposphere. Unfiltered traces for the pressure and horizontal zonal wind, for days 351 to 353 UT, 1990, are illustrated.

  6. LANDSAT digital data processing: A near real-time application. [Gulf of Mexico

    NASA Technical Reports Server (NTRS)

    Barker, J. L.; Bohn, C.; Stuart, L.; Hill, J.

    1975-01-01

    An application of rapid generation of classed digital images from LANDSAT-1 was demonstrated and its feasibility evaluated by NASA in conjunction with the Environmental Protection Agency (EPA), Texas A and M University (TAMU), and the Cousteau Society. The primary purpose was to show that satellite data could be processed and transmitted to the Calypso, which was used as a research vessel, in time for use in directing it to specific locations of possible plankton upwellings, sediment, or other anomalies in the coastal water areas along the Gulf of Mexico.

  7. Rocket engine plume diagnostics using video digitization and image processing - Analysis of start-up

    NASA Technical Reports Server (NTRS)

    Disimile, P. J.; Shoe, B.; Dhawan, A. P.

    1991-01-01

    Video digitization techniques have been developed to analyze the exhaust plume of the Space Shuttle Main Engine. Temporal averaging and a frame-by-frame analysis provide data used to evaluate the capabilities of image processing techniques for use as measurement tools. Capabilities include the determination of the necessary time requirement for the Mach disk to obtain a fully-developed state. Other results show the Mach disk tracks the nozzle for short time intervals, and that dominate frequencies exist for the nozzle and Mach disk movement.

  8. Qualitative and quantitative interpretation of SEM image using digital image processing.

    PubMed

    Saladra, Dawid; Kopernik, Magdalena

    2016-10-01

    The aim of the this study is improvement of qualitative and quantitative analysis of scanning electron microscope micrographs by development of computer program, which enables automatic crack analysis of scanning electron microscopy (SEM) micrographs. Micromechanical tests of pneumatic ventricular assist devices result in a large number of micrographs. Therefore, the analysis must be automatic. Tests for athrombogenic titanium nitride/gold coatings deposited on polymeric substrates (Bionate II) are performed. These tests include microshear, microtension and fatigue analysis. Anisotropic surface defects observed in the SEM micrographs require support for qualitative and quantitative interpretation. Improvement of qualitative analysis of scanning electron microscope images was achieved by a set of computational tools that includes binarization, simplified expanding, expanding, simple image statistic thresholding, the filters Laplacian 1, and Laplacian 2, Otsu and reverse binarization. Several modifications of the known image processing techniques and combinations of the selected image processing techniques were applied. The introduced quantitative analysis of digital scanning electron microscope images enables computation of stereological parameters such as area, crack angle, crack length, and total crack length per unit area. This study also compares the functionality of the developed computer program of digital image processing with existing applications. The described pre- and postprocessing may be helpful in scanning electron microscopy and transmission electron microscopy surface investigations.

  9. Auditory versus visual processing of three sets of simultaneous digit pairs.

    PubMed

    Rollins, H A; Schurman, D L; Evans, M J; Knoph, K

    1975-03-01

    Two experiments were conducted to determine whether the auditory and visual systems process simultaneously presented pairs of alphanumeric information differently. In Experiment 1, different groups of subjects were given extensive practice recalling pairs of superimposed visual or auditory digits in simultaneous order (the order of arrival) or successive order (one member of each digit pair in turn, followed by the other pair member). For auditory input, successive order of recall was more accurate, particularly for the last two of three pairs presented, whereas for visual input, simultaneous order of recall was more accurate. In Experiment 2, subjects were cued to recall in one or the other order either immediately before or after stimulus input. Recall order results were the same as for Experiment 1, and precuing did not facilitate recall in either order for both modalities. These results suggest that processing in the auditory system can only occur successively across time, whereas as in the visual system processing can only occur simultaneously in space.

  10. ARAMIS project: a comprehensive methodology for the identification of reference accident scenarios in process industries.

    PubMed

    Delvosalle, Christian; Fievez, Cécile; Pipart, Aurore; Debray, Bruno

    2006-03-31

    In the frame of the Accidental Risk Assessment Methodology for Industries (ARAMIS) project, this paper aims at presenting the work carried out in the part of the project devoted to the definition of accident scenarios. This topic is a key-point in risk assessment and serves as basis for the whole risk quantification. The first result of the work is the building of a methodology for the identification of major accident hazards (MIMAH), which is carried out with the development of generic fault and event trees based on a typology of equipment and substances. The term "major accidents" must be understood as the worst accidents likely to occur on the equipment, assuming that no safety systems are installed. A second methodology, called methodology for the identification of reference accident scenarios (MIRAS) takes into account the influence of safety systems on both the frequencies and possible consequences of accidents. This methodology leads to identify more realistic accident scenarios. The reference accident scenarios are chosen with the help of a tool called "risk matrix", crossing the frequency and the consequences of accidents. This paper presents both methodologies and an application on an ethylene oxide storage.

  11. Design of a high-speed digital processing element for parallel simulation

    NASA Technical Reports Server (NTRS)

    Milner, E. J.; Cwynar, D. S.

    1983-01-01

    A prototype of a custom designed computer to be used as a processing element in a multiprocessor based jet engine simulator is described. The purpose of the custom design was to give the computer the speed and versatility required to simulate a jet engine in real time. Real time simulations are needed for closed loop testing of digital electronic engine controls. The prototype computer has a microcycle time of 133 nanoseconds. This speed was achieved by: prefetching the next instruction while the current one is executing, transporting data using high speed data busses, and using state of the art components such as a very large scale integration (VLSI) multiplier. Included are discussions of processing element requirements, design philosophy, the architecture of the custom designed processing element, the comprehensive instruction set, the diagnostic support software, and the development status of the custom design.

  12. Bridging the gap between neurocognitive processing theory and performance validity assessment among the cognitively impaired: a review and methodological approach.

    PubMed

    Leighton, Angela; Weinborn, Michael; Maybery, Murray

    2014-10-01

    Bigler (2012) and Larrabee (2012) recently addressed the state of the science surrounding performance validity tests (PVTs) in a dialogue highlighting evidence for the valid and increased use of PVTs, but also for unresolved problems. Specifically, Bigler criticized the lack of guidance from neurocognitive processing theory in the PVT literature. For example, individual PVTs have applied the simultaneous forced-choice methodology using a variety of test characteristics (e.g., word vs. picture stimuli) with known neurocognitive processing implications (e.g., the "picture superiority effect"). However, the influence of such variations on classification accuracy has been inadequately evaluated, particularly among cognitively impaired individuals. The current review places the PVT literature in the context of neurocognitive processing theory, and identifies potential methodological factors to account for the significant variability we identified in classification accuracy across current PVTs. We subsequently evaluated the utility of a well-known cognitive manipulation to provide a Clinical Analogue Methodology (CAM), that is, to alter the PVT performance of healthy individuals to be similar to that of a cognitively impaired group. Initial support was found, suggesting the CAM may be useful alongside other approaches (analogue malingering methodology) for the systematic evaluation of PVTs, particularly the influence of specific neurocognitive processing components on performance.

  13. A Methodological Reflection on the Process of Narrative Analysis: Alienation and Identity in the Life Histories of English Language Teachers

    ERIC Educational Resources Information Center

    Menard-Warwick, Julia

    2011-01-01

    This article uses data from life-history interviews with English language teachers in Chile and California to illustrate methodological processes in teacher identity research through narrative analysis. To this end, the author describes the steps she took in identifying an issue to be examined, selecting particular narratives as representative of…

  14. Implementation theory of distortion-invariant pattern recognition for optical and digital signal processing systems

    NASA Astrophysics Data System (ADS)

    Lhamon, Michael Earl

    A pattern recognition system which uses complex correlation filter banks requires proportionally more computational effort than single-real valued filters. This introduces increased computation burden but also introduces a higher level of parallelism, that common computing platforms fail to identify. As a result, we consider algorithm mapping to both optical and digital processors. For digital implementation, we develop computationally efficient pattern recognition algorithms, referred to as, vector inner product operators that require less computational effort than traditional fast Fourier methods. These algorithms do not need correlation and they map readily onto parallel digital architectures, which imply new architectures for optical processors. These filters exploit circulant-symmetric matrix structures of the training set data representing a variety of distortions. By using the same mathematical basis as with the vector inner product operations, we are able to extend the capabilities of more traditional correlation filtering to what we refer to as "Super Images". These "Super Images" are used to morphologically transform a complicated input scene into a predetermined dot pattern. The orientation of the dot pattern is related to the rotational distortion of the object of interest. The optical implementation of "Super Images" yields feature reduction necessary for using other techniques, such as artificial neural networks. We propose a parallel digital signal processor architecture based on specific pattern recognition algorithms but general enough to be applicable to other similar problems. Such an architecture is classified as a data flow architecture. Instead of mapping an algorithm to an architecture, we propose mapping the DSP architecture to a class of pattern recognition algorithms. Today's optical processing systems have difficulties implementing full complex filter structures. Typically, optical systems (like the 4f correlators) are limited to phase

  15. Beyond data collection in digital mapping: interpretation, sketching and thought process elements in geological map making

    NASA Astrophysics Data System (ADS)

    Watkins, Hannah; Bond, Clare; Butler, Rob

    2016-04-01

    Geological mapping techniques have advanced significantly in recent years from paper fieldslips to Toughbook, smartphone and tablet mapping; but how do the methods used to create a geological map affect the thought processes that result in the final map interpretation? Geological maps have many key roles in the field of geosciences including understanding geological processes and geometries in 3D, interpreting geological histories and understanding stratigraphic relationships in 2D and 3D. Here we consider the impact of the methods used to create a map on the thought processes that result in the final geological map interpretation. As mapping technology has advanced in recent years, the way in which we produce geological maps has also changed. Traditional geological mapping is undertaken using paper fieldslips, pencils and compass clinometers. The map interpretation evolves through time as data is collected. This interpretive process that results in the final geological map is often supported by recording in a field notebook, observations, ideas and alternative geological models explored with the use of sketches and evolutionary diagrams. In combination the field map and notebook can be used to challenge the map interpretation and consider its uncertainties. These uncertainties and the balance of data to interpretation are often lost in the creation of published 'fair' copy geological maps. The advent of Toughbooks, smartphones and tablets in the production of geological maps has changed the process of map creation. Digital data collection, particularly through the use of inbuilt gyrometers in phones and tablets, has changed smartphones into geological mapping tools that can be used to collect lots of geological data quickly. With GPS functionality this data is also geospatially located, assuming good GPS connectivity, and can be linked to georeferenced infield photography. In contrast line drawing, for example for lithological boundary interpretation and sketching

  16. A Comparison of the Safety Analysis Process and the Generation IV Proliferation Resistance/Physical Protection Assessment Methodology

    SciTech Connect

    T. A. Bjornard; M. D. Zentner

    2006-05-01

    The Generation IV International Forum (GIF) is a vehicle for the cooperative international development of future nuclear energy systems. The Generation IV program has established primary objectives in the areas of sustainability, economics, safety and reliability, and Proliferation Resistance and Physical Protection (PR&PP). In order to help meet the latter objective a program was launched in December 2002 to develop a rigorous means to assess nuclear energy systems with respect to PR&PP. The study of Physical Protection of a facility is a relatively well established methodology, but an approach to evaluate the Proliferation Resistance of a nuclear fuel cycle is not. This paper will examine the Proliferation Resistance (PR) evaluation methodology being developed by the PR group, which is largely a new approach and compare it to generally accepted nuclear facility safety evaluation methodologies. Safety evaluation methods have been the subjects of decades of development and use. Further, safety design and analysis is fairly broadly understood, as well as being the subject of federally mandated procedures and requirements. It is therefore extremely instructive to compare and contrast the proposed new PR evaluation methodology process with that used in safety analysis. By so doing, instructive and useful conclusions can be derived from the comparison that will help to strengthen the PR methodological approach as it is developed further. From the comparison made in this paper it is evident that there are very strong parallels between the two processes. Most importantly, it is clear that the proliferation resistance aspects of nuclear energy systems are best considered beginning at the very outset of the design process. Only in this way can the designer identify and cost effectively incorporate intrinsic features that might be difficult to implement at some later stage. Also, just like safety, the process to implement proliferation resistance should be a dynamic

  17. Enhanced electrokinetic manipulation and impedance sensing using FPGA digital signal processing

    NASA Astrophysics Data System (ADS)

    Higginbotham, Steven N.; Sweatman, Denis R.

    2006-01-01

    Electrokinetic manipulation of microscopic biological particles, such as bacteria and other cells, is useful in the technology of lab-on-a-chip devices and micro-total-analysis systems (μTAS). In electrokinetic manipulation, non-uniform electric fields are used to exploit the dielectric properties of suspended biological microparticles, to induce forces and torques on the particles. The electric fields are produced by planar electrode arrays patterned on electrically-insulating substrates. Biological microparticles are dielectrically-heterogeneous structures. Each different type of biological cell has a distinct dielectric frequency response signature. This dielectric distinction allows specificity when manipulating biological microparticles using electrokinetics. Electrokinetic microbiological particle manipulation has numerous potential applications in biotechnology, such as the separation and study of cancerous cells, determining the viability of cells, as well as enabling more automation and parallelization in microbiological research and pathology. This paper presents microfabricated devices for the manipulation of biological microparticles using electrokinetics. Methods of impedance sensing for determining microparticle concentration and type are also discussed. This paper also presents methods of using digital signal processing systems to enhance the manipulation and sensing of the microbiological particles. A Field-Programmable Gate Array (FPGA) based system is demonstrated which is used to digitally synthesize signals for electrokinetic actuation, and to process signals for impedance sensing.

  18. Combination of digital signal processing methods towards an improved analysis algorithm for structural health monitoring.

    NASA Astrophysics Data System (ADS)

    Pentaris, Fragkiskos P.; Makris, John P.

    2013-04-01

    In Structural Health Monitoring (SHM) is of great importance to reveal valuable information from the recorded SHM data that could be used to predict or indicate structural fault or damage in a building. In this work a combination of digital signal processing methods, namely FFT along with Wavelet Transform is applied, together with a proposed algorithm to study frequency dispersion, in order to depict non-linear characteristics of SHM data collected in two university buildings under natural or anthropogenic excitation. The selected buildings are of great importance from civil protection point of view, as there are the premises of a public higher education institute, undergoing high use, stress, visit from academic staff and students. The SHM data are collected from two neighboring buildings that have different age (4 and 18 years old respectively). Proposed digital signal processing methods are applied to the data, presenting a comparison of the structural behavior of both buildings in response to seismic activity, weather conditions and man-made activity. Acknowledgments This work was supported in part by the Archimedes III Program of the Ministry of Education of Greece, through the Operational Program "Educational and Lifelong Learning", in the framework of the project entitled «Interdisciplinary Multi-Scale Research of Earthquake Physics and Seismotectonics at the front of the Hellenic Arc (IMPACT-ARC) » and is co-financed by the European Union (European Social Fund) and Greek National Fund.

  19. [Measuring the contrast resolution limits of human vision based on the modern digital image processing].

    PubMed

    Wang, Zhifang; Liu, Yuhong; Wang, Ying; Li, Hong; Li, Zhangyong; Zhao, Zhiqiang; Xie, Zhengxiang

    2008-10-01

    In the literatures on the human vision physiology and physics, there were reports about space resolution limit of 1' visual angle, frequency resolution limit of 5 nm and time resolution limit of 0.1" of human vision. However, there has been no report about the contrast resolution limit of human vision,especially the report of measuring method and result about the contrast resolution limit of human vision based on the modern digital image processing. Here we report a modern method for measuring the contrast resolution limit of human vision based on computer digital image processing technology, and we present the measured results and their mathematical models. The function relationships of contrast resolution limit varying with background gray in a photopic or a scotopic sights were illuminated respectively. It can be expected that such investigations with regard to human vision will establish the physiological foundation of the theories and techniques in hiding bodies and hiding figures (stealth), in anti-hiding bodies and anti-hiding figures, in the night vision system independent of infrared, as well as in their relative industries.

  20. Processing of multi-digit additions in high math-anxious individuals: psychophysiological evidence

    PubMed Central

    Núñez-Peña, María Isabel; Suárez-Pellicioni, Macarena

    2015-01-01

    We investigated the time course of neural processing of multi-digit additions in high- (HMA) and low-math anxious (LMA) individuals. Seventeen HMA and 17 LMA individuals were presented with two-digit additions and were asked to perform a verification task. Behavioral data showed that HMA individuals were slower and more error prone than their LMA peers, and that incorrect solutions were solved more slowly and less accurately than correct ones. Moreover, HMA individuals tended to need more time and commit more errors when having to verify incorrect solutions than correct ones. ERPs time-locked to the presentation of the addends (calculation phase) and to the presentation of the proposed solution (verification phase) were also analyzed. In both phases, a P2 component of larger amplitude was found for HMA individuals than for their LMA peers. Because the P2 component is considered to be a biomarker of the mobilization of attentional resources toward emotionally negative stimuli, these results suggest that HMA individuals may have invested more attentional resources both when processing the addends (calculation phase) and when they had to report whether the proposed solution was correct or not (verification phase), as compared to their LMA peers. Moreover, in the verification phase, LMA individuals showed a larger late positive component (LPC) for incorrect solutions at parietal electrodes than their HMA counterparts. The smaller LPC shown by HMA individuals when verifying incorrect solutions suggests that these solutions may have been appeared more plausible to them than to their LMA counterparts. PMID:26347705

  1. Do Italian Companies Manage Work-Related Stress Effectively? A Process Evaluation in Implementing the INAIL Methodology.

    PubMed

    Di Tecco, Cristina; Ronchetti, Matteo; Ghelli, Monica; Russo, Simone; Persechino, Benedetta; Iavicoli, Sergio

    2015-01-01

    Studies on Intervention Process Evaluation are attracting growing attention in the literature on interventions linked to stress and the wellbeing of workers. There is evidence that some elements relating to the process and content of an intervention may have a decisive role in implementing it by facilitating or hindering the effectiveness of the results. This study aimed to provide a process evaluation on interventions to assess and manage risks related to work-related stress using a methodological path offered by INAIL. The final sample is composed of 124 companies participating to an interview on aspects relating to each phase of the INAIL methodological path put in place to implement the intervention. INAIL methodology has been defined as useful in the process of assessing and managing the risks related to work-related stress. Some factors related to the process (e.g., implementation of a preliminary phase, workers' involvement, and use of external consultants) showed a role in significant differences that emerged in the levels of risk, particularly in relation to findings from the preliminary assessment. Main findings provide information on the key aspects of process and content that are useful in implementing an intervention for assessing and managing risks related to work-related stress.

  2. A working environment for digital planetary data processing and mapping using ISIS and GRASS GIS

    NASA Astrophysics Data System (ADS)

    Frigeri, Alessandro; Hare, Trent; Neteler, Markus; Coradini, Angioletta; Federico, Costanzo; Orosei, Roberto

    2011-09-01

    Since the beginning of planetary exploration, mapping has been fundamental to summarize observations returned by scientific missions. Sensor-based mapping has been used to highlight specific features from the planetary surfaces by means of processing. Interpretative mapping makes use of instrumental observations to produce thematic maps that summarize observations of actual data into a specific theme. Geologic maps, for example, are thematic interpretative maps that focus on the representation of materials and processes and their relative timing. The advancements in technology of the last 30 years have allowed us to develop specialized systems where the mapping process can be made entirely in the digital domain. The spread of networked computers on a global scale allowed the rapid propagation of software and digital data such that every researcher can now access digital mapping facilities on his desktop. The efforts to maintain planetary missions data accessible to the scientific community have led to the creation of standardized digital archives that facilitate the access to different datasets by software capable of processing these data from the raw level to the map projected one. Geographic Information Systems (GIS) have been developed to optimize the storage, the analysis, and the retrieval of spatially referenced Earth based environmental geodata; since the last decade these computer programs have become popular among the planetary science community, and recent mission data start to be distributed in formats compatible with these systems. Among all the systems developed for the analysis of planetary and spatially referenced data, we have created a working environment combining two software suites that have similar characteristics in their modular design, their development history, their policy of distribution and their support system. The first, the Integrated Software for Imagers and Spectrometers (ISIS) developed by the United States Geological Survey

  3. A working environment for digital planetary data processing and mapping using ISIS and GRASS GIS

    USGS Publications Warehouse

    Frigeri, A.; Hare, T.; Neteler, M.; Coradini, A.; Federico, C.; Orosei, R.

    2011-01-01

    Since the beginning of planetary exploration, mapping has been fundamental to summarize observations returned by scientific missions. Sensor-based mapping has been used to highlight specific features from the planetary surfaces by means of processing. Interpretative mapping makes use of instrumental observations to produce thematic maps that summarize observations of actual data into a specific theme. Geologic maps, for example, are thematic interpretative maps that focus on the representation of materials and processes and their relative timing. The advancements in technology of the last 30 years have allowed us to develop specialized systems where the mapping process can be made entirely in the digital domain. The spread of networked computers on a global scale allowed the rapid propagation of software and digital data such that every researcher can now access digital mapping facilities on his desktop. The efforts to maintain planetary missions data accessible to the scientific community have led to the creation of standardized digital archives that facilitate the access to different datasets by software capable of processing these data from the raw level to the map projected one. Geographic Information Systems (GIS) have been developed to optimize the storage, the analysis, and the retrieval of spatially referenced Earth based environmental geodata; since the last decade these computer programs have become popular among the planetary science community, and recent mission data start to be distributed in formats compatible with these systems. Among all the systems developed for the analysis of planetary and spatially referenced data, we have created a working environment combining two software suites that have similar characteristics in their modular design, their development history, their policy of distribution and their support system. The first, the Integrated Software for Imagers and Spectrometers (ISIS) developed by the United States Geological Survey

  4. Proposal of the Methodology for Analysing the Structural Relationship in the System of Random Process Using the Data Mining Methods

    NASA Astrophysics Data System (ADS)

    Michaľčonok, German; Kalinová, Michaela Horalová; Németh, Martin

    2014-12-01

    The aim of this paper is to present the possibilities of applying data mining techniques to the problem of analysis of structural relationships in the system of stationary random processes. In this paper, we will approach the area of the random processes, present the process of structural analysis and select suitable circuit data mining methods applicable to the area of structural analysis. We will propose the methodology for the structural analysis in the system of stationary stochastic processes using data mining methods for active experimental approach, based on the theoretical basis.

  5. Digital image processing of nanometer-size metal particles on amorphous substrates

    NASA Technical Reports Server (NTRS)

    Soria, F.; Artal, P.; Bescos, J.; Heinemann, K.

    1989-01-01

    The task of differentiating very small metal aggregates supported on amorphous films from the phase contrast image features inherently stemming from the support is extremely difficult in the nanometer particle size range. Digital image processing was employed to overcome some of the ambiguities in evaluating such micrographs. It was demonstrated that such processing allowed positive particle detection and a limited degree of statistical size analysis even for micrographs where by bare eye examination the distribution between particles and erroneous substrate features would seem highly ambiguous. The smallest size class detected for Pd/C samples peaks at 0.8 nm. This size class was found in various samples prepared under different evaporation conditions and it is concluded that these particles consist of 'a magic number' of 13 atoms and have cubooctahedral or icosahedral crystal structure.

  6. Research on pairing method of chromosome of endangered amphioxus based on digital image processing technology

    NASA Astrophysics Data System (ADS)

    Li, Jinping; Jia, Hongwei; Mu, Hongshan; Tan, Hai

    2010-07-01

    This paper studies a new method for chromosome pairing of amphioxus using digital image processing technology, selects chromosome banding image of amphioxus Branchiostoma belcheri tsingtauense as experimental material, compares and analyzes objective data such as perimeter, long axis and short axis of the chromosome, and realizes chromosome pairing of amphioxus with Reference to band feature of Laplace and Sobel edge detection as pairing supplement. According to objective data obtained from the program, this method improves the subjective pairing method which uses image processing software to measure long arm and short arm of the chromosome. The results show that the pairing method is effective, accurate and practical. This study has important theoretical and practical significance for further study of chromosome gene mapping and cell genetics of amphioxus.

  7. Policy through procurement - the introduction of digital signal process (DSP) hearing aids into the English NHS.

    PubMed

    Phillips, Wendy; Knight, Louise; Caldwell, Nigel; Warrington, John

    2007-01-01

    Despite being a major user of many technologies and innovations, the healthcare sector's role and influence as a procurer of technologies has been poorly represented by the literature and consequently is not fully understood. Providing a practical example of the introduction of digital signal process (DSP) hearing aids in to the English NHS, this paper discusses the role of public sector procurement agencies in the uptake of technologies from the private sector and their adoption by the public sector. Employing a system of innovation (SI) approach, the paper highlights the need for policy-makers to adopt a dynamic as well as systemic perspective that recognises the shifting roles, responsibilities and interactions of key stakeholders throughout the innovation process.

  8. Digital signal processing by virtual instrumentation of a MEMS magnetic field sensor for biomedical applications.

    PubMed

    Juárez-Aguirre, Raúl; Domínguez-Nicolás, Saúl M; Manjarrez, Elías; Tapia, Jesús A; Figueras, Eduard; Vázquez-Leal, Héctor; Aguilera-Cortés, Luz A; Herrera-May, Agustín L

    2013-11-05

    We present a signal processing system with virtual instrumentation of a MEMS sensor to detect magnetic flux density for biomedical applications. This system consists of a magnetic field sensor, electronic components implemented on a printed circuit board (PCB), a data acquisition (DAQ) card, and a virtual instrument. It allows the development of a semi-portable prototype with the capacity to filter small electromagnetic interference signals through digital signal processing. The virtual instrument includes an algorithm to implement different configurations of infinite impulse response (IIR) filters. The PCB contains a precision instrumentation amplifier, a demodulator, a low-pass filter (LPF) and a buffer with operational amplifier. The proposed prototype is used for real-time non-invasive monitoring of magnetic flux density in the thoracic cage of rats. The response of the rat respiratory magnetogram displays a similar behavior as the rat electromyogram (EMG).

  9. Real-time digital signal processing for live electro-optic imaging.

    PubMed

    Sasagawa, Kiyotaka; Kanno, Atsushi; Tsuchiya, Masahiro

    2009-08-31

    We present an imaging system that enables real-time magnitude and phase detection of modulated signals and its application to a Live Electro-optic Imaging (LEI) system, which realizes instantaneous visualization of RF electric fields. The real-time acquisition of magnitude and phase images of a modulated optical signal at 5 kHz is demonstrated by imaging with a Si-based high-speed CMOS image sensor and real-time signal processing with a digital signal processor. In the LEI system, RF electric fields are probed with light via an electro-optic crystal plate and downconverted to an intermediate frequency by parallel optical heterodyning, which can be detected with the image sensor. The artifacts caused by the optics and the image sensor characteristics are corrected by image processing. As examples, we demonstrate real-time visualization of electric fields from RF circuits.

  10. Application of three-dimensional digital image processing for reconstruction of microstructural volume from serial sections

    SciTech Connect

    Tewari, A.; Gokhale, A.M.

    2000-03-01

    Three-dimensional digital image processing is useful for reconstruction of microstructural volume from a stack of serial sections. Application of this technique is demonstrated via reconstruction of a volume segment of the liquid-phase sintered microstructure of a tungsten heavy alloy processed in the microgravity environment of NASA's space shuttle, Columbia. Ninety serial sections (approximately one micrometer apart) were used for reconstruction of the three-dimensional microstructure. The three-dimensional microstructural reconstruction clearly revealed that the tungsten grains are almost completely connected in three-dimensional space. Both the matrix and the grains are topologically co-continuous, although the alloy was liquid-phase sintered in microgravity. Therefore, absence of gravity did not produced a microstructure consisting of discrete isolated W grains uniformly dispersed in the liquid Ni-Fe alloy matrix at the sintering temperature.

  11. Real-time digital holographic microscopy using the graphic processing unit.

    PubMed

    Shimobaba, Tomoyoshi; Sato, Yoshikuni; Miura, Junya; Takenouchi, Mai; Ito, Tomoyoshi

    2008-08-04

    Digital holographic microscopy (DHM) is a well-known powerful method allowing both the amplitude and phase of a specimen to be simultaneously observed. In order to obtain a reconstructed image from a hologram, numerous calculations for the Fresnel diffraction are required. The Fresnel diffraction can be accelerated by the FFT (Fast Fourier Transform) algorithm. However, real-time reconstruction from a hologram is difficult even if we use a recent central processing unit (CPU) to calculate the Fresnel diffraction by the FFT algorithm. In this paper, we describe a real-time DHM system using a graphic processing unit (GPU) with many stream processors, which allows use as a highly parallel processor. The computational speed of the Fresnel diffraction using the GPU is faster than that of recent CPUs. The real-time DHM system can obtain reconstructed images from holograms whose size is 512 x 512 grids in 24 frames per second.

  12. TRIIG - Time-lapse reproduction of images through interactive graphics. [digital processing of quality hard copy

    NASA Technical Reports Server (NTRS)

    Buckner, J. D.; Council, H. W.; Edwards, T. R.

    1974-01-01

    Description of the hardware and software implementing the system of time-lapse reproduction of images through interactive graphics (TRIIG). The system produces a quality hard copy of processed images in a fast and inexpensive manner. This capability allows for optimal development of processing software through the rapid viewing of many image frames in an interactive mode. Three critical optical devices are used to reproduce an image: an Optronics photo reader/writer, the Adage Graphics Terminal, and Polaroid Type 57 high speed film. Typical sources of digitized images are observation satellites, such as ERTS or Mariner, computer coupled electron microscopes for high-magnification studies, or computer coupled X-ray devices for medical research.

  13. Digital Signal Processing by Virtual Instrumentation of a MEMS Magnetic Field Sensor for Biomedical Applications

    PubMed Central

    Juárez-Aguirre, Raúl; Domínguez-Nicolás, Saúl M.; Manjarrez, Elías; Tapia, Jesús A.; Figueras, Eduard; Vázquez-Leal, Héctor; Aguilera-Cortés, Luz A.; Herrera-May, Agustín L.

    2013-01-01

    We present a signal processing system with virtual instrumentation of a MEMS sensor to detect magnetic flux density for biomedical applications. This system consists of a magnetic field sensor, electronic components implemented on a printed circuit board (PCB), a data acquisition (DAQ) card, and a virtual instrument. It allows the development of a semi-portable prototype with the capacity to filter small electromagnetic interference signals through digital signal processing. The virtual instrument includes an algorithm to implement different configurations of infinite impulse response (IIR) filters. The PCB contains a precision instrumentation amplifier, a demodulator, a low-pass filter (LPF) and a buffer with operational amplifier. The proposed prototype is used for real-time non-invasive monitoring of magnetic flux density in the thoracic cage of rats. The response of the rat respiratory magnetogram displays a similar behavior as the rat electromyogram (EMG). PMID:24196434

  14. Digital ultrasonics signal processing: Flaw data post processing use and description

    NASA Technical Reports Server (NTRS)

    Buel, V. E.

    1981-01-01

    A modular system composed of two sets of tasks which interprets the flaw data and allows compensation of the data due to transducer characteristics is described. The hardware configuration consists of two main units. A DEC LSI-11 processor running under the RT-11 sngle job, version 2C-02 operating system, controls the scanner hardware and the ultrasonic unit. A DEC PDP-11/45 processor also running under the RT-11, version 2C-02, operating system, stores, processes and displays the flaw data. The software developed the Ultrasonics Evaluation System, is divided into two catagories; transducer characterization and flaw classification. Each category is divided further into two functional tasks: a data acquisition and a postprocessor ask. The flaw characterization collects data, compresses its, and writes it to a disk file. The data is then processed by the flaw classification postprocessing task. The use and operation of a flaw data postprocessor is described.

  15. Digital ultrasonics signal processing: Flaw data post processing use and description

    NASA Astrophysics Data System (ADS)

    Buel, V. E.

    1981-09-01

    A modular system composed of two sets of tasks which interprets the flaw data and allows compensation of the data due to transducer characteristics is described. The hardware configuration consists of two main units. A DEC LSI-11 processor running under the RT-11 sngle job, version 2C-02 operating system, controls the scanner hardware and the ultrasonic unit. A DEC PDP-11/45 processor also running under the RT-11, version 2C-02, operating system, stores, processes and displays the flaw data. The software developed the Ultrasonics Evaluation System, is divided into two catagories; transducer characterization and flaw classification. Each category is divided further into two functional tasks: a data acquisition and a postprocessor ask. The flaw characterization collects data, compresses its, and writes it to a disk file. The data is then processed by the flaw classification postprocessing task. The use and operation of a flaw data postprocessor is described.

  16. Phenopix: a R package to process digital images of a vegetation cover

    NASA Astrophysics Data System (ADS)

    Filippa, Gianluca; Cremonese, Edoardo; Migliavacca, Mirco; Galvagno, Marta; Morra di Cella, Umberto; Richardson, Andrew

    2015-04-01

    Plant phenology is a globally recognized indicator of the effects of climate change on the terrestrial biosphere. Accordingly, new tools to automatically track the seasonal development of a vegetation cover are becoming available and more and more deployed. Among them, near-continuous digital images are being collected in several networks in the US, Europe, Asia and Australia in a range of different ecosystems, including agricultural lands, deciduous and evergreen forests, and grasslands. The growing scientific interest in vegetation image analysis highlights the need of easy to use, flexible and standardized processing techniques. In this contribution we illustrate a new open source package called "phenopix" written in R language that allows to process images of a vegetation cover. The main features include: (i) define of one or more areas of interest on an image and process pixel information within them, (ii) compute vegetation indexes based on red green and blue channels, (iii) fit a curve to the seasonal trajectory of vegetation indexes and extract relevant dates (aka thresholds) on the seasonal trajectory; (iv) analyze image pixels separately to extract spatially explicit phenological information. The utilities of the package will be illustrated in detail for two subalpine sites, a grassland and a larch stand at about 2000 m in the Italian Western Alps. The phenopix package is a cost free and easy-to-use tool that allows to process digital images of a vegetation cover in a standardized, flexible and reproducible way. The software is available for download at the R forge web site (r-forge.r-project.org/projects/phenopix/).

  17. ADAPT: A knowledge-based synthesis tool for digital signal processing system design

    SciTech Connect

    Cooley, E.S.

    1988-01-01

    A computer aided synthesis tool for expansion, compression, and filtration of digital images is described. ADAPT, the Autonomous Digital Array Programming Tool, uses an extensive design knowledge base to synthesize a digital signal processing (DSP) system. Input to ADAPT can be either a behavioral description in English, or a block level specification via Petri Nets. The output from ADAPT comprises code to implement the DSP system on an array of processors. ADAPT is constructed using C, Prolog, and X Windows on a SUN 3/280 workstation. ADAPT knowledge encompasses DSP component information and the design algorithms and heuristics of a competent DSP designer. The knowledge is used to form queries for design capture, to generate design constraints from the user's responses, and to examine the design constraints. These constraints direct the search for possible DSP components and target architectures. Constraints are also used for partitioning the target systems into less complex subsystems. The subsystems correspond to architectural building blocks of the DSP design. These subsystems inherit design constraints and DSP characteristics from their parent blocks. Thus, a DSP subsystem or parent block, as designed by ADAPT, must meet the user's design constraints. Design solutions are sought by searching the Components section of the design knowledge base. Component behavior which matches or is similar to that required by the DSP subsystems is sought. Each match, which corresponds to a design alternative, is evaluated in terms of its behavior. When a design is sufficiently close to the behavior required by the user, detailed mathematical simulations may be performed to accurately determine exact behavior.

  18. Real-Time Digital Signal Processing Based on FPGAs for Electronic Skin Implementation †

    PubMed Central

    Ibrahim, Ali; Gastaldo, Paolo; Chible, Hussein; Valle, Maurizio

    2017-01-01

    Enabling touch-sensing capability would help appliances understand interaction behaviors with their surroundings. Many recent studies are focusing on the development of electronic skin because of its necessity in various application domains, namely autonomous artificial intelligence (e.g., robots), biomedical instrumentation, and replacement prosthetic devices. An essential task of the electronic skin system is to locally process the tactile data and send structured information either to mimic human skin or to respond to the application demands. The electronic skin must be fabricated together with an embedded electronic system which has the role of acquiring the tactile data, processing, and extracting structured information. On the other hand, processing tactile data requires efficient methods to extract meaningful information from raw sensor data. Machine learning represents an effective method for data analysis in many domains: it has recently demonstrated its effectiveness in processing tactile sensor data. In this framework, this paper presents the implementation of digital signal processing based on FPGAs for tactile data processing. It provides the implementation of a tensorial kernel function for a machine learning approach. Implementation results are assessed by highlighting the FPGA resource utilization and power consumption. Results demonstrate the feasibility of the proposed implementation when real-time classification of input touch modalities are targeted. PMID:28287448

  19. Real-Time Digital Signal Processing Based on FPGAs for Electronic Skin Implementation †.

    PubMed

    Ibrahim, Ali; Gastaldo, Paolo; Chible, Hussein; Valle, Maurizio

    2017-03-10

    Enabling touch-sensing capability would help appliances understand interaction behaviors with their surroundings. Many recent studies are focusing on the development of electronic skin because of its necessity in various application domains, namely autonomous artificial intelligence (e.g., robots), biomedical instrumentation, and replacement prosthetic devices. An essential task of the electronic skin system is to locally process the tactile data and send structured information either to mimic human skin or to respond to the application demands. The electronic skin must be fabricated together with an embedded electronic system which has the role of acquiring the tactile data, processing, and extracting structured information. On the other hand, processing tactile data requires efficient methods to extract meaningful information from raw sensor data. Machine learning represents an effective method for data analysis in many domains: it has recently demonstrated its effectiveness in processing tactile sensor data. In this framework, this paper presents the implementation of digital signal processing based on FPGAs for tactile data processing. It provides the implementation of a tensorial kernel function for a machine learning approach. Implementation results are assessed by highlighting the FPGA resource utilization and power consumption. Results demonstrate the feasibility of the proposed implementation when real-time classification of input touch modalities are targeted.

  20. The Process of Creation: A Novel Methodology for Analyzing Multimodal Data

    ERIC Educational Resources Information Center

    Halverson, Erica Rosenfeld; Bass, Michelle; Woods, David

    2012-01-01

    In the 21st century, meaning making is a multimodal act; we communicate what we know and how we know it using much more than printed text on a blank page. As a result, qualitative researchers need new methodologies, methods, and tools for working with the complex artifacts that our research subjects produce. In this article we describe the…