Science.gov

Sample records for digital processing methodology

  1. Digital methodology to implement the ECOUTER engagement process.

    PubMed

    Wilson, Rebecca C; Butters, Oliver W; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J

    2016-01-01

    ECOUTER (Employing COnceptUal schema for policy and Translation Engagement in Research) - French for 'to listen' - is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes. PMID:27366320

  2. Digital methodology to implement the ECOUTER engagement process

    PubMed Central

    Wilson, Rebecca C.; Butters, Oliver W.; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J.

    2016-01-01

    ECOUTER (Employing COnceptUal schema for policy and Translation Engagement in Research) – French for ‘to listen’ – is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes. PMID:27366320

  3. Digital methodology to implement the ECOUTER engagement process.

    PubMed

    Wilson, Rebecca C; Butters, Oliver W; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J

    2016-01-01

    ECOUTER (Employing COnceptUal schema for policy and Translation Engagement in Research) - French for 'to listen' - is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes.

  4. A Digital Methodology for the Design Process of Aerospace Assemblies with Sustainable Composite Processes & Manufacture

    NASA Astrophysics Data System (ADS)

    McEwan, W.; Butterfield, J.

    2011-05-01

    The well established benefits of composite materials are driving a significant shift in design and manufacture strategies for original equipment manufacturers (OEMs). Thermoplastic composites have advantages over the traditional thermosetting materials with regards to sustainability and environmental impact, features which are becoming increasingly pertinent in the aerospace arena. However, when sustainability and environmental impact are considered as design drivers, integrated methods for part design and product development must be developed so that any benefits of sustainable composite material systems can be assessed during the design process. These methods must include mechanisms to account for process induced part variation and techniques related to re-forming, recycling and decommissioning, which are in their infancy. It is proposed in this paper that predictive techniques related to material specification, part processing and product cost of thermoplastic composite components, be integrated within a Through Life Management (TLM) product development methodology as part of a larger strategy of product system modeling to improve disciplinary concurrency, realistic part performance, and to place sustainability at the heart of the design process. This paper reports the enhancement of digital manufacturing tools as a means of drawing simulated part manufacturing scenarios, real time costing mechanisms, and broader lifecycle performance data capture into the design cycle. The work demonstrates predictive processes for sustainable composite product manufacture and how a Product-Process-Resource (PPR) structure can be customised and enhanced to include design intent driven by `Real' part geometry and consequent assembly. your paper.

  5. A Methodology to Teach Advanced A/D Converters, Combining Digital Signal Processing and Microelectronics Perspectives

    ERIC Educational Resources Information Center

    Quintans, C.; Colmenar, A.; Castro, M.; Moure, M. J.; Mandado, E.

    2010-01-01

    ADCs (analog-to-digital converters), especially Pipeline and Sigma-Delta converters, are designed using complex architectures in order to increase their sampling rate and/or resolution. Consequently, the learning of ADC devices also encompasses complex concepts such as multistage synchronization, latency, oversampling, modulation, noise shaping,…

  6. Digital signal processing: Handbook

    NASA Astrophysics Data System (ADS)

    Goldenberg, L. M.; Matiushkin, B. D.; Poliak, M. N.

    The fundamentals of the theory and design of systems and devices for the digital processing of signals are presented. Particular attention is given to algorithmic methods of synthesis and digital processing equipment in communication systems (e.g., selective digital filtering, spectral analysis, and variation of the signal discretization frequency). Programs for the computer-aided analysis of digital filters are described. Computational examples are presented, along with tables of transfer function coefficients for recursive and nonrecursive digital filters.

  7. Integration Of Digital Methodologies (Field, Processing, and Presentation) In A Combined Sedimentology/Stratigraphy and Structure Course

    NASA Astrophysics Data System (ADS)

    Malinconico, L. L., Jr.; Sunderlin, D.; Liew, C. W.

    2015-12-01

    Over the course of the last three years we have designed, developed and refined two Apps for the iPad. GeoFieldBook and StratLogger allow for the real-time display of spatial (structural) and temporal (stratigraphic) field data as well as very easy in-field navigation. Field techniques and methods for data acquisition and mapping in the field have dramatically advanced and simplified how we collect and analyze data while in the field. The Apps are not geologic mapping programs, but rather a way of bypassing the analog field book step to acquire digital data directly that can then be used in various analysis programs (GIS, Google Earth, Stereonet, spreadsheet and drawing programs). We now complete all of our fieldwork digitally. GeoFieldBook can be used to collect structural and other field observations. Each record includes location/date/time information, orientation measurements, formation names, text observations and photos taken with the tablet camera. Records are customizable, so users can add fields of their own choosing. Data are displayed on an image base in real time with oriented structural symbols. The image base is also used for in-field navigation. In StratLogger, the user records bed thickness, lithofacies, biofacies, and contact data in preset and modifiable fields. Each bed/unit record may also be photographed and geo-referenced. As each record is collected, a column diagram of the stratigraphic sequence is built in real time, complete with lithology color, lithology texture, and fossil symbols. The recorded data from any measured stratigraphic sequence can be exported as both the live-drawn column image and as a .csv formatted file for use in spreadsheet or other applications. Common to both Apps is the ability to export the data (via .csv files), photographs and maps or stratigraphic columns (images). Since the data are digital they are easily imported into various processing programs (for example for stereoplot analysis). Requiring that all maps

  8. Processing Digital Imagery Data

    NASA Technical Reports Server (NTRS)

    Conner, P. K.; Junkin, B. G.; Graham, M. H.; Kalcic, M. T.; Seyfarth, B. R.

    1985-01-01

    Earth Resources Laboratory Applications Software (ELAS) is geobased information system designed for analyzing and processing digital imagery data. ELAS offers user of remotely sensed data wide range of easy to use capabilities in areas of land cover analysis. ELAS system written in FORTRAN and Assembler for batch or interactive processing.

  9. Digital image processing.

    PubMed

    Lo, Winnie Y; Puchalski, Sarah M

    2008-01-01

    Image processing or digital image manipulation is one of the greatest advantages of digital radiography (DR). Preprocessing depends on the modality and corrects for system irregularities such as differential light detection efficiency, dead pixels, or dark noise. Processing is manipulation of the raw data just after acquisition. It is generally proprietary and specific to the DR vendor but encompasses manipulations such as unsharp mask filtering within two or more spatial frequency bands, histogram sliding and stretching, and gray scale rendition or lookup table application. These processing steps have a profound effect on the final appearance of the radiograph, but they can also lead to artifacts unique to digital systems. Postprocessing refers to manipulation of the final appearance of the radiograph by the end-user and does not involve alteration of the raw data.

  10. Digital processing clock

    NASA Technical Reports Server (NTRS)

    Phillips, D. H.

    1982-01-01

    Tthe digital processing clock SG 1157/U is described. It is compatible with the PTTI world where it can be driven by an external cesium source. Built-in test equipment shows synchronization with cesium through 1 pulse per second. It is built to be expandable to accommodate future time-keeping needs of the Navy as well as any other time ordered functions. Examples of this expandibility are the inclusion of an unmodulated XR3 time code and the 2137 modulate time code (XR3 with 1 kHz carrier).

  11. Analog and digital signal processing

    NASA Astrophysics Data System (ADS)

    Baher, H.

    The techniques of signal processing in both the analog and digital domains are addressed in a fashion suitable for undergraduate courses in modern electrical engineering. The topics considered include: spectral analysis of continuous and discrete signals, analysis of continuous and discrete systems and networks using transform methods, design of analog and digital filters, digitization of analog signals, power spectrum estimation of stochastic signals, FFT algorithms, finite word-length effects in digital signal processes, linear estimation, and adaptive filtering.

  12. Digital signal processing

    NASA Astrophysics Data System (ADS)

    Meyer, G.

    The theory, realization techniques, and applications of digital filtering are surveyed, with an emphasis on the development of software, in a handbook for advanced students of electrical and electronic engineering and practicing development engineers. The foundations of the theory of discrete signals and systems are introduced. The design of one-dimensional linear systems is discussed, and the techniques are expanded to the treatment of two-dimensional discrete and multidimensional analog systems. Numerical systems, quantification and limitation, and the characteristics of particular signal-processing devices are considered in a section on design realization. An appendix contains definitions of the basic mathematical concepts, derivations and proofs, and tables of integration and differentiation formulas.

  13. Digital TV processing system

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Two digital video data compression systems directly applicable to the Space Shuttle TV Communication System were described: (1) For the uplink, a low rate monochrome data compressor is used. The compression is achieved by using a motion detection technique in the Hadamard domain. To transform the variable source rate into a fixed rate, an adaptive rate buffer is provided. (2) For the downlink, a color data compressor is considered. The compression is achieved first by intra-color transformation of the original signal vector, into a vector which has lower information entropy. Then two-dimensional data compression techniques are applied to the Hadamard transformed components of this last vector. Mathematical models and data reliability analyses were also provided for the above video data compression techniques transmitted over a channel encoded Gaussian channel. It was shown that substantial gains can be achieved by the combination of video source and channel coding.

  14. Digital Video as Research Practice: Methodology for the Millennium

    ERIC Educational Resources Information Center

    Shrum, Wesley; Duque, Ricardo; Brown, Timothy

    2005-01-01

    This essay has its origin in a project on the globalization of science that rediscovered the wisdom of past research practices through the technology of the future. The main argument of this essay is that a convergence of digital video technologies with practices of social surveillance portends a methodological shift towards a new variety of…

  15. Digital Storytelling: A Novel Methodology for Sexual Health Promotion

    ERIC Educational Resources Information Center

    Guse, Kylene; Spagat, Andrea; Hill, Amy; Lira, Andrea; Heathcock, Stephen; Gilliam, Melissa

    2013-01-01

    Digital storytelling draws on the power of narrative for personal and social transformation. This technique has many desirable attributes for sexuality education, including a participatory methodology, provision of a "safe space" to collaboratively address stigmatized topics, and an emphasis on the social and political contexts that…

  16. Advanced digital SAR processing study

    NASA Technical Reports Server (NTRS)

    Martinson, L. W.; Gaffney, B. P.; Liu, B.; Perry, R. P.; Ruvin, A.

    1982-01-01

    A highly programmable, land based, real time synthetic aperture radar (SAR) processor requiring a processed pixel rate of 2.75 MHz or more in a four look system was designed. Variations in range and azimuth compression, number of looks, range swath, range migration and SR mode were specified. Alternative range and azimuth processing algorithms were examined in conjunction with projected integrated circuit, digital architecture, and software technologies. The advaced digital SAR processor (ADSP) employs an FFT convolver algorithm for both range and azimuth processing in a parallel architecture configuration. Algorithm performace comparisons, design system design, implementation tradeoffs and the results of a supporting survey of integrated circuit and digital architecture technologies are reported. Cost tradeoffs and projections with alternate implementation plans are presented.

  17. Digital Literacy: Tools and Methodologies for Information Society

    ERIC Educational Resources Information Center

    Rivoltella, Pier Cesare, Ed.

    2008-01-01

    Currently in a state of cultural transition, global society is moving from a literary society to digital one, adopting widespread use of advanced technologies such as the Internet and mobile devices. Digital media has an extraordinary impact on society's formative processes, forcing a pragmatic shift in their management and organization. This…

  18. Multidimensional digital signal processing

    NASA Astrophysics Data System (ADS)

    Lanfear, T. A.; Constantinides, A. G.

    1984-06-01

    The computer program SIMUL is intended to simulate the ALPS system architecture at a high level so as to answer such questions as: is a signal processing application feasible with a particular hardware configuration?; how fast can the processing be performed?; will the system degrade gracefully if some of the resources fail?; what is the effect upon system performance of changes to details such as the number of resources available, the execution time of a resource etc. This document should be read in conjunction with previous documentation for ALPS. The program takes as input data the following information: the number of nodes in the signal flow graph, the number of types of resources, the number of data busses, the time to transfer a block of data from one resource to another, the signal flow graph connectivity and edge prioritization in the form of an adjacency matrix, the number of each type of resource, the execution time of each resource and the type of resource associated with each graph node.

  19. Image processing in digital radiography.

    PubMed

    Freedman, M T; Artz, D S

    1997-01-01

    Image processing is a critical part of obtaining high-quality digital radiographs. Fortunately, the user of these systems does not need to understand image processing in detail, because the manufacturers provide good starting values. Because radiologists may have different preferences in image appearance, it is helpful to know that many aspects of image appearance can be changed by image processing, and a new preferred setting can be loaded into the computer and saved so that it can become the new standard processing method. Image processing allows one to change the overall optical density of an image and to change its contrast. Spatial frequency processing allows an image to be sharpened, improving its appearance. It also allows noise to be blurred so that it is less visible. Care is necessary to avoid the introduction of artifacts or the hiding of mediastinal tubes.

  20. Selecting a software development methodology. [of digital flight control systems

    NASA Technical Reports Server (NTRS)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  1. Friction Stir Process Mapping Methodology

    NASA Technical Reports Server (NTRS)

    Bjorkman, Gerry; Kooney, Alex; Russell, Carolyn

    2003-01-01

    The weld process performance for a given weld joint configuration and tool setup is summarized on a 2-D plot of RPM vs. IPM. A process envelope is drawn within the map to identify the range of acceptable welds. The sweet spot is selected as the nominal weld schedule The nominal weld schedule is characterized in the expected manufacturing environment. The nominal weld schedule in conjunction with process control ensures a consistent and predictable weld performance.

  2. Friction Stir Process Mapping Methodology

    NASA Technical Reports Server (NTRS)

    Kooney, Alex; Bjorkman, Gerry; Russell, Carolyn; Smelser, Jerry (Technical Monitor)

    2002-01-01

    In FSW (friction stir welding), the weld process performance for a given weld joint configuration and tool setup is summarized on a 2-D plot of RPM vs. IPM. A process envelope is drawn within the map to identify the range of acceptable welds. The sweet spot is selected as the nominal weld schedule. The nominal weld schedule is characterized in the expected manufacturing environment. The nominal weld schedule in conjunction with process control ensures a consistent and predictable weld performance.

  3. Digital processing system for developing countries

    NASA Technical Reports Server (NTRS)

    Nanayakkara, C.; Wagner, H.

    1977-01-01

    An effort was undertaken to perform simple digital processing tasks using pre-existing general purpose digital computers. An experimental software package, LIGMALS, was obtained and modified for this purpose. The resulting software permits basic processing tasks to be performed including level slicing, gray mapping and ratio processing. The experience gained in this project indicates a possible direction which may be used by other developing countries to obtain digital processing capabilities.

  4. Seamless lesion insertion in digital mammography: methodology and reader study

    NASA Astrophysics Data System (ADS)

    Pezeshk, Aria; Petrick, Nicholas; Sahiner, Berkman

    2016-03-01

    Collection of large repositories of clinical images containing verified cancer locations is costly and time consuming due to difficulties associated with both the accumulation of data and establishment of the ground truth. This problem poses a significant challenge to the development of machine learning algorithms that require large amounts of data to properly train and avoid overfitting. In this paper we expand the methods in our previous publications by making several modifications that significantly increase the speed of our insertion algorithms, thereby allowing them to be used for inserting lesions that are much larger in size. These algorithms have been incorporated into an image composition tool that we have made publicly available. This tool allows users to modify or supplement existing datasets by seamlessly inserting a real breast mass or micro-calcification cluster extracted from a source digital mammogram into a different location on another mammogram. We demonstrate examples of the performance of this tool on clinical cases taken from the University of South Florida Digital Database for Screening Mammography (DDSM). Finally, we report the results of a reader study evaluating the realism of inserted lesions compared to clinical lesions. Analysis of the radiologist scores in the study using receiver operating characteristic (ROC) methodology indicates that inserted lesions cannot be reliably distinguished from clinical lesions.

  5. Digital processing of radiographic images

    NASA Technical Reports Server (NTRS)

    Bond, A. D.; Ramapriyan, H. K.

    1973-01-01

    Some techniques are presented and the software documentation for the digital enhancement of radiographs. Both image handling and image processing operations are considered. The image handling operations dealt with are: (1) conversion of format of data from packed to unpacked and vice versa; (2) automatic extraction of image data arrays; (3) transposition and 90 deg rotations of large data arrays; (4) translation of data arrays for registration; and (5) reduction of the dimensions of data arrays by integral factors. Both the frequency and the spatial domain approaches are presented for the design and implementation of the image processing operation. It is shown that spatial domain recursive implementation of filters is much faster than nonrecursive implementations using fast fourier transforms (FFT) for the cases of interest in this work. The recursive implementation of a class of matched filters for enhancing image signal to noise ratio is described. Test patterns are used to illustrate the filtering operations. The application of the techniques to radiographic images of metallic structures is demonstrated through several examples.

  6. BPSK Demodulation Using Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Garcia, Thomas R.

    1996-01-01

    A digital communications signal is a sinusoidal waveform that is modified by a binary (digital) information signal. The sinusoidal waveform is called the carrier. The carrier may be modified in amplitude, frequency, phase, or a combination of these. In this project a binary phase shift keyed (BPSK) signal is the communication signal. In a BPSK signal the phase of the carrier is set to one of two states, 180 degrees apart, by a binary (i.e., 1 or 0) information signal. A digital signal is a sampled version of a "real world" time continuous signal. The digital signal is generated by sampling the continuous signal at discrete points in time. The rate at which the signal is sampled is called the sampling rate (f(s)). The device that performs this operation is called an analog-to-digital (A/D) converter or a digitizer. The digital signal is composed of the sequence of individual values of the sampled BPSK signal. Digital signal processing (DSP) is the modification of the digital signal by mathematical operations. A device that performs this processing is called a digital signal processor. After processing, the digital signal may then be converted back to an analog signal using a digital-to-analog (D/A) converter. The goal of this project is to develop a system that will recover the digital information from a BPSK signal using DSP techniques. The project is broken down into the following steps: (1) Development of the algorithms required to demodulate the BPSK signal; (2) Simulation of the system; and (3) Implementation a BPSK receiver using digital signal processing hardware.

  7. VLSI systems design for digital signal processing. Volume 1 - Signal processing and signal processors

    NASA Astrophysics Data System (ADS)

    Bowen, B. A.; Brown, W. R.

    This book is concerned with the design of digital signal processing systems which utilize VLSI (Very Large Scale Integration) components. The presented material is intended for use by electrical engineers at the senior undergraduate or introductory graduate level. It is the purpose of this volume to present an overview of the important elements of background theory, processing techniques, and hardware evolution. Digital signals are considered along with linear systems and digital filters, taking into account the transform analysis of deterministic signals, a statistical signal model, time domain representations of discrete-time linear systems, and digital filter design techniques and implementation issues. Attention is given to aspects of detection and estimation, digital signal processing algorithms and techniques, issues which must be resolved in a processor design methodology, the fundamental concepts of high performance processing in terms of two early super computers, and the extension of these concepts to more recent processors.

  8. Digital signal processing for radioactive decay studies

    SciTech Connect

    Miller, D.; Madurga, M.; Paulauskas, S. V.; Ackermann, D.; Heinz, S.; Hessberger, F. P.; Hofmann, S.; Grzywacz, R.; Miernik, K.; Rykaczewski, K.; Tan, H.

    2011-11-30

    The use of digital acquisition system has been instrumental in the investigation of proton and alpha emitting nuclei. Recent developments extend the sensitivity and breadth of the application. The digital signal processing capabilities, used predominately by UT/ORNL for decay studies, include digitizers with decreased dead time, increased sampling rates, and new innovative firmware. Digital techniques and these improvements are furthermore applicable to a range of detector systems. Improvements in experimental sensitivity for alpha and beta-delayed neutron emitters measurements as well as the next generation of superheavy experiments are discussed.

  9. On Certain New Methodology for Reducing Sensor and Readout Electronics Circuitry Noise in Digital Domain

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Miko, Joseph; Bradley, Damon; Heinzen, Katherine

    2008-01-01

    NASA Hubble Space Telescope (HST) and upcoming cosmology science missions carry instruments with multiple focal planes populated with many large sensor detector arrays. These sensors are passively cooled to low temperatures for low-level light (L3) and near-infrared (NIR) signal detection, and the sensor readout electronics circuitry must perform at extremely low noise levels to enable new required science measurements. Because we are at the technological edge of enhanced performance for sensors and readout electronics circuitry, as determined by thermal noise level at given temperature in analog domain, we must find new ways of further compensating for the noise in the signal digital domain. To facilitate this new approach, state-of-the-art sensors are augmented at their array hardware boundaries by non-illuminated reference pixels, which can be used to reduce noise attributed to sensors. There are a few proposed methodologies of processing in the digital domain the information carried by reference pixels, as employed by the Hubble Space Telescope and the James Webb Space Telescope Projects. These methods involve using spatial and temporal statistical parameters derived from boundary reference pixel information to enhance the active (non-reference) pixel signals. To make a step beyond this heritage methodology, we apply the NASA-developed technology known as the Hilbert- Huang Transform Data Processing System (HHT-DPS) for reference pixel information processing and its utilization in reconfigurable hardware on-board a spaceflight instrument or post-processing on the ground. The methodology examines signal processing for a 2-D domain, in which high-variance components of the thermal noise are carried by both active and reference pixels, similar to that in processing of low-voltage differential signals and subtraction of a single analog reference pixel from all active pixels on the sensor. Heritage methods using the aforementioned statistical parameters in the

  10. The Process of Digitizing of Old Globe

    NASA Astrophysics Data System (ADS)

    Ambrožová, K.; Havrlanta, J.; Talich, M.; Böhm, O.

    2016-06-01

    This paper describes the process of digitalization of old globes that brings with it the possibility to use globes in their digital form. Created digital models are available to the general public through modern technology in the Internet network. This gives an opportunity to study old globes located in various historical collections, and prevent damage of the originals. Another benefit of digitization is also a possibility of comparing different models both among themselves and with current map data by increasing the transparency of individual layers. Digitization is carried out using special device that allows digitizing globes with a diameter ranging from 5 cm to 120 cm. This device can be easily disassembled, and it is fully mobile therefore the globes can be digitized in the place of its storage. Image data of globe surface are acquired by digital camera firmly fastened to the device. Acquired image data are then georeferenced by using a method of complex adjustment. The last step of digitization is publication of the final models that is realized by two ways. The first option is in the form of 3D model through JavaScript library Cesium or Google Earth plug-in in the Web browser. The second option is as a georeferenced map using Tile Map Service.

  11. CT Image Processing Using Public Digital Networks

    PubMed Central

    Rhodes, Michael L.; Azzawi, Yu-Ming; Quinn, John F.; Glenn, William V.; Rothman, Stephen L.G.

    1984-01-01

    Nationwide commercial computer communication is now commonplace for those applications where digital dialogues are generally short and widely distributed, and where bandwidth does not exceed that of dial-up telephone lines. Image processing using such networks is prohibitive because of the large volume of data inherent to digital pictures. With a blend of increasing bandwidth and distributed processing, network image processing becomes possible. This paper examines characteristics of a digital image processing service for a nationwide network of CT scanner installations. Issues of image transmission, data compression, distributed processing, software maintenance, and interfacility communication are also discussed. Included are results that show the volume and type of processing experienced by a network of over 50 CT scanners for the last 32 months.

  12. Process independent automated sizing methodology for current steering DAC

    NASA Astrophysics Data System (ADS)

    Vural, R. A.; Kahraman, N.; Erkmen, B.; Yildirim, T.

    2015-10-01

    This study introduces a process independent automated sizing methodology based on general regression neural network (GRNN) for current steering complementary metal-oxide semiconductor (CMOS) digital-to-analog converter (DAC) circuit. The aim is to utilise circuit structures designed with previous process technologies and to synthesise circuit structures for novel process technologies in contrast to other modelling researches that consider a particular process technology. The simulations were performed using ON SEMI 1.5 µm, ON SEMI 0.5 µm and TSMC 0.35 µm technology process parameters. Eventually, a high-dimensional database was developed consisting of transistor sizes of DAC designs and corresponded static specification errors obtained from simulation results. The key point is that the GRNN was trained with the data set including the simulation results of ON-SEMI 1.5 µm and 0.5 µm technology parameters and the test data were constituted with only the simulation results of TSMC 0.35 µm technology parameters that had not been applied to GRNN for training beforehand. The proposed methodology provides the channel lengths and widths of all transistors for a newer technology when the designer sets the numeric values of DAC static output specifications as Differential Non-linearity error, Integral Non-linearity error, monotonicity and gain error as the inputs of the network.

  13. Digital processing of signals from femtosecond combs

    NASA Astrophysics Data System (ADS)

    Čížek, Martin; Šmíd, Radek; Buchta, Zdeněk.; Mikel, Břetislav; Lazar, Josef; Číp, Ondrej

    2012-01-01

    The presented work is focused on digital processing of beat note signals from a femtosecond optical frequency comb. The levels of mixing products of single spectral components of the comb with CW laser sources are usually very low compared to products of mixing all the comb components together. RF counters are more likely to measure the frequency of the strongest spectral component rather than a weak beat note. Proposed experimental digital signal processing system solves this problem by analyzing the whole spectrum of the output RF signal and using software defined radio (SDR) algorithms. Our efforts concentrate in two main areas: Firstly, we are experimenting with digital signal processing of the RF beat note spectrum produced by f-2f 1 technique and with fully digital servo-loop stabilization of the fs comb. Secondly, we are using digital servo-loop techniques for locking free running continuous laser sources on single components of the fs comb spectrum. Software capable of computing and analyzing the beat-note RF spectrums using FFT and peak detection was developed. A SDR algorithm performing phase demodulation on the f- 2f signal is used as a regulation error signal source for a digital phase-locked loop stabilizing the offset and repetition frequencies of the fs comb.

  14. Digital Signal Processing Based Biotelemetry Receivers

    NASA Technical Reports Server (NTRS)

    Singh, Avtar; Hines, John; Somps, Chris

    1997-01-01

    This is an attempt to develop a biotelemetry receiver using digital signal processing technology and techniques. The receiver developed in this work is based on recovering signals that have been encoded using either Pulse Position Modulation (PPM) or Pulse Code Modulation (PCM) technique. A prototype has been developed using state-of-the-art digital signal processing technology. A Printed Circuit Board (PCB) is being developed based on the technique and technology described here. This board is intended to be used in the UCSF Fetal Monitoring system developed at NASA. The board is capable of handling a variety of PPM and PCM signals encoding signals such as ECG, temperature, and pressure. A signal processing program has also been developed to analyze the received ECG signal to determine heart rate. This system provides a base for using digital signal processing in biotelemetry receivers and other similar applications.

  15. Digital database architecture and delineation methodology for deriving drainage basins, and a comparison of digitally and non-digitally derived numeric drainage areas

    USGS Publications Warehouse

    Dupree, Jean A.; Crowfoot, Richard M.

    2012-01-01

    The drainage basin is a fundamental hydrologic entity used for studies of surface-water resources and during planning of water-related projects. Numeric drainage areas published by the U.S. Geological Survey water science centers in Annual Water Data Reports and on the National Water Information Systems (NWIS) Web site are still primarily derived from hard-copy sources and by manual delineation of polygonal basin areas on paper topographic map sheets. To expedite numeric drainage area determinations, the Colorado Water Science Center developed a digital database structure and a delineation methodology based on the hydrologic unit boundaries in the National Watershed Boundary Dataset. This report describes the digital database architecture and delineation methodology and also presents the results of a comparison of the numeric drainage areas derived using this digital methodology with those derived using traditional, non-digital methods. (Please see report for full Abstract)

  16. Digital signal processing for ionospheric propagation diagnostics

    NASA Astrophysics Data System (ADS)

    Rino, Charles L.; Groves, Keith M.; Carrano, Charles S.; Gunter, Jacob H.; Parris, Richard T.

    2015-08-01

    For decades, analog beacon satellite receivers have generated multifrequency narrowband complex data streams that could be processed directly to extract total electron content (TEC) and scintillation diagnostics. With the advent of software-defined radio, modern digital receivers generate baseband complex data streams that require intermediate processing to extract the narrowband modulation imparted to the signal by ionospheric structure. This paper develops and demonstrates a processing algorithm for digital beacon satellite data that will extract TEC and scintillation components. For algorithm evaluation, a simulator was developed to generate noise-limited multifrequency complex digital signal realizations with representative orbital dynamics and propagation disturbances. A frequency-tracking procedure is used to capture the slowly changing frequency component. Dynamic demodulation against the low-frequency estimate captures the scintillation. The low-frequency reference can be used directly for dual-frequency TEC estimation.

  17. Digital processing of Mariner 9 television data.

    NASA Technical Reports Server (NTRS)

    Green, W. B.; Seidman, J. B.

    1973-01-01

    The digital image processing performed by the Image Processing Laboratory (IPL) at JPL in support of the Mariner 9 mission is summarized. The support is divided into the general categories of image decalibration (the removal of photometric and geometric distortions from returned imagery), computer cartographic projections in support of mapping activities, and adaptive experimenter support (flexible support to provide qualitative digital enhancements and quantitative data reduction of returned imagery). Among the tasks performed were the production of maximum discriminability versions of several hundred frames to support generation of a geodetic control net for Mars, and special enhancements supporting analysis of Phobos and Deimos images.

  18. Digital image processing of vascular angiograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Beckenbach, E. S.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.

    1975-01-01

    The paper discusses the estimation of the degree of atherosclerosis in the human femoral artery through the use of a digital image processing system for vascular angiograms. The film digitizer uses an electronic image dissector camera to scan the angiogram and convert the recorded optical density information into a numerical format. Another processing step involves locating the vessel edges from the digital image. The computer has been programmed to estimate vessel abnormality through a series of measurements, some derived primarily from the vessel edge information and others from optical density variations within the lumen shadow. These measurements are combined into an atherosclerosis index, which is found in a post-mortem study to correlate well with both visual and chemical estimates of atherosclerotic disease.

  19. Digital Image Processing in Private Industry.

    ERIC Educational Resources Information Center

    Moore, Connie

    1986-01-01

    Examines various types of private industry optical disk installations in terms of business requirements for digital image systems in five areas: records management; transaction processing; engineering/manufacturing; information distribution; and office automation. Approaches for implementing image systems are addressed as well as key success…

  20. A Virtual Laboratory for Digital Signal Processing

    ERIC Educational Resources Information Center

    Dow, Chyi-Ren; Li, Yi-Hsung; Bai, Jin-Yu

    2006-01-01

    This work designs and implements a virtual digital signal processing laboratory, VDSPL. VDSPL consists of four parts: mobile agent execution environments, mobile agents, DSP development software, and DSP experimental platforms. The network capability of VDSPL is created by using mobile agent and wrapper techniques without modifying the source code…

  1. Computer Aided Teaching of Digital Signal Processing.

    ERIC Educational Resources Information Center

    Castro, Ian P.

    1990-01-01

    Describes a microcomputer-based software package developed at the University of Surrey for teaching digital signal processing to undergraduate science and engineering students. Menu-driven software capabilities are explained, including demonstration of qualitative concepts and experimentation with quantitative data, and examples are given of…

  2. On process optimization considering LCA methodology.

    PubMed

    Pieragostini, Carla; Mussati, Miguel C; Aguirre, Pío

    2012-04-15

    The goal of this work is to research the state-of-the-art in process optimization techniques and tools based on LCA, focused in the process engineering field. A collection of methods, approaches, applications, specific software packages, and insights regarding experiences and progress made in applying the LCA methodology coupled to optimization frameworks is provided, and general trends are identified. The "cradle-to-gate" concept to define the system boundaries is the most used approach in practice, instead of the "cradle-to-grave" approach. Normally, the relationship between inventory data and impact category indicators is linearly expressed by the characterization factors; then, synergic effects of the contaminants are neglected. Among the LCIA methods, the eco-indicator 99, which is based on the endpoint category and the panel method, is the most used in practice. A single environmental impact function, resulting from the aggregation of environmental impacts, is formulated as the environmental objective in most analyzed cases. SimaPro is the most used software for LCA applications in literature analyzed. The multi-objective optimization is the most used approach for dealing with this kind of problems, where the ε-constraint method for generating the Pareto set is the most applied technique. However, a renewed interest in formulating a single economic objective function in optimization frameworks can be observed, favored by the development of life cycle cost software and progress made in assessing costs of environmental externalities. Finally, a trend to deal with multi-period scenarios into integrated LCA-optimization frameworks can be distinguished providing more accurate results upon data availability.

  3. On process optimization considering LCA methodology.

    PubMed

    Pieragostini, Carla; Mussati, Miguel C; Aguirre, Pío

    2012-04-15

    The goal of this work is to research the state-of-the-art in process optimization techniques and tools based on LCA, focused in the process engineering field. A collection of methods, approaches, applications, specific software packages, and insights regarding experiences and progress made in applying the LCA methodology coupled to optimization frameworks is provided, and general trends are identified. The "cradle-to-gate" concept to define the system boundaries is the most used approach in practice, instead of the "cradle-to-grave" approach. Normally, the relationship between inventory data and impact category indicators is linearly expressed by the characterization factors; then, synergic effects of the contaminants are neglected. Among the LCIA methods, the eco-indicator 99, which is based on the endpoint category and the panel method, is the most used in practice. A single environmental impact function, resulting from the aggregation of environmental impacts, is formulated as the environmental objective in most analyzed cases. SimaPro is the most used software for LCA applications in literature analyzed. The multi-objective optimization is the most used approach for dealing with this kind of problems, where the ε-constraint method for generating the Pareto set is the most applied technique. However, a renewed interest in formulating a single economic objective function in optimization frameworks can be observed, favored by the development of life cycle cost software and progress made in assessing costs of environmental externalities. Finally, a trend to deal with multi-period scenarios into integrated LCA-optimization frameworks can be distinguished providing more accurate results upon data availability. PMID:22208397

  4. [Digital thoracic radiology: devices, image processing, limits].

    PubMed

    Frija, J; de Géry, S; Lallouet, F; Guermazi, A; Zagdanski, A M; De Kerviler, E

    2001-09-01

    In a first part, the different techniques of digital thoracic radiography are described. Since computed radiography with phosphore plates are the most commercialized it is more emphasized. But the other detectors are also described, as the drum coated with selenium and the direct digital radiography with selenium detectors. The other detectors are also studied in particular indirect flat panels detectors and the system with four high resolution CCD cameras. In a second step the most important image processing are discussed: the gradation curves, the unsharp mask processing, the system MUSICA, the dynamic range compression or reduction, the soustraction with dual energy. In the last part the advantages and the drawbacks of computed thoracic radiography are emphasized. The most important are the almost constant good quality of the pictures and the possibilities of image processing.

  5. [Digital thoracic radiology: devices, image processing, limits].

    PubMed

    Frija, J; de Géry, S; Lallouet, F; Guermazi, A; Zagdanski, A M; De Kerviler, E

    2001-09-01

    In a first part, the different techniques of digital thoracic radiography are described. Since computed radiography with phosphore plates are the most commercialized it is more emphasized. But the other detectors are also described, as the drum coated with selenium and the direct digital radiography with selenium detectors. The other detectors are also studied in particular indirect flat panels detectors and the system with four high resolution CCD cameras. In a second step the most important image processing are discussed: the gradation curves, the unsharp mask processing, the system MUSICA, the dynamic range compression or reduction, the soustraction with dual energy. In the last part the advantages and the drawbacks of computed thoracic radiography are emphasized. The most important are the almost constant good quality of the pictures and the possibilities of image processing. PMID:11567193

  6. A methodology for the semi-automatic digital image analysis of fragmental impactites

    NASA Astrophysics Data System (ADS)

    Chanou, A.; Osinski, G. R.; Grieve, R. A. F.

    2014-04-01

    A semi-automated digital image analysis method is developed for the comparative textural study of impact melt-bearing breccias. This method uses the freeware software ImageJ developed by the National Institute of Health (NIH). Digital image analysis is performed on scans of hand samples (10-15 cm across), based on macroscopic interpretations of the rock components. All image processing and segmentation are done semi-automatically, with the least possible manual intervention. The areal fraction of components is estimated and modal abundances can be deduced, where the physical optical properties (e.g., contrast, color) of the samples allow it. Other parameters that can be measured include, for example, clast size, clast-preferred orientations, average box-counting dimension or fragment shape complexity, and nearest neighbor distances (NnD). This semi-automated method allows the analysis of a larger number of samples in a relatively short time. Textures, granulometry, and shape descriptors are of considerable importance in rock characterization. The methodology is used to determine the variations of the physical characteristics of some examples of fragmental impactites.

  7. Fundamental concepts of digital image processing

    SciTech Connect

    Twogood, R.E.

    1983-03-01

    The field of a digital-image processing has experienced dramatic growth and increasingly widespread applicability in recent years. Fortunately, advances in computer technology have kept pace with the rapid growth in volume of image data in these and other applications. Digital image processing has become economical in many fields of research and in industrial and military applications. While each application has requirements unique from the others, all are concerned with faster, cheaper, more accurate, and more extensive computation. The trend is toward real-time and interactive operations, where the user of the system obtains preliminary results within a short enough time that the next decision can be made by the human processor without loss of concentration on the task at hand. An example of this is the obtaining of two-dimensional (2-D) computer-aided tomography (CAT) images. A medical decision might be made while the patient is still under observation rather than days later.

  8. Fundamental Concepts of Digital Image Processing

    DOE R&D Accomplishments Database

    Twogood, R. E.

    1983-03-01

    The field of a digital-image processing has experienced dramatic growth and increasingly widespread applicability in recent years. Fortunately, advances in computer technology have kept pace with the rapid growth in volume of image data in these and other applications. Digital image processing has become economical in many fields of research and in industrial and military applications. While each application has requirements unique from the others, all are concerned with faster, cheaper, more accurate, and more extensive computation. The trend is toward real-time and interactive operations, where the user of the system obtains preliminary results within a short enough time that the next decision can be made by the human processor without loss of concentration on the task at hand. An example of this is the obtaining of two-dimensional (2-D) computer-aided tomography (CAT) images. A medical decision might be made while the patient is still under observation rather than days later.

  9. Analysis of the ignition process using a digital image and colour processing technique

    NASA Astrophysics Data System (ADS)

    Huang, Hua Wei; Zhang, Yang

    2011-07-01

    An experimental investigation of flame emission properties in the ignition-to-flame propagation process has been conducted. In particular, the phenomenon of ignition delay was analysed through digital image processing and colour analysis. This processing methodology makes use of the observed correlation between a digital colour signal and physical flame emission characteristics in the visible spectrum. Aspects of red, green, blue and hue, saturation, value colour modelling principles were combined to turn a high-speed digital colour camera into an abstract multi-spectral system. Experiments were carried out on both a laboratory-based atmospheric burner and an industrial gas-turbine combustor. In both cases, results have shown that the commonly observed flame colour feature from the soot radiation does not signify the start of combustion reaction but rather a later stage of flame development. Additional weak colour quantities were identified via digital colour image processing in the ignition delay time interval where there were no previous definitive signals to designate the presence of combustion. This colour entity was found to match with the typical digital colour signal output from the stimulation of CH* and C2* radical chemiluminescence emissions.

  10. Digital Signal Processing in the GRETINA Spectrometer

    NASA Astrophysics Data System (ADS)

    Cromaz, Mario

    2015-10-01

    Developments in the segmentation of large-volume HPGe crystals has enabled the development of high-efficiency gamma-ray spectrometers which have the ability to track the path of gamma-rays scattering through the detector volume. This technology has been successfully implemented in the GRETINA spectrometer whose high efficiency and ability to perform precise event-by-event Doppler correction has made it an important tool in nuclear spectroscopy. Tracking has required the spectrometer to employ a fully digital signal processing chain. Each of the systems 1120 channels are digitized by 100 Mhz, 14-bit flash ADCs. Filters that provide timing and high-resolution energies are implemented on local FPGAs acting on the ADC data streams while interaction point locations and tracks, derived from the trace on each detector segment, are calculated in real time on a computing cluster. In this presentation we will give a description of GRETINA's digital signal processing system, the impact of design decisions on system performance, and a discussion of possible future directions as we look towards soon developing larger spectrometers such as GRETA with full 4 π solid angle coverage. This work was supported by the Office of Science in the Department of Energy under grant DE-AC02-05CH11231.

  11. Parallel digital signal processing architectures for image processing

    NASA Astrophysics Data System (ADS)

    Kshirsagar, Shirish P.; Hartley, David A.; Harvey, David M.; Hobson, Clifford A.

    1994-10-01

    This paper describes research into a high speed image processing system using parallel digital signal processors for the processing of electro-optic images. The objective of the system is to reduce the processing time of non-contact type inspection problems including industrial and medical applications. A single processor can not deliver sufficient processing power required for the use of applications hence, a MIMD system is designed and constructed to enable fast processing of electro-optic images. The Texas Instruments TMS320C40 digital signal processor is used due to its high speed floating point CPU and the support for the parallel processing environment. A custom designed VISION bus is provided to transfer images between processors. The system is being applied for solder joint inspection of high technology printed circuit boards.

  12. Modeling and Analysis of Power Processing Systems. [use of a digital computer for designing power plants

    NASA Technical Reports Server (NTRS)

    Fegley, K. A.; Hayden, J. H.; Rehmann, D. W.

    1974-01-01

    The feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems is investigated. It is shown that a digital computer may be used in an interactive mode for the design, modeling, analysis, and comparison of power processing systems.

  13. Digital signal processing methods for biosequence comparison.

    PubMed Central

    Benson, D C

    1990-01-01

    A method is discussed for DNA or protein sequence comparison using a finite field fast Fourier transform, a digital signal processing technique; and statistical methods are discussed for analyzing the output of this algorithm. This method compares two sequences of length N in computing time proportional to N log N compared to N2 for methods currently used. This method makes it feasible to compare very long sequences. An example is given to show that the method correctly identifies sites of known homology. PMID:2349096

  14. Applicability of ACR breast dosimetry methodology to a digital mammography system

    SciTech Connect

    Tomon, John J.; Johnson, Thomas E.; Swenson, Kristin N.; Schauer, David A.

    2006-03-15

    Determination of mean glandular dose (MGD) to breast tissue is an essential aspect of mammography equipment evaluations and exposure controls. The American College of Radiology (ACR) Quality Control Manual outlines the procedure for MGD determination in screen-film mammography based upon conversions of entrance skin exposures (ESEs) measured with an ionization chamber (IC). The development of digital mammography has increased with the demand for improved object resolution and tissue contrast. This change in image receptor from screen-film to a solid-state detector has led to questions about the applicability of the ACR MGD methodology to digital mammography. This research has validated the applicability of the ACR MGD methodology to digital mammography in the GE digital mammography system Senographe 2000D[reg]. MGD was determined using light output measurements from thermoluminescent dosimeters (MGD{sub TL}), exposure measurements from an IC (MGD{sub IC}) and conversion factors from the ACR Mammography Quality Control Manual. MGD{sub TL} and MGD{sub IC} data indicate that there is a statistically significant difference between the two measurements with the Senographe 2000D[reg]. However, the applicability of the ACR's methodology was validated by calculating MGD at various depths in a 50/50 breast phantom. Additionally, the results of backscatter measurements from the image receptors of both mammography modalities indicate there is a difference (all P values <0.001) in the radiation backscattered from each image receptor.

  15. SYDDARTA: new methodology for digitization of deterioration estimation in paintings

    NASA Astrophysics Data System (ADS)

    Granero-Montagud, Luís.; Portalés, Cristina; Pastor-Carbonell, Begoña.; Ribes-Gómez, Emilio; Gutiérrez-Lucas, Antonio; Tornari, Vivi; Papadakis, Vassilis; Groves, Roger M.; Sirmacek, Beril; Bonazza, Alessandra; Ozga, Izabela; Vermeiren, Jan; van der Zanden, Koen; Föster, Matthias; Aswendt, Petra; Borreman, Albert; Ward, Jon D.; Cardoso, António; Aguiar, Luís.; Alves, Filipa; Ropret, Polonca; Luzón-Nogué, José María.; Dietz, Christian

    2013-05-01

    The SYDDARTA project is an on-going European Commission funded initiative under the 7th Framework Programme. Its main objective is the development of a pre-industrial prototype for diagnosing the deterioration of movable art assets. The device combines two different optical techniques for the acquisition of data. On one hand, hyperspectral imaging is implemented by means of electronically tunable filters. On the other, 3D scanning, using structured light projection and capturing is developed. These techniques are integrated in a single piece of equipment, allowing the recording of two optical information streams. Together with multi-sensor data merging and information processing, estimates of artwork deterioration and degradation can be made. In particular, the resulting system will implement two optical channels (3D scanning and short wave infrared (SWIR) hyperspectral imaging) featuring a structured light projector and electronically tunable spectral separators. The system will work in the VIS-NIR range (400-1000nm), and SWIR range (900-2500nm). It will be also portable and user-friendly. Among all possible art work under consideration, Baroque paintings on canvas and wooden panels were selected as the project case studies.

  16. Image processing techniques for digital orthophotoquad production

    USGS Publications Warehouse

    Hood, Joy J.; Ladner, L. J.; Champion, Richard A.

    1989-01-01

    Orthophotographs have long been recognized for their value as supplements or alternatives to standard maps. Recent trends towards digital cartography have resulted in efforts by the US Geological Survey to develop a digital orthophotoquad production system. Digital image files were created by scanning color infrared photographs on a microdensitometer. Rectification techniques were applied to remove tile and relief displacement, thereby creating digital orthophotos. Image mosaicking software was then used to join the rectified images, producing digital orthophotos in quadrangle format.

  17. Parallel processing for digital picture comparison

    NASA Technical Reports Server (NTRS)

    Cheng, H. D.; Kou, L. T.

    1987-01-01

    In picture processing an important problem is to identify two digital pictures of the same scene taken under different lighting conditions. This kind of problem can be found in remote sensing, satellite signal processing and the related areas. The identification can be done by transforming the gray levels so that the gray level histograms of the two pictures are closely matched. The transformation problem can be solved by using the packing method. Researchers propose a VLSI architecture consisting of m x n processing elements with extensive parallel and pipelining computation capabilities to speed up the transformation with the time complexity 0(max(m,n)), where m and n are the numbers of the gray levels of the input picture and the reference picture respectively. If using uniprocessor and a dynamic programming algorithm, the time complexity will be 0(m(3)xn). The algorithm partition problem, as an important issue in VLSI design, is discussed. Verification of the proposed architecture is also given.

  18. Digital techniques for processing Landsat imagery

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1978-01-01

    An overview of the basic techniques used to process Landsat images with a digital computer, and the VICAR image processing software developed at JPL and available to users through the NASA sponsored COSMIC computer program distribution center is presented. Examples of subjective processing performed to improve the information display for the human observer, such as contrast enhancement, pseudocolor display and band rationing, and of quantitative processing using mathematical models, such as classification based on multispectral signatures of different areas within a given scene and geometric transformation of imagery into standard mapping projections are given. Examples are illustrated by Landsat scenes of the Andes mountains and Altyn-Tagh fault zone in China before and after contrast enhancement and classification of land use in Portland, Oregon. The VICAR image processing software system which consists of a language translator that simplifies execution of image processing programs and provides a general purpose format so that imagery from a variety of sources can be processed by the same basic set of general applications programs is described.

  19. An investigation of radiometer design using digital processing techniques

    NASA Technical Reports Server (NTRS)

    Lawrence, R. W.

    1981-01-01

    The use of digital signal processing techniques in Dicke switching radiometer design was investigated. The general approach was to develop an analytical model of the existing analog radiometer and identify factors which adversly affect its performance. A digital processor was then proposed to verify the feasibility of using digital techniques to minimize these adverse effects and improve the radiometer performance. Analysis and preliminary test results comparing the digital and analog processing approaches in radiometers design were analyzed.

  20. Digital interactive image analysis by array processing

    NASA Technical Reports Server (NTRS)

    Sabels, B. E.; Jennings, J. D.

    1973-01-01

    An attempt is made to draw a parallel between the existing geophysical data processing service industries and the emerging earth resources data support requirements. The relationship of seismic data analysis to ERTS data analysis is natural because in either case data is digitally recorded in the same format, resulting from remotely sensed energy which has been reflected, attenuated, shifted and degraded on its path from the source to the receiver. In the seismic case the energy is acoustic, ranging in frequencies from 10 to 75 cps, for which the lithosphere appears semi-transparent. In earth survey remote sensing through the atmosphere, visible and infrared frequency bands are being used. Yet the hardware and software required to process the magnetically recorded data from the two realms of inquiry are identical and similar, respectively. The resulting data products are similar.

  1. Fuzzy Logic Enhanced Digital PIV Processing Software

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.

    1999-01-01

    Digital Particle Image Velocimetry (DPIV) is an instantaneous, planar velocity measurement technique that is ideally suited for studying transient flow phenomena in high speed turbomachinery. DPIV is being actively used at the NASA Glenn Research Center to study both stable and unstable operating conditions in a high speed centrifugal compressor. Commercial PIV systems are readily available which provide near real time feedback of the PIV image data quality. These commercial systems are well designed to facilitate the expedient acquisition of PIV image data. However, as with any general purpose system, these commercial PIV systems do not meet all of the data processing needs required for PIV image data reduction in our compressor research program. An in-house PIV PROCessing (PIVPROC) code has been developed for reducing PIV data. The PIVPROC software incorporates fuzzy logic data validation for maximum information recovery from PIV image data. PIVPROC enables combined cross-correlation/particle tracking wherein the highest possible spatial resolution velocity measurements are obtained.

  2. Digital Light Processing and MEMS: reflecting the digital display needs of the networked society

    NASA Astrophysics Data System (ADS)

    Hornbeck, Larry J.

    1996-08-01

    Digital video technology is becoming increasingly important to the networked society. The natural interface to digital video is a digital display, one that accepts electrical bits at its input and converts them into optical bits at the output. The digital-to-analog processing function is performed in the mind of the observer. Texas Instruments has developed such a display with its recent market introduction of the Digital Light ProcessingTM (DLPTM) projection display. DLP technology is based on the Digital Micromirror DeviceTM (DMDTM), a microelectromechanical systems (MEMS) array of semiconductor-based digital light switches. The DMD switching array precisely controls a light source for projection display and digital printing applications. This paper presents an overview of DLP technology along with the architecture, projection operation, manufacture, and reliability of the DMD. Features of DMD technology that distinguish it from conventional MEMS technology are explored. Finally, the paper provides a view of DLP business opportunities.

  3. Development of economic consequence methodology for process risk analysis.

    PubMed

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies.

  4. Methodology of Diagnostics of Interethnic Relations and Ethnosocial Processes

    ERIC Educational Resources Information Center

    Maximova, Svetlana G.; Noyanzina, Oksana Ye.; Omelchenko, Daria A.; Maximov, Maxim B.; Avdeeva, Galina C.

    2016-01-01

    The purpose of this study was to research the methodological approaches to the study of interethnic relations and ethno-social processes. The analysis of the literature was conducted in three main areas: 1) the theoretical and methodological issues of organizing the research of inter-ethnic relations, allowing to highlight the current…

  5. Applying Statistical Process Quality Control Methodology to Educational Settings.

    ERIC Educational Resources Information Center

    Blumberg, Carol Joyce

    A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…

  6. An Interactive Graphics Program for Investigating Digital Signal Processing.

    ERIC Educational Resources Information Center

    Miller, Billy K.; And Others

    1983-01-01

    Describes development of an interactive computer graphics program for use in teaching digital signal processing. The program allows students to interactively configure digital systems on a monitor display and observe their system's performance by means of digital plots on the system's outputs. A sample program run is included. (JN)

  7. Digital data processing system dynamic loading analysis

    NASA Technical Reports Server (NTRS)

    Lagas, J. J.; Peterka, J. J.; Tucker, A. E.

    1976-01-01

    Simulation and analysis of the Space Shuttle Orbiter Digital Data Processing System (DDPS) are reported. The mated flight and postseparation flight phases of the space shuttle's approach and landing test configuration were modeled utilizing the Information Management System Interpretative Model (IMSIM) in a computerized simulation modeling of the ALT hardware, software, and workload. System requirements simulated for the ALT configuration were defined. Sensitivity analyses determined areas of potential data flow problems in DDPS operation. Based on the defined system requirements and the sensitivity analyses, a test design is described for adapting, parameterizing, and executing the IMSIM. Varying load and stress conditions for the model execution are given. The analyses of the computer simulation runs were documented as results, conclusions, and recommendations for DDPS improvements.

  8. Parallel Processing with Digital Signal Processing Hardware and Software

    NASA Technical Reports Server (NTRS)

    Swenson, Cory V.

    1995-01-01

    The assembling and testing of a parallel processing system is described which will allow a user to move a Digital Signal Processing (DSP) application from the design stage to the execution/analysis stage through the use of several software tools and hardware devices. The system will be used to demonstrate the feasibility of the Algorithm To Architecture Mapping Model (ATAMM) dataflow paradigm for static multiprocessor solutions of DSP applications. The individual components comprising the system are described followed by the installation procedure, research topics, and initial program development.

  9. Design methodology: edgeless 3D ASICs with complex in-pixel processing for pixel detectors

    SciTech Connect

    Fahim Farah, Fahim Farah; Deptuch, Grzegorz W.; Hoff, James R.; Mohseni, Hooman

    2015-08-28

    The design methodology for the development of 3D integrated edgeless pixel detectors with in-pixel processing using Electronic Design Automation (EDA) tools is presented. A large area 3 tier 3D detector with one sensor layer and two ASIC layers containing one analog and one digital tier, is built for x-ray photon time of arrival measurement and imaging. A full custom analog pixel is 65μm x 65μm. It is connected to a sensor pixel of the same size on one side, and on the other side it has approximately 40 connections to the digital pixel. A 32 x 32 edgeless array without any peripheral functional blocks constitutes a sub-chip. The sub-chip is an indivisible unit, which is further arranged in a 6 x 6 array to create the entire 1.248cm x 1.248cm ASIC. Each chip has 720 bump-bond I/O connections, on the back of the digital tier to the ceramic PCB. All the analog tier power and biasing is conveyed through the digital tier from the PCB. The assembly has no peripheral functional blocks, and hence the active area extends to the edge of the detector. This was achieved by using a few flavors of almost identical analog pixels (minimal variation in layout) to allow for peripheral biasing blocks to be placed within pixels. The 1024 pixels within a digital sub-chip array have a variety of full custom, semi-custom and automated timing driven functional blocks placed together. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout. The methodology uses the Cadence design platform, however it is not limited to this tool.

  10. Process Architecture for Managing Digital Object Identifiers

    NASA Astrophysics Data System (ADS)

    Wanchoo, L.; James, N.; Stolte, E.

    2014-12-01

    In 2010, NASA's Earth Science Data and Information System (ESDIS) Project implemented a process for registering Digital Object Identifiers (DOIs) for data products distributed by Earth Observing System Data and Information System (EOSDIS). For the first 3 years, ESDIS evolved the process involving the data provider community in the development of processes for creating and assigning DOIs, and guidelines for the landing page. To accomplish this, ESDIS established two DOI User Working Groups: one for reviewing the DOI process whose recommendations were submitted to ESDIS in February 2014; and the other recently tasked to review and further develop DOI landing page guidelines for ESDIS approval by end of 2014. ESDIS has recently upgraded the DOI system from a manually-driven system to one that largely automates the DOI process. The new automated feature include: a) reviewing the DOI metadata, b) assigning of opaque DOI name if data provider chooses, and c) reserving, registering, and updating the DOIs. The flexibility of reserving the DOI allows data providers to embed and test the DOI in the data product metadata before formally registering with EZID. The DOI update process allows the changing of any DOI metadata except the DOI name unless the name has not been registered. Currently, ESDIS has processed a total of 557 DOIs of which 379 DOIs are registered with EZID and 178 are reserved with ESDIS. The DOI incorporates several metadata elements that effectively identify the data product and the source of availability. Of these elements, the Uniform Resource Locator (URL) attribute has the very important function of identifying the landing page which describes the data product. ESDIS in consultation with data providers in the Earth Science community is currently developing landing page guidelines that specify the key data product descriptive elements to be included on each data product's landing page. This poster will describe in detail the unique automated process and

  11. Design Methodology: ASICs with complex in-pixel processing for Pixel Detectors

    SciTech Connect

    Fahim, Farah

    2014-10-31

    The development of Application Specific Integrated Circuits (ASIC) for pixel detectors with complex in-pixel processing using Computer Aided Design (CAD) tools that are, themselves, mainly developed for the design of conventional digital circuits requires a specialized approach. Mixed signal pixels often require parasitically aware detailed analog front-ends and extremely compact digital back-ends with more than 1000 transistors in small areas below 100μm x 100μm. These pixels are tiled to create large arrays, which have the same clock distribution and data readout speed constraints as in, for example, micro-processors. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout.

  12. Digital-Difference Processing For Collision Avoidance.

    NASA Technical Reports Server (NTRS)

    Shores, Paul; Lichtenberg, Chris; Kobayashi, Herbert S.; Cunningham, Allen R.

    1988-01-01

    Digital system for automotive crash avoidance measures and displays difference in frequency between two sinusoidal input signals of slightly different frequencies. Designed for use with Doppler radars. Characterized as digital mixer coupled to frequency counter measuring difference frequency in mixer output. Technique determines target path mathematically. Used for tracking cars, missiles, bullets, baseballs, and other fast-moving objects.

  13. The Creation Process in Digital Art

    NASA Astrophysics Data System (ADS)

    Marcos, Adérito Fernandes; Branco, Pedro Sérgio; Zagalo, Nelson Troca

    The process behind the act of the art creation or the creation process has been the subject of much debate and research during the last fifty years at least, even thinking art and beauty has been a subject of analysis already by the ancient Greeks such were Plato or Aristotle. Even though intuitively it is a simple phenomenon, creativity or the human ability to generate innovation (new ideas, concepts, etc.) is in fact quite complex. It has been studied from the perspectives of behavioral and social psychology, cognitive science, artificial intelligence, philosophy, history, design research, digital art, and computational aesthetics, among others. In spite of many years of discussion and research there is no single, authoritative perspective or definition of creativity, i.e., there is no standardized measurement technique. Regarding the development process that supports the intellectual act of creation it is usually described as a procedure where the artist experiments the medium, explores it with one or more techniques, changing shapes, forms, appearances, where beyond time and space, he/she seeks his/her way out to a clearing, i.e., envisages a path from intention to realization. Duchamp in his lecture "The Creative Act" states the artist is never alone with his/her artwork; there is always the spectator that later on will react critically to the work of art. If the artist succeeds in transmitting his/her intentions in terms of a message, emotion or feeling to the spectator then a form of aesthetic osmosis actually takes place through the inert matter (the medium) that enabled this communication or interaction phenomenon to occur. The role of the spectator may become gradually more active by interacting with the artwork itself possibly changing or becoming a part of it [2][4].

  14. Multilingual subjective methodology and evaluation of low-rate digital voice processors

    NASA Astrophysics Data System (ADS)

    Dimolitsas, Spiros; Corcoran, Franklin L.; Baraniecki, Marion R.; Phipps, John G., Jr.

    The methodology and results for a multilingual evaluation of source encoding algorithms operating at 16 kbit/s are presented. The evaluation was conducted in three languages (English, French, and Madarin), using listener opinion subjective assessments to determine whether 'toll-quality' performance is possible at 16 kbit/s. The study demonstrated that toll-quality voice is indeed possible at 16 kbit/s, and that several of the methods evaluated are more robust under high bit error conditions than either 32- or 64-kbit/s encoding. Thus, 16-kbit/s voice coding technology is currently suitable for many applications with the public-switched telephone network, including the next generation of digital circuit multiplication equipment, and integrated services digital network videotelephony.

  15. Digital Handling and Processing of Remote Sensing Data

    NASA Technical Reports Server (NTRS)

    Algazi, R.; Sakrison, D.

    1971-01-01

    Progress is reported on the development of a computing facility that provides automatic processing of remote sensing data on earth resources. Preliminary work on digital signal processing algorithms and the writing of corresponding programs for the design of digital filters is outlined.

  16. A methodology for high resolution digital image correlation in high temperature experiments.

    PubMed

    Blaber, Justin; Adair, Benjamin S; Antoniou, Antonia

    2015-03-01

    We propose a methodology for performing high resolution Digital Image Correlation (DIC) analysis during high-temperature mechanical tests. Specifically, we describe a technique for producing a stable, high-quality pattern on metal surfaces along with a simple optical system that uses a visible-range camera and a long-range microscope. The results are analyzed with a high-quality open-source DIC software developed by us. Using the proposed technique, we successfully acquired high-resolution strain maps of the crack tip field in a nickel superalloy sample at 1000 °C. PMID:25832279

  17. Modular digital holographic fringe data processing system

    NASA Technical Reports Server (NTRS)

    Downward, J. G.; Vavra, P. C.; Schebor, F. S.; Vest, C. M.

    1985-01-01

    A software architecture suitable for reducing holographic fringe data into useful engineering data is developed and tested. The results, along with a detailed description of the proposed architecture for a Modular Digital Fringe Analysis System, are presented.

  18. A methodology aimed at fostering and sustaining the development processes of an IE-based industry

    NASA Astrophysics Data System (ADS)

    Corallo, Angelo; Errico, Fabrizio; de Maggio, Marco; Giangreco, Enza

    In the current competitive scenario, where business relationships are fundamental in building successful business models and inter/intra organizational business processes are progressively digitalized, an end-to-end methodology is required that is capable of guiding business networks through the Internetworked Enterprise (IE) paradigm: a new and innovative organizational model able to leverage Internet technologies to perform real-time coordination of intra and inter-firm activities, to create value by offering innovative and personalized products/services and reduce transaction costs. This chapter presents the TEKNE project Methodology of change that guides business networks, by means of a modular and flexible approach, towards the IE techno-organizational paradigm, taking into account the competitive environment of the network and how this environment influences its strategic, organizational and technological levels. Contingency, the business model, enterprise architecture and performance metrics are the key concepts that form the cornerstone of this methodological framework.

  19. Programmable rate modem utilizing digital signal processing techniques

    NASA Technical Reports Server (NTRS)

    Naveh, Arad

    1992-01-01

    The need for a Programmable Rate Digital Satellite Modem capable of supporting both burst and continuous transmission modes with either Binary Phase Shift Keying (BPSK) or Quadrature Phase Shift Keying (QPSK) modulation is discussed. The preferred implementation technique is an all digital one which utilizes as much digital signal processing (DSP) as possible. The design trade-offs in each portion of the modulator and demodulator subsystem are outlined.

  20. Using Constructivist Case Study Methodology to Understand Community Development Processes: Proposed Methodological Questions to Guide the Research Process

    ERIC Educational Resources Information Center

    Lauckner, Heidi; Paterson, Margo; Krupa, Terry

    2012-01-01

    Often, research projects are presented as final products with the methodologies cleanly outlined and little attention paid to the decision-making processes that led to the chosen approach. Limited attention paid to these decision-making processes perpetuates a sense of mystery about qualitative approaches, particularly for new researchers who will…

  1. The digital storytelling process: A comparative analysis from various experts

    NASA Astrophysics Data System (ADS)

    Hussain, Hashiroh; Shiratuddin, Norshuhada

    2016-08-01

    Digital Storytelling (DST) is a method of delivering information to the audience. It combines narrative and digital media content infused with the multimedia elements. In order for the educators (i.e the designers) to create a compelling digital story, there are sets of processes introduced by experts. Nevertheless, the experts suggest varieties of processes to guide them; of which some are redundant. The main aim of this study is to propose a single guide process for the creation of DST. A comparative analysis is employed where ten DST models from various experts are analysed. The process can also be implemented in other multimedia materials that used the concept of DST.

  2. Digital signal processor and processing method for GPS receivers

    NASA Technical Reports Server (NTRS)

    Thomas, Jr., Jess B. (Inventor)

    1989-01-01

    A digital signal processor and processing method therefor for use in receivers of the NAVSTAR/GLOBAL POSITIONING SYSTEM (GPS) employs a digital carrier down-converter, digital code correlator and digital tracking processor. The digital carrier down-converter and code correlator consists of an all-digital, minimum bit implementation that utilizes digital chip and phase advancers, providing exceptional control and accuracy in feedback phase and in feedback delay. Roundoff and commensurability errors can be reduced to extremely small values (e.g., less than 100 nanochips and 100 nanocycles roundoff errors and 0.1 millichip and 1 millicycle commensurability errors). The digital tracking processor bases the fast feedback for phase and for group delay in the C/A, P.sub.1, and P.sub.2 channels on the L.sub.1 C/A carrier phase thereby maintaining lock at lower signal-to-noise ratios, reducing errors in feedback delays, reducing the frequency of cycle slips and in some cases obviating the need for quadrature processing in the P channels. Simple and reliable methods are employed for data bit synchronization, data bit removal and cycle counting. Improved precision in averaged output delay values is provided by carrier-aided data-compression techniques. The signal processor employs purely digital operations in the sense that exactly the same carrier phase and group delay measurements are obtained, to the last decimal place, every time the same sampled data (i.e., exactly the same bits) are processed.

  3. Pedagogical reforms of digital signal processing education

    NASA Astrophysics Data System (ADS)

    Christensen, Michael

    The future of the engineering discipline is arguably predicated heavily upon appealing to the future generation, in all its sensibilities. The greatest burden in doing so, one might rightly believe, lies on the shoulders of the educators. In examining the causal means by which the profession arrived at such a state, one finds that the technical revolution, precipitated by global war, had, as its catalyst, institutions as expansive as the government itself to satisfy the demand for engineers, who, as a result of such an existential crisis, were taught predominantly theoretical underpinnings to address a finite purpose. By contrast, the modern engineer, having expanded upon this vision and adapted to an evolving society, is increasingly placed in the proverbial role of the worker who must don many hats: not solely a scientist, yet often an artist; not a businessperson alone, but neither financially naive; not always a representative, though frequently a collaborator. Inasmuch as change then serves as the only constancy in a global climate, therefore, the educational system - if it is to mimic the demands of the industry - is left with an inherent need for perpetual revitalization to remain relevant. This work aims to serve that end. Motivated by existing research in engineering education, an epistemological challenge is molded into the framework of the electrical engineer with emphasis on digital signal processing. In particular, it is investigated whether students are better served by a learning paradigm that tolerates and, when feasible, encourages error via a medium free of traditional adjudication. Through the creation of learning modules using the Adobe Captivate environment, a wide range of fundamental knowledge in signal processing is challenged within the confines of existing undergraduate courses. It is found that such an approach not only conforms to the research agenda outlined for the engineering educator, but also reflects an often neglected reality

  4. Rethinking the Purposes and Processes for Designing Digital Portfolios

    ERIC Educational Resources Information Center

    Hicks, Troy; Russo, Anne; Autrey, Tara; Gardner, Rebecca; Kabodian, Aram; Edington, Cathy

    2007-01-01

    As digital portfolios become more prevalent in teacher education, the purposes and processes for creating them have become contested. Originally meant to be critical and reflective spaces for learning about multimedia and conceived as contributing to professional growth, research shows that digital portfolios are now increasingly being used to…

  5. Digital computer processing of X-ray photos

    NASA Technical Reports Server (NTRS)

    Nathan, R.; Selzer, R. H.

    1967-01-01

    Digital computers correct various distortions in medical and biological photographs. One of the principal methods of computer enhancement involves the use of a two-dimensional digital filter to modify the frequency spectrum of the picture. Another computer processing method is image subtraction.

  6. Powerful Practices in Digital Learning Processes

    ERIC Educational Resources Information Center

    Sørensen, Birgitte Holm; Levinsen, Karin Tweddell

    2015-01-01

    The present paper is based on two empirical research studies. The "Netbook 1:1" project (2009-2012), funded by the municipality of Gentofte and Microsoft Denmark, is complete, while "Students' digital production and students as learning designers" (2013-2015), funded by the Danish Ministry of Education, is ongoing. Both…

  7. Effective DQE (eDQE) and speed of digital radiographic systems: An experimental methodology

    PubMed Central

    Samei, Ehsan; Ranger, Nicole T.; MacKenzie, Alistair; Honey, Ian D.; Dobbins, James T.; Ravin, Carl E.

    2009-01-01

    Prior studies on performance evaluation of digital radiographic systems have primarily focused on the assessment of the detector performance alone. However, the clinical performance of such systems is also substantially impacted by magnification, focal spot blur, the presence of scattered radiation, and the presence of an antiscatter grid. The purpose of this study is to evaluate an experimental methodology to assess the performance of a digital radiographic system, including those attributes, and to propose a new metric, effective detective quantum efficiency (eDQE), a candidate for defining the efficiency or speed of digital radiographic imaging systems. The study employed a geometric phantom simulating the attenuation and scatter properties of the adult human thorax and a representative indirect flat-panel-based clinical digital radiographic imaging system. The noise power spectrum (NPS) was derived from images of the phantom acquired at three exposure levels spanning the operating range of the clinical system. The modulation transfer function (MTF) was measured using an edge device positioned at the surface of the phantom, facing the x-ray source. Scatter measurements were made using a beam stop technique. The eDQE was then computed from these measurements, along with measures of phantom attenuation and x-ray flux. The MTF results showed notable impact from the focal spot blur, while the NPS depicted a large component of structured noise resulting from use of an antiscatter grid. The eDQE was found to be an order of magnitude lower than the conventional DQE. At 120 kVp, eDQE(0) was in the 8%–9% range, fivefold lower than DQE(0) at the same technique. The eDQE method yielded reproducible estimates of the system performance in a clinically relevant context by quantifying the inherent speed of the system, that is, the actual signal to noise ratio that would be measured under clinical operating conditions. PMID:19746814

  8. A POLLUTION REDUCTION METHODOLOGY FOR CHEMICAL PROCESS SIMULATORS

    EPA Science Inventory

    A pollution minimization methodology was developed for chemical process design using computer simulation. It is based on a pollution balance that at steady state is used to define a pollution index with units of mass of pollution per mass of products. The pollution balance has be...

  9. Preliminary development of digital signal processing in microwave radiometers

    NASA Technical Reports Server (NTRS)

    Stanley, W. D.

    1980-01-01

    Topics covered involve a number of closely related tasks including: the development of several control loop and dynamic noise model computer programs for simulating microwave radiometer measurements; computer modeling of an existing stepped frequency radiometer in an effort to determine its optimum operational characteristics; investigation of the classical second order analog control loop to determine its ability to reduce the estimation error in a microwave radiometer; investigation of several digital signal processing unit designs; initiation of efforts to develop required hardware and software for implementation of the digital signal processing unit; and investigation of the general characteristics and peculiarities of digital processing noiselike microwave radiometer signals.

  10. The teaching of computer programming and digital image processing in radiography.

    PubMed

    Allan, G L; Zylinski, J

    1998-06-01

    The increased use of digital processing techniques in Medical Radiations imaging modalities, along with the rapid advance in information technology has resulted in a significant change in the delivery of radiographic teaching programs. This paper details a methodology used to concurrently educate radiographers in both computer programming and image processing. The students learn to program in visual basic applications (VBA), and the programming skills are contextualised by requiring the students to write a digital subtraction angiography (DSA) package. Program code generation and image presentation interface is undertaken by the spreadsheet Microsoft Excel. The user-friendly nature of this common interface enables all students to readily begin program creation. The teaching of programming and image processing skills by this method may be readily generalised to other vocational fields where digital image manipulation is a professional requirement. PMID:9726504

  11. Demonstration of the Dynamic Flowgraph Methodology using the Titan 2 Space Launch Vehicle Digital Flight Control System

    NASA Technical Reports Server (NTRS)

    Yau, M.; Guarro, S.; Apostolakis, G.

    1993-01-01

    Dynamic Flowgraph Methodology (DFM) is a new approach developed to integrate the modeling and analysis of the hardware and software components of an embedded system. The objective is to complement the traditional approaches which generally follow the philosophy of separating out the hardware and software portions of the assurance analysis. In this paper, the DFM approach is demonstrated using the Titan 2 Space Launch Vehicle Digital Flight Control System. The hardware and software portions of this embedded system are modeled in an integrated framework. In addition, the time dependent behavior and the switching logic can be captured by this DFM model. In the modeling process, it is found that constructing decision tables for software subroutines is very time consuming. A possible solution is suggested. This approach makes use of a well-known numerical method, the Newton-Raphson method, to solve the equations implemented in the subroutines in reverse. Convergence can be achieved in a few steps.

  12. Categorical digital soil database for the Carpathian-basin using the modified e-SOTER methodology

    NASA Astrophysics Data System (ADS)

    Dobos, Endre; Vadnai, Péter; Micheli, Erika; Pasztor, Laszlo

    2015-04-01

    Harmonized, spatially and thematically consistent, high resolution soil data covering larger regions, several countries to model different environmental and socio-economic scenarios is needed for several applications. The only way to have such data with large spatial coverage and high resolution data is to make use of the available high resolution digital data sources, digital soil mapping tools in the development process. Digital soil mapping has become a very efficient tool in soil science, several applications have been published in this topic recently. Many of these applications use environmental covariates like remotely sensed images and digital elevation models, which are raster based data sources with block support. The majority of soil data users requires data in raster format with values of certain properties, like pH, clay content or soil organic matter content. However, the use of these soil properties are often limited, an adequate interpretation of these numbers requires knowledge on the soil system, and its major processes and process associations. This soil system description can be best done using the existing knowledge of soil science expressed in soil classification. as diagnostics - features, materials horizons, as important descriptive information - and the classification categories. The most commonly used and internationally accepted classification system is the Worlds Reference Base for soil description, the so called WRB. Each soil classification category represent a complex association of processes and properties, which is difficult to be used and understood and also mapped due to its complex information behind the category names. The major advantage of the diagnostics based classification systems, like WRB, is that the complex soil categories, classes can be interpreted as unique combinations of the diagnostic features. Therefore each classes an be disaggregated into several diagnostics, where each have independent useful information

  13. The New Digital Engineering Design and Graphics Process.

    ERIC Educational Resources Information Center

    Barr, R. E.; Krueger, T. J.; Aanstoos, T. A.

    2002-01-01

    Summarizes the digital engineering design process using software widely available for the educational setting. Points out that newer technology used in the field is not used in engineering graphics education. (DDR)

  14. Quantization effects in radiation spectroscopy based on digital pulse processing

    SciTech Connect

    Jordanov, V. T.; Jordanova, K. V.

    2011-07-01

    Radiation spectra represent inherently quantization data in the form of stacked channels of equal width. The spectrum is an experimental measurement of the discrete probability density function (PDF) of the detector pulse heights. The quantization granularity of the spectra depends on the total number of channels covering the full range of pulse heights. In analog pulse processing the total number of channels is equal to the total digital values produced by a spectroscopy analog-to-digital converter (ADC). In digital pulse processing each detector pulse is sampled and quantized by a fast ADC producing certain number of quantized numerical values. These digital values are linearly processed to obtain a digital quantity representing the peak of the digitally shaped pulse. Using digital pulse processing it is possible to acquire a spectrum with the total number of channels greater than the number of ADC values. Noise and sample averaging are important in the transformation of ADC quantized data into spectral quantized data. Analysis of this transformation is performed using an area sampling model of quantization. Spectrum differential nonlinearity (DNL) is shown to be related to the quantization at low noise levels and small number of averaged samples. Theoretical analysis and experimental measurements are used to obtain the condition to minimize the DNL due to quantization. (authors)

  15. Digital signal processing in the radio science stability analyzer

    NASA Technical Reports Server (NTRS)

    Greenhall, C. A.

    1995-01-01

    The Telecommunications Division has built a stability analyzer for testing Deep Space Network installations during flight radio science experiments. The low-frequency part of the analyzer operates by digitizing wave signals with bandwidths between 80 Hz and 45 kHz. Processed outputs include spectra of signal, phase, amplitude, and differential phase; time series of the same quantities; and Allan deviation of phase and differential phase. This article documents the digital signal-processing methods programmed into the analyzer.

  16. Agricultural inventory capabilities of machine processed LANDSAT digital data

    NASA Technical Reports Server (NTRS)

    Dietrick, D. L.; Fries, R. E.; Egbert, D. D.

    1975-01-01

    Agricultural crop identification and acreage determination analysis of LANDSAT digital data was performed for two study areas. A multispectral image processing and analysis system was utilized to perform the manmachine interactive analysis. The developed techniques yielded crop acreage estimate results with accuracy greater than 90% and as high as 99%. These results are encouraging evidence of agricultural inventory capabilities of machine processed LANDSAT digital data.

  17. Detecting jaundice by using digital image processing

    NASA Astrophysics Data System (ADS)

    Castro-Ramos, J.; Toxqui-Quitl, C.; Villa Manriquez, F.; Orozco-Guillen, E.; Padilla-Vivanco, A.; Sánchez-Escobar, JJ.

    2014-03-01

    When strong Jaundice is presented, babies or adults should be subject to clinical exam like "serum bilirubin" which can cause traumas in patients. Often jaundice is presented in liver disease such as hepatitis or liver cancer. In order to avoid additional traumas we propose to detect jaundice (icterus) in newborns or adults by using a not pain method. By acquiring digital images in color, in palm, soles and forehead, we analyze RGB attributes and diffuse reflectance spectra as the parameter to characterize patients with either jaundice or not, and we correlate that parameters with the level of bilirubin. By applying support vector machine we distinguish between healthy and sick patients.

  18. The processing of two-digit numbers in bilinguals.

    PubMed

    Macizo, Pedro; Herrera, Amparo; Román, Patricia; Martín, María Cruz

    2011-08-01

    We explored possible between-language influences when bilinguals processed two-digit numbers. Spanish/English bilinguals and German/English bilinguals performed a number comparison task with Arabic digits and verbal numbers in their first language (L1) and second language (L2) while the unit-decade compatibility was manipulated. The two bilingual groups showed regular compatibility effect with Arabic digits. In L1, Spanish/English bilinguals showed reverse compatibility effect, while German/English bilinguals showed regular compatibility effect. However, both groups of bilinguals presented reverse compatibility effect in English (L2), which suggested that the bilinguals' L1 did not determine the processing of number words in their L2. The results indicated that bilinguals processed two-digit number words selectively in their L1 and L2 and that they did not transcode L2 numbers into Arabic format. PMID:21752000

  19. The processing of two-digit numbers in bilinguals.

    PubMed

    Macizo, Pedro; Herrera, Amparo; Román, Patricia; Martín, María Cruz

    2011-08-01

    We explored possible between-language influences when bilinguals processed two-digit numbers. Spanish/English bilinguals and German/English bilinguals performed a number comparison task with Arabic digits and verbal numbers in their first language (L1) and second language (L2) while the unit-decade compatibility was manipulated. The two bilingual groups showed regular compatibility effect with Arabic digits. In L1, Spanish/English bilinguals showed reverse compatibility effect, while German/English bilinguals showed regular compatibility effect. However, both groups of bilinguals presented reverse compatibility effect in English (L2), which suggested that the bilinguals' L1 did not determine the processing of number words in their L2. The results indicated that bilinguals processed two-digit number words selectively in their L1 and L2 and that they did not transcode L2 numbers into Arabic format.

  20. Analysis of the structural behavior of a membrane using digital image processing

    NASA Astrophysics Data System (ADS)

    Jurjo, Daniel Leonardo Braga Rodriguez; Magluta, Carlos; Roitman, Ney; Batista Gonçalves, Paulo

    2015-03-01

    This article presents a methodology for the experimental analysis of thin membranes using digital image processing techniques. The methodology is particularly suitable for structures that cannot be monitored using conventional systems, particularly those systems that involve contact with the structure. This methodology consists of a computer vision system that integrates the digital image acquisition and processing techniques on-line using special programming routines. Because the membrane analyzed is very thin and displays large displacements/strains, the use of any conventional sensor based on contact with the structure would not be possible. The methodology permits the measurement of large displacements at several points of the membrane simultaneously, thus enabling the acquisition of the global displacement field. The accuracy of the acquired displacement field is a function of the number of cameras and measured points. The second step is to estimate the strains and stresses on the membrane from the measured displacements. The basic idea of the methodology is to generate global two-dimensional functions that describe the strains and stresses at any point of the membrane. Two constitutive models are considered in the analysis: the Hookean and the neo-Hookean models. Favorable comparisons between the experimental and numerical results attest to the accuracy of the proposed experimental procedure, which can be used for both artificial and natural membranes.

  1. A Comprehensive and Harmonized Digital Forensic Investigation Process Model.

    PubMed

    Valjarevic, Aleksandar; Venter, Hein S

    2015-11-01

    Performing a digital forensic investigation (DFI) requires a standardized and formalized process. There is currently neither an international standard nor does a global, harmonized DFI process (DFIP) exist. The authors studied existing state-of-the-art DFIP models and concluded that there are significant disparities pertaining to the number of processes, the scope, the hierarchical levels, and concepts applied. This paper proposes a comprehensive model that harmonizes existing models. An effort was made to incorporate all types of processes proposed by the existing models, including those aimed at achieving digital forensic readiness. The authors introduce a novel class of processes called concurrent processes. This is a novel contribution that should, together with the rest of the model, enable more efficient and effective DFI, while ensuring admissibility of digital evidence. Ultimately, the proposed model is intended to be used for different types of DFI and should lead to standardization.

  2. A Comprehensive and Harmonized Digital Forensic Investigation Process Model.

    PubMed

    Valjarevic, Aleksandar; Venter, Hein S

    2015-11-01

    Performing a digital forensic investigation (DFI) requires a standardized and formalized process. There is currently neither an international standard nor does a global, harmonized DFI process (DFIP) exist. The authors studied existing state-of-the-art DFIP models and concluded that there are significant disparities pertaining to the number of processes, the scope, the hierarchical levels, and concepts applied. This paper proposes a comprehensive model that harmonizes existing models. An effort was made to incorporate all types of processes proposed by the existing models, including those aimed at achieving digital forensic readiness. The authors introduce a novel class of processes called concurrent processes. This is a novel contribution that should, together with the rest of the model, enable more efficient and effective DFI, while ensuring admissibility of digital evidence. Ultimately, the proposed model is intended to be used for different types of DFI and should lead to standardization. PMID:26258644

  3. Relationships between digital signal processing and control and estimation theory

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1978-01-01

    Research areas associated with digital signal processing and control and estimation theory are identified. Particular attention is given to image processing, system identification problems (parameter identification, linear prediction, least squares, Kalman filtering), stability analyses (the use of the Liapunov theory, frequency domain criteria, passivity), and multiparameter systems, distributed processes, and random fields.

  4. Analysis of Interpersonal Communication Processes in Digital Factory Environments

    NASA Astrophysics Data System (ADS)

    Schütze, Jens; Baum, Heiko; Laue, Martin; Müller, Egon

    The paper outlines the scope of influence of digital factory on the interpersonal communication process and the exemplary description of them. On the basis of a brief description about the theoretical basic concepts of the digital factory occurs the illustration of communicative features in digital factory. Practical coherences of interpersonal communication from a human oriented view were analyzed in Volkswagen AG in Wolfsburg in a pilot project. A modeling method was developed within the process analysis. This method makes it possible to visualize interpersonal communication and its human oriented attribute in a technically focused workflow. Due to the results of a developed inquiry about communication analysis and process models of modeling methods it was possible to build the processes in a suitable way for humans and to obtain a positive implication on the communication processes.

  5. Digital signal processing for fiber-optic thermometers

    SciTech Connect

    Fernicola, V.; Crovini, L.

    1994-12-31

    A digital signal processing scheme for measurement of exponentially-decaying signals, such as those found in fluorescence, lifetime-based, fiber-optic sensors, is proposed. The instrument uses a modified digital phase-sensitive-detection technique with the phase locked to a fixed value and the modulation period tracking the measured lifetime. Typical resolution of the system is 0.05% for slow decay (>500 {mu}s) and 0.1% for fast decay.

  6. A color image processing pipeline for digital microscope

    NASA Astrophysics Data System (ADS)

    Liu, Yan; Liu, Peng; Zhuang, Zhefeng; Chen, Enguo; Yu, Feihong

    2012-10-01

    Digital microscope has found wide application in the field of biology, medicine et al. A digital microscope differs from traditional optical microscope in that there is no need to observe the sample through an eyepiece directly, because the optical image is projected directly on the CCD/CMOS camera. However, because of the imaging difference between human eye and sensor, color image processing pipeline is needed for the digital microscope electronic eyepiece to get obtain fine image. The color image pipeline for digital microscope, including the procedures that convert the RAW image data captured by sensor into real color image, is of great concern to the quality of microscopic image. The color pipeline for digital microscope is different from digital still cameras and video cameras because of the specific requirements of microscopic image, which should have the characters of high dynamic range, keeping the same color with the objects observed and a variety of image post-processing. In this paper, a new color image processing pipeline is proposed to satisfy the requirements of digital microscope image. The algorithm of each step in the color image processing pipeline is designed and optimized with the purpose of getting high quality image and accommodating diverse user preferences. With the proposed pipeline implemented on the digital microscope platform, the output color images meet the various analysis requirements of images in the medicine and biology fields very well. The major steps of color imaging pipeline proposed include: black level adjustment, defect pixels removing, noise reduction, linearization, white balance, RGB color correction, tone scale correction and gamma correction.

  7. Digital processing of RF signals from optical frequency combs

    NASA Astrophysics Data System (ADS)

    Cizek, Martin; Smid, Radek; Buchta, Zdeněk.; Mikel, Břetislav; Lazar, Josef; Cip, Ondřej

    2013-01-01

    The presented work is focused on digital processing of beat note signals from a femtosecond optical frequency comb. The levels of mixing products of single spectral components of the comb with CW laser sources are usually very low compared to products of mixing all the comb components together. RF counters are more likely to measure the frequency of the strongest spectral component rather than a weak beat note. Proposed experimental digital signal processing system solves this problem by analyzing the whole spectrum of the output RF signal and using software defined radio (SDR) algorithms. Our efforts concentrate in two main areas: Firstly, using digital servo-loop techniques for locking free running continuous laser sources on single components of the fs comb spectrum. Secondly, we are experimenting with digital signal processing of the RF beat note spectrum produced by f-2f 1 technique used for assessing the offset and repetition frequencies of the comb, resulting in digital servo-loop stabilization of the fs comb. Software capable of computing and analyzing the beat-note RF spectrums using FFT and peak detection was developed. A SDR algorithm performing phase demodulation on the f- 2f signal is used as a regulation error signal source for a digital phase-locked loop stabilizing the offset frequency of the fs comb.

  8. Digital intermediate frequency QAM modulator using parallel processing

    DOEpatents

    Pao, Hsueh-Yuan; Tran, Binh-Nien

    2008-05-27

    The digital Intermediate Frequency (IF) modulator applies to various modulation types and offers a simple and low cost method to implement a high-speed digital IF modulator using field programmable gate arrays (FPGAs). The architecture eliminates multipliers and sequential processing by storing the pre-computed modulated cosine and sine carriers in ROM look-up-tables (LUTs). The high-speed input data stream is parallel processed using the corresponding LUTs, which reduces the main processing speed, allowing the use of low cost FPGAs.

  9. FPGA-Based Filterbank Implementation for Parallel Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Berner, Stephan; DeLeon, Phillip

    1999-01-01

    One approach to parallel digital signal processing decomposes a high bandwidth signal into multiple lower bandwidth (rate) signals by an analysis bank. After processing, the subband signals are recombined into a fullband output signal by a synthesis bank. This paper describes an implementation of the analysis and synthesis banks using (Field Programmable Gate Arrays) FPGAs.

  10. Improved assembly processes for the Quartz Digital Accelerometer cantilever

    SciTech Connect

    Romero, A.M.; Gebert, C.T.

    1990-07-01

    This report covers the development of improved assembly processes for the Quartz Digital Accelerometer cantilever. In this report we discuss improved single-assembly tooling, the development of tooling and processes for precision application of polyimide adhesive, the development of the wafer scale assembly procedure, and the application of eutectic bonding to cantilever assembly. 2 refs., 17 figs.

  11. Relationships between digital signal processing and control and estimation theory

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1978-01-01

    Research directions in the fields of digital signal processing and modern control and estimation theory are discussed. Stability theory, linear prediction and parameter identification, system synthesis and implementation, two-dimensional filtering, decentralized control and estimation, and image processing are considered in order to uncover some of the basic similarities and differences in the goals, techniques, and philosophy of the disciplines.

  12. Dry Machining Process of Milling Machine using Axiomatic Green Methodology

    NASA Astrophysics Data System (ADS)

    Puspita Andriani, Gita; Akbar, Muhammad; Irianto, Dradjad

    2016-02-01

    Most of companies know that there are strategies to become green industry, and they realize that green efforts have impacts on product quality and cost. Axiomatic Green Methodology models the relationship between green, quality, and cost. This methodology starts with determining the green improvement objective and then continues with mapping the functional, economic, and green requirements. From the mapping, variables which affect the requirements are identified. Afterwards, the effect of each variable is determined by performing experiments and regression modelling. In this research, axiomatic green methodology was implemented to dry machining of milling machine in order to reduce the amount of coolant. Dry machining will be feasible if it is not worse than the minimum required quality. As a result, dry machining is feasible without producing any defect. The proposed machining parameter is to reduce the coolant flow rate from 6.882 ml/minute to 0 ml/minute, set the depth of cut at 1.2 mm, spindle rotation speed at 500 rpm, and feed rate at 128 mm/minute. This solution is also resulted in reduction of cost for 200.48 rupiahs for each process.

  13. Intelligent detection and diagnosis of lightning arrester faults using digital thermovision image processing techniques

    NASA Astrophysics Data System (ADS)

    Laurentys Almeida, Carlos A.; Caminhas, Walmir M.; Braga, Antonio P.; Paiva, Vinicius; Martins, Helvio; Torres, Rodolfo

    2005-03-01

    This paper describes a methodology that aims to detect and diagnosis faults in lightning arresters, using the thermovision technique. Thermovision is a non-destructive technique used in diverse services of maintenance, having the advantage not to demand the disconnection of the equipment under inspection. It uses a set of neuro-fuzzy networks to achieve the lightning arresters fault classification. The methodology also uses a digital image processing algorithm based on the Watershed Transform in order to get the segmentation of the lightning arresters. This procedure enables the automatic search of the maximum and minimum temperature on the lightning arresters. These variables are necessary to generate the diagnosis. By appling the methodology is possible to classify lightning arresters operative condition in: faulty, normal, light, suspicious and faulty. The computacional system generated by the proposed methodology train its neuro-fuzzy network by using a historical thermovision data. During the train phase, a heuristic is proposed in order to set the number of networks in the diagnosis system. This system was validated using a database provided by the Eletric Energy Research Center, with a hundreds of different faulty scenarios. The validation error of the set of neuro-fuzzy and the automatic digital thermovision imagem processing was about 10 percent. The diagnosis system described has been sucessefully used by Eletric Energy Research Center as an auxiliar tool for lightning arresters fault diagnosis.

  14. Methodology and Process for Condition Assessment at Existing Hydropower Plants

    SciTech Connect

    Zhang, Qin Fen; Smith, Brennan T; Cones, Marvin; March, Patrick; Dham, Rajesh; Spray, Michael

    2012-01-01

    Hydropower Advancement Project was initiated by the U.S. Department of Energy Office of Energy Efficiency and Renewable Energy to develop and implement a systematic process with a standard methodology to identify the opportunities of performance improvement at existing hydropower facilities and to predict and trend the overall condition and improvement opportunity within the U.S. hydropower fleet. The concept of performance for the HAP focuses on water use efficiency how well a plant or individual unit converts potential energy to electrical energy over a long-term averaging period of a year or more. The performance improvement involves not only optimization of plant dispatch and scheduling but also enhancement of efficiency and availability through advanced technology and asset upgrades, and thus requires inspection and condition assessment for equipment, control system, and other generating assets. This paper discusses the standard methodology and process for condition assessment of approximately 50 nationwide facilities, including sampling techniques to ensure valid expansion of the 50 assessment results to the entire hydropower fleet. The application and refining process and the results from three demonstration assessments are also presented in this paper.

  15. Modeling of electrohydrodynamic drying process using response surface methodology

    PubMed Central

    Dalvand, Mohammad Jafar; Mohtasebi, Seyed Saeid; Rafiee, Shahin

    2014-01-01

    Energy consumption index is one of the most important criteria for judging about new, and emerging drying technologies. One of such novel and promising alternative of drying process is called electrohydrodynamic (EHD) drying. In this work, a solar energy was used to maintain required energy of EHD drying process. Moreover, response surface methodology (RSM) was used to build a predictive model in order to investigate the combined effects of independent variables such as applied voltage, field strength, number of discharge electrode (needle), and air velocity on moisture ratio, energy efficiency, and energy consumption as responses of EHD drying process. Three-levels and four-factor Box–Behnken design was employed to evaluate the effects of independent variables on system responses. A stepwise approach was followed to build up a model that can map the entire response surface. The interior relationships between parameters were well defined by RSM. PMID:24936289

  16. Power processing methodology. [computerized design of spacecraft electric power systems

    NASA Technical Reports Server (NTRS)

    Fegley, K. A.; Hansen, I. G.; Hayden, J. H.

    1974-01-01

    Discussion of the interim results of a program to investigate the feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems. The object of the total program is to develop a flexible engineering tool which will allow the power processor designer to effectively and rapidly assess and analyze the tradeoffs available by providing, in one comprehensive program, a mathematical model, an analysis of expected performance, simulation, and a comparative evaluation with alternative designs. This requires an understanding of electrical power source characteristics and the effects of load control, protection, and total system interaction.

  17. Application of automated methodologies based on digital images for phenological behaviour analysis in Mediterranean species

    NASA Astrophysics Data System (ADS)

    Cesaraccio, Carla; Piga, Alessandra; Ventura, Andrea; Arca, Angelo; Duce, Pierpaolo; Granados, Joel

    2015-04-01

    The importance of phenological research for understanding the consequences of global environmental change on vegetation is highlighted in the most recent IPCC reports. Collecting time series of phenological events appears to be of crucial importance to better understand how vegetation systems respond to climatic regime fluctuations, and, consequently, to develop effective management and adaptation strategies. Vegetation monitoring based on "near-surface" remote sensing techniques have been proposed in recent researches. In particular, the use of digital cameras has become more common for phenological monitoring. Digital images provide spectral information in the red, green, and blue (RGB) wavelengths. Inflection points in seasonal variations of intensities of each color channel can be used to identify phenological events. In this research, an Automated Phenological Observation System (APOS), based on digital image sensors, was used for monitoring the phenological behavior of shrubland species in a Mediterranean site. Major species of the shrubland ecosystem that were analyzed were: Cistus monspeliensis L., Cistus incanus L., Rosmarinus officinalis L., Pistacia lentiscus L., and Pinus halepensis Mill. The system was developed under the INCREASE (an Integrated Network on Climate Change Research) EU-funded research infrastructure project, which is based upon large scale field experiments with non-intrusive climatic manipulations. Monitoring of phenological behavior was conducted during 2012-2014 years. To the end of retrieve phenological information from digital images, a routine of commands to process the digital image file using the program MATLAB (R2013b, The MathWorks, Natick, Mass.) was specifically created. The images of the dataset have been re-classified and renamed files according to the date and time of acquisition. The analysis was focused on regions of interest (ROIs) of the panoramas acquired, defined by the presence of the most representative species of

  18. Digitizing Dissertations for an Institutional Repository: A Process and Cost Analysis*

    PubMed Central

    Piorun, Mary; Palmer, Lisa A.

    2008-01-01

    Objective: This paper describes the Lamar Soutter Library's process and costs associated with digitizing 300 doctoral dissertations for a newly implemented institutional repository at the University of Massachusetts Medical School. Methodology: Project tasks included identifying metadata elements, obtaining and tracking permissions, converting the dissertations to an electronic format, and coordinating workflow between library departments. Each dissertation was scanned, reviewed for quality control, enhanced with a table of contents, processed through an optical character recognition function, and added to the institutional repository. Results: Three hundred and twenty dissertations were digitized and added to the repository for a cost of $23,562, or $0.28 per page. Seventy-four percent of the authors who were contacted (n = 282) granted permission to digitize their dissertations. Processing time per title was 170 minutes, for a total processing time of 906 hours. In the first 17 months, full-text dissertations in the collection were downloaded 17,555 times. Conclusion: Locally digitizing dissertations or other scholarly works for inclusion in institutional repositories can be cost effective, especially if small, defined projects are chosen. A successful project serves as an excellent recruitment strategy for the institutional repository and helps libraries build new relationships. Challenges include workflow, cost, policy development, and copyright permissions. PMID:18654648

  19. Orthogonal rotation-invariant moments for digital image processing.

    PubMed

    Lin, Huibao; Si, Jennie; Abousleman, Glen P

    2008-03-01

    Orthogonal rotation-invariant moments (ORIMs), such as Zernike moments, are introduced and defined on a continuous unit disk and have been proven powerful tools in optics applications. These moments have also been digitized for applications in digital image processing. Unfortunately, digitization compromises the orthogonality of the moments and, therefore, digital ORIMs are incapable of representing subtle details in images and cannot accurately reconstruct images. Typical approaches to alleviate the digitization artifact can be divided into two categories: 1) careful selection of a set of pixels as close approximation to the unit disk and using numerical integration to determine the ORIM values, and 2) representing pixels using circular shapes such that they resemble that of the unit disk and then calculating ORIMs in polar space. These improvements still fall short of preserving the orthogonality of the ORIMs. In this paper, in contrast to the previous methods, we propose a different approach of using numerical optimization techniques to improve the orthogonality. We prove that with the improved orthogonality, image reconstruction becomes more accurate. Our simulation results also show that the optimized digital ORIMs can accurately reconstruct images and can represent subtle image details. PMID:18270118

  20. Digital Art Making as a Representational Process

    ERIC Educational Resources Information Center

    Halverson, Erica Rosenfeld

    2013-01-01

    In this article I bring artistic production into the learning sciences conversation by using the production of representations as a bridging concept between art making and the new literacies. Through case studies with 4 youth media arts organizations across the United States I ask how organizations structure the process of producing…

  1. Digital Signal Processing and Machine Learning

    NASA Astrophysics Data System (ADS)

    Li, Yuanqing; Ang, Kai Keng; Guan, Cuntai

    Any brain-computer interface (BCI) system must translate signals from the users brain into messages or commands (see Fig. 1). Many signal processing and machine learning techniques have been developed for this signal translation, and this chapter reviews the most common ones. Although these techniques are often illustrated using electroencephalography (EEG) signals in this chapter, they are also suitable for other brain signals.

  2. Light Water Reactor Sustainability Program: Digital Technology Business Case Methodology Guide

    SciTech Connect

    Thomas, Ken; Lawrie, Sean; Hart, Adam; Vlahoplus, Chris

    2014-09-01

    The Department of Energy’s (DOE’s) Light Water Reactor Sustainability Program aims to develop and deploy technologies that will make the existing U.S. nuclear fleet more efficient and competitive. The program has developed a standard methodology for determining the impact of new technologies in order to assist nuclear power plant (NPP) operators in building sound business cases. The Advanced Instrumentation, Information, and Control (II&C) Systems Technologies Pathway is part of the DOE’s Light Water Reactor Sustainability (LWRS) Program. It conducts targeted research and development (R&D) to address aging and reliability concerns with the legacy instrumentation and control and related information systems of the U.S. operating light water reactor (LWR) fleet. This work involves two major goals: (1) to ensure that legacy analog II&C systems are not life-limiting issues for the LWR fleet and (2) to implement digital II&C technology in a manner that enables broad innovation and business improvement in the NPP operating model. Resolving long-term operational concerns with the II&C systems contributes to the long-term sustainability of the LWR fleet, which is vital to the nation’s energy and environmental security. The II&C Pathway is conducting a series of pilot projects that enable the development and deployment of new II&C technologies in existing nuclear plants. Through the LWRS program, individual utilities and plants are able to participate in these projects or otherwise leverage the results of projects conducted at demonstration plants. Performance advantages of the new pilot project technologies are widely acknowledged, but it has proven difficult for utilities to derive business cases for justifying investment in these new capabilities. Lack of a business case is often cited by utilities as a barrier to pursuing wide-scale application of digital technologies to nuclear plant work activities. The decision to move forward with funding usually hinges on

  3. Reservoir continuous process improvement six sigma methodology implementation

    SciTech Connect

    Wannamaker, A.L.

    1996-12-01

    The six sigma methodology adopted by AlliedSignal Inc. for implementing continuous improvement activity was applied to a new manufacturing assignment for Federal Manufacturing & Technologies (FM&T). The responsibility for reservoir development/production was transferred from Rocky Flats to FM&T. Pressure vessel fabrication was new to this facility. No fabrication history for this type of product existed in-house. Statistical tools such as process mapping, failure mode and effects analysis, and design of experiments were used to define and fully characterize the machine processes to be used in reservoir production. Continuous improvement with regard to operating efficiencies and product quality is an ongoing activity at FM&T.

  4. Signal processing methodologies for an acoustic fetal heart rate monitor

    NASA Technical Reports Server (NTRS)

    Pretlow, Robert A., III; Stoughton, John W.

    1992-01-01

    Research and development is presented of real time signal processing methodologies for the detection of fetal heart tones within a noise-contaminated signal from a passive acoustic sensor. A linear predictor algorithm is utilized for detection of the heart tone event and additional processing derives heart rate. The linear predictor is adaptively 'trained' in a least mean square error sense on generic fetal heart tones recorded from patients. A real time monitor system is described which outputs to a strip chart recorder for plotting the time history of the fetal heart rate. The system is validated in the context of the fetal nonstress test. Comparisons are made with ultrasonic nonstress tests on a series of patients. Comparative data provides favorable indications of the feasibility of the acoustic monitor for clinical use.

  5. Statistical process control for hospitals: methodology, user education, and challenges.

    PubMed

    Matthes, Nikolas; Ogunbo, Samuel; Pennington, Gaither; Wood, Nell; Hart, Marilyn K; Hart, Robert F

    2007-01-01

    The health care industry is slowly embracing the use of statistical process control (SPC) to monitor and study causes of variation in health care processes. While the statistics and principles underlying the use of SPC are relatively straightforward, there is a need to be cognizant of the perils that await the user who is not well versed in the key concepts of SPC. This article introduces the theory behind SPC methodology, describes successful tactics for educating users, and discusses the challenges associated with encouraging adoption of SPC among health care professionals. To illustrate these benefits and challenges, this article references the National Hospital Quality Measures, presents critical elements of SPC curricula, and draws examples from hospitals that have successfully embedded SPC into their overall approach to performance assessment and improvement. PMID:17627215

  6. Digital image processing of vascular angiograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Blankenhorn, D. H.; Beckenbach, E. S.; Crawford, D. W.; Brooks, S. H.

    1975-01-01

    A computer image processing technique was developed to estimate the degree of atherosclerosis in the human femoral artery. With an angiographic film of the vessel as input, the computer was programmed to estimate vessel abnormality through a series of measurements, some derived primarily from the vessel edge information and others from optical density variations within the lumen shadow. These measurements were combined into an atherosclerosis index, which was found to correlate well with both visual and chemical estimates of atherosclerotic disease.

  7. Synthetic aperture radar and digital processing: An introduction

    NASA Technical Reports Server (NTRS)

    Dicenzo, A.

    1981-01-01

    A tutorial on synthetic aperture radar (SAR) is presented with emphasis on digital data collection and processing. Background information on waveform frequency and phase notation, mixing, Q conversion, sampling and cross correlation operations is included for clarity. The fate of a SAR signal from transmission to processed image is traced in detail, using the model of a single bright point target against a dark background. Some of the principal problems connected with SAR processing are also discussed.

  8. Digital image processing: a primer for JVIR authors and readers: Part 3: Digital image editing.

    PubMed

    LaBerge, Jeanne M; Andriole, Katherine P

    2003-12-01

    This is the final installment of a three-part series on digital image processing intended to prepare authors for online submission of manuscripts. In the first two articles of the series, the fundamentals of digital image architecture were reviewed and methods of importing images to the computer desktop were described. In this article, techniques are presented for editing images in preparation for online submission. A step-by-step guide to basic editing with use of Adobe Photoshop is provided and the ethical implications of this activity are explored.

  9. Novel digital signal processing and detection techniques

    NASA Astrophysics Data System (ADS)

    Liu, B.

    1981-09-01

    In the area of narrowband signal processing, design rules are developed for optimum decimator and interpolator, a new efficient scheme using recursive filter for decimation/interpolation is proposed, and a novel approach to the computation of narrowband spectra is shown to yield substantial saving over conventional approaches. Results on the implementation of recursive filters with poles near the unit circle that produces significantly reduced roundoff error include a transformation technique, a scheme to modify the quantizer error spectrum, and a new computationally efficient low noise filter structure. In the area of nonclassical signal detection, several results were derived on nonparametric sequential procedures and on the quantization of signal for detection. In addition, a programmable charge transfer device filter is developed, several problems concerning ADPCM are investigated, results are obtained on FFT roundoff error including the prime factor algorithm, and an effective method of generating random sequences is studied.

  10. Experiences with digital processing of images at INPE

    NASA Technical Reports Server (NTRS)

    Mascarenhas, N. D. A. (Principal Investigator)

    1984-01-01

    Four different research experiments with digital image processing at INPE will be described: (1) edge detection by hypothesis testing; (2) image interpolation by finite impulse response filters; (3) spatial feature extraction methods in multispectral classification; and (4) translational image registration by sequential tests of hypotheses.

  11. Digital Signal Processing in Acoustics--Part 2.

    ERIC Educational Resources Information Center

    Davies, H.; McNeill, D. J.

    1986-01-01

    Reviews the potential of a data acquisition system for illustrating the nature and significance of ideas in digital signal processing. Focuses on the fast Fourier transform and the utility of its two-channel format, emphasizing cross-correlation and its two-microphone technique of acoustic intensity measurement. Includes programing format. (ML)

  12. Trust in Numbers? Digital Education Governance and the Inspection Process

    ERIC Educational Resources Information Center

    Ozga, Jenny

    2016-01-01

    The aim of the paper is to contribute to the critical study of digital data use in education, through examination of the processes surrounding school inspection judgements. The interaction between pupil performance data and other (embodied, enacted) sources of inspection judgement is scrutinised and discussed with a focus on the interaction…

  13. Digital-Computer Processing of Graphical Data. Final Report.

    ERIC Educational Resources Information Center

    Freeman, Herbert

    The final report of a two-year study concerned with the digital-computer processing of graphical data. Five separate investigations carried out under this study are described briefly, and a detailed bibliography, complete with abstracts, is included in which are listed the technical papers and reports published during the period of this program.…

  14. Optical hybrid analog-digital signal processing based on spike processing in neurons

    NASA Astrophysics Data System (ADS)

    Fok, Mable P.; Tian, Yue; Rosenbluth, David; Deng, Yanhua; Prucnal, Paul R.

    2011-09-01

    Spike processing is one kind of hybrid analog-digital signal processing, which has the efficiency of analog processing and the robustness to noise of digital processing. When instantiated with optics, a hybrid analog-digital processing primitive has the potential to be scalable, computationally powerful, and have high operation bandwidth. These devices open up a range of processing applications for which electronic processing is too slow. Our approach is based on a hybrid analog/digital computational primitive that elegantly implements the functionality of an integrate-and-fire neuron using a Ge-doped non-linear optical fiber and off-the-shelf semiconductor devices. In this paper, we introduce our photonic neuron architecture and demonstrate the feasibility of implementing simple photonic neuromorphic circuits, including the auditory localization algorithm of the barn owl, which is useful for LIDAR localization, and the crayfish tail-flip escape response.

  15. Reflexivity: a methodological tool in the knowledge translation process?

    PubMed

    Alley, Sarah; Jackson, Suzanne F; Shakya, Yogendra B

    2015-05-01

    Knowledge translation is a dynamic and iterative process that includes the synthesis, dissemination, exchange, and application of knowledge. It is considered the bridge that closes the gap between research and practice. Yet it appears that in all areas of practice, a significant gap remains in translating research knowledge into practical application. Recently, researchers and practitioners in the field of health care have begun to recognize reflection and reflexive exercises as a fundamental component to the knowledge translation process. As a practical tool, reflexivity can go beyond simply looking at what practitioners are doing; when approached in a systematic manner, it has the potential to enable practitioners from a wide variety of backgrounds to identify, understand, and act in relation to the personal, professional, and political challenges they face in practice. This article focuses on how reflexive practice as a methodological tool can provide researchers and practitioners with new insights and increased self-awareness, as they are able to critically examine the nature of their work and acknowledge biases, which may affect the knowledge translation process. Through the use of structured journal entries, the nature of the relationship between reflexivity and knowledge translation was examined, specifically exploring if reflexivity can improve the knowledge translation process, leading to increased utilization and application of research findings into everyday practice.

  16. Methodology for the systems engineering process. Volume 3: Operational availability

    NASA Technical Reports Server (NTRS)

    Nelson, J. H.

    1972-01-01

    A detailed description and explanation of the operational availability parameter is presented. The fundamental mathematical basis for operational availability is developed, and its relationship to a system's overall performance effectiveness is illustrated within the context of identifying specific availability requirements. Thus, in attempting to provide a general methodology for treating both hypothetical and existing availability requirements, the concept of an availability state, in conjunction with the more conventional probability-time capability, is investigated. In this respect, emphasis is focused upon a balanced analytical and pragmatic treatment of operational availability within the system design process. For example, several applications of operational availability to typical aerospace systems are presented, encompassing the techniques of Monte Carlo simulation, system performance availability trade-off studies, analytical modeling of specific scenarios, as well as the determination of launch-on-time probabilities. Finally, an extensive bibliography is provided to indicate further levels of depth and detail of the operational availability parameter.

  17. Evaluation methodologies for an advanced information processing system

    NASA Technical Reports Server (NTRS)

    Schabowsky, R. S., Jr.; Gai, E.; Walker, B. K.; Lala, J. H.; Motyka, P.

    1984-01-01

    The system concept and requirements for an Advanced Information Processing System (AIPS) are briefly described, but the emphasis of this paper is on the evaluation methodologies being developed and utilized in the AIPS program. The evaluation tasks include hardware reliability, maintainability and availability, software reliability, performance, and performability. Hardware RMA and software reliability are addressed with Markov modeling techniques. The performance analysis for AIPS is based on queueing theory. Performability is a measure of merit which combines system reliability and performance measures. The probability laws of the performance measures are obtained from the Markov reliability models. Scalar functions of this law such as the mean and variance provide measures of merit in the AIPS performability evaluations.

  18. Digital image processing of bone - Problems and potentials

    NASA Technical Reports Server (NTRS)

    Morey, E. R.; Wronski, T. J.

    1980-01-01

    The development of a digital image processing system for bone histomorphometry and fluorescent marker monitoring is discussed. The system in question is capable of making measurements of UV or light microscope features on a video screen with either video or computer-generated images, and comprises a microscope, low-light-level video camera, video digitizer and display terminal, color monitor, and PDP 11/34 computer. Capabilities demonstrated in the analysis of an undecalcified rat tibia include the measurement of perimeter and total bone area, and the generation of microscope images, false color images, digitized images and contoured images for further analysis. Software development will be based on an existing software library, specifically the mini-VICAR system developed at JPL. It is noted that the potentials of the system in terms of speed and reliability far exceed any problems associated with hardware and software development.

  19. Gallium arsenide enhances digital signal processing in electronic warfare

    NASA Astrophysics Data System (ADS)

    Hoffman, B.; Apte, D.

    1985-07-01

    The higher electron mobility and velocity of GaAs digital signal processing IC devices for electronic warfare (EW) allow operation times that are several times faster than those of ICs based on silicon. Particular benefits are foreseen for the response time and broadband capability of ECM systems. Many data manipulation methods can be implemented in emitter-coupled logic (ECL) GaAs devices, and digital GaAs RF memories are noted to show great promise for improved ECM system performance while encompassing microwave frequency and chirp signal synthesis, repeater jamming, and multiple false target generation. EW digital frequency synthesizers are especially in need of GaAS IC technology, since bandwidth and resolution have been limited by ECL technology to about 250 MHz.

  20. Digital processing of histopathological aspects in renal transplantation

    NASA Astrophysics Data System (ADS)

    de Albuquerque Araujo, Arnaldo; de Andrade, Marcos C.; Bambirra, Eduardo A.; dos Santos, A. M. M.

    1993-07-01

    We describe here our initial experience with the digital image processing of histopathological aspects from multiple renal biopsies of transplanted kidney in a patient treated with Cyclosporine (CsA), a powerful immunosupressor drug whose use has improved the chances of a successful vascularized organ transplantation (Tx). Unfortunately, CsA promotes morphological alterations to the glomerular structure of the kidneys. To characterize this process, glomeruli, tufts, and lumen areas distributions are measured. The results are presented in form of graphics.

  1. Digital image processing for the earth resources technology satellite data.

    NASA Technical Reports Server (NTRS)

    Will, P. M.; Bakis, R.; Wesley, M. A.

    1972-01-01

    This paper discusses the problems of digital processing of the large volumes of multispectral image data that are expected to be received from the ERTS program. Correction of geometric and radiometric distortions are discussed and a byte oriented implementation is proposed. CPU timing estimates are given for a System/360 Model 67, and show that a processing throughput of 1000 image sets per week is feasible.

  2. Process sequence optimization for digital microfluidic integration using EWOD technique

    NASA Astrophysics Data System (ADS)

    Yadav, Supriya; Joyce, Robin; Sharma, Akash Kumar; Sharma, Himani; Sharma, Niti Nipun; Varghese, Soney; Akhtar, Jamil

    2016-04-01

    Micro/nano-fluidic MEMS biosensors are the devices that detects the biomolecules. The emerging micro/nano-fluidic devices provide high throughput and high repeatability with very low response time and reduced device cost as compared to traditional devices. This article presents the experimental details for process sequence optimization of digital microfluidics (DMF) using "electrowetting-on-dielectric" (EWOD). Stress free thick film deposition of silicon dioxide using PECVD and subsequent process for EWOD techniques have been optimized in this work.

  3. Integrating digital topology in image-processing libraries.

    PubMed

    Lamy, Julien

    2007-01-01

    This paper describes a method to integrate digital topology informations in image-processing libraries. This additional information allows a library user to write algorithms respecting topological constraints, for example, a seed fill or a skeletonization algorithm. As digital topology is absent from most image-processing libraries, such constraints cannot be fulfilled. We describe and give code samples for all the structures necessary for this integration, and show a use case in the form of a homotopic thinning filter inside ITK. The obtained filter can be up to a hundred times as fast as ITK's thinning filter and works for any image dimension. This paper mainly deals of integration within ITK, but can be adapted with only minor modifications to other image-processing libraries.

  4. Flow manipulation and control methodologies for vacuum infusion processes

    NASA Astrophysics Data System (ADS)

    Alms, Justin B.

    experienced. First, the effect on permeability is characterized, so the process can be simulated and the flow front patterns can be predicted. It was found that using the VIPR process in combination with tool side injection gates is a very effective method to control resin flow. Based on this understanding several control algorithms were developed to use the process in an automated manufacturing environment which were tested and validated in a virtual environment. To implement and demonstrate the approach, an experimental workstation was built and various infusion examples were performed in the automated environment to validate the capability of the VIPR process with the control methodologies. The VIPR process with control consistently performed better than the process without control. This contribution should prove useful in making VIPs more reliable in the production of large scale composite structures.

  5. Novel Optimization Methodology for Welding Process/Consumable Integration

    SciTech Connect

    Quintana, Marie A; DebRoy, Tarasankar; Vitek, John; Babu, Suresh

    2006-01-15

    Advanced materials are being developed to improve the energy efficiency of many industries of future including steel, mining, and chemical, as well as, US infrastructures including bridges, pipelines and buildings. Effective deployment of these materials is highly dependent upon the development of arc welding technology. Traditional welding technology development is slow and often involves expensive and time-consuming trial and error experimentation. The reason for this is the lack of useful predictive tools that enable welding technology development to keep pace with the deployment of new materials in various industrial sectors. Literature reviews showed two kinds of modeling activities. Academic and national laboratory efforts focus on developing integrated weld process models by employing the detailed scientific methodologies. However, these models are cumbersome and not easy to use. Therefore, these scientific models have limited application in real-world industrial conditions. On the other hand, industrial users have relied on simple predictive models based on analytical and empirical equations to drive their product development. The scopes of these simple models are limited. In this research, attempts were made to bridge this gap and provide the industry with a computational tool that combines the advantages of both approaches. This research resulted in the development of predictive tools which facilitate the development of optimized welding processes and consumables. The work demonstrated that it is possible to develop hybrid integrated models for relating the weld metal composition and process parameters to the performance of welds. In addition, these tools can be deployed for industrial users through user friendly graphical interface. In principle, the welding industry users can use these modular tools to guide their welding process parameter and consumable composition selection. It is hypothesized that by expanding these tools throughout welding industry

  6. A symbolic methodology to improve disassembly process design.

    PubMed

    Rios, Pedro; Blyler, Leslie; Tieman, Lisa; Stuart, Julie Ann; Grant, Ed

    2003-12-01

    Millions of end-of-life electronic components are retired annually due to the proliferation of new models and their rapid obsolescence. The recovery of resources such as plastics from these goods requires their disassembly. The time required for each disassembly and its associated cost is defined by the operator's familiarity with the product design and its complexity. Since model proliferation serves to complicate an operator's learning curve, it is worthwhile to investigate the benefits to be gained in a disassembly operator's preplanning process. Effective disassembly process design demands the application of green engineering principles, such as those developed by Anastas and Zimmerman (Environ. Sci. Technol. 2003, 37, 94A-101A), which include regard for product complexity, structural commonality, separation energy, material value, and waste prevention. This paper introduces the concept of design symbolsto help the operator more efficiently survey product complexity with respect to location and number of fasteners to remove a structure that is common to all electronics: the housing. With a sample of 71 different computers, printers, and monitors, we demonstrate that appropriate symbols reduce the total disassembly planning time by 13.2 min. Such an improvement could well make efficient the separation of plastic that would otherwise be destined for waste-to-energy or landfill. The symbolic methodology presented may also improve Design for Recycling and Design for Maintenance and Support.

  7. Digital pulse processing for NaI(Tl) detectors

    NASA Astrophysics Data System (ADS)

    Di Fulvio, A.; Shin, T. H.; Hamel, M. C.; Pozzi, S. A.

    2016-01-01

    We apply two different post-processing techniques to digital pulses induced by photons in a NaI(Tl) detector and compare the obtained energy resolution to the standard analog approach. Our digital acquisition approach is performed using a single-stage acquisition with a fast digitizer. Both the post-processing techniques we propose rely on signal integration. In the first, the pulse integral is calculated by directly numerically integrating the pulse digital samples, while in the second the pulse integral is estimated by a model-based fitting of the pulse. Our study used a 7.62 cm×7.62 cm cylindrical NaI(Tl) detector that gave a 7.60% energy resolution (at 662 keV), using the standard analog acquisition approach, based on a pulse shaping amplifier. The new direct numerical integration yielded a 6.52% energy resolution. The fitting approach yielded a 6.55% energy resolution, and, although computationally heavier than numerical integration, is preferable when only the early samples of the pulse are available. We also evaluated the timing performance of a fast-slow detection system, encompassing an EJ-309 and a NaI(Tl) scintillator. We use two techniques to determine the pulse start time: constant fraction discrimination (CFD) and adaptive noise threshold timing (ANT), for both the analog and digital acquisition approach. With the analog acquisition approach, we found a system time resolution of 5.8 ns and 7.3 ns, using the constant fraction discrimination and adaptive noise threshold timing, respectively. With the digital acquisition approach, a time resolution of 1.2 ns was achieved using the ANT method and 3.3 ns using CFD at 50% of the maximum, to select the pulse start time. The proposed direct digital readout and post-processing techniques can improve the application of NaI(Tl) detectors, traditionally considered 'slow', for fast counting and correlation measurements, while maintaining a good measurement of the energy resolution.

  8. Automated image processing of LANDSAT 2 digital data for watershed runoff prediction

    NASA Technical Reports Server (NTRS)

    Sasso, R. R.; Jensen, J. R.; Estes, J. E.

    1977-01-01

    The U.S. Soil Conservation Service (SCS) model for watershed runoff prediction uses soil and land cover information as its major drivers. Kern County Water Agency is implementing the SCS model to predict runoff for 10,400 sq cm of mountainous watershed in Kern County, California. The Remote Sensing Unit, University of California, Santa Barbara, was commissioned by KCWA to conduct a 230 sq cm feasibility study in the Lake Isabella, California region to evaluate remote sensing methodologies which could be ultimately extrapolated to the entire 10,400 sq cm Kern County watershed. Digital results indicate that digital image processing of Landsat 2 data will provide usable land cover required by KCWA for input to the SCS runoff model.

  9. Post-digital image processing based on microlens array

    NASA Astrophysics Data System (ADS)

    Shi, Chaiyuan; Xu, Feng

    2014-10-01

    Benefit from the attractive features such as compact volume, thin and lightweight, the imaging systems based on microlens array have become an active area of research. However, current imaging systems based on microlens array have insufficient imaging quality so that it cannot meet the practical requirements in most applications. As a result, the post-digital image processing for image reconstruction from the low-resolution sub-image sequence becomes particularly important. In general, the post-digital image processing mainly includes two parts: the accurate estimation of the motion parameters between the sub-image sequence and the reconstruction of high resolution image. In this paper, given the fact that the preprocessing of the unit image can make the edge of the reconstructed high-resolution image clearer, the low-resolution images are preprocessed before the post-digital image processing. Then, after the processing of the pixel rearrange method, a high-resolution image is obtained. From the result, we find that the edge of the reconstructed high-resolution image is clearer than that without preprocessing.

  10. Digital image processing as a tool for pavement distress evaluation

    NASA Astrophysics Data System (ADS)

    Georgopoulos, A.; Loizos, A.; Flouda, A.

    The information obtained through accurate condition assessment of pavement surface distress data is needed as an essential input to any decision making process concerning pavement management policy. At the same time technological advances in automated inspection systems provide the opportunity to automate the collection and evaluation of the pavement surface condition. In this paper a method developed jointly by the Laboratories of Highway Engineering and Photogrammetry of the National Technical University of Athens is described and proposed. The method involves digital image processing techniques to provide suitable digital imagery as input to specialised software developed especially for this project. This software determines objectively and fully automatically the type, the extent and the severity of surface crackings for flexible road pavements. The proposed method presenten substantial agreement, when compared with systematic visual ratings of existing pavement crackings carried out according to the internationally accepted requirements for airfield and road pavement of the Federal Aviation Administration (FAA).

  11. Processing Digital Imagery to Enhance Perceptions of Realism

    NASA Technical Reports Server (NTRS)

    Woodell, Glenn A.; Jobson, Daniel J.; Rahman, Zia-ur

    2003-01-01

    Multi-scale retinex with color restoration (MSRCR) is a method of processing digital image data based on Edwin Land s retinex (retina + cortex) theory of human color vision. An outgrowth of basic scientific research and its application to NASA s remote-sensing mission, MSRCR is embodied in a general-purpose algorithm that greatly improves the perception of visual realism and the quantity and quality of perceived information in a digitized image. In addition, the MSRCR algorithm includes provisions for automatic corrections to accelerate and facilitate what could otherwise be a tedious image-editing process. The MSRCR algorithm has been, and is expected to continue to be, the basis for development of commercial image-enhancement software designed to extend and refine its capabilities for diverse applications.

  12. Viking image processing. [digital stereo imagery and computer mosaicking

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1977-01-01

    The paper discusses the camera systems capable of recording black and white and color imagery developed for the Viking Lander imaging experiment. Each Viking Lander image consisted of a matrix of numbers with 512 rows and an arbitrary number of columns up to a maximum of about 9,000. Various techniques were used in the processing of the Viking Lander images, including: (1) digital geometric transformation, (2) the processing of stereo imagery to produce three-dimensional terrain maps, and (3) computer mosaicking of distinct processed images. A series of Viking Lander images is included.

  13. Computer image processing - The Viking experience. [digital enhancement techniques

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1977-01-01

    Computer processing of digital imagery from the Viking mission to Mars is discussed, with attention given to subjective enhancement and quantitative processing. Contrast stretching and high-pass filtering techniques of subjective enhancement are described; algorithms developed to determine optimal stretch and filtering parameters are also mentioned. In addition, geometric transformations to rectify the distortion of shapes in the field of view and to alter the apparent viewpoint of the image are considered. Perhaps the most difficult problem in quantitative processing of Viking imagery was the production of accurate color representations of Orbiter and Lander camera images.

  14. Digital signal processing algorithms for automatic voice recognition

    NASA Technical Reports Server (NTRS)

    Botros, Nazeih M.

    1987-01-01

    The current digital signal analysis algorithms are investigated that are implemented in automatic voice recognition algorithms. Automatic voice recognition means, the capability of a computer to recognize and interact with verbal commands. The digital signal is focused on, rather than the linguistic, analysis of speech signal. Several digital signal processing algorithms are available for voice recognition. Some of these algorithms are: Linear Predictive Coding (LPC), Short-time Fourier Analysis, and Cepstrum Analysis. Among these algorithms, the LPC is the most widely used. This algorithm has short execution time and do not require large memory storage. However, it has several limitations due to the assumptions used to develop it. The other 2 algorithms are frequency domain algorithms with not many assumptions, but they are not widely implemented or investigated. However, with the recent advances in the digital technology, namely signal processors, these 2 frequency domain algorithms may be investigated in order to implement them in voice recognition. This research is concerned with real time, microprocessor based recognition algorithms.

  15. Audit and Certification Process for Science Data Digital Repositories

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Giaretta, D.; Ambacher, B.; Ashley, K.; Conrad, M.; Downs, R. R.; Garrett, J.; Guercio, M.; Lambert, S.; Longstreth, T.; Sawyer, D. M.; Sierman, B.; Tibbo, H.; Waltz, M.

    2011-12-01

    Science data digital repositories are entrusted to ensure that a science community's data are available and useful to users both today and in the future. Part of the challenge in meeting this responsibility is identifying the standards, policies and procedures required to accomplish effective data preservation. Subsequently a repository should be evaluated on whether or not they are effective in their data preservation efforts. This poster will outline the process by which digital repositories are being formally evaluated in terms of their ability to preserve the digitally encoded information with which they have been entrusted. The ISO standards on which this is based will be identified and the relationship of these standards to the Open Archive Information System (OAIS) reference model will be shown. Six test audits have been conducted with three repositories in Europe and three in the USA. Some of the major lessons learned from these test audits will be briefly described. An assessment of the possible impact of this type of audit and certification on the practice of preserving digital information will also be provided.

  16. APET methodology for Defense Waste Processing Facility: Mode C operation

    SciTech Connect

    Taylor, R.P. Jr.; Massey, W.M.

    1995-04-01

    Safe operation of SRS facilities continues to be the highest priority of the Savannah River Site (SRS). One of these facilities, the Defense Waste Processing Facility or DWPF, is currently undergoing cold chemical runs to verify the design and construction preparatory to hot startup in 1995. The DWPFF is a facility designed to convert the waste currently stored in tanks at the 200-Area tank farm into a form that is suitable for long term storage in engineered surface facilities and, ultimately, geologic isolation. As a part of the program to ensure safe operation of the DWPF, a probabilistic Safety Assessment of the DWPF has been completed. The results of this analysis are incorporated into the Safety Analysis Report (SAR) for DWPF. The usual practice in preparation of Safety Analysis Reports is to include only a conservative analysis of certain design basis accidents. A major part of a Probabilistic Safety Assessment is the development and quantification of an Accident Progression Event Tree or APET. The APET provides a probabilistic representation of potential sequences along which an accident may progress. The methodology used to determine the risk of operation of the DWPF borrows heavily from methods applied to the Probabilistic Safety Assessment of SRS reactors and to some commercial reactors. This report describes the Accident Progression Event Tree developed for the Probabilistic Safety Assessment of the DWPF.

  17. Instruments and Methodologies for the Underwater Tridimensional Digitization and Data Musealization

    NASA Astrophysics Data System (ADS)

    Repola, L.; Memmolo, R.; Signoretti, D.

    2015-04-01

    In the research started within the SINAPSIS project of the Università degli Studi Suor Orsola Benincasa an underwater stereoscopic scanning aimed at surveying of submerged archaeological sites, integrable to standard systems for geomorphological detection of the coast, has been developed. The project involves the construction of hardware consisting of an aluminum frame supporting a pair of GoPro Hero Black Edition cameras and software for the production of point clouds and the initial processing of data. The software has features for stereoscopic vision system calibration, reduction of noise and the of distortion of underwater captured images, searching for corresponding points of stereoscopic images using stereo-matching algorithms (dense and sparse), for points cloud generating and filtering. Only after various calibration and survey tests carried out during the excavations envisaged in the project, the mastery of methods for an efficient acquisition of data has been achieved. The current development of the system has allowed generation of portions of digital models of real submerged scenes. A semi-automatic procedure for global registration of partial models is under development as a useful aid for the study and musealization of sites.

  18. Quantitative Assessment of Mouse Mammary Gland Morphology Using Automated Digital Image Processing and TEB Detection.

    PubMed

    Blacher, Silvia; Gérard, Céline; Gallez, Anne; Foidart, Jean-Michel; Noël, Agnès; Péqueux, Christel

    2016-04-01

    The assessment of rodent mammary gland morphology is largely used to study the molecular mechanisms driving breast development and to analyze the impact of various endocrine disruptors with putative pathological implications. In this work, we propose a methodology relying on fully automated digital image analysis methods including image processing and quantification of the whole ductal tree and of the terminal end buds as well. It allows to accurately and objectively measure both growth parameters and fine morphological glandular structures. Mammary gland elongation was characterized by 2 parameters: the length and the epithelial area of the ductal tree. Ductal tree fine structures were characterized by: 1) branch end-point density, 2) branching density, and 3) branch length distribution. The proposed methodology was compared with quantification methods classically used in the literature. This procedure can be transposed to several software and thus largely used by scientists studying rodent mammary gland morphology. PMID:26910307

  19. Fingerprint pattern restoration by digital image processing techniques.

    PubMed

    Wen, Che-Yen; Yu, Chiu-Chung

    2003-09-01

    Fingerprint evidence plays an important role in solving criminal problems. However, defective (lacking information needed for completeness) or contaminated (undesirable information included) fingerprint patterns make identifying and recognizing processes difficult. Unfortunately. this is the usual case. In the recognizing process (enhancement of patterns, or elimination of "false alarms" so that a fingerprint pattern can be searched in the Automated Fingerprint Identification System (AFIS)), chemical and physical techniques have been proposed to improve pattern legibility. In the identifying process, a fingerprint examiner can enhance contaminated (but not defective) fingerprint patterns under guidelines provided by the Scientific Working Group on Friction Ridge Analysis, Study and Technology (SWGFAST), the Scientific Working Group on Imaging Technology (SWGIT), and an AFIS working group within the National Institute of Justice. Recently, the image processing techniques have been successfully applied in forensic science. For example, we have applied image enhancement methods to improve the legibility of digital images such as fingerprints and vehicle plate numbers. In this paper, we propose a novel digital image restoration technique based on the AM (amplitude modulation)-FM (frequency modulation) reaction-diffusion method to restore defective or contaminated fingerprint patterns. This method shows its potential application to fingerprint pattern enhancement in the recognizing process (but not for the identifying process). Synthetic and real images are used to show the capability of the proposed method. The results of enhancing fingerprint patterns by the manual process and our method are evaluated and compared. PMID:14535661

  20. Digital metamaterials.

    PubMed

    Della Giovampaola, Cristian; Engheta, Nader

    2014-12-01

    Balancing complexity and simplicity has played an important role in the development of many fields in science and engineering. One of the well-known and powerful examples of such balance can be found in Boolean algebra and its impact on the birth of digital electronics and the digital information age. The simplicity of using only two numbers, '0' and '1', in a binary system for describing an arbitrary quantity made the fields of digital electronics and digital signal processing powerful and ubiquitous. Here, inspired by the binary concept, we propose to develop the notion of digital metamaterials. Specifically, we investigate how one can synthesize an electromagnetic metamaterial with a desired permittivity, using as building blocks only two elemental materials, which we call 'metamaterial bits', with two distinct permittivity functions. We demonstrate, analytically and numerically, how proper spatial mixtures of such metamaterial bits lead to elemental 'metamaterial bytes' with effective material parameters that are different from the parameters of the metamaterial bits. We then apply this methodology to several design examples of optical elements, such as digital convex lenses, flat graded-index digital lenses, digital constructs for epsilon-near-zero (ENZ) supercoupling and digital hyperlenses, thus highlighting the power and simplicity of the methodology.

  1. a Semi-Automated Point Cloud Processing Methodology for 3d Cultural Heritage Documentation

    NASA Astrophysics Data System (ADS)

    Kıvılcım, C. Ö.; Duran, Z.

    2016-06-01

    The preliminary phase in any architectural heritage project is to obtain metric measurements and documentation of the building and its individual elements. On the other hand, conventional measurement techniques require tremendous resources and lengthy project completion times for architectural surveys and 3D model production. Over the past two decades, the widespread use of laser scanning and digital photogrammetry have significantly altered the heritage documentation process. Furthermore, advances in these technologies have enabled robust data collection and reduced user workload for generating various levels of products, from single buildings to expansive cityscapes. More recently, the use of procedural modelling methods and BIM relevant applications for historic building documentation purposes has become an active area of research, however fully automated systems in cultural heritage documentation still remains open. In this paper, we present a semi-automated methodology, for 3D façade modelling of cultural heritage assets based on parametric and procedural modelling techniques and using airborne and terrestrial laser scanning data. We present the contribution of our methodology, which we implemented in an open source software environment using the example project of a 16th century early classical era Ottoman structure, Sinan the Architect's Şehzade Mosque in Istanbul, Turkey.

  2. American College of Radiology Imaging Network digital mammographic imaging screening trial: objectives and methodology.

    PubMed

    Pisano, Etta D; Gatsonis, Constantine A; Yaffe, Martin J; Hendrick, R Edward; Tosteson, Anna N A; Fryback, Dennis G; Bassett, Lawrence W; Baum, Janet K; Conant, Emily F; Jong, Roberta A; Rebner, Murray; D'Orsi, Carl J

    2005-08-01

    This study was approved by the Institutional Review Board (IRB) of the American College of Radiology Imaging Network (ACRIN) and each participating site and by the IRB and the Cancer Therapy Evaluation Program at the National Cancer Institute. The study was monitored by an independent Data Safety and Monitoring Board, which received interim analyses of data to ensure that the study would be terminated early if indicated by trends in the outcomes. The ACRIN, which is funded by the National Cancer Institute, conducted the Digital Mammographic Imaging Screening Trial (DMIST) primarily to compare the diagnostic accuracy of digital and screen-film mammography in asymptomatic women presenting for screening for breast cancer. Over the 25.5 months of enrollment, a total of 49 528 women were included at the 33 participating sites, which used five different types of digital mammography equipment. All participants underwent both screen-film and digital mammography. The digital and screen-film mammograms of each subject were independently interpreted by two radiologists. If findings of either examination were interpreted as abnormal, subsequent work-up occurred according to the recommendations of the interpreting radiologist. Breast cancer status was determined at biopsy or follow-up mammography 11-15 months after study entry. In addition to the measurement of diagnostic accuracy by using the interpretations of mammograms at the study sites, DMIST included evaluations of the relative cost-effectiveness and quality-of-life effects of digital versus screen-film mammography. Six separate reader studies using the de-identified archived DMIST mammograms will also assess the diagnostic accuracy of each of the individual digital mammography machines versus screen-film mammography machines, the effect of breast density on diagnostic accuracy of digital and screen-film mammography, and the effect of different rates of breast cancer on the diagnostic accuracy in a reader study. PMID

  3. Multiplexed interferometric fiber-optic sensors with digital signal processing.

    PubMed

    Sadkowski, R; Lee, C E; Taylor, H F

    1995-09-01

    A microcontroller-based digital signal processing system developed for use with fiber-optic sensors for measuring pressure in internal combustion engines is described. A single distributed feedback laser source provides optical power for four interferometric sensors. The laser current is repetitively modulated so that its optical frequency is nearly a linear function of time over most of a cycle. The interferometer phase shift is proportional to the elapsed time from the initiation of a sawtooth until the sensor output signal level crosses a threshold value proportional to the laser output power. This elapsed time, assumed to vary linearly with the combustion chamber pressure, is determined by the use of a digital timer-counter. The system has been used with fiber Fabry-Perot interferometer transducers for in-cylinder pressure measurement on a four-cylinder gasoline-powered engine.

  4. Fundamentals of in situ digital camera methodology for water quality monitoring of coast and ocean.

    PubMed

    Goddijn-Murphy, Lonneke; Dailloux, Damien; White, Martin; Bowers, Dave

    2009-01-01

    Conventional digital cameras, the Nikon Coolpix885(®) and the SeaLife ECOshot(®), were used as in situ optical instruments for water quality monitoring. Measured response spectra showed that these digital cameras are basically three-band radiometers. The response values in the red, green and blue bands, quantified by RGB values of digital images of the water surface, were comparable to measurements of irradiance levels at red, green and cyan/blue wavelengths of water leaving light. Different systems were deployed to capture upwelling light from below the surface, while eliminating direct surface reflection. Relationships between RGB ratios of water surface images, and water quality parameters were found to be consistent with previous measurements using more traditional narrow-band radiometers. This current paper focuses on the method that was used to acquire digital images, derive RGB values and relate measurements to water quality parameters. Field measurements were obtained in Galway Bay, Ireland, and in the Southern Rockall Trough in the North Atlantic, where both yellow substance and chlorophyll concentrations were successfully assessed using the digital camera method.

  5. Programmable rate modem utilizing digital signal processing techniques

    NASA Astrophysics Data System (ADS)

    Bunya, George K.; Wallace, Robert L.

    1989-07-01

    The engineering development study to follow was written to address the need for a Programmable Rate Digital Satellite Modem capable of supporting both burst and continuous transmission modes with either binary phase shift keying (BPSK) or quadrature phase shift keying (QPSK) modulation. The preferred implementation technique is an all digital one which utilizes as much digital signal processing (DSP) as possible. Here design tradeoffs in each portion of the modulator and demodulator subsystem are outlined, and viable circuit approaches which are easily repeatable, have low implementation losses and have low production costs are identified. The research involved for this study was divided into nine technical papers, each addressing a significant region of concern in a variable rate modem design. Trivial portions and basic support logic designs surrounding the nine major modem blocks were omitted. In brief, the nine topic areas were: (1) Transmit Data Filtering; (2) Transmit Clock Generation; (3) Carrier Synthesizer; (4) Receive AGC; (5) Receive Data Filtering; (6) RF Oscillator Phase Noise; (7) Receive Carrier Selectivity; (8) Carrier Recovery; and (9) Timing Recovery.

  6. Programmable rate modem utilizing digital signal processing techniques

    NASA Technical Reports Server (NTRS)

    Bunya, George K.; Wallace, Robert L.

    1989-01-01

    The engineering development study to follow was written to address the need for a Programmable Rate Digital Satellite Modem capable of supporting both burst and continuous transmission modes with either binary phase shift keying (BPSK) or quadrature phase shift keying (QPSK) modulation. The preferred implementation technique is an all digital one which utilizes as much digital signal processing (DSP) as possible. Here design tradeoffs in each portion of the modulator and demodulator subsystem are outlined, and viable circuit approaches which are easily repeatable, have low implementation losses and have low production costs are identified. The research involved for this study was divided into nine technical papers, each addressing a significant region of concern in a variable rate modem design. Trivial portions and basic support logic designs surrounding the nine major modem blocks were omitted. In brief, the nine topic areas were: (1) Transmit Data Filtering; (2) Transmit Clock Generation; (3) Carrier Synthesizer; (4) Receive AGC; (5) Receive Data Filtering; (6) RF Oscillator Phase Noise; (7) Receive Carrier Selectivity; (8) Carrier Recovery; and (9) Timing Recovery.

  7. Measurements methodology for evaluation of Digital TV operation in VHF high-band

    NASA Astrophysics Data System (ADS)

    Pudwell Chaves de Almeida, M.; Vladimir Gonzalez Castellanos, P.; Alfredo Cal Braz, J.; Pereira David, R.; Saboia Lima de Souza, R.; Pereira da Soledade, A.; Rodrigues Nascimento Junior, J.; Ferreira Lima, F.

    2016-07-01

    This paper describes the experimental setup of field measurements carried out for evaluating the operation of the ISDB-TB (Integrated Services Digital Broadcasting, Terrestrial, Brazilian version) standard digital TV in the VHF-highband. Measurements were performed in urban and suburban areas in a medium-sized Brazilian city. Besides the direct measurements of received power and environmental noise, a measurement procedure involving the injection of Gaussian additive noise was employed to achieve the signal to noise ratio threshold at each measurement site. The analysis includes results of static reception measurements for evaluating the received field strength and the signal to noise ratio thresholds for correct signal decoding.

  8. Evaluation of clinical image processing algorithms used in digital mammography.

    PubMed

    Zanca, Federica; Jacobs, Jurgen; Van Ongeval, Chantal; Claus, Filip; Celis, Valerie; Geniets, Catherine; Provost, Veerle; Pauwels, Herman; Marchal, Guy; Bosmans, Hilde

    2009-03-01

    Screening is the only proven approach to reduce the mortality of breast cancer, but significant numbers of breast cancers remain undetected even when all quality assurance guidelines are implemented. With the increasing adoption of digital mammography systems, image processing may be a key factor in the imaging chain. Although to our knowledge statistically significant effects of manufacturer-recommended image processings have not been previously demonstrated, the subjective experience of our radiologists, that the apparent image quality can vary considerably between different algorithms, motivated this study. This article addresses the impact of five such algorithms on the detection of clusters of microcalcifications. A database of unprocessed (raw) images of 200 normal digital mammograms, acquired with the Siemens Novation DR, was collected retrospectively. Realistic simulated microcalcification clusters were inserted in half of the unprocessed images. All unprocessed images were subsequently processed with five manufacturer-recommended image processing algorithms (Agfa Musica 1, IMS Raffaello Mammo 1.2, Sectra Mamea AB Sigmoid, Siemens OPVIEW v2, and Siemens OPVIEW v1). Four breast imaging radiologists were asked to locate and score the clusters in each image on a five point rating scale. The free-response data were analyzed by the jackknife free-response receiver operating characteristic (JAFROC) method and, for comparison, also with the receiver operating characteristic (ROC) method. JAFROC analysis revealed highly significant differences between the image processings (F = 8.51, p < 0.0001), suggesting that image processing strongly impacts the detectability of clusters. Siemens OPVIEW2 and Siemens OPVIEW1 yielded the highest and lowest performances, respectively. ROC analysis of the data also revealed significant differences between the processing but at lower significance (F = 3.47, p = 0.0305) than JAFROC. Both statistical analysis methods revealed that the

  9. Evaluation of clinical image processing algorithms used in digital mammography.

    PubMed

    Zanca, Federica; Jacobs, Jurgen; Van Ongeval, Chantal; Claus, Filip; Celis, Valerie; Geniets, Catherine; Provost, Veerle; Pauwels, Herman; Marchal, Guy; Bosmans, Hilde

    2009-03-01

    Screening is the only proven approach to reduce the mortality of breast cancer, but significant numbers of breast cancers remain undetected even when all quality assurance guidelines are implemented. With the increasing adoption of digital mammography systems, image processing may be a key factor in the imaging chain. Although to our knowledge statistically significant effects of manufacturer-recommended image processings have not been previously demonstrated, the subjective experience of our radiologists, that the apparent image quality can vary considerably between different algorithms, motivated this study. This article addresses the impact of five such algorithms on the detection of clusters of microcalcifications. A database of unprocessed (raw) images of 200 normal digital mammograms, acquired with the Siemens Novation DR, was collected retrospectively. Realistic simulated microcalcification clusters were inserted in half of the unprocessed images. All unprocessed images were subsequently processed with five manufacturer-recommended image processing algorithms (Agfa Musica 1, IMS Raffaello Mammo 1.2, Sectra Mamea AB Sigmoid, Siemens OPVIEW v2, and Siemens OPVIEW v1). Four breast imaging radiologists were asked to locate and score the clusters in each image on a five point rating scale. The free-response data were analyzed by the jackknife free-response receiver operating characteristic (JAFROC) method and, for comparison, also with the receiver operating characteristic (ROC) method. JAFROC analysis revealed highly significant differences between the image processings (F = 8.51, p < 0.0001), suggesting that image processing strongly impacts the detectability of clusters. Siemens OPVIEW2 and Siemens OPVIEW1 yielded the highest and lowest performances, respectively. ROC analysis of the data also revealed significant differences between the processing but at lower significance (F = 3.47, p = 0.0305) than JAFROC. Both statistical analysis methods revealed that the

  10. Fast layout processing methodologies for scalable distributed computing applications

    NASA Astrophysics Data System (ADS)

    Kang, Chang-woo; Shin, Jae-pil; Durvasula, Bhardwaj; Seo, Sang-won; Jung, Dae-hyun; Lee, Jong-bae; Park, Young-kwan

    2012-06-01

    As the feature size shrinks to sub-20nm, more advanced OPC technologies such as ILT and the new lithographic resolution by EUV become the key solutions for device fabrication. These technologies leads to the file size explosion of up to hundreds of gigabytes of GDSII and OASIS files mainly due to the addition of complicated scattering bars and flattening of the design to compensate for long range effects. Splitting and merging layout files have been done sequentially in typical distributed computing layout applications. This portion becomes the bottle neck, causing the scalability to become poor. According to the Amdahl's law, minimizing the portion of sequential part is the key to get the maximum speed up. In this paper, we present scalable layout dividing and merging methodologies: Skeleton file based querying and direct OASIS file merging. These methods not only use a very minimum memory footprint but also achieve remarkable speed improvement. The skeleton file concept is very novel for a distributed application requiring geometrical processing, as it allows almost pseudo-random access into the input GDSII or OASIS file. Client machines can make use of the random access and perform fast query operations. The skeleton concept also works very well for flat input layouts, which is often the case of post-OPC data. Also, our OASIS file merging scheme is a smart approach which is equivalent of a binary file concatenation scheme. The merging method for OASIS files concatenates shape information in binary format with basic interpretation of bits with very low memory usage. We have observed that the skeleton file concept achieved 13.5 times speed improvement and used only 3.78% of memory on the master, over the conventional concept of converting into an internal format. Also, the merging speed is very fast, 28MB/sec and it is 44.5 times faster than conventional method. On top of the fast merging speed, it is very scalable since the merging time grows in linear fashion

  11. DSPSR: Digital Signal Processing Software for Pulsar Astronomy

    NASA Astrophysics Data System (ADS)

    van Straten, W.; Bailes, M.

    2010-10-01

    DSPSR, written primarily in C++, is an open-source, object-oriented, digital signal processing software library and application suite for use in radio pulsar astronomy. The library implements an extensive range of modular algorithms for use in coherent dedispersion, filterbank formation, pulse folding, and other tasks. The software is installed and compiled using the standard GNU configure and make system, and is able to read astronomical data in 18 different file formats, including FITS, S2, CPSR, CPSR2, PuMa, PuMa2, WAPP, ASP, and Mark5.

  12. Graphics processing, video digitizing, and presentation of geologic information

    SciTech Connect

    Sanchez, J.D. )

    1990-02-01

    Computer users have unparalleled opportunities to use powerful desktop computers to generate, manipulate, analyze and use graphic information for better communication. Processing graphic geologic information on a personal computer like the Amiga used for the projects discussed here enables geoscientists to create and manipulate ideas in ways once available only to those with access to large budgets and large mainframe computers. Desktop video applications such as video digitizing and powerful graphic processing application programs add a new dimension to the creation and manipulation of geologic information. Videotape slide shows and animated geology give geoscientists new tools to examine and present information. Telecommunication programs such as ATalk III, which can be used as an all-purpose telecommunications program or can emulate a Tektronix 4014 terminal, allow the user to access Sun and Prime minicomputers and manipulate graphic geologic information stored there. Graphics information displayed on the monitor screen can be captured and saved in the standard Amiga IFF graphic format. These IFF files can be processed using image processing programs such as Butcher. Butcher offers edge mapping, resolution conversion, color separation, false colors, toning, positive-negative reversals, etc. Multitasking and easy expansion that includes IBM-XT and AT co-processing offer unique capabilities for graphic processing and file transfer between Amiga-DOS and MS-DOS. Digital images produced by satellites and airborne scanners can be analyzed on the Amiga using the A-Image processing system developed by the CSIRO Division of Mathematics and Statistics and the School of Mathematics and Computing at Curtin University, Australia.

  13. Perspectives on Learning: Methodologies for Exploring Learning Processes and Outcomes

    ERIC Educational Resources Information Center

    Goldman, Susan R.

    2014-01-01

    The papers in this Special Issue were initially prepared for an EARLI 2013 Symposium that was designed to examine methodologies in use by researchers from two sister communities, Learning and Instruction and Learning Sciences. The four papers reflect a common ground in advances in conceptions of learning since the early days of the "cognitive…

  14. Intelligent systems/software engineering methodology - A process to manage cost and risk

    NASA Technical Reports Server (NTRS)

    Friedlander, Carl; Lehrer, Nancy

    1991-01-01

    A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.

  15. Digital Methodologies of Education Governance: Pearson plc and the Remediation of Methods

    ERIC Educational Resources Information Center

    Williamson, Ben

    2016-01-01

    This article analyses the rise of software systems in education governance, focusing on digital methods in the collection, calculation and circulation of educational data. It examines how software-mediated methods intervene in the ways educational institutions and actors are seen, known and acted upon through an analysis of the methodological…

  16. Optimizing Digital Health Informatics Interventions Through Unobtrusive Quantitative Process Evaluations.

    PubMed

    Gude, Wouter T; van der Veer, Sabine N; de Keizer, Nicolette F; Coiera, Enrico; Peek, Niels

    2016-01-01

    Health informatics interventions such as clinical decision support (CDS) and audit and feedback (A&F) are variably effective at improving care because the underlying mechanisms through which these interventions bring about change are poorly understood. This limits our possibilities to design better interventions. Process evaluations can be used to improve this understanding by assessing fidelity and quality of implementation, clarifying causal mechanisms, and identifying contextual factors associated with variation in outcomes. Coiera describes the intervention process as a series of stages extending from interactions to outcomes: the "information value chain". However, past process evaluations often did not assess the relationships between those stages. In this paper we argue that the chain can be measured quantitatively and unobtrusively in digital interventions thanks to the availability of electronic data that are a by-product of their use. This provides novel possibilities to study the mechanisms of informatics interventions in detail and inform essential design choices to optimize their efficacy. PMID:27577453

  17. Digital processing of side-scan sonar data with the Woods Hole image processing system software

    USGS Publications Warehouse

    Paskevich, Valerie F.

    1992-01-01

    Since 1985, the Branch of Atlantic Marine Geology has been involved in collecting, processing and digitally mosaicking high and low-resolution side-scan sonar data. Recent development of a UNIX-based image-processing software system includes a series of task specific programs for processing side-scan sonar data. This report describes the steps required to process the collected data and to produce an image that has equal along- and across-track resol

  18. On the Development of Arabic Three-Digit Number Processing in Primary School Children

    ERIC Educational Resources Information Center

    Mann, Anne; Moeller, Korbinian; Pixner, Silvia; Kaufmann, Liane; Nuerk, Hans-Christoph

    2012-01-01

    The development of two-digit number processing in children, and in particular the influence of place-value understanding, has recently received increasing research interest. However, place-value influences leading to decomposed processing have not yet been investigated for multi-digit numbers beyond the two-digit number range in children.…

  19. Naturalistic observation of health-relevant social processes: the electronically activated recorder methodology in psychosomatics.

    PubMed

    Mehl, Matthias R; Robbins, Megan L; Deters, Fenne Große

    2012-05-01

    This article introduces a novel observational ambulatory monitoring method called the electronically activated recorder (EAR). The EAR is a digital audio recorder that runs on a handheld computer and periodically and unobtrusively records snippets of ambient sounds from participants' momentary environments. In tracking moment-to-moment ambient sounds, it yields acoustic logs of people's days as they naturally unfold. In sampling only a fraction of the time, it protects participants' privacy and makes large observational studies feasible. As a naturalistic observation method, it provides an observer's account of daily life and is optimized for the objective assessment of audible aspects of social environments, behaviors, and interactions (e.g., habitual preferences for social settings, idiosyncratic interaction styles, subtle emotional expressions). This article discusses the EAR method conceptually and methodologically, reviews prior research with it, and identifies three concrete ways in which it can enrich psychosomatic research. Specifically, it can (a) calibrate psychosocial effects on health against frequencies of real-world behavior; (b) provide ecological observational measures of health-related social processes that are independent of self-report; and (c) help with the assessment of subtle and habitual social behaviors that evade self-report but have important health implications. An important avenue for future research lies in merging traditional self-report-based ambulatory monitoring methods with observational approaches such as the EAR to allow for the simultaneous yet methodologically independent assessment of inner, experiential aspects (e.g., loneliness) and outer, observable aspects (e.g., social isolation) of real-world social processes to reveal their unique effects on health.

  20. Coherent detection and digital signal processing for fiber optic communications

    NASA Astrophysics Data System (ADS)

    Ip, Ezra

    The drive towards higher spectral efficiency in optical fiber systems has generated renewed interest in coherent detection. We review different detection methods, including noncoherent, differentially coherent, and coherent detection, as well as hybrid detection methods. We compare the modulation methods that are enabled and their respective performances in a linear regime. An important system parameter is the number of degrees of freedom (DOF) utilized in transmission. Polarization-multiplexed quadrature-amplitude modulation maximizes spectral efficiency and power efficiency as it uses all four available DOF contained in the two field quadratures in the two polarizations. Dual-polarization homodyne or heterodyne downconversion are linear processes that can fully recover the received signal field in these four DOF. When downconverted signals are sampled at the Nyquist rate, compensation of transmission impairments can be performed using digital signal processing (DSP). Software based receivers benefit from the robustness of DSP, flexibility in design, and ease of adaptation to time-varying channels. Linear impairments, including chromatic dispersion (CD) and polarization-mode dispersion (PMD), can be compensated quasi-exactly using finite impulse response filters. In practical systems, sampling the received signal at 3/2 times the symbol rate is sufficient to enable an arbitrary amount of CD and PMD to be compensated for a sufficiently long equalizer whose tap length scales linearly with transmission distance. Depending on the transmitted constellation and the target bit error rate, the analog-to-digital converter (ADC) should have around 5 to 6 bits of resolution. Digital coherent receivers are naturally suited for the implementation of feedforward carrier recovery, which has superior linewidth tolerance than phase-locked loops, and does not suffer from feedback delay constraints. Differential bit encoding can be used to prevent catastrophic receiver failure due

  1. Data Processing Factory for the Sloan Digital Sky Survey

    NASA Astrophysics Data System (ADS)

    Stoughton, Christopher; Adelman, Jennifer; Annis, James T.; Hendry, John; Inkmann, John; Jester, Sebastian; Kent, Steven M.; Kuropatkin, Nickolai; Lee, Brian; Lin, Huan; Peoples, John, Jr.; Sparks, Robert; Tucker, Douglas; Vanden Berk, Dan; Yanny, Brian; Yocum, Dan

    2002-12-01

    The Sloan Digital Sky Survey (SDSS) data handling presents two challenges: large data volume and timely production of spectroscopic plates from imaging data. A data processing factory, using technologies both old and new, handles this flow. Distribution to end users is via disk farms, to serve corrected images and calibrated spectra, and a database, to efficiently process catalog queries. For distribution of modest amounts of data from Apache Point Observatory to Fermilab, scripts use rsync to update files, while larger data transfers are accomplished by shipping magnetic tapes commercially. All data processing pipelines are wrapped in scripts to address consecutive phases: preparation, submission, checking, and quality control. We constructed the factory by chaining these pipelines together while using an operational database to hold processed imaging catalogs. The science database catalogs all imaging and spectroscopic object, with pointers to the various external files associated with them. Diverse computing systems address particular processing phases. UNIX computers handle tape reading and writing, as well as calibration steps that require access to a large amount of data with relatively modest computational demands. Commodity CPUs process steps that require access to a limited amount of data with more demanding computations requirements. Disk servers optimized for cost per Gbyte serve terabytes of processed data, while servers optimized for disk read speed run SQLServer software to process queries on the catalogs. This factory produced data for the SDSS Early Data Release in June 2001, and it is currently producing Data Release One, scheduled for January 2003.

  2. Phase resolved digital signal processing in optical coherence tomography

    NASA Astrophysics Data System (ADS)

    de Boer, Johannes F.; Tripathi, Renu; Park, Boris H.; Nassif, Nader

    2002-06-01

    We present phase resolved digital signal processing techniques for Optical Coherence Tomography to correct for the non Gaussian shape of source spectra and for Group Delay Dispersion (GDD). A broadband source centered at 820 nm was synthesized by combining the spectra of two superluminescent diodes to improve axial image resolution in an optical coherence tomography (OCT) system. Spectral shaping was used to reduce the side lobes (ringing) in the axial point spread function due to the non-Gaussian shape of the spectra. Images of onion cells taken with each individual source and the combined sources, respectively, show the improved resolution and quality enhancement in a turbid biological sample. An OCT system operating at 1310 nm was used to demonstrate that the broadening effect of group delay dispersion (GDD) on the coherence function could be eliminated completely by introducing a quadratic phase shift in the Fourier domain of the interferometric signal. The technique is demonstrated by images of human skin grafts with group delay dispersion mismatch between sample and reference arm before and after digital processing.

  3. The use of digital signal processing in satellite communication

    NASA Astrophysics Data System (ADS)

    Bramwell, Jonathan Richard

    1988-06-01

    The recent emphasis on information technology has increased the need for methods of data communications with a greater interest in the areas of satellite communications. Data communications over a satellite can be easily achieved by the use of excessive power and bandwidth but efficient management of the satellite resource requires more elegant means of transmission. The optimum modulator and demodulator can be described by mathematical expressions to represent the physical processes that are required to transmit and receive a signal. Digital signal processing circuits can be used to implement these mathematical functions and once correctly designed are not susceptible to variations in accuracy and hence can maintain an accurate representation of the mathematical model. This thesis documents an investigation into the algorithms and techniques that can be used in the digital implementation of a satellite data modem. The technique used for carrier phase recovery and data decoding is a major variation on a method proposed by Viterbi and Viterbi and relies on phase estimation instead of the more common carrier regeneration techniques. A computer simulation of this algorithm and its performance is described and the overall performance of the simulation is compared to theoretical analysis and experimental performance of a multi-data rate satellite modem covering data rates in the range of 16 Ksymbol/sec to 256 Ksymbol/sec in both the BPSK and QPSK data formats.

  4. Microcomputer-based digital image processing - A tutorial package for exploration geologists

    NASA Technical Reports Server (NTRS)

    Harrington, J. A., Jr.; Cartin, K. F.

    1985-01-01

    An Apple II microcomputer-based software package for analysis of digital data developed at the University of Oklahoma, the Digital Image Analysis System (DIAS), provides a relatively low-cost, portable alternative to large, dedicated minicomputers for digital image processing education. Digital processing techniques for analysis of Landsat MSS data and a series of tutorial exercises for exploration geologists are described and evaluated. DIAS allows in-house training that does not interfere with computer-based prospect analysis objectives.

  5. The Digital Fields Board for the FIELDS instrument suite on the Solar Probe Plus mission: Analog and digital signal processing

    NASA Astrophysics Data System (ADS)

    Malaspina, David M.; Ergun, Robert E.; Bolton, Mary; Kien, Mark; Summers, David; Stevens, Ken; Yehle, Alan; Karlsson, Magnus; Hoxie, Vaughn C.; Bale, Stuart D.; Goetz, Keith

    2016-06-01

    The first in situ measurements of electric and magnetic fields in the near-Sun environment (< 0.25 AU from the Sun) will be made by the FIELDS instrument suite on the Solar Probe Plus mission. The Digital Fields Board (DFB) is an electronics board within FIELDS that performs analog and digital signal processing, as well as digitization, for signals between DC and 60 kHz from five voltage sensors and four search coil magnetometer channels. These nine input signals are processed on the DFB into 26 analog data streams. A specialized application-specific integrated circuit performs analog to digital conversion on all 26 analog channels simultaneously. The DFB then processes the digital data using a field programmable gate array (FPGA), generating a variety of data products, including digitally filtered continuous waveforms, high-rate burst capture waveforms, power spectra, cross spectra, band-pass filter data, and several ancillary products. While the data products are optimized for encounter-based mission operations, they are also highly configurable, a key design aspect for a mission of exploration. This paper describes the analog and digital signal processing used to ensure that the DFB produces high-quality science data, using minimal resources, in the challenging near-Sun environment.

  6. Processing techniques for digital sonar images from GLORIA.

    USGS Publications Warehouse

    Chavez, P.S.

    1986-01-01

    Image processing techniques have been developed to handle data from one of the newest members of the remote sensing family of digital imaging systems. This paper discusses software to process data collected by the GLORIA (Geological Long Range Inclined Asdic) sonar imaging system, designed and built by the Institute of Oceanographic Sciences (IOS) in England, to correct for both geometric and radiometric distortions that exist in the original 'raw' data. Preprocessing algorithms that are GLORIA-specific include corrections for slant-range geometry, water column offset, aspect ratio distortion, changes in the ship's velocity, speckle noise, and shading problems caused by the power drop-off which occurs as a function of range.-from Author

  7. Graphics processing unit accelerated computation of digital holograms.

    PubMed

    Kang, Hoonjong; Yaraş, Fahri; Onural, Levent

    2009-12-01

    An approximation for fast digital hologram generation is implemented on a central processing unit (CPU), a graphics processing unit (GPU), and a multi-GPU computational platform. The computational performance of the method on each platform is measured and compared. The computational speed on the GPU platform is much faster than on a CPU, and the algorithm could be further accelerated on a multi-GPU platform. In addition, the accuracy of the algorithm for single- and double-precision arithmetic is evaluated. The quality of the reconstruction from the algorithm using single-precision arithmetic is comparable with the quality from the double-precision arithmetic, and thus the implementation using single-precision arithmetic on a multi-GPU platform can be used for holographic video displays.

  8. Digital processing of mesoscale analysis and space sensor data

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.; Karitani, S.

    1985-01-01

    The mesoscale analysis and space sensor (MASS) data management and analysis system on the research computer system is presented. The MASS data base management and analysis system was implemented on the research computer system which provides a wide range of capabilities for processing and displaying large volumes of conventional and satellite derived meteorological data. The research computer system consists of three primary computers (HP-1000F, Harris/6, and Perkin-Elmer 3250), each of which performs a specific function according to its unique capabilities. The overall tasks performed concerning the software, data base management and display capabilities of the research computer system in terms of providing a very effective interactive research tool for the digital processing of mesoscale analysis and space sensor data is described.

  9. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems.

    PubMed

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-11-30

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015.

  10. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems

    PubMed Central

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-01-01

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015. PMID:26633395

  11. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems.

    PubMed

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-01-01

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015. PMID:26633395

  12. A Digital Ecosystem for the Collaborative Production of Open Textbooks: The LATIn Methodology

    ERIC Educational Resources Information Center

    Silveira, Ismar Frango; Ochôa, Xavier; Cuadros-Vargas, Alex; Pérez Casas, Alén; Casali, Ana; Ortega, Andre; Sprock, Antonio Silva; Alves, Carlos Henrique; Collazos Ordoñez, Cesar Alberto; Deco, Claudia; Cuadros-Vargas, Ernesto; Knihs, Everton; Parra, Gonzalo; Muñoz-Arteaga, Jaime; Gomes dos Santos, Jéssica; Broisin, Julien; Omar, Nizam; Motz, Regina; Rodés, Virginia; Bieliukas, Yosly Hernández C.

    2013-01-01

    Access to books in higher education is an issue to be addressed, especially in the context of underdeveloped countries, such as those in Latin America. More than just financial issues, cultural aspects and need for adaptation must be considered. The present conceptual paper proposes a methodology framework that would support collaborative open…

  13. Dynamic application of digital image and colour processing in characterizing flame radiation features

    NASA Astrophysics Data System (ADS)

    Huang, Hua Wei; Zhang, Yang

    2010-08-01

    In this work, the experimental investigation of the dynamic flame properties of flame flickering and equivalence ratio sensing of a combustion process was done. In particular, the time-varied flame properties were examined using a novel digital image and colour processing methodology. This technique makes use of the observed correlation between a digital image colour signal and physical flame radiation characteristics in the visible wavelength domain. Aspects of RGB and HSV colour modelling principles were applied to show that the addition of colour identification in the image processing of high-speed flame image data could yield three useful parameters which are related to the dynamic behaviour of different flame emanating components. First, the validity of the colour identities for tracking the yellowish-red diffusion and greenish-blue premixed flame colourations were examined by comparing their respective flickering frequency profiles. Then, the usefulness of the extracted Rdiffusion, Gpremixed and Bpremixed colour signals to abstractly represent the behaviour of soot, C2* and CH* emission characteristics in a dynamic flame transition from diffusion to stoichiometric premixed condition was demonstrated. In particular, the colour signal ratio Bpremixed/Gpremixed was correlated to exemplify the approximate time-varied state of the equivalence ratio from the imaged combustion phenomenon.

  14. Digital Signal Processing and Control for the Study of Gene Networks.

    PubMed

    Shin, Yong-Jun

    2016-04-22

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.

  15. Digital Signal Processing and Control for the Study of Gene Networks

    PubMed Central

    Shin, Yong-Jun

    2016-01-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks. PMID:27102828

  16. IMAGEP - A FORTRAN ALGORITHM FOR DIGITAL IMAGE PROCESSING

    NASA Technical Reports Server (NTRS)

    Roth, D. J.

    1994-01-01

    IMAGEP is a FORTRAN computer algorithm containing various image processing, analysis, and enhancement functions. It is a keyboard-driven program organized into nine subroutines. Within the subroutines are other routines, also, selected via keyboard. Some of the functions performed by IMAGEP include digitization, storage and retrieval of images; image enhancement by contrast expansion, addition and subtraction, magnification, inversion, and bit shifting; display and movement of cursor; display of grey level histogram of image; and display of the variation of grey level intensity as a function of image position. This algorithm has possible scientific, industrial, and biomedical applications in material flaw studies, steel and ore analysis, and pathology, respectively. IMAGEP is written in VAX FORTRAN for DEC VAX series computers running VMS. The program requires the use of a Grinnell 274 image processor which can be obtained from Mark McCloud Associates, Campbell, CA. An object library of the required GMR series software is included on the distribution media. IMAGEP requires 1Mb of RAM for execution. The standard distribution medium for this program is a 1600 BPI 9track magnetic tape in VAX FILES-11 format. It is also available on a TK50 tape cartridge in VAX FILES-11 format. This program was developed in 1991. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation.

  17. Digital Image Processing Technique for Breast Cancer Detection

    NASA Astrophysics Data System (ADS)

    Guzmán-Cabrera, R.; Guzmán-Sepúlveda, J. R.; Torres-Cisneros, M.; May-Arrioja, D. A.; Ruiz-Pinales, J.; Ibarra-Manzano, O. G.; Aviña-Cervantes, G.; Parada, A. González

    2013-09-01

    Breast cancer is the most common cause of death in women and the second leading cause of cancer deaths worldwide. Primary prevention in the early stages of the disease becomes complex as the causes remain almost unknown. However, some typical signatures of this disease, such as masses and microcalcifications appearing on mammograms, can be used to improve early diagnostic techniques, which is critical for women’s quality of life. X-ray mammography is the main test used for screening and early diagnosis, and its analysis and processing are the keys to improving breast cancer prognosis. As masses and benign glandular tissue typically appear with low contrast and often very blurred, several computer-aided diagnosis schemes have been developed to support radiologists and internists in their diagnosis. In this article, an approach is proposed to effectively analyze digital mammograms based on texture segmentation for the detection of early stage tumors. The proposed algorithm was tested over several images taken from the digital database for screening mammography for cancer research and diagnosis, and it was found to be absolutely suitable to distinguish masses and microcalcifications from the background tissue using morphological operators and then extract them through machine learning techniques and a clustering algorithm for intensity-based segmentation.

  18. Digital Transformation of Words in Learning Processes: A Critical View.

    ERIC Educational Resources Information Center

    Saga, Hiroo

    1999-01-01

    Presents some negative aspects of society's dependence on digital transformation of words by referring to works by Walter Ong and Martin Heidegger. Discusses orality, literacy and digital literacy and describes three aspects of the digital transformation of words. Compares/contrasts art with technology and discusses implications for education.…

  19. Application Of Digital Image Processing To Acoustic Ambiguity Functions

    NASA Astrophysics Data System (ADS)

    Sharkey, J. Brian

    1983-03-01

    The passive acoustic ambiguity function is a measure of the cross-spectrum in a Doppler-shift and time-delay space that arises when two or more passive receivers are used to monitor a moving acoustic source. Detection of a signal source in the presence of noise has been treated in the past from a communications-theory point of view, with considerable effort devoted to establishing a threshold to which the maximum value of the function is compared. That approach disregards ambiguity function topography information which in practice is manually used to interpret source characteristics and source kinematics. Because of the two-dimensional representation of the ambiguity function, digital image processing techniques can be easily applied for the purposes of topography enhancement and characterization. This work presents an overview of techniques previously reported as well as more current research being conducted to improve detection performance and automate topography characterization.

  20. Infective endocarditis detection through SPECT/CT images digital processing

    NASA Astrophysics Data System (ADS)

    Moreno, Albino; Valdés, Raquel; Jiménez, Luis; Vallejo, Enrique; Hernández, Salvador; Soto, Gabriel

    2014-03-01

    Infective endocarditis (IE) is a difficult-to-diagnose pathology, since its manifestation in patients is highly variable. In this work, it was proposed a semiautomatic algorithm based on SPECT images digital processing for the detection of IE using a CT images volume as a spatial reference. The heart/lung rate was calculated using the SPECT images information. There were no statistically significant differences between the heart/lung rates values of a group of patients diagnosed with IE (2.62+/-0.47) and a group of healthy or control subjects (2.84+/-0.68). However, it is necessary to increase the study sample of both the individuals diagnosed with IE and the control group subjects, as well as to improve the images quality.

  1. Digital-image processing and image analysis of glacier ice

    USGS Publications Warehouse

    Fitzpatrick, Joan J.

    2013-01-01

    This document provides a methodology for extracting grain statistics from 8-bit color and grayscale images of thin sections of glacier ice—a subset of physical properties measurements typically performed on ice cores. This type of analysis is most commonly used to characterize the evolution of ice-crystal size, shape, and intercrystalline spatial relations within a large body of ice sampled by deep ice-coring projects from which paleoclimate records will be developed. However, such information is equally useful for investigating the stress state and physical responses of ice to stresses within a glacier. The methods of analysis presented here go hand-in-hand with the analysis of ice fabrics (aggregate crystal orientations) and, when combined with fabric analysis, provide a powerful method for investigating the dynamic recrystallization and deformation behaviors of bodies of ice in motion. The procedures described in this document compose a step-by-step handbook for a specific image acquisition and data reduction system built in support of U.S. Geological Survey ice analysis projects, but the general methodology can be used with any combination of image processing and analysis software. The specific approaches in this document use the FoveaPro 4 plug-in toolset to Adobe Photoshop CS5 Extended but it can be carried out equally well, though somewhat less conveniently, with software such as the image processing toolbox in MATLAB, Image-Pro Plus, or ImageJ.

  2. Study of optical techniques for the Ames unitary wind tunnel: Digital image processing, part 6

    NASA Technical Reports Server (NTRS)

    Lee, George

    1993-01-01

    A survey of digital image processing techniques and processing systems for aerodynamic images has been conducted. These images covered many types of flows and were generated by many types of flow diagnostics. These include laser vapor screens, infrared cameras, laser holographic interferometry, Schlieren, and luminescent paints. Some general digital image processing systems, imaging networks, optical sensors, and image computing chips were briefly reviewed. Possible digital imaging network systems for the Ames Unitary Wind Tunnel were explored.

  3. Digital computer processing of peach orchard multispectral aerial photography

    NASA Technical Reports Server (NTRS)

    Atkinson, R. J.

    1976-01-01

    Several methods of analysis using digital computers applicable to digitized multispectral aerial photography, are described, with particular application to peach orchard test sites. This effort was stimulated by the recent premature death of peach trees in the Southeastern United States. The techniques discussed are: (1) correction of intensity variations by digital filtering, (2) automatic detection and enumeration of trees in five size categories, (3) determination of unhealthy foliage by infrared reflectances, and (4) four band multispectral classification into healthy and declining categories.

  4. Digital Signal Processing Techniques for the GIFTS SM EDU

    NASA Technical Reports Server (NTRS)

    Tian, Jialin; Reisse, Robert A.; Gazarik, Michael J.

    2007-01-01

    The Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) Sensor Module (SM) Engineering Demonstration Unit (EDU) is a high resolution spectral imager designed to measure infrared (IR) radiance using a Fourier transform spectrometer (FTS). The GIFTS instrument employs three Focal Plane Arrays (FPAs), which gather measurements across the long-wave IR (LWIR), short/mid-wave IR (SMWIR), and visible spectral bands. The raw interferogram measurements are radiometrically and spectrally calibrated to produce radiance spectra, which are further processed to obtain atmospheric profiles via retrieval algorithms. This paper describes several digital signal processing (DSP) techniques involved in the development of the calibration model. In the first stage, the measured raw interferograms must undergo a series of processing steps that include filtering, decimation, and detector nonlinearity correction. The digital filtering is achieved by employing a linear-phase even-length FIR complex filter that is designed based on the optimum equiripple criteria. Next, the detector nonlinearity effect is compensated for using a set of pre-determined detector response characteristics. In the next stage, a phase correction algorithm is applied to the decimated interferograms. This is accomplished by first estimating the phase function from the spectral phase response of the windowed interferogram, and then correcting the entire interferogram based on the estimated phase function. In the calibration stage, we first compute the spectral responsivity based on the previous results and the ideal Planck blackbody spectra at the given temperatures, from which, the calibrated ambient blackbody (ABB), hot blackbody (HBB), and scene spectra can be obtained. In the post-calibration stage, we estimate the Noise Equivalent Spectral Radiance (NESR) from the calibrated ABB and HBB spectra. The NESR is generally considered as a measure of the instrument noise performance, and can be estimated as

  5. Digital Signal Processing Techniques for the GIFTS SM EDU

    NASA Astrophysics Data System (ADS)

    Tian, J.; Reisse, R.; Gazarik, M.

    The Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) Sensor Module (SM) Engineering Demonstration Unit (EDU) is a high resolution spectral imager designed to measure infrared (IR) radiance using a Fourier transform spectrometer (FTS). The GIFTS instrument employs three Focal Plane Arrays (FPAs), which gather measurements across the long-wave IR (LWIR), short/mid-wave IR (SMWIR), and visible spectral bands. The raw interferogram measurements are radiometrically and spectrally calibrated to produce radiance spectra, which are further processed to obtain atmospheric profiles via retrieval algorithms. This paper describes several digital signal processing (DSP) techniques involved in the development of the calibration model. In the first stage, the measured raw interferograms must undergo a series of processing steps that include filtering, decimation, and detector nonlinearity correction. The digital filtering is achieved by employing a linear-phase even-length FIR complex filter that is designed based on the optimum equiripple criteria. Next, the detector nonlinearity effect is compensated for using a set of pre-determined detector response characteristics. In the next stage, a phase correction algorithm is applied to the decimated interferograms. This is accomplished by first estimating the phase function from the spectral phase response of the windowed interferogram, and then correcting the entire interferogram based on the estimated phase function. In the calibration stage, we first compute the spectral responsivity based on the previous results and the ideal Planck blackbody spectra at the given temperatures, from which, the calibrated ambient blackbody (ABB), hot blackbody (HBB), and scene spectra can be obtained. In the post-calibration stage, we estimate the Noise Equivalent Spectral Radiance (NESR) from the calibrated ABB and HBB spectra. The NESR is generally considered as a measure of the instrument noise performance, and can be estimated as

  6. Social Information Processing, Emotions, and Aggression: Conceptual and Methodological Contributions of the Special Section Articles

    ERIC Educational Resources Information Center

    Arsenio, William F.

    2010-01-01

    This discussion summarizes some of the key conceptual and methodological contributions of the four articles in this special section on social information processing (SIP) and aggression. One major contribution involves the new methodological tools these studies provide for future researchers. Eye-tracking and mood induction techniques will make it…

  7. Environmental testing of a prototypic digital safety channel, Phase I: System design and test methodology

    SciTech Connect

    Korsah, K.; Turner, G.W.; Mullens, J.A.

    1995-04-01

    A microprocessor-based reactor trip channel has been assembled for environmental testing under an Instrumentation and Control (I&C) Qualification Program sponsored by the US Nuclear Regulatory Commission. The goal of this program is to establish the technical basis and acceptance criteria for the qualification of advanced I&C systems. The trip channel implemented for this study employs technologies and digital subsystems representative of those proposed for use in some advanced light-water reactors (ALWRs) such as the Simplified Boiling Water Reactor (SBWR). It is expected that these tests will reveal any potential system vulnerabilities for technologies representative of those proposed for use in ALWRs. The experimental channel will be purposely stressed considerably beyond what it is likely to experience in a normal nuclear power plant environment, so that the tests can uncover the worst-case failure modes (i.e., failures that are likely to prevent an entire trip system from performing its safety function when required to do so). Based on information obtained from this study, it may be possible to recommend tests that are likely to indicate the presence of such failure mechanisms. Such recommendations would be helpful in augmenting current qualification guidelines.

  8. Complexity, Methodology and Method: Crafting a Critical Process of Research

    ERIC Educational Resources Information Center

    Alhadeff-Jones, Michel

    2013-01-01

    This paper defines a theoretical framework aiming to support the actions and reflections of researchers looking for a "method" in order to critically conceive the complexity of a scientific process of research. First, it starts with a brief overview of the core assumptions framing Morin's "paradigm of complexity" and Le…

  9. Knowledge and Processes That Predict Proficiency in Digital Literacy

    ERIC Educational Resources Information Center

    Bulger, Monica E.; Mayer, Richard E.; Metzger, Miriam J.

    2014-01-01

    Proficiency in digital literacy refers to the ability to read and write using online sources, and includes the ability to select sources relevant to the task, synthesize information into a coherent message, and communicate the message with an audience. The present study examines the determinants of digital literacy proficiency by asking 150…

  10. Microcomputer-Based Digital Signal Processing Laboratory Experiments.

    ERIC Educational Resources Information Center

    Tinari, Jr., Rocco; Rao, S. Sathyanarayan

    1985-01-01

    Describes a system (Apple II microcomputer interfaced to flexible, custom-designed digital hardware) which can provide: (1) Fast Fourier Transform (FFT) computation on real-time data with a video display of spectrum; (2) frequency synthesis experiments using the inverse FFT; and (3) real-time digital filtering experiments. (JN)

  11. Methodology for comparison of laser digitizing versus contact systems in dimensional control

    NASA Astrophysics Data System (ADS)

    Martínez, S.; Cuesta, E.; Barreiro, J.; Álvarez, B.

    2010-12-01

    The massive acquisition of points in a very short inspection time has converted the scanning with laser triangulation sensors an adequate technology for reverse engineering activities. However, industry is demanding complementary uses for this technology, such as applications in the scope of dimensional measurement control of mechanical parts. A drawback arises in the scope of metrological applications—validity of laser scanning has not been tested in terms of accuracy for geometric and dimensional tolerance control. This paper studies the accuracy that can be achieved with these systems in the field of dimensional control. The methodology includes the comparison of two scanning technologies, laser scanning and contact scanning, considering the latter one as the reference system. The scope of this study is the measurement of surfaces based on canonical features: planes, spheres, cylinders (both outer and inner holes) and conical surfaces (countersink and counterboring holes). For the research, several devices have been designed, which include diverse form features. Reconstruction of surfaces and subsequent comparison with nominal geometry has been carried out using different CAD systems in order to analyze the convergence of results among them. Additionally, this research analyzes several issues that arise when making the comparisons, such as the setting up of a common reference system for the alignment of surfaces or the scanning strategies.

  12. Social work practice in the digital age: therapeutic e-mail as a direct practice methodology.

    PubMed

    Mattison, Marian

    2012-07-01

    The author addresses the risks and benefits of incorporating therapeutic e-mail communication into clinical social work practice. Consumer demand for online clinical services is growing faster than the professional response. E-mail, when used as an adjunct to traditional meetings with clients, offers distinct advantages and risks. Benefits include the potential to reach clients in geographically remote and underserved communities, enhancing and extending the therapeutic relationship and improving treatment outcomes. Risks include threats to client confidentiality and privacy, liability coverage for practitioners, licensing jurisdiction, and the lack of competency standards for delivering e-mail interventions. Currently, the social work profession does not have adequate instructive guidelines and best-practice standards for using e-mail as a direct practice methodology. Practitioners need (formal) academic training in the techniques connected to e-mail exchanges with clients. The author describes the ethical and legal risks for practitioners using therapeutic e-mail with clients and identifies recommendations for establishing best-practice standards.

  13. How processing digital elevation models can affect simulated water budgets

    USGS Publications Warehouse

    Kuniansky, E.L.; Lowery, M.A.; Campbell, B.G.

    2009-01-01

    For regional models, the shallow water table surface is often used as a source/sink boundary condition, as model grid scale precludes simulation of the water table aquifer. This approach is appropriate when the water table surface is relatively stationary. Since water table surface maps are not readily available, the elevation of the water table used in model cells is estimated via a two-step process. First, a regression equation is developed using existing land and water table elevations from wells in the area. This equation is then used to predict the water table surface for each model cell using land surface elevation available from digital elevation models (DEM). Two methods of processing DEM for estimating the land surface for each cell are commonly used (value nearest the cell centroid or mean value in the cell). This article demonstrates how these two methods of DEM processing can affect the simulated water budget. For the example presented, approximately 20% more total flow through the aquifer system is simulated if the centroid value rather than the mean value is used. This is due to the one-third greater average ground water gradients associated with the centroid value than the mean value. The results will vary depending on the particular model area topography and cell size. The use of the mean DEM value in each model cell will result in a more conservative water budget and is more appropriate because the model cell water table value should be representative of the entire cell area, not the centroid of the model cell.

  14. Fully Digital: Policy and Process Implications for the AAS

    NASA Astrophysics Data System (ADS)

    Biemesderfer, Chris

    Over the past two decades, every scholarly publisher has migrated at least the mechanical aspects of their journal publishing so that they utilize digital means. The academy was comfortable with that for a while, but publishers are under increasing pressure to adapt further. At the American Astronomical Society (AAS), we think that means bringing our publishing program to the point of being fully digital, by establishing procedures and policies that regard the digital objects of publication primarily. We have always thought about our electronic journals as databases of digital articles, from which we can publish and syndicate articles one at a time, and we must now put flesh on those bones by developing practices that are consistent with the realities of article at a time publication online. As a learned society that holds the long-term rights to the literature, we have actively taken responsibility for the preservation of the digital assets that constitute our journals, and in so doing we have not forsaken the legacy pre-digital assets. All of us who serve as the long-term stewards of scholarship must begin to evolve into fully digital publishers.

  15. Delicate visual artifacts of advanced digital video processing algorithms

    NASA Astrophysics Data System (ADS)

    Nicolas, Marina M.; Lebowsky, Fritz

    2005-03-01

    With the incoming of digital TV, sophisticated video processing algorithms have been developed to improve the rendering of motion or colors. However, the perceived subjective quality of these new systems sometimes happens to be in conflict with the objective measurable improvement we expect to get. In this presentation, we show examples where algorithms should visually improve the skin tone rendering of decoded pictures under normal conditions, but surprisingly fail, when the quality of mpeg encoding drops below a just noticeable threshold. In particular, we demonstrate that simple objective criteria used for the optimization, such as SAD, PSNR or histogram sometimes fail, partly because they are defined on a global scale, ignoring local characteristics of the picture content. We then integrate a simple human visual model to measure potential artifacts with regard to spatial and temporal variations of the objects' characteristics. Tuning some of the model's parameters allows correlating the perceived objective quality with compression metrics of various encoders. We show the evolution of our reference parameters in respect to the compression ratios. Finally, using the output of the model, we can control the parameters of the skin tone algorithm to reach an improvement in overall system quality.

  16. Automated Coronal Loop Identification Using Digital Image Processing Techniques

    NASA Technical Reports Server (NTRS)

    Lee, Jong K.; Gary, G. Allen; Newman, Timothy S.

    2003-01-01

    The results of a master thesis project on a study of computer algorithms for automatic identification of optical-thin, 3-dimensional solar coronal loop centers from extreme ultraviolet and X-ray 2-dimensional images will be presented. These center splines are proxies of associated magnetic field lines. The project is pattern recognition problems in which there are no unique shapes or edges and in which photon and detector noise heavily influence the images. The study explores extraction techniques using: (1) linear feature recognition of local patterns (related to the inertia-tensor concept), (2) parametric space via the Hough transform, and (3) topological adaptive contours (snakes) that constrains curvature and continuity as possible candidates for digital loop detection schemes. We have developed synthesized images for the coronal loops to test the various loop identification algorithms. Since the topology of these solar features is dominated by the magnetic field structure, a first-order magnetic field approximation using multiple dipoles provides a priori information in the identification process. Results from both synthesized and solar images will be presented.

  17. Recognition and inference of crevice processing on digitized paintings

    NASA Astrophysics Data System (ADS)

    Karuppiah, S. P.; Srivatsa, S. K.

    2013-03-01

    This paper is designed to detect and removal of cracks on digitized paintings. The cracks are detected by threshold. Afterwards, the thin dark brush strokes which have been misidentified as cracks are removed using Median radial basis function neural network on hue and saturation data, Semi-automatic procedure based on region growing. Finally, crack is filled using wiener filter. The paper is well designed in such a way that most of the cracks on digitized paintings have identified and removed. The paper % of betterment is 90%. This paper helps us to perform not only on digitized paintings but also the medical images and bmp images. This paper is implemented by Mat Lab.

  18. Exploring the Developmental Changes in Automatic Two-Digit Number Processing

    ERIC Educational Resources Information Center

    Chan, Winnie Wai Lan; Au, Terry K.; Tang, Joey

    2011-01-01

    Even when two-digit numbers are irrelevant to the task at hand, adults process them. Do children process numbers automatically, and if so, what kind of information is activated? In a novel dot-number Stroop task, children (Grades 1-5) and adults were shown two different two-digit numbers made up of dots. Participants were asked to select the…

  19. 21 CFR 1311.55 - Requirements for systems used to process digitally signed orders.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Certification Authority certificate and invalidate any order that fails these validity checks. (8) The system... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Requirements for systems used to process digitally... Certificates for Electronic Orders § 1311.55 Requirements for systems used to process digitally signed...

  20. CIDOC-CRM extensions for conservation processes: A methodological approach

    NASA Astrophysics Data System (ADS)

    Vassilakaki, Evgenia; Zervos, Spiros; Giannakopoulos, Georgios

    2015-02-01

    This paper aims to report the steps taken to create the CIDOC Conceptual Reference Model (CIDOC-CRM) extensions and the relationships established to accommodate the depiction of conservation processes. In particular, the specific steps undertaken for developing and applying the CIDOC-CRM extensions for defining the conservation interventions performed on the cultural artifacts of the National Archaeological Museum of Athens, Greece are presented in detail. A report on the preliminary design of the DOC-CULTURE project (Development of an integrated information environment for assessment and documentation of conservation interventions to cultural works/objects with nondestructive testing techniques [NDTs], www.ndt-lab.gr/docculture), co-financed by the European Union NSRF THALES program, can be found in Kyriaki-Manessi, Zervos & Giannakopoulos (1) whereas the NDT&E methods and their output data through CIDOC-CRM extension of the DOC-CULTURE project approach to standardize the documentation of the conservation were further reported in Kouis et al. (2).

  1. Emissions involved in acidic deposition processes: Methodology and results

    SciTech Connect

    Placet, M.

    1990-01-01

    Data on the emissions involved in atmospheric acid-base chemistry are crucial to the assessment of acidic deposition and its effects. Sulfur dioxide (SO{sub 2}), nitrogen oxides (NO{sub x}), and volatile organic compounds (VOCs) are the primary chemical compounds involved in acidic deposition processes. In addition, other emission species -- e.g., ammonia, alkaline dust particles, hydrogen chloride, and hydrogen fluoride -- are involved in atmospheric acid-base chemistry, either by contributing acidic constituents or by neutralizing acidic species. Several emissions data bases have been developed under the auspices of the National Acid Precipitation Program (NAPAP). In addition to those developed by NAPAP, emissions data bases and emissions trends estimates also have been developed by organizations such as the Electric Power Research Institute (EPRI) and the U.S. Environmental Protection Agency (EPA). This paper briefly describes and compares the methods used in developing these emissions data bases and presents an overview of their emissions estimates. A more detailed discussion of these topics can be found in the State-of-Science Report on emissions recently released by NAPAP and in the references cited in that report. 14 refs., 4 figs., 1 tab.

  2. Enhancing the Teaching of Digital Processing of Remote Sensing Image Course through Geospatial Web Processing Services

    NASA Astrophysics Data System (ADS)

    di, L.; Deng, M.

    2010-12-01

    Remote sensing (RS) is an essential method to collect data for Earth science research. Huge amount of remote sensing data, most of them in the image form, have been acquired. Almost all geography departments in the world offer courses in digital processing of remote sensing images. Such courses place emphasis on how to digitally process large amount of multi-source images for solving real world problems. However, due to the diversity and complexity of RS images and the shortcomings of current data and processing infrastructure, obstacles for effectively teaching such courses still remain. The major obstacles include 1) difficulties in finding, accessing, integrating and using massive RS images by students and educators, and 2) inadequate processing functions and computing facilities for students to freely explore the massive data. Recent development in geospatial Web processing service systems, which make massive data, computing powers, and processing capabilities to average Internet users anywhere in the world, promises the removal of the obstacles. The GeoBrain system developed by CSISS is an example of such systems. All functions available in GRASS Open Source GIS have been implemented as Web services in GeoBrain. Petabytes of remote sensing images in NASA data centers, the USGS Landsat data archive, and NOAA CLASS are accessible transparently and processable through GeoBrain. The GeoBrain system is operated on a high performance cluster server with large disk storage and fast Internet connection. All GeoBrain capabilities can be accessed by any Internet-connected Web browser. Dozens of universities have used GeoBrain as an ideal platform to support data-intensive remote sensing education. This presentation gives a specific example of using GeoBrain geoprocessing services to enhance the teaching of GGS 588, Digital Remote Sensing taught at the Department of Geography and Geoinformation Science, George Mason University. The course uses the textbook "Introductory

  3. Geometric processing of digital images of the planets

    NASA Technical Reports Server (NTRS)

    Edwards, Kathleen

    1987-01-01

    New procedures and software have been developed for geometric transformation of images to support digital cartography of the planets. The procedures involve the correction of spacecraft camera orientation of each image with the use of ground control and the transformation of each image to a Sinusoidal Equal-Area map projection with an algorithm which allows the number of transformation calculations to vary as the distortion varies within the image. When the distortion is low in an area of an image, few transformation computations are required, and most pixels can be interpolated. When distortion is extreme, the location of each pixel is computed. Mosaics are made of these images and stored as digital databases. Completed Sinusoidal databases may be used for digital analysis and registration with other spatial data. They may also be reproduced as published image maps by digitally transforming them to appropriate map projections.

  4. Geometric processing of digital images of the planets

    NASA Astrophysics Data System (ADS)

    Edwards, K.

    1987-09-01

    New procedures and software have been developed for geometric transformation of images to support digital cartography of the planets. The procedures involve the correction of spacecraft camera orientation of each image with the use of ground control and the transformation of each image to a Sinusoidal Equal-Area map projection with an algorithm which allows the number of transformation calculations to vary as the distortion varies within the image. When the distortion is low in an area of an image, few transformation computations are required, and most pixels can be interpolated. When distortion is extreme, the location of each pixel is computed. Mosaics are made of these images and stored as digital databases. Completed Sinusoidal databases may be used for digital analysis and registration with other spatial data. They may also be reproduced as published image maps by digitally transforming them to appropriate map projections.

  5. ISSUES IN DIGITAL IMAGE PROCESSING OF AERIAL PHOTOGRAPHY FOR MAPPING SUBMERSED AQUATIC VEGETATION

    EPA Science Inventory

    The paper discusses the numerous issues that needed to be addressed when developing a methodology for mapping Submersed Aquatic Vegetation (SAV) from digital aerial photography. Specifically, we discuss 1) choice of film; 2) consideration of tide and weather constraints; 3) in-s...

  6. Optical processing architecture and its potential application for digital and analog radiography.

    PubMed

    Liu, H; Xu, J; Fajardo, L L

    1999-04-01

    In this report we introduce the fundamental architectures and the potential applications of optical processing techniques in medical imaging. Three basic optical processing architectures were investigated for digital and analog radiography. The processors consist of a module that converts either the analog or the digital radiograph into a coherent light distribution; a coherent optical processing architecture that performs various mathematical operations; a programmable digital-optical interface and other accessories. Optical frequency filters were implemented for mammographic and other clinical feature enhancement. In medical image processing, digital computers offer the advantages of programmability and flexibility. In contrast, optical processors perform parallel image processing with high speed. Optical processors also offer analog nature, compact size, and cost effectiveness. With technical advances of digital-optical interface devices, the medical image processor, in the foreseeable future, may be a hybrid device, namely, a programmable optical architecture.

  7. A digital signal processing module for gamma-ray tracking detectors

    NASA Astrophysics Data System (ADS)

    Cromaz, M.; Riot, V. J.; Fallon, P.; Gros, S.; Holmes, B.; Lee, I. Y.; Macchiavelli, A. O.; Vu, C.; Yaver, H.; Zimmermann, S.

    2008-12-01

    We have designed and constructed an 8-channel digital signal processing board for the GRETINA spectrometer. The digitizer samples each of 8 inputs at 100 MHz with 12-bit resolution. Employing a large on-board FPGA, the board derives an energy, leading-edge time, and constant-fraction time from the input signal providing the functionality of a conventional analog electronics system. Readout and control of the digitizer is done over a VME bus. The digitizer's performance met all requirements for processing signals from the GRETINA spectrometer.

  8. Application of computer analysis of mammography phantom images (CAMPI) methodology to the comparison of two digital biopsy machines

    NASA Astrophysics Data System (ADS)

    Chakraborty, Dev P.; Fatouros, Panos P.

    1998-07-01

    The objective of this research was to compare a Fischer MammoVision/MammoTest and a LoRad DSM digital biopsy machine using the Computer Analysis of Mammography Phantom Images (CAMPI) methodology. This study reports on analysis of the 4 largest microcalcification groups (M1, M2, M3 and M4) and the largest nodule (N1) in a mammography accreditation phantom on images acquired at 26 kVp and different mAs values on the two machines. Both machines were linear in response but the MammoTest was more sensitive (i.e., it yielded a larger gray- scale value for a given x-ray technique). However, even after correcting for this difference, the CAMPI noise measure was substantially smaller for the LoRad than the MammoTest over the range of mAS values studied. Similarly, the CAMPI signal- to-noise-ratio and correlation measures were higher for the LoRad than the MammoTest over the same range of mAs, especially for the larger objects (M1/M2 and N1). For the smaller specks in M3/M4 somewhat closer performance was observed. The overall differences are attributed to better contrast/noise performance of the LoRad which appear to outweigh its lesser resolution capability. Our results are in agreement with earlier physical and psychophysical measurements using different methodologies. This work also describes better predictive models (i.e., fits) to describe the variation of all CAMPI measures with mAs at constant kVp. For example, the noise measure was fitted to a function that included physically reasonable sources of noise e.g., dark noise and detector gain fluctuations, in addition to the usual quantum noise. These fits can be used to summarize machine performance and to predict dependencies on other variables (e.g., exposure or dose) that are related to the mAs.

  9. Development of Coriolis mass flowmeter with digital drive and signal processing technology.

    PubMed

    Hou, Qi-Li; Xu, Ke-Jun; Fang, Min; Liu, Cui; Xiong, Wen-Jun

    2013-09-01

    Coriolis mass flowmeter (CMF) often suffers from two-phase flowrate which may cause flowtube stalling. To solve this problem, a digital drive method and a digital signal processing method of CMF is studied and implemented in this paper. A positive-negative step signal is used to initiate the flowtube oscillation without knowing the natural frequency of the flowtube. A digital zero-crossing detection method based on Lagrange interpolation is adopted to calculate the frequency and phase difference of the sensor output signals in order to synthesize the digital drive signal. The digital drive approach is implemented by a multiplying digital to analog converter (MDAC) and a direct digital synthesizer (DDS). A digital Coriolis mass flow transmitter is developed with a digital signal processor (DSP) to control the digital drive, and realize the signal processing. Water flow calibrations and gas-liquid two-phase flowrate experiments are conducted to examine the performance of the transmitter. The experimental results show that the transmitter shortens the start-up time and can maintain the oscillation of flowtube in two-phase flowrate condition.

  10. Digital mapping of side-scan sonar data with the Woods Hole Image Processing System software

    USGS Publications Warehouse

    Paskevich, Valerie F.

    1992-01-01

    Since 1985, the Branch of Atlantic Marine Geology has been involved in collecting, processing and digitally mosaicking high and low resolution sidescan sonar data. In the past, processing and digital mosaicking has been accomplished with a dedicated, shore-based computer system. Recent development of a UNIX-based image-processing software system includes a series of task specific programs for pre-processing sidescan sonar data. To extend the capabilities of the UNIX-based programs, development of digital mapping techniques have been developed. This report describes the initial development of an automated digital mapping procedure. Included is a description of the programs and steps required to complete the digital mosaicking on a UNIXbased computer system, and a comparison of techniques that the user may wish to select.

  11. Terahertz digital holography image processing based on MAP algorithm

    NASA Astrophysics Data System (ADS)

    Chen, Guang-Hao; Li, Qi

    2015-04-01

    Terahertz digital holography combines the terahertz technology and digital holography technology at present, fully exploits the advantages in both of them. Unfortunately, the quality of terahertz digital holography reconstruction images is gravely harmed by speckle noise which hinders the popularization of this technology. In this paper, the maximum a posterior estimation (MAP) filter is harnessed for the restoration of the digital reconstruction images. The filtering results are compared with images filtered by Wiener Filter and conventional frequency-domain filters from both subjective and objective perspectives. As for objective assessment, we adopted speckle index (SPKI) and edge preserving index (EPI) to quantitate the quality of images. In this paper, Canny edge detector is also used to outline the target in original and reconstruction images, which then act as an important role in the evaluation of filter performance. All the analysis indicate that maximum a posterior estimation filtering algorithm performs superiorly compared with the other two competitors in this paper and has enhanced the terahertz digital holography reconstruction images to a certain degree, allowing for a more accurate boundary identification.

  12. All-digital precision processing of ERTS images

    NASA Technical Reports Server (NTRS)

    Bernstein, R. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. Digital techniques have been developed and used to apply precision-grade radiometric and geometric corrections to ERTS MSS and RBV scenes. Geometric accuracies sufficient for mapping at 1:250,000 scale have been demonstrated. Radiometric quality has been superior to ERTS NDPF precision products. A configuration analysis has shown that feasible, cost-effective all-digital systems for correcting ERTS data are easily obtainable. This report contains a summary of all results obtained during this study and includes: (1) radiometric and geometric correction techniques, (2) reseau detection, (3) GCP location, (4) resampling, (5) alternative configuration evaluations, and (6) error analysis.

  13. Methodology development for the sustainability process assessment of sheet metal forming of complex-shaped products

    NASA Astrophysics Data System (ADS)

    Pankratov, D. L.; Kashapova, L. R.

    2015-06-01

    A methodology was developed for automated assessment of the reliability of the process of sheet metal forming process to reduce the defects in complex components manufacture. The article identifies the range of allowable values of the stamp parameters to obtain defect-free punching of spars trucks.

  14. Reengineering the Acquisition/Procurement Process: A Methodology for Requirements Collection

    NASA Technical Reports Server (NTRS)

    Taylor, Randall; Vanek, Thomas

    2011-01-01

    This paper captures the systematic approach taken by JPL's Acquisition Reengineering Project team, the methodology used, challenges faced, and lessons learned. It provides pragmatic "how-to" techniques and tools for collecting requirements and for identifying areas of improvement in an acquisition/procurement process or other core process of interest.

  15. Using Dual-Task Methodology to Dissociate Automatic from Nonautomatic Processes Involved in Artificial Grammar Learning

    ERIC Educational Resources Information Center

    Hendricks, Michelle A.; Conway, Christopher M.; Kellogg, Ronald T.

    2013-01-01

    Previous studies have suggested that both automatic and intentional processes contribute to the learning of grammar and fragment knowledge in artificial grammar learning (AGL) tasks. To explore the relative contribution of automatic and intentional processes to knowledge gained in AGL, we utilized dual-task methodology to dissociate automatic and…

  16. 21 CFR 1311.55 - Requirements for systems used to process digitally signed orders.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... JUSTICE REQUIREMENTS FOR ELECTRONIC ORDERS AND PRESCRIPTIONS Obtaining and Using Digital Certificates for Electronic Orders § 1311.55 Requirements for systems used to process digitally signed orders. (a) A CSOS certificate holder and recipient of an electronic order may use any system to write, track, or maintain...

  17. Identification and Quantification Soil Redoximorphic Features by Digital Image Processing

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Soil redoximorphic features (SRFs) have provided scientists and land managers with insight into relative soil moisture for approximately 60 years. The overall objective of this study was to develop a new method of SRF identification and quantification from soil cores using a digital camera and imag...

  18. Autism and Digital Learning Environments: Processes of Interaction and Mediation

    ERIC Educational Resources Information Center

    Passerino, Liliana M.; Santarosa, Lucila M. Costi

    2008-01-01

    Using a socio-historical perspective to explain social interaction and taking advantage of information and communication technologies (ICTs) currently available for creating digital learning environments (DLEs), this paper seeks to redress the absence of empirical data concerning technology-aided social interaction between autistic individuals. In…

  19. Process reengineering: the role of a planning methodology and picture archiving and communications system team building.

    PubMed

    Carrino, J A; Unkel, P J; Shelton, P; Johnson, T G

    1999-05-01

    The acquisition of a picture archiving and communications system (PACS) is an opportunity to reengineer business practices and should optimally consider the entire process from image acquisition to communication of results. The purpose of this presentation is to describe the PACS planning methodology used by the Department of Defense (DOD) Joint Imaging Technology Project Office (JITPO), outline the critical procedures for each phase, and review the military experience using this model. The methodology is segmented into four phases: strategic planning, clinical scenario planning, installation planning, and implementation planning. Each is further subdivided based on the specific tasks that need to be accomplished within that phase. By using this method, an institution will have clearly defined program goals, objectives, and PACS requirements before vendors are contacted. The development of an institution-specific PACS requirement should direct the process of proposal comparisons to be based on functionality and exclude unnecessary equipment. This PACS planning methodology is being used at more than eight DOD medical treatment facilities. When properly executed, this methodology facilitates a seamless transition to the electronic environment and contributes to the successful integration of the healthcare enterprise. A crucial component of this methodology is the development of a local PACS planning team to manage all aspects of the process. A plan formulated by the local team is based on input from each department that will be integrating with the PACS. Involving all users in the planning process is paramount for successful implementation. PMID:10342159

  20. Parallel Digital Watermarking Process on Ultrasound Medical Images in Multicores Environment

    PubMed Central

    Khor, Hui Liang; Liew, Siau-Chuin; Zain, Jasni Mohd.

    2016-01-01

    With the advancement of technology in communication network, it facilitated digital medical images transmitted to healthcare professionals via internal network or public network (e.g., Internet), but it also exposes the transmitted digital medical images to the security threats, such as images tampering or inserting false data in the images, which may cause an inaccurate diagnosis and treatment. Medical image distortion is not to be tolerated for diagnosis purposes; thus a digital watermarking on medical image is introduced. So far most of the watermarking research has been done on single frame medical image which is impractical in the real environment. In this paper, a digital watermarking on multiframes medical images is proposed. In order to speed up multiframes watermarking processing time, a parallel watermarking processing on medical images processing by utilizing multicores technology is introduced. An experiment result has shown that elapsed time on parallel watermarking processing is much shorter than sequential watermarking processing. PMID:26981111

  1. Parallel Digital Watermarking Process on Ultrasound Medical Images in Multicores Environment.

    PubMed

    Khor, Hui Liang; Liew, Siau-Chuin; Zain, Jasni Mohd

    2016-01-01

    With the advancement of technology in communication network, it facilitated digital medical images transmitted to healthcare professionals via internal network or public network (e.g., Internet), but it also exposes the transmitted digital medical images to the security threats, such as images tampering or inserting false data in the images, which may cause an inaccurate diagnosis and treatment. Medical image distortion is not to be tolerated for diagnosis purposes; thus a digital watermarking on medical image is introduced. So far most of the watermarking research has been done on single frame medical image which is impractical in the real environment. In this paper, a digital watermarking on multiframes medical images is proposed. In order to speed up multiframes watermarking processing time, a parallel watermarking processing on medical images processing by utilizing multicores technology is introduced. An experiment result has shown that elapsed time on parallel watermarking processing is much shorter than sequential watermarking processing. PMID:26981111

  2. Parallel Digital Watermarking Process on Ultrasound Medical Images in Multicores Environment.

    PubMed

    Khor, Hui Liang; Liew, Siau-Chuin; Zain, Jasni Mohd

    2016-01-01

    With the advancement of technology in communication network, it facilitated digital medical images transmitted to healthcare professionals via internal network or public network (e.g., Internet), but it also exposes the transmitted digital medical images to the security threats, such as images tampering or inserting false data in the images, which may cause an inaccurate diagnosis and treatment. Medical image distortion is not to be tolerated for diagnosis purposes; thus a digital watermarking on medical image is introduced. So far most of the watermarking research has been done on single frame medical image which is impractical in the real environment. In this paper, a digital watermarking on multiframes medical images is proposed. In order to speed up multiframes watermarking processing time, a parallel watermarking processing on medical images processing by utilizing multicores technology is introduced. An experiment result has shown that elapsed time on parallel watermarking processing is much shorter than sequential watermarking processing.

  3. 77 FR 50724 - Developing Software Life Cycle Processes for Digital Computer Software Used in Safety Systems of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... COMMISSION Developing Software Life Cycle Processes for Digital Computer Software Used in Safety Systems of... comment draft regulatory guide (DG), DG-1210, ``Developing Software Life Cycle Processes for Digital... practices for developing software life-cycle processes for digital computers used in safety systems...

  4. Spectral analysis and filtering techniques in digital spatial data processing

    USGS Publications Warehouse

    Pan, Jeng-Jong

    1989-01-01

    A filter toolbox has been developed at the EROS Data Center, US Geological Survey, for retrieving or removing specified frequency information from two-dimensional digital spatial data. This filter toolbox provides capabilities to compute the power spectrum of a given data and to design various filters in the frequency domain. Three types of filters are available in the toolbox: point filter, line filter, and area filter. Both the point and line filters employ Gaussian-type notch filters, and the area filter includes the capabilities to perform high-pass, band-pass, low-pass, and wedge filtering techniques. These filters are applied for analyzing satellite multispectral scanner data, airborne visible and infrared imaging spectrometer (AVIRIS) data, gravity data, and the digital elevation models (DEM) data. -from Author

  5. GEOMETRIC PROCESSING OF DIGITAL IMAGES OF THE PLANETS.

    USGS Publications Warehouse

    Edwards, Kathleen

    1987-01-01

    New procedures and software have been developed for geometric transformations of images to support digital cartography of the planets. The procedures involve the correction of spacecraft camera orientation of each image with the use of ground control and the transformation of each image to a Sinusoidal Equal-Area map projection with an algorithm which allows the number of transformation calculations to vary as the distortion varies within the image. When the distortion is low in an area of an image, few transformation computations are required, and most pixels can be interpolated. When distortion is extreme, the location of each pixel is computed. Mosaics are made of these images and stored as digital databases.

  6. Dynamic range control of audio signals by digital signal processing

    NASA Astrophysics Data System (ADS)

    Gilchrist, N. H. C.

    It is often necessary to reduce the dynamic range of musical programs, particularly those comprising orchestral and choral music, for them to be received satisfactorily by listeners to conventional FM and AM broadcasts. With the arrival of DAB (Digital Audio Broadcasting) a much wider dynamic range will become available for radio broadcasting, although some listeners may prefer to have a signal with a reduced dynamic range. This report describes a digital processor developed by the BBC to control the dynamic range of musical programs in a manner similar to that of a trained Studio Manager. It may be used prior to transmission in conventional broadcasting, replacing limiters or other compression equipment. In DAB, it offers the possibility of providing a dynamic range control signal to be sent to the receiver via an ancillary data channel, simultaneously with the uncompressed audio, giving the listener the option of the full dynamic range or a reduced dynamic range.

  7. Digital hardware and software design for infrared sensor image processing

    NASA Astrophysics Data System (ADS)

    Bekhtin, Yuri; Barantsev, Alexander; Solyakov, Vladimir; Medvedev, Alexander

    2005-06-01

    The example of the digital hardware-and-software complex consisting of the multi-element matrix sensor the personal computer along with the installed special card AMBPCI is described. The problems of elimination socalled fixed pattern noise (FPN) are considered. To improve current imaging the residual FPN is represented as a multiplicative noise. The wavelet-based de-noising algorithm using sets of noisy and non-noisy data of images is applied.

  8. Digital Microfluidic Processing of Mammalian Embryos for Vitrification

    PubMed Central

    Abdelgawad, Mohamed; Sun, Yu

    2014-01-01

    Cryopreservation is a key technology in biology and clinical practice. This paper presents a digital microfluidic device that automates sample preparation for mammalian embryo vitrification. Individual micro droplets manipulated on the microfluidic device were used as micro-vessels to transport a single mouse embryo through a complete vitrification procedure. Advantages of this approach, compared to manual operation and channel-based microfluidic vitrification, include automated operation, cryoprotectant concentration gradient generation, and feasibility of loading and retrieval of embryos. PMID:25250666

  9. From CAD to Digital Modeling: the Necessary Hybridization of Processes

    NASA Astrophysics Data System (ADS)

    Massari, G. A.; Bernardi, F.; Cristofolini, A.

    2011-09-01

    The essay deals with the themes of digital representation of architecture starting from several years of teaching activity which is growing within the course of Automatic Design of the degree course in Engineering/Architecture in the University of Trento. With the development of CAD systems, architectural representation lies less in the tracking of a simple graph and drawn deeper into a series of acts of building a complex digital model, which can be used as a data base on which to report all the stages of project and interpretation work, and from which to derive final drawings and documents. The advent of digital technology has led to increasing difficulty in finding explicit connections between one type of operation and the subsequent outcome; thereby increasing need for guidelines, the need to understand in order to precede the changes, the desire not to be overwhelmed by uncontrollable influences brought by technological hardware and software systems to use only in accordance with the principle of maximum productivity. Formation occupies a crucial role because has the ability to direct the profession toward a thoughtful and selective use of specific applications; teaching must build logical routes in the fluid world of info-graphics and the only way to do so is to describe its contours through method indications: this will consist in understanding, studying and divulging what in its mobility does not change, as procedural issues, rather than what is transitory in its fixity, as manual questions.

  10. Developing an undergraduate geography course on digital image processing of remotely sensed data

    NASA Technical Reports Server (NTRS)

    Baumann, P. R.

    1981-01-01

    Problems relating to the development of a digital image processing course in an undergraduate geography environment is discussed. Computer resource requirements, course prerequisites, and the size of the study area are addressed.

  11. Proper restorative material selection, digital processes allow highly esthetic shade match combined with layered porcelain.

    PubMed

    Kahng, Luke S

    2014-03-01

    Today's digital technologies are affording dentists and laboratory technicians more control over material choices for creating restorations and fabricating dental prostheses. Digital processes can potentially enable technicians to create ideal marginal areas and account for the thickness and support of layering porcelain over substructures in the design process. In this case report of a restoration of a single central incisor, a number of issues are addressed that are central to using the newest digital technology. As demonstrated, shade selection is a crucial early step in any restorative case preparation.

  12. Processing, mosaicking and management of the Monterey Bay digital sidescan-sonar images

    USGS Publications Warehouse

    Chavez, P.S.; Isbrecht, J.; Galanis, P.; Gabel, G.L.; Sides, S.C.; Soltesz, D.L.; Ross, S.L.; Velasco, M.G.

    2002-01-01

    Sidescan-sonar imaging systems with digital capabilities have now been available for approximately 20 years. In this paper we present several of the various digital image processing techniques developed by the U.S. Geological Survey (USGS) and used to apply intensity/radiometric and geometric corrections, as well as enhance and digitally mosaic, sidescan-sonar images of the Monterey Bay region. New software run by a WWW server was designed and implemented to allow very large image data sets, such as the digital mosaic, to be easily viewed interactively, including the ability to roam throughout the digital mosaic at the web site in either compressed or full 1-m resolution. The processing is separated into the two different stages: preprocessing and information extraction. In the preprocessing stage, sensor-specific algorithms are applied to correct for both geometric and intensity/radiometric distortions introduced by the sensor. This is followed by digital mosaicking of the track-line strips into quadrangle format which can be used as input to either visual or digital image analysis and interpretation. An automatic seam removal procedure was used in combination with an interactive digital feathering/stenciling procedure to help minimize tone or seam matching problems between image strips from adjacent track-lines. The sidescan-sonar image processing package is part of the USGS Mini Image Processing System (MIPS) and has been designed to process data collected by any 'generic' digital sidescan-sonar imaging system. The USGS MIPS software, developed over the last 20 years as a public domain package, is available on the WWW at: http://terraweb.wr.usgs.gov/trs/software.html.

  13. A Case Methodology for the Study of the Drawing Process and the Drawing Series. Final Report.

    ERIC Educational Resources Information Center

    Beittel, Kenneth R.

    This study asks: What methodologies permit one to describe and analyze the drawing process and series of artist in a manner cognitively adequate and close to the artist's imaginative consciousness? It assumes that: 1) arting is an ultimate realm of man's experience; 2) it includes artistic causality, idiosyncratic meaning, and intentional…

  14. An Effective Methodology for Processing and Analyzing Large, Complex Spacecraft Data Streams

    ERIC Educational Resources Information Center

    Teymourlouei, Haydar

    2013-01-01

    The emerging large datasets have made efficient data processing a much more difficult task for the traditional methodologies. Invariably, datasets continue to increase rapidly in size with time. The purpose of this research is to give an overview of some of the tools and techniques that can be utilized to manage and analyze large datasets. We…

  15. Analysis of the Process and Methodology of a School Finance Study in Kentucky.

    ERIC Educational Resources Information Center

    Guess, Arnold

    This paper describes the processes and methodology used in a 3-year study of the Kentucky Foundation Program--a body of related Kentucky revised statutes which directs the amount and flow of State aid for public education in the State. A technical committee, a research committee, a citizens advisory council, and local study committees examined…

  16. Digitizing rocks standardizing the geological description process using workstations

    SciTech Connect

    Saunders, M.R. , Windsor, Berkshire ); Shields, J.A. ); Taylor, M.R. )

    1993-09-01

    The preservation of geological knowledge in a standardized digital form presents a challenge. Data sources, inherently fuzzy, range in scale from the macroscopic (e.g., outcrop) through the mesoscopic (e.g., hand-specimen) core and sidewall core, to the microscopic (e.g., drill cuttings, thin sections, and microfossils). Each scale change results in increased heterogeneity and potentially contradictory data and the providers of such data may vary in experience level. To address these issues with respect to cores and drill cuttings, a geological description workstation has been developed and is undergoing field trials. Over 1000 carefully defined geological attributes are currently available within a depth-indexed, relational database. Attributes are stored in digital form, allowing multiple users to select familiar usage (e.g., diabase vs. dolerite). Data can be entered in one language and retrieved in other languages. The database structure allow groupings of similar elements (e.g., rhyolites in acidic, igneous or volcanics subgroups or the igneous rock group) permitting different uses to analyze details appropriate to the scale of the usage. Data entry uses a graphical user interface, allowing the geologist to make quick, logical selections in a standardized or custom-built format with extensive menus, on-screen graphics and help screens available. Description ranges are permissible. Entries for lithology, petrology, structures (sedimentary, organic and deformational), reservoir characteristics (porosity and hydrocarbon shows), and macrofossils are available. Sampling points for thin sections, core analysis, geochemistry, or micropaleontology studies are also recorded. Using digital data storage, geological logs using graphical, alphanumeric and symbolic depictions are possible. Data can be integrated with drilling and mud gas data, MWD and wireline data and off well-site analyses to produced composite formation evaluation logs and interpretational crossplots.

  17. Digital image processing for the rectification of television camera distortions.

    NASA Technical Reports Server (NTRS)

    Rindfleisch, T. C.

    1971-01-01

    All television systems introduce distortions into the imagery they record which influence the results of quantitative photometric and geometric measurements. Digital computer techniques provide a powerful approach to the calibration and rectification of these systematic effects. Nonlinear as well as linear problems can be attacked with flexibility and precision. Methods which have been developed and applied for the removal of structured system noises and the correction of photometric, geometric, and resolution distortions in vidicon systems are briefly described. Examples are given of results derived primarily from the Mariner Mars 1969 television experiment.

  18. Automation of Axisymmetric Drop Shape Analysis Using Digital Image Processing

    NASA Astrophysics Data System (ADS)

    Cheng, Philip Wing Ping

    The Axisymmetric Drop Shape Analysis - Profile (ADSA-P) technique, as initiated by Rotenberg, is a user -oriented scheme to determine liquid-fluid interfacial tensions and contact angles from the shape of axisymmetric menisci, i.e., from sessile as well as pendant drops. The ADSA -P program requires as input several coordinate points along the drop profile, the value of the density difference between the bulk phases, and gravity. The solution yields interfacial tension and contact angle. Although the ADSA-P technique was in principle complete, it was found that it was of very limited practical use. The major difficulty with the method is the need for very precise coordinate points along the drop profile, which, up to now, could not be obtained readily. In the past, the coordinate points along the drop profile were obtained by manual digitization of photographs or negatives. From manual digitization data, the surface tension values obtained had an average error of +/-5% when compared with literature values. Another problem with the ADSA-P technique was that the computer program failed to converge for the case of very elongated pendant drops. To acquire the drop profile coordinates automatically, a technique which utilizes recent developments in digital image acquisition and analysis was developed. In order to determine the drop profile coordinates as precisely as possible, the errors due to optical distortions were eliminated. In addition, determination of drop profile coordinates to pixel and sub-pixel resolution was developed. It was found that high precision could be obtained through the use of sub-pixel resolution and a spline fitting method. The results obtained using the automatic digitization technique in conjunction with ADSA-P not only compared well with the conventional methods, but also outstripped the precision of conventional methods considerably. To solve the convergence problem of very elongated pendant drops, it was found that the reason for the

  19. Performance of the SIR-B digital image processing subsystem

    NASA Technical Reports Server (NTRS)

    Curlander, J. C.

    1986-01-01

    A ground-based system to generate digital SAR image products has been developed and implemented in support of the SIR-B mission. This system is designed to achieve the maximum throughput while meeting strict image fidelity criteria. Its capabilities include: automated radiometric and geometric correction of the output imagery; high-precision absolute location without tiepoint registration; filtering of the raw data to remove spurious signals from alien radars; and automated catologing to maintain a full set of radar and image production facility in support of the SIR-B science investigators routinely produces over 80 image frames per week.

  20. Implementation of real-time digital endoscopic image processing system

    NASA Astrophysics Data System (ADS)

    Song, Chul Gyu; Lee, Young Mook; Lee, Sang Min; Kim, Won Ky; Lee, Jae Ho; Lee, Myoung Ho

    1997-10-01

    Endoscopy has become a crucial diagnostic and therapeutic procedure in clinical areas. Over the past four years, we have developed a computerized system to record and store clinical data pertaining to endoscopic surgery of laparascopic cholecystectomy, pelviscopic endometriosis, and surgical arthroscopy. In this study, we developed a computer system, which is composed of a frame grabber, a sound board, a VCR control board, a LAN card and EDMS. Also, computer system controls peripheral instruments such as a color video printer, a video cassette recorder, and endoscopic input/output signals. Digital endoscopic data management system is based on open architecture and a set of widely available industry standards; namely Microsoft Windows as an operating system, TCP/IP as a network protocol and a time sequential database that handles both images and speech. For the purpose of data storage, we used MOD and CD- R. Digital endoscopic system was designed to be able to store, recreate, change, and compress signals and medical images. Computerized endoscopy enables us to generate and manipulate the original visual document, making it accessible to a virtually unlimited number of physicians.

  1. Digital processing of signals arising from organic liquid scintillators for applications in the mixed-field assessment of nuclear threats

    NASA Astrophysics Data System (ADS)

    Aspinall, M. D.; Joyce, M. J.; Mackin, R. O.; Jarrah, Z.; Peyton, A. J.

    2008-10-01

    The nuclear aspect of the CBRN* threat is often divided amongst radiological substances posing no criticality risk, often referred to as 'dirty bomb' scenarios, and fissile threats. The latter have the theoretical potential for criticality excursion, resulting in elevated neutron fluxes in addition to the γ-ray component that is common to dirty bombs. Even in isolation of the highly-unlikely criticality scenario, fissile substances often exhibit radiation fields comprising a significant neutron component which can require considerably different counterterrorism measures and clean-up methodologies. The contrast between these threats can indicate important differences in the relative sophistication of the perpetrators and their organizations. Consequently, the detection and discrimination of nuclear perils in terms of mixed-field content is an important assay in combating terrorist threats. In this paper we report on the design and implementation of a fast digitizer and embedded-processor for onthe- fly signal processing of events from organic liquid scintillators. A digital technique, known as Pulse Gradient Analysis (PGA), has been developed at Lancaster University for the digital discrimination of neutrons and γ rays. PGA has been deployed on bespoke hardware and demonstrates remarkable improvement over analogue methods for the assay of mixed fields and the real-time discrimination of neutrons and γ rays. In this regard the technology constitutes an attractive and affordable means for the discrimination of the radiation fields arising from fissile threats and those from dirty bombs. Data are presented demonstrating this capability with sealed radioactive sources.

  2. Integrating rock mechanics issues with repository design through design process principles and methodology

    SciTech Connect

    Bieniawski, Z.T.

    1996-04-01

    A good designer needs not only knowledge for designing (technical know-how that is used to generate alternative design solutions) but also must have knowledge about designing (appropriate principles and systematic methodology to follow). Concepts such as {open_quotes}design for manufacture{close_quotes} or {open_quotes}concurrent engineering{close_quotes} are widely used in the industry. In the field of rock engineering, only limited attention has been paid to the design process because design of structures in rock masses presents unique challenges to the designers as a result of the uncertainties inherent in characterization of geologic media. However, a stage has now been reached where we are be able to sufficiently characterize rock masses for engineering purposes and identify the rock mechanics issues involved but are still lacking engineering design principles and methodology to maximize our design performance. This paper discusses the principles and methodology of the engineering design process directed to integrating site characterization activities with design, construction and performance of an underground repository. Using the latest information from the Yucca Mountain Project on geology, rock mechanics and starter tunnel design, the current lack of integration is pointed out and it is shown how rock mechanics issues can be effectively interwoven with repository design through a systematic design process methodology leading to improved repository performance. In essence, the design process is seen as the use of design principles within an integrating design methodology, leading to innovative problem solving. In particular, a new concept of {open_quotes}Design for Constructibility and Performance{close_quotes} is introduced. This is discussed with respect to ten rock mechanics issues identified for repository design and performance.

  3. A comparison of letter and digit processing in letter-by-letter reading.

    PubMed

    Ingles, Janet L; Eskes, Gail A

    2008-01-01

    The extent to which letter-by-letter reading results from a specific orthographic deficit, as compared with a nonspecific disturbance in basic visuoperceptual mechanisms, is unclear. The current study directly compared processing of letters and digits in a letter-by-letter reader, G.M., using a rapid serial visual presentation (RSVP) task and a speeded matching task. Comparisons were made to a group of six brain-damaged individuals without reading deficits. In the RSVP task, G.M. had increased difficulty reporting the target identities when they were letters, as compared with digits. Although this general pattern was also evident in the control group, the magnitude of the letter-digit accuracy difference was greater in G.M. Similarly, in the matching task, G.M. was slower to match letters than digits, relative to the control group, although his response times to both item types were increased. These data suggest that letter-by-letter reading, at least in this case, results from a visuoperceptual encoding deficit that particularly affects letters, but also extends to processing of digits to a lesser extent. Results are consistent with the notion that a left occipitotemporal area is specialized for letter processing with greater bilaterality in the visual processing of digits.

  4. Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology

    NASA Technical Reports Server (NTRS)

    Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.

    1974-01-01

    A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.

  5. Digital active material processing platform effort (DAMPER), SBIR phase 2

    NASA Technical Reports Server (NTRS)

    Blackburn, John; Smith, Dennis

    1992-01-01

    Applied Technology Associates, Inc., (ATA) has demonstrated that inertial actuation can be employed effectively in digital, active vibration isolation systems. Inertial actuation involves the use of momentum exchange to produce corrective forces which act directly on the payload being actively isolated. In a typical active vibration isolation system, accelerometers are used to measure the inertial motion of the payload. The signals from the accelerometers are then used to calculate the corrective forces required to counteract, or 'cancel out' the payload motion. Active vibration isolation is common technology, but the use of inertial actuation in such systems is novel, and is the focus of the DAMPER project. A May 1991 report was completed which documented the successful demonstration of inertial actuation, employed in the control of vibration in a single axis. In the 1 degree-of-freedom (1DOF) experiment a set of air bearing rails was used to suspend the payload, simulating a microgravity environment in a single horizontal axis. Digital Signal Processor (DSP) technology was used to calculate in real time, the control law between the accelerometer signals and the inertial actuators. The data obtained from this experiment verified that as much as 20 dB of rejection could be realized by this type of system. A discussion is included of recent tests performed in which vibrations were actively controlled in three axes simultaneously. In the three degree-of-freedom (3DOF) system, the air bearings were designed in such a way that the payload is free to rotate about the azimuth axis, as well as translate in the two horizontal directions. The actuator developed for the DAMPER project has applications beyond payload isolation, including structural damping and source vibration isolation. This report includes a brief discussion of these applications, as well as a commercialization plan for the actuator.

  6. Image processing for a tactile/vision substitution system using digital CNN.

    PubMed

    Lin, Chien-Nan; Yu, Sung-Nien; Hu, Jin-Cheng

    2006-01-01

    In view of the parallel processing and easy implementation properties of CNN, we propose to use digital CNN as the image processor of a tactile/vision substitution system (TVSS). The digital CNN processor is used to execute the wavelet down-sampling filtering and the half-toning operations, aiming to extract important features from the images. A template combination method is used to embed the two image processing functions into a single CNN processor. The digital CNN processor is implemented on an intellectual property (IP) and is implemented on a XILINX VIRTEX II 2000 FPGA board. Experiments are designated to test the capability of the CNN processor in the recognition of characters and human subjects in different environments. The experiments demonstrates impressive results, which proves the proposed digital CNN processor a powerful component in the design of efficient tactile/vision substitution systems for the visually impaired people.

  7. Optical fiber diameter measurement by the diffraction method with digital processing of the light scattering indicatrix

    NASA Astrophysics Data System (ADS)

    Kokodii, N. G.; Natarova, A. O.

    2016-07-01

    Relations between the position of the first diffraction minima and the fiber diameter are derived based on the solution of the problem of electromagnetic wave diffraction on a transparent fiber with a circular cross section. The obtained formulas are used to measure the fiber diameter. The diffraction pattern is recorded with a digital camera. The obtained image is digitally processed to determine the positions of the first two scattering indicatrix minima.

  8. Considerations in developing geographic informations systems based on low-cost digital image processing

    NASA Technical Reports Server (NTRS)

    Henderson, F. M.; Dobson, M. W.

    1981-01-01

    The potential of digital image processing systems costing $20,000 or less for geographic information systems is assessed with the emphasis on the volume of data to be handled, the commercial hardware systems available, and the basic software for: (1) data entry, conversion and digitization; (2) georeferencing and geometric correction; (3) data structuring; (4) editing and updating; (5) analysis and retrieval; (6) output drivers; and (7) data management. Costs must also be considered as tangible and intangible factors.

  9. Digital Libraries.

    ERIC Educational Resources Information Center

    Fox, Edward A.; Urs, Shalini R.

    2002-01-01

    Provides an overview of digital libraries research, practice, and literature. Highlights include new technologies; redefining roles; historical background; trends; creating digital content, including conversion; metadata; organizing digital resources; services; access; information retrieval; searching; natural language processing; visualization;…

  10. Erosion processes by water in agricultural landscapes: a low-cost methodology for post-event analyses

    NASA Astrophysics Data System (ADS)

    Prosdocimi, Massimo; Calligaro, Simone; Sofia, Giulia; Tarolli, Paolo

    2015-04-01

    Throughout the world, agricultural landscapes assume a great importance, especially for supplying food and a livelihood. Among the land degradation phenomena, erosion processes caused by water are those that may most affect the benefits provided by agricultural lands and endanger people who work and live there. In particular, erosion processes that affect the banks of agricultural channels may cause the bank failure and represent, in this way, a severe threat to floodplain inhabitants and agricultural crops. Similarly, rills and gullies are critical soil erosion processes as well, because they bear upon the productivity of a farm and represent a cost that growers have to deal with. To estimate quantitatively soil losses due to bank erosion and rills processes, area based measurements of surface changes are necessary but, sometimes, they may be difficult to realize. In fact, surface changes due to short-term events have to be represented with fine resolution and their monitoring may entail too much money and time. The main objective of this work is to show the effectiveness of a user-friendly and low-cost technique that may even rely on smart-phones, for the post-event analyses of i) bank erosion affecting agricultural channels, and ii) rill processes occurring on an agricultural plot. Two case studies were selected and located in the Veneto floodplain (northeast Italy) and Marche countryside (central Italy), respectively. The work is based on high-resolution topographic data obtained by the emerging, low-cost photogrammetric method named Structure-from-Motion (SfM). Extensive photosets of the case studies were obtained using both standalone reflex digital cameras and smart-phone built-in cameras. Digital Terrain Models (DTMs) derived from SfM revealed to be effective to estimate quantitatively erosion volumes and, in the case of the bank eroded, deposited materials as well. SfM applied to pictures taken by smartphones is useful for the analysis of the topography

  11. Medical ultrasound digital beamforming on a massively parallel processing array platform

    NASA Astrophysics Data System (ADS)

    Chen, Paul; Butts, Mike; Budlong, Brad

    2008-03-01

    Digital beamforming has been widely used in modern medical ultrasound instruments. Flexibility is the key advantage of a digital beamformer over the traditional analog approach. Unlike analog delay lines, digital delay can be programmed to implement new ways of beam shaping and beam steering without hardware modification. Digital beamformers can also be focused dynamically by tracking the depth and focusing the receive beam as the depth increases. By constantly updating an element weight table, a digital beamformer can dynamically increase aperture size with depth to maintain constant lateral resolution and reduce sidelobe noise. Because ultrasound digital beamformers have high I/O bandwidth and processing requirements, traditionally they have been implemented using ASICs or FPGAs that are costly both in time and in money. This paper introduces a sample implementation of a digital beamformer that is programmed in software on a Massively Parallel Processor Array (MPPA). The system consists of a host PC and a PCI Express-based beamformer accelerator with an Ambric Am2045 MPPA chip and 512 Mbytes of external memory. The Am2045 has 336 asynchronous RISCDSP processors that communicate through a configurable structure of channels, using a self-synchronizing communication protocol.

  12. Studying the relationship between dreaming and sleep-dependent memory processes: methodological challenges.

    PubMed

    Schredl, Michael

    2013-12-01

    The hypothesis that dreaming is involved in off-line memory processing is difficult to test because major methodological issues have to be addressed, such as dream recall and the effect of remembered dreams on memory. It would be fruitful--in addition to studying the ancient art of memory (AAOM) in a scanner--to study the dreams of persons who use AAOM regularly.

  13. Rethinking Design Process: Using 3D Digital Models as an Interface in Collaborative Session

    ERIC Educational Resources Information Center

    Ding, Suining

    2008-01-01

    This paper describes a pilot study for an alternative design process by integrating a designer-user collaborative session with digital models. The collaborative session took place in a 3D AutoCAD class for a real world project. The 3D models served as an interface for designer-user collaboration during the design process. Students not only learned…

  14. Industrial methodology for process verification in research (IMPROVER): toward systems biology verification

    PubMed Central

    Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo

    2012-01-01

    Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044

  15. An efficient forward-secure group certificate digital signature scheme to enhance EMR authentication process.

    PubMed

    Yu, Yao-Chang; Hou, Ting-Wei

    2014-05-01

    The frequently used digital signature algorithms, such as RSA and the Digital Signature Algorithm (DSA), lack forward-secure function. The result is that, when private keys are renewed, trustworthiness is lost. In other words, electronic medical records (EMRs) signed by revoked private keys are no longer trusted. This significant security threat stands in the way of EMR adoption. This paper proposes an efficient forward-secure group certificate digital signature scheme that is based on Shamir's (t,n) threshold scheme and Schnorr's digital signature scheme to ensure trustworthiness is maintained when private keys are renewed and to increase the efficiency of EMRs' authentication processes in terms of number of certificates, number of keys, forward-secure ability and searching time.

  16. Interactive Computing and Graphics in Undergraduate Digital Signal Processing. Microcomputing Working Paper Series F 84-9.

    ERIC Educational Resources Information Center

    Onaral, Banu; And Others

    This report describes the development of a Drexel University electrical and computer engineering course on digital filter design that used interactive computing and graphics, and was one of three courses in a senior-level sequence on digital signal processing (DSP). Interactive and digital analysis/design routines and the interconnection of these…

  17. A Federated Digital Identity Management Approach for Business Processes

    NASA Astrophysics Data System (ADS)

    Bertino, Elisa; Ferrini, Rodolfo; Musci, Andrea; Paci, Federica; Steuer, Kevin J.

    Business processes have gained a lot of attention because of the pressing need for integrating existing resources and services to better fulfill customer needs. A key feature of business processes is that they are built from composable services, referred to as component services, that may belong to different domains. In such a context, flexible multi-domain identity management solutions are crucial for increased security and user-convenience. In particular, it is important that during the execution of a business process the component services be able to verify the identity of the client to check that it has the required permissions for accessing the services. To address the problem of multi-domain identity management, we propose a multi-factor identity attribute verification protocol for business processes that assures clients privacy and handles naming heterogeneity.

  18. Hybrid digital signal processing and neural networks applications in PWRs

    SciTech Connect

    Eryurek, E.; Upadhyaya, B.R.; Kavaklioglu, K.

    1991-12-31

    Signal validation and plant subsystem tracking in power and process industries require the prediction of one or more state variables. Both heteroassociative and auotassociative neural networks were applied for characterizing relationships among sets of signals. A multi-layer neural network paradigm was applied for sensor and process monitoring in a Pressurized Water Reactor (PWR). This nonlinear interpolation technique was found to be very effective for these applications.

  19. Hybrid digital signal processing and neural networks applications in PWRs

    SciTech Connect

    Eryurek, E.; Upadhyaya, B.R.; Kavaklioglu, K.

    1991-01-01

    Signal validation and plant subsystem tracking in power and process industries require the prediction of one or more state variables. Both heteroassociative and auotassociative neural networks were applied for characterizing relationships among sets of signals. A multi-layer neural network paradigm was applied for sensor and process monitoring in a Pressurized Water Reactor (PWR). This nonlinear interpolation technique was found to be very effective for these applications.

  20. Optimization of the processing technology of Fructus Arctii by response surface methodology.

    PubMed

    Liu, Qi-Di; Qin, Kun-Ming; Shen, Bao-Jia; Cai, Hao; Cai, Bao-Chang

    2015-03-01

    The present study was designed to optimize the processing of Fructus Arctii by response surface methodology (RSM). Based on single factor studies, a three-variable, three-level Box-Behnken design (BBD) was used to monitor the effects of independent variables, including processing temperature and time, on the dependent variables. Response surfaces and contour plots of the contents of total lignans, chlorogenic acid, arctiin, and arctigenin were obtained through ultraviolet and visible (UV-Vis) monitoring and high performance liquid chromatography (HPLC). Fructus Arctii should be processed under heating in a pot at 311 °C, medicine at 119 °C for 123s with flipping frequently. The experimental values under the optimized processing technology were consistent with the predicted values. In conclusion, RSM is an effective method to optimize the processing of traditional Chinese medicine (TCM).

  1. Optimization of the processing technology of Fructus Arctii by response surface methodology.

    PubMed

    Liu, Qi-Di; Qin, Kun-Ming; Shen, Bao-Jia; Cai, Hao; Cai, Bao-Chang

    2015-03-01

    The present study was designed to optimize the processing of Fructus Arctii by response surface methodology (RSM). Based on single factor studies, a three-variable, three-level Box-Behnken design (BBD) was used to monitor the effects of independent variables, including processing temperature and time, on the dependent variables. Response surfaces and contour plots of the contents of total lignans, chlorogenic acid, arctiin, and arctigenin were obtained through ultraviolet and visible (UV-Vis) monitoring and high performance liquid chromatography (HPLC). Fructus Arctii should be processed under heating in a pot at 311 °C, medicine at 119 °C for 123s with flipping frequently. The experimental values under the optimized processing technology were consistent with the predicted values. In conclusion, RSM is an effective method to optimize the processing of traditional Chinese medicine (TCM). PMID:25835367

  2. The design, fabrication, and test of a new VLSI hybrid analog-digital neural processing element

    NASA Technical Reports Server (NTRS)

    Deyong, Mark R.; Findley, Randall L.; Fields, Chris

    1992-01-01

    A hybrid analog-digital neural processing element with the time-dependent behavior of biological neurons has been developed. The hybrid processing element is designed for VLSI implementation and offers the best attributes of both analog and digital computation. Custom VLSI layout reduces the layout area of the processing element, which in turn increases the expected network density. The hybrid processing element operates at the nanosecond time scale, which enables it to produce real-time solutions to complex spatiotemporal problems found in high-speed signal processing applications. VLSI prototype chips have been designed, fabricated, and tested with encouraging results. Systems utilizing the time-dependent behavior of the hybrid processing element have been simulated and are currently in the fabrication process. Future applications are also discussed.

  3. Evaluation of a change detection methodology by means of binary thresholding algorithms and informational fusion processes.

    PubMed

    Molina, Iñigo; Martinez, Estibaliz; Arquero, Agueda; Pajares, Gonzalo; Sanchez, Javier

    2012-01-01

    Landcover is subject to continuous changes on a wide variety of temporal and spatial scales. Those changes produce significant effects in human and natural activities. Maintaining an updated spatial database with the occurred changes allows a better monitoring of the Earth's resources and management of the environment. Change detection (CD) techniques using images from different sensors, such as satellite imagery, aerial photographs, etc., have proven to be suitable and secure data sources from which updated information can be extracted efficiently, so that changes can also be inventoried and monitored. In this paper, a multisource CD methodology for multiresolution datasets is applied. First, different change indices are processed, then different thresholding algorithms for change/no_change are applied to these indices in order to better estimate the statistical parameters of these categories, finally the indices are integrated into a change detection multisource fusion process, which allows generating a single CD result from several combination of indices. This methodology has been applied to datasets with different spectral and spatial resolution properties. Then, the obtained results are evaluated by means of a quality control analysis, as well as with complementary graphical representations. The suggested methodology has also been proved efficiently for identifying the change detection index with the higher contribution.

  4. Evaluation of a Change Detection Methodology by Means of Binary Thresholding Algorithms and Informational Fusion Processes

    PubMed Central

    Molina, Iñigo; Martinez, Estibaliz; Arquero, Agueda; Pajares, Gonzalo; Sanchez, Javier

    2012-01-01

    Landcover is subject to continuous changes on a wide variety of temporal and spatial scales. Those changes produce significant effects in human and natural activities. Maintaining an updated spatial database with the occurred changes allows a better monitoring of the Earth’s resources and management of the environment. Change detection (CD) techniques using images from different sensors, such as satellite imagery, aerial photographs, etc., have proven to be suitable and secure data sources from which updated information can be extracted efficiently, so that changes can also be inventoried and monitored. In this paper, a multisource CD methodology for multiresolution datasets is applied. First, different change indices are processed, then different thresholding algorithms for change/no_change are applied to these indices in order to better estimate the statistical parameters of these categories, finally the indices are integrated into a change detection multisource fusion process, which allows generating a single CD result from several combination of indices. This methodology has been applied to datasets with different spectral and spatial resolution properties. Then, the obtained results are evaluated by means of a quality control analysis, as well as with complementary graphical representations. The suggested methodology has also been proved efficiently for identifying the change detection index with the higher contribution. PMID:22737023

  5. Airy-Kaup-Kupershmidt filters applied to digital image processing

    NASA Astrophysics Data System (ADS)

    Hoyos Yepes, Laura Cristina

    2015-09-01

    The Kaup-Kupershmidt operator is applied to the two-dimensional solution of the Airy-diffusion equation and the resulting filter is applied via convolution to image processing. The full procedure is implemented using Maple code with the package ImageTools. Some experiments were performed using a wide category of images including biomedical images generated by magnetic resonance, computarized axial tomography, positron emission tomography, infrared and photon diffusion. The Airy-Kaup-Kupershmidt filter can be used as a powerful edge detector and as powerful enhancement tool in image processing. It is expected that the Airy-Kaup-Kupershmidt could be incorporated in standard programs for image processing such as ImageJ.

  6. Digital signal processing utilizing a generic instruction set

    NASA Astrophysics Data System (ADS)

    Mosley, V. V. W.; Bronder, J.; Wenk, A.

    In order to maintain a degree of technological equivalence between software and hardware in advanced VLSI development efforts, a set of generic instructions has been defined in the form of Ada-callable procedures which invoke a complex sequence of events for the execution of vector instructions in signal processing modules. Attention is presently given to real time signal processing functions in the cases of fighter aircraft fire control radar, passive sonar surveillance, communications systems' FSK demodulation and bit regeneration, and electronic warfare support measures and countermeasures. Generalized examples of each application are given as data flow graphs.

  7. Rapid processing of letters, digits and symbols: what purely visual-attentional deficit in developmental dyslexia?

    PubMed

    Ziegler, Johannes C; Pech-Georgel, Catherine; Dufau, Stéphane; Grainger, Jonathan

    2010-07-01

    Visual-attentional theories of dyslexia predict deficits for dyslexic children not only for the perception of letter strings but also for non-alphanumeric symbol strings. This prediction was tested in a two-alternative forced-choice paradigm with letters, digits, and symbols. Children with dyslexia showed significant deficits for letter and digit strings but not for symbol strings. This finding is difficult to explain for visual-attentional theories of dyslexia which postulate identical deficits for letters, digits and symbols. Moreover, dyslexics showed normal W-shaped serial position functions for letter and digit strings, which suggests that their deficit is not due to an abnormally small attentional window. Finally, the size of the deficit was identical for letters and digits, which suggests that poor letter perception is not just a consequence of the lack of reading. Together then, our results show that symbols that map onto phonological codes are impaired (i.e. letters and digits), whereas symbols that do not map onto phonological codes are not impaired. This dissociation suggests that impaired symbol-sound mapping rather than impaired visual-attentional processing is the key to understanding dyslexia. PMID:20590718

  8. Digital Image Processing Applied To Quality Assurance In Mineral Industry

    NASA Astrophysics Data System (ADS)

    Hamrouni, Zouheir; Ayache, Alain; Krey, Charlie J.

    1989-03-01

    In this paper , we bring forward an application of vision in the domain of quality assurance in mineral industry of talc. By using image processing and computer vision means, the proposed real time whiteness captor system intends: - to inspect the whiteness of grinded product, - to manage the mixing of primary talcs before grinding, in order to obtain a final product with predetermined whiteness. The system uses the robotic CCD microcamera MICAM (designed by our laboratory and presently manufactured), a micro computer system based on Motorola 68020 and real time image processing boards. It has the industrial following specifications: - High reliability - Whiteness is determined with a 0.3% precision on a scale of 25 levels. Because of the expected precision, we had to study carefully the lighting system, the type of image captor and associated electronics. The first developped softwares are able to process the withness of talcum powder; then we have conceived original algorithms to control withness of rough talc taking into account texture and shadows. The processing times of these algorithms are completely compatible with industrial rates. This system can be applied to other domains where high precision reflectance captor is needed: industry of paper, paints, ...

  9. Faster processing of multiple spatially-heterodyned direct to digital holograms

    DOEpatents

    Hanson, Gregory R [Clinton, TN; Bingham, Philip R [Knoxville, TN

    2008-09-09

    Systems and methods are described for faster processing of multiple spatially-heterodyned direct to digital holograms. A method includes of obtaining multiple spatially-heterodyned holograms, includes: digitally recording a first spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; digitally recording a second spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded first spatially-heterodyned hologram by shifting a first original origin of the recorded first spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a first angle between a first reference beam and a first object beam; applying a first digital filter to cut off signals around the first original origin and performing an inverse Fourier transform on the result; Fourier analyzing the recorded second spatially-heterodyned hologram by shifting a second original origin of the recorded second spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a second angle between a second reference beam and a second object beam; and applying a second digital filter to cut off signals around the second original origin and performing an inverse Fourier transform on the result, wherein digitally recording the first spatially-heterodyned hologram is completed before digitally recording the second spatially-heterodyned hologram and a single digital image includes both the first spatially-heterodyned hologram and the second spatially-heterodyned hologram.

  10. Faster processing of multiple spatially-heterodyned direct to digital holograms

    DOEpatents

    Hanson, Gregory R.; Bingham, Philip R.

    2006-10-03

    Systems and methods are described for faster processing of multiple spatially-heterodyned direct to digital holograms. A method includes of obtaining multiple spatially-heterodyned holograms, includes: digitally recording a first spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; digitally recording a second spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded first spatially-heterodyned hologram by shifting a first original origin of the recorded first spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a first angle between a first reference beam and a first, object beam; applying a first digital filter to cut off signals around the first original origin and performing an inverse Fourier transform on the result; Fourier analyzing the recorded second spatially-heterodyned hologram by shifting a second original origin of the recorded second spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a second angle between a second reference beam and a second object beam; and applying a second digital filter to cut off signals around the second original origin and performing an inverse Fourier transform on the result, wherein digitally recording the first spatially-heterodyned hologram is completed before digitally recording the second spatially-heterodyned hologram and a single digital image includes both the first spatially-heterodyned hologram and the second spatially-heterodyned hologram.

  11. MARKOV: A methodology for the solution of infinite time horizon MARKOV decision processes

    USGS Publications Warehouse

    Williams, B.K.

    1988-01-01

    Algorithms are described for determining optimal policies for finite state, finite action, infinite discrete time horizon Markov decision processes. Both value-improvement and policy-improvement techniques are used in the algorithms. Computing procedures are also described. The algorithms are appropriate for processes that are either finite or infinite, deterministic or stochastic, discounted or undiscounted, in any meaningful combination of these features. Computing procedures are described in terms of initial data processing, bound improvements, process reduction, and testing and solution. Application of the methodology is illustrated with an example involving natural resource management. Management implications of certain hypothesized relationships between mallard survival and harvest rates are addressed by applying the optimality procedures to mallard population models.

  12. Methodology for the Elimination of Reflection and System Vibration Effects in Particle Image Velocimetry Data Processing

    NASA Technical Reports Server (NTRS)

    Bremmer, David M.; Hutcheson, Florence V.; Stead, Daniel J.

    2005-01-01

    A methodology to eliminate model reflection and system vibration effects from post processed particle image velocimetry data is presented. Reflection and vibration lead to loss of data, and biased velocity calculations in PIV processing. A series of algorithms were developed to alleviate these problems. Reflections emanating from the model surface caused by the laser light sheet are removed from the PIV images by subtracting an image in which only the reflections are visible from all of the images within a data acquisition set. The result is a set of PIV images where only the seeded particles are apparent. Fiduciary marks painted on the surface of the test model were used as reference points in the images. By locating the centroids of these marks it was possible to shift all of the images to a common reference frame. This image alignment procedure as well as the subtraction of model reflection are performed in a first algorithm. Once the images have been shifted, they are compared with a background image that was recorded under no flow conditions. The second and third algorithms find the coordinates of fiduciary marks in the acquisition set images and the background image and calculate the displacement between these images. The final algorithm shifts all of the images so that fiduciary mark centroids lie in the same location as the background image centroids. This methodology effectively eliminated the effects of vibration so that unbiased data could be used for PIV processing. The PIV data used for this work was generated at the NASA Langley Research Center Quiet Flow Facility. The experiment entailed flow visualization near the flap side edge region of an airfoil model. Commercial PIV software was used for data acquisition and processing. In this paper, the experiment and the PIV acquisition of the data are described. The methodology used to develop the algorithms for reflection and system vibration removal is stated, and the implementation, testing and

  13. Experimental study of digital image processing techniques for LANDSAT data

    NASA Technical Reports Server (NTRS)

    Rifman, S. S. (Principal Investigator); Allendoerfer, W. B.; Caron, R. H.; Pemberton, L. J.; Mckinnon, D. M.; Polanski, G.; Simon, K. W.

    1976-01-01

    The author has identified the following significant results. Results are reported for: (1) subscene registration, (2) full scene rectification and registration, (3) resampling techniques, (4) and ground control point (GCP) extraction. Subscenes (354 pixels x 234 lines) were registered to approximately 1/4 pixel accuracy and evaluated by change detection imagery for three cases: (1) bulk data registration, (2) precision correction of a reference subscene using GCP data, and (3) independently precision processed subscenes. Full scene rectification and registration results were evaluated by using a correlation technique to measure registration errors of 0.3 pixel rms thoughout the full scene. Resampling evaluations of nearest neighbor and TRW cubic convolution processed data included change detection imagery and feature classification. Resampled data were also evaluated for an MSS scene containing specular solar reflections.

  14. Implementation of a Digital Signal Processing Subsystem for a Long Wavelength Array Station

    NASA Technical Reports Server (NTRS)

    Soriano, Melissa; Navarro, Robert; D'Addario, Larry; Sigman, Elliott; Wang, Douglas

    2011-01-01

    This paper describes the implementation of a Digital Signal Processing (DP) subsystem for a single Long Wavelength Array (LWA) station.12 The LWA is a radio telescope that will consist of many phased array stations. Each LWA station consists of 256 pairs of dipole-like antennas operating over the 10-88 MHz frequency range. The Digital Signal Processing subsystem digitizes up to 260 dual-polarization signals at 196 MHz from the LWA Analog Receiver, adjusts the delay and amplitude of each signal, and forms four independent beams. Coarse delay is implemented using a first-in-first-out buffer and fine delay is implemented using a finite impulse response filter. Amplitude adjustment and polarization corrections are implemented using a 2x2 matrix multiplication

  15. Pulsed digital holography system recording ultrafast process of the femtosecond order

    NASA Astrophysics Data System (ADS)

    Wang, Xiaolei; Zhai, Hongchen; Mu, Guoguang

    2006-06-01

    We report, for the first time to our knowledge, a pulsed digital microholographic system with spatial angular multiplexing for recording the ultrafast process of the femtosecond order. The optimized design of the two sets of subpulse-train generators in this system makes it possible to implement a digital holographic recording with spatial angular multiplexing of a frame interval of the femtosecond order, while keeping the incident angle of the object beams unchanged. Three pairs of amplitude and phase images from the same view angle digitally reconstructed by the system demonstrated the ultrafast dynamic process of laser-induced ionization of ambient air at a wavelength of 800 nm, with a time resolution of 50 fs and a frame interval of 300 fs.

  16. Time-resolved digital holographic microscopy of laser-induced forward transfer process

    PubMed Central

    Ma, H.; Venugopalan, V.

    2014-01-01

    We develop a method for time-resolved digital holographic microscopy to obtain time-resolved 3-D deformation measurements of laser induced forward transfer (LIFT) processes. We demonstrate nanometer axial resolution and nanosecond temporal resolution of our method which is suitable for measuring dynamic morphological changes in LIFT target materials. Such measurements provide insight into the early dynamics of the LIFT process and a means to examine the effect of laser and material parameters on LIFT process dynamics. PMID:24748724

  17. Application of digital image processing techniques to astronomical imagery, 1979

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.

    1979-01-01

    Several areas of applications of image processing to astronomy were identified and discussed. These areas include: (1) deconvolution for atmospheric seeing compensation; a comparison between maximum entropy and conventional Wiener algorithms; (2) polarization in galaxies from photographic plates; (3) time changes in M87 and methods of displaying these changes; (4) comparing emission line images in planetary nebulae; and (5) log intensity, hue saturation intensity, and principal component color enhancements of M82. Examples are presented of these techniques applied to a variety of objects.

  18. An integrated methodology for process improvement and delivery system visualization at a multidisciplinary cancer center.

    PubMed

    Singprasong, Rachanee; Eldabi, Tillal

    2013-01-01

    Multidisciplinary cancer centers require an integrated, collaborative, and stream-lined workflow in order to provide high quality of patient care. Due to the complex nature of cancer care and continuing changes to treatment techniques and technologies, it is a constant struggle for centers to obtain a systemic and holistic view of treatment workflow for improving the delivery systems. Project management techniques, Responsibility matrix and a swim-lane activity diagram representing sequence of activities can be combined for data collection, presentation, and evaluation of the patient care. This paper presents this integrated methodology using multidisciplinary meetings and walking the route approach for data collection, integrated responsibility matrix and swim-lane activity diagram with activity time for data representation and 5-why and gap analysis approach for data analysis. This enables collection of right detail of information in a shorter time frame by identifying process flaws and deficiencies while being independent of the nature of the patient's disease or treatment techniques. A case study of a multidisciplinary regional cancer centre is used to illustrate effectiveness of the proposed methodology and demonstrates that the methodology is simple to understand, allowing for minimal training of staff and rapid implementation.

  19. A comparison of orthogonal transformations for digital speech processing.

    NASA Technical Reports Server (NTRS)

    Campanella, S. J.; Robinson, G. S.

    1971-01-01

    Discrete forms of the Fourier, Hadamard, and Karhunen-Loeve transforms are examined for their capacity to reduce the bit rate necessary to transmit speech signals. To rate their effectiveness in accomplishing this goal the quantizing error (or noise) resulting for each transformation method at various bit rates is computed and compared with that for conventional companded PCM processing. Based on this comparison, it is found that Karhunen-Loeve provides a reduction in bit rate of 13.5 kbits/s, Fourier 10 kbits/s, and Hadamard 7.5 kbits/s as compared with the bit rate required for companded PCM. These bit-rate reductions are shown to be somewhat independent of the transmission bit rate.

  20. Digital Image Processing Applied To Problems In Art And Archaeology

    NASA Astrophysics Data System (ADS)

    Asmus, John F.; Katz, Norman P.

    1988-12-01

    Many of the images encountered during scholarly studies in the fields of art and archaeology have deteriorated through the effects of time. The Ice-Age rock art of the now-closed caves near Lascaux are prime examples of this fate. However, faint and subtle details of these can be exceedingly important as some theories suggest that the designs are computers or calendars pertaining to astronomical cycles as well as seasons for hunting, gathering, and planting. Consequently, we have applied a range of standard image processing algorithms (viz., edge detection, spatial filtering, spectral differencing, and contrast enhancement) as well as specialized techniques (e.g., matched filters) to the clarification of these drawings. Also, we report the results of computer enhancement studies pertaining to authenticity, faint details, sitter identity, and age of portraits by da Vinci, Rembrandt, Rotari, and Titian.

  1. Digital Signal Processing for the Event Horizon Telescope

    NASA Astrophysics Data System (ADS)

    Weintroub, Jonathan

    2015-08-01

    A broad international collaboration is building the Event Horizon Telescope (EHT). The aim is to test Einstein’s theory of General Relativity in one of the very few places it could break down: the strong gravity regime right at the edge of a black hole. The EHT is an earth-size VLBI array operating at the shortest radio wavelengths, that has achieved unprecedented angular resolution of a few tens of μarcseconds. For nearby super massive black holes (SMBH) this size scale is comparable to the Schwarzschild Radius, and emission in the immediate neighborhood of the event horizon can be directly observed. We give an introduction to the science behind the CASPER-enabled EHT, and outline technical developments, with emphasis on the secret sauce of high speed signal processing.

  2. Optimization of laser-assisted glass frit bonding process by response surface methodology

    NASA Astrophysics Data System (ADS)

    Wang, Wen; Xiao, Yanyi; Wu, Xingyang; Zhang, Jianhua

    2016-03-01

    In this work, a systematic study on laser-assisted glass frit bonding process was carried out by response surface methodology (RSM). Laser power, sealing speed and spot diameter were considered as key bonding parameters. Combined with a central rotatable experimental design, RSM was employed to establish mathematical model to predict the relationship between the shear force after bonding and the bonding process parameters. The model was validated experimentally. Based on the model, the interaction effects of the process parameters on the shear force were analyzed and the optimum bonding parameters were achieved. The results indicate that the model can be used to illustrate the relationship between the shear force and the bonding parameters. The predicted results obtained under the optimized parameters by the models are consistent with the experimental results.

  3. Application of reverse micelle extraction process for amylase recovery using response surface methodology.

    PubMed

    Bera, Manav B; Panesar, Parmjit S; Panesar, Reeba; Singh, Bahadur

    2008-06-01

    The effect of different process variables of reverse micelle extraction process like pH, addition of surfactant (AOT) concentration and potassium chloride (KCl) concentration on amylase recovery has been studied and analysed. Solid-state fermentation was used for the production of amylase enzyme. Response surface methodology (RSM) using central composite rotatable design (CCRD) was employed to analyse and optimize the enzyme extraction process. The regression analysis indicates that the effect of AOT concentration, and KCl concentration were significant, whereas the effect of pH was non-significant on enzyme recovery. For the maximum recovery of enzyme, the optimum operating condition for pH, AOT concentration (M) and KCl concentration were 10.43, 0.05 and 1.00, respectively. Under these optimal conditions, the enzyme recovery was 83.16%.

  4. Wavelet image processing applied to optical and digital holography: past achievements and future challenges

    NASA Astrophysics Data System (ADS)

    Jones, Katharine J.

    2005-08-01

    The link between wavelets and optics goes back to the work of Dennis Gabor who both invented holography and developed Gabor decompositions. Holography involves 3-D images. Gabor decompositions involves 1-D signals. Gabor decompositions are the predecessors of wavelets. Wavelet image processing of holography, both optical holography and digital holography, will be examined with respect to past achievements and future challenges.

  5. Proceedings of the Fourth Annual Workshop on the Use of Digital Computers in Process Control.

    ERIC Educational Resources Information Center

    Smith, Cecil L., Ed.

    Contents: Computer hardware testing (results of vendor-user interaction); CODIL (a new language for process control programing); the design and implementation of control systems utilizing CRT display consoles; the systems contractor - valuable professional or unnecessary middle man; power station digital computer applications; from inspiration to…

  6. Synchronous multi-microprocessor system for implementing digital signal processing algorithms

    SciTech Connect

    Barnwell, T.P. III; Hodges, C.J.M.

    1982-01-01

    This paper discusses the details of a multi-microprocessor system design as a research facility for studying multiprocessor implementation of digital signal processing algorithms. The overall system, which consists of a control microprocessor, eight satellite microprocessors, a control minicomputer, and extensive distributed software, has proven to be an effect tool in the study of multiprocessor implementations. 5 references.

  7. Digital Video Cameras for Brainstorming and Outlining: The Process and Potential

    ERIC Educational Resources Information Center

    Unger, John A.; Scullion, Vicki A.

    2013-01-01

    This "Voices from the Field" paper presents methods and participant-exemplar data for integrating digital video cameras into the writing process across postsecondary literacy contexts. The methods and participant data are part of an ongoing action-based research project systematically designed to bring research and theory into practice…

  8. Language influences on numerical development—Inversion effects on multi-digit number processing

    PubMed Central

    Klein, E.; Bahnmueller, J.; Mann, A.; Pixner, S.; Kaufmann, L.; Nuerk, H.-C.; Moeller, K.

    2013-01-01

    In early numerical development, children have to become familiar with the Arabic number system and its place-value structure. The present review summarizes and discusses evidence for language influences on the acquisition of the highly transparent structuring principles of digital-Arabic digits by means of its moderation through the transparency of the respective language's number word system. In particular, the so-called inversion property (i.e., 24 named as “four and twenty” instead of “twenty four”) was found to influence number processing in children not only in verbal but also in non-verbal numerical tasks. Additionally, there is first evidence suggesting that inversion-related difficulties may influence numerical processing longitudinally. Generally, language-specific influences in children's numerical development are most pronounced for multi-digit numbers. Yet, there is currently only one study on three-digit number processing for German-speaking children. A direct comparison of additional new data from Italian-speaking children further corroborates the assumption that language impacts on cognitive (number) processing as inversion-related interference was found most pronounced for German-speaking children. In sum, we conclude that numerical development may not be language-specific but seems to be moderated by language. PMID:23935585

  9. Digital image processing applications in the ignition and combustion of char/coal particles

    SciTech Connect

    Annamalai, K.; Kharbat, E.; Goplakrishnan, C.

    1992-12-01

    Digital image processing, is employed in this remarch study in order to visually investigate the ignition and combustion characteristics of isolated char/coal particles as well as the effect of interactivecombustion in two-particle char/coal arrays. Preliminary experiments are conducted on miniature isolated candles as well as two-candle arrays.

  10. Methods of Adapting Digital Content for the Learning Process via Mobile Devices

    ERIC Educational Resources Information Center

    Lopez, J. L. Gimenez; Royo, T. Magal; Laborda, Jesus Garcia; Calvo, F. Garde

    2009-01-01

    This article analyses different methods of adapting digital content for its delivery via mobile devices taking into account two aspects which are a fundamental part of the learning process; on the one hand, functionality of the contents, and on the other, the actual controlled navigation requirements that the learner needs in order to acquire high…

  11. Implementation and Performance of GaAs Digital Signal Processing ASICs

    NASA Technical Reports Server (NTRS)

    Whitaker, William D.; Buchanan, Jeffrey R.; Burke, Gary R.; Chow, Terrance W.; Graham, J. Scott; Kowalski, James E.; Lam, Barbara; Siavoshi, Fardad; Thompson, Matthew S.; Johnson, Robert A.

    1993-01-01

    The feasibility of performing high speed digital signal processing in GaAs gate array technology has been demonstrated with the successful implementation of a VLSI communications chip set for NASA's Deep Space Network. This paper describes the techniques developed to solve some of the technology and implementation problems associated with large scale integration of GaAs gate arrays.

  12. An Undergraduate Course and Laboratory in Digital Signal Processing with Field Programmable Gate Arrays

    ERIC Educational Resources Information Center

    Meyer-Base, U.; Vera, A.; Meyer-Base, A.; Pattichis, M. S.; Perry, R. J.

    2010-01-01

    In this paper, an innovative educational approach to introducing undergraduates to both digital signal processing (DSP) and field programmable gate array (FPGA)-based design in a one-semester course and laboratory is described. While both DSP and FPGA-based courses are currently present in different curricula, this integrated approach reduces the…

  13. Detecting Buried Archaeological Remains by the Use of Geophysical Data Processing with 'Diffusion Maps' Methodology

    NASA Astrophysics Data System (ADS)

    Eppelbaum, Lev

    2015-04-01

    observe that as a result of the above operations we embedded the original data into 3-dimensional space where data related to the AT subsurface are well separated from the N data. This 3D set of the data representatives can be used as a reference set for the classification of newly arriving data. Geophysically it means a reliable division of the studied areas for the AT-containing and not containing (N) these objects. Testing this methodology for delineation of archaeological cavities by magnetic and gravity data analysis displayed an effectiveness of this approach. References Alperovich, L., Eppelbaum, L., Zheludev, V., Dumoulin, J., Soldovieri, F., Proto, M., Bavusi, M. and Loperte, A., 2013. A new combined wavelet methodology applied to GPR and ERT data in the Montagnole experiment (French Alps). Journal of Geophysics and Engineering, 10, No. 2, 025017, 1-17. Averbuch, A., Hochman, K., Rabin, N., Schclar, A. and Zheludev, V., 2010. A diffusion frame-work for detection of moving vehicles. Digital Signal Processing, 20, No.1, 111-122. Averbuch A.Z., Neittaanmäki, P., and Zheludev, V.A., 2014. Spline and Spline Wavelet Methods with Applications to Signal and Image Processing. Volume I: Periodic Splines. Springer. Coifman, R.R. and Lafon, S., 2006. Diffusion maps, Applied and Computational Harmonic Analysis. Special issue on Diffusion Maps and Wavelets, 21, No. 7, 5-30. Eppelbaum, L.V., 2011. Study of magnetic anomalies over archaeological targets in urban conditions. Physics and Chemistry of the Earth, 36, No. 16, 1318-1330. Eppelbaum, L.V., 2014a. Geophysical observations at archaeological sites: Estimating informational content. Archaeological Prospection, 21, No. 2, 25-38. Eppelbaum, L.V. 2014b. Four Color Theorem and Applied Geophysics. Applied Mathematics, 5, 358-366. Eppelbaum, L.V., Alperovich, L., Zheludev, V. and Pechersky, A., 2011. Application of informational and wavelet approaches for integrated processing of geophysical data in complex environments. Proceed

  14. Advanced Signal Processing Methods Applied to Digital Mammography

    NASA Technical Reports Server (NTRS)

    Stauduhar, Richard P.

    1997-01-01

    The work reported here is on the extension of the earlier proposal of the same title, August 1994-June 1996. The report for that work is also being submitted. The work reported there forms the foundation for this work from January 1997 to September 1997. After the earlier work was completed there were a few items that needed to be completed prior to submission of a new and more comprehensive proposal for further research. Those tasks have been completed and two new proposals have been submitted, one to NASA, and one to Health & Human Services WS). The main purpose of this extension was to refine some of the techniques that lead to automatic large scale evaluation of full mammograms. Progress on each of the proposed tasks follows. Task 1: A multiresolution segmentation of background from breast has been developed and tested. The method is based on the different noise characteristics of the two different fields. The breast field has more power in the lower octaves and the off-breast field behaves similar to a wideband process, where more power is in the high frequency octaves. After the two fields are separated by lowpass filtering, a region labeling routine is used to find the largest contiguous region, the breast. Task 2: A wavelet expansion that can decompose the image without zero padding has been developed. The method preserves all properties of the power-of-two wavelet transform and does not add appreciably to computation time or storage. This work is essential for analysis of the full mammogram, as opposed to selecting sections from the full mammogram. Task 3: A clustering method has been developed based on a simple counting mechanism. No ROC analysis has been performed (and was not proposed), so we cannot finally evaluate this work without further support. Task 4: Further testing of the filter reveals that different wavelet bases do yield slightly different qualitative results. We cannot provide quantitative conclusions about this for all possible bases

  15. A Novel Optical/digital Processing System for Pattern Recognition

    NASA Technical Reports Server (NTRS)

    Boone, Bradley G.; Shukla, Oodaye B.

    1993-01-01

    This paper describes two processing algorithms that can be implemented optically: the Radon transform and angular correlation. These two algorithms can be combined in one optical processor to extract all the basic geometric and amplitude features from objects embedded in video imagery. We show that the internal amplitude structure of objects is recovered by the Radon transform, which is a well-known result, but, in addition, we show simulation results that calculate angular correlation, a simple but unique algorithm that extracts object boundaries from suitably threshold images from which length, width, area, aspect ratio, and orientation can be derived. In addition to circumventing scale and rotation distortions, these simulations indicate that the features derived from the angular correlation algorithm are relatively insensitive to tracking shifts and image noise. Some optical architecture concepts, including one based on micro-optical lenslet arrays, have been developed to implement these algorithms. Simulation test and evaluation using simple synthetic object data will be described, including results of a study that uses object boundaries (derivable from angular correlation) to classify simple objects using a neural network.

  16. Digital radar-gram processing for water pipelines leak detection

    NASA Astrophysics Data System (ADS)

    García-Márquez, Jorge; Flores, Ricardo; Valdivia, Ricardo; Carreón, Dora; Malacara, Zacarías; Camposeco, Arturo

    2006-02-01

    Ground penetrating radars (GPR) are useful underground exploration devices. Applications are found in archaeology, mine detection, pavement evaluation, among others. Here we use a GPR to detect by an indirect way, the anomalies caused by the presence of water in the neighborhood of an underground water pipeline. By Fourier transforming a GPR profile map we interpret the signal as spatial frequencies, instead of the temporal frequencies, that composes the profile map. This allows differentiating between signals returning from a standard subsoil feature from those coming back from anomalous zones. Facilities in Mexican cities are commonly buried up to 2.5 m. Their constituent materials are PVC, concrete or metal, typically steel. GPRs are ultra-wide band devices; leak detection must be an indirect process since echoes due to the presence of underground zones with high moisture levels are masked by dense reflections (clutter). In radargrams the presence of water is visualized as anomalies in the neighborhood of the facility. Enhancement of these anomalies will give us the information required to detect leaks.

  17. Matching rendered and real world images by digital image processing

    NASA Astrophysics Data System (ADS)

    Mitjà, Carles; Bover, Toni; Bigas, Miquel; Escofet, Jaume

    2010-05-01

    Recent advances in computer-generated images (CGI) have been used in commercial and industrial photography providing a broad scope in product advertising. Mixing real world images with those rendered from virtual space software shows a more or less visible mismatching between corresponding image quality performance. Rendered images are produced by software which quality performance is only limited by the resolution output. Real world images are taken with cameras with some amount of image degradation factors as lens residual aberrations, diffraction, sensor low pass anti aliasing filters, color pattern demosaicing, etc. The effect of all those image quality degradation factors can be characterized by the system Point Spread Function (PSF). Because the image is the convolution of the object by the system PSF, its characterization shows the amount of image degradation added to any taken picture. This work explores the use of image processing to degrade the rendered images following the parameters indicated by the real system PSF, attempting to match both virtual and real world image qualities. The system MTF is determined by the slanted edge method both in laboratory conditions and in the real picture environment in order to compare the influence of the working conditions on the device performance; an approximation to the system PSF is derived from the two measurements. The rendered images are filtered through a Gaussian filter obtained from the taking system PSF. Results with and without filtering are shown and compared measuring the contrast achieved in different final image regions.

  18. Realization of guitar audio effects using methods of digital signal processing

    NASA Astrophysics Data System (ADS)

    Buś, Szymon; Jedrzejewski, Konrad

    2015-09-01

    The paper is devoted to studies on possibilities of realization of guitar audio effects by means of methods of digital signal processing. As a result of research, some selected audio effects corresponding to the specifics of guitar sound were realized as the real-time system called Digital Guitar Multi-effect. Before implementation in the system, the selected effects were investigated using the dedicated application with a graphical user interface created in Matlab environment. In the second stage, the real-time system based on a microcontroller and an audio codec was designed and realized. The system is designed to perform audio effects on the output signal of an electric guitar.

  19. Digitizing data acquisition and time-of-flight pulse processing for ToF-ERDA

    NASA Astrophysics Data System (ADS)

    Julin, Jaakko; Sajavaara, Timo

    2016-01-01

    A versatile system to capture and analyze signals from multi channel plate (MCP) based time-of-flight detectors and ionization based energy detectors such as silicon diodes and gas ionization chambers (GIC) is introduced. The system is based on commercial digitizers and custom software. It forms a part of a ToF-ERDA spectrometer, which has to be able to detect recoil atoms of many different species and energies. Compared to the currently used analogue electronics the digitizing system provides comparable time-of-flight resolution and improved hydrogen detection efficiency, while allowing the operation of the spectrometer be studied and optimized after the measurement. The hardware, data acquisition software and digital pulse processing algorithms to suit this application are described in detail.

  20. Digital Processing and Segmentation of Breast Microcalcifications Images Obtained by a Si Microstrips Detector: Preliminary Results

    NASA Astrophysics Data System (ADS)

    Díaz, Claudia. C.; Angulo, Abril A.

    2007-02-01

    We present the preliminary results of digital processing and segmentation of breast microcalcifications images. They were obtained using a Bede X ray tube with Cu anode, which was fixed at 20 kV and 1 mA. Different biopsies were scanned using a 128 Si microstrips detector. Total scanning resulted in a data matrix, which corresponded with the image of each biopsy. We manipulated the contrast of the images using histograms and filters in the frequency domain in Matlab. Then we intended to investigate about different contour models for the segmentation of microcalcifications boundaries, which were based on the contrast and shape of the image. These algorithms could be applied to mammographic images, which may be obtained by digital mammography or digitizing conventional mammograms.

  1. Pre-Processing of Point-Data from Contact and Optical 3D Digitization Sensors

    PubMed Central

    Budak, Igor; Vukelić, Djordje; Bračun, Drago; Hodolič, Janko; Soković, Mirko

    2012-01-01

    Contemporary 3D digitization systems employed by reverse engineering (RE) feature ever-growing scanning speeds with the ability to generate large quantity of points in a unit of time. Although advantageous for the quality and efficiency of RE modelling, the huge number of point datas can turn into a serious practical problem, later on, when the CAD model is generated. In addition, 3D digitization processes are very often plagued by measuring errors, which can be attributed to the very nature of measuring systems, various characteristics of the digitized objects and subjective errors by the operator, which also contribute to problems in the CAD model generation process. This paper presents an integral system for the pre-processing of point data, i.e., filtering, smoothing and reduction, based on a cross-sectional RE approach. In the course of the proposed system development, major emphasis was placed on the module for point data reduction, which was designed according to a novel approach with integrated deviation analysis and fuzzy logic reasoning. The developed system was verified through its application on three case studies, on point data from objects of versatile geometries obtained by contact and laser 3D digitization systems. The obtained results demonstrate the effectiveness of the system. PMID:22368513

  2. Knowledge-based system for computer-aided process planning of laser sensor 3D digitizing

    NASA Astrophysics Data System (ADS)

    Bernard, Alain; Davillerd, Stephane; Sidot, Benoit

    1999-11-01

    This paper introduces some results of a research work carried out on the automation of digitizing process of complex part using a precision 3D-laser sensor. Indeed, most of the operations are generally still manual to perform digitalization. In fact, redundancies, lacks or forgetting in point acquisition are possible. Moreover, digitization time of a part, i.e. immobilization of the machine, is thus not optimized overall. So, it is important, for time- compression during product development, to minimize time consuming of reverse engineering step. A new way to scan automatically a complex 3D part is presented to order to measure and to compare the acquired data with the reference CAD model. After introducing digitization, the environment used for the experiments is presented, based on a CMM machine and a plane laser sensor. Then the proposed strategy is introduced for the adaptation of this environment to a robotic CAD software in order to be able to simulate and validate 3D-laser-scanning paths. The CAPP (Computer Aided Process Planning) system used for the automatic generation of the laser scanning process is also presented.

  3. Digital processing of the Mariner 10 images of Venus and Mercury

    NASA Technical Reports Server (NTRS)

    Soha, J. M.; Lynn, D. J.; Mosher, J. A.; Elliot, D. A.

    1977-01-01

    An extensive effort was devoted to the digital processing of the Mariner 10 images of Venus and Mercury at the Image Processing Laboratory of the Jet Propulsion Laboratory. This effort was designed to optimize the display of the considerable quantity of information contained in the images. Several image restoration, enhancement, and transformation procedures were applied; examples of these techniques are included. A particular task was the construction of large mosaics which characterize the surface of Mercury and the atmospheric structure of Venus.

  4. Engineering aesthetics and aesthetic ergonomics: theoretical foundations and a dual-process research methodology.

    PubMed

    Liu, Yili

    Although industrial and product designers are keenly aware of the importance of design aesthetics, they make aesthetic design decisions largely on the basis of their intuitive judgments and "educated guesses". Whilst ergonomics and human factors researchers have made great contributions to the safety, productivity, ease-of-use, and comfort of human-machine-environment systems, aesthetics is largely ignored as a topic of systematic scientific research in human factors and ergonomics. This article discusses the need for incorporating the aesthetics dimension in ergonomics and proposes the establishment of a new scientific and engineering discipline that we can call "engineering aesthetics". This discipline addresses two major questions: How do we use engineering and scientific methods to study aesthetics concepts in general and design aesthetics in particular? How do we incorporate engineering and scientific methods in the aesthetic design and evaluation process? This article identifies two special features that distinguish aesthetic appraisal of products and system designs from aesthetic appreciation of art, and lays out a theoretical foundation as well as a dual-process research methodology for "engineering aesthetics". Sample applications of this methodology are also described.

  5. Seismic acquisition and processing methodologies in overthrust areas: Some examples from Latin America

    SciTech Connect

    Tilander, N.G.; Mitchel, R..

    1996-08-01

    Overthrust areas represent some of the last frontiers in petroleum exploration today. Billion barrel discoveries in the Eastern Cordillera of Colombia and the Monagas fold-thrust belt of Venezuela during the past decade have highlighted the potential rewards for overthrust exploration. However the seismic data recorded in many overthrust areas is disappointingly poor. Challenges such as rough topography, complex subsurface structure, presence of high-velocity rocks at the surface, back-scattered energy and severe migration wavefronting continue to lower data quality and reduce interpretability. Lack of well/velocity control also reduces the reliability of depth estimations and migrated images. Failure to obtain satisfactory pre-drill structural images can easily result in costly wildcat failures. Advances in the methodologies used by Chevron for data acquisition, processing and interpretation have produced significant improvements in seismic data quality in Bolivia, Colombia and Trinidad. In this paper, seismic test results showing various swath geometries will be presented. We will also show recent examples of processing methods which have led to improved structural imaging. Rather than focusing on {open_quotes}black box{close_quotes} methodology, we will emphasize the cumulative effect of step-by-step improvements. Finally, the critical significance and interrelation of velocity measurements, modeling and depth migration will be explored. Pre-drill interpretations must ultimately encompass a variety of model solutions, and error bars should be established which realistically reflect the uncertainties in the data.

  6. Digital Storytelling in a Science Curriculum: The Process of Digital Storytelling to Help the Needs of Fourth Grade Students Understand the Concepts of Food Chains

    NASA Astrophysics Data System (ADS)

    Titus, Una-Bellelinda

    In this study I investigate if digital storytelling process will help the needs of the fourth grade students in an elementary school setting learn science concepts, specifically food chains. I focused on three students who varied in social and academic skills/behaviors to investigate their process in working on a digital story. My findings proved that digital storytelling scripts, storyboards, and graphic organizers helped students create a story telling about what happened in their food chain but students couldn't retain the information on food chains to help them in taking their post test. The graphic organizers were able to scaffold and help organize students' thinking. The digital scripts allowed students to comprehend science concepts and explain them to peers.

  7. Parallel distributed processing and neural networks: origins, methodology and cognitive functions.

    PubMed

    Parks, R W; Long, D L; Levine, D S; Crockett, D J; McGeer, E G; McGeer, P L; Dalton, I E; Zec, R F; Becker, R E; Coburn, K L

    1991-10-01

    Parallel Distributed Processing (PDP), a computational methodology with origins in Associationism, is used to provide empirical information regarding neurobiological systems. Recently, supercomputers have enabled neuroscientists to model brain behavior-relationships. An overview of supercomputer architecture demonstrates the advantages of parallel over serial processing. Histological data provide physical evidence of the parallel distributed nature of certain aspects of the human brain, as do corresponding computer simulations. Whereas sensory networks follow more sequential neural network pathways, in vivo brain imaging studies of attention and rudimentary language tasks appear to involve multiple cortical and subcortical areas. Controversy remains as to whether associative models or Artificial Intelligence symbolic models better reflect neural networks of cognitive functions; however, considerable interest has shifted towards associative models.

  8. Optimization of photo-Fenton process of RO concentrated coking wastewater using response surface methodology.

    PubMed

    Huiqing, Zhang; Chunsong, Ye; Xian, Zhang; Fan, Yang; Jun, Yang; Wei, Zhou

    2012-01-01

    The objective of this study was aimed at investigating the removal of chemical oxygen demand (COD) from reverse osmosis (RO) concentrated coking wastewater by the photo-Fenton process. The optimum extraction conditions for the photo-Fenton process by Box-Behnken design (BBD) and response surface methodology (RSM) to establish a predictive polynomial quadratic model were discussed based on a single factor test. Optimized parameters validated by the analysis of variances (ANOVA) were found to be H(2)O(2) concentration of 345.2 mg/L, pH value of 4.1 and reaction time of 103.5 minutes under ultraviolet irradiation. The experimental results of the COD removal under the optimized conditions presented better agreement with the predicted values with deviation error of 3.2%. The results confirmed that RSM based on BBD was a suitable method to optimize the operating conditions of RO concentrated coking wastewater.

  9. A Methodology for Evaluating Artifacts Produced by a Formal Verification Process

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Miner, Paul S.; Person, Suzette

    2011-01-01

    The goal of this study is to produce a methodology for evaluating the claims and arguments employed in, and the evidence produced by formal verification activities. To illustrate the process, we conduct a full assessment of a representative case study for the Enabling Technology Development and Demonstration (ETDD) program. We assess the model checking and satisfiabilty solving techniques as applied to a suite of abstract models of fault tolerant algorithms which were selected to be deployed in Orion, namely the TTEthernet startup services specified and verified in the Symbolic Analysis Laboratory (SAL) by TTTech. To this end, we introduce the Modeling and Verification Evaluation Score (MVES), a metric that is intended to estimate the amount of trust that can be placed on the evidence that is obtained. The results of the evaluation process and the MVES can then be used by non-experts and evaluators in assessing the credibility of the verification results.

  10. Improvement in the incident reporting and investigation procedures using process excellence (DMAI2C) methodology.

    PubMed

    Miles, Elizabeth N

    2006-03-17

    In 1996, Health & Safety introduced an incident investigation process called Learning to Look to Johnson & Johnson. This process provides a systematic way of analyzing work-related injuries and illness, uncovers root cause that leads to system defects, and points to viable solutions. The process analyzed involves three steps: investigation and reporting of the incident, determination of root cause, and development and implementation of a corrective action plan. The process requires the investigators to provide an initial communication for work-related serious injuries and illness as well as lost workday cases to Corporate Headquarters within 72 h of the incident with a full investigative report to follow within 10 days. A full investigation requires a written report, a cause-result logic diagram (CRLD), a corrective action plan (CAP) and a report of incident costs (SafeCost) all due to be filed electronically. It is incumbent on the principal investigator and his or her investigative teams to assemble the various parts of the investigation and to follow up with the relevant parties to ensure corrective actions are implemented, and a full report submitted to Corporate executives. Initial review of the system revealed that the process was not working as designed. A number of reports were late, not signed by the business leaders, and in some instances, all cause were not identified. Process excellence was the process used to study the issue. The team used six sigma DMAI2C methodologies to identify and implement system improvements. The project examined the breakdown of the critical aspects of the reporting and investigation process that lead to system errors. This report will discuss the study findings, recommended improvements, and methods used to monitor the new improved process. PMID:16225990

  11. Improvement in the incident reporting and investigation procedures using process excellence (DMAI2C) methodology.

    PubMed

    Miles, Elizabeth N

    2006-03-17

    In 1996, Health & Safety introduced an incident investigation process called Learning to Look to Johnson & Johnson. This process provides a systematic way of analyzing work-related injuries and illness, uncovers root cause that leads to system defects, and points to viable solutions. The process analyzed involves three steps: investigation and reporting of the incident, determination of root cause, and development and implementation of a corrective action plan. The process requires the investigators to provide an initial communication for work-related serious injuries and illness as well as lost workday cases to Corporate Headquarters within 72 h of the incident with a full investigative report to follow within 10 days. A full investigation requires a written report, a cause-result logic diagram (CRLD), a corrective action plan (CAP) and a report of incident costs (SafeCost) all due to be filed electronically. It is incumbent on the principal investigator and his or her investigative teams to assemble the various parts of the investigation and to follow up with the relevant parties to ensure corrective actions are implemented, and a full report submitted to Corporate executives. Initial review of the system revealed that the process was not working as designed. A number of reports were late, not signed by the business leaders, and in some instances, all cause were not identified. Process excellence was the process used to study the issue. The team used six sigma DMAI2C methodologies to identify and implement system improvements. The project examined the breakdown of the critical aspects of the reporting and investigation process that lead to system errors. This report will discuss the study findings, recommended improvements, and methods used to monitor the new improved process.

  12. A digital process for additive manufacturing of occlusal splints: a clinical pilot study

    PubMed Central

    Salmi, Mika; Paloheimo, Kaija-Stiina; Tuomi, Jukka; Ingman, Tuula; Mäkitie, Antti

    2013-01-01

    The aim of this study was to develop and evaluate a digital process for manufacturing of occlusal splints. An alginate impression was taken from the upper and lower jaws of a patient with temporomandibular disorder owing to cross bite and wear of the teeth, and then digitized using a table laser scanner. The scanned model was repaired using the 3Data Expert software, and a splint was designed with the Viscam RP software. A splint was manufactured from a biocompatible liquid photopolymer by stereolithography. The system employed in the process was SLA 350. The splint was worn nightly for six months. The patient adapted to the splint well and found it comfortable to use. The splint relieved tension in the patient's bite muscles. No sign of tooth wear or significant splint wear was detected after six months of testing. Modern digital technology enables us to manufacture clinically functional occlusal splints, which might reduce costs, dental technician working time and chair-side time. Maximum-dimensional errors of approximately 1 mm were found at thin walls and sharp corners of the splint when compared with the digital model. PMID:23614943

  13. On Process, Progress, Success and Methodology or the Unfolding of the Bologna Process as It Appears to Two Reasonably Benign Observers

    ERIC Educational Resources Information Center

    Neave, Guy; Amaral, Alberto

    2008-01-01

    This article examines the Bologna Process from two main perspectives: as a dynamic strategy as well as the unfolding of the methodology employed. It argues that the latter was largely determined by the former. Three phases of development are identified; the first two of which shows that the methodology was largely determined by the need to bestow…

  14. Optimization of Electrochemical Treatment Process Conditions for Distillery Effluent Using Response Surface Methodology.

    PubMed

    Arulmathi, P; Elangovan, G; Begum, A Farjana

    2015-01-01

    Distillery industry is recognized as one of the most polluting industries in India with a large amount of annual effluent production. In this present study, the optimization of electrochemical treatment process variables was reported to treat the color and COD of distillery spent wash using Ti/Pt as an anode in a batch mode. Process variables such as pH, current density, electrolysis time, and electrolyte dose were selected as operation variables and chemical oxygen demand (COD) and color removal efficiency were considered as response variable for optimization using response surface methodology. Indirect electrochemical-oxidation process variables were optimized using Box-Behnken response surface design (BBD). The results showed that electrochemical treatment process effectively removed the COD (89.5%) and color (95.1%) of the distillery industry spent wash under the optimum conditions: pH of 4.12, current density of 25.02 mA/cm(2), electrolysis time of 103.27 min, and electrolyte (NaCl) concentration of 1.67 g/L, respectively.

  15. Optimization of permeabilization process of yeast cells for catalase activity using response surface methodology

    PubMed Central

    Trawczyńska, Ilona; Wójcik, Marek

    2015-01-01

    Biotransformation processes accompanied by whole yeast cells as biocatalyst are a promising area of food industry. Among the chemical sanitizers currently used in food technology, hydrogen peroxide is a very effective microbicidal and bleaching agent. In this paper, permeabilization has been applied to Saccharomyces cerevisiae yeast cells aiming at increased intracellular catalase activity for decomposed H2O2. Ethanol, which is non-toxic, biodegradable and easily available, has been used as permeabilization factor. Response surface methodology (RSM) has been applied in determining the influence of different parameters on permeabilization process. The aim of the study was to find such values of the process parameters that would yield maximum activity of catalase during decomposition of hydrogen peroxide. The optimum operating conditions for permeabilization process obtained by RSM were as follows: 53% (v/v) of ethanol concentration, temperature of 14.8 °C and treatment time of 40 min. After permeabilization, the activity of catalase increased ca. 40 times and its maximum value equalled to 4711 U/g. PMID:26019618

  16. Optimization of Electrochemical Treatment Process Conditions for Distillery Effluent Using Response Surface Methodology

    PubMed Central

    Arulmathi, P.; Elangovan, G.; Begum, A. Farjana

    2015-01-01

    Distillery industry is recognized as one of the most polluting industries in India with a large amount of annual effluent production. In this present study, the optimization of electrochemical treatment process variables was reported to treat the color and COD of distillery spent wash using Ti/Pt as an anode in a batch mode. Process variables such as pH, current density, electrolysis time, and electrolyte dose were selected as operation variables and chemical oxygen demand (COD) and color removal efficiency were considered as response variable for optimization using response surface methodology. Indirect electrochemical-oxidation process variables were optimized using Box-Behnken response surface design (BBD). The results showed that electrochemical treatment process effectively removed the COD (89.5%) and color (95.1%) of the distillery industry spent wash under the optimum conditions: pH of 4.12, current density of 25.02 mA/cm2, electrolysis time of 103.27 min, and electrolyte (NaCl) concentration of 1.67 g/L, respectively. PMID:26491716

  17. Optimization of Electrochemical Treatment Process Conditions for Distillery Effluent Using Response Surface Methodology.

    PubMed

    Arulmathi, P; Elangovan, G; Begum, A Farjana

    2015-01-01

    Distillery industry is recognized as one of the most polluting industries in India with a large amount of annual effluent production. In this present study, the optimization of electrochemical treatment process variables was reported to treat the color and COD of distillery spent wash using Ti/Pt as an anode in a batch mode. Process variables such as pH, current density, electrolysis time, and electrolyte dose were selected as operation variables and chemical oxygen demand (COD) and color removal efficiency were considered as response variable for optimization using response surface methodology. Indirect electrochemical-oxidation process variables were optimized using Box-Behnken response surface design (BBD). The results showed that electrochemical treatment process effectively removed the COD (89.5%) and color (95.1%) of the distillery industry spent wash under the optimum conditions: pH of 4.12, current density of 25.02 mA/cm(2), electrolysis time of 103.27 min, and electrolyte (NaCl) concentration of 1.67 g/L, respectively. PMID:26491716

  18. Integrated circuit layout design methodology for deep sub-wavelength processes

    NASA Astrophysics Data System (ADS)

    Torres Robles, Juan Andres

    One of the critical aspects of semiconductor fabrication is the patterning of multiple design layers onto silicon wafers. Since 180nm processes came online, the semiconductor industry has operated under conditions in which the critical features are smaller than the wavelength of light used during the patterning process. Such sub-wavelength conditions present many challenges because topology, rather than feature width and space, defines the yield characteristics of the devices. Pattern variability can contribute as much as 80% of the total timing margins defined by traditional SPICE corner models. Because feature variability is undesirable from electrical considerations, this work proposes a physical design verification methodology that emphasizes pattern robustness to process variations. This new method is based on a framework composed of manufacturability objects, operators and guidelines, which permits the definition of a scoring system ranking the manufacturing process and the manufacturability of the designs. This framework is intended to alleviate circuit design and verification challenges and it based on three new concepts: the first relates to compact process model requirements. The second involves the definition of a new design object, called pv-Band, which reflects layout sensitivity to process variations. The third is the specification of two manufacturability metrics that, when optimized, can improve yield by accounting layout sensitivities across multiple design levels (e.g., Active, polysilicon, contact, metal 1, etc.). By integrating these new concepts (process models, pv-Bands and manufacturability metrics) with existing knowledge, this work moves forward the state-of-the-art of physical design and verification of integrated circuits subject to sub-wavelength effects.

  19. Electronic polarization-division demultiplexing based on digital signal processing in intensity-modulation direct-detection optical communication systems.

    PubMed

    Kikuchi, Kazuro

    2014-01-27

    We propose a novel configuration of optical receivers for intensity-modulation direct-detection (IM · DD) systems, which can cope with dual-polarization (DP) optical signals electrically. Using a Stokes analyzer and a newly-developed digital signal-processing (DSP) algorithm, we can achieve polarization tracking and demultiplexing in the digital domain after direct detection. Simulation results show that the power penalty stemming from digital polarization manipulations is negligibly small.

  20. A Subjective Assessment of Perceived Clarity of Indirect Digital Images and Processed Digital Images with Conventional Intra-oral Periapical Radiographs

    PubMed Central

    Malleshi, Suchetha N.; V.G., Mahima; Raina, Anudeepa; Patil, Karthikeya

    2013-01-01

    Objectives: To compare and analyze the perceived clarity and diagnostic value of Conventional periapical Radiographs (CRs) with those of their Digitized Periapical Images (DIs) and Processed Digitized Periapical Images (PDIs) counterparts. Material and Methods: Forty two intraoral periapical radiographs of patients with clinically suspected periapical pathosis were made to constitute the group of CRs. These were photographed by using a Canon Power Shot SD500 (7.1 Megapixel) digital camera and the unaltered images were transferred to a computer laptop, to form the group of DIs. Subsequently, the contrast and brightness of these images were modified to represent the group of PDIs. Two experienced oral radiologists independently evaluated 5 specific apical and periapical region parameters of all the 42 CRs, DIs and PDIs for perceived image quality and diagnostic value and graded them on a three point grading scale. Conventional radiographs served as the control. Data were analyzed by using paired t-test and Kappa analysis. Results: The clarity and diagnostic quality of the PDIs were statistically significant as compared to those of their conventional counterparts. In comparison, the DIs group fared badly, with deterioration of the image quality. The interobserver agreement was good and all the results were statistically significant. Conclusion: Indirectly digitizing the radiographs by employing a digital camera and further digitally processing the images resulted in an improvement in their perceived clarity and they enhanced their diagnostic properties. PMID:24086916

  1. Optimization of enzymatic process for vanillin extraction using response surface methodology.

    PubMed

    Gu, Fenglin; Xu, Fei; Tan, Lehe; Wu, Huasong; Chu, Zhong; Wang, Qinghuang

    2012-01-01

    Vanillin was extracted from vanilla beans using pretreatment with cellulase to produce enzymatic hydrolysis, and response surface methodology (RSM) was applied to optimize the processing parameters of this extraction. The effects of heating time, enzyme quantity and temperature on enzymatic extraction of vanillin were evaluated. Extraction yield (mg/g) was used as the response value. The results revealed that the increase in heating time and the increase in enzyme quantity (within certain ranges) were associated with an enhancement of extraction yield, and that the optimal conditions for vanillin extraction were: Heating time 6 h, temperature 60 °C and enzyme quantity 33.5 mL. Calculated from the final polynomial functions, the optimal response of vanillin extraction yield was 7.62 mg/g. The predicted results for optimal reaction conditions were in good agreement with experimental values.

  2. Process and methodology of developing Cassini G and C Telemetry Dictionary

    NASA Technical Reports Server (NTRS)

    Kan, Edwin P.

    1994-01-01

    While the Cassini spacecraft telemetry design had taken on the new approach of 'packetized telemetry', the AACS (Attitude and Articulation Subsystem) had further extended into the design of 'mini-packets' in its telemetry system. Such telemetry packet and mini-packet design produced the AACS Telemetry Dictionary; iterations of the latter in turn provided changes to the former. The ultimate goals were to achieve maximum telemetry packing density, optimize the 'freshness' of more time-critical data, and to effect flexibility, i.e., multiple AACS data collection schemes, without needing to change the overall spacecraft telemetry mode. This paper describes such a systematic process and methodology, evidenced by various design products related to, or as part of, the AACS Telemetry Dictionary.

  3. Methods, media and systems for managing a distributed application running in a plurality of digital processing devices

    DOEpatents

    Laadan, Oren; Nieh, Jason; Phung, Dan

    2012-10-02

    Methods, media and systems for managing a distributed application running in a plurality of digital processing devices are provided. In some embodiments, a method includes running one or more processes associated with the distributed application in virtualized operating system environments on a plurality of digital processing devices, suspending the one or more processes, and saving network state information relating to network connections among the one or more processes. The method further include storing process information relating to the one or more processes, recreating the network connections using the saved network state information, and restarting the one or more processes using the stored process information.

  4. PREFACE: I International Scientific School Methods of Digital Image Processing in Optics and Photonics

    NASA Astrophysics Data System (ADS)

    Gurov, I. P.; Kozlov, S. A.

    2014-09-01

    The first international scientific school "Methods of Digital Image Processing in Optics and Photonics" was held with a view to develop cooperation between world-class experts, young scientists, students and post-graduate students, and to exchange information on the current status and directions of research in the field of digital image processing in optics and photonics. The International Scientific School was managed by: Saint Petersburg National Research University of Information Technologies, Mechanics and Optics (ITMO University) - Saint Petersburg (Russia) Chernyshevsky Saratov State University - Saratov (Russia) National research nuclear University "MEPHI" (NRNU MEPhI) - Moscow (Russia) The school was held with the participation of the local chapters of Optical Society of America (OSA), the Society of Photo-Optical Instrumentation Engineers (SPIE) and IEEE Photonics Society. Further details, including topics, committees and conference photos are available in the PDF

  5. Searching early bone metastasis on plain radiography by using digital imaging processing

    NASA Astrophysics Data System (ADS)

    Jaramillo-Núñez, A.; Pérez-Meza, M.

    2012-10-01

    Some authors mention that it is not possible to detect early bone metastasis on plain radiography. In this work we use digital imaging processing to analyze three radiographs taken from a patient with bone metastasis discomfort on the right shoulder. The time period among the first and second radiography was approximately one month and between the first and the third one year. This procedure is a first approach in order to know if in this particular case it was possible to detect an early bone metastasis. The obtained results suggest that by carrying out a digital processing is possible to detect the metastasis since the radiography contains the information although visually it is not possible to observe it.

  6. Searching early bone metastasis on plain radiography by using digital imaging processing

    SciTech Connect

    Jaramillo-Nunez, A.; Perez-Meza, M.

    2012-10-23

    Some authors mention that it is not possible to detect early bone metastasis on plain radiography. In this work we use digital imaging processing to analyze three radiographs taken from a patient with bone metastasis discomfort on the right shoulder. The time period among the first and second radiography was approximately one month and between the first and the third one year. This procedure is a first approach in order to know if in this particular case it was possible to detect an early bone metastasis. The obtained results suggest that by carrying out a digital processing is possible to detect the metastasis since the radiography contains the information although visually it is not possible to observe it.

  7. Recent developments at JPL in the application of digital image processing techniques to astronomical images

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.; Lynn, D. J.; Benton, W. D.

    1976-01-01

    Several techniques of a digital image-processing nature are illustrated which have proved useful in visual analysis of astronomical pictorial data. Processed digital scans of photographic plates of Stephans Quintet and NGC 4151 are used as examples to show how faint nebulosity is enhanced by high-pass filtering, how foreground stars are suppressed by linear interpolation, and how relative color differences between two images recorded on plates with different spectral sensitivities can be revealed by generating ratio images. Analyses are outlined which are intended to compensate partially for the blurring effects of the atmosphere on images of Stephans Quintet and to obtain more detailed information about Saturn's ring structure from low- and high-resolution scans of the planet and its ring system. The employment of a correlation picture to determine the tilt angle of an average spectral line in a low-quality spectrum is demonstrated for a section of the spectrum of Uranus.

  8. Two-dimensional quantification of the corrosion process in metal surfaces using digital speckle pattern interferometry

    SciTech Connect

    Andres, N.; Lobera, J.; Arroyo, M. P.; Angurel, L. A.

    2011-04-01

    The applicability of digital speckle pattern interferometry (DSPI) to the analysis of surface corrosion processes has been evaluated by studying the evolution of an Fe surface immersed in sulfuric acid. This work describes the analysis process required to obtain quantitative information about the corrosion process. It has been possible to evaluate the corrosion rate, and the results agree with those derived from the weight loss method. In addition, a two-dimensional analysis has been applied, showing that DSPI measurements can be used to extract information about the corrosion rate at any region of the surface.

  9. Accuracy, security, and processing time comparisons of biometric fingerprint recognition system using digital and optical enhancements

    NASA Astrophysics Data System (ADS)

    Alsharif, Salim; El-Saba, Aed; Jagapathi, Rajendarreddy

    2011-06-01

    Fingerprint recognition is one of the most commonly used forms of biometrics and has been widely used in daily life due to its feasibility, distinctiveness, permanence, accuracy, reliability, and acceptability. Besides cost, issues related to accuracy, security, and processing time in practical biometric recognition systems represent the most critical factors that makes these systems widely acceptable. Accurate and secure biometric systems often require sophisticated enhancement and encoding techniques that burdens the overall processing time of the system. In this paper we present a comparison between common digital and optical enhancementencoding techniques with respect to their accuracy, security and processing time, when applied to biometric fingerprint systems.

  10. The Development of a Digital Processing System for Accurate Range Determinations. [for Teleoperator Maneuvering Systems

    NASA Technical Reports Server (NTRS)

    Pujol, A., Jr.

    1983-01-01

    The development of an accurate close range (from 0.0 meters to 30.0 meters) radar system for Teleoperator Maneuvering Systems (TMS) is discussed. The system under investigation is a digital processor that converts incoming signals from the radar system into their related frequency spectra. Identification will be attempted by correlating spectral characteristics with accurate range determinataions. The system will utilize an analog to digital converter for sampling and converting the signal from the radar system into 16-bit digital words (two bytes) for RAM storage, data manipulations, and computations. To remove unwanted frequency components the data will be retrieved from RAM and digitally filtered using large scale integration (LSI) circuits. Filtering will be performed by a biquadratic routine within the chip which carries out the required filter algorithm. For conversion to a frequency spectrum the filtered data will be processed by a Fast Fourier Transform chip. Analysis and identification of spectral characteristics for accurate range determinations will be made by microcomputer computations.

  11. Equalization-enhanced phase noise for coherent-detection systems using electronic digital signal processing.

    PubMed

    Shieh, William; Ho, Keang-Po

    2008-09-29

    In coherent optical systems employing electronic digital signal processing, the fiber chromatic dispersion can be gracefully compensated in electronic domain without resorting to optical techniques. Unlike optical dispersion compensator, the electronic equalizer enhances the impairments from the laser phase noise. This equalization-enhanced phase noise (EEPN) imposes a tighter constraint on the receive laser phase noise for transmission systems with high symbol rate and large electronically-compensated chromatic dispersion.

  12. Processing of post-consumer HDPE reinforced with wood flour: Effect of functionaliation methodology

    NASA Astrophysics Data System (ADS)

    Catto, A. L.; Montagna, L. S.; Rossini, K.; Santana, R. M. C.

    2014-05-01

    A way very interesting used in the reuse of waste cellulose derivatives such as wood flour is its incorporation into thermoplastics matrix. As the olefinic polymers have no interaction with cellulose derivatives, some chemical treatments have been used in the modification of vegetable fibers, to increase the interfacial adhesion between the cellulosic reinforcement and the polymeric matrix. In this sense, the objective of this study was to evaluate the influence of the methodology of the incorporation of compatibilizer agent (CA) in polyolefin matrix and evaluate the mechanical and morphological properties of composites. HDPE, wood flour from Eucalyptus grandis species (EU) and graftized polyethylene with maleic anhydride (CA) were used in composites, being extruded and after injection molding. The mixtures were processed in a single screw extruder (L/D: 22), with the temperature profile from 170° to 190°C. In a first step, the materials were processed together in the extruder, and after the samples were injected a temperature of 185°C and pressure of 600 bar. In a second step, the HDPE with the compatibilizer agent were first processed in the extruder in order to functionalize the polyolefin, and added after the wood flour (EU) sieved (30% w/w). Results showed that composites with CA had a higher mechanical performance compared to nocompatibilized. Also showed that composites compatibilized previously in the extruder with CA had better performance in comparison to other where the polymer matrix was not previously compatibilized.

  13. Method and Apparatus for Evaluating the Visual Quality of Processed Digital Video Sequences

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B. (Inventor)

    2002-01-01

    A Digital Video Quality (DVQ) apparatus and method that incorporate a model of human visual sensitivity to predict the visibility of artifacts. The DVQ method and apparatus are used for the evaluation of the visual quality of processed digital video sequences and for adaptively controlling the bit rate of the processed digital video sequences without compromising the visual quality. The DVQ apparatus minimizes the required amount of memory and computation. The input to the DVQ apparatus is a pair of color image sequences: an original (R) non-compressed sequence, and a processed (T) sequence. Both sequences (R) and (T) are sampled, cropped, and subjected to color transformations. The sequences are then subjected to blocking and discrete cosine transformation, and the results are transformed to local contrast. The next step is a time filtering operation which implements the human sensitivity to different time frequencies. The results are converted to threshold units by dividing each discrete cosine transform coefficient by its respective visual threshold. At the next stage the two sequences are subtracted to produce an error sequence. The error sequence is subjected to a contrast masking operation, which also depends upon the reference sequence (R). The masked errors can be pooled in various ways to illustrate the perceptual error over various dimensions, and the pooled error can be converted to a visual quality measure.

  14. The effects of solar incidence angle over digital processing of LANDSAT data

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Novo, E. M. L. M.

    1983-01-01

    A technique to extract the topography modulation component from digital data is described. The enhancement process is based on the fact that the pixel contains two types of information: (1) reflectance variation due to the target; (2) reflectance variation due to the topography. In order to enhance the signal variation due to topography, the technique recommends the extraction from original LANDSAT data of the component resulting from target reflectance. Considering that the role of topographic modulation over the pixel information will vary with solar incidence angle, the results of this technique of digital processing will differ from one season to another, mainly in highly dissected topography. In this context, the effects of solar incidence angle over the topographic modulation technique were evaluated. Two sets of MSS/LANDSAT data, with solar elevation angles varying from 22 to 41 deg were selected to implement the digital processing at the Image-100 System. A secondary watershed (Rio Bocaina) draining into Rio Paraiba do Sul (Sao Paulo State) was selected as a test site. The results showed that the technique used was more appropriate to MSS data acquired under higher Sun elevation angles. Topographic modulation components applied to low Sun elevation angles lessens rather than enhances topography.

  15. Single-step affinity purification of enzyme biotherapeutics: a platform methodology for accelerated process development.

    PubMed

    Brower, Kevin P; Ryakala, Venkat K; Bird, Ryan; Godawat, Rahul; Riske, Frank J; Konstantinov, Konstantin; Warikoo, Veena; Gamble, Jean

    2014-01-01

    Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non-antibody biologics. Multi-step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single-step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi-step purification trains. These models empower process development decision-making with drug substance-equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT).

  16. Interplay of formulation and process methodology on the extent of nifedipine molecular dispersion in polymers.

    PubMed

    Huang, Jingjun; Li, Ying; Wigent, Rodney J; Malick, Waseem A; Sandhu, Harpreet K; Singhal, Dharmendra; Shah, Navnit H

    2011-11-25

    The aim of this study is to evaluate effects of formulation and process technology on drug molecular dispersibility in solid dispersions (SDs). Nifedipine solid dispersions with ethylcellulose (EC) and/or Eudragit RL (RL) prepared by co-precipitation, co-evaporation, and fusion methods were characterized with FTIR, DSC, and XRPD for the content of nifedipine as molecular dispersion, amorphous and/or crystalline suspensions. A method was developed based on regular solution and Flory-Huggins theories to calculate drug-polymer interaction parameter in solid dispersion systems. A synergic effect of RL and EC on nifedipine molecular dispersibility in solid dispersions was observed. Increasing RL/EC ratio resulted in a higher degree of drug-polymer interaction that thermodynamically favored molecular dispersion, which, however, was counteracted by a corresponding decrease in the matrix glass transition point that kinetically favored phase-separation. Process methodology was found to play an important role in the formation of amorphous SD. The ranking of technologies with respect to the extent of molecular dispersion from high to low is fusion>co-evaporation>co-precipitation, wherein the solidification rate of polymeric solution and non-solvent effects were linked to kinetic entrapment of drug molecules in polymeric networks. Since nifedipine molecular dispersibility in EC/RL polymer(s) is a result of interplay between thermodynamic and kinetic factors, nifedipine molecular dispersions prepared for this study are thermodynamically metastable systems. To explore those supersaturation systems for use in drug delivery of poorly water soluble drugs, it is critical to balance drug-polymer interactions and matrix glass transition point and to consider a process technology with a fast solidification rate during formulation and process development of amorphous SD.

  17. Using dual-task methodology to dissociate automatic from nonautomatic processes involved in artificial grammar learning.

    PubMed

    Hendricks, Michelle A; Conway, Christopher M; Kellogg, Ronald T

    2013-09-01

    Previous studies have suggested that both automatic and intentional processes contribute to the learning of grammar and fragment knowledge in artificial grammar learning (AGL) tasks. To explore the relative contribution of automatic and intentional processes to knowledge gained in AGL, we utilized dual-task methodology to dissociate automatic and intentional grammar- and fragment-based knowledge in AGL at both acquisition and at test. Both experiments used a balanced chunk strength grammar to assure an equal proportion of fragment cues (i.e., chunks) in grammatical and nongrammatical test items. In Experiment 1, participants engaged in a working memory dual-task either during acquisition, test, or both acquisition and test. The results showed that participants performing the dual-task during acquisition learned the artificial grammar as well as the single-task group, presumably by relying on automatic learning mechanisms. A working memory dual-task at test resulted in attenuated grammar performance, suggesting a role for intentional processes for the expression of grammatical learning at test. Experiment 2 explored the importance of perceptual cues by changing letters between the acquisition and test phase; unlike Experiment 1, there was no significant learning of grammatical information for participants under dual-task conditions in Experiment 2, suggesting that intentional processing is necessary for successful acquisition and expression of grammar-based knowledge under transfer conditions. In sum, it appears that some aspects of learning in AGL are indeed relatively automatic, although the expression of grammatical information and the learning of grammatical patterns when perceptual similarity is eliminated both appear to require explicit resources.

  18. Process optimization of electrospun polycaprolactone and nanohydroxyapatite composite nanofibers using response surface methodology.

    PubMed

    Doustgani, A; Vasheghani-Farahani, E; Soleimani, M; Hashemi-Najafabadi, S

    2013-07-01

    Electrospinning is a process that produces continuous polymer fibers in the sub-micron range through the action of an external electric field imposed on a polymer solution or melt. In this study the effects of process parameters on the mean diameter of electrospun polycaprolactone and nanohydroxyapatite (nHA) composite nanofibers were investigated. The fiber morphology and mean fiber diameter of prepared nanofibers were investigated by scanning electron microscopy. Response surface methodology (RSM) was utilized to design the experiments at the settings of nHA concentration, applied voltage, spinning distance and the flow rate of polymer solution. It also used to find and evaluate a quantitative relationship between electrospinning parameters and average fiber diameters. Mean fiber diameter was correlated to these variables using a third order polynomial function. Value of R-square for the model was 0.96, which indicates that 96% of the variability in the dependent variable could be explained and only 4% of the total variations cannot be explained by the model. It was found that nHA concentration, applied voltage and spinning distance were the most effective parameters and the sole effect of flow rate was not important. The predicted fiber diameters were in good agreement with the experimental results.

  19. Modeling and optimization of red currants vacuum drying process by response surface methodology (RSM).

    PubMed

    Šumić, Zdravko; Vakula, Anita; Tepić, Aleksandra; Čakarević, Jelena; Vitas, Jasmina; Pavlić, Branimir

    2016-07-15

    Fresh red currants were dried by vacuum drying process under different drying conditions. Box-Behnken experimental design with response surface methodology was used for optimization of drying process in terms of physical (moisture content, water activity, total color change, firmness and rehydratation power) and chemical (total phenols, total flavonoids, monomeric anthocyanins and ascorbic acid content and antioxidant activity) properties of dried samples. Temperature (48-78 °C), pressure (30-330 mbar) and drying time (8-16 h) were investigated as independent variables. Experimental results were fitted to a second-order polynomial model where regression analysis and analysis of variance were used to determine model fitness and optimal drying conditions. The optimal conditions of simultaneously optimized responses were temperature of 70.2 °C, pressure of 39 mbar and drying time of 8 h. It could be concluded that vacuum drying provides samples with good physico-chemical properties, similar to lyophilized sample and better than conventionally dried sample.

  20. Modeling and optimization of red currants vacuum drying process by response surface methodology (RSM).

    PubMed

    Šumić, Zdravko; Vakula, Anita; Tepić, Aleksandra; Čakarević, Jelena; Vitas, Jasmina; Pavlić, Branimir

    2016-07-15

    Fresh red currants were dried by vacuum drying process under different drying conditions. Box-Behnken experimental design with response surface methodology was used for optimization of drying process in terms of physical (moisture content, water activity, total color change, firmness and rehydratation power) and chemical (total phenols, total flavonoids, monomeric anthocyanins and ascorbic acid content and antioxidant activity) properties of dried samples. Temperature (48-78 °C), pressure (30-330 mbar) and drying time (8-16 h) were investigated as independent variables. Experimental results were fitted to a second-order polynomial model where regression analysis and analysis of variance were used to determine model fitness and optimal drying conditions. The optimal conditions of simultaneously optimized responses were temperature of 70.2 °C, pressure of 39 mbar and drying time of 8 h. It could be concluded that vacuum drying provides samples with good physico-chemical properties, similar to lyophilized sample and better than conventionally dried sample. PMID:26948639

  1. Methodology to assess the environmental impact of a product and its processes

    NASA Astrophysics Data System (ADS)

    Kumar, K. R.; Lee, Dongwon; Malhotra, Arvind

    2001-02-01

    This study presents a methodology for capturing the environmental impact of a product and its processes throughout the life cycle in discrete part manufacturing. The objectives of the study are to identify opportunities to enhance environmental friendliness of a product in its design stage, and assess whether the environmental impact has actually been reduced or has simply been shifted elsewhere in the life cycle of the product. Using the bill of materials and the process route sheet, we build the environmental status of its operations as a vector of measurable attributes, categorized under the taxonomy of social, ecological, and economic impact that can be aggregated and evaluated at a business unit level. The vector of social impact deals with the effects of materials used and wastes produced on people through the life cycle. The vector of ecological impact consists of effects of recycling, reuse, and remanufacturing of a product based on the notion of materials balance. Finally, the vector of economic impact represents the conversion of the previous two vectors into managerially relevant costs to firm expressed in dollar amounts so that managers in any positions visually appraise their operations and communicate each other with the same language.

  2. Cointegration methodology for psychological researchers: An introduction to the analysis of dynamic process systems.

    PubMed

    Stroe-Kunold, Esther; Gruber, Antje; Stadnytska, Tetiana; Werner, Joachim; Brosig, Burkhard

    2012-11-01

    Longitudinal data analysis focused on internal characteristics of a single time series has attracted increasing interest among psychologists. The systemic psychological perspective suggests, however, that many long-term phenomena are mutually interconnected, forming a dynamic system. Hence, only multivariate methods can handle such human dynamics appropriately. Unlike the majority of time series methodologies, the cointegration approach allows interdependencies of integrated (i.e., extremely unstable) processes to be modelled. This advantage results from the fact that cointegrated series are connected by stationary long-run equilibrium relationships. Vector error-correction models are frequently used representations of cointegrated systems. They capture both this equilibrium and compensation mechanisms in the case of short-term deviations due to developmental changes. Thus, the past disequilibrium serves as explanatory variable in the dynamic behaviour of current variables. Employing empirical data from cognitive psychology, psychosomatics, and marital interaction research, this paper describes how to apply cointegration methods to dynamic process systems and how to interpret the parameters under investigation from a psychological perspective.

  3. A foundational methodology for determining system static complexity using notional lunar oxygen production processes

    NASA Astrophysics Data System (ADS)

    Long, Nicholas James

    This thesis serves to develop a preliminary foundational methodology for evaluating the static complexity of future lunar oxygen production systems when extensive information is not yet available about the various systems under consideration. Evaluating static complexity, as part of a overall system complexity analysis, is an important consideration in ultimately selecting a process to be used in a lunar base. When system complexity is higher, there is generally an overall increase in risk which could impact the safety of astronauts and the economic performance of the mission. To evaluate static complexity in lunar oxygen production, static complexity is simplified and defined into its essential components. First, three essential dimensions of static complexity are investigated, including interconnective complexity, strength of connections, and complexity in variety. Then a set of methods is developed upon which to separately evaluate each dimension. Q-connectivity analysis is proposed as a means to evaluate interconnective complexity and strength of connections. The law of requisite variety originating from cybernetic theory is suggested to interpret complexity in variety. Secondly, a means to aggregate the results of each analysis is proposed to create holistic measurement for static complexity using the Single Multi-Attribute Ranking Technique (SMART). Each method of static complexity analysis and the aggregation technique is demonstrated using notional data for four lunar oxygen production processes.

  4. Digital Audio Signal Processing and Nde: AN Unlikely but Valuable Partnership

    NASA Astrophysics Data System (ADS)

    Gaydecki, Patrick

    2008-02-01

    In the Digital Signal Processing (DSP) group, within the School of Electrical and Electronic Engineering at The University of Manchester, research is conducted into two seemingly distinct and disparate subjects: instrumentation for nondestructive evaluation, and DSP systems & algorithms for digital audio. We have often found that many of the hardware systems and algorithms employed to recover, extract or enhance audio signals may also be applied to signals provided by ultrasonic or magnetic NDE instruments. Furthermore, modern DSP hardware is so fast (typically performing hundreds of millions of operations per second), that much of the processing and signal reconstruction may be performed in real time. Here, we describe some of the hardware systems we have developed, together with algorithms that can be implemented both in real time and offline. A next generation system has now been designed, which incorporates a processor operating at 0.55 Giga MMACS, six input and eight output analogue channels, digital input/output in the form of S/PDIF, a JTAG and a USB interface. The software allows the user, with no knowledge of filter theory or programming, to design and run standard or arbitrary FIR, IIR and adaptive filters. Using audio as a vehicle, we can demonstrate the remarkable properties of modern reconstruction algorithms when used in conjunction with such hardware; applications in NDE include signal enhancement and recovery in acoustic, ultrasonic, magnetic and eddy current modalities.

  5. Automatic Rice Crop Height Measurement Using a Field Server and Digital Image Processing

    PubMed Central

    Sritarapipat, Tanakorn; Rakwatin, Preesan; Kasetkasem, Teerasit

    2014-01-01

    Rice crop height is an important agronomic trait linked to plant type and yield potential. This research developed an automatic image processing technique to detect rice crop height based on images taken by a digital camera attached to a field server. The camera acquires rice paddy images daily at a consistent time of day. The images include the rice plants and a marker bar used to provide a height reference. The rice crop height can be indirectly measured from the images by measuring the height of the marker bar compared to the height of the initial marker bar. Four digital image processing steps are employed to automatically measure the rice crop height: band selection, filtering, thresholding, and height measurement. Band selection is used to remove redundant features. Filtering extracts significant features of the marker bar. The thresholding method is applied to separate objects and boundaries of the marker bar versus other areas. The marker bar is detected and compared with the initial marker bar to measure the rice crop height. Our experiment used a field server with a digital camera to continuously monitor a rice field located in Suphanburi Province, Thailand. The experimental results show that the proposed method measures rice crop height effectively, with no human intervention required. PMID:24451465

  6. Automatic Analysis for the Chemical Testing of Urine Examination Using Digital Image Processing Techniques

    NASA Astrophysics Data System (ADS)

    Vilardy, Juan M.; Peña, Jose C.; Daza, Miller F.; Torres, Cesar O.; Mattos, Lorenzo

    2008-04-01

    For to make the chemical testing of urine examination a dipstick is used, which contains pads that have incorporated within them the reagents for chemical reactions for the detection of a number from substances in the urine. Urine is added to the pads for reaction by dipping the dipstick into the urine and then slowly withdrawing it. The subsequent colorimetric reactions are timed to an endpoint; the extent of colors formation is directly related to the level of the urine constituent. The colors can be read manually by comparison with color charts or with the use of automated reflectance meters. The aim of the System described in this paper is to analyze and to determine automatically the changes of the colors in the dipstick when this is retired of the urine sample and to compare the results with color charts for the diagnosis of many common diseases such as diabetes. The system consists of: (a) a USB camera. (b) Computer. (c) Software Matlab v7.4. Image analysis begins with a digital capturing of the image as data. Once the image is acquired in digital format, the data can be manipulated through digital image processing. Our objective was to develop a computerised image processing system and an interactive software package for the backing of clinicians, medical research and medical students.

  7. An integrated approach to safety-driven and ICT-enabled process reengineering: methodological advice and a case study.

    PubMed

    Langer, M; Castellari, R; Locatelli, P; Sini, E; Torresani, M; Facchini, R; Moser, R

    2014-01-01

    Patient safety is a central concern inside any healthcare environment. With the progress of Information and Communication Technologies (ICTs), new solutions have become available to support care and management processes. Analyzing process risks helps identifying areas of improvement and provides ICT-solutions design with indications on what portions of the process need primary interventions. Understanding the link between process reengineering, technology assessment of enabling technologies and risk management allows user acceptance and patient safety improvements. Fondazione IRCCS Istituto Nazionale dei Tumori (INT), offers a good example of process reengineering driven by the purpose of increasing patient safety, enabled by new technologies. A pillar of the evolution of ICT process support at INT is based on Radio Frequency Identification technologies, implemented to identify and trace items and people across processes. This paper will present an integrated approach, based on process reengineering methodologies and risk assessment studies, and methodological advice applied to a case of surgical kits management procedures.

  8. An integrated approach to safety-driven and ICT-enabled process reengineering: methodological advice and a case study.

    PubMed

    Langer, M; Castellari, R; Locatelli, P; Sini, E; Torresani, M; Facchini, R; Moser, R

    2014-01-01

    Patient safety is a central concern inside any healthcare environment. With the progress of Information and Communication Technologies (ICTs), new solutions have become available to support care and management processes. Analyzing process risks helps identifying areas of improvement and provides ICT-solutions design with indications on what portions of the process need primary interventions. Understanding the link between process reengineering, technology assessment of enabling technologies and risk management allows user acceptance and patient safety improvements. Fondazione IRCCS Istituto Nazionale dei Tumori (INT), offers a good example of process reengineering driven by the purpose of increasing patient safety, enabled by new technologies. A pillar of the evolution of ICT process support at INT is based on Radio Frequency Identification technologies, implemented to identify and trace items and people across processes. This paper will present an integrated approach, based on process reengineering methodologies and risk assessment studies, and methodological advice applied to a case of surgical kits management procedures. PMID:24943545

  9. Methodology used to produce an encoded 1:100,000-scale digital hydrographic data layer for the Pacific Northwest

    USGS Publications Warehouse

    Fisher, B.J.

    1996-01-01

    The U.S. Geological Survey (USGS) has produced a River Reach File data layer for the Pacific Northwest for use in water-resource management applications. The Pacific Northwest (PNW) River Reach Files, a geo-referenced river reach data layer at 1:100,000-scale, are encoded with the U.S. Environmental Protection Agency"s (EPA) reach numbers. The encoding was a primary task of the River Reach project, because EPA"s reach identifiers are also an integral hydrologic component in a regional Northwest Environmental Data Base-an ongoing effort by Federal and State agencies to compile information on reach-specific resources on rivers in Oregon, Idaho, Washington, and western Montana. A unique conflation algorithm was developed by the USGS to transfer the EPA reach codes and other meaningful attributes from the 1:250,000-scale EPA TRACE graphic files to the PNW Reach Files. The PNW Reach Files also were designed so that reach-specific information upstream or downstream from a point in the stream network could be extracted from feature attribute tables or from a Geographic Information System. This report documents the methodology used to create this 1:100,000-scale hydrologic data layer.

  10. Applications of digital image processing techniques to problems of data registration and correlation

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1978-01-01

    An overview is presented of the evolution of the computer configuration at JPL's Image Processing Laboratory (IPL). The development of techniques for the geometric transformation of digital imagery is discussed and consideration is given to automated and semiautomated image registration, and the registration of imaging and nonimaging data. The increasing complexity of image processing tasks at IPL is illustrated with examples of various applications from the planetary program and earth resources activities. It is noted that the registration of existing geocoded data bases with Landsat imagery will continue to be important if the Landsat data is to be of genuine use to the user community.

  11. Software for Processing of Digitized Astronegatives from Archives and Databases of Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Protsyuk, Yu. I.; Andruk, V. N.; Kazantseva, L. V.

    The paper discusses and illustrates the steps of basic processing of digitized image of astro negatives. Software for obtaining of a rectangular coordinates and photometric values of objects on photographic plates was created in the environment LINUX / MIDAS / ROMAFOT. The program can automatically process the specified number of files in FITS format with sizes up to 20000 x 20000 pixels. Other programs were made in FORTRAN and PASCAL with the ability to work in an environment of LINUX or WINDOWS. They were used for: identification of stars, separation and exclusion of diffraction satellites and double and triple exposures, elimination of image defects, reduction to the equatorial coordinates and magnitudes of a reference catalogs.

  12. Signal processing circuit for a mass flow rate digital meter design

    NASA Astrophysics Data System (ADS)

    Abdul-Hameed, Kamal Hilal; Abdul-Karim, Majid A. H.

    1987-07-01

    This paper introduces the design of a signal processing circuit which can be used in conjunction with additional arrangement for measuring gas mass flow rate. Measurement is based on Bernoulli's equation for subsonic flow. Voltages representing gas static pressure, static temperature, and difference pressure generated across an orifice plate are supposed to be taken from appropriate transducers. These voltages are processed with this circuit in such a way as to produce a digital number representing the mass flow rate value. The circuit shows a good accuracy result with uncertainty of about ±0.1%.

  13. Fast extended focused imaging in digital holography using a graphics processing unit.

    PubMed

    Wang, Le; Zhao, Jianlin; Di, Jianglei; Jiang, Hongzhen

    2011-05-01

    We present a simple and effective method for reconstructing extended focused images in digital holography using a graphics processing unit (GPU). The Fresnel transform method is simplified by an algorithm named fast Fourier transform pruning with frequency shift. Then the pixel size consistency problem is solved by coordinate transformation and combining the subpixel resampling and the fast Fourier transform pruning with frequency shift. With the assistance of the GPU, we implemented an improved parallel version of this method, which obtained about a 300-500-fold speedup compared with central processing unit codes.

  14. Digital image processing applied to analysis of geophysical and geochemical data for southern Missouri

    NASA Technical Reports Server (NTRS)

    Guinness, E. A.; Arvidson, R. E.; Leff, C. E.; Edwards, M. H.; Bindschadler, D. L.

    1983-01-01

    Digital image-processing techniques have been used to analyze a variety of geophysical and geochemical map data covering southern Missouri, a region with important basement and strata-bound mineral deposits. Gravity and magnetic anomaly patterns, which have been reformatted to image displays, indicate a deep crustal structure cutting northwest-southeast through southern Missouri. In addition, geologic map data, topography, and Landsat multispectral scanner images have been used as base maps for the digital overlay of aerial gamma-ray and stream sediment chemical data for the 1 x 2-deg Rolla quadrangle. Results indicate enrichment of a variety of elements within the clay-rich alluvium covering many of the interfluvial plains, as well as a complicated pattern of enrichment for the sedimentary units close to the Precambrian rhyolites and granites of the St. Francois Mountains.

  15. Prototyping scalable digital signal processing systems for radio astronomy using dataflow models

    NASA Astrophysics Data System (ADS)

    Sane, N.; Ford, J.; Harris, A. I.; Bhattacharyya, S. S.

    2012-05-01

    There is a growing trend toward using high-level tools for design and implementation of radio astronomy digital signal processing (DSP) systems. Such tools, for example, those from the Collaboration for Astronomy Signal Processing and Electronics Research (CASPER), are usually platform-specific, and lack high-level, platform-independent, portable, scalable application specifications. This limits the designer's ability to experiment with designs at a high-level of abstraction and early in the development cycle. We address some of these issues using a model-based design approach employing dataflow models. We demonstrate this approach by applying it to the design of a tunable digital downconverter (TDD) used for narrow-bandwidth spectroscopy. Our design is targeted toward an FPGA platform, called the Interconnect Break-out Board (IBOB), that is available from the CASPER. We use the term TDD to refer to a digital downconverter for which the decimation factor and center frequency can be reconfigured without the need for regenerating the hardware code. Such a design is currently not available in the CASPER DSP library. The work presented in this paper focuses on two aspects. First, we introduce and demonstrate a dataflow-based design approach using the dataflow interchange format (DIF) tool for high-level application specification, and we integrate this approach with the CASPER tool flow. Secondly, we explore the trade-off between the flexibility of TDD designs and the low hardware cost of fixed-configuration digital downconverter (FDD) designs that use the available CASPER DSP library. We further explore this trade-off in the context of a two-stage downconversion scheme employing a combination of TDD or FDD designs.

  16. Morphometrics of aeolian blowouts from high-resolution digital elevation data: methodological considerations, shape metrics, and scaling

    NASA Astrophysics Data System (ADS)

    Hamilton, T. K.; Duke, G.; Brown, O.; Koenig, D.; Barchyn, T. E.; Hugenholtz, C.

    2011-12-01

    Aeolian blowouts are wind erosion hollows that form in vegetated aeolian landscapes. They are especially pervasive in dunefields of the northern Great Plains, yielding highly pitted or hummocky terrain, and adding to the spatial variability of microenvironments. Their development is thought to be linked to feedbacks between morphology and airflow; however, few measurements are available to test this hypothesis. Currently, a dearth of morphology data is limiting modeling progress. From a systematic program of blowout mapping with high-resolution airborne LiDAR data, we used a GIS to calculate morphometrics for 1373 blowouts in Great Sand Hills, Saskatchewan, Canada. All of the blowouts selected for this investigation were covered by grassland vegetation and inactive; their morphology represents the final stage of evolution. We first outline methodological considerations for delineating blowouts and measuring their volume. In particular, we present an objective method to enhance edge and reduce operator error and bias. We show that blowouts are slightly elongate and 49% of the sample blowouts are oriented parallel to the prevailing westerly winds. We also show that their size distribution is heavy-tailed, meaning that most blowouts are relatively small and rarely increase in size beyond 400 m3. Given that blowout growth is dominated by a positive feedback between sediment transport and vegetation erosion, these results suggest several possible mechanisms: i) blowouts simultaneously evolved and stabilized as a result of external climate forcing, ii) blowouts are slaved to exogenous biogenic disturbance patterns (e.g., bison wallows), or iii) a morphodynamic limiting mechanism restricts blowout size. Overall, these data will serve as a foundation for future study, providing insight into an understudied landform that is common in many dunefields.

  17. Using Lean Six Sigma Methodology to Improve a Mass Immunizations Process at the United States Naval Academy.

    PubMed

    Ha, Chrysanthy; McCoy, Donald A; Taylor, Christopher B; Kirk, Kayla D; Fry, Robert S; Modi, Jitendrakumar R

    2016-06-01

    Lean Six Sigma (LSS) is a process improvement methodology developed in the manufacturing industry to increase process efficiency while maintaining product quality. The efficacy of LSS application to the health care setting has not been adequately studied. This article presents a quality improvement project at the U.S. Naval Academy that uses LSS to improve the mass immunizations process for Midshipmen during in-processing. The process was standardized to give all vaccinations at one station instead of giving a different vaccination at each station. After project implementation, the average immunizations lead time decreased by 79% and staffing decreased by 10%. The process was shown to be in control with a capability index of 1.18 and performance index of 1.10, resulting in a defect rate of 0.04%. This project demonstrates that the LSS methodology can be applied successfully to the health care setting to make sustainable process improvements if used correctly and completely.

  18. Using Lean Six Sigma Methodology to Improve a Mass Immunizations Process at the United States Naval Academy.

    PubMed

    Ha, Chrysanthy; McCoy, Donald A; Taylor, Christopher B; Kirk, Kayla D; Fry, Robert S; Modi, Jitendrakumar R

    2016-06-01

    Lean Six Sigma (LSS) is a process improvement methodology developed in the manufacturing industry to increase process efficiency while maintaining product quality. The efficacy of LSS application to the health care setting has not been adequately studied. This article presents a quality improvement project at the U.S. Naval Academy that uses LSS to improve the mass immunizations process for Midshipmen during in-processing. The process was standardized to give all vaccinations at one station instead of giving a different vaccination at each station. After project implementation, the average immunizations lead time decreased by 79% and staffing decreased by 10%. The process was shown to be in control with a capability index of 1.18 and performance index of 1.10, resulting in a defect rate of 0.04%. This project demonstrates that the LSS methodology can be applied successfully to the health care setting to make sustainable process improvements if used correctly and completely. PMID:27244070

  19. Choosing between Methodologies: An Inquiry into English Learning Processes in a Taiwanese Indigenous School

    ERIC Educational Resources Information Center

    Lin, Wen-Chuan

    2012-01-01

    Traditional, cognitive-oriented theories of English language acquisition tend to employ experimental modes of inquiry and neglect social, cultural and historical contexts. In this paper, I review the theoretical debate over methodology by examining ontological, epistemological and methodological controversies around cognitive-oriented theories. I…

  20. Microprocessor instruments for measuring nonlinear distortions; algorithms for digital processing of the measurement signal and an estimate of the errors

    SciTech Connect

    Mints, M.Ya.; Chinkov, V.N.

    1995-09-01

    Rational algorithms for measuring the harmonic coefficient in microprocessor instruments for measuring nonlinear distortions based on digital processing of the codes of the instantaneous values of the signal being investigated are described and the errors of such instruments are obtained.

  1. Merged GLORIA sidescan and hydrosweep pseudo-sidescan: Processing and creation of digital mosaics

    USGS Publications Warehouse

    Bird, R.T.; Searle, R.C.; Paskevich, V.; Twichell, D.C.

    1996-01-01

    We have replaced the usual band of poor-quality data in the near-nadir region of our GLORIA long-range sidescan-sonar imagery with a shaded-relief image constructed from swath bathymetry data (collected simultaneously with GLORIA) which completely cover the nadir area. We have developed a technique to enhance these "pseudo-sidescan" images in order to mimic the neighbouring GLORIA backscatter intensities. As a result, the enhanced images greatly facilitate the geologic interpretation of the adjacent GLORIA data, and geologic features evident in the GLORIA data may be correlated with greater confidence across track. Features interpreted from the pseudo-sidescan may be extrapolated from the near-nadir region out into the GLORIA range where they may not have been recognized otherwise, and therefore the pseudo-sidescan can be used to ground-truth GLORIA interpretations. Creation of digital sidescan mosaics utilized an approach not previously used for GLORIA data. Pixels were correctly placed in cartographic space and the time required to complete a final mosaic was significantly reduced. Computer software for digital mapping and mosaic creation is incorporated into the newly-developed Woods Hole Image Processing System (WHIPS) which can process both low- and high-frequency sidescan, and can interchange data with the Mini Image Processing System (MIPS) most commonly used for GLORIA processing. These techniques are tested by creating digital mosaics of merged GLORIA sidescan and Hydrosweep pseudo-sidescan data from the vicinity of the Juan Fernandez microplate along the East Pacific Rise (EPR). ?? 1996 Kluwer Academic Publishers.

  2. A New Screening Methodology for Improved Oil Recovery Processes Using Soft-Computing Techniques

    NASA Astrophysics Data System (ADS)

    Parada, Claudia; Ertekin, Turgay

    2010-05-01

    The first stage of production of any oil reservoir involves oil displacement by natural drive mechanisms such as solution gas drive, gas cap drive and gravity drainage. Typically, improved oil recovery (IOR) methods are applied to oil reservoirs that have been depleted naturally. In more recent years, IOR techniques are applied to reservoirs even before their natural energy drive is exhausted by primary depletion. Descriptive screening criteria for IOR methods are used to select the appropriate recovery technique according to the fluid and rock properties. This methodology helps in assessing the most suitable recovery process for field deployment of a candidate reservoir. However, the already published screening guidelines neither provide information about the expected reservoir performance nor suggest a set of project design parameters, which can be used towards the optimization of the process. In this study, artificial neural networks (ANN) are used to build a high-performance neuro-simulation tool for screening different improved oil recovery techniques: miscible injection (CO2 and N2), waterflooding and steam injection processes. The simulation tool consists of proxy models that implement a multilayer cascade feedforward back propagation network algorithm. The tool is intended to narrow the ranges of possible scenarios to be modeled using conventional simulation, reducing the extensive time and energy spent in dynamic reservoir modeling. A commercial reservoir simulator is used to generate the data to train and validate the artificial neural networks. The proxy models are built considering four different well patterns with different well operating conditions as the field design parameters. Different expert systems are developed for each well pattern. The screening networks predict oil production rate and cumulative oil production profiles for a given set of rock and fluid properties, and design parameters. The results of this study show that the networks are

  3. Optimization of Extraction Process for Polysaccharide in Salvia Miltiorrhiza Bunge Using Response Surface Methodology

    PubMed Central

    Yanhua, Wang; Fuhua, Wu; Zhaohan, Guo; Mingxing, Peng; Yanan, Zhang; Ling, Pang Zhen; Minhua, Du; Caiying, Zhang; Zian, Liang

    2015-01-01

    This study was aimed to optimize the extraction process for Salvia miltiorrhiza Bunge polysaccharide using response surface methodology The results showed that four operating parameters including microwave power, microwave time and the particle size had notable effects on the polysaccharide extraction of Salvia miltiorrhiza Bunge. The effects could be ranked in decreasing order of importance as follows:. Microwave power > microwave time > the comminution degree. The optimal extraction parameters were determined as 573.83W of Microwave power and 8.4min of microwave time and 67.51mesh of the comminution degree, resulting in the yield of Salvia miltiorrhiza Bunge polysaccharide of 101.161mg / g. The established regression model describing polysaccharide extraction from as a function of the three extraction parameters was highly significant (R 2 = 0.9953). The predicted and experimental results were found to be in good agreement. Thus, the model can be applicable for the prediction of polysaccharide extraction from Salvia miltiorrhiza Bunge. PMID:26998185

  4. Photothermal heating as a methodology for post processing of polymeric nanofibers

    NASA Astrophysics Data System (ADS)

    Gorga, Russell; Clarke, Laura; Bochinski, Jason; Viswanath, Vidya; Maity, Somsubhra; Dong, Ju; Firestone, Gabriel

    2015-03-01

    Metal nanoparticles embedded within polymeric systems can be made to act as localized heat sources thereby aiding in-situ polymer processing. This is made possible by the surface plasmon resonance (SPR) mediated photothermal effect of metal (in this case gold) nanoparticles, wherein incident light absorbed by the nanoparticle generates a non-equilibrium electron distribution which subsequently transfers this energy into the surrounding medium, resulting in a temperature increase in the immediate region around the particle. Here we demonstrate this effect in polymer nanocomposite systems, specifically electrospun polyethylene oxide nanofibrous mats, which have been annealed at temperatures above the glass transition. A non-contact temperature measurement technique utilizing embedded fluorophores (perylene) has been used to monitor the average temperature within samples. The effect of annealing methods (conventional and photothermal) and annealing conditions (temperature and time) on the fiber morphology, overall crystallinity, and mechanical properties is discussed. This methodology is further utilized in core-sheath nanofibers to crosslink the core material, which is a pre-cured epoxy thermoset. NSF Grant CMMI-1069108.

  5. A Novel Methodology for Processing of Plutonium-Bearing Waste as Ammonium Plutonium(III)-Oxalate

    SciTech Connect

    Sali, Sanjay Krishnarao; Noronha, Donal Marshal; Mhatre, Hemakant Ramkrishna; Mahajan, Murlidhar Anna; Chander, Keshav; Aggarwal, Suresh Kumar; Venugopal, Venkatarama

    2005-09-15

    A novel methodology has been developed for the recovery of Pu from different types of waste solutions generated during various operations involved in the chemical quality control/assurance of nuclear fuels. The method is based on the precipitation of Pu as ammonium plutonium(III)-oxalate and involves the adjustment of acidity of the Pu solution to 1 N, the addition of ascorbic acid (0.05 M) to reduce Pu to Pu(III), followed by the addition of (NH{sub 4}){sub 2}SO{sub 4} (0.5 M) and a stoichiometric amount of saturated oxalic acid maintaining a 0.2 M excess of oxalic acid concentration in the supernatant. The precipitate was characterized by X-ray powder diffraction and thermal and chemical analysis and was found to have the composition NH{sub 4}Pu(C{sub 2}O{sub 4}){sub 2}.3H{sub 2}O. This compound can be easily decomposed to PuO{sub 2} on heating in air at 823 K. Decontamination factors of U, Fe, and Cr determined showed quantitative removal of these ions during the precipitation of Pu as ammonium plutonium(III)-oxalate.A semiautomatic assembly based on the transfer of solutions by suction arrangement was designed and fabricated for processing large volumes of Pu solution. This assembly reduced the corrosion of the glove-box material and offered the advantage of lower radiation exposure to the working personnel.

  6. Computer-aided feature extraction, classification, and acceptance processing of digital NDE data

    NASA Astrophysics Data System (ADS)

    Hildreth, Joseph H.

    1996-11-01

    As part of the advanced Launch System technology development effort begun in 1989, the Air Force initiated a program to automate, to the extent possible, the processing of NDE data from the inspection of slid rocket motors during fabrication. The computerized system, called the Automated NDE Data Evaluation System or ANDES, was developed under contract to Martin Marietta, now Lockheed Martin. The ANDES system is generic in structure and is highly tailorable. The system can be configured to process digital or digitized data from any source, to process data from a single or from multiple acquisition systems, and to function as a single stand-alone system or in a multiple workstation distributed network. The system can maintain multiple configurations from which the user can select. In large measure, a configuration is defined through the system's user interface and is stored in the system's data base to be recalled by the user at any time. Three operational systems are currently in use. These systems ar located at Hill AFB in Ogden, Utah, Kelly AFB in San Antonio, TX, and the Phillips Laboratory at Edwards AFB in California. Each of these systems is configured to process x-ray computed tomography, CT, images. The Hill AFB installation supports the aging surveillance effort on Minuteman third stage rocket motors. The Kelly AFB system supports the acceptance inspection of airframe and engine components and torpedo housing components. The installation at Edwards AFB provides technical support to the other two locations.

  7. A digital archiving system and distributed server-side processing of large datasets

    NASA Astrophysics Data System (ADS)

    Jomier, Julien; Aylward, Stephen R.; Marion, Charles; Lee, Joowhi; Styner, Martin

    2009-02-01

    In this paper, we present MIDAS, a web-based digital archiving system that processes large collections of data. Medical imaging research often involves interdisciplinary teams, each performing a separate task, from acquiring datasets to analyzing the processing results. Moreover, the number and size of the datasets continue to increase every year due to recent advancements in acquisition technology. As a result, many research laboratories centralize their data and rely on distributed computing power. We created a web-based digital archiving repository based on openstandards. The MIDAS repository is specifically tuned for medical and scientific datasets and provides a flexible data management facility, a search engine, and an online image viewer. MIDAS enables users to run a set of extensible image processing algorithms from the web to the selected datasets and to add new algorithms to the MIDAS system, facilitating the dissemination of users' work to different research partners. The MIDAS system is currently running in several research laboratories and has demonstrated its ability to streamline the full image processing workflow from data acquisition to image analysis and reports.

  8. Digital processing considerations for extraction of ocean wave image spectra from raw synthetic aperture radar data

    NASA Technical Reports Server (NTRS)

    Lahaie, I. J.; Dias, A. R.; Darling, G. D.

    1984-01-01

    The digital processing requirements of several algorithms for extracting the spectrum of a detected synthetic aperture radar (SAR) image from the raw SAR data are described and compared. The most efficient algorithms for image spectrum extraction from raw SAR data appear to be those containing an intermediate image formation step. It is shown that a recently developed compact formulation of the image spectrum in terms of the raw data is computationally inefficient when evaluated directly, in comparison with the classical method where matched-filter image formation is an intermediate result. It is also shown that a proposed indirect procedure for digitally implementing the same compact formulation is somewhat more efficient than the classical matched-filtering approach. However, this indirect procedure includes the image formation process as part of the total algorithm. Indeed, the computational savings afforded by the indirect implementation are identical to those obtained in SAR image formation processing when the matched-filtering algorithm is replaced by the well-known 'dechirp-Fourier transform' technique. Furthermore, corrections to account for slant-to-ground range conversion, spherical earth, etc., are often best implemented in the image domain, making intermediate image formation a valuable processing feature.

  9. Digital CODEC for real-time processing of broadcast quality video signals at 1.8 bits/pixel

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary JO; Whyte, Wayne A., Jr.

    1989-01-01

    Advances in very large-scale integration and recent work in the field of bandwidth efficient digital modulation techniques have combined to make digital video processing technically feasible and potentially cost competitive for broadcast quality television transmission. A hardware implementation was developed for a DPCM-based digital television bandwidth compression algorithm which processes standard NTSC composite color television signals and produces broadcast quality video in real time at an average of 1.8 bits/pixel. The data compression algorithm and the hardware implementation of the CODEC are described, and performance results are provided.

  10. Digital CODEC for real-time processing of broadcast quality video signals at 1.8 bits/pixel

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary JO; Whyte, Wayne A.

    1991-01-01

    Advances in very large scale integration and recent work in the field of bandwidth efficient digital modulation techniques have combined to make digital video processing technically feasible an potentially cost competitive for broadcast quality television transmission. A hardware implementation was developed for DPCM (differential pulse code midulation)-based digital television bandwidth compression algorithm which processes standard NTSC composite color television signals and produces broadcast quality video in real time at an average of 1.8 bits/pixel. The data compression algorithm and the hardware implementation of the codec are described, and performance results are provided.

  11. Real-time digital design for an optical coherence tomography acquisition and processing system

    NASA Astrophysics Data System (ADS)

    Ralston, Tyler S.; Mayen, Jose A.; Marks, Dan L.; Boppart, Stephen A.

    2004-07-01

    We present a real-time, multi-dimensional, digital, optical coherence tomography (OCT) acquisition and imaging system. The system consists of conventional OCT optics, a rapid scanning optical delay (RSOD) line to support fast data acquisition rates, and a high-speed A/D converter for sampling the interference waveforms. A 1M-gate Virtex-II field programmable gate array (FPGA) is designed to perform digital down conversion. This is analogous to demodulating and low-pass filtering the continuous time signal. The system creates in-phase and quadrature-phase components using a tunable quadrature mixer. Multistage polyphase finite impulse response (FIR) filtering and down sampling is used to remove unneeded high frequencies. A floating-point digital signal processor (DSP) computes the magnitude and phase shifts. The data is read by a host machine and displayed on screen at real-time rates commensurate with the data acquisition rate. This system offers flexible acquisition and processing parameters for a wide range of multi-dimensional optical microscopy techniques.

  12. Experimental demonstration of a format-flexible single-carrier coherent receiver using data-aided digital signal processing.

    PubMed

    Elschner, Robert; Frey, Felix; Meuer, Christian; Fischer, Johannes Karl; Alreesh, Saleem; Schmidt-Langhorst, Carsten; Molle, Lutz; Tanimura, Takahito; Schubert, Colja

    2012-12-17

    We experimentally demonstrate the use of data-aided digital signal processing for format-flexible coherent reception of different 28-GBd PDM and 4D modulated signals in WDM transmission experiments over up to 7680 km SSMF by using the same resource-efficient digital signal processing algorithms for the equalization of all formats. Stable and regular performance in the nonlinear transmission regime is confirmed. PMID:23263118

  13. Experimental demonstration of a format-flexible single-carrier coherent receiver using data-aided digital signal processing.

    PubMed

    Elschner, Robert; Frey, Felix; Meuer, Christian; Fischer, Johannes Karl; Alreesh, Saleem; Schmidt-Langhorst, Carsten; Molle, Lutz; Tanimura, Takahito; Schubert, Colja

    2012-12-17

    We experimentally demonstrate the use of data-aided digital signal processing for format-flexible coherent reception of different 28-GBd PDM and 4D modulated signals in WDM transmission experiments over up to 7680 km SSMF by using the same resource-efficient digital signal processing algorithms for the equalization of all formats. Stable and regular performance in the nonlinear transmission regime is confirmed.

  14. APPLEPIPS /Apple Personal Image Processing System/ - An interactive digital image processing system for the Apple II microcomputer

    NASA Technical Reports Server (NTRS)

    Masuoka, E.; Rose, J.; Quattromani, M.

    1981-01-01

    Recent developments related to microprocessor-based personal computers have made low-cost digital image processing systems a reality. Image analysis systems built around these microcomputers provide color image displays for images as large as 256 by 240 pixels in sixteen colors. Descriptive statistics can be computed for portions of an image, and supervised image classification can be obtained. The systems support Basic, Fortran, Pascal, and assembler language. A description is provided of a system which is representative of the new microprocessor-based image processing systems currently on the market. While small systems may never be truly independent of larger mainframes, because they lack 9-track tape drives, the independent processing power of the microcomputers will help alleviate some of the turn-around time problems associated with image analysis and display on the larger multiuser systems.

  15. Advanced digital signal processing for short-haul and access network

    NASA Astrophysics Data System (ADS)

    Zhang, Junwen; Yu, Jianjun; Chi, Nan

    2016-02-01

    Digital signal processing (DSP) has been proved to be a successful technology recently in high speed and high spectrum-efficiency optical short-haul and access network, which enables high performances based on digital equalizations and compensations. In this paper, we investigate advanced DSP at the transmitter and receiver side for signal pre-equalization and post-equalization in an optical access network. A novel DSP-based digital and optical pre-equalization scheme has been proposed for bandwidth-limited high speed short-distance communication system, which is based on the feedback of receiver-side adaptive equalizers, such as least-mean-squares (LMS) algorithm and constant or multi-modulus algorithms (CMA, MMA). Based on this scheme, we experimentally demonstrate 400GE on a single optical carrier based on the highest ETDM 120-GBaud PDM-PAM-4 signal, using one external modulator and coherent detection. A line rate of 480-Gb/s is achieved, which enables 20% forward-error correction (FEC) overhead to keep the 400-Gb/s net information rate. The performance after fiber transmission shows large margin for both short range and metro/regional networks. We also extend the advanced DSP for short haul optical access networks by using high order QAMs. We propose and demonstrate a high speed multi-band CAP-WDM-PON system on intensity modulation, direct detection and digital equalizations. A hybrid modified cascaded MMA post-equalization schemes are used to equalize the multi-band CAP-mQAM signals. Using this scheme, we successfully demonstrates 550Gb/s high capacity WDMPON system with 11 WDM channels, 55 sub-bands, and 10-Gb/s per user in the downstream over 40-km SMF.

  16. Perceptual and category processing of the Uncanny Valley hypothesis' dimension of human likeness: some methodological issues.

    PubMed

    Cheetham, Marcus; Jancke, Lutz

    2013-01-01

    Mori's Uncanny Valley Hypothesis(1,2) proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings (3, 4, 5, 6). One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) (7). Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated. PMID:23770728

  17. Process evaluation of community monitoring under national health mission at Chandigarh, union territory: Methodology and challenges

    PubMed Central

    Tripathy, Jaya Prasad; Aggarwal, Arun Kumar; Patro, Binod Kumar; Verma, Himbala

    2015-01-01

    Background: Community monitoring was introduced on a pilot mode in 36 selected districts of India in a phased manner. In Chandigarh, it was introduced in the year 2009–2010. A preliminary evaluation of the program was undertaken with special emphasis on the inputs and the processes. Methodology: Quantitative methods included verification against checklists and record reviews. Nonparticipant observation was used to evaluate the conduct of trainings, interviews, and group discussions. Health system had trained health system functionaries (nursing students and Village Health Sanitation Committee [VHSC] members) to generate village-based scorecards for assessing community needs. Community needs were assessed independently for two villages under the study area to validate the scores generated by the health system. Results: VHSCs were formed in all 22 villages but without a chairperson or convener. The involvement of VHSC members in the community monitoring process was minimal. The conduct of group discussions was below par due to poor moderation and unequal responses from the group. The community monitoring committees at the state level had limited representation from the non-health sector, lower committees, and the nongovernmental organizations/civil societies. Agreement between the report cards generated by the investigator and the health system in the selected villages was found to be to be fair (0.369) whereas weighted kappa (0.504) was moderate. Conclusion: In spite of all these limitations and challenges, the government has taken a valiant step by trying to involve the community in the monitoring of health services. The dynamic nature of the community warrants incorporation of an evaluation framework into the planning of such programs. PMID:26985413

  18. Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues

    PubMed Central

    Cheetham, Marcus; Jancke, Lutz

    2013-01-01

    Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated. PMID:23770728

  19. Multimedia abstract generation of intensive care data: the automation of clinical processes through AI methodologies.

    PubMed

    Jordan, Desmond; Rose, Sydney E

    2010-04-01

    Medical errors from communication failures are enormous during the perioperative period of cardiac surgical patients. As caregivers change shifts or surgical patients change location within the hospital, key information is lost or misconstrued. After a baseline cognitive study of information need and caregiver workflow, we implemented an advanced clinical decision support tool of intelligent agents, medical logic modules, and text generators called the "Inference Engine" to summarize individual patient's raw medical data elements into procedural milestones, illness severity, and care therapies. The system generates two displays: 1) the continuum of care, multimedia abstract generation of intensive care data (MAGIC)-an expert system that would automatically generate a physician briefing of a cardiac patient's operative course in a multimodal format; and 2) the isolated point in time, "Inference Engine"-a system that provides a real-time, high-level, summarized depiction of a patient's clinical status. In our studies, system accuracy and efficacy was judged against clinician performance in the workplace. To test the automated physician briefing, "MAGIC," the patient's intraoperative course, was reviewed in the intensive care unit before patient arrival. It was then judged against the actual physician briefing and that given in a cohort of patients where the system was not used. To test the real-time representation of the patient's clinical status, system inferences were judged against clinician decisions. Changes in workflow and situational awareness were assessed by questionnaires and process evaluation. MAGIC provides 200% more information, twice the accuracy, and enhances situational awareness. This study demonstrates that the automation of clinical processes through AI methodologies yields positive results. PMID:20012610

  20. Digital image capture, processing, and recording system upgrade for the APS-94F SLAR

    NASA Astrophysics Data System (ADS)

    Ferraris, Guillermo L.

    2000-11-01

    The Argentine Army has been operating the APS-94F SLAR systems, on board the venerable OV-1D MOHAWK aircraft, since 1996. These systems were received from the U.S. Government through the FMS program. One major handicap of the system is due to the now obsolete imagery recording subsystem, which includes complex optical, thermal and electro-mechanical obsolete processes and components, that account for most of the degradations and distortions in the images obtained (not to mention the fact that images are recorded on a 9.5-inch silver halide film media, which has to be kept at -17 degree(s)C and has to be brought to thermal equilibrium with the environment eight hours before the mission). An integral digital capture, processing and recording subsystem was developed at CITEFA (Instituto de Investigaciones Cientificas y Tecnicas de las Fuerzas Armadas) to replace the old analog RO-495/U recorder, as an upgrade to this very robust and proven imaging radar system The subsystem developed includes three custom designed ISA boards: (1) Radar video and aircraft attitude signal conditioning board, (2) Microprocessor controlled two- channel high speed digitizing board and (3) Integrated 12- channel GPS OEM board. The operator's software interface is a PC based GUI C++ application, including radar imagery forming and processing algorithms, slant range to ground range conversion, digitally controlled image gain, bias and contrast adjustments, image registration (GPS), image file disk recording and retrieval functions, real time mensuration and MTI/FTI (moving target indication/fixed target indication) image correlation. The system also provides for the added capability to send compressed still radar images in NRT (near real time) to a ground receiving station through a secure data link. Due to serious space limitations inside the OV-1D two-seat cockpit, a military grade ruggedized laptop computer and docking station hardware implementation was selected.

  1. Rocket engine plume diagnostics using video digitization and image processing - Analysis of start-up

    NASA Technical Reports Server (NTRS)

    Disimile, P. J.; Shoe, B.; Dhawan, A. P.

    1991-01-01

    Video digitization techniques have been developed to analyze the exhaust plume of the Space Shuttle Main Engine. Temporal averaging and a frame-by-frame analysis provide data used to evaluate the capabilities of image processing techniques for use as measurement tools. Capabilities include the determination of the necessary time requirement for the Mach disk to obtain a fully-developed state. Other results show the Mach disk tracks the nozzle for short time intervals, and that dominate frequencies exist for the nozzle and Mach disk movement.

  2. Automated image processing of Landsat II digital data for watershed runoff prediction

    NASA Technical Reports Server (NTRS)

    Sasso, R. R.; Jensen, J. R.; Estes, J. E.

    1977-01-01

    Digital image processing of Landsat data from a 230 sq km area was examined as a possible means of generating soil cover information for use in the watershed runoff prediction of Kern County, California. The soil cover information included data on brush, grass, pasture lands and forests. A classification accuracy of 94% for the Landsat-based soil cover survey suggested that the technique could be applied to the watershed runoff estimate. However, problems involving the survey of complex mountainous environments may require further attention

  3. LANDSAT digital data processing: A near real-time application. [Gulf of Mexico

    NASA Technical Reports Server (NTRS)

    Barker, J. L.; Bohn, C.; Stuart, L.; Hill, J.

    1975-01-01

    An application of rapid generation of classed digital images from LANDSAT-1 was demonstrated and its feasibility evaluated by NASA in conjunction with the Environmental Protection Agency (EPA), Texas A and M University (TAMU), and the Cousteau Society. The primary purpose was to show that satellite data could be processed and transmitted to the Calypso, which was used as a research vessel, in time for use in directing it to specific locations of possible plankton upwellings, sediment, or other anomalies in the coastal water areas along the Gulf of Mexico.

  4. IBIS - A geographic information system based on digital image processing and image raster datatype

    NASA Technical Reports Server (NTRS)

    Bryant, N. A.; Zobrist, A. L.

    1976-01-01

    IBIS (Image Based Information System) is a geographic information system which makes use of digital image processing techniques to interface existing geocoded data sets and information management systems with thematic maps and remotely sensed imagery. The basic premise is that geocoded data sets can be referenced to a raster scan that is equivalent to a grid cell data set. The first applications (St. Tammany Parish, Louisiana, and Los Angeles County) have been restricted to the design of a land resource inventory and analysis system. It is thought that the algorithms and the hardware interfaces developed will be readily applicable to other Landsat imagery.

  5. Combined flatland ST radar and digital-barometer network observations of mesoscale processes

    NASA Technical Reports Server (NTRS)

    Clark, W. L.; Vanzandt, T. E.; Gage, K. S.; Einaudi, F. E.; Rottman, J. W.; Hollinger, S. E.

    1991-01-01

    The paper describes a six-station digital-barometer network centered on the Flatland ST radar to support observational studies of gravity waves and other mesoscale features at the Flatland Atmospheric Observatory in central Illinois. The network's current mode of operation is examined, and a preliminary example of an apparent group of waves evident throughout the network as well as throughout the troposphere is presented. Preliminary results demonstrate the capabilities of the current operational system to study wave convection, wave-front, and other coherent mesoscale interactions and processes throughout the troposphere. Unfiltered traces for the pressure and horizontal zonal wind, for days 351 to 353 UT, 1990, are illustrated.

  6. Comparing digital data processing techniques for surface mine and reclamation monitoring

    NASA Technical Reports Server (NTRS)

    Witt, R. G.; Bly, B. G.; Campbell, W. J.; Bloemer, H. H. L.; Brumfield, J. O.

    1982-01-01

    The results of three techniques used for processing Landsat digital data are compared for their utility in delineating areas of surface mining and subsequent reclamation. An unsupervised clustering algorithm (ISOCLS), a maximum-likelihood classifier (CLASFY), and a hybrid approach utilizing canonical analysis (ISOCLS/KLTRANS/ISOCLS) were compared by means of a detailed accuracy assessment with aerial photography at NASA's Goddard Space Flight Center. Results show that the hybrid approach was superior to the traditional techniques in distinguishing strip mined and reclaimed areas.

  7. Comparison of digital signal processing modules in gamma-ray spectrometry.

    PubMed

    Lépy, Marie-Christine; Cissé, Ousmane Ibrahima; Pierre, Sylvie

    2014-05-01

    Commercial digital signal-processing modules have been tested for their applicability to gamma-ray spectrometry. The tests were based on the same n-type high purity germanium detector. The spectrum quality was studied in terms of energy resolution and peak area versus shaping parameters, using a Eu-152 point source. The stability of a reference peak count rate versus the total count rate was also examined. The reliability of the quantitative results is discussed for their use in measurement at the metrological level.

  8. Basic forest cover mapping using digitized remote sensor data and automated data processing techniques

    NASA Technical Reports Server (NTRS)

    Coggeshall, M. E.; Hoffer, R. M.

    1973-01-01

    Remote sensing equipment and automatic data processing techniques were employed as aids in the institution of improved forest resource management methods. On the basis of automatically calculated statistics derived from manually selected training samples, the feature selection processor of LARSYS selected, upon consideration of various groups of the four available spectral regions, a series of channel combinations whose automatic classification performances (for six cover types, including both deciduous and coniferous forest) were tested, analyzed, and further compared with automatic classification results obtained from digitized color infrared photography.

  9. Operation and performance of a longitudinal feedback system using digital signal processing

    SciTech Connect

    Teytelman, D.; Fox, J.; Hindi, H.

    1994-11-22

    A programmable longitudinal feedback system using a parallel array of AT&T 1610 digital signal processors has been developed as a component of the PEP-II R&D program. This system has been installed at the Advanced Light Source (LBL) and implements full speed bunch by bunch signal processing for storage rings with bunch spacing of 4ns. Open and closed loop results showing the action of the feedback system are presented, and the system is shown to damp coupled-bunch instabilities in the ALS. A unified PC-based software environment for the feedback system operation is also described.

  10. Digital Image Processing: Effects Of Metz Filters And Matched Filters On Detection Of Simple Radiographic Objects

    NASA Astrophysics Data System (ADS)

    Chan, Heang-Ping; Doi, Kunio; Metz, Charles E.

    1984-06-01

    We studied the effect of image processing with Metz filters and matched filters on the detection of simulated low-contrast square objects superimposed on radiographic mottle. The signal-to-noise ratios (SNRs) of original and processed images were calculated based on the perceived statistical decision theory model by taking into account the internal noise of a human observer's eye-brain system. Threshold contrasts for objects of various sizes were predicted by assuming a threshold SNR of 3.8 which was determined previously for a 50% correct detection in 18 alternative forced-choice experiments. The relative performance of various image processing techniques was also evaluated experimentally with a contrast-detail diagram method. The simulated images were generated by a high-quality digital image processing and simulation system. The digitized images were Fourier-trans-formed, filtered, inversely Fourier-transformed, and/or contrast-enhanced to produce the processed images. The contrast-detail curves of the original or processed images were obtained by averaging the results of four image samples and twelve observers. Both the theoretical prediction and the C-D experiment demonstrated an improvement in detectabilities of the simple test objects over those of the original images. However, the observers seemed to under-read the filtered images in the sense that the improvement in obser-ver performance was slightly less than the prediction. This is probably caused by the changes in appearance of the object and the noise texture in the filtered images. The usefulness and limitations of the Metz filters and matched filters in comparison with other image processing techniques are discussed.

  11. Qualitative and quantitative interpretation of SEM image using digital image processing.

    PubMed

    Saladra, Dawid; Kopernik, Magdalena

    2016-10-01

    The aim of the this study is improvement of qualitative and quantitative analysis of scanning electron microscope micrographs by development of computer program, which enables automatic crack analysis of scanning electron microscopy (SEM) micrographs. Micromechanical tests of pneumatic ventricular assist devices result in a large number of micrographs. Therefore, the analysis must be automatic. Tests for athrombogenic titanium nitride/gold coatings deposited on polymeric substrates (Bionate II) are performed. These tests include microshear, microtension and fatigue analysis. Anisotropic surface defects observed in the SEM micrographs require support for qualitative and quantitative interpretation. Improvement of qualitative analysis of scanning electron microscope images was achieved by a set of computational tools that includes binarization, simplified expanding, expanding, simple image statistic thresholding, the filters Laplacian 1, and Laplacian 2, Otsu and reverse binarization. Several modifications of the known image processing techniques and combinations of the selected image processing techniques were applied. The introduced quantitative analysis of digital scanning electron microscope images enables computation of stereological parameters such as area, crack angle, crack length, and total crack length per unit area. This study also compares the functionality of the developed computer program of digital image processing with existing applications. The described pre- and postprocessing may be helpful in scanning electron microscopy and transmission electron microscopy surface investigations.

  12. Digital Storytelling as a Narrative Health Promotion Process: Evaluation of a Pilot Study.

    PubMed

    DiFulvio, Gloria T; Gubrium, Aline C; Fiddian-Green, Alice; Lowe, Sarah E; Del Toro-Mejias, Lizbeth Marie

    2016-04-01

    Digital storytelling (DST) engages participants in a group-based process to create and share narrative accounts of life events. The process of individuals telling their own stories has not been well assessed as a mechanism of health behavior change. This study looks at outcomes associated with engaging in the DST process for vulnerable youth. The project focused on the experiences of Puerto Rican Latinas between the ages of 15 to 21. A total of 30 participants enrolled in a 4-day DST workshops, with 29 completing a 1 to 3-minute digital story. Self-reported data on several scales (self-esteem, social support, empowerment, and sexual attitudes and behaviors) were collected and analyzed. Participants showed an increase in positive social interactions from baseline to 3-month post workshop. Participants also demonstrated increases in optimism and control over the future immediately after the workshop, but this change was not sustained at 3 months. Analysis of qualitative results and implications are discussed. PMID:27166356

  13. Design of a high-speed digital processing element for parallel simulation

    NASA Technical Reports Server (NTRS)

    Milner, E. J.; Cwynar, D. S.

    1983-01-01

    A prototype of a custom designed computer to be used as a processing element in a multiprocessor based jet engine simulator is described. The purpose of the custom design was to give the computer the speed and versatility required to simulate a jet engine in real time. Real time simulations are needed for closed loop testing of digital electronic engine controls. The prototype computer has a microcycle time of 133 nanoseconds. This speed was achieved by: prefetching the next instruction while the current one is executing, transporting data using high speed data busses, and using state of the art components such as a very large scale integration (VLSI) multiplier. Included are discussions of processing element requirements, design philosophy, the architecture of the custom designed processing element, the comprehensive instruction set, the diagnostic support software, and the development status of the custom design.

  14. Construction of an integrated biomodule composed of microfluidics and digitally controlled microelectrodes for processing biomolecules.

    NASA Astrophysics Data System (ADS)

    Wagler, Patrick F.; Tangen, Uwe; Maeke, Thomas; Mathis, Harald P.; McCaskill, John S.

    2003-01-01

    This work focuses on the development of an online programmable microfluidic bioprocessing unit (BioModule) using digital logic microelectrodes for rapid pipelined selection and transfer of DNA molecules and other charged biopolymers. The design and construction technique for this hybrid programmable biopolymer processing device is presented along with the first proof of principle functionality. The electronically controlled collection, separation and channel transfer of the biomolecules is monitored by a sensitive fluorescence setup. This hybrid reconfigurable architecture couples electronic and biomolecular information processing via a single module combination of fluidics and electronics and opens new fields of applications not only in DNA computing and molecular diagnostics but also in applications of combinatorial chemistry and lab-on-a-chip biotechnology to the drug discovery process. Fundamentals of the design and silicon-PDMS-based construction of these electronic microfluidic devices and their functions are described as well as the experimental results.

  15. Microfabrication of a BioModule composed of microfluidics and digitally controlled microelectrodes for processing biomolecules

    NASA Astrophysics Data System (ADS)

    Wagler, Patrick F.; Tangen, Uwe; Maeke, Thomas; Mathis, Harald P.; McCaskill, John S.

    2003-10-01

    This work focuses on the development of an online programmable microfluidic bioprocessing unit (BioModule) using digital logic microelectrodes for rapid pipelined selection and transfer of deoxyribonucleic acid (DNA) molecules and other charged biopolymers. The design and construction technique for this hybrid programmable biopolymer processing device is presented along with the first proof of principle functionality. The electronically controlled collection, separation and channel transfer of the biomolecules is monitored by a sensitive fluorescence set-up. This hybrid reconfigurable architecture couples electronic and biomolecular information processing via a single module combination of fluidics and electronics and opens new fields of applications not only in DNA computing and molecular diagnostics but also in applications of combinatorial chemistry and lab-on-a-chip biotechnology to the drug discovery process. Fundamentals of the design and silicon-polydimethylsiloxane (PDMS)-based construction of these electronic microfluidic devices and their functions are described as well as the experimental results.

  16. Systemic Inflammation: Methodological Approaches to Identification of the Common Pathological Process

    PubMed Central

    Zotova, N. V.; Chereshnev, V. A.; Gusev, E. Yu.

    2016-01-01

    We defined Systemic inflammation (SI) as a “typical, multi-syndrome, phase-specific pathological process, developing from systemic damage and characterized by the total inflammatory reactivity of endotheliocytes, plasma and blood cell factors, connective tissue and, at the final stage, by microcirculatory disorders in vital organs and tissues.” The goal of the work: to determine methodological approaches and particular methodical solutions for the problem of identification of SI as a common pathological process. SI can be defined by the presence in plasma of systemic proinflammatory cell stress products—cytokines and other inflammatory mediators, and also by the complexity of other processes signs. We have developed 2 scales: 1) The Reactivity Level scale (RL)–from 0 to 5 points: 0-normal level; RL-5 confirms systemic nature of inflammatory mediator release, and RL- 2–4 defines different degrees of event probability. 2) The SI scale, considering additional criteria along with RL, addresses more integral criteria of SI: the presence of ≥ 5 points according to the SI scale proves the high probability of SI developing. To calculate the RL scale, concentrations of 4 cytokines (IL-6, IL-8, IL-10, TNF-α) and C-reactive protein in plasma were examined. Additional criteria of the SI scale were the following: D-dimers>500ng/ml, cortisol>1380 or <100nmol/l, troponin I≥0.2ng/ml and/or myoglobin≥800ng/ml. 422 patients were included in the study with different septic (n-207) and aseptic (n-215) pathologies. In 190 cases (of 422) there were signs of SI (lethality 38.4%, n-73). In only 5 of 78 cases, lethality was not confirmed by the presence of SI. SI was registered in 100% of cases with septic shock (n-31). There were not significant differences between AU-ROC of CR, SI scale and SOFA to predict death in patients with sepsis and trauma. PMID:27153324

  17. Implementation theory of distortion-invariant pattern recognition for optical and digital signal processing systems

    NASA Astrophysics Data System (ADS)

    Lhamon, Michael Earl

    A pattern recognition system which uses complex correlation filter banks requires proportionally more computational effort than single-real valued filters. This introduces increased computation burden but also introduces a higher level of parallelism, that common computing platforms fail to identify. As a result, we consider algorithm mapping to both optical and digital processors. For digital implementation, we develop computationally efficient pattern recognition algorithms, referred to as, vector inner product operators that require less computational effort than traditional fast Fourier methods. These algorithms do not need correlation and they map readily onto parallel digital architectures, which imply new architectures for optical processors. These filters exploit circulant-symmetric matrix structures of the training set data representing a variety of distortions. By using the same mathematical basis as with the vector inner product operations, we are able to extend the capabilities of more traditional correlation filtering to what we refer to as "Super Images". These "Super Images" are used to morphologically transform a complicated input scene into a predetermined dot pattern. The orientation of the dot pattern is related to the rotational distortion of the object of interest. The optical implementation of "Super Images" yields feature reduction necessary for using other techniques, such as artificial neural networks. We propose a parallel digital signal processor architecture based on specific pattern recognition algorithms but general enough to be applicable to other similar problems. Such an architecture is classified as a data flow architecture. Instead of mapping an algorithm to an architecture, we propose mapping the DSP architecture to a class of pattern recognition algorithms. Today's optical processing systems have difficulties implementing full complex filter structures. Typically, optical systems (like the 4f correlators) are limited to phase

  18. Beyond data collection in digital mapping: interpretation, sketching and thought process elements in geological map making

    NASA Astrophysics Data System (ADS)

    Watkins, Hannah; Bond, Clare; Butler, Rob

    2016-04-01

    Geological mapping techniques have advanced significantly in recent years from paper fieldslips to Toughbook, smartphone and tablet mapping; but how do the methods used to create a geological map affect the thought processes that result in the final map interpretation? Geological maps have many key roles in the field of geosciences including understanding geological processes and geometries in 3D, interpreting geological histories and understanding stratigraphic relationships in 2D and 3D. Here we consider the impact of the methods used to create a map on the thought processes that result in the final geological map interpretation. As mapping technology has advanced in recent years, the way in which we produce geological maps has also changed. Traditional geological mapping is undertaken using paper fieldslips, pencils and compass clinometers. The map interpretation evolves through time as data is collected. This interpretive process that results in the final geological map is often supported by recording in a field notebook, observations, ideas and alternative geological models explored with the use of sketches and evolutionary diagrams. In combination the field map and notebook can be used to challenge the map interpretation and consider its uncertainties. These uncertainties and the balance of data to interpretation are often lost in the creation of published 'fair' copy geological maps. The advent of Toughbooks, smartphones and tablets in the production of geological maps has changed the process of map creation. Digital data collection, particularly through the use of inbuilt gyrometers in phones and tablets, has changed smartphones into geological mapping tools that can be used to collect lots of geological data quickly. With GPS functionality this data is also geospatially located, assuming good GPS connectivity, and can be linked to georeferenced infield photography. In contrast line drawing, for example for lithological boundary interpretation and sketching

  19. Bridging the gap between neurocognitive processing theory and performance validity assessment among the cognitively impaired: a review and methodological approach.

    PubMed

    Leighton, Angela; Weinborn, Michael; Maybery, Murray

    2014-10-01

    Bigler (2012) and Larrabee (2012) recently addressed the state of the science surrounding performance validity tests (PVTs) in a dialogue highlighting evidence for the valid and increased use of PVTs, but also for unresolved problems. Specifically, Bigler criticized the lack of guidance from neurocognitive processing theory in the PVT literature. For example, individual PVTs have applied the simultaneous forced-choice methodology using a variety of test characteristics (e.g., word vs. picture stimuli) with known neurocognitive processing implications (e.g., the "picture superiority effect"). However, the influence of such variations on classification accuracy has been inadequately evaluated, particularly among cognitively impaired individuals. The current review places the PVT literature in the context of neurocognitive processing theory, and identifies potential methodological factors to account for the significant variability we identified in classification accuracy across current PVTs. We subsequently evaluated the utility of a well-known cognitive manipulation to provide a Clinical Analogue Methodology (CAM), that is, to alter the PVT performance of healthy individuals to be similar to that of a cognitively impaired group. Initial support was found, suggesting the CAM may be useful alongside other approaches (analogue malingering methodology) for the systematic evaluation of PVTs, particularly the influence of specific neurocognitive processing components on performance. PMID:25383483

  20. [Measuring the contrast resolution limits of human vision based on the modern digital image processing].

    PubMed

    Wang, Zhifang; Liu, Yuhong; Wang, Ying; Li, Hong; Li, Zhangyong; Zhao, Zhiqiang; Xie, Zhengxiang

    2008-10-01

    In the literatures on the human vision physiology and physics, there were reports about space resolution limit of 1' visual angle, frequency resolution limit of 5 nm and time resolution limit of 0.1" of human vision. However, there has been no report about the contrast resolution limit of human vision,especially the report of measuring method and result about the contrast resolution limit of human vision based on the modern digital image processing. Here we report a modern method for measuring the contrast resolution limit of human vision based on computer digital image processing technology, and we present the measured results and their mathematical models. The function relationships of contrast resolution limit varying with background gray in a photopic or a scotopic sights were illuminated respectively. It can be expected that such investigations with regard to human vision will establish the physiological foundation of the theories and techniques in hiding bodies and hiding figures (stealth), in anti-hiding bodies and anti-hiding figures, in the night vision system independent of infrared, as well as in their relative industries.

  1. Combination of digital signal processing methods towards an improved analysis algorithm for structural health monitoring.

    NASA Astrophysics Data System (ADS)

    Pentaris, Fragkiskos P.; Makris, John P.

    2013-04-01

    In Structural Health Monitoring (SHM) is of great importance to reveal valuable information from the recorded SHM data that could be used to predict or indicate structural fault or damage in a building. In this work a combination of digital signal processing methods, namely FFT along with Wavelet Transform is applied, together with a proposed algorithm to study frequency dispersion, in order to depict non-linear characteristics of SHM data collected in two university buildings under natural or anthropogenic excitation. The selected buildings are of great importance from civil protection point of view, as there are the premises of a public higher education institute, undergoing high use, stress, visit from academic staff and students. The SHM data are collected from two neighboring buildings that have different age (4 and 18 years old respectively). Proposed digital signal processing methods are applied to the data, presenting a comparison of the structural behavior of both buildings in response to seismic activity, weather conditions and man-made activity. Acknowledgments This work was supported in part by the Archimedes III Program of the Ministry of Education of Greece, through the Operational Program "Educational and Lifelong Learning", in the framework of the project entitled «Interdisciplinary Multi-Scale Research of Earthquake Physics and Seismotectonics at the front of the Hellenic Arc (IMPACT-ARC) » and is co-financed by the European Union (European Social Fund) and Greek National Fund.

  2. A working environment for digital planetary data processing and mapping using ISIS and GRASS GIS

    NASA Astrophysics Data System (ADS)

    Frigeri, Alessandro; Hare, Trent; Neteler, Markus; Coradini, Angioletta; Federico, Costanzo; Orosei, Roberto

    2011-09-01

    Since the beginning of planetary exploration, mapping has been fundamental to summarize observations returned by scientific missions. Sensor-based mapping has been used to highlight specific features from the planetary surfaces by means of processing. Interpretative mapping makes use of instrumental observations to produce thematic maps that summarize observations of actual data into a specific theme. Geologic maps, for example, are thematic interpretative maps that focus on the representation of materials and processes and their relative timing. The advancements in technology of the last 30 years have allowed us to develop specialized systems where the mapping process can be made entirely in the digital domain. The spread of networked computers on a global scale allowed the rapid propagation of software and digital data such that every researcher can now access digital mapping facilities on his desktop. The efforts to maintain planetary missions data accessible to the scientific community have led to the creation of standardized digital archives that facilitate the access to different datasets by software capable of processing these data from the raw level to the map projected one. Geographic Information Systems (GIS) have been developed to optimize the storage, the analysis, and the retrieval of spatially referenced Earth based environmental geodata; since the last decade these computer programs have become popular among the planetary science community, and recent mission data start to be distributed in formats compatible with these systems. Among all the systems developed for the analysis of planetary and spatially referenced data, we have created a working environment combining two software suites that have similar characteristics in their modular design, their development history, their policy of distribution and their support system. The first, the Integrated Software for Imagers and Spectrometers (ISIS) developed by the United States Geological Survey

  3. A working environment for digital planetary data processing and mapping using ISIS and GRASS GIS

    USGS Publications Warehouse

    Frigeri, A.; Hare, T.; Neteler, M.; Coradini, A.; Federico, C.; Orosei, R.

    2011-01-01

    Since the beginning of planetary exploration, mapping has been fundamental to summarize observations returned by scientific missions. Sensor-based mapping has been used to highlight specific features from the planetary surfaces by means of processing. Interpretative mapping makes use of instrumental observations to produce thematic maps that summarize observations of actual data into a specific theme. Geologic maps, for example, are thematic interpretative maps that focus on the representation of materials and processes and their relative timing. The advancements in technology of the last 30 years have allowed us to develop specialized systems where the mapping process can be made entirely in the digital domain. The spread of networked computers on a global scale allowed the rapid propagation of software and digital data such that every researcher can now access digital mapping facilities on his desktop. The efforts to maintain planetary missions data accessible to the scientific community have led to the creation of standardized digital archives that facilitate the access to different datasets by software capable of processing these data from the raw level to the map projected one. Geographic Information Systems (GIS) have been developed to optimize the storage, the analysis, and the retrieval of spatially referenced Earth based environmental geodata; since the last decade these computer programs have become popular among the planetary science community, and recent mission data start to be distributed in formats compatible with these systems. Among all the systems developed for the analysis of planetary and spatially referenced data, we have created a working environment combining two software suites that have similar characteristics in their modular design, their development history, their policy of distribution and their support system. The first, the Integrated Software for Imagers and Spectrometers (ISIS) developed by the United States Geological Survey

  4. A Comparison of the Safety Analysis Process and the Generation IV Proliferation Resistance/Physical Protection Assessment Methodology

    SciTech Connect

    T. A. Bjornard; M. D. Zentner

    2006-05-01

    The Generation IV International Forum (GIF) is a vehicle for the cooperative international development of future nuclear energy systems. The Generation IV program has established primary objectives in the areas of sustainability, economics, safety and reliability, and Proliferation Resistance and Physical Protection (PR&PP). In order to help meet the latter objective a program was launched in December 2002 to develop a rigorous means to assess nuclear energy systems with respect to PR&PP. The study of Physical Protection of a facility is a relatively well established methodology, but an approach to evaluate the Proliferation Resistance of a nuclear fuel cycle is not. This paper will examine the Proliferation Resistance (PR) evaluation methodology being developed by the PR group, which is largely a new approach and compare it to generally accepted nuclear facility safety evaluation methodologies. Safety evaluation methods have been the subjects of decades of development and use. Further, safety design and analysis is fairly broadly understood, as well as being the subject of federally mandated procedures and requirements. It is therefore extremely instructive to compare and contrast the proposed new PR evaluation methodology process with that used in safety analysis. By so doing, instructive and useful conclusions can be derived from the comparison that will help to strengthen the PR methodological approach as it is developed further. From the comparison made in this paper it is evident that there are very strong parallels between the two processes. Most importantly, it is clear that the proliferation resistance aspects of nuclear energy systems are best considered beginning at the very outset of the design process. Only in this way can the designer identify and cost effectively incorporate intrinsic features that might be difficult to implement at some later stage. Also, just like safety, the process to implement proliferation resistance should be a dynamic

  5. Digital seismo-acoustic signal processing aboard a wireless sensor platform

    NASA Astrophysics Data System (ADS)

    Marcillo, O.; Johnson, J. B.; Lorincz, K.; Werner-Allen, G.; Welsh, M.

    2006-12-01

    We are developing a low power, low-cost wireless sensor array to conduct real-time signal processing of earthquakes at active volcanoes. The sensor array, which integrates data from both seismic and acoustic sensors, is based on Moteiv TMote Sky wireless sensor nodes (www.moteiv.com). The nodes feature a Texas Instruments MSP430 microcontroller, 48 Kbytes of program memory, 10 Kbytes of static RAM, 1 Mbyte of external flash memory, and a 2.4-GHz Chipcon CC2420 IEEE 802.15.4 radio. The TMote Sky is programmed in TinyOS. Basic signal processing occurs on an array of three peripheral sensor nodes. These nodes are tied into a dedicated GPS receiver node, which is focused on time synchronization, and a central communications node, which handles data integration and additional processing. The sensor nodes incorporate dual 12-bit digitizers sampling a seismic sensor and a pressure transducer at 100 samples per second. The wireless capabilities of the system allow flexible array geometry, with a maximum aperture of 200m. We have already developed the digital signal processing routines on board the Moteiv Tmote sensor nodes. The developed routines accomplish Real-time Seismic-Amplitude Measurement (RSAM), Seismic Spectral- Amplitude Measurement (SSAM), and a user-configured Short Term Averaging / Long Term Averaging (STA LTA ratio), which is used to calculate first arrivals. The processed data from individual nodes are transmitted back to a central node, where additional processing may be performed. Such processing will include back azimuth determination and other wave field analyses. Future on-board signal processing will focus on event characterization utilizing pattern recognition and spectral characterization. The processed data is intended as low bandwidth information which can be transmitted periodically and at low cost through satellite telemetry to a web server. The processing is limited by the computational capabilities (RAM, ROM) of the nodes. Nevertheless, we

  6. Digital Signal Processing by Virtual Instrumentation of a MEMS Magnetic Field Sensor for Biomedical Applications

    PubMed Central

    Juárez-Aguirre, Raúl; Domínguez-Nicolás, Saúl M.; Manjarrez, Elías; Tapia, Jesús A.; Figueras, Eduard; Vázquez-Leal, Héctor; Aguilera-Cortés, Luz A.; Herrera-May, Agustín L.

    2013-01-01

    We present a signal processing system with virtual instrumentation of a MEMS sensor to detect magnetic flux density for biomedical applications. This system consists of a magnetic field sensor, electronic components implemented on a printed circuit board (PCB), a data acquisition (DAQ) card, and a virtual instrument. It allows the development of a semi-portable prototype with the capacity to filter small electromagnetic interference signals through digital signal processing. The virtual instrument includes an algorithm to implement different configurations of infinite impulse response (IIR) filters. The PCB contains a precision instrumentation amplifier, a demodulator, a low-pass filter (LPF) and a buffer with operational amplifier. The proposed prototype is used for real-time non-invasive monitoring of magnetic flux density in the thoracic cage of rats. The response of the rat respiratory magnetogram displays a similar behavior as the rat electromyogram (EMG). PMID:24196434

  7. Vibration-insensitive temperature sensing system based on fluorescence decay and using a digital processing approach

    NASA Astrophysics Data System (ADS)

    Dong, H.; Zhao, W.; Sun, T.; Grattan, K. T. V.; Al-Shamma'a, A. I.; Wei, C.; Mulrooney, J.; Clifford, J.; Fitzpatrick, C.; Lewis, E.; Degner, M.; Ewald, H.; Lochmann, S. I.; Bramann, G.; Merlone Borla, E.; Faraldi, P.; Pidria, M.

    2006-07-01

    A fluorescence-based temperature sensor system using a digital signal processing approach has been developed and evaluated in operation on a working automotive engine. The signal processing approach, using the least-squares method, makes the system relatively insensitive to intensity variations in the probe and thus provides more precise measurements when compared to a previous system designed using analogue phase-locked detection. Experiments carried out to determine the emission temperatures of a running car engine have demonstrated the effectiveness of the sensor system in monitoring exhaust temperatures up to 250 °C, and potentially higher. This paper was presented at the 13th International Conference on Sensors and Their Applications, held in Chatham, Kent, on 6-7 September 2005.

  8. Improvement of FBG peak wavelength demodulation using digital signal processing algorithms

    NASA Astrophysics Data System (ADS)

    Harasim, Damian; Gulbahar, Yussupova

    2015-09-01

    Spectrum reflected or transmitted by fiber Bragg grating (FBG) in laboratory environment usually has smooth shape with high signal to noise ratio, similar to Gaussian curve. However, in some applications reflected spectrum could included some strong noise, especially where sensing array contains large number of FBGs or while is used broadband, low power source. This paper presents a possibility for extraction fiber Bragg grating peak wavelength from spectra with weak signal to noise radio with most frequently using digital signal processing algorithms. The accuracy of function minimum, centroid and Gaussian fitting methods for peak wavelength detection is compared. The linearity of processing characteristics of extended FBG measured for reference high power and second, low power source is shown and compared.

  9. Digital image processing of nanometer-size metal particles on amorphous substrates

    NASA Technical Reports Server (NTRS)

    Soria, F.; Artal, P.; Bescos, J.; Heinemann, K.

    1989-01-01

    The task of differentiating very small metal aggregates supported on amorphous films from the phase contrast image features inherently stemming from the support is extremely difficult in the nanometer particle size range. Digital image processing was employed to overcome some of the ambiguities in evaluating such micrographs. It was demonstrated that such processing allowed positive particle detection and a limited degree of statistical size analysis even for micrographs where by bare eye examination the distribution between particles and erroneous substrate features would seem highly ambiguous. The smallest size class detected for Pd/C samples peaks at 0.8 nm. This size class was found in various samples prepared under different evaporation conditions and it is concluded that these particles consist of 'a magic number' of 13 atoms and have cubooctahedral or icosahedral crystal structure.

  10. Real-time digital holographic microscopy using the graphic processing unit.

    PubMed

    Shimobaba, Tomoyoshi; Sato, Yoshikuni; Miura, Junya; Takenouchi, Mai; Ito, Tomoyoshi

    2008-08-01

    Digital holographic microscopy (DHM) is a well-known powerful method allowing both the amplitude and phase of a specimen to be simultaneously observed. In order to obtain a reconstructed image from a hologram, numerous calculations for the Fresnel diffraction are required. The Fresnel diffraction can be accelerated by the FFT (Fast Fourier Transform) algorithm. However, real-time reconstruction from a hologram is difficult even if we use a recent central processing unit (CPU) to calculate the Fresnel diffraction by the FFT algorithm. In this paper, we describe a real-time DHM system using a graphic processing unit (GPU) with many stream processors, which allows use as a highly parallel processor. The computational speed of the Fresnel diffraction using the GPU is faster than that of recent CPUs. The real-time DHM system can obtain reconstructed images from holograms whose size is 512 x 512 grids in 24 frames per second.

  11. Real-time digital signal processing for live electro-optic imaging.

    PubMed

    Sasagawa, Kiyotaka; Kanno, Atsushi; Tsuchiya, Masahiro

    2009-08-31

    We present an imaging system that enables real-time magnitude and phase detection of modulated signals and its application to a Live Electro-optic Imaging (LEI) system, which realizes instantaneous visualization of RF electric fields. The real-time acquisition of magnitude and phase images of a modulated optical signal at 5 kHz is demonstrated by imaging with a Si-based high-speed CMOS image sensor and real-time signal processing with a digital signal processor. In the LEI system, RF electric fields are probed with light via an electro-optic crystal plate and downconverted to an intermediate frequency by parallel optical heterodyning, which can be detected with the image sensor. The artifacts caused by the optics and the image sensor characteristics are corrected by image processing. As examples, we demonstrate real-time visualization of electric fields from RF circuits.

  12. TRIIG - Time-lapse reproduction of images through interactive graphics. [digital processing of quality hard copy

    NASA Technical Reports Server (NTRS)

    Buckner, J. D.; Council, H. W.; Edwards, T. R.

    1974-01-01

    Description of the hardware and software implementing the system of time-lapse reproduction of images through interactive graphics (TRIIG). The system produces a quality hard copy of processed images in a fast and inexpensive manner. This capability allows for optimal development of processing software through the rapid viewing of many image frames in an interactive mode. Three critical optical devices are used to reproduce an image: an Optronics photo reader/writer, the Adage Graphics Terminal, and Polaroid Type 57 high speed film. Typical sources of digitized images are observation satellites, such as ERTS or Mariner, computer coupled electron microscopes for high-magnification studies, or computer coupled X-ray devices for medical research.

  13. Digital signal processing by virtual instrumentation of a MEMS magnetic field sensor for biomedical applications.

    PubMed

    Juárez-Aguirre, Raúl; Domínguez-Nicolás, Saúl M; Manjarrez, Elías; Tapia, Jesús A; Figueras, Eduard; Vázquez-Leal, Héctor; Aguilera-Cortés, Luz A; Herrera-May, Agustín L

    2013-11-05

    We present a signal processing system with virtual instrumentation of a MEMS sensor to detect magnetic flux density for biomedical applications. This system consists of a magnetic field sensor, electronic components implemented on a printed circuit board (PCB), a data acquisition (DAQ) card, and a virtual instrument. It allows the development of a semi-portable prototype with the capacity to filter small electromagnetic interference signals through digital signal processing. The virtual instrument includes an algorithm to implement different configurations of infinite impulse response (IIR) filters. The PCB contains a precision instrumentation amplifier, a demodulator, a low-pass filter (LPF) and a buffer with operational amplifier. The proposed prototype is used for real-time non-invasive monitoring of magnetic flux density in the thoracic cage of rats. The response of the rat respiratory magnetogram displays a similar behavior as the rat electromyogram (EMG).

  14. Current Tracking Control of Voltage Source PWM Inverters Using Adaptive Digital Signal Processing

    NASA Astrophysics Data System (ADS)

    Fukuda, Shoji; Furukawa, Yuya

    An active filter (AF) is required to have a high control capability of tracking a time-varying current reference. However, a steady-state current error always exists if a conventional proportional and integral (PI) regulator is used because the current reference varies in time. This paper proposes the application of adaptive digital signal processing (ADSP) to the current control of voltage source PWM inverters. ADSP does not require any additional hardware. It can automatically minimize the mean square-error. Since the processing time available by a computer is limited, ADSP cannot eliminate higher order harmonics but can eliminate lower order harmonics such as 5th to 17th. Experimental results demonstrate that ADSP is useful for improving the reference tracking performance of voltage source inverters.

  15. Application of three-dimensional digital image processing for reconstruction of microstructural volume from serial sections

    SciTech Connect

    Tewari, A.; Gokhale, A.M.

    2000-03-01

    Three-dimensional digital image processing is useful for reconstruction of microstructural volume from a stack of serial sections. Application of this technique is demonstrated via reconstruction of a volume segment of the liquid-phase sintered microstructure of a tungsten heavy alloy processed in the microgravity environment of NASA's space shuttle, Columbia. Ninety serial sections (approximately one micrometer apart) were used for reconstruction of the three-dimensional microstructure. The three-dimensional microstructural reconstruction clearly revealed that the tungsten grains are almost completely connected in three-dimensional space. Both the matrix and the grains are topologically co-continuous, although the alloy was liquid-phase sintered in microgravity. Therefore, absence of gravity did not produced a microstructure consisting of discrete isolated W grains uniformly dispersed in the liquid Ni-Fe alloy matrix at the sintering temperature.

  16. Do Italian Companies Manage Work-Related Stress Effectively? A Process Evaluation in Implementing the INAIL Methodology

    PubMed Central

    Di Tecco, Cristina; Ronchetti, Matteo; Ghelli, Monica; Russo, Simone; Persechino, Benedetta; Iavicoli, Sergio

    2015-01-01

    Studies on Intervention Process Evaluation are attracting growing attention in the literature on interventions linked to stress and the wellbeing of workers. There is evidence that some elements relating to the process and content of an intervention may have a decisive role in implementing it by facilitating or hindering the effectiveness of the results. This study aimed to provide a process evaluation on interventions to assess and manage risks related to work-related stress using a methodological path offered by INAIL. The final sample is composed of 124 companies participating to an interview on aspects relating to each phase of the INAIL methodological path put in place to implement the intervention. INAIL methodology has been defined as useful in the process of assessing and managing the risks related to work-related stress. Some factors related to the process (e.g., implementation of a preliminary phase, workers' involvement, and use of external consultants) showed a role in significant differences that emerged in the levels of risk, particularly in relation to findings from the preliminary assessment. Main findings provide information on the key aspects of process and content that are useful in implementing an intervention for assessing and managing risks related to work-related stress. PMID:26504788

  17. Do Italian Companies Manage Work-Related Stress Effectively? A Process Evaluation in Implementing the INAIL Methodology.

    PubMed

    Di Tecco, Cristina; Ronchetti, Matteo; Ghelli, Monica; Russo, Simone; Persechino, Benedetta; Iavicoli, Sergio

    2015-01-01

    Studies on Intervention Process Evaluation are attracting growing attention in the literature on interventions linked to stress and the wellbeing of workers. There is evidence that some elements relating to the process and content of an intervention may have a decisive role in implementing it by facilitating or hindering the effectiveness of the results. This study aimed to provide a process evaluation on interventions to assess and manage risks related to work-related stress using a methodological path offered by INAIL. The final sample is composed of 124 companies participating to an interview on aspects relating to each phase of the INAIL methodological path put in place to implement the intervention. INAIL methodology has been defined as useful in the process of assessing and managing the risks related to work-related stress. Some factors related to the process (e.g., implementation of a preliminary phase, workers' involvement, and use of external consultants) showed a role in significant differences that emerged in the levels of risk, particularly in relation to findings from the preliminary assessment. Main findings provide information on the key aspects of process and content that are useful in implementing an intervention for assessing and managing risks related to work-related stress. PMID:26504788

  18. Do Italian Companies Manage Work-Related Stress Effectively? A Process Evaluation in Implementing the INAIL Methodology.

    PubMed

    Di Tecco, Cristina; Ronchetti, Matteo; Ghelli, Monica; Russo, Simone; Persechino, Benedetta; Iavicoli, Sergio

    2015-01-01

    Studies on Intervention Process Evaluation are attracting growing attention in the literature on interventions linked to stress and the wellbeing of workers. There is evidence that some elements relating to the process and content of an intervention may have a decisive role in implementing it by facilitating or hindering the effectiveness of the results. This study aimed to provide a process evaluation on interventions to assess and manage risks related to work-related stress using a methodological path offered by INAIL. The final sample is composed of 124 companies participating to an interview on aspects relating to each phase of the INAIL methodological path put in place to implement the intervention. INAIL methodology has been defined as useful in the process of assessing and managing the risks related to work-related stress. Some factors related to the process (e.g., implementation of a preliminary phase, workers' involvement, and use of external consultants) showed a role in significant differences that emerged in the levels of risk, particularly in relation to findings from the preliminary assessment. Main findings provide information on the key aspects of process and content that are useful in implementing an intervention for assessing and managing risks related to work-related stress.

  19. Phenopix: a R package to process digital images of a vegetation cover

    NASA Astrophysics Data System (ADS)

    Filippa, Gianluca; Cremonese, Edoardo; Migliavacca, Mirco; Galvagno, Marta; Morra di Cella, Umberto; Richardson, Andrew

    2015-04-01

    Plant phenology is a globally recognized indicator of the effects of climate change on the terrestrial biosphere. Accordingly, new tools to automatically track the seasonal development of a vegetation cover are becoming available and more and more deployed. Among them, near-continuous digital images are being collected in several networks in the US, Europe, Asia and Australia in a range of different ecosystems, including agricultural lands, deciduous and evergreen forests, and grasslands. The growing scientific interest in vegetation image analysis highlights the need of easy to use, flexible and standardized processing techniques. In this contribution we illustrate a new open source package called "phenopix" written in R language that allows to process images of a vegetation cover. The main features include: (i) define of one or more areas of interest on an image and process pixel information within them, (ii) compute vegetation indexes based on red green and blue channels, (iii) fit a curve to the seasonal trajectory of vegetation indexes and extract relevant dates (aka thresholds) on the seasonal trajectory; (iv) analyze image pixels separately to extract spatially explicit phenological information. The utilities of the package will be illustrated in detail for two subalpine sites, a grassland and a larch stand at about 2000 m in the Italian Western Alps. The phenopix package is a cost free and easy-to-use tool that allows to process digital images of a vegetation cover in a standardized, flexible and reproducible way. The software is available for download at the R forge web site (r-forge.r-project.org/projects/phenopix/).

  20. Digital ultrasonics signal processing: Flaw data post processing use and description

    NASA Astrophysics Data System (ADS)

    Buel, V. E.

    1981-09-01

    A modular system composed of two sets of tasks which interprets the flaw data and allows compensation of the data due to transducer characteristics is described. The hardware configuration consists of two main units. A DEC LSI-11 processor running under the RT-11 sngle job, version 2C-02 operating system, controls the scanner hardware and the ultrasonic unit. A DEC PDP-11/45 processor also running under the RT-11, version 2C-02, operating system, stores, processes and displays the flaw data. The software developed the Ultrasonics Evaluation System, is divided into two catagories; transducer characterization and flaw classification. Each category is divided further into two functional tasks: a data acquisition and a postprocessor ask. The flaw characterization collects data, compresses its, and writes it to a disk file. The data is then processed by the flaw classification postprocessing task. The use and operation of a flaw data postprocessor is described.

  1. Digital ultrasonics signal processing: Flaw data post processing use and description

    NASA Technical Reports Server (NTRS)

    Buel, V. E.

    1981-01-01

    A modular system composed of two sets of tasks which interprets the flaw data and allows compensation of the data due to transducer characteristics is described. The hardware configuration consists of two main units. A DEC LSI-11 processor running under the RT-11 sngle job, version 2C-02 operating system, controls the scanner hardware and the ultrasonic unit. A DEC PDP-11/45 processor also running under the RT-11, version 2C-02, operating system, stores, processes and displays the flaw data. The software developed the Ultrasonics Evaluation System, is divided into two catagories; transducer characterization and flaw classification. Each category is divided further into two functional tasks: a data acquisition and a postprocessor ask. The flaw characterization collects data, compresses its, and writes it to a disk file. The data is then processed by the flaw classification postprocessing task. The use and operation of a flaw data postprocessor is described.

  2. Proposal of the Methodology for Analysing the Structural Relationship in the System of Random Process Using the Data Mining Methods

    NASA Astrophysics Data System (ADS)

    Michaľčonok, German; Kalinová, Michaela Horalová; Németh, Martin

    2014-12-01

    The aim of this paper is to present the possibilities of applying data mining techniques to the problem of analysis of structural relationships in the system of stationary random processes. In this paper, we will approach the area of the random processes, present the process of structural analysis and select suitable circuit data mining methods applicable to the area of structural analysis. We will propose the methodology for the structural analysis in the system of stationary stochastic processes using data mining methods for active experimental approach, based on the theoretical basis.

  3. Electrocoagulation and nanofiltration integrated process application in purification of bilge water using response surface methodology.

    PubMed

    Akarsu, Ceyhun; Ozay, Yasin; Dizge, Nadir; Elif Gulsen, H; Ates, Hasan; Gozmen, Belgin; Turabik, Meral

    2016-01-01

    Marine pollution has been considered an increasing problem because of the increase in sea transportation day by day. Therefore, a large volume of bilge water which contains petroleum, oil and hydrocarbons in high concentrations is generated from all types of ships. In this study, treatment of bilge water by electrocoagulation/electroflotation and nanofiltration integrated process is investigated as a function of voltage, time, and initial pH with aluminum electrode as both anode and cathode. Moreover, a commercial NF270 flat-sheet membrane was also used for further purification. Box-Behnken design combined with response surface methodology was used to study the response pattern and determine the optimum conditions for maximum chemical oxygen demand (COD) removal and minimum metal ion contents of bilge water. Three independent variables, namely voltage (5-15 V), initial pH (4.5-8.0) and time (30-90 min) were transformed to coded values. The COD removal percent, UV absorbance at 254 nm, pH value (after treatment), and concentration of metal ions (Ti, As, Cu, Cr, Zn, Sr, Mo) were obtained as responses. Analysis of variance results showed that all the models were significant except for Zn (P > 0.05), because the calculated F values for these models were less than the critical F value for the considered probability (P = 0.05). The obtained R(2) and Radj(2) values signified the correlation between the experimental data and predicted responses: except for the model of Zn concentration after treatment, the high R(2) values showed the goodness of fit of the model. While the increase in the applied voltage showed negative effects, the increases in time and pH showed a positive effect on COD removal efficiency; also the most effective linear term was found as time. A positive sign of the interactive coefficients of the voltage-time and pH-time systems indicated synergistic effect on COD removal efficiency, whereas interaction between voltage and pH showed an antagonistic

  4. Electrocoagulation and nanofiltration integrated process application in purification of bilge water using response surface methodology.

    PubMed

    Akarsu, Ceyhun; Ozay, Yasin; Dizge, Nadir; Elif Gulsen, H; Ates, Hasan; Gozmen, Belgin; Turabik, Meral

    2016-01-01

    Marine pollution has been considered an increasing problem because of the increase in sea transportation day by day. Therefore, a large volume of bilge water which contains petroleum, oil and hydrocarbons in high concentrations is generated from all types of ships. In this study, treatment of bilge water by electrocoagulation/electroflotation and nanofiltration integrated process is investigated as a function of voltage, time, and initial pH with aluminum electrode as both anode and cathode. Moreover, a commercial NF270 flat-sheet membrane was also used for further purification. Box-Behnken design combined with response surface methodology was used to study the response pattern and determine the optimum conditions for maximum chemical oxygen demand (COD) removal and minimum metal ion contents of bilge water. Three independent variables, namely voltage (5-15 V), initial pH (4.5-8.0) and time (30-90 min) were transformed to coded values. The COD removal percent, UV absorbance at 254 nm, pH value (after treatment), and concentration of metal ions (Ti, As, Cu, Cr, Zn, Sr, Mo) were obtained as responses. Analysis of variance results showed that all the models were significant except for Zn (P > 0.05), because the calculated F values for these models were less than the critical F value for the considered probability (P = 0.05). The obtained R(2) and Radj(2) values signified the correlation between the experimental data and predicted responses: except for the model of Zn concentration after treatment, the high R(2) values showed the goodness of fit of the model. While the increase in the applied voltage showed negative effects, the increases in time and pH showed a positive effect on COD removal efficiency; also the most effective linear term was found as time. A positive sign of the interactive coefficients of the voltage-time and pH-time systems indicated synergistic effect on COD removal efficiency, whereas interaction between voltage and pH showed an antagonistic

  5. Comparison of breast percent density estimation from raw versus processed digital mammograms

    NASA Astrophysics Data System (ADS)

    Li, Diane; Gavenonis, Sara; Conant, Emily; Kontos, Despina

    2011-03-01

    We compared breast percent density (PD%) measures obtained from raw and post-processed digital mammographic (DM) images. Bilateral raw and post-processed medio-lateral oblique (MLO) images from 81 screening studies were retrospectively analyzed. Image acquisition was performed with a GE Healthcare DS full-field DM system. Image post-processing was performed using the PremiumViewTM algorithm (GE Healthcare). Area-based breast PD% was estimated by a radiologist using a semi-automated image thresholding technique (Cumulus, Univ. Toronto). Comparison of breast PD% between raw and post-processed DM images was performed using the Pearson correlation (r), linear regression, and Student's t-test. Intra-reader variability was assessed with a repeat read on the same data-set. Our results show that breast PD% measurements from raw and post-processed DM images have a high correlation (r=0.98, R2=0.95, p<0.001). Paired t-test comparison of breast PD% between the raw and the post-processed images showed a statistically significant difference equal to 1.2% (p = 0.006). Our results suggest that the relatively small magnitude of the absolute difference in PD% between raw and post-processed DM images is unlikely to be clinically significant in breast cancer risk stratification. Therefore, it may be feasible to use post-processed DM images for breast PD% estimation in clinical settings. Since most breast imaging clinics routinely use and store only the post-processed DM images, breast PD% estimation from post-processed data may accelerate the integration of breast density in breast cancer risk assessment models used in clinical practice.

  6. A Mixed Methodological Analysis of the Role of Culture in the Clinical Decision-Making Process

    ERIC Educational Resources Information Center

    Hays, Danica G.; Prosek, Elizabeth A.; McLeod, Amy L.

    2010-01-01

    Even though literature indicates that particular cultural groups receive more severe diagnoses at disproportionate rates, there has been minimal research that addresses how culture interfaces specifically with clinical decision making. This mixed methodological study of 41 counselors indicated that cultural characteristics of both counselors and…

  7. The Process of Creation: A Novel Methodology for Analyzing Multimodal Data

    ERIC Educational Resources Information Center

    Halverson, Erica Rosenfeld; Bass, Michelle; Woods, David

    2012-01-01

    In the 21st century, meaning making is a multimodal act; we communicate what we know and how we know it using much more than printed text on a blank page. As a result, qualitative researchers need new methodologies, methods, and tools for working with the complex artifacts that our research subjects produce. In this article we describe the…

  8. Identities, Roles and Iterative Processes: Methodological Reflections from Research on Literacy among Gypsies and Travellers

    ERIC Educational Resources Information Center

    McCaffery, Juliet

    2014-01-01

    In this article the author reflects on some of the methodological issues of conducting research in a local marginalised community in the UK. Her research was on attitudes to literacy in the Gypsy and Traveller community in southern England. This article describes some of the challenges and how she, as an outsider and not a member of their…

  9. Analysis of Feedback Processes in Online Group Interaction: A Methodological Model

    ERIC Educational Resources Information Center

    Espasa, Anna; Guasch, Teresa; Alvarez, Ibis M.

    2013-01-01

    The aim of this article is to present a methodological model to analyze students' group interaction to improve their essays in online learning environments, based on asynchronous and written communication. In these environments teacher and student scaffolds for discussion are essential to promote interaction. One of these scaffolds can be the…

  10. A new approach to pre-processing digital image for wavelet-based watermark

    NASA Astrophysics Data System (ADS)

    Agreste, Santa; Andaloro, Guido

    2008-11-01

    The growth of the Internet has increased the phenomenon of digital piracy, in multimedia objects, like software, image, video, audio and text. Therefore it is strategic to individualize and to develop methods and numerical algorithms, which are stable and have low computational cost, that will allow us to find a solution to these problems. We describe a digital watermarking algorithm for color image protection and authenticity: robust, not blind, and wavelet-based. The use of Discrete Wavelet Transform is motivated by good time-frequency features and a good match with Human Visual System directives. These two combined elements are important for building an invisible and robust watermark. Moreover our algorithm can work with any image, thanks to the step of pre-processing of the image that includes resize techniques that adapt to the size of the original image for Wavelet transform. The watermark signal is calculated in correlation with the image features and statistic properties. In the detection step we apply a re-synchronization between the original and watermarked image according to the Neyman-Pearson statistic criterion. Experimentation on a large set of different images has been shown to be resistant against geometric, filtering, and StirMark attacks with a low rate of false alarm.

  11. An online detection system for aggregate sizes and shapes based on digital image processing

    NASA Astrophysics Data System (ADS)

    Yang, Jianhong; Chen, Sijia

    2016-07-01

    Traditional aggregate size measuring methods are time-consuming, taxing, and do not deliver online measurements. A new online detection system for determining aggregate size and shape based on a digital camera with a charge-coupled device, and subsequent digital image processing, have been developed to overcome these problems. The system captures images of aggregates while falling and flat lying. Using these data, the particle size and shape distribution can be obtained in real time. Here, we calibrate this method using standard globules. Our experiments show that the maximum particle size distribution error was only 3 wt%, while the maximum particle shape distribution error was only 2 wt% for data derived from falling aggregates, having good dispersion. In contrast, the data for flat-lying aggregates had a maximum particle size distribution error of 12 wt%, and a maximum particle shape distribution error of 10 wt%; their accuracy was clearly lower than for falling aggregates. However, they performed well for single-graded aggregates, and did not require a dispersion device. Our system is low-cost and easy to install. It can successfully achieve online detection of aggregate size and shape with good reliability, and it has great potential for aggregate quality assurance.

  12. 77 FR 21538 - Announcing DRAFT Revisions to Federal Information Processing Standard (FIPS) 186-3, Digital...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-10

    ... algorithm (DSA) to generate and verify digital signatures. Later revisions (FIPS 186-1, FIPS 186-2, and FIPS 186-3, adopted in 1998, 1999 and 2009, respectively) adopted two additional algorithms: The Elliptic Curve Digital Signature Algorithm (ECDSA) and the RSA digital signature algorithm. NIST is...

  13. Digital Archiving and Preservation: Technologies and Processes for a Trusted Repository

    ERIC Educational Resources Information Center

    Jantz, Ronald; Giarlo, Michael

    2006-01-01

    This article examines what is implied by the term "trusted" in the phrase "trusted digital repositories." Digital repositories should be able to preserve electronic materials for periods at least comparable to existing preservation methods. Our collective lack of experience with preserving digital objects and consensus about the reliability of our…

  14. A method for the processing and analysis of digital terrain elevation data. [Shiprock and Gallup Quadrangles, Arizona and New Mexico

    NASA Technical Reports Server (NTRS)

    Junkin, B. G. (Principal Investigator)

    1979-01-01

    A method is presented for the processing and analysis of digital topography data that can subsequently be entered in an interactive data base in the form of slope, slope length, elevation, and aspect angle. A discussion of the data source and specific descriptions of the data processing software programs are included. In addition, the mathematical considerations involved in the registration of raw digitized coordinate points to the UTM coordinate system are presented. Scale factor considerations are also included. Results of the processing and analysis are illustrated using the Shiprock and Gallup Quadrangle test data.

  15. Low Power and Robust Domino Circuit with Process Variations Tolerance for High Speed Digital Signal Processing

    NASA Astrophysics Data System (ADS)

    Wang, Jinhui; Peng, Xiaohong; Li, Xinxin; Hou, Ligang; Wu, Wuchen

    Utilizing the sleep switch transistor technique and dual threshold voltage technique, a source following evaluation gate (SEFG) based domino circuit is presented in this paper for simultaneously suppressing the leakage current and enhancing noise immunity. Simulation results show that the leakage current of the proposed design can be reduced by 43%, 62%, and 67% while improving 19.7%, 3.4 %, and 12.5% noise margin as compared to standard low threshold voltage circuit, standard dual threshold voltage circuit, and SEFG structure, respectively. Also, the inputs and clock signals combination static state dependent leakage current characteristic is analyzed and the minimum leakage states of different domino AND gates are obtained. At last, the leakage power characteristic under process variations is discussed.

  16. The Application of Six Sigma Methodologies to University Processes: The Use of Student Teams

    ERIC Educational Resources Information Center

    Pryor, Mildred Golden; Alexander, Christine; Taneja, Sonia; Tirumalasetty, Sowmya; Chadalavada, Deepthi

    2012-01-01

    The first student Six Sigma team (activated under a QEP Process Sub-team) evaluated the course and curriculum approval process. The goal was to streamline the process and thereby shorten process cycle time and reduce confusion about how the process works. Members of this team developed flowcharts on how the process is supposed to work (by…

  17. Space shuttle orbiter digital data processing system timing sensitivity analysis OFT ascent phase

    NASA Technical Reports Server (NTRS)

    Lagas, J. J.; Peterka, J. J.; Becker, D. A.

    1977-01-01

    Dynamic loads were investigated to provide simulation and analysis of the space shuttle orbiter digital data processing system (DDPS). Segments of the ascent test (OFT) configuration were modeled utilizing the information management system interpretive model (IMSIM) in a computerized simulation modeling of the OFT hardware and software workload. System requirements for simulation of the OFT configuration were defined, and sensitivity analyses determined areas of potential data flow problems in DDPS operation. Based on the defined system requirements and these sensitivity analyses, a test design was developed for adapting, parameterizing, and executing IMSIM, using varying load and stress conditions for model execution. Analyses of the computer simulation runs are documented, including results, conclusions, and recommendations for DDPS improvements.

  18. The application of digital computers to near-real-time processing of flutter test data

    NASA Technical Reports Server (NTRS)

    Hurley, S. R.

    1976-01-01

    Procedures used in monitoring, analyzing, and displaying flight and ground flutter test data are presented. These procedures include three digital computer programs developed to process structural response data in near real time. Qualitative and quantitative modal stability data are derived from time history response data resulting from rapid sinusoidal frequency sweep forcing functions, tuned-mode quick stops, and pilot induced control pulses. The techniques have been applied to both fixed and rotary wing aircraft, during flight, whirl tower rotor systems tests, and wind tunnel flutter model tests. An hydraulically driven oscillatory aerodynamic vane excitation system utilized during the flight flutter test programs accomplished during Lockheed L-1011 and S-3A development is described.

  19. [A modified speech enhancement algorithm for electronic cochlear implant and its digital signal processing realization].

    PubMed

    Wang, Yulin; Tian, Xuelong

    2014-08-01

    In order to improve the speech quality and auditory perceptiveness of electronic cochlear implant under strong noise background, a speech enhancement system used for electronic cochlear implant front-end was constructed. Taking digital signal processing (DSP) as the core, the system combines its multi-channel buffered serial port (McBSP) data transmission channel with extended audio interface chip TLV320AIC10, so speech signal acquisition and output with high speed are realized. Meanwhile, due to the traditional speech enhancement method which has the problems as bad adaptability, slow convergence speed and big steady-state error, versiera function and de-correlation principle were used to improve the existing adaptive filtering algorithm, which effectively enhanced the quality of voice communications. Test results verified the stability of the system and the de-noising performance of the algorithm, and it also proved that they could provide clearer speech signals for the deaf or tinnitus patients.

  20. Versatile self-reconfigurable digital processing platform for satellite and aerospace applications

    NASA Astrophysics Data System (ADS)

    Cichocki, A.; Nowosielski, W.; Orleanski, P.

    2012-05-01

    This document presents the concept and implementation of a reconfigurable digital processing platform for airborne and satellite systems. Some recent trends visible in the technology development of on-board electronics were taken under consideration during the conceptual phase of the design. They were, namely, use of commercial-of-the-shelf (or COTS) components, utilization of FPGAs, common interfaces and system re-programmability. On the other hand, a matter that is constantly being a challenge for these types of applications that must be considered as crucial is the reliability. The key feature of described prototype device is a fusion of two different approaches: static functionality and ability of a self-reconfiguration on the fly, while retaining high availability of a system, especially when the configuration is altered by space radiation.

  1. Spectral analysis of laser Doppler signals in real time using digital processing.

    PubMed

    Dougherty, G

    1994-01-01

    A versatile spectrum analyser was developed to generate and display laser Doppler shift signals, and derived parameters, continuously in real time using a digital signal processing chip. A major attraction of the system is that it is entirely programmable, so that both the algorithms and the attributes of the system, such as window function and frame overlap, can be easily altered. It was used to investigate the relative merits of a variety of algorithms using a blood-flow phantom. An index based on the first moment of the Doppler power spectrum was found to be the most reliable flow indicator, with linearity extending towards a velocity of 5 mm s-1 for a blood haematocrit of 5%. The system is not limited to analysis based on the fast Fourier transform (FFT), and is suitable for non-linear techniques such as maximum entropy spectral estimation (MESE). PMID:8162263

  2. [A modified speech enhancement algorithm for electronic cochlear implant and its digital signal processing realization].

    PubMed

    Wang, Yulin; Tian, Xuelong

    2014-08-01

    In order to improve the speech quality and auditory perceptiveness of electronic cochlear implant under strong noise background, a speech enhancement system used for electronic cochlear implant front-end was constructed. Taking digital signal processing (DSP) as the core, the system combines its multi-channel buffered serial port (McBSP) data transmission channel with extended audio interface chip TLV320AIC10, so speech signal acquisition and output with high speed are realized. Meanwhile, due to the traditional speech enhancement method which has the problems as bad adaptability, slow convergence speed and big steady-state error, versiera function and de-correlation principle were used to improve the existing adaptive filtering algorithm, which effectively enhanced the quality of voice communications. Test results verified the stability of the system and the de-noising performance of the algorithm, and it also proved that they could provide clearer speech signals for the deaf or tinnitus patients. PMID:25464779

  3. [A modified speech enhancement algorithm for electronic cochlear implant and its digital signal processing realization].

    PubMed

    Wang, Yulin; Tian, Xuelong

    2014-08-01

    In order to improve the speech quality and auditory perceptiveness of electronic cochlear implant under strong noise background, a speech enhancement system used for electronic cochlear implant front-end was constructed. Taking digital signal processing (DSP) as the core, the system combines its multi-channel buffered serial port (McBSP) data transmission channel with extended audio interface chip TLV320AIC10, so speech signal acquisition and output with high speed are realized. Meanwhile, due to the traditional speech enhancement method which has the problems as bad adaptability, slow convergence speed and big steady-state error, versiera function and de-correlation principle were used to improve the existing adaptive filtering algorithm, which effectively enhanced the quality of voice communications. Test results verified the stability of the system and the de-noising performance of the algorithm, and it also proved that they could provide clearer speech signals for the deaf or tinnitus patients. PMID:25508410

  4. Families of atomic functions ch a, n ( x) and fup n ( x) in digital signal processing

    NASA Astrophysics Data System (ADS)

    Kravchenko, V. F.; Konovalov, Ya. Yu.; Pustovoit, V. I.

    2015-05-01

    A new class of weight functions constructed on the basis of family of atomic functions and ch a, n ( x) and fup n ( x) is proposed and substantiated. The study consists of three parts. In the first part, the definition of atomic functions and their convolutions are presented. In the second part, a definition of a new family of atomic functions ch a, n ( x) as convolutions of h a ( x) is given. In the third part, a family of weight functions is constructed by truncation of ch a, n ( x) to the effective support. If smoothening with classical windows is applied after the truncation combined weight functions will be received. The physical characteristics of the weight functions constructed by the direct truncation and in combination with the Hamming and Riesz windows are presented. The found functions can find wide application in problems of digital signal processing, restoration of images, radar, radiometry, radio astronomy, remote probing, and other physical domains.

  5. Customisable-Off-The-Shelf Digital Signal Processing Board for Space Applications

    NASA Astrophysics Data System (ADS)

    Reichinger, H.; Süst, M.

    The space market, driven by tight budgets, is increasingly asking for Commercial Off The Shelf (COTS) solutions. Considering spaceborne Digital Signal Processing (DSP) boards, the usage of COTS products is often hampered by requirements asking for functionality or performance, originally not provided by the board. Later implementation of such lacking functionality easily consumes the initial cost advantage of COTS. AAE's new Customisable-Off-The-Shelf DSP board family, called USPM, circumvents this pitfall. The USPM can certainly be used in its standard configuration off the shelf. Customisation of the USPM hardware, software and test equipment is supported by a modular design approach and extensive configuration options. Additional hardware functionality can be accommodated in an FPGA on the board or on a piggyback module. The USPM product family will be available in all space quality levels. A special configuration avoiding any parts requiring a US export licence is also available.

  6. On Digital Simulation of Multicorrelated Random Processes and Its Applications. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Sinha, A. K.

    1973-01-01

    Two methods are described to simulate, on a digital computer, a set of correlated, stationary, and Gaussian time series with zero mean from the given matrix of power spectral densities and cross spectral densities. The first method is based upon trigonometric series with random amplitudes and deterministic phase angles. The random amplitudes are generated by using a standard random number generator subroutine. An example is given which corresponds to three components of wind velocities at two different spatial locations for a total of six correlated time series. In the second method, the whole process is carried out using the Fast Fourier Transform approach. This method gives more accurate results and works about twenty times faster for a set of six correlated time series.

  7. A 'user friendly' geographic information system in a color interactive digital image processing system environment

    NASA Technical Reports Server (NTRS)

    Campbell, W. J.; Goldberg, M.

    1982-01-01

    NASA's Eastern Regional Remote Sensing Applications Center (ERRSAC) has recognized the need to accommodate spatial analysis techniques in its remote sensing technology transfer program. A computerized Geographic Information System to incorporate remotely sensed data, specifically Landsat, with other relevant data was considered a realistic approach to address a given resource problem. Questions arose concerning the selection of a suitable available software system to demonstrate, train, and undertake demonstration projects with ERRSAC's user community. The very specific requirements for such a system are discussed. The solution found involved the addition of geographic information processing functions to the Interactive Digital Image Manipulation System (IDIMS). Details regarding the functions of the new integrated system are examined along with the characteristics of the software.

  8. DIGITAL PROCESSING TECHNIQUES FOR IMAGE MAPPING WITH LANDSAT TM AND SPOT SIMULATOR DATA.

    USGS Publications Warehouse

    Chavez, Pat S., Jr.

    1984-01-01

    To overcome certain problems associated with the visual selection of Landsat TM bands for image mapping, the author used a quantitative technique that ranks the 20 possible three-band combinations based upon their information content. Standard deviations and correlation coefficients can be used to compute a value called the Optimum Index Factor (OIF) for each of the 20 possible combinations. SPOT simulator images were digitally processed and compared with Landsat-4 Thematic Mapper (TM) images covering a semi-arid region in northern Arizona and a highly vegetated urban area near Washington, D. C. Statistical comparisons indicate the more radiometric or color information exists in certain TM three-band combinations than in the three SPOT bands.

  9. Two dimensional recursive digital filters for near real time image processing

    NASA Technical Reports Server (NTRS)

    Olson, D.; Sherrod, E.

    1980-01-01

    A program was designed toward the demonstration of the feasibility of using two dimensional recursive digital filters for subjective image processing applications that require rapid turn around. The concept of the use of a dedicated minicomputer for the processor for this application was demonstrated. The minicomputer used was the HP1000 series E with a RTE 2 disc operating system and 32K words of memory. A Grinnel 256 x 512 x 8 bit display system was used to display the images. Sample images were provided by NASA Goddard on a 800 BPI, 9 track tape. Four 512 x 512 images representing 4 spectral regions of the same scene were provided. These images were filtered with enhancement filters developed during this effort.

  10. Digital signal processing approaches for semiconductor phase noise tolerant coherent transmission systems

    NASA Astrophysics Data System (ADS)

    Iglesias Olmedo, Miguel; Pang, Xiaodan; Schatz, Richard; Zibar, Darko; Tafur Monroy, Idelfonso; Jacobsen, Gunnar; Popov, Sergei

    2015-01-01

    We discuss about digital signal processing approaches that can enable coherent links based on semiconductor lasers. A state-of-the art analysis on different carrier-phase recovery (CPR) techniques is presented. We show that these techniques are based on the assumption of lorentzian linewidth, which does not hold for monolithically integrated semiconductor lasers. We investigate the impact of such lineshape on both 3 and 20 dB linewidth and experimentally conduct a systematic study for 56-GBaud DP-QPSK and 28-GBaud DP-16QAM systems using a decision directed phase look loop algorithm. We show how carrier induced frequency noise has no impact on linewidth but a significant impact on system performance; which rises the question on whether 3-dB linewidth should be used as performance estimator for semiconductor lasers.

  11. Operation and performance of a longitudinal damping system using parallel digital signal processing

    SciTech Connect

    Fox, J.D.; Hindi, H.; Linscott, I.

    1994-06-01

    A programmable longitudinal feedback system based on four AT&T 1610 digital signal processors has been developed as a component of the PEP-II R&D program. This Longitudinal Quick Prototype is a proof of concept for the PEP-II system and implements full speed bunch-by-bunch signal processing for storage rings with bunch spacings of 4 ns. The design implements, via software, a general purpose feedback controller which allows the system to be operated at several accelerator facilities. The system configuration used for tests at the LBL Advanced Light Source is described. Open and closed loop results showing the detection and calculation of feedback signals from bunch motion are presented, and the system is shown to damp coupled-bunch instabilities in the ALS. Use of the system for accelerator diagnostics is illustrated via measurement of injection transients and analysis of open loop bunch motion.

  12. Automated identification of copepods using digital image processing and artificial neural network

    PubMed Central

    2015-01-01

    Background Copepods are planktonic organisms that play a major role in the marine food chain. Studying the community structure and abundance of copepods in relation to the environment is essential to evaluate their contribution to mangrove trophodynamics and coastal fisheries. The routine identification of copepods can be very technical, requiring taxonomic expertise, experience and much effort which can be very time-consuming. Hence, there is an urgent need to introduce novel methods and approaches to automate identification and classification of copepod specimens. This study aims to apply digital image processing and machine learning methods to build an automated identification and classification technique. Results We developed an automated technique to extract morphological features of copepods' specimen from captured images using digital image processing techniques. An Artificial Neural Network (ANN) was used to classify the copepod specimens from species Acartia spinicauda, Bestiolina similis, Oithona aruensis, Oithona dissimilis, Oithona simplex, Parvocalanus crassirostris, Tortanus barbatus and Tortanus forcipatus based on the extracted features. 60% of the dataset was used for a two-layer feed-forward network training and the remaining 40% was used as testing dataset for system evaluation. Our approach demonstrated an overall classification accuracy of 93.13% (100% for A. spinicauda, B. similis and O. aruensis, 95% for T. barbatus, 90% for O. dissimilis and P. crassirostris, 85% for O. similis and T. forcipatus). Conclusions The methods presented in this study enable fast classification of copepods to the species level. Future studies should include more classes in the model, improving the selection of features, and reducing the time to capture the copepod images. PMID:26678287

  13. Phase-shifting of correlation fringes created by image processing as an alternative to improve digital shearography

    NASA Astrophysics Data System (ADS)

    Braga, Roberto A.; González-Peña, Rolando J.; Marcon, Marlon; Magalhães, Ricardo R.; Paiva-Almeida, Thiago; Santos, Igor V. A.; Martins, Moisés

    2016-12-01

    The adoption of digital speckle pattern shearing interferometry, or speckle shearography, is well known in many areas when one needs to measure micro-displacements in-plane and out of the plane in biological and non-biological objects; it is based on the Michelson's Interferometer with the use of a piezoelectric transducer (PZT) in order to provide the phase-shift of the fringes and then to improve the quality of the final image. The creation of the shifting images using a PZT, despite its widespread use, has some drawbacks or limitations, such as the cost of the apparatus, the difficulties in applying the same displacement in the mirror repeated times, and when the phase-shift cannot be used in dynamic object measurement. The aim of this work was to create digitally phase-shift images avoiding the mechanical adjustments of the PZT, testing them with the digital shearography method. The methodology was tested using a well-known object, a cantilever beam of aluminium under deformation. The results documented the ability to create the deformation map and curves with reliability and sensitivity, reducing the cost, and improving the robustness and also the accessibility of digital speckle pattern shearing interferometry.

  14. A new general methodology for incorporating physico-chemical transformations into multi-phase wastewater treatment process models.

    PubMed

    Lizarralde, I; Fernández-Arévalo, T; Brouckaert, C; Vanrolleghem, P; Ikumi, D S; Ekama, G A; Ayesa, E; Grau, P

    2015-05-01

    This paper introduces a new general methodology for incorporating physico-chemical and chemical transformations into multi-phase wastewater treatment process models in a systematic and rigorous way under a Plant-Wide modelling (PWM) framework. The methodology presented in this paper requires the selection of the relevant biochemical, chemical and physico-chemical transformations taking place and the definition of the mass transport for the co-existing phases. As an example a mathematical model has been constructed to describe a system for biological COD, nitrogen and phosphorus removal, liquid-gas transfer, precipitation processes, and chemical reactions. The capability of the model has been tested by comparing simulated and experimental results for a nutrient removal system with sludge digestion. Finally, a scenario analysis has been undertaken to show the potential of the obtained mathematical model to study phosphorus recovery.

  15. Methodology to Establish Associations between Data and Clinical Assessment for Computerized Nursing Process in Intensive Care Units.

    PubMed

    Couto Carvalho Barra, Daniela; Marcon Dal Sasso, Grace Teresinha; Paese, Fernanda

    2015-01-01

    Combining the Information and Communication Technologies (ICT) in the Nursing Process (NP) is a way to support their development in health contexts. The alliance between ICT and the NP integrates and organizes a logical structure of data and clinical information supporting nurses in decision-making. This manuscript describes the methodology used to articulate data and information cynical of Computerized Nursing Process (CNP), according to ICNP® 2.0, associating detailed clinical assessment of each human system to their diagnoses, interventions, and patient outcomes. This is a methodological study and technological production conducted in 2010, and is developed in three stages. It was possible to restructure the CNP from the associations between the data and clinical information of all human systems (cardiovascular, neurological, respiratory, renal, gastrointestinal, cutaneous, musculoskeletal, female/male and biopsychosocial) to their diagnoses, interventions, and results of Nursing. PMID:26262243

  16. A new general methodology for incorporating physico-chemical transformations into multi-phase wastewater treatment process models.

    PubMed

    Lizarralde, I; Fernández-Arévalo, T; Brouckaert, C; Vanrolleghem, P; Ikumi, D S; Ekama, G A; Ayesa, E; Grau, P

    2015-05-01

    This paper introduces a new general methodology for incorporating physico-chemical and chemical transformations into multi-phase wastewater treatment process models in a systematic and rigorous way under a Plant-Wide modelling (PWM) framework. The methodology presented in this paper requires the selection of the relevant biochemical, chemical and physico-chemical transformations taking place and the definition of the mass transport for the co-existing phases. As an example a mathematical model has been constructed to describe a system for biological COD, nitrogen and phosphorus removal, liquid-gas transfer, precipitation processes, and chemical reactions. The capability of the model has been tested by comparing simulated and experimental results for a nutrient removal system with sludge digestion. Finally, a scenario analysis has been undertaken to show the potential of the obtained mathematical model to study phosphorus recovery. PMID:25746499

  17. Snow process monitoring in mountain forest environments with a digital camera network

    NASA Astrophysics Data System (ADS)

    Dong, Chunyu; Menzel, Lucas

    2016-04-01

    Snow processes are important components of the hydrologic cycle in mountainous areas and at high latitudes. Sparse observations in remote regions, in combination with complex topography, local climate specifics and the impact of heterogeneous vegetation cover complicate a detailed investigation of snow related processes. In this study, a camera network is applied to monitor the complex snow processes with high temporal resolution in montane forest environments (800-1200 m a.s.l.) in southwestern Germany. A typical feature of this region is the high temporal variability of weather conditions, with frequent snow accumulation and ablation processes and recurrent snow interception on conifers. We developed a semi-automatic procedure to interpret snow depths from the digital images, which shows high consistency with manual readings and station-based measurements. To extract the snow canopy interception dynamics from the pictures, six binary classification methods are compared. MaxEntropy classifier shows obviously better performance than the others in various illumination conditions, and it is thus selected to execute the snow interception quantification. The snow accumulation and ablation processes on the ground as well as the snow loading and unloading in forest canopies are investigated based on the snow parameters derived from the time-lapse photography monitoring. Besides, the influences of meteorological conditions, forest cover and elevation on snow processes are considered. Further, our investigations serve to improve the snow and interception modules of a hydrological model. We found that time-lapse photography proves to be an effective and low-cost approach to collect useful snow-related information which supports our understanding of snow processes and the further development of hydrological models. We will present selected results from our investigations over two consecutive winters.

  18. All-digital multicarrier demodulators for on-board processing satellites in mobile communication systems

    NASA Astrophysics Data System (ADS)

    Yim, Wan Hung

    Economical operation of future satellite systems for mobile communications can only be fulfilled by using dedicated on-board processing satellites, which would allow both cheap earth terminals and lower space segment costs. With on-board modems and codecs, the up-link and down-link can be optimized separately. An attractive scheme is to use frequency-division multiple access/single chanel per carrier (FDMA/SCPC) on the up-link and time division multiplexing (TDM) on the down-link. This scheme allows mobile terminals to transmit a narrow band, low power signal, resulting in smaller dishes and high power amplifiers (HPA's) with lower output power. On the up-link, there are hundreds to thousands of FDM channels to be demodulated on-board. The most promising approach is the use of all-digital multicarrier demodulators (MCD's), where analog and digital hardware are efficiently shared among channels, and digital signal processing (DSP) is used at an early stage to take advantage of very large scale integration (VLSI) implementation. A MCD consists of a channellizer for separation of frequency division multiplexing (FDM) channels, followed by individual modulators for each channel. Major research areas in MCD's are in multirate DSP, and the optimal estimation for synchronization, which form the basis of the thesis. Complex signal theories are central to the development of structured approaches for the sampling and processing of bandpass signals, which are the foundations in both channellizer and demodulator design. In multirate DSP, polyphase theories replace many ad-hoc, tedious and error-prone design procedures. For example, a polyphase-matrix deep space network frequency and timing system (DFT) channellizer includes all efficient filter bank techniques as special cases. Also, a polyphase-lattice filter is derived, not only for sampling rate conversion, but also capable of sampling phase variation, which is required for symbol timing adjustment in all-digital

  19. Modified Methodology for the Quench Temperature Selection in Quenching and Partitioning (Q&P) Processing of Steels

    NASA Astrophysics Data System (ADS)

    Seo, Eun Jung; Cho, Lawrence; De Cooman, Bruno C.

    2016-08-01

    The original method to select the optimum quench temperature for quenching and partitioning (Q&P) processing aims to determine the quench temperature which yields a maximum volume fraction of retained austenite. In the present study, the original method was reviewed and refined by comparison with experimental results. The proposed methodology is based on the use of a modified Koistinen-Marburger equation for the kinetics of the athermal martensite transformation of steels containing C, Mn, Si, Cr, and B.

  20. Digital signal processing for a thermal neutron detector using ZnS(Ag):6LiF scintillating layers read out with WLS fibers and SiPMs

    NASA Astrophysics Data System (ADS)

    Mosset, J.-B.; Stoykov, A.; Greuter, U.; Hildebrandt, M.; Schlumpf, N.

    2016-07-01

    We present a digital signal processing system based on a photon counting approach which we developed for a thermal neutron detector consisting of ZnS(Ag):6LiF scintillating layers read out with WLS fibers and SiPMs. Three digital filters have been evaluated: a moving sum, a moving sum after differentiation and a digital CR-RC4 filter. The performances of the detector with these filters are presented. A full analog signal processing using a CR-RC4 filter has been emulated digitally. The detector performance obtained with this analog approach is compared with the one obtained with the best performing digital approach.

  1. Methodology and findings of the NRC`s materials licensing process redesign

    SciTech Connect

    Rathbun, P.A.; Brown, K.D.; Madera, J.R.; Moriarty, M.; Pelchat, J.M.; Usilton, W.K.; Whitten, J.E.; Vacca, P.C.

    1996-04-01

    This report describes the work and vision of the team chartered to redesign the process for licensing users of nuclear materials. The Business Process Redesign team was chartered to improve the speed of the existing licensing process while maintaining or improving public safety and to achieve required resource levels. The report describes the team`s methods for acquiring and analyzing information about the existing materials licensing process and the steps necessary to radically change this process to the envisioned future process.

  2. Digital Archive of UkrVO: first results of MAO NASU Solar System Bodies photographic plate processing

    NASA Astrophysics Data System (ADS)

    Ivanov, G.; Pakuliak, L.; Shatokhina, S.; Yizhakevych, E.; Kazantseva, L.; Andruk, V.

    The digitizing and processing of photographic plates with the images of the outer planets and their satellites from the archive collections of MAO NASU and AO of Kiev university included into the UkrVO Joint Digital Archive (JDA) have been made. Plates were obtained in the last half of the 20th century. The digitizing of JDA archive plates and inclusion of plate preview images into GPA database has been under way, using two models of flatbed scanners: Microtek ScanMaker 9800XL TMA and Epson Expression 10000XL. The database with metadata of plates is allocated on the computational resources of MAO NASU (http://gua.db.ukr-vo.org). Plates have been scanned at 16-bits grey dynamic range, with a resolution of 1200-1600 dpi, and saved in TIFF format. Linear dimensions of images are up to 13 thousand pixels (for plates 30.30 cm). The astrometric and photometric calibration procedures have been done in the LINUX-MIDASROMAFOT environment and Tycho-2 as reference with the image processing procedure specially developed for digitized images of huge linear dimensions on the basis of the image inherent traits. First results of digitized plate processing give the rms errors of 10 and 20 mas for RA, DEC respectively. (O-C) for plates with Pluto in comparison to JPL PLU021.DE405 has been derived of 140(RA) and 270(DEC) mas.

  3. Digital Signal Processing Using Stream High Performance Computing: A 512-Input Broadband Correlator for Radio Astronomy

    NASA Astrophysics Data System (ADS)

    Kocz, J.; Greenhill, L. J.; Barsdell, B. R.; Price, D.; Bernardi, G.; Bourke, S.; Clark, M. A.; Craig, J.; Dexter, M.; Dowell, J.; Eftekhari, T.; Ellingson, S.; Hallinan, G.; Hartman, J.; Jameson, A.; MacMahon, D.; Taylor, G.; Schinzel, F.; Werthimer, D.

    2015-03-01

    A "large-N" correlator that makes use of Field Programmable Gate Arrays and Graphics Processing Units has been deployed as the digital signal processing system for the Long Wavelength Array station at Owens Valley Radio Observatory (LWA-OV), to enable the Large Aperture Experiment to Detect the Dark Ages (LEDA). The system samples a ˜ 100 MHz baseband and processes signals from 512 antennas (256 dual polarization) over a ˜ 58 MHz instantaneous sub-band, achieving 16.8 Tops s-1 and 0.236 Tbit s-1 throughput in a 9 kW envelope and single rack footprint. The output data rate is 260 MB s-1 for 9-s time averaging of cross-power and 1 s averaging of total power data. At deployment, the LWA-OV correlator was the largest in production in terms of N and is the third largest in terms of complex multiply accumulations, after the Very Large Array and Atacama Large Millimeter Array. The correlator's comparatively fast development time and low cost establish a practical foundation for the scalability of a modular, heterogeneous, computing architecture.

  4. Speckle reduction process based on digital filtering and wavelet compounding in optical coherence tomography for dermatology

    NASA Astrophysics Data System (ADS)

    Gómez Valverde, Juan J.; Ortuño, Juan E.; Guerra, Pedro; Hermann, Boris; Zabihian, Behrooz; Rubio-Guivernau, José L.; Santos, Andrés.; Drexler, Wolfgang; Ledesma-Carbayo, Maria J.

    2015-07-01

    Optical Coherence Tomography (OCT) has shown a great potential as a complementary imaging tool in the diagnosis of skin diseases. Speckle noise is the most prominent artifact present in OCT images and could limit the interpretation and detection capabilities. In this work we propose a new speckle reduction process and compare it with various denoising filters with high edge-preserving potential, using several sets of dermatological OCT B-scans. To validate the performance we used a custom-designed spectral domain OCT and two different data set groups. The first group consisted in five datasets of a single B-scan captured N times (with N<20), the second were five 3D volumes of 25 Bscans. As quality metrics we used signal to noise (SNR), contrast to noise (CNR) and equivalent number of looks (ENL) ratios. Our results show that a process based on a combination of a 2D enhanced sigma digital filter and a wavelet compounding method achieves the best results in terms of the improvement of the quality metrics. In the first group of individual B-scans we achieved improvements in SNR, CNR and ENL of 16.87 dB, 2.19 and 328 respectively; for the 3D volume datasets the improvements were 15.65 dB, 3.44 and 1148. Our results suggest that the proposed enhancement process may significantly reduce speckle, increasing SNR, CNR and ENL and reducing the number of extra acquisitions of the same frame.

  5. TERRAIN: A computer program to process digital elevation models for modeling surface flow

    SciTech Connect

    Schwartz, P.M.; Levine, D.A.; Hunsaker, C.T.; Timmins, S.P.

    1995-08-01

    This document provides a step by step procedure, TERRAIN, for processing digital elevation models to calculate overland flow paths, watershed boundaries, slope, and aspect. The algorithms incorporated into TERRAIN have been used at two different geographic scales: first for small research watersheds where surface wetness measurements are made, and second for regional water modeling for entire counties. For small areas methods based on flow distribution may be more desirable, especially if time-dependent flow models are to be used. The main improvement in TERRAIN compared with earlier programs on which it is based is that it combines the conditioning routines, which remove depressions to avoid water storage, into a single process. Efficiency has also been improved, reducing run times as much as 10:1 and enabling the processing of very large grids in strips for regional modeling. Additionally, the ability to calculate the nutrient load delivered any cell in a watershed has been added. These improvements make TERRAIN a powerful tool for modeling surface flow.

  6. Classification of RF transients in space using digital signal processing and neural network techniques

    SciTech Connect

    Moore, K.R.; Blain, P.C.; Briles, S.D.; Jones, R.G.

    1995-02-01

    The FORTE{prime} (Fast On-Orbit Recording of Transient Events) small satellite experiment scheduled for launch in October, 1995 will attempt to measure and classify electromagnetic transients as sensed from space. The FORTE{prime} payload will employ an Event Classifier to perform onboard classification of radio frequency transients from terrestrial sources such as lightning. These transients are often dominated by a constantly changing assortment of man-made ``clutter`` such as TV, FM, and radar signals. The FORTE{prime} Event Classifier, or EC, uses specialized hardware to implement various signal processing and neural network algorithms. The resulting system can process and classify digitized records of several thousand samples onboard the spacecraft at rates of about a second per record. In addition to reducing dowlink rates, the EC minimizes command uplink data by normally using uploaded algorithm sequences rather than full code modules (although it is possible for full code modules to be uploaded from the ground). The FORTE{prime} Event Classifier experiment combines science and engineering in an evolutionary step toward useful and robust adaptive processing systems in space.

  7. Digital signal processing for velocity measurements in dynamical material's behaviour studies

    NASA Astrophysics Data System (ADS)

    Devlaminck, Julien; Luc, Jérôme; Chanal, Pierre-Yves

    2014-03-01

    In this work, we describe different configurations of optical fiber interferometers (types Michelson and Mach-Zehnder) used to measure velocities during dynamical material's behaviour studies. We detail the algorithms of processing developed and optimized to improve the performance of these interferometers especially in terms of time and frequency resolutions. Three methods of analysis of interferometric signals were studied. For Michelson interferometers, the time-frequency analysis of signals by Short-Time Fourier Transform (STFT) is compared to a time-frequency analysis by Continuous Wavelet Transform (CWT). The results have shown that the CWT was more suitable than the STFT for signals with low signal-to-noise, and low velocity and high acceleration areas. For Mach-Zehnder interferometers, the measurement is carried out by analyzing the phase shift between three interferometric signals (Triature processing). These three methods of digital signal processing were evaluated, their measurement uncertainties estimated, and their restrictions or operational limitations specified from experimental results performed on a pulsed power machine.

  8. Digital signal processing for velocity measurements in dynamical material's behaviour studies.

    PubMed

    Devlaminck, Julien; Luc, Jérôme; Chanal, Pierre-Yves

    2014-03-01

    In this work, we describe different configurations of optical fiber interferometers (types Michelson and Mach-Zehnder) used to measure velocities during dynamical material's behaviour studies. We detail the algorithms of processing developed and optimized to improve the performance of these interferometers especially in terms of time and frequency resolutions. Three methods of analysis of interferometric signals were studied. For Michelson interferometers, the time-frequency analysis of signals by Short-Time Fourier Transform (STFT) is compared to a time-frequency analysis by Continuous Wavelet Transform (CWT). The results have shown that the CWT was more suitable than the STFT for signals with low signal-to-noise, and low velocity and high acceleration areas. For Mach-Zehnder interferometers, the measurement is carried out by analyzing the phase shift between three interferometric signals (Triature processing). These three methods of digital signal processing were evaluated, their measurement uncertainties estimated, and their restrictions or operational limitations specified from experimental results performed on a pulsed power machine. PMID:24689622

  9. Preliminary development of digital elevation and relief models for ICESat-2 onboard processing

    NASA Astrophysics Data System (ADS)

    Leigh, H. W.; Magruder, L. A.; Carabajal, C. C.

    2012-12-01

    ATLAS (Advanced Topographic Laser Altimeter System) is a photon-counting laser ranging instrument that will fly onboard NASA's ICESat-2 mission to collect global altimetry data for the primary purpose of determining volumetric changes in the Polar Regions. While photon-counting systems provide the advantage of using small, low power lasers, they are typically much more susceptible to noise and require the use of sophisticated algorithms both onboard and in ground based processing to ensure capture of valid data and production of accurate data products. An onboard receiver algorithm is being developed for ATLAS to ensure that valid data is returned while adhering to the 577 Gb/day limit on data telemetry. The onboard receiver algorithm makes use of multiple onboard databases, two of which are the DEM (Digital Elevation Model) and the DRM (Digital Relief Map). The DEM provides start and stop times for software-induced range gating on the ATLAS detectors, and is a nested, three-tiered grid to account for a 6 km overall constraint on the allowable limit for ranging acquisition. The DRM contains the maximum values of relief seen across 140m- and 700m-long flight path segments, which are used in statistically determining the presence of a valid surface return and in deciding which bands to telemeter. Both onboard databases are to be primarily constructed from existing digital elevation models and must provide global coverage referenced to latitude and longitude. Production of the grids is complicated by the lack of global data products of sufficient resolution and accuracy such that preliminary analysis is required for DEM selection and usage in addition to the determination of how to intelligently merge differing data sets. This initial investigation is also focused on determining the impact of the selected DEM quality on the ICESat-2 onboard algorithms as well as the precipitated error induced on the DRM. These results are required in order to determine the expected

  10. High-rate dead-time corrections in a general purpose digital pulse processing system.

    PubMed

    Abbene, Leonardo; Gerardi, Gaetano

    2015-09-01

    Dead-time losses are well recognized and studied drawbacks in counting and spectroscopic systems. In this work the abilities on dead-time correction of a real-time digital pulse processing (DPP) system for high-rate high-resolution radiation measurements are presented. The DPP system, through a fast and slow analysis of the output waveform from radiation detectors, is able to perform multi-parameter analysis (arrival time, pulse width, pulse height, pulse shape, etc.) at high input counting rates (ICRs), allowing accurate counting loss corrections even for variable or transient radiations. The fast analysis is used to obtain both the ICR and energy spectra with high throughput, while the slow analysis is used to obtain high-resolution energy spectra. A complete characterization of the counting capabilities, through both theoretical and experimental approaches, was performed. The dead-time modeling, the throughput curves, the experimental time-interval distributions (TIDs) and the counting uncertainty of the recorded events of both the fast and the slow channels, measured with a planar CdTe (cadmium telluride) detector, will be presented. The throughput formula of a series of two types of dead-times is also derived. The results of dead-time corrections, performed through different methods, will be reported and discussed, pointing out the error on ICR estimation and the simplicity of the procedure. Accurate ICR estimations (nonlinearity < 0.5%) were performed by using the time widths and the TIDs (using 10 ns time bin width) of the detected pulses up to 2.2 Mcps. The digital system allows, after a simple parameter setting, different and sophisticated procedures for dead-time correction, traditionally implemented in complex/dedicated systems and time-consuming set-ups.

  11. High-rate dead-time corrections in a general purpose digital pulse processing system

    PubMed Central

    Abbene, Leonardo; Gerardi, Gaetano

    2015-01-01

    Dead-time losses are well recognized and studied drawbacks in counting and spectroscopic systems. In this work the abilities on dead-time correction of a real-time digital pulse processing (DPP) system for high-rate high-resolution radiation measurements are presented. The DPP system, through a fast and slow analysis of the output waveform from radiation detectors, is able to perform multi-parameter analysis (arrival time, pulse width, pulse height, pulse shape, etc.) at high input counting rates (ICRs), allowing accurate counting loss corrections even for variable or transient radiations. The fast analysis is used to obtain both the ICR and energy spectra with high throughput, while the slow analysis is used to obtain high-resolution energy spectra. A complete characterization of the counting capabilities, through both theoretical and experimental approaches, was performed. The dead-time modeling, the throughput curves, the experimental time-interval distributions (TIDs) and the counting uncertainty of the recorded events of both the fast and the slow channels, measured with a planar CdTe (cadmium telluride) detector, will be presented. The throughput formula of a series of two types of dead-times is also derived. The results of dead-time corrections, performed through different methods, will be reported and discussed, pointing out the error on ICR estimation and the simplicity of the procedure. Accurate ICR estimations (nonlinearity < 0.5%) were performed by using the time widths and the TIDs (using 10 ns time bin width) of the detected pulses up to 2.2 Mcps. The digital system allows, after a simple parameter setting, different and sophisticated procedures for dead-time correction, traditionally implemented in complex/dedicated systems and time-consuming set-ups. PMID:26289270

  12. Using Digital Time-Lapse Videos to Teach Geomorphic Processes to Undergraduates

    NASA Astrophysics Data System (ADS)

    Clark, D. H.; Linneman, S. R.; Fuller, J.

    2004-12-01

    We demonstrate the use of relatively low-cost, computer-based digital imagery to create time-lapse videos of two distinct geomorphic processes in order to help students grasp the significance of the rates, styles, and temporal dependence of geologic phenomena. Student interviews indicate that such videos help them to understand the relationship between processes and landform development. Time-lapse videos have been used extensively in some sciences (e.g., biology - http://sbcf.iu.edu/goodpract/hangarter.html, meteorology - http://www.apple.com/education/hed/aua0101s/meteor/, chemistry - http://www.chem.yorku.ca/profs/hempsted/chemed/home.html) to demonstrate gradual processes that are difficult for many students to visualize. Most geologic processes are slower still, and are consequently even more difficult for students to grasp, yet time-lapse videos are rarely used in earth science classrooms. The advent of inexpensive web-cams and computers provides a new means to explore the temporal dimension of earth surface processes. To test the use of time-lapse videos in geoscience education, we are developing time-lapse movies that record the evolution of two landforms: a stream-table delta and a large, natural, active landslide. The former involves well-known processes in a controlled, repeatable laboratory experiment, whereas the latter tracks the developing dynamics of an otherwise poorly understood slope failure. The stream-table delta is small and grows in ca. 2 days; we capture a frame on an overhead web-cam every 3 minutes. Before seeing the video, students are asked to hypothesize how the delta will grow through time. The final time-lapse video, ca. 20-80 MB, elegantly shows channel migration, progradation rates, and formation of major geomorphic elements (topset, foreset, bottomset beds). The web-cam can also be "zoomed-in" to show smaller-scale processes, such as bedload transfer, and foreset slumping. Post-lab tests and interviews with students indicate that

  13. Improving timeliness and efficiency in the referral process for safety net providers: application of the Lean Six Sigma methodology.

    PubMed

    Deckard, Gloria J; Borkowski, Nancy; Diaz, Deisell; Sanchez, Carlos; Boisette, Serge A

    2010-01-01

    Designated primary care clinics largely serve low-income and uninsured patients who present a disproportionate number of chronic illnesses and face great difficulty in obtaining the medical care they need, particularly the access to specialty physicians. With limited capacity for providing specialty care, these primary care clinics generally refer patients to safety net hospitals' specialty ambulatory care clinics. A large public safety net health system successfully improved the effectiveness and efficiency of the specialty clinic referral process through application of Lean Six Sigma, an advanced process-improvement methodology and set of tools driven by statistics and engineering concepts.

  14. Simple Image Processing Techniques For The Contrast Enhancement Of Real-Time Digital Speckle Pattern Interferometry Fringes

    NASA Astrophysics Data System (ADS)

    Ganesan, A. R.; Kothiyal, M. P.; Sirohi, Rajpal S.

    1989-09-01

    Some simple image processing techniques are suggested that can be used for enhancing the contrast of real-time digital speckle pattern interferometry fringes. The techniques have been developed for the commercial Intellect 100 image processing system interfaced to a PDP-1 1 /23+ microcomputer, but they can be adapted to any commercial image processing system with slight modifications, if necessary, depending on the hardware configuration of the system.

  15. Semi-automated Digital Imaging and Processing System for Measuring Lake Ice Thickness

    NASA Astrophysics Data System (ADS)

    Singh, Preetpal

    to detect equipment failure and identify defective products at the assembly line. The research work in this thesis combines machine vision and image processing technology to build a digital imaging and processing system for monitoring and measuring lake ice thickness in real time. An ultra-compact USB camera is programmed to acquire and transmit high resolution imagery for processing with MATLAB Image Processing toolbox. The image acquisition and transmission process is fully automated; image analysis is semi-automated and requires limited user input. Potential design changes to the prototype and ideas on fully automating the imaging and processing procedure are presented to conclude this research work.

  16. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    SciTech Connect

    Ragan, Eric D; Goodall, John R

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit process recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.

  17. Correlation and registration of ERTS multispectral imagery. [by a digital processing technique

    NASA Technical Reports Server (NTRS)

    Bonrud, L. O.; Henrikson, P. J.

    1974-01-01

    Examples of automatic digital processing demonstrate the feasibility of registering one ERTS multispectral scanner (MSS) image with another obtained on a subsequent orbit, and automatic matching, correlation, and registration of MSS imagery with aerial photography (multisensor correlation) is demonstrated. Excellent correlation was obtained with patch sizes exceeding 16 pixels square. Qualities which lead to effective control point selection are distinctive features, good contrast, and constant feature characteristics. Results of the study indicate that more than 300 degrees of freedom are required to register two standard ERTS-1 MSS frames covering 100 by 100 nautical miles to an accuracy of 0.6 pixel mean radial displacement error. An automatic strip processing technique demonstrates 600 to 1200 degrees of freedom over a quater frame of ERTS imagery. Registration accuracies in the range of 0.3 pixel to 0.5 pixel mean radial error were confirmed by independent error analysis. Accuracies in the range of 0.5 pixel to 1.4 pixel mean radial error were demonstrated by semi-automatic registration over small geographic areas.

  18. Real-time digital signal processing in multiphoton and time-resolved microscopy

    NASA Astrophysics Data System (ADS)

    Wilson, Jesse W.; Warren, Warren S.; Fischer, Martin C.

    2016-03-01

    The use of multiphoton interactions in biological tissue for imaging contrast requires highly sensitive optical measurements. These often involve signal processing and filtering steps between the photodetector and the data acquisition device, such as photon counting and lock-in amplification. These steps can be implemented as real-time digital signal processing (DSP) elements on field-programmable gate array (FPGA) devices, an approach that affords much greater flexibility than commercial photon counting or lock-in devices. We will present progress toward developing two new FPGA-based DSP devices for multiphoton and time-resolved microscopy applications. The first is a high-speed multiharmonic lock-in amplifier for transient absorption microscopy, which is being developed for real-time analysis of the intensity-dependence of melanin, with applications in vivo and ex vivo (noninvasive histopathology of melanoma and pigmented lesions). The second device is a kHz lock-in amplifier running on a low cost (50-200) development platform. It is our hope that these FPGA-based DSP devices will enable new, high-speed, low-cost applications in multiphoton and time-resolved microscopy.

  19. Visual investigation on the heat dissipation process of a heat sink by using digital holographic interferometry

    SciTech Connect

    Wu, Bingjing; Zhao, Jianlin Wang, Jun; Di, Jianglei; Chen, Xin; Liu, Junjiang

    2013-11-21

    We present a method for visually and quantitatively investigating the heat dissipation process of plate-fin heat sinks by using digital holographic interferometry. A series of phase change maps reflecting the temperature distribution and variation trend of the air field surrounding heat sink during the heat dissipation process are numerically reconstructed based on double-exposure holographic interferometry. According to the phase unwrapping algorithm and the derived relationship between temperature and phase change of the detection beam, the full-field temperature distributions are quantitatively obtained with a reasonably high measurement accuracy. And then the impact of heat sink's channel width on the heat dissipation performance in the case of natural convection is analyzed. In addition, a comparison between simulation and experiment results is given to verify the reliability of this method. The experiment results certify the feasibility and validity of the presented method in full-field, dynamical, and quantitative measurement of the air field temperature distribution, which provides a basis for analyzing the heat dissipation performance of plate-fin heat sinks.

  20. The differential maturation of two processing rates related to digit span.

    PubMed

    Cowan, N

    1999-03-01

    Recent studies have proposed that rapid-speaking durations predict short-term memory spans. However, N. Cowan et al. (1998, Journal of Experimental Psychology: General, 127, 141-160) found two separate types of processing durations related to digit span for children in Grade 1 (7-8 years), Grade 3 (9-10 years), and Grade 5 (11-12 years): rapid-speaking durations (thought to index the rate of covert articulation) and the durations of interword pauses in the memory span task responses (thought to index the rate of short-term memory retrieval). The present analysis additionally establishes the differential maturation of those two durations. Within-age correlations between span and rapid-speaking durations were significant only in first graders; correlations between span and interword pauses, only in fifth graders. When subsamples were selected to match spans across age groups, both types of duration also were matched between Grades 1 and 3. However, fifth graders had considerably shorter interword pauses than their third-grade counterparts. Thus, a particular memory span is accompanied by different profiles of processing rates in children of different ages. PMID:10047439