Science.gov

Sample records for digital processing methodology

  1. Digital methodology to implement the ECOUTER engagement process

    PubMed Central

    Wilson, Rebecca C.; Butters, Oliver W.; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J.

    2016-01-01

    ECOUTER (Employing COnceptUal schema for policy and Translation Engagement in Research) – French for ‘to listen’ – is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes. PMID:27366320

  2. Digital methodology to implement the ECOUTER engagement process.

    PubMed

    Wilson, Rebecca C; Butters, Oliver W; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J

    2016-01-01

    ECOUTER (Employing COnceptUal schema for policy and Translation Engagement in Research) - French for 'to listen' - is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes. PMID:27366320

  3. A Digital Methodology for the Design Process of Aerospace Assemblies with Sustainable Composite Processes & Manufacture

    NASA Astrophysics Data System (ADS)

    McEwan, W.; Butterfield, J.

    2011-05-01

    The well established benefits of composite materials are driving a significant shift in design and manufacture strategies for original equipment manufacturers (OEMs). Thermoplastic composites have advantages over the traditional thermosetting materials with regards to sustainability and environmental impact, features which are becoming increasingly pertinent in the aerospace arena. However, when sustainability and environmental impact are considered as design drivers, integrated methods for part design and product development must be developed so that any benefits of sustainable composite material systems can be assessed during the design process. These methods must include mechanisms to account for process induced part variation and techniques related to re-forming, recycling and decommissioning, which are in their infancy. It is proposed in this paper that predictive techniques related to material specification, part processing and product cost of thermoplastic composite components, be integrated within a Through Life Management (TLM) product development methodology as part of a larger strategy of product system modeling to improve disciplinary concurrency, realistic part performance, and to place sustainability at the heart of the design process. This paper reports the enhancement of digital manufacturing tools as a means of drawing simulated part manufacturing scenarios, real time costing mechanisms, and broader lifecycle performance data capture into the design cycle. The work demonstrates predictive processes for sustainable composite product manufacture and how a Product-Process-Resource (PPR) structure can be customised and enhanced to include design intent driven by `Real' part geometry and consequent assembly. your paper.

  4. Developing Digital Interventions: A Methodological Guide

    PubMed Central

    Watts, Sam; Yardley, Lucy; Lewith, George

    2014-01-01

    Digital interventions are becoming an increasingly popular method of delivering healthcare as they enable and promote patient self-management. This paper provides a methodological guide to the processes involved in developing effective digital interventions, detailing how to plan and develop such interventions to avoid common pitfalls. It demonstrates the need for mixed qualitative and quantitative methods in order to develop digital interventions which are effective, feasible, and acceptable to users and stakeholders. PMID:24648848

  5. A Methodology to Teach Advanced A/D Converters, Combining Digital Signal Processing and Microelectronics Perspectives

    ERIC Educational Resources Information Center

    Quintans, C.; Colmenar, A.; Castro, M.; Moure, M. J.; Mandado, E.

    2010-01-01

    ADCs (analog-to-digital converters), especially Pipeline and Sigma-Delta converters, are designed using complex architectures in order to increase their sampling rate and/or resolution. Consequently, the learning of ADC devices also encompasses complex concepts such as multistage synchronization, latency, oversampling, modulation, noise shaping,…

  6. Digital image processing.

    PubMed

    Seeram, Euclid

    2004-01-01

    Digital image processing is now commonplace in radiology, nuclear medicine and sonography. This article outlines underlying principles and concepts of digital image processing. After completing this article, readers should be able to: List the limitations of film-based imaging. Identify major components of a digital imaging system. Describe the history and application areas of digital image processing. Discuss image representation and the fundamentals of digital image processing. Outline digital image processing techniques and processing operations used in selected imaging modalities. Explain the basic concepts and visualization tools used in 3-D and virtual reality imaging. Recognize medical imaging informatics as a new area of specialization for radiologic technologists. PMID:15352557

  7. Integration Of Digital Methodologies (Field, Processing, and Presentation) In A Combined Sedimentology/Stratigraphy and Structure Course

    NASA Astrophysics Data System (ADS)

    Malinconico, L. L., Jr.; Sunderlin, D.; Liew, C. W.

    2015-12-01

    Over the course of the last three years we have designed, developed and refined two Apps for the iPad. GeoFieldBook and StratLogger allow for the real-time display of spatial (structural) and temporal (stratigraphic) field data as well as very easy in-field navigation. Field techniques and methods for data acquisition and mapping in the field have dramatically advanced and simplified how we collect and analyze data while in the field. The Apps are not geologic mapping programs, but rather a way of bypassing the analog field book step to acquire digital data directly that can then be used in various analysis programs (GIS, Google Earth, Stereonet, spreadsheet and drawing programs). We now complete all of our fieldwork digitally. GeoFieldBook can be used to collect structural and other field observations. Each record includes location/date/time information, orientation measurements, formation names, text observations and photos taken with the tablet camera. Records are customizable, so users can add fields of their own choosing. Data are displayed on an image base in real time with oriented structural symbols. The image base is also used for in-field navigation. In StratLogger, the user records bed thickness, lithofacies, biofacies, and contact data in preset and modifiable fields. Each bed/unit record may also be photographed and geo-referenced. As each record is collected, a column diagram of the stratigraphic sequence is built in real time, complete with lithology color, lithology texture, and fossil symbols. The recorded data from any measured stratigraphic sequence can be exported as both the live-drawn column image and as a .csv formatted file for use in spreadsheet or other applications. Common to both Apps is the ability to export the data (via .csv files), photographs and maps or stratigraphic columns (images). Since the data are digital they are easily imported into various processing programs (for example for stereoplot analysis). Requiring that all maps

  8. Digital image classification by the Bessel masks methodology

    NASA Astrophysics Data System (ADS)

    Solorza, S.; Álvarez-Borrego, J.

    2014-10-01

    Since the evolution of the computer's hardware in the middle of last century, the automatization processes are very productive in fields such as industry, security, engineering and science. In this work a pattern recognition methodology to classified images is presented. Here, the Bessel masks digital image system invariant to position and rotation is utilized to classified gray-scale images. Moreover, by the use of the Fisher's Z distribution the digital system get a 99% confidence level performance.

  9. DIGITAL TECHNOLOGY BUSINESS CASE METHODOLOGY GUIDE & WORKBOOK

    SciTech Connect

    Thomas, Ken; Lawrie, Sean; Hart, Adam; Vlahoplus, Chris

    2014-09-01

    Performance advantages of the new digital technologies are widely acknowledged, but it has proven difficult for utilities to derive business cases for justifying investment in these new capabilities. Lack of a business case is often cited by utilities as a barrier to pursuing wide-scale application of digital technologies to nuclear plant work activities. The decision to move forward with funding usually hinges on demonstrating actual cost reductions that can be credited to budgets and thereby truly reduce O&M or capital costs. Technology enhancements, while enhancing work methods and making work more efficient, often fail to eliminate workload such that it changes overall staffing and material cost requirements. It is critical to demonstrate cost reductions or impacts on non-cost performance objectives in order for the business case to justify investment by nuclear operators. This Business Case Methodology approaches building a business case for a particular technology or suite of technologies by detailing how they impact an operator in one or more of the three following areas: Labor Costs, Non-Labor Costs, and Key Performance Indicators (KPIs). Key to those impacts will be identifying where the savings are “harvestable,” meaning they result in an actual reduction in headcount and/or cost. The report consists of a Digital Technology Business Case Methodology Guide and an accompanying spreadsheet workbook that will enable the user to develop a business case.

  10. Digital signal processing

    NASA Astrophysics Data System (ADS)

    Morgera, Salvatore D.; Krishna, Hari

    Computationally efficient digital signal-processing algorithms over finite fields are developed analytically, and the relationship of these algorithms to algebraic error-correcting codes is explored. A multidisciplinary approach is employed, in an effort to make the results accessible to engineers, mathematicians, and computer scientists. Chapters are devoted to systems of bilinear forms, efficient finite-field algorithms, multidimensional methods, a new class of linear codes, and a new error-control scheme.

  11. Digital processing clock

    NASA Technical Reports Server (NTRS)

    Phillips, D. H.

    1982-01-01

    Tthe digital processing clock SG 1157/U is described. It is compatible with the PTTI world where it can be driven by an external cesium source. Built-in test equipment shows synchronization with cesium through 1 pulse per second. It is built to be expandable to accommodate future time-keeping needs of the Navy as well as any other time ordered functions. Examples of this expandibility are the inclusion of an unmodulated XR3 time code and the 2137 modulate time code (XR3 with 1 kHz carrier).

  12. Aquarius Digital Processing Unit

    NASA Technical Reports Server (NTRS)

    Forgione, Joshua; Winkert, George; Dobson, Norman

    2009-01-01

    Three documents provide information on a digital processing unit (DPU) for the planned Aquarius mission, in which a radiometer aboard a spacecraft orbiting Earth is to measure radiometric temperatures from which data on sea-surface salinity are to be deduced. The DPU is the interface between the radiometer and an instrument-command-and-data system aboard the spacecraft. The DPU cycles the radiometer through a programmable sequence of states, collects and processes all radiometric data, and collects all housekeeping data pertaining to operation of the radiometer. The documents summarize the DPU design, with emphasis on innovative aspects that include mainly the following: a) In the radiometer and the DPU, conversion from analog voltages to digital data is effected by means of asynchronous voltage-to-frequency converters in combination with a frequency-measurement scheme implemented in field-programmable gate arrays (FPGAs). b) A scheme to compensate for aging and changes in the temperature of the DPU in order to provide an overall temperature-measurement accuracy within 0.01 K includes a high-precision, inexpensive DC temperature measurement scheme and a drift-compensation scheme that was used on the Cassini radar system. c) An interface among multiple FPGAs in the DPU guarantees setup and hold times.

  13. Analog and digital signal processing

    NASA Astrophysics Data System (ADS)

    Baher, H.

    The techniques of signal processing in both the analog and digital domains are addressed in a fashion suitable for undergraduate courses in modern electrical engineering. The topics considered include: spectral analysis of continuous and discrete signals, analysis of continuous and discrete systems and networks using transform methods, design of analog and digital filters, digitization of analog signals, power spectrum estimation of stochastic signals, FFT algorithms, finite word-length effects in digital signal processes, linear estimation, and adaptive filtering.

  14. Digital signal processing

    NASA Astrophysics Data System (ADS)

    Meyer, G.

    The theory, realization techniques, and applications of digital filtering are surveyed, with an emphasis on the development of software, in a handbook for advanced students of electrical and electronic engineering and practicing development engineers. The foundations of the theory of discrete signals and systems are introduced. The design of one-dimensional linear systems is discussed, and the techniques are expanded to the treatment of two-dimensional discrete and multidimensional analog systems. Numerical systems, quantification and limitation, and the characteristics of particular signal-processing devices are considered in a section on design realization. An appendix contains definitions of the basic mathematical concepts, derivations and proofs, and tables of integration and differentiation formulas.

  15. Digital processing of bandpass signals

    NASA Astrophysics Data System (ADS)

    Jackson, M. C.; Matthewson, P.

    Modern radar and radio systems rely on digital signal processing to enhance the quality of received signals. Prior to such processing, these signals must be converted to digital form. The historical development of signal digitization is briefly discussed in this paper and leads to a description of some current work on digital mixing. A method of directly sampling a band-limited intermediate frequency (i.f.) signal is presented, using a pair of digital mixer channels to produce complex low-pass samples of the signal envelope. The method is found to produce well matched channel outputs. Finally, the applicability of the method to radar is discussed.

  16. Measuring user experience in digital gaming: theoretical and methodological issues

    NASA Astrophysics Data System (ADS)

    Takatalo, Jari; Häkkinen, Jukka; Kaistinen, Jyrki; Nyman, Göte

    2007-01-01

    There are innumerable concepts, terms and definitions for user experience. Few of them have a solid empirical foundation. In trying to understand user experience in interactive technologies such as computer games and virtual environments, reliable and valid concepts are needed for measuring relevant user reactions and experiences. Here we present our approach to create both theoretically and methodologically sound methods for quantification of the rich user experience in different digital environments. Our approach is based on the idea that the experience received from a content presented with a specific technology is always a result of a complex psychological interpretation process, which components should be understood. The main aim of our approach is to grasp the complex and multivariate nature of the experience and make it measurable. We will present our two basic measurement frameworks, which have been developed and tested in large data set (n=2182). The 15 measurement scales extracted from these models are applied to digital gaming with a head-mounted display and a table-top display. The results show how it is possible to map between experience, technology variables and the background of the user (e.g., gender). This approach can help to optimize, for example, the contents for specific viewing devices or viewing situations.

  17. Digital TV processing system

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Two digital video data compression systems directly applicable to the Space Shuttle TV Communication System were described: (1) For the uplink, a low rate monochrome data compressor is used. The compression is achieved by using a motion detection technique in the Hadamard domain. To transform the variable source rate into a fixed rate, an adaptive rate buffer is provided. (2) For the downlink, a color data compressor is considered. The compression is achieved first by intra-color transformation of the original signal vector, into a vector which has lower information entropy. Then two-dimensional data compression techniques are applied to the Hadamard transformed components of this last vector. Mathematical models and data reliability analyses were also provided for the above video data compression techniques transmitted over a channel encoded Gaussian channel. It was shown that substantial gains can be achieved by the combination of video source and channel coding.

  18. Digital Storytelling: A Novel Methodology for Sexual Health Promotion

    ERIC Educational Resources Information Center

    Guse, Kylene; Spagat, Andrea; Hill, Amy; Lira, Andrea; Heathcock, Stephen; Gilliam, Melissa

    2013-01-01

    Digital storytelling draws on the power of narrative for personal and social transformation. This technique has many desirable attributes for sexuality education, including a participatory methodology, provision of a "safe space" to collaboratively address stigmatized topics, and an emphasis on the social and political contexts that…

  19. Advanced digital SAR processing study

    NASA Technical Reports Server (NTRS)

    Martinson, L. W.; Gaffney, B. P.; Liu, B.; Perry, R. P.; Ruvin, A.

    1982-01-01

    A highly programmable, land based, real time synthetic aperture radar (SAR) processor requiring a processed pixel rate of 2.75 MHz or more in a four look system was designed. Variations in range and azimuth compression, number of looks, range swath, range migration and SR mode were specified. Alternative range and azimuth processing algorithms were examined in conjunction with projected integrated circuit, digital architecture, and software technologies. The advaced digital SAR processor (ADSP) employs an FFT convolver algorithm for both range and azimuth processing in a parallel architecture configuration. Algorithm performace comparisons, design system design, implementation tradeoffs and the results of a supporting survey of integrated circuit and digital architecture technologies are reported. Cost tradeoffs and projections with alternate implementation plans are presented.

  20. Digital Literacy: Tools and Methodologies for Information Society

    ERIC Educational Resources Information Center

    Rivoltella, Pier Cesare, Ed.

    2008-01-01

    Currently in a state of cultural transition, global society is moving from a literary society to digital one, adopting widespread use of advanced technologies such as the Internet and mobile devices. Digital media has an extraordinary impact on society's formative processes, forcing a pragmatic shift in their management and organization. This…

  1. Digital switching noise as a stochastic process

    NASA Astrophysics Data System (ADS)

    Boselli, Giorgio; Trucco, Gabriella; Liberali, Valentino

    2007-06-01

    Switching activity of logic gates in a digital system is a deterministic process, depending on both circuit parameters and input signals. However, the huge number of logic blocks in a digital system makes digital switching a cognitively stochastic process. Switching activity is the source of the so-called "digital noise", which can be analyzed using a stochastic approach. For an asynchronous digital network, we can model digital switching currents as a shot noise process, deriving both its amplitude distribution and its power spectral density. From spectral distribution of digital currents, we can also calculate the spectral distribution and the power of disturbances injected into the on-chip power supply lines.

  2. Digital Pulse Processing for Steff

    NASA Astrophysics Data System (ADS)

    Pollitt, A. J.; Dare, J. A.; Smith, A. G.; Tsekhanovich, I.

    2011-10-01

    The SpecTrometer for Exotic Fission Fragments (STEFF) is a Manchester based experiment that uses two segmented Bragg Chambers to determine fission fragment energies of a 252Cf source. In this work the pulses collected in the Bragg chambers are processed digitally using Flash ADC's in GRT4 cards and a DAQ. Algorithms have been developed to sort the data pulses off-line. These include routines for the removal of noise and cross talk through digital filtering, adjusting the data for ballistic deficit and adding back pulses from the individual anode segments. From the corrected data, the mass and Z of the fission fragments is deducible. Current methods and present challenges will be discussed.

  3. Selecting a software development methodology. [of digital flight control systems

    NASA Technical Reports Server (NTRS)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  4. Friction Stir Process Mapping Methodology

    NASA Technical Reports Server (NTRS)

    Kooney, Alex; Bjorkman, Gerry; Russell, Carolyn; Smelser, Jerry (Technical Monitor)

    2002-01-01

    In FSW (friction stir welding), the weld process performance for a given weld joint configuration and tool setup is summarized on a 2-D plot of RPM vs. IPM. A process envelope is drawn within the map to identify the range of acceptable welds. The sweet spot is selected as the nominal weld schedule. The nominal weld schedule is characterized in the expected manufacturing environment. The nominal weld schedule in conjunction with process control ensures a consistent and predictable weld performance.

  5. Friction Stir Process Mapping Methodology

    NASA Technical Reports Server (NTRS)

    Bjorkman, Gerry; Kooney, Alex; Russell, Carolyn

    2003-01-01

    The weld process performance for a given weld joint configuration and tool setup is summarized on a 2-D plot of RPM vs. IPM. A process envelope is drawn within the map to identify the range of acceptable welds. The sweet spot is selected as the nominal weld schedule The nominal weld schedule is characterized in the expected manufacturing environment. The nominal weld schedule in conjunction with process control ensures a consistent and predictable weld performance.

  6. Development of digital image processing based methodology to study, quantify and correlate the microstructure and three dimensional fracture surface morphology of aluminum alloy 7050

    NASA Astrophysics Data System (ADS)

    Dighe, Manish Deepak

    2000-10-01

    7XXX series wrought aluminum alloys are extensively used for structural aerospace applications due to their high strength to weight ratio, excellent corrosion resistance, and high fracture resistance. 7050 is an important alloy of this group, which is widely used for the applications such as aircraft wing skin structures, aircraft landing gear parts, and fuselage frame structure. Therefore, it is of interest to investigate the fracture behavior of 7050 aluminum alloy, which is a typical alloy of 7XXX series. The aim of this research is to quantitatively characterize and model the relationships among processing, microstructure, fracture surface morphology, and fracture toughness of hot-rolled partially recrystallized precipitation hardened 7050 alloy. A new technique is developed which permits simultaneous viewing of the fracture surface and the microstructure just below and above the fracture surface. This technique is then applied to identify, validate and quantify various fracture micro-mechanisms observed on the fracture surface. To get a better insight in to the shape and anisotropy of the recrystallized grains, the three-dimensional structure of the microstructure is reconstructed using serial sectioning. The gathered information is utilized to develop a mathematical model relating the various processing parameters and microstructural attributes to the fracture toughness.

  7. Challenges of implementing digital technology in motion picture distribution and exhibition: testing and evaluation methodology

    NASA Astrophysics Data System (ADS)

    Swartz, Charles S.

    2003-05-01

    The process of distributing and exhibiting a motion picture has changed little since the Lumière brothers presented the first motion picture to an audience in 1895. While this analog photochemical process is capable of producing screen images of great beauty and expressive power, more often the consumer experience is diminished by third generation prints and by the wear and tear of the mechanical process. Furthermore, the film industry globally spends approximately $1B annually manufacturing and shipping prints. Alternatively, distributing digital files would theoretically yield great benefits in terms of image clarity and quality, lower cost, greater security, and more flexibility in the cinema (e.g., multiple language versions). In order to understand the components of the digital cinema chain and evaluate the proposed technical solutions, the Entertainment Technology Center at USC in 2000 established the Digital Cinema Laboratory as a critical viewing environment, with the highest quality film and digital projection equipment. The presentation describes the infrastructure of the Lab, test materials, and testing methodologies developed for compression evaluation, and lessons learned up to the present. In addition to compression, the Digital Cinema Laboratory plans to evaluate other components of the digital cinema process as well.

  8. Process waste assessment methodology for mechanical departments

    SciTech Connect

    Hedrick, R.B.

    1992-12-01

    Process waste assessments (PWAS) were performed for three pilot processes to develop methodology for performing PWAs for all the various processes used throughout the mechanical departments. A material balance and process flow diagram identifying the raw materials utilized in the process and the quantity and types of materials entering the waste streams from the process is defined for each PWA. The data and information are used to determine potential options'' for eliminating hazardous materials or minimizing wastes generated.

  9. Digital processing system for developing countries

    NASA Technical Reports Server (NTRS)

    Nanayakkara, C.; Wagner, H.

    1977-01-01

    An effort was undertaken to perform simple digital processing tasks using pre-existing general purpose digital computers. An experimental software package, LIGMALS, was obtained and modified for this purpose. The resulting software permits basic processing tasks to be performed including level slicing, gray mapping and ratio processing. The experience gained in this project indicates a possible direction which may be used by other developing countries to obtain digital processing capabilities.

  10. Controlling the digital transfer process

    NASA Astrophysics Data System (ADS)

    Brunner, Felix

    1997-02-01

    The accuracy of today's color management systems fails to satisfy the requirements of the graphic arts market. A first explanation for this is that color calibration charts on which these systems rely, because of print technical reasons, are subject to color deviations and inconsistencies. A second reason is that colorimetry describes the human visual perception of color differences and has no direct relation to the rendering technology itself of a proofing or printing device. The author explains that only firm process control of the many parameters in offset printing by means of a system as for example EUROSTANDARD System Brunner, can lead to accurate and consistent calibration of scanner, display, proof and print. The same principles hold for the quality management of digital presses.

  11. Digital processing of radiographic images

    NASA Technical Reports Server (NTRS)

    Bond, A. D.; Ramapriyan, H. K.

    1973-01-01

    Some techniques are presented and the software documentation for the digital enhancement of radiographs. Both image handling and image processing operations are considered. The image handling operations dealt with are: (1) conversion of format of data from packed to unpacked and vice versa; (2) automatic extraction of image data arrays; (3) transposition and 90 deg rotations of large data arrays; (4) translation of data arrays for registration; and (5) reduction of the dimensions of data arrays by integral factors. Both the frequency and the spatial domain approaches are presented for the design and implementation of the image processing operation. It is shown that spatial domain recursive implementation of filters is much faster than nonrecursive implementations using fast fourier transforms (FFT) for the cases of interest in this work. The recursive implementation of a class of matched filters for enhancing image signal to noise ratio is described. Test patterns are used to illustrate the filtering operations. The application of the techniques to radiographic images of metallic structures is demonstrated through several examples.

  12. BPSK Demodulation Using Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Garcia, Thomas R.

    1996-01-01

    A digital communications signal is a sinusoidal waveform that is modified by a binary (digital) information signal. The sinusoidal waveform is called the carrier. The carrier may be modified in amplitude, frequency, phase, or a combination of these. In this project a binary phase shift keyed (BPSK) signal is the communication signal. In a BPSK signal the phase of the carrier is set to one of two states, 180 degrees apart, by a binary (i.e., 1 or 0) information signal. A digital signal is a sampled version of a "real world" time continuous signal. The digital signal is generated by sampling the continuous signal at discrete points in time. The rate at which the signal is sampled is called the sampling rate (f(s)). The device that performs this operation is called an analog-to-digital (A/D) converter or a digitizer. The digital signal is composed of the sequence of individual values of the sampled BPSK signal. Digital signal processing (DSP) is the modification of the digital signal by mathematical operations. A device that performs this processing is called a digital signal processor. After processing, the digital signal may then be converted back to an analog signal using a digital-to-analog (D/A) converter. The goal of this project is to develop a system that will recover the digital information from a BPSK signal using DSP techniques. The project is broken down into the following steps: (1) Development of the algorithms required to demodulate the BPSK signal; (2) Simulation of the system; and (3) Implementation a BPSK receiver using digital signal processing hardware.

  13. A Methodology and Implementation for Annotating Digital Images for Context-appropriate Use in an Academic Health Care Environment

    PubMed Central

    Goede, Patricia A.; Lauman, Jason R.; Cochella, Christopher; Katzman, Gregory L.; Morton, David A.; Albertine, Kurt H.

    2004-01-01

    Use of digital medical images has become common over the last several years, coincident with the release of inexpensive, mega-pixel quality digital cameras and the transition to digital radiology operation by hospitals. One problem that clinicians, medical educators, and basic scientists encounter when handling images is the difficulty of using business and graphic arts commercial-off-the-shelf (COTS) software in multicontext authoring and interactive teaching environments. The authors investigated and developed software-supported methodologies to help clinicians, medical educators, and basic scientists become more efficient and effective in their digital imaging environments. The software that the authors developed provides the ability to annotate images based on a multispecialty methodology for annotation and visual knowledge representation. This annotation methodology is designed by consensus, with contributions from the authors and physicians, medical educators, and basic scientists in the Departments of Radiology, Neurobiology and Anatomy, Dermatology, and Ophthalmology at the University of Utah. The annotation methodology functions as a foundation for creating, using, reusing, and extending dynamic annotations in a context-appropriate, interactive digital environment. The annotation methodology supports the authoring process as well as output and presentation mechanisms. The annotation methodology is the foundation for a Windows implementation that allows annotated elements to be represented as structured eXtensible Markup Language and stored separate from the image(s). PMID:14527971

  14. Development and testing of controller performance evaluation methodology for multi-input/multi-output digital control systems

    NASA Technical Reports Server (NTRS)

    Pototzky, Anthony; Wieseman, Carol; Hoadley, Sherwood Tiffany; Mukhopadhyay, Vivek

    1991-01-01

    Described here is the development and implementation of on-line, near real time controller performance evaluation (CPE) methods capability. Briefly discussed are the structure of data flow, the signal processing methods used to process the data, and the software developed to generate the transfer functions. This methodology is generic in nature and can be used in any type of multi-input/multi-output (MIMO) digital controller application, including digital flight control systems, digitally controlled spacecraft structures, and actively controlled wind tunnel models. Results of applying the CPE methodology to evaluate (in near real time) MIMO digital flutter suppression systems being tested on the Rockwell Active Flexible Wing (AFW) wind tunnel model are presented to demonstrate the CPE capability.

  15. VLSI systems design for digital signal processing. Volume 1 - Signal processing and signal processors

    NASA Astrophysics Data System (ADS)

    Bowen, B. A.; Brown, W. R.

    This book is concerned with the design of digital signal processing systems which utilize VLSI (Very Large Scale Integration) components. The presented material is intended for use by electrical engineers at the senior undergraduate or introductory graduate level. It is the purpose of this volume to present an overview of the important elements of background theory, processing techniques, and hardware evolution. Digital signals are considered along with linear systems and digital filters, taking into account the transform analysis of deterministic signals, a statistical signal model, time domain representations of discrete-time linear systems, and digital filter design techniques and implementation issues. Attention is given to aspects of detection and estimation, digital signal processing algorithms and techniques, issues which must be resolved in a processor design methodology, the fundamental concepts of high performance processing in terms of two early super computers, and the extension of these concepts to more recent processors.

  16. Digital signal processing for radioactive decay studies

    SciTech Connect

    Miller, D.; Madurga, M.; Paulauskas, S. V.; Ackermann, D.; Heinz, S.; Hessberger, F. P.; Hofmann, S.; Grzywacz, R.; Miernik, K.; Rykaczewski, K.; Tan, H.

    2011-11-30

    The use of digital acquisition system has been instrumental in the investigation of proton and alpha emitting nuclei. Recent developments extend the sensitivity and breadth of the application. The digital signal processing capabilities, used predominately by UT/ORNL for decay studies, include digitizers with decreased dead time, increased sampling rates, and new innovative firmware. Digital techniques and these improvements are furthermore applicable to a range of detector systems. Improvements in experimental sensitivity for alpha and beta-delayed neutron emitters measurements as well as the next generation of superheavy experiments are discussed.

  17. How Digital Image Processing Became Really Easy

    NASA Astrophysics Data System (ADS)

    Cannon, Michael

    1988-02-01

    In the early and mid-1970s, digital image processing was the subject of intense university and corporate research. The research lay along two lines: (1) developing mathematical techniques for improving the appearance of or analyzing the contents of images represented in digital form, and (2) creating cost-effective hardware to carry out these techniques. The research has been very effective, as evidenced by the continued decline of image processing as a research topic, and the rapid increase of commercial companies to market digital image processing software and hardware.

  18. On Certain New Methodology for Reducing Sensor and Readout Electronics Circuitry Noise in Digital Domain

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Miko, Joseph; Bradley, Damon; Heinzen, Katherine

    2008-01-01

    NASA Hubble Space Telescope (HST) and upcoming cosmology science missions carry instruments with multiple focal planes populated with many large sensor detector arrays. These sensors are passively cooled to low temperatures for low-level light (L3) and near-infrared (NIR) signal detection, and the sensor readout electronics circuitry must perform at extremely low noise levels to enable new required science measurements. Because we are at the technological edge of enhanced performance for sensors and readout electronics circuitry, as determined by thermal noise level at given temperature in analog domain, we must find new ways of further compensating for the noise in the signal digital domain. To facilitate this new approach, state-of-the-art sensors are augmented at their array hardware boundaries by non-illuminated reference pixels, which can be used to reduce noise attributed to sensors. There are a few proposed methodologies of processing in the digital domain the information carried by reference pixels, as employed by the Hubble Space Telescope and the James Webb Space Telescope Projects. These methods involve using spatial and temporal statistical parameters derived from boundary reference pixel information to enhance the active (non-reference) pixel signals. To make a step beyond this heritage methodology, we apply the NASA-developed technology known as the Hilbert- Huang Transform Data Processing System (HHT-DPS) for reference pixel information processing and its utilization in reconfigurable hardware on-board a spaceflight instrument or post-processing on the ground. The methodology examines signal processing for a 2-D domain, in which high-variance components of the thermal noise are carried by both active and reference pixels, similar to that in processing of low-voltage differential signals and subtraction of a single analog reference pixel from all active pixels on the sensor. Heritage methods using the aforementioned statistical parameters in the

  19. The Process of Digitizing of Old Globe

    NASA Astrophysics Data System (ADS)

    Ambrožová, K.; Havrlanta, J.; Talich, M.; Böhm, O.

    2016-06-01

    This paper describes the process of digitalization of old globes that brings with it the possibility to use globes in their digital form. Created digital models are available to the general public through modern technology in the Internet network. This gives an opportunity to study old globes located in various historical collections, and prevent damage of the originals. Another benefit of digitization is also a possibility of comparing different models both among themselves and with current map data by increasing the transparency of individual layers. Digitization is carried out using special device that allows digitizing globes with a diameter ranging from 5 cm to 120 cm. This device can be easily disassembled, and it is fully mobile therefore the globes can be digitized in the place of its storage. Image data of globe surface are acquired by digital camera firmly fastened to the device. Acquired image data are then georeferenced by using a method of complex adjustment. The last step of digitization is publication of the final models that is realized by two ways. The first option is in the form of 3D model through JavaScript library Cesium or Google Earth plug-in in the Web browser. The second option is as a georeferenced map using Tile Map Service.

  20. Checking Fits With Digital Image Processing

    NASA Technical Reports Server (NTRS)

    Davis, R. M.; Geaslen, W. D.

    1988-01-01

    Computer-aided video inspection of mechanical and electrical connectors feasible. Report discusses work done on digital image processing for computer-aided interface verification (CAIV). Two kinds of components examined: mechanical mating flange and electrical plug.

  1. Digital images in the map revision process

    NASA Astrophysics Data System (ADS)

    Newby, P. R. T.

    Progress towards the adoption of digital (or softcopy) photogrammetric techniques for database and map revision is reviewed. Particular attention is given to the Ordnance Survey of Great Britain, the author's former employer, where digital processes are under investigation but have not yet been introduced for routine production. Developments which may lead to increasing automation of database update processes appear promising, but because of the cost and practical problems associated with managing as well as updating large digital databases, caution is advised when considering the transition to softcopy photogrammetry for revision tasks.

  2. CT Image Processing Using Public Digital Networks

    PubMed Central

    Rhodes, Michael L.; Azzawi, Yu-Ming; Quinn, John F.; Glenn, William V.; Rothman, Stephen L.G.

    1984-01-01

    Nationwide commercial computer communication is now commonplace for those applications where digital dialogues are generally short and widely distributed, and where bandwidth does not exceed that of dial-up telephone lines. Image processing using such networks is prohibitive because of the large volume of data inherent to digital pictures. With a blend of increasing bandwidth and distributed processing, network image processing becomes possible. This paper examines characteristics of a digital image processing service for a nationwide network of CT scanner installations. Issues of image transmission, data compression, distributed processing, software maintenance, and interfacility communication are also discussed. Included are results that show the volume and type of processing experienced by a network of over 50 CT scanners for the last 32 months.

  3. Process independent automated sizing methodology for current steering DAC

    NASA Astrophysics Data System (ADS)

    Vural, R. A.; Kahraman, N.; Erkmen, B.; Yildirim, T.

    2015-10-01

    This study introduces a process independent automated sizing methodology based on general regression neural network (GRNN) for current steering complementary metal-oxide semiconductor (CMOS) digital-to-analog converter (DAC) circuit. The aim is to utilise circuit structures designed with previous process technologies and to synthesise circuit structures for novel process technologies in contrast to other modelling researches that consider a particular process technology. The simulations were performed using ON SEMI 1.5 µm, ON SEMI 0.5 µm and TSMC 0.35 µm technology process parameters. Eventually, a high-dimensional database was developed consisting of transistor sizes of DAC designs and corresponded static specification errors obtained from simulation results. The key point is that the GRNN was trained with the data set including the simulation results of ON-SEMI 1.5 µm and 0.5 µm technology parameters and the test data were constituted with only the simulation results of TSMC 0.35 µm technology parameters that had not been applied to GRNN for training beforehand. The proposed methodology provides the channel lengths and widths of all transistors for a newer technology when the designer sets the numeric values of DAC static output specifications as Differential Non-linearity error, Integral Non-linearity error, monotonicity and gain error as the inputs of the network.

  4. Digital Signal Processing Based Biotelemetry Receivers

    NASA Technical Reports Server (NTRS)

    Singh, Avtar; Hines, John; Somps, Chris

    1997-01-01

    This is an attempt to develop a biotelemetry receiver using digital signal processing technology and techniques. The receiver developed in this work is based on recovering signals that have been encoded using either Pulse Position Modulation (PPM) or Pulse Code Modulation (PCM) technique. A prototype has been developed using state-of-the-art digital signal processing technology. A Printed Circuit Board (PCB) is being developed based on the technique and technology described here. This board is intended to be used in the UCSF Fetal Monitoring system developed at NASA. The board is capable of handling a variety of PPM and PCM signals encoding signals such as ECG, temperature, and pressure. A signal processing program has also been developed to analyze the received ECG signal to determine heart rate. This system provides a base for using digital signal processing in biotelemetry receivers and other similar applications.

  5. Digital database architecture and delineation methodology for deriving drainage basins, and a comparison of digitally and non-digitally derived numeric drainage areas

    USGS Publications Warehouse

    Dupree, Jean A.; Crowfoot, Richard M.

    2012-01-01

    The drainage basin is a fundamental hydrologic entity used for studies of surface-water resources and during planning of water-related projects. Numeric drainage areas published by the U.S. Geological Survey water science centers in Annual Water Data Reports and on the National Water Information Systems (NWIS) Web site are still primarily derived from hard-copy sources and by manual delineation of polygonal basin areas on paper topographic map sheets. To expedite numeric drainage area determinations, the Colorado Water Science Center developed a digital database structure and a delineation methodology based on the hydrologic unit boundaries in the National Watershed Boundary Dataset. This report describes the digital database architecture and delineation methodology and also presents the results of a comparison of the numeric drainage areas derived using this digital methodology with those derived using traditional, non-digital methods. (Please see report for full Abstract)

  6. Digital signal processing for ionospheric propagation diagnostics

    NASA Astrophysics Data System (ADS)

    Rino, Charles L.; Groves, Keith M.; Carrano, Charles S.; Gunter, Jacob H.; Parris, Richard T.

    2015-08-01

    For decades, analog beacon satellite receivers have generated multifrequency narrowband complex data streams that could be processed directly to extract total electron content (TEC) and scintillation diagnostics. With the advent of software-defined radio, modern digital receivers generate baseband complex data streams that require intermediate processing to extract the narrowband modulation imparted to the signal by ionospheric structure. This paper develops and demonstrates a processing algorithm for digital beacon satellite data that will extract TEC and scintillation components. For algorithm evaluation, a simulator was developed to generate noise-limited multifrequency complex digital signal realizations with representative orbital dynamics and propagation disturbances. A frequency-tracking procedure is used to capture the slowly changing frequency component. Dynamic demodulation against the low-frequency estimate captures the scintillation. The low-frequency reference can be used directly for dual-frequency TEC estimation.

  7. Digital processing of Mariner 9 television data.

    NASA Technical Reports Server (NTRS)

    Green, W. B.; Seidman, J. B.

    1973-01-01

    The digital image processing performed by the Image Processing Laboratory (IPL) at JPL in support of the Mariner 9 mission is summarized. The support is divided into the general categories of image decalibration (the removal of photometric and geometric distortions from returned imagery), computer cartographic projections in support of mapping activities, and adaptive experimenter support (flexible support to provide qualitative digital enhancements and quantitative data reduction of returned imagery). Among the tasks performed were the production of maximum discriminability versions of several hundred frames to support generation of a geodetic control net for Mars, and special enhancements supporting analysis of Phobos and Deimos images.

  8. Digital image processing of vascular angiograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Beckenbach, E. S.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.

    1975-01-01

    The paper discusses the estimation of the degree of atherosclerosis in the human femoral artery through the use of a digital image processing system for vascular angiograms. The film digitizer uses an electronic image dissector camera to scan the angiogram and convert the recorded optical density information into a numerical format. Another processing step involves locating the vessel edges from the digital image. The computer has been programmed to estimate vessel abnormality through a series of measurements, some derived primarily from the vessel edge information and others from optical density variations within the lumen shadow. These measurements are combined into an atherosclerosis index, which is found in a post-mortem study to correlate well with both visual and chemical estimates of atherosclerotic disease.

  9. Digital Image Processing in Private Industry.

    ERIC Educational Resources Information Center

    Moore, Connie

    1986-01-01

    Examines various types of private industry optical disk installations in terms of business requirements for digital image systems in five areas: records management; transaction processing; engineering/manufacturing; information distribution; and office automation. Approaches for implementing image systems are addressed as well as key success…

  10. Research Methodologies and the Doctoral Process.

    ERIC Educational Resources Information Center

    Creswell, John W.; Miller, Gary A.

    1997-01-01

    Doctoral students often select one of four common research methodologies that are popular in the social sciences and education today: positivist; interpretive; ideological; and pragmatic. But choice of methodology also influences the student's choice of course work, membership of dissertation committee, and the form and structure of the…

  11. On process optimization considering LCA methodology.

    PubMed

    Pieragostini, Carla; Mussati, Miguel C; Aguirre, Pío

    2012-04-15

    The goal of this work is to research the state-of-the-art in process optimization techniques and tools based on LCA, focused in the process engineering field. A collection of methods, approaches, applications, specific software packages, and insights regarding experiences and progress made in applying the LCA methodology coupled to optimization frameworks is provided, and general trends are identified. The "cradle-to-gate" concept to define the system boundaries is the most used approach in practice, instead of the "cradle-to-grave" approach. Normally, the relationship between inventory data and impact category indicators is linearly expressed by the characterization factors; then, synergic effects of the contaminants are neglected. Among the LCIA methods, the eco-indicator 99, which is based on the endpoint category and the panel method, is the most used in practice. A single environmental impact function, resulting from the aggregation of environmental impacts, is formulated as the environmental objective in most analyzed cases. SimaPro is the most used software for LCA applications in literature analyzed. The multi-objective optimization is the most used approach for dealing with this kind of problems, where the ε-constraint method for generating the Pareto set is the most applied technique. However, a renewed interest in formulating a single economic objective function in optimization frameworks can be observed, favored by the development of life cycle cost software and progress made in assessing costs of environmental externalities. Finally, a trend to deal with multi-period scenarios into integrated LCA-optimization frameworks can be distinguished providing more accurate results upon data availability. PMID:22208397

  12. [Digital thoracic radiology: devices, image processing, limits].

    PubMed

    Frija, J; de Géry, S; Lallouet, F; Guermazi, A; Zagdanski, A M; De Kerviler, E

    2001-09-01

    In a first part, the different techniques of digital thoracic radiography are described. Since computed radiography with phosphore plates are the most commercialized it is more emphasized. But the other detectors are also described, as the drum coated with selenium and the direct digital radiography with selenium detectors. The other detectors are also studied in particular indirect flat panels detectors and the system with four high resolution CCD cameras. In a second step the most important image processing are discussed: the gradation curves, the unsharp mask processing, the system MUSICA, the dynamic range compression or reduction, the soustraction with dual energy. In the last part the advantages and the drawbacks of computed thoracic radiography are emphasized. The most important are the almost constant good quality of the pictures and the possibilities of image processing. PMID:11567193

  13. A methodology for use of digital image correlation for hot mix asphalt testing

    NASA Astrophysics Data System (ADS)

    Ramos, Estefany

    Digital Image Correlation (DIC) is a relatively new technology which aids in the measurement of material properties without the need for installation of sensors. DIC is a noncontact measuring technique that requires the specimen to be marked with a random speckled pattern and to be photographed during the test. The photographs are then post-processed based on the location of the pattern throughout the test. DIC can aid in calculating properties that would otherwise be too difficult even with other measuring instruments. The objective of this thesis is to discuss the methodology and validate the use of DIC in different hot mix asphalt (HMA) tests, such as, the Overlay Tester (OT) Test, Indirect Tensile (IDT) Test, and the Semicircular Bending (SCB) Test. The DIC system provides displacements and strains in any visible surface. The properly calibrated 2-D or 3-D DIC data can be used to understand the complex stress and strain distributions and the modes of the initiation and propagation of cracks. The use of this observational method will lead to further understanding of the complex boundary conditions of the different test, and therefore, allowing it to be implemented in the analysis of other materials. The use of digital image correlation will bring insight and knowledge onto what is happening during a test.

  14. A methodology for the semi-automatic digital image analysis of fragmental impactites

    NASA Astrophysics Data System (ADS)

    Chanou, A.; Osinski, G. R.; Grieve, R. A. F.

    2014-04-01

    A semi-automated digital image analysis method is developed for the comparative textural study of impact melt-bearing breccias. This method uses the freeware software ImageJ developed by the National Institute of Health (NIH). Digital image analysis is performed on scans of hand samples (10-15 cm across), based on macroscopic interpretations of the rock components. All image processing and segmentation are done semi-automatically, with the least possible manual intervention. The areal fraction of components is estimated and modal abundances can be deduced, where the physical optical properties (e.g., contrast, color) of the samples allow it. Other parameters that can be measured include, for example, clast size, clast-preferred orientations, average box-counting dimension or fragment shape complexity, and nearest neighbor distances (NnD). This semi-automated method allows the analysis of a larger number of samples in a relatively short time. Textures, granulometry, and shape descriptors are of considerable importance in rock characterization. The methodology is used to determine the variations of the physical characteristics of some examples of fragmental impactites.

  15. Weighting in digital synthetic aperture radar processing

    NASA Technical Reports Server (NTRS)

    Dicenzo, A.

    1979-01-01

    Weighting is employed in synthetic aperture radar (SAR) processing to reduce the sidelobe response at the expense of peak center response height and mainlobe resolution. The weighting effectiveness in digital processing depends not only on the choice of weighting function, but on the fineness of sampling and quantization, on the time bandwidth product, on the quadratic phase error, and on the azimuth antenna pattern. The results of simulations conducted to uncover the effect of these parameters on azimuth weighting effectiveness are presented. In particular, it is shown that multilook capabilities of future SAR systems may obviate the need for consideration of the antenna pattern, and that azimuth time-bandwidth products of over 200 are probably required before the digital results begin to approach the ideal results.

  16. Fundamental Concepts of Digital Image Processing

    DOE R&D Accomplishments Database

    Twogood, R. E.

    1983-03-01

    The field of a digital-image processing has experienced dramatic growth and increasingly widespread applicability in recent years. Fortunately, advances in computer technology have kept pace with the rapid growth in volume of image data in these and other applications. Digital image processing has become economical in many fields of research and in industrial and military applications. While each application has requirements unique from the others, all are concerned with faster, cheaper, more accurate, and more extensive computation. The trend is toward real-time and interactive operations, where the user of the system obtains preliminary results within a short enough time that the next decision can be made by the human processor without loss of concentration on the task at hand. An example of this is the obtaining of two-dimensional (2-D) computer-aided tomography (CAT) images. A medical decision might be made while the patient is still under observation rather than days later.

  17. REVIEW ARTICLE: Spectrophotometric applications of digital signal processing

    NASA Astrophysics Data System (ADS)

    Morawski, Roman Z.

    2006-09-01

    Spectrophotometry is more and more often the method of choice not only in analysis of (bio)chemical substances, but also in the identification of physical properties of various objects and their classification. The applications of spectrophotometry include such diversified tasks as monitoring of optical telecommunications links, assessment of eating quality of food, forensic classification of papers, biometric identification of individuals, detection of insect infestation of seeds and classification of textiles. In all those applications, large numbers of data, generated by spectrophotometers, are processed by various digital means in order to extract measurement information. The main objective of this paper is to review the state-of-the-art methodology for digital signal processing (DSP) when applied to data provided by spectrophotometric transducers and spectrophotometers. First, a general methodology of DSP applications in spectrophotometry, based on DSP-oriented models of spectrophotometric data, is outlined. Then, the most important classes of DSP methods for processing spectrophotometric data—the methods for DSP-aided calibration of spectrophotometric instrumentation, the methods for the estimation of spectra on the basis of spectrophotometric data, the methods for the estimation of spectrum-related measurands on the basis of spectrophotometric data—are presented. Finally, the methods for preprocessing and postprocessing of spectrophotometric data are overviewed. Throughout the review, the applications of DSP are illustrated with numerous examples related to broadly understood spectrophotometry.

  18. Digital signal processing the Tevatron BPM signals

    SciTech Connect

    Cancelo, G.; James, E.; Wolbers, S.; /Fermilab

    2005-05-01

    The Beam Position Monitor (TeV BPM) readout system at Fermilab's Tevatron has been updated and is currently being commissioned. The new BPMs use new analog and digital hardware to achieve better beam position measurement resolution. The new system reads signals from both ends of the existing directional stripline pickups to provide simultaneous proton and antiproton measurements. The signals provided by the two ends of the BPM pickups are processed by analog band-pass filters and sampled by 14-bit ADCs at 74.3MHz. A crucial part of this work has been the design of digital filters that process the signal. This paper describes the digital processing and estimation techniques used to optimize the beam position measurement. The BPM electronics must operate in narrow-band and wide-band modes to enable measurements of closed-orbit and turn-by-turn positions. The filtering and timing conditions of the signals are tuned accordingly for the operational modes. The analysis and the optimized result for each mode are presented.

  19. Process simulation in digital camera system

    NASA Astrophysics Data System (ADS)

    Toadere, Florin

    2012-06-01

    The goal of this paper is to simulate the functionality of a digital camera system. The simulations cover the conversion from light to numerical signal and the color processing and rendering. We consider the image acquisition system to be linear shift invariant and axial. The light propagation is orthogonal to the system. We use a spectral image processing algorithm in order to simulate the radiometric properties of a digital camera. In the algorithm we take into consideration the transmittances of the: light source, lenses, filters and the quantum efficiency of a CMOS (complementary metal oxide semiconductor) sensor. The optical part is characterized by a multiple convolution between the different points spread functions of the optical components. We use a Cooke triplet, the aperture, the light fall off and the optical part of the CMOS sensor. The electrical part consists of the: Bayer sampling, interpolation, signal to noise ratio, dynamic range, analog to digital conversion and JPG compression. We reconstruct the noisy blurred image by blending different light exposed images in order to reduce the photon shot noise, also we filter the fixed pattern noise and we sharpen the image. Then we have the color processing blocks: white balancing, color correction, gamma correction, and conversion from XYZ color space to RGB color space. For the reproduction of color we use an OLED (organic light emitting diode) monitor. The analysis can be useful to assist students and engineers in image quality evaluation and imaging system design. Many other configurations of blocks can be used in our analysis.

  20. Parallel digital signal processing architectures for image processing

    NASA Astrophysics Data System (ADS)

    Kshirsagar, Shirish P.; Hartley, David A.; Harvey, David M.; Hobson, Clifford A.

    1994-10-01

    This paper describes research into a high speed image processing system using parallel digital signal processors for the processing of electro-optic images. The objective of the system is to reduce the processing time of non-contact type inspection problems including industrial and medical applications. A single processor can not deliver sufficient processing power required for the use of applications hence, a MIMD system is designed and constructed to enable fast processing of electro-optic images. The Texas Instruments TMS320C40 digital signal processor is used due to its high speed floating point CPU and the support for the parallel processing environment. A custom designed VISION bus is provided to transfer images between processors. The system is being applied for solder joint inspection of high technology printed circuit boards.

  1. Digital Signal Processing in the GRETINA Spectrometer

    NASA Astrophysics Data System (ADS)

    Cromaz, Mario

    2015-10-01

    Developments in the segmentation of large-volume HPGe crystals has enabled the development of high-efficiency gamma-ray spectrometers which have the ability to track the path of gamma-rays scattering through the detector volume. This technology has been successfully implemented in the GRETINA spectrometer whose high efficiency and ability to perform precise event-by-event Doppler correction has made it an important tool in nuclear spectroscopy. Tracking has required the spectrometer to employ a fully digital signal processing chain. Each of the systems 1120 channels are digitized by 100 Mhz, 14-bit flash ADCs. Filters that provide timing and high-resolution energies are implemented on local FPGAs acting on the ADC data streams while interaction point locations and tracks, derived from the trace on each detector segment, are calculated in real time on a computing cluster. In this presentation we will give a description of GRETINA's digital signal processing system, the impact of design decisions on system performance, and a discussion of possible future directions as we look towards soon developing larger spectrometers such as GRETA with full 4 π solid angle coverage. This work was supported by the Office of Science in the Department of Energy under grant DE-AC02-05CH11231.

  2. C language algorithms for digital signal processing

    SciTech Connect

    Embree, P.M.; Kimble, B.

    1991-01-01

    The use of the C programming language to construct digital signal-processing (DSP) algorithms for operation on high-performance personal computers is described in a textbook for engineering students. Chapters are devoted to the fundamental principles of DSP, basic C programming techniques, user-interface and disk-storage routines, filtering routines, discrete Fourier transforms, matrix and vector routines, and image-processing routines. Also included is a floppy disk containing a library of standard C mathematics, character-string, memory-allocation, and I/O functions; a library of DSP functions; and several sample DSP programs. 83 refs.

  3. Applications of Digital Image Processing 11

    NASA Technical Reports Server (NTRS)

    Cho, Y. -C.

    1988-01-01

    A new technique, digital image velocimetry, is proposed for the measurement of instantaneous velocity fields of time dependent flows. A time sequence of single-exposure images of seed particles are captured with a high-speed camera, and a finite number of the single-exposure images are sampled within a prescribed period in time. The sampled images are then digitized on an image processor, enhanced, and superimposed to construct an image which is equivalent to a multiple exposure image used in both laser speckle velocimetry and particle image velocimetry. The superimposed image and a single-exposure Image are digitally Fourier transformed for extraction of information on the velocity field. A great enhancement of the dynamic range of the velocity measurement is accomplished through the new technique by manipulating the Fourier transform of both the single-exposure image and the superimposed image. Also the direction of the velocity vector is unequivocally determined. With the use of a high-speed video camera, the whole process from image acquisition to velocity determination can be carried out electronically; thus this technique can be developed into a real-time capability.

  4. Modeling and Analysis of Power Processing Systems. [use of a digital computer for designing power plants

    NASA Technical Reports Server (NTRS)

    Fegley, K. A.; Hayden, J. H.; Rehmann, D. W.

    1974-01-01

    The feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems is investigated. It is shown that a digital computer may be used in an interactive mode for the design, modeling, analysis, and comparison of power processing systems.

  5. Digital signal processing in acoustics. I

    NASA Astrophysics Data System (ADS)

    Davies, H.; McNeil, D. J.

    1985-11-01

    Digital signal processing techniques have gained steadily in importance over the past few years in many areas of science and engineering and have transformed the character of instrumentation used in laboratory and plant. This is particularly marked in acoustics, which has both benefited from the developments in signal processing and provided significant stimulus for these developments. As a result acoustical techniques are now used in a very wide range of applications and acoustics is one area in which digital signal processing is exploited to its limits. For example, the development of fast algorithms for computing Fourier transforms and the associated developments in hardware have led to remarkable advances in the use of spectral analysis as a means of investigating the nature and characteristics of acoustic sources. Speech research has benefited considerably in this respect, and, in a rather more technological application, spectral analysis of machinery noise provides information about changes in machine condition which may indicate imminent failure. More recently the observation that human and animal muscles emit low intensity noise suggests that spectral analysis of this noise may yield information about muscle structure and performance.

  6. Image processing techniques for digital orthophotoquad production

    USGS Publications Warehouse

    Hood, Joy J.; Ladner, L. J.; Champion, Richard A.

    1989-01-01

    Orthophotographs have long been recognized for their value as supplements or alternatives to standard maps. Recent trends towards digital cartography have resulted in efforts by the US Geological Survey to develop a digital orthophotoquad production system. Digital image files were created by scanning color infrared photographs on a microdensitometer. Rectification techniques were applied to remove tile and relief displacement, thereby creating digital orthophotos. Image mosaicking software was then used to join the rectified images, producing digital orthophotos in quadrangle format.

  7. Parallel processing for digital picture comparison

    NASA Technical Reports Server (NTRS)

    Cheng, H. D.; Kou, L. T.

    1987-01-01

    In picture processing an important problem is to identify two digital pictures of the same scene taken under different lighting conditions. This kind of problem can be found in remote sensing, satellite signal processing and the related areas. The identification can be done by transforming the gray levels so that the gray level histograms of the two pictures are closely matched. The transformation problem can be solved by using the packing method. Researchers propose a VLSI architecture consisting of m x n processing elements with extensive parallel and pipelining computation capabilities to speed up the transformation with the time complexity 0(max(m,n)), where m and n are the numbers of the gray levels of the input picture and the reference picture respectively. If using uniprocessor and a dynamic programming algorithm, the time complexity will be 0(m(3)xn). The algorithm partition problem, as an important issue in VLSI design, is discussed. Verification of the proposed architecture is also given.

  8. Applicability of ACR breast dosimetry methodology to a digital mammography system

    SciTech Connect

    Tomon, John J.; Johnson, Thomas E.; Swenson, Kristin N.; Schauer, David A.

    2006-03-15

    Determination of mean glandular dose (MGD) to breast tissue is an essential aspect of mammography equipment evaluations and exposure controls. The American College of Radiology (ACR) Quality Control Manual outlines the procedure for MGD determination in screen-film mammography based upon conversions of entrance skin exposures (ESEs) measured with an ionization chamber (IC). The development of digital mammography has increased with the demand for improved object resolution and tissue contrast. This change in image receptor from screen-film to a solid-state detector has led to questions about the applicability of the ACR MGD methodology to digital mammography. This research has validated the applicability of the ACR MGD methodology to digital mammography in the GE digital mammography system Senographe 2000D[reg]. MGD was determined using light output measurements from thermoluminescent dosimeters (MGD{sub TL}), exposure measurements from an IC (MGD{sub IC}) and conversion factors from the ACR Mammography Quality Control Manual. MGD{sub TL} and MGD{sub IC} data indicate that there is a statistically significant difference between the two measurements with the Senographe 2000D[reg]. However, the applicability of the ACR's methodology was validated by calculating MGD at various depths in a 50/50 breast phantom. Additionally, the results of backscatter measurements from the image receptors of both mammography modalities indicate there is a difference (all P values <0.001) in the radiation backscattered from each image receptor.

  9. Seismic Rayleigh Wave Digital Processing Technology

    NASA Astrophysics Data System (ADS)

    Jie, Li

    2013-04-01

    In Rayleigh wave exploration, the digital processing of data plays a very important position. This directly affects the interpretation of ground effect. Therefore, the use of accurate processing software and effective method in the Rayleigh wave exploration has important theoretical and practical significance. Previously, Rayleigh wave dispersion curve obtained by the one-dimensional phase analysis. This method requires channel spacing should be less than the effective wavelength. And minimal phase error will cause great changes in the phase velocity of Rayleigh wave. Damped least square method is a local linear model. It is easy to cause that inversion objective function cannot find the global optimal solution. Therefore, the method and the technology used in the past are difficult to apply the requirements of the current Rayleigh wave exploration. This study focused on the related technologies and algorithms of F-K domain dispersion curve extraction and GA global non-linear inversion, and combined with the impact of Rayleigh wave data acquisition parameters and the characteristics. Rayleigh wave exploration data processing software design and process technology research is completed. Firstly, the article describes the theoretical basis of Rayleigh wave method. This is also part of the theoretical basis of following treatment. The theoretical proof of existence of Rayleigh wave Dispersive in layered strata. Secondly, F-K domain dispersion curve extraction tests showed that the method can overcome the one-dimensional digital processing technology deficiencies, and make full use of multi-channel Rayleigh wave data record information. GA global non-linear inversion indicated that the inversion is not easy getting into local optimal solution. Thirdly, some examples illustrate each mode Rayleigh wave dispersion curve characteristics in the X-T domain. Tests demonstrated the impact on their extraction of dispersion curves. Parameters change example (including the X

  10. Digital techniques for processing Landsat imagery

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1978-01-01

    An overview of the basic techniques used to process Landsat images with a digital computer, and the VICAR image processing software developed at JPL and available to users through the NASA sponsored COSMIC computer program distribution center is presented. Examples of subjective processing performed to improve the information display for the human observer, such as contrast enhancement, pseudocolor display and band rationing, and of quantitative processing using mathematical models, such as classification based on multispectral signatures of different areas within a given scene and geometric transformation of imagery into standard mapping projections are given. Examples are illustrated by Landsat scenes of the Andes mountains and Altyn-Tagh fault zone in China before and after contrast enhancement and classification of land use in Portland, Oregon. The VICAR image processing software system which consists of a language translator that simplifies execution of image processing programs and provides a general purpose format so that imagery from a variety of sources can be processed by the same basic set of general applications programs is described.

  11. Wavelet processing techniques for digital mammography

    NASA Astrophysics Data System (ADS)

    Laine, Andrew F.; Song, Shuwu

    1992-09-01

    This paper introduces a novel approach for accomplishing mammographic feature analysis through multiresolution representations. We show that efficient (nonredundant) representations may be identified from digital mammography and used to enhance specific mammographic features within a continuum of scale space. The multiresolution decomposition of wavelet transforms provides a natural hierarchy in which to embed an interactive paradigm for accomplishing scale space feature analysis. Similar to traditional coarse to fine matching strategies, the radiologist may first choose to look for coarse features (e.g., dominant mass) within low frequency levels of a wavelet transform and later examine finer features (e.g., microcalcifications) at higher frequency levels. In addition, features may be extracted by applying geometric constraints within each level of the transform. Choosing wavelets (or analyzing functions) that are simultaneously localized in both space and frequency, results in a powerful methodology for image analysis. Multiresolution and orientation selectivity, known biological mechanisms in primate vision, are ingrained in wavelet representations and inspire the techniques presented in this paper. Our approach includes local analysis of complete multiscale representations. Mammograms are reconstructed from wavelet representations, enhanced by linear, exponential and constant weight functions through scale space. By improving the visualization of breast pathology we can improve the chances of early detection of breast cancers (improve quality) while requiring less time to evaluate mammograms for most patients (lower costs).

  12. An investigation of radiometer design using digital processing techniques

    NASA Technical Reports Server (NTRS)

    Lawrence, R. W.

    1981-01-01

    The use of digital signal processing techniques in Dicke switching radiometer design was investigated. The general approach was to develop an analytical model of the existing analog radiometer and identify factors which adversly affect its performance. A digital processor was then proposed to verify the feasibility of using digital techniques to minimize these adverse effects and improve the radiometer performance. Analysis and preliminary test results comparing the digital and analog processing approaches in radiometers design were analyzed.

  13. Digital interactive image analysis by array processing

    NASA Technical Reports Server (NTRS)

    Sabels, B. E.; Jennings, J. D.

    1973-01-01

    An attempt is made to draw a parallel between the existing geophysical data processing service industries and the emerging earth resources data support requirements. The relationship of seismic data analysis to ERTS data analysis is natural because in either case data is digitally recorded in the same format, resulting from remotely sensed energy which has been reflected, attenuated, shifted and degraded on its path from the source to the receiver. In the seismic case the energy is acoustic, ranging in frequencies from 10 to 75 cps, for which the lithosphere appears semi-transparent. In earth survey remote sensing through the atmosphere, visible and infrared frequency bands are being used. Yet the hardware and software required to process the magnetically recorded data from the two realms of inquiry are identical and similar, respectively. The resulting data products are similar.

  14. Fuzzy Logic Enhanced Digital PIV Processing Software

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.

    1999-01-01

    Digital Particle Image Velocimetry (DPIV) is an instantaneous, planar velocity measurement technique that is ideally suited for studying transient flow phenomena in high speed turbomachinery. DPIV is being actively used at the NASA Glenn Research Center to study both stable and unstable operating conditions in a high speed centrifugal compressor. Commercial PIV systems are readily available which provide near real time feedback of the PIV image data quality. These commercial systems are well designed to facilitate the expedient acquisition of PIV image data. However, as with any general purpose system, these commercial PIV systems do not meet all of the data processing needs required for PIV image data reduction in our compressor research program. An in-house PIV PROCessing (PIVPROC) code has been developed for reducing PIV data. The PIVPROC software incorporates fuzzy logic data validation for maximum information recovery from PIV image data. PIVPROC enables combined cross-correlation/particle tracking wherein the highest possible spatial resolution velocity measurements are obtained.

  15. Applying Statistical Process Quality Control Methodology to Educational Settings.

    ERIC Educational Resources Information Center

    Blumberg, Carol Joyce

    A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…

  16. Development and testing of methodology for evaluating the performance of multi-input/multi-output digital control systems

    NASA Technical Reports Server (NTRS)

    Pototzky, Anthony S.; Wieseman, Carol D.; Hoadley, Sherwood Tiffany; Mukhopadhyay, Vivek

    1990-01-01

    A Controller Performance Evaluation (CPE) methodology for multi-input/multi-output digital control systems was developed and tested on an aeroelastic wind-tunnel model. Modern signal processing methods were used to implement control laws and to acquire time domain data of the whole system (controller and plant) from which appropriate transfer matrices of the control system could be generated. Matrix computational procedures were used to calculate singular values of return-difference matrices at the plant input and output points to evaluate the performance of the control system. The CPE procedures effectively identified potentially destabilizing controllers and confirmed the satisfactory performance of stabilizing ones.

  17. [Process of perversion. Methodological and clinical study].

    PubMed

    Marchais, P

    1975-07-01

    Studies in classical psychiatry and psychoanalysis have reduced perversions to pathological phenomena, by lessening the moral criterion progressively. The applying of a comprehensive method of study of acquired perversions incites to consider various levels of observation. Henceforth properties common to each of these levels appears permitting thus to isolate a general process of perversion. This process, which keeps undeniable links with mental pathology, must however be differentiated, for it is not necessarily to be assimilated or reduced to the latter. Besides, it entails notable consequences on the social and cultural level. PMID:1233902

  18. Digital data processing system dynamic loading analysis

    NASA Technical Reports Server (NTRS)

    Lagas, J. J.; Peterka, J. J.; Tucker, A. E.

    1976-01-01

    Simulation and analysis of the Space Shuttle Orbiter Digital Data Processing System (DDPS) are reported. The mated flight and postseparation flight phases of the space shuttle's approach and landing test configuration were modeled utilizing the Information Management System Interpretative Model (IMSIM) in a computerized simulation modeling of the ALT hardware, software, and workload. System requirements simulated for the ALT configuration were defined. Sensitivity analyses determined areas of potential data flow problems in DDPS operation. Based on the defined system requirements and the sensitivity analyses, a test design is described for adapting, parameterizing, and executing the IMSIM. Varying load and stress conditions for the model execution are given. The analyses of the computer simulation runs were documented as results, conclusions, and recommendations for DDPS improvements.

  19. Parallel Processing with Digital Signal Processing Hardware and Software

    NASA Technical Reports Server (NTRS)

    Swenson, Cory V.

    1995-01-01

    The assembling and testing of a parallel processing system is described which will allow a user to move a Digital Signal Processing (DSP) application from the design stage to the execution/analysis stage through the use of several software tools and hardware devices. The system will be used to demonstrate the feasibility of the Algorithm To Architecture Mapping Model (ATAMM) dataflow paradigm for static multiprocessor solutions of DSP applications. The individual components comprising the system are described followed by the installation procedure, research topics, and initial program development.

  20. SOME CONCEPTS PERTAINING TO INVESTIGATIVE METHODOLOGY FOR SUBSURFACE PROCESS RESEARCH

    EPA Science Inventory

    Problems of investigative methodology comprise a critical and often preponderant element of research to delineate and quantitate processes which govern the transport and fate of pollutants in subsurface environments. Examination of several recent research studies illustrates that...

  1. Using continuous process improvement methodology to standardize nursing handoff communication.

    PubMed

    Klee, Kristi; Latta, Linda; Davis-Kirsch, Sallie; Pecchia, Maria

    2012-04-01

    The purpose of this article was to describe the use of continuous performance improvement (CPI) methodology to standardize nurse shift-to-shift handoff communication. The goals of the process were to standardize the content and process of shift handoff, improve patient safety, increase patient and family involvement in the handoff process, and decrease end-of-shift overtime. This article will describe process changes made over a 4-year period as result of application of the plan-do-check-act procedure, which is an integral part of the CPI methodology, and discuss further work needed to continue to refine this critical nursing care process. PMID:21964442

  2. On digital image processing technology and application in geometric measure

    NASA Astrophysics Data System (ADS)

    Yuan, Jiugen; Xing, Ruonan; Liao, Na

    2014-04-01

    Digital image processing technique is an emerging science that emerging with the development of semiconductor integrated circuit technology and computer science technology since the 1960s.The article introduces the digital image processing technique and principle during measuring compared with the traditional optical measurement method. It takes geometric measure as an example and introduced the development tendency of digital image processing technology from the perspective of technology application.

  3. Design methodology: edgeless 3D ASICs with complex in-pixel processing for pixel detectors

    SciTech Connect

    Fahim Farah, Fahim Farah; Deptuch, Grzegorz W.; Hoff, James R.; Mohseni, Hooman

    2015-08-28

    The design methodology for the development of 3D integrated edgeless pixel detectors with in-pixel processing using Electronic Design Automation (EDA) tools is presented. A large area 3 tier 3D detector with one sensor layer and two ASIC layers containing one analog and one digital tier, is built for x-ray photon time of arrival measurement and imaging. A full custom analog pixel is 65μm x 65μm. It is connected to a sensor pixel of the same size on one side, and on the other side it has approximately 40 connections to the digital pixel. A 32 x 32 edgeless array without any peripheral functional blocks constitutes a sub-chip. The sub-chip is an indivisible unit, which is further arranged in a 6 x 6 array to create the entire 1.248cm x 1.248cm ASIC. Each chip has 720 bump-bond I/O connections, on the back of the digital tier to the ceramic PCB. All the analog tier power and biasing is conveyed through the digital tier from the PCB. The assembly has no peripheral functional blocks, and hence the active area extends to the edge of the detector. This was achieved by using a few flavors of almost identical analog pixels (minimal variation in layout) to allow for peripheral biasing blocks to be placed within pixels. The 1024 pixels within a digital sub-chip array have a variety of full custom, semi-custom and automated timing driven functional blocks placed together. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout. The methodology uses the Cadence design platform, however it is not limited to this tool.

  4. Design methodology: edgeless 3D ASICs with complex in-pixel processing for pixel detectors

    NASA Astrophysics Data System (ADS)

    Fahim, Farah; Deptuch, Grzegorz W.; Hoff, James R.; Mohseni, Hooman

    2015-08-01

    The design methodology for the development of 3D integrated edgeless pixel detectors with in-pixel processing using Electronic Design Automation (EDA) tools is presented. A large area 3 tier 3D detector with one sensor layer and two ASIC layers containing one analog and one digital tier, is built for x-ray photon time of arrival measurement and imaging. A full custom analog pixel is 65μm x 65μm. It is connected to a sensor pixel of the same size on one side, and on the other side it has approximately 40 connections to the digital pixel. A 32 x 32 edgeless array without any peripheral functional blocks constitutes a sub-chip. The sub-chip is an indivisible unit, which is further arranged in a 6 x 6 array to create the entire 1.248cm x 1.248cm ASIC. Each chip has 720 bump-bond I/O connections, on the back of the digital tier to the ceramic PCB. All the analog tier power and biasing is conveyed through the digital tier from the PCB. The assembly has no peripheral functional blocks, and hence the active area extends to the edge of the detector. This was achieved by using a few flavors of almost identical analog pixels (minimal variation in layout) to allow for peripheral biasing blocks to be placed within pixels. The 1024 pixels within a digital sub-chip array have a variety of full custom, semi-custom and automated timing driven functional blocks placed together. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout. The methodology uses the Cadence design platform, however it is not limited to this tool.

  5. Process Architecture for Managing Digital Object Identifiers

    NASA Astrophysics Data System (ADS)

    Wanchoo, L.; James, N.; Stolte, E.

    2014-12-01

    In 2010, NASA's Earth Science Data and Information System (ESDIS) Project implemented a process for registering Digital Object Identifiers (DOIs) for data products distributed by Earth Observing System Data and Information System (EOSDIS). For the first 3 years, ESDIS evolved the process involving the data provider community in the development of processes for creating and assigning DOIs, and guidelines for the landing page. To accomplish this, ESDIS established two DOI User Working Groups: one for reviewing the DOI process whose recommendations were submitted to ESDIS in February 2014; and the other recently tasked to review and further develop DOI landing page guidelines for ESDIS approval by end of 2014. ESDIS has recently upgraded the DOI system from a manually-driven system to one that largely automates the DOI process. The new automated feature include: a) reviewing the DOI metadata, b) assigning of opaque DOI name if data provider chooses, and c) reserving, registering, and updating the DOIs. The flexibility of reserving the DOI allows data providers to embed and test the DOI in the data product metadata before formally registering with EZID. The DOI update process allows the changing of any DOI metadata except the DOI name unless the name has not been registered. Currently, ESDIS has processed a total of 557 DOIs of which 379 DOIs are registered with EZID and 178 are reserved with ESDIS. The DOI incorporates several metadata elements that effectively identify the data product and the source of availability. Of these elements, the Uniform Resource Locator (URL) attribute has the very important function of identifying the landing page which describes the data product. ESDIS in consultation with data providers in the Earth Science community is currently developing landing page guidelines that specify the key data product descriptive elements to be included on each data product's landing page. This poster will describe in detail the unique automated process and

  6. Design methodology for high-speed video processing system based on signal integrity analysis

    NASA Astrophysics Data System (ADS)

    Wang, Rui; Zhang, Hao

    2009-07-01

    On account of high performance requirement of video processing systems and the shortcoming of conventional circuit design method, a design methodology based on the signal integrity (SI) theory for the high-speed video processing system with TI's digital signal processor TMS320DM642 was proposed. The PCB stack-up and construction of the system as well as transmission line characteristic impedance are set and calculated firstly with the impedance control tool Si8000 through this methodology. And then some crucial signals such as data lines of SDRAM are simulated and analyzed with the IBIS models so that reasonable layout and routing rules are established. Finally the system's highdensity PCB design is completed on Cadence SPB15.7 platform. The design result shows that this methodology can effectively restrain signal reflection, crosstalk, rail collapse noise and electromagnetic interference (EMI). Thus it significantly improves stability of the system and shortens development cycles.

  7. Digital signal processing using virtual instruments

    NASA Astrophysics Data System (ADS)

    Anderson, James A.; Korrapati, Raghu; Swain, Nikunja K.

    2000-08-01

    The area of test and measurement is changing rapidly because of the recent developments in software and hardware. The test and measurement systems are increasingly becoming PC based. Most of these PC based systems use graphical programming language to design test and measurement modules called virtual instruments (Vis). These Vis provide visual representation of dat or models, and make understanding of abstract concepts and algorithms easier. This allows users to express their ideas in a concise manner. One such virtual instruments package is LabVIEW from National Instruments Corporation at Austin, Texas. This software package is one of the first graphical programming products and is currently used in number of academic institutions, industries, Department of Defense graphical programming products and is currently sued in number of academic institutions, industries, Department of Defense, Department of Energy, and National Aeronautics and Space Administration for various test, measurement, and control applications. LabVIEW has an extensive built-in VI library that can be used to design and develop solutions for different applications. Besides using the built-in VI library that can be used to design and develop solutions for different applications. Besides using the built-in VI modules in LabVIEW, the user can design new VI modules easily. This paper discusses the use of LabVIEW to design and develop digital signal processing VI modules such as Fourier Analysis and Windowing. Instructors can use these modules to teach some of the signal processing concepts effectively.

  8. Design Methodology: ASICs with complex in-pixel processing for Pixel Detectors

    SciTech Connect

    Fahim, Farah

    2014-10-31

    The development of Application Specific Integrated Circuits (ASIC) for pixel detectors with complex in-pixel processing using Computer Aided Design (CAD) tools that are, themselves, mainly developed for the design of conventional digital circuits requires a specialized approach. Mixed signal pixels often require parasitically aware detailed analog front-ends and extremely compact digital back-ends with more than 1000 transistors in small areas below 100μm x 100μm. These pixels are tiled to create large arrays, which have the same clock distribution and data readout speed constraints as in, for example, micro-processors. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout.

  9. Digital-Difference Processing For Collision Avoidance.

    NASA Technical Reports Server (NTRS)

    Shores, Paul; Lichtenberg, Chris; Kobayashi, Herbert S.; Cunningham, Allen R.

    1988-01-01

    Digital system for automotive crash avoidance measures and displays difference in frequency between two sinusoidal input signals of slightly different frequencies. Designed for use with Doppler radars. Characterized as digital mixer coupled to frequency counter measuring difference frequency in mixer output. Technique determines target path mathematically. Used for tracking cars, missiles, bullets, baseballs, and other fast-moving objects.

  10. The Creation Process in Digital Art

    NASA Astrophysics Data System (ADS)

    Marcos, Adérito Fernandes; Branco, Pedro Sérgio; Zagalo, Nelson Troca

    The process behind the act of the art creation or the creation process has been the subject of much debate and research during the last fifty years at least, even thinking art and beauty has been a subject of analysis already by the ancient Greeks such were Plato or Aristotle. Even though intuitively it is a simple phenomenon, creativity or the human ability to generate innovation (new ideas, concepts, etc.) is in fact quite complex. It has been studied from the perspectives of behavioral and social psychology, cognitive science, artificial intelligence, philosophy, history, design research, digital art, and computational aesthetics, among others. In spite of many years of discussion and research there is no single, authoritative perspective or definition of creativity, i.e., there is no standardized measurement technique. Regarding the development process that supports the intellectual act of creation it is usually described as a procedure where the artist experiments the medium, explores it with one or more techniques, changing shapes, forms, appearances, where beyond time and space, he/she seeks his/her way out to a clearing, i.e., envisages a path from intention to realization. Duchamp in his lecture "The Creative Act" states the artist is never alone with his/her artwork; there is always the spectator that later on will react critically to the work of art. If the artist succeeds in transmitting his/her intentions in terms of a message, emotion or feeling to the spectator then a form of aesthetic osmosis actually takes place through the inert matter (the medium) that enabled this communication or interaction phenomenon to occur. The role of the spectator may become gradually more active by interacting with the artwork itself possibly changing or becoming a part of it [2][4].

  11. The spatial structure of terrain - A process signal in satellite digital images

    NASA Technical Reports Server (NTRS)

    Craig, R. G.

    1984-01-01

    Pattern recognition procedures applied to Landsat imagery carry an implicit assumption that the digital data are independently distributed. That assumption is incorrect over virtually any terrain. Deviations from independence occur because slopes follow a systematic pattern of variation arising from the slope-forming processes. That pattern can be identified using the stochastic process methodology of Box and Jenkins.Angles of adjacent slopes are autocorrelated and the bidirectional reflectance function transfers these systematic slope changes to the sensor. Imagery becomes autocorrelated through this transfer. Autocorrelation in the imagery can be removed through direct calculation from a digital elevation model or by use of stochastic process methodology. The latter has the advantage that the residuals are white noise; and it is applicable in any area, even where a D.E.M. is unavailable. The stochastic process signal can be used to study terrain processes.

  12. Closed orbit feedback with digital signal processing

    SciTech Connect

    Chung, Y.; Kirchman, J.; Lenkszus, F.

    1994-08-01

    The closed orbit feedback experiment conducted on the SPEAR using the singular value decomposition (SVD) technique and digital signal processing (DSP) is presented. The beam response matrix, defined as beam motion at beam position monitor (BPM) locations per unit kick by corrector magnets, was measured and then analyzed using SVD. Ten BPMs, sixteen correctors, and the eight largest SVD eigenvalues were used for closed orbit correction. The maximum sampling frequency for the closed loop feedback was measured at 37 Hz. Using the proportional and integral (PI) control algorithm with the gains Kp = 3 and K{sub I} = 0.05 and the open-loop bandwidth corresponding to 1% of the sampling frequency, a correction bandwidth ({minus}3 dB) of approximately 0.8 Hz was achieved. Time domain measurements showed that the response time of the closed loop feedback system for 1/e decay was approximately 0.25 second. This result implies {approximately} 100 Hz correction bandwidth for the planned beam position feedback system for the Advanced Photon Source storage ring with the projected 4-kHz sampling frequency.

  13. Multilingual subjective methodology and evaluation of low-rate digital voice processors

    NASA Astrophysics Data System (ADS)

    Dimolitsas, Spiros; Corcoran, Franklin L.; Baraniecki, Marion R.; Phipps, John G., Jr.

    The methodology and results for a multilingual evaluation of source encoding algorithms operating at 16 kbit/s are presented. The evaluation was conducted in three languages (English, French, and Madarin), using listener opinion subjective assessments to determine whether 'toll-quality' performance is possible at 16 kbit/s. The study demonstrated that toll-quality voice is indeed possible at 16 kbit/s, and that several of the methods evaluated are more robust under high bit error conditions than either 32- or 64-kbit/s encoding. Thus, 16-kbit/s voice coding technology is currently suitable for many applications with the public-switched telephone network, including the next generation of digital circuit multiplication equipment, and integrated services digital network videotelephony.

  14. Modular digital holographic fringe data processing system

    NASA Technical Reports Server (NTRS)

    Downward, J. G.; Vavra, P. C.; Schebor, F. S.; Vest, C. M.

    1985-01-01

    A software architecture suitable for reducing holographic fringe data into useful engineering data is developed and tested. The results, along with a detailed description of the proposed architecture for a Modular Digital Fringe Analysis System, are presented.

  15. A methodology aimed at fostering and sustaining the development processes of an IE-based industry

    NASA Astrophysics Data System (ADS)

    Corallo, Angelo; Errico, Fabrizio; de Maggio, Marco; Giangreco, Enza

    In the current competitive scenario, where business relationships are fundamental in building successful business models and inter/intra organizational business processes are progressively digitalized, an end-to-end methodology is required that is capable of guiding business networks through the Internetworked Enterprise (IE) paradigm: a new and innovative organizational model able to leverage Internet technologies to perform real-time coordination of intra and inter-firm activities, to create value by offering innovative and personalized products/services and reduce transaction costs. This chapter presents the TEKNE project Methodology of change that guides business networks, by means of a modular and flexible approach, towards the IE techno-organizational paradigm, taking into account the competitive environment of the network and how this environment influences its strategic, organizational and technological levels. Contingency, the business model, enterprise architecture and performance metrics are the key concepts that form the cornerstone of this methodological framework.

  16. Programmable rate modem utilizing digital signal processing techniques

    NASA Technical Reports Server (NTRS)

    Naveh, Arad

    1992-01-01

    The need for a Programmable Rate Digital Satellite Modem capable of supporting both burst and continuous transmission modes with either Binary Phase Shift Keying (BPSK) or Quadrature Phase Shift Keying (QPSK) modulation is discussed. The preferred implementation technique is an all digital one which utilizes as much digital signal processing (DSP) as possible. The design trade-offs in each portion of the modulator and demodulator subsystem are outlined.

  17. A Phenomenological Study of an Emergent National Digital Library, Part I: Theory and Methodological Framework

    ERIC Educational Resources Information Center

    Dalbello, Marija

    2005-01-01

    The activities surrounding the National Digital Library Program (NDLP) at the Library of Congress (1995-2000) are used to study institutional processes associated with technological innovation in the library context. The study identified modalities of successful innovation and the characteristics of creative decision making. Theories of social…

  18. Digital image processing: a primer for JVIR authors and readers: part 2: digital image acquisition.

    PubMed

    LaBerge, Jeanne M; Andriole, Katherine P

    2003-11-01

    This is the second installment of a three-part series on digital image processing intended to prepare authors for online submission of manuscripts. In the first article of the series, we reviewed the fundamentals of digital image architecture. In this article, we describe the ways that an author can import digital images to the computer desktop. We explore the modern imaging network and explain how to import picture archiving and communications systems (PACS) images to the desktop. Options and techniques for producing digital hard copy film are also presented. PMID:14605101

  19. The digital storytelling process: A comparative analysis from various experts

    NASA Astrophysics Data System (ADS)

    Hussain, Hashiroh; Shiratuddin, Norshuhada

    2016-08-01

    Digital Storytelling (DST) is a method of delivering information to the audience. It combines narrative and digital media content infused with the multimedia elements. In order for the educators (i.e the designers) to create a compelling digital story, there are sets of processes introduced by experts. Nevertheless, the experts suggest varieties of processes to guide them; of which some are redundant. The main aim of this study is to propose a single guide process for the creation of DST. A comparative analysis is employed where ten DST models from various experts are analysed. The process can also be implemented in other multimedia materials that used the concept of DST.

  20. Digital signal processor and processing method for GPS receivers

    NASA Technical Reports Server (NTRS)

    Thomas, Jr., Jess B. (Inventor)

    1989-01-01

    A digital signal processor and processing method therefor for use in receivers of the NAVSTAR/GLOBAL POSITIONING SYSTEM (GPS) employs a digital carrier down-converter, digital code correlator and digital tracking processor. The digital carrier down-converter and code correlator consists of an all-digital, minimum bit implementation that utilizes digital chip and phase advancers, providing exceptional control and accuracy in feedback phase and in feedback delay. Roundoff and commensurability errors can be reduced to extremely small values (e.g., less than 100 nanochips and 100 nanocycles roundoff errors and 0.1 millichip and 1 millicycle commensurability errors). The digital tracking processor bases the fast feedback for phase and for group delay in the C/A, P.sub.1, and P.sub.2 channels on the L.sub.1 C/A carrier phase thereby maintaining lock at lower signal-to-noise ratios, reducing errors in feedback delays, reducing the frequency of cycle slips and in some cases obviating the need for quadrature processing in the P channels. Simple and reliable methods are employed for data bit synchronization, data bit removal and cycle counting. Improved precision in averaged output delay values is provided by carrier-aided data-compression techniques. The signal processor employs purely digital operations in the sense that exactly the same carrier phase and group delay measurements are obtained, to the last decimal place, every time the same sampled data (i.e., exactly the same bits) are processed.

  1. Pedagogical reforms of digital signal processing education

    NASA Astrophysics Data System (ADS)

    Christensen, Michael

    The future of the engineering discipline is arguably predicated heavily upon appealing to the future generation, in all its sensibilities. The greatest burden in doing so, one might rightly believe, lies on the shoulders of the educators. In examining the causal means by which the profession arrived at such a state, one finds that the technical revolution, precipitated by global war, had, as its catalyst, institutions as expansive as the government itself to satisfy the demand for engineers, who, as a result of such an existential crisis, were taught predominantly theoretical underpinnings to address a finite purpose. By contrast, the modern engineer, having expanded upon this vision and adapted to an evolving society, is increasingly placed in the proverbial role of the worker who must don many hats: not solely a scientist, yet often an artist; not a businessperson alone, but neither financially naive; not always a representative, though frequently a collaborator. Inasmuch as change then serves as the only constancy in a global climate, therefore, the educational system - if it is to mimic the demands of the industry - is left with an inherent need for perpetual revitalization to remain relevant. This work aims to serve that end. Motivated by existing research in engineering education, an epistemological challenge is molded into the framework of the electrical engineer with emphasis on digital signal processing. In particular, it is investigated whether students are better served by a learning paradigm that tolerates and, when feasible, encourages error via a medium free of traditional adjudication. Through the creation of learning modules using the Adobe Captivate environment, a wide range of fundamental knowledge in signal processing is challenged within the confines of existing undergraduate courses. It is found that such an approach not only conforms to the research agenda outlined for the engineering educator, but also reflects an often neglected reality

  2. Methodology for utilizing CD distributions for optimization of lithographic processes

    NASA Astrophysics Data System (ADS)

    Charrier, Edward W.; Mack, Chris A.; Zuo, Qiang; Maslow, Mark J.

    1997-07-01

    As the critical dimension (CD) of optical lithography processes continues to decrease, the process latitude also decreases and CD control becomes more difficult. As this trend continues, lithography engineers will find that they require improved process optimization methods which take into account the random and systematic errors that are inherent in any manufacturing process. This paper shows the methodology of such an optimization method. Lithography simulation and analysis software, combined with experimental process error distributions, are used to perform optimizations of numerical aperture and partial coherence, as well as the selection of the best OPC pattern for a given mask.

  3. Rethinking the Purposes and Processes for Designing Digital Portfolios

    ERIC Educational Resources Information Center

    Hicks, Troy; Russo, Anne; Autrey, Tara; Gardner, Rebecca; Kabodian, Aram; Edington, Cathy

    2007-01-01

    As digital portfolios become more prevalent in teacher education, the purposes and processes for creating them have become contested. Originally meant to be critical and reflective spaces for learning about multimedia and conceived as contributing to professional growth, research shows that digital portfolios are now increasingly being used to…

  4. Digital computer processing of X-ray photos

    NASA Technical Reports Server (NTRS)

    Nathan, R.; Selzer, R. H.

    1967-01-01

    Digital computers correct various distortions in medical and biological photographs. One of the principal methods of computer enhancement involves the use of a two-dimensional digital filter to modify the frequency spectrum of the picture. Another computer processing method is image subtraction.

  5. The Technology Transfer Process: Concepts, Framework and Methodology.

    ERIC Educational Resources Information Center

    Jolly, James A.

    This paper discusses the conceptual framework and methodology of the technology transfer process and develops a model of the transfer mechanism. This model is then transformed into a predictive model of technology transfer incorporating nine factors that contribute to the movement of knowledge from source to user. Each of these factors is examined…

  6. A POLLUTION REDUCTION METHODOLOGY FOR CHEMICAL PROCESS SIMULATORS

    EPA Science Inventory

    A pollution minimization methodology was developed for chemical process design using computer simulation. It is based on a pollution balance that at steady state is used to define a pollution index with units of mass of pollution per mass of products. The pollution balance has be...

  7. Powerful Practices in Digital Learning Processes

    ERIC Educational Resources Information Center

    Sørensen, Birgitte Holm; Levinsen, Karin Tweddell

    2015-01-01

    The present paper is based on two empirical research studies. The "Netbook 1:1" project (2009-2012), funded by the municipality of Gentofte and Microsoft Denmark, is complete, while "Students' digital production and students as learning designers" (2013-2015), funded by the Danish Ministry of Education, is ongoing. Both…

  8. Preliminary development of digital signal processing in microwave radiometers

    NASA Technical Reports Server (NTRS)

    Stanley, W. D.

    1980-01-01

    Topics covered involve a number of closely related tasks including: the development of several control loop and dynamic noise model computer programs for simulating microwave radiometer measurements; computer modeling of an existing stepped frequency radiometer in an effort to determine its optimum operational characteristics; investigation of the classical second order analog control loop to determine its ability to reduce the estimation error in a microwave radiometer; investigation of several digital signal processing unit designs; initiation of efforts to develop required hardware and software for implementation of the digital signal processing unit; and investigation of the general characteristics and peculiarities of digital processing noiselike microwave radiometer signals.

  9. The teaching of computer programming and digital image processing in radiography.

    PubMed

    Allan, G L; Zylinski, J

    1998-06-01

    The increased use of digital processing techniques in Medical Radiations imaging modalities, along with the rapid advance in information technology has resulted in a significant change in the delivery of radiographic teaching programs. This paper details a methodology used to concurrently educate radiographers in both computer programming and image processing. The students learn to program in visual basic applications (VBA), and the programming skills are contextualised by requiring the students to write a digital subtraction angiography (DSA) package. Program code generation and image presentation interface is undertaken by the spreadsheet Microsoft Excel. The user-friendly nature of this common interface enables all students to readily begin program creation. The teaching of programming and image processing skills by this method may be readily generalised to other vocational fields where digital image manipulation is a professional requirement. PMID:9726504

  10. Effective DQE (eDQE) and speed of digital radiographic systems: An experimental methodology

    PubMed Central

    Samei, Ehsan; Ranger, Nicole T.; MacKenzie, Alistair; Honey, Ian D.; Dobbins, James T.; Ravin, Carl E.

    2009-01-01

    Prior studies on performance evaluation of digital radiographic systems have primarily focused on the assessment of the detector performance alone. However, the clinical performance of such systems is also substantially impacted by magnification, focal spot blur, the presence of scattered radiation, and the presence of an antiscatter grid. The purpose of this study is to evaluate an experimental methodology to assess the performance of a digital radiographic system, including those attributes, and to propose a new metric, effective detective quantum efficiency (eDQE), a candidate for defining the efficiency or speed of digital radiographic imaging systems. The study employed a geometric phantom simulating the attenuation and scatter properties of the adult human thorax and a representative indirect flat-panel-based clinical digital radiographic imaging system. The noise power spectrum (NPS) was derived from images of the phantom acquired at three exposure levels spanning the operating range of the clinical system. The modulation transfer function (MTF) was measured using an edge device positioned at the surface of the phantom, facing the x-ray source. Scatter measurements were made using a beam stop technique. The eDQE was then computed from these measurements, along with measures of phantom attenuation and x-ray flux. The MTF results showed notable impact from the focal spot blur, while the NPS depicted a large component of structured noise resulting from use of an antiscatter grid. The eDQE was found to be an order of magnitude lower than the conventional DQE. At 120 kVp, eDQE(0) was in the 8%–9% range, fivefold lower than DQE(0) at the same technique. The eDQE method yielded reproducible estimates of the system performance in a clinically relevant context by quantifying the inherent speed of the system, that is, the actual signal to noise ratio that would be measured under clinical operating conditions. PMID:19746814

  11. Demonstration of the Dynamic Flowgraph Methodology using the Titan 2 Space Launch Vehicle Digital Flight Control System

    NASA Technical Reports Server (NTRS)

    Yau, M.; Guarro, S.; Apostolakis, G.

    1993-01-01

    Dynamic Flowgraph Methodology (DFM) is a new approach developed to integrate the modeling and analysis of the hardware and software components of an embedded system. The objective is to complement the traditional approaches which generally follow the philosophy of separating out the hardware and software portions of the assurance analysis. In this paper, the DFM approach is demonstrated using the Titan 2 Space Launch Vehicle Digital Flight Control System. The hardware and software portions of this embedded system are modeled in an integrated framework. In addition, the time dependent behavior and the switching logic can be captured by this DFM model. In the modeling process, it is found that constructing decision tables for software subroutines is very time consuming. A possible solution is suggested. This approach makes use of a well-known numerical method, the Newton-Raphson method, to solve the equations implemented in the subroutines in reverse. Convergence can be achieved in a few steps.

  12. Categorical digital soil database for the Carpathian-basin using the modified e-SOTER methodology

    NASA Astrophysics Data System (ADS)

    Dobos, Endre; Vadnai, Péter; Micheli, Erika; Pasztor, Laszlo

    2015-04-01

    Harmonized, spatially and thematically consistent, high resolution soil data covering larger regions, several countries to model different environmental and socio-economic scenarios is needed for several applications. The only way to have such data with large spatial coverage and high resolution data is to make use of the available high resolution digital data sources, digital soil mapping tools in the development process. Digital soil mapping has become a very efficient tool in soil science, several applications have been published in this topic recently. Many of these applications use environmental covariates like remotely sensed images and digital elevation models, which are raster based data sources with block support. The majority of soil data users requires data in raster format with values of certain properties, like pH, clay content or soil organic matter content. However, the use of these soil properties are often limited, an adequate interpretation of these numbers requires knowledge on the soil system, and its major processes and process associations. This soil system description can be best done using the existing knowledge of soil science expressed in soil classification. as diagnostics - features, materials horizons, as important descriptive information - and the classification categories. The most commonly used and internationally accepted classification system is the Worlds Reference Base for soil description, the so called WRB. Each soil classification category represent a complex association of processes and properties, which is difficult to be used and understood and also mapped due to its complex information behind the category names. The major advantage of the diagnostics based classification systems, like WRB, is that the complex soil categories, classes can be interpreted as unique combinations of the diagnostic features. Therefore each classes an be disaggregated into several diagnostics, where each have independent useful information

  13. Application of digital image processing techniques to astronomical imagery 1977

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.; Lynn, D. J.

    1978-01-01

    Nine specific techniques of combination of techniques developed for applying digital image processing technology to existing astronomical imagery are described. Photoproducts are included to illustrate the results of each of these investigations.

  14. Quantization effects in radiation spectroscopy based on digital pulse processing

    SciTech Connect

    Jordanov, V. T.; Jordanova, K. V.

    2011-07-01

    Radiation spectra represent inherently quantization data in the form of stacked channels of equal width. The spectrum is an experimental measurement of the discrete probability density function (PDF) of the detector pulse heights. The quantization granularity of the spectra depends on the total number of channels covering the full range of pulse heights. In analog pulse processing the total number of channels is equal to the total digital values produced by a spectroscopy analog-to-digital converter (ADC). In digital pulse processing each detector pulse is sampled and quantized by a fast ADC producing certain number of quantized numerical values. These digital values are linearly processed to obtain a digital quantity representing the peak of the digitally shaped pulse. Using digital pulse processing it is possible to acquire a spectrum with the total number of channels greater than the number of ADC values. Noise and sample averaging are important in the transformation of ADC quantized data into spectral quantized data. Analysis of this transformation is performed using an area sampling model of quantization. Spectrum differential nonlinearity (DNL) is shown to be related to the quantization at low noise levels and small number of averaged samples. Theoretical analysis and experimental measurements are used to obtain the condition to minimize the DNL due to quantization. (authors)

  15. Digital signal processing in the radio science stability analyzer

    NASA Technical Reports Server (NTRS)

    Greenhall, C. A.

    1995-01-01

    The Telecommunications Division has built a stability analyzer for testing Deep Space Network installations during flight radio science experiments. The low-frequency part of the analyzer operates by digitizing wave signals with bandwidths between 80 Hz and 45 kHz. Processed outputs include spectra of signal, phase, amplitude, and differential phase; time series of the same quantities; and Allan deviation of phase and differential phase. This article documents the digital signal-processing methods programmed into the analyzer.

  16. Agricultural inventory capabilities of machine processed LANDSAT digital data

    NASA Technical Reports Server (NTRS)

    Dietrick, D. L.; Fries, R. E.; Egbert, D. D.

    1975-01-01

    Agricultural crop identification and acreage determination analysis of LANDSAT digital data was performed for two study areas. A multispectral image processing and analysis system was utilized to perform the manmachine interactive analysis. The developed techniques yielded crop acreage estimate results with accuracy greater than 90% and as high as 99%. These results are encouraging evidence of agricultural inventory capabilities of machine processed LANDSAT digital data.

  17. Detecting jaundice by using digital image processing

    NASA Astrophysics Data System (ADS)

    Castro-Ramos, J.; Toxqui-Quitl, C.; Villa Manriquez, F.; Orozco-Guillen, E.; Padilla-Vivanco, A.; Sánchez-Escobar, JJ.

    2014-03-01

    When strong Jaundice is presented, babies or adults should be subject to clinical exam like "serum bilirubin" which can cause traumas in patients. Often jaundice is presented in liver disease such as hepatitis or liver cancer. In order to avoid additional traumas we propose to detect jaundice (icterus) in newborns or adults by using a not pain method. By acquiring digital images in color, in palm, soles and forehead, we analyze RGB attributes and diffuse reflectance spectra as the parameter to characterize patients with either jaundice or not, and we correlate that parameters with the level of bilirubin. By applying support vector machine we distinguish between healthy and sick patients.

  18. The processing of two-digit numbers in bilinguals.

    PubMed

    Macizo, Pedro; Herrera, Amparo; Román, Patricia; Martín, María Cruz

    2011-08-01

    We explored possible between-language influences when bilinguals processed two-digit numbers. Spanish/English bilinguals and German/English bilinguals performed a number comparison task with Arabic digits and verbal numbers in their first language (L1) and second language (L2) while the unit-decade compatibility was manipulated. The two bilingual groups showed regular compatibility effect with Arabic digits. In L1, Spanish/English bilinguals showed reverse compatibility effect, while German/English bilinguals showed regular compatibility effect. However, both groups of bilinguals presented reverse compatibility effect in English (L2), which suggested that the bilinguals' L1 did not determine the processing of number words in their L2. The results indicated that bilinguals processed two-digit number words selectively in their L1 and L2 and that they did not transcode L2 numbers into Arabic format. PMID:21752000

  19. A Comprehensive and Harmonized Digital Forensic Investigation Process Model.

    PubMed

    Valjarevic, Aleksandar; Venter, Hein S

    2015-11-01

    Performing a digital forensic investigation (DFI) requires a standardized and formalized process. There is currently neither an international standard nor does a global, harmonized DFI process (DFIP) exist. The authors studied existing state-of-the-art DFIP models and concluded that there are significant disparities pertaining to the number of processes, the scope, the hierarchical levels, and concepts applied. This paper proposes a comprehensive model that harmonizes existing models. An effort was made to incorporate all types of processes proposed by the existing models, including those aimed at achieving digital forensic readiness. The authors introduce a novel class of processes called concurrent processes. This is a novel contribution that should, together with the rest of the model, enable more efficient and effective DFI, while ensuring admissibility of digital evidence. Ultimately, the proposed model is intended to be used for different types of DFI and should lead to standardization. PMID:26258644

  20. Relationships between digital signal processing and control and estimation theory

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1978-01-01

    Research areas associated with digital signal processing and control and estimation theory are identified. Particular attention is given to image processing, system identification problems (parameter identification, linear prediction, least squares, Kalman filtering), stability analyses (the use of the Liapunov theory, frequency domain criteria, passivity), and multiparameter systems, distributed processes, and random fields.

  1. Analysis of Interpersonal Communication Processes in Digital Factory Environments

    NASA Astrophysics Data System (ADS)

    Schütze, Jens; Baum, Heiko; Laue, Martin; Müller, Egon

    The paper outlines the scope of influence of digital factory on the interpersonal communication process and the exemplary description of them. On the basis of a brief description about the theoretical basic concepts of the digital factory occurs the illustration of communicative features in digital factory. Practical coherences of interpersonal communication from a human oriented view were analyzed in Volkswagen AG in Wolfsburg in a pilot project. A modeling method was developed within the process analysis. This method makes it possible to visualize interpersonal communication and its human oriented attribute in a technically focused workflow. Due to the results of a developed inquiry about communication analysis and process models of modeling methods it was possible to build the processes in a suitable way for humans and to obtain a positive implication on the communication processes.

  2. Digital image processing of earth observation sensor data

    NASA Technical Reports Server (NTRS)

    Bernstein, R.

    1976-01-01

    This paper describes digital image processing techniques that were developed to precisely correct Landsat multispectral earth observation data and gives illustrations of the results achieved, e.g., geometric corrections with an error of less than one picture element, a relative error of one-fourth picture element, and no radiometric error effect. Techniques for enhancing the sensor data, digitally mosaicking multiple scenes, and extracting information are also illustrated.

  3. Digital signal processing for fiber-optic thermometers

    SciTech Connect

    Fernicola, V.; Crovini, L.

    1994-12-31

    A digital signal processing scheme for measurement of exponentially-decaying signals, such as those found in fluorescence, lifetime-based, fiber-optic sensors, is proposed. The instrument uses a modified digital phase-sensitive-detection technique with the phase locked to a fixed value and the modulation period tracking the measured lifetime. Typical resolution of the system is 0.05% for slow decay (>500 {mu}s) and 0.1% for fast decay.

  4. Parametric design methodology for chemical processes using a simulator

    SciTech Connect

    Diwekar, U.M.; Rubin, E.S. )

    1994-02-01

    Parameter design is a method popularized by the Japanese quality expert G. Taguchi, for designing products and manufacturing processes that are robust in the face of uncontrollable variations. At the design stage, the goal of parameter design is to identify design settings that make the product performance less sensitive to the effects of manufacturing and environmental variations and deterioration. Because parameter design reduces performance variation by reducing the influence of the sources of variation rather than by controlling them, it is a cost-effective technique for improving quality. A recent study on the application of parameter design methodology for chemical processes reported that the use of Taguchi's method was not justified and a method based on Monte Carlo simulation combined with optimization was shown to be more effective. However, this method is computationally intensive as a large number of samples are necessary to achieve the given accuracy. Additionally, determination of the number of sample runs required is based on experimentation due to a lack of systematic sampling methods. In an attempt to overcome these problems, the use of a stochastic modeling capability combined with an optimizer is presented in this paper. The objective is that of providing an effective means for application of parameter design methodologies to chemical processes using the ASPEN simulator. This implementation not only presents a generalized tool for use by chemical engineers at large but also provides systematic estimates of the number of sample runs required to attain the specified accuracy. The stochastic model employs the technique of Latin hypercube sampling instead of the traditional Monte Carlo technique and hence has a great potential to reduce the required number of samples. The methodology is illustrated via an example problem of designing a chemical process.

  5. Nuclear spectroscopy pulse height analysis based on digital signal processing techniques

    SciTech Connect

    Simoes, J.B.; Simoes, P.C.P.S.; Correia, C.M.B.A.

    1995-08-01

    A digital approach to pulse height analysis is presented. It consists of entire pulse digitization, using a flash analog-to-digital converter (ADC), being its height estimated by a floating point digital signal processor (DSP) as one parameter of a model best fitting to the pulse samples. The differential nonlinearity (DNL) is reduced by simultaneously adding to the pulse, prior to its digitization, two analog signals provided by a digital-to-analog converter (DAC). One of them is a small amplitude dither signal used to eliminate a bias introduced by the fitting algorithm. The other, with large amplitude, corrects the ADC nonlinearities by a method similar to the well known Gatti`s sliding scale. The simulations carried out showed that, using a 12-bit flash ADC, a 14-bit DAC and a dedicated floating point DSP performing a polynomial fitting to the samples around the pulse peak, it is actually possible to process about 10,000 events per second, with a constant height pulse dispersion of only 4 on 8,192 channels and a very good differential linearity. A prototype system based on the Texas Instruments floating point DSP TMS320C31 and built following the presented methodology has already been tested and performed as expected.

  6. [Fundamental bases of digital information processing in nuclear cardiology (III)].

    PubMed

    Cuarón, A; González, C; García Moreira, C

    1984-01-01

    This article describes the transformation of the gamma-camera images into digital form. The incidence of a gamma photon on the detector, produces two voltage pulses, which are proportional to the coordinates of the incidence points, and a digital pulse, indicative of the occurrence of the event. The coordinate pulses passes through a analog-digital converter, that is activated by the pulse. The result is the appearance of a digital number at the out-put of the converter, which is proportional to the voltage at its in-put. This number, is stored on the accumulation memory of the system, either on a list mode or on a matrix mode. Static images can be stored on a single matrix. Dynamic data can be stored on a series of matrixes, each representing a different period of acquisition. It is also possible to capture information on a series of matrixes syncronized with the electrocardiogram of the patient. In this instance, each matrix represents a distinct period of the cardiac cycle. Data stored on the memory, can be used to process and display images and quantitative histograms on a video screen. In order to do that, it is necessary to translate the digital data on the memory to voltage levels, and to transform these on light levels on the screen. This, is achieved through a digital analog converter. The reading of the digital memory must be syncronic with the electronic scanning of the video screen. PMID:6466002

  7. Digital intermediate frequency QAM modulator using parallel processing

    DOEpatents

    Pao, Hsueh-Yuan; Tran, Binh-Nien

    2008-05-27

    The digital Intermediate Frequency (IF) modulator applies to various modulation types and offers a simple and low cost method to implement a high-speed digital IF modulator using field programmable gate arrays (FPGAs). The architecture eliminates multipliers and sequential processing by storing the pre-computed modulated cosine and sine carriers in ROM look-up-tables (LUTs). The high-speed input data stream is parallel processed using the corresponding LUTs, which reduces the main processing speed, allowing the use of low cost FPGAs.

  8. Dry Machining Process of Milling Machine using Axiomatic Green Methodology

    NASA Astrophysics Data System (ADS)

    Puspita Andriani, Gita; Akbar, Muhammad; Irianto, Dradjad

    2016-02-01

    Most of companies know that there are strategies to become green industry, and they realize that green efforts have impacts on product quality and cost. Axiomatic Green Methodology models the relationship between green, quality, and cost. This methodology starts with determining the green improvement objective and then continues with mapping the functional, economic, and green requirements. From the mapping, variables which affect the requirements are identified. Afterwards, the effect of each variable is determined by performing experiments and regression modelling. In this research, axiomatic green methodology was implemented to dry machining of milling machine in order to reduce the amount of coolant. Dry machining will be feasible if it is not worse than the minimum required quality. As a result, dry machining is feasible without producing any defect. The proposed machining parameter is to reduce the coolant flow rate from 6.882 ml/minute to 0 ml/minute, set the depth of cut at 1.2 mm, spindle rotation speed at 500 rpm, and feed rate at 128 mm/minute. This solution is also resulted in reduction of cost for 200.48 rupiahs for each process.

  9. Improved assembly processes for the Quartz Digital Accelerometer cantilever

    SciTech Connect

    Romero, A.M.; Gebert, C.T.

    1990-07-01

    This report covers the development of improved assembly processes for the Quartz Digital Accelerometer cantilever. In this report we discuss improved single-assembly tooling, the development of tooling and processes for precision application of polyimide adhesive, the development of the wafer scale assembly procedure, and the application of eutectic bonding to cantilever assembly. 2 refs., 17 figs.

  10. FPGA-Based Filterbank Implementation for Parallel Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Berner, Stephan; DeLeon, Phillip

    1999-01-01

    One approach to parallel digital signal processing decomposes a high bandwidth signal into multiple lower bandwidth (rate) signals by an analysis bank. After processing, the subband signals are recombined into a fullband output signal by a synthesis bank. This paper describes an implementation of the analysis and synthesis banks using (Field Programmable Gate Arrays) FPGAs.

  11. Relationships between digital signal processing and control and estimation theory

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1978-01-01

    Research directions in the fields of digital signal processing and modern control and estimation theory are discussed. Stability theory, linear prediction and parameter identification, system synthesis and implementation, two-dimensional filtering, decentralized control and estimation, and image processing are considered in order to uncover some of the basic similarities and differences in the goals, techniques, and philosophy of the disciplines.

  12. Modeling of electrohydrodynamic drying process using response surface methodology

    PubMed Central

    Dalvand, Mohammad Jafar; Mohtasebi, Seyed Saeid; Rafiee, Shahin

    2014-01-01

    Energy consumption index is one of the most important criteria for judging about new, and emerging drying technologies. One of such novel and promising alternative of drying process is called electrohydrodynamic (EHD) drying. In this work, a solar energy was used to maintain required energy of EHD drying process. Moreover, response surface methodology (RSM) was used to build a predictive model in order to investigate the combined effects of independent variables such as applied voltage, field strength, number of discharge electrode (needle), and air velocity on moisture ratio, energy efficiency, and energy consumption as responses of EHD drying process. Three-levels and four-factor Box–Behnken design was employed to evaluate the effects of independent variables on system responses. A stepwise approach was followed to build up a model that can map the entire response surface. The interior relationships between parameters were well defined by RSM. PMID:24936289

  13. Methodology and Process for Condition Assessment at Existing Hydropower Plants

    SciTech Connect

    Zhang, Qin Fen; Smith, Brennan T; Cones, Marvin; March, Patrick; Dham, Rajesh; Spray, Michael

    2012-01-01

    Hydropower Advancement Project was initiated by the U.S. Department of Energy Office of Energy Efficiency and Renewable Energy to develop and implement a systematic process with a standard methodology to identify the opportunities of performance improvement at existing hydropower facilities and to predict and trend the overall condition and improvement opportunity within the U.S. hydropower fleet. The concept of performance for the HAP focuses on water use efficiency how well a plant or individual unit converts potential energy to electrical energy over a long-term averaging period of a year or more. The performance improvement involves not only optimization of plant dispatch and scheduling but also enhancement of efficiency and availability through advanced technology and asset upgrades, and thus requires inspection and condition assessment for equipment, control system, and other generating assets. This paper discusses the standard methodology and process for condition assessment of approximately 50 nationwide facilities, including sampling techniques to ensure valid expansion of the 50 assessment results to the entire hydropower fleet. The application and refining process and the results from three demonstration assessments are also presented in this paper.

  14. Image processing in digital chest radiography: effect on diagnostic efficacy.

    PubMed

    Manninen, H; Partanen, K; Lehtovirta, J; Matsi, P; Soimakallio, S

    1992-01-01

    The usefulness of digital image processing of chest radiographs was evaluated in a clinical study. In 54 patients, chest radiographs in the posteroanterior projection were obtained by both 14 inch digital image intensifier equipment and the conventional screen-film technique. The digital radiographs (512 x 512 image format) viewed on a 625 line monitor were processed in three different ways: (1) standard display; (2) digital edge enhancement for the standard display; and (3) inverse intensity display. The radiographs were interpreted independently by three radiologists. The diagnoses were confirmed by CT, follow-up radiographs and clinical records. Chest abnormalities of the films analyzed included 21 primary lung tumors, 44 pulmonary nodules, 16 cases with mediastinal disease and 17 cases with pneumonia/atelectasis. Interstitial lung disease, pleural plaques, and pulmonary emphysema were found in 30, 18 and 19 cases, respectively. The sensitivity of conventional radiography when averaged overall findings was better than that of the digital techniques (P less than 0.001). The differences in diagnostic accuracy measured by sensitivity and specificity between the three digital display modes were small. Standard image display showed better sensitivity for pulmonary nodules (0.74 vs 0.66; P less than 0.05) but poorer specificity for pulmonary emphysema (0.85 vs. 0.93; P less than 0.05) compared with inverse intensity display. We conclude that when using 512 x 512 image format, the routine use of digital edge enhancement and tone reversal at digital chest radiographs is not warranted. PMID:1563421

  15. Efficient multiprocessor architecture for digital signal processing

    SciTech Connect

    Auguin, M.; Boeri, F.

    1982-01-01

    There is a continuing pressure of better processing performances in numerical signal processing. Effective utilization of LSI semiconductor technology allows the consideration of multiprocessor architectures. The problem of interconnecting the components of the architecture arises. The authors describe a control algorithm of the Benes interconnection network in a asynchronous multiprocessor system. A simulation study of the time-shared bus, of the omega network, of the benes network and of the crossbar network gives a comparison of performances. 8 references.

  16. Digital pulse processing: new possibilities in nuclear spectroscopy

    PubMed

    Warburton; Momayezi; Hubbard-Nelson; Skulski

    2000-10-01

    Digital pulse processing is a signal processing technique in which detector (preamplifier output) signals are directly digitized and processed to extract quantities of interest. This approach has several significant advantages compared to traditional analog signal shaping. First, analyses can be developed which take pulse-by-pulse differences into account, as in making ballistic deficit compensations. Second, transient induced charge signals, which deposit no net charge on an electrode, can be analyzed to give, for example, information on the position of interaction within the detector. Third, deadtimes from transient overload signals are greatly reduced, from tens of micros to hundreds of ns. Fourth, signals are easily captured, so that more complex analyses can be postponed until the source event has been deemed "interesting". Fifth, signal capture and processing may easily be based on coincidence criteria between different detectors or different parts of the same detector. XIAs recently introduced CAMAC module, the DGF-4C, provides many of these features for four input channels, including two levels of digital processing and a FIFO for signal capture for each signal channel. The first level of digital processing is "immediate", taking place in a gate array at the 40 MHz digitization rate, and implements pulse detection, pileup inspection, trapezoidal energy filtering, and control of an external 25.6 micros long FIFO. The second level of digital processing is provided by a digital signal processor (DSP), where more complex algorithms can be implemented. To illustrate digital pulse processing's possibilities, we describe the application of the DGF-4C to a series of experiments. The first, for which the DGF was originally developed, involves locating gamma-ray interaction sites within large segmented Ge detectors. The goal of this work is to attain spatial resolutions of order 2 mm sigma within 70 mm x 90 mm detectors. We show how pulse shape analysis allows

  17. Application of automated methodologies based on digital images for phenological behaviour analysis in Mediterranean species

    NASA Astrophysics Data System (ADS)

    Cesaraccio, Carla; Piga, Alessandra; Ventura, Andrea; Arca, Angelo; Duce, Pierpaolo; Granados, Joel

    2015-04-01

    The importance of phenological research for understanding the consequences of global environmental change on vegetation is highlighted in the most recent IPCC reports. Collecting time series of phenological events appears to be of crucial importance to better understand how vegetation systems respond to climatic regime fluctuations, and, consequently, to develop effective management and adaptation strategies. Vegetation monitoring based on "near-surface" remote sensing techniques have been proposed in recent researches. In particular, the use of digital cameras has become more common for phenological monitoring. Digital images provide spectral information in the red, green, and blue (RGB) wavelengths. Inflection points in seasonal variations of intensities of each color channel can be used to identify phenological events. In this research, an Automated Phenological Observation System (APOS), based on digital image sensors, was used for monitoring the phenological behavior of shrubland species in a Mediterranean site. Major species of the shrubland ecosystem that were analyzed were: Cistus monspeliensis L., Cistus incanus L., Rosmarinus officinalis L., Pistacia lentiscus L., and Pinus halepensis Mill. The system was developed under the INCREASE (an Integrated Network on Climate Change Research) EU-funded research infrastructure project, which is based upon large scale field experiments with non-intrusive climatic manipulations. Monitoring of phenological behavior was conducted during 2012-2014 years. To the end of retrieve phenological information from digital images, a routine of commands to process the digital image file using the program MATLAB (R2013b, The MathWorks, Natick, Mass.) was specifically created. The images of the dataset have been re-classified and renamed files according to the date and time of acquisition. The analysis was focused on regions of interest (ROIs) of the panoramas acquired, defined by the presence of the most representative species of

  18. Digital Art Making as a Representational Process

    ERIC Educational Resources Information Center

    Halverson, Erica Rosenfeld

    2013-01-01

    In this article I bring artistic production into the learning sciences conversation by using the production of representations as a bridging concept between art making and the new literacies. Through case studies with 4 youth media arts organizations across the United States I ask how organizations structure the process of producing…

  19. Digitizing Dissertations for an Institutional Repository: A Process and Cost Analysis*

    PubMed Central

    Piorun, Mary; Palmer, Lisa A.

    2008-01-01

    Objective: This paper describes the Lamar Soutter Library's process and costs associated with digitizing 300 doctoral dissertations for a newly implemented institutional repository at the University of Massachusetts Medical School. Methodology: Project tasks included identifying metadata elements, obtaining and tracking permissions, converting the dissertations to an electronic format, and coordinating workflow between library departments. Each dissertation was scanned, reviewed for quality control, enhanced with a table of contents, processed through an optical character recognition function, and added to the institutional repository. Results: Three hundred and twenty dissertations were digitized and added to the repository for a cost of $23,562, or $0.28 per page. Seventy-four percent of the authors who were contacted (n = 282) granted permission to digitize their dissertations. Processing time per title was 170 minutes, for a total processing time of 906 hours. In the first 17 months, full-text dissertations in the collection were downloaded 17,555 times. Conclusion: Locally digitizing dissertations or other scholarly works for inclusion in institutional repositories can be cost effective, especially if small, defined projects are chosen. A successful project serves as an excellent recruitment strategy for the institutional repository and helps libraries build new relationships. Challenges include workflow, cost, policy development, and copyright permissions. PMID:18654648

  20. Reservoir continuous process improvement six sigma methodology implementation

    SciTech Connect

    Wannamaker, A.L.

    1996-12-01

    The six sigma methodology adopted by AlliedSignal Inc. for implementing continuous improvement activity was applied to a new manufacturing assignment for Federal Manufacturing & Technologies (FM&T). The responsibility for reservoir development/production was transferred from Rocky Flats to FM&T. Pressure vessel fabrication was new to this facility. No fabrication history for this type of product existed in-house. Statistical tools such as process mapping, failure mode and effects analysis, and design of experiments were used to define and fully characterize the machine processes to be used in reservoir production. Continuous improvement with regard to operating efficiencies and product quality is an ongoing activity at FM&T.

  1. Signal processing methodologies for an acoustic fetal heart rate monitor

    NASA Technical Reports Server (NTRS)

    Pretlow, Robert A., III; Stoughton, John W.

    1992-01-01

    Research and development is presented of real time signal processing methodologies for the detection of fetal heart tones within a noise-contaminated signal from a passive acoustic sensor. A linear predictor algorithm is utilized for detection of the heart tone event and additional processing derives heart rate. The linear predictor is adaptively 'trained' in a least mean square error sense on generic fetal heart tones recorded from patients. A real time monitor system is described which outputs to a strip chart recorder for plotting the time history of the fetal heart rate. The system is validated in the context of the fetal nonstress test. Comparisons are made with ultrasonic nonstress tests on a series of patients. Comparative data provides favorable indications of the feasibility of the acoustic monitor for clinical use.

  2. Statistical process control for hospitals: methodology, user education, and challenges.

    PubMed

    Matthes, Nikolas; Ogunbo, Samuel; Pennington, Gaither; Wood, Nell; Hart, Marilyn K; Hart, Robert F

    2007-01-01

    The health care industry is slowly embracing the use of statistical process control (SPC) to monitor and study causes of variation in health care processes. While the statistics and principles underlying the use of SPC are relatively straightforward, there is a need to be cognizant of the perils that await the user who is not well versed in the key concepts of SPC. This article introduces the theory behind SPC methodology, describes successful tactics for educating users, and discusses the challenges associated with encouraging adoption of SPC among health care professionals. To illustrate these benefits and challenges, this article references the National Hospital Quality Measures, presents critical elements of SPC curricula, and draws examples from hospitals that have successfully embedded SPC into their overall approach to performance assessment and improvement. PMID:17627215

  3. Digital image processing of vascular angiograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Blankenhorn, D. H.; Beckenbach, E. S.; Crawford, D. W.; Brooks, S. H.

    1975-01-01

    A computer image processing technique was developed to estimate the degree of atherosclerosis in the human femoral artery. With an angiographic film of the vessel as input, the computer was programmed to estimate vessel abnormality through a series of measurements, some derived primarily from the vessel edge information and others from optical density variations within the lumen shadow. These measurements were combined into an atherosclerosis index, which was found to correlate well with both visual and chemical estimates of atherosclerotic disease.

  4. Synthetic aperture radar and digital processing: An introduction

    NASA Technical Reports Server (NTRS)

    Dicenzo, A.

    1981-01-01

    A tutorial on synthetic aperture radar (SAR) is presented with emphasis on digital data collection and processing. Background information on waveform frequency and phase notation, mixing, Q conversion, sampling and cross correlation operations is included for clarity. The fate of a SAR signal from transmission to processed image is traced in detail, using the model of a single bright point target against a dark background. Some of the principal problems connected with SAR processing are also discussed.

  5. Digital image processing: a primer for JVIR authors and readers: Part 3: Digital image editing.

    PubMed

    LaBerge, Jeanne M; Andriole, Katherine P

    2003-12-01

    This is the final installment of a three-part series on digital image processing intended to prepare authors for online submission of manuscripts. In the first two articles of the series, the fundamentals of digital image architecture were reviewed and methods of importing images to the computer desktop were described. In this article, techniques are presented for editing images in preparation for online submission. A step-by-step guide to basic editing with use of Adobe Photoshop is provided and the ethical implications of this activity are explored. PMID:14654480

  6. Digital Light Processing update: status and future applications

    NASA Astrophysics Data System (ADS)

    Hornbeck, Larry J.

    1999-05-01

    Digital Light Processing (DLP) projection displays based on the Digital Micromirror Device (DMD) were introduced to the market in 1996. Less than 3 years later, DLP-based projectors are found in such diverse applications as mobile, conference room, video wall, home theater, and large-venue. They provide high-quality, seamless, all-digital images that have exceptional stability as well as freedom from both flicker and image lag. Marked improvements have been made in the image quality of DLP-based projection display, including brightness, resolution, contrast ratio, and border image. DLP-based mobile projectors that weighted about 27 pounds in 1996 now weight only about 7 pounds. This weight reduction has been responsible for the definition of an entirely new projector class, the ultraportable. New applications are being developed for this important new projection display technology; these include digital photofinishing for high process speed minilab and maxilab applications and DLP Cinema for the digital delivery of films to audiences around the world. This paper describes the status of DLP-based projection display technology, including its manufacturing, performance improvements, and new applications, with emphasis on DLP Cinema.

  7. Results of precision processing (scene correction) of ERTS-1 images using digital image processing techniques

    NASA Technical Reports Server (NTRS)

    Bernstein, R.

    1973-01-01

    ERTS-1 MSS and RBV data recorded on computer compatible tapes have been analyzed and processed, and preliminary results have been obtained. No degradation of intensity (radiance) information occurred in implementing the geometric correction. The quality and resolution of the digitally processed images are very good, due primarily to the fact that the number of film generations and conversions is reduced to a minimum. Processing times of digitally processed images are about equivalent to the NDPF electro-optical processor.

  8. Introduction and comparison of new EBSD post-processing methodologies.

    PubMed

    Wright, Stuart I; Nowell, Matthew M; Lindeman, Scott P; Camus, Patrick P; De Graef, Marc; Jackson, Michael A

    2015-12-01

    Electron Backscatter Diffraction (EBSD) provides a useful means for characterizing microstructure. However, it can be difficult to obtain index-able diffraction patterns from some samples. This can lead to noisy maps reconstructed from the scan data. Various post-processing methodologies have been developed to improve the scan data generally based on correlating non-indexed or mis-indexed points with the orientations obtained at neighboring points in the scan grid. Two new approaches are introduced (1) a re-scanning approach using local pattern averaging and (2) using the multiple solutions obtained by the triplet indexing method. These methodologies are applied to samples with noise introduced into the patterns artificially and by the operational settings of the EBSD camera. They are also applied to a heavily deformed and a fine-grained sample. In all cases, both techniques provide an improvement in the resulting scan data, the local pattern averaging providing the most improvement of the two. However, the local pattern averaging is most helpful when the noise in the patterns is due to the camera operating conditions as opposed to inherent challenges in the sample itself. A byproduct of this study was insight into the validity of various indexing success rate metrics. A metric based given by the fraction of points with CI values greater than some tolerance value (0.1 in this case) was confirmed to provide an accurate assessment of the indexing success rate. PMID:26342553

  9. Sliding mean edge estimation. [in digital image processing

    NASA Technical Reports Server (NTRS)

    Ford, G. E.

    1978-01-01

    A method for determining the locations of the major edges of objects in digital images is presented. The method is based on an algorithm utilizing maximum likelihood concepts. An image line-scan interval is processed to determine if an edge exists within the interval and its location. The proposed algorithm has demonstrated good results even in noisy images.

  10. Digital frequency counter permits readout without disturbing counting process

    NASA Technical Reports Server (NTRS)

    Winkelstein, R.

    1966-01-01

    Digital frequency counter system enables readout accurately at one-second intervals without interrupting or disturbing the counting process. The system incorporates a master counter and a slave counter with novel logic interconnections. The counter can be readily adapted to provide frequency readouts at 0.1 second intervals.

  11. Digital Image Processing application to spray and flammability studies

    NASA Technical Reports Server (NTRS)

    Hernan, M. A.; Parikh, P.; Sarohia, V.

    1985-01-01

    Digital Image Processing has been integrated into a new technique for measurements of fuel spray characteristics. The advantages of this technique are: a wide dynamic range of droplet sizes, accounting for nonspherical droplet shapes not possible with other spray assessment techniques. Finally, the technique has been applied to the study of turbojet engine fuel nozzle atomization performance with Jet A and antimisting fuel.

  12. Digital-Computer Processing of Graphical Data. Final Report.

    ERIC Educational Resources Information Center

    Freeman, Herbert

    The final report of a two-year study concerned with the digital-computer processing of graphical data. Five separate investigations carried out under this study are described briefly, and a detailed bibliography, complete with abstracts, is included in which are listed the technical papers and reports published during the period of this program.…

  13. Experiences with digital processing of images at INPE

    NASA Technical Reports Server (NTRS)

    Mascarenhas, N. D. A. (Principal Investigator)

    1984-01-01

    Four different research experiments with digital image processing at INPE will be described: (1) edge detection by hypothesis testing; (2) image interpolation by finite impulse response filters; (3) spatial feature extraction methods in multispectral classification; and (4) translational image registration by sequential tests of hypotheses.

  14. Trust in Numbers? Digital Education Governance and the Inspection Process

    ERIC Educational Resources Information Center

    Ozga, Jenny

    2016-01-01

    The aim of the paper is to contribute to the critical study of digital data use in education, through examination of the processes surrounding school inspection judgements. The interaction between pupil performance data and other (embodied, enacted) sources of inspection judgement is scrutinised and discussed with a focus on the interaction…

  15. Optical hybrid analog-digital signal processing based on spike processing in neurons

    NASA Astrophysics Data System (ADS)

    Fok, Mable P.; Tian, Yue; Rosenbluth, David; Deng, Yanhua; Prucnal, Paul R.

    2011-09-01

    Spike processing is one kind of hybrid analog-digital signal processing, which has the efficiency of analog processing and the robustness to noise of digital processing. When instantiated with optics, a hybrid analog-digital processing primitive has the potential to be scalable, computationally powerful, and have high operation bandwidth. These devices open up a range of processing applications for which electronic processing is too slow. Our approach is based on a hybrid analog/digital computational primitive that elegantly implements the functionality of an integrate-and-fire neuron using a Ge-doped non-linear optical fiber and off-the-shelf semiconductor devices. In this paper, we introduce our photonic neuron architecture and demonstrate the feasibility of implementing simple photonic neuromorphic circuits, including the auditory localization algorithm of the barn owl, which is useful for LIDAR localization, and the crayfish tail-flip escape response.

  16. Digital Signal Processing System for Active Noise Reduction

    NASA Astrophysics Data System (ADS)

    Edmonson, William W.; Tucker, Jerry

    2002-12-01

    different adaptive noise cancellation algorithms and provide an operational prototype to understand the behavior of the system under test. DSP software was required to interface the processor with the data converters using interrupt routines. The goal is to build a complete ANC system that can be placed on a flexible circuit with added memory circuitry that also contains the power supply, sensors and actuators. This work on the digital signal processing system for active noise reduction was completed in collaboration with another ASEE Fellow, Dr. Jerry Tucker from Virginia Commonwealth University, Richmond, VA.

  17. Evaluation methodologies for an advanced information processing system

    NASA Technical Reports Server (NTRS)

    Schabowsky, R. S., Jr.; Gai, E.; Walker, B. K.; Lala, J. H.; Motyka, P.

    1984-01-01

    The system concept and requirements for an Advanced Information Processing System (AIPS) are briefly described, but the emphasis of this paper is on the evaluation methodologies being developed and utilized in the AIPS program. The evaluation tasks include hardware reliability, maintainability and availability, software reliability, performance, and performability. Hardware RMA and software reliability are addressed with Markov modeling techniques. The performance analysis for AIPS is based on queueing theory. Performability is a measure of merit which combines system reliability and performance measures. The probability laws of the performance measures are obtained from the Markov reliability models. Scalar functions of this law such as the mean and variance provide measures of merit in the AIPS performability evaluations.

  18. Methodology for the systems engineering process. Volume 3: Operational availability

    NASA Technical Reports Server (NTRS)

    Nelson, J. H.

    1972-01-01

    A detailed description and explanation of the operational availability parameter is presented. The fundamental mathematical basis for operational availability is developed, and its relationship to a system's overall performance effectiveness is illustrated within the context of identifying specific availability requirements. Thus, in attempting to provide a general methodology for treating both hypothetical and existing availability requirements, the concept of an availability state, in conjunction with the more conventional probability-time capability, is investigated. In this respect, emphasis is focused upon a balanced analytical and pragmatic treatment of operational availability within the system design process. For example, several applications of operational availability to typical aerospace systems are presented, encompassing the techniques of Monte Carlo simulation, system performance availability trade-off studies, analytical modeling of specific scenarios, as well as the determination of launch-on-time probabilities. Finally, an extensive bibliography is provided to indicate further levels of depth and detail of the operational availability parameter.

  19. Gallium arsenide enhances digital signal processing in electronic warfare

    NASA Astrophysics Data System (ADS)

    Hoffman, B.; Apte, D.

    1985-07-01

    The higher electron mobility and velocity of GaAs digital signal processing IC devices for electronic warfare (EW) allow operation times that are several times faster than those of ICs based on silicon. Particular benefits are foreseen for the response time and broadband capability of ECM systems. Many data manipulation methods can be implemented in emitter-coupled logic (ECL) GaAs devices, and digital GaAs RF memories are noted to show great promise for improved ECM system performance while encompassing microwave frequency and chirp signal synthesis, repeater jamming, and multiple false target generation. EW digital frequency synthesizers are especially in need of GaAS IC technology, since bandwidth and resolution have been limited by ECL technology to about 250 MHz.

  20. Digital image processing of bone - Problems and potentials

    NASA Technical Reports Server (NTRS)

    Morey, E. R.; Wronski, T. J.

    1980-01-01

    The development of a digital image processing system for bone histomorphometry and fluorescent marker monitoring is discussed. The system in question is capable of making measurements of UV or light microscope features on a video screen with either video or computer-generated images, and comprises a microscope, low-light-level video camera, video digitizer and display terminal, color monitor, and PDP 11/34 computer. Capabilities demonstrated in the analysis of an undecalcified rat tibia include the measurement of perimeter and total bone area, and the generation of microscope images, false color images, digitized images and contoured images for further analysis. Software development will be based on an existing software library, specifically the mini-VICAR system developed at JPL. It is noted that the potentials of the system in terms of speed and reliability far exceed any problems associated with hardware and software development.

  1. Methodology to reduce chronic defect mechanisms in semiconductor processing

    NASA Astrophysics Data System (ADS)

    Ecton, Timothy W.; Frazee, Kenneth G.

    1990-06-01

    This paper docuitents a structur approach to defect elimination in seiiiccructor processing. Classical problem solving techniques were used to logically guide the defect rIuction effort. tfect infontation was gatherei using an automated wafer inspection systeaii ar defects were classifi&1 by production workers on a rete review station. This approach distiruishe actual causes from several probable causes. A process change has reduc the defect mechanism. This methodology was applied to ruce !IEFWN' perfluoroalkoxy (PFA) particles in a one micron semiccructor process. Electrical test structures identified a critical layer where yield loss was occurring. An audit procedure was establishi at this layer arx defects were c1assifi into broad cateories. Further breakout of defect t'pes by appearance was necessaxy to construct a meaningful Pareto chart ard identify the xist fr&ijiently occurring fatal defect. The critical process zone was segmented using autaat wafer inspection to isolate the step causing the defect. An IshiJcawa or cause-effect diagram was construct with input from process engineers to outline all possible causes of the defect. A nest probable branch was selected for investigation arxi pursued until it became clear that this branch was not related to the cause. At this point, new ideas were sought from a sister production facility. ring the visit a breakthrough irxicat& a different path ar ultiltiately lead to identifying the source of the defect. A process change was implemented. An evaluation of the change she1 a substantial decrease in defect evel. rther efforts to eliminate the defect srce are in rogres.

  2. Process sequence optimization for digital microfluidic integration using EWOD technique

    NASA Astrophysics Data System (ADS)

    Yadav, Supriya; Joyce, Robin; Sharma, Akash Kumar; Sharma, Himani; Sharma, Niti Nipun; Varghese, Soney; Akhtar, Jamil

    2016-04-01

    Micro/nano-fluidic MEMS biosensors are the devices that detects the biomolecules. The emerging micro/nano-fluidic devices provide high throughput and high repeatability with very low response time and reduced device cost as compared to traditional devices. This article presents the experimental details for process sequence optimization of digital microfluidics (DMF) using "electrowetting-on-dielectric" (EWOD). Stress free thick film deposition of silicon dioxide using PECVD and subsequent process for EWOD techniques have been optimized in this work.

  3. Digital processing of histopathological aspects in renal transplantation

    NASA Astrophysics Data System (ADS)

    de Albuquerque Araujo, Arnaldo; de Andrade, Marcos C.; Bambirra, Eduardo A.; dos Santos, A. M. M.

    1993-07-01

    We describe here our initial experience with the digital image processing of histopathological aspects from multiple renal biopsies of transplanted kidney in a patient treated with Cyclosporine (CsA), a powerful immunosupressor drug whose use has improved the chances of a successful vascularized organ transplantation (Tx). Unfortunately, CsA promotes morphological alterations to the glomerular structure of the kidneys. To characterize this process, glomeruli, tufts, and lumen areas distributions are measured. The results are presented in form of graphics.

  4. Real time speech recognition on a distributed digital processing array

    NASA Astrophysics Data System (ADS)

    Simpson, P.; Roberts, J. B. G.

    1983-08-01

    A compact digital signal processor based on the architecture of the ICL Distributed Array Processor (DAP) is under development for MOD applications in Radar, ESM, Image Processing, etc. This Memorandum examines its applicability to speech recognition. In such a distributed processor, optimum mapping of the problem on to the array of processors is vital for efficiency. Three mappings of a dynamic time warping algorithm for isolated word recognition are examined, leading to a feasbile real time capability for continuous speech processing. The compatibility found between dynamic programming methods and this class of machine enlarges the scope of signal processing algorithms foreseen as amenable to parallel processing.

  5. Flow manipulation and control methodologies for vacuum infusion processes

    NASA Astrophysics Data System (ADS)

    Alms, Justin B.

    experienced. First, the effect on permeability is characterized, so the process can be simulated and the flow front patterns can be predicted. It was found that using the VIPR process in combination with tool side injection gates is a very effective method to control resin flow. Based on this understanding several control algorithms were developed to use the process in an automated manufacturing environment which were tested and validated in a virtual environment. To implement and demonstrate the approach, an experimental workstation was built and various infusion examples were performed in the automated environment to validate the capability of the VIPR process with the control methodologies. The VIPR process with control consistently performed better than the process without control. This contribution should prove useful in making VIPs more reliable in the production of large scale composite structures.

  6. Novel Optimization Methodology for Welding Process/Consumable Integration

    SciTech Connect

    Quintana, Marie A; DebRoy, Tarasankar; Vitek, John; Babu, Suresh

    2006-01-15

    Advanced materials are being developed to improve the energy efficiency of many industries of future including steel, mining, and chemical, as well as, US infrastructures including bridges, pipelines and buildings. Effective deployment of these materials is highly dependent upon the development of arc welding technology. Traditional welding technology development is slow and often involves expensive and time-consuming trial and error experimentation. The reason for this is the lack of useful predictive tools that enable welding technology development to keep pace with the deployment of new materials in various industrial sectors. Literature reviews showed two kinds of modeling activities. Academic and national laboratory efforts focus on developing integrated weld process models by employing the detailed scientific methodologies. However, these models are cumbersome and not easy to use. Therefore, these scientific models have limited application in real-world industrial conditions. On the other hand, industrial users have relied on simple predictive models based on analytical and empirical equations to drive their product development. The scopes of these simple models are limited. In this research, attempts were made to bridge this gap and provide the industry with a computational tool that combines the advantages of both approaches. This research resulted in the development of predictive tools which facilitate the development of optimized welding processes and consumables. The work demonstrated that it is possible to develop hybrid integrated models for relating the weld metal composition and process parameters to the performance of welds. In addition, these tools can be deployed for industrial users through user friendly graphical interface. In principle, the welding industry users can use these modular tools to guide their welding process parameter and consumable composition selection. It is hypothesized that by expanding these tools throughout welding industry

  7. Digital pulse processing for NaI(Tl) detectors

    NASA Astrophysics Data System (ADS)

    Di Fulvio, A.; Shin, T. H.; Hamel, M. C.; Pozzi, S. A.

    2016-01-01

    We apply two different post-processing techniques to digital pulses induced by photons in a NaI(Tl) detector and compare the obtained energy resolution to the standard analog approach. Our digital acquisition approach is performed using a single-stage acquisition with a fast digitizer. Both the post-processing techniques we propose rely on signal integration. In the first, the pulse integral is calculated by directly numerically integrating the pulse digital samples, while in the second the pulse integral is estimated by a model-based fitting of the pulse. Our study used a 7.62 cm×7.62 cm cylindrical NaI(Tl) detector that gave a 7.60% energy resolution (at 662 keV), using the standard analog acquisition approach, based on a pulse shaping amplifier. The new direct numerical integration yielded a 6.52% energy resolution. The fitting approach yielded a 6.55% energy resolution, and, although computationally heavier than numerical integration, is preferable when only the early samples of the pulse are available. We also evaluated the timing performance of a fast-slow detection system, encompassing an EJ-309 and a NaI(Tl) scintillator. We use two techniques to determine the pulse start time: constant fraction discrimination (CFD) and adaptive noise threshold timing (ANT), for both the analog and digital acquisition approach. With the analog acquisition approach, we found a system time resolution of 5.8 ns and 7.3 ns, using the constant fraction discrimination and adaptive noise threshold timing, respectively. With the digital acquisition approach, a time resolution of 1.2 ns was achieved using the ANT method and 3.3 ns using CFD at 50% of the maximum, to select the pulse start time. The proposed direct digital readout and post-processing techniques can improve the application of NaI(Tl) detectors, traditionally considered 'slow', for fast counting and correlation measurements, while maintaining a good measurement of the energy resolution.

  8. Automated image processing of LANDSAT 2 digital data for watershed runoff prediction

    NASA Technical Reports Server (NTRS)

    Sasso, R. R.; Jensen, J. R.; Estes, J. E.

    1977-01-01

    The U.S. Soil Conservation Service (SCS) model for watershed runoff prediction uses soil and land cover information as its major drivers. Kern County Water Agency is implementing the SCS model to predict runoff for 10,400 sq cm of mountainous watershed in Kern County, California. The Remote Sensing Unit, University of California, Santa Barbara, was commissioned by KCWA to conduct a 230 sq cm feasibility study in the Lake Isabella, California region to evaluate remote sensing methodologies which could be ultimately extrapolated to the entire 10,400 sq cm Kern County watershed. Digital results indicate that digital image processing of Landsat 2 data will provide usable land cover required by KCWA for input to the SCS runoff model.

  9. Computer image processing - The Viking experience. [digital enhancement techniques

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1977-01-01

    Computer processing of digital imagery from the Viking mission to Mars is discussed, with attention given to subjective enhancement and quantitative processing. Contrast stretching and high-pass filtering techniques of subjective enhancement are described; algorithms developed to determine optimal stretch and filtering parameters are also mentioned. In addition, geometric transformations to rectify the distortion of shapes in the field of view and to alter the apparent viewpoint of the image are considered. Perhaps the most difficult problem in quantitative processing of Viking imagery was the production of accurate color representations of Orbiter and Lander camera images.

  10. Viking image processing. [digital stereo imagery and computer mosaicking

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1977-01-01

    The paper discusses the camera systems capable of recording black and white and color imagery developed for the Viking Lander imaging experiment. Each Viking Lander image consisted of a matrix of numbers with 512 rows and an arbitrary number of columns up to a maximum of about 9,000. Various techniques were used in the processing of the Viking Lander images, including: (1) digital geometric transformation, (2) the processing of stereo imagery to produce three-dimensional terrain maps, and (3) computer mosaicking of distinct processed images. A series of Viking Lander images is included.

  11. Tunable photonic filters: a digital signal processing design approach.

    PubMed

    Binh, Le Nguyen

    2009-05-20

    Digital signal processing techniques are used for synthesizing tunable optical filters with variable bandwidth and centered reference frequency including the tunability of the low-pass, high-pass, bandpass, and bandstop optical filters. Potential applications of such filters are discussed, and the design techniques and properties of recursive digital filters are outlined. The basic filter structures, namely, the first-order all-pole optical filter (FOAPOF) and the first-order all-zero optical filter (FOAZOF), are described, and finally the design process of tunable optical filters and the designs of the second-order Butterworth low-pass, high-pass, bandpass, and bandstop tunable optical filters are presented. Indeed, we identify that the all-zero and all-pole networks are equivalent with well known principles of optics of interference and resonance, respectively. It is thus very straightforward to implement tunable optical filters, which is a unique feature. PMID:19458728

  12. Processing Digital Imagery to Enhance Perceptions of Realism

    NASA Technical Reports Server (NTRS)

    Woodell, Glenn A.; Jobson, Daniel J.; Rahman, Zia-ur

    2003-01-01

    Multi-scale retinex with color restoration (MSRCR) is a method of processing digital image data based on Edwin Land s retinex (retina + cortex) theory of human color vision. An outgrowth of basic scientific research and its application to NASA s remote-sensing mission, MSRCR is embodied in a general-purpose algorithm that greatly improves the perception of visual realism and the quantity and quality of perceived information in a digitized image. In addition, the MSRCR algorithm includes provisions for automatic corrections to accelerate and facilitate what could otherwise be a tedious image-editing process. The MSRCR algorithm has been, and is expected to continue to be, the basis for development of commercial image-enhancement software designed to extend and refine its capabilities for diverse applications.

  13. Digital signal processing algorithms for automatic voice recognition

    NASA Technical Reports Server (NTRS)

    Botros, Nazeih M.

    1987-01-01

    The current digital signal analysis algorithms are investigated that are implemented in automatic voice recognition algorithms. Automatic voice recognition means, the capability of a computer to recognize and interact with verbal commands. The digital signal is focused on, rather than the linguistic, analysis of speech signal. Several digital signal processing algorithms are available for voice recognition. Some of these algorithms are: Linear Predictive Coding (LPC), Short-time Fourier Analysis, and Cepstrum Analysis. Among these algorithms, the LPC is the most widely used. This algorithm has short execution time and do not require large memory storage. However, it has several limitations due to the assumptions used to develop it. The other 2 algorithms are frequency domain algorithms with not many assumptions, but they are not widely implemented or investigated. However, with the recent advances in the digital technology, namely signal processors, these 2 frequency domain algorithms may be investigated in order to implement them in voice recognition. This research is concerned with real time, microprocessor based recognition algorithms.

  14. Audit and Certification Process for Science Data Digital Repositories

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Giaretta, D.; Ambacher, B.; Ashley, K.; Conrad, M.; Downs, R. R.; Garrett, J.; Guercio, M.; Lambert, S.; Longstreth, T.; Sawyer, D. M.; Sierman, B.; Tibbo, H.; Waltz, M.

    2011-12-01

    Science data digital repositories are entrusted to ensure that a science community's data are available and useful to users both today and in the future. Part of the challenge in meeting this responsibility is identifying the standards, policies and procedures required to accomplish effective data preservation. Subsequently a repository should be evaluated on whether or not they are effective in their data preservation efforts. This poster will outline the process by which digital repositories are being formally evaluated in terms of their ability to preserve the digitally encoded information with which they have been entrusted. The ISO standards on which this is based will be identified and the relationship of these standards to the Open Archive Information System (OAIS) reference model will be shown. Six test audits have been conducted with three repositories in Europe and three in the USA. Some of the major lessons learned from these test audits will be briefly described. An assessment of the possible impact of this type of audit and certification on the practice of preserving digital information will also be provided.

  15. Digital signal processing based on inverse scattering transform.

    PubMed

    Turitsyna, Elena G; Turitsyn, Sergei K

    2013-10-15

    Through numerical modeling, we illustrate the possibility of a new approach to digital signal processing in coherent optical communications based on the application of the so-called inverse scattering transform. Considering without loss of generality a fiber link with normal dispersion and quadrature phase shift keying signal modulation, we demonstrate how an initial information pattern can be recovered (without direct backward propagation) through the calculation of nonlinear spectral data of the received optical signal. PMID:24321955

  16. Adaptive control technique for accelerators using digital signal processing

    SciTech Connect

    Eaton, L.; Jachim, S.; Natter, E.

    1987-01-01

    The use of present Digital Signal Processing (DSP) techniques can drastically reduce the residual rf amplitude and phase error in an accelerating rf cavity. Accelerator beam loading contributes greatly to this residual error, and the low-level rf field control loops cannot completely absorb the fast transient of the error. A feedforward technique using DSP is required to maintain the very stringent rf field amplitude and phase specifications. 7 refs.

  17. In-Basket Exercises as a Methodology for Studying Information Processing.

    ERIC Educational Resources Information Center

    Dukerich, Janet M.; And Others

    1990-01-01

    Describes in-basket exercises as a research methodology for examining how managers allocate attention and respond to information in their environment. Issues in organizational information processing and decision making to which this methodology might be applicable are discussed, and other methodologies for studying information processing are…

  18. APET methodology for Defense Waste Processing Facility: Mode C operation

    SciTech Connect

    Taylor, R.P. Jr.; Massey, W.M.

    1995-04-01

    Safe operation of SRS facilities continues to be the highest priority of the Savannah River Site (SRS). One of these facilities, the Defense Waste Processing Facility or DWPF, is currently undergoing cold chemical runs to verify the design and construction preparatory to hot startup in 1995. The DWPFF is a facility designed to convert the waste currently stored in tanks at the 200-Area tank farm into a form that is suitable for long term storage in engineered surface facilities and, ultimately, geologic isolation. As a part of the program to ensure safe operation of the DWPF, a probabilistic Safety Assessment of the DWPF has been completed. The results of this analysis are incorporated into the Safety Analysis Report (SAR) for DWPF. The usual practice in preparation of Safety Analysis Reports is to include only a conservative analysis of certain design basis accidents. A major part of a Probabilistic Safety Assessment is the development and quantification of an Accident Progression Event Tree or APET. The APET provides a probabilistic representation of potential sequences along which an accident may progress. The methodology used to determine the risk of operation of the DWPF borrows heavily from methods applied to the Probabilistic Safety Assessment of SRS reactors and to some commercial reactors. This report describes the Accident Progression Event Tree developed for the Probabilistic Safety Assessment of the DWPF.

  19. Quantitative Assessment of Mouse Mammary Gland Morphology Using Automated Digital Image Processing and TEB Detection.

    PubMed

    Blacher, Silvia; Gérard, Céline; Gallez, Anne; Foidart, Jean-Michel; Noël, Agnès; Péqueux, Christel

    2016-04-01

    The assessment of rodent mammary gland morphology is largely used to study the molecular mechanisms driving breast development and to analyze the impact of various endocrine disruptors with putative pathological implications. In this work, we propose a methodology relying on fully automated digital image analysis methods including image processing and quantification of the whole ductal tree and of the terminal end buds as well. It allows to accurately and objectively measure both growth parameters and fine morphological glandular structures. Mammary gland elongation was characterized by 2 parameters: the length and the epithelial area of the ductal tree. Ductal tree fine structures were characterized by: 1) branch end-point density, 2) branching density, and 3) branch length distribution. The proposed methodology was compared with quantification methods classically used in the literature. This procedure can be transposed to several software and thus largely used by scientists studying rodent mammary gland morphology. PMID:26910307

  20. Application of digital image processing for the generation of voxels phantoms for Monte Carlo simulation.

    PubMed

    Boia, L S; Menezes, A F; Cardoso, M A C; da Rosa, L A R; Batista, D V S; Cardoso, S C; Silva, A X; Facure, A

    2012-01-01

    This paper presents the application of a computational methodology for optimizing the conversion of medical tomographic images in voxel anthropomorphic models for simulation of radiation transport using the MCNP code. A computational system was developed for digital image processing that compresses the information from the DICOM medical image before it is converted to the Scan2MCNP software input file for optimization of the image data. In order to validate the computational methodology, a radiosurgery treatment simulation was performed using the Alderson Rando phantom and the acquisition of DICOM images was performed. The simulation results were compared with data obtained with the BrainLab planning system. The comparison showed good agreement for three orthogonal treatment beams of (60)Co gamma radiation. The percentage differences were 3.07%, 0.77% and 6.15% for axial, coronal and sagital projections, respectively. PMID:21945017

  1. Fingerprint pattern restoration by digital image processing techniques.

    PubMed

    Wen, Che-Yen; Yu, Chiu-Chung

    2003-09-01

    Fingerprint evidence plays an important role in solving criminal problems. However, defective (lacking information needed for completeness) or contaminated (undesirable information included) fingerprint patterns make identifying and recognizing processes difficult. Unfortunately. this is the usual case. In the recognizing process (enhancement of patterns, or elimination of "false alarms" so that a fingerprint pattern can be searched in the Automated Fingerprint Identification System (AFIS)), chemical and physical techniques have been proposed to improve pattern legibility. In the identifying process, a fingerprint examiner can enhance contaminated (but not defective) fingerprint patterns under guidelines provided by the Scientific Working Group on Friction Ridge Analysis, Study and Technology (SWGFAST), the Scientific Working Group on Imaging Technology (SWGIT), and an AFIS working group within the National Institute of Justice. Recently, the image processing techniques have been successfully applied in forensic science. For example, we have applied image enhancement methods to improve the legibility of digital images such as fingerprints and vehicle plate numbers. In this paper, we propose a novel digital image restoration technique based on the AM (amplitude modulation)-FM (frequency modulation) reaction-diffusion method to restore defective or contaminated fingerprint patterns. This method shows its potential application to fingerprint pattern enhancement in the recognizing process (but not for the identifying process). Synthetic and real images are used to show the capability of the proposed method. The results of enhancing fingerprint patterns by the manual process and our method are evaluated and compared. PMID:14535661

  2. Digital signal processing (DSP) applications for multiband loudness correction digital hearing aids and cochlear implants.

    PubMed

    Dillier, N; Frölich, T; Kompis, M; Bögli, H; Lai, W K

    1993-01-01

    Single-chip digital signal processors (DSPs) allow the flexible implementation of a large variety of speech analysis, synthesis, and processing algorithms for the hearing impaired. A series of experiments was carried out to optimize parameters of the adaptive beamformer noise reduction algorithm and to evaluate its performance in realistic environments with normal-hearing and hearing-impaired subjects. An experimental DSP system has been used to implement a multiband loudness correction (MLC) algorithm for a digital hearing aid. Speech tests in quiet and noise with 13 users of conventional hearing aids demonstrated significant improvements in discrimination scores with the MLC algorithm. Various speech coding strategies for cochlear implants were implemented in real time on a DSP laboratory speech processor. Improved speech discrimination performance was achieved with high-rate stimulation. Hybrid strategies incorporating speech feature detectors and complex decision algorithms are currently being investigated. PMID:8263833

  3. Instruments and Methodologies for the Underwater Tridimensional Digitization and Data Musealization

    NASA Astrophysics Data System (ADS)

    Repola, L.; Memmolo, R.; Signoretti, D.

    2015-04-01

    In the research started within the SINAPSIS project of the Università degli Studi Suor Orsola Benincasa an underwater stereoscopic scanning aimed at surveying of submerged archaeological sites, integrable to standard systems for geomorphological detection of the coast, has been developed. The project involves the construction of hardware consisting of an aluminum frame supporting a pair of GoPro Hero Black Edition cameras and software for the production of point clouds and the initial processing of data. The software has features for stereoscopic vision system calibration, reduction of noise and the of distortion of underwater captured images, searching for corresponding points of stereoscopic images using stereo-matching algorithms (dense and sparse), for points cloud generating and filtering. Only after various calibration and survey tests carried out during the excavations envisaged in the project, the mastery of methods for an efficient acquisition of data has been achieved. The current development of the system has allowed generation of portions of digital models of real submerged scenes. A semi-automatic procedure for global registration of partial models is under development as a useful aid for the study and musealization of sites.

  4. a Semi-Automated Point Cloud Processing Methodology for 3d Cultural Heritage Documentation

    NASA Astrophysics Data System (ADS)

    Kıvılcım, C. Ö.; Duran, Z.

    2016-06-01

    The preliminary phase in any architectural heritage project is to obtain metric measurements and documentation of the building and its individual elements. On the other hand, conventional measurement techniques require tremendous resources and lengthy project completion times for architectural surveys and 3D model production. Over the past two decades, the widespread use of laser scanning and digital photogrammetry have significantly altered the heritage documentation process. Furthermore, advances in these technologies have enabled robust data collection and reduced user workload for generating various levels of products, from single buildings to expansive cityscapes. More recently, the use of procedural modelling methods and BIM relevant applications for historic building documentation purposes has become an active area of research, however fully automated systems in cultural heritage documentation still remains open. In this paper, we present a semi-automated methodology, for 3D façade modelling of cultural heritage assets based on parametric and procedural modelling techniques and using airborne and terrestrial laser scanning data. We present the contribution of our methodology, which we implemented in an open source software environment using the example project of a 16th century early classical era Ottoman structure, Sinan the Architect's Şehzade Mosque in Istanbul, Turkey.

  5. Digital metamaterials

    NASA Astrophysics Data System (ADS)

    Della Giovampaola, Cristian; Engheta, Nader

    2014-12-01

    Balancing complexity and simplicity has played an important role in the development of many fields in science and engineering. One of the well-known and powerful examples of such balance can be found in Boolean algebra and its impact on the birth of digital electronics and the digital information age. The simplicity of using only two numbers, ‘0’ and ‘1’, in a binary system for describing an arbitrary quantity made the fields of digital electronics and digital signal processing powerful and ubiquitous. Here, inspired by the binary concept, we propose to develop the notion of digital metamaterials. Specifically, we investigate how one can synthesize an electromagnetic metamaterial with a desired permittivity, using as building blocks only two elemental materials, which we call ‘metamaterial bits’, with two distinct permittivity functions. We demonstrate, analytically and numerically, how proper spatial mixtures of such metamaterial bits lead to elemental ‘metamaterial bytes’ with effective material parameters that are different from the parameters of the metamaterial bits. We then apply this methodology to several design examples of optical elements, such as digital convex lenses, flat graded-index digital lenses, digital constructs for epsilon-near-zero (ENZ) supercoupling and digital hyperlenses, thus highlighting the power and simplicity of the methodology.

  6. Digital metamaterials.

    PubMed

    Della Giovampaola, Cristian; Engheta, Nader

    2014-12-01

    Balancing complexity and simplicity has played an important role in the development of many fields in science and engineering. One of the well-known and powerful examples of such balance can be found in Boolean algebra and its impact on the birth of digital electronics and the digital information age. The simplicity of using only two numbers, '0' and '1', in a binary system for describing an arbitrary quantity made the fields of digital electronics and digital signal processing powerful and ubiquitous. Here, inspired by the binary concept, we propose to develop the notion of digital metamaterials. Specifically, we investigate how one can synthesize an electromagnetic metamaterial with a desired permittivity, using as building blocks only two elemental materials, which we call 'metamaterial bits', with two distinct permittivity functions. We demonstrate, analytically and numerically, how proper spatial mixtures of such metamaterial bits lead to elemental 'metamaterial bytes' with effective material parameters that are different from the parameters of the metamaterial bits. We then apply this methodology to several design examples of optical elements, such as digital convex lenses, flat graded-index digital lenses, digital constructs for epsilon-near-zero (ENZ) supercoupling and digital hyperlenses, thus highlighting the power and simplicity of the methodology. PMID:25218061

  7. An overall digital processing architecture for an autonomous spacecraft

    NASA Astrophysics Data System (ADS)

    Fernandez, M.

    Attention is given to an autonomous spacecraft's digital processing system architecture. This architecture has as its task the provision of a system whose assets can systematically degrade, as limitations on self-healing capabilities are reached by the distributed architecture. The architecture possesses high availability and reliability for the postulated data processing problem, and is noted to be affordable in such respects as onboard weight, power requirements, and cost. Attention is given to the advantages of the proposed architecture, by comparison to conventional approaches, in the areas of fault coverage, functional availability, and 'graceful' degradation.

  8. A methodology for exploiting parallelism in the finite element process

    NASA Technical Reports Server (NTRS)

    Adams, L. M.; Voigt, R. G.

    1983-01-01

    A methodology is described for developing a parallel system using a top down approach taking into account the requirements of the user. Substructuring, a popular technique in structural analysis, is used to illustrate this approach.

  9. Digital system for monitoring and controlling remote processes

    NASA Astrophysics Data System (ADS)

    Roach, Dennis P.

    The need to operate increasingly complex and potentially hazardous facilities at higher degrees of efficiency can be met through the development of automated process control systems. The availability of microcomputers capable of interfacing to data acquisition and control equipment results in the possibility of developing such systems at low investment costs. An automated control system is described which maintains a constant or time varying pressure in a pressure vessel. Process control data acquisition and analysis is carried out using a commercially available microcomputer and data scanner interface device. In this system, a computer interface is developed to allow precision positioning of custom designed proportional valves. Continuous real time process control is achieved through a direct digital control algorithm. The advantages to be gained by adapting this system to other process control applications is discussed. The modular design and ability of this system to operate many types of hardware control mechanisms makes it adaptable to a wide variety of industrial applications.

  10. Programmable rate modem utilizing digital signal processing techniques

    NASA Technical Reports Server (NTRS)

    Bunya, George K.; Wallace, Robert L.

    1989-01-01

    The engineering development study to follow was written to address the need for a Programmable Rate Digital Satellite Modem capable of supporting both burst and continuous transmission modes with either binary phase shift keying (BPSK) or quadrature phase shift keying (QPSK) modulation. The preferred implementation technique is an all digital one which utilizes as much digital signal processing (DSP) as possible. Here design tradeoffs in each portion of the modulator and demodulator subsystem are outlined, and viable circuit approaches which are easily repeatable, have low implementation losses and have low production costs are identified. The research involved for this study was divided into nine technical papers, each addressing a significant region of concern in a variable rate modem design. Trivial portions and basic support logic designs surrounding the nine major modem blocks were omitted. In brief, the nine topic areas were: (1) Transmit Data Filtering; (2) Transmit Clock Generation; (3) Carrier Synthesizer; (4) Receive AGC; (5) Receive Data Filtering; (6) RF Oscillator Phase Noise; (7) Receive Carrier Selectivity; (8) Carrier Recovery; and (9) Timing Recovery.

  11. Evaluation of clinical image processing algorithms used in digital mammography.

    PubMed

    Zanca, Federica; Jacobs, Jurgen; Van Ongeval, Chantal; Claus, Filip; Celis, Valerie; Geniets, Catherine; Provost, Veerle; Pauwels, Herman; Marchal, Guy; Bosmans, Hilde

    2009-03-01

    Screening is the only proven approach to reduce the mortality of breast cancer, but significant numbers of breast cancers remain undetected even when all quality assurance guidelines are implemented. With the increasing adoption of digital mammography systems, image processing may be a key factor in the imaging chain. Although to our knowledge statistically significant effects of manufacturer-recommended image processings have not been previously demonstrated, the subjective experience of our radiologists, that the apparent image quality can vary considerably between different algorithms, motivated this study. This article addresses the impact of five such algorithms on the detection of clusters of microcalcifications. A database of unprocessed (raw) images of 200 normal digital mammograms, acquired with the Siemens Novation DR, was collected retrospectively. Realistic simulated microcalcification clusters were inserted in half of the unprocessed images. All unprocessed images were subsequently processed with five manufacturer-recommended image processing algorithms (Agfa Musica 1, IMS Raffaello Mammo 1.2, Sectra Mamea AB Sigmoid, Siemens OPVIEW v2, and Siemens OPVIEW v1). Four breast imaging radiologists were asked to locate and score the clusters in each image on a five point rating scale. The free-response data were analyzed by the jackknife free-response receiver operating characteristic (JAFROC) method and, for comparison, also with the receiver operating characteristic (ROC) method. JAFROC analysis revealed highly significant differences between the image processings (F = 8.51, p < 0.0001), suggesting that image processing strongly impacts the detectability of clusters. Siemens OPVIEW2 and Siemens OPVIEW1 yielded the highest and lowest performances, respectively. ROC analysis of the data also revealed significant differences between the processing but at lower significance (F = 3.47, p = 0.0305) than JAFROC. Both statistical analysis methods revealed that the

  12. Measurements methodology for evaluation of Digital TV operation in VHF high-band

    NASA Astrophysics Data System (ADS)

    Pudwell Chaves de Almeida, M.; Vladimir Gonzalez Castellanos, P.; Alfredo Cal Braz, J.; Pereira David, R.; Saboia Lima de Souza, R.; Pereira da Soledade, A.; Rodrigues Nascimento Junior, J.; Ferreira Lima, F.

    2016-07-01

    This paper describes the experimental setup of field measurements carried out for evaluating the operation of the ISDB-TB (Integrated Services Digital Broadcasting, Terrestrial, Brazilian version) standard digital TV in the VHF-highband. Measurements were performed in urban and suburban areas in a medium-sized Brazilian city. Besides the direct measurements of received power and environmental noise, a measurement procedure involving the injection of Gaussian additive noise was employed to achieve the signal to noise ratio threshold at each measurement site. The analysis includes results of static reception measurements for evaluating the received field strength and the signal to noise ratio thresholds for correct signal decoding.

  13. DSPSR: Digital Signal Processing Software for Pulsar Astronomy

    NASA Astrophysics Data System (ADS)

    van Straten, W.; Bailes, M.

    2010-10-01

    DSPSR, written primarily in C++, is an open-source, object-oriented, digital signal processing software library and application suite for use in radio pulsar astronomy. The library implements an extensive range of modular algorithms for use in coherent dedispersion, filterbank formation, pulse folding, and other tasks. The software is installed and compiled using the standard GNU configure and make system, and is able to read astronomical data in 18 different file formats, including FITS, S2, CPSR, CPSR2, PuMa, PuMa2, WAPP, ASP, and Mark5.

  14. Intelligent systems/software engineering methodology - A process to manage cost and risk

    NASA Technical Reports Server (NTRS)

    Friedlander, Carl; Lehrer, Nancy

    1991-01-01

    A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.

  15. Recent developments in digital image processing at the Image Processing Laboratory of JPL.

    NASA Technical Reports Server (NTRS)

    O'Handley, D. A.

    1973-01-01

    Review of some of the computer-aided digital image processing techniques recently developed. Special attention is given to mapping and mosaicking techniques and to preliminary developments in range determination from stereo image pairs. The discussed image processing utilization areas include space, biomedical, and robotic applications.

  16. Digital signal processor and programming system for parallel signal processing

    SciTech Connect

    Van den Bout, D.E.

    1987-01-01

    This thesis describes an integrated assault upon the problem of designing high-throughput, low-cost digital signal-processing systems. The dual prongs of this assault consist of: (1) the design of a digital signal processor (DSP) which efficiently executes signal-processing algorithms in either a uniprocessor or multiprocessor configuration, (2) the PaLS programming system which accepts an arbitrary algorithm, partitions it across a group of DSPs, synthesizes an optimal communication link topology for the DSPs, and schedules the partitioned algorithm upon the DSPs. The results of applying a new quasi-dynamic analysis technique to a set of high-level signal-processing algorithms were used to determine the uniprocessor features of the DSP design. For multiprocessing applications, the DSP contains an interprocessor communications port (IPC) which supports simple, flexible, dataflow communications while allowing the total communication bandwidth to be incrementally allocated to achieve the best link utilization. The net result is a DSP with a simple architecture that is easy to program for both uniprocessor and multi-processor modes of operation. The PaLS programming system simplifies the task of parallelizing an algorithm for execution upon a multiprocessor built with the DSP.

  17. Perspectives on Learning: Methodologies for Exploring Learning Processes and Outcomes

    ERIC Educational Resources Information Center

    Goldman, Susan R.

    2014-01-01

    The papers in this Special Issue were initially prepared for an EARLI 2013 Symposium that was designed to examine methodologies in use by researchers from two sister communities, Learning and Instruction and Learning Sciences. The four papers reflect a common ground in advances in conceptions of learning since the early days of the "cognitive…

  18. Digital Image Processing Overview For Helmet Mounted Displays

    NASA Astrophysics Data System (ADS)

    Parise, Michael J.

    1989-09-01

    Digital image processing provides a means to manipulate an image and presents a user with a variety of display formats that are not available in the analog image processing environment. When performed in real time and presented on a Helmet Mounted Display, system capability and flexibility are greatly enhanced. The information content of a display can be increased by the addition of real time insets and static windows from secondary sensor sources, near real time 3-D imaging from a single sensor can be achieved, graphical information can be added, and enhancement techniques can be employed. Such increased functionality is generating a considerable amount of interest in the military and commercial markets. This paper discusses some of these image processing techniques and their applications.

  19. Applying of digital signal processing to optical equisignal zone system

    NASA Astrophysics Data System (ADS)

    Maraev, Anton A.; Timofeev, Aleksandr N.; Gusarov, Vadim F.

    2015-05-01

    In this work we are trying to assess the application of array detectors and digital information processing to the system with the optical equisignal zone as a new method of evaluating of optical equisignal zone position. Peculiarities of optical equisignal zone formation are described. The algorithm of evaluation of optical equisignal zone position is applied to processing on the array detector. This algorithm enables to evaluate as lateral displacement as turning angles of the receiver relative to the projector. Interrelation of parameters of the projector and the receiver is considered. According to described principles an experimental set was made and then characterized. The accuracy of position evaluation of the equisignal zone is shown dependent of the size of the equivalent entrance pupil at processing.

  20. Optimizing Digital Health Informatics Interventions Through Unobtrusive Quantitative Process Evaluations.

    PubMed

    Gude, Wouter T; van der Veer, Sabine N; de Keizer, Nicolette F; Coiera, Enrico; Peek, Niels

    2016-01-01

    Health informatics interventions such as clinical decision support (CDS) and audit and feedback (A&F) are variably effective at improving care because the underlying mechanisms through which these interventions bring about change are poorly understood. This limits our possibilities to design better interventions. Process evaluations can be used to improve this understanding by assessing fidelity and quality of implementation, clarifying causal mechanisms, and identifying contextual factors associated with variation in outcomes. Coiera describes the intervention process as a series of stages extending from interactions to outcomes: the "information value chain". However, past process evaluations often did not assess the relationships between those stages. In this paper we argue that the chain can be measured quantitatively and unobtrusively in digital interventions thanks to the availability of electronic data that are a by-product of their use. This provides novel possibilities to study the mechanisms of informatics interventions in detail and inform essential design choices to optimize their efficacy. PMID:27577453

  1. Liquid crystal thermography and true-colour digital image processing

    NASA Astrophysics Data System (ADS)

    Stasiek, J.; Stasiek, A.; Jewartowski, M.; Collins, M. W.

    2006-06-01

    In the last decade thermochromic liquid crystals (TLC) and true-colour digital image processing have been successfully used in non-intrusive technical, industrial and biomedical studies and applications. Thin coatings of TLCs at surfaces are utilized to obtain detailed temperature distributions and heat transfer rates for steady or transient processes. Liquid crystals also can be used to make visible the temperature and velocity fields in liquids by the simple expedient of directly mixing the liquid crystal material into the liquid (water, glycerol, glycol, and silicone oils) in very small quantities to use as thermal and hydrodynamic tracers. In biomedical situations e.g., skin diseases, breast cancer, blood circulation and other medical application, TLC and image processing are successfully used as an additional non-invasive diagnostic method especially useful for screening large groups of potential patients. The history of this technique is reviewed, principal methods and tools are described and some examples are also presented.

  2. Holographic digital microscopy in on-line process control

    NASA Astrophysics Data System (ADS)

    Osanlou, Ardeshir

    2011-09-01

    This article investigates the feasibility of real-time three-dimensional imaging of microscopic objects within various emulsions while being produced in specialized production vessels. The study is particularly relevant to on-line process monitoring and control in chemical, pharmaceutical, food, cleaning, and personal hygiene industries. Such processes are often dynamic and the materials cannot be measured once removed from the production vessel. The technique reported here is applicable to three-dimensional characterization analyses on stirred fluids in small reaction vessels. Relatively expensive pulsed lasers have been avoided through the careful control of the speed of the moving fluid in relation to the speed of the camera exposure and the wavelength of the continuous wave laser used. The ultimate aim of the project is to introduce a fully robust and compact digital holographic microscope as a process control tool in a full size specialized production vessel.

  3. Digital processing of side-scan sonar data with the Woods Hole image processing system software

    USGS Publications Warehouse

    Paskevich, Valerie F.

    1992-01-01

    Since 1985, the Branch of Atlantic Marine Geology has been involved in collecting, processing and digitally mosaicking high and low-resolution side-scan sonar data. Recent development of a UNIX-based image-processing software system includes a series of task specific programs for processing side-scan sonar data. This report describes the steps required to process the collected data and to produce an image that has equal along- and across-track resol

  4. Incremental terrain processing for large digital elevation models

    NASA Astrophysics Data System (ADS)

    Ye, Z.

    2012-12-01

    Incremental terrain processing for large digital elevation models Zichuan Ye, Dean Djokic, Lori Armstrong Esri, 380 New York Street, Redlands, CA 92373, USA (E-mail: zye@esri.com, ddjokic@esri.com , larmstrong@esri.com) Efficient analyses of large digital elevation models (DEM) require generation of additional DEM artifacts such as flow direction, flow accumulation and other DEM derivatives. When the DEMs to analyze have a large number of grid cells (usually > 1,000,000,000) the generation of these DEM derivatives is either impractical (it takes too long) or impossible (software is incapable of processing such a large number of cells). Different strategies and algorithms can be put in place to alleviate this situation. This paper describes an approach where the overall DEM is partitioned in smaller processing units that can be efficiently processed. The processed DEM derivatives for each partition can then be either mosaicked back into a single large entity or managed on partition level. For dendritic terrain morphologies, the way in which partitions are to be derived and the order in which they are to be processed depend on the river and catchment patterns. These patterns are not available until flow pattern of the whole region is created, which in turn cannot be established upfront due to the size issues. This paper describes a procedure that solves this problem: (1) Resample the original large DEM grid so that the total number of cells is reduced to a level for which the drainage pattern can be established. (2) Run standard terrain preprocessing operations on the resampled DEM to generate the river and catchment system. (3) Define the processing units and their processing order based on the river and catchment system created in step (2). (4) Based on the processing order, apply the analysis, i.e., flow accumulation operation to each of the processing units, at the full resolution DEM. (5) As each processing unit is processed based on the processing order defined

  5. Digital Methodologies of Education Governance: Pearson plc and the Remediation of Methods

    ERIC Educational Resources Information Center

    Williamson, Ben

    2016-01-01

    This article analyses the rise of software systems in education governance, focusing on digital methods in the collection, calculation and circulation of educational data. It examines how software-mediated methods intervene in the ways educational institutions and actors are seen, known and acted upon through an analysis of the methodological…

  6. Naturalistic Observation of Health-Relevant Social Processes: The Electronically Activated Recorder (EAR) Methodology in Psychosomatics

    PubMed Central

    Mehl, Matthias R.; Robbins, Megan L.; Deters, Fenne große

    2012-01-01

    This article introduces a novel, observational ambulatory monitoring method called the Electronically Activated Recorder or EAR. The EAR is a digital audio recorder that runs on a handheld computer and periodically and unobtrusively records snippets of ambient sounds from participants’ momentary environments. In tracking moment-to-moment ambient sounds, it yields acoustic logs of people’s days as they naturally unfold. In sampling only a fraction of the time, it protects participants’ privacy and makes large observational studies feasible. As a naturalistic observation method, it provides an observer’s account of daily life and is optimized for the objective assessment of audible aspects of social environments, behaviors, and interactions (e.g., habitual preferences for social settings, idiosyncratic interaction styles, and subtle emotional expressions). The article discusses the EAR method conceptually and methodologically, reviews prior research with it, and identifies three concrete ways in which it can enrich psychosomatic research. Specifically, it can (a) calibrate psychosocial effects on health against frequencies of real-world behavior, (b) provide ecological, observational measures of health-related social processes that are independent of self-report, and (c) help with the assessment of subtle and habitual social behaviors that evade self-report but have important health implications. An important avenue for future research lies in merging traditional, self-report based ambulatory monitoring methods with observational approaches such as the EAR to allow for the simultaneous yet methodologically independent assessment of inner, experiential (e.g., loneliness) and outer, observable aspects (e.g., social isolation) of real-world social processes to reveal their unique effects on health. PMID:22582338

  7. Naturalistic observation of health-relevant social processes: the electronically activated recorder methodology in psychosomatics.

    PubMed

    Mehl, Matthias R; Robbins, Megan L; Deters, Fenne Große

    2012-05-01

    This article introduces a novel observational ambulatory monitoring method called the electronically activated recorder (EAR). The EAR is a digital audio recorder that runs on a handheld computer and periodically and unobtrusively records snippets of ambient sounds from participants' momentary environments. In tracking moment-to-moment ambient sounds, it yields acoustic logs of people's days as they naturally unfold. In sampling only a fraction of the time, it protects participants' privacy and makes large observational studies feasible. As a naturalistic observation method, it provides an observer's account of daily life and is optimized for the objective assessment of audible aspects of social environments, behaviors, and interactions (e.g., habitual preferences for social settings, idiosyncratic interaction styles, subtle emotional expressions). This article discusses the EAR method conceptually and methodologically, reviews prior research with it, and identifies three concrete ways in which it can enrich psychosomatic research. Specifically, it can (a) calibrate psychosocial effects on health against frequencies of real-world behavior; (b) provide ecological observational measures of health-related social processes that are independent of self-report; and (c) help with the assessment of subtle and habitual social behaviors that evade self-report but have important health implications. An important avenue for future research lies in merging traditional self-report-based ambulatory monitoring methods with observational approaches such as the EAR to allow for the simultaneous yet methodologically independent assessment of inner, experiential aspects (e.g., loneliness) and outer, observable aspects (e.g., social isolation) of real-world social processes to reveal their unique effects on health. PMID:22582338

  8. Coherent detection and digital signal processing for fiber optic communications

    NASA Astrophysics Data System (ADS)

    Ip, Ezra

    The drive towards higher spectral efficiency in optical fiber systems has generated renewed interest in coherent detection. We review different detection methods, including noncoherent, differentially coherent, and coherent detection, as well as hybrid detection methods. We compare the modulation methods that are enabled and their respective performances in a linear regime. An important system parameter is the number of degrees of freedom (DOF) utilized in transmission. Polarization-multiplexed quadrature-amplitude modulation maximizes spectral efficiency and power efficiency as it uses all four available DOF contained in the two field quadratures in the two polarizations. Dual-polarization homodyne or heterodyne downconversion are linear processes that can fully recover the received signal field in these four DOF. When downconverted signals are sampled at the Nyquist rate, compensation of transmission impairments can be performed using digital signal processing (DSP). Software based receivers benefit from the robustness of DSP, flexibility in design, and ease of adaptation to time-varying channels. Linear impairments, including chromatic dispersion (CD) and polarization-mode dispersion (PMD), can be compensated quasi-exactly using finite impulse response filters. In practical systems, sampling the received signal at 3/2 times the symbol rate is sufficient to enable an arbitrary amount of CD and PMD to be compensated for a sufficiently long equalizer whose tap length scales linearly with transmission distance. Depending on the transmitted constellation and the target bit error rate, the analog-to-digital converter (ADC) should have around 5 to 6 bits of resolution. Digital coherent receivers are naturally suited for the implementation of feedforward carrier recovery, which has superior linewidth tolerance than phase-locked loops, and does not suffer from feedback delay constraints. Differential bit encoding can be used to prevent catastrophic receiver failure due

  9. Data Processing Factory for the Sloan Digital Sky Survey

    NASA Astrophysics Data System (ADS)

    Stoughton, Christopher; Adelman, Jennifer; Annis, James T.; Hendry, John; Inkmann, John; Jester, Sebastian; Kent, Steven M.; Kuropatkin, Nickolai; Lee, Brian; Lin, Huan; Peoples, John, Jr.; Sparks, Robert; Tucker, Douglas; Vanden Berk, Dan; Yanny, Brian; Yocum, Dan

    2002-12-01

    The Sloan Digital Sky Survey (SDSS) data handling presents two challenges: large data volume and timely production of spectroscopic plates from imaging data. A data processing factory, using technologies both old and new, handles this flow. Distribution to end users is via disk farms, to serve corrected images and calibrated spectra, and a database, to efficiently process catalog queries. For distribution of modest amounts of data from Apache Point Observatory to Fermilab, scripts use rsync to update files, while larger data transfers are accomplished by shipping magnetic tapes commercially. All data processing pipelines are wrapped in scripts to address consecutive phases: preparation, submission, checking, and quality control. We constructed the factory by chaining these pipelines together while using an operational database to hold processed imaging catalogs. The science database catalogs all imaging and spectroscopic object, with pointers to the various external files associated with them. Diverse computing systems address particular processing phases. UNIX computers handle tape reading and writing, as well as calibration steps that require access to a large amount of data with relatively modest computational demands. Commodity CPUs process steps that require access to a limited amount of data with more demanding computations requirements. Disk servers optimized for cost per Gbyte serve terabytes of processed data, while servers optimized for disk read speed run SQLServer software to process queries on the catalogs. This factory produced data for the SDSS Early Data Release in June 2001, and it is currently producing Data Release One, scheduled for January 2003.

  10. On the Development of Arabic Three-Digit Number Processing in Primary School Children

    ERIC Educational Resources Information Center

    Mann, Anne; Moeller, Korbinian; Pixner, Silvia; Kaufmann, Liane; Nuerk, Hans-Christoph

    2012-01-01

    The development of two-digit number processing in children, and in particular the influence of place-value understanding, has recently received increasing research interest. However, place-value influences leading to decomposed processing have not yet been investigated for multi-digit numbers beyond the two-digit number range in children.…

  11. Microcomputer-based digital image processing - A tutorial package for exploration geologists

    NASA Technical Reports Server (NTRS)

    Harrington, J. A., Jr.; Cartin, K. F.

    1985-01-01

    An Apple II microcomputer-based software package for analysis of digital data developed at the University of Oklahoma, the Digital Image Analysis System (DIAS), provides a relatively low-cost, portable alternative to large, dedicated minicomputers for digital image processing education. Digital processing techniques for analysis of Landsat MSS data and a series of tutorial exercises for exploration geologists are described and evaluated. DIAS allows in-house training that does not interfere with computer-based prospect analysis objectives.

  12. The Digital Fields Board for the FIELDS instrument suite on the Solar Probe Plus mission: Analog and digital signal processing

    NASA Astrophysics Data System (ADS)

    Malaspina, David M.; Ergun, Robert E.; Bolton, Mary; Kien, Mark; Summers, David; Stevens, Ken; Yehle, Alan; Karlsson, Magnus; Hoxie, Vaughn C.; Bale, Stuart D.; Goetz, Keith

    2016-06-01

    The first in situ measurements of electric and magnetic fields in the near-Sun environment (< 0.25 AU from the Sun) will be made by the FIELDS instrument suite on the Solar Probe Plus mission. The Digital Fields Board (DFB) is an electronics board within FIELDS that performs analog and digital signal processing, as well as digitization, for signals between DC and 60 kHz from five voltage sensors and four search coil magnetometer channels. These nine input signals are processed on the DFB into 26 analog data streams. A specialized application-specific integrated circuit performs analog to digital conversion on all 26 analog channels simultaneously. The DFB then processes the digital data using a field programmable gate array (FPGA), generating a variety of data products, including digitally filtered continuous waveforms, high-rate burst capture waveforms, power spectra, cross spectra, band-pass filter data, and several ancillary products. While the data products are optimized for encounter-based mission operations, they are also highly configurable, a key design aspect for a mission of exploration. This paper describes the analog and digital signal processing used to ensure that the DFB produces high-quality science data, using minimal resources, in the challenging near-Sun environment.

  13. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems.

    PubMed

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-01-01

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015. PMID:26633395

  14. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems

    PubMed Central

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-01-01

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015. PMID:26633395

  15. Digital processing of mesoscale analysis and space sensor data

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.; Karitani, S.

    1985-01-01

    The mesoscale analysis and space sensor (MASS) data management and analysis system on the research computer system is presented. The MASS data base management and analysis system was implemented on the research computer system which provides a wide range of capabilities for processing and displaying large volumes of conventional and satellite derived meteorological data. The research computer system consists of three primary computers (HP-1000F, Harris/6, and Perkin-Elmer 3250), each of which performs a specific function according to its unique capabilities. The overall tasks performed concerning the software, data base management and display capabilities of the research computer system in terms of providing a very effective interactive research tool for the digital processing of mesoscale analysis and space sensor data is described.

  16. Processing techniques for digital sonar images from GLORIA.

    USGS Publications Warehouse

    Chavez, P.S., Jr.

    1986-01-01

    Image processing techniques have been developed to handle data from one of the newest members of the remote sensing family of digital imaging systems. This paper discusses software to process data collected by the GLORIA (Geological Long Range Inclined Asdic) sonar imaging system, designed and built by the Institute of Oceanographic Sciences (IOS) in England, to correct for both geometric and radiometric distortions that exist in the original 'raw' data. Preprocessing algorithms that are GLORIA-specific include corrections for slant-range geometry, water column offset, aspect ratio distortion, changes in the ship's velocity, speckle noise, and shading problems caused by the power drop-off which occurs as a function of range.-from Author

  17. Intranets and Digital Organizational Information Resources: Towards a Portable Methodology for Design and Development.

    ERIC Educational Resources Information Center

    Rosenbaum, Howard

    1997-01-01

    Discusses the concept of the intranet, comparing and contrasting it with groupware, and presents an argument for its value based on technical and information management considerations. Presents an intranet development project for an academic organization and describes a portable, user-centered and team-based methodology for the design and…

  18. A Digital Ecosystem for the Collaborative Production of Open Textbooks: The LATIn Methodology

    ERIC Educational Resources Information Center

    Silveira, Ismar Frango; Ochôa, Xavier; Cuadros-Vargas, Alex; Pérez Casas, Alén; Casali, Ana; Ortega, Andre; Sprock, Antonio Silva; Alves, Carlos Henrique; Collazos Ordoñez, Cesar Alberto; Deco, Claudia; Cuadros-Vargas, Ernesto; Knihs, Everton; Parra, Gonzalo; Muñoz-Arteaga, Jaime; Gomes dos Santos, Jéssica; Broisin, Julien; Omar, Nizam; Motz, Regina; Rodés, Virginia; Bieliukas, Yosly Hernández C.

    2013-01-01

    Access to books in higher education is an issue to be addressed, especially in the context of underdeveloped countries, such as those in Latin America. More than just financial issues, cultural aspects and need for adaptation must be considered. The present conceptual paper proposes a methodology framework that would support collaborative open…

  19. Digital Image Processing Technique for Breast Cancer Detection

    NASA Astrophysics Data System (ADS)

    Guzmán-Cabrera, R.; Guzmán-Sepúlveda, J. R.; Torres-Cisneros, M.; May-Arrioja, D. A.; Ruiz-Pinales, J.; Ibarra-Manzano, O. G.; Aviña-Cervantes, G.; Parada, A. González

    2013-09-01

    Breast cancer is the most common cause of death in women and the second leading cause of cancer deaths worldwide. Primary prevention in the early stages of the disease becomes complex as the causes remain almost unknown. However, some typical signatures of this disease, such as masses and microcalcifications appearing on mammograms, can be used to improve early diagnostic techniques, which is critical for women’s quality of life. X-ray mammography is the main test used for screening and early diagnosis, and its analysis and processing are the keys to improving breast cancer prognosis. As masses and benign glandular tissue typically appear with low contrast and often very blurred, several computer-aided diagnosis schemes have been developed to support radiologists and internists in their diagnosis. In this article, an approach is proposed to effectively analyze digital mammograms based on texture segmentation for the detection of early stage tumors. The proposed algorithm was tested over several images taken from the digital database for screening mammography for cancer research and diagnosis, and it was found to be absolutely suitable to distinguish masses and microcalcifications from the background tissue using morphological operators and then extract them through machine learning techniques and a clustering algorithm for intensity-based segmentation.

  20. IMAGEP - A FORTRAN ALGORITHM FOR DIGITAL IMAGE PROCESSING

    NASA Technical Reports Server (NTRS)

    Roth, D. J.

    1994-01-01

    IMAGEP is a FORTRAN computer algorithm containing various image processing, analysis, and enhancement functions. It is a keyboard-driven program organized into nine subroutines. Within the subroutines are other routines, also, selected via keyboard. Some of the functions performed by IMAGEP include digitization, storage and retrieval of images; image enhancement by contrast expansion, addition and subtraction, magnification, inversion, and bit shifting; display and movement of cursor; display of grey level histogram of image; and display of the variation of grey level intensity as a function of image position. This algorithm has possible scientific, industrial, and biomedical applications in material flaw studies, steel and ore analysis, and pathology, respectively. IMAGEP is written in VAX FORTRAN for DEC VAX series computers running VMS. The program requires the use of a Grinnell 274 image processor which can be obtained from Mark McCloud Associates, Campbell, CA. An object library of the required GMR series software is included on the distribution media. IMAGEP requires 1Mb of RAM for execution. The standard distribution medium for this program is a 1600 BPI 9track magnetic tape in VAX FILES-11 format. It is also available on a TK50 tape cartridge in VAX FILES-11 format. This program was developed in 1991. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation.

  1. Digital Signal Processing and Control for the Study of Gene Networks.

    PubMed

    Shin, Yong-Jun

    2016-01-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks. PMID:27102828

  2. Digital Signal Processing and Control for the Study of Gene Networks

    PubMed Central

    Shin, Yong-Jun

    2016-01-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks. PMID:27102828

  3. Digital Transformation of Words in Learning Processes: A Critical View.

    ERIC Educational Resources Information Center

    Saga, Hiroo

    1999-01-01

    Presents some negative aspects of society's dependence on digital transformation of words by referring to works by Walter Ong and Martin Heidegger. Discusses orality, literacy and digital literacy and describes three aspects of the digital transformation of words. Compares/contrasts art with technology and discusses implications for education.…

  4. Digital-image processing and image analysis of glacier ice

    USGS Publications Warehouse

    Fitzpatrick, Joan J.

    2013-01-01

    This document provides a methodology for extracting grain statistics from 8-bit color and grayscale images of thin sections of glacier ice—a subset of physical properties measurements typically performed on ice cores. This type of analysis is most commonly used to characterize the evolution of ice-crystal size, shape, and intercrystalline spatial relations within a large body of ice sampled by deep ice-coring projects from which paleoclimate records will be developed. However, such information is equally useful for investigating the stress state and physical responses of ice to stresses within a glacier. The methods of analysis presented here go hand-in-hand with the analysis of ice fabrics (aggregate crystal orientations) and, when combined with fabric analysis, provide a powerful method for investigating the dynamic recrystallization and deformation behaviors of bodies of ice in motion. The procedures described in this document compose a step-by-step handbook for a specific image acquisition and data reduction system built in support of U.S. Geological Survey ice analysis projects, but the general methodology can be used with any combination of image processing and analysis software. The specific approaches in this document use the FoveaPro 4 plug-in toolset to Adobe Photoshop CS5 Extended but it can be carried out equally well, though somewhat less conveniently, with software such as the image processing toolbox in MATLAB, Image-Pro Plus, or ImageJ.

  5. Infective endocarditis detection through SPECT/CT images digital processing

    NASA Astrophysics Data System (ADS)

    Moreno, Albino; Valdés, Raquel; Jiménez, Luis; Vallejo, Enrique; Hernández, Salvador; Soto, Gabriel

    2014-03-01

    Infective endocarditis (IE) is a difficult-to-diagnose pathology, since its manifestation in patients is highly variable. In this work, it was proposed a semiautomatic algorithm based on SPECT images digital processing for the detection of IE using a CT images volume as a spatial reference. The heart/lung rate was calculated using the SPECT images information. There were no statistically significant differences between the heart/lung rates values of a group of patients diagnosed with IE (2.62+/-0.47) and a group of healthy or control subjects (2.84+/-0.68). However, it is necessary to increase the study sample of both the individuals diagnosed with IE and the control group subjects, as well as to improve the images quality.

  6. Digital Image Processing for Noise Reduction in Medical Ultrasonics

    NASA Astrophysics Data System (ADS)

    Loupas, Thanasis

    Available from UMI in association with The British Library. Requires signed TDF. The purpose of this project was to investigate the application of digital image processing techniques as a means of reducing noise in medical ultrasonic imaging. Ultrasonic images suffer primarily from a type of acoustic noise, known as speckle, which is generally regarded as a major source of image quality degradation. The origin of speckle, its statistical properties as well as methods suggested to eliminate this artifact were reviewed. A simple model which can characterize the statistics of speckle on displays was also developed. A large number of digital noise reduction techniques was investigated. These include frame averaging techniques performed by commercially available devices and spatial filters implemented in software. Among the latter, some filters have been proposed in the scientific literature for ultrasonic, laser and microwave speckle or general noise suppression and the rest are original, developed specifically to suppress ultrasonic speckle. Particular emphasis was placed on adaptive techniques which adjust the processing performed at each point according to the local image content. In this way, they manage to suppress speckle with negligible loss of genuine image detail. Apart from preserving the diagnostically significant features of a scan another requirement a technique must satisfy before it is accepted in routine clinical practice is real-time operation. A spatial filter capable of satisfying both these requirements was designed and built in hardware using low-cost and readily available components. The possibility of incorporating all the necessary filter circuitry into a single VLSI chip was also investigated. In order to establish the effectiveness and usefulness of speckle suppression, a representative sample from the techniques examined here was applied to a large number of abdominal scans and their effect on image quality was evaluated. Finally, further

  7. Towards a Methodology for Representing and Classifying Business Processes

    NASA Astrophysics Data System (ADS)

    Mili, Hafedh; Leshob, Abderrahmane; Lefebvre, Eric; Lévesque, Ghislain; El-Boussaidi, Ghizlane

    Organizations build information systems to support their business processes. Some of these business processes are industry or organization-specific, but most are common to many industries and are used, modulo a few modifications, in different contexts. A precise modeling of such processes would seem to be a necessary prerequisite for building information systems that are aligned with the business objectives of the organization and that fulfill the functional requirements of its users. Yet, there are few tools, conceptual or otherwise, that enable organizations to model their business processes precisely and efficiently, and fewer tools still, to map such process models to the software components that are needed to support them. Our work deals with the problem of building tools to model business processes precisely, and to help map such models to software models. In this paper, we describe a representation and classification of business processes that supports the specification of organization-specific processes by, 1) navigating a repository of generic business processes, and 2) automatically generating new process variants to accommodate the specifics of the organization. We present the principles underlying our approach, and describe the state of an ongoing implementation.

  8. Irdis: A Digital Scene Storage And Processing System For Hardware-In-The-Loop Missile Testing

    NASA Astrophysics Data System (ADS)

    Sedlar, Michael F.; Griffith, Jerry A.

    1988-07-01

    This paper describes the implementation of a Seeker Evaluation and Test Simulation (SETS) Facility at Eglin Air Force Base. This facility will be used to evaluate imaging infrared (IIR) guided weapon systems by performing various types of laboratory tests. One such test is termed Hardware-in-the-Loop (HIL) simulation (Figure 1) in which the actual flight of a weapon system is simulated as closely as possible in the laboratory. As shown in the figure, there are four major elements in the HIL test environment; the weapon/sensor combination, an aerodynamic simulator, an imagery controller, and an infrared imagery system. The paper concentrates on the approaches and methodologies used in the imagery controller and infrared imaging system elements for generating scene information. For procurement purposes, these two elements have been combined into an Infrared Digital Injection System (IRDIS) which provides scene storage, processing, and output interface to drive a radiometric display device or to directly inject digital video into the weapon system (bypassing the sensor). The paper describes in detail how standard and custom image processing functions have been combined with off-the-shelf mass storage and computing devices to produce a system which provides high sample rates (greater than 90 Hz), a large terrain database, high weapon rates of change, and multiple independent targets. A photo based approach has been used to maximize terrain and target fidelity, thus providing a rich and complex scene for weapon/tracker evaluation.

  9. Unfolding-synthesis technique for digital pulse processing. Part 1: Unfolding

    NASA Astrophysics Data System (ADS)

    Jordanov, Valentin T.

    2016-01-01

    The unfolding-synthesis technique is used in the development of digital pulse processing systems used in radiation measurements. This technique is applied to digital signals obtained by digitization of analog signals that represent the combined response of the radiation detectors and the associated signal conditioning electronics. The salient features of the unfolding-synthesis technique are first the unfolding of the digital signals into unit impulses, followed by the synthesis of digital signal processing systems with unit impulse responses equivalent to the desired pulse shapes. Part 1 of this paper covers the unfolding part of this technique.

  10. Development of next generation digital flat panel catheterization system: design principles and validation methodology

    NASA Astrophysics Data System (ADS)

    Belanger, B.; Betraoui, F.; Dhawale, P.; Gopinath, P.; Tegzes, Pal; Vagvolgyi, B.

    2006-03-01

    The design principles that drove the development of a new cardiovascular x-ray digital flat panel (DFP) detector system are presented, followed by assessments of imaging and dose performance achieved relative to other state of the art FPD systems. The new system (GE Innova 2100 IQ TM) incorporates a new detector with substantially improved DQE at fluoroscopic (73%@1μR) and record (79%@114uR) doses, an x-ray tube with higher continuous fluoro power (3.2kW), a collimator with a wide range of copper spectral filtration (up to 0.9mm), and an improved automatic x-ray exposure management system. The performance of this new system was compared to that of the previous generation GE product (Innova 2000) and to state-of-the art cardiac digital x-ray flat panel systems from two other major manufacturers. Performance was assessed with the industry standard Cardiac X-ray NEMA/SCA and I phantom, and a new moving coronary artery stent (MCAS) phantom, designed to simulate cardiac clinical imaging conditions, composed of an anthropomorphic chest section with stents moving in a manner simulating normal coronary arteries. The NEMA/SCA&I phantom results showed the Innova 2100 IQ to exceed or equal the Innova 2000 in all of the performance categories, while operating at 28% lower dose on average, and to exceed the other DFP systems in most of the performance categories. The MCAS phantom tests showed the Innova 2100 IQ to be significantly better (p << 0.05) than the Innova 2000, and significantly better than the other DFP systems in most cases at comparable or lower doses, thereby verifying excellent performance against design goals.

  11. Study of optical techniques for the Ames unitary wind tunnel: Digital image processing, part 6

    NASA Technical Reports Server (NTRS)

    Lee, George

    1993-01-01

    A survey of digital image processing techniques and processing systems for aerodynamic images has been conducted. These images covered many types of flows and were generated by many types of flow diagnostics. These include laser vapor screens, infrared cameras, laser holographic interferometry, Schlieren, and luminescent paints. Some general digital image processing systems, imaging networks, optical sensors, and image computing chips were briefly reviewed. Possible digital imaging network systems for the Ames Unitary Wind Tunnel were explored.

  12. Digital signal processing techniques for coherent optical communication

    NASA Astrophysics Data System (ADS)

    Goldfarb, Gilad

    Coherent detection with subsequent digital signal processing (DSP) is developed, analyzed theoretically and numerically and experimentally demonstrated in various fiber-optic transmission scenarios. The use of DSP in conjunction with coherent detection unleashes the benefits of coherent detection which rely on the preservaton of full information of the incoming field. These benefits include high receiver sensitivity, the ability to achieve high spectral-efficiency and the use of advanced modulation formats. With the immense advancements in DSP speeds, many of the problems hindering the use of coherent detection in optical transmission systems have been eliminated. Most notably, DSP alleviates the need for hardware phase-locking and polarization tracking, which can now be achieved in the digital domain. The complexity previously associated with coherent detection is hence significantly diminished and coherent detection is once gain considered a feasible detection alternative. In this thesis, several aspects of coherent detection (with or without subsequent DSP) are addressed. Coherent detection is presented as a means to extend the dispersion limit of a duobinary signal using an analog decision-directed phase-lock loop. Analytical bit-error ratio estimation for quadrature phase-shift keying signals is derived. To validate the promise for high spectral efficiency, the orthogonal-wavelength-division multiplexing scheme is suggested. In this scheme the WDM channels are spaced at the symbol rate, thus achieving the spectral efficiency limit. Theory, simulation and experimental results demonstrate the feasibility of this approach. Infinite impulse response filtering is shown to be an efficient alternative to finite impulse response filtering for chromatic dispersion compensation. Theory, design considerations, simulation and experimental results relating to this topic are presented. Interaction between fiber dispersion and nonlinearity remains the last major challenge

  13. Digital computer processing of peach orchard multispectral aerial photography

    NASA Technical Reports Server (NTRS)

    Atkinson, R. J.

    1976-01-01

    Several methods of analysis using digital computers applicable to digitized multispectral aerial photography, are described, with particular application to peach orchard test sites. This effort was stimulated by the recent premature death of peach trees in the Southeastern United States. The techniques discussed are: (1) correction of intensity variations by digital filtering, (2) automatic detection and enumeration of trees in five size categories, (3) determination of unhealthy foliage by infrared reflectances, and (4) four band multispectral classification into healthy and declining categories.

  14. Social Information Processing, Emotions, and Aggression: Conceptual and Methodological Contributions of the Special Section Articles

    ERIC Educational Resources Information Center

    Arsenio, William F.

    2010-01-01

    This discussion summarizes some of the key conceptual and methodological contributions of the four articles in this special section on social information processing (SIP) and aggression. One major contribution involves the new methodological tools these studies provide for future researchers. Eye-tracking and mood induction techniques will make it…

  15. Digital Signal Processing Techniques for the GIFTS SM EDU

    NASA Technical Reports Server (NTRS)

    Tian, Jialin; Reisse, Robert A.; Gazarik, Michael J.

    2007-01-01

    The Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) Sensor Module (SM) Engineering Demonstration Unit (EDU) is a high resolution spectral imager designed to measure infrared (IR) radiance using a Fourier transform spectrometer (FTS). The GIFTS instrument employs three Focal Plane Arrays (FPAs), which gather measurements across the long-wave IR (LWIR), short/mid-wave IR (SMWIR), and visible spectral bands. The raw interferogram measurements are radiometrically and spectrally calibrated to produce radiance spectra, which are further processed to obtain atmospheric profiles via retrieval algorithms. This paper describes several digital signal processing (DSP) techniques involved in the development of the calibration model. In the first stage, the measured raw interferograms must undergo a series of processing steps that include filtering, decimation, and detector nonlinearity correction. The digital filtering is achieved by employing a linear-phase even-length FIR complex filter that is designed based on the optimum equiripple criteria. Next, the detector nonlinearity effect is compensated for using a set of pre-determined detector response characteristics. In the next stage, a phase correction algorithm is applied to the decimated interferograms. This is accomplished by first estimating the phase function from the spectral phase response of the windowed interferogram, and then correcting the entire interferogram based on the estimated phase function. In the calibration stage, we first compute the spectral responsivity based on the previous results and the ideal Planck blackbody spectra at the given temperatures, from which, the calibrated ambient blackbody (ABB), hot blackbody (HBB), and scene spectra can be obtained. In the post-calibration stage, we estimate the Noise Equivalent Spectral Radiance (NESR) from the calibrated ABB and HBB spectra. The NESR is generally considered as a measure of the instrument noise performance, and can be estimated as

  16. Complexity, Methodology and Method: Crafting a Critical Process of Research

    ERIC Educational Resources Information Center

    Alhadeff-Jones, Michel

    2013-01-01

    This paper defines a theoretical framework aiming to support the actions and reflections of researchers looking for a "method" in order to critically conceive the complexity of a scientific process of research. First, it starts with a brief overview of the core assumptions framing Morin's "paradigm of complexity" and Le…

  17. Radiometric calibration of digital cameras using Gaussian processes

    NASA Astrophysics Data System (ADS)

    Schall, Martin; Grunwald, Michael; Umlauf, Georg; Franz, Matthias O.

    2015-05-01

    Digital cameras are subject to physical, electronic and optic effects that result in errors and noise in the image. These effects include for example a temperature dependent dark current, read noise, optical vignetting or different sensitivities of individual pixels. The task of a radiometric calibration is to reduce these errors in the image and thus improve the quality of the overall application. In this work we present an algorithm for radiometric calibration based on Gaussian processes. Gaussian processes are a regression method widely used in machine learning that is particularly useful in our context. Then Gaussian process regression is used to learn a temperature and exposure time dependent mapping from observed gray-scale values to true light intensities for each pixel. Regression models based on the characteristics of single pixels suffer from excessively high runtime and thus are unsuitable for many practical applications. In contrast, a single regression model for an entire image with high spatial resolution leads to a low quality radiometric calibration, which also limits its practical use. The proposed algorithm is predicated on a partitioning of the pixels such that each pixel partition can be represented by one single regression model without quality loss. Partitioning is done by extracting features from the characteristic of each pixel and using them for lexicographic sorting. Splitting the sorted data into partitions with equal size yields the final partitions, each of which is represented by the partition centers. An individual Gaussian process regression and model selection is done for each partition. Calibration is performed by interpolating the gray-scale value of each pixel with the regression model of the respective partition. The experimental comparison of the proposed approach to classical flat field calibration shows a consistently higher reconstruction quality for the same overall number of calibration frames.

  18. Environmental testing of a prototypic digital safety channel, Phase I: System design and test methodology

    SciTech Connect

    Korsah, K.; Turner, G.W.; Mullens, J.A.

    1995-04-01

    A microprocessor-based reactor trip channel has been assembled for environmental testing under an Instrumentation and Control (I&C) Qualification Program sponsored by the US Nuclear Regulatory Commission. The goal of this program is to establish the technical basis and acceptance criteria for the qualification of advanced I&C systems. The trip channel implemented for this study employs technologies and digital subsystems representative of those proposed for use in some advanced light-water reactors (ALWRs) such as the Simplified Boiling Water Reactor (SBWR). It is expected that these tests will reveal any potential system vulnerabilities for technologies representative of those proposed for use in ALWRs. The experimental channel will be purposely stressed considerably beyond what it is likely to experience in a normal nuclear power plant environment, so that the tests can uncover the worst-case failure modes (i.e., failures that are likely to prevent an entire trip system from performing its safety function when required to do so). Based on information obtained from this study, it may be possible to recommend tests that are likely to indicate the presence of such failure mechanisms. Such recommendations would be helpful in augmenting current qualification guidelines.

  19. Environmental testing of a prototypic digital safety channel, phase I: System design and test methodology

    SciTech Connect

    Korsah, K.; Turner, G.W.; Mullens, J.A.

    1995-02-01

    A microprocessor-based reactor trip channel has been assembled for environmental testing under an Instrumentation and Control (I&C) Qualification Program sponsored by the U.S. Nuclear Regulatory Commission. The goal of this program is to establish the technical basis for the qualification of advanced I&C systems. The trip channel implemented for this study employs technologies and digital subsystems representative of those proposed for use in some advanced light-water reactors (ALNWS) such as the Simplified Boiling Water Reactor (SBNW) and AP600. It is expected that these tests will reveal any potential system vulnerabilities for technologies representative of those proposed for use in ALNWS. The experimental channel will be purposely stressed considerably beyond what it is likely to experience in a normal nuclear power plant environment, so that the tests can uncover the worst-case failure modes (i.e., failures that are likely to prevent an entire trip system from performing its safety function when required to do so). Based on information obtained from this study, it may be possible to recommend tests that are likely to indicate the presence of such failure mechanisms. Such recommendations would be helpful in augmenting current qualification guidelines.

  20. Knowledge and Processes That Predict Proficiency in Digital Literacy

    ERIC Educational Resources Information Center

    Bulger, Monica E.; Mayer, Richard E.; Metzger, Miriam J.

    2014-01-01

    Proficiency in digital literacy refers to the ability to read and write using online sources, and includes the ability to select sources relevant to the task, synthesize information into a coherent message, and communicate the message with an audience. The present study examines the determinants of digital literacy proficiency by asking 150…

  1. Microcomputer-Based Digital Signal Processing Laboratory Experiments.

    ERIC Educational Resources Information Center

    Tinari, Jr., Rocco; Rao, S. Sathyanarayan

    1985-01-01

    Describes a system (Apple II microcomputer interfaced to flexible, custom-designed digital hardware) which can provide: (1) Fast Fourier Transform (FFT) computation on real-time data with a video display of spectrum; (2) frequency synthesis experiments using the inverse FFT; and (3) real-time digital filtering experiments. (JN)

  2. Digital image processing: a primer for JVIR authors and readers: part 1: the fundamentals.

    PubMed

    LaBerge, Jeanne M; Andriole, Katherine P

    2003-10-01

    Online submission of manuscripts will be mandatory for most journals in the near future. To prepare authors for this requirement and to acquaint readers with this new development, herein the basics of digital image processing are described. From the fundamentals of digital image architecture, through acquisition, editing, and storage of digital images, the steps necessary to prepare an image for online submission are reviewed. In this article, the first of a three-part series, the structure of the digital image is described. In subsequent articles, the acquisition and editing of digital images will be reviewed. PMID:14551267

  3. Systematic methodology for estimating direct capital costs for blanket tritium processing systems

    SciTech Connect

    Finn, P.A.

    1985-01-01

    This paper describes the methodology developed for estimating the relative capital costs of blanket processing systems. The capital costs of the nine blanket concepts selected in the Blanket Comparison and Selection Study are presented and compared.

  4. How processing digital elevation models can affect simulated water budgets

    USGS Publications Warehouse

    Kuniansky, E.L.; Lowery, M.A.; Campbell, B.G.

    2009-01-01

    For regional models, the shallow water table surface is often used as a source/sink boundary condition, as model grid scale precludes simulation of the water table aquifer. This approach is appropriate when the water table surface is relatively stationary. Since water table surface maps are not readily available, the elevation of the water table used in model cells is estimated via a two-step process. First, a regression equation is developed using existing land and water table elevations from wells in the area. This equation is then used to predict the water table surface for each model cell using land surface elevation available from digital elevation models (DEM). Two methods of processing DEM for estimating the land surface for each cell are commonly used (value nearest the cell centroid or mean value in the cell). This article demonstrates how these two methods of DEM processing can affect the simulated water budget. For the example presented, approximately 20% more total flow through the aquifer system is simulated if the centroid value rather than the mean value is used. This is due to the one-third greater average ground water gradients associated with the centroid value than the mean value. The results will vary depending on the particular model area topography and cell size. The use of the mean DEM value in each model cell will result in a more conservative water budget and is more appropriate because the model cell water table value should be representative of the entire cell area, not the centroid of the model cell.

  5. Social work practice in the digital age: therapeutic e-mail as a direct practice methodology.

    PubMed

    Mattison, Marian

    2012-07-01

    The author addresses the risks and benefits of incorporating therapeutic e-mail communication into clinical social work practice. Consumer demand for online clinical services is growing faster than the professional response. E-mail, when used as an adjunct to traditional meetings with clients, offers distinct advantages and risks. Benefits include the potential to reach clients in geographically remote and underserved communities, enhancing and extending the therapeutic relationship and improving treatment outcomes. Risks include threats to client confidentiality and privacy, liability coverage for practitioners, licensing jurisdiction, and the lack of competency standards for delivering e-mail interventions. Currently, the social work profession does not have adequate instructive guidelines and best-practice standards for using e-mail as a direct practice methodology. Practitioners need (formal) academic training in the techniques connected to e-mail exchanges with clients. The author describes the ethical and legal risks for practitioners using therapeutic e-mail with clients and identifies recommendations for establishing best-practice standards. PMID:23252316

  6. Delicate visual artifacts of advanced digital video processing algorithms

    NASA Astrophysics Data System (ADS)

    Nicolas, Marina M.; Lebowsky, Fritz

    2005-03-01

    With the incoming of digital TV, sophisticated video processing algorithms have been developed to improve the rendering of motion or colors. However, the perceived subjective quality of these new systems sometimes happens to be in conflict with the objective measurable improvement we expect to get. In this presentation, we show examples where algorithms should visually improve the skin tone rendering of decoded pictures under normal conditions, but surprisingly fail, when the quality of mpeg encoding drops below a just noticeable threshold. In particular, we demonstrate that simple objective criteria used for the optimization, such as SAD, PSNR or histogram sometimes fail, partly because they are defined on a global scale, ignoring local characteristics of the picture content. We then integrate a simple human visual model to measure potential artifacts with regard to spatial and temporal variations of the objects' characteristics. Tuning some of the model's parameters allows correlating the perceived objective quality with compression metrics of various encoders. We show the evolution of our reference parameters in respect to the compression ratios. Finally, using the output of the model, we can control the parameters of the skin tone algorithm to reach an improvement in overall system quality.

  7. Automated Coronal Loop Identification Using Digital Image Processing Techniques

    NASA Technical Reports Server (NTRS)

    Lee, Jong K.; Gary, G. Allen; Newman, Timothy S.

    2003-01-01

    The results of a master thesis project on a study of computer algorithms for automatic identification of optical-thin, 3-dimensional solar coronal loop centers from extreme ultraviolet and X-ray 2-dimensional images will be presented. These center splines are proxies of associated magnetic field lines. The project is pattern recognition problems in which there are no unique shapes or edges and in which photon and detector noise heavily influence the images. The study explores extraction techniques using: (1) linear feature recognition of local patterns (related to the inertia-tensor concept), (2) parametric space via the Hough transform, and (3) topological adaptive contours (snakes) that constrains curvature and continuity as possible candidates for digital loop detection schemes. We have developed synthesized images for the coronal loops to test the various loop identification algorithms. Since the topology of these solar features is dominated by the magnetic field structure, a first-order magnetic field approximation using multiple dipoles provides a priori information in the identification process. Results from both synthesized and solar images will be presented.

  8. Digital neuromorphic processing for a simplified algorithm of ultrasonic reception

    NASA Astrophysics Data System (ADS)

    Qiang, Lin; Clarke, Chris

    2001-05-01

    Previously, most mammalian auditory systems research has concentrated on human sensory perception whose frequencies are lower than 20 kHz. The implementations almost always used analog VLSI design. Due to the complexity of the model, it is difficult to implement these algorithms using current digital technology. This paper introduces a simplified model of biosonic reception system in bats and its implementation in the ``Chiroptera Inspired Robotic CEphaloid'' (CIRCE) project. This model consists of bandpass filters, a half-wave rectifier, low-pass filters, automatic gain control, and spike generation with thresholds. Due to the real-time requirements of the system, the system employs Butterworth filters and advanced field programmable gate array (FPGA) architectures to provide a viable solution. The ultrasonic signal processing is implemented on a Xilinx FPGA Virtex II device in real time. In the system, 12-bit input echo signals from receivers are sampled at 1 M samples per second for a signal frequency range from 20 to 200 kHz. The system performs a 704-channel per ear auditory pipeline operating in real time. The output of the system is a coded time series of threshold crossing points. Comparing hardware implementation with fixed-point software, the system shows significant performance gains with no loss of accuracy.

  9. Fully Digital: Policy and Process Implications for the AAS

    NASA Astrophysics Data System (ADS)

    Biemesderfer, Chris

    Over the past two decades, every scholarly publisher has migrated at least the mechanical aspects of their journal publishing so that they utilize digital means. The academy was comfortable with that for a while, but publishers are under increasing pressure to adapt further. At the American Astronomical Society (AAS), we think that means bringing our publishing program to the point of being fully digital, by establishing procedures and policies that regard the digital objects of publication primarily. We have always thought about our electronic journals as databases of digital articles, from which we can publish and syndicate articles one at a time, and we must now put flesh on those bones by developing practices that are consistent with the realities of article at a time publication online. As a learned society that holds the long-term rights to the literature, we have actively taken responsibility for the preservation of the digital assets that constitute our journals, and in so doing we have not forsaken the legacy pre-digital assets. All of us who serve as the long-term stewards of scholarship must begin to evolve into fully digital publishers.

  10. Recognition and inference of crevice processing on digitized paintings

    NASA Astrophysics Data System (ADS)

    Karuppiah, S. P.; Srivatsa, S. K.

    2013-03-01

    This paper is designed to detect and removal of cracks on digitized paintings. The cracks are detected by threshold. Afterwards, the thin dark brush strokes which have been misidentified as cracks are removed using Median radial basis function neural network on hue and saturation data, Semi-automatic procedure based on region growing. Finally, crack is filled using wiener filter. The paper is well designed in such a way that most of the cracks on digitized paintings have identified and removed. The paper % of betterment is 90%. This paper helps us to perform not only on digitized paintings but also the medical images and bmp images. This paper is implemented by Mat Lab.