Science.gov

Sample records for digital processing methodology

  1. Digital methodology to implement the ECOUTER engagement process.

    PubMed

    Wilson, Rebecca C; Butters, Oliver W; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J

    2016-01-01

    ECOUTER (Employing COnceptUal schema for policy and Translation Engagement in Research) - French for 'to listen' - is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes. PMID:27366320

  2. Digital methodology to implement the ECOUTER engagement process

    PubMed Central

    Wilson, Rebecca C.; Butters, Oliver W.; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J.

    2016-01-01

    ECOUTER (Employing COnceptUal schema for policy and Translation Engagement in Research) – French for ‘to listen’ – is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes. PMID:27366320

  3. A Digital Methodology for the Design Process of Aerospace Assemblies with Sustainable Composite Processes & Manufacture

    NASA Astrophysics Data System (ADS)

    McEwan, W.; Butterfield, J.

    2011-05-01

    The well established benefits of composite materials are driving a significant shift in design and manufacture strategies for original equipment manufacturers (OEMs). Thermoplastic composites have advantages over the traditional thermosetting materials with regards to sustainability and environmental impact, features which are becoming increasingly pertinent in the aerospace arena. However, when sustainability and environmental impact are considered as design drivers, integrated methods for part design and product development must be developed so that any benefits of sustainable composite material systems can be assessed during the design process. These methods must include mechanisms to account for process induced part variation and techniques related to re-forming, recycling and decommissioning, which are in their infancy. It is proposed in this paper that predictive techniques related to material specification, part processing and product cost of thermoplastic composite components, be integrated within a Through Life Management (TLM) product development methodology as part of a larger strategy of product system modeling to improve disciplinary concurrency, realistic part performance, and to place sustainability at the heart of the design process. This paper reports the enhancement of digital manufacturing tools as a means of drawing simulated part manufacturing scenarios, real time costing mechanisms, and broader lifecycle performance data capture into the design cycle. The work demonstrates predictive processes for sustainable composite product manufacture and how a Product-Process-Resource (PPR) structure can be customised and enhanced to include design intent driven by `Real' part geometry and consequent assembly. your paper.

  4. Developing Digital Interventions: A Methodological Guide

    PubMed Central

    Watts, Sam; Yardley, Lucy; Lewith, George

    2014-01-01

    Digital interventions are becoming an increasingly popular method of delivering healthcare as they enable and promote patient self-management. This paper provides a methodological guide to the processes involved in developing effective digital interventions, detailing how to plan and develop such interventions to avoid common pitfalls. It demonstrates the need for mixed qualitative and quantitative methods in order to develop digital interventions which are effective, feasible, and acceptable to users and stakeholders. PMID:24648848

  5. A Methodology to Teach Advanced A/D Converters, Combining Digital Signal Processing and Microelectronics Perspectives

    ERIC Educational Resources Information Center

    Quintans, C.; Colmenar, A.; Castro, M.; Moure, M. J.; Mandado, E.

    2010-01-01

    ADCs (analog-to-digital converters), especially Pipeline and Sigma-Delta converters, are designed using complex architectures in order to increase their sampling rate and/or resolution. Consequently, the learning of ADC devices also encompasses complex concepts such as multistage synchronization, latency, oversampling, modulation, noise shaping,…

  6. Digital image processing.

    PubMed

    Seeram, Euclid

    2004-01-01

    Digital image processing is now commonplace in radiology, nuclear medicine and sonography. This article outlines underlying principles and concepts of digital image processing. After completing this article, readers should be able to: List the limitations of film-based imaging. Identify major components of a digital imaging system. Describe the history and application areas of digital image processing. Discuss image representation and the fundamentals of digital image processing. Outline digital image processing techniques and processing operations used in selected imaging modalities. Explain the basic concepts and visualization tools used in 3-D and virtual reality imaging. Recognize medical imaging informatics as a new area of specialization for radiologic technologists. PMID:15352557

  7. Integration Of Digital Methodologies (Field, Processing, and Presentation) In A Combined Sedimentology/Stratigraphy and Structure Course

    NASA Astrophysics Data System (ADS)

    Malinconico, L. L., Jr.; Sunderlin, D.; Liew, C. W.

    2015-12-01

    Over the course of the last three years we have designed, developed and refined two Apps for the iPad. GeoFieldBook and StratLogger allow for the real-time display of spatial (structural) and temporal (stratigraphic) field data as well as very easy in-field navigation. Field techniques and methods for data acquisition and mapping in the field have dramatically advanced and simplified how we collect and analyze data while in the field. The Apps are not geologic mapping programs, but rather a way of bypassing the analog field book step to acquire digital data directly that can then be used in various analysis programs (GIS, Google Earth, Stereonet, spreadsheet and drawing programs). We now complete all of our fieldwork digitally. GeoFieldBook can be used to collect structural and other field observations. Each record includes location/date/time information, orientation measurements, formation names, text observations and photos taken with the tablet camera. Records are customizable, so users can add fields of their own choosing. Data are displayed on an image base in real time with oriented structural symbols. The image base is also used for in-field navigation. In StratLogger, the user records bed thickness, lithofacies, biofacies, and contact data in preset and modifiable fields. Each bed/unit record may also be photographed and geo-referenced. As each record is collected, a column diagram of the stratigraphic sequence is built in real time, complete with lithology color, lithology texture, and fossil symbols. The recorded data from any measured stratigraphic sequence can be exported as both the live-drawn column image and as a .csv formatted file for use in spreadsheet or other applications. Common to both Apps is the ability to export the data (via .csv files), photographs and maps or stratigraphic columns (images). Since the data are digital they are easily imported into various processing programs (for example for stereoplot analysis). Requiring that all maps

  8. Digital image classification by the Bessel masks methodology

    NASA Astrophysics Data System (ADS)

    Solorza, S.; Álvarez-Borrego, J.

    2014-10-01

    Since the evolution of the computer's hardware in the middle of last century, the automatization processes are very productive in fields such as industry, security, engineering and science. In this work a pattern recognition methodology to classified images is presented. Here, the Bessel masks digital image system invariant to position and rotation is utilized to classified gray-scale images. Moreover, by the use of the Fisher's Z distribution the digital system get a 99% confidence level performance.

  9. DIGITAL TECHNOLOGY BUSINESS CASE METHODOLOGY GUIDE & WORKBOOK

    SciTech Connect

    Thomas, Ken; Lawrie, Sean; Hart, Adam; Vlahoplus, Chris

    2014-09-01

    Performance advantages of the new digital technologies are widely acknowledged, but it has proven difficult for utilities to derive business cases for justifying investment in these new capabilities. Lack of a business case is often cited by utilities as a barrier to pursuing wide-scale application of digital technologies to nuclear plant work activities. The decision to move forward with funding usually hinges on demonstrating actual cost reductions that can be credited to budgets and thereby truly reduce O&M or capital costs. Technology enhancements, while enhancing work methods and making work more efficient, often fail to eliminate workload such that it changes overall staffing and material cost requirements. It is critical to demonstrate cost reductions or impacts on non-cost performance objectives in order for the business case to justify investment by nuclear operators. This Business Case Methodology approaches building a business case for a particular technology or suite of technologies by detailing how they impact an operator in one or more of the three following areas: Labor Costs, Non-Labor Costs, and Key Performance Indicators (KPIs). Key to those impacts will be identifying where the savings are “harvestable,” meaning they result in an actual reduction in headcount and/or cost. The report consists of a Digital Technology Business Case Methodology Guide and an accompanying spreadsheet workbook that will enable the user to develop a business case.

  10. Digital signal processing

    NASA Astrophysics Data System (ADS)

    Morgera, Salvatore D.; Krishna, Hari

    Computationally efficient digital signal-processing algorithms over finite fields are developed analytically, and the relationship of these algorithms to algebraic error-correcting codes is explored. A multidisciplinary approach is employed, in an effort to make the results accessible to engineers, mathematicians, and computer scientists. Chapters are devoted to systems of bilinear forms, efficient finite-field algorithms, multidimensional methods, a new class of linear codes, and a new error-control scheme.

  11. Digital processing clock

    NASA Technical Reports Server (NTRS)

    Phillips, D. H.

    1982-01-01

    Tthe digital processing clock SG 1157/U is described. It is compatible with the PTTI world where it can be driven by an external cesium source. Built-in test equipment shows synchronization with cesium through 1 pulse per second. It is built to be expandable to accommodate future time-keeping needs of the Navy as well as any other time ordered functions. Examples of this expandibility are the inclusion of an unmodulated XR3 time code and the 2137 modulate time code (XR3 with 1 kHz carrier).

  12. Aquarius Digital Processing Unit

    NASA Technical Reports Server (NTRS)

    Forgione, Joshua; Winkert, George; Dobson, Norman

    2009-01-01

    Three documents provide information on a digital processing unit (DPU) for the planned Aquarius mission, in which a radiometer aboard a spacecraft orbiting Earth is to measure radiometric temperatures from which data on sea-surface salinity are to be deduced. The DPU is the interface between the radiometer and an instrument-command-and-data system aboard the spacecraft. The DPU cycles the radiometer through a programmable sequence of states, collects and processes all radiometric data, and collects all housekeeping data pertaining to operation of the radiometer. The documents summarize the DPU design, with emphasis on innovative aspects that include mainly the following: a) In the radiometer and the DPU, conversion from analog voltages to digital data is effected by means of asynchronous voltage-to-frequency converters in combination with a frequency-measurement scheme implemented in field-programmable gate arrays (FPGAs). b) A scheme to compensate for aging and changes in the temperature of the DPU in order to provide an overall temperature-measurement accuracy within 0.01 K includes a high-precision, inexpensive DC temperature measurement scheme and a drift-compensation scheme that was used on the Cassini radar system. c) An interface among multiple FPGAs in the DPU guarantees setup and hold times.

  13. Analog and digital signal processing

    NASA Astrophysics Data System (ADS)

    Baher, H.

    The techniques of signal processing in both the analog and digital domains are addressed in a fashion suitable for undergraduate courses in modern electrical engineering. The topics considered include: spectral analysis of continuous and discrete signals, analysis of continuous and discrete systems and networks using transform methods, design of analog and digital filters, digitization of analog signals, power spectrum estimation of stochastic signals, FFT algorithms, finite word-length effects in digital signal processes, linear estimation, and adaptive filtering.

  14. Digital signal processing

    NASA Astrophysics Data System (ADS)

    Meyer, G.

    The theory, realization techniques, and applications of digital filtering are surveyed, with an emphasis on the development of software, in a handbook for advanced students of electrical and electronic engineering and practicing development engineers. The foundations of the theory of discrete signals and systems are introduced. The design of one-dimensional linear systems is discussed, and the techniques are expanded to the treatment of two-dimensional discrete and multidimensional analog systems. Numerical systems, quantification and limitation, and the characteristics of particular signal-processing devices are considered in a section on design realization. An appendix contains definitions of the basic mathematical concepts, derivations and proofs, and tables of integration and differentiation formulas.

  15. Digital processing of bandpass signals

    NASA Astrophysics Data System (ADS)

    Jackson, M. C.; Matthewson, P.

    Modern radar and radio systems rely on digital signal processing to enhance the quality of received signals. Prior to such processing, these signals must be converted to digital form. The historical development of signal digitization is briefly discussed in this paper and leads to a description of some current work on digital mixing. A method of directly sampling a band-limited intermediate frequency (i.f.) signal is presented, using a pair of digital mixer channels to produce complex low-pass samples of the signal envelope. The method is found to produce well matched channel outputs. Finally, the applicability of the method to radar is discussed.

  16. Measuring user experience in digital gaming: theoretical and methodological issues

    NASA Astrophysics Data System (ADS)

    Takatalo, Jari; Häkkinen, Jukka; Kaistinen, Jyrki; Nyman, Göte

    2007-01-01

    There are innumerable concepts, terms and definitions for user experience. Few of them have a solid empirical foundation. In trying to understand user experience in interactive technologies such as computer games and virtual environments, reliable and valid concepts are needed for measuring relevant user reactions and experiences. Here we present our approach to create both theoretically and methodologically sound methods for quantification of the rich user experience in different digital environments. Our approach is based on the idea that the experience received from a content presented with a specific technology is always a result of a complex psychological interpretation process, which components should be understood. The main aim of our approach is to grasp the complex and multivariate nature of the experience and make it measurable. We will present our two basic measurement frameworks, which have been developed and tested in large data set (n=2182). The 15 measurement scales extracted from these models are applied to digital gaming with a head-mounted display and a table-top display. The results show how it is possible to map between experience, technology variables and the background of the user (e.g., gender). This approach can help to optimize, for example, the contents for specific viewing devices or viewing situations.

  17. Digital TV processing system

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Two digital video data compression systems directly applicable to the Space Shuttle TV Communication System were described: (1) For the uplink, a low rate monochrome data compressor is used. The compression is achieved by using a motion detection technique in the Hadamard domain. To transform the variable source rate into a fixed rate, an adaptive rate buffer is provided. (2) For the downlink, a color data compressor is considered. The compression is achieved first by intra-color transformation of the original signal vector, into a vector which has lower information entropy. Then two-dimensional data compression techniques are applied to the Hadamard transformed components of this last vector. Mathematical models and data reliability analyses were also provided for the above video data compression techniques transmitted over a channel encoded Gaussian channel. It was shown that substantial gains can be achieved by the combination of video source and channel coding.

  18. Digital Storytelling: A Novel Methodology for Sexual Health Promotion

    ERIC Educational Resources Information Center

    Guse, Kylene; Spagat, Andrea; Hill, Amy; Lira, Andrea; Heathcock, Stephen; Gilliam, Melissa

    2013-01-01

    Digital storytelling draws on the power of narrative for personal and social transformation. This technique has many desirable attributes for sexuality education, including a participatory methodology, provision of a "safe space" to collaboratively address stigmatized topics, and an emphasis on the social and political contexts that…

  19. Advanced digital SAR processing study

    NASA Technical Reports Server (NTRS)

    Martinson, L. W.; Gaffney, B. P.; Liu, B.; Perry, R. P.; Ruvin, A.

    1982-01-01

    A highly programmable, land based, real time synthetic aperture radar (SAR) processor requiring a processed pixel rate of 2.75 MHz or more in a four look system was designed. Variations in range and azimuth compression, number of looks, range swath, range migration and SR mode were specified. Alternative range and azimuth processing algorithms were examined in conjunction with projected integrated circuit, digital architecture, and software technologies. The advaced digital SAR processor (ADSP) employs an FFT convolver algorithm for both range and azimuth processing in a parallel architecture configuration. Algorithm performace comparisons, design system design, implementation tradeoffs and the results of a supporting survey of integrated circuit and digital architecture technologies are reported. Cost tradeoffs and projections with alternate implementation plans are presented.

  20. Digital Literacy: Tools and Methodologies for Information Society

    ERIC Educational Resources Information Center

    Rivoltella, Pier Cesare, Ed.

    2008-01-01

    Currently in a state of cultural transition, global society is moving from a literary society to digital one, adopting widespread use of advanced technologies such as the Internet and mobile devices. Digital media has an extraordinary impact on society's formative processes, forcing a pragmatic shift in their management and organization. This…

  1. Digital switching noise as a stochastic process

    NASA Astrophysics Data System (ADS)

    Boselli, Giorgio; Trucco, Gabriella; Liberali, Valentino

    2007-06-01

    Switching activity of logic gates in a digital system is a deterministic process, depending on both circuit parameters and input signals. However, the huge number of logic blocks in a digital system makes digital switching a cognitively stochastic process. Switching activity is the source of the so-called "digital noise", which can be analyzed using a stochastic approach. For an asynchronous digital network, we can model digital switching currents as a shot noise process, deriving both its amplitude distribution and its power spectral density. From spectral distribution of digital currents, we can also calculate the spectral distribution and the power of disturbances injected into the on-chip power supply lines.

  2. Digital Pulse Processing for Steff

    NASA Astrophysics Data System (ADS)

    Pollitt, A. J.; Dare, J. A.; Smith, A. G.; Tsekhanovich, I.

    2011-10-01

    The SpecTrometer for Exotic Fission Fragments (STEFF) is a Manchester based experiment that uses two segmented Bragg Chambers to determine fission fragment energies of a 252Cf source. In this work the pulses collected in the Bragg chambers are processed digitally using Flash ADC's in GRT4 cards and a DAQ. Algorithms have been developed to sort the data pulses off-line. These include routines for the removal of noise and cross talk through digital filtering, adjusting the data for ballistic deficit and adding back pulses from the individual anode segments. From the corrected data, the mass and Z of the fission fragments is deducible. Current methods and present challenges will be discussed.

  3. Selecting a software development methodology. [of digital flight control systems

    NASA Technical Reports Server (NTRS)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  4. Friction Stir Process Mapping Methodology

    NASA Technical Reports Server (NTRS)

    Kooney, Alex; Bjorkman, Gerry; Russell, Carolyn; Smelser, Jerry (Technical Monitor)

    2002-01-01

    In FSW (friction stir welding), the weld process performance for a given weld joint configuration and tool setup is summarized on a 2-D plot of RPM vs. IPM. A process envelope is drawn within the map to identify the range of acceptable welds. The sweet spot is selected as the nominal weld schedule. The nominal weld schedule is characterized in the expected manufacturing environment. The nominal weld schedule in conjunction with process control ensures a consistent and predictable weld performance.

  5. Friction Stir Process Mapping Methodology

    NASA Technical Reports Server (NTRS)

    Bjorkman, Gerry; Kooney, Alex; Russell, Carolyn

    2003-01-01

    The weld process performance for a given weld joint configuration and tool setup is summarized on a 2-D plot of RPM vs. IPM. A process envelope is drawn within the map to identify the range of acceptable welds. The sweet spot is selected as the nominal weld schedule The nominal weld schedule is characterized in the expected manufacturing environment. The nominal weld schedule in conjunction with process control ensures a consistent and predictable weld performance.

  6. Development of digital image processing based methodology to study, quantify and correlate the microstructure and three dimensional fracture surface morphology of aluminum alloy 7050

    NASA Astrophysics Data System (ADS)

    Dighe, Manish Deepak

    2000-10-01

    7XXX series wrought aluminum alloys are extensively used for structural aerospace applications due to their high strength to weight ratio, excellent corrosion resistance, and high fracture resistance. 7050 is an important alloy of this group, which is widely used for the applications such as aircraft wing skin structures, aircraft landing gear parts, and fuselage frame structure. Therefore, it is of interest to investigate the fracture behavior of 7050 aluminum alloy, which is a typical alloy of 7XXX series. The aim of this research is to quantitatively characterize and model the relationships among processing, microstructure, fracture surface morphology, and fracture toughness of hot-rolled partially recrystallized precipitation hardened 7050 alloy. A new technique is developed which permits simultaneous viewing of the fracture surface and the microstructure just below and above the fracture surface. This technique is then applied to identify, validate and quantify various fracture micro-mechanisms observed on the fracture surface. To get a better insight in to the shape and anisotropy of the recrystallized grains, the three-dimensional structure of the microstructure is reconstructed using serial sectioning. The gathered information is utilized to develop a mathematical model relating the various processing parameters and microstructural attributes to the fracture toughness.

  7. Challenges of implementing digital technology in motion picture distribution and exhibition: testing and evaluation methodology

    NASA Astrophysics Data System (ADS)

    Swartz, Charles S.

    2003-05-01

    The process of distributing and exhibiting a motion picture has changed little since the Lumière brothers presented the first motion picture to an audience in 1895. While this analog photochemical process is capable of producing screen images of great beauty and expressive power, more often the consumer experience is diminished by third generation prints and by the wear and tear of the mechanical process. Furthermore, the film industry globally spends approximately $1B annually manufacturing and shipping prints. Alternatively, distributing digital files would theoretically yield great benefits in terms of image clarity and quality, lower cost, greater security, and more flexibility in the cinema (e.g., multiple language versions). In order to understand the components of the digital cinema chain and evaluate the proposed technical solutions, the Entertainment Technology Center at USC in 2000 established the Digital Cinema Laboratory as a critical viewing environment, with the highest quality film and digital projection equipment. The presentation describes the infrastructure of the Lab, test materials, and testing methodologies developed for compression evaluation, and lessons learned up to the present. In addition to compression, the Digital Cinema Laboratory plans to evaluate other components of the digital cinema process as well.

  8. Process waste assessment methodology for mechanical departments

    SciTech Connect

    Hedrick, R.B.

    1992-12-01

    Process waste assessments (PWAS) were performed for three pilot processes to develop methodology for performing PWAs for all the various processes used throughout the mechanical departments. A material balance and process flow diagram identifying the raw materials utilized in the process and the quantity and types of materials entering the waste streams from the process is defined for each PWA. The data and information are used to determine potential options'' for eliminating hazardous materials or minimizing wastes generated.

  9. Digital processing system for developing countries

    NASA Technical Reports Server (NTRS)

    Nanayakkara, C.; Wagner, H.

    1977-01-01

    An effort was undertaken to perform simple digital processing tasks using pre-existing general purpose digital computers. An experimental software package, LIGMALS, was obtained and modified for this purpose. The resulting software permits basic processing tasks to be performed including level slicing, gray mapping and ratio processing. The experience gained in this project indicates a possible direction which may be used by other developing countries to obtain digital processing capabilities.

  10. Controlling the digital transfer process

    NASA Astrophysics Data System (ADS)

    Brunner, Felix

    1997-02-01

    The accuracy of today's color management systems fails to satisfy the requirements of the graphic arts market. A first explanation for this is that color calibration charts on which these systems rely, because of print technical reasons, are subject to color deviations and inconsistencies. A second reason is that colorimetry describes the human visual perception of color differences and has no direct relation to the rendering technology itself of a proofing or printing device. The author explains that only firm process control of the many parameters in offset printing by means of a system as for example EUROSTANDARD System Brunner, can lead to accurate and consistent calibration of scanner, display, proof and print. The same principles hold for the quality management of digital presses.

  11. Digital processing of radiographic images

    NASA Technical Reports Server (NTRS)

    Bond, A. D.; Ramapriyan, H. K.

    1973-01-01

    Some techniques are presented and the software documentation for the digital enhancement of radiographs. Both image handling and image processing operations are considered. The image handling operations dealt with are: (1) conversion of format of data from packed to unpacked and vice versa; (2) automatic extraction of image data arrays; (3) transposition and 90 deg rotations of large data arrays; (4) translation of data arrays for registration; and (5) reduction of the dimensions of data arrays by integral factors. Both the frequency and the spatial domain approaches are presented for the design and implementation of the image processing operation. It is shown that spatial domain recursive implementation of filters is much faster than nonrecursive implementations using fast fourier transforms (FFT) for the cases of interest in this work. The recursive implementation of a class of matched filters for enhancing image signal to noise ratio is described. Test patterns are used to illustrate the filtering operations. The application of the techniques to radiographic images of metallic structures is demonstrated through several examples.

  12. BPSK Demodulation Using Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Garcia, Thomas R.

    1996-01-01

    A digital communications signal is a sinusoidal waveform that is modified by a binary (digital) information signal. The sinusoidal waveform is called the carrier. The carrier may be modified in amplitude, frequency, phase, or a combination of these. In this project a binary phase shift keyed (BPSK) signal is the communication signal. In a BPSK signal the phase of the carrier is set to one of two states, 180 degrees apart, by a binary (i.e., 1 or 0) information signal. A digital signal is a sampled version of a "real world" time continuous signal. The digital signal is generated by sampling the continuous signal at discrete points in time. The rate at which the signal is sampled is called the sampling rate (f(s)). The device that performs this operation is called an analog-to-digital (A/D) converter or a digitizer. The digital signal is composed of the sequence of individual values of the sampled BPSK signal. Digital signal processing (DSP) is the modification of the digital signal by mathematical operations. A device that performs this processing is called a digital signal processor. After processing, the digital signal may then be converted back to an analog signal using a digital-to-analog (D/A) converter. The goal of this project is to develop a system that will recover the digital information from a BPSK signal using DSP techniques. The project is broken down into the following steps: (1) Development of the algorithms required to demodulate the BPSK signal; (2) Simulation of the system; and (3) Implementation a BPSK receiver using digital signal processing hardware.

  13. A Methodology and Implementation for Annotating Digital Images for Context-appropriate Use in an Academic Health Care Environment

    PubMed Central

    Goede, Patricia A.; Lauman, Jason R.; Cochella, Christopher; Katzman, Gregory L.; Morton, David A.; Albertine, Kurt H.

    2004-01-01

    Use of digital medical images has become common over the last several years, coincident with the release of inexpensive, mega-pixel quality digital cameras and the transition to digital radiology operation by hospitals. One problem that clinicians, medical educators, and basic scientists encounter when handling images is the difficulty of using business and graphic arts commercial-off-the-shelf (COTS) software in multicontext authoring and interactive teaching environments. The authors investigated and developed software-supported methodologies to help clinicians, medical educators, and basic scientists become more efficient and effective in their digital imaging environments. The software that the authors developed provides the ability to annotate images based on a multispecialty methodology for annotation and visual knowledge representation. This annotation methodology is designed by consensus, with contributions from the authors and physicians, medical educators, and basic scientists in the Departments of Radiology, Neurobiology and Anatomy, Dermatology, and Ophthalmology at the University of Utah. The annotation methodology functions as a foundation for creating, using, reusing, and extending dynamic annotations in a context-appropriate, interactive digital environment. The annotation methodology supports the authoring process as well as output and presentation mechanisms. The annotation methodology is the foundation for a Windows implementation that allows annotated elements to be represented as structured eXtensible Markup Language and stored separate from the image(s). PMID:14527971

  14. Development and testing of controller performance evaluation methodology for multi-input/multi-output digital control systems

    NASA Technical Reports Server (NTRS)

    Pototzky, Anthony; Wieseman, Carol; Hoadley, Sherwood Tiffany; Mukhopadhyay, Vivek

    1991-01-01

    Described here is the development and implementation of on-line, near real time controller performance evaluation (CPE) methods capability. Briefly discussed are the structure of data flow, the signal processing methods used to process the data, and the software developed to generate the transfer functions. This methodology is generic in nature and can be used in any type of multi-input/multi-output (MIMO) digital controller application, including digital flight control systems, digitally controlled spacecraft structures, and actively controlled wind tunnel models. Results of applying the CPE methodology to evaluate (in near real time) MIMO digital flutter suppression systems being tested on the Rockwell Active Flexible Wing (AFW) wind tunnel model are presented to demonstrate the CPE capability.

  15. VLSI systems design for digital signal processing. Volume 1 - Signal processing and signal processors

    NASA Astrophysics Data System (ADS)

    Bowen, B. A.; Brown, W. R.

    This book is concerned with the design of digital signal processing systems which utilize VLSI (Very Large Scale Integration) components. The presented material is intended for use by electrical engineers at the senior undergraduate or introductory graduate level. It is the purpose of this volume to present an overview of the important elements of background theory, processing techniques, and hardware evolution. Digital signals are considered along with linear systems and digital filters, taking into account the transform analysis of deterministic signals, a statistical signal model, time domain representations of discrete-time linear systems, and digital filter design techniques and implementation issues. Attention is given to aspects of detection and estimation, digital signal processing algorithms and techniques, issues which must be resolved in a processor design methodology, the fundamental concepts of high performance processing in terms of two early super computers, and the extension of these concepts to more recent processors.

  16. Digital signal processing for radioactive decay studies

    SciTech Connect

    Miller, D.; Madurga, M.; Paulauskas, S. V.; Ackermann, D.; Heinz, S.; Hessberger, F. P.; Hofmann, S.; Grzywacz, R.; Miernik, K.; Rykaczewski, K.; Tan, H.

    2011-11-30

    The use of digital acquisition system has been instrumental in the investigation of proton and alpha emitting nuclei. Recent developments extend the sensitivity and breadth of the application. The digital signal processing capabilities, used predominately by UT/ORNL for decay studies, include digitizers with decreased dead time, increased sampling rates, and new innovative firmware. Digital techniques and these improvements are furthermore applicable to a range of detector systems. Improvements in experimental sensitivity for alpha and beta-delayed neutron emitters measurements as well as the next generation of superheavy experiments are discussed.

  17. How Digital Image Processing Became Really Easy

    NASA Astrophysics Data System (ADS)

    Cannon, Michael

    1988-02-01

    In the early and mid-1970s, digital image processing was the subject of intense university and corporate research. The research lay along two lines: (1) developing mathematical techniques for improving the appearance of or analyzing the contents of images represented in digital form, and (2) creating cost-effective hardware to carry out these techniques. The research has been very effective, as evidenced by the continued decline of image processing as a research topic, and the rapid increase of commercial companies to market digital image processing software and hardware.

  18. On Certain New Methodology for Reducing Sensor and Readout Electronics Circuitry Noise in Digital Domain

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Miko, Joseph; Bradley, Damon; Heinzen, Katherine

    2008-01-01

    NASA Hubble Space Telescope (HST) and upcoming cosmology science missions carry instruments with multiple focal planes populated with many large sensor detector arrays. These sensors are passively cooled to low temperatures for low-level light (L3) and near-infrared (NIR) signal detection, and the sensor readout electronics circuitry must perform at extremely low noise levels to enable new required science measurements. Because we are at the technological edge of enhanced performance for sensors and readout electronics circuitry, as determined by thermal noise level at given temperature in analog domain, we must find new ways of further compensating for the noise in the signal digital domain. To facilitate this new approach, state-of-the-art sensors are augmented at their array hardware boundaries by non-illuminated reference pixels, which can be used to reduce noise attributed to sensors. There are a few proposed methodologies of processing in the digital domain the information carried by reference pixels, as employed by the Hubble Space Telescope and the James Webb Space Telescope Projects. These methods involve using spatial and temporal statistical parameters derived from boundary reference pixel information to enhance the active (non-reference) pixel signals. To make a step beyond this heritage methodology, we apply the NASA-developed technology known as the Hilbert- Huang Transform Data Processing System (HHT-DPS) for reference pixel information processing and its utilization in reconfigurable hardware on-board a spaceflight instrument or post-processing on the ground. The methodology examines signal processing for a 2-D domain, in which high-variance components of the thermal noise are carried by both active and reference pixels, similar to that in processing of low-voltage differential signals and subtraction of a single analog reference pixel from all active pixels on the sensor. Heritage methods using the aforementioned statistical parameters in the

  19. The Process of Digitizing of Old Globe

    NASA Astrophysics Data System (ADS)

    Ambrožová, K.; Havrlanta, J.; Talich, M.; Böhm, O.

    2016-06-01

    This paper describes the process of digitalization of old globes that brings with it the possibility to use globes in their digital form. Created digital models are available to the general public through modern technology in the Internet network. This gives an opportunity to study old globes located in various historical collections, and prevent damage of the originals. Another benefit of digitization is also a possibility of comparing different models both among themselves and with current map data by increasing the transparency of individual layers. Digitization is carried out using special device that allows digitizing globes with a diameter ranging from 5 cm to 120 cm. This device can be easily disassembled, and it is fully mobile therefore the globes can be digitized in the place of its storage. Image data of globe surface are acquired by digital camera firmly fastened to the device. Acquired image data are then georeferenced by using a method of complex adjustment. The last step of digitization is publication of the final models that is realized by two ways. The first option is in the form of 3D model through JavaScript library Cesium or Google Earth plug-in in the Web browser. The second option is as a georeferenced map using Tile Map Service.

  20. Checking Fits With Digital Image Processing

    NASA Technical Reports Server (NTRS)

    Davis, R. M.; Geaslen, W. D.

    1988-01-01

    Computer-aided video inspection of mechanical and electrical connectors feasible. Report discusses work done on digital image processing for computer-aided interface verification (CAIV). Two kinds of components examined: mechanical mating flange and electrical plug.

  1. Digital images in the map revision process

    NASA Astrophysics Data System (ADS)

    Newby, P. R. T.

    Progress towards the adoption of digital (or softcopy) photogrammetric techniques for database and map revision is reviewed. Particular attention is given to the Ordnance Survey of Great Britain, the author's former employer, where digital processes are under investigation but have not yet been introduced for routine production. Developments which may lead to increasing automation of database update processes appear promising, but because of the cost and practical problems associated with managing as well as updating large digital databases, caution is advised when considering the transition to softcopy photogrammetry for revision tasks.

  2. CT Image Processing Using Public Digital Networks

    PubMed Central

    Rhodes, Michael L.; Azzawi, Yu-Ming; Quinn, John F.; Glenn, William V.; Rothman, Stephen L.G.

    1984-01-01

    Nationwide commercial computer communication is now commonplace for those applications where digital dialogues are generally short and widely distributed, and where bandwidth does not exceed that of dial-up telephone lines. Image processing using such networks is prohibitive because of the large volume of data inherent to digital pictures. With a blend of increasing bandwidth and distributed processing, network image processing becomes possible. This paper examines characteristics of a digital image processing service for a nationwide network of CT scanner installations. Issues of image transmission, data compression, distributed processing, software maintenance, and interfacility communication are also discussed. Included are results that show the volume and type of processing experienced by a network of over 50 CT scanners for the last 32 months.

  3. Process independent automated sizing methodology for current steering DAC

    NASA Astrophysics Data System (ADS)

    Vural, R. A.; Kahraman, N.; Erkmen, B.; Yildirim, T.

    2015-10-01

    This study introduces a process independent automated sizing methodology based on general regression neural network (GRNN) for current steering complementary metal-oxide semiconductor (CMOS) digital-to-analog converter (DAC) circuit. The aim is to utilise circuit structures designed with previous process technologies and to synthesise circuit structures for novel process technologies in contrast to other modelling researches that consider a particular process technology. The simulations were performed using ON SEMI 1.5 µm, ON SEMI 0.5 µm and TSMC 0.35 µm technology process parameters. Eventually, a high-dimensional database was developed consisting of transistor sizes of DAC designs and corresponded static specification errors obtained from simulation results. The key point is that the GRNN was trained with the data set including the simulation results of ON-SEMI 1.5 µm and 0.5 µm technology parameters and the test data were constituted with only the simulation results of TSMC 0.35 µm technology parameters that had not been applied to GRNN for training beforehand. The proposed methodology provides the channel lengths and widths of all transistors for a newer technology when the designer sets the numeric values of DAC static output specifications as Differential Non-linearity error, Integral Non-linearity error, monotonicity and gain error as the inputs of the network.

  4. Digital Signal Processing Based Biotelemetry Receivers

    NASA Technical Reports Server (NTRS)

    Singh, Avtar; Hines, John; Somps, Chris

    1997-01-01

    This is an attempt to develop a biotelemetry receiver using digital signal processing technology and techniques. The receiver developed in this work is based on recovering signals that have been encoded using either Pulse Position Modulation (PPM) or Pulse Code Modulation (PCM) technique. A prototype has been developed using state-of-the-art digital signal processing technology. A Printed Circuit Board (PCB) is being developed based on the technique and technology described here. This board is intended to be used in the UCSF Fetal Monitoring system developed at NASA. The board is capable of handling a variety of PPM and PCM signals encoding signals such as ECG, temperature, and pressure. A signal processing program has also been developed to analyze the received ECG signal to determine heart rate. This system provides a base for using digital signal processing in biotelemetry receivers and other similar applications.

  5. Digital database architecture and delineation methodology for deriving drainage basins, and a comparison of digitally and non-digitally derived numeric drainage areas

    USGS Publications Warehouse

    Dupree, Jean A.; Crowfoot, Richard M.

    2012-01-01

    The drainage basin is a fundamental hydrologic entity used for studies of surface-water resources and during planning of water-related projects. Numeric drainage areas published by the U.S. Geological Survey water science centers in Annual Water Data Reports and on the National Water Information Systems (NWIS) Web site are still primarily derived from hard-copy sources and by manual delineation of polygonal basin areas on paper topographic map sheets. To expedite numeric drainage area determinations, the Colorado Water Science Center developed a digital database structure and a delineation methodology based on the hydrologic unit boundaries in the National Watershed Boundary Dataset. This report describes the digital database architecture and delineation methodology and also presents the results of a comparison of the numeric drainage areas derived using this digital methodology with those derived using traditional, non-digital methods. (Please see report for full Abstract)

  6. Digital signal processing for ionospheric propagation diagnostics

    NASA Astrophysics Data System (ADS)

    Rino, Charles L.; Groves, Keith M.; Carrano, Charles S.; Gunter, Jacob H.; Parris, Richard T.

    2015-08-01

    For decades, analog beacon satellite receivers have generated multifrequency narrowband complex data streams that could be processed directly to extract total electron content (TEC) and scintillation diagnostics. With the advent of software-defined radio, modern digital receivers generate baseband complex data streams that require intermediate processing to extract the narrowband modulation imparted to the signal by ionospheric structure. This paper develops and demonstrates a processing algorithm for digital beacon satellite data that will extract TEC and scintillation components. For algorithm evaluation, a simulator was developed to generate noise-limited multifrequency complex digital signal realizations with representative orbital dynamics and propagation disturbances. A frequency-tracking procedure is used to capture the slowly changing frequency component. Dynamic demodulation against the low-frequency estimate captures the scintillation. The low-frequency reference can be used directly for dual-frequency TEC estimation.

  7. Digital processing of Mariner 9 television data.

    NASA Technical Reports Server (NTRS)

    Green, W. B.; Seidman, J. B.

    1973-01-01

    The digital image processing performed by the Image Processing Laboratory (IPL) at JPL in support of the Mariner 9 mission is summarized. The support is divided into the general categories of image decalibration (the removal of photometric and geometric distortions from returned imagery), computer cartographic projections in support of mapping activities, and adaptive experimenter support (flexible support to provide qualitative digital enhancements and quantitative data reduction of returned imagery). Among the tasks performed were the production of maximum discriminability versions of several hundred frames to support generation of a geodetic control net for Mars, and special enhancements supporting analysis of Phobos and Deimos images.

  8. Digital image processing of vascular angiograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Beckenbach, E. S.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.

    1975-01-01

    The paper discusses the estimation of the degree of atherosclerosis in the human femoral artery through the use of a digital image processing system for vascular angiograms. The film digitizer uses an electronic image dissector camera to scan the angiogram and convert the recorded optical density information into a numerical format. Another processing step involves locating the vessel edges from the digital image. The computer has been programmed to estimate vessel abnormality through a series of measurements, some derived primarily from the vessel edge information and others from optical density variations within the lumen shadow. These measurements are combined into an atherosclerosis index, which is found in a post-mortem study to correlate well with both visual and chemical estimates of atherosclerotic disease.

  9. Digital Image Processing in Private Industry.

    ERIC Educational Resources Information Center

    Moore, Connie

    1986-01-01

    Examines various types of private industry optical disk installations in terms of business requirements for digital image systems in five areas: records management; transaction processing; engineering/manufacturing; information distribution; and office automation. Approaches for implementing image systems are addressed as well as key success…

  10. Research Methodologies and the Doctoral Process.

    ERIC Educational Resources Information Center

    Creswell, John W.; Miller, Gary A.

    1997-01-01

    Doctoral students often select one of four common research methodologies that are popular in the social sciences and education today: positivist; interpretive; ideological; and pragmatic. But choice of methodology also influences the student's choice of course work, membership of dissertation committee, and the form and structure of the…

  11. On process optimization considering LCA methodology.

    PubMed

    Pieragostini, Carla; Mussati, Miguel C; Aguirre, Pío

    2012-04-15

    The goal of this work is to research the state-of-the-art in process optimization techniques and tools based on LCA, focused in the process engineering field. A collection of methods, approaches, applications, specific software packages, and insights regarding experiences and progress made in applying the LCA methodology coupled to optimization frameworks is provided, and general trends are identified. The "cradle-to-gate" concept to define the system boundaries is the most used approach in practice, instead of the "cradle-to-grave" approach. Normally, the relationship between inventory data and impact category indicators is linearly expressed by the characterization factors; then, synergic effects of the contaminants are neglected. Among the LCIA methods, the eco-indicator 99, which is based on the endpoint category and the panel method, is the most used in practice. A single environmental impact function, resulting from the aggregation of environmental impacts, is formulated as the environmental objective in most analyzed cases. SimaPro is the most used software for LCA applications in literature analyzed. The multi-objective optimization is the most used approach for dealing with this kind of problems, where the ε-constraint method for generating the Pareto set is the most applied technique. However, a renewed interest in formulating a single economic objective function in optimization frameworks can be observed, favored by the development of life cycle cost software and progress made in assessing costs of environmental externalities. Finally, a trend to deal with multi-period scenarios into integrated LCA-optimization frameworks can be distinguished providing more accurate results upon data availability. PMID:22208397

  12. [Digital thoracic radiology: devices, image processing, limits].

    PubMed

    Frija, J; de Géry, S; Lallouet, F; Guermazi, A; Zagdanski, A M; De Kerviler, E

    2001-09-01

    In a first part, the different techniques of digital thoracic radiography are described. Since computed radiography with phosphore plates are the most commercialized it is more emphasized. But the other detectors are also described, as the drum coated with selenium and the direct digital radiography with selenium detectors. The other detectors are also studied in particular indirect flat panels detectors and the system with four high resolution CCD cameras. In a second step the most important image processing are discussed: the gradation curves, the unsharp mask processing, the system MUSICA, the dynamic range compression or reduction, the soustraction with dual energy. In the last part the advantages and the drawbacks of computed thoracic radiography are emphasized. The most important are the almost constant good quality of the pictures and the possibilities of image processing. PMID:11567193

  13. A methodology for use of digital image correlation for hot mix asphalt testing

    NASA Astrophysics Data System (ADS)

    Ramos, Estefany

    Digital Image Correlation (DIC) is a relatively new technology which aids in the measurement of material properties without the need for installation of sensors. DIC is a noncontact measuring technique that requires the specimen to be marked with a random speckled pattern and to be photographed during the test. The photographs are then post-processed based on the location of the pattern throughout the test. DIC can aid in calculating properties that would otherwise be too difficult even with other measuring instruments. The objective of this thesis is to discuss the methodology and validate the use of DIC in different hot mix asphalt (HMA) tests, such as, the Overlay Tester (OT) Test, Indirect Tensile (IDT) Test, and the Semicircular Bending (SCB) Test. The DIC system provides displacements and strains in any visible surface. The properly calibrated 2-D or 3-D DIC data can be used to understand the complex stress and strain distributions and the modes of the initiation and propagation of cracks. The use of this observational method will lead to further understanding of the complex boundary conditions of the different test, and therefore, allowing it to be implemented in the analysis of other materials. The use of digital image correlation will bring insight and knowledge onto what is happening during a test.

  14. A methodology for the semi-automatic digital image analysis of fragmental impactites

    NASA Astrophysics Data System (ADS)

    Chanou, A.; Osinski, G. R.; Grieve, R. A. F.

    2014-04-01

    A semi-automated digital image analysis method is developed for the comparative textural study of impact melt-bearing breccias. This method uses the freeware software ImageJ developed by the National Institute of Health (NIH). Digital image analysis is performed on scans of hand samples (10-15 cm across), based on macroscopic interpretations of the rock components. All image processing and segmentation are done semi-automatically, with the least possible manual intervention. The areal fraction of components is estimated and modal abundances can be deduced, where the physical optical properties (e.g., contrast, color) of the samples allow it. Other parameters that can be measured include, for example, clast size, clast-preferred orientations, average box-counting dimension or fragment shape complexity, and nearest neighbor distances (NnD). This semi-automated method allows the analysis of a larger number of samples in a relatively short time. Textures, granulometry, and shape descriptors are of considerable importance in rock characterization. The methodology is used to determine the variations of the physical characteristics of some examples of fragmental impactites.

  15. Fundamental Concepts of Digital Image Processing

    DOE R&D Accomplishments Database

    Twogood, R. E.

    1983-03-01

    The field of a digital-image processing has experienced dramatic growth and increasingly widespread applicability in recent years. Fortunately, advances in computer technology have kept pace with the rapid growth in volume of image data in these and other applications. Digital image processing has become economical in many fields of research and in industrial and military applications. While each application has requirements unique from the others, all are concerned with faster, cheaper, more accurate, and more extensive computation. The trend is toward real-time and interactive operations, where the user of the system obtains preliminary results within a short enough time that the next decision can be made by the human processor without loss of concentration on the task at hand. An example of this is the obtaining of two-dimensional (2-D) computer-aided tomography (CAT) images. A medical decision might be made while the patient is still under observation rather than days later.

  16. Weighting in digital synthetic aperture radar processing

    NASA Technical Reports Server (NTRS)

    Dicenzo, A.

    1979-01-01

    Weighting is employed in synthetic aperture radar (SAR) processing to reduce the sidelobe response at the expense of peak center response height and mainlobe resolution. The weighting effectiveness in digital processing depends not only on the choice of weighting function, but on the fineness of sampling and quantization, on the time bandwidth product, on the quadratic phase error, and on the azimuth antenna pattern. The results of simulations conducted to uncover the effect of these parameters on azimuth weighting effectiveness are presented. In particular, it is shown that multilook capabilities of future SAR systems may obviate the need for consideration of the antenna pattern, and that azimuth time-bandwidth products of over 200 are probably required before the digital results begin to approach the ideal results.

  17. REVIEW ARTICLE: Spectrophotometric applications of digital signal processing

    NASA Astrophysics Data System (ADS)

    Morawski, Roman Z.

    2006-09-01

    Spectrophotometry is more and more often the method of choice not only in analysis of (bio)chemical substances, but also in the identification of physical properties of various objects and their classification. The applications of spectrophotometry include such diversified tasks as monitoring of optical telecommunications links, assessment of eating quality of food, forensic classification of papers, biometric identification of individuals, detection of insect infestation of seeds and classification of textiles. In all those applications, large numbers of data, generated by spectrophotometers, are processed by various digital means in order to extract measurement information. The main objective of this paper is to review the state-of-the-art methodology for digital signal processing (DSP) when applied to data provided by spectrophotometric transducers and spectrophotometers. First, a general methodology of DSP applications in spectrophotometry, based on DSP-oriented models of spectrophotometric data, is outlined. Then, the most important classes of DSP methods for processing spectrophotometric data—the methods for DSP-aided calibration of spectrophotometric instrumentation, the methods for the estimation of spectra on the basis of spectrophotometric data, the methods for the estimation of spectrum-related measurands on the basis of spectrophotometric data—are presented. Finally, the methods for preprocessing and postprocessing of spectrophotometric data are overviewed. Throughout the review, the applications of DSP are illustrated with numerous examples related to broadly understood spectrophotometry.

  18. Digital signal processing the Tevatron BPM signals

    SciTech Connect

    Cancelo, G.; James, E.; Wolbers, S.; /Fermilab

    2005-05-01

    The Beam Position Monitor (TeV BPM) readout system at Fermilab's Tevatron has been updated and is currently being commissioned. The new BPMs use new analog and digital hardware to achieve better beam position measurement resolution. The new system reads signals from both ends of the existing directional stripline pickups to provide simultaneous proton and antiproton measurements. The signals provided by the two ends of the BPM pickups are processed by analog band-pass filters and sampled by 14-bit ADCs at 74.3MHz. A crucial part of this work has been the design of digital filters that process the signal. This paper describes the digital processing and estimation techniques used to optimize the beam position measurement. The BPM electronics must operate in narrow-band and wide-band modes to enable measurements of closed-orbit and turn-by-turn positions. The filtering and timing conditions of the signals are tuned accordingly for the operational modes. The analysis and the optimized result for each mode are presented.

  19. Process simulation in digital camera system

    NASA Astrophysics Data System (ADS)

    Toadere, Florin

    2012-06-01

    The goal of this paper is to simulate the functionality of a digital camera system. The simulations cover the conversion from light to numerical signal and the color processing and rendering. We consider the image acquisition system to be linear shift invariant and axial. The light propagation is orthogonal to the system. We use a spectral image processing algorithm in order to simulate the radiometric properties of a digital camera. In the algorithm we take into consideration the transmittances of the: light source, lenses, filters and the quantum efficiency of a CMOS (complementary metal oxide semiconductor) sensor. The optical part is characterized by a multiple convolution between the different points spread functions of the optical components. We use a Cooke triplet, the aperture, the light fall off and the optical part of the CMOS sensor. The electrical part consists of the: Bayer sampling, interpolation, signal to noise ratio, dynamic range, analog to digital conversion and JPG compression. We reconstruct the noisy blurred image by blending different light exposed images in order to reduce the photon shot noise, also we filter the fixed pattern noise and we sharpen the image. Then we have the color processing blocks: white balancing, color correction, gamma correction, and conversion from XYZ color space to RGB color space. For the reproduction of color we use an OLED (organic light emitting diode) monitor. The analysis can be useful to assist students and engineers in image quality evaluation and imaging system design. Many other configurations of blocks can be used in our analysis.

  20. Parallel digital signal processing architectures for image processing

    NASA Astrophysics Data System (ADS)

    Kshirsagar, Shirish P.; Hartley, David A.; Harvey, David M.; Hobson, Clifford A.

    1994-10-01

    This paper describes research into a high speed image processing system using parallel digital signal processors for the processing of electro-optic images. The objective of the system is to reduce the processing time of non-contact type inspection problems including industrial and medical applications. A single processor can not deliver sufficient processing power required for the use of applications hence, a MIMD system is designed and constructed to enable fast processing of electro-optic images. The Texas Instruments TMS320C40 digital signal processor is used due to its high speed floating point CPU and the support for the parallel processing environment. A custom designed VISION bus is provided to transfer images between processors. The system is being applied for solder joint inspection of high technology printed circuit boards.

  1. Digital Signal Processing in the GRETINA Spectrometer

    NASA Astrophysics Data System (ADS)

    Cromaz, Mario

    2015-10-01

    Developments in the segmentation of large-volume HPGe crystals has enabled the development of high-efficiency gamma-ray spectrometers which have the ability to track the path of gamma-rays scattering through the detector volume. This technology has been successfully implemented in the GRETINA spectrometer whose high efficiency and ability to perform precise event-by-event Doppler correction has made it an important tool in nuclear spectroscopy. Tracking has required the spectrometer to employ a fully digital signal processing chain. Each of the systems 1120 channels are digitized by 100 Mhz, 14-bit flash ADCs. Filters that provide timing and high-resolution energies are implemented on local FPGAs acting on the ADC data streams while interaction point locations and tracks, derived from the trace on each detector segment, are calculated in real time on a computing cluster. In this presentation we will give a description of GRETINA's digital signal processing system, the impact of design decisions on system performance, and a discussion of possible future directions as we look towards soon developing larger spectrometers such as GRETA with full 4 π solid angle coverage. This work was supported by the Office of Science in the Department of Energy under grant DE-AC02-05CH11231.

  2. C language algorithms for digital signal processing

    SciTech Connect

    Embree, P.M.; Kimble, B.

    1991-01-01

    The use of the C programming language to construct digital signal-processing (DSP) algorithms for operation on high-performance personal computers is described in a textbook for engineering students. Chapters are devoted to the fundamental principles of DSP, basic C programming techniques, user-interface and disk-storage routines, filtering routines, discrete Fourier transforms, matrix and vector routines, and image-processing routines. Also included is a floppy disk containing a library of standard C mathematics, character-string, memory-allocation, and I/O functions; a library of DSP functions; and several sample DSP programs. 83 refs.

  3. Applications of Digital Image Processing 11

    NASA Technical Reports Server (NTRS)

    Cho, Y. -C.

    1988-01-01

    A new technique, digital image velocimetry, is proposed for the measurement of instantaneous velocity fields of time dependent flows. A time sequence of single-exposure images of seed particles are captured with a high-speed camera, and a finite number of the single-exposure images are sampled within a prescribed period in time. The sampled images are then digitized on an image processor, enhanced, and superimposed to construct an image which is equivalent to a multiple exposure image used in both laser speckle velocimetry and particle image velocimetry. The superimposed image and a single-exposure Image are digitally Fourier transformed for extraction of information on the velocity field. A great enhancement of the dynamic range of the velocity measurement is accomplished through the new technique by manipulating the Fourier transform of both the single-exposure image and the superimposed image. Also the direction of the velocity vector is unequivocally determined. With the use of a high-speed video camera, the whole process from image acquisition to velocity determination can be carried out electronically; thus this technique can be developed into a real-time capability.

  4. Modeling and Analysis of Power Processing Systems. [use of a digital computer for designing power plants

    NASA Technical Reports Server (NTRS)

    Fegley, K. A.; Hayden, J. H.; Rehmann, D. W.

    1974-01-01

    The feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems is investigated. It is shown that a digital computer may be used in an interactive mode for the design, modeling, analysis, and comparison of power processing systems.

  5. Digital signal processing in acoustics. I

    NASA Astrophysics Data System (ADS)

    Davies, H.; McNeil, D. J.

    1985-11-01

    Digital signal processing techniques have gained steadily in importance over the past few years in many areas of science and engineering and have transformed the character of instrumentation used in laboratory and plant. This is particularly marked in acoustics, which has both benefited from the developments in signal processing and provided significant stimulus for these developments. As a result acoustical techniques are now used in a very wide range of applications and acoustics is one area in which digital signal processing is exploited to its limits. For example, the development of fast algorithms for computing Fourier transforms and the associated developments in hardware have led to remarkable advances in the use of spectral analysis as a means of investigating the nature and characteristics of acoustic sources. Speech research has benefited considerably in this respect, and, in a rather more technological application, spectral analysis of machinery noise provides information about changes in machine condition which may indicate imminent failure. More recently the observation that human and animal muscles emit low intensity noise suggests that spectral analysis of this noise may yield information about muscle structure and performance.

  6. Image processing techniques for digital orthophotoquad production

    USGS Publications Warehouse

    Hood, Joy J.; Ladner, L. J.; Champion, Richard A.

    1989-01-01

    Orthophotographs have long been recognized for their value as supplements or alternatives to standard maps. Recent trends towards digital cartography have resulted in efforts by the US Geological Survey to develop a digital orthophotoquad production system. Digital image files were created by scanning color infrared photographs on a microdensitometer. Rectification techniques were applied to remove tile and relief displacement, thereby creating digital orthophotos. Image mosaicking software was then used to join the rectified images, producing digital orthophotos in quadrangle format.

  7. Parallel processing for digital picture comparison

    NASA Technical Reports Server (NTRS)

    Cheng, H. D.; Kou, L. T.

    1987-01-01

    In picture processing an important problem is to identify two digital pictures of the same scene taken under different lighting conditions. This kind of problem can be found in remote sensing, satellite signal processing and the related areas. The identification can be done by transforming the gray levels so that the gray level histograms of the two pictures are closely matched. The transformation problem can be solved by using the packing method. Researchers propose a VLSI architecture consisting of m x n processing elements with extensive parallel and pipelining computation capabilities to speed up the transformation with the time complexity 0(max(m,n)), where m and n are the numbers of the gray levels of the input picture and the reference picture respectively. If using uniprocessor and a dynamic programming algorithm, the time complexity will be 0(m(3)xn). The algorithm partition problem, as an important issue in VLSI design, is discussed. Verification of the proposed architecture is also given.

  8. Applicability of ACR breast dosimetry methodology to a digital mammography system

    SciTech Connect

    Tomon, John J.; Johnson, Thomas E.; Swenson, Kristin N.; Schauer, David A.

    2006-03-15

    Determination of mean glandular dose (MGD) to breast tissue is an essential aspect of mammography equipment evaluations and exposure controls. The American College of Radiology (ACR) Quality Control Manual outlines the procedure for MGD determination in screen-film mammography based upon conversions of entrance skin exposures (ESEs) measured with an ionization chamber (IC). The development of digital mammography has increased with the demand for improved object resolution and tissue contrast. This change in image receptor from screen-film to a solid-state detector has led to questions about the applicability of the ACR MGD methodology to digital mammography. This research has validated the applicability of the ACR MGD methodology to digital mammography in the GE digital mammography system Senographe 2000D[reg]. MGD was determined using light output measurements from thermoluminescent dosimeters (MGD{sub TL}), exposure measurements from an IC (MGD{sub IC}) and conversion factors from the ACR Mammography Quality Control Manual. MGD{sub TL} and MGD{sub IC} data indicate that there is a statistically significant difference between the two measurements with the Senographe 2000D[reg]. However, the applicability of the ACR's methodology was validated by calculating MGD at various depths in a 50/50 breast phantom. Additionally, the results of backscatter measurements from the image receptors of both mammography modalities indicate there is a difference (all P values <0.001) in the radiation backscattered from each image receptor.

  9. Seismic Rayleigh Wave Digital Processing Technology

    NASA Astrophysics Data System (ADS)

    Jie, Li

    2013-04-01

    In Rayleigh wave exploration, the digital processing of data plays a very important position. This directly affects the interpretation of ground effect. Therefore, the use of accurate processing software and effective method in the Rayleigh wave exploration has important theoretical and practical significance. Previously, Rayleigh wave dispersion curve obtained by the one-dimensional phase analysis. This method requires channel spacing should be less than the effective wavelength. And minimal phase error will cause great changes in the phase velocity of Rayleigh wave. Damped least square method is a local linear model. It is easy to cause that inversion objective function cannot find the global optimal solution. Therefore, the method and the technology used in the past are difficult to apply the requirements of the current Rayleigh wave exploration. This study focused on the related technologies and algorithms of F-K domain dispersion curve extraction and GA global non-linear inversion, and combined with the impact of Rayleigh wave data acquisition parameters and the characteristics. Rayleigh wave exploration data processing software design and process technology research is completed. Firstly, the article describes the theoretical basis of Rayleigh wave method. This is also part of the theoretical basis of following treatment. The theoretical proof of existence of Rayleigh wave Dispersive in layered strata. Secondly, F-K domain dispersion curve extraction tests showed that the method can overcome the one-dimensional digital processing technology deficiencies, and make full use of multi-channel Rayleigh wave data record information. GA global non-linear inversion indicated that the inversion is not easy getting into local optimal solution. Thirdly, some examples illustrate each mode Rayleigh wave dispersion curve characteristics in the X-T domain. Tests demonstrated the impact on their extraction of dispersion curves. Parameters change example (including the X

  10. Digital techniques for processing Landsat imagery

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1978-01-01

    An overview of the basic techniques used to process Landsat images with a digital computer, and the VICAR image processing software developed at JPL and available to users through the NASA sponsored COSMIC computer program distribution center is presented. Examples of subjective processing performed to improve the information display for the human observer, such as contrast enhancement, pseudocolor display and band rationing, and of quantitative processing using mathematical models, such as classification based on multispectral signatures of different areas within a given scene and geometric transformation of imagery into standard mapping projections are given. Examples are illustrated by Landsat scenes of the Andes mountains and Altyn-Tagh fault zone in China before and after contrast enhancement and classification of land use in Portland, Oregon. The VICAR image processing software system which consists of a language translator that simplifies execution of image processing programs and provides a general purpose format so that imagery from a variety of sources can be processed by the same basic set of general applications programs is described.

  11. Wavelet processing techniques for digital mammography

    NASA Astrophysics Data System (ADS)

    Laine, Andrew F.; Song, Shuwu

    1992-09-01

    This paper introduces a novel approach for accomplishing mammographic feature analysis through multiresolution representations. We show that efficient (nonredundant) representations may be identified from digital mammography and used to enhance specific mammographic features within a continuum of scale space. The multiresolution decomposition of wavelet transforms provides a natural hierarchy in which to embed an interactive paradigm for accomplishing scale space feature analysis. Similar to traditional coarse to fine matching strategies, the radiologist may first choose to look for coarse features (e.g., dominant mass) within low frequency levels of a wavelet transform and later examine finer features (e.g., microcalcifications) at higher frequency levels. In addition, features may be extracted by applying geometric constraints within each level of the transform. Choosing wavelets (or analyzing functions) that are simultaneously localized in both space and frequency, results in a powerful methodology for image analysis. Multiresolution and orientation selectivity, known biological mechanisms in primate vision, are ingrained in wavelet representations and inspire the techniques presented in this paper. Our approach includes local analysis of complete multiscale representations. Mammograms are reconstructed from wavelet representations, enhanced by linear, exponential and constant weight functions through scale space. By improving the visualization of breast pathology we can improve the chances of early detection of breast cancers (improve quality) while requiring less time to evaluate mammograms for most patients (lower costs).

  12. An investigation of radiometer design using digital processing techniques

    NASA Technical Reports Server (NTRS)

    Lawrence, R. W.

    1981-01-01

    The use of digital signal processing techniques in Dicke switching radiometer design was investigated. The general approach was to develop an analytical model of the existing analog radiometer and identify factors which adversly affect its performance. A digital processor was then proposed to verify the feasibility of using digital techniques to minimize these adverse effects and improve the radiometer performance. Analysis and preliminary test results comparing the digital and analog processing approaches in radiometers design were analyzed.

  13. Digital interactive image analysis by array processing

    NASA Technical Reports Server (NTRS)

    Sabels, B. E.; Jennings, J. D.

    1973-01-01

    An attempt is made to draw a parallel between the existing geophysical data processing service industries and the emerging earth resources data support requirements. The relationship of seismic data analysis to ERTS data analysis is natural because in either case data is digitally recorded in the same format, resulting from remotely sensed energy which has been reflected, attenuated, shifted and degraded on its path from the source to the receiver. In the seismic case the energy is acoustic, ranging in frequencies from 10 to 75 cps, for which the lithosphere appears semi-transparent. In earth survey remote sensing through the atmosphere, visible and infrared frequency bands are being used. Yet the hardware and software required to process the magnetically recorded data from the two realms of inquiry are identical and similar, respectively. The resulting data products are similar.

  14. Fuzzy Logic Enhanced Digital PIV Processing Software

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.

    1999-01-01

    Digital Particle Image Velocimetry (DPIV) is an instantaneous, planar velocity measurement technique that is ideally suited for studying transient flow phenomena in high speed turbomachinery. DPIV is being actively used at the NASA Glenn Research Center to study both stable and unstable operating conditions in a high speed centrifugal compressor. Commercial PIV systems are readily available which provide near real time feedback of the PIV image data quality. These commercial systems are well designed to facilitate the expedient acquisition of PIV image data. However, as with any general purpose system, these commercial PIV systems do not meet all of the data processing needs required for PIV image data reduction in our compressor research program. An in-house PIV PROCessing (PIVPROC) code has been developed for reducing PIV data. The PIVPROC software incorporates fuzzy logic data validation for maximum information recovery from PIV image data. PIVPROC enables combined cross-correlation/particle tracking wherein the highest possible spatial resolution velocity measurements are obtained.

  15. Applying Statistical Process Quality Control Methodology to Educational Settings.

    ERIC Educational Resources Information Center

    Blumberg, Carol Joyce

    A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…

  16. Development and testing of methodology for evaluating the performance of multi-input/multi-output digital control systems

    NASA Technical Reports Server (NTRS)

    Pototzky, Anthony S.; Wieseman, Carol D.; Hoadley, Sherwood Tiffany; Mukhopadhyay, Vivek

    1990-01-01

    A Controller Performance Evaluation (CPE) methodology for multi-input/multi-output digital control systems was developed and tested on an aeroelastic wind-tunnel model. Modern signal processing methods were used to implement control laws and to acquire time domain data of the whole system (controller and plant) from which appropriate transfer matrices of the control system could be generated. Matrix computational procedures were used to calculate singular values of return-difference matrices at the plant input and output points to evaluate the performance of the control system. The CPE procedures effectively identified potentially destabilizing controllers and confirmed the satisfactory performance of stabilizing ones.

  17. [Process of perversion. Methodological and clinical study].

    PubMed

    Marchais, P

    1975-07-01

    Studies in classical psychiatry and psychoanalysis have reduced perversions to pathological phenomena, by lessening the moral criterion progressively. The applying of a comprehensive method of study of acquired perversions incites to consider various levels of observation. Henceforth properties common to each of these levels appears permitting thus to isolate a general process of perversion. This process, which keeps undeniable links with mental pathology, must however be differentiated, for it is not necessarily to be assimilated or reduced to the latter. Besides, it entails notable consequences on the social and cultural level. PMID:1233902

  18. Digital data processing system dynamic loading analysis

    NASA Technical Reports Server (NTRS)

    Lagas, J. J.; Peterka, J. J.; Tucker, A. E.

    1976-01-01

    Simulation and analysis of the Space Shuttle Orbiter Digital Data Processing System (DDPS) are reported. The mated flight and postseparation flight phases of the space shuttle's approach and landing test configuration were modeled utilizing the Information Management System Interpretative Model (IMSIM) in a computerized simulation modeling of the ALT hardware, software, and workload. System requirements simulated for the ALT configuration were defined. Sensitivity analyses determined areas of potential data flow problems in DDPS operation. Based on the defined system requirements and the sensitivity analyses, a test design is described for adapting, parameterizing, and executing the IMSIM. Varying load and stress conditions for the model execution are given. The analyses of the computer simulation runs were documented as results, conclusions, and recommendations for DDPS improvements.

  19. Parallel Processing with Digital Signal Processing Hardware and Software

    NASA Technical Reports Server (NTRS)

    Swenson, Cory V.

    1995-01-01

    The assembling and testing of a parallel processing system is described which will allow a user to move a Digital Signal Processing (DSP) application from the design stage to the execution/analysis stage through the use of several software tools and hardware devices. The system will be used to demonstrate the feasibility of the Algorithm To Architecture Mapping Model (ATAMM) dataflow paradigm for static multiprocessor solutions of DSP applications. The individual components comprising the system are described followed by the installation procedure, research topics, and initial program development.

  20. SOME CONCEPTS PERTAINING TO INVESTIGATIVE METHODOLOGY FOR SUBSURFACE PROCESS RESEARCH

    EPA Science Inventory

    Problems of investigative methodology comprise a critical and often preponderant element of research to delineate and quantitate processes which govern the transport and fate of pollutants in subsurface environments. Examination of several recent research studies illustrates that...

  1. Using continuous process improvement methodology to standardize nursing handoff communication.

    PubMed

    Klee, Kristi; Latta, Linda; Davis-Kirsch, Sallie; Pecchia, Maria

    2012-04-01

    The purpose of this article was to describe the use of continuous performance improvement (CPI) methodology to standardize nurse shift-to-shift handoff communication. The goals of the process were to standardize the content and process of shift handoff, improve patient safety, increase patient and family involvement in the handoff process, and decrease end-of-shift overtime. This article will describe process changes made over a 4-year period as result of application of the plan-do-check-act procedure, which is an integral part of the CPI methodology, and discuss further work needed to continue to refine this critical nursing care process. PMID:21964442

  2. On digital image processing technology and application in geometric measure

    NASA Astrophysics Data System (ADS)

    Yuan, Jiugen; Xing, Ruonan; Liao, Na

    2014-04-01

    Digital image processing technique is an emerging science that emerging with the development of semiconductor integrated circuit technology and computer science technology since the 1960s.The article introduces the digital image processing technique and principle during measuring compared with the traditional optical measurement method. It takes geometric measure as an example and introduced the development tendency of digital image processing technology from the perspective of technology application.

  3. Design methodology: edgeless 3D ASICs with complex in-pixel processing for pixel detectors

    NASA Astrophysics Data System (ADS)

    Fahim, Farah; Deptuch, Grzegorz W.; Hoff, James R.; Mohseni, Hooman

    2015-08-01

    The design methodology for the development of 3D integrated edgeless pixel detectors with in-pixel processing using Electronic Design Automation (EDA) tools is presented. A large area 3 tier 3D detector with one sensor layer and two ASIC layers containing one analog and one digital tier, is built for x-ray photon time of arrival measurement and imaging. A full custom analog pixel is 65μm x 65μm. It is connected to a sensor pixel of the same size on one side, and on the other side it has approximately 40 connections to the digital pixel. A 32 x 32 edgeless array without any peripheral functional blocks constitutes a sub-chip. The sub-chip is an indivisible unit, which is further arranged in a 6 x 6 array to create the entire 1.248cm x 1.248cm ASIC. Each chip has 720 bump-bond I/O connections, on the back of the digital tier to the ceramic PCB. All the analog tier power and biasing is conveyed through the digital tier from the PCB. The assembly has no peripheral functional blocks, and hence the active area extends to the edge of the detector. This was achieved by using a few flavors of almost identical analog pixels (minimal variation in layout) to allow for peripheral biasing blocks to be placed within pixels. The 1024 pixels within a digital sub-chip array have a variety of full custom, semi-custom and automated timing driven functional blocks placed together. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout. The methodology uses the Cadence design platform, however it is not limited to this tool.

  4. Design methodology: edgeless 3D ASICs with complex in-pixel processing for pixel detectors

    SciTech Connect

    Fahim Farah, Fahim Farah; Deptuch, Grzegorz W.; Hoff, James R.; Mohseni, Hooman

    2015-08-28

    The design methodology for the development of 3D integrated edgeless pixel detectors with in-pixel processing using Electronic Design Automation (EDA) tools is presented. A large area 3 tier 3D detector with one sensor layer and two ASIC layers containing one analog and one digital tier, is built for x-ray photon time of arrival measurement and imaging. A full custom analog pixel is 65μm x 65μm. It is connected to a sensor pixel of the same size on one side, and on the other side it has approximately 40 connections to the digital pixel. A 32 x 32 edgeless array without any peripheral functional blocks constitutes a sub-chip. The sub-chip is an indivisible unit, which is further arranged in a 6 x 6 array to create the entire 1.248cm x 1.248cm ASIC. Each chip has 720 bump-bond I/O connections, on the back of the digital tier to the ceramic PCB. All the analog tier power and biasing is conveyed through the digital tier from the PCB. The assembly has no peripheral functional blocks, and hence the active area extends to the edge of the detector. This was achieved by using a few flavors of almost identical analog pixels (minimal variation in layout) to allow for peripheral biasing blocks to be placed within pixels. The 1024 pixels within a digital sub-chip array have a variety of full custom, semi-custom and automated timing driven functional blocks placed together. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout. The methodology uses the Cadence design platform, however it is not limited to this tool.

  5. Process Architecture for Managing Digital Object Identifiers

    NASA Astrophysics Data System (ADS)

    Wanchoo, L.; James, N.; Stolte, E.

    2014-12-01

    In 2010, NASA's Earth Science Data and Information System (ESDIS) Project implemented a process for registering Digital Object Identifiers (DOIs) for data products distributed by Earth Observing System Data and Information System (EOSDIS). For the first 3 years, ESDIS evolved the process involving the data provider community in the development of processes for creating and assigning DOIs, and guidelines for the landing page. To accomplish this, ESDIS established two DOI User Working Groups: one for reviewing the DOI process whose recommendations were submitted to ESDIS in February 2014; and the other recently tasked to review and further develop DOI landing page guidelines for ESDIS approval by end of 2014. ESDIS has recently upgraded the DOI system from a manually-driven system to one that largely automates the DOI process. The new automated feature include: a) reviewing the DOI metadata, b) assigning of opaque DOI name if data provider chooses, and c) reserving, registering, and updating the DOIs. The flexibility of reserving the DOI allows data providers to embed and test the DOI in the data product metadata before formally registering with EZID. The DOI update process allows the changing of any DOI metadata except the DOI name unless the name has not been registered. Currently, ESDIS has processed a total of 557 DOIs of which 379 DOIs are registered with EZID and 178 are reserved with ESDIS. The DOI incorporates several metadata elements that effectively identify the data product and the source of availability. Of these elements, the Uniform Resource Locator (URL) attribute has the very important function of identifying the landing page which describes the data product. ESDIS in consultation with data providers in the Earth Science community is currently developing landing page guidelines that specify the key data product descriptive elements to be included on each data product's landing page. This poster will describe in detail the unique automated process and

  6. Design methodology for high-speed video processing system based on signal integrity analysis

    NASA Astrophysics Data System (ADS)

    Wang, Rui; Zhang, Hao

    2009-07-01

    On account of high performance requirement of video processing systems and the shortcoming of conventional circuit design method, a design methodology based on the signal integrity (SI) theory for the high-speed video processing system with TI's digital signal processor TMS320DM642 was proposed. The PCB stack-up and construction of the system as well as transmission line characteristic impedance are set and calculated firstly with the impedance control tool Si8000 through this methodology. And then some crucial signals such as data lines of SDRAM are simulated and analyzed with the IBIS models so that reasonable layout and routing rules are established. Finally the system's highdensity PCB design is completed on Cadence SPB15.7 platform. The design result shows that this methodology can effectively restrain signal reflection, crosstalk, rail collapse noise and electromagnetic interference (EMI). Thus it significantly improves stability of the system and shortens development cycles.

  7. Digital signal processing using virtual instruments

    NASA Astrophysics Data System (ADS)

    Anderson, James A.; Korrapati, Raghu; Swain, Nikunja K.

    2000-08-01

    The area of test and measurement is changing rapidly because of the recent developments in software and hardware. The test and measurement systems are increasingly becoming PC based. Most of these PC based systems use graphical programming language to design test and measurement modules called virtual instruments (Vis). These Vis provide visual representation of dat or models, and make understanding of abstract concepts and algorithms easier. This allows users to express their ideas in a concise manner. One such virtual instruments package is LabVIEW from National Instruments Corporation at Austin, Texas. This software package is one of the first graphical programming products and is currently used in number of academic institutions, industries, Department of Defense graphical programming products and is currently sued in number of academic institutions, industries, Department of Defense, Department of Energy, and National Aeronautics and Space Administration for various test, measurement, and control applications. LabVIEW has an extensive built-in VI library that can be used to design and develop solutions for different applications. Besides using the built-in VI library that can be used to design and develop solutions for different applications. Besides using the built-in VI modules in LabVIEW, the user can design new VI modules easily. This paper discusses the use of LabVIEW to design and develop digital signal processing VI modules such as Fourier Analysis and Windowing. Instructors can use these modules to teach some of the signal processing concepts effectively.

  8. Design Methodology: ASICs with complex in-pixel processing for Pixel Detectors

    SciTech Connect

    Fahim, Farah

    2014-10-31

    The development of Application Specific Integrated Circuits (ASIC) for pixel detectors with complex in-pixel processing using Computer Aided Design (CAD) tools that are, themselves, mainly developed for the design of conventional digital circuits requires a specialized approach. Mixed signal pixels often require parasitically aware detailed analog front-ends and extremely compact digital back-ends with more than 1000 transistors in small areas below 100μm x 100μm. These pixels are tiled to create large arrays, which have the same clock distribution and data readout speed constraints as in, for example, micro-processors. The methodology uses a modified mixed-mode on-top digital implementation flow to not only harness the tool efficiency for timing and floor-planning but also to maintain designer control over compact parasitically aware layout.

  9. Digital-Difference Processing For Collision Avoidance.

    NASA Technical Reports Server (NTRS)

    Shores, Paul; Lichtenberg, Chris; Kobayashi, Herbert S.; Cunningham, Allen R.

    1988-01-01

    Digital system for automotive crash avoidance measures and displays difference in frequency between two sinusoidal input signals of slightly different frequencies. Designed for use with Doppler radars. Characterized as digital mixer coupled to frequency counter measuring difference frequency in mixer output. Technique determines target path mathematically. Used for tracking cars, missiles, bullets, baseballs, and other fast-moving objects.

  10. The Creation Process in Digital Art

    NASA Astrophysics Data System (ADS)

    Marcos, Adérito Fernandes; Branco, Pedro Sérgio; Zagalo, Nelson Troca

    The process behind the act of the art creation or the creation process has been the subject of much debate and research during the last fifty years at least, even thinking art and beauty has been a subject of analysis already by the ancient Greeks such were Plato or Aristotle. Even though intuitively it is a simple phenomenon, creativity or the human ability to generate innovation (new ideas, concepts, etc.) is in fact quite complex. It has been studied from the perspectives of behavioral and social psychology, cognitive science, artificial intelligence, philosophy, history, design research, digital art, and computational aesthetics, among others. In spite of many years of discussion and research there is no single, authoritative perspective or definition of creativity, i.e., there is no standardized measurement technique. Regarding the development process that supports the intellectual act of creation it is usually described as a procedure where the artist experiments the medium, explores it with one or more techniques, changing shapes, forms, appearances, where beyond time and space, he/she seeks his/her way out to a clearing, i.e., envisages a path from intention to realization. Duchamp in his lecture "The Creative Act" states the artist is never alone with his/her artwork; there is always the spectator that later on will react critically to the work of art. If the artist succeeds in transmitting his/her intentions in terms of a message, emotion or feeling to the spectator then a form of aesthetic osmosis actually takes place through the inert matter (the medium) that enabled this communication or interaction phenomenon to occur. The role of the spectator may become gradually more active by interacting with the artwork itself possibly changing or becoming a part of it [2][4].

  11. The spatial structure of terrain - A process signal in satellite digital images

    NASA Technical Reports Server (NTRS)

    Craig, R. G.

    1984-01-01

    Pattern recognition procedures applied to Landsat imagery carry an implicit assumption that the digital data are independently distributed. That assumption is incorrect over virtually any terrain. Deviations from independence occur because slopes follow a systematic pattern of variation arising from the slope-forming processes. That pattern can be identified using the stochastic process methodology of Box and Jenkins.Angles of adjacent slopes are autocorrelated and the bidirectional reflectance function transfers these systematic slope changes to the sensor. Imagery becomes autocorrelated through this transfer. Autocorrelation in the imagery can be removed through direct calculation from a digital elevation model or by use of stochastic process methodology. The latter has the advantage that the residuals are white noise; and it is applicable in any area, even where a D.E.M. is unavailable. The stochastic process signal can be used to study terrain processes.

  12. Closed orbit feedback with digital signal processing

    SciTech Connect

    Chung, Y.; Kirchman, J.; Lenkszus, F.

    1994-08-01

    The closed orbit feedback experiment conducted on the SPEAR using the singular value decomposition (SVD) technique and digital signal processing (DSP) is presented. The beam response matrix, defined as beam motion at beam position monitor (BPM) locations per unit kick by corrector magnets, was measured and then analyzed using SVD. Ten BPMs, sixteen correctors, and the eight largest SVD eigenvalues were used for closed orbit correction. The maximum sampling frequency for the closed loop feedback was measured at 37 Hz. Using the proportional and integral (PI) control algorithm with the gains Kp = 3 and K{sub I} = 0.05 and the open-loop bandwidth corresponding to 1% of the sampling frequency, a correction bandwidth ({minus}3 dB) of approximately 0.8 Hz was achieved. Time domain measurements showed that the response time of the closed loop feedback system for 1/e decay was approximately 0.25 second. This result implies {approximately} 100 Hz correction bandwidth for the planned beam position feedback system for the Advanced Photon Source storage ring with the projected 4-kHz sampling frequency.

  13. Multilingual subjective methodology and evaluation of low-rate digital voice processors

    NASA Astrophysics Data System (ADS)

    Dimolitsas, Spiros; Corcoran, Franklin L.; Baraniecki, Marion R.; Phipps, John G., Jr.

    The methodology and results for a multilingual evaluation of source encoding algorithms operating at 16 kbit/s are presented. The evaluation was conducted in three languages (English, French, and Madarin), using listener opinion subjective assessments to determine whether 'toll-quality' performance is possible at 16 kbit/s. The study demonstrated that toll-quality voice is indeed possible at 16 kbit/s, and that several of the methods evaluated are more robust under high bit error conditions than either 32- or 64-kbit/s encoding. Thus, 16-kbit/s voice coding technology is currently suitable for many applications with the public-switched telephone network, including the next generation of digital circuit multiplication equipment, and integrated services digital network videotelephony.

  14. Modular digital holographic fringe data processing system

    NASA Technical Reports Server (NTRS)

    Downward, J. G.; Vavra, P. C.; Schebor, F. S.; Vest, C. M.

    1985-01-01

    A software architecture suitable for reducing holographic fringe data into useful engineering data is developed and tested. The results, along with a detailed description of the proposed architecture for a Modular Digital Fringe Analysis System, are presented.

  15. A methodology aimed at fostering and sustaining the development processes of an IE-based industry

    NASA Astrophysics Data System (ADS)

    Corallo, Angelo; Errico, Fabrizio; de Maggio, Marco; Giangreco, Enza

    In the current competitive scenario, where business relationships are fundamental in building successful business models and inter/intra organizational business processes are progressively digitalized, an end-to-end methodology is required that is capable of guiding business networks through the Internetworked Enterprise (IE) paradigm: a new and innovative organizational model able to leverage Internet technologies to perform real-time coordination of intra and inter-firm activities, to create value by offering innovative and personalized products/services and reduce transaction costs. This chapter presents the TEKNE project Methodology of change that guides business networks, by means of a modular and flexible approach, towards the IE techno-organizational paradigm, taking into account the competitive environment of the network and how this environment influences its strategic, organizational and technological levels. Contingency, the business model, enterprise architecture and performance metrics are the key concepts that form the cornerstone of this methodological framework.

  16. Programmable rate modem utilizing digital signal processing techniques

    NASA Technical Reports Server (NTRS)

    Naveh, Arad

    1992-01-01

    The need for a Programmable Rate Digital Satellite Modem capable of supporting both burst and continuous transmission modes with either Binary Phase Shift Keying (BPSK) or Quadrature Phase Shift Keying (QPSK) modulation is discussed. The preferred implementation technique is an all digital one which utilizes as much digital signal processing (DSP) as possible. The design trade-offs in each portion of the modulator and demodulator subsystem are outlined.

  17. A Phenomenological Study of an Emergent National Digital Library, Part I: Theory and Methodological Framework

    ERIC Educational Resources Information Center

    Dalbello, Marija

    2005-01-01

    The activities surrounding the National Digital Library Program (NDLP) at the Library of Congress (1995-2000) are used to study institutional processes associated with technological innovation in the library context. The study identified modalities of successful innovation and the characteristics of creative decision making. Theories of social…

  18. Digital image processing: a primer for JVIR authors and readers: part 2: digital image acquisition.

    PubMed

    LaBerge, Jeanne M; Andriole, Katherine P

    2003-11-01

    This is the second installment of a three-part series on digital image processing intended to prepare authors for online submission of manuscripts. In the first article of the series, we reviewed the fundamentals of digital image architecture. In this article, we describe the ways that an author can import digital images to the computer desktop. We explore the modern imaging network and explain how to import picture archiving and communications systems (PACS) images to the desktop. Options and techniques for producing digital hard copy film are also presented. PMID:14605101

  19. The digital storytelling process: A comparative analysis from various experts

    NASA Astrophysics Data System (ADS)

    Hussain, Hashiroh; Shiratuddin, Norshuhada

    2016-08-01

    Digital Storytelling (DST) is a method of delivering information to the audience. It combines narrative and digital media content infused with the multimedia elements. In order for the educators (i.e the designers) to create a compelling digital story, there are sets of processes introduced by experts. Nevertheless, the experts suggest varieties of processes to guide them; of which some are redundant. The main aim of this study is to propose a single guide process for the creation of DST. A comparative analysis is employed where ten DST models from various experts are analysed. The process can also be implemented in other multimedia materials that used the concept of DST.

  20. Digital signal processor and processing method for GPS receivers

    NASA Technical Reports Server (NTRS)

    Thomas, Jr., Jess B. (Inventor)

    1989-01-01

    A digital signal processor and processing method therefor for use in receivers of the NAVSTAR/GLOBAL POSITIONING SYSTEM (GPS) employs a digital carrier down-converter, digital code correlator and digital tracking processor. The digital carrier down-converter and code correlator consists of an all-digital, minimum bit implementation that utilizes digital chip and phase advancers, providing exceptional control and accuracy in feedback phase and in feedback delay. Roundoff and commensurability errors can be reduced to extremely small values (e.g., less than 100 nanochips and 100 nanocycles roundoff errors and 0.1 millichip and 1 millicycle commensurability errors). The digital tracking processor bases the fast feedback for phase and for group delay in the C/A, P.sub.1, and P.sub.2 channels on the L.sub.1 C/A carrier phase thereby maintaining lock at lower signal-to-noise ratios, reducing errors in feedback delays, reducing the frequency of cycle slips and in some cases obviating the need for quadrature processing in the P channels. Simple and reliable methods are employed for data bit synchronization, data bit removal and cycle counting. Improved precision in averaged output delay values is provided by carrier-aided data-compression techniques. The signal processor employs purely digital operations in the sense that exactly the same carrier phase and group delay measurements are obtained, to the last decimal place, every time the same sampled data (i.e., exactly the same bits) are processed.

  1. Pedagogical reforms of digital signal processing education

    NASA Astrophysics Data System (ADS)

    Christensen, Michael

    The future of the engineering discipline is arguably predicated heavily upon appealing to the future generation, in all its sensibilities. The greatest burden in doing so, one might rightly believe, lies on the shoulders of the educators. In examining the causal means by which the profession arrived at such a state, one finds that the technical revolution, precipitated by global war, had, as its catalyst, institutions as expansive as the government itself to satisfy the demand for engineers, who, as a result of such an existential crisis, were taught predominantly theoretical underpinnings to address a finite purpose. By contrast, the modern engineer, having expanded upon this vision and adapted to an evolving society, is increasingly placed in the proverbial role of the worker who must don many hats: not solely a scientist, yet often an artist; not a businessperson alone, but neither financially naive; not always a representative, though frequently a collaborator. Inasmuch as change then serves as the only constancy in a global climate, therefore, the educational system - if it is to mimic the demands of the industry - is left with an inherent need for perpetual revitalization to remain relevant. This work aims to serve that end. Motivated by existing research in engineering education, an epistemological challenge is molded into the framework of the electrical engineer with emphasis on digital signal processing. In particular, it is investigated whether students are better served by a learning paradigm that tolerates and, when feasible, encourages error via a medium free of traditional adjudication. Through the creation of learning modules using the Adobe Captivate environment, a wide range of fundamental knowledge in signal processing is challenged within the confines of existing undergraduate courses. It is found that such an approach not only conforms to the research agenda outlined for the engineering educator, but also reflects an often neglected reality

  2. Methodology for utilizing CD distributions for optimization of lithographic processes

    NASA Astrophysics Data System (ADS)

    Charrier, Edward W.; Mack, Chris A.; Zuo, Qiang; Maslow, Mark J.

    1997-07-01

    As the critical dimension (CD) of optical lithography processes continues to decrease, the process latitude also decreases and CD control becomes more difficult. As this trend continues, lithography engineers will find that they require improved process optimization methods which take into account the random and systematic errors that are inherent in any manufacturing process. This paper shows the methodology of such an optimization method. Lithography simulation and analysis software, combined with experimental process error distributions, are used to perform optimizations of numerical aperture and partial coherence, as well as the selection of the best OPC pattern for a given mask.

  3. The Technology Transfer Process: Concepts, Framework and Methodology.

    ERIC Educational Resources Information Center

    Jolly, James A.

    This paper discusses the conceptual framework and methodology of the technology transfer process and develops a model of the transfer mechanism. This model is then transformed into a predictive model of technology transfer incorporating nine factors that contribute to the movement of knowledge from source to user. Each of these factors is examined…

  4. A POLLUTION REDUCTION METHODOLOGY FOR CHEMICAL PROCESS SIMULATORS

    EPA Science Inventory

    A pollution minimization methodology was developed for chemical process design using computer simulation. It is based on a pollution balance that at steady state is used to define a pollution index with units of mass of pollution per mass of products. The pollution balance has be...

  5. Rethinking the Purposes and Processes for Designing Digital Portfolios

    ERIC Educational Resources Information Center

    Hicks, Troy; Russo, Anne; Autrey, Tara; Gardner, Rebecca; Kabodian, Aram; Edington, Cathy

    2007-01-01

    As digital portfolios become more prevalent in teacher education, the purposes and processes for creating them have become contested. Originally meant to be critical and reflective spaces for learning about multimedia and conceived as contributing to professional growth, research shows that digital portfolios are now increasingly being used to…

  6. Digital computer processing of X-ray photos

    NASA Technical Reports Server (NTRS)

    Nathan, R.; Selzer, R. H.

    1967-01-01

    Digital computers correct various distortions in medical and biological photographs. One of the principal methods of computer enhancement involves the use of a two-dimensional digital filter to modify the frequency spectrum of the picture. Another computer processing method is image subtraction.

  7. Powerful Practices in Digital Learning Processes

    ERIC Educational Resources Information Center

    Sørensen, Birgitte Holm; Levinsen, Karin Tweddell

    2015-01-01

    The present paper is based on two empirical research studies. The "Netbook 1:1" project (2009-2012), funded by the municipality of Gentofte and Microsoft Denmark, is complete, while "Students' digital production and students as learning designers" (2013-2015), funded by the Danish Ministry of Education, is ongoing. Both…

  8. Preliminary development of digital signal processing in microwave radiometers

    NASA Technical Reports Server (NTRS)

    Stanley, W. D.

    1980-01-01

    Topics covered involve a number of closely related tasks including: the development of several control loop and dynamic noise model computer programs for simulating microwave radiometer measurements; computer modeling of an existing stepped frequency radiometer in an effort to determine its optimum operational characteristics; investigation of the classical second order analog control loop to determine its ability to reduce the estimation error in a microwave radiometer; investigation of several digital signal processing unit designs; initiation of efforts to develop required hardware and software for implementation of the digital signal processing unit; and investigation of the general characteristics and peculiarities of digital processing noiselike microwave radiometer signals.

  9. The teaching of computer programming and digital image processing in radiography.

    PubMed

    Allan, G L; Zylinski, J

    1998-06-01

    The increased use of digital processing techniques in Medical Radiations imaging modalities, along with the rapid advance in information technology has resulted in a significant change in the delivery of radiographic teaching programs. This paper details a methodology used to concurrently educate radiographers in both computer programming and image processing. The students learn to program in visual basic applications (VBA), and the programming skills are contextualised by requiring the students to write a digital subtraction angiography (DSA) package. Program code generation and image presentation interface is undertaken by the spreadsheet Microsoft Excel. The user-friendly nature of this common interface enables all students to readily begin program creation. The teaching of programming and image processing skills by this method may be readily generalised to other vocational fields where digital image manipulation is a professional requirement. PMID:9726504

  10. Effective DQE (eDQE) and speed of digital radiographic systems: An experimental methodology

    PubMed Central

    Samei, Ehsan; Ranger, Nicole T.; MacKenzie, Alistair; Honey, Ian D.; Dobbins, James T.; Ravin, Carl E.

    2009-01-01

    Prior studies on performance evaluation of digital radiographic systems have primarily focused on the assessment of the detector performance alone. However, the clinical performance of such systems is also substantially impacted by magnification, focal spot blur, the presence of scattered radiation, and the presence of an antiscatter grid. The purpose of this study is to evaluate an experimental methodology to assess the performance of a digital radiographic system, including those attributes, and to propose a new metric, effective detective quantum efficiency (eDQE), a candidate for defining the efficiency or speed of digital radiographic imaging systems. The study employed a geometric phantom simulating the attenuation and scatter properties of the adult human thorax and a representative indirect flat-panel-based clinical digital radiographic imaging system. The noise power spectrum (NPS) was derived from images of the phantom acquired at three exposure levels spanning the operating range of the clinical system. The modulation transfer function (MTF) was measured using an edge device positioned at the surface of the phantom, facing the x-ray source. Scatter measurements were made using a beam stop technique. The eDQE was then computed from these measurements, along with measures of phantom attenuation and x-ray flux. The MTF results showed notable impact from the focal spot blur, while the NPS depicted a large component of structured noise resulting from use of an antiscatter grid. The eDQE was found to be an order of magnitude lower than the conventional DQE. At 120 kVp, eDQE(0) was in the 8%–9% range, fivefold lower than DQE(0) at the same technique. The eDQE method yielded reproducible estimates of the system performance in a clinically relevant context by quantifying the inherent speed of the system, that is, the actual signal to noise ratio that would be measured under clinical operating conditions. PMID:19746814

  11. Demonstration of the Dynamic Flowgraph Methodology using the Titan 2 Space Launch Vehicle Digital Flight Control System

    NASA Technical Reports Server (NTRS)

    Yau, M.; Guarro, S.; Apostolakis, G.

    1993-01-01

    Dynamic Flowgraph Methodology (DFM) is a new approach developed to integrate the modeling and analysis of the hardware and software components of an embedded system. The objective is to complement the traditional approaches which generally follow the philosophy of separating out the hardware and software portions of the assurance analysis. In this paper, the DFM approach is demonstrated using the Titan 2 Space Launch Vehicle Digital Flight Control System. The hardware and software portions of this embedded system are modeled in an integrated framework. In addition, the time dependent behavior and the switching logic can be captured by this DFM model. In the modeling process, it is found that constructing decision tables for software subroutines is very time consuming. A possible solution is suggested. This approach makes use of a well-known numerical method, the Newton-Raphson method, to solve the equations implemented in the subroutines in reverse. Convergence can be achieved in a few steps.

  12. Categorical digital soil database for the Carpathian-basin using the modified e-SOTER methodology

    NASA Astrophysics Data System (ADS)

    Dobos, Endre; Vadnai, Péter; Micheli, Erika; Pasztor, Laszlo

    2015-04-01

    Harmonized, spatially and thematically consistent, high resolution soil data covering larger regions, several countries to model different environmental and socio-economic scenarios is needed for several applications. The only way to have such data with large spatial coverage and high resolution data is to make use of the available high resolution digital data sources, digital soil mapping tools in the development process. Digital soil mapping has become a very efficient tool in soil science, several applications have been published in this topic recently. Many of these applications use environmental covariates like remotely sensed images and digital elevation models, which are raster based data sources with block support. The majority of soil data users requires data in raster format with values of certain properties, like pH, clay content or soil organic matter content. However, the use of these soil properties are often limited, an adequate interpretation of these numbers requires knowledge on the soil system, and its major processes and process associations. This soil system description can be best done using the existing knowledge of soil science expressed in soil classification. as diagnostics - features, materials horizons, as important descriptive information - and the classification categories. The most commonly used and internationally accepted classification system is the Worlds Reference Base for soil description, the so called WRB. Each soil classification category represent a complex association of processes and properties, which is difficult to be used and understood and also mapped due to its complex information behind the category names. The major advantage of the diagnostics based classification systems, like WRB, is that the complex soil categories, classes can be interpreted as unique combinations of the diagnostic features. Therefore each classes an be disaggregated into several diagnostics, where each have independent useful information

  13. Application of digital image processing techniques to astronomical imagery 1977

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.; Lynn, D. J.

    1978-01-01

    Nine specific techniques of combination of techniques developed for applying digital image processing technology to existing astronomical imagery are described. Photoproducts are included to illustrate the results of each of these investigations.

  14. Quantization effects in radiation spectroscopy based on digital pulse processing

    SciTech Connect

    Jordanov, V. T.; Jordanova, K. V.

    2011-07-01

    Radiation spectra represent inherently quantization data in the form of stacked channels of equal width. The spectrum is an experimental measurement of the discrete probability density function (PDF) of the detector pulse heights. The quantization granularity of the spectra depends on the total number of channels covering the full range of pulse heights. In analog pulse processing the total number of channels is equal to the total digital values produced by a spectroscopy analog-to-digital converter (ADC). In digital pulse processing each detector pulse is sampled and quantized by a fast ADC producing certain number of quantized numerical values. These digital values are linearly processed to obtain a digital quantity representing the peak of the digitally shaped pulse. Using digital pulse processing it is possible to acquire a spectrum with the total number of channels greater than the number of ADC values. Noise and sample averaging are important in the transformation of ADC quantized data into spectral quantized data. Analysis of this transformation is performed using an area sampling model of quantization. Spectrum differential nonlinearity (DNL) is shown to be related to the quantization at low noise levels and small number of averaged samples. Theoretical analysis and experimental measurements are used to obtain the condition to minimize the DNL due to quantization. (authors)

  15. Digital signal processing in the radio science stability analyzer

    NASA Technical Reports Server (NTRS)

    Greenhall, C. A.

    1995-01-01

    The Telecommunications Division has built a stability analyzer for testing Deep Space Network installations during flight radio science experiments. The low-frequency part of the analyzer operates by digitizing wave signals with bandwidths between 80 Hz and 45 kHz. Processed outputs include spectra of signal, phase, amplitude, and differential phase; time series of the same quantities; and Allan deviation of phase and differential phase. This article documents the digital signal-processing methods programmed into the analyzer.

  16. Agricultural inventory capabilities of machine processed LANDSAT digital data

    NASA Technical Reports Server (NTRS)

    Dietrick, D. L.; Fries, R. E.; Egbert, D. D.

    1975-01-01

    Agricultural crop identification and acreage determination analysis of LANDSAT digital data was performed for two study areas. A multispectral image processing and analysis system was utilized to perform the manmachine interactive analysis. The developed techniques yielded crop acreage estimate results with accuracy greater than 90% and as high as 99%. These results are encouraging evidence of agricultural inventory capabilities of machine processed LANDSAT digital data.

  17. Detecting jaundice by using digital image processing

    NASA Astrophysics Data System (ADS)

    Castro-Ramos, J.; Toxqui-Quitl, C.; Villa Manriquez, F.; Orozco-Guillen, E.; Padilla-Vivanco, A.; Sánchez-Escobar, JJ.

    2014-03-01

    When strong Jaundice is presented, babies or adults should be subject to clinical exam like "serum bilirubin" which can cause traumas in patients. Often jaundice is presented in liver disease such as hepatitis or liver cancer. In order to avoid additional traumas we propose to detect jaundice (icterus) in newborns or adults by using a not pain method. By acquiring digital images in color, in palm, soles and forehead, we analyze RGB attributes and diffuse reflectance spectra as the parameter to characterize patients with either jaundice or not, and we correlate that parameters with the level of bilirubin. By applying support vector machine we distinguish between healthy and sick patients.

  18. The processing of two-digit numbers in bilinguals.

    PubMed

    Macizo, Pedro; Herrera, Amparo; Román, Patricia; Martín, María Cruz

    2011-08-01

    We explored possible between-language influences when bilinguals processed two-digit numbers. Spanish/English bilinguals and German/English bilinguals performed a number comparison task with Arabic digits and verbal numbers in their first language (L1) and second language (L2) while the unit-decade compatibility was manipulated. The two bilingual groups showed regular compatibility effect with Arabic digits. In L1, Spanish/English bilinguals showed reverse compatibility effect, while German/English bilinguals showed regular compatibility effect. However, both groups of bilinguals presented reverse compatibility effect in English (L2), which suggested that the bilinguals' L1 did not determine the processing of number words in their L2. The results indicated that bilinguals processed two-digit number words selectively in their L1 and L2 and that they did not transcode L2 numbers into Arabic format. PMID:21752000

  19. A Comprehensive and Harmonized Digital Forensic Investigation Process Model.

    PubMed

    Valjarevic, Aleksandar; Venter, Hein S

    2015-11-01

    Performing a digital forensic investigation (DFI) requires a standardized and formalized process. There is currently neither an international standard nor does a global, harmonized DFI process (DFIP) exist. The authors studied existing state-of-the-art DFIP models and concluded that there are significant disparities pertaining to the number of processes, the scope, the hierarchical levels, and concepts applied. This paper proposes a comprehensive model that harmonizes existing models. An effort was made to incorporate all types of processes proposed by the existing models, including those aimed at achieving digital forensic readiness. The authors introduce a novel class of processes called concurrent processes. This is a novel contribution that should, together with the rest of the model, enable more efficient and effective DFI, while ensuring admissibility of digital evidence. Ultimately, the proposed model is intended to be used for different types of DFI and should lead to standardization. PMID:26258644

  20. Relationships between digital signal processing and control and estimation theory

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1978-01-01

    Research areas associated with digital signal processing and control and estimation theory are identified. Particular attention is given to image processing, system identification problems (parameter identification, linear prediction, least squares, Kalman filtering), stability analyses (the use of the Liapunov theory, frequency domain criteria, passivity), and multiparameter systems, distributed processes, and random fields.

  1. Analysis of Interpersonal Communication Processes in Digital Factory Environments

    NASA Astrophysics Data System (ADS)

    Schütze, Jens; Baum, Heiko; Laue, Martin; Müller, Egon

    The paper outlines the scope of influence of digital factory on the interpersonal communication process and the exemplary description of them. On the basis of a brief description about the theoretical basic concepts of the digital factory occurs the illustration of communicative features in digital factory. Practical coherences of interpersonal communication from a human oriented view were analyzed in Volkswagen AG in Wolfsburg in a pilot project. A modeling method was developed within the process analysis. This method makes it possible to visualize interpersonal communication and its human oriented attribute in a technically focused workflow. Due to the results of a developed inquiry about communication analysis and process models of modeling methods it was possible to build the processes in a suitable way for humans and to obtain a positive implication on the communication processes.

  2. Digital signal processing for fiber-optic thermometers

    SciTech Connect

    Fernicola, V.; Crovini, L.

    1994-12-31

    A digital signal processing scheme for measurement of exponentially-decaying signals, such as those found in fluorescence, lifetime-based, fiber-optic sensors, is proposed. The instrument uses a modified digital phase-sensitive-detection technique with the phase locked to a fixed value and the modulation period tracking the measured lifetime. Typical resolution of the system is 0.05% for slow decay (>500 {mu}s) and 0.1% for fast decay.

  3. Digital image processing of earth observation sensor data

    NASA Technical Reports Server (NTRS)

    Bernstein, R.

    1976-01-01

    This paper describes digital image processing techniques that were developed to precisely correct Landsat multispectral earth observation data and gives illustrations of the results achieved, e.g., geometric corrections with an error of less than one picture element, a relative error of one-fourth picture element, and no radiometric error effect. Techniques for enhancing the sensor data, digitally mosaicking multiple scenes, and extracting information are also illustrated.

  4. Parametric design methodology for chemical processes using a simulator

    SciTech Connect

    Diwekar, U.M.; Rubin, E.S. )

    1994-02-01

    Parameter design is a method popularized by the Japanese quality expert G. Taguchi, for designing products and manufacturing processes that are robust in the face of uncontrollable variations. At the design stage, the goal of parameter design is to identify design settings that make the product performance less sensitive to the effects of manufacturing and environmental variations and deterioration. Because parameter design reduces performance variation by reducing the influence of the sources of variation rather than by controlling them, it is a cost-effective technique for improving quality. A recent study on the application of parameter design methodology for chemical processes reported that the use of Taguchi's method was not justified and a method based on Monte Carlo simulation combined with optimization was shown to be more effective. However, this method is computationally intensive as a large number of samples are necessary to achieve the given accuracy. Additionally, determination of the number of sample runs required is based on experimentation due to a lack of systematic sampling methods. In an attempt to overcome these problems, the use of a stochastic modeling capability combined with an optimizer is presented in this paper. The objective is that of providing an effective means for application of parameter design methodologies to chemical processes using the ASPEN simulator. This implementation not only presents a generalized tool for use by chemical engineers at large but also provides systematic estimates of the number of sample runs required to attain the specified accuracy. The stochastic model employs the technique of Latin hypercube sampling instead of the traditional Monte Carlo technique and hence has a great potential to reduce the required number of samples. The methodology is illustrated via an example problem of designing a chemical process.

  5. Nuclear spectroscopy pulse height analysis based on digital signal processing techniques

    SciTech Connect

    Simoes, J.B.; Simoes, P.C.P.S.; Correia, C.M.B.A.

    1995-08-01

    A digital approach to pulse height analysis is presented. It consists of entire pulse digitization, using a flash analog-to-digital converter (ADC), being its height estimated by a floating point digital signal processor (DSP) as one parameter of a model best fitting to the pulse samples. The differential nonlinearity (DNL) is reduced by simultaneously adding to the pulse, prior to its digitization, two analog signals provided by a digital-to-analog converter (DAC). One of them is a small amplitude dither signal used to eliminate a bias introduced by the fitting algorithm. The other, with large amplitude, corrects the ADC nonlinearities by a method similar to the well known Gatti`s sliding scale. The simulations carried out showed that, using a 12-bit flash ADC, a 14-bit DAC and a dedicated floating point DSP performing a polynomial fitting to the samples around the pulse peak, it is actually possible to process about 10,000 events per second, with a constant height pulse dispersion of only 4 on 8,192 channels and a very good differential linearity. A prototype system based on the Texas Instruments floating point DSP TMS320C31 and built following the presented methodology has already been tested and performed as expected.

  6. [Fundamental bases of digital information processing in nuclear cardiology (III)].

    PubMed

    Cuarón, A; González, C; García Moreira, C

    1984-01-01

    This article describes the transformation of the gamma-camera images into digital form. The incidence of a gamma photon on the detector, produces two voltage pulses, which are proportional to the coordinates of the incidence points, and a digital pulse, indicative of the occurrence of the event. The coordinate pulses passes through a analog-digital converter, that is activated by the pulse. The result is the appearance of a digital number at the out-put of the converter, which is proportional to the voltage at its in-put. This number, is stored on the accumulation memory of the system, either on a list mode or on a matrix mode. Static images can be stored on a single matrix. Dynamic data can be stored on a series of matrixes, each representing a different period of acquisition. It is also possible to capture information on a series of matrixes syncronized with the electrocardiogram of the patient. In this instance, each matrix represents a distinct period of the cardiac cycle. Data stored on the memory, can be used to process and display images and quantitative histograms on a video screen. In order to do that, it is necessary to translate the digital data on the memory to voltage levels, and to transform these on light levels on the screen. This, is achieved through a digital analog converter. The reading of the digital memory must be syncronic with the electronic scanning of the video screen. PMID:6466002

  7. Digital intermediate frequency QAM modulator using parallel processing

    DOEpatents

    Pao, Hsueh-Yuan; Tran, Binh-Nien

    2008-05-27

    The digital Intermediate Frequency (IF) modulator applies to various modulation types and offers a simple and low cost method to implement a high-speed digital IF modulator using field programmable gate arrays (FPGAs). The architecture eliminates multipliers and sequential processing by storing the pre-computed modulated cosine and sine carriers in ROM look-up-tables (LUTs). The high-speed input data stream is parallel processed using the corresponding LUTs, which reduces the main processing speed, allowing the use of low cost FPGAs.

  8. Dry Machining Process of Milling Machine using Axiomatic Green Methodology

    NASA Astrophysics Data System (ADS)

    Puspita Andriani, Gita; Akbar, Muhammad; Irianto, Dradjad

    2016-02-01

    Most of companies know that there are strategies to become green industry, and they realize that green efforts have impacts on product quality and cost. Axiomatic Green Methodology models the relationship between green, quality, and cost. This methodology starts with determining the green improvement objective and then continues with mapping the functional, economic, and green requirements. From the mapping, variables which affect the requirements are identified. Afterwards, the effect of each variable is determined by performing experiments and regression modelling. In this research, axiomatic green methodology was implemented to dry machining of milling machine in order to reduce the amount of coolant. Dry machining will be feasible if it is not worse than the minimum required quality. As a result, dry machining is feasible without producing any defect. The proposed machining parameter is to reduce the coolant flow rate from 6.882 ml/minute to 0 ml/minute, set the depth of cut at 1.2 mm, spindle rotation speed at 500 rpm, and feed rate at 128 mm/minute. This solution is also resulted in reduction of cost for 200.48 rupiahs for each process.

  9. Improved assembly processes for the Quartz Digital Accelerometer cantilever

    SciTech Connect

    Romero, A.M.; Gebert, C.T.

    1990-07-01

    This report covers the development of improved assembly processes for the Quartz Digital Accelerometer cantilever. In this report we discuss improved single-assembly tooling, the development of tooling and processes for precision application of polyimide adhesive, the development of the wafer scale assembly procedure, and the application of eutectic bonding to cantilever assembly. 2 refs., 17 figs.

  10. FPGA-Based Filterbank Implementation for Parallel Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Berner, Stephan; DeLeon, Phillip

    1999-01-01

    One approach to parallel digital signal processing decomposes a high bandwidth signal into multiple lower bandwidth (rate) signals by an analysis bank. After processing, the subband signals are recombined into a fullband output signal by a synthesis bank. This paper describes an implementation of the analysis and synthesis banks using (Field Programmable Gate Arrays) FPGAs.

  11. Relationships between digital signal processing and control and estimation theory

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1978-01-01

    Research directions in the fields of digital signal processing and modern control and estimation theory are discussed. Stability theory, linear prediction and parameter identification, system synthesis and implementation, two-dimensional filtering, decentralized control and estimation, and image processing are considered in order to uncover some of the basic similarities and differences in the goals, techniques, and philosophy of the disciplines.

  12. Modeling of electrohydrodynamic drying process using response surface methodology

    PubMed Central

    Dalvand, Mohammad Jafar; Mohtasebi, Seyed Saeid; Rafiee, Shahin

    2014-01-01

    Energy consumption index is one of the most important criteria for judging about new, and emerging drying technologies. One of such novel and promising alternative of drying process is called electrohydrodynamic (EHD) drying. In this work, a solar energy was used to maintain required energy of EHD drying process. Moreover, response surface methodology (RSM) was used to build a predictive model in order to investigate the combined effects of independent variables such as applied voltage, field strength, number of discharge electrode (needle), and air velocity on moisture ratio, energy efficiency, and energy consumption as responses of EHD drying process. Three-levels and four-factor Box–Behnken design was employed to evaluate the effects of independent variables on system responses. A stepwise approach was followed to build up a model that can map the entire response surface. The interior relationships between parameters were well defined by RSM. PMID:24936289

  13. Methodology and Process for Condition Assessment at Existing Hydropower Plants

    SciTech Connect

    Zhang, Qin Fen; Smith, Brennan T; Cones, Marvin; March, Patrick; Dham, Rajesh; Spray, Michael

    2012-01-01

    Hydropower Advancement Project was initiated by the U.S. Department of Energy Office of Energy Efficiency and Renewable Energy to develop and implement a systematic process with a standard methodology to identify the opportunities of performance improvement at existing hydropower facilities and to predict and trend the overall condition and improvement opportunity within the U.S. hydropower fleet. The concept of performance for the HAP focuses on water use efficiency how well a plant or individual unit converts potential energy to electrical energy over a long-term averaging period of a year or more. The performance improvement involves not only optimization of plant dispatch and scheduling but also enhancement of efficiency and availability through advanced technology and asset upgrades, and thus requires inspection and condition assessment for equipment, control system, and other generating assets. This paper discusses the standard methodology and process for condition assessment of approximately 50 nationwide facilities, including sampling techniques to ensure valid expansion of the 50 assessment results to the entire hydropower fleet. The application and refining process and the results from three demonstration assessments are also presented in this paper.

  14. Image processing in digital chest radiography: effect on diagnostic efficacy.

    PubMed

    Manninen, H; Partanen, K; Lehtovirta, J; Matsi, P; Soimakallio, S

    1992-01-01

    The usefulness of digital image processing of chest radiographs was evaluated in a clinical study. In 54 patients, chest radiographs in the posteroanterior projection were obtained by both 14 inch digital image intensifier equipment and the conventional screen-film technique. The digital radiographs (512 x 512 image format) viewed on a 625 line monitor were processed in three different ways: (1) standard display; (2) digital edge enhancement for the standard display; and (3) inverse intensity display. The radiographs were interpreted independently by three radiologists. The diagnoses were confirmed by CT, follow-up radiographs and clinical records. Chest abnormalities of the films analyzed included 21 primary lung tumors, 44 pulmonary nodules, 16 cases with mediastinal disease and 17 cases with pneumonia/atelectasis. Interstitial lung disease, pleural plaques, and pulmonary emphysema were found in 30, 18 and 19 cases, respectively. The sensitivity of conventional radiography when averaged overall findings was better than that of the digital techniques (P less than 0.001). The differences in diagnostic accuracy measured by sensitivity and specificity between the three digital display modes were small. Standard image display showed better sensitivity for pulmonary nodules (0.74 vs 0.66; P less than 0.05) but poorer specificity for pulmonary emphysema (0.85 vs. 0.93; P less than 0.05) compared with inverse intensity display. We conclude that when using 512 x 512 image format, the routine use of digital edge enhancement and tone reversal at digital chest radiographs is not warranted. PMID:1563421

  15. Efficient multiprocessor architecture for digital signal processing

    SciTech Connect

    Auguin, M.; Boeri, F.

    1982-01-01

    There is a continuing pressure of better processing performances in numerical signal processing. Effective utilization of LSI semiconductor technology allows the consideration of multiprocessor architectures. The problem of interconnecting the components of the architecture arises. The authors describe a control algorithm of the Benes interconnection network in a asynchronous multiprocessor system. A simulation study of the time-shared bus, of the omega network, of the benes network and of the crossbar network gives a comparison of performances. 8 references.

  16. Digital pulse processing: new possibilities in nuclear spectroscopy

    PubMed

    Warburton; Momayezi; Hubbard-Nelson; Skulski

    2000-10-01

    Digital pulse processing is a signal processing technique in which detector (preamplifier output) signals are directly digitized and processed to extract quantities of interest. This approach has several significant advantages compared to traditional analog signal shaping. First, analyses can be developed which take pulse-by-pulse differences into account, as in making ballistic deficit compensations. Second, transient induced charge signals, which deposit no net charge on an electrode, can be analyzed to give, for example, information on the position of interaction within the detector. Third, deadtimes from transient overload signals are greatly reduced, from tens of micros to hundreds of ns. Fourth, signals are easily captured, so that more complex analyses can be postponed until the source event has been deemed "interesting". Fifth, signal capture and processing may easily be based on coincidence criteria between different detectors or different parts of the same detector. XIAs recently introduced CAMAC module, the DGF-4C, provides many of these features for four input channels, including two levels of digital processing and a FIFO for signal capture for each signal channel. The first level of digital processing is "immediate", taking place in a gate array at the 40 MHz digitization rate, and implements pulse detection, pileup inspection, trapezoidal energy filtering, and control of an external 25.6 micros long FIFO. The second level of digital processing is provided by a digital signal processor (DSP), where more complex algorithms can be implemented. To illustrate digital pulse processing's possibilities, we describe the application of the DGF-4C to a series of experiments. The first, for which the DGF was originally developed, involves locating gamma-ray interaction sites within large segmented Ge detectors. The goal of this work is to attain spatial resolutions of order 2 mm sigma within 70 mm x 90 mm detectors. We show how pulse shape analysis allows

  17. Application of automated methodologies based on digital images for phenological behaviour analysis in Mediterranean species

    NASA Astrophysics Data System (ADS)

    Cesaraccio, Carla; Piga, Alessandra; Ventura, Andrea; Arca, Angelo; Duce, Pierpaolo; Granados, Joel

    2015-04-01

    The importance of phenological research for understanding the consequences of global environmental change on vegetation is highlighted in the most recent IPCC reports. Collecting time series of phenological events appears to be of crucial importance to better understand how vegetation systems respond to climatic regime fluctuations, and, consequently, to develop effective management and adaptation strategies. Vegetation monitoring based on "near-surface" remote sensing techniques have been proposed in recent researches. In particular, the use of digital cameras has become more common for phenological monitoring. Digital images provide spectral information in the red, green, and blue (RGB) wavelengths. Inflection points in seasonal variations of intensities of each color channel can be used to identify phenological events. In this research, an Automated Phenological Observation System (APOS), based on digital image sensors, was used for monitoring the phenological behavior of shrubland species in a Mediterranean site. Major species of the shrubland ecosystem that were analyzed were: Cistus monspeliensis L., Cistus incanus L., Rosmarinus officinalis L., Pistacia lentiscus L., and Pinus halepensis Mill. The system was developed under the INCREASE (an Integrated Network on Climate Change Research) EU-funded research infrastructure project, which is based upon large scale field experiments with non-intrusive climatic manipulations. Monitoring of phenological behavior was conducted during 2012-2014 years. To the end of retrieve phenological information from digital images, a routine of commands to process the digital image file using the program MATLAB (R2013b, The MathWorks, Natick, Mass.) was specifically created. The images of the dataset have been re-classified and renamed files according to the date and time of acquisition. The analysis was focused on regions of interest (ROIs) of the panoramas acquired, defined by the presence of the most representative species of

  18. Digital Art Making as a Representational Process

    ERIC Educational Resources Information Center

    Halverson, Erica Rosenfeld

    2013-01-01

    In this article I bring artistic production into the learning sciences conversation by using the production of representations as a bridging concept between art making and the new literacies. Through case studies with 4 youth media arts organizations across the United States I ask how organizations structure the process of producing…

  19. Digitizing Dissertations for an Institutional Repository: A Process and Cost Analysis*

    PubMed Central

    Piorun, Mary; Palmer, Lisa A.

    2008-01-01

    Objective: This paper describes the Lamar Soutter Library's process and costs associated with digitizing 300 doctoral dissertations for a newly implemented institutional repository at the University of Massachusetts Medical School. Methodology: Project tasks included identifying metadata elements, obtaining and tracking permissions, converting the dissertations to an electronic format, and coordinating workflow between library departments. Each dissertation was scanned, reviewed for quality control, enhanced with a table of contents, processed through an optical character recognition function, and added to the institutional repository. Results: Three hundred and twenty dissertations were digitized and added to the repository for a cost of $23,562, or $0.28 per page. Seventy-four percent of the authors who were contacted (n = 282) granted permission to digitize their dissertations. Processing time per title was 170 minutes, for a total processing time of 906 hours. In the first 17 months, full-text dissertations in the collection were downloaded 17,555 times. Conclusion: Locally digitizing dissertations or other scholarly works for inclusion in institutional repositories can be cost effective, especially if small, defined projects are chosen. A successful project serves as an excellent recruitment strategy for the institutional repository and helps libraries build new relationships. Challenges include workflow, cost, policy development, and copyright permissions. PMID:18654648

  20. Signal processing methodologies for an acoustic fetal heart rate monitor

    NASA Technical Reports Server (NTRS)

    Pretlow, Robert A., III; Stoughton, John W.

    1992-01-01

    Research and development is presented of real time signal processing methodologies for the detection of fetal heart tones within a noise-contaminated signal from a passive acoustic sensor. A linear predictor algorithm is utilized for detection of the heart tone event and additional processing derives heart rate. The linear predictor is adaptively 'trained' in a least mean square error sense on generic fetal heart tones recorded from patients. A real time monitor system is described which outputs to a strip chart recorder for plotting the time history of the fetal heart rate. The system is validated in the context of the fetal nonstress test. Comparisons are made with ultrasonic nonstress tests on a series of patients. Comparative data provides favorable indications of the feasibility of the acoustic monitor for clinical use.

  1. Reservoir continuous process improvement six sigma methodology implementation

    SciTech Connect

    Wannamaker, A.L.

    1996-12-01

    The six sigma methodology adopted by AlliedSignal Inc. for implementing continuous improvement activity was applied to a new manufacturing assignment for Federal Manufacturing & Technologies (FM&T). The responsibility for reservoir development/production was transferred from Rocky Flats to FM&T. Pressure vessel fabrication was new to this facility. No fabrication history for this type of product existed in-house. Statistical tools such as process mapping, failure mode and effects analysis, and design of experiments were used to define and fully characterize the machine processes to be used in reservoir production. Continuous improvement with regard to operating efficiencies and product quality is an ongoing activity at FM&T.

  2. Statistical process control for hospitals: methodology, user education, and challenges.

    PubMed

    Matthes, Nikolas; Ogunbo, Samuel; Pennington, Gaither; Wood, Nell; Hart, Marilyn K; Hart, Robert F

    2007-01-01

    The health care industry is slowly embracing the use of statistical process control (SPC) to monitor and study causes of variation in health care processes. While the statistics and principles underlying the use of SPC are relatively straightforward, there is a need to be cognizant of the perils that await the user who is not well versed in the key concepts of SPC. This article introduces the theory behind SPC methodology, describes successful tactics for educating users, and discusses the challenges associated with encouraging adoption of SPC among health care professionals. To illustrate these benefits and challenges, this article references the National Hospital Quality Measures, presents critical elements of SPC curricula, and draws examples from hospitals that have successfully embedded SPC into their overall approach to performance assessment and improvement. PMID:17627215

  3. Digital image processing of vascular angiograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Blankenhorn, D. H.; Beckenbach, E. S.; Crawford, D. W.; Brooks, S. H.

    1975-01-01

    A computer image processing technique was developed to estimate the degree of atherosclerosis in the human femoral artery. With an angiographic film of the vessel as input, the computer was programmed to estimate vessel abnormality through a series of measurements, some derived primarily from the vessel edge information and others from optical density variations within the lumen shadow. These measurements were combined into an atherosclerosis index, which was found to correlate well with both visual and chemical estimates of atherosclerotic disease.

  4. Synthetic aperture radar and digital processing: An introduction

    NASA Technical Reports Server (NTRS)

    Dicenzo, A.

    1981-01-01

    A tutorial on synthetic aperture radar (SAR) is presented with emphasis on digital data collection and processing. Background information on waveform frequency and phase notation, mixing, Q conversion, sampling and cross correlation operations is included for clarity. The fate of a SAR signal from transmission to processed image is traced in detail, using the model of a single bright point target against a dark background. Some of the principal problems connected with SAR processing are also discussed.

  5. Digital image processing: a primer for JVIR authors and readers: Part 3: Digital image editing.

    PubMed

    LaBerge, Jeanne M; Andriole, Katherine P

    2003-12-01

    This is the final installment of a three-part series on digital image processing intended to prepare authors for online submission of manuscripts. In the first two articles of the series, the fundamentals of digital image architecture were reviewed and methods of importing images to the computer desktop were described. In this article, techniques are presented for editing images in preparation for online submission. A step-by-step guide to basic editing with use of Adobe Photoshop is provided and the ethical implications of this activity are explored. PMID:14654480

  6. Digital Light Processing update: status and future applications

    NASA Astrophysics Data System (ADS)

    Hornbeck, Larry J.

    1999-05-01

    Digital Light Processing (DLP) projection displays based on the Digital Micromirror Device (DMD) were introduced to the market in 1996. Less than 3 years later, DLP-based projectors are found in such diverse applications as mobile, conference room, video wall, home theater, and large-venue. They provide high-quality, seamless, all-digital images that have exceptional stability as well as freedom from both flicker and image lag. Marked improvements have been made in the image quality of DLP-based projection display, including brightness, resolution, contrast ratio, and border image. DLP-based mobile projectors that weighted about 27 pounds in 1996 now weight only about 7 pounds. This weight reduction has been responsible for the definition of an entirely new projector class, the ultraportable. New applications are being developed for this important new projection display technology; these include digital photofinishing for high process speed minilab and maxilab applications and DLP Cinema for the digital delivery of films to audiences around the world. This paper describes the status of DLP-based projection display technology, including its manufacturing, performance improvements, and new applications, with emphasis on DLP Cinema.

  7. Results of precision processing (scene correction) of ERTS-1 images using digital image processing techniques

    NASA Technical Reports Server (NTRS)

    Bernstein, R.

    1973-01-01

    ERTS-1 MSS and RBV data recorded on computer compatible tapes have been analyzed and processed, and preliminary results have been obtained. No degradation of intensity (radiance) information occurred in implementing the geometric correction. The quality and resolution of the digitally processed images are very good, due primarily to the fact that the number of film generations and conversions is reduced to a minimum. Processing times of digitally processed images are about equivalent to the NDPF electro-optical processor.

  8. Introduction and comparison of new EBSD post-processing methodologies.

    PubMed

    Wright, Stuart I; Nowell, Matthew M; Lindeman, Scott P; Camus, Patrick P; De Graef, Marc; Jackson, Michael A

    2015-12-01

    Electron Backscatter Diffraction (EBSD) provides a useful means for characterizing microstructure. However, it can be difficult to obtain index-able diffraction patterns from some samples. This can lead to noisy maps reconstructed from the scan data. Various post-processing methodologies have been developed to improve the scan data generally based on correlating non-indexed or mis-indexed points with the orientations obtained at neighboring points in the scan grid. Two new approaches are introduced (1) a re-scanning approach using local pattern averaging and (2) using the multiple solutions obtained by the triplet indexing method. These methodologies are applied to samples with noise introduced into the patterns artificially and by the operational settings of the EBSD camera. They are also applied to a heavily deformed and a fine-grained sample. In all cases, both techniques provide an improvement in the resulting scan data, the local pattern averaging providing the most improvement of the two. However, the local pattern averaging is most helpful when the noise in the patterns is due to the camera operating conditions as opposed to inherent challenges in the sample itself. A byproduct of this study was insight into the validity of various indexing success rate metrics. A metric based given by the fraction of points with CI values greater than some tolerance value (0.1 in this case) was confirmed to provide an accurate assessment of the indexing success rate. PMID:26342553

  9. Digital-Computer Processing of Graphical Data. Final Report.

    ERIC Educational Resources Information Center

    Freeman, Herbert

    The final report of a two-year study concerned with the digital-computer processing of graphical data. Five separate investigations carried out under this study are described briefly, and a detailed bibliography, complete with abstracts, is included in which are listed the technical papers and reports published during the period of this program.…

  10. Sliding mean edge estimation. [in digital image processing

    NASA Technical Reports Server (NTRS)

    Ford, G. E.

    1978-01-01

    A method for determining the locations of the major edges of objects in digital images is presented. The method is based on an algorithm utilizing maximum likelihood concepts. An image line-scan interval is processed to determine if an edge exists within the interval and its location. The proposed algorithm has demonstrated good results even in noisy images.

  11. Digital frequency counter permits readout without disturbing counting process

    NASA Technical Reports Server (NTRS)

    Winkelstein, R.

    1966-01-01

    Digital frequency counter system enables readout accurately at one-second intervals without interrupting or disturbing the counting process. The system incorporates a master counter and a slave counter with novel logic interconnections. The counter can be readily adapted to provide frequency readouts at 0.1 second intervals.

  12. Digital Image Processing application to spray and flammability studies

    NASA Technical Reports Server (NTRS)

    Hernan, M. A.; Parikh, P.; Sarohia, V.

    1985-01-01

    Digital Image Processing has been integrated into a new technique for measurements of fuel spray characteristics. The advantages of this technique are: a wide dynamic range of droplet sizes, accounting for nonspherical droplet shapes not possible with other spray assessment techniques. Finally, the technique has been applied to the study of turbojet engine fuel nozzle atomization performance with Jet A and antimisting fuel.

  13. Experiences with digital processing of images at INPE

    NASA Technical Reports Server (NTRS)

    Mascarenhas, N. D. A. (Principal Investigator)

    1984-01-01

    Four different research experiments with digital image processing at INPE will be described: (1) edge detection by hypothesis testing; (2) image interpolation by finite impulse response filters; (3) spatial feature extraction methods in multispectral classification; and (4) translational image registration by sequential tests of hypotheses.

  14. Trust in Numbers? Digital Education Governance and the Inspection Process

    ERIC Educational Resources Information Center

    Ozga, Jenny

    2016-01-01

    The aim of the paper is to contribute to the critical study of digital data use in education, through examination of the processes surrounding school inspection judgements. The interaction between pupil performance data and other (embodied, enacted) sources of inspection judgement is scrutinised and discussed with a focus on the interaction…

  15. Optical hybrid analog-digital signal processing based on spike processing in neurons

    NASA Astrophysics Data System (ADS)

    Fok, Mable P.; Tian, Yue; Rosenbluth, David; Deng, Yanhua; Prucnal, Paul R.

    2011-09-01

    Spike processing is one kind of hybrid analog-digital signal processing, which has the efficiency of analog processing and the robustness to noise of digital processing. When instantiated with optics, a hybrid analog-digital processing primitive has the potential to be scalable, computationally powerful, and have high operation bandwidth. These devices open up a range of processing applications for which electronic processing is too slow. Our approach is based on a hybrid analog/digital computational primitive that elegantly implements the functionality of an integrate-and-fire neuron using a Ge-doped non-linear optical fiber and off-the-shelf semiconductor devices. In this paper, we introduce our photonic neuron architecture and demonstrate the feasibility of implementing simple photonic neuromorphic circuits, including the auditory localization algorithm of the barn owl, which is useful for LIDAR localization, and the crayfish tail-flip escape response.

  16. Digital Signal Processing System for Active Noise Reduction

    NASA Astrophysics Data System (ADS)

    Edmonson, William W.; Tucker, Jerry

    2002-12-01

    different adaptive noise cancellation algorithms and provide an operational prototype to understand the behavior of the system under test. DSP software was required to interface the processor with the data converters using interrupt routines. The goal is to build a complete ANC system that can be placed on a flexible circuit with added memory circuitry that also contains the power supply, sensors and actuators. This work on the digital signal processing system for active noise reduction was completed in collaboration with another ASEE Fellow, Dr. Jerry Tucker from Virginia Commonwealth University, Richmond, VA.

  17. Evaluation methodologies for an advanced information processing system

    NASA Technical Reports Server (NTRS)

    Schabowsky, R. S., Jr.; Gai, E.; Walker, B. K.; Lala, J. H.; Motyka, P.

    1984-01-01

    The system concept and requirements for an Advanced Information Processing System (AIPS) are briefly described, but the emphasis of this paper is on the evaluation methodologies being developed and utilized in the AIPS program. The evaluation tasks include hardware reliability, maintainability and availability, software reliability, performance, and performability. Hardware RMA and software reliability are addressed with Markov modeling techniques. The performance analysis for AIPS is based on queueing theory. Performability is a measure of merit which combines system reliability and performance measures. The probability laws of the performance measures are obtained from the Markov reliability models. Scalar functions of this law such as the mean and variance provide measures of merit in the AIPS performability evaluations.

  18. Methodology for the systems engineering process. Volume 3: Operational availability

    NASA Technical Reports Server (NTRS)

    Nelson, J. H.

    1972-01-01

    A detailed description and explanation of the operational availability parameter is presented. The fundamental mathematical basis for operational availability is developed, and its relationship to a system's overall performance effectiveness is illustrated within the context of identifying specific availability requirements. Thus, in attempting to provide a general methodology for treating both hypothetical and existing availability requirements, the concept of an availability state, in conjunction with the more conventional probability-time capability, is investigated. In this respect, emphasis is focused upon a balanced analytical and pragmatic treatment of operational availability within the system design process. For example, several applications of operational availability to typical aerospace systems are presented, encompassing the techniques of Monte Carlo simulation, system performance availability trade-off studies, analytical modeling of specific scenarios, as well as the determination of launch-on-time probabilities. Finally, an extensive bibliography is provided to indicate further levels of depth and detail of the operational availability parameter.

  19. Gallium arsenide enhances digital signal processing in electronic warfare

    NASA Astrophysics Data System (ADS)

    Hoffman, B.; Apte, D.

    1985-07-01

    The higher electron mobility and velocity of GaAs digital signal processing IC devices for electronic warfare (EW) allow operation times that are several times faster than those of ICs based on silicon. Particular benefits are foreseen for the response time and broadband capability of ECM systems. Many data manipulation methods can be implemented in emitter-coupled logic (ECL) GaAs devices, and digital GaAs RF memories are noted to show great promise for improved ECM system performance while encompassing microwave frequency and chirp signal synthesis, repeater jamming, and multiple false target generation. EW digital frequency synthesizers are especially in need of GaAS IC technology, since bandwidth and resolution have been limited by ECL technology to about 250 MHz.

  20. Digital image processing of bone - Problems and potentials

    NASA Technical Reports Server (NTRS)

    Morey, E. R.; Wronski, T. J.

    1980-01-01

    The development of a digital image processing system for bone histomorphometry and fluorescent marker monitoring is discussed. The system in question is capable of making measurements of UV or light microscope features on a video screen with either video or computer-generated images, and comprises a microscope, low-light-level video camera, video digitizer and display terminal, color monitor, and PDP 11/34 computer. Capabilities demonstrated in the analysis of an undecalcified rat tibia include the measurement of perimeter and total bone area, and the generation of microscope images, false color images, digitized images and contoured images for further analysis. Software development will be based on an existing software library, specifically the mini-VICAR system developed at JPL. It is noted that the potentials of the system in terms of speed and reliability far exceed any problems associated with hardware and software development.

  1. Methodology to reduce chronic defect mechanisms in semiconductor processing

    NASA Astrophysics Data System (ADS)

    Ecton, Timothy W.; Frazee, Kenneth G.

    1990-06-01

    This paper docuitents a structur approach to defect elimination in seiiiccructor processing. Classical problem solving techniques were used to logically guide the defect rIuction effort. tfect infontation was gatherei using an automated wafer inspection systeaii ar defects were classifi&1 by production workers on a rete review station. This approach distiruishe actual causes from several probable causes. A process change has reduc the defect mechanism. This methodology was applied to ruce !IEFWN' perfluoroalkoxy (PFA) particles in a one micron semiccructor process. Electrical test structures identified a critical layer where yield loss was occurring. An audit procedure was establishi at this layer arx defects were c1assifi into broad cateories. Further breakout of defect t'pes by appearance was necessaxy to construct a meaningful Pareto chart ard identify the xist fr&ijiently occurring fatal defect. The critical process zone was segmented using autaat wafer inspection to isolate the step causing the defect. An IshiJcawa or cause-effect diagram was construct with input from process engineers to outline all possible causes of the defect. A nest probable branch was selected for investigation arxi pursued until it became clear that this branch was not related to the cause. At this point, new ideas were sought from a sister production facility. ring the visit a breakthrough irxicat& a different path ar ultiltiately lead to identifying the source of the defect. A process change was implemented. An evaluation of the change she1 a substantial decrease in defect evel. rther efforts to eliminate the defect srce are in rogres.

  2. Process sequence optimization for digital microfluidic integration using EWOD technique

    NASA Astrophysics Data System (ADS)

    Yadav, Supriya; Joyce, Robin; Sharma, Akash Kumar; Sharma, Himani; Sharma, Niti Nipun; Varghese, Soney; Akhtar, Jamil

    2016-04-01

    Micro/nano-fluidic MEMS biosensors are the devices that detects the biomolecules. The emerging micro/nano-fluidic devices provide high throughput and high repeatability with very low response time and reduced device cost as compared to traditional devices. This article presents the experimental details for process sequence optimization of digital microfluidics (DMF) using "electrowetting-on-dielectric" (EWOD). Stress free thick film deposition of silicon dioxide using PECVD and subsequent process for EWOD techniques have been optimized in this work.

  3. Digital processing of histopathological aspects in renal transplantation

    NASA Astrophysics Data System (ADS)

    de Albuquerque Araujo, Arnaldo; de Andrade, Marcos C.; Bambirra, Eduardo A.; dos Santos, A. M. M.

    1993-07-01

    We describe here our initial experience with the digital image processing of histopathological aspects from multiple renal biopsies of transplanted kidney in a patient treated with Cyclosporine (CsA), a powerful immunosupressor drug whose use has improved the chances of a successful vascularized organ transplantation (Tx). Unfortunately, CsA promotes morphological alterations to the glomerular structure of the kidneys. To characterize this process, glomeruli, tufts, and lumen areas distributions are measured. The results are presented in form of graphics.

  4. Real time speech recognition on a distributed digital processing array

    NASA Astrophysics Data System (ADS)

    Simpson, P.; Roberts, J. B. G.

    1983-08-01

    A compact digital signal processor based on the architecture of the ICL Distributed Array Processor (DAP) is under development for MOD applications in Radar, ESM, Image Processing, etc. This Memorandum examines its applicability to speech recognition. In such a distributed processor, optimum mapping of the problem on to the array of processors is vital for efficiency. Three mappings of a dynamic time warping algorithm for isolated word recognition are examined, leading to a feasbile real time capability for continuous speech processing. The compatibility found between dynamic programming methods and this class of machine enlarges the scope of signal processing algorithms foreseen as amenable to parallel processing.

  5. Flow manipulation and control methodologies for vacuum infusion processes

    NASA Astrophysics Data System (ADS)

    Alms, Justin B.

    experienced. First, the effect on permeability is characterized, so the process can be simulated and the flow front patterns can be predicted. It was found that using the VIPR process in combination with tool side injection gates is a very effective method to control resin flow. Based on this understanding several control algorithms were developed to use the process in an automated manufacturing environment which were tested and validated in a virtual environment. To implement and demonstrate the approach, an experimental workstation was built and various infusion examples were performed in the automated environment to validate the capability of the VIPR process with the control methodologies. The VIPR process with control consistently performed better than the process without control. This contribution should prove useful in making VIPs more reliable in the production of large scale composite structures.

  6. Novel Optimization Methodology for Welding Process/Consumable Integration

    SciTech Connect

    Quintana, Marie A; DebRoy, Tarasankar; Vitek, John; Babu, Suresh

    2006-01-15

    Advanced materials are being developed to improve the energy efficiency of many industries of future including steel, mining, and chemical, as well as, US infrastructures including bridges, pipelines and buildings. Effective deployment of these materials is highly dependent upon the development of arc welding technology. Traditional welding technology development is slow and often involves expensive and time-consuming trial and error experimentation. The reason for this is the lack of useful predictive tools that enable welding technology development to keep pace with the deployment of new materials in various industrial sectors. Literature reviews showed two kinds of modeling activities. Academic and national laboratory efforts focus on developing integrated weld process models by employing the detailed scientific methodologies. However, these models are cumbersome and not easy to use. Therefore, these scientific models have limited application in real-world industrial conditions. On the other hand, industrial users have relied on simple predictive models based on analytical and empirical equations to drive their product development. The scopes of these simple models are limited. In this research, attempts were made to bridge this gap and provide the industry with a computational tool that combines the advantages of both approaches. This research resulted in the development of predictive tools which facilitate the development of optimized welding processes and consumables. The work demonstrated that it is possible to develop hybrid integrated models for relating the weld metal composition and process parameters to the performance of welds. In addition, these tools can be deployed for industrial users through user friendly graphical interface. In principle, the welding industry users can use these modular tools to guide their welding process parameter and consumable composition selection. It is hypothesized that by expanding these tools throughout welding industry

  7. Digital pulse processing for NaI(Tl) detectors

    NASA Astrophysics Data System (ADS)

    Di Fulvio, A.; Shin, T. H.; Hamel, M. C.; Pozzi, S. A.

    2016-01-01

    We apply two different post-processing techniques to digital pulses induced by photons in a NaI(Tl) detector and compare the obtained energy resolution to the standard analog approach. Our digital acquisition approach is performed using a single-stage acquisition with a fast digitizer. Both the post-processing techniques we propose rely on signal integration. In the first, the pulse integral is calculated by directly numerically integrating the pulse digital samples, while in the second the pulse integral is estimated by a model-based fitting of the pulse. Our study used a 7.62 cm×7.62 cm cylindrical NaI(Tl) detector that gave a 7.60% energy resolution (at 662 keV), using the standard analog acquisition approach, based on a pulse shaping amplifier. The new direct numerical integration yielded a 6.52% energy resolution. The fitting approach yielded a 6.55% energy resolution, and, although computationally heavier than numerical integration, is preferable when only the early samples of the pulse are available. We also evaluated the timing performance of a fast-slow detection system, encompassing an EJ-309 and a NaI(Tl) scintillator. We use two techniques to determine the pulse start time: constant fraction discrimination (CFD) and adaptive noise threshold timing (ANT), for both the analog and digital acquisition approach. With the analog acquisition approach, we found a system time resolution of 5.8 ns and 7.3 ns, using the constant fraction discrimination and adaptive noise threshold timing, respectively. With the digital acquisition approach, a time resolution of 1.2 ns was achieved using the ANT method and 3.3 ns using CFD at 50% of the maximum, to select the pulse start time. The proposed direct digital readout and post-processing techniques can improve the application of NaI(Tl) detectors, traditionally considered 'slow', for fast counting and correlation measurements, while maintaining a good measurement of the energy resolution.

  8. Automated image processing of LANDSAT 2 digital data for watershed runoff prediction

    NASA Technical Reports Server (NTRS)

    Sasso, R. R.; Jensen, J. R.; Estes, J. E.

    1977-01-01

    The U.S. Soil Conservation Service (SCS) model for watershed runoff prediction uses soil and land cover information as its major drivers. Kern County Water Agency is implementing the SCS model to predict runoff for 10,400 sq cm of mountainous watershed in Kern County, California. The Remote Sensing Unit, University of California, Santa Barbara, was commissioned by KCWA to conduct a 230 sq cm feasibility study in the Lake Isabella, California region to evaluate remote sensing methodologies which could be ultimately extrapolated to the entire 10,400 sq cm Kern County watershed. Digital results indicate that digital image processing of Landsat 2 data will provide usable land cover required by KCWA for input to the SCS runoff model.

  9. Computer image processing - The Viking experience. [digital enhancement techniques

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1977-01-01

    Computer processing of digital imagery from the Viking mission to Mars is discussed, with attention given to subjective enhancement and quantitative processing. Contrast stretching and high-pass filtering techniques of subjective enhancement are described; algorithms developed to determine optimal stretch and filtering parameters are also mentioned. In addition, geometric transformations to rectify the distortion of shapes in the field of view and to alter the apparent viewpoint of the image are considered. Perhaps the most difficult problem in quantitative processing of Viking imagery was the production of accurate color representations of Orbiter and Lander camera images.

  10. Viking image processing. [digital stereo imagery and computer mosaicking

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1977-01-01

    The paper discusses the camera systems capable of recording black and white and color imagery developed for the Viking Lander imaging experiment. Each Viking Lander image consisted of a matrix of numbers with 512 rows and an arbitrary number of columns up to a maximum of about 9,000. Various techniques were used in the processing of the Viking Lander images, including: (1) digital geometric transformation, (2) the processing of stereo imagery to produce three-dimensional terrain maps, and (3) computer mosaicking of distinct processed images. A series of Viking Lander images is included.

  11. Processing Digital Imagery to Enhance Perceptions of Realism

    NASA Technical Reports Server (NTRS)

    Woodell, Glenn A.; Jobson, Daniel J.; Rahman, Zia-ur

    2003-01-01

    Multi-scale retinex with color restoration (MSRCR) is a method of processing digital image data based on Edwin Land s retinex (retina + cortex) theory of human color vision. An outgrowth of basic scientific research and its application to NASA s remote-sensing mission, MSRCR is embodied in a general-purpose algorithm that greatly improves the perception of visual realism and the quantity and quality of perceived information in a digitized image. In addition, the MSRCR algorithm includes provisions for automatic corrections to accelerate and facilitate what could otherwise be a tedious image-editing process. The MSRCR algorithm has been, and is expected to continue to be, the basis for development of commercial image-enhancement software designed to extend and refine its capabilities for diverse applications.

  12. Tunable photonic filters: a digital signal processing design approach.

    PubMed

    Binh, Le Nguyen

    2009-05-20

    Digital signal processing techniques are used for synthesizing tunable optical filters with variable bandwidth and centered reference frequency including the tunability of the low-pass, high-pass, bandpass, and bandstop optical filters. Potential applications of such filters are discussed, and the design techniques and properties of recursive digital filters are outlined. The basic filter structures, namely, the first-order all-pole optical filter (FOAPOF) and the first-order all-zero optical filter (FOAZOF), are described, and finally the design process of tunable optical filters and the designs of the second-order Butterworth low-pass, high-pass, bandpass, and bandstop tunable optical filters are presented. Indeed, we identify that the all-zero and all-pole networks are equivalent with well known principles of optics of interference and resonance, respectively. It is thus very straightforward to implement tunable optical filters, which is a unique feature. PMID:19458728

  13. Digital signal processing algorithms for automatic voice recognition

    NASA Technical Reports Server (NTRS)

    Botros, Nazeih M.

    1987-01-01

    The current digital signal analysis algorithms are investigated that are implemented in automatic voice recognition algorithms. Automatic voice recognition means, the capability of a computer to recognize and interact with verbal commands. The digital signal is focused on, rather than the linguistic, analysis of speech signal. Several digital signal processing algorithms are available for voice recognition. Some of these algorithms are: Linear Predictive Coding (LPC), Short-time Fourier Analysis, and Cepstrum Analysis. Among these algorithms, the LPC is the most widely used. This algorithm has short execution time and do not require large memory storage. However, it has several limitations due to the assumptions used to develop it. The other 2 algorithms are frequency domain algorithms with not many assumptions, but they are not widely implemented or investigated. However, with the recent advances in the digital technology, namely signal processors, these 2 frequency domain algorithms may be investigated in order to implement them in voice recognition. This research is concerned with real time, microprocessor based recognition algorithms.

  14. Audit and Certification Process for Science Data Digital Repositories

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Giaretta, D.; Ambacher, B.; Ashley, K.; Conrad, M.; Downs, R. R.; Garrett, J.; Guercio, M.; Lambert, S.; Longstreth, T.; Sawyer, D. M.; Sierman, B.; Tibbo, H.; Waltz, M.

    2011-12-01

    Science data digital repositories are entrusted to ensure that a science community's data are available and useful to users both today and in the future. Part of the challenge in meeting this responsibility is identifying the standards, policies and procedures required to accomplish effective data preservation. Subsequently a repository should be evaluated on whether or not they are effective in their data preservation efforts. This poster will outline the process by which digital repositories are being formally evaluated in terms of their ability to preserve the digitally encoded information with which they have been entrusted. The ISO standards on which this is based will be identified and the relationship of these standards to the Open Archive Information System (OAIS) reference model will be shown. Six test audits have been conducted with three repositories in Europe and three in the USA. Some of the major lessons learned from these test audits will be briefly described. An assessment of the possible impact of this type of audit and certification on the practice of preserving digital information will also be provided.

  15. Digital signal processing based on inverse scattering transform.

    PubMed

    Turitsyna, Elena G; Turitsyn, Sergei K

    2013-10-15

    Through numerical modeling, we illustrate the possibility of a new approach to digital signal processing in coherent optical communications based on the application of the so-called inverse scattering transform. Considering without loss of generality a fiber link with normal dispersion and quadrature phase shift keying signal modulation, we demonstrate how an initial information pattern can be recovered (without direct backward propagation) through the calculation of nonlinear spectral data of the received optical signal. PMID:24321955

  16. Adaptive control technique for accelerators using digital signal processing

    SciTech Connect

    Eaton, L.; Jachim, S.; Natter, E.

    1987-01-01

    The use of present Digital Signal Processing (DSP) techniques can drastically reduce the residual rf amplitude and phase error in an accelerating rf cavity. Accelerator beam loading contributes greatly to this residual error, and the low-level rf field control loops cannot completely absorb the fast transient of the error. A feedforward technique using DSP is required to maintain the very stringent rf field amplitude and phase specifications. 7 refs.

  17. In-Basket Exercises as a Methodology for Studying Information Processing.

    ERIC Educational Resources Information Center

    Dukerich, Janet M.; And Others

    1990-01-01

    Describes in-basket exercises as a research methodology for examining how managers allocate attention and respond to information in their environment. Issues in organizational information processing and decision making to which this methodology might be applicable are discussed, and other methodologies for studying information processing are…

  18. APET methodology for Defense Waste Processing Facility: Mode C operation

    SciTech Connect

    Taylor, R.P. Jr.; Massey, W.M.

    1995-04-01

    Safe operation of SRS facilities continues to be the highest priority of the Savannah River Site (SRS). One of these facilities, the Defense Waste Processing Facility or DWPF, is currently undergoing cold chemical runs to verify the design and construction preparatory to hot startup in 1995. The DWPFF is a facility designed to convert the waste currently stored in tanks at the 200-Area tank farm into a form that is suitable for long term storage in engineered surface facilities and, ultimately, geologic isolation. As a part of the program to ensure safe operation of the DWPF, a probabilistic Safety Assessment of the DWPF has been completed. The results of this analysis are incorporated into the Safety Analysis Report (SAR) for DWPF. The usual practice in preparation of Safety Analysis Reports is to include only a conservative analysis of certain design basis accidents. A major part of a Probabilistic Safety Assessment is the development and quantification of an Accident Progression Event Tree or APET. The APET provides a probabilistic representation of potential sequences along which an accident may progress. The methodology used to determine the risk of operation of the DWPF borrows heavily from methods applied to the Probabilistic Safety Assessment of SRS reactors and to some commercial reactors. This report describes the Accident Progression Event Tree developed for the Probabilistic Safety Assessment of the DWPF.

  19. Quantitative Assessment of Mouse Mammary Gland Morphology Using Automated Digital Image Processing and TEB Detection.

    PubMed

    Blacher, Silvia; Gérard, Céline; Gallez, Anne; Foidart, Jean-Michel; Noël, Agnès; Péqueux, Christel

    2016-04-01

    The assessment of rodent mammary gland morphology is largely used to study the molecular mechanisms driving breast development and to analyze the impact of various endocrine disruptors with putative pathological implications. In this work, we propose a methodology relying on fully automated digital image analysis methods including image processing and quantification of the whole ductal tree and of the terminal end buds as well. It allows to accurately and objectively measure both growth parameters and fine morphological glandular structures. Mammary gland elongation was characterized by 2 parameters: the length and the epithelial area of the ductal tree. Ductal tree fine structures were characterized by: 1) branch end-point density, 2) branching density, and 3) branch length distribution. The proposed methodology was compared with quantification methods classically used in the literature. This procedure can be transposed to several software and thus largely used by scientists studying rodent mammary gland morphology. PMID:26910307

  20. Application of digital image processing for the generation of voxels phantoms for Monte Carlo simulation.

    PubMed

    Boia, L S; Menezes, A F; Cardoso, M A C; da Rosa, L A R; Batista, D V S; Cardoso, S C; Silva, A X; Facure, A

    2012-01-01

    This paper presents the application of a computational methodology for optimizing the conversion of medical tomographic images in voxel anthropomorphic models for simulation of radiation transport using the MCNP code. A computational system was developed for digital image processing that compresses the information from the DICOM medical image before it is converted to the Scan2MCNP software input file for optimization of the image data. In order to validate the computational methodology, a radiosurgery treatment simulation was performed using the Alderson Rando phantom and the acquisition of DICOM images was performed. The simulation results were compared with data obtained with the BrainLab planning system. The comparison showed good agreement for three orthogonal treatment beams of (60)Co gamma radiation. The percentage differences were 3.07%, 0.77% and 6.15% for axial, coronal and sagital projections, respectively. PMID:21945017

  1. Fingerprint pattern restoration by digital image processing techniques.

    PubMed

    Wen, Che-Yen; Yu, Chiu-Chung

    2003-09-01

    Fingerprint evidence plays an important role in solving criminal problems. However, defective (lacking information needed for completeness) or contaminated (undesirable information included) fingerprint patterns make identifying and recognizing processes difficult. Unfortunately. this is the usual case. In the recognizing process (enhancement of patterns, or elimination of "false alarms" so that a fingerprint pattern can be searched in the Automated Fingerprint Identification System (AFIS)), chemical and physical techniques have been proposed to improve pattern legibility. In the identifying process, a fingerprint examiner can enhance contaminated (but not defective) fingerprint patterns under guidelines provided by the Scientific Working Group on Friction Ridge Analysis, Study and Technology (SWGFAST), the Scientific Working Group on Imaging Technology (SWGIT), and an AFIS working group within the National Institute of Justice. Recently, the image processing techniques have been successfully applied in forensic science. For example, we have applied image enhancement methods to improve the legibility of digital images such as fingerprints and vehicle plate numbers. In this paper, we propose a novel digital image restoration technique based on the AM (amplitude modulation)-FM (frequency modulation) reaction-diffusion method to restore defective or contaminated fingerprint patterns. This method shows its potential application to fingerprint pattern enhancement in the recognizing process (but not for the identifying process). Synthetic and real images are used to show the capability of the proposed method. The results of enhancing fingerprint patterns by the manual process and our method are evaluated and compared. PMID:14535661

  2. Digital signal processing (DSP) applications for multiband loudness correction digital hearing aids and cochlear implants.

    PubMed

    Dillier, N; Frölich, T; Kompis, M; Bögli, H; Lai, W K

    1993-01-01

    Single-chip digital signal processors (DSPs) allow the flexible implementation of a large variety of speech analysis, synthesis, and processing algorithms for the hearing impaired. A series of experiments was carried out to optimize parameters of the adaptive beamformer noise reduction algorithm and to evaluate its performance in realistic environments with normal-hearing and hearing-impaired subjects. An experimental DSP system has been used to implement a multiband loudness correction (MLC) algorithm for a digital hearing aid. Speech tests in quiet and noise with 13 users of conventional hearing aids demonstrated significant improvements in discrimination scores with the MLC algorithm. Various speech coding strategies for cochlear implants were implemented in real time on a DSP laboratory speech processor. Improved speech discrimination performance was achieved with high-rate stimulation. Hybrid strategies incorporating speech feature detectors and complex decision algorithms are currently being investigated. PMID:8263833

  3. Instruments and Methodologies for the Underwater Tridimensional Digitization and Data Musealization

    NASA Astrophysics Data System (ADS)

    Repola, L.; Memmolo, R.; Signoretti, D.

    2015-04-01

    In the research started within the SINAPSIS project of the Università degli Studi Suor Orsola Benincasa an underwater stereoscopic scanning aimed at surveying of submerged archaeological sites, integrable to standard systems for geomorphological detection of the coast, has been developed. The project involves the construction of hardware consisting of an aluminum frame supporting a pair of GoPro Hero Black Edition cameras and software for the production of point clouds and the initial processing of data. The software has features for stereoscopic vision system calibration, reduction of noise and the of distortion of underwater captured images, searching for corresponding points of stereoscopic images using stereo-matching algorithms (dense and sparse), for points cloud generating and filtering. Only after various calibration and survey tests carried out during the excavations envisaged in the project, the mastery of methods for an efficient acquisition of data has been achieved. The current development of the system has allowed generation of portions of digital models of real submerged scenes. A semi-automatic procedure for global registration of partial models is under development as a useful aid for the study and musealization of sites.

  4. a Semi-Automated Point Cloud Processing Methodology for 3d Cultural Heritage Documentation

    NASA Astrophysics Data System (ADS)

    Kıvılcım, C. Ö.; Duran, Z.

    2016-06-01

    The preliminary phase in any architectural heritage project is to obtain metric measurements and documentation of the building and its individual elements. On the other hand, conventional measurement techniques require tremendous resources and lengthy project completion times for architectural surveys and 3D model production. Over the past two decades, the widespread use of laser scanning and digital photogrammetry have significantly altered the heritage documentation process. Furthermore, advances in these technologies have enabled robust data collection and reduced user workload for generating various levels of products, from single buildings to expansive cityscapes. More recently, the use of procedural modelling methods and BIM relevant applications for historic building documentation purposes has become an active area of research, however fully automated systems in cultural heritage documentation still remains open. In this paper, we present a semi-automated methodology, for 3D façade modelling of cultural heritage assets based on parametric and procedural modelling techniques and using airborne and terrestrial laser scanning data. We present the contribution of our methodology, which we implemented in an open source software environment using the example project of a 16th century early classical era Ottoman structure, Sinan the Architect's Şehzade Mosque in Istanbul, Turkey.

  5. Digital metamaterials

    NASA Astrophysics Data System (ADS)

    Della Giovampaola, Cristian; Engheta, Nader

    2014-12-01

    Balancing complexity and simplicity has played an important role in the development of many fields in science and engineering. One of the well-known and powerful examples of such balance can be found in Boolean algebra and its impact on the birth of digital electronics and the digital information age. The simplicity of using only two numbers, ‘0’ and ‘1’, in a binary system for describing an arbitrary quantity made the fields of digital electronics and digital signal processing powerful and ubiquitous. Here, inspired by the binary concept, we propose to develop the notion of digital metamaterials. Specifically, we investigate how one can synthesize an electromagnetic metamaterial with a desired permittivity, using as building blocks only two elemental materials, which we call ‘metamaterial bits’, with two distinct permittivity functions. We demonstrate, analytically and numerically, how proper spatial mixtures of such metamaterial bits lead to elemental ‘metamaterial bytes’ with effective material parameters that are different from the parameters of the metamaterial bits. We then apply this methodology to several design examples of optical elements, such as digital convex lenses, flat graded-index digital lenses, digital constructs for epsilon-near-zero (ENZ) supercoupling and digital hyperlenses, thus highlighting the power and simplicity of the methodology.

  6. Digital metamaterials.

    PubMed

    Della Giovampaola, Cristian; Engheta, Nader

    2014-12-01

    Balancing complexity and simplicity has played an important role in the development of many fields in science and engineering. One of the well-known and powerful examples of such balance can be found in Boolean algebra and its impact on the birth of digital electronics and the digital information age. The simplicity of using only two numbers, '0' and '1', in a binary system for describing an arbitrary quantity made the fields of digital electronics and digital signal processing powerful and ubiquitous. Here, inspired by the binary concept, we propose to develop the notion of digital metamaterials. Specifically, we investigate how one can synthesize an electromagnetic metamaterial with a desired permittivity, using as building blocks only two elemental materials, which we call 'metamaterial bits', with two distinct permittivity functions. We demonstrate, analytically and numerically, how proper spatial mixtures of such metamaterial bits lead to elemental 'metamaterial bytes' with effective material parameters that are different from the parameters of the metamaterial bits. We then apply this methodology to several design examples of optical elements, such as digital convex lenses, flat graded-index digital lenses, digital constructs for epsilon-near-zero (ENZ) supercoupling and digital hyperlenses, thus highlighting the power and simplicity of the methodology. PMID:25218061

  7. A methodology for exploiting parallelism in the finite element process

    NASA Technical Reports Server (NTRS)

    Adams, L. M.; Voigt, R. G.

    1983-01-01

    A methodology is described for developing a parallel system using a top down approach taking into account the requirements of the user. Substructuring, a popular technique in structural analysis, is used to illustrate this approach.

  8. An overall digital processing architecture for an autonomous spacecraft

    NASA Astrophysics Data System (ADS)

    Fernandez, M.

    Attention is given to an autonomous spacecraft's digital processing system architecture. This architecture has as its task the provision of a system whose assets can systematically degrade, as limitations on self-healing capabilities are reached by the distributed architecture. The architecture possesses high availability and reliability for the postulated data processing problem, and is noted to be affordable in such respects as onboard weight, power requirements, and cost. Attention is given to the advantages of the proposed architecture, by comparison to conventional approaches, in the areas of fault coverage, functional availability, and 'graceful' degradation.

  9. Digital system for monitoring and controlling remote processes

    NASA Astrophysics Data System (ADS)

    Roach, Dennis P.

    The need to operate increasingly complex and potentially hazardous facilities at higher degrees of efficiency can be met through the development of automated process control systems. The availability of microcomputers capable of interfacing to data acquisition and control equipment results in the possibility of developing such systems at low investment costs. An automated control system is described which maintains a constant or time varying pressure in a pressure vessel. Process control data acquisition and analysis is carried out using a commercially available microcomputer and data scanner interface device. In this system, a computer interface is developed to allow precision positioning of custom designed proportional valves. Continuous real time process control is achieved through a direct digital control algorithm. The advantages to be gained by adapting this system to other process control applications is discussed. The modular design and ability of this system to operate many types of hardware control mechanisms makes it adaptable to a wide variety of industrial applications.

  10. Programmable rate modem utilizing digital signal processing techniques

    NASA Technical Reports Server (NTRS)

    Bunya, George K.; Wallace, Robert L.

    1989-01-01

    The engineering development study to follow was written to address the need for a Programmable Rate Digital Satellite Modem capable of supporting both burst and continuous transmission modes with either binary phase shift keying (BPSK) or quadrature phase shift keying (QPSK) modulation. The preferred implementation technique is an all digital one which utilizes as much digital signal processing (DSP) as possible. Here design tradeoffs in each portion of the modulator and demodulator subsystem are outlined, and viable circuit approaches which are easily repeatable, have low implementation losses and have low production costs are identified. The research involved for this study was divided into nine technical papers, each addressing a significant region of concern in a variable rate modem design. Trivial portions and basic support logic designs surrounding the nine major modem blocks were omitted. In brief, the nine topic areas were: (1) Transmit Data Filtering; (2) Transmit Clock Generation; (3) Carrier Synthesizer; (4) Receive AGC; (5) Receive Data Filtering; (6) RF Oscillator Phase Noise; (7) Receive Carrier Selectivity; (8) Carrier Recovery; and (9) Timing Recovery.

  11. Evaluation of clinical image processing algorithms used in digital mammography.

    PubMed

    Zanca, Federica; Jacobs, Jurgen; Van Ongeval, Chantal; Claus, Filip; Celis, Valerie; Geniets, Catherine; Provost, Veerle; Pauwels, Herman; Marchal, Guy; Bosmans, Hilde

    2009-03-01

    Screening is the only proven approach to reduce the mortality of breast cancer, but significant numbers of breast cancers remain undetected even when all quality assurance guidelines are implemented. With the increasing adoption of digital mammography systems, image processing may be a key factor in the imaging chain. Although to our knowledge statistically significant effects of manufacturer-recommended image processings have not been previously demonstrated, the subjective experience of our radiologists, that the apparent image quality can vary considerably between different algorithms, motivated this study. This article addresses the impact of five such algorithms on the detection of clusters of microcalcifications. A database of unprocessed (raw) images of 200 normal digital mammograms, acquired with the Siemens Novation DR, was collected retrospectively. Realistic simulated microcalcification clusters were inserted in half of the unprocessed images. All unprocessed images were subsequently processed with five manufacturer-recommended image processing algorithms (Agfa Musica 1, IMS Raffaello Mammo 1.2, Sectra Mamea AB Sigmoid, Siemens OPVIEW v2, and Siemens OPVIEW v1). Four breast imaging radiologists were asked to locate and score the clusters in each image on a five point rating scale. The free-response data were analyzed by the jackknife free-response receiver operating characteristic (JAFROC) method and, for comparison, also with the receiver operating characteristic (ROC) method. JAFROC analysis revealed highly significant differences between the image processings (F = 8.51, p < 0.0001), suggesting that image processing strongly impacts the detectability of clusters. Siemens OPVIEW2 and Siemens OPVIEW1 yielded the highest and lowest performances, respectively. ROC analysis of the data also revealed significant differences between the processing but at lower significance (F = 3.47, p = 0.0305) than JAFROC. Both statistical analysis methods revealed that the

  12. Measurements methodology for evaluation of Digital TV operation in VHF high-band

    NASA Astrophysics Data System (ADS)

    Pudwell Chaves de Almeida, M.; Vladimir Gonzalez Castellanos, P.; Alfredo Cal Braz, J.; Pereira David, R.; Saboia Lima de Souza, R.; Pereira da Soledade, A.; Rodrigues Nascimento Junior, J.; Ferreira Lima, F.

    2016-07-01

    This paper describes the experimental setup of field measurements carried out for evaluating the operation of the ISDB-TB (Integrated Services Digital Broadcasting, Terrestrial, Brazilian version) standard digital TV in the VHF-highband. Measurements were performed in urban and suburban areas in a medium-sized Brazilian city. Besides the direct measurements of received power and environmental noise, a measurement procedure involving the injection of Gaussian additive noise was employed to achieve the signal to noise ratio threshold at each measurement site. The analysis includes results of static reception measurements for evaluating the received field strength and the signal to noise ratio thresholds for correct signal decoding.

  13. DSPSR: Digital Signal Processing Software for Pulsar Astronomy

    NASA Astrophysics Data System (ADS)

    van Straten, W.; Bailes, M.

    2010-10-01

    DSPSR, written primarily in C++, is an open-source, object-oriented, digital signal processing software library and application suite for use in radio pulsar astronomy. The library implements an extensive range of modular algorithms for use in coherent dedispersion, filterbank formation, pulse folding, and other tasks. The software is installed and compiled using the standard GNU configure and make system, and is able to read astronomical data in 18 different file formats, including FITS, S2, CPSR, CPSR2, PuMa, PuMa2, WAPP, ASP, and Mark5.

  14. Intelligent systems/software engineering methodology - A process to manage cost and risk

    NASA Technical Reports Server (NTRS)

    Friedlander, Carl; Lehrer, Nancy

    1991-01-01

    A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.

  15. Perspectives on Learning: Methodologies for Exploring Learning Processes and Outcomes

    ERIC Educational Resources Information Center

    Goldman, Susan R.

    2014-01-01

    The papers in this Special Issue were initially prepared for an EARLI 2013 Symposium that was designed to examine methodologies in use by researchers from two sister communities, Learning and Instruction and Learning Sciences. The four papers reflect a common ground in advances in conceptions of learning since the early days of the "cognitive…

  16. Recent developments in digital image processing at the Image Processing Laboratory of JPL.

    NASA Technical Reports Server (NTRS)

    O'Handley, D. A.

    1973-01-01

    Review of some of the computer-aided digital image processing techniques recently developed. Special attention is given to mapping and mosaicking techniques and to preliminary developments in range determination from stereo image pairs. The discussed image processing utilization areas include space, biomedical, and robotic applications.

  17. Digital signal processor and programming system for parallel signal processing

    SciTech Connect

    Van den Bout, D.E.

    1987-01-01

    This thesis describes an integrated assault upon the problem of designing high-throughput, low-cost digital signal-processing systems. The dual prongs of this assault consist of: (1) the design of a digital signal processor (DSP) which efficiently executes signal-processing algorithms in either a uniprocessor or multiprocessor configuration, (2) the PaLS programming system which accepts an arbitrary algorithm, partitions it across a group of DSPs, synthesizes an optimal communication link topology for the DSPs, and schedules the partitioned algorithm upon the DSPs. The results of applying a new quasi-dynamic analysis technique to a set of high-level signal-processing algorithms were used to determine the uniprocessor features of the DSP design. For multiprocessing applications, the DSP contains an interprocessor communications port (IPC) which supports simple, flexible, dataflow communications while allowing the total communication bandwidth to be incrementally allocated to achieve the best link utilization. The net result is a DSP with a simple architecture that is easy to program for both uniprocessor and multi-processor modes of operation. The PaLS programming system simplifies the task of parallelizing an algorithm for execution upon a multiprocessor built with the DSP.

  18. Optimizing Digital Health Informatics Interventions Through Unobtrusive Quantitative Process Evaluations.

    PubMed

    Gude, Wouter T; van der Veer, Sabine N; de Keizer, Nicolette F; Coiera, Enrico; Peek, Niels

    2016-01-01

    Health informatics interventions such as clinical decision support (CDS) and audit and feedback (A&F) are variably effective at improving care because the underlying mechanisms through which these interventions bring about change are poorly understood. This limits our possibilities to design better interventions. Process evaluations can be used to improve this understanding by assessing fidelity and quality of implementation, clarifying causal mechanisms, and identifying contextual factors associated with variation in outcomes. Coiera describes the intervention process as a series of stages extending from interactions to outcomes: the "information value chain". However, past process evaluations often did not assess the relationships between those stages. In this paper we argue that the chain can be measured quantitatively and unobtrusively in digital interventions thanks to the availability of electronic data that are a by-product of their use. This provides novel possibilities to study the mechanisms of informatics interventions in detail and inform essential design choices to optimize their efficacy. PMID:27577453

  19. Liquid crystal thermography and true-colour digital image processing

    NASA Astrophysics Data System (ADS)

    Stasiek, J.; Stasiek, A.; Jewartowski, M.; Collins, M. W.

    2006-06-01

    In the last decade thermochromic liquid crystals (TLC) and true-colour digital image processing have been successfully used in non-intrusive technical, industrial and biomedical studies and applications. Thin coatings of TLCs at surfaces are utilized to obtain detailed temperature distributions and heat transfer rates for steady or transient processes. Liquid crystals also can be used to make visible the temperature and velocity fields in liquids by the simple expedient of directly mixing the liquid crystal material into the liquid (water, glycerol, glycol, and silicone oils) in very small quantities to use as thermal and hydrodynamic tracers. In biomedical situations e.g., skin diseases, breast cancer, blood circulation and other medical application, TLC and image processing are successfully used as an additional non-invasive diagnostic method especially useful for screening large groups of potential patients. The history of this technique is reviewed, principal methods and tools are described and some examples are also presented.

  20. Holographic digital microscopy in on-line process control

    NASA Astrophysics Data System (ADS)

    Osanlou, Ardeshir

    2011-09-01

    This article investigates the feasibility of real-time three-dimensional imaging of microscopic objects within various emulsions while being produced in specialized production vessels. The study is particularly relevant to on-line process monitoring and control in chemical, pharmaceutical, food, cleaning, and personal hygiene industries. Such processes are often dynamic and the materials cannot be measured once removed from the production vessel. The technique reported here is applicable to three-dimensional characterization analyses on stirred fluids in small reaction vessels. Relatively expensive pulsed lasers have been avoided through the careful control of the speed of the moving fluid in relation to the speed of the camera exposure and the wavelength of the continuous wave laser used. The ultimate aim of the project is to introduce a fully robust and compact digital holographic microscope as a process control tool in a full size specialized production vessel.

  1. Digital Image Processing Overview For Helmet Mounted Displays

    NASA Astrophysics Data System (ADS)

    Parise, Michael J.

    1989-09-01

    Digital image processing provides a means to manipulate an image and presents a user with a variety of display formats that are not available in the analog image processing environment. When performed in real time and presented on a Helmet Mounted Display, system capability and flexibility are greatly enhanced. The information content of a display can be increased by the addition of real time insets and static windows from secondary sensor sources, near real time 3-D imaging from a single sensor can be achieved, graphical information can be added, and enhancement techniques can be employed. Such increased functionality is generating a considerable amount of interest in the military and commercial markets. This paper discusses some of these image processing techniques and their applications.

  2. Applying of digital signal processing to optical equisignal zone system

    NASA Astrophysics Data System (ADS)

    Maraev, Anton A.; Timofeev, Aleksandr N.; Gusarov, Vadim F.

    2015-05-01

    In this work we are trying to assess the application of array detectors and digital information processing to the system with the optical equisignal zone as a new method of evaluating of optical equisignal zone position. Peculiarities of optical equisignal zone formation are described. The algorithm of evaluation of optical equisignal zone position is applied to processing on the array detector. This algorithm enables to evaluate as lateral displacement as turning angles of the receiver relative to the projector. Interrelation of parameters of the projector and the receiver is considered. According to described principles an experimental set was made and then characterized. The accuracy of position evaluation of the equisignal zone is shown dependent of the size of the equivalent entrance pupil at processing.

  3. Digital processing of side-scan sonar data with the Woods Hole image processing system software

    USGS Publications Warehouse

    Paskevich, Valerie F.

    1992-01-01

    Since 1985, the Branch of Atlantic Marine Geology has been involved in collecting, processing and digitally mosaicking high and low-resolution side-scan sonar data. Recent development of a UNIX-based image-processing software system includes a series of task specific programs for processing side-scan sonar data. This report describes the steps required to process the collected data and to produce an image that has equal along- and across-track resol

  4. Incremental terrain processing for large digital elevation models

    NASA Astrophysics Data System (ADS)

    Ye, Z.

    2012-12-01

    Incremental terrain processing for large digital elevation models Zichuan Ye, Dean Djokic, Lori Armstrong Esri, 380 New York Street, Redlands, CA 92373, USA (E-mail: zye@esri.com, ddjokic@esri.com , larmstrong@esri.com) Efficient analyses of large digital elevation models (DEM) require generation of additional DEM artifacts such as flow direction, flow accumulation and other DEM derivatives. When the DEMs to analyze have a large number of grid cells (usually > 1,000,000,000) the generation of these DEM derivatives is either impractical (it takes too long) or impossible (software is incapable of processing such a large number of cells). Different strategies and algorithms can be put in place to alleviate this situation. This paper describes an approach where the overall DEM is partitioned in smaller processing units that can be efficiently processed. The processed DEM derivatives for each partition can then be either mosaicked back into a single large entity or managed on partition level. For dendritic terrain morphologies, the way in which partitions are to be derived and the order in which they are to be processed depend on the river and catchment patterns. These patterns are not available until flow pattern of the whole region is created, which in turn cannot be established upfront due to the size issues. This paper describes a procedure that solves this problem: (1) Resample the original large DEM grid so that the total number of cells is reduced to a level for which the drainage pattern can be established. (2) Run standard terrain preprocessing operations on the resampled DEM to generate the river and catchment system. (3) Define the processing units and their processing order based on the river and catchment system created in step (2). (4) Based on the processing order, apply the analysis, i.e., flow accumulation operation to each of the processing units, at the full resolution DEM. (5) As each processing unit is processed based on the processing order defined

  5. Digital Methodologies of Education Governance: Pearson plc and the Remediation of Methods

    ERIC Educational Resources Information Center

    Williamson, Ben

    2016-01-01

    This article analyses the rise of software systems in education governance, focusing on digital methods in the collection, calculation and circulation of educational data. It examines how software-mediated methods intervene in the ways educational institutions and actors are seen, known and acted upon through an analysis of the methodological…

  6. Naturalistic observation of health-relevant social processes: the electronically activated recorder methodology in psychosomatics.

    PubMed

    Mehl, Matthias R; Robbins, Megan L; Deters, Fenne Große

    2012-05-01

    This article introduces a novel observational ambulatory monitoring method called the electronically activated recorder (EAR). The EAR is a digital audio recorder that runs on a handheld computer and periodically and unobtrusively records snippets of ambient sounds from participants' momentary environments. In tracking moment-to-moment ambient sounds, it yields acoustic logs of people's days as they naturally unfold. In sampling only a fraction of the time, it protects participants' privacy and makes large observational studies feasible. As a naturalistic observation method, it provides an observer's account of daily life and is optimized for the objective assessment of audible aspects of social environments, behaviors, and interactions (e.g., habitual preferences for social settings, idiosyncratic interaction styles, subtle emotional expressions). This article discusses the EAR method conceptually and methodologically, reviews prior research with it, and identifies three concrete ways in which it can enrich psychosomatic research. Specifically, it can (a) calibrate psychosocial effects on health against frequencies of real-world behavior; (b) provide ecological observational measures of health-related social processes that are independent of self-report; and (c) help with the assessment of subtle and habitual social behaviors that evade self-report but have important health implications. An important avenue for future research lies in merging traditional self-report-based ambulatory monitoring methods with observational approaches such as the EAR to allow for the simultaneous yet methodologically independent assessment of inner, experiential aspects (e.g., loneliness) and outer, observable aspects (e.g., social isolation) of real-world social processes to reveal their unique effects on health. PMID:22582338

  7. Naturalistic Observation of Health-Relevant Social Processes: The Electronically Activated Recorder (EAR) Methodology in Psychosomatics

    PubMed Central

    Mehl, Matthias R.; Robbins, Megan L.; Deters, Fenne große

    2012-01-01

    This article introduces a novel, observational ambulatory monitoring method called the Electronically Activated Recorder or EAR. The EAR is a digital audio recorder that runs on a handheld computer and periodically and unobtrusively records snippets of ambient sounds from participants’ momentary environments. In tracking moment-to-moment ambient sounds, it yields acoustic logs of people’s days as they naturally unfold. In sampling only a fraction of the time, it protects participants’ privacy and makes large observational studies feasible. As a naturalistic observation method, it provides an observer’s account of daily life and is optimized for the objective assessment of audible aspects of social environments, behaviors, and interactions (e.g., habitual preferences for social settings, idiosyncratic interaction styles, and subtle emotional expressions). The article discusses the EAR method conceptually and methodologically, reviews prior research with it, and identifies three concrete ways in which it can enrich psychosomatic research. Specifically, it can (a) calibrate psychosocial effects on health against frequencies of real-world behavior, (b) provide ecological, observational measures of health-related social processes that are independent of self-report, and (c) help with the assessment of subtle and habitual social behaviors that evade self-report but have important health implications. An important avenue for future research lies in merging traditional, self-report based ambulatory monitoring methods with observational approaches such as the EAR to allow for the simultaneous yet methodologically independent assessment of inner, experiential (e.g., loneliness) and outer, observable aspects (e.g., social isolation) of real-world social processes to reveal their unique effects on health. PMID:22582338

  8. Coherent detection and digital signal processing for fiber optic communications

    NASA Astrophysics Data System (ADS)

    Ip, Ezra

    The drive towards higher spectral efficiency in optical fiber systems has generated renewed interest in coherent detection. We review different detection methods, including noncoherent, differentially coherent, and coherent detection, as well as hybrid detection methods. We compare the modulation methods that are enabled and their respective performances in a linear regime. An important system parameter is the number of degrees of freedom (DOF) utilized in transmission. Polarization-multiplexed quadrature-amplitude modulation maximizes spectral efficiency and power efficiency as it uses all four available DOF contained in the two field quadratures in the two polarizations. Dual-polarization homodyne or heterodyne downconversion are linear processes that can fully recover the received signal field in these four DOF. When downconverted signals are sampled at the Nyquist rate, compensation of transmission impairments can be performed using digital signal processing (DSP). Software based receivers benefit from the robustness of DSP, flexibility in design, and ease of adaptation to time-varying channels. Linear impairments, including chromatic dispersion (CD) and polarization-mode dispersion (PMD), can be compensated quasi-exactly using finite impulse response filters. In practical systems, sampling the received signal at 3/2 times the symbol rate is sufficient to enable an arbitrary amount of CD and PMD to be compensated for a sufficiently long equalizer whose tap length scales linearly with transmission distance. Depending on the transmitted constellation and the target bit error rate, the analog-to-digital converter (ADC) should have around 5 to 6 bits of resolution. Digital coherent receivers are naturally suited for the implementation of feedforward carrier recovery, which has superior linewidth tolerance than phase-locked loops, and does not suffer from feedback delay constraints. Differential bit encoding can be used to prevent catastrophic receiver failure due

  9. Data Processing Factory for the Sloan Digital Sky Survey

    NASA Astrophysics Data System (ADS)

    Stoughton, Christopher; Adelman, Jennifer; Annis, James T.; Hendry, John; Inkmann, John; Jester, Sebastian; Kent, Steven M.; Kuropatkin, Nickolai; Lee, Brian; Lin, Huan; Peoples, John, Jr.; Sparks, Robert; Tucker, Douglas; Vanden Berk, Dan; Yanny, Brian; Yocum, Dan

    2002-12-01

    The Sloan Digital Sky Survey (SDSS) data handling presents two challenges: large data volume and timely production of spectroscopic plates from imaging data. A data processing factory, using technologies both old and new, handles this flow. Distribution to end users is via disk farms, to serve corrected images and calibrated spectra, and a database, to efficiently process catalog queries. For distribution of modest amounts of data from Apache Point Observatory to Fermilab, scripts use rsync to update files, while larger data transfers are accomplished by shipping magnetic tapes commercially. All data processing pipelines are wrapped in scripts to address consecutive phases: preparation, submission, checking, and quality control. We constructed the factory by chaining these pipelines together while using an operational database to hold processed imaging catalogs. The science database catalogs all imaging and spectroscopic object, with pointers to the various external files associated with them. Diverse computing systems address particular processing phases. UNIX computers handle tape reading and writing, as well as calibration steps that require access to a large amount of data with relatively modest computational demands. Commodity CPUs process steps that require access to a limited amount of data with more demanding computations requirements. Disk servers optimized for cost per Gbyte serve terabytes of processed data, while servers optimized for disk read speed run SQLServer software to process queries on the catalogs. This factory produced data for the SDSS Early Data Release in June 2001, and it is currently producing Data Release One, scheduled for January 2003.

  10. On the Development of Arabic Three-Digit Number Processing in Primary School Children

    ERIC Educational Resources Information Center

    Mann, Anne; Moeller, Korbinian; Pixner, Silvia; Kaufmann, Liane; Nuerk, Hans-Christoph

    2012-01-01

    The development of two-digit number processing in children, and in particular the influence of place-value understanding, has recently received increasing research interest. However, place-value influences leading to decomposed processing have not yet been investigated for multi-digit numbers beyond the two-digit number range in children.…

  11. Microcomputer-based digital image processing - A tutorial package for exploration geologists

    NASA Technical Reports Server (NTRS)

    Harrington, J. A., Jr.; Cartin, K. F.

    1985-01-01

    An Apple II microcomputer-based software package for analysis of digital data developed at the University of Oklahoma, the Digital Image Analysis System (DIAS), provides a relatively low-cost, portable alternative to large, dedicated minicomputers for digital image processing education. Digital processing techniques for analysis of Landsat MSS data and a series of tutorial exercises for exploration geologists are described and evaluated. DIAS allows in-house training that does not interfere with computer-based prospect analysis objectives.

  12. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems.

    PubMed

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-01-01

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015. PMID:26633395

  13. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems

    PubMed Central

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-01-01

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015. PMID:26633395

  14. The Digital Fields Board for the FIELDS instrument suite on the Solar Probe Plus mission: Analog and digital signal processing

    NASA Astrophysics Data System (ADS)

    Malaspina, David M.; Ergun, Robert E.; Bolton, Mary; Kien, Mark; Summers, David; Stevens, Ken; Yehle, Alan; Karlsson, Magnus; Hoxie, Vaughn C.; Bale, Stuart D.; Goetz, Keith

    2016-06-01

    The first in situ measurements of electric and magnetic fields in the near-Sun environment (< 0.25 AU from the Sun) will be made by the FIELDS instrument suite on the Solar Probe Plus mission. The Digital Fields Board (DFB) is an electronics board within FIELDS that performs analog and digital signal processing, as well as digitization, for signals between DC and 60 kHz from five voltage sensors and four search coil magnetometer channels. These nine input signals are processed on the DFB into 26 analog data streams. A specialized application-specific integrated circuit performs analog to digital conversion on all 26 analog channels simultaneously. The DFB then processes the digital data using a field programmable gate array (FPGA), generating a variety of data products, including digitally filtered continuous waveforms, high-rate burst capture waveforms, power spectra, cross spectra, band-pass filter data, and several ancillary products. While the data products are optimized for encounter-based mission operations, they are also highly configurable, a key design aspect for a mission of exploration. This paper describes the analog and digital signal processing used to ensure that the DFB produces high-quality science data, using minimal resources, in the challenging near-Sun environment.

  15. Digital processing of mesoscale analysis and space sensor data

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.; Karitani, S.

    1985-01-01

    The mesoscale analysis and space sensor (MASS) data management and analysis system on the research computer system is presented. The MASS data base management and analysis system was implemented on the research computer system which provides a wide range of capabilities for processing and displaying large volumes of conventional and satellite derived meteorological data. The research computer system consists of three primary computers (HP-1000F, Harris/6, and Perkin-Elmer 3250), each of which performs a specific function according to its unique capabilities. The overall tasks performed concerning the software, data base management and display capabilities of the research computer system in terms of providing a very effective interactive research tool for the digital processing of mesoscale analysis and space sensor data is described.

  16. Processing techniques for digital sonar images from GLORIA.

    USGS Publications Warehouse

    Chavez, P.S., Jr.

    1986-01-01

    Image processing techniques have been developed to handle data from one of the newest members of the remote sensing family of digital imaging systems. This paper discusses software to process data collected by the GLORIA (Geological Long Range Inclined Asdic) sonar imaging system, designed and built by the Institute of Oceanographic Sciences (IOS) in England, to correct for both geometric and radiometric distortions that exist in the original 'raw' data. Preprocessing algorithms that are GLORIA-specific include corrections for slant-range geometry, water column offset, aspect ratio distortion, changes in the ship's velocity, speckle noise, and shading problems caused by the power drop-off which occurs as a function of range.-from Author

  17. Intranets and Digital Organizational Information Resources: Towards a Portable Methodology for Design and Development.

    ERIC Educational Resources Information Center

    Rosenbaum, Howard

    1997-01-01

    Discusses the concept of the intranet, comparing and contrasting it with groupware, and presents an argument for its value based on technical and information management considerations. Presents an intranet development project for an academic organization and describes a portable, user-centered and team-based methodology for the design and…

  18. A Digital Ecosystem for the Collaborative Production of Open Textbooks: The LATIn Methodology

    ERIC Educational Resources Information Center

    Silveira, Ismar Frango; Ochôa, Xavier; Cuadros-Vargas, Alex; Pérez Casas, Alén; Casali, Ana; Ortega, Andre; Sprock, Antonio Silva; Alves, Carlos Henrique; Collazos Ordoñez, Cesar Alberto; Deco, Claudia; Cuadros-Vargas, Ernesto; Knihs, Everton; Parra, Gonzalo; Muñoz-Arteaga, Jaime; Gomes dos Santos, Jéssica; Broisin, Julien; Omar, Nizam; Motz, Regina; Rodés, Virginia; Bieliukas, Yosly Hernández C.

    2013-01-01

    Access to books in higher education is an issue to be addressed, especially in the context of underdeveloped countries, such as those in Latin America. More than just financial issues, cultural aspects and need for adaptation must be considered. The present conceptual paper proposes a methodology framework that would support collaborative open…

  19. Digital Image Processing Technique for Breast Cancer Detection

    NASA Astrophysics Data System (ADS)

    Guzmán-Cabrera, R.; Guzmán-Sepúlveda, J. R.; Torres-Cisneros, M.; May-Arrioja, D. A.; Ruiz-Pinales, J.; Ibarra-Manzano, O. G.; Aviña-Cervantes, G.; Parada, A. González

    2013-09-01

    Breast cancer is the most common cause of death in women and the second leading cause of cancer deaths worldwide. Primary prevention in the early stages of the disease becomes complex as the causes remain almost unknown. However, some typical signatures of this disease, such as masses and microcalcifications appearing on mammograms, can be used to improve early diagnostic techniques, which is critical for women’s quality of life. X-ray mammography is the main test used for screening and early diagnosis, and its analysis and processing are the keys to improving breast cancer prognosis. As masses and benign glandular tissue typically appear with low contrast and often very blurred, several computer-aided diagnosis schemes have been developed to support radiologists and internists in their diagnosis. In this article, an approach is proposed to effectively analyze digital mammograms based on texture segmentation for the detection of early stage tumors. The proposed algorithm was tested over several images taken from the digital database for screening mammography for cancer research and diagnosis, and it was found to be absolutely suitable to distinguish masses and microcalcifications from the background tissue using morphological operators and then extract them through machine learning techniques and a clustering algorithm for intensity-based segmentation.

  20. IMAGEP - A FORTRAN ALGORITHM FOR DIGITAL IMAGE PROCESSING

    NASA Technical Reports Server (NTRS)

    Roth, D. J.

    1994-01-01

    IMAGEP is a FORTRAN computer algorithm containing various image processing, analysis, and enhancement functions. It is a keyboard-driven program organized into nine subroutines. Within the subroutines are other routines, also, selected via keyboard. Some of the functions performed by IMAGEP include digitization, storage and retrieval of images; image enhancement by contrast expansion, addition and subtraction, magnification, inversion, and bit shifting; display and movement of cursor; display of grey level histogram of image; and display of the variation of grey level intensity as a function of image position. This algorithm has possible scientific, industrial, and biomedical applications in material flaw studies, steel and ore analysis, and pathology, respectively. IMAGEP is written in VAX FORTRAN for DEC VAX series computers running VMS. The program requires the use of a Grinnell 274 image processor which can be obtained from Mark McCloud Associates, Campbell, CA. An object library of the required GMR series software is included on the distribution media. IMAGEP requires 1Mb of RAM for execution. The standard distribution medium for this program is a 1600 BPI 9track magnetic tape in VAX FILES-11 format. It is also available on a TK50 tape cartridge in VAX FILES-11 format. This program was developed in 1991. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation.

  1. Digital Signal Processing and Control for the Study of Gene Networks

    PubMed Central

    Shin, Yong-Jun

    2016-01-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks. PMID:27102828

  2. Digital Signal Processing and Control for the Study of Gene Networks.

    PubMed

    Shin, Yong-Jun

    2016-01-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks. PMID:27102828

  3. Digital Transformation of Words in Learning Processes: A Critical View.

    ERIC Educational Resources Information Center

    Saga, Hiroo

    1999-01-01

    Presents some negative aspects of society's dependence on digital transformation of words by referring to works by Walter Ong and Martin Heidegger. Discusses orality, literacy and digital literacy and describes three aspects of the digital transformation of words. Compares/contrasts art with technology and discusses implications for education.…

  4. Digital-image processing and image analysis of glacier ice

    USGS Publications Warehouse

    Fitzpatrick, Joan J.

    2013-01-01

    This document provides a methodology for extracting grain statistics from 8-bit color and grayscale images of thin sections of glacier ice—a subset of physical properties measurements typically performed on ice cores. This type of analysis is most commonly used to characterize the evolution of ice-crystal size, shape, and intercrystalline spatial relations within a large body of ice sampled by deep ice-coring projects from which paleoclimate records will be developed. However, such information is equally useful for investigating the stress state and physical responses of ice to stresses within a glacier. The methods of analysis presented here go hand-in-hand with the analysis of ice fabrics (aggregate crystal orientations) and, when combined with fabric analysis, provide a powerful method for investigating the dynamic recrystallization and deformation behaviors of bodies of ice in motion. The procedures described in this document compose a step-by-step handbook for a specific image acquisition and data reduction system built in support of U.S. Geological Survey ice analysis projects, but the general methodology can be used with any combination of image processing and analysis software. The specific approaches in this document use the FoveaPro 4 plug-in toolset to Adobe Photoshop CS5 Extended but it can be carried out equally well, though somewhat less conveniently, with software such as the image processing toolbox in MATLAB, Image-Pro Plus, or ImageJ.

  5. Infective endocarditis detection through SPECT/CT images digital processing

    NASA Astrophysics Data System (ADS)

    Moreno, Albino; Valdés, Raquel; Jiménez, Luis; Vallejo, Enrique; Hernández, Salvador; Soto, Gabriel

    2014-03-01

    Infective endocarditis (IE) is a difficult-to-diagnose pathology, since its manifestation in patients is highly variable. In this work, it was proposed a semiautomatic algorithm based on SPECT images digital processing for the detection of IE using a CT images volume as a spatial reference. The heart/lung rate was calculated using the SPECT images information. There were no statistically significant differences between the heart/lung rates values of a group of patients diagnosed with IE (2.62+/-0.47) and a group of healthy or control subjects (2.84+/-0.68). However, it is necessary to increase the study sample of both the individuals diagnosed with IE and the control group subjects, as well as to improve the images quality.

  6. Towards a Methodology for Representing and Classifying Business Processes

    NASA Astrophysics Data System (ADS)

    Mili, Hafedh; Leshob, Abderrahmane; Lefebvre, Eric; Lévesque, Ghislain; El-Boussaidi, Ghizlane

    Organizations build information systems to support their business processes. Some of these business processes are industry or organization-specific, but most are common to many industries and are used, modulo a few modifications, in different contexts. A precise modeling of such processes would seem to be a necessary prerequisite for building information systems that are aligned with the business objectives of the organization and that fulfill the functional requirements of its users. Yet, there are few tools, conceptual or otherwise, that enable organizations to model their business processes precisely and efficiently, and fewer tools still, to map such process models to the software components that are needed to support them. Our work deals with the problem of building tools to model business processes precisely, and to help map such models to software models. In this paper, we describe a representation and classification of business processes that supports the specification of organization-specific processes by, 1) navigating a repository of generic business processes, and 2) automatically generating new process variants to accommodate the specifics of the organization. We present the principles underlying our approach, and describe the state of an ongoing implementation.

  7. Digital Image Processing for Noise Reduction in Medical Ultrasonics

    NASA Astrophysics Data System (ADS)

    Loupas, Thanasis

    Available from UMI in association with The British Library. Requires signed TDF. The purpose of this project was to investigate the application of digital image processing techniques as a means of reducing noise in medical ultrasonic imaging. Ultrasonic images suffer primarily from a type of acoustic noise, known as speckle, which is generally regarded as a major source of image quality degradation. The origin of speckle, its statistical properties as well as methods suggested to eliminate this artifact were reviewed. A simple model which can characterize the statistics of speckle on displays was also developed. A large number of digital noise reduction techniques was investigated. These include frame averaging techniques performed by commercially available devices and spatial filters implemented in software. Among the latter, some filters have been proposed in the scientific literature for ultrasonic, laser and microwave speckle or general noise suppression and the rest are original, developed specifically to suppress ultrasonic speckle. Particular emphasis was placed on adaptive techniques which adjust the processing performed at each point according to the local image content. In this way, they manage to suppress speckle with negligible loss of genuine image detail. Apart from preserving the diagnostically significant features of a scan another requirement a technique must satisfy before it is accepted in routine clinical practice is real-time operation. A spatial filter capable of satisfying both these requirements was designed and built in hardware using low-cost and readily available components. The possibility of incorporating all the necessary filter circuitry into a single VLSI chip was also investigated. In order to establish the effectiveness and usefulness of speckle suppression, a representative sample from the techniques examined here was applied to a large number of abdominal scans and their effect on image quality was evaluated. Finally, further

  8. Irdis: A Digital Scene Storage And Processing System For Hardware-In-The-Loop Missile Testing

    NASA Astrophysics Data System (ADS)

    Sedlar, Michael F.; Griffith, Jerry A.

    1988-07-01

    This paper describes the implementation of a Seeker Evaluation and Test Simulation (SETS) Facility at Eglin Air Force Base. This facility will be used to evaluate imaging infrared (IIR) guided weapon systems by performing various types of laboratory tests. One such test is termed Hardware-in-the-Loop (HIL) simulation (Figure 1) in which the actual flight of a weapon system is simulated as closely as possible in the laboratory. As shown in the figure, there are four major elements in the HIL test environment; the weapon/sensor combination, an aerodynamic simulator, an imagery controller, and an infrared imagery system. The paper concentrates on the approaches and methodologies used in the imagery controller and infrared imaging system elements for generating scene information. For procurement purposes, these two elements have been combined into an Infrared Digital Injection System (IRDIS) which provides scene storage, processing, and output interface to drive a radiometric display device or to directly inject digital video into the weapon system (bypassing the sensor). The paper describes in detail how standard and custom image processing functions have been combined with off-the-shelf mass storage and computing devices to produce a system which provides high sample rates (greater than 90 Hz), a large terrain database, high weapon rates of change, and multiple independent targets. A photo based approach has been used to maximize terrain and target fidelity, thus providing a rich and complex scene for weapon/tracker evaluation.

  9. Unfolding-synthesis technique for digital pulse processing. Part 1: Unfolding

    NASA Astrophysics Data System (ADS)

    Jordanov, Valentin T.

    2016-01-01

    The unfolding-synthesis technique is used in the development of digital pulse processing systems used in radiation measurements. This technique is applied to digital signals obtained by digitization of analog signals that represent the combined response of the radiation detectors and the associated signal conditioning electronics. The salient features of the unfolding-synthesis technique are first the unfolding of the digital signals into unit impulses, followed by the synthesis of digital signal processing systems with unit impulse responses equivalent to the desired pulse shapes. Part 1 of this paper covers the unfolding part of this technique.

  10. Development of next generation digital flat panel catheterization system: design principles and validation methodology

    NASA Astrophysics Data System (ADS)

    Belanger, B.; Betraoui, F.; Dhawale, P.; Gopinath, P.; Tegzes, Pal; Vagvolgyi, B.

    2006-03-01

    The design principles that drove the development of a new cardiovascular x-ray digital flat panel (DFP) detector system are presented, followed by assessments of imaging and dose performance achieved relative to other state of the art FPD systems. The new system (GE Innova 2100 IQ TM) incorporates a new detector with substantially improved DQE at fluoroscopic (73%@1μR) and record (79%@114uR) doses, an x-ray tube with higher continuous fluoro power (3.2kW), a collimator with a wide range of copper spectral filtration (up to 0.9mm), and an improved automatic x-ray exposure management system. The performance of this new system was compared to that of the previous generation GE product (Innova 2000) and to state-of-the art cardiac digital x-ray flat panel systems from two other major manufacturers. Performance was assessed with the industry standard Cardiac X-ray NEMA/SCA and I phantom, and a new moving coronary artery stent (MCAS) phantom, designed to simulate cardiac clinical imaging conditions, composed of an anthropomorphic chest section with stents moving in a manner simulating normal coronary arteries. The NEMA/SCA&I phantom results showed the Innova 2100 IQ to exceed or equal the Innova 2000 in all of the performance categories, while operating at 28% lower dose on average, and to exceed the other DFP systems in most of the performance categories. The MCAS phantom tests showed the Innova 2100 IQ to be significantly better (p << 0.05) than the Innova 2000, and significantly better than the other DFP systems in most cases at comparable or lower doses, thereby verifying excellent performance against design goals.

  11. Study of optical techniques for the Ames unitary wind tunnel: Digital image processing, part 6

    NASA Technical Reports Server (NTRS)

    Lee, George

    1993-01-01

    A survey of digital image processing techniques and processing systems for aerodynamic images has been conducted. These images covered many types of flows and were generated by many types of flow diagnostics. These include laser vapor screens, infrared cameras, laser holographic interferometry, Schlieren, and luminescent paints. Some general digital image processing systems, imaging networks, optical sensors, and image computing chips were briefly reviewed. Possible digital imaging network systems for the Ames Unitary Wind Tunnel were explored.

  12. Digital signal processing techniques for coherent optical communication

    NASA Astrophysics Data System (ADS)

    Goldfarb, Gilad

    Coherent detection with subsequent digital signal processing (DSP) is developed, analyzed theoretically and numerically and experimentally demonstrated in various fiber-optic transmission scenarios. The use of DSP in conjunction with coherent detection unleashes the benefits of coherent detection which rely on the preservaton of full information of the incoming field. These benefits include high receiver sensitivity, the ability to achieve high spectral-efficiency and the use of advanced modulation formats. With the immense advancements in DSP speeds, many of the problems hindering the use of coherent detection in optical transmission systems have been eliminated. Most notably, DSP alleviates the need for hardware phase-locking and polarization tracking, which can now be achieved in the digital domain. The complexity previously associated with coherent detection is hence significantly diminished and coherent detection is once gain considered a feasible detection alternative. In this thesis, several aspects of coherent detection (with or without subsequent DSP) are addressed. Coherent detection is presented as a means to extend the dispersion limit of a duobinary signal using an analog decision-directed phase-lock loop. Analytical bit-error ratio estimation for quadrature phase-shift keying signals is derived. To validate the promise for high spectral efficiency, the orthogonal-wavelength-division multiplexing scheme is suggested. In this scheme the WDM channels are spaced at the symbol rate, thus achieving the spectral efficiency limit. Theory, simulation and experimental results demonstrate the feasibility of this approach. Infinite impulse response filtering is shown to be an efficient alternative to finite impulse response filtering for chromatic dispersion compensation. Theory, design considerations, simulation and experimental results relating to this topic are presented. Interaction between fiber dispersion and nonlinearity remains the last major challenge

  13. Digital computer processing of peach orchard multispectral aerial photography

    NASA Technical Reports Server (NTRS)

    Atkinson, R. J.

    1976-01-01

    Several methods of analysis using digital computers applicable to digitized multispectral aerial photography, are described, with particular application to peach orchard test sites. This effort was stimulated by the recent premature death of peach trees in the Southeastern United States. The techniques discussed are: (1) correction of intensity variations by digital filtering, (2) automatic detection and enumeration of trees in five size categories, (3) determination of unhealthy foliage by infrared reflectances, and (4) four band multispectral classification into healthy and declining categories.

  14. Social Information Processing, Emotions, and Aggression: Conceptual and Methodological Contributions of the Special Section Articles

    ERIC Educational Resources Information Center

    Arsenio, William F.

    2010-01-01

    This discussion summarizes some of the key conceptual and methodological contributions of the four articles in this special section on social information processing (SIP) and aggression. One major contribution involves the new methodological tools these studies provide for future researchers. Eye-tracking and mood induction techniques will make it…

  15. Digital Signal Processing Techniques for the GIFTS SM EDU

    NASA Technical Reports Server (NTRS)

    Tian, Jialin; Reisse, Robert A.; Gazarik, Michael J.

    2007-01-01

    The Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) Sensor Module (SM) Engineering Demonstration Unit (EDU) is a high resolution spectral imager designed to measure infrared (IR) radiance using a Fourier transform spectrometer (FTS). The GIFTS instrument employs three Focal Plane Arrays (FPAs), which gather measurements across the long-wave IR (LWIR), short/mid-wave IR (SMWIR), and visible spectral bands. The raw interferogram measurements are radiometrically and spectrally calibrated to produce radiance spectra, which are further processed to obtain atmospheric profiles via retrieval algorithms. This paper describes several digital signal processing (DSP) techniques involved in the development of the calibration model. In the first stage, the measured raw interferograms must undergo a series of processing steps that include filtering, decimation, and detector nonlinearity correction. The digital filtering is achieved by employing a linear-phase even-length FIR complex filter that is designed based on the optimum equiripple criteria. Next, the detector nonlinearity effect is compensated for using a set of pre-determined detector response characteristics. In the next stage, a phase correction algorithm is applied to the decimated interferograms. This is accomplished by first estimating the phase function from the spectral phase response of the windowed interferogram, and then correcting the entire interferogram based on the estimated phase function. In the calibration stage, we first compute the spectral responsivity based on the previous results and the ideal Planck blackbody spectra at the given temperatures, from which, the calibrated ambient blackbody (ABB), hot blackbody (HBB), and scene spectra can be obtained. In the post-calibration stage, we estimate the Noise Equivalent Spectral Radiance (NESR) from the calibrated ABB and HBB spectra. The NESR is generally considered as a measure of the instrument noise performance, and can be estimated as

  16. Complexity, Methodology and Method: Crafting a Critical Process of Research

    ERIC Educational Resources Information Center

    Alhadeff-Jones, Michel

    2013-01-01

    This paper defines a theoretical framework aiming to support the actions and reflections of researchers looking for a "method" in order to critically conceive the complexity of a scientific process of research. First, it starts with a brief overview of the core assumptions framing Morin's "paradigm of complexity" and Le…

  17. Radiometric calibration of digital cameras using Gaussian processes

    NASA Astrophysics Data System (ADS)

    Schall, Martin; Grunwald, Michael; Umlauf, Georg; Franz, Matthias O.

    2015-05-01

    Digital cameras are subject to physical, electronic and optic effects that result in errors and noise in the image. These effects include for example a temperature dependent dark current, read noise, optical vignetting or different sensitivities of individual pixels. The task of a radiometric calibration is to reduce these errors in the image and thus improve the quality of the overall application. In this work we present an algorithm for radiometric calibration based on Gaussian processes. Gaussian processes are a regression method widely used in machine learning that is particularly useful in our context. Then Gaussian process regression is used to learn a temperature and exposure time dependent mapping from observed gray-scale values to true light intensities for each pixel. Regression models based on the characteristics of single pixels suffer from excessively high runtime and thus are unsuitable for many practical applications. In contrast, a single regression model for an entire image with high spatial resolution leads to a low quality radiometric calibration, which also limits its practical use. The proposed algorithm is predicated on a partitioning of the pixels such that each pixel partition can be represented by one single regression model without quality loss. Partitioning is done by extracting features from the characteristic of each pixel and using them for lexicographic sorting. Splitting the sorted data into partitions with equal size yields the final partitions, each of which is represented by the partition centers. An individual Gaussian process regression and model selection is done for each partition. Calibration is performed by interpolating the gray-scale value of each pixel with the regression model of the respective partition. The experimental comparison of the proposed approach to classical flat field calibration shows a consistently higher reconstruction quality for the same overall number of calibration frames.

  18. Environmental testing of a prototypic digital safety channel, phase I: System design and test methodology

    SciTech Connect

    Korsah, K.; Turner, G.W.; Mullens, J.A.

    1995-02-01

    A microprocessor-based reactor trip channel has been assembled for environmental testing under an Instrumentation and Control (I&C) Qualification Program sponsored by the U.S. Nuclear Regulatory Commission. The goal of this program is to establish the technical basis for the qualification of advanced I&C systems. The trip channel implemented for this study employs technologies and digital subsystems representative of those proposed for use in some advanced light-water reactors (ALNWS) such as the Simplified Boiling Water Reactor (SBNW) and AP600. It is expected that these tests will reveal any potential system vulnerabilities for technologies representative of those proposed for use in ALNWS. The experimental channel will be purposely stressed considerably beyond what it is likely to experience in a normal nuclear power plant environment, so that the tests can uncover the worst-case failure modes (i.e., failures that are likely to prevent an entire trip system from performing its safety function when required to do so). Based on information obtained from this study, it may be possible to recommend tests that are likely to indicate the presence of such failure mechanisms. Such recommendations would be helpful in augmenting current qualification guidelines.

  19. Environmental testing of a prototypic digital safety channel, Phase I: System design and test methodology

    SciTech Connect

    Korsah, K.; Turner, G.W.; Mullens, J.A.

    1995-04-01

    A microprocessor-based reactor trip channel has been assembled for environmental testing under an Instrumentation and Control (I&C) Qualification Program sponsored by the US Nuclear Regulatory Commission. The goal of this program is to establish the technical basis and acceptance criteria for the qualification of advanced I&C systems. The trip channel implemented for this study employs technologies and digital subsystems representative of those proposed for use in some advanced light-water reactors (ALWRs) such as the Simplified Boiling Water Reactor (SBWR). It is expected that these tests will reveal any potential system vulnerabilities for technologies representative of those proposed for use in ALWRs. The experimental channel will be purposely stressed considerably beyond what it is likely to experience in a normal nuclear power plant environment, so that the tests can uncover the worst-case failure modes (i.e., failures that are likely to prevent an entire trip system from performing its safety function when required to do so). Based on information obtained from this study, it may be possible to recommend tests that are likely to indicate the presence of such failure mechanisms. Such recommendations would be helpful in augmenting current qualification guidelines.

  20. Knowledge and Processes That Predict Proficiency in Digital Literacy

    ERIC Educational Resources Information Center

    Bulger, Monica E.; Mayer, Richard E.; Metzger, Miriam J.

    2014-01-01

    Proficiency in digital literacy refers to the ability to read and write using online sources, and includes the ability to select sources relevant to the task, synthesize information into a coherent message, and communicate the message with an audience. The present study examines the determinants of digital literacy proficiency by asking 150…

  1. Microcomputer-Based Digital Signal Processing Laboratory Experiments.

    ERIC Educational Resources Information Center

    Tinari, Jr., Rocco; Rao, S. Sathyanarayan

    1985-01-01

    Describes a system (Apple II microcomputer interfaced to flexible, custom-designed digital hardware) which can provide: (1) Fast Fourier Transform (FFT) computation on real-time data with a video display of spectrum; (2) frequency synthesis experiments using the inverse FFT; and (3) real-time digital filtering experiments. (JN)

  2. Digital image processing: a primer for JVIR authors and readers: part 1: the fundamentals.

    PubMed

    LaBerge, Jeanne M; Andriole, Katherine P

    2003-10-01

    Online submission of manuscripts will be mandatory for most journals in the near future. To prepare authors for this requirement and to acquaint readers with this new development, herein the basics of digital image processing are described. From the fundamentals of digital image architecture, through acquisition, editing, and storage of digital images, the steps necessary to prepare an image for online submission are reviewed. In this article, the first of a three-part series, the structure of the digital image is described. In subsequent articles, the acquisition and editing of digital images will be reviewed. PMID:14551267

  3. Systematic methodology for estimating direct capital costs for blanket tritium processing systems

    SciTech Connect

    Finn, P.A.

    1985-01-01

    This paper describes the methodology developed for estimating the relative capital costs of blanket processing systems. The capital costs of the nine blanket concepts selected in the Blanket Comparison and Selection Study are presented and compared.

  4. How processing digital elevation models can affect simulated water budgets

    USGS Publications Warehouse

    Kuniansky, E.L.; Lowery, M.A.; Campbell, B.G.

    2009-01-01

    For regional models, the shallow water table surface is often used as a source/sink boundary condition, as model grid scale precludes simulation of the water table aquifer. This approach is appropriate when the water table surface is relatively stationary. Since water table surface maps are not readily available, the elevation of the water table used in model cells is estimated via a two-step process. First, a regression equation is developed using existing land and water table elevations from wells in the area. This equation is then used to predict the water table surface for each model cell using land surface elevation available from digital elevation models (DEM). Two methods of processing DEM for estimating the land surface for each cell are commonly used (value nearest the cell centroid or mean value in the cell). This article demonstrates how these two methods of DEM processing can affect the simulated water budget. For the example presented, approximately 20% more total flow through the aquifer system is simulated if the centroid value rather than the mean value is used. This is due to the one-third greater average ground water gradients associated with the centroid value than the mean value. The results will vary depending on the particular model area topography and cell size. The use of the mean DEM value in each model cell will result in a more conservative water budget and is more appropriate because the model cell water table value should be representative of the entire cell area, not the centroid of the model cell.

  5. Social work practice in the digital age: therapeutic e-mail as a direct practice methodology.

    PubMed

    Mattison, Marian

    2012-07-01

    The author addresses the risks and benefits of incorporating therapeutic e-mail communication into clinical social work practice. Consumer demand for online clinical services is growing faster than the professional response. E-mail, when used as an adjunct to traditional meetings with clients, offers distinct advantages and risks. Benefits include the potential to reach clients in geographically remote and underserved communities, enhancing and extending the therapeutic relationship and improving treatment outcomes. Risks include threats to client confidentiality and privacy, liability coverage for practitioners, licensing jurisdiction, and the lack of competency standards for delivering e-mail interventions. Currently, the social work profession does not have adequate instructive guidelines and best-practice standards for using e-mail as a direct practice methodology. Practitioners need (formal) academic training in the techniques connected to e-mail exchanges with clients. The author describes the ethical and legal risks for practitioners using therapeutic e-mail with clients and identifies recommendations for establishing best-practice standards. PMID:23252316

  6. Delicate visual artifacts of advanced digital video processing algorithms

    NASA Astrophysics Data System (ADS)

    Nicolas, Marina M.; Lebowsky, Fritz

    2005-03-01

    With the incoming of digital TV, sophisticated video processing algorithms have been developed to improve the rendering of motion or colors. However, the perceived subjective quality of these new systems sometimes happens to be in conflict with the objective measurable improvement we expect to get. In this presentation, we show examples where algorithms should visually improve the skin tone rendering of decoded pictures under normal conditions, but surprisingly fail, when the quality of mpeg encoding drops below a just noticeable threshold. In particular, we demonstrate that simple objective criteria used for the optimization, such as SAD, PSNR or histogram sometimes fail, partly because they are defined on a global scale, ignoring local characteristics of the picture content. We then integrate a simple human visual model to measure potential artifacts with regard to spatial and temporal variations of the objects' characteristics. Tuning some of the model's parameters allows correlating the perceived objective quality with compression metrics of various encoders. We show the evolution of our reference parameters in respect to the compression ratios. Finally, using the output of the model, we can control the parameters of the skin tone algorithm to reach an improvement in overall system quality.

  7. Digital neuromorphic processing for a simplified algorithm of ultrasonic reception

    NASA Astrophysics Data System (ADS)

    Qiang, Lin; Clarke, Chris

    2001-05-01

    Previously, most mammalian auditory systems research has concentrated on human sensory perception whose frequencies are lower than 20 kHz. The implementations almost always used analog VLSI design. Due to the complexity of the model, it is difficult to implement these algorithms using current digital technology. This paper introduces a simplified model of biosonic reception system in bats and its implementation in the ``Chiroptera Inspired Robotic CEphaloid'' (CIRCE) project. This model consists of bandpass filters, a half-wave rectifier, low-pass filters, automatic gain control, and spike generation with thresholds. Due to the real-time requirements of the system, the system employs Butterworth filters and advanced field programmable gate array (FPGA) architectures to provide a viable solution. The ultrasonic signal processing is implemented on a Xilinx FPGA Virtex II device in real time. In the system, 12-bit input echo signals from receivers are sampled at 1 M samples per second for a signal frequency range from 20 to 200 kHz. The system performs a 704-channel per ear auditory pipeline operating in real time. The output of the system is a coded time series of threshold crossing points. Comparing hardware implementation with fixed-point software, the system shows significant performance gains with no loss of accuracy.

  8. Automated Coronal Loop Identification Using Digital Image Processing Techniques

    NASA Technical Reports Server (NTRS)

    Lee, Jong K.; Gary, G. Allen; Newman, Timothy S.

    2003-01-01

    The results of a master thesis project on a study of computer algorithms for automatic identification of optical-thin, 3-dimensional solar coronal loop centers from extreme ultraviolet and X-ray 2-dimensional images will be presented. These center splines are proxies of associated magnetic field lines. The project is pattern recognition problems in which there are no unique shapes or edges and in which photon and detector noise heavily influence the images. The study explores extraction techniques using: (1) linear feature recognition of local patterns (related to the inertia-tensor concept), (2) parametric space via the Hough transform, and (3) topological adaptive contours (snakes) that constrains curvature and continuity as possible candidates for digital loop detection schemes. We have developed synthesized images for the coronal loops to test the various loop identification algorithms. Since the topology of these solar features is dominated by the magnetic field structure, a first-order magnetic field approximation using multiple dipoles provides a priori information in the identification process. Results from both synthesized and solar images will be presented.

  9. Fully Digital: Policy and Process Implications for the AAS

    NASA Astrophysics Data System (ADS)

    Biemesderfer, Chris

    Over the past two decades, every scholarly publisher has migrated at least the mechanical aspects of their journal publishing so that they utilize digital means. The academy was comfortable with that for a while, but publishers are under increasing pressure to adapt further. At the American Astronomical Society (AAS), we think that means bringing our publishing program to the point of being fully digital, by establishing procedures and policies that regard the digital objects of publication primarily. We have always thought about our electronic journals as databases of digital articles, from which we can publish and syndicate articles one at a time, and we must now put flesh on those bones by developing practices that are consistent with the realities of article at a time publication online. As a learned society that holds the long-term rights to the literature, we have actively taken responsibility for the preservation of the digital assets that constitute our journals, and in so doing we have not forsaken the legacy pre-digital assets. All of us who serve as the long-term stewards of scholarship must begin to evolve into fully digital publishers.

  10. Recognition and inference of crevice processing on digitized paintings

    NASA Astrophysics Data System (ADS)

    Karuppiah, S. P.; Srivatsa, S. K.

    2013-03-01

    This paper is designed to detect and removal of cracks on digitized paintings. The cracks are detected by threshold. Afterwards, the thin dark brush strokes which have been misidentified as cracks are removed using Median radial basis function neural network on hue and saturation data, Semi-automatic procedure based on region growing. Finally, crack is filled using wiener filter. The paper is well designed in such a way that most of the cracks on digitized paintings have identified and removed. The paper % of betterment is 90%. This paper helps us to perform not only on digitized paintings but also the medical images and bmp images. This paper is implemented by Mat Lab.

  11. CIDOC-CRM extensions for conservation processes: A methodological approach

    NASA Astrophysics Data System (ADS)

    Vassilakaki, Evgenia; Zervos, Spiros; Giannakopoulos, Georgios

    2015-02-01

    This paper aims to report the steps taken to create the CIDOC Conceptual Reference Model (CIDOC-CRM) extensions and the relationships established to accommodate the depiction of conservation processes. In particular, the specific steps undertaken for developing and applying the CIDOC-CRM extensions for defining the conservation interventions performed on the cultural artifacts of the National Archaeological Museum of Athens, Greece are presented in detail. A report on the preliminary design of the DOC-CULTURE project (Development of an integrated information environment for assessment and documentation of conservation interventions to cultural works/objects with nondestructive testing techniques [NDTs], www.ndt-lab.gr/docculture), co-financed by the European Union NSRF THALES program, can be found in Kyriaki-Manessi, Zervos & Giannakopoulos (1) whereas the NDT&E methods and their output data through CIDOC-CRM extension of the DOC-CULTURE project approach to standardize the documentation of the conservation were further reported in Kouis et al. (2).

  12. Emissions involved in acidic deposition processes: Methodology and results

    SciTech Connect

    Placet, M.

    1990-01-01

    Data on the emissions involved in atmospheric acid-base chemistry are crucial to the assessment of acidic deposition and its effects. Sulfur dioxide (SO{sub 2}), nitrogen oxides (NO{sub x}), and volatile organic compounds (VOCs) are the primary chemical compounds involved in acidic deposition processes. In addition, other emission species -- e.g., ammonia, alkaline dust particles, hydrogen chloride, and hydrogen fluoride -- are involved in atmospheric acid-base chemistry, either by contributing acidic constituents or by neutralizing acidic species. Several emissions data bases have been developed under the auspices of the National Acid Precipitation Program (NAPAP). In addition to those developed by NAPAP, emissions data bases and emissions trends estimates also have been developed by organizations such as the Electric Power Research Institute (EPRI) and the U.S. Environmental Protection Agency (EPA). This paper briefly describes and compares the methods used in developing these emissions data bases and presents an overview of their emissions estimates. A more detailed discussion of these topics can be found in the State-of-Science Report on emissions recently released by NAPAP and in the references cited in that report. 14 refs., 4 figs., 1 tab.

  13. Document Conversion Methodology.

    ERIC Educational Resources Information Center

    Bovee, Donna

    1990-01-01

    Discusses digital imaging technology and examines document database conversion considerations. Two types of document imaging systems are described: (1) a work in process system, and (2) a storage and retrieval system. Conversion methodology is outlined, and a document conversion scenario is presented as a practical guide to conversion. (LRW)

  14. Data reduction complex analog-to-digital data processing requirements for onsite test facilities

    NASA Technical Reports Server (NTRS)

    Debbrecht, J. D.

    1976-01-01

    The analog to digital processing requirements of onsite test facilities are described. The source and medium of all input data to the Data Reduction Complex (DRC) and the destination and medium of all output products of the analog-to-digital processing are identified. Additionally, preliminary input and output data formats are presented along with the planned use of the output products.

  15. Exploring the Developmental Changes in Automatic Two-Digit Number Processing

    ERIC Educational Resources Information Center

    Chan, Winnie Wai Lan; Au, Terry K.; Tang, Joey

    2011-01-01

    Even when two-digit numbers are irrelevant to the task at hand, adults process them. Do children process numbers automatically, and if so, what kind of information is activated? In a novel dot-number Stroop task, children (Grades 1-5) and adults were shown two different two-digit numbers made up of dots. Participants were asked to select the…

  16. Enhancing the Teaching of Digital Processing of Remote Sensing Image Course through Geospatial Web Processing Services

    NASA Astrophysics Data System (ADS)

    di, L.; Deng, M.

    2010-12-01

    Remote sensing (RS) is an essential method to collect data for Earth science research. Huge amount of remote sensing data, most of them in the image form, have been acquired. Almost all geography departments in the world offer courses in digital processing of remote sensing images. Such courses place emphasis on how to digitally process large amount of multi-source images for solving real world problems. However, due to the diversity and complexity of RS images and the shortcomings of current data and processing infrastructure, obstacles for effectively teaching such courses still remain. The major obstacles include 1) difficulties in finding, accessing, integrating and using massive RS images by students and educators, and 2) inadequate processing functions and computing facilities for students to freely explore the massive data. Recent development in geospatial Web processing service systems, which make massive data, computing powers, and processing capabilities to average Internet users anywhere in the world, promises the removal of the obstacles. The GeoBrain system developed by CSISS is an example of such systems. All functions available in GRASS Open Source GIS have been implemented as Web services in GeoBrain. Petabytes of remote sensing images in NASA data centers, the USGS Landsat data archive, and NOAA CLASS are accessible transparently and processable through GeoBrain. The GeoBrain system is operated on a high performance cluster server with large disk storage and fast Internet connection. All GeoBrain capabilities can be accessed by any Internet-connected Web browser. Dozens of universities have used GeoBrain as an ideal platform to support data-intensive remote sensing education. This presentation gives a specific example of using GeoBrain geoprocessing services to enhance the teaching of GGS 588, Digital Remote Sensing taught at the Department of Geography and Geoinformation Science, George Mason University. The course uses the textbook "Introductory

  17. Geometric processing of digital images of the planets

    NASA Technical Reports Server (NTRS)

    Edwards, Kathleen

    1987-01-01

    New procedures and software have been developed for geometric transformation of images to support digital cartography of the planets. The procedures involve the correction of spacecraft camera orientation of each image with the use of ground control and the transformation of each image to a Sinusoidal Equal-Area map projection with an algorithm which allows the number of transformation calculations to vary as the distortion varies within the image. When the distortion is low in an area of an image, few transformation computations are required, and most pixels can be interpolated. When distortion is extreme, the location of each pixel is computed. Mosaics are made of these images and stored as digital databases. Completed Sinusoidal databases may be used for digital analysis and registration with other spatial data. They may also be reproduced as published image maps by digitally transforming them to appropriate map projections.

  18. ISSUES IN DIGITAL IMAGE PROCESSING OF AERIAL PHOTOGRAPHY FOR MAPPING SUBMERSED AQUATIC VEGETATION

    EPA Science Inventory

    The paper discusses the numerous issues that needed to be addressed when developing a methodology for mapping Submersed Aquatic Vegetation (SAV) from digital aerial photography. Specifically, we discuss 1) choice of film; 2) consideration of tide and weather constraints; 3) in-s...

  19. Modeling the simulation execution process with digital objects

    NASA Astrophysics Data System (ADS)

    Cubert, Robert M.; Fishwick, Paul A.

    1999-06-01

    Object Oriented Physical Modeling (OOPM), formerly known as MOOSE, and its implementation of behavior multimodels provide an ability to manage arbitrarily complex patterns of behavioral abstraction in web-friendly simulation modeling. In an OOPM mode, one object stands as surrogate for another object, and these surrogates cognitively map to the real world. This `physical object' principle mitigates impact of incomplete knowledge and ambiguity because its real-world metaphors enable model authors to draw on intuition, facilitating reuse and integration, as well as consistency in collaborative efforts. A 3D interface for modeling and simulation visualization, under construction to augment the existing 2D GUI, obeys the physical object principle, providing a means to create, change, reuse, and integrate digital worlds made of digital objects. Implementation includes Distributed Simulation Executive, Digital object MultiModel Language, Digital Object Warehouse, and multimodel Translator. This approach is powerful and its capabilities have steadily grown; however, it has lacked a formal basis which we now provide: we define multimodels, represent digital objects as multimodels, transform multimodels to simulations, demonstrate the correctness of execution sequence of the simulations, and closure under coupling of digital objects. These theoretical results complement and enhance the practical aspects of physical multimodeling.

  20. A digital signal processing module for gamma-ray tracking detectors

    NASA Astrophysics Data System (ADS)

    Cromaz, M.; Riot, V. J.; Fallon, P.; Gros, S.; Holmes, B.; Lee, I. Y.; Macchiavelli, A. O.; Vu, C.; Yaver, H.; Zimmermann, S.

    2008-12-01

    We have designed and constructed an 8-channel digital signal processing board for the GRETINA spectrometer. The digitizer samples each of 8 inputs at 100 MHz with 12-bit resolution. Employing a large on-board FPGA, the board derives an energy, leading-edge time, and constant-fraction time from the input signal providing the functionality of a conventional analog electronics system. Readout and control of the digitizer is done over a VME bus. The digitizer's performance met all requirements for processing signals from the GRETINA spectrometer.

  1. Beyond roots alone: Novel methodologies for analyzing complex soil and minirhizotron imagery using image processing and GIS tools

    NASA Astrophysics Data System (ADS)

    Silva, Justina A.

    Quantifying belowground dynamics is critical to our understanding of plant and ecosystem function and belowground carbon cycling, yet currently available tools for complex belowground image analyses are insufficient. We introduce novel techniques combining digital image processing tools and geographic information systems (GIS) analysis to permit semi-automated analysis of complex root and soil dynamics. We illustrate methodologies with imagery from microcosms, minirhizotrons, and a rhizotron, in upland and peatland soils. We provide guidelines for correct image capture, a method that automatically stitches together numerous minirhizotron images into one seamless image, and image analysis using image segmentation and classification in SPRING or change analysis in ArcMap. These methods facilitate spatial and temporal root and soil interaction studies, providing a framework to expand a more comprehensive understanding of belowground dynamics.

  2. The application of digital signal processing techniques to a teleoperator radar system

    NASA Technical Reports Server (NTRS)

    Pujol, A.

    1982-01-01

    A digital signal processing system was studied for the determination of the spectral frequency distribution of echo signals from a teleoperator radar system. The system consisted of a sample and hold circuit, an analog to digital converter, a digital filter, and a Fast Fourier Transform. The system is interfaced to a 16 bit microprocessor. The microprocessor is programmed to control the complete digital signal processing. The digital filtering and Fast Fourier Transform functions are implemented by a S2815 digital filter/utility peripheral chip and a S2814A Fast Fourier Transform chip. The S2815 initially simulates a low-pass Butterworth filter with later expansion to complete filter circuit (bandpass and highpass) synthesizing.

  3. Development of Coriolis mass flowmeter with digital drive and signal processing technology.

    PubMed

    Hou, Qi-Li; Xu, Ke-Jun; Fang, Min; Liu, Cui; Xiong, Wen-Jun

    2013-09-01

    Coriolis mass flowmeter (CMF) often suffers from two-phase flowrate which may cause flowtube stalling. To solve this problem, a digital drive method and a digital signal processing method of CMF is studied and implemented in this paper. A positive-negative step signal is used to initiate the flowtube oscillation without knowing the natural frequency of the flowtube. A digital zero-crossing detection method based on Lagrange interpolation is adopted to calculate the frequency and phase difference of the sensor output signals in order to synthesize the digital drive signal. The digital drive approach is implemented by a multiplying digital to analog converter (MDAC) and a direct digital synthesizer (DDS). A digital Coriolis mass flow transmitter is developed with a digital signal processor (DSP) to control the digital drive, and realize the signal processing. Water flow calibrations and gas-liquid two-phase flowrate experiments are conducted to examine the performance of the transmitter. The experimental results show that the transmitter shortens the start-up time and can maintain the oscillation of flowtube in two-phase flowrate condition. PMID:23721742

  4. Digital mapping of side-scan sonar data with the Woods Hole Image Processing System software

    USGS Publications Warehouse

    Paskevich, Valerie F.

    1992-01-01

    Since 1985, the Branch of Atlantic Marine Geology has been involved in collecting, processing and digitally mosaicking high and low resolution sidescan sonar data. In the past, processing and digital mosaicking has been accomplished with a dedicated, shore-based computer system. Recent development of a UNIX-based image-processing software system includes a series of task specific programs for pre-processing sidescan sonar data. To extend the capabilities of the UNIX-based programs, development of digital mapping techniques have been developed. This report describes the initial development of an automated digital mapping procedure. Included is a description of the programs and steps required to complete the digital mosaicking on a UNIXbased computer system, and a comparison of techniques that the user may wish to select.

  5. Post-processing of compressed video using a unified metric for digital video processing

    NASA Astrophysics Data System (ADS)

    Boroczky, Lilla; Yang, Yibin

    2004-01-01

    In this paper we propose a novel, post-processing system for compressed video sources. The proposed system explores the interaction between artifact reduction and sharpness/resolution enhancement to achieve optimal video quality for compressed (e.g. MPEG-2) sources. It is based on the Unified Metric for Digital Video Processing (UMDVP), which adaptively controls the post-processing algorithms according to the coding characteristics of the decoded video. The experiments carried out on several MPEG-2 encoded video sequences have shown significant improvement in picture quality compared to a system without the UMDVP control and to a system that did not exploit the interaction between artifact reduction and video enhancement. The UMDVP as well the proposed post-processing system can be easily adapted for different coding standard, such as MPEG-4, H.26x.

  6. Terahertz digital holography image processing based on MAP algorithm

    NASA Astrophysics Data System (ADS)

    Chen, Guang-Hao; Li, Qi

    2015-04-01

    Terahertz digital holography combines the terahertz technology and digital holography technology at present, fully exploits the advantages in both of them. Unfortunately, the quality of terahertz digital holography reconstruction images is gravely harmed by speckle noise which hinders the popularization of this technology. In this paper, the maximum a posterior estimation (MAP) filter is harnessed for the restoration of the digital reconstruction images. The filtering results are compared with images filtered by Wiener Filter and conventional frequency-domain filters from both subjective and objective perspectives. As for objective assessment, we adopted speckle index (SPKI) and edge preserving index (EPI) to quantitate the quality of images. In this paper, Canny edge detector is also used to outline the target in original and reconstruction images, which then act as an important role in the evaluation of filter performance. All the analysis indicate that maximum a posterior estimation filtering algorithm performs superiorly compared with the other two competitors in this paper and has enhanced the terahertz digital holography reconstruction images to a certain degree, allowing for a more accurate boundary identification.

  7. Digital process for an implant-supported fixed dental prosthesis: A clinical report.

    PubMed

    Brandt, Jan; Lauer, Hans-Christoph; Peter, Thorsten; Brandt, Silvia

    2015-10-01

    A digital process is presented for an implant-supported single-tooth and a 3-unit fixed dental prosthesis (FDP) with customized abutments and monolithic prosthetic zirconia restorations. The digital impression on the implant level was made with a TRIOS intraoral scanner (3Shape). This process included the fabrication of an implant cast with the fused deposition modeling technique and a 3-dimensional printing process with integrated implant analogs. The process enabled the FDPs to be designed with CAD/CAM on the cast before patient contact. Designing a printed implant cast expands the use of the digital workflow in the dental field. PMID:26187099

  8. How to Find Exculpatory and Inculpatory Evidence Using a Circular Digital Forensics Process Model

    NASA Astrophysics Data System (ADS)

    Khatir, Marjan; Hejazi, Seyed Mahmood

    With raising the number of cyber crimes, the need of having a proper digital forensic process also increases. Although digital forensics is practiced in recent years, there is still a big gap between previously suggested digital forensics processes and what is really needed to be done in real cases. Some problems with current processes are lack of flexible transition between phases, not having a clear method or a complete scenario for addressing reliable evidence, and not paying enough attention to management aspects and team roles. This paper provides a process model by paying special attention to the team roles and management aspects as well as both exculpatory and inculpatory evidence.

  9. Processing multi-digit numbers: a translingual eye-tracking study.

    PubMed

    Bahnmueller, Julia; Huber, Stefan; Nuerk, Hans-Christoph; Göbel, Silke M; Moeller, Korbinian

    2016-05-01

    The present study aimed at investigating the underlying cognitive processes and language specificities of three-digit number processing. More specifically, it was intended to clarify whether the single digits of three-digit numbers are processed in parallel and/or sequentially and whether processing strategies are influenced by the inversion of number words with respect to the Arabic digits [e.g., 43: dreiundvierzig ("three and forty")] and/or by differences in reading behavior of the respective first language. Therefore, English- and German-speaking adults had to complete a three-digit number comparison task while their eye-fixation behavior was recorded. Replicating previous results, reliable hundred-decade-compatibility effects (e.g., 742_896: hundred-decade compatible because 7 < 8 and 4 < 9; 362_517: hundred-decade incompatible because 3 < 5 but 6 > 1) for English- as well as hundred-unit-compatibility effects for English- and German-speaking participants were observed, indicating parallel processing strategies. While no indices of partial sequential processing were found for the English-speaking group, about half of the German-speaking participants showed an inverse hundred-decade-compatibility effect accompanied by longer inspection time on the hundred digit indicating additional sequential processes. Thereby, the present data revealed that in transition from two- to higher multi-digit numbers, the homogeneity of underlying processing strategies varies between language groups. The regular German orthography (allowing for letter-by-letter reading) and its associated more sequential reading behavior may have promoted sequential processing strategies in multi-digit number processing. Furthermore, these results indicated that the inversion of number words alone is not sufficient to explain all observed language differences in three-digit number processing. PMID:26669690

  10. All-digital precision processing of ERTS images

    NASA Technical Reports Server (NTRS)

    Bernstein, R. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. Digital techniques have been developed and used to apply precision-grade radiometric and geometric corrections to ERTS MSS and RBV scenes. Geometric accuracies sufficient for mapping at 1:250,000 scale have been demonstrated. Radiometric quality has been superior to ERTS NDPF precision products. A configuration analysis has shown that feasible, cost-effective all-digital systems for correcting ERTS data are easily obtainable. This report contains a summary of all results obtained during this study and includes: (1) radiometric and geometric correction techniques, (2) reseau detection, (3) GCP location, (4) resampling, (5) alternative configuration evaluations, and (6) error analysis.

  11. Reengineering the Acquisition/Procurement Process: A Methodology for Requirements Collection

    NASA Technical Reports Server (NTRS)

    Taylor, Randall; Vanek, Thomas

    2011-01-01

    This paper captures the systematic approach taken by JPL's Acquisition Reengineering Project team, the methodology used, challenges faced, and lessons learned. It provides pragmatic "how-to" techniques and tools for collecting requirements and for identifying areas of improvement in an acquisition/procurement process or other core process of interest.

  12. Methodology development for the sustainability process assessment of sheet metal forming of complex-shaped products

    NASA Astrophysics Data System (ADS)

    Pankratov, D. L.; Kashapova, L. R.

    2015-06-01

    A methodology was developed for automated assessment of the reliability of the process of sheet metal forming process to reduce the defects in complex components manufacture. The article identifies the range of allowable values of the stamp parameters to obtain defect-free punching of spars trucks.

  13. Using Dual-Task Methodology to Dissociate Automatic from Nonautomatic Processes Involved in Artificial Grammar Learning

    ERIC Educational Resources Information Center

    Hendricks, Michelle A.; Conway, Christopher M.; Kellogg, Ronald T.

    2013-01-01

    Previous studies have suggested that both automatic and intentional processes contribute to the learning of grammar and fragment knowledge in artificial grammar learning (AGL) tasks. To explore the relative contribution of automatic and intentional processes to knowledge gained in AGL, we utilized dual-task methodology to dissociate automatic and…

  14. A New Methodology for Studying Dynamics of Aerosol Particles in Sneeze and Cough Using a Digital High-Vision, High-Speed Video System and Vector Analyses

    PubMed Central

    Nishimura, Hidekazu; Sakata, Soichiro; Kaga, Akikazu

    2013-01-01

    Microbial pathogens of respiratory infectious diseases are often transmitted through particles in sneeze and cough. Therefore, understanding the particle movement is important for infection control. Images of a sneeze induced by nasal cavity stimulation by healthy adult volunteers, were taken by a digital high-vision, high-speed video system equipped with a computer system and treated as a research model. The obtained images were enhanced electronically, converted to digital images every 1/300 s, and subjected to vector analysis of the bioparticles contained in the whole sneeze cloud using automatic image processing software. The initial velocity of the particles or their clusters in the sneeze was greater than 6 m/s, but decreased as the particles moved forward; the momentums of the particles seemed to be lost by 0.15–0.20 s and started a diffusion movement. An approximate equation of a function of elapsed time for their velocity was obtained from the vector analysis to represent the dynamics of the front-line particles. This methodology was also applied for a cough. Microclouds contained in a smoke exhaled with a voluntary cough by a volunteer after smoking one breath of cigarette, were traced as the visible, aerodynamic surrogates for invisible bioparticles of cough. The smoke cough microclouds had an initial velocity greater than 5 m/s. The fastest microclouds were located at the forefront of cloud mass that moving forward; however, their velocity clearly decreased after 0.05 s and they began to diffuse in the environmental airflow. The maximum direct reaches of the particles and microclouds driven by sneezing and coughing unaffected by environmental airflows were estimated by calculations using the obtained equations to be about 84 cm and 30 cm from the mouth, respectively, both achieved in about 0.2 s, suggesting that data relating to the dynamics of sneeze and cough became available by calculation. PMID:24312206

  15. 21 CFR 1311.55 - Requirements for systems used to process digitally signed orders.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 9 2014-04-01 2014-04-01 false Requirements for systems used to process digitally signed orders. 1311.55 Section 1311.55 Food and Drugs DRUG ENFORCEMENT ADMINISTRATION, DEPARTMENT OF JUSTICE REQUIREMENTS FOR ELECTRONIC ORDERS AND PRESCRIPTIONS Obtaining and Using Digital Certificates for Electronic Orders § 1311.55...

  16. The Effects of Digital Portfolio Assessment Process on Students' Writing and Drawing Performances

    ERIC Educational Resources Information Center

    Tezci, Erdogan; Dikici, Ayhan

    2006-01-01

    In this paper, it was investigated the effect of digital portfolio assessment process on the drawing and story writing performances of the 14-15 ages students. For this reason, a digital portfolio assessment rubric was prepared in order to evaluate students' drawing and story writing works. For the validity and reliability analyze was applied to…

  17. Identification and Quantification Soil Redoximorphic Features by Digital Image Processing

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Soil redoximorphic features (SRFs) have provided scientists and land managers with insight into relative soil moisture for approximately 60 years. The overall objective of this study was to develop a new method of SRF identification and quantification from soil cores using a digital camera and imag...

  18. Parallel Digital Watermarking Process on Ultrasound Medical Images in Multicores Environment

    PubMed Central

    Khor, Hui Liang; Liew, Siau-Chuin; Zain, Jasni Mohd.

    2016-01-01

    With the advancement of technology in communication network, it facilitated digital medical images transmitted to healthcare professionals via internal network or public network (e.g., Internet), but it also exposes the transmitted digital medical images to the security threats, such as images tampering or inserting false data in the images, which may cause an inaccurate diagnosis and treatment. Medical image distortion is not to be tolerated for diagnosis purposes; thus a digital watermarking on medical image is introduced. So far most of the watermarking research has been done on single frame medical image which is impractical in the real environment. In this paper, a digital watermarking on multiframes medical images is proposed. In order to speed up multiframes watermarking processing time, a parallel watermarking processing on medical images processing by utilizing multicores technology is introduced. An experiment result has shown that elapsed time on parallel watermarking processing is much shorter than sequential watermarking processing. PMID:26981111

  19. Parallel Digital Watermarking Process on Ultrasound Medical Images in Multicores Environment.

    PubMed

    Khor, Hui Liang; Liew, Siau-Chuin; Zain, Jasni Mohd

    2016-01-01

    With the advancement of technology in communication network, it facilitated digital medical images transmitted to healthcare professionals via internal network or public network (e.g., Internet), but it also exposes the transmitted digital medical images to the security threats, such as images tampering or inserting false data in the images, which may cause an inaccurate diagnosis and treatment. Medical image distortion is not to be tolerated for diagnosis purposes; thus a digital watermarking on medical image is introduced. So far most of the watermarking research has been done on single frame medical image which is impractical in the real environment. In this paper, a digital watermarking on multiframes medical images is proposed. In order to speed up multiframes watermarking processing time, a parallel watermarking processing on medical images processing by utilizing multicores technology is introduced. An experiment result has shown that elapsed time on parallel watermarking processing is much shorter than sequential watermarking processing. PMID:26981111

  20. Characterization of digital signal processing in the DiDAC data acquisition system

    SciTech Connect

    Parson, J.D.; Olivier, T.L.; Habbersett, R.C.; Martin, J.C.; Wilder, M.E.; Jett, J.H. )

    1993-01-01

    A new generation data acquisition system for flow cytometers has been constructed. This Digital Data Acquisition and Control (DiDAC) system is based on the VME architecture and uses both the standard VME bus and a private bus for system communication and data transfer. At the front end of the system is a free running 20 MHz ADC. The output of a detector preamp provides the signal for digitization. The digitized waveform is passed to a custom built digital signal processing circuit that extracts the height, width, and integral of the waveform. Calculation of these parameters is started (and stopped) when the waveform exceeds (and falls below) a preset threshold value. The free running ADC is specified to have 10 bit accuracy at 25 MHZ. The authors have characterized it to the results obtained with conventional analog signal processing followed by digitization. Comparisons are made between the two approaches in terms of measurement CV, linearity and in other aspects.

  1. Spectral analysis and filtering techniques in digital spatial data processing

    USGS Publications Warehouse

    Pan, Jeng-Jong

    1989-01-01

    A filter toolbox has been developed at the EROS Data Center, US Geological Survey, for retrieving or removing specified frequency information from two-dimensional digital spatial data. This filter toolbox provides capabilities to compute the power spectrum of a given data and to design various filters in the frequency domain. Three types of filters are available in the toolbox: point filter, line filter, and area filter. Both the point and line filters employ Gaussian-type notch filters, and the area filter includes the capabilities to perform high-pass, band-pass, low-pass, and wedge filtering techniques. These filters are applied for analyzing satellite multispectral scanner data, airborne visible and infrared imaging spectrometer (AVIRIS) data, gravity data, and the digital elevation models (DEM) data. -from Author

  2. Dynamic range control of audio signals by digital signal processing

    NASA Astrophysics Data System (ADS)

    Gilchrist, N. H. C.

    It is often necessary to reduce the dynamic range of musical programs, particularly those comprising orchestral and choral music, for them to be received satisfactorily by listeners to conventional FM and AM broadcasts. With the arrival of DAB (Digital Audio Broadcasting) a much wider dynamic range will become available for radio broadcasting, although some listeners may prefer to have a signal with a reduced dynamic range. This report describes a digital processor developed by the BBC to control the dynamic range of musical programs in a manner similar to that of a trained Studio Manager. It may be used prior to transmission in conventional broadcasting, replacing limiters or other compression equipment. In DAB, it offers the possibility of providing a dynamic range control signal to be sent to the receiver via an ancillary data channel, simultaneously with the uncompressed audio, giving the listener the option of the full dynamic range or a reduced dynamic range.

  3. GEOMETRIC PROCESSING OF DIGITAL IMAGES OF THE PLANETS.

    USGS Publications Warehouse

    Edwards, Kathleen

    1987-01-01

    New procedures and software have been developed for geometric transformations of images to support digital cartography of the planets. The procedures involve the correction of spacecraft camera orientation of each image with the use of ground control and the transformation of each image to a Sinusoidal Equal-Area map projection with an algorithm which allows the number of transformation calculations to vary as the distortion varies within the image. When the distortion is low in an area of an image, few transformation computations are required, and most pixels can be interpolated. When distortion is extreme, the location of each pixel is computed. Mosaics are made of these images and stored as digital databases.

  4. Digital Microfluidic Processing of Mammalian Embryos for Vitrification

    PubMed Central

    Abdelgawad, Mohamed; Sun, Yu

    2014-01-01

    Cryopreservation is a key technology in biology and clinical practice. This paper presents a digital microfluidic device that automates sample preparation for mammalian embryo vitrification. Individual micro droplets manipulated on the microfluidic device were used as micro-vessels to transport a single mouse embryo through a complete vitrification procedure. Advantages of this approach, compared to manual operation and channel-based microfluidic vitrification, include automated operation, cryoprotectant concentration gradient generation, and feasibility of loading and retrieval of embryos. PMID:25250666

  5. Applications of digital processing for noise removal from plasma diagnostics

    SciTech Connect

    Kane, R.J.; Candy, J.V.; Casper, T.A.

    1985-11-11

    The use of digital signal techniques for removal of noise components present in plasma diagnostic signals is discussed, particularly with reference to diamagnetic loop signals. These signals contain noise due to power supply ripple in addition to plasma characteristics. The application of noise canceling techniques, such as adaptive noise canceling and model-based estimation, will be discussed. The use of computer codes such as SIG is described. 19 refs., 5 figs.

  6. From CAD to Digital Modeling: the Necessary Hybridization of Processes

    NASA Astrophysics Data System (ADS)

    Massari, G. A.; Bernardi, F.; Cristofolini, A.

    2011-09-01

    The essay deals with the themes of digital representation of architecture starting from several years of teaching activity which is growing within the course of Automatic Design of the degree course in Engineering/Architecture in the University of Trento. With the development of CAD systems, architectural representation lies less in the tracking of a simple graph and drawn deeper into a series of acts of building a complex digital model, which can be used as a data base on which to report all the stages of project and interpretation work, and from which to derive final drawings and documents. The advent of digital technology has led to increasing difficulty in finding explicit connections between one type of operation and the subsequent outcome; thereby increasing need for guidelines, the need to understand in order to precede the changes, the desire not to be overwhelmed by uncontrollable influences brought by technological hardware and software systems to use only in accordance with the principle of maximum productivity. Formation occupies a crucial role because has the ability to direct the profession toward a thoughtful and selective use of specific applications; teaching must build logical routes in the fluid world of info-graphics and the only way to do so is to describe its contours through method indications: this will consist in understanding, studying and divulging what in its mobility does not change, as procedural issues, rather than what is transitory in its fixity, as manual questions.

  7. Developing an undergraduate geography course on digital image processing of remotely sensed data

    NASA Technical Reports Server (NTRS)

    Baumann, P. R.

    1981-01-01

    Problems relating to the development of a digital image processing course in an undergraduate geography environment is discussed. Computer resource requirements, course prerequisites, and the size of the study area are addressed.

  8. Processing results of digitized photographic observations of Pluto from the collections of the Ukrainian Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Kazantseva, L. V.; Shatokhina, S. V.; Protsyuk, Yu. I.; Kovylianska, O. E.; Andruk, V. M.

    2015-01-01

    The catalogue of 59 equatorial coordinates and magnitudes of the Pluto-Charon system for the period 1961-1990 was created based on digitized photographic observations from collections of the Joint Digital Archive of the Ukrainian Virtual Observatory obtained from five telescopes of three Ukrainian observatories. Developed software and scan processing techniques were successfully used. Parametric processing models were analyzed. The mean accuracy for the equatorial coordinates of Tycho-2 reference stars on digitized astronegatives is ±90 mas. The mean error of Tycho-2 B magnitudes is ±0.32 mag. Pluto positions were compared with the ephemeris JPL PLU43-DE431.

  9. Processing results of digitized photographic observations of Pluto from the collections of the Ukrainian Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Kazantseva, L. V.; Shatokhina, S. V.; Protsyuk, Yu. I.; Kovylianska, O. E.; Andruk, V. M.

    2015-01-01

    The catalogue of 59 equatorial coordinates and magnitudes of the Pluto-Charon system for the period 1961-1990 was created based on digitized photographic observations from collections of the Joint Digital Archive of the Ukrainian Virtual Observatory obtained from five telescopes of three Ukrainian observatories. Developed software and scan processing techniques were successfully used. Parametric processing models were analyzed. The mean accuracy for the equatorial coordinates of Tycho-2 reference stars on digitized astronegatives is ±90 mas. The mean error of Tycho-2 B T magnitudes is ±0.32 m . Pluto positions were compared with the ephemeris JPL PLU43-DE431.

  10. Proper restorative material selection, digital processes allow highly esthetic shade match combined with layered porcelain.

    PubMed

    Kahng, Luke S

    2014-03-01

    Today's digital technologies are affording dentists and laboratory technicians more control over material choices for creating restorations and fabricating dental prostheses. Digital processes can potentially enable technicians to create ideal marginal areas and account for the thickness and support of layering porcelain over substructures in the design process. In this case report of a restoration of a single central incisor, a number of issues are addressed that are central to using the newest digital technology. As demonstrated, shade selection is a crucial early step in any restorative case preparation. PMID:24773196

  11. Processing, mosaicking and management of the Monterey Bay digital sidescan-sonar images

    USGS Publications Warehouse

    Chavez, P.S., Jr.; Isbrecht, J.; Galanis, P.; Gabel, G.L.; Sides, S.C.; Soltesz, D.L.; Ross, S.L.; Velasco, M.G.

    2002-01-01

    Sidescan-sonar imaging systems with digital capabilities have now been available for approximately 20 years. In this paper we present several of the various digital image processing techniques developed by the U.S. Geological Survey (USGS) and used to apply intensity/radiometric and geometric corrections, as well as enhance and digitally mosaic, sidescan-sonar images of the Monterey Bay region. New software run by a WWW server was designed and implemented to allow very large image data sets, such as the digital mosaic, to be easily viewed interactively, including the ability to roam throughout the digital mosaic at the web site in either compressed or full 1-m resolution. The processing is separated into the two different stages: preprocessing and information extraction. In the preprocessing stage, sensor-specific algorithms are applied to correct for both geometric and intensity/radiometric distortions introduced by the sensor. This is followed by digital mosaicking of the track-line strips into quadrangle format which can be used as input to either visual or digital image analysis and interpretation. An automatic seam removal procedure was used in combination with an interactive digital feathering/stenciling procedure to help minimize tone or seam matching problems between image strips from adjacent track-lines. The sidescan-sonar image processing package is part of the USGS Mini Image Processing System (MIPS) and has been designed to process data collected by any 'generic' digital sidescan-sonar imaging system. The USGS MIPS software, developed over the last 20 years as a public domain package, is available on the WWW at: http://terraweb.wr.usgs.gov/trs/software.html.

  12. The Presence and Function of Metaphor in Supervision Sessions: An Intensive Case Study Using Process Methodology.

    ERIC Educational Resources Information Center

    Newton, Fred B.; Wilson, Marlin W.

    1991-01-01

    Investigated presence, co-occurrence, characteristics, and functions of metaphor around judged points of insight in counseling supervision sessions. Used case study methodology to view process of change in supervision sessions. Found both novel and frozen metaphors present in best and lest effective sessions. Only novel metaphors found in best…

  13. An Effective Methodology for Processing and Analyzing Large, Complex Spacecraft Data Streams

    ERIC Educational Resources Information Center

    Teymourlouei, Haydar

    2013-01-01

    The emerging large datasets have made efficient data processing a much more difficult task for the traditional methodologies. Invariably, datasets continue to increase rapidly in size with time. The purpose of this research is to give an overview of some of the tools and techniques that can be utilized to manage and analyze large datasets. We…

  14. Development of an Optimization Methodology for the Aluminum Alloy Wheel Casting Process

    NASA Astrophysics Data System (ADS)

    Duan, Jianglan; Reilly, Carl; Maijer, Daan M.; Cockcroft, Steve L.; Phillion, Andre B.

    2015-08-01

    An optimization methodology has been developed for the aluminum alloy wheel casting process. The methodology is focused on improving the timing of cooling processes in a die to achieve improved casting quality. This methodology utilizes (1) a casting process model, which was developed within the commercial finite element package, ABAQUS™—ABAQUS is a trademark of Dassault Systèms; (2) a Python-based results extraction procedure; and (3) a numerical optimization module from the open-source Python library, Scipy. To achieve optimal casting quality, a set of constraints have been defined to ensure directional solidification, and an objective function, based on the solidification cooling rates, has been defined to either maximize, or target a specific, cooling rate. The methodology has been applied to a series of casting and die geometries with different cooling system configurations, including a 2-D axisymmetric wheel and die assembly generated from a full-scale prototype wheel. The results show that, with properly defined constraint and objective functions, solidification conditions can be improved and optimal cooling conditions can be achieved leading to process productivity and product quality improvements.

  15. Digitizing rocks standardizing the geological description process using workstations

    SciTech Connect

    Saunders, M.R. , Windsor, Berkshire ); Shields, J.A. ); Taylor, M.R. )

    1993-09-01

    The preservation of geological knowledge in a standardized digital form presents a challenge. Data sources, inherently fuzzy, range in scale from the macroscopic (e.g., outcrop) through the mesoscopic (e.g., hand-specimen) core and sidewall core, to the microscopic (e.g., drill cuttings, thin sections, and microfossils). Each scale change results in increased heterogeneity and potentially contradictory data and the providers of such data may vary in experience level. To address these issues with respect to cores and drill cuttings, a geological description workstation has been developed and is undergoing field trials. Over 1000 carefully defined geological attributes are currently available within a depth-indexed, relational database. Attributes are stored in digital form, allowing multiple users to select familiar usage (e.g., diabase vs. dolerite). Data can be entered in one language and retrieved in other languages. The database structure allow groupings of similar elements (e.g., rhyolites in acidic, igneous or volcanics subgroups or the igneous rock group) permitting different uses to analyze details appropriate to the scale of the usage. Data entry uses a graphical user interface, allowing the geologist to make quick, logical selections in a standardized or custom-built format with extensive menus, on-screen graphics and help screens available. Description ranges are permissible. Entries for lithology, petrology, structures (sedimentary, organic and deformational), reservoir characteristics (porosity and hydrocarbon shows), and macrofossils are available. Sampling points for thin sections, core analysis, geochemistry, or micropaleontology studies are also recorded. Using digital data storage, geological logs using graphical, alphanumeric and symbolic depictions are possible. Data can be integrated with drilling and mud gas data, MWD and wireline data and off well-site analyses to produced composite formation evaluation logs and interpretational crossplots.

  16. Performance of the SIR-B digital image processing subsystem

    NASA Technical Reports Server (NTRS)

    Curlander, J. C.

    1986-01-01

    A ground-based system to generate digital SAR image products has been developed and implemented in support of the SIR-B mission. This system is designed to achieve the maximum throughput while meeting strict image fidelity criteria. Its capabilities include: automated radiometric and geometric correction of the output imagery; high-precision absolute location without tiepoint registration; filtering of the raw data to remove spurious signals from alien radars; and automated catologing to maintain a full set of radar and image production facility in support of the SIR-B science investigators routinely produces over 80 image frames per week.

  17. Experimental Methodology for Determining Optimum Process Parameters for Production of Hydrous Metal Oxides by Internal Gelation

    SciTech Connect

    Collins, J.L.

    2005-10-28

    The objective of this report is to describe a simple but very useful experimental methodology that was used to determine optimum process parameters for preparing several hydrous metal-oxide gel spheres by the internal gelation process. The method is inexpensive and very effective in collection of key gel-forming data that are needed to prepare the hydrous metal-oxide microspheres of the best quality for a number of elements.

  18. Reaction Wheel Friction Telemetry Data Processing Methodology and On-Orbit Experience

    NASA Astrophysics Data System (ADS)

    Hacker, Johannes M.; Ying, Jiongyu; Lai, Peter C.

    2015-09-01

    A Globalstar 2nd generation satellite experienced a reaction wheel mechanical failure, and in response Globalstar has been closely monitoring reaction wheel bearing friction. To prevent another reaction wheel hardware failure and subsequent shortened satellite mission life, a friction data processing methodology was developed as an on-orbit monitoring tool for the ground to issue early warning and take appropriate action on any hardware degradation or potential failure. The methodology, reaction wheel friction behavior, and its application to an on-orbit anomaly experience will be presented.

  19. Implementation of real-time digital endoscopic image processing system

    NASA Astrophysics Data System (ADS)

    Song, Chul Gyu; Lee, Young Mook; Lee, Sang Min; Kim, Won Ky; Lee, Jae Ho; Lee, Myoung Ho

    1997-10-01

    Endoscopy has become a crucial diagnostic and therapeutic procedure in clinical areas. Over the past four years, we have developed a computerized system to record and store clinical data pertaining to endoscopic surgery of laparascopic cholecystectomy, pelviscopic endometriosis, and surgical arthroscopy. In this study, we developed a computer system, which is composed of a frame grabber, a sound board, a VCR control board, a LAN card and EDMS. Also, computer system controls peripheral instruments such as a color video printer, a video cassette recorder, and endoscopic input/output signals. Digital endoscopic data management system is based on open architecture and a set of widely available industry standards; namely Microsoft Windows as an operating system, TCP/IP as a network protocol and a time sequential database that handles both images and speech. For the purpose of data storage, we used MOD and CD- R. Digital endoscopic system was designed to be able to store, recreate, change, and compress signals and medical images. Computerized endoscopy enables us to generate and manipulate the original visual document, making it accessible to a virtually unlimited number of physicians.

  20. Integrating rock mechanics issues with repository design through design process principles and methodology

    SciTech Connect

    Bieniawski, Z.T.

    1996-04-01

    A good designer needs not only knowledge for designing (technical know-how that is used to generate alternative design solutions) but also must have knowledge about designing (appropriate principles and systematic methodology to follow). Concepts such as {open_quotes}design for manufacture{close_quotes} or {open_quotes}concurrent engineering{close_quotes} are widely used in the industry. In the field of rock engineering, only limited attention has been paid to the design process because design of structures in rock masses presents unique challenges to the designers as a result of the uncertainties inherent in characterization of geologic media. However, a stage has now been reached where we are be able to sufficiently characterize rock masses for engineering purposes and identify the rock mechanics issues involved but are still lacking engineering design principles and methodology to maximize our design performance. This paper discusses the principles and methodology of the engineering design process directed to integrating site characterization activities with design, construction and performance of an underground repository. Using the latest information from the Yucca Mountain Project on geology, rock mechanics and starter tunnel design, the current lack of integration is pointed out and it is shown how rock mechanics issues can be effectively interwoven with repository design through a systematic design process methodology leading to improved repository performance. In essence, the design process is seen as the use of design principles within an integrating design methodology, leading to innovative problem solving. In particular, a new concept of {open_quotes}Design for Constructibility and Performance{close_quotes} is introduced. This is discussed with respect to ten rock mechanics issues identified for repository design and performance.

  1. Digital processing of signals arising from organic liquid scintillators for applications in the mixed-field assessment of nuclear threats

    NASA Astrophysics Data System (ADS)

    Aspinall, M. D.; Joyce, M. J.; Mackin, R. O.; Jarrah, Z.; Peyton, A. J.

    2008-10-01

    The nuclear aspect of the CBRN* threat is often divided amongst radiological substances posing no criticality risk, often referred to as 'dirty bomb' scenarios, and fissile threats. The latter have the theoretical potential for criticality excursion, resulting in elevated neutron fluxes in addition to the γ-ray component that is common to dirty bombs. Even in isolation of the highly-unlikely criticality scenario, fissile substances often exhibit radiation fields comprising a significant neutron component which can require considerably different counterterrorism measures and clean-up methodologies. The contrast between these threats can indicate important differences in the relative sophistication of the perpetrators and their organizations. Consequently, the detection and discrimination of nuclear perils in terms of mixed-field content is an important assay in combating terrorist threats. In this paper we report on the design and implementation of a fast digitizer and embedded-processor for onthe- fly signal processing of events from organic liquid scintillators. A digital technique, known as Pulse Gradient Analysis (PGA), has been developed at Lancaster University for the digital discrimination of neutrons and γ rays. PGA has been deployed on bespoke hardware and demonstrates remarkable improvement over analogue methods for the assay of mixed fields and the real-time discrimination of neutrons and γ rays. In this regard the technology constitutes an attractive and affordable means for the discrimination of the radiation fields arising from fissile threats and those from dirty bombs. Data are presented demonstrating this capability with sealed radioactive sources.

  2. Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology

    NASA Technical Reports Server (NTRS)

    Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.

    1974-01-01

    A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.

  3. Digital pulse processing and optimization of the front-end electronics for nuclear instrumentation.

    PubMed

    Bobin, C; Bouchard, J; Thiam, C; Ménesguen, Y

    2014-05-01

    This article describes an algorithm developed for the digital processing of signals provided by a high-efficiency well-type NaI(Tl) detector used to apply the 4πγ technique. In order to achieve a low-energy threshold, a new front-end electronics has been specifically designed to optimize the coupling to an analog-to-digital converter (14 bit, 125 MHz) connected to a digital development kit produced by Altera(®). The digital pulse processing is based on an IIR (Infinite Impulse Response) approximation of the Gaussian filter (and its derivatives) that can be applied to the real-time processing of digitized signals. Based on measurements obtained with the photon emissions generated by an (241)Am source, the energy threshold is estimated to be equal to ~2 keV corresponding to the physical threshold of the NaI(Tl) detector. An algorithm developed for a Silicon Drift Detector used for low-energy x-ray spectrometry is also described. In that case, the digital pulse processing is specifically designed for signals provided by a reset-type preamplifier ((55)Fe source). PMID:24326314

  4. Introduction to computers and digital processing in medical imaging

    SciTech Connect

    Kuni, C.C.

    1988-01-01

    The author provides a nontechnical, nonmathematical explanation of computers, programs, peripheral devices, and imaging applications so that radiologists can more completely control the digital devices they use. There are additional reasons for radiologists to understand computers and computing. First, knowledge of computes allows a fundamental understanding of the next generation of imaging devices and leads to more intelligent interpretation of images. Second, recognition of artifacts and system failures is facilitated. Finally, the radiologist with such knowledge will remain a central figure in imaging departments. This book is organized into three sections. The first series of five chapters is devoted to the fundamentals of computers, image formation, manipulation, and display. The second section is five chapters each on a specific modality, such as computed tomography or magnetic resonance imaging. The final section is a chapter that discusses networks and archiving.

  5. Digital pulse-shape processing for CdTe detectors

    NASA Astrophysics Data System (ADS)

    Bargholtz, Chr.; Fumero, E.; Mårtensson, L.; Wachtmeister, S.

    2001-09-01

    CdTe detectors suffer from low photo-peak efficiency and poor energy resolution. These problems are due to the drift properties of charge carriers in CdTe where particularly the holes have small mobility and trapping time. This is reflected in the amplitude and the shape of the detector output. To improve this situation a digital method is introduced where a sampling ADC is used to make a detailed measurement of the time evolution of the pulse. The measured pulse shape is fitted with a model. For the detector under study a model taking hole trapping into account significantly improves the photo-peak efficiency. The description of the hole component is, however, not fully satisfactory since for pulses with a large hole contribution a broadening of the full-energy peak occurs. Allowing for inhomogeneities in the detector material within the model partially remedies this deficiency.

  6. Digital active material processing platform effort (DAMPER), SBIR phase 2

    NASA Technical Reports Server (NTRS)

    Blackburn, John; Smith, Dennis

    1992-01-01

    Applied Technology Associates, Inc., (ATA) has demonstrated that inertial actuation can be employed effectively in digital, active vibration isolation systems. Inertial actuation involves the use of momentum exchange to produce corrective forces which act directly on the payload being actively isolated. In a typical active vibration isolation system, accelerometers are used to measure the inertial motion of the payload. The signals from the accelerometers are then used to calculate the corrective forces required to counteract, or 'cancel out' the payload motion. Active vibration isolation is common technology, but the use of inertial actuation in such systems is novel, and is the focus of the DAMPER project. A May 1991 report was completed which documented the successful demonstration of inertial actuation, employed in the control of vibration in a single axis. In the 1 degree-of-freedom (1DOF) experiment a set of air bearing rails was used to suspend the payload, simulating a microgravity environment in a single horizontal axis. Digital Signal Processor (DSP) technology was used to calculate in real time, the control law between the accelerometer signals and the inertial actuators. The data obtained from this experiment verified that as much as 20 dB of rejection could be realized by this type of system. A discussion is included of recent tests performed in which vibrations were actively controlled in three axes simultaneously. In the three degree-of-freedom (3DOF) system, the air bearings were designed in such a way that the payload is free to rotate about the azimuth axis, as well as translate in the two horizontal directions. The actuator developed for the DAMPER project has applications beyond payload isolation, including structural damping and source vibration isolation. This report includes a brief discussion of these applications, as well as a commercialization plan for the actuator.

  7. Image processing for a tactile/vision substitution system using digital CNN.

    PubMed

    Lin, Chien-Nan; Yu, Sung-Nien; Hu, Jin-Cheng

    2006-01-01

    In view of the parallel processing and easy implementation properties of CNN, we propose to use digital CNN as the image processor of a tactile/vision substitution system (TVSS). The digital CNN processor is used to execute the wavelet down-sampling filtering and the half-toning operations, aiming to extract important features from the images. A template combination method is used to embed the two image processing functions into a single CNN processor. The digital CNN processor is implemented on an intellectual property (IP) and is implemented on a XILINX VIRTEX II 2000 FPGA board. Experiments are designated to test the capability of the CNN processor in the recognition of characters and human subjects in different environments. The experiments demonstrates impressive results, which proves the proposed digital CNN processor a powerful component in the design of efficient tactile/vision substitution systems for the visually impaired people. PMID:17946687

  8. Optical fiber diameter measurement by the diffraction method with digital processing of the light scattering indicatrix

    NASA Astrophysics Data System (ADS)

    Kokodii, N. G.; Natarova, A. O.

    2016-07-01

    Relations between the position of the first diffraction minima and the fiber diameter are derived based on the solution of the problem of electromagnetic wave diffraction on a transparent fiber with a circular cross section. The obtained formulas are used to measure the fiber diameter. The diffraction pattern is recorded with a digital camera. The obtained image is digitally processed to determine the positions of the first two scattering indicatrix minima.

  9. Considerations in developing geographic informations systems based on low-cost digital image processing

    NASA Technical Reports Server (NTRS)

    Henderson, F. M.; Dobson, M. W.

    1981-01-01

    The potential of digital image processing systems costing $20,000 or less for geographic information systems is assessed with the emphasis on the volume of data to be handled, the commercial hardware systems available, and the basic software for: (1) data entry, conversion and digitization; (2) georeferencing and geometric correction; (3) data structuring; (4) editing and updating; (5) analysis and retrieval; (6) output drivers; and (7) data management. Costs must also be considered as tangible and intangible factors.

  10. Digital Libraries.

    ERIC Educational Resources Information Center

    Fox, Edward A.; Urs, Shalini R.

    2002-01-01

    Provides an overview of digital libraries research, practice, and literature. Highlights include new technologies; redefining roles; historical background; trends; creating digital content, including conversion; metadata; organizing digital resources; services; access; information retrieval; searching; natural language processing; visualization;…

  11. Erosion processes by water in agricultural landscapes: a low-cost methodology for post-event analyses

    NASA Astrophysics Data System (ADS)

    Prosdocimi, Massimo; Calligaro, Simone; Sofia, Giulia; Tarolli, Paolo

    2015-04-01

    Throughout the world, agricultural landscapes assume a great importance, especially for supplying food and a livelihood. Among the land degradation phenomena, erosion processes caused by water are those that may most affect the benefits provided by agricultural lands and endanger people who work and live there. In particular, erosion processes that affect the banks of agricultural channels may cause the bank failure and represent, in this way, a severe threat to floodplain inhabitants and agricultural crops. Similarly, rills and gullies are critical soil erosion processes as well, because they bear upon the productivity of a farm and represent a cost that growers have to deal with. To estimate quantitatively soil losses due to bank erosion and rills processes, area based measurements of surface changes are necessary but, sometimes, they may be difficult to realize. In fact, surface changes due to short-term events have to be represented with fine resolution and their monitoring may entail too much money and time. The main objective of this work is to show the effectiveness of a user-friendly and low-cost technique that may even rely on smart-phones, for the post-event analyses of i) bank erosion affecting agricultural channels, and ii) rill processes occurring on an agricultural plot. Two case studies were selected and located in the Veneto floodplain (northeast Italy) and Marche countryside (central Italy), respectively. The work is based on high-resolution topographic data obtained by the emerging, low-cost photogrammetric method named Structure-from-Motion (SfM). Extensive photosets of the case studies were obtained using both standalone reflex digital cameras and smart-phone built-in cameras. Digital Terrain Models (DTMs) derived from SfM revealed to be effective to estimate quantitatively erosion volumes and, in the case of the bank eroded, deposited materials as well. SfM applied to pictures taken by smartphones is useful for the analysis of the topography

  12. Integrating the human element into the systems engineering process and MBSE methodology.

    SciTech Connect

    Tadros, Michael Samir.

    2013-12-01

    In response to the challenges related to the increasing size and complexity of systems, organizations have recognized the need to integrate human considerations in the beginning stages of systems development. Human Systems Integration (HSI) seeks to accomplish this objective by incorporating human factors within systems engineering (SE) processes and methodologies, which is the focus of this paper. A representative set of HSI methods from multiple sources are organized, analyzed, and mapped to the systems engineering Vee-model. These methods are then consolidated and evaluated against the SE process and Models-Based Systems Engineering (MBSE) methodology to determine where and how they could integrate within systems development activities in the form of specific enhancements. Overall conclusions based on these evaluations are presented and future research areas are proposed.

  13. Industrial methodology for process verification in research (IMPROVER): toward systems biology verification

    PubMed Central

    Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo

    2012-01-01

    Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044

  14. Rethinking Design Process: Using 3D Digital Models as an Interface in Collaborative Session

    ERIC Educational Resources Information Center

    Ding, Suining

    2008-01-01

    This paper describes a pilot study for an alternative design process by integrating a designer-user collaborative session with digital models. The collaborative session took place in a 3D AutoCAD class for a real world project. The 3D models served as an interface for designer-user collaboration during the design process. Students not only learned…

  15. An efficient forward-secure group certificate digital signature scheme to enhance EMR authentication process.

    PubMed

    Yu, Yao-Chang; Hou, Ting-Wei

    2014-05-01

    The frequently used digital signature algorithms, such as RSA and the Digital Signature Algorithm (DSA), lack forward-secure function. The result is that, when private keys are renewed, trustworthiness is lost. In other words, electronic medical records (EMRs) signed by revoked private keys are no longer trusted. This significant security threat stands in the way of EMR adoption. This paper proposes an efficient forward-secure group certificate digital signature scheme that is based on Shamir's (t,n) threshold scheme and Schnorr's digital signature scheme to ensure trustworthiness is maintained when private keys are renewed and to increase the efficiency of EMRs' authentication processes in terms of number of certificates, number of keys, forward-secure ability and searching time. PMID:24652661

  16. Real-time digital signal processing-based optical coherence tomography and Doppler optical coherence tomography.

    PubMed

    Schaefer, Alexander W; Reynolds, J Joshua; Marks, Daniel L; Boppart, Stephen A

    2004-01-01

    We present the development and use of a real-time digital signal processing (DSP)-based optical coherence tomography (OCT) and Doppler OCT system. Images of microstructure and transient fluid-flow profiles are acquired using the DSP architecture for real-time processing of computationally intensive calculations. This acquisition system is readily configurable for a wide range of real-time signal processing and image processing applications in OCT. PMID:14723509

  17. Interactive Computing and Graphics in Undergraduate Digital Signal Processing. Microcomputing Working Paper Series F 84-9.

    ERIC Educational Resources Information Center

    Onaral, Banu; And Others

    This report describes the development of a Drexel University electrical and computer engineering course on digital filter design that used interactive computing and graphics, and was one of three courses in a senior-level sequence on digital signal processing (DSP). Interactive and digital analysis/design routines and the interconnection of these…

  18. Optimization of the processing technology of Fructus Arctii by response surface methodology.

    PubMed

    Liu, Qi-Di; Qin, Kun-Ming; Shen, Bao-Jia; Cai, Hao; Cai, Bao-Chang

    2015-03-01

    The present study was designed to optimize the processing of Fructus Arctii by response surface methodology (RSM). Based on single factor studies, a three-variable, three-level Box-Behnken design (BBD) was used to monitor the effects of independent variables, including processing temperature and time, on the dependent variables. Response surfaces and contour plots of the contents of total lignans, chlorogenic acid, arctiin, and arctigenin were obtained through ultraviolet and visible (UV-Vis) monitoring and high performance liquid chromatography (HPLC). Fructus Arctii should be processed under heating in a pot at 311 °C, medicine at 119 °C for 123s with flipping frequently. The experimental values under the optimized processing technology were consistent with the predicted values. In conclusion, RSM is an effective method to optimize the processing of traditional Chinese medicine (TCM). PMID:25835367

  19. A Federated Digital Identity Management Approach for Business Processes

    NASA Astrophysics Data System (ADS)

    Bertino, Elisa; Ferrini, Rodolfo; Musci, Andrea; Paci, Federica; Steuer, Kevin J.

    Business processes have gained a lot of attention because of the pressing need for integrating existing resources and services to better fulfill customer needs. A key feature of business processes is that they are built from composable services, referred to as component services, that may belong to different domains. In such a context, flexible multi-domain identity management solutions are crucial for increased security and user-convenience. In particular, it is important that during the execution of a business process the component services be able to verify the identity of the client to check that it has the required permissions for accessing the services. To address the problem of multi-domain identity management, we propose a multi-factor identity attribute verification protocol for business processes that assures clients privacy and handles naming heterogeneity.

  20. Study on the improvement of overall optical image quality via digital image processing

    NASA Astrophysics Data System (ADS)

    Tsai, Cheng-Mu; Fang, Yi Chin; Lin, Yu Chin

    2008-12-01

    This paper studies the effects of improving overall optical image quality via Digital Image Processing (DIP) and compares the promoted optical image with the non-processed optical image. Seen from the optical system, the improvement of image quality has a great influence on chromatic aberration and monochromatic aberration. However, overall image capture systems-such as cellphones and digital cameras-include not only the basic optical system but also many other factors, such as the electronic circuit system, transducer system, and so forth, whose quality can directly affect the image quality of the whole picture. Therefore, in this thesis Digital Image Processing technology is utilized to improve the overall image. It is shown via experiments that system modulation transfer function (MTF) based on the proposed DIP technology and applied to a comparatively bad optical system can be comparable to, even possibly superior to, the system MTF derived from a good optical system.

  1. The design, fabrication, and test of a new VLSI hybrid analog-digital neural processing element

    NASA Technical Reports Server (NTRS)

    Deyong, Mark R.; Findley, Randall L.; Fields, Chris

    1992-01-01

    A hybrid analog-digital neural processing element with the time-dependent behavior of biological neurons has been developed. The hybrid processing element is designed for VLSI implementation and offers the best attributes of both analog and digital computation. Custom VLSI layout reduces the layout area of the processing element, which in turn increases the expected network density. The hybrid processing element operates at the nanosecond time scale, which enables it to produce real-time solutions to complex spatiotemporal problems found in high-speed signal processing applications. VLSI prototype chips have been designed, fabricated, and tested with encouraging results. Systems utilizing the time-dependent behavior of the hybrid processing element have been simulated and are currently in the fabrication process. Future applications are also discussed.

  2. Evaluation of a Change Detection Methodology by Means of Binary Thresholding Algorithms and Informational Fusion Processes

    PubMed Central

    Molina, Iñigo; Martinez, Estibaliz; Arquero, Agueda; Pajares, Gonzalo; Sanchez, Javier

    2012-01-01

    Landcover is subject to continuous changes on a wide variety of temporal and spatial scales. Those changes produce significant effects in human and natural activities. Maintaining an updated spatial database with the occurred changes allows a better monitoring of the Earth’s resources and management of the environment. Change detection (CD) techniques using images from different sensors, such as satellite imagery, aerial photographs, etc., have proven to be suitable and secure data sources from which updated information can be extracted efficiently, so that changes can also be inventoried and monitored. In this paper, a multisource CD methodology for multiresolution datasets is applied. First, different change indices are processed, then different thresholding algorithms for change/no_change are applied to these indices in order to better estimate the statistical parameters of these categories, finally the indices are integrated into a change detection multisource fusion process, which allows generating a single CD result from several combination of indices. This methodology has been applied to datasets with different spectral and spatial resolution properties. Then, the obtained results are evaluated by means of a quality control analysis, as well as with complementary graphical representations. The suggested methodology has also been proved efficiently for identifying the change detection index with the higher contribution. PMID:22737023

  3. Airy-Kaup-Kupershmidt filters applied to digital image processing

    NASA Astrophysics Data System (ADS)

    Hoyos Yepes, Laura Cristina

    2015-09-01

    The Kaup-Kupershmidt operator is applied to the two-dimensional solution of the Airy-diffusion equation and the resulting filter is applied via convolution to image processing. The full procedure is implemented using Maple code with the package ImageTools. Some experiments were performed using a wide category of images including biomedical images generated by magnetic resonance, computarized axial tomography, positron emission tomography, infrared and photon diffusion. The Airy-Kaup-Kupershmidt filter can be used as a powerful edge detector and as powerful enhancement tool in image processing. It is expected that the Airy-Kaup-Kupershmidt could be incorporated in standard programs for image processing such as ImageJ.

  4. Digital signal processing utilizing a generic instruction set

    NASA Astrophysics Data System (ADS)

    Mosley, V. V. W.; Bronder, J.; Wenk, A.

    In order to maintain a degree of technological equivalence between software and hardware in advanced VLSI development efforts, a set of generic instructions has been defined in the form of Ada-callable procedures which invoke a complex sequence of events for the execution of vector instructions in signal processing modules. Attention is presently given to real time signal processing functions in the cases of fighter aircraft fire control radar, passive sonar surveillance, communications systems' FSK demodulation and bit regeneration, and electronic warfare support measures and countermeasures. Generalized examples of each application are given as data flow graphs.

  5. Information collection and processing of dam distortion in digital reservoir system

    NASA Astrophysics Data System (ADS)

    Liang, Yong; Zhang, Chengming; Li, Yanling; Wu, Qiulan; Ge, Pingju

    2007-06-01

    The "digital reservoir" is usually understood as describing the whole reservoir with digital information technology to make it serve the human existence and development furthest. Strictly speaking, the "digital reservoir" is referred to describing vast information of the reservoir in different dimension and space-time by RS, GPS, GIS, telemetry, remote-control and virtual reality technology based on computer, multi-media, large-scale memory and wide-band networks technology for the human existence, development and daily work, life and entertainment. The core of "digital reservoir" is to realize the intelligence and visibility of vast information of the reservoir through computers and networks. The dam is main building of reservoir, whose safety concerns reservoir and people's safety. Safety monitoring is important way guaranteeing the dam's safety, which controls the dam's running through collecting the dam's information concerned and developing trend. Safety monitoring of the dam is the process from collection and processing of initial safety information to forming safety concept in the brain. The paper mainly researches information collection and processing of the dam by digital means.

  6. MARKOV: A methodology for the solution of infinite time horizon MARKOV decision processes

    USGS Publications Warehouse

    Williams, B.K.

    1988-01-01

    Algorithms are described for determining optimal policies for finite state, finite action, infinite discrete time horizon Markov decision processes. Both value-improvement and policy-improvement techniques are used in the algorithms. Computing procedures are also described. The algorithms are appropriate for processes that are either finite or infinite, deterministic or stochastic, discounted or undiscounted, in any meaningful combination of these features. Computing procedures are described in terms of initial data processing, bound improvements, process reduction, and testing and solution. Application of the methodology is illustrated with an example involving natural resource management. Management implications of certain hypothesized relationships between mallard survival and harvest rates are addressed by applying the optimality procedures to mallard population models.

  7. Methodology development of an engineering design expert system utilizing a modular knowledge-base inference process

    NASA Astrophysics Data System (ADS)

    Winter, Steven John

    Methodology development was conducted to incorporate a modular knowledge-base representation into an expert system engineering design application. The objective for using multidisciplinary methodologies in defining a design system was to develop a system framework that would be applicable to a wide range of engineering applications. The technique of "knowledge clustering" was used to construct a general decision tree for all factual information relating to the design application. This construction combined the design process surface knowledge and specific application depth knowledge. Utilization of both levels of knowledge created a system capable of processing multiple controlling tasks including; organizing factual information relative to the cognitive levels of the design process, building finite element models for depth knowledge analysis, developing a standardized finite element code for parallel processing, and determining a best solution generated by design optimization procedures. Proof of concept for the methodology developed here is shown in the implementation of an application defining the analysis and optimization of a composite aircraft canard subjected to a general compound loading condition. This application contained a wide range of factual information and heuristic rules. The analysis tools used included a finite element (FE) processor and numerical optimizer. An advisory knowledge-base was also developed to provide a standard for conversion of serial FE code for parallel processing. All knowledge-bases developed operated as either an advisory, selection, or classification systems. Laminate properties are limited to even-numbered, quasi-isotropic ply stacking sequences. This retained full influence of the coupled in-plane and bending effects of the structures behavior. The canard is modeled as a constant thickness plate and discretized into a varying number of four or nine-noded, quadrilateral, shear-deformable plate elements. The benefit gained by

  8. Rapid processing of letters, digits and symbols: what purely visual-attentional deficit in developmental dyslexia?

    PubMed

    Ziegler, Johannes C; Pech-Georgel, Catherine; Dufau, Stéphane; Grainger, Jonathan

    2010-07-01

    Visual-attentional theories of dyslexia predict deficits for dyslexic children not only for the perception of letter strings but also for non-alphanumeric symbol strings. This prediction was tested in a two-alternative forced-choice paradigm with letters, digits, and symbols. Children with dyslexia showed significant deficits for letter and digit strings but not for symbol strings. This finding is difficult to explain for visual-attentional theories of dyslexia which postulate identical deficits for letters, digits and symbols. Moreover, dyslexics showed normal W-shaped serial position functions for letter and digit strings, which suggests that their deficit is not due to an abnormally small attentional window. Finally, the size of the deficit was identical for letters and digits, which suggests that poor letter perception is not just a consequence of the lack of reading. Together then, our results show that symbols that map onto phonological codes are impaired (i.e. letters and digits), whereas symbols that do not map onto phonological codes are not impaired. This dissociation suggests that impaired symbol-sound mapping rather than impaired visual-attentional processing is the key to understanding dyslexia. PMID:20590718

  9. Techniques for the processing of remotely sensed imagery. [digital processing of satellite imagery

    NASA Technical Reports Server (NTRS)

    Deutsch, E. S.; Rosenfeld, A.

    1974-01-01

    The following techniques are considered for classifying low resolution satellite imagery: (1) Gradient operations; (2) histogram methods; (3) gray level detection; (4) frequency domain operations; (5) Hadamard transform in digital image matching; and (6) edge and line detection schemes.

  10. Methodology for the Elimination of Reflection and System Vibration Effects in Particle Image Velocimetry Data Processing

    NASA Technical Reports Server (NTRS)

    Bremmer, David M.; Hutcheson, Florence V.; Stead, Daniel J.

    2005-01-01

    A methodology to eliminate model reflection and system vibration effects from post processed particle image velocimetry data is presented. Reflection and vibration lead to loss of data, and biased velocity calculations in PIV processing. A series of algorithms were developed to alleviate these problems. Reflections emanating from the model surface caused by the laser light sheet are removed from the PIV images by subtracting an image in which only the reflections are visible from all of the images within a data acquisition set. The result is a set of PIV images where only the seeded particles are apparent. Fiduciary marks painted on the surface of the test model were used as reference points in the images. By locating the centroids of these marks it was possible to shift all of the images to a common reference frame. This image alignment procedure as well as the subtraction of model reflection are performed in a first algorithm. Once the images have been shifted, they are compared with a background image that was recorded under no flow conditions. The second and third algorithms find the coordinates of fiduciary marks in the acquisition set images and the background image and calculate the displacement between these images. The final algorithm shifts all of the images so that fiduciary mark centroids lie in the same location as the background image centroids. This methodology effectively eliminated the effects of vibration so that unbiased data could be used for PIV processing. The PIV data used for this work was generated at the NASA Langley Research Center Quiet Flow Facility. The experiment entailed flow visualization near the flap side edge region of an airfoil model. Commercial PIV software was used for data acquisition and processing. In this paper, the experiment and the PIV acquisition of the data are described. The methodology used to develop the algorithms for reflection and system vibration removal is stated, and the implementation, testing and

  11. Faster processing of multiple spatially-heterodyned direct to digital holograms

    DOEpatents

    Hanson, Gregory R.; Bingham, Philip R.

    2006-10-03

    Systems and methods are described for faster processing of multiple spatially-heterodyned direct to digital holograms. A method includes of obtaining multiple spatially-heterodyned holograms, includes: digitally recording a first spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; digitally recording a second spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded first spatially-heterodyned hologram by shifting a first original origin of the recorded first spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a first angle between a first reference beam and a first, object beam; applying a first digital filter to cut off signals around the first original origin and performing an inverse Fourier transform on the result; Fourier analyzing the recorded second spatially-heterodyned hologram by shifting a second original origin of the recorded second spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a second angle between a second reference beam and a second object beam; and applying a second digital filter to cut off signals around the second original origin and performing an inverse Fourier transform on the result, wherein digitally recording the first spatially-heterodyned hologram is completed before digitally recording the second spatially-heterodyned hologram and a single digital image includes both the first spatially-heterodyned hologram and the second spatially-heterodyned hologram.

  12. Faster processing of multiple spatially-heterodyned direct to digital holograms

    DOEpatents

    Hanson, Gregory R [Clinton, TN; Bingham, Philip R [Knoxville, TN

    2008-09-09

    Systems and methods are described for faster processing of multiple spatially-heterodyned direct to digital holograms. A method includes of obtaining multiple spatially-heterodyned holograms, includes: digitally recording a first spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; digitally recording a second spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded first spatially-heterodyned hologram by shifting a first original origin of the recorded first spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a first angle between a first reference beam and a first object beam; applying a first digital filter to cut off signals around the first original origin and performing an inverse Fourier transform on the result; Fourier analyzing the recorded second spatially-heterodyned hologram by shifting a second original origin of the recorded second spatially-heterodyned hologram including spatial heterodyne fringes in Fourier space to sit on top of a spatial-heterodyne carrier frequency defined as a second angle between a second reference beam and a second object beam; and applying a second digital filter to cut off signals around the second original origin and performing an inverse Fourier transform on the result, wherein digitally recording the first spatially-heterodyned hologram is completed before digitally recording the second spatially-heterodyned hologram and a single digital image includes both the first spatially-heterodyned hologram and the second spatially-heterodyned hologram.

  13. Experimental study of digital image processing techniques for LANDSAT data

    NASA Technical Reports Server (NTRS)

    Rifman, S. S. (Principal Investigator); Allendoerfer, W. B.; Caron, R. H.; Pemberton, L. J.; Mckinnon, D. M.; Polanski, G.; Simon, K. W.

    1976-01-01

    The author has identified the following significant results. Results are reported for: (1) subscene registration, (2) full scene rectification and registration, (3) resampling techniques, (4) and ground control point (GCP) extraction. Subscenes (354 pixels x 234 lines) were registered to approximately 1/4 pixel accuracy and evaluated by change detection imagery for three cases: (1) bulk data registration, (2) precision correction of a reference subscene using GCP data, and (3) independently precision processed subscenes. Full scene rectification and registration results were evaluated by using a correlation technique to measure registration errors of 0.3 pixel rms thoughout the full scene. Resampling evaluations of nearest neighbor and TRW cubic convolution processed data included change detection imagery and feature classification. Resampled data were also evaluated for an MSS scene containing specular solar reflections.

  14. Implementation of a Digital Signal Processing Subsystem for a Long Wavelength Array Station

    NASA Technical Reports Server (NTRS)

    Soriano, Melissa; Navarro, Robert; D'Addario, Larry; Sigman, Elliott; Wang, Douglas

    2011-01-01

    This paper describes the implementation of a Digital Signal Processing (DP) subsystem for a single Long Wavelength Array (LWA) station.12 The LWA is a radio telescope that will consist of many phased array stations. Each LWA station consists of 256 pairs of dipole-like antennas operating over the 10-88 MHz frequency range. The Digital Signal Processing subsystem digitizes up to 260 dual-polarization signals at 196 MHz from the LWA Analog Receiver, adjusts the delay and amplitude of each signal, and forms four independent beams. Coarse delay is implemented using a first-in-first-out buffer and fine delay is implemented using a finite impulse response filter. Amplitude adjustment and polarization corrections are implemented using a 2x2 matrix multiplication

  15. Pulsed digital holography system recording ultrafast process of the femtosecond order

    NASA Astrophysics Data System (ADS)

    Wang, Xiaolei; Zhai, Hongchen; Mu, Guoguang

    2006-06-01

    We report, for the first time to our knowledge, a pulsed digital microholographic system with spatial angular multiplexing for recording the ultrafast process of the femtosecond order. The optimized design of the two sets of subpulse-train generators in this system makes it possible to implement a digital holographic recording with spatial angular multiplexing of a frame interval of the femtosecond order, while keeping the incident angle of the object beams unchanged. Three pairs of amplitude and phase images from the same view angle digitally reconstructed by the system demonstrated the ultrafast dynamic process of laser-induced ionization of ambient air at a wavelength of 800 nm, with a time resolution of 50 fs and a frame interval of 300 fs.

  16. Time-resolved digital holographic microscopy of laser-induced forward transfer process

    PubMed Central

    Ma, H.; Venugopalan, V.

    2014-01-01

    We develop a method for time-resolved digital holographic microscopy to obtain time-resolved 3-D deformation measurements of laser induced forward transfer (LIFT) processes. We demonstrate nanometer axial resolution and nanosecond temporal resolution of our method which is suitable for measuring dynamic morphological changes in LIFT target materials. Such measurements provide insight into the early dynamics of the LIFT process and a means to examine the effect of laser and material parameters on LIFT process dynamics. PMID:24748724

  17. Time-resolved digital holographic microscopy of laser-induced forward transfer process.

    PubMed

    Ma, H; Venugopalan, V

    2014-03-01

    We develop a method for time-resolved digital holographic microscopy to obtain time-resolved 3-D deformation measurements of laser induced forward transfer (LIFT) processes. We demonstrate nanometer axial resolution and nanosecond temporal resolution of our method which is suitable for measuring dynamic morphological changes in LIFT target materials. Such measurements provide insight into the early dynamics of the LIFT process and a means to examine the effect of laser and material parameters on LIFT process dynamics. PMID:24748724

  18. Application of digital image processing techniques to astronomical imagery, 1979

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.

    1979-01-01

    Several areas of applications of image processing to astronomy were identified and discussed. These areas include: (1) deconvolution for atmospheric seeing compensation; a comparison between maximum entropy and conventional Wiener algorithms; (2) polarization in galaxies from photographic plates; (3) time changes in M87 and methods of displaying these changes; (4) comparing emission line images in planetary nebulae; and (5) log intensity, hue saturation intensity, and principal component color enhancements of M82. Examples are presented of these techniques applied to a variety of objects.

  19. An integrated methodology for process improvement and delivery system visualization at a multidisciplinary cancer center.

    PubMed

    Singprasong, Rachanee; Eldabi, Tillal

    2013-01-01

    Multidisciplinary cancer centers require an integrated, collaborative, and stream-lined workflow in order to provide high quality of patient care. Due to the complex nature of cancer care and continuing changes to treatment techniques and technologies, it is a constant struggle for centers to obtain a systemic and holistic view of treatment workflow for improving the delivery systems. Project management techniques, Responsibility matrix and a swim-lane activity diagram representing sequence of activities can be combined for data collection, presentation, and evaluation of the patient care. This paper presents this integrated methodology using multidisciplinary meetings and walking the route approach for data collection, integrated responsibility matrix and swim-lane activity diagram with activity time for data representation and 5-why and gap analysis approach for data analysis. This enables collection of right detail of information in a shorter time frame by identifying process flaws and deficiencies while being independent of the nature of the patient's disease or treatment techniques. A case study of a multidisciplinary regional cancer centre is used to illustrate effectiveness of the proposed methodology and demonstrates that the methodology is simple to understand, allowing for minimal training of staff and rapid implementation. PMID:22092497

  20. Improvement of the detection rate in digital watermarked images against image degradation caused by image processing

    NASA Astrophysics Data System (ADS)

    Nishio, Masato; Ando, Yutaka; Tsukamoto, Nobuhiro; Kawashima, Hironao; Nakamura, Shinya

    2004-04-01

    In the current environment of medical information disclosure, the general-purpose image format such as JPEG/BMP which does not require special software for viewing, is suitable for carrying and managing medical image information individually. These formats have no way to know patient and study information. We have therefore developed two kinds of ID embedding methods: one is Bit-swapping method for embedding Alteration detection ID and the other is data-imposing method in Fourier domain using Discrete Cosine Transform (DCT) for embedding Original image source ID. We then applied these two digital watermark methods to four modality images (Chest X-ray, Head CT, Abdomen CT, Bone scintigraphy). However, there were some cases where the digital watermarked ID could not be detected correctly due to image degradation caused by image processing. In this study, we improved the detection rate in digital watermarked image using several techniques, which are Error correction method, Majority correction method, and Scramble location method. We applied these techniques to digital watermarked images against image processing (Smoothing) and evaluated the effectiveness. As a result, Majority correction method is effective to improve the detection rate in digital watermarked image against image degradation.

  1. Optimization of laser-assisted glass frit bonding process by response surface methodology

    NASA Astrophysics Data System (ADS)

    Wang, Wen; Xiao, Yanyi; Wu, Xingyang; Zhang, Jianhua

    2016-03-01

    In this work, a systematic study on laser-assisted glass frit bonding process was carried out by response surface methodology (RSM). Laser power, sealing speed and spot diameter were considered as key bonding parameters. Combined with a central rotatable experimental design, RSM was employed to establish mathematical model to predict the relationship between the shear force after bonding and the bonding process parameters. The model was validated experimentally. Based on the model, the interaction effects of the process parameters on the shear force were analyzed and the optimum bonding parameters were achieved. The results indicate that the model can be used to illustrate the relationship between the shear force and the bonding parameters. The predicted results obtained under the optimized parameters by the models are consistent with the experimental results.

  2. A comparison of orthogonal transformations for digital speech processing.

    NASA Technical Reports Server (NTRS)

    Campanella, S. J.; Robinson, G. S.

    1971-01-01

    Discrete forms of the Fourier, Hadamard, and Karhunen-Loeve transforms are examined for their capacity to reduce the bit rate necessary to transmit speech signals. To rate their effectiveness in accomplishing this goal the quantizing error (or noise) resulting for each transformation method at various bit rates is computed and compared with that for conventional companded PCM processing. Based on this comparison, it is found that Karhunen-Loeve provides a reduction in bit rate of 13.5 kbits/s, Fourier 10 kbits/s, and Hadamard 7.5 kbits/s as compared with the bit rate required for companded PCM. These bit-rate reductions are shown to be somewhat independent of the transmission bit rate.

  3. Digital Signal Processing for the Event Horizon Telescope

    NASA Astrophysics Data System (ADS)

    Weintroub, Jonathan

    2015-08-01

    A broad international collaboration is building the Event Horizon Telescope (EHT). The aim is to test Einstein’s theory of General Relativity in one of the very few places it could break down: the strong gravity regime right at the edge of a black hole. The EHT is an earth-size VLBI array operating at the shortest radio wavelengths, that has achieved unprecedented angular resolution of a few tens of μarcseconds. For nearby super massive black holes (SMBH) this size scale is comparable to the Schwarzschild Radius, and emission in the immediate neighborhood of the event horizon can be directly observed. We give an introduction to the science behind the CASPER-enabled EHT, and outline technical developments, with emphasis on the secret sauce of high speed signal processing.

  4. Proceedings of the Fourth Annual Workshop on the Use of Digital Computers in Process Control.

    ERIC Educational Resources Information Center

    Smith, Cecil L., Ed.

    Contents: Computer hardware testing (results of vendor-user interaction); CODIL (a new language for process control programing); the design and implementation of control systems utilizing CRT display consoles; the systems contractor - valuable professional or unnecessary middle man; power station digital computer applications; from inspiration to…

  5. 21 CFR 1311.55 - Requirements for systems used to process digitally signed orders.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... private key from the system memory to prevent the unauthorized access to, or use of, the private key. (7... certificate holder and recipient of an electronic order may use any system to write, track, or maintain orders provided that the system has been enabled to process digitally signed documents and that it meets...

  6. 21 CFR 1311.55 - Requirements for systems used to process digitally signed orders.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... private key from the system memory to prevent the unauthorized access to, or use of, the private key. (7... certificate holder and recipient of an electronic order may use any system to write, track, or maintain orders provided that the system has been enabled to process digitally signed documents and that it meets...

  7. 21 CFR 1311.55 - Requirements for systems used to process digitally signed orders.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... private key from the system memory to prevent the unauthorized access to, or use of, the private key. (7... certificate holder and recipient of an electronic order may use any system to write, track, or maintain orders provided that the system has been enabled to process digitally signed documents and that it meets...

  8. 21 CFR 1311.55 - Requirements for systems used to process digitally signed orders.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... deactivated, the system must clear the plain text private key from the system memory to prevent the.... (a) A CSOS certificate holder and recipient of an electronic order may use any system to write, track, or maintain orders provided that the system has been enabled to process digitally signed...

  9. Electronic post-compensation of WDM transmission impairments using coherent detection and digital signal processing.

    PubMed

    Li, Xiaoxu; Chen, Xin; Goldfarb, Gilad; Mateo, Eduardo; Kim, Inwoong; Yaman, Fatih; Li, Guifang

    2008-01-21

    A universal post-compensation scheme for fiber impairments in wavelength-division multiplexing (WDM) systems is proposed based on coherent detection and digital signal processing (DSP). Transmission of 10 x 10 Gbit/s binary-phase-shift-keying (BPSK) signals at a channel spacing of 20 GHz over 800 km dispersion shifted fiber (DSF) has been demonstrated numerically. PMID:18542162

  10. An Undergraduate Course and Laboratory in Digital Signal Processing with Field Programmable Gate Arrays

    ERIC Educational Resources Information Center

    Meyer-Base, U.; Vera, A.; Meyer-Base, A.; Pattichis, M. S.; Perry, R. J.

    2010-01-01

    In this paper, an innovative educational approach to introducing undergraduates to both digital signal processing (DSP) and field programmable gate array (FPGA)-based design in a one-semester course and laboratory is described. While both DSP and FPGA-based courses are currently present in different curricula, this integrated approach reduces the…

  11. Implementation and Performance of GaAs Digital Signal Processing ASICs

    NASA Technical Reports Server (NTRS)

    Whitaker, William D.; Buchanan, Jeffrey R.; Burke, Gary R.; Chow, Terrance W.; Graham, J. Scott; Kowalski, James E.; Lam, Barbara; Siavoshi, Fardad; Thompson, Matthew S.; Johnson, Robert A.

    1993-01-01

    The feasibility of performing high speed digital signal processing in GaAs gate array technology has been demonstrated with the successful implementation of a VLSI communications chip set for NASA's Deep Space Network. This paper describes the techniques developed to solve some of the technology and implementation problems associated with large scale integration of GaAs gate arrays.

  12. Digital Video Cameras for Brainstorming and Outlining: The Process and Potential

    ERIC Educational Resources Information Center

    Unger, John A.; Scullion, Vicki A.

    2013-01-01

    This "Voices from the Field" paper presents methods and participant-exemplar data for integrating digital video cameras into the writing process across postsecondary literacy contexts. The methods and participant data are part of an ongoing action-based research project systematically designed to bring research and theory into practice…

  13. Digital image processing applications in the ignition and combustion of char/coal particles

    SciTech Connect

    Annamalai, K.; Kharbat, E.; Goplakrishnan, C.

    1992-12-01

    Digital image processing, is employed in this remarch study in order to visually investigate the ignition and combustion characteristics of isolated char/coal particles as well as the effect of interactivecombustion in two-particle char/coal arrays. Preliminary experiments are conducted on miniature isolated candles as well as two-candle arrays.

  14. Wavelet image processing applied to optical and digital holography: past achievements and future challenges

    NASA Astrophysics Data System (ADS)

    Jones, Katharine J.

    2005-08-01

    The link between wavelets and optics goes back to the work of Dennis Gabor who both invented holography and developed Gabor decompositions. Holography involves 3-D images. Gabor decompositions involves 1-D signals. Gabor decompositions are the predecessors of wavelets. Wavelet image processing of holography, both optical holography and digital holography, will be examined with respect to past achievements and future challenges.

  15. Advanced Signal Processing Methods Applied to Digital Mammography

    NASA Technical Reports Server (NTRS)

    Stauduhar, Richard P.

    1997-01-01

    The work reported here is on the extension of the earlier proposal of the same title, August 1994-June 1996. The report for that work is also being submitted. The work reported there forms the foundation for this work from January 1997 to September 1997. After the earlier work was completed there were a few items that needed to be completed prior to submission of a new and more comprehensive proposal for further research. Those tasks have been completed and two new proposals have been submitted, one to NASA, and one to Health & Human Services WS). The main purpose of this extension was to refine some of the techniques that lead to automatic large scale evaluation of full mammograms. Progress on each of the proposed tasks follows. Task 1: A multiresolution segmentation of background from breast has been developed and tested. The method is based on the different noise characteristics of the two different fields. The breast field has more power in the lower octaves and the off-breast field behaves similar to a wideband process, where more power is in the high frequency octaves. After the two fields are separated by lowpass filtering, a region labeling routine is used to find the largest contiguous region, the breast. Task 2: A wavelet expansion that can decompose the image without zero padding has been developed. The method preserves all properties of the power-of-two wavelet transform and does not add appreciably to computation time or storage. This work is essential for analysis of the full mammogram, as opposed to selecting sections from the full mammogram. Task 3: A clustering method has been developed based on a simple counting mechanism. No ROC analysis has been performed (and was not proposed), so we cannot finally evaluate this work without further support. Task 4: Further testing of the filter reveals that different wavelet bases do yield slightly different qualitative results. We cannot provide quantitative conclusions about this for all possible bases

  16. A Novel Optical/digital Processing System for Pattern Recognition

    NASA Technical Reports Server (NTRS)

    Boone, Bradley G.; Shukla, Oodaye B.

    1993-01-01

    This paper describes two processing algorithms that can be implemented optically: the Radon transform and angular correlation. These two algorithms can be combined in one optical processor to extract all the basic geometric and amplitude features from objects embedded in video imagery. We show that the internal amplitude structure of objects is recovered by the Radon transform, which is a well-known result, but, in addition, we show simulation results that calculate angular correlation, a simple but unique algorithm that extracts object boundaries from suitably threshold images from which length, width, area, aspect ratio, and orientation can be derived. In addition to circumventing scale and rotation distortions, these simulations indicate that the features derived from the angular correlation algorithm are relatively insensitive to tracking shifts and image noise. Some optical architecture concepts, including one based on micro-optical lenslet arrays, have been developed to implement these algorithms. Simulation test and evaluation using simple synthetic object data will be described, including results of a study that uses object boundaries (derivable from angular correlation) to classify simple objects using a neural network.

  17. Matching rendered and real world images by digital image processing

    NASA Astrophysics Data System (ADS)

    Mitjà, Carles; Bover, Toni; Bigas, Miquel; Escofet, Jaume

    2010-05-01

    Recent advances in computer-generated images (CGI) have been used in commercial and industrial photography providing a broad scope in product advertising. Mixing real world images with those rendered from virtual space software shows a more or less visible mismatching between corresponding image quality performance. Rendered images are produced by software which quality performance is only limited by the resolution output. Real world images are taken with cameras with some amount of image degradation factors as lens residual aberrations, diffraction, sensor low pass anti aliasing filters, color pattern demosaicing, etc. The effect of all those image quality degradation factors can be characterized by the system Point Spread Function (PSF). Because the image is the convolution of the object by the system PSF, its characterization shows the amount of image degradation added to any taken picture. This work explores the use of image processing to degrade the rendered images following the parameters indicated by the real system PSF, attempting to match both virtual and real world image qualities. The system MTF is determined by the slanted edge method both in laboratory conditions and in the real picture environment in order to compare the influence of the working conditions on the device performance; an approximation to the system PSF is derived from the two measurements. The rendered images are filtered through a Gaussian filter obtained from the taking system PSF. Results with and without filtering are shown and compared measuring the contrast achieved in different final image regions.

  18. Discovery of inscriptions on the shroud of Turin by digital image processing

    NASA Astrophysics Data System (ADS)

    Marion, Andre

    1998-08-01

    Almost invisible Latin and Greek inscriptions were discovered on the shroud of Turin around the image of the face. This result was obtained by using digital image processing techniques. An original method is developed to eliminate the texture of the cloth and also to combine data from different photographs or from the same plate digitized under various conditions. According to some paleographists, the revealed characters are thought to date back to before the Middle Ages. This conclusion could be a new argument in favor of the authenticity of the shroud.

  19. Realization of guitar audio effects using methods of digital signal processing

    NASA Astrophysics Data System (ADS)

    Buś, Szymon; Jedrzejewski, Konrad

    2015-09-01

    The paper is devoted to studies on possibilities of realization of guitar audio effects by means of methods of digital signal processing. As a result of research, some selected audio effects corresponding to the specifics of guitar sound were realized as the real-time system called Digital Guitar Multi-effect. Before implementation in the system, the selected effects were investigated using the dedicated application with a graphical user interface created in Matlab environment. In the second stage, the real-time system based on a microcontroller and an audio codec was designed and realized. The system is designed to perform audio effects on the output signal of an electric guitar.

  20. The heart's fibre alignment assessed by comparing two digitizing systems. Methodological investigation into the inclination angle towards wall thickness.

    PubMed

    Lunkenheimer, P P; Redmann, K; Dietl, K H; Cryer, C; Richter, K D; Whimster, W F; Niederer, P

    1997-04-01

    Myocardial contractile pathways which are not aligned strictly parallel to the heart's epicardial surface, give rise to forces which also act in the ventricular dilating direction. We developed a method which allows us to assess any fibre orientation in the three-dimensional myocardial weave. Decollagenized hearts were prepared by peeling-off fibre strands, following their main fibre orientation down to near the endocardium. In the subepicardium the strands followed a course more or less parallel to the epicardium, whereas from the mid-wall on they tended to dive progressively deeper into the wall. The preparation displays more or less rugged surfaces rather than smooth layers. The grooves and crests on the exposed surfaces were sequentially digitized by two methods: (1) Using a magnet tablet (3 Draw Digitizer System, Polhemus, Cochester VTO 5446, USA) on a dilated pig heart we manually followed the crests using a stylus, handling each groove and crest as an individual contractile pathway. (2) A constricted cow heart was digitized using a contact-free optical system (opto TOP, Dr. Breuckmann, Meersburg, Germany), which is based on the principle of imaging triangulation. Using specially developed software the inclination angles of selected crests and grooves with respect to the epicardial surface were calculated. The two digitizing methods yield comparable results. We found a depth- and side-specific weave component inclined to the epi-endocardial direction. This oblique netting component was more pronounced in the inner 1/3 of the wall than in the subepicardium. The inclination angle probably increases with increasing wall thickness during the ejection period. Manual digitizing is an easy and fast method which delivers consistent results comparable with those obtained by the cumbersome high resolution optical method. The rationales for the assessment of transmural fibre inclination are (1) the putative existence of dilating forces inherent in the myocardial weave

  1. Image processing system architecture using parallel arrays of digital signal processors

    NASA Astrophysics Data System (ADS)

    Kshirsagar, Shirish P.; Hobson, Clifford A.; Hartley, David A.; Harvey, David M.

    1993-10-01

    The paper describes the requirements of a high definition, high speed image processing system. Different types of parallel architectures were considered for the system. Advantages and limitations of SIMD and MIMD architectures are briefly discussed for image processing applications. A parallel image processing system based on MIMD architecture has been developed using multiple digital signal processors which can communicate with each other through an interconnection network. Texas Instruments TMS320C40 digital signal processors have been selected because they have a powerful floating point CPU supported by fast parallel communication ports, a DMA coprocessor and two memory interfaces. A five processor system is described in the paper. The EISA bus is used as the host interface and VISION bus is used to transfer images between the processors. The system is being used for automated non-contact inspection in which electro-optic signals are processed to identify manufacturing problems.

  2. Digital Light Processing for high-brightness high-resolution applications

    NASA Astrophysics Data System (ADS)

    Hornbeck, Larry J.

    1997-05-01

    Electronic projection display technology for high-brightness applications had its origins in the Gretag Eidophor, an oil film-based projection system developed in the early 1940s. A number of solid state technologies have challenged the Eidophor, including CRT-addressed LCD light valves and active-matrix-addressed LCD panels. More recently, in response to various limitations of the LCD technologies, high-brightness systems have been developed based on Digital Light Processing technology. At the heart of the DLP projection display is the Digital Micromirror Device, a semiconductor-based array of fast, reflective digital light switches that precisely control a light source using a binary pulsewidth modulation technique. This paper describes the design, operation, performance, and advantages of DLP- based projection systems for high-brightness, high- resolution applications. It also presents the current status of high-brightness products that will soon be on the market.

  3. Digitizing data acquisition and time-of-flight pulse processing for ToF-ERDA

    NASA Astrophysics Data System (ADS)

    Julin, Jaakko; Sajavaara, Timo

    2016-01-01

    A versatile system to capture and analyze signals from multi channel plate (MCP) based time-of-flight detectors and ionization based energy detectors such as silicon diodes and gas ionization chambers (GIC) is introduced. The system is based on commercial digitizers and custom software. It forms a part of a ToF-ERDA spectrometer, which has to be able to detect recoil atoms of many different species and energies. Compared to the currently used analogue electronics the digitizing system provides comparable time-of-flight resolution and improved hydrogen detection efficiency, while allowing the operation of the spectrometer be studied and optimized after the measurement. The hardware, data acquisition software and digital pulse processing algorithms to suit this application are described in detail.

  4. Seismic acquisition and processing methodologies in overthrust areas: Some examples from Latin America

    SciTech Connect

    Tilander, N.G.; Mitchel, R..

    1996-08-01

    Overthrust areas represent some of the last frontiers in petroleum exploration today. Billion barrel discoveries in the Eastern Cordillera of Colombia and the Monagas fold-thrust belt of Venezuela during the past decade have highlighted the potential rewards for overthrust exploration. However the seismic data recorded in many overthrust areas is disappointingly poor. Challenges such as rough topography, complex subsurface structure, presence of high-velocity rocks at the surface, back-scattered energy and severe migration wavefronting continue to lower data quality and reduce interpretability. Lack of well/velocity control also reduces the reliability of depth estimations and migrated images. Failure to obtain satisfactory pre-drill structural images can easily result in costly wildcat failures. Advances in the methodologies used by Chevron for data acquisition, processing and interpretation have produced significant improvements in seismic data quality in Bolivia, Colombia and Trinidad. In this paper, seismic test results showing various swath geometries will be presented. We will also show recent examples of processing methods which have led to improved structural imaging. Rather than focusing on {open_quotes}black box{close_quotes} methodology, we will emphasize the cumulative effect of step-by-step improvements. Finally, the critical significance and interrelation of velocity measurements, modeling and depth migration will be explored. Pre-drill interpretations must ultimately encompass a variety of model solutions, and error bars should be established which realistically reflect the uncertainties in the data.

  5. Pre-Processing of Point-Data from Contact and Optical 3D Digitization Sensors

    PubMed Central

    Budak, Igor; Vukelić, Djordje; Bračun, Drago; Hodolič, Janko; Soković, Mirko

    2012-01-01

    Contemporary 3D digitization systems employed by reverse engineering (RE) feature ever-growing scanning speeds with the ability to generate large quantity of points in a unit of time. Although advantageous for the quality and efficiency of RE modelling, the huge number of point datas can turn into a serious practical problem, later on, when the CAD model is generated. In addition, 3D digitization processes are very often plagued by measuring errors, which can be attributed to the very nature of measuring systems, various characteristics of the digitized objects and subjective errors by the operator, which also contribute to problems in the CAD model generation process. This paper presents an integral system for the pre-processing of point data, i.e., filtering, smoothing and reduction, based on a cross-sectional RE approach. In the course of the proposed system development, major emphasis was placed on the module for point data reduction, which was designed according to a novel approach with integrated deviation analysis and fuzzy logic reasoning. The developed system was verified through its application on three case studies, on point data from objects of versatile geometries obtained by contact and laser 3D digitization systems. The obtained results demonstrate the effectiveness of the system. PMID:22368513

  6. Digital processing of signals from LaBr3:Ce scintillation detectors

    NASA Astrophysics Data System (ADS)

    Nakhostin, M.; Podolyak, Zs; Regan, P. H.

    2014-12-01

    In this paper, we report on the results of digital signal processing of LaBr3(Ce) detectors. The photomultiplier (PMT) output signals from two cylindrical LaBr3(Ce) detectors (1.5'' diameter and 2'' tall) were directly digitized with an ultrafast digitizer (sampling rate up to 4 GSample/s and 10-bits resolution) and the energy and timing information were extracted through offline analysis of the pulses. It is shown that at high sampling rates (4 GS/s) a simple integration of pulses is sufficient to reproduce the analogue energy resolution of the detectors (3.5% at 662 keV energy) and by employing a digital version of constant-fraction discrimination (CFD) timing a time resolution of 240 ps (FWHM) is achieved at the energy lines of 60Co. The effects of pulse sampling rate were studied, indicating a degradation of the performance of the detectors with reducing the pulse sampling rate. In particular, it was found that at sampling rates below 1 GS/s, the digital timing can be limited by the aliasing error. By using an anti-aliasing filter, a time resolution of 375 ps (FWHM) and an energy resolution of 3.5% at 662 keV were achieved with a sampling rate of 500 MS/s.

  7. Digital processing of the Mariner 10 images of Venus and Mercury

    NASA Technical Reports Server (NTRS)

    Soha, J. M.; Lynn, D. J.; Mosher, J. A.; Elliot, D. A.

    1977-01-01

    An extensive effort was devoted to the digital processing of the Mariner 10 images of Venus and Mercury at the Image Processing Laboratory of the Jet Propulsion Laboratory. This effort was designed to optimize the display of the considerable quantity of information contained in the images. Several image restoration, enhancement, and transformation procedures were applied; examples of these techniques are included. A particular task was the construction of large mosaics which characterize the surface of Mercury and the atmospheric structure of Venus.

  8. A Methodology for Evaluating Artifacts Produced by a Formal Verification Process

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Miner, Paul S.; Person, Suzette

    2011-01-01

    The goal of this study is to produce a methodology for evaluating the claims and arguments employed in, and the evidence produced by formal verification activities. To illustrate the process, we conduct a full assessment of a representative case study for the Enabling Technology Development and Demonstration (ETDD) program. We assess the model checking and satisfiabilty solving techniques as applied to a suite of abstract models of fault tolerant algorithms which were selected to be deployed in Orion, namely the TTEthernet startup services specified and verified in the Symbolic Analysis Laboratory (SAL) by TTTech. To this end, we introduce the Modeling and Verification Evaluation Score (MVES), a metric that is intended to estimate the amount of trust that can be placed on the evidence that is obtained. The results of the evaluation process and the MVES can then be used by non-experts and evaluators in assessing the credibility of the verification results.

  9. Digital Storytelling in a Science Curriculum: The Process of Digital Storytelling to Help the Needs of Fourth Grade Students Understand the Concepts of Food Chains

    NASA Astrophysics Data System (ADS)

    Titus, Una-Bellelinda

    In this study I investigate if digital storytelling process will help the needs of the fourth grade students in an elementary school setting learn science concepts, specifically food chains. I focused on three students who varied in social and academic skills/behaviors to investigate their process in working on a digital story. My findings proved that digital storytelling scripts, storyboards, and graphic organizers helped students create a story telling about what happened in their food chain but students couldn't retain the information on food chains to help them in taking their post test. The graphic organizers were able to scaffold and help organize students' thinking. The digital scripts allowed students to comprehend science concepts and explain them to peers.

  10. Improvement in the incident reporting and investigation procedures using process excellence (DMAI2C) methodology.

    PubMed

    Miles, Elizabeth N

    2006-03-17

    In 1996, Health & Safety introduced an incident investigation process called Learning to Look to Johnson & Johnson. This process provides a systematic way of analyzing work-related injuries and illness, uncovers root cause that leads to system defects, and points to viable solutions. The process analyzed involves three steps: investigation and reporting of the incident, determination of root cause, and development and implementation of a corrective action plan. The process requires the investigators to provide an initial communication for work-related serious injuries and illness as well as lost workday cases to Corporate Headquarters within 72 h of the incident with a full investigative report to follow within 10 days. A full investigation requires a written report, a cause-result logic diagram (CRLD), a corrective action plan (CAP) and a report of incident costs (SafeCost) all due to be filed electronically. It is incumbent on the principal investigator and his or her investigative teams to assemble the various parts of the investigation and to follow up with the relevant parties to ensure corrective actions are implemented, and a full report submitted to Corporate executives. Initial review of the system revealed that the process was not working as designed. A number of reports were late, not signed by the business leaders, and in some instances, all cause were not identified. Process excellence was the process used to study the issue. The team used six sigma DMAI2C methodologies to identify and implement system improvements. The project examined the breakdown of the critical aspects of the reporting and investigation process that lead to system errors. This report will discuss the study findings, recommended improvements, and methods used to monitor the new improved process. PMID:16225990

  11. Study of heat dissipation process from heat sink using lensless Fourier transform digital holographic interferometry.

    PubMed

    Kumar, Varun; Shakher, Chandra

    2015-02-20

    This paper presents the results of experimental investigations about the heat dissipation process of plate fin heat sink using digital holographic interferometry. Visual inspection of reconstructed phase difference maps of the air field around the heat sink with and without electric power in the load resistor provides qualitative information about the variation of temperature and the heat dissipation process. Quantitative information about the temperature distribution is obtained from the relationship between the digitally reconstructed phase difference map of ambient air and heated air. Experimental results are presented for different current and voltage in the load resistor to investigate the heat dissipation process. The effect of fin spacing on the heat dissipation performance of the heat sink is also investigated in the case of natural heat convection. From experimental data, heat transfer parameters, such as local heat flux and convective heat transfer coefficients, are also calculated. PMID:25968185

  12. Computer-enhanced video microscopy: digitally processed microscope images can be produced in real time.

    PubMed Central

    Walter, R J; Berns, M W

    1981-01-01

    Digital processing techniques can be used to greatly enhance the available information in an optical image. Although this technology has been routinely used in many fields for a number of years, little application of digital image-processing techniques have been made toward analysis and enhancement of the types of images seen most often by the research biologist. We describe here a computer-based video microscope system that is capable of performing extensive manipulation and enhancement of microscope images in real time. The types of manipulations possible with these techniques greatly surpass the enhancement capabilities of photographic or video techniques alone. The speed and flexibility of this system enables experimental manipulation of the microscopic specimen based on its live processed image. These features greatly extend the power and versatility of the light microscope. Images PMID:6947267

  13. A digital process for additive manufacturing of occlusal splints: a clinical pilot study.

    PubMed

    Salmi, Mika; Paloheimo, Kaija-Stiina; Tuomi, Jukka; Ingman, Tuula; Mäkitie, Antti

    2013-07-01

    The aim of this study was to develop and evaluate a digital process for manufacturing of occlusal splints. An alginate impression was taken from the upper and lower jaws of a patient with temporomandibular disorder owing to cross bite and wear of the teeth, and then digitized using a table laser scanner. The scanned model was repaired using the 3Data Expert software, and a splint was designed with the Viscam RP software. A splint was manufactured from a biocompatible liquid photopolymer by stereolithography. The system employed in the process was SLA 350. The splint was worn nightly for six months. The patient adapted to the splint well and found it comfortable to use. The splint relieved tension in the patient's bite muscles. No sign of tooth wear or significant splint wear was detected after six months of testing. Modern digital technology enables us to manufacture clinically functional occlusal splints, which might reduce costs, dental technician working time and chair-side time. Maximum-dimensional errors of approximately 1 mm were found at thin walls and sharp corners of the splint when compared with the digital model. PMID:23614943

  14. A digital process for additive manufacturing of occlusal splints: a clinical pilot study

    PubMed Central

    Salmi, Mika; Paloheimo, Kaija-Stiina; Tuomi, Jukka; Ingman, Tuula; Mäkitie, Antti

    2013-01-01

    The aim of this study was to develop and evaluate a digital process for manufacturing of occlusal splints. An alginate impression was taken from the upper and lower jaws of a patient with temporomandibular disorder owing to cross bite and wear of the teeth, and then digitized using a table laser scanner. The scanned model was repaired using the 3Data Expert software, and a splint was designed with the Viscam RP software. A splint was manufactured from a biocompatible liquid photopolymer by stereolithography. The system employed in the process was SLA 350. The splint was worn nightly for six months. The patient adapted to the splint well and found it comfortable to use. The splint relieved tension in the patient's bite muscles. No sign of tooth wear or significant splint wear was detected after six months of testing. Modern digital technology enables us to manufacture clinically functional occlusal splints, which might reduce costs, dental technician working time and chair-side time. Maximum-dimensional errors of approximately 1 mm were found at thin walls and sharp corners of the splint when compared with the digital model. PMID:23614943

  15. Digital image processing system for a high-powered CO2 laser radar

    NASA Astrophysics Data System (ADS)

    Corbett, Francis J.; Groden, Michael; Dryden, Gordon L.; Pfeiffer, George; Boos, Robert; Youmans, Douglas G.

    1996-11-01

    Textron has designed and built a high-powered CO2 laser radar for long range targeting and remote sensing. This is a coherent, multi-wavelength system with a 2D, wide-band image processing capability. The digital processor produces several output products from the transmitter return signals including range, velocity, angle, and 2D range-Doppler images of hard-body targets (LADAR mode). In addition, the processor sorts and reports on data acquired from gaseous targets by wavelength and integrated path absorption (LIDAR mode). The digital processor has been developed from commercial components with a SUN SPARC 20 serving as the operator workstation and display. The digital output products are produced in real time and stored off-line for post-mission analysis and further target enhancements. This LADAR is distinguished from other designs primarily by the waveforms produced by the laser for target interrogation. The digital processing algorithms are designed to extract certain features through operation on each of the two waveforms. The waveforms are a pulse-tone and a pulse-burst designed for target acquisition and track, and 2D imaging respectively. The algorithms are categorized by function as acquisition/track, 2D imaging, integrated absorption for gaseous targets, and post mission enhancements such as tomographic reconstruction for multiple looks at targets from different perspectives. Field tests are now in process and results acquired from Feb.-June '96 will be reported on. The digital imaging system, its architecture, algorithms, simulations, and products will be described.

  16. Optimization of coagulation-flocculation process for palm oil mill effluent using response surface methodology.

    PubMed

    Ahmad, A L; Ismail, S; Bhatia, S

    2005-04-15

    The coagulation-flocculation process incorporated with membrane separation technology will become a new approach for palm oil mill effluent (POME) treatment as well as water reclamation and reuse. In our current research, a membrane pilot plant has been used for POME treatment where the coagulation-flocculation process plays an important role as a pretreatment process for the mitigation of membrane fouling problems. The pretreated POME with low turbidity values and high water recovery are the main objectives to be achieved through the coagulation-flocculation process. Therefore, treatment optimization to serve these purposes was performed using jar tests and applying a response surface methodology (RSM) to the results. A 2(3) full-factorial central composite design (CCD) was chosen to explain the effect and interaction of three factors: coagulant dosage, flocculent dosage, and pH. The CCD is successfully demonstrated to efficiently determine the optimized parameters, where 78% of water recovery with a 20 NTU turbidity value can be obtained at the optimum value of coagulant dosage, flocculent dosage, and pH at 15 000 mg/L, 300 mg/L, and 6, respectively. PMID:15884382

  17. Optimization of permeabilization process of yeast cells for catalase activity using response surface methodology

    PubMed Central

    Trawczyńska, Ilona; Wójcik, Marek

    2015-01-01

    Biotransformation processes accompanied by whole yeast cells as biocatalyst are a promising area of food industry. Among the chemical sanitizers currently used in food technology, hydrogen peroxide is a very effective microbicidal and bleaching agent. In this paper, permeabilization has been applied to Saccharomyces cerevisiae yeast cells aiming at increased intracellular catalase activity for decomposed H2O2. Ethanol, which is non-toxic, biodegradable and easily available, has been used as permeabilization factor. Response surface methodology (RSM) has been applied in determining the influence of different parameters on permeabilization process. The aim of the study was to find such values of the process parameters that would yield maximum activity of catalase during decomposition of hydrogen peroxide. The optimum operating conditions for permeabilization process obtained by RSM were as follows: 53% (v/v) of ethanol concentration, temperature of 14.8 °C and treatment time of 40 min. After permeabilization, the activity of catalase increased ca. 40 times and its maximum value equalled to 4711 U/g. PMID:26019618

  18. Optimization of Electrochemical Treatment Process Conditions for Distillery Effluent Using Response Surface Methodology

    PubMed Central

    Arulmathi, P.; Elangovan, G.; Begum, A. Farjana

    2015-01-01

    Distillery industry is recognized as one of the most polluting industries in India with a large amount of annual effluent production. In this present study, the optimization of electrochemical treatment process variables was reported to treat the color and COD of distillery spent wash using Ti/Pt as an anode in a batch mode. Process variables such as pH, current density, electrolysis time, and electrolyte dose were selected as operation variables and chemical oxygen demand (COD) and color removal efficiency were considered as response variable for optimization using response surface methodology. Indirect electrochemical-oxidation process variables were optimized using Box-Behnken response surface design (BBD). The results showed that electrochemical treatment process effectively removed the COD (89.5%) and color (95.1%) of the distillery industry spent wash under the optimum conditions: pH of 4.12, current density of 25.02 mA/cm2, electrolysis time of 103.27 min, and electrolyte (NaCl) concentration of 1.67 g/L, respectively. PMID:26491716

  19. Microspectrofluorometry by digital image processing: measurement of cytoplasmic pH

    SciTech Connect

    Tanasugarn, L.; McNeil, P.; Reynolds, G.T.; Taylor, D.L.

    1984-02-01

    An interface of our microspectrofluorometer with an image processing system performs microspectrofluorometric measurements in living cells by digital image processing. Fluorescence spectroscopic parameters can be measured by digital image processing directly from microscopic images of cells, and are automatically normalized for pathlength and accessible volume. Thus, an accurate cytoplasmic map of various spectroscopic parameters can be produced. The resting cytoplasmic pH of fibroblasts (3T3 cells) has been determined by measuring the ratio of fluorescein fluorescence exited by two successive wavelengths (489 and 452 nm). Fluorescein-labeled dextran microinjected into the cells is used as a pH indicator, since it is trapped in the cytoplasm but is excluded from the nucleus and other organelles. The average cytoplasmic pH is 6.83 (+/- 0.38). However, cytoplasmic pH exhibits a non-unimodal distribution, the lower mean pH being 6.74 (+/- 0.23). When 3T3 cells pinocytose medium containing fluorescein dextran, pinosomes peripheral to the nucleus exhibit a lower pH than those closer to the ruffling edge of the cell. The present image processing system is analyzed for linearity of detection, light scattering artifacts, signal to noise ratio, standard curves, and spatial resolution. The results obtained from digital image analysis are shown to be comparable to the results from standard microspectrofluorometry. We also discuss several other applications of this ratio imaging technique in cell biology.

  20. Process and methodology of developing Cassini G and C Telemetry Dictionary

    NASA Astrophysics Data System (ADS)

    Kan, Edwin P.

    1994-11-01

    While the Cassini spacecraft telemetry design had taken on the new approach of 'packetized telemetry', the AACS (Attitude and Articulation Subsystem) had further extended into the design of 'mini-packets' in its telemetry system. Such telemetry packet and mini-packet design produced the AACS Telemetry Dictionary; iterations of the latter in turn provided changes to the former. The ultimate goals were to achieve maximum telemetry packing density, optimize the 'freshness' of more time-critical data, and to effect flexibility, i.e., multiple AACS data collection schemes, without needing to change the overall spacecraft telemetry mode. This paper describes such a systematic process and methodology, evidenced by various design products related to, or as part of, the AACS Telemetry Dictionary.

  1. Application of response surface methodology in optimizaing the sulfation-roasting-leaching process of nickel laterite

    NASA Astrophysics Data System (ADS)

    Guo, Xue-Yi; Li, Dong; Wu, Zhan; Tian, Qing-Hua

    2012-03-01

    Nickel was recovered from nickel laterite using a sulfation-roasting-leaching process and the effects of operation parameters including acid addition, roasting temperature, and roasting time on nickel extraction and iron dissolution were investigated using response surface methodology (RSM). Two second-order polynomial models of high significance were presented to show the relationship between the responses and the variables. The analysis of variance (ANOVA) showed high coefficients of determination ( R 2) of 0.894 and 0.980 for the two models, respectively. Optimum areas of ≥80% Ni extraction and ≤5% Fe dissolution were obtained by the overlaid contours. Verification experiments in the optimum areas were conducted and the results indicate a close agreement with the predicted values obtained from the models.

  2. Process optimization of microencapsulation of curcumin in γ-polyglutamic acid using response surface methodology.

    PubMed

    Ko, Wen-Ching; Chang, Chao-Kai; Wang, Hsiu-Ju; Wang, Shian-Jen; Hsieh, Chang-Wei

    2015-04-01

    The aim of this study was to develop an optimal microencapsulation method for an oil-soluble component (curcumin) using γ-PGA. The results show that Span80 significantly enhances the encapsulation efficiency (EE) of γ-Na(+)-PGA microcapsules. Therefore, the effects of γ-Na(+)-PGA, curcumin and Span80 concentration on EE of γ-Na(+)-PGA microcapsules were studied by means of response surface methodology (RSM). It was found that the optimal microencapsulation process is achieved by using γ-Na(+)-PGA 6.05%, curcumin 15.97% and Span80 0.61% with a high EE% (74.47 ± 0.20%). Furthermore, the models explain 98% of the variability in the responses. γ-Na(+)-PGA seems to be a good carrier for the encapsulation of curcumin. In conclusion, this simple and versatile approach can potentially be applied to the microencapsulation of various oil-soluble components for food applications. PMID:25442584

  3. Process and methodology of developing Cassini G and C Telemetry Dictionary

    NASA Technical Reports Server (NTRS)

    Kan, Edwin P.

    1994-01-01

    While the Cassini spacecraft telemetry design had taken on the new approach of 'packetized telemetry', the AACS (Attitude and Articulation Subsystem) had further extended into the design of 'mini-packets' in its telemetry system. Such telemetry packet and mini-packet design produced the AACS Telemetry Dictionary; iterations of the latter in turn provided changes to the former. The ultimate goals were to achieve maximum telemetry packing density, optimize the 'freshness' of more time-critical data, and to effect flexibility, i.e., multiple AACS data collection schemes, without needing to change the overall spacecraft telemetry mode. This paper describes such a systematic process and methodology, evidenced by various design products related to, or as part of, the AACS Telemetry Dictionary.

  4. Electronic polarization-division demultiplexing based on digital signal processing in intensity-modulation direct-detection optical communication systems.

    PubMed

    Kikuchi, Kazuro

    2014-01-27

    We propose a novel configuration of optical receivers for intensity-modulation direct-detection (IM · DD) systems, which can cope with dual-polarization (DP) optical signals electrically. Using a Stokes analyzer and a newly-developed digital signal-processing (DSP) algorithm, we can achieve polarization tracking and demultiplexing in the digital domain after direct detection. Simulation results show that the power penalty stemming from digital polarization manipulations is negligibly small. PMID:24515206

  5. Methods, media and systems for managing a distributed application running in a plurality of digital processing devices

    DOEpatents

    Laadan, Oren; Nieh, Jason; Phung, Dan

    2012-10-02

    Methods, media and systems for managing a distributed application running in a plurality of digital processing devices are provided. In some embodiments, a method includes running one or more processes associated with the distributed application in virtualized operating system environments on a plurality of digital processing devices, suspending the one or more processes, and saving network state information relating to network connections among the one or more processes. The method further include storing process information relating to the one or more processes, recreating the network connections using the saved network state information, and restarting the one or more processes using the stored process information.

  6. Accuracy, security, and processing time comparisons of biometric fingerprint recognition system using digital and optical enhancements

    NASA Astrophysics Data System (ADS)

    Alsharif, Salim; El-Saba, Aed; Jagapathi, Rajendarreddy

    2011-06-01

    Fingerprint recognition is one of the most commonly used forms of biometrics and has been widely used in daily life due to its feasibility, distinctiveness, permanence, accuracy, reliability, and acceptability. Besides cost, issues related to accuracy, security, and processing time in practical biometric recognition systems represent the most critical factors that makes these systems widely acceptable. Accurate and secure biometric systems often require sophisticated enhancement and encoding techniques that burdens the overall processing time of the system. In this paper we present a comparison between common digital and optical enhancementencoding techniques with respect to their accuracy, security and processing time, when applied to biometric fingerprint systems.

  7. Two-dimensional quantification of the corrosion process in metal surfaces using digital speckle pattern interferometry

    SciTech Connect

    Andres, N.; Lobera, J.; Arroyo, M. P.; Angurel, L. A.

    2011-04-01

    The applicability of digital speckle pattern interferometry (DSPI) to the analysis of surface corrosion processes has been evaluated by studying the evolution of an Fe surface immersed in sulfuric acid. This work describes the analysis process required to obtain quantitative information about the corrosion process. It has been possible to evaluate the corrosion rate, and the results agree with those derived from the weight loss method. In addition, a two-dimensional analysis has been applied, showing that DSPI measurements can be used to extract information about the corrosion rate at any region of the surface.

  8. Digital signal processing and data acquisition employing diode lasers for lidar-hygrometer

    NASA Astrophysics Data System (ADS)

    Naboko, Sergei V.; Pavlov, Lyubomir Y.; Penchev, Stoyan P.; Naboko, Vassily N.; Pencheva, Vasilka H.; Donchev, T.

    2003-11-01

    The paper refers to novel aspects of application of the laser radar (LIDAR) to differential absorption spectroscopy and atmospheric gas monitoring, accenting on the advantages of the class of powerful pulsed laser diodes. The implementation of the task for determination of atmospheric humidity, which is a major green house gas, and the set demands of measurement match well the potential of the acquisition system. The projected system is designed by transmission of the operations to Digital Signal Processing (DSP) module allowing preservation of the informative part of the signal by real-time pre-processing and following post-processing by personal computer.

  9. Searching early bone metastasis on plain radiography by using digital imaging processing

    NASA Astrophysics Data System (ADS)

    Jaramillo-Núñez, A.; Pérez-Meza, M.

    2012-10-01

    Some authors mention that it is not possible to detect early bone metastasis on plain radiography. In this work we use digital imaging processing to analyze three radiographs taken from a patient with bone metastasis discomfort on the right shoulder. The time period among the first and second radiography was approximately one month and between the first and the third one year. This procedure is a first approach in order to know if in this particular case it was possible to detect an early bone metastasis. The obtained results suggest that by carrying out a digital processing is possible to detect the metastasis since the radiography contains the information although visually it is not possible to observe it.

  10. PREFACE: I International Scientific School Methods of Digital Image Processing in Optics and Photonics

    NASA Astrophysics Data System (ADS)

    Gurov, I. P.; Kozlov, S. A.

    2014-09-01

    The first international scientific school "Methods of Digital Image Processing in Optics and Photonics" was held with a view to develop cooperation between world-class experts, young scientists, students and post-graduate students, and to exchange information on the current status and directions of research in the field of digital image processing in optics and photonics. The International Scientific School was managed by: Saint Petersburg National Research University of Information Technologies, Mechanics and Optics (ITMO University) - Saint Petersburg (Russia) Chernyshevsky Saratov State University - Saratov (Russia) National research nuclear University "MEPHI" (NRNU MEPhI) - Moscow (Russia) The school was held with the participation of the local chapters of Optical Society of America (OSA), the Society of Photo-Optical Instrumentation Engineers (SPIE) and IEEE Photonics Society. Further details, including topics, committees and conference photos are available in the PDF

  11. The Development of a Digital Processing System for Accurate Range Determinations. [for Teleoperator Maneuvering Systems

    NASA Technical Reports Server (NTRS)

    Pujol, A., Jr.

    1983-01-01

    The development of an accurate close range (from 0.0 meters to 30.0 meters) radar system for Teleoperator Maneuvering Systems (TMS) is discussed. The system under investigation is a digital processor that converts incoming signals from the radar system into their related frequency spectra. Identification will be attempted by correlating spectral characteristics with accurate range determinataions. The system will utilize an analog to digital converter for sampling and converting the signal from the radar system into 16-bit digital words (two bytes) for RAM storage, data manipulations, and computations. To remove unwanted frequency components the data will be retrieved from RAM and digitally filtered using large scale integration (LSI) circuits. Filtering will be performed by a biquadratic routine within the chip which carries out the required filter algorithm. For conversion to a frequency spectrum the filtered data will be processed by a Fast Fourier Transform chip. Analysis and identification of spectral characteristics for accurate range determinations will be made by microcomputer computations.

  12. Universal Michelson Gires-Tournois interferometer optical interleaver based on digital signal processing.

    PubMed

    Zhang, Juan; Yang, Xiaowei

    2010-03-01

    Optical interleavers based on Michelson Gires-Tournois interferometer (MGTI) with arbitrary cascaded reflectors for symmetrical or asymmetrical periodic frequency response with arbitrary duty cycles are defined as universal MGTI optical interleaver (UMGTIOI). It can significantly enhance flexibility and applicability of optical networks. A novel and simple method based on digital signal processing is proposed for the design of UMGTIOI. Different kinds of design examples are given to confirm effectiveness of the method. PMID:20389520

  13. a Real-Time Optical/digital Radon Space Image Processing System

    NASA Astrophysics Data System (ADS)

    Woolven, Steve

    A unique hybrid optical/digital general image processing system which potentially functions at real-time rates and performs analysis on low object-to-background contrast images in Radon space is investigated. The system is capable of some real-time functions which are invariant to object distortions. This research is presented in three stages: the development and analysis of the theory of Radon space, the hardware and software design and implementation of the working system, and the results achieved. This original system functions by using the forward Radon transform ^1, which is achieved by a front-end optical processor, followed by a digital processing subsystem operating in Radon space instead of the more familiar image space. The system works by converting the two dimensional image data into a series of one dimensional projections, and it is demonstrated that several digital image processing functions can potentially be performed faster on the projection data than on the original image data. Using the transform, it is shown that the system is theoretically capable of performing real-time two dimensional Fourier transforms and matched filtering operations. Also, this document presents and demonstrates a method of potential real-time object-moment analysis which allows objects to undergo distortions and continue to be recognized as the original object. It is shown that these moments can be calculated in Radon space using significantly less image data and fewer digital processing operations than in image space. The optical system is potentially capable of performing 6.04 times 10^{10 } operations per second on the two dimensional image data. ftn^1The Radon transform refers to a mathematical tomographic transform of image data from two dimensional image space to a one dimensional space (Radon space).

  14. Processing of post-consumer HDPE reinforced with wood flour: Effect of functionaliation methodology

    NASA Astrophysics Data System (ADS)

    Catto, A. L.; Montagna, L. S.; Rossini, K.; Santana, R. M. C.

    2014-05-01

    A way very interesting used in the reuse of waste cellulose derivatives such as wood flour is its incorporation into thermoplastics matrix. As the olefinic polymers have no interaction with cellulose derivatives, some chemical treatments have been used in the modification of vegetable fibers, to increase the interfacial adhesion between the cellulosic reinforcement and the polymeric matrix. In this sense, the objective of this study was to evaluate the influence of the methodology of the incorporation of compatibilizer agent (CA) in polyolefin matrix and evaluate the mechanical and morphological properties of composites. HDPE, wood flour from Eucalyptus grandis species (EU) and graftized polyethylene with maleic anhydride (CA) were used in composites, being extruded and after injection molding. The mixtures were processed in a single screw extruder (L/D: 22), with the temperature profile from 170° to 190°C. In a first step, the materials were processed together in the extruder, and after the samples were injected a temperature of 185°C and pressure of 600 bar. In a second step, the HDPE with the compatibilizer agent were first processed in the extruder in order to functionalize the polyolefin, and added after the wood flour (EU) sieved (30% w/w). Results showed that composites with CA had a higher mechanical performance compared to nocompatibilized. Also showed that composites compatibilized previously in the extruder with CA had better performance in comparison to other where the polymer matrix was not previously compatibilized.

  15. The effects of solar incidence angle over digital processing of LANDSAT data

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Novo, E. M. L. M.

    1983-01-01

    A technique to extract the topography modulation component from digital data is described. The enhancement process is based on the fact that the pixel contains two types of information: (1) reflectance variation due to the target; (2) reflectance variation due to the topography. In order to enhance the signal variation due to topography, the technique recommends the extraction from original LANDSAT data of the component resulting from target reflectance. Considering that the role of topographic modulation over the pixel information will vary with solar incidence angle, the results of this technique of digital processing will differ from one season to another, mainly in highly dissected topography. In this context, the effects of solar incidence angle over the topographic modulation technique were evaluated. Two sets of MSS/LANDSAT data, with solar elevation angles varying from 22 to 41 deg were selected to implement the digital processing at the Image-100 System. A secondary watershed (Rio Bocaina) draining into Rio Paraiba do Sul (Sao Paulo State) was selected as a test site. The results showed that the technique used was more appropriate to MSS data acquired under higher Sun elevation angles. Topographic modulation components applied to low Sun elevation angles lessens rather than enhances topography.

  16. Method and Apparatus for Evaluating the Visual Quality of Processed Digital Video Sequences

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B. (Inventor)

    2002-01-01

    A Digital Video Quality (DVQ) apparatus and method that incorporate a model of human visual sensitivity to predict the visibility of artifacts. The DVQ method and apparatus are used for the evaluation of the visual quality of processed digital video sequences and for adaptively controlling the bit rate of the processed digital video sequences without compromising the visual quality. The DVQ apparatus minimizes the required amount of memory and computation. The input to the DVQ apparatus is a pair of color image sequences: an original (R) non-compressed sequence, and a processed (T) sequence. Both sequences (R) and (T) are sampled, cropped, and subjected to color transformations. The sequences are then subjected to blocking and discrete cosine transformation, and the results are transformed to local contrast. The next step is a time filtering operation which implements the human sensitivity to different time frequencies. The results are converted to threshold units by dividing each discrete cosine transform coefficient by its respective visual threshold. At the next stage the two sequences are subtracted to produce an error sequence. The error sequence is subjected to a contrast masking operation, which also depends upon the reference sequence (R). The masked errors can be pooled in various ways to illustrate the perceptual error over various dimensions, and the pooled error can be converted to a visual quality measure.

  17. Method and Apparatus for Evaluating the Visual Quality of Processed Digital Video Sequences

    NASA Astrophysics Data System (ADS)

    Watson, Andrew B.

    2002-12-01

    A Digital Video Quality (DVQ) apparatus and method that incorporate a model of human visual sensitivity to predict the visibility of artifacts. The DVQ method and apparatus are used for the evaluation of the visual quality of processed digital video sequences and for adaptively controlling the bit rate of the processed digital video sequences without compromising the visual quality. The DVQ apparatus minimizes the required amount of memory and computation. The input to the DVQ apparatus is a pair of color image sequences: an original (R) non-compressed sequence, and a processed (T) sequence. Both sequences (R) and (T) are sampled, cropped, and subjected to color transformations. The sequences are then subjected to blocking and discrete cosine transformation, and the results are transformed to local contrast. The next step is a time filtering operation which implements the human sensitivity to different time frequencies. The results are converted to threshold units by dividing each discrete cosine transform coefficient by its respective visual threshold. At the next stage the two sequences are subtracted to produce an error sequence. The error sequence is subjected to a contrast masking operation, which also depends upon the reference sequence (R). The masked errors can be pooled in various ways to illustrate the perceptual error over various dimensions, and the pooled error can be converted to a visual quality measure.

  18. Real-time design of N-dimensional digital filters for image processing

    NASA Astrophysics Data System (ADS)

    Drynkin, Vladimir N.

    1995-12-01

    The main body of remote sensing data is obtained with the aid of optoelectronic and photographic devices. This data is usually referred to as the video information since it may be presented as images of terrestrial surface on a satellite track or an airway. This is the reason of increasing interest of specialists in the field of the remote sensing devices design to the methods of synthesis of optimal data processing hardware. The design of effective systems of the remote sensing data formation and transmission are impossible without using the state-of- the-art synthesis methods of digital image processing systems, taking account of a message source and their recipient characteristic properties. It is possible to take account of these characteristic properties only on the basis of optimal N-dimensional digital filtering. From this point of view the N-dimensional filter, used for video images filtering, becomes optimal only in the case of coincidence of the pass band region of its spatial frequency response (SFR) with the isoenergetic surface of the image spectrum with allowance for eyesight characteristics. In the light of the above the problem of N-dimensional digital filters design with the given pass band region configuration becomes actual. Incidentally the practicable interest presents first of all the methods, allowing with relatively low hardware expenses to design structures, from one part operating in the real time, and from the other -- approaching best of all the given characteristics. In this case it is necessary to ensure stability during their operation. In the following we shall present the results of the synthesis method development of N-dimensional digital filters with the guaranteed stability and the given pass band region configuration, realizing the image processing in the real time.

  19. Optimization of spray drying process for developing seabuckthorn fruit juice powder using response surface methodology.

    PubMed

    Selvamuthukumaran, Meenakshisundaram; Khanum, Farhath

    2014-12-01

    The response surface methodology was used to optimize the spray drying process for development of seabuckthorn fruit juice powder. The independent variables were different levels of inlet air temperature and maltodextrin concentration. The responses were moisture, solubility, dispersibility, vitamin C and overall color difference value. Statistical analysis revealed that independent variables significantly affected all the responses. The Inlet air temperature showed maximum influence on moisture and vitamin C content, while the maltodextrin concentration showed similar influence on solubility, dispersibility and overall color difference value. Contour plots for each response were used to generate an optimum area by superimposition. The seabuckthorn fruit juice powder was developed using the derived optimum processing conditions to check the validity of the second order polynomial model. The experimental values were found to be in close agreement to the predicted values and were within the acceptable limits indicating the suitability of the model in predicting quality attributes of seabuckthorn fruit juice powder. The recommended optimum spray drying conditions for drying 100 g fruit juice slurry were inlet air temperature and maltodextrin concentration of 162.5 °C and 25 g, respectively. The spray dried juice powder contains higher amounts of antioxidants viz., vitamin C, vitamin E, total carotenoids, total anthocyanins and total phenols when compared to commercial fruit juice powders and they are also found to be free flowing without any physical alterations such as caking, stickiness, collapse and crystallization by exhibiting greater glass transition temperature. PMID:25477639

  20. A foundational methodology for determining system static complexity using notional lunar oxygen production processes

    NASA Astrophysics Data System (ADS)

    Long, Nicholas James

    This thesis serves to develop a preliminary foundational methodology for evaluating the static complexity of future lunar oxygen production systems when extensive information is not yet available about the various systems under consideration. Evaluating static complexity, as part of a overall system complexity analysis, is an important consideration in ultimately selecting a process to be used in a lunar base. When system complexity is higher, there is generally an overall increase in risk which could impact the safety of astronauts and the economic performance of the mission. To evaluate static complexity in lunar oxygen production, static complexity is simplified and defined into its essential components. First, three essential dimensions of static complexity are investigated, including interconnective complexity, strength of connections, and complexity in variety. Then a set of methods is developed upon which to separately evaluate each dimension. Q-connectivity analysis is proposed as a means to evaluate interconnective complexity and strength of connections. The law of requisite variety originating from cybernetic theory is suggested to interpret complexity in variety. Secondly, a means to aggregate the results of each analysis is proposed to create holistic measurement for static complexity using the Single Multi-Attribute Ranking Technique (SMART). Each method of static complexity analysis and the aggregation technique is demonstrated using notional data for four lunar oxygen production processes.

  1. Methodology to assess the environmental impact of a product and its processes

    NASA Astrophysics Data System (ADS)

    Kumar, K. R.; Lee, Dongwon; Malhotra, Arvind

    2001-02-01

    This study presents a methodology for capturing the environmental impact of a product and its processes throughout the life cycle in discrete part manufacturing. The objectives of the study are to identify opportunities to enhance environmental friendliness of a product in its design stage, and assess whether the environmental impact has actually been reduced or has simply been shifted elsewhere in the life cycle of the product. Using the bill of materials and the process route sheet, we build the environmental status of its operations as a vector of measurable attributes, categorized under the taxonomy of social, ecological, and economic impact that can be aggregated and evaluated at a business unit level. The vector of social impact deals with the effects of materials used and wastes produced on people through the life cycle. The vector of ecological impact consists of effects of recycling, reuse, and remanufacturing of a product based on the notion of materials balance. Finally, the vector of economic impact represents the conversion of the previous two vectors into managerially relevant costs to firm expressed in dollar amounts so that managers in any positions visually appraise their operations and communicate each other with the same language.

  2. Modeling and optimization of red currants vacuum drying process by response surface methodology (RSM).

    PubMed

    Šumić, Zdravko; Vakula, Anita; Tepić, Aleksandra; Čakarević, Jelena; Vitas, Jasmina; Pavlić, Branimir

    2016-07-15

    Fresh red currants were dried by vacuum drying process under different drying conditions. Box-Behnken experimental design with response surface methodology was used for optimization of drying process in terms of physical (moisture content, water activity, total color change, firmness and rehydratation power) and chemical (total phenols, total flavonoids, monomeric anthocyanins and ascorbic acid content and antioxidant activity) properties of dried samples. Temperature (48-78 °C), pressure (30-330 mbar) and drying time (8-16 h) were investigated as independent variables. Experimental results were fitted to a second-order polynomial model where regression analysis and analysis of variance were used to determine model fitness and optimal drying conditions. The optimal conditions of simultaneously optimized responses were temperature of 70.2 °C, pressure of 39 mbar and drying time of 8 h. It could be concluded that vacuum drying provides samples with good physico-chemical properties, similar to lyophilized sample and better than conventionally dried sample. PMID:26948639

  3. Evaluation of superconducting quantum interference devices interfaced with digital signal processing electronics for biomagnetic applications

    SciTech Connect

    Kung, Pang-Jen, Flynn, E.R.; Bracht, R.R.; Lewis, P.S.

    1994-08-01

    The performance of a dc-SQUID magnetometer driven by both analog electronics and digital signal processors are investigated and compared for biomagnetic applications. Low-noise ( < 5 {mu} {Phi} {sub 0}/{radical}Hz at 1 Hz) dc-SQUIDs were fabricated by Conductus, Inc. using the all-refractory Nb/Al/Al{sub 2}O{sub 3}/Nb process on silicon substrates with on-chip modulation coils and integral washer damping resistors. A second-order gradiometer was magnetically coupled to the input coil of the SQUID to maximize the detected signal strength. The readout of this SQUID gradiometer was achieved using a conventional flux-locked loop (FLL) circuit to provide a linearized voltage output that was proportional to the flux applied to the SQUID. A shielded cylinder was constructed to house the magnetometer to reduce ambient field noise. To realize the digital feedback loop, the analog FLL is replaced except for the preamplifier by a digital signal processing board with dual 16-bit A/D and D/A converters. This approach shows several advantages over the analog scheme including operational flexibility, cost reduction, and possibly, the enhancement of dynamic ranges and slew rates.

  4. Automatic rice crop height measurement using a field server and digital image processing.

    PubMed

    Sritarapipat, Tanakorn; Rakwatin, Preesan; Kasetkasem, Teerasit

    2014-01-01

    Rice crop height is an important agronomic trait linked to plant type and yield potential. This research developed an automatic image processing technique to detect rice crop height based on images taken by a digital camera attached to a field server. The camera acquires rice paddy images daily at a consistent time of day. The images include the rice plants and a marker bar used to provide a height reference. The rice crop height can be indirectly measured from the images by measuring the height of the marker bar compared to the height of the initial marker bar. Four digital image processing steps are employed to automatically measure the rice crop height: band selection, filtering, thresholding, and height measurement. Band selection is used to remove redundant features. Filtering extracts significant features of the marker bar. The thresholding method is applied to separate objects and boundaries of the marker bar versus other areas. The marker bar is detected and compared with the initial marker bar to measure the rice crop height. Our experiment used a field server with a digital camera to continuously monitor a rice field located in Suphanburi Province, Thailand. The experimental results show that the proposed method measures rice crop height effectively, with no human intervention required. PMID:24451465

  5. Using digital flow cytometry to assess the degradation of three cyanobacteria species after oxidation processes.

    PubMed

    Wert, Eric C; Dong, Mei Mei; Rosario-Ortiz, Fernando L

    2013-07-01

    Depending on drinking water treatment conditions, oxidation processes may result in the degradation of cyanobacteria cells causing the release of toxic metabolites (microcystin), odorous metabolites (MIB, geosmin), or disinfection byproduct precursors. In this study, a digital flow cytometer (FlowCAM(®)) in combination with chlorophyll-a analysis was used to evaluate the ability of ozone, chlorine, chlorine dioxide, and chloramine to damage or lyse cyanobacteria cells added to Colorado River water. Microcystis aeruginosa (MA), Oscillatoria sp. (OSC) and Lyngbya sp. (LYN) were selected for the study due to their occurrence in surface water supplies, metabolite production, and morphology. Results showed that cell damage was observed without complete lysis or fragmentation of the cell membrane under many of the conditions tested. During ozone and chlorine experiments, the unicellular MA was more susceptible to oxidation than the filamentous OSC and LYN. Rate constants were developed based on the loss of chlorophyll-a and oxidant exposure, which showed the oxidants degraded MA, OSC, and LYN according to the order of ozone > chlorine ~ chlorine dioxide > chloramine. Digital and binary images taken by the digital flow cytometer provided qualitative insight regarding cell damage. When applying this information, drinking water utilities can better understand the risk of cell damage or lysis during oxidation processes. PMID:23726712

  6. Digital Audio Signal Processing and Nde: AN Unlikely but Valuable Partnership

    NASA Astrophysics Data System (ADS)

    Gaydecki, Patrick

    2008-02-01

    In the Digital Signal Processing (DSP) group, within the School of Electrical and Electronic Engineering at The University of Manchester, research is conducted into two seemingly distinct and disparate subjects: instrumentation for nondestructive evaluation, and DSP systems & algorithms for digital audio. We have often found that many of the hardware systems and algorithms employed to recover, extract or enhance audio signals may also be applied to signals provided by ultrasonic or magnetic NDE instruments. Furthermore, modern DSP hardware is so fast (typically performing hundreds of millions of operations per second), that much of the processing and signal reconstruction may be performed in real time. Here, we describe some of the hardware systems we have developed, together with algorithms that can be implemented both in real time and offline. A next generation system has now been designed, which incorporates a processor operating at 0.55 Giga MMACS, six input and eight output analogue channels, digital input/output in the form of S/PDIF, a JTAG and a USB interface. The software allows the user, with no knowledge of filter theory or programming, to design and run standard or arbitrary FIR, IIR and adaptive filters. Using audio as a vehicle, we can demonstrate the remarkable properties of modern reconstruction algorithms when used in conjunction with such hardware; applications in NDE include signal enhancement and recovery in acoustic, ultrasonic, magnetic and eddy current modalities.

  7. Automatic Rice Crop Height Measurement Using a Field Server and Digital Image Processing

    PubMed Central

    Sritarapipat, Tanakorn; Rakwatin, Preesan; Kasetkasem, Teerasit

    2014-01-01

    Rice crop height is an important agronomic trait linked to plant type and yield potential. This research developed an automatic image processing technique to detect rice crop height based on images taken by a digital camera attached to a field server. The camera acquires rice paddy images daily at a consistent time of day. The images include the rice plants and a marker bar used to provide a height reference. The rice crop height can be indirectly measured from the images by measuring the height of the marker bar compared to the height of the initial marker bar. Four digital image processing steps are employed to automatically measure the rice crop height: band selection, filtering, thresholding, and height measurement. Band selection is used to remove redundant features. Filtering extracts significant features of the marker bar. The thresholding method is applied to separate objects and boundaries of the marker bar versus other areas. The marker bar is detected and compared with the initial marker bar to measure the rice crop height. Our experiment used a field server with a digital camera to continuously monitor a rice field located in Suphanburi Province, Thailand. The experimental results show that the proposed method measures rice crop height effectively, with no human intervention required. PMID:24451465

  8. Automatic Analysis for the Chemical Testing of Urine Examination Using Digital Image Processing Techniques

    NASA Astrophysics Data System (ADS)

    Vilardy, Juan M.; Peña, Jose C.; Daza, Miller F.; Torres, Cesar O.; Mattos, Lorenzo

    2008-04-01

    For to make the chemical testing of urine examination a dipstick is used, which contains pads that have incorporated within them the reagents for chemical reactions for the detection of a number from substances in the urine. Urine is added to the pads for reaction by dipping the dipstick into the urine and then slowly withdrawing it. The subsequent colorimetric reactions are timed to an endpoint; the extent of colors formation is directly related to the level of the urine constituent. The colors can be read manually by comparison with color charts or with the use of automated reflectance meters. The aim of the System described in this paper is to analyze and to determine automatically the changes of the colors in the dipstick when this is retired of the urine sample and to compare the results with color charts for the diagnosis of many common diseases such as diabetes. The system consists of: (a) a USB camera. (b) Computer. (c) Software Matlab v7.4. Image analysis begins with a digital capturing of the image as data. Once the image is acquired in digital format, the data can be manipulated through digital image processing. Our objective was to develop a computerised image processing system and an interactive software package for the backing of clinicians, medical research and medical students.

  9. Fully automated digital holographic processing for monitoring the dynamics of a vesicle suspension under shear flow

    PubMed Central

    Minetti, Christophe; Podgorski, Thomas; Coupier, Gwennou; Dubois, Frank

    2014-01-01

    We investigate the dynamics of a vesicle suspension under shear flow between plates using DHM with a spatially reduced coherent source. Holograms are grabbed at a frequency of 24 frames/sec. The distribution of the vesicle suspension is obtained after numerical processing of the digital holograms sequence resulting in a 4D distribution. Obtaining this distribution is not straightforward and requires special processing to automate the analysis. We present an original method that fully automates the analysis and provides distributions that are further analyzed to extract physical properties of the fluid. Details of the numerical implementation, as well as sample experimental results are presented. PMID:24877015

  10. Design of digital response in enzyme-based bioanalytical systems for information processing applications.

    PubMed

    Domanskyi, Sergii; Privman, Vladimir

    2012-11-26

    We investigate performance and optimization of the "digital" bioanalytical response. Specifically, we consider the recently introduced approach of a partial input conversion into inactive compounds, resulting in the "branch point effect" similar to that encountered in biological systems. This corresponds to an "intensity filter," which can yield a binary-type sigmoid-response output signal of interest in information and signal processing and in biosensing applications. We define measures for optimizing the response for information processing applications based on the kinetic modeling of the enzymatic reactions involved, and apply the developed approach to the recently published data for glucose detection. PMID:23098224

  11. Software for Processing of Digitized Astronegatives from Archives and Databases of Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Protsyuk, Yu. I.; Andruk, V. N.; Kazantseva, L. V.

    The paper discusses and illustrates the steps of basic processing of digitized image of astro negatives. Software for obtaining of a rectangular coordinates and photometric values of objects on photographic plates was created in the environment LINUX / MIDAS / ROMAFOT. The program can automatically process the specified number of files in FITS format with sizes up to 20000 x 20000 pixels. Other programs were made in FORTRAN and PASCAL with the ability to work in an environment of LINUX or WINDOWS. They were used for: identification of stars, separation and exclusion of diffraction satellites and double and triple exposures, elimination of image defects, reduction to the equatorial coordinates and magnitudes of a reference catalogs.

  12. Applications of digital image processing techniques to problems of data registration and correlation

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1978-01-01

    An overview is presented of the evolution of the computer configuration at JPL's Image Processing Laboratory (IPL). The development of techniques for the geometric transformation of digital imagery is discussed and consideration is given to automated and semiautomated image registration, and the registration of imaging and nonimaging data. The increasing complexity of image processing tasks at IPL is illustrated with examples of various applications from the planetary program and earth resources activities. It is noted that the registration of existing geocoded data bases with Landsat imagery will continue to be important if the Landsat data is to be of genuine use to the user community.

  13. Digital image processing applied to analysis of geophysical and geochemical data for southern Missouri

    NASA Technical Reports Server (NTRS)

    Guinness, E. A.; Arvidson, R. E.; Leff, C. E.; Edwards, M. H.; Bindschadler, D. L.

    1983-01-01

    Digital image-processing techniques have been used to analyze a variety of geophysical and geochemical map data covering southern Missouri, a region with important basement and strata-bound mineral deposits. Gravity and magnetic anomaly patterns, which have been reformatted to image displays, indicate a deep crustal structure cutting northwest-southeast through southern Missouri. In addition, geologic map data, topography, and Landsat multispectral scanner images have been used as base maps for the digital overlay of aerial gamma-ray and stream sediment chemical data for the 1 x 2-deg Rolla quadrangle. Results indicate enrichment of a variety of elements within the clay-rich alluvium covering many of the interfluvial plains, as well as a complicated pattern of enrichment for the sedimentary units close to the Precambrian rhyolites and granites of the St. Francois Mountains.

  14. Methodology used to produce an encoded 1:100,000-scale digital hydrographic data layer for the Pacific Northwest

    USGS Publications Warehouse

    Fisher, B.J.

    1996-01-01

    The U.S. Geological Survey (USGS) has produced a River Reach File data layer for the Pacific Northwest for use in water-resource management applications. The Pacific Northwest (PNW) River Reach Files, a geo-referenced river reach data layer at 1:100,000-scale, are encoded with the U.S. Environmental Protection Agency"s (EPA) reach numbers. The encoding was a primary task of the River Reach project, because EPA"s reach identifiers are also an integral hydrologic component in a regional Northwest Environmental Data Base-an ongoing effort by Federal and State agencies to compile information on reach-specific resources on rivers in Oregon, Idaho, Washington, and western Montana. A unique conflation algorithm was developed by the USGS to transfer the EPA reach codes and other meaningful attributes from the 1:250,000-scale EPA TRACE graphic files to the PNW Reach Files. The PNW Reach Files also were designed so that reach-specific information upstream or downstream from a point in the stream network could be extracted from feature attribute tables or from a Geographic Information System. This report documents the methodology used to create this 1:100,000-scale hydrologic data layer.

  15. Detecting Buried Archaeological Remains by the Use of Geophysical Data Processing with 'Diffusion Maps' Methodology

    NASA Astrophysics Data System (ADS)

    Eppelbaum, Lev

    2015-04-01

    Geophysical methods are prompt, non-invasive and low-cost tool for quantitative delineation of buried archaeological targets. However, taking into account the complexity of geological-archaeological media, some unfavourable environments and known ambiguity of geophysical data analysis, a single geophysical method examination might be insufficient (Khesin and Eppelbaum, 1997). Besides this, it is well-known that the majority of inverse-problem solutions in geophysics are ill-posed (e.g., Zhdanov, 2002), which means, according to Hadamard (1902), that the solution does not exist, or is not unique, or is not a continuous function of observed geophysical data (when small perturbations in the observations will cause arbitrary mistakes in the solution). This fact has a wide application for informational, probabilistic and wavelet methodologies in archaeological geophysics (Eppelbaum, 2014a). The goal of the modern geophysical data examination is to detect the geophysical signatures of buried targets at noisy areas via the analysis of some physical parameters with a minimal number of false alarms and miss-detections (Eppelbaum et al., 2011; Eppelbaum, 2014b). The proposed wavelet approach to recognition of archaeological targets (AT) by the examination of geophysical method integration consists of advanced processing of each geophysical method and nonconventional integration of different geophysical methods between themselves. The recently developed technique of diffusion clustering combined with the abovementioned wavelet methods was utilized to integrate the geophysical data and detect existing irregularities. The approach is based on the wavelet packet techniques applied as to the geophysical images (or graphs) versus coordinates. For such an analysis may be utilized practically all geophysical methods (magnetic, gravity, seismic, GPR, ERT, self-potential, etc.). On the first stage of the proposed investigation a few tens of typical physical-archaeological models (PAM

  16. Prototyping scalable digital signal processing systems for radio astronomy using dataflow models

    NASA Astrophysics Data System (ADS)

    Sane, N.; Ford, J.; Harris, A. I.; Bhattacharyya, S. S.

    2012-05-01

    There is a growing trend toward using high-level tools for design and implementation of radio astronomy digital signal processing (DSP) systems. Such tools, for example, those from the Collaboration for Astronomy Signal Processing and Electronics Research (CASPER), are usually platform-specific, and lack high-level, platform-independent, portable, scalable application specifications. This limits the designer's ability to experiment with designs at a high-level of abstraction and early in the development cycle. We address some of these issues using a model-based design approach employing dataflow models. We demonstrate this approach by applying it to the design of a tunable digital downconverter (TDD) used for narrow-bandwidth spectroscopy. Our design is targeted toward an FPGA platform, called the Interconnect Break-out Board (IBOB), that is available from the CASPER. We use the term TDD to refer to a digital downconverter for which the decimation factor and center frequency can be reconfigured without the need for regenerating the hardware code. Such a design is currently not available in the CASPER DSP library. The work presented in this paper focuses on two aspects. First, we introduce and demonstrate a dataflow-based design approach using the dataflow interchange format (DIF) tool for high-level application specification, and we integrate this approach with the CASPER tool flow. Secondly, we explore the trade-off between the flexibility of TDD designs and the low hardware cost of fixed-configuration digital downconverter (FDD) designs that use the available CASPER DSP library. We further explore this trade-off in the context of a two-stage downconversion scheme employing a combination of TDD or FDD designs.

  17. [A study on digitally processed sounds designed to improve speech sound perception].

    PubMed

    Arai, M

    1994-08-01

    This study was performed to improve the speech sound perception of patients with sensorineural deafness by using digitally processed sounds. The fourteen CV sounds (/pa, ta, ka, ba, da, ga, ha, sa, za, ma, na, ra, ja, wa/) were selected in this study, and the consonant burst and/or voice onset time (VOT) of these sounds were doubled and/or amplified by digital processing and stored in DAT. These processed sounds and the original unprocessed sound were presented to patients and with moderate sensorineural deafness. The following results were obtained from patients who had made a mistake in discriminating the original sounds. 1. The correct answer rate for /ta/ and /sa/ was improved by amplification of the consonant burst or VOT, and for /ka/, amplification and/or repetition improved the corredt answer rate. Amplification of the consonant burst, or of the consonant burst with VOT, was especially effective for unvoiced explosive sounds (/pa, ta, ka/). 2. In voiced sounds, for /za/ and /ra/ the correct answer rate was improved by repetition or elongation of the consonant burst or transition part, and for /ga/, /ma/, and /na/ the rate was improved by amplification. 3. Semivowels (/wa, ja/) and glottal sounds (/ha/) were seldom misunderstood, and required no processing. Digital filtering processing was performed on monosyllables with the "s" sound (/sa, su, se, so/), and these filtered sounds were presented to the patients and a comparison was made with the original sounds. As a result, it was revealed that the correct answer rate could be improved by filtering, although the pass band was changed slightly by the succeeding vowel. This improvement was more apparent in the presence of environmental noise than under quiet conditions. PMID:7931805

  18. Using Lean Six Sigma Methodology to Improve a Mass Immunizations Process at the United States Naval Academy.

    PubMed

    Ha, Chrysanthy; McCoy, Donald A; Taylor, Christopher B; Kirk, Kayla D; Fry, Robert S; Modi, Jitendrakumar R

    2016-06-01

    Lean Six Sigma (LSS) is a process improvement methodology developed in the manufacturing industry to increase process efficiency while maintaining product quality. The efficacy of LSS application to the health care setting has not been adequately studied. This article presents a quality improvement project at the U.S. Naval Academy that uses LSS to improve the mass immunizations process for Midshipmen during in-processing. The process was standardized to give all vaccinations at one station instead of giving a different vaccination at each station. After project implementation, the average immunizations lead time decreased by 79% and staffing decreased by 10%. The process was shown to be in control with a capability index of 1.18 and performance index of 1.10, resulting in a defect rate of 0.04%. This project demonstrates that the LSS methodology can be applied successfully to the health care setting to make sustainable process improvements if used correctly and completely. PMID:27244070

  19. Morphometrics of aeolian blowouts from high-resolution digital elevation data: methodological considerations, shape metrics, and scaling

    NASA Astrophysics Data System (ADS)

    Hamilton, T. K.; Duke, G.; Brown, O.; Koenig, D.; Barchyn, T. E.; Hugenholtz, C.

    2011-12-01

    Aeolian blowouts are wind erosion hollows that form in vegetated aeolian landscapes. They are especially pervasive in dunefields of the northern Great Plains, yielding highly pitted or hummocky terrain, and adding to the spatial variability of microenvironments. Their development is thought to be linked to feedbacks between morphology and airflow; however, few measurements are available to test this hypothesis. Currently, a dearth of morphology data is limiting modeling progress. From a systematic program of blowout mapping with high-resolution airborne LiDAR data, we used a GIS to calculate morphometrics for 1373 blowouts in Great Sand Hills, Saskatchewan, Canada. All of the blowouts selected for this investigation were covered by grassland vegetation and inactive; their morphology represents the final stage of evolution. We first outline methodological considerations for delineating blowouts and measuring their volume. In particular, we present an objective method to enhance edge and reduce operator error and bias. We show that blowouts are slightly elongate and 49% of the sample blowouts are oriented parallel to the prevailing westerly winds. We also show that their size distribution is heavy-tailed, meaning that most blowouts are relatively small and rarely increase in size beyond 400 m3. Given that blowout growth is dominated by a positive feedback between sediment transport and vegetation erosion, these results suggest several possible mechanisms: i) blowouts simultaneously evolved and stabilized as a result of external climate forcing, ii) blowouts are slaved to exogenous biogenic disturbance patterns (e.g., bison wallows), or iii) a morphodynamic limiting mechanism restricts blowout size. Overall, these data will serve as a foundation for future study, providing insight into an understudied landform that is common in many dunefields.

  20. Choosing between Methodologies: An Inquiry into English Learning Processes in a Taiwanese Indigenous School

    ERIC Educational Resources Information Center

    Lin, Wen-Chuan

    2012-01-01

    Traditional, cognitive-oriented theories of English language acquisition tend to employ experimental modes of inquiry and neglect social, cultural and historical contexts. In this paper, I review the theoretical debate over methodology by examining ontological, epistemological and methodological controversies around cognitive-oriented theories. I…

  1. Processing of the GALILEOTM fuel rod code model uncertainties within the AREVA LWR realistic thermal-mechanical analysis methodology

    NASA Astrophysics Data System (ADS)

    Mailhe, P.; Barbier, B.; Garnier, Ch.; Landskron, H.; Sedlacek, R.; Arimescu, I.; Smith, M.; Bellanger, Ph.

    2014-06-01

    The availability of reliable tools and associated methodology able to accurately predict the LWR fuel behavior in all conditions is of great importance for safe and economic fuel usage. For that purpose, AREVA has developed its new global fuel rod performance code GALILEOTM along with its associated realistic thermal-mechanical analysis methodology. This realistic methodology is based on a Monte Carlo type random sampling of all relevant input variables. After having outlined the AREVA realistic methodology, this paper will be focused on the GALILEOTM code benchmarking process on its extended experimental database and the GALILEOTM model uncertainties assessment. The propagation of these model uncertainties through the AREVA realistic methodology is also presented. This GALILEOTM model uncertainties processing is of the utmost importance for accurate fuel design margin evaluation as illustrated on some application examples. With the submittal of Topical Report for GALILEOTM to the U.S. NRC in 2013, GALILEOTM and its methodology are on the way to be industrially used in a wide range of irradiation conditions.

  2. A New Screening Methodology for Improved Oil Recovery Processes Using Soft-Computing Techniques

    NASA Astrophysics Data System (ADS)

    Parada, Claudia; Ertekin, Turgay

    2010-05-01

    The first stage of production of any oil reservoir involves oil displacement by natural drive mechanisms such as solution gas drive, gas cap drive and gravity drainage. Typically, improved oil recovery (IOR) methods are applied to oil reservoirs that have been depleted naturally. In more recent years, IOR techniques are applied to reservoirs even before their natural energy drive is exhausted by primary depletion. Descriptive screening criteria for IOR methods are used to select the appropriate recovery technique according to the fluid and rock properties. This methodology helps in assessing the most suitable recovery process for field deployment of a candidate reservoir. However, the already published screening guidelines neither provide information about the expected reservoir performance nor suggest a set of project design parameters, which can be used towards the optimization of the process. In this study, artificial neural networks (ANN) are used to build a high-performance neuro-simulation tool for screening different improved oil recovery techniques: miscible injection (CO2 and N2), waterflooding and steam injection processes. The simulation tool consists of proxy models that implement a multilayer cascade feedforward back propagation network algorithm. The tool is intended to narrow the ranges of possible scenarios to be modeled using conventional simulation, reducing the extensive time and energy spent in dynamic reservoir modeling. A commercial reservoir simulator is used to generate the data to train and validate the artificial neural networks. The proxy models are built considering four different well patterns with different well operating conditions as the field design parameters. Different expert systems are developed for each well pattern. The screening networks predict oil production rate and cumulative oil production profiles for a given set of rock and fluid properties, and design parameters. The results of this study show that the networks are

  3. Merged GLORIA sidescan and hydrosweep pseudo-sidescan: Processing and creation of digital mosaics

    USGS Publications Warehouse

    Bird, R.T.; Searle, R.C.; Paskevich, V.; Twichell, D.C.

    1996-01-01

    We have replaced the usual band of poor-quality data in the near-nadir region of our GLORIA long-range sidescan-sonar imagery with a shaded-relief image constructed from swath bathymetry data (collected simultaneously with GLORIA) which completely cover the nadir area. We have developed a technique to enhance these "pseudo-sidescan" images in order to mimic the neighbouring GLORIA backscatter intensities. As a result, the enhanced images greatly facilitate the geologic interpretation of the adjacent GLORIA data, and geologic features evident in the GLORIA data may be correlated with greater confidence across track. Features interpreted from the pseudo-sidescan may be extrapolated from the near-nadir region out into the GLORIA range where they may not have been recognized otherwise, and therefore the pseudo-sidescan can be used to ground-truth GLORIA interpretations. Creation of digital sidescan mosaics utilized an approach not previously used for GLORIA data. Pixels were correctly placed in cartographic space and the time required to complete a final mosaic was significantly reduced. Computer software for digital mapping and mosaic creation is incorporated into the newly-developed Woods Hole Image Processing System (WHIPS) which can process both low- and high-frequency sidescan, and can interchange data with the Mini Image Processing System (MIPS) most commonly used for GLORIA processing. These techniques are tested by creating digital mosaics of merged GLORIA sidescan and Hydrosweep pseudo-sidescan data from the vicinity of the Juan Fernandez microplate along the East Pacific Rise (EPR). ?? 1996 Kluwer Academic Publishers.

  4. Microprocessor instruments for measuring nonlinear distortions; algorithms for digital processing of the measurement signal and an estimate of the errors

    SciTech Connect

    Mints, M.Ya.; Chinkov, V.N.

    1995-09-01

    Rational algorithms for measuring the harmonic coefficient in microprocessor instruments for measuring nonlinear distortions based on digital processing of the codes of the instantaneous values of the signal being investigated are described and the errors of such instruments are obtained.

  5. Development of a Digital Signal Processing System for the X-ray Microcalorimeter onboard ASTRO-H

    NASA Astrophysics Data System (ADS)

    Seta, Hiromi; Tashiro, Makoto S.; Terada, Yukikatsu; Shimoda, Yuya; Onda, Kaori; Ishisaki, Yoshitaka; Tsujimoto, Masahiro; Hagihara, Toshishige; Takei, Yoh; Mitsuda, Kazuhisa; Boyce, Kevin R.; Szymkowiak, Andrew E.

    2009-12-01

    A digital signal processing system for the X-ray microcalorimeter array (SXS) is being developed for the next Japanese X-ray astronomy satellite, ASTRO-H. The SXS digital signal processing system evaluates each pulse by an optimal filtering process. For the ASTRO-H project, we decided to employ digital electronics hardware, which includes a digital I/O board based upon FPGAs, and a separate CPU board. It is crucially important for the FPGA to be able to detect the presence of an ``secondary'' pulses on the tail of an initial pulse. In order to detect the contaminating pulses, we have developed a new finite impulse response filter, to compensate for the undershoot in the derivative. By employing the filter it is possible for FPGA to detect the secondary pulse very close the first pulse, and to reduce the load of the CPU in the secondary pulse searching process.

  6. A Novel Methodology for Processing of Plutonium-Bearing Waste as Ammonium Plutonium(III)-Oxalate

    SciTech Connect

    Sali, Sanjay Krishnarao; Noronha, Donal Marshal; Mhatre, Hemakant Ramkrishna; Mahajan, Murlidhar Anna; Chander, Keshav; Aggarwal, Suresh Kumar; Venugopal, Venkatarama

    2005-09-15

    A novel methodology has been developed for the recovery of Pu from different types of waste solutions generated during various operations involved in the chemical quality control/assurance of nuclear fuels. The method is based on the precipitation of Pu as ammonium plutonium(III)-oxalate and involves the adjustment of acidity of the Pu solution to 1 N, the addition of ascorbic acid (0.05 M) to reduce Pu to Pu(III), followed by the addition of (NH{sub 4}){sub 2}SO{sub 4} (0.5 M) and a stoichiometric amount of saturated oxalic acid maintaining a 0.2 M excess of oxalic acid concentration in the supernatant. The precipitate was characterized by X-ray powder diffraction and thermal and chemical analysis and was found to have the composition NH{sub 4}Pu(C{sub 2}O{sub 4}){sub 2}.3H{sub 2}O. This compound can be easily decomposed to PuO{sub 2} on heating in air at 823 K. Decontamination factors of U, Fe, and Cr determined showed quantitative removal of these ions during the precipitation of Pu as ammonium plutonium(III)-oxalate.A semiautomatic assembly based on the transfer of solutions by suction arrangement was designed and fabricated for processing large volumes of Pu solution. This assembly reduced the corrosion of the glove-box material and offered the advantage of lower radiation exposure to the working personnel.

  7. Leaching zinc from spent catalyst: process optimization using response surface methodology.

    PubMed

    Zhang, Zhengyong; Peng, Jinhui; Srinivasakannan, C; Zhang, Zebiao; Zhang, Libo; Fernández, Y; Menéndez, J A

    2010-04-15

    The spent catalyst from vinyl acetate synthesis contains large quantity of zinc. The present study attempts to leach zinc using a mixture of ammonia, ammonium carbonate and water solution, after microwave treatment. The effect of important parameters such as leaching time, liquid/solid ratio and the ammonia concentration was investigated and the process conditions were optimized using surface response methodology (RSM) based on central composite design (CCD). The optimum condition for leaching of zinc from spent catalyst was identified to be a leaching time of 2.50 h, a liquid/solid ratio of 6 and ammonia concentration 5.37 mol/L. A maximum of 97% of zinc was recovered under the optimum experimental conditions. The proposed model equation using RSM has shown good agreement with the experimental data, with a correlation coefficient (R(2)) of 0.95. The samples were characterized before and after leaching using X-ray diffraction (XRD), nitrogen adsorption and scanning electron microscope (SEM). PMID:20060224

  8. A simple methodology to assess endolysosomal protease activity involved in antigen processing in human primary cells

    PubMed Central

    2013-01-01

    Background Endolysosomes play a key role in maintaining the homeostasis of the cell. They are made of a complex set of proteins that degrade lipids, proteins and sugars. Studies involving endolysosome contribution to cellular functions such as MHC class I and II epitope production have used recombinant endolysosomal proteins, knockout mice that lack one of the enzymes or purified organelles from human tissue. Each of these approaches has some caveats in analyzing endolysosomal enzyme functions. Results In this study, we have developed a simple methodology to assess endolysosomal protease activity. By varying the pH in crude lysate from human peripheral blood mononuclear cells (PBMCs), we documented increased endolysosomal cathepsin activity in acidic conditions. Using this new method, we showed that the degradation of HIV peptides in low pH extracts analyzed by mass spectrometry followed similar kinetics and degradation patterns as those performed with purified endolysosomes. Conclusion By using crude lysate in the place of purified organelles this method will be a quick and useful tool to assess endolysosomal protease activities in primary cells of limited availability. This quick method will especially be useful to screen peptide susceptibility to degradation in endolysosomal compartments for antigen processing studies, following which detailed analysis using purified organelles may be used to study specific peptides. PMID:23937268

  9. Photothermal heating as a methodology for post processing of polymeric nanofibers

    NASA Astrophysics Data System (ADS)

    Gorga, Russell; Clarke, Laura; Bochinski, Jason; Viswanath, Vidya; Maity, Somsubhra; Dong, Ju; Firestone, Gabriel

    2015-03-01

    Metal nanoparticles embedded within polymeric systems can be made to act as localized heat sources thereby aiding in-situ polymer processing. This is made possible by the surface plasmon resonance (SPR) mediated photothermal effect of metal (in this case gold) nanoparticles, wherein incident light absorbed by the nanoparticle generates a non-equilibrium electron distribution which subsequently transfers this energy into the surrounding medium, resulting in a temperature increase in the immediate region around the particle. Here we demonstrate this effect in polymer nanocomposite systems, specifically electrospun polyethylene oxide nanofibrous mats, which have been annealed at temperatures above the glass transition. A non-contact temperature measurement technique utilizing embedded fluorophores (perylene) has been used to monitor the average temperature within samples. The effect of annealing methods (conventional and photothermal) and annealing conditions (temperature and time) on the fiber morphology, overall crystallinity, and mechanical properties is discussed. This methodology is further utilized in core-sheath nanofibers to crosslink the core material, which is a pre-cured epoxy thermoset. NSF Grant CMMI-1069108.

  10. Digital signal processing of cylinder pressure data for combustion diagnostics of HCCI engine

    NASA Astrophysics Data System (ADS)

    Kumar Maurya, Rakesh; Pal, Dev Datt; Kumar Agarwal, Avinash

    2013-03-01

    Diagnosis of combustion is necessary for the estimation of the combustion quality, and control of combustion timing in advanced combustion concepts like HCCI. Combustion diagnostics is often performed using digital processing of pressure signals measured using piezoelectric sensor installed in the combustion chamber of the engine. Four-step pressure signal processing consisting of (i) absolute pressure correction, (ii) phasing w.r.t. crank angle, (iii) cycle averaging and (iv) smoothening is used to get cylinder pressure data from the engine experiments, which is further analyzed to get information about combustion characteristics. This study focuses on various aspect of signal processing (cycle averaging and smoothing) of in-cylinder pressure signal from a HCCI engine acquired using a piezoelectric pressure sensor. Experimental investigations are conducted on a HCCI combustion engine operating at different engine speed/load/air-fuel ratio conditions. The cylinder pressure history of 3000 consecutive engine cycles is acquired for analysis using piezoelectric pressure sensor. This study determines the optimum number of engine cycles to be acquired for reasonably good pressure signals based on standard deviation of in-cylinder pressure, rate of pressure rise and rate of heat release signals. Different signal smoothening methods (using various digital filters) are also analyzed and their results are compared. This study also presents effect of signal processing methods on pressure, pressure rise rate and rate of heat release curves at different engine operating conditions.

  11. Digital processing considerations for extraction of ocean wave image spectra from raw synthetic aperture radar data

    NASA Technical Reports Server (NTRS)

    Lahaie, I. J.; Dias, A. R.; Darling, G. D.

    1984-01-01

    The digital processing requirements of several algorithms for extracting the spectrum of a detected synthetic aperture radar (SAR) image from the raw SAR data are described and compared. The most efficient algorithms for image spectrum extraction from raw SAR data appear to be those containing an intermediate image formation step. It is shown that a recently developed compact formulation of the image spectrum in terms of the raw data is computationally inefficient when evaluated directly, in comparison with the classical method where matched-filter image formation is an intermediate result. It is also shown that a proposed indirect procedure for digitally implementing the same compact formulation is somewhat more efficient than the classical matched-filtering approach. However, this indirect procedure includes the image formation process as part of the total algorithm. Indeed, the computational savings afforded by the indirect implementation are identical to those obtained in SAR image formation processing when the matched-filtering algorithm is replaced by the well-known 'dechirp-Fourier transform' technique. Furthermore, corrections to account for slant-to-ground range conversion, spherical earth, etc., are often best implemented in the image domain, making intermediate image formation a valuable processing feature.

  12. A digital archiving system and distributed server-side processing of large datasets

    NASA Astrophysics Data System (ADS)

    Jomier, Julien; Aylward, Stephen R.; Marion, Charles; Lee, Joowhi; Styner, Martin

    2009-02-01

    In this paper, we present MIDAS, a web-based digital archiving system that processes large collections of data. Medical imaging research often involves interdisciplinary teams, each performing a separate task, from acquiring datasets to analyzing the processing results. Moreover, the number and size of the datasets continue to increase every year due to recent advancements in acquisition technology. As a result, many research laboratories centralize their data and rely on distributed computing power. We created a web-based digital archiving repository based on openstandards. The MIDAS repository is specifically tuned for medical and scientific datasets and provides a flexible data management facility, a search engine, and an online image viewer. MIDAS enables users to run a set of extensible image processing algorithms from the web to the selected datasets and to add new algorithms to the MIDAS system, facilitating the dissemination of users' work to different research partners. The MIDAS system is currently running in several research laboratories and has demonstrated its ability to streamline the full image processing workflow from data acquisition to image analysis and reports.

  13. Development of software for digital image processing for analysis of neuroangiogenesis

    NASA Astrophysics Data System (ADS)

    Gonzalez, M. A.; Ballarin, V. L.; Celín, A. R.; Rapacioli, M.; López-Costa, J. J.; Flores, V.

    2011-12-01

    The process of formation, growth and distribution of vessels within the developing central nervous system is difficult to analyze due to the complexity of the paths and branches within the system. The study of images of this area poses particular problems because the high levels of noise, blurring and poor contrast often prevent the objects of interest detected correctly. The design of algorithms for digital image processing suitable for this type of imagery remains a constant challenge. The aim of this work is to develop a computer tool to assist the specialist in processing these images. This paper proposes the use of morphological grayscale reconstruction and other morphological operators in order to segment the images properly. The results show that the algorithms allow a suitable segmentation of the objects of interest. Moreover, the interface developed for processing enables easy and simple analysis of them by the specialists.

  14. Digital CODEC for real-time processing of broadcast quality video signals at 1.8 bits/pixel

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary JO; Whyte, Wayne A.

    1991-01-01

    Advances in very large scale integration and recent work in the field of bandwidth efficient digital modulation techniques have combined to make digital video processing technically feasible an potentially cost competitive for broadcast quality television transmission. A hardware implementation was developed for DPCM (differential pulse code midulation)-based digital television bandwidth compression algorithm which processes standard NTSC composite color television signals and produces broadcast quality video in real time at an average of 1.8 bits/pixel. The data compression algorithm and the hardware implementation of the codec are described, and performance results are provided.

  15. Digital CODEC for real-time processing of broadcast quality video signals at 1.8 bits/pixel

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary JO; Whyte, Wayne A., Jr.

    1989-01-01

    Advances in very large-scale integration and recent work in the field of bandwidth efficient digital modulation techniques have combined to make digital video processing technically feasible and potentially cost competitive for broadcast quality television transmission. A hardware implementation was developed for a DPCM-based digital television bandwidth compression algorithm which processes standard NTSC composite color television signals and produces broadcast quality video in real time at an average of 1.8 bits/pixel. The data compression algorithm and the hardware implementation of the CODEC are described, and performance results are provided.

  16. APPLEPIPS /Apple Personal Image Processing System/ - An interactive digital image processing system for the Apple II microcomputer

    NASA Technical Reports Server (NTRS)

    Masuoka, E.; Rose, J.; Quattromani, M.

    1981-01-01

    Recent developments related to microprocessor-based personal computers have made low-cost digital image processing systems a reality. Image analysis systems built around these microcomputers provide color image displays for images as large as 256 by 240 pixels in sixteen colors. Descriptive statistics can be computed for portions of an image, and supervised image classification can be obtained. The systems support Basic, Fortran, Pascal, and assembler language. A description is provided of a system which is representative of the new microprocessor-based image processing systems currently on the market. While small systems may never be truly independent of larger mainframes, because they lack 9-track tape drives, the independent processing power of the microcomputers will help alleviate some of the turn-around time problems associated with image analysis and display on the larger multiuser systems.

  17. Perceptual and category processing of the Uncanny Valley hypothesis' dimension of human likeness: some methodological issues.

    PubMed

    Cheetham, Marcus; Jancke, Lutz

    2013-01-01

    Mori's Uncanny Valley Hypothesis(1,2) proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings (3, 4, 5, 6). One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) (7). Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated. PMID:23770728

  18. Process evaluation of community monitoring under national health mission at Chandigarh, union territory: Methodology and challenges

    PubMed Central

    Tripathy, Jaya Prasad; Aggarwal, Arun Kumar; Patro, Binod Kumar; Verma, Himbala

    2015-01-01

    Background: Community monitoring was introduced on a pilot mode in 36 selected districts of India in a phased manner. In Chandigarh, it was introduced in the year 2009–2010. A preliminary evaluation of the program was undertaken with special emphasis on the inputs and the processes. Methodology: Quantitative methods included verification against checklists and record reviews. Nonparticipant observation was used to evaluate the conduct of trainings, interviews, and group discussions. Health system had trained health system functionaries (nursing students and Village Health Sanitation Committee [VHSC] members) to generate village-based scorecards for assessing community needs. Community needs were assessed independently for two villages under the study area to validate the scores generated by the health system. Results: VHSCs were formed in all 22 villages but without a chairperson or convener. The involvement of VHSC members in the community monitoring process was minimal. The conduct of group discussions was below par due to poor moderation and unequal responses from the group. The community monitoring committees at the state level had limited representation from the non-health sector, lower committees, and the nongovernmental organizations/civil societies. Agreement between the report cards generated by the investigator and the health system in the selected villages was found to be to be fair (0.369) whereas weighted kappa (0.504) was moderate. Conclusion: In spite of all these limitations and challenges, the government has taken a valiant step by trying to involve the community in the monitoring of health services. The dynamic nature of the community warrants incorporation of an evaluation framework into the planning of such programs. PMID:26985413

  19. Multimedia abstract generation of intensive care data: the automation of clinical processes through AI methodologies.

    PubMed

    Jordan, Desmond; Rose, Sydney E

    2010-04-01

    Medical errors from communication failures are enormous during the perioperative period of cardiac surgical patients. As caregivers change shifts or surgical patients change location within the hospital, key information is lost or misconstrued. After a baseline cognitive study of information need and caregiver workflow, we implemented an advanced clinical decision support tool of intelligent agents, medical logic modules, and text generators called the "Inference Engine" to summarize individual patient's raw medical data elements into procedural milestones, illness severity, and care therapies. The system generates two displays: 1) the continuum of care, multimedia abstract generation of intensive care data (MAGIC)-an expert system that would automatically generate a physician briefing of a cardiac patient's operative course in a multimodal format; and 2) the isolated point in time, "Inference Engine"-a system that provides a real-time, high-level, summarized depiction of a patient's clinical status. In our studies, system accuracy and efficacy was judged against clinician performance in the workplace. To test the automated physician briefing, "MAGIC," the patient's intraoperative course, was reviewed in the intensive care unit before patient arrival. It was then judged against the actual physician briefing and that given in a cohort of patients where the system was not used. To test the real-time representation of the patient's clinical status, system inferences were judged against clinician decisions. Changes in workflow and situational awareness were assessed by questionnaires and process evaluation. MAGIC provides 200% more information, twice the accuracy, and enhances situational awareness. This study demonstrates that the automation of clinical processes through AI methodologies yields positive results. PMID:20012610

  20. Methodology for processing backscattered electron images. Application to Aguada archaeological paints.

    PubMed

    Galván Josa, V; Bertolino, S R; Riveros, J A; Castellano, G

    2009-12-01

    Scanning electron microscopy is a powerful technique in several fields of science and technology. In particular it is an important complement in the characterization of materials for which X-ray analysis is not possible. Such is the case of thin paint layers on ceramic pots, in which, even for low incident energies, the electron interaction volume can be greater than the paint thickness--in addition to the problem arising from similar compositions. With the aim of complementing other common techniques used in compositional materials characterization, in this work, an image-processing software has been developed, which implements a new methodology for the treatment of backscattered electron (BSE) images in order to bring to evidence small mean atomic number contrasts, usually imperceptible to human eye. The program was used to study black and white pigments of ceramic pieces belonging to the Ambato style of "Aguada" culture (Catamarca province, Argentina, IV-XII centuries AD). Although the BSE images acquired for these samples showed no apparent contrast between sherd and black and white pigments, through image-processing algorithms using different space filters, chemical contrast between regions has been brought to evidence with a minor detail loss. This has been accomplished by applying a smoothing filter, after which the main routine for contrast enhancement reveals details in the grey-level region of interest; finally, a filter for edge enhancement permits to recover some details lost in the previous steps, achieving satisfactory results for the painted sherd samples analyzed. In order to validate the mean atomic number differences found between each pigment and the ceramic body, X-ray diffraction diagrams have been refined with Rietveld method using the software DIFFRACplus Topas, arriving to mineralogical differences which agree with the results obtained. As a consequence of this study, the program developed has proven to be a suitable tool for routine

  1. The Rondonia Lightning Detection Network: Network Description, Science Objectives, Data Processing Archival/Methodology, and Results

    NASA Technical Reports Server (NTRS)

    Blakeslee, R. J.; Bailey, J. C.; Pinto, O.; Athayde, A.; Renno, N.; Weidman, C. D.

    2003-01-01

    A four station Advanced Lightning Direction Finder (ALDF) network was established in the state of Rondonia in western Brazil in 1999 through a collaboration of U.S. and Brazilian participants from NASA, INPE, INMET, and various universities. The network utilizes ALDF IMPACT (Improved Accuracy from Combined Technology) sensors to provide cloud-to-ground lightning observations (i.e., stroke/flash locations, signal amplitude, and polarity) using both time-of- arrival and magnetic direction finding techniques. The observations are collected, processed and archived at a central site in Brasilia and at the NASA/Marshall Space Flight Center in Huntsville, Alabama. Initial, non-quality assured quick-look results are made available in near real-time over the Internet. The network, which is still operational, was deployed to provide ground truth data for the Lightning Imaging Sensor (LIS) on the Tropical Rainfall Measuring Mission (TRMM) satellite that was launched in November 1997. The measurements are also being used to investigate the relationship between the electrical, microphysical and kinematic properties of tropical convection. In addition, the long-time series observations produced by this network will help establish a regional lightning climatological database, supplementing other databases in Brazil that already exist or may soon be implemented. Analytic inversion algorithms developed at the NASA/Marshall Space Flight Center have been applied to the Rondonian ALDF lightning observations to obtain site error corrections and improved location retrievals. The data will also be corrected for the network detection efficiency. The processing methodology and the results from the analysis of four years of network operations will be presented.

  2. Digital image capture, processing, and recording system upgrade for the APS-94F SLAR

    NASA Astrophysics Data System (ADS)

    Ferraris, Guillermo L.

    2000-11-01

    The Argentine Army has been operating the APS-94F SLAR systems, on board the venerable OV-1D MOHAWK aircraft, since 1996. These systems were received from the U.S. Government through the FMS program. One major handicap of the system is due to the now obsolete imagery recording subsystem, which includes complex optical, thermal and electro-mechanical obsolete processes and components, that account for most of the degradations and distortions in the images obtained (not to mention the fact that images are recorded on a 9.5-inch silver halide film media, which has to be kept at -17 degree(s)C and has to be brought to thermal equilibrium with the environment eight hours before the mission). An integral digital capture, processing and recording subsystem was developed at CITEFA (Instituto de Investigaciones Cientificas y Tecnicas de las Fuerzas Armadas) to replace the old analog RO-495/U recorder, as an upgrade to this very robust and proven imaging radar system The subsystem developed includes three custom designed ISA boards: (1) Radar video and aircraft attitude signal conditioning board, (2) Microprocessor controlled two- channel high speed digitizing board and (3) Integrated 12- channel GPS OEM board. The operator's software interface is a PC based GUI C++ application, including radar imagery forming and processing algorithms, slant range to ground range conversion, digitally controlled image gain, bias and contrast adjustments, image registration (GPS), image file disk recording and retrieval functions, real time mensuration and MTI/FTI (moving target indication/fixed target indication) image correlation. The system also provides for the added capability to send compressed still radar images in NRT (near real time) to a ground receiving station through a secure data link. Due to serious space limitations inside the OV-1D two-seat cockpit, a military grade ruggedized laptop computer and docking station hardware implementation was selected.

  3. Non-digitized diffractive beam splitters for high-throughput laser materials processing

    NASA Astrophysics Data System (ADS)

    Amako, J.; Fujii, E.

    2014-03-01

    We report a non-digitized diffractive beam splitter with a split count of 45, a 95% splitting efficiency, and a 0.90 splitting uniformity. The splitter was iteratively designed and was created on fused silica by laser writing lithography. Antireflection coatings were added to the splitter to ensure high efficiency. This splitter was applied to the manufacture of inkjet printer heads, in which silicon wafers were drilled with a 532-nm, nanosecond pulse laser with an average output of 10 W and were wet-etched to produce microfluidic channels. We also discuss large beam arrays for process throughput and subwavelength structures formed on the splitter for efficient laser power use.

  4. Combined flatland ST radar and digital-barometer network observations of mesoscale processes

    NASA Technical Reports Server (NTRS)

    Clark, W. L.; Vanzandt, T. E.; Gage, K. S.; Einaudi, F. E.; Rottman, J. W.; Hollinger, S. E.

    1991-01-01

    The paper describes a six-station digital-barometer network centered on the Flatland ST radar to support observational studies of gravity waves and other mesoscale features at the Flatland Atmospheric Observatory in central Illinois. The network's current mode of operation is examined, and a preliminary example of an apparent group of waves evident throughout the network as well as throughout the troposphere is presented. Preliminary results demonstrate the capabilities of the current operational system to study wave convection, wave-front, and other coherent mesoscale interactions and processes throughout the troposphere. Unfiltered traces for the pressure and horizontal zonal wind, for days 351 to 353 UT, 1990, are illustrated.

  5. Automated image processing of Landsat II digital data for watershed runoff prediction

    NASA Technical Reports Server (NTRS)

    Sasso, R. R.; Jensen, J. R.; Estes, J. E.

    1977-01-01

    Digital image processing of Landsat data from a 230 sq km area was examined as a possible means of generating soil cover information for use in the watershed runoff prediction of Kern County, California. The soil cover information included data on brush, grass, pasture lands and forests. A classification accuracy of 94% for the Landsat-based soil cover survey suggested that the technique could be applied to the watershed runoff estimate. However, problems involving the survey of complex mountainous environments may require further attention

  6. IBIS - A geographic information system based on digital image processing and image raster datatype

    NASA Technical Reports Server (NTRS)

    Bryant, N. A.; Zobrist, A. L.

    1976-01-01

    IBIS (Image Based Information System) is a geographic information system which makes use of digital image processing techniques to interface existing geocoded data sets and information management systems with thematic maps and remotely sensed imagery. The basic premise is that geocoded data sets can be referenced to a raster scan that is equivalent to a grid cell data set. The first applications (St. Tammany Parish, Louisiana, and Los Angeles County) have been restricted to the design of a land resource inventory and analysis system. It is thought that the algorithms and the hardware interfaces developed will be readily applicable to other Landsat imagery.

  7. Rocket engine plume diagnostics using video digitization and image processing - Analysis of start-up

    NASA Technical Reports Server (NTRS)

    Disimile, P. J.; Shoe, B.; Dhawan, A. P.

    1991-01-01

    Video digitization techniques have been developed to analyze the exhaust plume of the Space Shuttle Main Engine. Temporal averaging and a frame-by-frame analysis provide data used to evaluate the capabilities of image processing techniques for use as measurement tools. Capabilities include the determination of the necessary time requirement for the Mach disk to obtain a fully-developed state. Other results show the Mach disk tracks the nozzle for short time intervals, and that dominate frequencies exist for the nozzle and Mach disk movement.

  8. Inventories of Delaware's coastal vegetation and land-use utilizing digital processing of ERTS-1 imagery

    NASA Technical Reports Server (NTRS)

    Klemas, V. (Principal Investigator); Bartlett, D.; Rogers, R.; Reed, L.

    1974-01-01

    The author has identified the following significant results. Analysis of ERTS-1 color composite images using analogy processing equipment confirmed that all the major wetlands plant species were distinguishable at ERTS-1 scale. Furthermore, human alterations of the coastal zone were easily recognized since such alterations typically involve removal of vegetative cover resulting in a change of spectral signature. The superior spectral resolution of the CCTs as compared with single band or composite imagery has indeed provided good discrimination through digital analysis of the CCTs with the added advantage of rapid production of thematic maps and data.

  9. Basic forest cover mapping using digitized remote sensor data and automated data processing techniques

    NASA Technical Reports Server (NTRS)

    Coggeshall, M. E.; Hoffer, R. M.

    1973-01-01

    Remote sensing equipment and automatic data processing techniques were employed as aids in the institution of improved forest resource management methods. On the basis of automatically calculated statistics derived from manually selected training samples, the feature selection processor of LARSYS selected, upon consideration of various groups of the four available spectral regions, a series of channel combinations whose automatic classification performances (for six cover types, including both deciduous and coniferous forest) were tested, analyzed, and further compared with automatic classification results obtained from digitized color infrared photography.

  10. LANDSAT digital data processing: A near real-time application. [Gulf of Mexico

    NASA Technical Reports Server (NTRS)

    Barker, J. L.; Bohn, C.; Stuart, L.; Hill, J.

    1975-01-01

    An application of rapid generation of classed digital images from LANDSAT-1 was demonstrated and its feasibility evaluated by NASA in conjunction with the Environmental Protection Agency (EPA), Texas A and M University (TAMU), and the Cousteau Society. The primary purpose was to show that satellite data could be processed and transmitted to the Calypso, which was used as a research vessel, in time for use in directing it to specific locations of possible plankton upwellings, sediment, or other anomalies in the coastal water areas along the Gulf of Mexico.

  11. Digital Storytelling as a Narrative Health Promotion Process: Evaluation of a Pilot Study.

    PubMed

    DiFulvio, Gloria T; Gubrium, Aline C; Fiddian-Green, Alice; Lowe, Sarah E; Del Toro-Mejias, Lizbeth Marie

    2016-04-01

    Digital storytelling (DST) engages participants in a group-based process to create and share narrative accounts of life events. The process of individuals telling their own stories has not been well assessed as a mechanism of health behavior change. This study looks at outcomes associated with engaging in the DST process for vulnerable youth. The project focused on the experiences of Puerto Rican Latinas between the ages of 15 to 21. A total of 30 participants enrolled in a 4-day DST workshops, with 29 completing a 1 to 3-minute digital story. Self-reported data on several scales (self-esteem, social support, empowerment, and sexual attitudes and behaviors) were collected and analyzed. Participants showed an increase in positive social interactions from baseline to 3-month post workshop. Participants also demonstrated increases in optimism and control over the future immediately after the workshop, but this change was not sustained at 3 months. Analysis of qualitative results and implications are discussed. PMID:27166356

  12. Virtual and flexible digital signal processing system based on software PnP and component works

    NASA Astrophysics Data System (ADS)

    He, Tao; Wu, Qinghua; Zhong, Fei; Li, Wei

    2005-05-01

    An idea about software PnP (Plug & Play) is put forward according to the hardware PnP. And base on this idea, a virtual flexible digital signal processing system (FVDSPS) is carried out. FVDSPS is composed of a main control center, many sub-function modules and other hardware I/O modules. Main control center sends out commands to sub-function modules, and manages running orders, parameters and results of sub-functions. The software kernel of FVDSPS is DSP (Digital Signal Processing) module, which communicates with the main control center through some protocols, accept commands or send requirements. The data sharing and exchanging between the main control center and the DSP modules are carried out and managed by the files system of the Windows Operation System through the effective communication. FVDSPS real orients objects, orients engineers and orients engineering problems. With FVDSPS, users can freely plug and play, and fast reconfigure a signal process system according to engineering problems without programming. What you see is what you get. Thus, an engineer can orient engineering problems directly, pay more attention to engineering problems, and promote the flexibility, reliability and veracity of testing system. Because FVDSPS orients TCP/IP protocol, through Internet, testing engineers, technology experts can be connected freely without space. Engineering problems can be resolved fast and effectively. FVDSPS can be used in many fields such as instruments and meter, fault diagnosis, device maintenance and quality control.

  13. The use of response surface methodology for modelling and analysis of water and wastewater treatment processes: a review.

    PubMed

    Nair, Abhilash T; Makwana, Abhipsa R; Ahammed, M Mansoor

    2014-01-01

    In recent years, response surface methodology (RSM) has been used for modelling and optimising a variety of water and wastewater treatment processes. RSM is a collection of mathematical and statistical techniques for building models, evaluating the effects of several variables, and obtaining the values of process variables that produce desirable values of the response. This paper reviews the recent information on the use of RSM in different water and wastewater treatment processes. The theoretical principles and steps for its application are first described. The recent investigations on its application in coagulation-flocculation, adsorption, advanced oxidation processes, electro-chemical processes and disinfection are reviewed. The limitations of the methodology are highlighted. Attempts made to improve the RSM by combining it with other modelling techniques are also described. PMID:24552716

  14. Systemic Inflammation: Methodological Approaches to Identification of the Common Pathological Process

    PubMed Central

    Zotova, N. V.; Chereshnev, V. A.; Gusev, E. Yu.

    2016-01-01

    We defined Systemic inflammation (SI) as a “typical, multi-syndrome, phase-specific pathological process, developing from systemic damage and characterized by the total inflammatory reactivity of endotheliocytes, plasma and blood cell factors, connective tissue and, at the final stage, by microcirculatory disorders in vital organs and tissues.” The goal of the work: to determine methodological approaches and particular methodical solutions for the problem of identification of SI as a common pathological process. SI can be defined by the presence in plasma of systemic proinflammatory cell stress products—cytokines and other inflammatory mediators, and also by the complexity of other processes signs. We have developed 2 scales: 1) The Reactivity Level scale (RL)–from 0 to 5 points: 0-normal level; RL-5 confirms systemic nature of inflammatory mediator release, and RL- 2–4 defines different degrees of event probability. 2) The SI scale, considering additional criteria along with RL, addresses more integral criteria of SI: the presence of ≥ 5 points according to the SI scale proves the high probability of SI developing. To calculate the RL scale, concentrations of 4 cytokines (IL-6, IL-8, IL-10, TNF-α) and C-reactive protein in plasma were examined. Additional criteria of the SI scale were the following: D-dimers>500ng/ml, cortisol>1380 or <100nmol/l, troponin I≥0.2ng/ml and/or myoglobin≥800ng/ml. 422 patients were included in the study with different septic (n-207) and aseptic (n-215) pathologies. In 190 cases (of 422) there were signs of SI (lethality 38.4%, n-73). In only 5 of 78 cases, lethality was not confirmed by the presence of SI. SI was registered in 100% of cases with septic shock (n-31). There were not significant differences between AU-ROC of CR, SI scale and SOFA to predict death in patients with sepsis and trauma. PMID:27153324

  15. Design of a high-speed digital processing element for parallel simulation

    NASA Technical Reports Server (NTRS)

    Milner, E. J.; Cwynar, D. S.

    1983-01-01

    A prototype of a custom designed computer to be used as a processing element in a multiprocessor based jet engine simulator is described. The purpose of the custom design was to give the computer the speed and versatility required to simulate a jet engine in real time. Real time simulations are needed for closed loop testing of digital electronic engine controls. The prototype computer has a microcycle time of 133 nanoseconds. This speed was achieved by: prefetching the next instruction while the current one is executing, transporting data using high speed data busses, and using state of the art components such as a very large scale integration (VLSI) multiplier. Included are discussions of processing element requirements, design philosophy, the architecture of the custom designed processing element, the comprehensive instruction set, the diagnostic support software, and the development status of the custom design.

  16. Construction of an integrated biomodule composed of microfluidics and digitally controlled microelectrodes for processing biomolecules.

    NASA Astrophysics Data System (ADS)

    Wagler, Patrick F.; Tangen, Uwe; Maeke, Thomas; Mathis, Harald P.; McCaskill, John S.

    2003-01-01

    This work focuses on the development of an online programmable microfluidic bioprocessing unit (BioModule) using digital logic microelectrodes for rapid pipelined selection and transfer of DNA molecules and other charged biopolymers. The design and construction technique for this hybrid programmable biopolymer processing device is presented along with the first proof of principle functionality. The electronically controlled collection, separation and channel transfer of the biomolecules is monitored by a sensitive fluorescence setup. This hybrid reconfigurable architecture couples electronic and biomolecular information processing via a single module combination of fluidics and electronics and opens new fields of applications not only in DNA computing and molecular diagnostics but also in applications of combinatorial chemistry and lab-on-a-chip biotechnology to the drug discovery process. Fundamentals of the design and silicon-PDMS-based construction of these electronic microfluidic devices and their functions are described as well as the experimental results.

  17. Microfabrication of a BioModule composed of microfluidics and digitally controlled microelectrodes for processing biomolecules

    NASA Astrophysics Data System (ADS)

    Wagler, Patrick F.; Tangen, Uwe; Maeke, Thomas; Mathis, Harald P.; McCaskill, John S.

    2003-10-01

    This work focuses on the development of an online programmable microfluidic bioprocessing unit (BioModule) using digital logic microelectrodes for rapid pipelined selection and transfer of deoxyribonucleic acid (DNA) molecules and other charged biopolymers. The design and construction technique for this hybrid programmable biopolymer processing device is presented along with the first proof of principle functionality. The electronically controlled collection, separation and channel transfer of the biomolecules is monitored by a sensitive fluorescence set-up. This hybrid reconfigurable architecture couples electronic and biomolecular information processing via a single module combination of fluidics and electronics and opens new fields of applications not only in DNA computing and molecular diagnostics but also in applications of combinatorial chemistry and lab-on-a-chip biotechnology to the drug discovery process. Fundamentals of the design and silicon-polydimethylsiloxane (PDMS)-based construction of these electronic microfluidic devices and their functions are described as well as the experimental results.

  18. X-SAR/SRTM Digital Height Models: Processing Status and Results

    NASA Astrophysics Data System (ADS)

    Werner, M.; Roth, A.; Knoepfle, W.; Breit, H.; Eineder, M.; Suchandt, S.

    2003-04-01

    The Shuttle Radar Topography Mission in February 2000 had two "single pass" interferometric radar systems on board which are the C-band system of NASA/JPL and the X-band system from DLR. Both systems have been operated simultaneously during the mission. Independent processors have been developed to produce the digital terrain models. During SRTM the two C- and X-band radar systems of SIR-C/X-SAR were used as active ilumination sources and were supplemented by two passive antennas mounted on the top of a 60 m long boom. Due to this re-use of existing hardware the X-band system covered only a 50 km wide swath providing a net of elevation data during this 11day mission. After a difficult and long calibration phase in which mainly the baseline determination errors had to be removed we began the operational processing of the X-band data in December 2001. The processing sequence is continent-by-continent starting from the ocean, which is used as a reference. Each swath crossing as well as overlaps between adjacent swaths is used to support the height fixation and to check the height and location accuracy. Meanwhile the west part of Europe (30 deg longitude) is processed, Africa and South America have been finished and the digital height models and radar images are available to the public. Height error maps accompany the digital terrain models in tiles with a size of 15 arc-minutes. More than 1000 products have been delivered so far to the principal investigators, customers and to DLR's own research team. The interferometric processing of the whole raw data set with an amount of 3600 Gbyte is already completed and the data are ready for the geocoding and mosaicking process. There are some problematic areas where the phase unwrapping failed which have now to be reworked with different procedures and matched parameters. In this paper we present the status of the processing and the future schedule as well as some results from comparison with reference DEM's. The performance

  19. A Methodological Reflection on the Process of Narrative Analysis: Alienation and Identity in the Life Histories of English Language Teachers

    ERIC Educational Resources Information Center

    Menard-Warwick, Julia

    2011-01-01

    This article uses data from life-history interviews with English language teachers in Chile and California to illustrate methodological processes in teacher identity research through narrative analysis. To this end, the author describes the steps she took in identifying an issue to be examined, selecting particular narratives as representative of…

  20. COMPUTER AIDED CHEMICAL PROCESS DESIGN METHODOLOGIES FOR POLLUTION REDUCTION(SYSTEMS ANALYSIS BRANCH, SUSTAINABLE TECHNOLOGY DIVISION, NRMRL)

    EPA Science Inventory

    The objective of the project is to develop computer optimization and simulation methodologies for the design of economical chemical manufacturing processes with a minimum of impact on the environment. The computer simulation and optimization tools developed in this project can be...

  1. Bridging the gap between neurocognitive processing theory and performance validity assessment among the cognitively impaired: a review and methodological approach.

    PubMed

    Leighton, Angela; Weinborn, Michael; Maybery, Murray

    2014-10-01

    Bigler (2012) and Larrabee (2012) recently addressed the state of the science surrounding performance validity tests (PVTs) in a dialogue highlighting evidence for the valid and increased use of PVTs, but also for unresolved problems. Specifically, Bigler criticized the lack of guidance from neurocognitive processing theory in the PVT literature. For example, individual PVTs have applied the simultaneous forced-choice methodology using a variety of test characteristics (e.g., word vs. picture stimuli) with known neurocognitive processing implications (e.g., the "picture superiority effect"). However, the influence of such variations on classification accuracy has been inadequately evaluated, particularly among cognitively impaired individuals. The current review places the PVT literature in the context of neurocognitive processing theory, and identifies potential methodological factors to account for the significant variability we identified in classification accuracy across current PVTs. We subsequently evaluated the utility of a well-known cognitive manipulation to provide a Clinical Analogue Methodology (CAM), that is, to alter the PVT performance of healthy individuals to be similar to that of a cognitively impaired group. Initial support was found, suggesting the CAM may be useful alongside other approaches (analogue malingering methodology) for the systematic evaluation of PVTs, particularly the influence of specific neurocognitive processing components on performance. PMID:25383483

  2. Beyond data collection in digital mapping: interpretation, sketching and thought process elements in geological map making

    NASA Astrophysics Data System (ADS)

    Watkins, Hannah; Bond, Clare; Butler, Rob

    2016-04-01

    Geological mapping techniques have advanced significantly in recent years from paper fieldslips to Toughbook, smartphone and tablet mapping; but how do the methods used to create a geological map affect the thought processes that result in the final map interpretation? Geological maps have many key roles in the field of geosciences including understanding geological processes and geometries in 3D, interpreting geological histories and understanding stratigraphic relationships in 2D and 3D. Here we consider the impact of the methods used to create a map on the thought processes that result in the final geological map interpretation. As mapping technology has advanced in recent years, the way in which we produce geological maps has also changed. Traditional geological mapping is undertaken using paper fieldslips, pencils and compass clinometers. The map interpretation evolves through time as data is collected. This interpretive process that results in the final geological map is often supported by recording in a field notebook, observations, ideas and alternative geological models explored with the use of sketches and evolutionary diagrams. In combination the field map and notebook can be used to challenge the map interpretation and consider its uncertainties. These uncertainties and the balance of data to interpretation are often lost in the creation of published 'fair' copy geological maps. The advent of Toughbooks, smartphones and tablets in the production of geological maps has changed the process of map creation. Digital data collection, particularly through the use of inbuilt gyrometers in phones and tablets, has changed smartphones into geological mapping tools that can be used to collect lots of geological data quickly. With GPS functionality this data is also geospatially located, assuming good GPS connectivity, and can be linked to georeferenced infield photography. In contrast line drawing, for example for lithological boundary interpretation and sketching

  3. Statistical properties of 4000 raw and processed digital mammograms from a GE Senograph 2000D

    NASA Astrophysics Data System (ADS)

    Bloomquist, Aili K.; Yaffe, Martin J.; Mawdsley, Gordon E.; Rico, Dan; Bright, Stewart

    2003-06-01

    Optimization of the display of digital mammograms is an important challenge and requires knowledge of the characteristics of actual patient images. This work aims to create a description of some of the fundamental statistical properties of a large volume of images acquired on an FDA approved device as used in clinical practice. 4569 digital mammograms (1246 patients) were acquired between October 2001 and August 2002 on a GE Senograph 2000D at Sunnybrook and Women's College Health Sciences Centre. Images were saved in "raw" format. The breast was then segmented from the background on the image using a technique based on thresholding and some connectivity rules. The histogram of pixel values in the breast only is then calculated for both the raw and processed versions of the image. The region of constant thickness, where the breast is in contact with the compression paddle, was also segmented from the CC view raw images. The histogram and statistical properties in this central region were also calculated. Assorted statistical descriptors of the histograms were examined (dynamic range, mean, standard deviations, median and mode). The effect of image processing on the dynamic range in the periphery and central area of the breast was evaluated. The results were compared against the automatic exposure algorithm and acquisition parameters, projection (view) and breast thickness.

  4. Combination of digital signal processing methods towards an improved analysis algorithm for structural health monitoring.

    NASA Astrophysics Data System (ADS)

    Pentaris, Fragkiskos P.; Makris, John P.

    2013-04-01

    In Structural Health Monitoring (SHM) is of great importance to reveal valuable information from the recorded SHM data that could be used to predict or indicate structural fault or damage in a building. In this work a combination of digital signal processing methods, namely FFT along with Wavelet Transform is applied, together with a proposed algorithm to study frequency dispersion, in order to depict non-linear characteristics of SHM data collected in two university buildings under natural or anthropogenic excitation. The selected buildings are of great importance from civil protection point of view, as there are the premises of a public higher education institute, undergoing high use, stress, visit from academic staff and students. The SHM data are collected from two neighboring buildings that have different age (4 and 18 years old respectively). Proposed digital signal processing methods are applied to the data, presenting a comparison of the structural behavior of both buildings in response to seismic activity, weather conditions and man-made activity. Acknowledgments This work was supported in part by the Archimedes III Program of the Ministry of Education of Greece, through the Operational Program "Educational and Lifelong Learning", in the framework of the project entitled «Interdisciplinary Multi-Scale Research of Earthquake Physics and Seismotectonics at the front of the Hellenic Arc (IMPACT-ARC) » and is co-financed by the European Union (European Social Fund) and Greek National Fund.

  5. Processing of multi-digit additions in high math-anxious individuals: psychophysiological evidence

    PubMed Central

    Núñez-Peña, María Isabel; Suárez-Pellicioni, Macarena

    2015-01-01

    We investigated the time course of neural processing of multi-digit additions in high- (HMA) and low-math anxious (LMA) individuals. Seventeen HMA and 17 LMA individuals were presented with two-digit additions and were asked to perform a verification task. Behavioral data showed that HMA individuals were slower and more error prone than their LMA peers, and that incorrect solutions were solved more slowly and less accurately than correct ones. Moreover, HMA individuals tended to need more time and commit more errors when having to verify incorrect solutions than correct ones. ERPs time-locked to the presentation of the addends (calculation phase) and to the presentation of the proposed solution (verification phase) were also analyzed. In both phases, a P2 component of larger amplitude was found for HMA individuals than for their LMA peers. Because the P2 component is considered to be a biomarker of the mobilization of attentional resources toward emotionally negative stimuli, these results suggest that HMA individuals may have invested more attentional resources both when processing the addends (calculation phase) and when they had to report whether the proposed solution was correct or not (verification phase), as compared to their LMA peers. Moreover, in the verification phase, LMA individuals showed a larger late positive component (LPC) for incorrect solutions at parietal electrodes than their HMA counterparts. The smaller LPC shown by HMA individuals when verifying incorrect solutions suggests that these solutions may have been appeared more plausible to them than to their LMA counterparts. PMID:26347705

  6. A Comparison of the Safety Analysis Process and the Generation IV Proliferation Resistance/Physical Protection Assessment Methodology

    SciTech Connect

    T. A. Bjornard; M. D. Zentner

    2006-05-01

    The Generation IV International Forum (GIF) is a vehicle for the cooperative international development of future nuclear energy systems. The Generation IV program has established primary objectives in the areas of sustainability, economics, safety and reliability, and Proliferation Resistance and Physical Protection (PR&PP). In order to help meet the latter objective a program was launched in December 2002 to develop a rigorous means to assess nuclear energy systems with respect to PR&PP. The study of Physical Protection of a facility is a relatively well established methodology, but an approach to evaluate the Proliferation Resistance of a nuclear fuel cycle is not. This paper will examine the Proliferation Resistance (PR) evaluation methodology being developed by the PR group, which is largely a new approach and compare it to generally accepted nuclear facility safety evaluation methodologies. Safety evaluation methods have been the subjects of decades of development and use. Further, safety design and analysis is fairly broadly understood, as well as being the subject of federally mandated procedures and requirements. It is therefore extremely instructive to compare and contrast the proposed new PR evaluation methodology process with that used in safety analysis. By so doing, instructive and useful conclusions can be derived from the comparison that will help to strengthen the PR methodological approach as it is developed further. From the comparison made in this paper it is evident that there are very strong parallels between the two processes. Most importantly, it is clear that the proliferation resistance aspects of nuclear energy systems are best considered beginning at the very outset of the design process. Only in this way can the designer identify and cost effectively incorporate intrinsic features that might be difficult to implement at some later stage. Also, just like safety, the process to implement proliferation resistance should be a dynamic

  7. A working environment for digital planetary data processing and mapping using ISIS and GRASS GIS

    USGS Publications Warehouse

    Frigeri, A.; Hare, T.; Neteler, M.; Coradini, A.; Federico, C.; Orosei, R.

    2011-01-01

    Since the beginning of planetary exploration, mapping has been fundamental to summarize observations returned by scientific missions. Sensor-based mapping has been used to highlight specific features from the planetary surfaces by means of processing. Interpretative mapping makes use of instrumental observations to produce thematic maps that summarize observations of actual data into a specific theme. Geologic maps, for example, are thematic interpretative maps that focus on the representation of materials and processes and their relative timing. The advancements in technology of the last 30 years have allowed us to develop specialized systems where the mapping process can be made entirely in the digital domain. The spread of networked computers on a global scale allowed the rapid propagation of software and digital data such that every researcher can now access digital mapping facilities on his desktop. The efforts to maintain planetary missions data accessible to the scientific community have led to the creation of standardized digital archives that facilitate the access to different datasets by software capable of processing these data from the raw level to the map projected one. Geographic Information Systems (GIS) have been developed to optimize the storage, the analysis, and the retrieval of spatially referenced Earth based environmental geodata; since the last decade these computer programs have become popular among the planetary science community, and recent mission data start to be distributed in formats compatible with these systems. Among all the systems developed for the analysis of planetary and spatially referenced data, we have created a working environment combining two software suites that have similar characteristics in their modular design, their development history, their policy of distribution and their support system. The first, the Integrated Software for Imagers and Spectrometers (ISIS) developed by the United States Geological Survey

  8. A working environment for digital planetary data processing and mapping using ISIS and GRASS GIS

    NASA Astrophysics Data System (ADS)

    Frigeri, Alessandro; Hare, Trent; Neteler, Markus; Coradini, Angioletta; Federico, Costanzo; Orosei, Roberto

    2011-09-01

    Since the beginning of planetary exploration, mapping has been fundamental to summarize observations returned by scientific missions. Sensor-based mapping has been used to highlight specific features from the planetary surfaces by means of processing. Interpretative mapping makes use of instrumental observations to produce thematic maps that summarize observations of actual data into a specific theme. Geologic maps, for example, are thematic interpretative maps that focus on the representation of materials and processes and their relative timing. The advancements in technology of the last 30 years have allowed us to develop specialized systems where the mapping process can be made entirely in the digital domain. The spread of networked computers on a global scale allowed the rapid propagation of software and digital data such that every researcher can now access digital mapping facilities on his desktop. The efforts to maintain planetary missions data accessible to the scientific community have led to the creation of standardized digital archives that facilitate the access to different datasets by software capable of processing these data from the raw level to the map projected one. Geographic Information Systems (GIS) have been developed to optimize the storage, the analysis, and the retrieval of spatially referenced Earth based environmental geodata; since the last decade these computer programs have become popular among the planetary science community, and recent mission data start to be distributed in formats compatible with these systems. Among all the systems developed for the analysis of planetary and spatially referenced data, we have created a working environment combining two software suites that have similar characteristics in their modular design, their development history, their policy of distribution and their support system. The first, the Integrated Software for Imagers and Spectrometers (ISIS) developed by the United States Geological Survey

  9. Process development for high speed superconductor microelectronics for digital and mixed signal applications

    NASA Astrophysics Data System (ADS)

    Yohannes, Daniel T.

    After half a century of enormous successes and complete dominance, semiconductor electronics based on silicon Metal-Oxide-Semiconductor Field Effect Transistors (MOSFETs) is fast approaching its limits for high-end applications in telecommunications, computing and routing. Digital superconductor electronics (SCE) based on the Rapid Single Flux Quantum Logic (RSFQ) is considered a viable low risk alternative to Si CMOS circuits, due to its potential for ultra-high operating frequency and ultra-low power dissipation. The most developed and reliable superconductor electronics fabrication technology is based on the externally shunted Nb/Al/AlOx/Nb Josephson tunnel junctions (JJ). The technology level is characterized by the Nb/Al/AlOx/Nb trilayer critical current density, jc, and the minimum junction size, a. The maximum clock frequency of the RSFQ-based SCE circuits scales as square root of jc and inversely proportional to a. The main goals of the thesis work is: first, to research the physical limitations of the existing methods of making SCE and restrictions on the circuit complexity and speed; second, to develop a reliable and scalable SCE fabrication process that is capable of making high-speed complex circuits for digital and mixed signal applications; and third to implement the results at a commercial SCE foundry at HYPRES Inc. To this end, an advanced fabrication process with 4:5 kA/cm2 critical current density JJ has been developed. The process is based on an enhanced lithography and thin film processes and incorporates an additional anodization step for JJ protection. A simple approach for scaling of the existing circuit designs to newer higher jc processes has been proposed and implemented. A great number of complex digital circuits > 104 JJ operating at clock frequencies in excess of 30 GHz has been fabricated for the first time as well as less complex (about 500 JJs) circuits operating above 40 GHz and simple circuits with about 20 JJs operating to about

  10. Digital seismo-acoustic signal processing aboard a wireless sensor platform

    NASA Astrophysics Data System (ADS)

    Marcillo, O.; Johnson, J. B.; Lorincz, K.; Werner-Allen, G.; Welsh, M.

    2006-12-01

    We are developing a low power, low-cost wireless sensor array to conduct real-time signal processing of earthquakes at active volcanoes. The sensor array, which integrates data from both seismic and acoustic sensors, is based on Moteiv TMote Sky wireless sensor nodes (www.moteiv.com). The nodes feature a Texas Instruments MSP430 microcontroller, 48 Kbytes of program memory, 10 Kbytes of static RAM, 1 Mbyte of external flash memory, and a 2.4-GHz Chipcon CC2420 IEEE 802.15.4 radio. The TMote Sky is programmed in TinyOS. Basic signal processing occurs on an array of three peripheral sensor nodes. These nodes are tied into a dedicated GPS receiver node, which is focused on time synchronization, and a central communications node, which handles data integration and additional processing. The sensor nodes incorporate dual 12-bit digitizers sampling a seismic sensor and a pressure transducer at 100 samples per second. The wireless capabilities of the system allow flexible array geometry, with a maximum aperture of 200m. We have already developed the digital signal processing routines on board the Moteiv Tmote sensor nodes. The developed routines accomplish Real-time Seismic-Amplitude Measurement (RSAM), Seismic Spectral- Amplitude Measurement (SSAM), and a user-configured Short Term Averaging / Long Term Averaging (STA LTA ratio), which is used to calculate first arrivals. The processed data from individual nodes are transmitted back to a central node, where additional processing may be performed. Such processing will include back azimuth determination and other wave field analyses. Future on-board signal processing will focus on event characterization utilizing pattern recognition and spectral characterization. The processed data is intended as low bandwidth information which can be transmitted periodically and at low cost through satellite telemetry to a web server. The processing is limited by the computational capabilities (RAM, ROM) of the nodes. Nevertheless, we

  11. Do Italian Companies Manage Work-Related Stress Effectively? A Process Evaluation in Implementing the INAIL Methodology

    PubMed Central

    Di Tecco, Cristina; Ronchetti, Matteo; Ghelli, Monica; Russo, Simone; Persechino, Benedetta; Iavicoli, Sergio

    2015-01-01

    Studies on Intervention Process Evaluation are attracting growing attention in the literature on interventions linked to stress and the wellbeing of workers. There is evidence that some elements relating to the process and content of an intervention may have a decisive role in implementing it by facilitating or hindering the effectiveness of the results. This study aimed to provide a process evaluation on interventions to assess and manage risks related to work-related stress using a methodological path offered by INAIL. The final sample is composed of 124 companies participating to an interview on aspects relating to each phase of the INAIL methodological path put in place to implement the intervention. INAIL methodology has been defined as useful in the process of assessing and managing the risks related to work-related stress. Some factors related to the process (e.g., implementation of a preliminary phase, workers' involvement, and use of external consultants) showed a role in significant differences that emerged in the levels of risk, particularly in relation to findings from the preliminary assessment. Main findings provide information on the key aspects of process and content that are useful in implementing an intervention for assessing and managing risks related to work-related stress. PMID:26504788

  12. Do Italian Companies Manage Work-Related Stress Effectively? A Process Evaluation in Implementing the INAIL Methodology.

    PubMed

    Di Tecco, Cristina; Ronchetti, Matteo; Ghelli, Monica; Russo, Simone; Persechino, Benedetta; Iavicoli, Sergio

    2015-01-01

    Studies on Intervention Process Evaluation are attracting growing attention in the literature on interventions linked to stress and the wellbeing of workers. There is evidence that some elements relating to the process and content of an intervention may have a decisive role in implementing it by facilitating or hindering the effectiveness of the results. This study aimed to provide a process evaluation on interventions to assess and manage risks related to work-related stress using a methodological path offered by INAIL. The final sample is composed of 124 companies participating to an interview on aspects relating to each phase of the INAIL methodological path put in place to implement the intervention. INAIL methodology has been defined as useful in the process of assessing and managing the risks related to work-related stress. Some factors related to the process (e.g., implementation of a preliminary phase, workers' involvement, and use of external consultants) showed a role in significant differences that emerged in the levels of risk, particularly in relation to findings from the preliminary assessment. Main findings provide information on the key aspects of process and content that are useful in implementing an intervention for assessing and managing risks related to work-related stress. PMID:26504788

  13. Digital image processing of nanometer-size metal particles on amorphous substrates

    NASA Astrophysics Data System (ADS)

    Soria, F.; Artal, P.; Bescos, J.; Heinemann, K.

    The task of differentiating very small metal aggregates supported on amorphous films from the phase contrast image features inherently stemming from the support is extremely difficult in the nanometer particle size range. Digital image processing was employed to overcome some of the ambiguities in evaluating such micrographs. It was demonstrated that such processing allowed positive particle detection and a limited degree of statistical size analysis even for micrographs where by bare eye examination the distribution between particles and erroneous substrate features would seem highly ambiguous. The smallest size class detected for Pd/C samples peaks at 0.8 nm. This size class was found in various samples prepared under different evaporation conditions and it is concluded that these particles consist of 'a magic number' of 13 atoms and have cubooctahedral or icosahedral crystal structure.

  14. Digital Signal Processing by Virtual Instrumentation of a MEMS Magnetic Field Sensor for Biomedical Applications

    PubMed Central

    Juárez-Aguirre, Raúl; Domínguez-Nicolás, Saúl M.; Manjarrez, Elías; Tapia, Jesús A.; Figueras, Eduard; Vázquez-Leal, Héctor; Aguilera-Cortés, Luz A.; Herrera-May, Agustín L.

    2013-01-01

    We present a signal processing system with virtual instrumentation of a MEMS sensor to detect magnetic flux density for biomedical applications. This system consists of a magnetic field sensor, electronic components implemented on a printed circuit board (PCB), a data acquisition (DAQ) card, and a virtual instrument. It allows the development of a semi-portable prototype with the capacity to filter small electromagnetic interference signals through digital signal processing. The virtual instrument includes an algorithm to implement different configurations of infinite impulse response (IIR) filters. The PCB contains a precision instrumentation amplifier, a demodulator, a low-pass filter (LPF) and a buffer with operational amplifier. The proposed prototype is used for real-time non-invasive monitoring of magnetic flux density in the thoracic cage of rats. The response of the rat respiratory magnetogram displays a similar behavior as the rat electromyogram (EMG). PMID:24196434

  15. Digital image processing of nanometer-size metal particles on amorphous substrates

    NASA Technical Reports Server (NTRS)

    Soria, F.; Artal, P.; Bescos, J.; Heinemann, K.

    1989-01-01

    The task of differentiating very small metal aggregates supported on amorphous films from the phase contrast image features inherently stemming from the support is extremely difficult in the nanometer particle size range. Digital image processing was employed to overcome some of the ambiguities in evaluating such micrographs. It was demonstrated that such processing allowed positive particle detection and a limited degree of statistical size analysis even for micrographs where by bare eye examination the distribution between particles and erroneous substrate features would seem highly ambiguous. The smallest size class detected for Pd/C samples peaks at 0.8 nm. This size class was found in various samples prepared under different evaporation conditions and it is concluded that these particles consist of 'a magic number' of 13 atoms and have cubooctahedral or icosahedral crystal structure.

  16. TRIIG - Time-lapse reproduction of images through interactive graphics. [digital processing of quality hard copy

    NASA Technical Reports Server (NTRS)

    Buckner, J. D.; Council, H. W.; Edwards, T. R.

    1974-01-01

    Description of the hardware and software implementing the system of time-lapse reproduction of images through interactive graphics (TRIIG). The system produces a quality hard copy of processed images in a fast and inexpensive manner. This capability allows for optimal development of processing software through the rapid viewing of many image frames in an interactive mode. Three critical optical devices are used to reproduce an image: an Optronics photo reader/writer, the Adage Graphics Terminal, and Polaroid Type 57 high speed film. Typical sources of digitized images are observation satellites, such as ERTS or Mariner, computer coupled electron microscopes for high-magnification studies, or computer coupled X-ray devices for medical research.

  17. Digital signal processing by virtual instrumentation of a MEMS magnetic field sensor for biomedical applications.

    PubMed

    Juárez-Aguirre, Raúl; Domínguez-Nicolás, Saúl M; Manjarrez, Elías; Tapia, Jesús A; Figueras, Eduard; Vázquez-Leal, Héctor; Aguilera-Cortés, Luz A; Herrera-May, Agustín L

    2013-01-01

    We present a signal processing system with virtual instrumentation of a MEMS sensor to detect magnetic flux density for biomedical applications. This system consists of a magnetic field sensor, electronic components implemented on a printed circuit board (PCB), a data acquisition (DAQ) card, and a virtual instrument. It allows the development of a semi-portable prototype with the capacity to filter small electromagnetic interference signals through digital signal processing. The virtual instrument includes an algorithm to implement different configurations of infinite impulse response (IIR) filters. The PCB contains a precision instrumentation amplifier, a demodulator, a low-pass filter (LPF) and a buffer with operational amplifier. The proposed prototype is used for real-time non-invasive monitoring of magnetic flux density in the thoracic cage of rats. The response of the rat respiratory magnetogram displays a similar behavior as the rat electromyogram (EMG). PMID:24196434

  18. Improvement of FBG peak wavelength demodulation using digital signal processing algorithms

    NASA Astrophysics Data System (ADS)

    Harasim, Damian; Gulbahar, Yussupova

    2015-09-01

    Spectrum reflected or transmitted by fiber Bragg grating (FBG) in laboratory environment usually has smooth shape with high signal to noise ratio, similar to Gaussian curve. However, in some applications reflected spectrum could included some strong noise, especially where sensing array contains large number of FBGs or while is used broadband, low power source. This paper presents a possibility for extraction fiber Bragg grating peak wavelength from spectra with weak signal to noise radio with most frequently using digital signal processing algorithms. The accuracy of function minimum, centroid and Gaussian fitting methods for peak wavelength detection is compared. The linearity of processing characteristics of extended FBG measured for reference high power and second, low power source is shown and compared.

  19. Digital ultrasonics signal processing: Flaw data post processing use and description

    NASA Technical Reports Server (NTRS)

    Buel, V. E.

    1981-01-01

    A modular system composed of two sets of tasks which interprets the flaw data and allows compensation of the data due to transducer characteristics is described. The hardware configuration consists of two main units. A DEC LSI-11 processor running under the RT-11 sngle job, version 2C-02 operating system, controls the scanner hardware and the ultrasonic unit. A DEC PDP-11/45 processor also running under the RT-11, version 2C-02, operating system, stores, processes and displays the flaw data. The software developed the Ultrasonics Evaluation System, is divided into two catagories; transducer characterization and flaw classification. Each category is divided further into two functional tasks: a data acquisition and a postprocessor ask. The flaw characterization collects data, compresses its, and writes it to a disk file. The data is then processed by the flaw classification postprocessing task. The use and operation of a flaw data postprocessor is described.

  20. Proposal of the Methodology for Analysing the Structural Relationship in the System of Random Process Using the Data Mining Methods

    NASA Astrophysics Data System (ADS)

    Michaľčonok, German; Kalinová, Michaela Horalová; Németh, Martin

    2014-12-01

    The aim of this paper is to present the possibilities of applying data mining techniques to the problem of analysis of structural relationships in the system of stationary random processes. In this paper, we will approach the area of the random processes, present the process of structural analysis and select suitable circuit data mining methods applicable to the area of structural analysis. We will propose the methodology for the structural analysis in the system of stationary stochastic processes using data mining methods for active experimental approach, based on the theoretical basis.

  1. Phenopix: a R package to process digital images of a vegetation cover

    NASA Astrophysics Data System (ADS)

    Filippa, Gianluca; Cremonese, Edoardo; Migliavacca, Mirco; Galvagno, Marta; Morra di Cella, Umberto; Richardson, Andrew

    2015-04-01

    Plant phenology is a globally recognized indicator of the effects of climate change on the terrestrial biosphere. Accordingly, new tools to automatically track the seasonal development of a vegetation cover are becoming available and more and more deployed. Among them, near-continuous digital images are being collected in several networks in the US, Europe, Asia and Australia in a range of different ecosystems, including agricultural lands, deciduous and evergreen forests, and grasslands. The growing scientific interest in vegetation image analysis highlights the need of easy to use, flexible and standardized processing techniques. In this contribution we illustrate a new open source package called "phenopix" written in R language that allows to process images of a vegetation cover. The main features include: (i) define of one or more areas of interest on an image and process pixel information within them, (ii) compute vegetation indexes based on red green and blue channels, (iii) fit a curve to the seasonal trajectory of vegetation indexes and extract relevant dates (aka thresholds) on the seasonal trajectory; (iv) analyze image pixels separately to extract spatially explicit phenological information. The utilities of the package will be illustrated in detail for two subalpine sites, a grassland and a larch stand at about 2000 m in the Italian Western Alps. The phenopix package is a cost free and easy-to-use tool that allows to process digital images of a vegetation cover in a standardized, flexible and reproducible way. The software is available for download at the R forge web site (r-forge.r-project.org/projects/phenopix/).

  2. ADAPT: A knowledge-based synthesis tool for digital signal processing system design

    SciTech Connect

    Cooley, E.S.

    1988-01-01

    A computer aided synthesis tool for expansion, compression, and filtration of digital images is described. ADAPT, the Autonomous Digital Array Programming Tool, uses an extensive design knowledge base to synthesize a digital signal processing (DSP) system. Input to ADAPT can be either a behavioral description in English, or a block level specification via Petri Nets. The output from ADAPT comprises code to implement the DSP system on an array of processors. ADAPT is constructed using C, Prolog, and X Windows on a SUN 3/280 workstation. ADAPT knowledge encompasses DSP component information and the design algorithms and heuristics of a competent DSP designer. The knowledge is used to form queries for design capture, to generate design constraints from the user's responses, and to examine the design constraints. These constraints direct the search for possible DSP components and target architectures. Constraints are also used for partitioning the target systems into less complex subsystems. The subsystems correspond to architectural building blocks of the DSP design. These subsystems inherit design constraints and DSP characteristics from their parent blocks. Thus, a DSP subsystem or parent block, as designed by ADAPT, must meet the user's design constraints. Design solutions are sought by searching the Components section of the design knowledge base. Component behavior which matches or is similar to that required by the DSP subsystems is sought. Each match, which corresponds to a design alternative, is evaluated in terms of its behavior. When a design is sufficiently close to the behavior required by the user, detailed mathematical simulations may be performed to accurately determine exact behavior.

  3. Electrocoagulation and nanofiltration integrated process application in purification of bilge water using response surface methodology.

    PubMed

    Akarsu, Ceyhun; Ozay, Yasin; Dizge, Nadir; Elif Gulsen, H; Ates, Hasan; Gozmen, Belgin; Turabik, Meral

    2016-01-01

    Marine pollution has been considered an increasing problem because of the increase in sea transportation day by day. Therefore, a large volume of bilge water which contains petroleum, oil and hydrocarbons in high concentrations is generated from all types of ships. In this study, treatment of bilge water by electrocoagulation/electroflotation and nanofiltration integrated process is investigated as a function of voltage, time, and initial pH with aluminum electrode as both anode and cathode. Moreover, a commercial NF270 flat-sheet membrane was also used for further purification. Box-Behnken design combined with response surface methodology was used to study the response pattern and determine the optimum conditions for maximum chemical oxygen demand (COD) removal and minimum metal ion contents of bilge water. Three independent variables, namely voltage (5-15 V), initial pH (4.5-8.0) and time (30-90 min) were transformed to coded values. The COD removal percent, UV absorbance at 254 nm, pH value (after treatment), and concentration of metal ions (Ti, As, Cu, Cr, Zn, Sr, Mo) were obtained as responses. Analysis of variance results showed that all the models were significant except for Zn (P > 0.05), because the calculated F values for these models were less than the critical F value for the considered probability (P = 0.05). The obtained R(2) and Radj(2) values signified the correlation between the experimental data and predicted responses: except for the model of Zn concentration after treatment, the high R(2) values showed the goodness of fit of the model. While the increase in the applied voltage showed negative effects, the increases in time and pH showed a positive effect on COD removal efficiency; also the most effective linear term was found as time. A positive sign of the interactive coefficients of the voltage-time and pH-time systems indicated synergistic effect on COD removal efficiency, whereas interaction between voltage and pH showed an antagonistic

  4. The DFP 9200 Digital Noise Reducer, A Real-Time High-Resolution Digital Video Processing System For X-Ray Fluoroscopy

    NASA Astrophysics Data System (ADS)

    McMann, Renville H.; Baron, Stanley; Kreinik, Stephen; Epperson, Don; Kruger, Robert A.

    1981-11-01

    A dedicated digital processor is described capable of digitizing a high resolution video signal from a fluoroscopic TV camera into an 810 x 600 matrix in real time. For less demanding applications, a 512 x 512 matrix can be substituted. The sampling clock frequency is 15 Megahertz giving a Nyquist bandwidth limit of 7.5 MHz. A 7 MHz phase equalized eliptical filter at the input prevents aliasing and the production of false artifacts in the picture. Eleven bit digital processing follows an 8 bit analog to digital converter. Noise reduction is accomplished by a one frame recursive filter in which the filter coefficients are adjusted by a patented motion detector on a pixel by pixel basis to reduce motion smear. The lower perceived noise permits X-ray dose reduction of 2 to 8 times while retaining high quality pictures. A noise reduced spot picture can be frozen by a foot controlled switch permitting a further reduction of dosage and eliminating the need for a troublesome disc recorder. This noise reduced picture can also be used as a subtraction mask in an optional version of the equipment. A minimum of front panel operator controls for best human interface is accomplished by the use of a programmed read only memories to control all functions including noise reduction and frame storage.

  5. Comparison of breast percent density estimation from raw versus processed digital mammograms

    NASA Astrophysics Data System (ADS)

    Li, Diane; Gavenonis, Sara; Conant, Emily; Kontos, Despina

    2011-03-01

    We compared breast percent density (PD%) measures obtained from raw and post-processed digital mammographic (DM) images. Bilateral raw and post-processed medio-lateral oblique (MLO) images from 81 screening studies were retrospectively analyzed. Image acquisition was performed with a GE Healthcare DS full-field DM system. Image post-processing was performed using the PremiumViewTM algorithm (GE Healthcare). Area-based breast PD% was estimated by a radiologist using a semi-automated image thresholding technique (Cumulus, Univ. Toronto). Comparison of breast PD% between raw and post-processed DM images was performed using the Pearson correlation (r), linear regression, and Student's t-test. Intra-reader variability was assessed with a repeat read on the same data-set. Our results show that breast PD% measurements from raw and post-processed DM images have a high correlation (r=0.98, R2=0.95, p<0.001). Paired t-test comparison of breast PD% between the raw and the post-processed images showed a statistically significant difference equal to 1.2% (p = 0.006). Our results suggest that the relatively small magnitude of the absolute difference in PD% between raw and post-processed DM images is unlikely to be clinically significant in breast cancer risk stratification. Therefore, it may be feasible to use post-processed DM images for breast PD% estimation in clinical settings. Since most breast imaging clinics routinely use and store only the post-processed DM images, breast PD% estimation from post-processed data may accelerate the integration of breast density in breast cancer risk assessment models used in clinical practice.

  6. A Mixed Methodological Analysis of the Role of Culture in the Clinical Decision-Making Process

    ERIC Educational Resources Information Center

    Hays, Danica G.; Prosek, Elizabeth A.; McLeod, Amy L.

    2010-01-01

    Even though literature indicates that particular cultural groups receive more severe diagnoses at disproportionate rates, there has been minimal research that addresses how culture interfaces specifically with clinical decision making. This mixed methodological study of 41 counselors indicated that cultural characteristics of both counselors and…

  7. Analysis of Feedback Processes in Online Group Interaction: A Methodological Model

    ERIC Educational Resources Information Center

    Espasa, Anna; Guasch, Teresa; Alvarez, Ibis M.

    2013-01-01

    The aim of this article is to present a methodological model to analyze students' group interaction to improve their essays in online learning environments, based on asynchronous and written communication. In these environments teacher and student scaffolds for discussion are essential to promote interaction. One of these scaffolds can be the…

  8. Camera model and calibration process for high-accuracy digital image metrology of inspection planes

    NASA Astrophysics Data System (ADS)

    Correia, Bento A. B.; Dinis, Joao

    1998-10-01

    High accuracy digital image based metrology must rely on an integrated model of image generation that is able to consider simultaneously the geometry of the camera vs. object positioning, and the conversion of the optical image on the sensor into an electronic digital format. In applications of automated visual inspection involving the analysis of approximately plane objects these models are generally simplified in order to facilitate the process of camera calibration. In this context, the lack of rigor in the determination of the intrinsic parameters in such models is particularly relevant. Aiming at the high accuracy metrology of contours of objects lying on an analysis plane, and involving sub-pixel measurements, this paper presents a three-stage camera model that includes an extrinsic component of perspective distortion and the intrinsic components of radial lens distortion and sensor misalignment. The later two factors are crucial in applications of machine vision that rely on the use of low cost optical components. A polynomial model for the negative radial lens distortion of wide field of view CCTV lenses is also established.

  9. A new approach to pre-processing digital image for wavelet-based watermark

    NASA Astrophysics Data System (ADS)

    Agreste, Santa; Andaloro, Guido

    2008-11-01

    The growth of the Internet has increased the phenomenon of digital piracy, in multimedia objects, like software, image, video, audio and text. Therefore it is strategic to individualize and to develop methods and numerical algorithms, which are stable and have low computational cost, that will allow us to find a solution to these problems. We describe a digital watermarking algorithm for color image protection and authenticity: robust, not blind, and wavelet-based. The use of Discrete Wavelet Transform is motivated by good time-frequency features and a good match with Human Visual System directives. These two combined elements are important for building an invisible and robust watermark. Moreover our algorithm can work with any image, thanks to the step of pre-processing of the image that includes resize techniques that adapt to the size of the original image for Wavelet transform. The watermark signal is calculated in correlation with the image features and statistic properties. In the detection step we apply a re-synchronization between the original and watermarked image according to the Neyman-Pearson statistic criterion. Experimentation on a large set of different images has been shown to be resistant against geometric, filtering, and StirMark attacks with a low rate of false alarm.

  10. Interaction of image noise, spatial resolution, and low contrast fine detail preservation in digital image processing

    NASA Astrophysics Data System (ADS)

    Artmann, Uwe; Wueller, Dietmar

    2009-01-01

    We present a method to improve the validity of noise and resolution measurements on digital cameras. If non-linear adaptive noise reduction is part of the signal processing in the camera, the measurement results for image noise and spatial resolution can be good, while the image quality is low due to the loss of fine details and a watercolor like appearance of the image. To improve the correlation between objective measurement and subjective image quality we propose to supplement the standard test methods with an additional measurement of the texture preserving capabilities of the camera. The proposed method uses a test target showing white Gaussian noise. The camera under test reproduces this target and the image is analyzed. We propose to use the kurtosis of the derivative of the image as a metric for the texture preservation of the camera. Kurtosis is a statistical measure for the closeness of a distribution compared to the Gaussian distribution. It can be shown, that the distribution of digital values in the derivative of the image showing the chart becomes the more leptokurtic (increased kurtosis) the stronger the noise reduction has an impact on the image.

  11. Joint digital signal processing of Nyquist-wavelength division multiplexing superchannel with group detection

    NASA Astrophysics Data System (ADS)

    Zhang, Tianyun; Yao, Shuchang; Fu, Songnian; Tang, Ming; Liu, Deming

    2014-12-01

    To relax the limited sampling rate of an analog-to-digital converter (ADC) and to reduce the complexity of conventional fiber-optical superchannel coherent detection, we propose and demonstrate a joint digital signal processing (DSP) technique of Nyquist-wavelength division multiplexing superchannel with group detection. At the receiver side, three Nyquist-spaced channels with 12.5 Gbaud polarization multiplexing-quadrature phase shift keying signals are group detected with a sampling rate per channel of 1.33 times over the normal sampling rate. A modified carrier separation technique is then put forward to retrieve the high-frequency interference component of both the designated channel and its adjacent channels, which can subsequently be used to recover the designated channel with new constant modulus algorithm-based joint multiinput-multioutput equalizers. The results show that the proposed group detection and joint DSP algorithm can simultaneously improve the transmission performance and reduce the complexity of both the transmitter and receiver, regardless of bandwidth restrictions from the waveshaper, ADC module, and coherent receiver.

  12. An online detection system for aggregate sizes and shapes based on digital image processing

    NASA Astrophysics Data System (ADS)

    Yang, Jianhong; Chen, Sijia

    2016-07-01

    Traditional aggregate size measuring methods are time-consuming, taxing, and do not deliver online measurements. A new online detection system for determining aggregate size and shape based on a digital camera with a charge-coupled device, and subsequent digital image processing, have been developed to overcome these problems. The system captures images of aggregates while falling and flat lying. Using these data, the particle size and shape distribution can be obtained in real time. Here, we calibrate this method using standard globules. Our experiments show that the maximum particle size distribution error was only 3 wt%, while the maximum particle shape distribution error was only 2 wt% for data derived from falling aggregates, having good dispersion. In contrast, the data for flat-lying aggregates had a maximum particle size distribution error of 12 wt%, and a maximum particle shape distribution error of 10 wt%; their accuracy was clearly lower than for falling aggregates. However, they performed well for single-graded aggregates, and did not require a dispersion device. Our system is low-cost and easy to install. It can successfully achieve online detection of aggregate size and shape with good reliability, and it has great potential for aggregate quality assurance.

  13. A method for the processing and analysis of digital terrain elevation data. [Shiprock and Gallup Quadrangles, Arizona and New Mexico

    NASA Technical Reports Server (NTRS)

    Junkin, B. G. (Principal Investigator)

    1979-01-01

    A method is presented for the processing and analysis of digital topography data that can subsequently be entered in an interactive data base in the form of slope, slope length, elevation, and aspect angle. A discussion of the data source and specific descriptions of the data processing software programs are included. In addition, the mathematical considerations involved in the registration of raw digitized coordinate points to the UTM coordinate system are presented. Scale factor considerations are also included. Results of the processing and analysis are illustrated using the Shiprock and Gallup Quadrangle test data.

  14. Digital Archiving and Preservation: Technologies and Processes for a Trusted Repository

    ERIC Educational Resources Information Center

    Jantz, Ronald; Giarlo, Michael

    2006-01-01

    This article examines what is implied by the term "trusted" in the phrase "trusted digital repositories." Digital repositories should be able to preserve electronic materials for periods at least comparable to existing preservation methods. Our collective lack of experience with preserving digital objects and consensus about the reliability of our…

  15. The Application of Six Sigma Methodologies to University Processes: The Use of Student Teams

    ERIC Educational Resources Information Center

    Pryor, Mildred Golden; Alexander, Christine; Taneja, Sonia; Tirumalasetty, Sowmya; Chadalavada, Deepthi

    2012-01-01

    The first student Six Sigma team (activated under a QEP Process Sub-team) evaluated the course and curriculum approval process. The goal was to streamline the process and thereby shorten process cycle time and reduce confusion about how the process works. Members of this team developed flowcharts on how the process is supposed to work (by…

  16. Reengineering the picture archiving and communication system (PACS) process for digital imaging networks PACS.

    PubMed

    Horton, M C; Lewis, T E; Kinsey, T V

    1999-05-01

    Prior to June 1997, military picture archiving and communications systems (PACS) were planned, procured, and installed with key decisions on the system, equipment, and even funding sources made through a research and development office called Medical Diagnostic Imaging Systems (MDIS). Beginning in June 1997, the Joint Imaging Technology Project Office (JITPO) initiated a collaborative and consultative process for planning and implementing PACS into military treatment facilities through a new Department of Defense (DoD) contract vehicle called digital imaging networks (DIN)-PACS. The JITPO reengineered this process incorporating multiple organizations and politics. The reengineered PACS process administered through the JITPO transformed the decision process and accountability from a single office to a consultative method that increased end-user knowledge, responsibility, and ownership in PACS. The JITPO continues to provide information and services that assist multiple groups and users in rendering PACS planning and implementation decisions. Local site project managers are involved from the outset and this end-user collaboration has made the sometimes difficult transition to PACS an easier and more acceptable process for all involved. Corporately, this process saved DoD sites millions by having PACS plans developed within the government and proposed to vendors second, and then having vendors respond specifically to those plans. The integrity and efficiency of the process have reduced the opportunity for implementing nonstandard systems while sharing resources and reducing wasted government dollars. This presentation will describe the chronology of changes, encountered obstacles, and lessons learned within the reengineering of the PACS process for DIN-PACS. PMID:10342167

  17. Definition of the fundamentals for the automatic generation of digitalization processes with a 3D laser sensor

    NASA Astrophysics Data System (ADS)

    Davillerd, Stephane; Sidot, Benoit; Bernard, Alain; Ris, Gabriel

    1998-12-01

    This paper introduces the first results of a research work carried out on the automation of digitizing process of complex part using a precision 3D laser senor. Indeed, most of the operations are generally still manual to perform digitization. In fact, redundancies, lacks or forgettings in point acquisition are possible. Moreover, digitalization time of a part, i.e. immobilization of the machine, is thus not optimized overall. After introducing the context in which evolves the reverse engineering, we quickly present non-contact sensors and machines usable to digitalize a part. Considered environment of digitization is also modeled, but in a general way in order to preserve an upgrading capability to the system. Machine and sensor actually used are then presented and their integration exposed. Current process of digitization is then detailed, after what a critical analysis from the considered point of view is carried out and some solutions are suggested. The paper concludes on the laid down prospects and the next projected developments.

  18. Comparative positioning of ships on the basis of neural processing of digital images

    NASA Astrophysics Data System (ADS)

    Stateczny, A.

    2003-04-01

    Satellite and radar systems have been the main information sources in marine navigation in recent years. Apart from commonly known anti-collision functions, the marine navigational radar constitutes the basis for a future comparative system of ship positioning. The sonar is an additional source of image information in the system. In this way, the data are derived from observing the surroundings of the ship's total measuring area. The system of comparative navigation is an attractive alternative to satellite navigation due to its autonomy and independence from external appliances. The methods of analytic comparison of digitally recorded images applied so far are based on complex and time-consuming calculation algorithms. A new approach in comparative navigation is the application of artificial neural networks for plotting the ship's position. In the positioning process, previously registered images can be made use of, as well as their positions plotted for instance by means of the GPS system or by geodetic methods. The teaching sequence is constituted by the registered images correlated with positions; it is performed earlier and can last for any length of time. After the process of teaching the network is completed, the dynamically registered images are put on the network input as they come, and a position interpolation is performed based on images recognized as closest to the image analyzed. A merit of this method is teaching the network with real images, along with their disturbances and distortions. The teaching sequence includes images analogous to those that will be used in practice. During the system's working the response of the network (plotting the ship's position) is almost immediate. A basic problem of this method is the need for previous registration of numerous real images in various hydrometeorological conditions. The registered images should be subjected to digital processing, to the compression process in particular. One of the processing methods is

  19. Design criteria for a multiple input land use system. [digital image processing techniques

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.; Bryant, N. A.

    1975-01-01

    A design is presented that proposes the use of digital image processing techniques to interface existing geocoded data sets and information management systems with thematic maps and remote sensed imagery. The basic premise is that geocoded data sets can be referenced to a raster scan that is equivalent to a grid cell data set, and that images taken of thematic maps or from remote sensing platforms can be converted to a raster scan. A major advantage of the raster format is that x, y coordinates are implicitly recognized by their position in the scan, and z values can be treated as Boolean layers in a three-dimensional data space. Such a system permits the rapid incorporation of data sets, rapid comparison of data sets, and adaptation to variable scales by resampling the raster scans.

  20. Digital processing of pulse signal from light-to-frequency converter under dynamic condition

    NASA Astrophysics Data System (ADS)

    Pawlowski, Eligiusz

    2014-08-01

    Frequency of an output signal from a Light-to-Frequency Converter (LFC) is proportional to light intensity. Under dynamic conditions, instantaneous frequency values represent instantaneous values of light intensity. In order to precisely determine frequency of the pulse signal in short time it is required to measure its successive periods. But if the light intensity changes, time between successive pulses of the output signal from LFC changes too, which prevents from obtaining the results of light measurement at regular time intervals. This work presents an algorithm for digital processing of a pulse frequency signal from LFC to obtain instantaneous values of light intensity at regular time intervals. Appropriate analytical dependences and examples of measurement results are also presented. Measurement circuit was built using DAQ-Card PCI-6602 and LabVIEW package of National Instruments.