#### Sample records for image processing mathematics

1. MATHEMATICAL METHODS IN MEDICAL IMAGE PROCESSING

PubMed Central

ANGENENT, SIGURD; PICHON, ERIC; TANNENBAUM, ALLEN

2013-01-01

In this paper, we describe some central mathematical problems in medical imaging. The subject has been undergoing rapid changes driven by better hardware and software. Much of the software is based on novel methods utilizing geometric partial differential equations in conjunction with standard signal/image processing techniques as well as computer graphics facilitating man/machine interactions. As part of this enterprise, researchers have been trying to base biomedical engineering principles on rigorous mathematical foundations for the development of software methods to be integrated into complete therapy delivery systems. These systems support the more effective delivery of many image-guided procedures such as radiation therapy, biopsy, and minimally invasive surgery. We will show how mathematics may impact some of the main problems in this area, including image enhancement, registration, and segmentation. PMID:23645963

2. Image processing and classification of metal materials based on mathematical morphology

Weng, Guirong; Ye, Ping

2008-12-01

The analysis of metal fracture surface could ascertain the condition and give some valuable clue to the research of the component's failure by observing and analyzing the fracture surface. A method of the image processing of the metal fracture surface for eliminating the noise, detecting the edges of the image and classification of the material by obtaining the images' character based on the mathematical morphology will be described. The methods of the character description have unchangeable characteristic for rotating, zooming or moving the images. Then the images can be classified by using artificial neural network. 72 images of stainless steel, brass and red copper processed through corrosion fatigue fracture and equalized dimple fracture are used as a study samples and 54 images are used as a test samples. The data of the experiment explains the feasibility of the method and analyzes the cause of the error.

3. Applying Mathematical Processes (AMP)

ERIC Educational Resources Information Center

Kathotia, Vinay

2011-01-01

This article provides insights into the "Applying Mathematical Processes" resources, developed by the Nuffield Foundation. It features Nuffield AMP activities--and related ones from Bowland Maths--that were designed to support the teaching and assessment of key processes in mathematics--representing a situation mathematically, analysing,…

4. Mathematical modeling and high-speed imaging of technical combustion processes

Wolfrum, Juergen M.

1995-05-01

The high spectral brightness and short pulse duration of tunable high power excimer lasers allows the 2D and 3D application of techniques like laser-induces fluorescence (LIF), Mie and Rayleigh scattering for high speed imaging in industrial applications. The construction of these lasers allows easy transportation and installation to perform measurements at industrial applications. The construction of these lasers allows easy transportation and installation to perform measurements at industrial facilities which can not be moved. In combination with suitable filters and gated image-intensified CCD cameras techniques are now available to measure multidimensional distributions of temperatures and concentrations. Simultaneous measurements of temperature fields and hydroxyl radical distributions were performed to study the influence of turbulence on large premixed natural gas flames. A combination of temperature and nitric oxide concentration measurements yielded information about the correlations between NO formation and burner design in domestic gas burners. Detailed experimental studies on the carbon dioxide-laser induced ignition of CH3OH/O2-mixtures in quartz reactor are performed to supply quantitative data for direct comparison with the numerical results of a mathematical model for ignition processes in 2D geometries. Temporally and spatially resolved measurements of flame position and OH concentration are presented for different conditions and compared directly to the computational results. LIF, Rayleigh and Mie scattering were used for measurements of temperature fields, fuel and OH radical distributions in engines. Finally a novel type of combustion control system for municipal waste incinerators using fast infrared thermography to obtain information about the temperature distribution in the furnace interior is described. A fast scanner camera operating in the mid infrared was installed which allows the direct imaging of the fuel bed through the overlying

5. Processing of microCT implant-bone systems images using Fuzzy Mathematical Morphology

Bouchet, A.; Colabella, L.; Omar, S.; Ballarre, J.; Pastore, J.

2016-04-01

The relationship between a metallic implant and the existing bone in a surgical permanent prosthesis is of great importance since the fixation and osseointegration of the system leads to the failure or success of the surgery. Micro Computed Tomography is a technique that helps to visualize the structure of the bone. In this study, the microCT is used to analyze implant-bone systems images. However, one of the problems presented in the reconstruction of these images is the effect of the iron based implants, with a halo or fluorescence scattering distorting the micro CT image and leading to bad 3D reconstructions. In this work we introduce an automatic method for eliminate the effect of AISI 316L iron materials in the implant-bone system based on the application of Compensatory Fuzzy Mathematical Morphology for future investigate about the structural and mechanical properties of bone and cancellous materials.

6. Investigating Teachers' Images of Mathematics

ERIC Educational Resources Information Center

2008-01-01

Research suggests that understanding new images of mathematics is very challenging and can contribute to teacher resistance. An explicit exploration of personal views of mathematics may be necessary for pedagogical change. One possible way for exploring these images is through mathematical metaphors. As metaphors focus on similarities, they can be…

7. Semantic Processing of Mathematical Gestures

ERIC Educational Resources Information Center

Lim, Vanessa K.; Wilson, Anna J.; Hamm, Jeff P.; Phillips, Nicola; Iwabuchi, Sarina J.; Corballis, Michael C.; Arzarello, Ferdinando; Thomas, Michael O. J.

2009-01-01

Objective: To examine whether or not university mathematics students semantically process gestures depicting mathematical functions (mathematical gestures) similarly to the way they process action gestures and sentences. Semantic processing was indexed by the N400 effect. Results: The N400 effect elicited by words primed with mathematical gestures…

8. Mathematical Methods for Diffusion MRI Processing

PubMed Central

Lenglet, C.; Campbell, J.S.W.; Descoteaux, M.; Haro, G.; Savadjiev, P.; Wassermann, D.; Anwander, A.; Deriche, R.; Pike, G.B.; Sapiro, G.; Siddiqi, K.; Thompson, P.

2009-01-01

In this article, we review recent mathematical models and computational methods for the processing of diffusion Magnetic Resonance Images, including state-of-the-art reconstruction of diffusion models, cerebral white matter connectivity analysis, and segmentation techniques. We focus on Diffusion Tensor Images (DTI) and Q-Ball Images (QBI). PMID:19063977

9. Images, Anxieties and Attitudes toward Mathematics

ERIC Educational Resources Information Center

Belbase, Shashidhar

2010-01-01

Images, anxieties, and attitudes towards mathematics are common interest among mathematics teachers, teacher educators and researchers. The main purpose of this literature review based paper is to discuss and analyze images, anxieties, and attitudes towards mathematics in order to foster meaningful teaching and learning of mathematics. Images of…

10. Images, Anxieties, and Attitudes toward Mathematics

ERIC Educational Resources Information Center

Belbase, Shashidhar

2013-01-01

The purpose of this paper is to discuss and analyze images, anxieties, and attitudes towards mathematics in order to foster meaningful teaching and learning of mathematics. Images of mathematics seem to be profoundly shaped by epistemological, philosophical, and pedagogical perspectives of one who views mathematics either as priori or a…

11. Exploring Mathematical Definition Construction Processes

ERIC Educational Resources Information Center

Ouvrier-Buffet, Cecile

2006-01-01

The definition of "definition" cannot be taken for granted. The problem has been treated from various angles in different journals. Among other questions raised on the subject we find: the notions of "concept definition" and "concept image", conceptions of mathematical definitions, redefinitions, and from a more axiomatic point of view, how to…

12. Introduction to computer image processing

NASA Technical Reports Server (NTRS)

Moik, J. G.

1973-01-01

Theoretical backgrounds and digital techniques for a class of image processing problems are presented. Image formation in the context of linear system theory, image evaluation, noise characteristics, mathematical operations on image and their implementation are discussed. Various techniques for image restoration and image enhancement are presented. Methods for object extraction and the problem of pictorial pattern recognition and classification are discussed.

13. Image Processing

NASA Technical Reports Server (NTRS)

1993-01-01

Electronic Imagery, Inc.'s ImageScale Plus software, developed through a Small Business Innovation Research (SBIR) contract with Kennedy Space Flight Center for use on space shuttle Orbiter in 1991, enables astronauts to conduct image processing, prepare electronic still camera images in orbit, display them and downlink images to ground based scientists for evaluation. Electronic Imagery, Inc.'s ImageCount, a spin-off product of ImageScale Plus, is used to count trees in Florida orange groves. Other applications include x-ray and MRI imagery, textile designs and special effects for movies. As of 1/28/98, company could not be located, therefore contact/product information is no longer valid.

14. Image Processing for Teaching.

ERIC Educational Resources Information Center

Greenberg, R.; And Others

1993-01-01

The Image Processing for Teaching project provides a powerful medium to excite students about science and mathematics, especially children from minority groups and others whose needs have not been met by traditional teaching. Using professional-quality software on microcomputers, students explore a variety of scientific data sets, including…

15. Image processing and reconstruction

SciTech Connect

Chartrand, Rick

2012-06-15

This talk will examine some mathematical methods for image processing and the solution of underdetermined, linear inverse problems. The talk will have a tutorial flavor, mostly accessible to undergraduates, while still presenting research results. The primary approach is the use of optimization problems. We will find that relaxing the usual assumption of convexity will give us much better results.

16. Workbook, Basic Mathematics and Wastewater Processing Calculations.

ERIC Educational Resources Information Center

New York State Dept. of Environmental Conservation, Albany.

This workbook serves as a self-learning guide to basic mathematics and treatment plant calculations and also as a reference and source book for the mathematics of sewage treatment and processing. In addition to basic mathematics, the workbook discusses processing and process control, laboratory calculations and efficiency calculations necessary in…

17. Mathematical Modeling: A Structured Process

ERIC Educational Resources Information Center

Anhalt, Cynthia Oropesa; Cortez, Ricardo

2015-01-01

Mathematical modeling, in which students use mathematics to explain or interpret physical, social, or scientific phenomena, is an essential component of the high school curriculum. The Common Core State Standards for Mathematics (CCSSM) classify modeling as a K-12 standard for mathematical practice and as a conceptual category for high school…

18. Mathematical Approaches to the Composing Process.

ERIC Educational Resources Information Center

Hall, Dennis R.

Rhetoric and mathematics have much in common that can help explain the composing process. Common elements of rhetoric and mathematics important to the teaching of writing are (1) relationships between syntax and semantics, (2) practices of representation, and (3) focus on problem solving. Recent emphasis on "repair processes" in mathematics is…

ERIC Educational Resources Information Center

Arikan, Elif Esra; Unal, Hasan

2015-01-01

The aim of this study is to investigate the metaphors images of gifted students about mathematics. The sample of the study consists of 82 gifted students, which are 2, 3, 4, 5, 6, 7 graders, from Istanbul. Data were collected by asking students to complete the sentence: "Mathematics is as …, because…". In the study content analysis was…

20. Mathematization Competencies of Pre-Service Elementary Mathematics Teachers in the Mathematical Modelling Process

ERIC Educational Resources Information Center

Yilmaz, Suha; Tekin-Dede, Ayse

2016-01-01

Mathematization competency is considered in the field as the focus of modelling process. Considering the various definitions, the components of the mathematization competency are determined as identifying assumptions, identifying variables based on the assumptions and constructing mathematical model/s based on the relations among identified…

1. Self and Peer Assessment of Mathematical Processes

ERIC Educational Resources Information Center

Onion, Alice; Javaheri, Elnaz

2011-01-01

This article explores using Bowland assessment tasks and Nuffield Applying Mathematical Processes (AMP) activities as part of a scheme of work. The Bowland tasks and Nuffield AMP activities are designed to develop students' mathematical thinking; they are focused on key processes. Unfamiliar demands are made on the students and they are challenged…

2. Iterative Processes in Mathematics Education

ERIC Educational Resources Information Center

Mudaly, Vimolan

2009-01-01

There are many arguments that reflect on inductive versus deductive methods in mathematics. Claims are often made that teaching from the general to the specific does make understanding better for learners or vice versa. I discuss an intervention conducted with Grade 10 (15-year-old) learners in a small suburb in South Africa. I reflect on the…

3. Mathematics from Still and Moving Images

ERIC Educational Resources Information Center

Pierce, Robyn; Stacey, Kaye; Ball, Lynda

2005-01-01

Digital photos and digital movies offer an excellent way of bringing real world situations into the mathematics classroom. The technologies surveyed here are feasible for everyday classroom use and inexpensive. Examples are drawn from the teaching of Cartesian coordinates, linear functions, ratio and Pythagoras' theorem using still images, and…

4. An Emergent Framework: Views of Mathematical Processes

ERIC Educational Resources Information Center

Sanchez, Wendy B.; Lischka, Alyson E.; Edenfield, Kelly W.; Gammill, Rebecca

2015-01-01

The findings reported in this paper were generated from a case study of teacher leaders at a state-level mathematics conference. Investigation focused on how participants viewed the mathematical processes of communication, connections, representations, problem solving, and reasoning and proof. Purposeful sampling was employed to select nine…

5. Mathematical and physical modelling of materials processing

NASA Technical Reports Server (NTRS)

1982-01-01

Mathematical and physical modeling of turbulence phenomena in metals processing, electromagnetically driven flows in materials processing, gas-solid reactions, rapid solidification processes, the electroslag casting process, the role of cathodic depolarizers in the corrosion of aluminum in sea water, and predicting viscoelastic flows are described.

6. Image Processing

NASA Technical Reports Server (NTRS)

1987-01-01

A new spinoff product was derived from Geospectra Corporation's expertise in processing LANDSAT data in a software package. Called ATOM (for Automatic Topographic Mapping), it's capable of digitally extracting elevation information from stereo photos taken by spaceborne cameras. ATOM offers a new dimension of realism in applications involving terrain simulations, producing extremely precise maps of an area's elevations at a lower cost than traditional methods. ATOM has a number of applications involving defense training simulations and offers utility in architecture, urban planning, forestry, petroleum and mineral exploration.

7. Digital image processing.

PubMed

Seeram, Euclid

2004-01-01

Digital image processing is now commonplace in radiology, nuclear medicine and sonography. This article outlines underlying principles and concepts of digital image processing. After completing this article, readers should be able to: List the limitations of film-based imaging. Identify major components of a digital imaging system. Describe the history and application areas of digital image processing. Discuss image representation and the fundamentals of digital image processing. Outline digital image processing techniques and processing operations used in selected imaging modalities. Explain the basic concepts and visualization tools used in 3-D and virtual reality imaging. Recognize medical imaging informatics as a new area of specialization for radiologic technologists. PMID:15352557

8. Filler segmentation of SEM paper images based on mathematical morphology.

PubMed

Ait Kbir, M; Benslimane, Rachid; Princi, Elisabetta; Vicini, Silvia; Pedemonte, Enrico

2007-07-01

Recent developments in microscopy and image processing have made digital measurements on high-resolution images of fibrous materials possible. This helps to gain a better understanding of the structure and other properties of the material at micro level. In this paper SEM image segmentation based on mathematical morphology is proposed. In fact, paper models images (Whatman, Murillo, Watercolor, Newsprint paper) selected in the context of the Euro Mediterranean PaperTech Project have different distributions of fibers and fillers, caused by the presence of SiAl and CaCO3 particles. It is a microscopy challenge to make filler particles in the sheet distinguishable from the other components of the paper surface. This objectif is reached here by using switable strutural elements and mathematical morphology operators. PMID:17867540

9. A Mathematical Analysis of a Biology Process

Juratoni, A.; Bundǎu, O.; Chevereşan, A.

2010-09-01

We present a mathematical model of tumor growth with an immune response. We will analyze the problem of maximizes the effects of the immunotherapy while minimizing the number of tumor cells and the cost of the control which is given by medical treatment. Mathematical modeling of this process leads to an optimal control problem with a finite horizon. We give an existence result and we prove the necessary conditions for the optimal control problem.

10. Lensless ghost imaging based on mathematical simulation and experimental simulation

Liu, Yanyan; Wang, Biyi; Zhao, Yingchao; Dong, Junzhang

2014-02-01

The differences of conventional imaging and correlated imaging are discussed in this paper. The mathematical model of lensless ghost imaging system is set up and the image of double slits is computed by mathematical simulation. The results are also testified by the experimental verification. Both the theory simulation and experimental verifications results shows that the mathematical model based on statistical optical principle are keeping consistent with real experimental results.

11. Mathematics of Information Processing and the Internet

ERIC Educational Resources Information Center

Hart, Eric W.

2010-01-01

The mathematics of information processing and the Internet can be organized around four fundamental themes: (1) access (finding information easily); (2) security (keeping information confidential); (3) accuracy (ensuring accurate information); and (4) efficiency (data compression). In this article, the author discusses each theme with reference to…

12. Image-Processing Educator

NASA Technical Reports Server (NTRS)

Gunther, F. J.

1986-01-01

Apple Image-Processing Educator (AIPE) explores ability of microcomputers to provide personalized computer-assisted instruction (CAI) in digital image processing of remotely sensed images. AIPE is "proof-of-concept" system, not polished production system. User-friendly prompts provide access to explanations of common features of digital image processing and of sample programs that implement these features.

13. Multispectral imaging and image processing

Klein, Julie

2014-02-01

The color accuracy of conventional RGB cameras is not sufficient for many color-critical applications. One of these applications, namely the measurement of color defects in yarns, is why Prof. Til Aach and the Institute of Image Processing and Computer Vision (RWTH Aachen University, Germany) started off with multispectral imaging. The first acquisition device was a camera using a monochrome sensor and seven bandpass color filters positioned sequentially in front of it. The camera allowed sampling the visible wavelength range more accurately and reconstructing the spectra for each acquired image position. An overview will be given over several optical and imaging aspects of the multispectral camera that have been investigated. For instance, optical aberrations caused by filters and camera lens deteriorate the quality of captured multispectral images. The different aberrations were analyzed thoroughly and compensated based on models for the optical elements and the imaging chain by utilizing image processing. With this compensation, geometrical distortions disappear and sharpness is enhanced, without reducing the color accuracy of multispectral images. Strong foundations in multispectral imaging were laid and a fruitful cooperation was initiated with Prof. Bernhard Hill. Current research topics like stereo multispectral imaging and goniometric multispectral measure- ments that are further explored with his expertise will also be presented in this work.

14. Design of smart imagers with image processing

Serova, Evgeniya N.; Shiryaev, Yury A.; Udovichenko, Anton O.

2005-06-01

This paper is devoted to creation of novel CMOS APS imagers with focal plane parallel image preprocessing for smart technical vision and electro-optical systems based on neural implementation. Using analysis of main biological vision features, the desired artificial vision characteristics are defined. Image processing tasks can be implemented by smart focal plane preprocessing CMOS imagers with neural networks are determined. Eventual results are important for medicine, aerospace ecological monitoring, complexity, and ways for CMOS APS neural nets implementation. To reduce real image preprocessing time special methods based on edge detection and neighbored frame subtraction will be considered and simulated. To select optimal methods and mathematical operators for edge detection various medical, technical and aerospace images will be tested. The important research direction will be devoted to analogue implementation of main preprocessing operations (addition, subtraction, neighbored frame subtraction, module, and edge detection of pixel signals) in focal plane of CMOS APS imagers. We present the following results: the algorithm of edge detection for analog realization, and patented focal plane circuits for analog image reprocessing (edge detection and motion detection).

15. Processes and priorities in planning mathematics teaching

Sullivan, Peter; Clarke, David J.; Clarke, Doug M.; Farrell, Lesley; Gerrard, Jessica

2013-12-01

Insights into teachers' planning of mathematics reported here were gathered as part of a broader project examining aspects of the implementation of the Australian curriculum in mathematics (and English). In particular, the responses of primary and secondary teachers to a survey of various aspects of decisions that inform their use of curriculum documents and assessment processes to plan their teaching are discussed. Teachers appear to have a clear idea of the overall topic as the focus of their planning, but they are less clear when asked to articulate the important ideas in that topic. While there is considerable diversity in the processes that teachers use for planning and in the ways that assessment information informs that planning, a consistent theme was that teachers make active decisions at all stages in the planning process. Teachers use a variety of assessment data in various ways, but these are not typically data extracted from external assessments. This research has important implications for those responsible for supporting teachers in the transition to the Australian Curriculum: Mathematics.

16. The (Mathematical) Modeling Process in Biosciences

PubMed Central

Torres, Nestor V.; Santos, Guido

2015-01-01

In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology. PMID:26734063

17. Pre-Service Mathematics Teachers' Concept Images of Radian

ERIC Educational Resources Information Center

Akkoc, Hatice

2008-01-01

This study investigates pre-service mathematics teachers' concept images of radian and possible sources of such images. A multiple-case study was conducted for this study. Forty-two pre-service mathematics teachers completed a questionnaire, which aims to assess their understanding of radian. Six of them were selected for individual interviews on…

18. Hyperspectral image processing

Technology Transfer Automated Retrieval System (TEKTRAN)

Hyperspectral image processing refers to the use of computer algorithms to extract, store and manipulate both spatial and spectral information contained in hyperspectral images across the visible and near-infrared portion of the electromagnetic spectrum. A typical hyperspectral image processing work...

19. Hyperspectral image processing methods

Technology Transfer Automated Retrieval System (TEKTRAN)

Hyperspectral image processing refers to the use of computer algorithms to extract, store and manipulate both spatial and spectral information contained in hyperspectral images across the visible and near-infrared portion of the electromagnetic spectrum. A typical hyperspectral image processing work...

20. Mathematical morphology for TOFD image analysis and automatic crack detection.

PubMed

Merazi-Meksen, Thouraya; Boudraa, Malika; Boudraa, Bachir

2014-08-01

The aim of this work is to automate the interpretation of ultrasonic images during the non-destructive testing (NDT) technique called time-of-flight diffraction (TOFD) to aid in decision making. In this paper, the mathematical morphology approach is used to extract relevant pixels corresponding to the presence of a discontinuity, and a pattern recognition technique is used to characterize the discontinuity. The watershed technique is exploited to determine the region of interest and image background is removed using an erosion process, thereby improving the detection of connected shapes present in the image. Remaining shapes, are finally reduced to curves using a skeletonization technique. In the case of crack defects, the curve formed by such pixels has a parabolic form that can be automatically detected using the randomized Hough transform. PMID:24709071

1. Hybrid image processing

NASA Technical Reports Server (NTRS)

Juday, Richard D.

1990-01-01

Partly-digital, partly-optical 'hybrid' image processing attempts to use the properties of each domain to synergistic advantage: while Fourier optics furnishes speed, digital processing allows the use of much greater algorithmic complexity. The video-rate image-coordinate transformation used is a critical technology for real-time hybrid image-pattern recognition. Attention is given to the separation of pose variables, image registration, and both single- and multiple-frame registration.

2. Subroutines For Image Processing

NASA Technical Reports Server (NTRS)

Faulcon, Nettie D.; Monteith, James H.; Miller, Keith W.

1988-01-01

Image Processing Library computer program, IPLIB, is collection of subroutines facilitating use of COMTAL image-processing system driven by HP 1000 computer. Functions include addition or subtraction of two images with or without scaling, display of color or monochrome images, digitization of image from television camera, display of test pattern, manipulation of bits, and clearing of screen. Provides capability to read or write points, lines, and pixels from image; read or write at location of cursor; and read or write array of integers into COMTAL memory. Written in FORTRAN 77.

3. A description of a system of programs for mathematically processing on unified series (YeS) computers photographic images of the Earth taken from spacecraft

NASA Technical Reports Server (NTRS)

Zolotukhin, V. G.; Kolosov, B. I.; Usikov, D. A.; Borisenko, V. I.; Mosin, S. T.; Gorokhov, V. N.

1980-01-01

A description of a batch of programs for the YeS-1040 computer combined into an automated system for processing photo (and video) images of the Earth's surface, taken from spacecraft, is presented. Individual programs with the detailed discussion of the algorithmic and programmatic facilities needed by the user are presented. The basic principles for assembling the system, and the control programs are included. The exchange format within whose framework the cataloging of any programs recommended for the system of processing will be activated in the future is displayed.

4. Differential morphology and image processing.

PubMed

Maragos, P

1996-01-01

Image processing via mathematical morphology has traditionally used geometry to intuitively understand morphological signal operators and set or lattice algebra to analyze them in the space domain. We provide a unified view and analytic tools for morphological image processing that is based on ideas from differential calculus and dynamical systems. This includes ideas on using partial differential or difference equations (PDEs) to model distance propagation or nonlinear multiscale processes in images. We briefly review some nonlinear difference equations that implement discrete distance transforms and relate them to numerical solutions of the eikonal equation of optics. We also review some nonlinear PDEs that model the evolution of multiscale morphological operators and use morphological derivatives. Among the new ideas presented, we develop some general 2-D max/min-sum difference equations that model the space dynamics of 2-D morphological systems (including the distance computations) and some nonlinear signal transforms, called slope transforms, that can analyze these systems in a transform domain in ways conceptually similar to the application of Fourier transforms to linear systems. Thus, distance transforms are shown to be bandpass slope filters. We view the analysis of the multiscale morphological PDEs and of the eikonal PDE solved via weighted distance transforms as a unified area in nonlinear image processing, which we call differential morphology, and briefly discuss its potential applications to image processing and computer vision. PMID:18285181

5. Developing Mathematical Processes (DMP): Field Test Evaluation, 1972-73.

ERIC Educational Resources Information Center

Schall, William E.; And Others

Developing Mathematical Processes (DMP) is a research-based, innovative, process-oriented elementary mathematics program that was developed at the Research and Development Center for Cognitive Learning, University of Wisconsin-Madison. The program utilizes an activities approach to mathematics. Emphasis is on manipulative materials and sequencing…

6. Mathematical Analysis and Optimization of Infiltration Processes

NASA Technical Reports Server (NTRS)

Chang, H.-C.; Gottlieb, D.; Marion, M.; Sheldon, B. W.

1997-01-01

A variety of infiltration techniques can be used to fabricate solid materials, particularly composites. In general these processes can be described with at least one time dependent partial differential equation describing the evolution of the solid phase, coupled to one or more partial differential equations describing mass transport through a porous structure. This paper presents a detailed mathematical analysis of a relatively simple set of equations which is used to describe chemical vapor infiltration. The results demonstrate that the process is controlled by only two parameters, alpha and beta. The optimization problem associated with minimizing the infiltration time is also considered. Allowing alpha and beta to vary with time leads to significant reductions in the infiltration time, compared with the conventional case where alpha and beta are treated as constants.

7. Collective Mathematical Understanding as an Improvisational Process

ERIC Educational Resources Information Center

Martin, Lyndon C.; Towers, Jo

2003-01-01

This paper explores the phenomenon of mathematical understanding, and offers a response to the question raised by Martin (2001) at PME-NA about the possibility for and nature of collective mathematical understanding. In referring to collective mathematical understanding we point to the kinds of learning and understanding we may see occurring when…

8. Mathematical modeling of biomass fuels formation process

SciTech Connect

2008-07-01

The increasing demand for thermal and electric energy in many branches of industry and municipal management accounts for a drastic diminishing of natural resources (fossil fuels). Meanwhile, in numerous technical processes, a huge mass of wastes is produced. A segregated and converted combustible fraction of the wastes, with relatively high calorific value, may be used as a component of formed fuels. The utilization of the formed fuel components from segregated groups of waste in associated processes of co-combustion with conventional fuels causes significant savings resulting from partial replacement of fossil fuels, and reduction of environmental pollution resulting directly from the limitation of waste migration to the environment (soil, atmospheric air, surface and underground water). The realization of technological processes with the utilization of formed fuel in associated thermal systems should be qualified by technical criteria, which means that elementary processes as well as factors of sustainable development, from a global viewpoint, must not be disturbed. The utilization of post-process waste should be preceded by detailed technical, ecological and economic analyses. In order to optimize the mixing process of fuel components, a mathematical model of the forming process was created. The model is defined as a group of data structures which uniquely identify a real process and conversion of this data in algorithms based on a problem of linear programming. The paper also presents the optimization of parameters in the process of forming fuels using a modified simplex algorithm with a polynomial worktime. This model is a datum-point in the numerical modeling of real processes, allowing a precise determination of the optimal elementary composition of formed fuels components, with assumed constraints and decision variables of the task.

9. Image Processing Software

NASA Technical Reports Server (NTRS)

1992-01-01

To convert raw data into environmental products, the National Weather Service and other organizations use the Global 9000 image processing system marketed by Global Imaging, Inc. The company's GAE software package is an enhanced version of the TAE, developed by Goddard Space Flight Center to support remote sensing and image processing applications. The system can be operated in three modes and is combined with HP Apollo workstation hardware.

10. Apple Image Processing Educator

NASA Technical Reports Server (NTRS)

Gunther, F. J.

1981-01-01

A software system design is proposed and demonstrated with pilot-project software. The system permits the Apple II microcomputer to be used for personalized computer-assisted instruction in the digital image processing of LANDSAT images. The programs provide data input, menu selection, graphic and hard-copy displays, and both general and detailed instructions. The pilot-project results are considered to be successful indicators of the capabilities and limits of microcomputers for digital image processing education.

11. Image processing mini manual

NASA Technical Reports Server (NTRS)

Matthews, Christine G.; Posenau, Mary-Anne; Leonard, Desiree M.; Avis, Elizabeth L.; Debure, Kelly R.; Stacy, Kathryn; Vonofenheim, Bill

1992-01-01

The intent is to provide an introduction to the image processing capabilities available at the Langley Research Center (LaRC) Central Scientific Computing Complex (CSCC). Various image processing software components are described. Information is given concerning the use of these components in the Data Visualization and Animation Laboratory at LaRC.

12. How Digital Image Processing Became Really Easy

Cannon, Michael

1988-02-01

In the early and mid-1970s, digital image processing was the subject of intense university and corporate research. The research lay along two lines: (1) developing mathematical techniques for improving the appearance of or analyzing the contents of images represented in digital form, and (2) creating cost-effective hardware to carry out these techniques. The research has been very effective, as evidenced by the continued decline of image processing as a research topic, and the rapid increase of commercial companies to market digital image processing software and hardware.

13. Image Processing System

NASA Technical Reports Server (NTRS)

1986-01-01

Mallinckrodt Institute of Radiology (MIR) is using a digital image processing system which employs NASA-developed technology. MIR's computer system is the largest radiology system in the world. It is used in diagnostic imaging. Blood vessels are injected with x-ray dye, and the images which are produced indicate whether arteries are hardened or blocked. A computer program developed by Jet Propulsion Laboratory known as Mini-VICAR/IBIS was supplied to MIR by COSMIC. The program provides the basis for developing the computer imaging routines for data processing, contrast enhancement and picture display.

ERIC Educational Resources Information Center

Engelbrecht, Johann

2010-01-01

The transition process to advanced mathematical thinking is experienced as traumatic by many students. Experiences that students had of school mathematics differ greatly to what is expected from them at university. Success in school mathematics meant application of different methods to get an answer. Students are not familiar with logical…

15. Visual color image processing

Qiu, Guoping; Schaefer, Gerald

1999-12-01

In this paper, we propose a color image processing method by combining modern signal processing technique with knowledge about the properties of the human color vision system. Color signals are processed differently according to their visual importance. The emphasis of the technique is on the preservation of total visual quality of the image and simultaneously taking into account computational efficiency. A specific color image enhancement technique, termed Hybrid Vector Median Filtering is presented. Computer simulations have been performed to demonstrate that the new approach is technically sound and results are comparable to or better than traditional methods.

16. Students' Images of Mathematics

ERIC Educational Resources Information Center

Martin, Lee; Gourley-Delaney, Pamela

2014-01-01

Students' judgments about "what counts" as mathematics in and out of school have important consequences for problem solving and transfer, yet our understanding of the source and nature of these judgments remains incomplete. Thirty-five sixth grade students participated in a study focused on what activities students judge as…

17. Mathematical Problem Solving through Sequential Process Analysis

ERIC Educational Resources Information Center

Codina, A.; Cañadas, M. C.; Castro, E.

2015-01-01

Introduction: The macroscopic perspective is one of the frameworks for research on problem solving in mathematics education. Coming from this perspective, our study addresses the stages of thought in mathematical problem solving, offering an innovative approach because we apply sequential relations and global interrelations between the different…

18. Processes and Priorities in Planning Mathematics Teaching

ERIC Educational Resources Information Center

Sullivan, Peter; Clarke, David J.; Clarke, Doug M.; Farrell, Lesley; Gerrard, Jessica

2013-01-01

Insights into teachers' planning of mathematics reported here were gathered as part of a broader project examining aspects of the implementation of the Australian curriculum in mathematics (and English). In particular, the responses of primary and secondary teachers to a survey of various aspects of decisions that inform their use of…

19. The Image of Mathematics Held by Irish Post-Primary Students

ERIC Educational Resources Information Center

Lane, Ciara; Stynes, Martin; O'Donoghue, John

2014-01-01

The image of mathematics held by Irish post-primary students was examined and a model for the image found was constructed. Initially, a definition for "image of mathematics" was adopted with image of mathematics hypothesized as comprising attitudes, beliefs, self-concept, motivation, emotions and past experiences of mathematics. Research…

20. Meteorological image processing applications

NASA Technical Reports Server (NTRS)

Bracken, P. A.; Dalton, J. T.; Hasler, A. F.; Adler, R. F.

1979-01-01

Meteorologists at NASA's Goddard Space Flight Center are conducting an extensive program of research in weather and climate related phenomena. This paper focuses on meteorological image processing applications directed toward gaining a detailed understanding of severe weather phenomena. In addition, the paper discusses the ground data handling and image processing systems used at the Goddard Space Flight Center to support severe weather research activities and describes three specific meteorological studies which utilized these facilities.

1. Cognition in Children's Mathematical Processing: Bringing Psychology to the Classroom

ERIC Educational Resources Information Center

Witt, Marcus

2010-01-01

Introduction: The cognitive processes that underpin successful mathematical processing in children have been well researched by experimental psychologists, but are not widely understood among teachers of primary mathematics. This is a shame, as an understanding of these cognitive processes could be highly useful to practitioners. This paper…

2. Methods in Astronomical Image Processing

Jörsäter, S.

A Brief Introductory Note History of Astronomical Imaging Astronomical Image Data Images in Various Formats Digitized Image Data Digital Image Data Philosophy of Astronomical Image Processing Properties of Digital Astronomical Images Human Image Processing Astronomical vs. Computer Science Image Processing Basic Tools of Astronomical Image Processing Display Applications Calibration of Intensity Scales Calibration of Length Scales Image Re-shaping Feature Enhancement Noise Suppression Noise and Error Analysis Image Processing Packages: Design of AIPS and MIDAS AIPS MIDAS Reduction of CCD Data Bias Subtraction Clipping Preflash Subtraction Dark Subtraction Flat Fielding Sky Subtraction Extinction Correction Deconvolution Methods Rebinning/Combining Summary and Prospects for the Future

3. Stochastic processes, estimation theory and image enhancement

NASA Technical Reports Server (NTRS)

Assefi, T.

1978-01-01

An introductory account of stochastic processes, estimation theory, and image enhancement is presented. The book is primarily intended for first-year graduate students and practicing engineers and scientists whose work requires an acquaintance with the theory. Fundamental concepts of probability were reviewed that are required to support the main topics. The appendices discuss the remaining mathematical background.

4. Meaning and Process in Mathematics and Programming.

ERIC Educational Resources Information Center

Grogono, Peter

1989-01-01

Trends in computer programing language design are described and children's difficulties in learning to write programs for mathematics problems are considered. Languages are compared under the headings of imperative programing, functional programing, logic programing, and pictures. (DC)

5. Retrospective Study on Mathematical Modeling Based on Computer Graphic Processing

Zhang, Kai Li

Graphics & image making is an important field in computer application, in which visualization software has been widely used with the characteristics of convenience and quick. However, it was thought by modeling designers that the software had been limited in it's function and flexibility because mathematics modeling platform was not built. A non-visualization graphics software appearing at this moment enabled the graphics & image design has a very good mathematics modeling platform. In the paper, a polished pyramid is established by multivariate spline function algorithm, and validate the non-visualization software is good in mathematical modeling.

6. Onboard image processing

NASA Technical Reports Server (NTRS)

Martin, D. R.; Samulon, A. S.

1979-01-01

The possibility of onboard geometric correction of Thematic Mapper type imagery to make possible image registration is considered. Typically, image registration is performed by processing raw image data on the ground. The geometric distortion (e.g., due to variation in spacecraft location and viewing angle) is estimated by using a Kalman filter updated by correlating the received data with a small reference subimage, which has known location. Onboard image processing dictates minimizing the complexity of the distortion estimation while offering the advantages of a real time environment. In keeping with this, the distortion estimation can be replaced by information obtained from the Global Positioning System and from advanced star trackers. Although not as accurate as the conventional ground control point technique, this approach is capable of achieving subpixel registration. Appropriate attitude commands can be used in conjunction with image processing to achieve exact overlap of image frames. The magnitude of the various distortion contributions, the accuracy with which they can be measured in real time, and approaches to onboard correction are investigated.

7. Mathematical Knowledge and School Work. A Case Study of the Teaching of Developing Mathematical Processes (DMP).

ERIC Educational Resources Information Center

Stephens, W. M.; Romberg, T. A.

This study examined the aspirations of the Developing Mathematical Processes (DMP) program and sought to ascertain the extent to which it has been implemented in observed classrooms. DMP was intended to reshape conceptions of mathematical knowledge and school work and to create a pedagogy in which children would be active in creating and testing…

8. Image sets for satellite image processing systems

Peterson, Michael R.; Horner, Toby; Temple, Asael

2011-06-01

The development of novel image processing algorithms requires a diverse and relevant set of training images to ensure the general applicability of such algorithms for their required tasks. Images must be appropriately chosen for the algorithm's intended applications. Image processing algorithms often employ the discrete wavelet transform (DWT) algorithm to provide efficient compression and near-perfect reconstruction of image data. Defense applications often require the transmission of images and video across noisy or low-bandwidth channels. Unfortunately, the DWT algorithm's performance deteriorates in the presence of noise. Evolutionary algorithms are often able to train image filters that outperform DWT filters in noisy environments. Here, we present and evaluate two image sets suitable for the training of such filters for satellite and unmanned aerial vehicle imagery applications. We demonstrate the use of the first image set as a training platform for evolutionary algorithms that optimize discrete wavelet transform (DWT)-based image transform filters for satellite image compression. We evaluate the suitability of each image as a training image during optimization. Each image is ranked according to its suitability as a training image and its difficulty as a test image. The second image set provides a test-bed for holdout validation of trained image filters. These images are used to independently verify that trained filters will provide strong performance on unseen satellite images. Collectively, these image sets are suitable for the development of image processing algorithms for satellite and reconnaissance imagery applications.

9. Your Students' Images of Mathematicians and Mathematics.

ERIC Educational Resources Information Center

Picker, Susan H.; Berry, John S.

2001-01-01

Discusses the subliminal images that students might have of mathematicians. Presents the disparity between boys and girls in envisioning mathematicians of their own sex. Explores implications for pedagogy. (KHR)

10. Image-Processing Program

NASA Technical Reports Server (NTRS)

Roth, D. J.; Hull, D. R.

1994-01-01

IMAGEP manipulates digital image data to effect various processing, analysis, and enhancement functions. It is keyboard-driven program organized into nine subroutines. Within subroutines are sub-subroutines also selected via keyboard. Algorithm has possible scientific, industrial, and biomedical applications in study of flows in materials, analysis of steels and ores, and pathology, respectively.

11. Mechanical-mathematical modeling for landslide process

Svalova, V.

2009-04-01

500 m and displacement of a landslide in the plan over 1 m. Last serious activization of a landslide has taken place in 2002 with a motion on 53 cm. Catastrophic activization of the deep blockglide landslide in the area of Khoroshevo in Moscow took place in 2006-2007. A crack of 330 m long appeared in the old sliding circus, along which a new 220 m long creeping block was separated from the plateau and began sinking with a displaced surface of the plateau reaching to 12 m. Such activization of the landslide process was not observed in Moscow since mid XIX century. The sliding area of Khoroshevo was stable during long time without manifestations of activity. Revealing of the reasons of deformation and development of ways of protection from deep landslide motions is extremely actual and difficult problem which decision is necessary for preservation of valuable historical monuments and modern city constructions. The reasons of activization and protective measures are discussed. Structure of monitoring system for urban territories is elaborated. Mechanical-mathematical model of high viscous fluid was used for modeling of matter behavior on landslide slopes. Equation of continuity and an approximated equation of the Navier-Stockes for slow motions in a thin layer were used. The results of modelling give possibility to define the place of highest velocity on landslide surface, which could be the best place for monitoring post position. Model can be used for calibration of monitoring equipment and gives possibility to investigate some fundamental aspects of matter movement on landslide slope.

12. Cognitive Processes that Underlie Mathematical Precociousness in Young Children

ERIC Educational Resources Information Center

Swanson, H. Lee

2006-01-01

The working memory (WM) processes that underlie young children's (ages 6-8 years) mathematical precociousness were examined. A battery of tests that assessed components of WM (phonological loop, visual-spatial sketchpad, and central executive), naming speed, random generation, and fluency was administered to mathematically precocious and…

13. Enhancing the Teaching and Learning of Mathematical Visual Images

ERIC Educational Resources Information Center

Quinnell, Lorna

2014-01-01

The importance of mathematical visual images is indicated by the introductory paragraph in the Statistics and Probability content strand of the Australian Curriculum, which draws attention to the importance of learners developing skills to analyse and draw inferences from data and "represent, summarise and interpret data and undertake…

14. First Year Mathematics Undergraduates' Settled Images of Tangent Line

ERIC Educational Resources Information Center

2010-01-01

This study concerns 182 first year mathematics undergraduates' perspectives on the tangent line of function graph in the light of a previous study on Year 12 pupils' perspectives. The aim was the investigation of tangency images that settle after undergraduates' distancing from the notion for a few months and after their participation in…

15. Distinct and Overlapping Brain Areas Engaged during Value-Based, Mathematical, and Emotional Decision Processing.

PubMed

Hsu, Chun-Wei; Goh, Joshua O S

2016-01-01

When comparing between the values of different choices, human beings can rely on either more cognitive processes, such as using mathematical computation, or more affective processes, such as using emotion. However, the neural correlates of how these two types of processes operate during value-based decision-making remain unclear. In this study, we investigated the extent to which neural regions engaged during value-based decision-making overlap with those engaged during mathematical and emotional processing in a within-subject manner. In a functional magnetic resonance imaging experiment, participants viewed stimuli that always consisted of numbers and emotional faces that depicted two choices. Across tasks, participants decided between the two choices based on the expected value of the numbers, a mathematical result of the numbers, or the emotional face stimuli. We found that all three tasks commonly involved various cortical areas including frontal, parietal, motor, somatosensory, and visual regions. Critically, the mathematical task shared common areas with the value but not emotion task in bilateral striatum. Although the emotion task overlapped with the value task in parietal, motor, and sensory areas, the mathematical task also evoked responses in other areas within these same cortical structures. Minimal areas were uniquely engaged for the value task apart from the other two tasks. The emotion task elicited a more expansive area of neural activity whereas value and mathematical task responses were in more focal regions. Whole-brain spatial correlation analysis showed that valuative processing engaged functional brain responses more similarly to mathematical processing than emotional processing. While decisions on expected value entail both mathematical and emotional processing regions, mathematical processes have a more prominent contribution particularly in subcortical processes. PMID:27375466

16. Distinct and Overlapping Brain Areas Engaged during Value-Based, Mathematical, and Emotional Decision Processing

PubMed Central

Hsu, Chun-Wei; Goh, Joshua O. S.

2016-01-01

When comparing between the values of different choices, human beings can rely on either more cognitive processes, such as using mathematical computation, or more affective processes, such as using emotion. However, the neural correlates of how these two types of processes operate during value-based decision-making remain unclear. In this study, we investigated the extent to which neural regions engaged during value-based decision-making overlap with those engaged during mathematical and emotional processing in a within-subject manner. In a functional magnetic resonance imaging experiment, participants viewed stimuli that always consisted of numbers and emotional faces that depicted two choices. Across tasks, participants decided between the two choices based on the expected value of the numbers, a mathematical result of the numbers, or the emotional face stimuli. We found that all three tasks commonly involved various cortical areas including frontal, parietal, motor, somatosensory, and visual regions. Critically, the mathematical task shared common areas with the value but not emotion task in bilateral striatum. Although the emotion task overlapped with the value task in parietal, motor, and sensory areas, the mathematical task also evoked responses in other areas within these same cortical structures. Minimal areas were uniquely engaged for the value task apart from the other two tasks. The emotion task elicited a more expansive area of neural activity whereas value and mathematical task responses were in more focal regions. Whole-brain spatial correlation analysis showed that valuative processing engaged functional brain responses more similarly to mathematical processing than emotional processing. While decisions on expected value entail both mathematical and emotional processing regions, mathematical processes have a more prominent contribution particularly in subcortical processes. PMID:27375466

17. Retinomorphic image processing.

PubMed

Ghosh, Kuntal; Bhaumik, Kamales; Sarkar, Sandip

2008-01-01

The present work is aimed at understanding and explaining some of the aspects of visual signal processing at the retinal level while exploiting the same towards the development of some simple techniques in the domain of digital image processing. Classical studies on retinal physiology revealed the nature of contrast sensitivity of the receptive field of bipolar or ganglion cells, which lie in the outer and inner plexiform layers of the retina. To explain these observations, a difference of Gaussian (DOG) filter was suggested, which was subsequently modified to a Laplacian of Gaussian (LOG) filter for computational ease in handling two-dimensional retinal inputs. Till date almost all image processing algorithms, used in various branches of science and engineering had followed LOG or one of its variants. Recent observations in retinal physiology however, indicate that the retinal ganglion cells receive input from a larger area than the classical receptive fields. We have proposed an isotropic model for the non-classical receptive field of the retinal ganglion cells, corroborated from these recent observations, by introducing higher order derivatives of Gaussian expressed as linear combination of Gaussians only. In digital image processing, this provides a new mechanism of edge detection on one hand and image half-toning on the other. It has also been found that living systems may sometimes prefer to "perceive" the external scenario by adding noise to the received signals in the pre-processing level for arriving at better information on light and shade in the edge map. The proposed model also provides explanation to many brightness-contrast illusions hitherto unexplained not only by the classical isotropic model but also by some other Gestalt and Constructivist models or by non-isotropic multi-scale models. The proposed model is easy to implement both in the analog and digital domain. A scheme for implementation in the analog domain generates a new silicon retina

18. Image processing technology

SciTech Connect

Van Eeckhout, E.; Pope, P.; Balick, L.

1996-07-01

This is the final report of a two-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The primary objective of this project was to advance image processing and visualization technologies for environmental characterization. This was effected by developing and implementing analyses of remote sensing data from satellite and airborne platforms, and demonstrating their effectiveness in visualization of environmental problems. Many sources of information were integrated as appropriate using geographic information systems.

19. scikit-image: image processing in Python.

PubMed

van der Walt, Stéfan; Schönberger, Johannes L; Nunez-Iglesias, Juan; Boulogne, François; Warner, Joshua D; Yager, Neil; Gouillart, Emmanuelle; Yu, Tony

2014-01-01

scikit-image is an image processing library that implements algorithms and utilities for use in research, education and industry applications. It is released under the liberal Modified BSD open source license, provides a well-documented API in the Python programming language, and is developed by an active, international team of collaborators. In this paper we highlight the advantages of open source to achieve the goals of the scikit-image library, and we showcase several real-world image processing applications that use scikit-image. More information can be found on the project homepage, http://scikit-image.org. PMID:25024921

20. scikit-image: image processing in Python

PubMed Central

Schönberger, Johannes L.; Nunez-Iglesias, Juan; Boulogne, François; Warner, Joshua D.; Yager, Neil; Gouillart, Emmanuelle; Yu, Tony

2014-01-01

scikit-image is an image processing library that implements algorithms and utilities for use in research, education and industry applications. It is released under the liberal Modified BSD open source license, provides a well-documented API in the Python programming language, and is developed by an active, international team of collaborators. In this paper we highlight the advantages of open source to achieve the goals of the scikit-image library, and we showcase several real-world image processing applications that use scikit-image. More information can be found on the project homepage, http://scikit-image.org. PMID:25024921

1. Mathematical Modeling of Primary Wood Processing

Szyszka, Barbara; Rozmiarek, Klaudyna

2008-09-01

This work presents a way of optimizing wood logs' conversion into semi-products. Calculating algorithms have been used in order to choose the cutting patterns and the number of logs needed to realize an order, including task specification. What makes it possible for the author's computer program TARPAK1 to be written is the visualization of the results, the generation pattern of wood logs' conversion for given entry parameters and prediction of sawn timber manufacture. This program has been created with the intention of being introduced to small and medium sawmills in Poland. The Project has been financed from government resources and written by workers of the Institute of Mathematics (Poznan University of Technology) and the Department of Mechanical Wood Technology (Poznan University of Life Sciences).

2. Analysis of physical processes via imaging vectors

Volovodenko, V.; Efremova, N.; Efremov, V.

2016-06-01

Practically, all modeling processes in one way or another are random. The foremost formulated theoretical foundation embraces Markov processes, being represented in different forms. Markov processes are characterized as a random process that undergoes transitions from one state to another on a state space, whereas the probability distribution of the next state depends only on the current state and not on the sequence of events that preceded it. In the Markov processes the proposition (model) of the future by no means changes in the event of the expansion and/or strong information progression relative to preceding time. Basically, modeling physical fields involves process changing in time, i.e. non-stationay processes. In this case, the application of Laplace transformation provides unjustified description complications. Transition to other possibilities results in explicit simplification. The method of imaging vectors renders constructive mathematical models and necessary transition in the modeling process and analysis itself. The flexibility of the model itself using polynomial basis leads to the possible rapid transition of the mathematical model and further analysis acceleration. It should be noted that the mathematical description permits operator representation. Conversely, operator representation of the structures, algorithms and data processing procedures significantly improve the flexibility of the modeling process.

3. Image Processing Diagnostics: Emphysema

McKenzie, Alex

2009-10-01

Currently the computerized tomography (CT) scan can detect emphysema sooner than traditional x-rays, but other tests are required to measure more accurately the amount of affected lung. CT scan images show clearly if a patient has emphysema, but is unable by visual scan alone, to quantify the degree of the disease, as it appears merely as subtle, barely distinct, dark spots on the lung. Our goal is to create a software plug-in to interface with existing open source medical imaging software, to automate the process of accurately diagnosing and determining emphysema severity levels in patients. This will be accomplished by performing a number of statistical calculations using data taken from CT scan images of several patients representing a wide range of severity of the disease. These analyses include an examination of the deviation from a normal distribution curve to determine skewness, a commonly used statistical parameter. Our preliminary results show that this method of assessment appears to be more accurate and robust than currently utilized methods which involve looking at percentages of radiodensities in air passages of the lung.

4. Elaborative and Integrative Thought Processes in Mathematics Learning.

ERIC Educational Resources Information Center

Swing, Susan; Peterson, Penelope

1988-01-01

The effects on mathematics achievement of having fifth graders engage in elaborative and integrative processing were studied using 121 students. Subjects were randomly assigned to cognitive-processing, fact-sheet, or control groups. Results show that cognitive processing is related to memory and understanding and is more effective for higher…

5. Computer image processing and recognition

NASA Technical Reports Server (NTRS)

Hall, E. L.

1979-01-01

A systematic introduction to the concepts and techniques of computer image processing and recognition is presented. Consideration is given to such topics as image formation and perception; computer representation of images; image enhancement and restoration; reconstruction from projections; digital television, encoding, and data compression; scene understanding; scene matching and recognition; and processing techniques for linear systems.

6. A Review on Mathematical Modeling for Textile Processes

2015-10-01

Mathematical model is a powerful tool in engineering for studying variety of problems related to design and development of products and processes, optimization of manufacturing process, understanding a phenomenon and predicting product's behaviour in actual use. An insight of the process and use of appropriate mathematical tools are necessary for developing models. In the present paper, a review of types of model, procedure followed in developing them and their limitations have been discussed. Modeling techniques being used in few textile processes available in the literature have been cited as examples.

7. Mathematical modelling in the computer-aided process planning

Mitin, S.; Bochkarev, P.

2016-04-01

This paper presents new approaches to organization of manufacturing preparation and mathematical models related to development of the computer-aided multi product process planning (CAMPP) system. CAMPP system has some peculiarities compared to the existing computer-aided process planning (CAPP) systems: fully formalized developing of the machining operations; a capacity to create and to formalize the interrelationships among design, process planning and process implementation; procedures for consideration of the real manufacturing conditions. The paper describes the structure of the CAMPP system and shows the mathematical models and methods to formalize the design procedures.

8. A unified mathematical theory of electrophoretic processes

NASA Technical Reports Server (NTRS)

Bier, M.; Palusinski, O. A.; Mosher, R. A.; Graham, A.; Saville, D. A.

1983-01-01

A mathematical theory is presented which shows that each of the four classical electrophoretic modes (zone electrophoresis, moving boundary electrophoresis, isotachophoresis, and isoelectric focusing) is based on the same general principles and can collectively be described in terms of a single set of equations. This model can predict the evolution of the four electrophoretic modes as a function of time. The model system is one-dimensional, neglecting the effects of electroosmosis, temperature gradients, and any bulk flows of liquid. The model is based on equations which express the components' dissociation equilibria, the mass transport due to electromigration and diffusion, electroneutrality, and the conservation of mass and charge. The model consists of a system of coupled partial differential and nonlinear algebraic equations which can be solved numerically by use of a computer. The versatility of this model was verified using an example of a three-component system containing cacodylate, tris hydroxylmethylaminomethane, and histidine. Results show that this model not only correctly predicts the characteristic features of each electrophoretic mode, but also gives details of the concentration, pH, and conductivity profiles not easily amenable to direct experimental measurement.

9. Smart Image Enhancement Process

NASA Technical Reports Server (NTRS)

Jobson, Daniel J. (Inventor); Rahman, Zia-ur (Inventor); Woodell, Glenn A. (Inventor)

2012-01-01

Contrast and lightness measures are used to first classify the image as being one of non-turbid and turbid. If turbid, the original image is enhanced to generate a first enhanced image. If non-turbid, the original image is classified in terms of a merged contrast/lightness score based on the contrast and lightness measures. The non-turbid image is enhanced to generate a second enhanced image when a poor contrast/lightness score is associated therewith. When the second enhanced image has a poor contrast/lightness score associated therewith, this image is enhanced to generate a third enhanced image. A sharpness measure is computed for one image that is selected from (i) the non-turbid image, (ii) the first enhanced image, (iii) the second enhanced image when a good contrast/lightness score is associated therewith, and (iv) the third enhanced image. If the selected image is not-sharp, it is sharpened to generate a sharpened image. The final image is selected from the selected image and the sharpened image.

10. Infrared image enhancement based on the edge detection and mathematical morphology

Zhang, Linlin; Zhao, Yuejin; Dong, Liquan; Liu, Xiaohua; Yu, Xiaomei; Hui, Mei; Chu, Xuhong; Gong, Cheng

2010-11-01

The development of the un-cooled infrared imaging technology from military necessity. At present, It is widely applied in industrial, medicine, scientific and technological research and so on. The infrared radiation temperature distribution of the measured object's surface can be observed visually. The collection of infrared images from our laboratory has following characteristics: Strong spatial correlation, Low contrast , Poor visual effect; Without color or shadows because of gray image , and has low resolution; Low definition compare to the visible light image; Many kinds of noise are brought by the random disturbances of the external environment. Digital image processing are widely applied in many areas, it can now be studied up close and in detail in many research field. It has become one kind of important means of the human visual continuation. Traditional methods for image enhancement cannot capture the geometric information of images and tend to amplify noise. In order to remove noise and improve visual effect. Meanwhile, To overcome the above enhancement issues. The mathematical model of FPA unit was constructed based on matrix transformation theory. According to characteristics of FPA, Image enhancement algorithm which combined with mathematical morphology and edge detection are established. First of all, Image profile is obtained by using the edge detection combine with mathematical morphological operators. And then, through filling the template profile by original image to get the ideal background image, The image noise can be removed on the base of the above method. The experiments show that utilizing the proposed algorithm can enhance image detail and the signal to noise ratio.

11. IMAGES: An interactive image processing system

NASA Technical Reports Server (NTRS)

Jensen, J. R.

1981-01-01

The IMAGES interactive image processing system was created specifically for undergraduate remote sensing education in geography. The system is interactive, relatively inexpensive to operate, almost hardware independent, and responsive to numerous users at one time in a time-sharing mode. Most important, it provides a medium whereby theoretical remote sensing principles discussed in lecture may be reinforced in laboratory as students perform computer-assisted image processing. In addition to its use in academic and short course environments, the system has also been used extensively to conduct basic image processing research. The flow of information through the system is discussed including an overview of the programs.

12. Processing Visual Images

SciTech Connect

Litke, Alan

2006-03-27

The back of the eye is lined by an extraordinary biological pixel detector, the retina. This neural network is able to extract vital information about the external visual world, and transmit this information in a timely manner to the brain. In this talk, Professor Litke will describe a system that has been implemented to study how the retina processes and encodes dynamic visual images. Based on techniques and expertise acquired in the development of silicon microstrip detectors for high energy physics experiments, this system can simultaneously record the extracellular electrical activity of hundreds of retinal output neurons. After presenting first results obtained with this system, Professor Litke will describe additional applications of this incredible technology.

13. ASPIC: STARLINK image processing package

Davenhall, A. C.; Hartley, Ken F.; Penny, Alan J.; Kelly, B. D.; King, Dave J.; Lupton, W. F.; Tudhope, D.; Pike, C. D.; Cooke, J. A.; Pence, W. D.; Wallace, Patrick T.; Brownrigg, D. R. K.; Baines, Dave W. T.; Warren-Smith, Rodney F.; McNally, B. V.; Bell, L. L.; Jones, T. A.; Terrett, Dave L.; Pearce, D. J.; Carey, J. V.; Currie, Malcolm J.; Benn, Chris; Beard, S. M.; Giddings, Jack R.; Balona, Luis A.; Harrison, B.; Wood, Roger; Sparkes, Bill; Allan, Peter M.; Berry, David S.; Shirt, J. V.

2015-10-01

ASPIC handled basic astronomical image processing. Early releases concentrated on image arithmetic, standard filters, expansion/contraction/selection/combination of images, and displaying and manipulating images on the ARGS and other devices. Later releases added new astronomy-specific applications to this sound framework. The ASPIC collection of about 400 image-processing programs was written using the Starlink "interim" environment in the 1980; the software is now obsolete.

14. Developing Mathematical Processes (DMP). Field Test Evaluation, 1972-1973.

ERIC Educational Resources Information Center

Schall, William E.; And Others

The field test of the Developing Mathematical Processes (DMP) program was conducted jointly by the Falconer Central School, St. Mary's Elementary School in Dunkirk, New York, and the Teacher Education Research Center at the State University College in Fredonia, New York. DMP is a research-based, innovative, process-oriented elementary mathematics…

15. The Mathematical Models of the Periodical Literature Publishing Process.

ERIC Educational Resources Information Center

Guang, Yu; Daren, Yu; Yihong, Rong

2000-01-01

Describes two mathematical models of the periodical publishing process based on a theoretical analysis. Discusses the publishing process for periodical literature, explains the continuous model and the discrete model, presents partial differential equations, and demonstrates the adaptability and the validity of the models. (LRW)

16. Mathematical Modelling of Continuous Biotechnological Processes

ERIC Educational Resources Information Center

Pencheva, T.; Hristozov, I.; Shannon, A. G.

2003-01-01

Biotechnological processes (BTP) are characterized by a complicated structure of organization and interdependent characteristics. Partial differential equations or systems of partial differential equations are used for their behavioural description as objects with distributed parameters. Modelling of substrate without regard to dispersion…

17. Post-Primary Students' Images of Mathematics: Findings from a Survey of Irish Ordinary Level Mathematics Students

ERIC Educational Resources Information Center

Lane, Ciara; Stynes, Martin; O'Donoghue, John

2016-01-01

A questionnaire survey was carried out as part of a PhD research study to investigate the image of mathematics held by post-primary students in Ireland. The study focused on students in fifth year of post-primary education studying ordinary level mathematics for the Irish Leaving Certificate examination--the final examination for students in…

18. Mathematical abilities in dyslexic children: a diffusion tensor imaging study.

PubMed

Koerte, Inga K; Willems, Anna; Muehlmann, Marc; Moll, Kristina; Cornell, Sonia; Pixner, Silvia; Steffinger, Denise; Keeser, Daniel; Heinen, Florian; Kubicki, Marek; Shenton, Martha E; Ertl-Wagner, Birgit; Schulte-Körne, Gerd

2016-09-01

Dyslexia is characterized by a deficit in language processing which mainly affects word decoding and spelling skills. In addition, children with dyslexia also show problems in mathematics. However, for the latter, the underlying structural correlates have not been investigated. Sixteen children with dyslexia (mean age 9.8 years [0.39]) and 24 typically developing children (mean age 9.9 years [0.29]) group matched for age, gender, IQ, and handedness underwent 3 T MR diffusion tensor imaging as well as cognitive testing. Tract-Based Spatial Statistics were performed to correlate behavioral data with diffusion data. Children with dyslexia performed worse than controls in standardized verbal number tasks, such as arithmetic efficiency tests (addition, subtraction, multiplication, division). In contrast, the two groups did not differ in the nonverbal number line task. Arithmetic efficiency, representing the total score of the four arithmetic tasks, multiplication, and division, correlated with diffusion measures in widespread areas of the white matter, including bilateral superior and inferior longitudinal fasciculi in children with dyslexia compared to controls. Children with dyslexia demonstrated lower performance in verbal number tasks but performed similarly to controls in a nonverbal number task. Further, an association between verbal arithmetic efficiency and diffusion measures was demonstrated in widespread areas of the white matter suggesting compensatory mechanisms in children with dyslexia compared to controls. Taken together, poor fact retrieval in children with dyslexia is likely a consequence of deficits in the language system, which not only affects literacy skills but also impacts on arithmetic skills. PMID:26286825

19. Growing Mathematical Understanding through Collective Image Making, Collective Image Having, and Collective Property Noticing

ERIC Educational Resources Information Center

Martin, Lyndon C.; Towers, Jo

2015-01-01

In the research reported in this paper, we develop a theoretical perspective to describe and account for the growth of collective mathematical understanding. We discuss collective processes in mathematics, drawing in particular on theoretical work in the domains of improvisational jazz and theatre. Using examples of data from a study of elementary…

20. FORTRAN Algorithm for Image Processing

NASA Technical Reports Server (NTRS)

Roth, Don J.; Hull, David R.

1987-01-01

FORTRAN computer algorithm containing various image-processing analysis and enhancement functions developed. Algorithm developed specifically to process images of developmental heat-engine materials obtained with sophisticated nondestructive evaluation instruments. Applications of program include scientific, industrial, and biomedical imaging for studies of flaws in materials, analyses of steel and ores, and pathology.

1. Basic research planning in mathematical pattern recognition and image analysis

NASA Technical Reports Server (NTRS)

Bryant, J.; Guseman, L. F., Jr.

1981-01-01

Fundamental problems encountered while attempting to develop automated techniques for applications of remote sensing are discussed under the following categories: (1) geometric and radiometric preprocessing; (2) spatial, spectral, temporal, syntactic, and ancillary digital image representation; (3) image partitioning, proportion estimation, and error models in object scene interference; (4) parallel processing and image data structures; and (5) continuing studies in polarization; computer architectures and parallel processing; and the applicability of "expert systems" to interactive analysis.

2. A Case Study on Pre-Service Secondary School Mathematics Teachers' Cognitive-Metacognitive Behaviours in Mathematical Modelling Process

ERIC Educational Resources Information Center

Sagirli, Meryem Özturan

2016-01-01

The aim of the present study is to investigate pre-service secondary mathematics teachers' cognitive-metacognitive behaviours during the mathematical problem-solving process considering class level. The study, in which the case study methodology was employed, was carried out with eight pre-service mathematics teachers, enrolled at a university in…

3. The APL image processing laboratory

NASA Technical Reports Server (NTRS)

Jenkins, J. O.; Randolph, J. P.; Tilley, D. G.; Waters, C. A.

1984-01-01

The present and proposed capabilities of the Central Image Processing Laboratory, which provides a powerful resource for the advancement of programs in missile technology, space science, oceanography, and biomedical image analysis, are discussed. The use of image digitizing, digital image processing, and digital image output permits a variety of functional capabilities, including: enhancement, pseudocolor, convolution, computer output microfilm, presentation graphics, animations, transforms, geometric corrections, and feature extractions. The hardware and software of the Image Processing Laboratory, consisting of digitizing and processing equipment, software packages, and display equipment, is described. Attention is given to applications for imaging systems, map geometric correction, raster movie display of Seasat ocean data, Seasat and Skylab scenes of Nantucket Island, Space Shuttle imaging radar, differential radiography, and a computerized tomographic scan of the brain.

4. Multiscale Image Processing of Solar Image Data

Young, C.; Myers, D. C.

2001-12-01

It is often said that the blessing and curse of solar physics is too much data. Solar missions such as Yohkoh, SOHO and TRACE have shown us the Sun with amazing clarity but have also increased the amount of highly complex data. We have improved our view of the Sun yet we have not improved our analysis techniques. The standard techniques used for analysis of solar images generally consist of observing the evolution of features in a sequence of byte scaled images or a sequence of byte scaled difference images. The determination of features and structures in the images are done qualitatively by the observer. There is little quantitative and objective analysis done with these images. Many advances in image processing techniques have occured in the past decade. Many of these methods are possibly suited for solar image analysis. Multiscale/Multiresolution methods are perhaps the most promising. These methods have been used to formulate the human ability to view and comprehend phenomena on different scales. So these techniques could be used to quantitify the imaging processing done by the observers eyes and brains. In this work we present several applications of multiscale techniques applied to solar image data. Specifically, we discuss uses of the wavelet, curvelet, and related transforms to define a multiresolution support for EIT, LASCO and TRACE images.

5. Improving Resolution and Depth of Astronomical Observations via Modern Mathematical Methods for Image Analysis

Castellano, M.; Ottaviani, D.; Fontana, A.; Merlin, E.; Pilo, S.; Falcone, M.

2015-09-01

In the past years modern mathematical methods for image analysis have led to a revolution in many fields, from computer vision to scientific imaging. However, some recently developed image processing techniques successfully exploited by other sectors have been rarely, if ever, experimented on astronomical observations. We present here tests of two classes of variational image enhancement techniques: "structure-texture decomposition" and "super-resolution" showing that they are effective in improving the quality of observations. Structure-texture decomposition allows to recover faint sources previously hidden by the background noise, effectively increasing the depth of available observations. Super-resolution yields an higher-resolution and a better sampled image out of a set of low resolution frames, thus mitigating problematics in data analysis arising from the difference in resolution/sampling between different instruments, as in the case of EUCLID VIS and NIR imagers.

6. Developing Mathematical Processes (DMP). Field Test Evaluation, 1973-1974.

ERIC Educational Resources Information Center

Schall, William; And Others

The Developing Mathematical Processes (DMP) program was field-tested in the kindergarten and first three grades of one parochial and five public schools. DMP is an activity-based program developed around a comprehensive list of behavioral objectives. The program is concerned with the development of intuitive geometric concepts as well as…

7. The Development from Effortful to Automatic Processing in Mathematical Cognition.

ERIC Educational Resources Information Center

Kaye, Daniel B.; And Others

This investigation capitalizes upon the information processing models that depend upon measurement of latency of response to a mathematical problem and the decomposition of reaction time (RT). Simple two term addition problems were presented with possible solutions for true-false verification, and accuracy and RT to response were recorded. Total…

8. Cooperative processes in image segmentation

NASA Technical Reports Server (NTRS)

Davis, L. S.

1982-01-01

Research into the role of cooperative, or relaxation, processes in image segmentation is surveyed. Cooperative processes can be employed at several levels of the segmentation process as a preprocessing enhancement step, during supervised or unsupervised pixel classification and, finally, for the interpretation of image segments based on segment properties and relations.

9. Voyager image processing at the Image Processing Laboratory

NASA Technical Reports Server (NTRS)

Jepsen, P. L.; Mosher, J. A.; Yagi, G. M.; Avis, C. C.; Lorre, J. J.; Garneau, G. W.

1980-01-01

This paper discusses new digital processing techniques as applied to the Voyager Imaging Subsystem and devised to explore atmospheric dynamics, spectral variations, and the morphology of Jupiter, Saturn and their satellites. Radiometric and geometric decalibration processes, the modulation transfer function, and processes to determine and remove photometric properties of the atmosphere and surface of Jupiter and its satellites are examined. It is exhibited that selected images can be processed into 'approach at constant longitude' time lapse movies which are useful in observing atmospheric changes of Jupiter. Photographs are included to illustrate various image processing techniques.

10. Mathematical modeling of DNA's transcription process for the cancer study

Morales-Peñaloza, A.; Meza-López, C. D.; Godina-Nava, J. J.

2012-10-01

The cancer is a phenomenon caused by an anomaly in the DNA's transcription process, therefore it is necessary to known how such anomaly is generated in order to implement alternative therapies to combat it. We propose to use mathematical modeling to treat the problem. Is implemented a simulation of the process of transcription and are studied the transport properties in the heterogeneous case using nonlinear dynamics.

11. Industrial Applications of Image Processing

2014-11-01

The recent advances in sensors quality and processing power provide us with excellent tools for designing more complex image processing and pattern recognition tasks. In this paper we review the existing applications of image processing and pattern recognition in industrial engineering. First we define the role of vision in an industrial. Then a dissemination of some image processing techniques, feature extraction, object recognition and industrial robotic guidance is presented. Moreover, examples of implementations of such techniques in industry are presented. Such implementations include automated visual inspection, process control, part identification, robots control. Finally, we present some conclusions regarding the investigated topics and directions for future investigation

12. Generalized Mathematical Model Predicting the Mechanical Processing Topography

Leonov, S. L.; Markov, A. M.; Belov, A. B.; Sczygol, N.

2016-04-01

We propose a unified approach for the construction of mathematical models for the formation of surface topography and calculation of its roughness parameters for different methods of machining processes. The approach is based on a process of geometric copy tool in the material which superimposes plastico-elastic deformation, oscillatory occurrences in processing and random components of the profile. The unified approach makes it possible to reduce the time forcreation of simulated stochastic model for a specific type of processing and guarantee the accuracy of geometric parameters calculation of the surface. We make an application example of generalized model for calculation of roughness density distribution Ra in external sharpening.

13. An image processing algorithm for PPCR imaging

Cowen, Arnold R.; Giles, Anthony; Davies, Andrew G.; Workman, A.

1993-09-01

During 1990 The UK Department of Health installed two Photostimulable Phosphor Computed Radiography (PPCR) systems in the General Infirmary at Leeds with a view to evaluating the clinical and physical performance of the technology prior to its introduction into the NHS. An issue that came to light from the outset of the projects was the radiologists reservations about the influence of the standard PPCR computerized image processing on image quality and diagnostic performance. An investigation was set up by FAXIL to develop an algorithm to produce single format high quality PPCR images that would be easy to implement and allay the concerns of radiologists.

14. SWNT Imaging Using Multispectral Image Processing

Blades, Michael; Pirbhai, Massooma; Rotkin, Slava V.

2012-02-01

A flexible optical system was developed to image carbon single-wall nanotube (SWNT) photoluminescence using the multispectral capabilities of a typical CCD camcorder. The built in Bayer filter of the CCD camera was utilized, using OpenCV C++ libraries for image processing, to decompose the image generated in a high magnification epifluorescence microscope setup into three pseudo-color channels. By carefully calibrating the filter beforehand, it was possible to extract spectral data from these channels, and effectively isolate the SWNT signals from the background.

15. Subband/Transform MATLAB Functions For Processing Images

NASA Technical Reports Server (NTRS)

Glover, D.

1995-01-01

SUBTRANS software is package of routines implementing image-data-processing functions for use with MATLAB*(TM) software. Provides capability to transform image data with block transforms and to produce spatial-frequency subbands of transformed data. Functions cascaded to provide further decomposition into more subbands. Also used in image-data-compression systems. For example, transforms used to prepare data for lossy compression. Written for use in MATLAB mathematical-analysis environment.

16. On the mathematical modeling of wound healing angiogenesis in skin as a reaction-transport process

PubMed Central

Flegg, Jennifer A.; Menon, Shakti N.; Maini, Philip K.; McElwain, D. L. Sean

2015-01-01

Over the last 30 years, numerous research groups have attempted to provide mathematical descriptions of the skin wound healing process. The development of theoretical models of the interlinked processes that underlie the healing mechanism has yielded considerable insight into aspects of this critical phenomenon that remain difficult to investigate empirically. In particular, the mathematical modeling of angiogenesis, i.e., capillary sprout growth, has offered new paradigms for the understanding of this highly complex and crucial step in the healing pathway. With the recent advances in imaging and cell tracking, the time is now ripe for an appraisal of the utility and importance of mathematical modeling in wound healing angiogenesis research. The purpose of this review is to pedagogically elucidate the conceptual principles that have underpinned the development of mathematical descriptions of wound healing angiogenesis, specifically those that have utilized a continuum reaction-transport framework, and highlight the contribution that such models have made toward the advancement of research in this field. We aim to draw attention to the common assumptions made when developing models of this nature, thereby bringing into focus the advantages and limitations of this approach. A deeper integration of mathematical modeling techniques into the practice of wound healing angiogenesis research promises new perspectives for advancing our knowledge in this area. To this end we detail several open problems related to the understanding of wound healing angiogenesis, and outline how these issues could be addressed through closer cross-disciplinary collaboration. PMID:26483695

17. An interactive image processing system.

PubMed

Troxel, D E

1981-01-01

A multiuser multiprocessing image processing system has been developed. It is an interactive picture manipulation and enhancement facility which is capable of executing a variety of image processing operations while simultaneously controlling real-time input and output of pictures. It was designed to provide a reliable picture processing system which would be cost-effective in the commercial production environment. Additional goals met by the system include flexibility and ease of operation and modification. PMID:21868923

18. Conceptions and Images of Mathematics Professors on Teaching Mathematics in School.

ERIC Educational Resources Information Center

Pehkonen, Erkki

1999-01-01

Clarifies what kind of mathematical beliefs are conveyed to student teachers during their studies. Interviews mathematics professors (n=7) from five Finnish universities who were responsible for mathematics teacher education. Professors estimated that teachers' basic knowledge was poor and old-fashioned, requiring improvement, and they emphasized…

19. Image Processing: Some Challenging Problems

Huang, T. S.; Aizawa, K.

1993-11-01

Image processing can be broadly defined as the manipulation of signals which are inherently multidimensional. The most common such signals are photographs and video sequences. The goals of processing or manipulation can be (i) compression for storage or transmission; (ii) enhancement or restoration; (iii) analysis, recognition, and understanding; or (iv) visualization for human observers. The use of image processing techniques has become almost ubiquitous; they find applications in such diverse areas as astronomy, archaeology, medicine, video communication, and electronic games. Nonetheless, many important problems in image processing remain unsolved. It is the goal of this paper to discuss some of these challenging problems. In Section I, we mention a number of outstanding problems. Then, in the remainder of this paper, we concentrate on one of them: very-low-bit-rate video compression. This is chosen because it involves almost all aspects of image processing.

20. Image processing of aerodynamic data

NASA Technical Reports Server (NTRS)

Faulcon, N. D.

1985-01-01

The use of digital image processing techniques in analyzing and evaluating aerodynamic data is discussed. An image processing system that converts images derived from digital data or from transparent film into black and white, full color, or false color pictures is described. Applications to black and white images of a model wing with a NACA 64-210 section in simulated rain and to computed low properties for transonic flow past a NACA 0012 airfoil are presented. Image processing techniques are used to visualize the variations of water film thicknesses on the wing model and to illustrate the contours of computed Mach numbers for the flow past the NACA 0012 airfoil. Since the computed data for the NACA 0012 airfoil are available only at discrete spatial locations, an interpolation method is used to provide values of the Mach number over the entire field.

1. Preventing clonal evolutionary processes in cancer: Insights from mathematical models

PubMed Central

Rodriguez-Brenes, Ignacio A.; Wodarz, Dominik

2015-01-01

Clonal evolutionary processes can drive pathogenesis in human diseases, with cancer being a prominent example. To prevent or treat cancer, mechanisms that can potentially interfere with clonal evolutionary processes need to be understood better. Mathematical modeling is an important research tool that plays an ever-increasing role in cancer research. This paper discusses how mathematical models can be useful to gain insights into mechanisms that can prevent disease initiation, help analyze treatment responses, and aid in the design of treatment strategies to combat the emergence of drug-resistant cells. The discussion will be done in the context of specific examples. Among defense mechanisms, we explore how replicative limits and cellular senescence induced by telomere shortening can influence the emergence and evolution of tumors. Among treatment approaches, we consider the targeted treatment of chronic lymphocytic leukemia (CLL) with tyrosine kinase inhibitors. We illustrate how basic evolutionary mathematical models have the potential to make patient-specific predictions about disease and treatment outcome, and argue that evolutionary models could become important clinical tools in the field of personalized medicine. PMID:26195751

2. Mathematical Modelling of Bacterial Populations in Bio-remediation Processes

Vasiliadou, Ioanna A.; Vayenas, Dimitris V.; Chrysikopoulos, Constantinos V.

2011-09-01

An understanding of bacterial behaviour concerns many field applications, such as the enhancement of water, wastewater and subsurface bio-remediation, the prevention of environmental pollution and the protection of human health. Numerous microorganisms have been identified to be able to degrade chemical pollutants, thus, a variety of bacteria are known that can be used in bio-remediation processes. In this study the development of mathematical models capable of describing bacterial behaviour considered in bio-augmentation plans, such as bacterial growth, consumption of nutrients, removal of pollutants, bacterial transport and attachment in porous media, is presented. The mathematical models may be used as a guide in designing and assessing the conditions under which areas contaminated with pollutants can be better remediated.

3. The Mathematics of Medical Imaging in the Classroom.

ERIC Educational Resources Information Center

Funkhouser, Charles P.; Jafari, Farhad; Eubank, William B.

2002-01-01

Presents an integrated exposition of aspects of secondary school mathematics and a medical science specialty. Reviews clinical medical practice and theoretical and empirical literature in mathematics education and radiology to develop and pilot model integrative classroom topics and activities. Suggests mathematical applications in numeration and…

4. Fuzzy image processing in sun sensor

NASA Technical Reports Server (NTRS)

Mobasser, S.; Liebe, C. C.; Howard, A.

2003-01-01

This paper will describe how the fuzzy image processing is implemented in the instrument. Comparison of the Fuzzy image processing and a more conventional image processing algorithm is provided and shows that the Fuzzy image processing yields better accuracy then conventional image processing.

5. Signal and Image Processing Operations

Energy Science and Technology Software Center (ESTSC)

1995-05-10

VIEW is a software system for processing arbitrary multidimensional signals. It provides facilities for numerical operations, signal displays, and signal databasing. The major emphasis of the system is on the processing of time-sequences and multidimensional images. The system is designed to be both portable and extensible. It runs currently on UNIX systems, primarily SUN workstations.

6. Associative architecture for image processing

1997-09-01

This article presents a new generation in parallel processing architecture for real-time image processing. The approach is implemented in a real time image processor chip, called the XiumTM-2, based on combining a fully associative array which provides the parallel engine with a serial RISC core on the same die. The architecture is fully programmable and can be programmed to implement a wide range of color image processing, computer vision and media processing functions in real time. The associative part of the chip is based on patented pending methodology of Associative Computing Ltd. (ACL), which condenses 2048 associative processors, each of 128 'intelligent' bits. Each bit can be a processing bit or a memory bit. At only 33 MHz and 0.6 micron manufacturing technology process, the chip has a computational power of 3 billion ALU operations per second and 66 billion string search operations per second. The fully programmable nature of the XiumTM-2 chip enables developers to use ACL tools to write their own proprietary algorithms combined with existing image processing and analysis functions from ACL's extended set of libraries.

7. Mathematics Stories: Preservice Teachers' Images and Experiences as Learners of Mathematics

ERIC Educational Resources Information Center

Guillaume, Andrea M.; Kirtman, Lisa

2010-01-01

This study seeks to determine whether national trends in subject matter knowledge and in curricular experiences hold true for prospective teachers who attended K-12 schooling during the reform period. It further seeks to determine other influences on teachers' visions of mathematics and goals for themselves as mathematics teachers. Teachers' past…

8. Mathematical Formulation Requirements and Specifications for the Process Models

SciTech Connect

Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

2010-11-01

The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments naturally

9. Digital processing of radiographic images

NASA Technical Reports Server (NTRS)

Bond, A. D.; Ramapriyan, H. K.

1973-01-01

Some techniques are presented and the software documentation for the digital enhancement of radiographs. Both image handling and image processing operations are considered. The image handling operations dealt with are: (1) conversion of format of data from packed to unpacked and vice versa; (2) automatic extraction of image data arrays; (3) transposition and 90 deg rotations of large data arrays; (4) translation of data arrays for registration; and (5) reduction of the dimensions of data arrays by integral factors. Both the frequency and the spatial domain approaches are presented for the design and implementation of the image processing operation. It is shown that spatial domain recursive implementation of filters is much faster than nonrecursive implementations using fast fourier transforms (FFT) for the cases of interest in this work. The recursive implementation of a class of matched filters for enhancing image signal to noise ratio is described. Test patterns are used to illustrate the filtering operations. The application of the techniques to radiographic images of metallic structures is demonstrated through several examples.

10. Seismic Imaging Processing and Migration

Energy Science and Technology Software Center (ESTSC)

2000-06-26

Salvo is a 3D, finite difference, prestack, depth migration code for parallel computers. It is also capable of processing 2D and poststack data. The code requires as input a seismic dataset, a velocity model and a file of parameters that allows the user to select various options. The code uses this information to produce a seismic image. Some of the options available to the user include the application of various filters and imaging conditions. Themore » code also incorporates phase encoding (patent applied for) to process multiple shots simultaneously.« less

11. Fingerprint recognition using image processing

Dholay, Surekha; Mishra, Akassh A.

2011-06-01

Finger Print Recognition is concerned with the difficult task of matching the images of finger print of a person with the finger print present in the database efficiently. Finger print Recognition is used in forensic science which helps in finding the criminals and also used in authentication of a particular person. Since, Finger print is the only thing which is unique among the people and changes from person to person. The present paper describes finger print recognition methods using various edge detection techniques and also how to detect correct finger print using a camera images. The present paper describes the method that does not require a special device but a simple camera can be used for its processes. Hence, the describe technique can also be using in a simple camera mobile phone. The various factors affecting the process will be poor illumination, noise disturbance, viewpoint-dependence, Climate factors, and Imaging conditions. The described factor has to be considered so we have to perform various image enhancement techniques so as to increase the quality and remove noise disturbance of image. The present paper describe the technique of using contour tracking on the finger print image then using edge detection on the contour and after that matching the edges inside the contour.

12. The Mathematics of Medical Imaging in the Classroom

ERIC Educational Resources Information Center

Funkhouser, Charles P.; Jafari, Farhad; Eubank, William B.

2002-01-01

The article presents an integrated exposition of aspects of secondary school mathematics and a medical science specialty together with related classroom activities. Clinical medical practice and theoretical and empirical literature in mathematics education and radiology were reviewed to develop and pilot model integrative classroom topics and…

13. Analysis of electronic autoradiographs by mathematical post-processing

Ghosh, S.; Baier, M.; Schütz, J.; Schneider, F.; Scherer, U. W.

2016-02-01

Autoradiography is a well-established method of nuclear imaging. When different radionuclides are present simultaneously, additional processing is needed to distinguish distributions of radionuclides. In this work, a method is presented where aluminium absorbers of different thickness are used to produce images with different cut-off energies. By subtracting images pixel-by-pixel one can generate images representing certain ranges of β-particle energies. The method is applied to the measurement of irradiated reactor graphite samples containing several radionuclides to determine the spatial distribution of these radionuclides within pre-defined energy windows. The process was repeated under fixed parameters after thermal treatment of the samples. The greyscale images of the distribution after treatment were subtracted from the corresponding pre-treatment images. Significant changes in the intensity and distribution of radionuclides could be observed in some samples. Due to the thermal treatment parameters the most significant differences were observed in the 3H and 14C inventory and distribution.

14. Computer image processing: Geologic applications

NASA Technical Reports Server (NTRS)

Abrams, M. J.

1978-01-01

Computer image processing of digital data was performed to support several geological studies. The specific goals were to: (1) relate the mineral content to the spectral reflectance of certain geologic materials, (2) determine the influence of environmental factors, such as atmosphere and vegetation, and (3) improve image processing techniques. For detection of spectral differences related to mineralogy, the technique of band ratioing was found to be the most useful. The influence of atmospheric scattering and methods to correct for the scattering were also studied. Two techniques were used to correct for atmospheric effects: (1) dark object subtraction, (2) normalization of use of ground spectral measurements. Of the two, the first technique proved to be the most successful for removing the effects of atmospheric scattering. A digital mosaic was produced from two side-lapping LANDSAT frames. The advantages were that the same enhancement algorithm can be applied to both frames, and there is no seam where the two images are joined.

15. An Exploration of Mathematical Problem-Solving Processes.

ERIC Educational Resources Information Center

Webb, Norman

The problem-solving strategies used by tenth-grade students in the solution of mathematical problems were investigated. Forty students selected from four high schools were given pretests generating sixteen scores related to mathematics achievement, attitude toward mathematics, and other ability measures. These students were then asked to solve…

16. The image of mathematics held by Irish post-primary students

Lane, Ciara; Stynes, Martin; O'Donoghue, John

2014-08-01

The image of mathematics held by Irish post-primary students was examined and a model for the image found was constructed. Initially, a definition for 'image of mathematics' was adopted with image of mathematics hypothesized as comprising attitudes, beliefs, self-concept, motivation, emotions and past experiences of mathematics. Research focused on students studying ordinary level mathematics for the Irish Leaving Certificate examination - the final examination for students in second-level or post-primary education. Students were aged between 15 and 18 years. A questionnaire was constructed with both quantitative and qualitative aspects. The questionnaire survey was completed by 356 post-primary students. Responses were analysed quantitatively using Statistical Package for the Social Sciences (SPSS) and qualitatively using the constant comparative method of analysis and by reviewing individual responses. Findings provide an insight into Irish post-primary students' images of mathematics and offer a means for constructing a theoretical model of image of mathematics which could be beneficial for future research.

17. Linear Algebra and Image Processing

ERIC Educational Resources Information Center

Allali, Mohamed

2010-01-01

We use the computing technology digital image processing (DIP) to enhance the teaching of linear algebra so as to make the course more visual and interesting. Certainly, this visual approach by using technology to link linear algebra to DIP is interesting and unexpected to both students as well as many faculty. (Contains 2 tables and 11 figures.)

18. Concept Learning through Image Processing.

ERIC Educational Resources Information Center

Cifuentes, Lauren; Yi-Chuan, Jane Hsieh

This study explored computer-based image processing as a study strategy for middle school students' science concept learning. Specifically, the research examined the effects of computer graphics generation on science concept learning and the impact of using computer graphics to show interrelationships among concepts during study time. The 87…

19. Optimal single stage restoration of split-beam sonar images via mathematical morphology

Rystrom, Larry R.; Haralick, Robert M.; Katz, Philip L.

1994-05-01

Split-beam sonar binary images are inherently noisy and have large quantities of shot noise as well as many missing data points. We address the problem of their restoration via mathematical morphology. Conventional restoration techniques for these types of images do not make use of any of the spatial relationships between data points, such as a qualitative observation that outliers tend to have much larger distances to neighboring pixels. We first define an explicit noise model that characterizes the image degradation process for split-beam sonar images. A key feature of the model is that the degradation is split into two parts, a foreground component and a background component. The amount of noise occurring in the background decreases with distance from the underlying signal object. Thus outliers in the model have the same statistical properties as those observed in training data. Next we propose two different restoration algorithms for these kinds of images based respectively on morphological distance transforms and dilation with a toroid shaped structuring element followed by intersection. Finally we generalize to processing other kinds of imagery where applicable.

20. Mathematical modeling heat and mass transfer processes in porous media

Akhmed-Zaki, Darkhan

2013-11-01

On late development stages of oil-fields appears a complex problem of oil-recovery reduction. One of solution approaches is injecting of surfactant together with water in the form of active impurities into the productive layer - for decreasing oil viscosity and capillary forces between ``oil-water'' phases system. In fluids flow the surfactant can be in three states: dissolved in water, dissolved in oil and adsorbed on pore channels' walls. The surfactant's invasion into the reservoir is tracked by its diffusion with reservoir liquid and mass-exchange with two phase (liquid and solid) components of porous structure. Additionally, in this case heat exchange between fluids (injected, residual) and framework of porous medium has practical importance for evaluating of temperature influences on enhancing oil recovery. Now, the problem of designing an adequate mathematical model for describing a simultaneous flowing heat and mass transfer processes in anisotropic heterogeneous porous medium -surfactant injection during at various temperature regimes has not been fully researched. In this work is presents a 2D mathematical model of surfactant injections into the oil reservoir. Description of heat- and mass transfer processes in a porous media is done through differential and kinetic equations. For designing a computational algorithm is used modify version of IMPES method. The sequential and parallel computational algorithms are developed using an adaptive curvilinear meshes which into account heterogeneous porous structures. In this case we can evaluate the boundaries of our process flows - fronts (``invasion'', ``heat'' and ``mass'' transfers), according to the pressure, temperature, and concentration gradient changes.

1. Investigation of Primary Mathematics Student Teachers' Concept Images: Cylinder and Cone

ERIC Educational Resources Information Center

Ertekin, Erhan; Yazici, Ersen; Delice, Ali

2014-01-01

The aim of the present study is to determine the influence of concept definitions of cylinder and cone on primary mathematics student teachers' construction of relevant concept images. The study had a relational survey design and the participants were 238 primary mathematics student teachers. Statistical analyses implied the following:…

2. Investigation of Prospective Primary Mathematics Teachers' Perceptions and Images for Quadrilaterals

ERIC Educational Resources Information Center

Turnuklu, Elif; Gundogdu Alayli, Funda; Akkas, Elif Nur

2013-01-01

The object of this study was to show how prospective elementary mathematics teachers define and classify the quadrilaterals and to find out their images. This research was a qualitative study. It was conducted with 36 prospective elementary mathematics teachers studying at 3rd and 4th years in an educational faculty. The data were collected by…

3. Real-time contrast medium detection in x-ray images by mathematical morphology operators

Ly, Dieu Sang; Beucher, Serge; Bilodeau, Michel

2015-11-01

This paper proposes a solution to contrast agent (CA) detection in angiograms by considering x-ray images as intensity images and applying mathematical morphology operators. We present two detection approaches, one based on the intensity infimum and the other based on the dual reconstruction. The evaluation using several data sets shows that both techniques are able to detect the presence of the contrast medium volume. Moreover, the dual reconstruction-based method is proven to be faster in processing time and more effective than the intensity infimum-based method in distinguishing the intensity change at the same location from the displacement of the same region. In addition, we show how to track the CA passage through a region of interest by observing the intensity evolution in successive submasks.

4. Mathematical Knowledge and School Work: A Case Study of the Teaching of Developing Mathematical Processes.

ERIC Educational Resources Information Center

Stephens, Walter Maxwell

This study considered the meaning that was given to knowing/doing mathematics in classrooms comprising the observational study conducted by the Wisconsin Center for Education Research during 1978-81. The study interprets the work of teachers and students, and considers what constitutes appropriate mathematical knowledge for children to learn. A…

5. Proceedings of the NASA Symposium on Mathematical Pattern Recognition and Image Analysis

NASA Technical Reports Server (NTRS)

Guseman, L. F., Jr.

1983-01-01

The application of mathematical and statistical analyses techniques to imagery obtained by remote sensors is described by Principal Investigators. Scene-to-map registration, geometric rectification, and image matching are among the pattern recognition aspects discussed.

6. ImageJ: Image processing and analysis in Java

Rasband, W. S.

2012-06-01

ImageJ is a public domain Java image processing program inspired by NIH Image. It can display, edit, analyze, process, save and print 8-bit, 16-bit and 32-bit images. It can read many image formats including TIFF, GIF, JPEG, BMP, DICOM, FITS and "raw". It supports "stacks", a series of images that share a single window. It is multithreaded, so time-consuming operations such as image file reading can be performed in parallel with other operations.

7. Modelling Of Flotation Processes By Classical Mathematical Methods - A Review

Jovanović, Ivana; Miljanović, Igor

2015-12-01

Flotation process modelling is not a simple task, mostly because of the process complexity, i.e. the presence of a large number of variables that (to a lesser or a greater extent) affect the final outcome of the mineral particles separation based on the differences in their surface properties. The attempts toward the development of the quantitative predictive model that would fully describe the operation of an industrial flotation plant started in the middle of past century and it lasts to this day. This paper gives a review of published research activities directed toward the development of flotation models based on the classical mathematical rules. The description and systematization of classical flotation models were performed according to the available references, with emphasize exclusively given to the flotation process modelling, regardless of the model application in a certain control system. In accordance with the contemporary considerations, models were classified as the empirical, probabilistic, kinetic and population balance types. Each model type is presented through the aspects of flotation modelling at the macro and micro process levels.

8. Universal Gestational Age Effects on Cognitive and Basic Mathematic Processing: 2 Cohorts in 2 Countries

PubMed Central

Wolke, Dieter; Strauss, Vicky Yu-Chun; Johnson, Samantha; Gilmore, Camilla; Marlow, Neil; Jaekel, Julia

2015-01-01

Objective To determine whether general cognitive ability, basic mathematic processing, and mathematic attainment are universally affected by gestation at birth, as well as whether mathematic attainment is more strongly associated with cohort-specific factors such as schooling than basic cognitive and mathematical abilities. Study design The Bavarian Longitudinal Study (BLS, 1289 children, 27-41 weeks gestational age [GA]) was used to estimate effects of GA on IQ, basic mathematic processing, and mathematic attainment. These estimations were used to predict IQ, mathematic processing, and mathematic attainment in the EPICure Study (171 children <26 weeks GA). Results For children born <34 weeks GA, each lower week decreased IQ and mathematic attainment scores by 2.34 (95% CI: −2.99, −1.70) and 2.76 (95% CI: −3.40, −2.11) points, respectively. There were no differences among children born 34-41 weeks GA. Similarly, for children born <36 weeks GA, mathematic processing scores decreased by 1.77 (95% CI: −2.20, −1.34) points with each lower GA week. The prediction function generated using BLS data accurately predicted the effect of GA on IQ and mathematic processing among EPICure children. However, these children had better attainment than predicted by BLS. Conclusions Prematurity has adverse effects on basic mathematic processing following birth at all gestations <36 weeks and on IQ and mathematic attainment <34 weeks GA. The ability to predict IQ and mathematic processing scores from one cohort to another among children cared for in different eras and countries suggests that universal neurodevelopmental factors may explain the effects of gestation at birth. In contrast, mathematic attainment may be improved by schooling. PMID:25842966

9. Defective Number Module or Impaired Access? Numerical Magnitude Processing in First Graders with Mathematical Difficulties

ERIC Educational Resources Information Center

De Smedt, Bert; Gilmore, Camilla K.

2011-01-01

This study examined numerical magnitude processing in first graders with severe and mild forms of mathematical difficulties, children with mathematics learning disabilities (MLD) and children with low achievement (LA) in mathematics, respectively. In total, 20 children with MLD, 21 children with LA, and 41 regular achievers completed a numerical…

10. Interactivity of Visual Mathematical Representations: Factors Affecting Learning and Cognitive Processes

ERIC Educational Resources Information Center

Sedig, Kamran; Liang, Hai-Ning

2006-01-01

Computer-based mathematical cognitive tools (MCTs) are a category of external aids intended to support and enhance learning and cognitive processes of learners. MCTs often contain interactive visual mathematical representations (VMRs), where VMRs are graphical representations that encode properties and relationships of mathematical concepts. In…

11. Relationships between the Process Standards: Process Elicited through Letter Writing between Preservice Teachers and High School Mathematics Students

ERIC Educational Resources Information Center

Kosko, Karl Wesley; Norton, Anderson

2012-01-01

The current body of literature suggests an interactive relationship between several of the process standards advocated by National Council of Teachers of Mathematics. Verbal and written mathematical communication has often been described as an alternative to typical mathematical representations (e.g., charts and graphs). Therefore, the…

12. Medical Image Segmentation using the HSI color space and Fuzzy Mathematical Morphology

Gasparri, J. P.; Bouchet, A.; Abras, G.; Ballarin, V.; Pastore, J. I.

2011-12-01

Diabetic retinopathy is the most common cause of blindness among the active population in developed countries. An early ophthalmologic examination followed by proper treatment can prevent blindness. The purpose of this work is develop an automated method for segmentation the vasculature in retinal images in order to assist the expert in the evolution of a specific treatment or in the diagnosis of a potential pathology. Since the HSI space has the ability to separate the intensity of the intrinsic color information, its use is recommended for the digital processing images when they are affected by lighting changes, characteristic of the images under study. By the application of color filters, is achieved artificially change the tone of blood vessels, to better distinguish them from the bottom. This technique, combined with the application of fuzzy mathematical morphology tools as the Top-Hat transformation, creates images of the retina, where vascular branches are markedly enhanced over the original. These images provide the visualization of blood vessels by the specialist.

13. Dynamic infrared imaging in identification of breast cancer tissue with combined image processing and frequency analysis.

PubMed

Joro, R; Lääperi, A-L; Soimakallio, S; Järvenpää, R; Kuukasjärvi, T; Toivonen, T; Saaristo, R; Dastidar, P

2008-01-01

Five combinations of image-processing algorithms were applied to dynamic infrared (IR) images of six breast cancer patients preoperatively to establish optimal enhancement of cancer tissue before frequency analysis. mid-wave photovoltaic (PV) IR cameras with 320x254 and 640x512 pixels were used. The signal-to-noise ratio and the specificity for breast cancer were evaluated with the image-processing combinations from the image series of each patient. Before image processing and frequency analysis the effect of patient movement was minimized with a stabilization program developed and tested in the study by stabilizing image slices using surface markers set as measurement points on the skin of the imaged breast. A mathematical equation for superiority value was developed for comparison of the key ratios of the image-processing combinations. The ability of each combination to locate the mammography finding of breast cancer in each patient was compared. Our results show that data collected with a 640x512-pixel mid-wave PV camera applying image-processing methods optimizing signal-to-noise ratio, morphological image processing and linear image restoration before frequency analysis possess the greatest superiority value, showing the cancer area most clearly also in the match centre of the mammography estimation. PMID:18666012

14. Applications in Digital Image Processing

ERIC Educational Resources Information Center

Silverman, Jason; Rosen, Gail L.; Essinger, Steve

2013-01-01

Students are immersed in a mathematically intensive, technological world. They engage daily with iPods, HDTVs, and smartphones--technological devices that rely on sophisticated but accessible mathematical ideas. In this article, the authors provide an overview of four lab-type activities that have been used successfully in high school mathematics…

15. Vehicle positioning using image processing

Kaur, Amardeep; Watkins, Steve E.; Swift, Theresa M.

2009-03-01

An image-processing approach is described that detects the position of a vehicle on a bridge. A load-bearing vehicle must be carefully positioned on a bridge for quantitative bridge monitoring. The personnel required for setup and testing and the time required for bridge closure or traffic control are important management and cost considerations. Consequently, bridge monitoring and inspections are good candidates for smart embedded systems. The objectives of this work are to reduce the need for personnel time and to minimize the time for bridge closure. An approach is proposed that uses a passive target on the bridge and camera instrumentation on the load vehicle. The orientation of the vehicle-mounted camera and the target determine the position. The experiment used pre-defined concentric circles as the target, a FireWire camera for image capture, and MATLAB for computer processing. Various image-processing techniques are compared for determining the orientation of the target circles with respect to speed and accuracy in the positioning application. The techniques for determining the target orientation use algorithms based on using the centroid feature, template matching, color feature, and Hough transforms. Timing parameters are determined for each algorithm to determine the feasibility for real-time use in a position triggering system. Also, the effect of variations in the size and color of the circles are examined. The development can be combined with embedded sensors and sensor nodes for a complete automated procedure. As the load vehicle moves to the proper position, the image-based system can trigger an embedded measurement, which is then transmitted back to the vehicle control computer through a wireless link.

16. Identifying Cognitive Processes Important to Mathematics Learning but Often Overlooked

ERIC Educational Resources Information Center

Turner, Ross

2011-01-01

In August 2010, ACER held its annual conference in Melbourne. The theme of the 2010 conference--"Teaching Mathematics? Make It Count"--was chosen to highlight that mathematics education is an area of high priority in Australia. In the author's own presentation to the conference, he outlined research into an area that he believes is very important…

17. The Importance of Dialogic Processes to Conceptual Development in Mathematics

ERIC Educational Resources Information Center

Kazak, Sibel; Wegerif, Rupert; Fujita, Taro

2015-01-01

We argue that dialogic theory, inspired by the Russian scholar Mikhail Bakhtin, has a distinct contribution to the analysis of the genesis of understanding in the mathematics classroom. We begin by contrasting dialogic theory to other leading theoretical approaches to understanding conceptual development in mathematics influenced by Jean Piaget…

18. Image processing photosensor for robots

Vinogradov, Sergey L.; Shubin, Vitaly E.

1995-01-01

Some aspects of the possible applications of new, nontraditional generation of the advanced photosensors having the inherent internal image processing for multifunctional optoelectronic systems such as machine vision systems (MVS) are discussed. The optical information in these solid-state photosensors, so-called photoelectric structures with memory (PESM), is registered and stored in the form of 2D charge and potential patterns in the plane of the layers, and then it may be transferred and transformed in a normal direction due to interaction of these patterns. PESM ensure high operation potential of the massively parallel processing with effective rate up to 1014 operation/bit/s in such integral operations as addition, subtraction, contouring, correlation of images and so on. Most diverse devices and apparatus may be developed on their base, ranging from automatic rangefinders to the MVS for furnishing robotized industries. Principal features, physical backgrounds of the main primary operations, complex functional algorithms for object selection, tracking, and guidance are briefly described. The examples of the possible application of the PESM as an intellectual 'supervideosensor', that combines a high-quality imager, memory media and a high-capacity special-purpose processor will be presented.

19. Image processing software for imaging spectrometry

NASA Technical Reports Server (NTRS)

Mazer, Alan S.; Martin, Miki; Lee, Meemong; Solomon, Jerry E.

1988-01-01

The paper presents a software system, Spectral Analysis Manager (SPAM), which has been specifically designed and implemented to provide the exploratory analysis tools necessary for imaging spectrometer data, using only modest computational resources. The basic design objectives are described as well as the major algorithms designed or adapted for high-dimensional images. Included in a discussion of system implementation are interactive data display, statistical analysis, image segmentation and spectral matching, and mixture analysis.

20. Networks for image acquisition, processing and display

NASA Technical Reports Server (NTRS)

1990-01-01

The human visual system comprises layers of networks which sample, process, and code images. Understanding these networks is a valuable means of understanding human vision and of designing autonomous vision systems based on network processing. Ames Research Center has an ongoing program to develop computational models of such networks. The models predict human performance in detection of targets and in discrimination of displayed information. In addition, the models are artificial vision systems sharing properties with biological vision that has been tuned by evolution for high performance. Properties include variable density sampling, noise immunity, multi-resolution coding, and fault-tolerance. The research stresses analysis of noise in visual networks, including sampling, photon, and processing unit noises. Specific accomplishments include: models of sampling array growth with variable density and irregularity comparable to that of the retinal cone mosaic; noise models of networks with signal-dependent and independent noise; models of network connection development for preserving spatial registration and interpolation; multi-resolution encoding models based on hexagonal arrays (HOP transform); and mathematical procedures for simplifying analysis of large networks.

1. Circular Samples as Objects for Magnetic Resonance Imaging - Mathematical Simulation, Experimental Results

Frollo, Ivan; Krafčík, Andrej; Andris, Peter; Přibil, Jiří; Dermek, Tomáš

2015-12-01

Circular samples are the frequent objects of "in-vitro" investigation using imaging method based on magnetic resonance principles. The goal of our investigation is imaging of thin planar layers without using the slide selection procedure, thus only 2D imaging or imaging of selected layers of samples in circular vessels, eppendorf tubes,.. compulsorily using procedure "slide selection". In spite of that the standard imaging methods was used, some specificity arise when mathematical modeling of these procedure is introduced. In the paper several mathematical models were presented that were compared with real experimental results. Circular magnetic samples were placed into the homogenous magnetic field of a low field imager based on nuclear magnetic resonance. For experimental verification an MRI 0.178 Tesla ESAOTE Opera imager was used.

2. Multispectral Image Processing for Plants

NASA Technical Reports Server (NTRS)

Miles, Gaines E.

1991-01-01

The development of a machine vision system to monitor plant growth and health is one of three essential steps towards establishing an intelligent system capable of accurately assessing the state of a controlled ecological life support system for long-term space travel. Besides a network of sensors, simulators are needed to predict plant features, and artificial intelligence algorithms are needed to determine the state of a plant based life support system. Multispectral machine vision and image processing can be used to sense plant features, including health and nutritional status.

3. Image processing technique for arbitrary image positioning in holographic stereogram

Kang, Der-Kuan; Yamaguchi, Masahiro; Honda, Toshio; Ohyama, Nagaaki

1990-12-01

In a one-step holographic stereogram, if the series of original images are used just as they are taken from perspective views, three-dimensional images are usually reconstructed in back of the hologram plane. In order to enhance the sense of perspective of the reconstructed images and minimize blur of the interesting portions, we introduce an image processing technique for making a one-step flat format holographic stereogram in which three-dimensional images can be observed at an arbitrary specified position. Experimental results show the effect of the image processing. Further, we show results of a medical application using this image processing.

4. Using n-Dimensional Volumes for Mathematical Applications in Spectral Image Analysis

Ziemann, Amanda K.

The ability to detect an object or activity -- such as a military vehicle, construction area, campsite, or vehicle tracks -- is highly important to both military and civilian applications. Sensors that process multi and hyperspectral images provide a medium for performing such tasks. Hyperspectral imaging is a technique for collecting and processing imagery at a large number of visible and non-visible wavelengths. Different materials exhibit different trends in their spectra, which can be used to analyze the image. For an image collected at n different wavelengths, the spectrum of each pixel can be mathematically represented as an n-element vector. The algorithm established in this work, the Simplex Volume Estimation algorithm (SVE), focuses specifically on change detection and large area search. In hyperspectral image analysis, a set of pixels constitutes a data cloud, with each pixel corresponding to a vector endpoint in Euclidean space. The SVE algorithm takes a geometrical approach to image analysis based on the linear mixture model, which describes each pixel in an image collected at n spectral bands as a linear combination of n+1 pure-material component spectra (known as endmembers). Iterative endmember identification is used to construct a volume function, where the Gram matrix is used to calculate the hypervolume of the data at each iteration as the endmembers are considered in Euclidean spaces of increasing dimensionality. Linear algebraic theory substantiates that the volume function accurately characterizes the inherent dimensionality of a set of data, and supports that the volume function provides a tool for identifying the subspace in which the magnitude of the spread of the data is the greatest. A metric is extracted from the volume function, and is used to quantify the relative complexity within a single image or the change in complexity across multiple images. The SVE algorithm was applied to hyperspectral images for the tasks of change detection

5. Possible effects of English-Chinese language differences on the processing of mathematical text: A review

Galligan, Linda

2001-09-01

When comparing Chinese and English language, large differences in orthography, syntax, semantics, and phonetics are found. These differences may have consequences in the processing of mathematical text, yet little consideration is given to them when the mathematical abilities of students from these different cultures are compared. This paper reviews the differences between English and Mandarin Chinese language, evaluates current research, discusses the possible consequences for processing mathematical text in both languages, and outlines future research possibilities.

6. Mathematical modeling of olive mill waste composting process.

PubMed

Vasiliadou, Ioanna A; Muktadirul Bari Chowdhury, Abu Khayer Md; Akratos, Christos S; Tekerlekopoulou, Athanasia G; Pavlou, Stavros; Vayenas, Dimitrios V

2015-09-01

The present study aimed at developing an integrated mathematical model for the composting process of olive mill waste. The multi-component model was developed to simulate the composting of three-phase olive mill solid waste with olive leaves and different materials as bulking agents. The modeling system included heat transfer, organic substrate degradation, oxygen consumption, carbon dioxide production, water content change, and biological processes. First-order kinetics were used to describe the hydrolysis of insoluble organic matter, followed by formation of biomass. Microbial biomass growth was modeled with a double-substrate limitation by hydrolyzed available organic substrate and oxygen using Monod kinetics. The inhibitory factors of temperature and moisture content were included in the system. The production and consumption of nitrogen and phosphorous were also included in the model. In order to evaluate the kinetic parameters, and to validate the model, six pilot-scale composting experiments in controlled laboratory conditions were used. Low values of hydrolysis rates were observed (0.002841/d) coinciding with the high cellulose and lignin content of the composting materials used. Model simulations were in good agreement with the experimental results. Sensitivity analysis was performed and the modeling efficiency was determined to further evaluate the model predictions. Results revealed that oxygen simulations were more sensitive on the input parameters of the model compared to those of water, temperature and insoluble organic matter. Finally, the Nash and Sutcliff index (E), showed that the experimental data of insoluble organic matter (E>0.909) and temperature (E>0.678) were better simulated than those of water. PMID:26174354

7. Concurrent Image Processing Executive (CIPE)

NASA Technical Reports Server (NTRS)

Lee, Meemong; Cooper, Gregory T.; Groom, Steven L.; Mazer, Alan S.; Williams, Winifred I.

1988-01-01

The design and implementation of a Concurrent Image Processing Executive (CIPE), which is intended to become the support system software for a prototype high performance science analysis workstation are discussed. The target machine for this software is a JPL/Caltech Mark IIIfp Hypercube hosted by either a MASSCOMP 5600 or a Sun-3, Sun-4 workstation; however, the design will accommodate other concurrent machines of similar architecture, i.e., local memory, multiple-instruction-multiple-data (MIMD) machines. The CIPE system provides both a multimode user interface and an applications programmer interface, and has been designed around four loosely coupled modules; (1) user interface, (2) host-resident executive, (3) hypercube-resident executive, and (4) application functions. The loose coupling between modules allows modification of a particular module without significantly affecting the other modules in the system. In order to enhance hypercube memory utilization and to allow expansion of image processing capabilities, a specialized program management method, incremental loading, was devised. To minimize data transfer between host and hypercube a data management method which distributes, redistributes, and tracks data set information was implemented.

8. Image Processing for Teaching: Transforming a Scientific Research Tool into an Educational Technology.

ERIC Educational Resources Information Center

Greenberg, Richard

1998-01-01

Describes the Image Processing for Teaching (IPT) project which provides digital image processing to excite students about science and mathematics as they use research-quality software on microcomputers. Provides information on IPT whose components of this dissemination project have been widespread teacher education, curriculum-based materials…

9. Visualization of children's mathematics solving process using near infrared spectroscopic approach

Kuroda, Yasufumi; Okamoto, Naoko; Chance, Britton; Nioka, Shoko; Eda, Hideo; Maesako, Takanori

2009-02-01

Over the past decade, the application of results from brain science research to education research has been a controversial topic. A NIRS imaging system shows images of Hb parameters in the brain. Measurements using NIRS are safe, easy and the equipment is portable, allowing subjects to tolerate longer research periods. The purpose of this research is to examine the characteristics of Hb using NIRS at the moment of understanding. We measured Hb in the prefrontal cortex of children while they were solving mathematical problems (tangram puzzles). As a result of the experiment, we were able to classify the children into three groups based on their solution methods. Hb continually increased in a group which could not develop a problem solving strategy for the tangram puzzles. Hb declined steadily for a group which was able to develop a strategy for the tangram puzzles. Hb was steady for a certain group that had already developed a strategy before solving the problems. Our experiments showed that the brain data from NIRS enables the visualization of children's mathematical solution processes.

10. Mathematics Education as a Proving-Ground for Information-Processing Theories.

ERIC Educational Resources Information Center

Greer, Brian, Ed.; Verschaffel, Lieven, Ed.

1990-01-01

Five papers discuss the current and potential contributions of information-processing theory to our understanding of mathematical thinking as those contributions affect the practice of mathematics education. It is concluded that information-processing theories need to be supplemented in various ways to more adequately reflect the complexity of…

11. Investigating the Representational Fluency of Pre-Service Mathematics Teachers in a Modelling Process

ERIC Educational Resources Information Center

Delice, Ali; Kertil, Mahmut

2015-01-01

This article reports the results of a study that investigated pre-service mathematics teachers' modelling processes in terms of representational fluency in a modelling activity related to a cassette player. A qualitative approach was used in the data collection process. Students' individual and group written responses to the mathematical modelling…

12. Mathematical Modeling of a Solar Arrays Deploying Process at Ground Tests

Tomilin, A.; Shpyakin, I.

2016-04-01

This paper focuses on the creating of a mathematical model of a solar array deploying process during ground tests. Lagrange equation was used to obtain the math model. The distinctive feature of this mathematical model is the possibility of taking into account the gravity compensation system influence on the construction in the deploying process and the aerodynamic resistance during ground tests.

13. Prospective Elementary Mathematics Teachers' Thought Processes on a Model Eliciting Activity

ERIC Educational Resources Information Center

Eraslan, Ali

2012-01-01

Mathematical model and modeling are one of the topics that have been intensively discussed in recent years. The purpose of this study is to examine prospective elementary mathematics teachers' thought processes on a model eliciting activity and reveal difficulties or blockages in the processes. The study includes forty-five seniors taking the…

14. Image enhancement based on gamma map processing

Tseng, Chen-Yu; Wang, Sheng-Jyh; Chen, Yi-An

2010-05-01

This paper proposes a novel image enhancement technique based on Gamma Map Processing (GMP). In this approach, a base gamma map is directly generated according to the intensity image. After that, a sequence of gamma map processing is performed to generate a channel-wise gamma map. Mapping through the estimated gamma, image details, colorfulness, and sharpness of the original image are automatically improved. Besides, the dynamic range of the images can be virtually expanded.

15. Cluster-based parallel image processing toolkit

Squyres, Jeffery M.; Lumsdaine, Andrew; Stevenson, Robert L.

1995-03-01

Many image processing tasks exhibit a high degree of data locality and parallelism and map quite readily to specialized massively parallel computing hardware. However, as network technologies continue to mature, workstation clusters are becoming a viable and economical parallel computing resource, so it is important to understand how to use these environments for parallel image processing as well. In this paper we discuss our implementation of parallel image processing software library (the Parallel Image Processing Toolkit). The Toolkit uses a message- passing model of parallelism designed around the Message Passing Interface (MPI) standard. Experimental results are presented to demonstrate the parallel speedup obtained with the Parallel Image Processing Toolkit in a typical workstation cluster over a wide variety of image processing tasks. We also discuss load balancing and the potential for parallelizing portions of image processing tasks that seem to be inherently sequential, such as visualization and data I/O.

16. Mathematical modelling of the steam explosion treatment process for pre-impregnated lignocellulosic material

Prosvirnikov, D. B.; Ziatdinova, D. F.; Timerbaev, N. F.; Saldaev, V. A.; Gilfanov, K. H.

2016-04-01

The article analyses the physical picture of the process of steam explosion treatment of pre-impregnated lignocellulosic material, on the basis of which a mathematical modelling of the process is done. The mathematical modelling is represented in the form of differential equations with boundary conditions. The obtained mathematical description allows identifying the degree of influence of various factors on the kinetics of the process and producing a rational selection of operating parameters for the considered processes in terms of the set of application tasks.

17. Mathematics for generative processes: Living and non-living systems

2006-05-01

The traditional Differential Calculus often shows its limits when describing living systems. These in fact present such a richness of characteristics that are, in the majority of cases, much wider than the description capabilities of the usual differential equations. Such an aspect became particularly evident during the research (completed in 2001) for an appropriate formulation of Odum's Maximum Em-Power Principle (proposed by the Author as a possible Fourth Thermodynamic Principle). In fact, in such a context, the particular non-conservative Algebra, adopted to account for both Quality and quantity of generative processes, suggested we introduce a faithfully corresponding concept of "derivative" (of both integer and fractional order) to describe dynamic conditions however variable. The new concept not only succeeded in pointing out the corresponding differential bases of all the rules of Emergy Algebra, but also represented the preferential guide in order to recognize the most profound physical nature of the basic processes which mostly characterize self-organizing Systems (co-production, co-injection, inter-action, feed-back, splits, etc.).From a mathematical point of view, the most important novelties introduced by such a new approach are: (i) the derivative of any integer or fractional order can be obtained independently from the evaluation of its lower order derivatives; (ii) the exponential function plays an extremely hinge role, much more marked than in the case of traditional differential equations; (iii) wide classes of differential equations, traditionally considered as being non-linear, become "intrinsically" linear when reconsidered in terms of "incipient" derivatives; (iv) their corresponding explicit solutions can be given in terms of new classes of functions (such as "binary" and "duet" functions); (v) every solution shows a sort of "persistence of form" when representing the product generated with respect to the agents of the generating process

18. Thinking Process of Pseudo Construction in Mathematics Concepts

ERIC Educational Resources Information Center

Subanji; Nusantara, Toto

2016-01-01

This article aims at studying pseudo construction of student thinking in mathematical concepts, integer number operation, algebraic forms, area concepts, and triangle concepts. 391 junior high school students from four districts of East Java Province Indonesia were taken as the subjects. Data were collected by means of distributing the main…

19. Combining image-processing and image compression schemes

NASA Technical Reports Server (NTRS)

Greenspan, H.; Lee, M.-C.

1995-01-01

An investigation into the combining of image-processing schemes, specifically an image enhancement scheme, with existing compression schemes is discussed. Results are presented on the pyramid coding scheme, the subband coding scheme, and progressive transmission. Encouraging results are demonstrated for the combination of image enhancement and pyramid image coding schemes, especially at low bit rates. Adding the enhancement scheme to progressive image transmission allows enhanced visual perception at low resolutions. In addition, further progressing of the transmitted images, such as edge detection schemes, can gain from the added image resolution via the enhancement.

20. Applications Of Image Processing In Criminalistics

Krile, Thomas F.; Walkup, John F.; Barsallo, Adonis; Olimb, Hal; Tarng, Jaw-Horng

1987-01-01

A review of some basic image processing techniques for enhancement and restoration of images is given. Both digital and optical approaches are discussed. Fingerprint images are used as examples to illustrate the various processing techniques and their potential applications in criminalistics.

1. Intentional and automatic numerical processing as predictors of mathematical abilities in primary school children

PubMed Central

Pina, Violeta; Castillo, Alejandro; Cohen Kadosh, Roi; Fuentes, Luis J.

2015-01-01

Previous studies have suggested that numerical processing relates to mathematical performance, but it seems that such relationship is more evident for intentional than for automatic numerical processing. In the present study we assessed the relationship between the two types of numerical processing and specific mathematical abilities in a sample of 109 children in grades 1–6. Participants were tested in an ample range of mathematical tests and also performed both a numerical and a size comparison task. The results showed that numerical processing related to mathematical performance only when inhibitory control was involved in the comparison tasks. Concretely, we found that intentional numerical processing, as indexed by the numerical distance effect in the numerical comparison task, was related to mathematical reasoning skills only when the task-irrelevant dimension (the physical size) was incongruent; whereas automatic numerical processing, indexed by the congruency effect in the size comparison task, was related to mathematical calculation skills only when digits were separated by small distance. The observed double dissociation highlights the relevance of both intentional and automatic numerical processing in mathematical skills, but when inhibitory control is also involved. PMID:25873909

2. Some aspects of mathematical and chemical modeling of complex chemical processes

NASA Technical Reports Server (NTRS)

Nemes, I.; Botar, L.; Danoczy, E.; Vidoczy, T.; Gal, D.

1983-01-01

Some theoretical questions involved in the mathematical modeling of the kinetics of complex chemical process are discussed. The analysis is carried out for the homogeneous oxidation of ethylbenzene in the liquid phase. Particular attention is given to the determination of the general characteristics of chemical systems from an analysis of mathematical models developed on the basis of linear algebra.

3. Secondary School Students' Understanding of Mathematical Induction: Structural Characteristics and the Process of Proof Construction

ERIC Educational Resources Information Center

Palla, Marina; Potari, Despina; Spyrou, Panagiotis

2012-01-01

In this study, we investigate the meaning students attribute to the structure of mathematical induction (MI) and the process of proof construction using mathematical induction in the context of a geometric recursion problem. Two hundred and thirteen 17-year-old students of an upper secondary school in Greece participated in the study. Students'…

4. A Mathematical Experience Involving Defining Processes: In-Action Definitions and Zero-Definitions

ERIC Educational Resources Information Center

Ouvrier-Buffet, Cecile

2011-01-01

In this paper, a focus is made on defining processes at stake in an unfamiliar situation coming from discrete mathematics which brings surprising mathematical results. The epistemological framework of Lakatos is questioned and used for the design and the analysis of the situation. The cognitive background of Vergnaud's approach enriches the study…

ERIC Educational Resources Information Center

Paz-Baruch, N.; Leikin, M.; Aharon-Peretz, J.; Leikin, R.

2014-01-01

A considerable amount of recent evidence suggests that speed of information processing (SIP) may be related to general giftedness as well as contributing to higher mathematical ability. To date, no study has examined SIP associated with both general giftedness (G) and excellence in mathematics (EM). This paper presents a part of more extensive…

6. Examining Prospective Mathematics Teachers' Proof Processes for Algebraic Concepts

ERIC Educational Resources Information Center

Güler, Gürsel; Dikici, Ramazan

2014-01-01

The aim of this study was to examine prospective mathematics teachers' proof processes for algebraic concepts. The study was conducted with 10 prospective teachers who were studying at the department of secondary mathematics teaching and who volunteered to participate in the study. The data were obtained via task-based clinical interviews…

7. Mathematical modeling of sedimentation process of nanoparticles in gradient medium

Ezhenkova, S. I.; Chivilikhin, S. A.

2015-11-01

Mathematical model describing the motion of a light ray in the medium with a varying index of refraction formed by particles settling in a liquid has been built. We have received size distribution of particles settling in a liquid; calculated the light ray's trajectory in the medium; investigated the dependence of the light ray's trajectory on the initial particles concentration; received the solution of the equation of convective diffusion for nanoparticles.

8. Spatial Mapping of Translational Diffusion Coefficients Using Diffusion Tensor Imaging: A Mathematical Description

PubMed Central

SHETTY, ANIL N.; CHIANG, SHARON; MALETIC-SAVATIC, MIRJANA; KASPRIAN, GREGOR; VANNUCCI, MARINA; LEE, WESLEY

2016-01-01

In this article, we discuss the theoretical background for diffusion weighted imaging and diffusion tensor imaging. Molecular diffusion is a random process involving thermal Brownian motion. In biological tissues, the underlying microstructures restrict the diffusion of water molecules, making diffusion directionally dependent. Water diffusion in tissue is mathematically characterized by the diffusion tensor, the elements of which contain information about the magnitude and direction of diffusion and is a function of the coordinate system. Thus, it is possible to generate contrast in tissue based primarily on diffusion effects. Expressing diffusion in terms of the measured diffusion coefficient (eigenvalue) in any one direction can lead to errors. Nowhere is this more evident than in white matter, due to the preferential orientation of myelin fibers. The directional dependency is removed by diagonalization of the diffusion tensor, which then yields a set of three eigenvalues and eigenvectors, representing the magnitude and direction of the three orthogonal axes of the diffusion ellipsoid, respectively. For example, the eigenvalue corresponding to the eigenvector along the long axis of the fiber corresponds qualitatively to diffusion with least restriction. Determination of the principal values of the diffusion tensor and various anisotropic indices provides structural information. We review the use of diffusion measurements using the modified Stejskal–Tanner diffusion equation. The anisotropy is analyzed by decomposing the diffusion tensor based on symmetrical properties describing the geometry of diffusion tensor. We further describe diffusion tensor properties in visualizing fiber tract organization of the human brain.

9. Programmable remapper for image processing

NASA Technical Reports Server (NTRS)

Juday, Richard D. (Inventor); Sampsell, Jeffrey B. (Inventor)

1991-01-01

A video-rate coordinate remapper includes a memory for storing a plurality of transformations on look-up tables for remapping input images from one coordinate system to another. Such transformations are operator selectable. The remapper includes a collective processor by which certain input pixels of an input image are transformed to a portion of the output image in a many-to-one relationship. The remapper includes an interpolative processor by which the remaining input pixels of the input image are transformed to another portion of the output image in a one-to-many relationship. The invention includes certain specific transforms for creating output images useful for certain defects of visually impaired people. The invention also includes means for shifting input pixels and means for scrolling the output matrix.

10. Uses and Values of Prototypic Visual Images in High School Mathematics.

ERIC Educational Resources Information Center

Presmeg, Norma C.

Imagery use in high school mathematics classrooms was studied. A visual image was defined as a mental scheme depicting visual or spatial information, but this definition was not spelled out to teachers or students, in order to learn what they meant by the concept. Subjects were 13 high school teachers and 54 of their students interviewed over 3…

11. New method of contour image processing based on the formalism of spiral light beams

SciTech Connect

Volostnikov, Vladimir G; Kishkin, S A; Kotova, S P

2013-07-31

The possibility of applying the mathematical formalism of spiral light beams to the problems of contour image recognition is theoretically studied. The advantages and disadvantages of the proposed approach are evaluated; the results of numerical modelling are presented. (optical image processing)

12. Handbook on COMTAL's Image Processing System

NASA Technical Reports Server (NTRS)

Faulcon, N. D.

1983-01-01

An image processing system is the combination of an image processor with other control and display devices plus the necessary software needed to produce an interactive capability to analyze and enhance image data. Such an image processing system installed at NASA Langley Research Center, Instrument Research Division, Acoustics and Vibration Instrumentation Section (AVIS) is described. Although much of the information contained herein can be found in the other references, it is hoped that this single handbook will give the user better access, in concise form, to pertinent information and usage of the image processing system.

13. System-theoretical approach to multistage image processing

Grudin, Maxim A.; Timchenko, Leonid I.; Harvey, David M.; Gel, Vladimir P.

1996-08-01

We present a novel three-dimensional network and its application to pattern analysis. This is a multistage architecture which investigates partial correlations between structural image components. Mathematical description of the multistage hierarchical processing is provided together with the network architecture. Initially the image is partitioned to be processed in parallel channels. In each channel, the structural components are transformed and subsequently separated depending on their informational activity, to be mixed with the components from other channels for further processing. This procedure of temporal decomposition creates a flexible processing hierarchy, which reflects structural image complexity. An output result is represented as a pattern vector, whose components are computed one at a time to allow the quickest possible response. While several applications of the multistage network are possible, this paper represents an algorithm applied to image classification. The input gray-scale image is transformed so that each pixel contains information about the spatial structure of its neighborhood. A three-level representation of gray-scale image is used in order for each pixel to contain the maximum amount of structural information. The investigation of spatial regularities at all hierarchical levels provides a unified approach to pattern analysis. The most correlated information is extracted first, making the algorithm tolerant to minor structural changes.

14. Sequential Processes In Image Generation.

ERIC Educational Resources Information Center

Kosslyn, Stephen M.; And Others

1988-01-01

Results of three experiments are reported, which indicate that images of simple two-dimensional patterns are formed sequentially. The subjects included 48 undergraduates and 16 members of the Harvard University (Cambridge, Mass.) community. A new objective methodology indicates that images of complex letters require more time to generate. (TJH)

15. Image processing on the IBM personal computer

NASA Technical Reports Server (NTRS)

Myers, H. J.; Bernstein, R.

1985-01-01

An experimental, personal computer image processing system has been developed which provides a variety of processing functions in an environment that connects programs by means of a 'menu' for both casual and experienced users. The system is implemented by a compiled BASIC program that is coupled to assembly language subroutines. Image processing functions encompass subimage extraction, image coloring, area classification, histogramming, contrast enhancement, filtering, and pixel extraction.

16. Semi-automated Image Processing for Preclinical Bioluminescent Imaging

PubMed Central

Slavine, Nikolai V; McColl, Roderick W

2015-01-01

Objective Bioluminescent imaging is a valuable noninvasive technique for investigating tumor dynamics and specific biological molecular events in living animals to better understand the effects of human disease in animal models. The purpose of this study was to develop and test a strategy behind automated methods for bioluminescence image processing from the data acquisition to obtaining 3D images. Methods In order to optimize this procedure a semi-automated image processing approach with multi-modality image handling environment was developed. To identify a bioluminescent source location and strength we used the light flux detected on the surface of the imaged object by CCD cameras. For phantom calibration tests and object surface reconstruction we used MLEM algorithm. For internal bioluminescent sources we used the diffusion approximation with balancing the internal and external intensities on the boundary of the media and then determined an initial order approximation for the photon fluence we subsequently applied a novel iterative deconvolution method to obtain the final reconstruction result. Results We find that the reconstruction techniques successfully used the depth-dependent light transport approach and semi-automated image processing to provide a realistic 3D model of the lung tumor. Our image processing software can optimize and decrease the time of the volumetric imaging and quantitative assessment. Conclusion The data obtained from light phantom and lung mouse tumor images demonstrate the utility of the image reconstruction algorithms and semi-automated approach for bioluminescent image processing procedure. We suggest that the developed image processing approach can be applied to preclinical imaging studies: characteristics of tumor growth, identify metastases, and potentially determine the effectiveness of cancer treatment. PMID:26618187

17. Image processing applied to laser cladding process

SciTech Connect

Meriaudeau, F.; Truchetet, F.

1996-12-31

The laser cladding process, which consists of adding a melt powder to a substrate in order to improve or change the behavior of the material against corrosion, fatigue and so on, involves a lot of parameters. In order to perform good tracks some parameters need to be controlled during the process. The authors present here a low cost performance system using two CCD matrix cameras. One camera provides surface temperature measurements while the other gives information relative to the powder distribution or geometric characteristics of the tracks. The surface temperature (thanks to Beer Lambert`s law) enables one to detect variations in the mass feed rate. Using such a system the authors are able to detect fluctuation of 2 to 3g/min in the mass flow rate. The other camera gives them information related to the powder distribution, a simple algorithm applied to the data acquired from the CCD matrix camera allows them to see very weak fluctuations within both gaz flux (carriage or protection gaz). During the process, this camera is also used to perform geometric measurements. The height and the width of the track are obtained in real time and enable the operator to find information related to the process parameters such as the speed processing, the mass flow rate. The authors display the result provided by their system in order to enhance the efficiency of the laser cladding process. The conclusion is dedicated to a summary of the presented works and the expectations for the future.

18. Possible Effects of English-Chinese Language Differences on the Processing of Mathematical Text: A Review.

ERIC Educational Resources Information Center

Galligan, Linda

2001-01-01

Reviews the differences between the English and Mandarin Chinese language, evaluates current research, discusses the possible consequences for processing mathematical text in both languages, and outlines future research possibilities. (Author/MM)

19. Computers in Public Schools: Changing the Image with Image Processing.

ERIC Educational Resources Information Center

Raphael, Jacqueline; Greenberg, Richard

1995-01-01

The kinds of educational technologies selected can make the difference between uninspired, rote computer use and challenging learning experiences. University of Arizona's Image Processing for Teaching Project has worked with over 1,000 teachers to develop image-processing techniques that provide students with exciting, open-ended opportunities for…

20. Image Processing in Intravascular OCT

Wang, Zhao; Wilson, David L.; Bezerra, Hiram G.; Rollins, Andrew M.

Coronary artery disease is the leading cause of death in the world. Intravascular optical coherence tomography (IVOCT) is rapidly becoming a promising imaging modality for characterization of atherosclerotic plaques and evaluation of coronary stenting. OCT has several unique advantages over alternative technologies, such as intravascular ultrasound (IVUS), due to its better resolution and contrast. For example, OCT is currently the only imaging modality that can measure the thickness of the fibrous cap of an atherosclerotic plaque in vivo. OCT also has the ability to accurately assess the coverage of individual stent struts by neointimal tissue over time. However, it is extremely time-consuming to analyze IVOCT images manually to derive quantitative diagnostic metrics. In this chapter, we introduce some computer-aided methods to automate the common IVOCT image analysis tasks.

1. A mathematical model of neuro-fuzzy approximation in image classification

Gopalan, Sasi; Pinto, Linu; Sheela, C.; Arun Kumar M., N.

2016-06-01

Image digitization and explosion of World Wide Web has made traditional search for image, an inefficient method for retrieval of required grassland image data from large database. For a given input query image Content-Based Image Retrieval (CBIR) system retrieves the similar images from a large database. Advances in technology has increased the use of grassland image data in diverse areas such has agriculture, art galleries, education, industry etc. In all the above mentioned diverse areas it is necessary to retrieve grassland image data efficiently from a large database to perform an assigned task and to make a suitable decision. A CBIR system based on grassland image properties and it uses the aid of a feed-forward back propagation neural network for an effective image retrieval is proposed in this paper. Fuzzy Memberships plays an important role in the input space of the proposed system which leads to a combined neural fuzzy approximation in image classification. The CBIR system with mathematical model in the proposed work gives more clarity about fuzzy-neuro approximation and the convergence of the image features in a grassland image.

2. Combining advanced imaging processing and low cost remote imaging capabilities

Rohrer, Matthew J.; McQuiddy, Brian

2008-04-01

Target images are very important for evaluating the situation when Unattended Ground Sensors (UGS) are deployed. These images add a significant amount of information to determine the difference between hostile and non-hostile activities, the number of targets in an area, the difference between animals and people, the movement dynamics of targets, and when specific activities of interest are taking place. The imaging capabilities of UGS systems need to provide only target activity and not images without targets in the field of view. The current UGS remote imaging systems are not optimized for target processing and are not low cost. McQ describes in this paper an architectural and technologic approach for significantly improving the processing of images to provide target information while reducing the cost of the intelligent remote imaging capability.

3. Matching rendered and real world images by digital image processing

Mitjà, Carles; Bover, Toni; Bigas, Miquel; Escofet, Jaume

2010-05-01

Recent advances in computer-generated images (CGI) have been used in commercial and industrial photography providing a broad scope in product advertising. Mixing real world images with those rendered from virtual space software shows a more or less visible mismatching between corresponding image quality performance. Rendered images are produced by software which quality performance is only limited by the resolution output. Real world images are taken with cameras with some amount of image degradation factors as lens residual aberrations, diffraction, sensor low pass anti aliasing filters, color pattern demosaicing, etc. The effect of all those image quality degradation factors can be characterized by the system Point Spread Function (PSF). Because the image is the convolution of the object by the system PSF, its characterization shows the amount of image degradation added to any taken picture. This work explores the use of image processing to degrade the rendered images following the parameters indicated by the real system PSF, attempting to match both virtual and real world image qualities. The system MTF is determined by the slanted edge method both in laboratory conditions and in the real picture environment in order to compare the influence of the working conditions on the device performance; an approximation to the system PSF is derived from the two measurements. The rendered images are filtered through a Gaussian filter obtained from the taking system PSF. Results with and without filtering are shown and compared measuring the contrast achieved in different final image regions.

4. Programmable Iterative Optical Image And Data Processing

NASA Technical Reports Server (NTRS)

Jackson, Deborah J.

1995-01-01

Proposed method of iterative optical image and data processing overcomes limitations imposed by loss of optical power after repeated passes through many optical elements - especially, beam splitters. Involves selective, timed combination of optical wavefront phase conjugation and amplification to regenerate images in real time to compensate for losses in optical iteration loops; timing such that amplification turned on to regenerate desired image, then turned off so as not to regenerate other, undesired images or spurious light propagating through loops from unwanted reflections.

5. Utilizing image processing techniques to compute herbivory.

PubMed

Olson, T E; Barlow, V M

2001-01-01

Leafy spurge (Euphorbia esula L. sensu lato) is a perennial weed species common to the north-central United States and southern Canada. The plant is a foreign species toxic to cattle. Spurge infestation can reduce cattle carrying capacity by 50 to 75 percent [1]. University of Wyoming Entomology doctoral candidate Vonny Barlow is conducting research in the area of biological control of leafy spurge via the Aphthona nigriscutis Foudras flea beetle. He is addressing the question of variability within leafy spurge and its potential impact on flea beetle herbivory. One component of Barlow's research consists of measuring the herbivory of leafy spurge plant specimens after introducing adult beetles. Herbivory is the degree of consumption of the plant's leaves and was measured in two different manners. First, Barlow assigned each consumed plant specimen a visual rank from 1 to 5. Second, image processing techniques were applied to "before" and "after" images of each plant specimen in an attempt to more accurately quantify herbivory. Standardized techniques were used to acquire images before and after beetles were allowed to feed on plants for a period of 12 days. Matlab was used as the image processing tool. The image processing algorithm allowed the user to crop the portion of the "before" image containing only plant foliage. Then Matlab cropped the "after" image with the same dimensions, converted the images from RGB to grayscale. The grayscale image was converted to binary based on a user defined threshold value. Finally, herbivory was computed based on the number of black pixels in the "before" and "after" images. The image processing results were mixed. Although, this image processing technique depends on user input and non-ideal images, the data is useful to Barlow's research and offers insight into better imaging systems and processing algorithms. PMID:11347423

6. Non-linear Post Processing Image Enhancement

NASA Technical Reports Server (NTRS)

Hunt, Shawn; Lopez, Alex; Torres, Angel

1997-01-01

A non-linear filter for image post processing based on the feedforward Neural Network topology is presented. This study was undertaken to investigate the usefulness of "smart" filters in image post processing. The filter has shown to be useful in recovering high frequencies, such as those lost during the JPEG compression-decompression process. The filtered images have a higher signal to noise ratio, and a higher perceived image quality. Simulation studies comparing the proposed filter with the optimum mean square non-linear filter, showing examples of the high frequency recovery, and the statistical properties of the filter are given,

7. Quantitative image processing in fluid mechanics

NASA Technical Reports Server (NTRS)

Hesselink, Lambertus; Helman, James; Ning, Paul

1992-01-01

The current status of digital image processing in fluid flow research is reviewed. In particular, attention is given to a comprehensive approach to the extraction of quantitative data from multivariate databases and examples of recent developments. The discussion covers numerical simulations and experiments, data processing, generation and dissemination of knowledge, traditional image processing, hybrid processing, fluid flow vector field topology, and isosurface analysis using Marching Cubes.

8. Blurred Star Image Processing for Star Sensors under Dynamic Conditions

PubMed Central

Zhang, Weina; Quan, Wei; Guo, Lei

2012-01-01

The precision of star point location is significant to identify the star map and to acquire the aircraft attitude for star sensors. Under dynamic conditions, star images are not only corrupted by various noises, but also blurred due to the angular rate of the star sensor. According to different angular rates under dynamic conditions, a novel method is proposed in this article, which includes a denoising method based on adaptive wavelet threshold and a restoration method based on the large angular rate. The adaptive threshold is adopted for denoising the star image when the angular rate is in the dynamic range. Then, the mathematical model of motion blur is deduced so as to restore the blurred star map due to large angular rate. Simulation results validate the effectiveness of the proposed method, which is suitable for blurred star image processing and practical for attitude determination of satellites under dynamic conditions. PMID:22778666

9. Anthropological methods of optical image processing

Ginzburg, V. M.

1981-12-01

Some applications of the new method for optical image processing, based on a prior separation of informative elements (IE) with the help of a defocusing equal to the average eye defocusing, considered in a previous paper, are described. A diagram of a "drawing" robot with the use of defocusing and other mechanisms of the human visual system (VS) is given. Methods of narrowing the TV channel bandwidth and elimination of noises in computer image processing by prior image defocusing are described.

10. Water surface capturing by image processing

Technology Transfer Automated Retrieval System (TEKTRAN)

An alternative means of measuring the water surface interface during laboratory experiments is processing a series of sequentially captured images. Image processing can provide a continuous, non-intrusive record of the water surface profile whose accuracy is not dependent on water depth. More trad...

11. The mathematical model reduces the effect of distance to the scatter images gray level

Sun, Li-na

2013-09-01

In x-ray scanning system, scatter images are obtained to provide information on material density. The forward and backward scatter is related to solid angle. Scatter is therefore dependent on the distance of the scanned object from the x-ray source. In the real world, an object may be placed anywhere on the conveyer belt, so the measured intensity will contain errors relative to the ideal intensity. This makes classification results less reliable. Extraction of characteristic values L associated with the density; need to know the gray levels of scatter images, so how to base on forward scatter and back scatter images to determine the scatter image gray level is first necessary to solve the problem. The author combined with the forward scatter and backscatter images，then established higher order gray-level mathematical model of scattering images, to eliminate the impact of distance on the scatter images, to obtain more accurate gray level of scatter image. Then compare the error use of LMS algorithm and the LS algorithm to solving mathematical model parameters, LS algorithm ultimately prove less error and experimental validation of the superiority of the LS algorithm.

12. Automatic processing, analysis, and recognition of images

Abrukov, Victor S.; Smirnov, Evgeniy V.; Ivanov, Dmitriy G.

2004-11-01

New approaches and computer codes (A&CC) for automatic processing, analysis and recognition of images are offered. The A&CC are based on presentation of object image as a collection of pixels of various colours and consecutive automatic painting of distinguished itself parts of the image. The A&CC have technical objectives centred on such direction as: 1) image processing, 2) image feature extraction, 3) image analysis and some others in any consistency and combination. The A&CC allows to obtain various geometrical and statistical parameters of object image and its parts. Additional possibilities of the A&CC usage deal with a usage of artificial neural networks technologies. We believe that A&CC can be used at creation of the systems of testing and control in a various field of industry and military applications (airborne imaging systems, tracking of moving objects), in medical diagnostics, at creation of new software for CCD, at industrial vision and creation of decision-making system, etc. The opportunities of the A&CC are tested at image analysis of model fires and plumes of the sprayed fluid, ensembles of particles, at a decoding of interferometric images, for digitization of paper diagrams of electrical signals, for recognition of the text, for elimination of a noise of the images, for filtration of the image, for analysis of the astronomical images and air photography, at detection of objects.

13. SUPRIM: easily modified image processing software.

PubMed

Schroeter, J P; Bretaudiere, J P

1996-01-01

A flexible, modular software package intended for the processing of electron microscopy images is presented. The system consists of a set of image processing tools or filters, written in the C programming language, and a command line style user interface based on the UNIX shell. The pipe and filter structure of UNIX and the availability of command files in the form of shell scripts eases the construction of complex image processing procedures from the simpler tools. Implementation of a new image processing algorithm in SUPRIM may often be performed by construction of a new shell script, using already existing tools. Currently, the package has been used for two- and three-dimensional image processing and reconstruction of macromolecules and other structures of biological interest. PMID:8742734

14. Student Understanding Of The Physics And Mathematics Of Process Variables In P-V Diagrams

Pollock, Evan B.; Thompson, John R.; Mountcastle, Donald B.

2007-11-01

Students in an upper-level thermal physics course were asked to compare quantities related to the First Law of Thermodynamics along with similar mathematical questions devoid of all physical context. We report on a comparison of student responses to physics questions involving interpretation of ideal gas processes on P-V diagrams and to analogous mathematical qualitative questions about the signs of and comparisons between the magnitudes of various integrals. Student performance on individual questions combined with performance on the paired questions shows evidence of isolated understanding of physics and mathematics. Some difficulties are addressed by instruction.

15. Parallel-Processing Software for Creating Mosaic Images

NASA Technical Reports Server (NTRS)

Klimeck, Gerhard; Deen, Robert; McCauley, Michael; DeJong, Eric

2008-01-01

A computer program implements parallel processing for nearly real-time creation of panoramic mosaics of images of terrain acquired by video cameras on an exploratory robotic vehicle (e.g., a Mars rover). Because the original images are typically acquired at various camera positions and orientations, it is necessary to warp the images into the reference frame of the mosaic before stitching them together to create the mosaic. [Also see "Parallel-Processing Software for Correlating Stereo Images," Software Supplement to NASA Tech Briefs, Vol. 31, No. 9 (September 2007) page 26.] The warping algorithm in this computer program reflects the considerations that (1) for every pixel in the desired final mosaic, a good corresponding point must be found in one or more of the original images and (2) for this purpose, one needs a good mathematical model of the cameras and a good correlation of individual pixels with respect to their positions in three dimensions. The desired mosaic is divided into slices, each of which is assigned to one of a number of central processing units (CPUs) operating simultaneously. The results from the CPUs are gathered and placed into the final mosaic. The time taken to create the mosaic depends upon the number of CPUs, the speed of each CPU, and whether a local or a remote data-staging mechanism is used.

16. Image processing for cameras with fiber bundle image relay.

PubMed

Olivas, Stephen J; Arianpour, Ashkan; Stamenov, Igor; Morrison, Rick; Stack, Ron A; Johnson, Adam R; Agurok, Ilya P; Ford, Joseph E

2015-02-10

Some high-performance imaging systems generate a curved focal surface and so are incompatible with focal plane arrays fabricated by conventional silicon processing. One example is a monocentric lens, which forms a wide field-of-view high-resolution spherical image with a radius equal to the focal length. Optical fiber bundles have been used to couple between this focal surface and planar image sensors. However, such fiber-coupled imaging systems suffer from artifacts due to image sampling and incoherent light transfer by the fiber bundle as well as resampling by the focal plane, resulting in a fixed obscuration pattern. Here, we describe digital image processing techniques to improve image quality in a compact 126° field-of-view, 30 megapixel panoramic imager, where a 12 mm focal length F/1.35 lens made of concentric glass surfaces forms a spherical image surface, which is fiber-coupled to six discrete CMOS focal planes. We characterize the locally space-variant system impulse response at various stages: monocentric lens image formation onto the 2.5 μm pitch fiber bundle, image transfer by the fiber bundle, and sensing by a 1.75 μm pitch backside illuminated color focal plane. We demonstrate methods to mitigate moiré artifacts and local obscuration, correct for sphere to plane mapping distortion and vignetting, and stitch together the image data from discrete sensors into a single panorama. We compare processed images from the prototype to those taken with a 10× larger commercial camera with comparable field-of-view and light collection. PMID:25968031

17. ESO C Library for an Image Processing Software Environment (eclipse)

Devillard, N.

Written in ANSI C, eclipse is a library offering numerous services related to astronomical image processing: FITS data access, various image and cube loading methods, binary image handling and filtering (including convolution and morphological filters), 2-D cross-correlation, connected components, cube and image arithmetic, dead pixel detection and correction, object detection, data extraction, flat-fielding with robust fit, image generation, statistics, photometry, image-space resampling, image combination, and cube stacking. It also contains support for mathematical tools like random number generation, FFT, curve fitting, matrices, fast median computation, and point-pattern matching. The main feature of this library is its ability to handle large amounts of input data (up to 2 GB in the current version) regardless of the amount of memory and swap available on the local machine. Another feature is the very high speed allowed by optimized C, making it an ideal base tool for programming efficient number-crunching applications, e.g., on parallel (Beowulf) systems. Running on all Unix-like platforms, eclipse is portable. A high-level interface to Python is foreseen that would allow programmers to prototype their applications much faster than through C programs.

18. Eclipse: ESO C Library for an Image Processing Software Environment

Devillard, Nicolas

2011-12-01

Written in ANSI C, eclipse is a library offering numerous services related to astronomical image processing: FITS data access, various image and cube loading methods, binary image handling and filtering (including convolution and morphological filters), 2-D cross-correlation, connected components, cube and image arithmetic, dead pixel detection and correction, object detection, data extraction, flat-fielding with robust fit, image generation, statistics, photometry, image-space resampling, image combination, and cube stacking. It also contains support for mathematical tools like random number generation, FFT, curve fitting, matrices, fast median computation, and point-pattern matching. The main feature of this library is its ability to handle large amounts of input data (up to 2GB in the current version) regardless of the amount of memory and swap available on the local machine. Another feature is the very high speed allowed by optimized C, making it an ideal base tool for programming efficient number-crunching applications, e.g., on parallel (Beowulf) systems.

19. [Development of a Text-Data Based Learning Tool That Integrates Image Processing and Displaying].

PubMed

Shinohara, Hiroyuki; Hashimoto, Takeyuki

2015-01-01

We developed a text-data based learning tool that integrates image processing and displaying by Excel. Knowledge required for programing this tool is limited to using absolute, relative, and composite cell references and learning approximately 20 mathematical functions available in Excel. The new tool is capable of resolution translation, geometric transformation, spatial-filter processing, Radon transform, Fourier transform, convolutions, correlations, deconvolutions, wavelet transform, mutual information, and simulation of proton density-, T1-, and T2-weighted MR images. The processed images of 128 x 128 pixels or 256 x 256 pixels are observed directly within Excel worksheets without using any particular image display software. The results of image processing using this tool were compared with those using C language and the new tool was judged to have sufficient accuracy to be practically useful. The images displayed on Excel worksheets were compared with images using binary-data display software. This comparison indicated that the image quality of the Excel worksheets was nearly equal to the latter in visual impressions. Since image processing is performed by using text-data, the process is visible and facilitates making contrasts by using mathematical equations within the program. We concluded that the newly developed tool is adequate as a computer-assisted learning tool for use in medical image processing. PMID:27125125

20. CT Image Processing Using Public Digital Networks

PubMed Central

Rhodes, Michael L.; Azzawi, Yu-Ming; Quinn, John F.; Glenn, William V.; Rothman, Stephen L.G.

1984-01-01

Nationwide commercial computer communication is now commonplace for those applications where digital dialogues are generally short and widely distributed, and where bandwidth does not exceed that of dial-up telephone lines. Image processing using such networks is prohibitive because of the large volume of data inherent to digital pictures. With a blend of increasing bandwidth and distributed processing, network image processing becomes possible. This paper examines characteristics of a digital image processing service for a nationwide network of CT scanner installations. Issues of image transmission, data compression, distributed processing, software maintenance, and interfacility communication are also discussed. Included are results that show the volume and type of processing experienced by a network of over 50 CT scanners for the last 32 months.

1. Image processing for drawing recognition

Feyzkhanov, Rustem; Zhelavskaya, Irina

2014-03-01

The task of recognizing edges of rectangular structures is well known. Still, almost all of them work with static images and has no limit on work time. We propose application of conducting homography for the video stream which can be obtained from the webcam. We propose algorithm which can be successfully used for this kind of application. One of the main use cases of such application is recognition of drawings by person on the piece of paper before webcam.

2. Parallel digital signal processing architectures for image processing

Kshirsagar, Shirish P.; Hartley, David A.; Harvey, David M.; Hobson, Clifford A.

1994-10-01

This paper describes research into a high speed image processing system using parallel digital signal processors for the processing of electro-optic images. The objective of the system is to reduce the processing time of non-contact type inspection problems including industrial and medical applications. A single processor can not deliver sufficient processing power required for the use of applications hence, a MIMD system is designed and constructed to enable fast processing of electro-optic images. The Texas Instruments TMS320C40 digital signal processor is used due to its high speed floating point CPU and the support for the parallel processing environment. A custom designed VISION bus is provided to transfer images between processors. The system is being applied for solder joint inspection of high technology printed circuit boards.

3. Stable image acquisition for mobile image processing applications

Henning, Kai-Fabian; Fritze, Alexander; Gillich, Eugen; Mönks, Uwe; Lohweg, Volker

2015-02-01

Today, mobile devices (smartphones, tablets, etc.) are widespread and of high importance for their users. Their performance as well as versatility increases over time. This leads to the opportunity to use such devices for more specific tasks like image processing in an industrial context. For the analysis of images requirements like image quality (blur, illumination, etc.) as well as a defined relative position of the object to be inspected are crucial. Since mobile devices are handheld and used in constantly changing environments the challenge is to fulfill these requirements. We present an approach to overcome the obstacles and stabilize the image capturing process such that image analysis becomes significantly improved on mobile devices. Therefore, image processing methods are combined with sensor fusion concepts. The approach consists of three main parts. First, pose estimation methods are used to guide a user moving the device to a defined position. Second, the sensors data and the pose information are combined for relative motion estimation. Finally, the image capturing process is automated. It is triggered depending on the alignment of the device and the object as well as the image quality that can be achieved under consideration of motion and environmental effects.

4. Mathematical modeling of a single stage ultrasonically assisted distillation process.

PubMed

2015-05-01

The ability of sonication phenomena in facilitating separation of azeotropic mixtures presents a promising approach for the development of more intensified and efficient distillation systems than conventional ones. To expedite the much-needed development, a mathematical model of the system based on conservation principles, vapor-liquid equilibrium and sonochemistry was developed in this study. The model that was founded on a single stage vapor-liquid equilibrium system and enhanced with ultrasonic waves was coded using MATLAB simulator and validated with experimental data for ethanol-ethyl acetate mixture. The effects of both ultrasonic frequency and intensity on the relative volatility and azeotropic point were examined, and the optimal conditions were obtained using genetic algorithm. The experimental data validated the model with a reasonable accuracy. The results of this study revealed that the azeotropic point of the mixture can be totally eliminated with the right combination of sonication parameters and this can be utilized in facilitating design efforts towards establishing a workable ultrasonically intensified distillation system. PMID:25432400

5. Applications of Digital Image Processing 11

NASA Technical Reports Server (NTRS)

Cho, Y. -C.

1988-01-01

A new technique, digital image velocimetry, is proposed for the measurement of instantaneous velocity fields of time dependent flows. A time sequence of single-exposure images of seed particles are captured with a high-speed camera, and a finite number of the single-exposure images are sampled within a prescribed period in time. The sampled images are then digitized on an image processor, enhanced, and superimposed to construct an image which is equivalent to a multiple exposure image used in both laser speckle velocimetry and particle image velocimetry. The superimposed image and a single-exposure Image are digitally Fourier transformed for extraction of information on the velocity field. A great enhancement of the dynamic range of the velocity measurement is accomplished through the new technique by manipulating the Fourier transform of both the single-exposure image and the superimposed image. Also the direction of the velocity vector is unequivocally determined. With the use of a high-speed video camera, the whole process from image acquisition to velocity determination can be carried out electronically; thus this technique can be developed into a real-time capability.

6. Process perspective on image quality evaluation

Leisti, Tuomas; Halonen, Raisa; Kokkonen, Anna; Weckman, Hanna; Mettänen, Marja; Lensu, Lasse; Ritala, Risto; Oittinen, Pirkko; Nyman, Göte

2008-01-01

The psychological complexity of multivariate image quality evaluation makes it difficult to develop general image quality metrics. Quality evaluation includes several mental processes and ignoring these processes and the use of a few test images can lead to biased results. By using a qualitative/quantitative (Interpretation Based Quality, IBQ) methodology, we examined the process of pair-wise comparison in a setting, where the quality of the images printed by laser printer on different paper grades was evaluated. Test image consisted of a picture of a table covered with several objects. Three other images were also used, photographs of a woman, cityscape and countryside. In addition to the pair-wise comparisons, observers (N=10) were interviewed about the subjective quality attributes they used in making their quality decisions. An examination of the individual pair-wise comparisons revealed serious inconsistencies in observers' evaluations on the test image content, but not on other contexts. The qualitative analysis showed that this inconsistency was due to the observers' focus of attention. The lack of easily recognizable context in the test image may have contributed to this inconsistency. To obtain reliable knowledge of the effect of image context or attention on subjective image quality, a qualitative methodology is needed.

7. Mathematical simulation of hemodynamical processes and medical technologies

Tsitsyura, Nadiya; Novyc'kyy, Victor V.; Lushchyk, Ulyana B.

2001-06-01

Vascular pathologies constitute a significant part of human's diseases and their rate tends to increase. Numerous investigations of brain blood flow in a normal condition and in a pathological one has created a new branch of modern medicine -- angioneurology. It combines the information on brain angioarchitecture and on blood supply in a normal condition and in a pathological one. Investigations of a disease's development constitute an important problem of a modern medicine. Cerebrum blood supply is regulated by arterial inflow and venous outflow, but, unfortunately, in the literature available arterial and venous beds are considered separately. This causes an one-sided interpretation of atherosclerotical and discirculatory encefalopathies. As arterial inflow and venous outflow are interrelated, it seems to be expedient to perform a complex estimation of arteriovenous interactions, prove a correlation dependence connection between the beds and find a dependence in a form of mathematical function. The results will be observed clearly in the graphs. There were 139 patients aged from 2 up to 70 examined in the 'Istyna' Scientific Medical Ultrasound Center by means of a Logidop 2 apparatus manufactured by Kranzbuhler, Germany using a technique of cerebral arteries and veins ultrasound location (invented and patented by Ulyana Lushchyk, State Patent of Ukraine N10262 of 19/07/1995). A clinical interpretation of the results obtained was performed. With the help of this technique and ultrasound Dopplerography the blood flow in major head and cervical arteries was investigated. While performing a visual graphic analysis we paid attention to the changes of carotid artery (CA), internal jugular vein (IJV) and supratrochlear artery's (STA) hemodynamical parameters. Generally accepted blood flow parameters: FS -- maximal systolic frequency and FD -- minimal diastolic frequency were measured. The correlation between different combinations of parameters in the vessels mentioned

8. Numerical magnitude processing in abacus-trained children with superior mathematical ability: an EEG study.

PubMed

Huang, Jian; Du, Feng-lei; Yao, Yuan; Wan, Qun; Wang, Xiao-Song; Chen, Fei-Yan

2015-08-01

Distance effect has been regarded as the best established marker of basic numerical magnitude processes and is related to individual mathematical abilities. A larger behavioral distance effect is suggested to be concomitant with lower mathematical achievement in children. However, the relationship between distance effect and superior mathematical abilities is unclear. One could get superior mathematical abilities by acquiring the skill of abacus-based mental calculation (AMC), which can be used to solve calculation problems with exceptional speed and high accuracy. In the current study, we explore the relationship between distance effect and superior mathematical abilities by examining whether and how the AMC training modifies numerical magnitude processing. Thus, mathematical competencies were tested in 18 abacus-trained children (who accepted the AMC training) and 18 non-trained children. Electroencephalography (EEG) waveforms were recorded when these children executed numerical comparison tasks in both Arabic digit and dot array forms. We found that: (a) the abacus-trained group had superior mathematical abilities than their peers; (b) distance effects were found both in behavioral results and on EEG waveforms; (c) the distance effect size of the average amplitude on the late negative-going component was different between groups in the digit task, with a larger effect size for abacus-trained children; (d) both the behavioral and EEG distance effects were modulated by the notation. These results revealed that the neural substrates of magnitude processing were modified by AMC training, and suggested that the mechanism of the representation of numerical magnitude for children with superior mathematical abilities was different from their peers. In addition, the results provide evidence for a view of non-abstract numerical representation. PMID:26238541

9. Numerical magnitude processing in abacus-trained children with superior mathematical ability: an EEG study*

PubMed Central

Huang, Jian; Du, Feng-lei; Yao, Yuan; Wan, Qun; Wang, Xiao-song; Chen, Fei-yan

2015-01-01

Distance effect has been regarded as the best established marker of basic numerical magnitude processes and is related to individual mathematical abilities. A larger behavioral distance effect is suggested to be concomitant with lower mathematical achievement in children. However, the relationship between distance effect and superior mathematical abilities is unclear. One could get superior mathematical abilities by acquiring the skill of abacus-based mental calculation (AMC), which can be used to solve calculation problems with exceptional speed and high accuracy. In the current study, we explore the relationship between distance effect and superior mathematical abilities by examining whether and how the AMC training modifies numerical magnitude processing. Thus, mathematical competencies were tested in 18 abacus-trained children (who accepted the AMC training) and 18 non-trained children. Electroencephalography (EEG) waveforms were recorded when these children executed numerical comparison tasks in both Arabic digit and dot array forms. We found that: (a) the abacus-trained group had superior mathematical abilities than their peers; (b) distance effects were found both in behavioral results and on EEG waveforms; (c) the distance effect size of the average amplitude on the late negative-going component was different between groups in the digit task, with a larger effect size for abacus-trained children; (d) both the behavioral and EEG distance effects were modulated by the notation. These results revealed that the neural substrates of magnitude processing were modified by AMC training, and suggested that the mechanism of the representation of numerical magnitude for children with superior mathematical abilities was different from their peers. In addition, the results provide evidence for a view of non-abstract numerical representation. PMID:26238541

10. Interactive image processing in swallowing research

Dengel, Gail A.; Robbins, JoAnne; Rosenbek, John C.

1991-06-01

Dynamic radiographic imaging of the mouth, larynx, pharynx, and esophagus during swallowing is used commonly in clinical diagnosis, treatment and research. Images are recorded on videotape and interpreted conventionally by visual perceptual methods, limited to specific measures in the time domain and binary decisions about the presence or absence of events. An image processing system using personal computer hardware and original software has been developed to facilitate measurement of temporal, spatial and temporospatial parameters. Digitized image sequences derived from videotape are manipulated and analyzed interactively. Animation is used to preserve context and increase efficiency of measurement. Filtering and enhancement functions heighten image clarity and contrast, improving visibility of details which are not apparent on videotape. Distortion effects and extraneous head and body motions are removed prior to analysis, and spatial scales are controlled to permit comparison among subjects. Effects of image processing on intra- and interjudge reliability and research applications are discussed.

11. Screw thread parameter measurement system based on image processing method

Rao, Zhimin; Huang, Kanggao; Mao, Jiandong; Zhang, Yaya; Zhang, Fan

2013-08-01

In the industrial production, as an important transmission part, the screw thread is applied extensively in many automation equipments. The traditional measurement methods of screw thread parameter, including integrated test methods of multiparameters and the single parameter measurement method, belong to contact measurement method. In practical the contact measurement exists some disadvantages, such as relatively high time cost, introducing easily human error and causing thread damage. In this paper, as a new kind of real-time and non-contact measurement method, a screw thread parameter measurement system based on image processing method is developed to accurately measure the outside diameter, inside diameter, pitch diameter, pitch, thread height and other parameters of screw thread. In the system the industrial camera is employed to acquire the image of screw thread, some image processing methods are used to obtain the image profile of screw thread and a mathematics model is established to compute the parameters. The C++Builder 6.0 is employed as the software development platform to realize the image process and computation of screw thread parameters. For verifying the feasibility of the measurement system, some experiments were carried out and the measurement errors were analyzed. The experiment results show the image measurement system satisfies the measurement requirements and suitable for real-time detection of screw thread parameters mentioned above. Comparing with the traditional methods the system based on image processing method has some advantages, such as, non-contact, easy operation, high measuring accuracy, no work piece damage, fast error analysis and so on. In the industrial production, this measurement system can provide an important reference value for development of similar parameter measurement system.

12. Cognitive components of a mathematical processing network in 9-year-old children

PubMed Central

Szűcs, Dénes; Devine, Amy; Soltesz, Fruzsina; Nobes, Alison; Gabriel, Florence

2014-01-01

We determined how various cognitive abilities, including several measures of a proposed domain-specific number sense, relate to mathematical competence in nearly 100 9-year-old children with normal reading skill. Results are consistent with an extended number processing network and suggest that important processing nodes of this network are phonological processing, verbal knowledge, visuo-spatial short-term and working memory, spatial ability and general executive functioning. The model was highly specific to predicting arithmetic performance. There were no strong relations between mathematical achievement and verbal short-term and working memory, sustained attention, response inhibition, finger knowledge and symbolic number comparison performance. Non-verbal intelligence measures were also non-significant predictors when added to our model. Number sense variables were non-significant predictors in the model and they were also non-significant predictors when entered into regression analysis with only a single visuo-spatial WM measure. Number sense variables were predicted by sustained attention. Results support a network theory of mathematical competence in primary school children and falsify the importance of a proposed modular ‘number sense’. We suggest an ‘executive memory function centric’ model of mathematical processing. Mapping a complex processing network requires that studies consider the complex predictor space of mathematics rather than just focusing on a single or a few explanatory factors. PMID:25089322

13. Cognitive components of a mathematical processing network in 9-year-old children.

PubMed

Szűcs, Dénes; Devine, Amy; Soltesz, Fruzsina; Nobes, Alison; Gabriel, Florence

2014-07-01

We determined how various cognitive abilities, including several measures of a proposed domain-specific number sense, relate to mathematical competence in nearly 100 9-year-old children with normal reading skill. Results are consistent with an extended number processing network and suggest that important processing nodes of this network are phonological processing, verbal knowledge, visuo-spatial short-term and working memory, spatial ability and general executive functioning. The model was highly specific to predicting arithmetic performance. There were no strong relations between mathematical achievement and verbal short-term and working memory, sustained attention, response inhibition, finger knowledge and symbolic number comparison performance. Non-verbal intelligence measures were also non-significant predictors when added to our model. Number sense variables were non-significant predictors in the model and they were also non-significant predictors when entered into regression analysis with only a single visuo-spatial WM measure. Number sense variables were predicted by sustained attention. Results support a network theory of mathematical competence in primary school children and falsify the importance of a proposed modular 'number sense'. We suggest an 'executive memory function centric' model of mathematical processing. Mapping a complex processing network requires that studies consider the complex predictor space of mathematics rather than just focusing on a single or a few explanatory factors. PMID:25089322

14. Image-plane processing of visual information

NASA Technical Reports Server (NTRS)

Huck, F. O.; Fales, C. L.; Park, S. K.; Samms, R. W.

1984-01-01

Shannon's theory of information is used to optimize the optical design of sensor-array imaging systems which use neighborhood image-plane signal processing for enhancing edges and compressing dynamic range during image formation. The resultant edge-enhancement, or band-pass-filter, response is found to be very similar to that of human vision. Comparisons of traits in human vision with results from information theory suggest that: (1) Image-plane processing, like preprocessing in human vision, can improve visual information acquisition for pattern recognition when resolving power, sensitivity, and dynamic range are constrained. Improvements include reduced sensitivity to changes in lighter levels, reduced signal dynamic range, reduced data transmission and processing, and reduced aliasing and photosensor noise degradation. (2) Information content can be an appropriate figure of merit for optimizing the optical design of imaging systems when visual information is acquired for pattern recognition. The design trade-offs involve spatial response, sensitivity, and sampling interval.

15. Earth Observation Services (Image Processing Software)

NASA Technical Reports Server (NTRS)

1992-01-01

San Diego State University and Environmental Systems Research Institute, with other agencies, have applied satellite imaging and image processing techniques to geographic information systems (GIS) updating. The resulting images display land use and are used by a regional planning agency for applications like mapping vegetation distribution and preserving wildlife habitats. The EOCAP program provides government co-funding to encourage private investment in, and to broaden the use of NASA-developed technology for analyzing information about Earth and ocean resources.

16. Image Algebra Matlab language version 2.3 for image processing and compression research

Schmalz, Mark S.; Ritter, Gerhard X.; Hayden, Eric

2010-08-01

Image algebra is a rigorous, concise notation that unifies linear and nonlinear mathematics in the image domain. Image algebra was developed under DARPA and US Air Force sponsorship at University of Florida for over 15 years beginning in 1984. Image algebra has been implemented in a variety of programming languages designed specifically to support the development of image processing and computer vision algorithms and software. The University of Florida has been associated with development of the languages FORTRAN, Ada, Lisp, and C++. The latter implementation involved a class library, iac++, that supported image algebra programming in C++. Since image processing and computer vision are generally performed with operands that are array-based, the Matlab™ programming language is ideal for implementing the common subset of image algebra. Objects include sets and set operations, images and operations on images, as well as templates and image-template convolution operations. This implementation, called Image Algebra Matlab (IAM), has been found to be useful for research in data, image, and video compression, as described herein. Due to the widespread acceptance of the Matlab programming language in the computing community, IAM offers exciting possibilities for supporting a large group of users. The control over an object's computational resources provided to the algorithm designer by Matlab means that IAM programs can employ versatile representations for the operands and operations of the algebra, which are supported by the underlying libraries written in Matlab. In a previous publication, we showed how the functionality of IAC++ could be carried forth into a Matlab implementation, and provided practical details of a prototype implementation called IAM Version 1. In this paper, we further elaborate the purpose and structure of image algebra, then present a maturing implementation of Image Algebra Matlab called IAM Version 2.3, which extends the previous implementation

17. Nonlinear Optical Image Processing with Bacteriorhodopsin Films

NASA Technical Reports Server (NTRS)

Downie, John D.; Deiss, Ron (Technical Monitor)

1994-01-01

The transmission properties of some bacteriorhodopsin film spatial light modulators are uniquely suited to allow nonlinear optical image processing operations to be applied to images with multiplicative noise characteristics. A logarithmic amplitude transmission feature of the film permits the conversion of multiplicative noise to additive noise, which may then be linearly filtered out in the Fourier plane of the transformed image. The bacteriorhodopsin film displays the logarithmic amplitude response for write beam intensities spanning a dynamic range greater than 2.0 orders of magnitude. We present experimental results demonstrating the principle and capability for several different image and noise situations, including deterministic noise and speckle. Using the bacteriorhodopsin film, we successfully filter out image noise from the transformed image that cannot be removed from the original image.

18. Image compression/decompression based on mathematical transform, reduction/expansion, and image sharpening

DOEpatents

Fu, Chi-Yung; Petrich, Loren I.

1997-01-01

An image represented in a first image array of pixels is first decimated in two dimensions before being compressed by a predefined compression algorithm such as JPEG. Another possible predefined compression algorithm can involve a wavelet technique. The compressed, reduced image is then transmitted over the limited bandwidth transmission medium, and the transmitted image is decompressed using an algorithm which is an inverse of the predefined compression algorithm (such as reverse JPEG). The decompressed, reduced image is then interpolated back to its original array size. Edges (contours) in the image are then sharpened to enhance the perceptual quality of the reconstructed image. Specific sharpening techniques are described.

19. Image compression/decompression based on mathematical transform, reduction/expansion, and image sharpening

DOEpatents

Fu, C.Y.; Petrich, L.I.

1997-12-30

An image represented in a first image array of pixels is first decimated in two dimensions before being compressed by a predefined compression algorithm such as JPEG. Another possible predefined compression algorithm can involve a wavelet technique. The compressed, reduced image is then transmitted over the limited bandwidth transmission medium, and the transmitted image is decompressed using an algorithm which is an inverse of the predefined compression algorithm (such as reverse JPEG). The decompressed, reduced image is then interpolated back to its original array size. Edges (contours) in the image are then sharpened to enhance the perceptual quality of the reconstructed image. Specific sharpening techniques are described. 22 figs.

20. Accelerated image processing on FPGAs.

PubMed

Draper, Bruce A; Beveridge, J Ross; Böhm, A P Willem; Ross, Charles; Chawathe, Monica

2003-01-01

The Cameron project has developed a language called single assignment C (SA-C), and a compiler for mapping image-based applications written in SA-C to field programmable gate arrays (FPGAs). The paper tests this technology by implementing several applications in SA-C and compiling them to an Annapolis Microsystems (AMS) WildStar board with a Xilinx XV2000E FPGA. The performance of these applications on the FPGA is compared to the performance of the same applications written in assembly code or C for an 800 MHz Pentium III. (Although no comparison across processors is perfect, these chips were the first of their respective classes fabricated at 0.18 microns, and are therefore of comparable ages.) We find that applications written in SA-C and compiled to FPGAs are between 8 and 800 times faster than the equivalent program run on the Pentium III. PMID:18244709

1. Digital Image Processing in Private Industry.

ERIC Educational Resources Information Center

Moore, Connie

1986-01-01

Examines various types of private industry optical disk installations in terms of business requirements for digital image systems in five areas: records management; transaction processing; engineering/manufacturing; information distribution; and office automation. Approaches for implementing image systems are addressed as well as key success…

2. Mathematical Thinking Process of Autistic Students in Terms of Representational Gesture

ERIC Educational Resources Information Center

Mustafa, Sriyanti; Nusantara, Toto; Subanji; Irawati, Santi

2016-01-01

The aim of this study is to describe the mathematical thinking process of autistic students in terms of gesture, using a qualitative approach. Data collecting is conducted by using 3 (three) audio-visual cameras. During the learning process, both teacher and students' activity are recorded using handy cam and digital camera (full HD capacity).…

3. PASS Processes and Early Mathematics Skills in Dutch and Italian Kindergarteners

ERIC Educational Resources Information Center

Kroesbergen, Evelyn H.; Van Luit, Johannes E. H.; Naglieri, Jack A.; Taddei, Stefano; Franchi, Elena

2010-01-01

The purpose of this study was to investigate the relation between early mathematical skills and cognitive processing abilities for two samples of children in Italy (N = 40) and the Netherlands (N = 59) who completed both a cognitive test that measures Planning, Attention, Simultaneous, and Successive (PASS) processing and an early mathematical…

4. Responsiveness and Affective Processes in the Interactive Construction of Understanding in Mathematics.

ERIC Educational Resources Information Center

Owens, Kay; Perry, Bob; Conroy, John; Howe, Peter; Geoghegan, Noel

1998-01-01

Reports on important learning processes that emerged during adult mathematics classes that used a teaching approach compatible with a social constructivist theory of knowing. Concludes that affective processes precipitated students' responsiveness, modifying the immediate learning context which influenced student thinking, creating a snowball…

5. Mathematical formulation and numerical simulation of bird flu infection process within a poultry farm

Putri, Arrival Rince; Nova, Tertia Delia; Watanabe, M.

2016-02-01

Bird flu infection processes within a poultry farm are formulated mathematically. A spatial effect is taken into account for the virus concentration with a diffusive term. An infection process is represented in terms of a traveling wave solutions. For a small removal rate, a singular perturbation analysis lead to existence of traveling wave solutions, that correspond to progressive infection in one direction.

6. Cognitive Components of a Mathematical Processing Network in 9-Year-Old Children

ERIC Educational Resources Information Center

Szucs, Dénes; Devine, Amy; Soltesz, Fruzsina; Nobes, Alison; Gabriel, Florence

2014-01-01

We determined how various cognitive abilities, including several measures of a proposed domain-specific number sense, relate to mathematical competence in nearly 100 9-year-old children with normal reading skill. Results are consistent with an extended number processing network and suggest that important processing nodes of this network are…

7. Checking Fits With Digital Image Processing

NASA Technical Reports Server (NTRS)

Davis, R. M.; Geaslen, W. D.

1988-01-01

Computer-aided video inspection of mechanical and electrical connectors feasible. Report discusses work done on digital image processing for computer-aided interface verification (CAIV). Two kinds of components examined: mechanical mating flange and electrical plug.

8. Recent developments in digital image processing at the Image Processing Laboratory of JPL.

NASA Technical Reports Server (NTRS)

O'Handley, D. A.

1973-01-01

Review of some of the computer-aided digital image processing techniques recently developed. Special attention is given to mapping and mosaicking techniques and to preliminary developments in range determination from stereo image pairs. The discussed image processing utilization areas include space, biomedical, and robotic applications.

9. Command Line Image Processing System (CLIPS)

Fleagle, S. R.; Meyers, G. L.; Kulinski, R. G.

1985-06-01

An interactive image processing language (CLIPS) has been developed for use in an image processing environment. CLIPS uses a simple syntax with extensive on-line help to allow even the most naive user perform complex image processing tasks. In addition, CLIPS functions as an interpretive language complete with data structures and program control statements. CLIPS statements fall into one of three categories: command, control,and utility statements. Command statements are expressions comprised of intrinsic functions and/or arithmetic operators which act directly on image or user defined data. Some examples of CLIPS intrinsic functions are ROTATE, FILTER AND EXPONENT. Control statements allow a structured programming style through the use of statements such as DO WHILE and IF-THEN - ELSE. Utility statements such as DEFINE, READ, and WRITE, support I/O and user defined data structures. Since CLIPS uses a table driven parser, it is easily adapted to any environment. New commands may be added to CLIPS by writing the procedure in a high level language such as Pascal or FORTRAN and inserting the syntax for that command into the table. However, CLIPS was designed by incorporating most imaging operations into the language as intrinsic functions. CLIPS allows the user to generate new procedures easily with these powerful functions in an interactive or off line fashion using a text editor. The fact that CLIPS can be used to generate complex procedures quickly or perform basic image processing functions interactively makes it a valuable tool in any image processing environment.

Ahlers, Rolf-Juergen; Rauh, W.

1990-08-01

Image processing systems have found wide application in industry. For most computer integrated manufacturing faci- lities it is necessary to adapt these systems thus that they can automate the interaction with and the integration of CAD and CAM Systems. In this paper new approaches will be described that make use of the coupling of CAD and image processing as well as the automatic generation of programmes for the machining of products.

11. Mathematical simulation of the process of condensing natural gas

Tastandieva, G. M.

2015-01-01

Presents a two-dimensional unsteady model of heat transfer in terms of condensation of natural gas at low temperatures. Performed calculations of the process heat and mass transfer of liquefied natural gas (LNG) storage tanks of cylindrical shape. The influence of model parameters on the nature of heat transfer. Defined temperature regimes eliminate evaporation by cooling liquefied natural gas. The obtained dependence of the mass flow rate of vapor condensation gas temperature. Identified the possibility of regulating the process of "cooling down" liquefied natural gas in terms of its partial evaporation with low cost energy.

12. Examining the Underlying Values of Turkish and German Mathematics Teachers' Decision Making Processes in Group Studies

ERIC Educational Resources Information Center

Dede, Yuksel

2013-01-01

The purpose of this study was to explore the values underlying the decision-making processes in group studies for Turkish and German mathematics teachers. This study presented a small part of a wider study investigating German and Turkish mathematics teachers' and their students' values (Values in Mathematics Teaching in Turkey and…

13. A novel mathematical setup for fault tolerant control systems with state-dependent failure process

Chitraganti, S.; Aberkane, S.; Aubrun, C.

2014-12-01

In this paper, we consider a fault tolerant control system (FTCS) with state- dependent failures and provide a tractable mathematical model to handle the state-dependent failures. By assuming abrupt changes in system parameters, we use a jump process modelling of failure process and the fault detection and isolation (FDI) process. In particular, we assume that the failure rates of the failure process vary according to which set the state of the system belongs to.

14. Color image processing for date quality evaluation

Lee, Dah Jye; Archibald, James K.

2010-01-01

Many agricultural non-contact visual inspection applications use color image processing techniques because color is often a good indicator of product quality. Color evaluation is an essential step in the processing and inventory control of fruits and vegetables that directly affects profitability. Most color spaces such as RGB and HSV represent colors with three-dimensional data, which makes using color image processing a challenging task. Since most agricultural applications only require analysis on a predefined set or range of colors, mapping these relevant colors to a small number of indexes allows simple and efficient color image processing for quality evaluation. This paper presents a simple but efficient color mapping and image processing technique that is designed specifically for real-time quality evaluation of Medjool dates. In contrast with more complex color image processing techniques, the proposed color mapping method makes it easy for a human operator to specify and adjust color-preference settings for different color groups representing distinct quality levels. Using this color mapping technique, the color image is first converted to a color map that has one color index represents a color value for each pixel. Fruit maturity level is evaluated based on these color indices. A skin lamination threshold is then determined based on the fruit surface characteristics. This adaptive threshold is used to detect delaminated fruit skin and hence determine the fruit quality. The performance of this robust color grading technique has been used for real-time Medjool date grading.

15. Image processing technique based on image understanding architecture

Kuvychko, Igor

2000-12-01

Effectiveness of image applications is directly based on its abilities to resolve ambiguity and uncertainty in the real images. That requires tight integration of low-level image processing with high-level knowledge-based reasoning, which is the solution of the image understanding problem. This article presents a generic computational framework necessary for the solution of image understanding problem -- Spatial Turing Machine. Instead of tape of symbols, it works with hierarchical networks dually represented as discrete and continuous structures. Dual representation provides natural transformation of the continuous image information into the discrete structures, making it available for analysis. Such structures are data and algorithms at the same time and able to perform graph and diagrammatic operations being the basis of intelligence. They can create derivative structures that play role of context, or 'measurement device,' giving the ability to analyze, and run top-bottom algorithms. Symbols naturally emerge there, and symbolic operations work in combination with new simplified methods of computational intelligence. That makes images and scenes self-describing, and provides flexible ways of resolving uncertainty. Classification of images truly invariant to any transformation could be done via matching their derivative structures. New proposed architecture does not require supercomputers, opening ways to the new image technologies.

16. Nanosecond image processing using stimulated photon echoes.

PubMed

Xu, E Y; Kröll, S; Huestis, D L; Kachru, R; Kim, M K

1990-05-15

Processing of two-dimensional images on a nanosecond time scale is demonstrated using the stimulated photon echoes in a rare-earth-doped crystal (0.1 at. % Pr(3+):LaF(3)). Two spatially encoded laser pulses (pictures) resonant with the (3)P(0)-(3)H(4) transition of Pr(3+) were stored by focusing the image pulses sequentially into the Pr(3+):LaF(3) crystal. The stored information is retrieved and processed by a third read pulse, generating the echo that is the spatial convolution or correlation of the input images. Application of this scheme to high-speed pattern recognition is discussed. PMID:19768008

17. New approach for underwater imaging and processing

Wen, Yanan; Tian, Weijian; Zheng, Bing; Zhou, Guozun; Dong, Hui; Wu, Qiong

2014-05-01

Due to the absorptive and scattering nature of water, the characteristic of underwater image is different with it in the air. Underwater image is characterized by their poor visibility and noise. Getting clear original image and image processing are two important problems to be solved in underwater clear vision area. In this paper a new approach technology is presented to solve these problems. Firstly, an inhomogeneous illumination method is developed to get the clear original image. Normal illumination image system and inhomogeneous illumination image system are used to capture the image in same distance. The result shows that the contrast and definition of processed image is get great improvement by inhomogeneous illumination method. Secondly, based on the theory of photon transmitted in the water and the particularity of underwater target detecting, the characters of laser scattering on underwater target surface and spatial and temporal characters of oceanic optical channel have been studied. Based on the Monte Carlo simulation, we studied how the parameters of water quality and other systemic parameters affect the light transmitting through water at spatial and temporal region and provided the theoretical sustentation of enhancing the SNR and operational distance.

18. Image processing via ultrasonics - Status and promise

NASA Technical Reports Server (NTRS)

Kornreich, P. G.; Kowel, S. T.; Mahapatra, A.; Nouhi, A.

1979-01-01

Acousto-electric devices for electronic imaging of light are discussed. These devices are more versatile than line scan imaging devices in current use. They have the capability of presenting the image information in a variety of modes. The image can be read out in the conventional line scan mode. It can be read out in the form of the Fourier, Hadamard, or other transform. One can take the transform along one direction of the image and line scan in the other direction, or perform other combinations of image processing functions. This is accomplished by applying the appropriate electrical input signals to the device. Since the electrical output signal of these devices can be detected in a synchronous mode, substantial noise reduction is possible

19. Image-processing with augmented reality (AR)

Babaei, Hossein R.; Mohurutshe, Pagiel L.; Habibi Lashkari, Arash

2013-03-01

In this project, the aim is to discuss and articulate the intent to create an image-based Android Application. The basis of this study is on real-time image detection and processing. It's a new convenient measure that allows users to gain information on imagery right on the spot. Past studies have revealed attempts to create image based applications but have only gone up to crating image finders that only work with images that are already stored within some form of database. Android platform is rapidly spreading around the world and provides by far the most interactive and technical platform for smart-phones. This is why it was important to base the study and research on it. Augmented Reality is this allows the user to maipulate the data and can add enhanced features (video, GPS tags) to the image taken.

20. The Merging Of Computer Graphics And Image Processing Technologies And Applications

Brammer, Robert F.; Stephenson, Thomas P.

1990-01-01

Historically, computer graphics and image processing technologies and applications have been distinct, both in their research communities and in their hardware and software product suppliers. Computer graphics deals with synthesized visual depictions of outputs from computer models*, whereas image processing (and analysis) deals with computational operations on input data from "imaging sensors"**. Furthermore, the fundamental storage and computational aspects of these two fields are different from one another. For example, many computer graphics applications store data using vector formats whereas image processing applications generally use raster formats. Computer graphics applications may involve polygonal representations, floating point operations, and mathematical models of physical phenomena such as lighting conditions, surface reflecting properties, etc. Image processing applications may involve pixel operations, fixed point representations, global operations (e.g. image rotations), and nonlinear signal processing algorithms.

1. Aberrant functional activation in school age children at-risk for mathematical disability: a functional imaging study of simple arithmetic skill.

PubMed

Davis, Nicole; Cannistraci, Christopher J; Rogers, Baxter P; Gatenby, J Christopher; Fuchs, Lynn S; Anderson, Adam W; Gore, John C

2009-10-01

We used functional magnetic resonance imaging (fMRI) to explore the patterns of brain activation associated with different levels of performance in exact and approximate calculation tasks in well-defined cohorts of children with mathematical calculation difficulties (MD) and typically developing controls. Both groups of children activated the same network of brain regions; however, children in the MD group had significantly increased activation in parietal, frontal, and cingulate cortices during both calculation tasks. A majority of the differences occurred in anatomical brain regions associated with cognitive resources such as executive functioning and working memory that are known to support higher level arithmetic skill but are not specific to mathematical processing. We propose that these findings are evidence that children with MD use the same types of problem solving strategies as TD children, but their weak mathematical processing system causes them to employ a more developmentally immature and less efficient form of the strategies. PMID:19410589

2. Aberrant functional activation in school age children at-risk for mathematical disability: A functional imaging study of simple arithmetic skill

PubMed Central

Davis, Nicole; Cannistraci, Christopher J.; Rogers, Baxter P.; Gatenby, J. Christopher; Fuchs, Lynn S.; Anderson, Adam W.; Gore, John C.

2009-01-01

We used functional magnetic resonance imaging (fMRI) to explore the patterns of brain activation associated with different levels of performance in exact and approximate calculation tasks in well defined cohorts of children with mathematical calculation difficulties (MD) and typically developing controls. Both groups of children activated the same network of brain regions; however, children in the MD group had significantly increased activation in parietal, frontal, and cingulate cortices during both calculation tasks. A majority of the differences occurred in anatomical brain regions associated with cognitive resources such as executive functioning and working memory that are known to support higher level arithmetic skill but are not specific to mathematical processing. We propose that these findings are evidence that children with MD use the same types of problem solving strategies as TD children, but their weak mathematical processing system causes them to employ a more developmentally immature and less efficient form of the strategies. PMID:19410589

3. Overview on METEOSAT geometrical image data processing

NASA Technical Reports Server (NTRS)

Diekmann, Frank J.

1994-01-01

Digital Images acquired from the geostationary METEOSAT satellites are processed and disseminated at ESA's European Space Operations Centre in Darmstadt, Germany. Their scientific value is mainly dependent on their radiometric quality and geometric stability. This paper will give an overview on the image processing activities performed at ESOC, concentrating on the geometrical restoration and quality evaluation. The performance of the rectification process for the various satellites over the past years will be presented and the impacts of external events as for instance the Pinatubo eruption in 1991 will be explained. Special developments both in hard and software, necessary to cope with demanding tasks as new image resampling or to correct for spacecraft anomalies, are presented as well. The rotating lens of MET-5 causing severe geometrical image distortions is an example for the latter.

4. A mathematical model of color and orientation processing in V1.

PubMed

Smirnova, Elena Y; Chizhkova, Ekaterina A; Chizhov, Anton V

2015-10-01

Orientation processing in the primary visual cortex (V1) has been experimentally investigated in detail and reproduced in models, while color processing remains unclear. Thus, we have constructed a mathematical model of color and orientation processing in V1. The model is mainly based on the following experimental evidence concerning color blobs: A blob contains overlapping neuronal patches activated by different hues, so that each blob represents a full gamut of hue and might be structured with a loop (Xiao et al. in NeuroImage 35:771-786, 2007). The proposed model describes a set of orientation hypercolumns and color blobs, in which color and orientation preferences are represented by the poloidal and toroidal angles of a torus, correspondingly. The model consists of color-insensitive (CI) and color-sensitive (CS) neuronal populations, which are described by a firing-rate model. The set of CI neurons is described by the classical ring model (Ben-Yishai et al. in Proc Natl Acad Sci USA 92:3844-3848, 1995) with recurrent connections in the orientation space; similarly, the set of CS neurons is described in the color space and also receives input from CI neurons of the same orientation preference. The model predictions are as follows: (1) responses to oriented color stimuli are significantly stronger than those to non-oriented color stimuli; (2) the activity of CS neurons in total is higher than that of CI neurons; (3) a random color can be illusorily perceived in the case of gray oriented stimulus; (4) in response to two-color stimulus in the marginal phase, the network chooses either one of the colors or the intermediate color; (5) input to a blob has rather continual representation of a hue than discrete one (with two narrowly tuned opponent signals). PMID:26330361

5. Real-time optical image processing techniques

NASA Technical Reports Server (NTRS)

Liu, Hua-Kuang

1988-01-01

Nonlinear real-time optical processing on spatial pulse frequency modulation has been pursued through the analysis, design, and fabrication of pulse frequency modulated halftone screens and the modification of micro-channel spatial light modulators (MSLMs). Micro-channel spatial light modulators are modified via the Fabry-Perot method to achieve the high gamma operation required for non-linear operation. Real-time nonlinear processing was performed using the halftone screen and MSLM. The experiments showed the effectiveness of the thresholding and also showed the needs of higher SBP for image processing. The Hughes LCLV has been characterized and found to yield high gamma (about 1.7) when operated in low frequency and low bias mode. Cascading of two LCLVs should also provide enough gamma for nonlinear processing. In this case, the SBP of the LCLV is sufficient but the uniformity of the LCLV needs improvement. These include image correlation, computer generation of holograms, pseudo-color image encoding for image enhancement, and associative-retrieval in neural processing. The discovery of the only known optical method for dynamic range compression of an input image in real-time by using GaAs photorefractive crystals is reported. Finally, a new architecture for non-linear multiple sensory, neural processing has been suggested.

6. A low-cost vector processor boosting compute-intensive image processing operations

NASA Technical Reports Server (NTRS)

1992-01-01

Low-cost vector processing (VP) is within reach of everyone seriously engaged in scientific computing. The advent of affordable add-on VP-boards for standard workstations complemented by mathematical/statistical libraries is beginning to impact compute-intensive tasks such as image processing. A case in point in the restoration of distorted images from the Hubble Space Telescope. A low-cost implementation is presented of the standard Tarasko-Richardson-Lucy restoration algorithm on an Intel i860-based VP-board which is seamlessly interfaced to a commercial, interactive image processing system. First experience is reported (including some benchmarks for standalone FFT's) and some conclusions are drawn.

7. Bistatic SAR: Signal Processing and Image Formation.

SciTech Connect

Wahl, Daniel E.; Yocky, David A.

2014-10-01

This report describes the significant processing steps that were used to take the raw recorded digitized signals from the bistatic synthetic aperture RADAR (SAR) hardware built for the NCNS Bistatic SAR project to a final bistatic SAR image. In general, the process steps herein are applicable to bistatic SAR signals that include the direct-path signal and the reflected signal. The steps include preprocessing steps, data extraction to for a phase history, and finally, image format. Various plots and values will be shown at most steps to illustrate the processing for a bistatic COSMO SkyMed collection gathered on June 10, 2013 on Kirtland Air Force Base, New Mexico.

8. Twofold processing for denoising ultrasound medical images.

PubMed

Kishore, P V V; Kumar, K V V; Kumar, D Anil; Prasad, M V D; Goutham, E N D; Rahul, R; Krishna, C B S Vamsi; Sandeep, Y

2015-01-01

Ultrasound medical (US) imaging non-invasively pictures inside of a human body for disease diagnostics. Speckle noise attacks ultrasound images degrading their visual quality. A twofold processing algorithm is proposed in this work to reduce this multiplicative speckle noise. First fold used block based thresholding, both hard (BHT) and soft (BST), on pixels in wavelet domain with 8, 16, 32 and 64 non-overlapping block sizes. This first fold process is a better denoising method for reducing speckle and also inducing object of interest blurring. The second fold process initiates to restore object boundaries and texture with adaptive wavelet fusion. The degraded object restoration in block thresholded US image is carried through wavelet coefficient fusion of object in original US mage and block thresholded US image. Fusion rules and wavelet decomposition levels are made adaptive for each block using gradient histograms with normalized differential mean (NDF) to introduce highest level of contrast between the denoised pixels and the object pixels in the resultant image. Thus the proposed twofold methods are named as adaptive NDF block fusion with hard and soft thresholding (ANBF-HT and ANBF-ST). The results indicate visual quality improvement to an interesting level with the proposed twofold processing, where the first fold removes noise and second fold restores object properties. Peak signal to noise ratio (PSNR), normalized cross correlation coefficient (NCC), edge strength (ES), image quality Index (IQI) and structural similarity index (SSIM), measure the quantitative quality of the twofold processing technique. Validation of the proposed method is done by comparing with anisotropic diffusion (AD), total variational filtering (TVF) and empirical mode decomposition (EMD) for enhancement of US images. The US images are provided by AMMA hospital radiology labs at Vijayawada, India. PMID:26697285

9. 3D seismic image processing for interpretation

Wu, Xinming

Extracting fault, unconformity, and horizon surfaces from a seismic image is useful for interpretation of geologic structures and stratigraphic features. Although interpretation of these surfaces has been automated to some extent by others, significant manual effort is still required for extracting each type of these geologic surfaces. I propose methods to automatically extract all the fault, unconformity, and horizon surfaces from a 3D seismic image. To a large degree, these methods just involve image processing or array processing which is achieved by efficiently solving partial differential equations. For fault interpretation, I propose a linked data structure, which is simpler than triangle or quad meshes, to represent a fault surface. In this simple data structure, each sample of a fault corresponds to exactly one image sample. Using this linked data structure, I extract complete and intersecting fault surfaces without holes from 3D seismic images. I use the same structure in subsequent processing to estimate fault slip vectors. I further propose two methods, using precomputed fault surfaces and slips, to undo faulting in seismic images by simultaneously moving fault blocks and faults themselves. For unconformity interpretation, I first propose a new method to compute a unconformity likelihood image that highlights both the termination areas and the corresponding parallel unconformities and correlative conformities. I then extract unconformity surfaces from the likelihood image and use these surfaces as constraints to more accurately estimate seismic normal vectors that are discontinuous near the unconformities. Finally, I use the estimated normal vectors and use the unconformities as constraints to compute a flattened image, in which seismic reflectors are all flat and vertical gaps correspond to the unconformities. Horizon extraction is straightforward after computing a map of image flattening; we can first extract horizontal slices in the flattened space

10. Image Processing Application for Cognition (IPAC) - Traditional and Emerging Topics in Image Processing in Astronomy (Invited)

Pesenson, M.; Roby, W.; Helou, G.; McCollum, B.; Ly, L.; Wu, X.; Laine, S.; Hartley, B.

2008-08-01

A new application framework for advanced image processing for astronomy is presented. It implements standard two-dimensional operators, and recent developments in the field of non-astronomical image processing (IP), as well as original algorithms based on nonlinear partial differential equations (PDE). These algorithms are especially well suited for multi-scale astronomical images since they increase signal to noise ratio without smearing localized and diffuse objects. The visualization component is based on the extensive tools that we developed for Spitzer Space Telescope's observation planning tool Spot and archive retrieval tool Leopard. It contains many common features, combines images in new and unique ways and interfaces with many astronomy data archives. Both interactive and batch mode processing are incorporated. In the interactive mode, the user can set up simple processing pipelines, and monitor and visualize the resulting images from each step of the processing stream. The system is platform-independent and has an open architecture that allows extensibility by addition of plug-ins. This presentation addresses astronomical applications of traditional topics of IP (image enhancement, image segmentation) as well as emerging new topics like automated image quality assessment (QA) and feature extraction, which have potential for shaping future developments in the field. Our application framework embodies a novel synergistic approach based on integration of image processing, image visualization and image QA (iQA).

11. Mathematical modeling of physical processes in inorganic chemistry

SciTech Connect

Chiu, H.L.

1988-01-01

The first part deals with the rapid calculation of steady-state concentration profiles in contactors using the Purex Process. Most of the computer codes simulating the reprocessing of spent nuclear fuel generate the steady-state properties by calculating the transient behavior of the contactors. In this study, the author simulates the steady-state concentration profiles directly without first generating the transient behavior. Two computer codes are developed, PUMA (Plutonium-Uranium-Matrix-Algorithm) and PUNE (Plutonium-Uranium-Non-Equilibrium). The first one simulates the steady-state concentration profiles under conditions of equilibrium mass transfer. The second one accounts for deviations from mass transfer equilibrium. The second part of this dissertation shows how to use the classical trajectory method to study the equilibrium and saddle-point geometries of MX{sub n} (n = 2-7) molecules. Two nuclear potential functions that have the property of invariance to the operations of the permutation group of nuclei in molecules of the general formula MX{sub n} are described. Such potential functions allow equivalent isomers to have equal energies so that various statistical mechanical properties can be simply determined. The first function contains two center interactions between pairs of peripheral atoms and its defined by V(r) = 1/2{Sigma}{sub {alpha}}k{triangle}r{sub {alpha}{mu}}{sup 2} + {Sigma}{sub {alpha}< {beta}} QR{sub {alpha}{beta}}{sup {minus}n} (n = 1,2...). The second function contains two and three center interactions and is defined by V({Theta}) = 1/2{Sigma}{sub {alpha}}K{triangle}{sub {alpha}{mu}}{sup 2} + 1/2{Sigma}{sub {alpha}<{beta}}Qr{sub 0}{sup 2} ({Theta}{sub {alpha}{mu}{beta}} - {pi}){sup 2}.

12. Thermal Imaging Processes of Polymer Nanocomposite Coatings

Meth, Jeffrey

2015-03-01

Laser induced thermal imaging (LITI) is a process whereby infrared radiation impinging on a coating on a donor film transfers that coating to a receiving film to produce a pattern. This talk describes how LITI patterning can print color filters for liquid crystal displays, and details the physical processes that are responsible for transferring the nanocomposite coating in a coherent manner that does not degrade its optical properties. Unique features of this process involve heating rates of 107 K/s, and cooling rates of 104 K/s, which implies that not all of the relaxation modes of the polymer are accessed during the imaging process. On the microsecond time scale, the polymer flow is forced by devolatilization of solvents, followed by deformation akin to the constrained blister test, and then fracture caused by differential thermal expansion. The unique combination of disparate physical processes demonstrates the gamut of physics that contribute to advanced material processing in an industrial setting.

13. A Pipeline Tool for CCD Image Processing

Bell, Jon F.; Young, Peter J.; Roberts, William H.; Sebo, Kim M.

MSSSO is part of a collaboration developing a wide field imaging CCD mosaic (WFI). As part of this project, we have developed a GUI based pipeline tool that is an integrated part of MSSSO's CICADA data acquisition environment and processes CCD FITS images as they are acquired. The tool is also designed to run as a stand alone program to process previously acquired data. IRAF tasks are used as the central engine, including the new NOAO mscred package for processing multi-extension FITS files. The STScI OPUS pipeline environment may be used to manage data and process scheduling. The Motif GUI was developed using SUN Visual Workshop. C++ classes were written to facilitate launching of IRAF and OPUS tasks. While this first version implements calibration processing up to and including flat field corrections, there is scope to extend it to other processing.

14. Digital-image processing and image analysis of glacier ice

USGS Publications Warehouse

Fitzpatrick, Joan J.

2013-01-01

This document provides a methodology for extracting grain statistics from 8-bit color and grayscale images of thin sections of glacier ice—a subset of physical properties measurements typically performed on ice cores. This type of analysis is most commonly used to characterize the evolution of ice-crystal size, shape, and intercrystalline spatial relations within a large body of ice sampled by deep ice-coring projects from which paleoclimate records will be developed. However, such information is equally useful for investigating the stress state and physical responses of ice to stresses within a glacier. The methods of analysis presented here go hand-in-hand with the analysis of ice fabrics (aggregate crystal orientations) and, when combined with fabric analysis, provide a powerful method for investigating the dynamic recrystallization and deformation behaviors of bodies of ice in motion. The procedures described in this document compose a step-by-step handbook for a specific image acquisition and data reduction system built in support of U.S. Geological Survey ice analysis projects, but the general methodology can be used with any combination of image processing and analysis software. The specific approaches in this document use the FoveaPro 4 plug-in toolset to Adobe Photoshop CS5 Extended but it can be carried out equally well, though somewhat less conveniently, with software such as the image processing toolbox in MATLAB, Image-Pro Plus, or ImageJ.

15. Fundamental Concepts of Digital Image Processing

DOE R&D Accomplishments Database

Twogood, R. E.

1983-03-01

The field of a digital-image processing has experienced dramatic growth and increasingly widespread applicability in recent years. Fortunately, advances in computer technology have kept pace with the rapid growth in volume of image data in these and other applications. Digital image processing has become economical in many fields of research and in industrial and military applications. While each application has requirements unique from the others, all are concerned with faster, cheaper, more accurate, and more extensive computation. The trend is toward real-time and interactive operations, where the user of the system obtains preliminary results within a short enough time that the next decision can be made by the human processor without loss of concentration on the task at hand. An example of this is the obtaining of two-dimensional (2-D) computer-aided tomography (CAT) images. A medical decision might be made while the patient is still under observation rather than days later.

16. The Answering Process for Multiple-Choice Questions in Collaborative Learning: A Mathematical Learning Model Analysis

ERIC Educational Resources Information Center

Nakamura, Yasuyuki; Nishi, Shinnosuke; Muramatsu, Yuta; Yasutake, Koichi; Yamakawa, Osamu; Tagawa, Takahiro

2014-01-01

In this paper, we introduce a mathematical model for collaborative learning and the answering process for multiple-choice questions. The collaborative learning model is inspired by the Ising spin model and the model for answering multiple-choice questions is based on their difficulty level. An intensive simulation study predicts the possibility of…

17. Developing Mathematical Processes: A Report of the 1971-72 Field Test.

ERIC Educational Resources Information Center

Hubbard, W. Donald

This document reports on a field test of Developing Mathematical Processes (DMP), a research based instructional program for elementary school children developed from psychological principles. The field test was conducted in eight schools; four were multiunit schools in small towns and large cities; four were conventionally organized and located…

18. Developing Mathematical Processes: 1972-73 Field Test Report. Technical Report No. 324.

ERIC Educational Resources Information Center

Hubbard, W. Donald; Buchanan, Anne E.

A continuation of the field test of Developing Mathematical Processes (DMP) was conducted in eight schools. Four were multiunit schools located in settings ranging from small town to large city; the remaining schools were conventionally organized and located in large urban areas. The purpose of the field test was (1) to determine the effectiveness…

19. Number Magnitude Processing and Basic Cognitive Functions in Children with Mathematical Learning Disabilities

ERIC Educational Resources Information Center

2012-01-01

The study sought out to extend our knowledge regarding the origin of mathematical learning disabilities (MLD) in children by testing different hypotheses in the same samples of children. Different aspects of cognitive functions and number processing were assessed in fifth- and sixth-graders (11-13 years old) with MLD and compared to controls. The…

20. A Process of Students and Their Instructor Developing a Final Closed-Book Mathematics Exam

ERIC Educational Resources Information Center

Rapke, Tina

2016-01-01

This article describes a study, from a Canadian technical institute's upgrading mathematics course, where students played a role in developing the final closed-book exam that they sat. The study involved a process where students developed practice exams and solutions keys, students sat each other's practice exams, students evaluated classmates'…

1. Socially Shared Metacognition of Dyads of Pupils in Collaborative Mathematical Problem-Solving Processes

ERIC Educational Resources Information Center

Iiskala, Tuike; Vauras, Marja; Lehtinen, Erno; Salonen, Pekka

2011-01-01

This study investigated how metacognition appears as a socially shared phenomenon within collaborative mathematical word-problem solving processes of dyads of high-achieving pupils. Four dyads solved problems of different difficulty levels. The pupils were 10 years old. The problem-solving activities were videotaped and transcribed in terms of…

2. Error Analysis in High School Mathematics. Conceived as Information-Processing Pathology.

ERIC Educational Resources Information Center

Davis, Robert B.

This paper, presented at the 1979 meeting of the American Educational Research Association (AERA), investigates student errors in high school mathematics. A conceptual framework of hypothetical information-handling processes such as procedures, frames, retrieval from memory, visually-moderated sequences (VMS sequences), the integrated sequence,…

3. Understanding Expertise-Based Training Effects on the Software Evaluation Process of Mathematics Education Teachers

ERIC Educational Resources Information Center

Incikabi, Lutfi; Sancar Tokmak, Hatice

2012-01-01

This case study examined the educational software evaluation processes of pre-service teachers who attended either expertise-based training (XBT) or traditional training in conjunction with a Software-Evaluation checklist. Forty-three mathematics teacher candidates and three experts participated in the study. All participants evaluated educational…

4. The Instructional Quality of Classroom Processes and Pupils' Mathematical Attainment Concerning Decimal Fractions

ERIC Educational Resources Information Center

Pitkäniemi, Harri; Häkkinen, Kaija

2015-01-01

The objective of our study is to understand and analyse what significance cognitive and emotional networks of classroom processes have in mathematics learning. The subject of the study includes two classrooms from Year 5 of a teacher training school, their pupils (NA=17, NB=19) and student teachers (N=4). The course on decimals, which consisted of…

5. Understanding Prospective Mathematics Teachers' Processes for Making Sense of Students' Work with Technology

ERIC Educational Resources Information Center

Wilson, P. Holt; Lee, Hollylynne Stohl; Hollebrands, Karen F.

2011-01-01

This study investigated the processes used by prospective mathematics teachers as they examined middle-school students' work solving statistical problems using a computer software program. Ways in which the model may be used by other researchers and implications for the design of pedagogical tasks for prospective teachers are discussed. (Contains…

6. Image processing of angiograms: A pilot study

NASA Technical Reports Server (NTRS)

Larsen, L. E.; Evans, R. A.; Roehm, J. O., Jr.

1974-01-01

The technology transfer application this report describes is the result of a pilot study of image-processing methods applied to the image enhancement, coding, and analysis of arteriograms. Angiography is a subspecialty of radiology that employs the introduction of media with high X-ray absorption into arteries in order to study vessel pathology as well as to infer disease of the organs supplied by the vessel in question.

7. Future projects in pulse image processing

Kinser, Jason M.

1999-03-01

Pulse-Couple Neural Networks have generated quite a bit of interest as image processing tools. Past applications include image segmentation, edge extraction, texture extraction, de-noising, object isolation, foveation and fusion. These past applications do not comprise a complete list of useful applications of the PCNN. Future avenues of research will include level set analysis, binary (optical) correlators, artificial life simulations, maze running and filter jet analysis. This presentation will explore these future avenues of PCNN research.

8. CCD architecture for spacecraft SAR image processing

NASA Technical Reports Server (NTRS)

Arens, W. E.

1977-01-01

A real-time synthetic aperture radar (SAR) image processing architecture amenable to future on-board spacecraft applications is currently under development. Using state-of-the-art charge-coupled device (CCD) technology, low cost and power are inherent features. Other characteristics include the ability to reprogram correlation reference functions, correct for range migration, and compensate for antenna beam pointing errors on the spacecraft in real time. The first spaceborne demonstration is scheduled to be flown as an experiment on a 1982 Shuttle imaging radar mission (SIR-B). This paper describes the architecture and implementation characteristics of this initial spaceborne CCD SAR image processor.

9. Infrared image processing and data analysis

Ibarra-Castanedo, C.; González, D.; Klein, M.; Pilla, M.; Vallerand, S.; Maldague, X.

2004-12-01

Infrared thermography in nondestructive testing provides images (thermograms) in which zones of interest (defects) appear sometimes as subtle signatures. In this context, raw images are not often appropriate since most will be missed. In some other cases, what is needed is a quantitative analysis such as for defect detection and characterization. In this paper, presentation is made of various methods of data analysis required either at preprocessing and/or processing images. References from literature are provided for briefly discussed known methods while novelties are elaborated in more details within the text which include also experimental results.

10. The ‘hit’ phenomenon: a mathematical model of human dynamics interactions as a stochastic process

Ishii, Akira; Arakaki, Hisashi; Matsuda, Naoya; Umemura, Sanae; Urushidani, Tamiko; Yamagata, Naoya; Yoshida, Narihiko

2012-06-01

A mathematical model for the ‘hit’ phenomenon in entertainment within a society is presented as a stochastic process of human dynamics interactions. The model uses only the advertisement budget time distribution as an input, and word-of-mouth (WOM), represented by posts on social network systems, is used as data to make a comparison with the calculated results. The unit of time is days. The WOM distribution in time is found to be very close to the revenue distribution in time. Calculations for the Japanese motion picture market based on the mathematical model agree well with the actual revenue distribution in time.

11. Industrial Holography Combined With Image Processing

Schorner, J.; Rottenkolber, H.; Roid, W.; Hinsch, K.

1988-01-01

Holographic test methods have gained to become a valuable tool for the engineer in research and development. But also in the field of non-destructive quality control holographic test equipment is now accepted for tests within the production line. The producer of aircraft tyres e. g. are using holographic tests to prove the guarantee of their tyres. Together with image processing the whole test cycle is automatisized. The defects within the tyre are found automatically and are listed on an outprint. The power engine industry is using holographic vibration tests for the optimization of their constructions. In the plastics industry tanks, wheels, seats and fans are tested holographically to find the optimum of shape. The automotive industry makes holography a tool for noise reduction. Instant holography and image processing techniques for quantitative analysis have led to an economic application of holographic test methods. New developments of holographic units in combination with image processing are presented.

12. DSP based image processing for retinal prosthesis.

PubMed

Parikh, Neha J; Weiland, James D; Humayun, Mark S; Shah, Saloni S; Mohile, Gaurav S

2004-01-01

The real-time image processing in retinal prosthesis consists of the implementation of various image processing algorithms like edge detection, edge enhancement, decimation etc. The algorithmic computations in real-time may have high level of computational complexity and hence the use of digital signal processors (DSPs) for the implementation of such algorithms is proposed here. This application desires that the DSPs be highly computationally efficient while working on low power. DSPs have computational capabilities of hundreds of millions of instructions per second (MIPS) or millions of floating point operations per second (MFLOPS) along with certain processor configurations having low power. The various image processing algorithms, the DSP requirements and capabilities of different platforms would be discussed in this paper. PMID:17271974

13. Three-dimensional image signals: processing methods

Schiopu, Paul; Manea, Adrian; Craciun, Anca-Ileana; Craciun, Alexandru

2010-11-01

Over the years extensive studies have been carried out to apply coherent optics methods in real-time processing, communications and transmission image. This is especially true when a large amount of information needs to be processed, e.g., in high-resolution imaging. The recent progress in data-processing networks and communication systems has considerably increased the capacity of information exchange. We describe the results of literature investigation research of processing methods for the signals of the three-dimensional images. All commercially available 3D technologies today are based on stereoscopic viewing. 3D technology was once the exclusive domain of skilled computer-graphics developers with high-end machines and software. The images capture from the advanced 3D digital camera can be displayed onto screen of the 3D digital viewer with/ without special glasses. For this is needed considerable processing power and memory to create and render the complex mix of colors, textures, and virtual lighting and perspective necessary to make figures appear three-dimensional. Also, using a standard digital camera and a technique called phase-shift interferometry we can capture "digital holograms." These are holograms that can be stored on computer and transmitted over conventional networks. We present some research methods to process "digital holograms" for the Internet transmission and results.

14. Support Routines for In Situ Image Processing

NASA Technical Reports Server (NTRS)

Deen, Robert G.; Pariser, Oleg; Yeates, Matthew C.; Lee, Hyun H.; Lorre, Jean

2013-01-01

This software consists of a set of application programs that support ground-based image processing for in situ missions. These programs represent a collection of utility routines that perform miscellaneous functions in the context of the ground data system. Each one fulfills some specific need as determined via operational experience. The most unique aspect to these programs is that they are integrated into the large, in situ image processing system via the PIG (Planetary Image Geometry) library. They work directly with space in situ data, understanding the appropriate image meta-data fields and updating them properly. The programs themselves are completely multimission; all mission dependencies are handled by PIG. This suite of programs consists of: (1)marscahv: Generates a linearized, epi-polar aligned image given a stereo pair of images. These images are optimized for 1-D stereo correlations, (2) marscheckcm: Compares the camera model in an image label with one derived via kinematics modeling on the ground, (3) marschkovl: Checks the overlaps between a list of images in order to determine which might be stereo pairs. This is useful for non-traditional stereo images like long-baseline or those from an articulating arm camera, (4) marscoordtrans: Translates mosaic coordinates from one form into another, (5) marsdispcompare: Checks a Left Right stereo disparity image against a Right Left disparity image to ensure they are consistent with each other, (6) marsdispwarp: Takes one image of a stereo pair and warps it through a disparity map to create a synthetic opposite- eye image. For example, a right eye image could be transformed to look like it was taken from the left eye via this program, (7) marsfidfinder: Finds fiducial markers in an image by projecting their approximate location and then using correlation to locate the markers to subpixel accuracy. These fiducial markets are small targets attached to the spacecraft surface. This helps verify, or improve, the

15. Processing infrared images of aircraft lapjoints

NASA Technical Reports Server (NTRS)

Syed, Hazari; Winfree, William P.; Cramer, K. E.

1992-01-01

Techniques for processing IR images of aging aircraft lapjoint data are discussed. Attention is given to a technique for detecting disbonds in aircraft lapjoints which clearly delineates the disbonded region from the bonded regions. The technique is weak on unpainted aircraft skin surfaces, but can be overridden by using a self-adhering contact sheet. Neural network analysis on raw temperature data has been shown to be an effective tool for visualization of images. Numerical simulation results show the above processing technique to be an effective tool in delineating the disbonds.

16. Results of precision processing (scene correction) of ERTS-1 images using digital image processing techniques

NASA Technical Reports Server (NTRS)

Bernstein, R.

1973-01-01

ERTS-1 MSS and RBV data recorded on computer compatible tapes have been analyzed and processed, and preliminary results have been obtained. No degradation of intensity (radiance) information occurred in implementing the geometric correction. The quality and resolution of the digitally processed images are very good, due primarily to the fact that the number of film generations and conversions is reduced to a minimum. Processing times of digitally processed images are about equivalent to the NDPF electro-optical processor.

17. Fundamental remote sensing science research program. Part 1: Status report of the mathematical pattern recognition and image analysis project

NASA Technical Reports Server (NTRS)

Heydorn, R. D.

1984-01-01

The Mathematical Pattern Recognition and Image Analysis (MPRIA) Project is concerned with basic research problems related to the study of the Earth from remotely sensed measurement of its surface characteristics. The program goal is to better understand how to analyze the digital image that represents the spatial, spectral, and temporal arrangement of these measurements for purposing of making selected inference about the Earth.

18. Construction of mathematical models of extraction processes with nonlocal conditions by a spatial variable

Orazov, Issabek; Ayaz, Sultanbek Zh.

2016-08-01

In this paper, we consider issues of constructing mathematical models of extraction processes from solid polydis-perse porous materials considering the porosity of structure of particles, taking into account the connection of the residence time of fractions with particle size in the extractant, based on inverse problems of recovery of coefficients of diffusion processes under various variants of boundary conditions by a spatial variable.

19. FLIPS: Friendly Lisp Image Processing System

Gee, Shirley J.

1991-08-01

The Friendly Lisp Image Processing System (FLIPS) is the interface to Advanced Target Detection (ATD), a multi-resolutional image analysis system developed by Hughes in conjunction with the Hughes Research Laboratories. Both menu- and graphics-driven, FLIPS enhances system usability by supporting the interactive nature of research and development. Although much progress has been made, fully automated image understanding technology that is both robust and reliable is not a reality. In situations where highly accurate results are required, skilled human analysts must still verify the findings of these systems. Furthermore, the systems often require processing times several orders of magnitude greater than that needed by veteran personnel to analyze the same image. The purpose of FLIPS is to facilitate the ability of an image analyst to take statistical measurements on digital imagery in a timely fashion, a capability critical in research environments where a large percentage of time is expended in algorithm development. In many cases, this entails minor modifications or code tinkering. Without a well-developed man-machine interface, throughput is unduly constricted. FLIPS provides mechanisms which support rapid prototyping for ATD. This paper examines the ATD/FLIPS system. The philosophy of ATD in addressing image understanding problems is described, and the capabilities of FLIPS are discussed, along with a description of the interaction between ATD and FLIPS. Finally, an overview of current plans for the system is outlined.

20. Cognitive processing and mathematical achievement: a study with schoolchildren between fourth and sixth grade of primary education.

PubMed

Iglesias-Sarmiento, Valentín; Deaño, Manuel

2011-01-01

This investigation analyzed the relation between cognitive functioning and mathematical achievement in 114 students in fourth, fifth, and sixth grades. Differences in cognitive performance were studied concurrently in three selected achievement groups: mathematical learning disability group (MLD), low achieving group (LA), and typically achieving group (TA). For this purpose, performance in verbal memory and in the PASS cognitive processes of planning, attention, and simultaneous and successive processing was assessed at the end of the academic course. Correlational analyses showed that phonological loop and successive and simultaneous processing were related to mathematical achievement at all three grades. Regression analysis revealed simultaneous processing as a cognitive predictor of mathematical performance, although phonological loop was also associated with higher achievement. Simultaneous and successive processing were the elements that differentiated the MLD group from the LA group. These results show that, of all the variables analyzed in this study, simultaneous processing was the best predictor of mathematical performance. PMID:21444928

1. Product review: lucis image processing software.

PubMed

Johnson, J E

1999-04-01

Lucis is a software program that allows the manipulation of images through the process of selective contrast pattern emphasis. Using an image-processing algorithm called Differential Hysteresis Processing (DHP), Lucis extracts and highlights patterns based on variations in image intensity (luminance). The result is that details can be seen that would otherwise be hidden in deep shadow or excessive brightness. The software is contained on a single floppy disk, is easy to install on a PC, simple to use, and runs on Windows 95, Windows 98, and Windows NT operating systems. The cost is \$8,500 for a license, but is estimated to save a great deal of money in photographic materials, time, and labor that would have otherwise been spent in the darkroom. Superb images are easily obtained from unstained (no lead or uranium) sections, and stored image files sent to laser printers are of publication quality. The software can be used not only for all types of microscopy, including color fluorescence light microscopy, biological and materials science electron microscopy (TEM and SEM), but will be beneficial in medicine, such as X-ray films (pending approval by the FDA), and in the arts. PMID:10206154

2. Processing Images of Craters for Spacecraft Navigation

NASA Technical Reports Server (NTRS)

Cheng, Yang; Johnson, Andrew E.; Matthies, Larry H.

2009-01-01

A crater-detection algorithm has been conceived to enable automation of what, heretofore, have been manual processes for utilizing images of craters on a celestial body as landmarks for navigating a spacecraft flying near or landing on that body. The images are acquired by an electronic camera aboard the spacecraft, then digitized, then processed by the algorithm, which consists mainly of the following steps: 1. Edges in an image detected and placed in a database. 2. Crater rim edges are selected from the edge database. 3. Edges that belong to the same crater are grouped together. 4. An ellipse is fitted to each group of crater edges. 5. Ellipses are refined directly in the image domain to reduce errors introduced in the detection of edges and fitting of ellipses. 6. The quality of each detected crater is evaluated. It is planned to utilize this algorithm as the basis of a computer program for automated, real-time, onboard processing of crater-image data. Experimental studies have led to the conclusion that this algorithm is capable of a detection rate >93 percent, a false-alarm rate <5 percent, a geometric error <0.5 pixel, and a position error <0.3 pixel.

3. Onboard Image Processing System for Hyperspectral Sensor.

PubMed

Hihara, Hiroki; Moritani, Kotaro; Inoue, Masao; Hoshi, Yoshihiro; Iwasaki, Akira; Takada, Jun; Inada, Hitomi; Suzuki, Makoto; Seki, Taeko; Ichikawa, Satoshi; Tanii, Jun

2015-01-01

Onboard image processing systems for a hyperspectral sensor have been developed in order to maximize image data transmission efficiency for large volume and high speed data downlink capacity. Since more than 100 channels are required for hyperspectral sensors on Earth observation satellites, fast and small-footprint lossless image compression capability is essential for reducing the size and weight of a sensor system. A fast lossless image compression algorithm has been developed, and is implemented in the onboard correction circuitry of sensitivity and linearity of Complementary Metal Oxide Semiconductor (CMOS) sensors in order to maximize the compression ratio. The employed image compression method is based on Fast, Efficient, Lossless Image compression System (FELICS), which is a hierarchical predictive coding method with resolution scaling. To improve FELICS's performance of image decorrelation and entropy coding, we apply a two-dimensional interpolation prediction and adaptive Golomb-Rice coding. It supports progressive decompression using resolution scaling while still maintaining superior performance measured as speed and complexity. Coding efficiency and compression speed enlarge the effective capacity of signal transmission channels, which lead to reducing onboard hardware by multiplexing sensor signals into a reduced number of compression circuits. The circuitry is embedded into the data formatter of the sensor system without adding size, weight, power consumption, and fabrication cost. PMID:26404281

4. Onboard Image Processing System for Hyperspectral Sensor

PubMed Central

Hihara, Hiroki; Moritani, Kotaro; Inoue, Masao; Hoshi, Yoshihiro; Iwasaki, Akira; Takada, Jun; Inada, Hitomi; Suzuki, Makoto; Seki, Taeko; Ichikawa, Satoshi; Tanii, Jun

2015-01-01

Onboard image processing systems for a hyperspectral sensor have been developed in order to maximize image data transmission efficiency for large volume and high speed data downlink capacity. Since more than 100 channels are required for hyperspectral sensors on Earth observation satellites, fast and small-footprint lossless image compression capability is essential for reducing the size and weight of a sensor system. A fast lossless image compression algorithm has been developed, and is implemented in the onboard correction circuitry of sensitivity and linearity of Complementary Metal Oxide Semiconductor (CMOS) sensors in order to maximize the compression ratio. The employed image compression method is based on Fast, Efficient, Lossless Image compression System (FELICS), which is a hierarchical predictive coding method with resolution scaling. To improve FELICS’s performance of image decorrelation and entropy coding, we apply a two-dimensional interpolation prediction and adaptive Golomb-Rice coding. It supports progressive decompression using resolution scaling while still maintaining superior performance measured as speed and complexity. Coding efficiency and compression speed enlarge the effective capacity of signal transmission channels, which lead to reducing onboard hardware by multiplexing sensor signals into a reduced number of compression circuits. The circuitry is embedded into the data formatter of the sensor system without adding size, weight, power consumption, and fabrication cost. PMID:26404281

5. Feedback regulation of microscopes by image processing.

PubMed

2013-05-01

Computational microscope systems are becoming a major part of imaging biological phenomena, and the development of such systems requires the design of automated regulation of microscopes. An important aspect of automated regulation is feedback regulation, which is the focus of this review. As modern microscope systems become more complex, often with many independent components that must work together, computer control is inevitable since the exact orchestration of parameters and timings for these multiple components is critical to acquire proper images. A number of techniques have been developed for biological imaging to accomplish this. Here, we summarize the basics of computational microscopy for the purpose of building automatically regulated microscopes focus on feedback regulation by image processing. These techniques allow high throughput data acquisition while monitoring both short- and long-term dynamic phenomena, which cannot be achieved without an automated system. PMID:23594233

6. FITSH: Software Package for Image Processing

Pál, András

2011-11-01

FITSH provides a standalone environment for analysis of data acquired by imaging astronomical detectors. The package provides utilities both for the full pipeline of subsequent related data processing steps (including image calibration, astrometry, source identification, photometry, differential analysis, low-level arithmetic operations, multiple image combinations, spatial transformations and interpolations, etc.) and for aiding the interpretation of the (mainly photometric and/or astrometric) results. The package also features a consistent implementation of photometry based on image subtraction, point spread function fitting and aperture photometry and provides easy-to-use interfaces for comparisons and for picking the most suitable method for a particular problem. The utilities in the package are built on the top of the commonly used UNIX/POSIX shells (hence the name of the package), therefore both frequently used and well-documented tools for such environments can be exploited and managing massive amount of data is rather convenient.

7. Simplified labeling process for medical image segmentation.

PubMed

Gao, Mingchen; Huang, Junzhou; Huang, Xiaolei; Zhang, Shaoting; Metaxas, Dimitris N

2012-01-01

Image segmentation plays a crucial role in many medical imaging applications by automatically locating the regions of interest. Typically supervised learning based segmentation methods require a large set of accurately labeled training data. However, thel labeling process is tedious, time consuming and sometimes not necessary. We propose a robust logistic regression algorithm to handle label outliers such that doctors do not need to waste time on precisely labeling images for training set. To validate its effectiveness and efficiency, we conduct carefully designed experiments on cervigram image segmentation while there exist label outliers. Experimental results show that the proposed robust logistic regression algorithms achieve superior performance compared to previous methods, which validates the benefits of the proposed algorithms. PMID:23286072

8. Enhanced neutron imaging detector using optical processing

SciTech Connect

Hutchinson, D.P.; McElhaney, S.A.

1992-08-01

Existing neutron imaging detectors have limited count rates due to inherent property and electronic limitations. The popular multiwire proportional counter is qualified by gas recombination to a count rate of less than 10{sup 5} n/s over the entire array and the neutron Anger camera, even though improved with new fiber optic encoding methods, can only achieve 10{sup 6} cps over a limited array. We present a preliminary design for a new type of neutron imaging detector with a resolution of 2--5 mm and a count rate capability of 10{sup 6} cps pixel element. We propose to combine optical and electronic processing to economically increase the throughput of advanced detector systems while simplifying computing requirements. By placing a scintillator screen ahead of an optical image processor followed by a detector array, a high throughput imaging detector may be constructed.

9. Mariner 9 - Image processing and products.

NASA Technical Reports Server (NTRS)

Levinthal, E. C.; Green, W. B.; Cutts, J. A.; Jahelka, E. D.; Johansen, R. A.; Sander, M. J.; Seidman, J. B.; Young, A. T.; Soderblom, L. A.

1972-01-01

The purpose of this paper is to describe the system for the display, processing, and production of image data products created to support the Mariner 9 Television Experiment. Of necessity, the system was large in order to respond to the needs of a large team of scientists with a broad scope of experimental objectives. The desire to generate processed data products as rapidly as possible to take advantage of adaptive planning during the mission, coupled with the complexities introduced by the nature of the vidicon camera, greatly increased the scale of the ground image processing effort. This paper describes the systems that carried out the processes and delivered the products necessary for real-time and near-real-time analyses. References are made to the computer algorithms used for the different levels of decalibration and analysis.

10. Mariner 9 - Image processing and products.

NASA Technical Reports Server (NTRS)

Levinthal, E. C.; Green, W. B.; Cutts, J. A.; Jahelka, E. D.; Johansen, R. A.; Sander, M. J.; Seidman, J. B.; Young, A. T.; Soderblom, L. A.

1973-01-01

The purpose of this paper is to describe the system for the display, processing, and production of image-data products created to support the Mariner 9 Television Experiment. Of necessity, the system was large in order to respond to the needs of a large team of scientists with a broad scope of experimental objectives. The desire to generate processed data products as rapidly as possible, coupled with the complexities introduced by the nature of the vidicon camera, greatly increased the scale of the ground-image processing effort. This paper describes the systems that carried out the processes and delivered the products necessary for real-time and near-real-time analyses. References are made to the computer algorithms used for the different levels of decalibration and analysis.

11. Mariner 9-Image processing and products

USGS Publications Warehouse

Levinthal, E.C.; Green, W.B.; Cutts, J.A.; Jahelka, E.D.; Johansen, R.A.; Sander, M.J.; Seidman, J.B.; Young, A.T.; Soderblom, L.A.

1973-01-01

The purpose of this paper is to describe the system for the display, processing, and production of image-data products created to support the Mariner 9 Television Experiment. Of necessity, the system was large in order to respond to the needs of a large team of scientists with a broad scope of experimental objectives. The desire to generate processed data products as rapidly as possible to take advantage of adaptive planning during the mission, coupled with the complexities introduced by the nature of the vidicon camera, greatly increased the scale of the ground-image processing effort. This paper describes the systems that carried out the processes and delivered the products necessary for real-time and near-real-time analyses. References are made to the computer algorithms used for the, different levels of decalibration and analysis. ?? 1973.

12. Web-based document image processing

Walker, Frank L.; Thoma, George R.

1999-12-01

Increasing numbers of research libraries are turning to the Internet for electron interlibrary loan and for document delivery to patrons. This has been made possible through the widespread adoption of software such as Ariel and DocView. Ariel, a product of the Research Libraries Group, converts paper-based documents to monochrome bitmapped images, and delivers them over the Internet. The National Library of Medicine's DocView is primarily designed for library patrons are beginning to reap the benefits of this new technology, barriers exist, e.g., differences in image file format, that lead to difficulties in the use of library document information. To research how to overcome such barriers, the Communications Engineering Branch of the Lister Hill National Center for Biomedical Communications, an R and D division of NLM, has developed a web site called the DocMorph Server. This is part of an ongoing intramural R and D program in document imaging that has spanned many aspects of electronic document conversion and preservation, Internet document transmission and document usage. The DocMorph Server Web site is designed to fill two roles. First, in a role that will benefit both libraries and their patrons, it allows Internet users to upload scanned image files for conversion to alternative formats, thereby enabling wider delivery and easier usage of library document information. Second, the DocMorph Server provides the design team an active test bed for evaluating the effectiveness and utility of new document image processing algorithms and functions, so that they may be evaluated for possible inclusion in other image processing software products being developed at NLM or elsewhere. This paper describes the design of the prototype DocMorph Server and the image processing functions being implemented on it.

13. Digital image processing of vascular angiograms

NASA Technical Reports Server (NTRS)

Selzer, R. H.; Beckenbach, E. S.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.

1975-01-01

The paper discusses the estimation of the degree of atherosclerosis in the human femoral artery through the use of a digital image processing system for vascular angiograms. The film digitizer uses an electronic image dissector camera to scan the angiogram and convert the recorded optical density information into a numerical format. Another processing step involves locating the vessel edges from the digital image. The computer has been programmed to estimate vessel abnormality through a series of measurements, some derived primarily from the vessel edge information and others from optical density variations within the lumen shadow. These measurements are combined into an atherosclerosis index, which is found in a post-mortem study to correlate well with both visual and chemical estimates of atherosclerotic disease.

14. A mathematical model for estimating the axial stress of the common carotid artery wall from ultrasound images.

PubMed

Soleimani, Effat; Mokhtari-Dizaji, Manijhe; Saberi, Hajir; Sharif-Kashani, Shervin

2016-08-01

Clarifying the complex interaction between mechanical and biological processes in healthy and diseased conditions requires constitutive models for arterial walls. In this study, a mathematical model for the displacement of the carotid artery wall in the longitudinal direction is defined providing a satisfactory representation of the axial stress applied to the arterial wall. The proposed model was applied to the carotid artery wall motion estimated from ultrasound image sequences of 10 healthy adults, and the axial stress waveform exerted on the artery wall was extracted. Consecutive ultrasonic images (30 frames per second) of the common carotid artery of 10 healthy subjects (age 44 ± 4 year) were recorded and transferred to a personal computer. Longitudinal displacement and acceleration were extracted from ultrasonic image processing using a block-matching algorithm. Furthermore, images were examined using a maximum gradient algorithm and time rate changes of the internal diameter and intima-media thickness were extracted. Finally, axial stress was estimated using an appropriate constitutive equation for thin-walled tubes. Performance of the proposed model was evaluated using goodness of fit between approximated and measured longitudinal displacement statistics. Values of goodness-of-fit statistics indicated high quality of fit for all investigated subjects with the mean adjusted R-square (0.86 ± 0.08) and root mean squared error (0.08 ± 0.04 mm). According to the results of the present study, maximum and minimum axial stresses exerted on the arterial wall are 1.7 ± 0.6 and -1.5 ± 0.5 kPa, respectively. These results reveal the potential of this technique to provide a new method to assess arterial stress from ultrasound images, overcoming the limitations of the finite element and other simulation techniques. PMID:26563198

15. Motivational Beliefs and Cognitive Processes in Mathematics Achievement, Analyzed in the Context of Cultural Differences: A Korean Elementary School Example

ERIC Educational Resources Information Center

Seo, Daeryong; Taherbhai, Husein

2009-01-01

The relations among students' motivational beliefs, cognitive processes, and academic achievement were investigated. A 51-item questionnaire together with a mathematics achievement test was administered to 459 fifth graders in Korean elementary school mathematics classrooms. Results indicated that, in general, students' cognitive processes related…

16. Individual Differences in Working Memory, Nonverbal IQ, and Mathematics Achievement and Brain Mechanisms Associated with Symbolic and Nonsymbolic Number Processing

ERIC Educational Resources Information Center

Gullick, Margaret M.; Sprute, Lisa A.; Temple, Elise

2011-01-01

Individual differences in mathematics performance may stem from domain-general factors like working memory and intelligence. Parietal and frontal brain areas have been implicated in number processing, but the influence of such cognitive factors on brain activity during mathematics processing is not known. The relationship between brain mechanisms…

17. Conceptualization of Approaches and Thought Processes Emerging in Validating of Model in Mathematical Modeling in Technology Aided Environment

ERIC Educational Resources Information Center

Hidiroglu, Çaglar Naci; Bukova Güzel, Esra

2013-01-01

The aim of the present study is to conceptualize the approaches displayed for validation of model and thought processes provided in mathematical modeling process performed in technology-aided learning environment. The participants of this grounded theory study were nineteen secondary school mathematics student teachers. The data gathered from the…

18. Mathematical models in simulation process in rehabilitation of persons with disabilities

Gorie, Nina; Dolga, Valer; Mondoc, Alina

2012-11-01

The problems of people with disability are varied. A disability may be physical, cognitive, mental, sensory, emotional, developmental or some combination of these. The major disabilities which can appear in people's lives are: the blindness, the deafness, the limb-girdle muscular dystrophy, the orthopedic impairment, the visual impairment. A disability is an umbrella term, covering impairments, activity limitations and participation restrictions. A disability may occur during a person's lifetime or may be present from birth. The authors conclude that some of these disabilities like physical, cognitive, mental, sensory, emotional, developmental can be rehabilitated. Starting from this state of affairs the authors present briefly the possibility of using certain mechatronic systems for rehabilitation of persons with different disabilities. The authors focus their presentation on alternative calling the Stewart platform in order to achieve the proposed goal. The authors present a mathematical model of systems theory approach under the parallel system and described its contents can. The authors analyze in a meaningful mathematical model describing the procedure of rehabilitation process. From the affected function biomechanics and taking into account medical recommendations the authors illustrate the mathematical models of rehabilitation work. The authors assemble a whole mathematical model of parallel structure and the rehabilitation process and making simulation and highlighting the results estimated. The authors present in the end work the results envisaged in the end analysis work, conclusions and steps for future work program..

19. Progressive band processing for hyperspectral imaging

Schultz, Robert C.

Hyperspectral imaging has emerged as an image processing technique in many applications. The reason that hyperspectral data is called hyperspectral is mainly because the massive amount of information provided by the hundreds of spectral bands that can be used for data analysis. However, due to very high band-to-band correlation much information may be also redundant. Consequently, how to effectively and best utilize such rich spectral information becomes very challenging. One general approach is data dimensionality reduction which can be performed by data compression techniques, such as data transforms, and data reduction techniques, such as band selection. This dissertation presents a new area in hyperspectral imaging, to be called progressive hyperspectral imaging, which has not been explored in the past. Specifically, it derives a new theory, called Progressive Band Processing (PBP) of hyperspectral data that can significantly reduce computing time and can also be realized in real-time. It is particularly suited for application areas such as hyperspectral data communications and transmission where data can be communicated and transmitted progressively through spectral or satellite channels with limited data storage. Most importantly, PBP allows users to screen preliminary results before deciding to continue with processing the complete data set. These advantages benefit users of hyperspectral data by reducing processing time and increasing the timeliness of crucial decisions made based on the data such as identifying key intelligence information when a required response time is short.

20. Calculation procedures for oil free scroll compressors based on mathematical modelling of working process

Paranin, Y.; Burmistrov, A.; Salikeev, S.; Fomina, M.

2015-08-01

Basic propositions of calculation procedures for oil free scroll compressors characteristics are presented. It is shown that mathematical modelling of working process in a scroll compressor makes it possible to take into account such factors influencing the working process as heat and mass exchange, mechanical interaction in working chambers, leakage through slots, etc. The basic mathematical model may be supplemented by taking into account external heat exchange, elastic deformation of scrolls, inlet and outlet losses, etc. To evaluate the influence of procedure on scroll compressor characteristics calculations accuracy different calculations were carried out. Internal adiabatic efficiency was chosen as a comparative parameter which evaluates the perfection of internal thermodynamic and gas-dynamic compressor processes. Calculated characteristics are compared with experimental values obtained for the compressor pilot sample.

1. Improving Synthetic Aperture Image by Image Compounding in Beamforming Process

Martínez-Graullera, Oscar; Higuti, Ricardo T.; Martín, Carlos J.; Ullate, Luis. G.; Romero, David; Parrilla, Montserrat

2011-06-01

In this work, signal processing techniques are used to improve the quality of image based on multi-element synthetic aperture techniques. Using several apodization functions to obtain different side lobes distribution, a polarity function and a threshold criterium are used to develop an image compounding technique. The spatial diversity is increased using an additional array, which generates complementary information about the defects, improving the results of the proposed algorithm and producing high resolution and contrast images. The inspection of isotropic plate-like structures using linear arrays and Lamb waves is presented. Experimental results are shown for a 1-mm-thick isotropic aluminum plate with artificial defects using linear arrays formed by 30 piezoelectric elements, with the low dispersion symmetric mode S0 at the frequency of 330 kHz.

2. Limiting liability via high resolution image processing

SciTech Connect

1996-12-31

The utilization of high resolution image processing allows forensic analysts and visualization scientists to assist detectives by enhancing field photographs, and by providing the tools and training to increase the quality and usability of field photos. Through the use of digitized photographs and computerized enhancement software, field evidence can be obtained and processed as `evidence ready`, even in poor lighting and shadowed conditions or darkened rooms. These images, which are most often unusable when taken with standard camera equipment, can be shot in the worst of photographic condition and be processed as usable evidence. Visualization scientists have taken the use of digital photographic image processing and moved the process of crime scene photos into the technology age. The use of high resolution technology will assist law enforcement in making better use of crime scene photography and positive identification of prints. Valuable court room and investigation time can be saved and better served by this accurate, performance based process. Inconclusive evidence does not lead to convictions. Enhancement of the photographic capability helps solve one major problem with crime scene photos, that if taken with standard equipment and without the benefit of enhancement software would be inconclusive, thus allowing guilty parties to be set free due to lack of evidence.

3. Processing Infrared Images For Fire Management Applications

Warren, John R.; Pratt, William K.

1981-12-01

The USDA Forest Service has used airborne infrared systems for forest fire detection and mapping for many years. The transfer of the images from plane to ground and the transposition of fire spots and perimeters to maps has been performed manually. A new system has been developed which uses digital image processing, transmission, and storage. Interactive graphics, high resolution color display, calculations, and computer model compatibility are featured in the system. Images are acquired by an IR line scanner and converted to 1024 x 1024 x 8 bit frames for transmission to the ground at a 1.544 M bit rate over a 14.7 GHZ carrier. Individual frames are received and stored, then transferred to a solid state memory to refresh the display at a conventional 30 frames per second rate. Line length and area calculations, false color assignment, X-Y scaling, and image enhancement are available. Fire spread can be calculated for display and fire perimeters plotted on maps. The performance requirements, basic system, and image processing will be described.

4. Visual parameter optimisation for biomedical image processing

PubMed Central

2015-01-01

Background Biomedical image processing methods require users to optimise input parameters to ensure high-quality output. This presents two challenges. First, it is difficult to optimise multiple input parameters for multiple input images. Second, it is difficult to achieve an understanding of underlying algorithms, in particular, relationships between input and output. Results We present a visualisation method that transforms users' ability to understand algorithm behaviour by integrating input and output, and by supporting exploration of their relationships. We discuss its application to a colour deconvolution technique for stained histology images and show how it enabled a domain expert to identify suitable parameter values for the deconvolution of two types of images, and metrics to quantify deconvolution performance. It also enabled a breakthrough in understanding by invalidating an underlying assumption about the algorithm. Conclusions The visualisation method presented here provides analysis capability for multiple inputs and outputs in biomedical image processing that is not supported by previous analysis software. The analysis supported by our method is not feasible with conventional trial-and-error approaches. PMID:26329538

5. Subband/transform functions for image processing

NASA Technical Reports Server (NTRS)

Glover, Daniel

1993-01-01

Functions for image data processing written for use with the MATLAB(TM) software package are presented. These functions provide the capability to transform image data with block transformations (such as the Walsh Hadamard) and to produce spatial frequency subbands of the transformed data. Block transforms are equivalent to simple subband systems. The transform coefficients are reordered using a simple permutation to give subbands. The low frequency subband is a low resolution version of the original image, while the higher frequency subbands contain edge information. The transform functions can be cascaded to provide further decomposition into more subbands. If the cascade is applied to all four of the first stage subbands (in the case of a four band decomposition), then a uniform structure of sixteen bands is obtained. If the cascade is applied only to the low frequency subband, an octave structure of seven bands results. Functions for the inverse transforms are also given. These functions can be used for image data compression systems. The transforms do not in themselves produce data compression, but prepare the data for quantization and compression. Sample quantization functions for subbands are also given. A typical compression approach is to subband the image data, quantize it, then use statistical coding (e.g., run-length coding followed by Huffman coding) for compression. Contour plots of image data and subbanded data are shown.

6. Remote online processing of multispectral image data

Groh, Christine; Rothe, Hendrik

2005-10-01

Within the scope of this paper a both compact and economical data acquisition system for multispecral images is described. It consists of a CCD camera, a liquid crystal tunable filter in combination with an associated concept for data processing. Despite of their limited functionality (e.g.regarding calibration) in comparison with commercial systems such as AVIRIS the use of these upcoming compact multispectral camera systems can be advantageous in many applications. Additional benefit can be derived adding online data processing. In order to maintain the systems low weight and price this work proposes to separate data acquisition and processing modules, and transmit pre-processed camera data online to a stationary high performance computer for further processing. The inevitable data transmission has to be optimised because of bandwidth limitations. All mentioned considerations hold especially for applications involving mini-unmanned-aerial-vehicles (mini-UAVs). Due to their limited internal payload the use of a lightweight, compact camera system is of particular importance. This work emphasises on the optimal software interface in between pre-processed data (from the camera system), transmitted data (regarding small bandwidth) and post-processed data (based on high performance computer). Discussed parameters are pre-processing algorithms, channel bandwidth, and resulting accuracy in the classification of multispectral image data. The benchmarked pre-processing algorithms include diagnostic statistics, test of internal determination coefficients as well as loss-free and lossy data compression methods. The resulting classification precision is computed in comparison to a classification performed with the original image dataset.

7. Color Imaging management in film processing

Tremeau, Alain; Konik, Hubert; Colantoni, Philippe

2003-12-01

The latest research projects in the laboratory LIGIV concerns capture, processing, archiving and display of color images considering the trichromatic nature of the Human Vision System (HSV). Among these projects one addresses digital cinematographic film sequences of high resolution and dynamic range. This project aims to optimize the use of content for the post-production operators and for the end user. The studies presented in this paper address the use of metadata to optimise the consumption of video content on a device of user's choice independent of the nature of the equipment that captured the content. Optimising consumption includes enhancing the quality of image reconstruction on a display. Another part of this project addresses the content-based adaptation of image display. Main focus is on Regions of Interest (ROI) operations, based on the ROI concepts of MPEG-7. The aim of this second part is to characterize and ensure the conditions of display even if display device or display media changes. This requires firstly the definition of a reference color space and the definition of bi-directional color transformations for each peripheral device (camera, display, film recorder, etc.). The complicating factor is that different devices have different color gamuts, depending on the chromaticity of their primaries and the ambient illumination under which they are viewed. To match the displayed image to the aimed appearance, all kind of production metadata (camera specification, camera colour primaries, lighting conditions) should be associated to the film material. Metadata and content build together rich content. The author is assumed to specify conditions as known from digital graphics arts. To control image pre-processing and image post-processing, these specifications should be contained in the film's metadata. The specifications are related to the ICC profiles but need additionally consider mesopic viewing conditions.

8. Bitplane Image Coding With Parallel Coefficient Processing.

PubMed

Auli-Llinas, Francesc; Enfedaque, Pablo; Moure, Juan C; Sanchez, Victor

2016-01-01

Image coding systems have been traditionally tailored for multiple instruction, multiple data (MIMD) computing. In general, they partition the (transformed) image in codeblocks that can be coded in the cores of MIMD-based processors. Each core executes a sequential flow of instructions to process the coefficients in the codeblock, independently and asynchronously from the others cores. Bitplane coding is a common strategy to code such data. Most of its mechanisms require sequential processing of the coefficients. The last years have seen the upraising of processing accelerators with enhanced computational performance and power efficiency whose architecture is mainly based on the single instruction, multiple data (SIMD) principle. SIMD computing refers to the execution of the same instruction to multiple data in a lockstep synchronous way. Unfortunately, current bitplane coding strategies cannot fully profit from such processors due to inherently sequential coding task. This paper presents bitplane image coding with parallel coefficient (BPC-PaCo) processing, a coding method that can process many coefficients within a codeblock in parallel and synchronously. To this end, the scanning order, the context formation, the probability model, and the arithmetic coder of the coding engine have been re-formulated. The experimental results suggest that the penalization in coding performance of BPC-PaCo with respect to the traditional strategies is almost negligible. PMID:26441420

9. [Digital thoracic radiology: devices, image processing, limits].

PubMed

Frija, J; de Géry, S; Lallouet, F; Guermazi, A; Zagdanski, A M; De Kerviler, E

2001-09-01

In a first part, the different techniques of digital thoracic radiography are described. Since computed radiography with phosphore plates are the most commercialized it is more emphasized. But the other detectors are also described, as the drum coated with selenium and the direct digital radiography with selenium detectors. The other detectors are also studied in particular indirect flat panels detectors and the system with four high resolution CCD cameras. In a second step the most important image processing are discussed: the gradation curves, the unsharp mask processing, the system MUSICA, the dynamic range compression or reduction, the soustraction with dual energy. In the last part the advantages and the drawbacks of computed thoracic radiography are emphasized. The most important are the almost constant good quality of the pictures and the possibilities of image processing. PMID:11567193

10. Image processing via VLSI: A concept paper

NASA Technical Reports Server (NTRS)

Nathan, R.

1982-01-01

Implementing specific image processing algorithms via very large scale integrated systems offers a potent solution to the problem of handling high data rates. Two algorithms stand out as being particularly critical -- geometric map transformation and filtering or correlation. These two functions form the basis for data calibration, registration and mosaicking. VLSI presents itself as an inexpensive ancillary function to be added to almost any general purpose computer and if the geometry and filter algorithms are implemented in VLSI, the processing rate bottleneck would be significantly relieved. A set of image processing functions that limit present systems to deal with future throughput needs, translates these functions to algorithms, implements via VLSI technology and interfaces the hardware to a general purpose digital computer is developed.

11. EOS image data processing system definition study

NASA Technical Reports Server (NTRS)

Gilbert, J.; Honikman, T.; Mcmahon, E.; Miller, E.; Pietrzak, L.; Yorsz, W.

1973-01-01

The Image Processing System (IPS) requirements and configuration are defined for NASA-sponsored advanced technology Earth Observatory System (EOS). The scope included investigation and definition of IPS operational, functional, and product requirements considering overall system constraints and interfaces (sensor, etc.) The scope also included investigation of the technical feasibility and definition of a point design reflecting system requirements. The design phase required a survey of present and projected technology related to general and special-purpose processors, high-density digital tape recorders, and image recorders.

12. Electronics Signal Processing for Medical Imaging

Turchetta, Renato

This paper describes the way the signal coming from a radiation detector is conditioned and processed to produce images useful for medical applications. First of all, the small signal produce by the radiation is processed by analogue electronics specifically designed to produce a good signal-over-noise ratio. The optimised analogue signal produced at this stage can then be processed and transformed into digital information that is eventually stored in a computer, where it can be further processed as required. After an introduction to the general requirements of the processing electronics, we will review the basic building blocks that process the `tiny' analogue signal coming from a radiation detector. We will in particular analyse how it is possible to optimise the signal-over-noise ratio of the electronics. Some exercises, developed in the tutorial, will help to understand this fundamental part. The blocks needed to process the analogue signal and transform it into a digital code will be described. The description of electronics systems used for medical imaging systems will conclude the lecture.

13. Removing DC offset and de-noising for inspecting signal based on mathematical morphology filter processing

Yang, Jianning; Sun, Jun; Ni, Jun

2008-12-01

A new signal processing method based on the theory of mathematical morphology filter for Removing DC offset and noise pollution about the on-site sampled data is proposed. Conclusion that the ratio of the signal numerical frequency to length of flat structuring element determines the attenuation magnitude of signal and the principia of length of flat structuring element has been found, by analyzing inhesion meaning of base operation of mathematical morphology and then making up combination morphological filter with Maragos type of open-close and close-open. This method possesses following advantages: high calculation speed, easy implementation of hardware and better use value. The arithmetic is simulated with Matlab software and then transplanted to DSP hardware platform and practiced in on-site. It is proved that the proposed method can deal with the sampled data corrupted by on-site noise and the DC of set efficiently. Also, the processed results can meet the requirement in real time well.

14. An image-processing system, motion analysis oriented (IPS-100), applied to microscopy.

PubMed

Gualtieri, P; Coltelli, P

1991-09-01

This paper describes a real-time video image processing system, suitable for image analysis of stationary and moving images. It consists of a high-quality microscope, a general-purpose personal computer, a commercially available image-processing hardware module plugged into the computer bus, a b/w TV-camera, video monitors and a software package. The structure and the capability of this system are explained. The software is menu-driven and performs real-time image enhancements, real-time mathematical and morphological filters, image segmentation and labelling, real-time identification of moving objects, and real-time analysis of their movements. The program is available in listing form. PMID:1760921

15. Mathematical modeling of thermal processing of individual solid-fuel particles

SciTech Connect

Patskov, V.P.; Dudnik, A.N.; Anishchenko, A.A.

1995-08-01

A mathematical model, an algorithm, and a program for calculating the thermal processing of individual solid-fuel particles are developed with account for moisture evaporation, escape of volatiles, and burn-out of the carbon residue. Numerical calculations of the influence of regime conditions on the gasification-combustion of individual particles of Chelyabinsk brown coal are performed. A comparison with experiment is made.

16. Computer image processing in marine resource exploration

NASA Technical Reports Server (NTRS)

Paluzzi, P. R.; Normark, W. R.; Hess, G. R.; Hess, H. D.; Cruickshank, M. J.

1976-01-01

Pictographic data or imagery is commonly used in marine exploration. Pre-existing image processing techniques (software) similar to those used on imagery obtained from unmanned planetary exploration were used to improve marine photography and side-scan sonar imagery. Features and details not visible by conventional photo processing methods were enhanced by filtering and noise removal on selected deep-sea photographs. Information gained near the periphery of photographs allows improved interpretation and facilitates construction of bottom mosaics where overlapping frames are available. Similar processing techniques were applied to side-scan sonar imagery, including corrections for slant range distortion, and along-track scale changes. The use of digital data processing and storage techniques greatly extends the quantity of information that can be handled, stored, and processed.

17. IMAGE 100: The interactive multispectral image processing system

NASA Technical Reports Server (NTRS)

Schaller, E. S.; Towles, R. W.

1975-01-01

The need for rapid, cost-effective extraction of useful information from vast quantities of multispectral imagery available from aircraft or spacecraft has resulted in the design, implementation and application of a state-of-the-art processing system known as IMAGE 100. Operating on the general principle that all objects or materials possess unique spectral characteristics or signatures, the system uses this signature uniqueness to identify similar features in an image by simultaneously analyzing signatures in multiple frequency bands. Pseudo-colors, or themes, are assigned to features having identical spectral characteristics. These themes are displayed on a color CRT, and may be recorded on tape, film, or other media. The system was designed to incorporate key features such as interactive operation, user-oriented displays and controls, and rapid-response machine processing. Owing to these features, the user can readily control and/or modify the analysis process based on his knowledge of the input imagery. Effective use can be made of conventional photographic interpretation skills and state-of-the-art machine analysis techniques in the extraction of useful information from multispectral imagery. This approach results in highly accurate multitheme classification of imagery in seconds or minutes rather than the hours often involved in processing using other means.

18. Students, Computers and Mathematics the Golden Trilogy in the Teaching-Learning Process

ERIC Educational Resources Information Center

García-Santillán, Arturo; Escalera-Chávez, Milka Elena; López-Morales, José Satsumi; Córdova Rangel, Arturo

2014-01-01

In this paper we examine the relationships between students' attitudes towards mathematics and technology, therefore, we take a Galbraith and Hines' scale (1998, 2000) about mathematics confidence, computer confidence, computer and mathematics interaction, mathematics motivation, computer motivation, and mathematics engagement. 164…

19. Sorting Olive Batches for the Milling Process Using Image Processing.

PubMed

Aguilera Puerto, Daniel; Martínez Gila, Diego Manuel; Gámez García, Javier; Gómez Ortega, Juan

2015-01-01

The quality of virgin olive oil obtained in the milling process is directly bound to the characteristics of the olives. Hence, the correct classification of the different incoming olive batches is crucial to reach the maximum quality of the oil. The aim of this work is to provide an automatic inspection system, based on computer vision, and to classify automatically different batches of olives entering the milling process. The classification is based on the differentiation between ground and tree olives. For this purpose, three different species have been studied (Picudo, Picual and Hojiblanco). The samples have been obtained by picking the olives directly from the tree or from the ground. The feature vector of the samples has been obtained on the basis of the olive image histograms. Moreover, different image preprocessing has been employed, and two classification techniques have been used: these are discriminant analysis and neural networks. The proposed methodology has been validated successfully, obtaining good classification results. PMID:26147729

20. Sorting Olive Batches for the Milling Process Using Image Processing

PubMed Central

Puerto, Daniel Aguilera; Martínez Gila, Diego Manuel; Gámez García, Javier; Gómez Ortega, Juan

2015-01-01

The quality of virgin olive oil obtained in the milling process is directly bound to the characteristics of the olives. Hence, the correct classification of the different incoming olive batches is crucial to reach the maximum quality of the oil. The aim of this work is to provide an automatic inspection system, based on computer vision, and to classify automatically different batches of olives entering the milling process. The classification is based on the differentiation between ground and tree olives. For this purpose, three different species have been studied (Picudo, Picual and Hojiblanco). The samples have been obtained by picking the olives directly from the tree or from the ground. The feature vector of the samples has been obtained on the basis of the olive image histograms. Moreover, different image preprocessing has been employed, and two classification techniques have been used: these are discriminant analysis and neural networks. The proposed methodology has been validated successfully, obtaining good classification results. PMID:26147729

1. Digital image processing of vascular angiograms

NASA Technical Reports Server (NTRS)

Selzer, R. H.; Blankenhorn, D. H.; Beckenbach, E. S.; Crawford, D. W.; Brooks, S. H.

1975-01-01

A computer image processing technique was developed to estimate the degree of atherosclerosis in the human femoral artery. With an angiographic film of the vessel as input, the computer was programmed to estimate vessel abnormality through a series of measurements, some derived primarily from the vessel edge information and others from optical density variations within the lumen shadow. These measurements were combined into an atherosclerosis index, which was found to correlate well with both visual and chemical estimates of atherosclerotic disease.

2. IPLIB (Image processing library) user's manual

NASA Technical Reports Server (NTRS)

Faulcon, N. D.; Monteith, J. H.; Miller, K.

1985-01-01

IPLIB is a collection of HP FORTRAN 77 subroutines and functions that facilitate the use of a COMTAL image processing system driven by an HP-1000 computer. It is intended for programmers who want to use the HP 1000 to drive the COMTAL Vision One/20 system. It is assumed that the programmer knows HP 1000 FORTRAN 77 or at least one FORTRAN dialect. It is also assumed that the programmer has some familiarity with the COMTAL Vision One/20 system.

3. Novel image processing approach to detect malaria

Mas, David; Ferrer, Belen; Cojoc, Dan; Finaurini, Sara; Mico, Vicente; Garcia, Javier; Zalevsky, Zeev

2015-09-01

In this paper we present a novel image processing algorithm providing good preliminary capabilities for in vitro detection of malaria. The proposed concept is based upon analysis of the temporal variation of each pixel. Changes in dark pixels mean that inter cellular activity happened, indicating the presence of the malaria parasite inside the cell. Preliminary experimental results involving analysis of red blood cells being either healthy or infected with malaria parasites, validated the potential benefit of the proposed numerical approach.

4. Color Image Processing and Object Tracking System

NASA Technical Reports Server (NTRS)

Klimek, Robert B.; Wright, Ted W.; Sielken, Robert S.

1996-01-01

This report describes a personal computer based system for automatic and semiautomatic tracking of objects on film or video tape, developed to meet the needs of the Microgravity Combustion and Fluids Science Research Programs at the NASA Lewis Research Center. The system consists of individual hardware components working under computer control to achieve a high degree of automation. The most important hardware components include 16-mm and 35-mm film transports, a high resolution digital camera mounted on a x-y-z micro-positioning stage, an S-VHS tapedeck, an Hi8 tapedeck, video laserdisk, and a framegrabber. All of the image input devices are remotely controlled by a computer. Software was developed to integrate the overall operation of the system including device frame incrementation, grabbing of image frames, image processing of the object's neighborhood, locating the position of the object being tracked, and storing the coordinates in a file. This process is performed repeatedly until the last frame is reached. Several different tracking methods are supported. To illustrate the process, two representative applications of the system are described. These applications represent typical uses of the system and include tracking the propagation of a flame front and tracking the movement of a liquid-gas interface with extremely poor visibility.

5. Processing of space, time, and number contributes to mathematical abilities above and beyond domain-general cognitive abilities.

PubMed

Skagerlund, Kenny; Träff, Ulf

2016-03-01

The current study investigated whether processing of number, space, and time contributes to mathematical abilities beyond previously known domain-general cognitive abilities in a sample of 8- to 10-year-old children (N=133). Multiple regression analyses revealed that executive functions and general intelligence predicted all aspects of mathematics and overall mathematical ability. Working memory capacity did not contribute significantly to our models, whereas spatial ability was a strong predictor of achievement. The study replicates earlier research showing that non-symbolic number processing seems to lose predictive power of mathematical abilities once the symbolic system is acquired. Novel findings include the fact that time discrimination ability was tied to calculation ability. Therefore, a conclusion is that magnitude processing in general contributes to mathematical achievement. PMID:26637947

6. Optical processing of imaging spectrometer data

NASA Technical Reports Server (NTRS)

Liu, Shiaw-Dong; Casasent, David

1988-01-01

The data-processing problems associated with imaging spectrometer data are reviewed; new algorithms and optical processing solutions are advanced for this computationally intensive application. Optical decision net, directed graph, and neural net solutions are considered. Decision nets and mineral element determination of nonmixture data are emphasized here. A new Fisher/minimum-variance clustering algorithm is advanced, initialization using minimum-variance clustering is found to be preferred and fast. Tests on a 500-class problem show the excellent performance of this algorithm.

7. A Pilot Study in the Application of the Analytic Hierarchy Process to Predict Student Performance in Mathematics

ERIC Educational Resources Information Center

Warwick, Jon

2007-01-01

The decline in the development of mathematical skills in students prior to university entrance has been a matter of concern to UK higher education staff for a number of years. This article describes a pilot study that uses the Analytic Hierarchy Process to quantify the mathematical experiences of computing students prior to the start of a first…

8. Report of the Coordinators' Training for Large Scale Field Testing of Developing Mathematical Processes. Technical Report No. 296.

ERIC Educational Resources Information Center

Montgomery, Mary E.; Whitaker, Donald R.

This report describes a 1972-73 field test regarding the development of procedures and materials for training coordinators to implement the Developing Mathematical Processes (DMP) program. DMP is a research-based, elementary school mathematics program under development at the Wisconsin Research and Development Center for Cognitive Learning. To…

9. A Three Month Trial of Developing Mathematical Processes (DMP) with Ten Educable Mentally Retarded Children. Technical Report No. 336.

ERIC Educational Resources Information Center

Abernatha, Evelyn; Wiles, Clyde A.

The purpose of this study was to assess the usefulness of the elementary mathematics program Developing Mathematical Processes (DMP) for educable mentally retarded (EMR) students. The subjects of this study were 10 children from an intact class designated EMR. The children ranged in age from 7 to 12 years. The 1972 Developmental Edition of the DMP…

10. The Prospective Mathematics Teachers' Thought Processes and Views about Using Problem-Based Learning in Statistics Education

ERIC Educational Resources Information Center

Canturk-Gunhan, Berna; Bukova-Guzel, Esra; Ozgur, Zekiye

2012-01-01

The purpose of this study is to determine prospective mathematics teachers' views about using problem-based learning (PBL) in statistics teaching and to examine their thought processes. It is a qualitative study conducted with 15 prospective mathematics teachers from a state university in Turkey. The data were collected via participant observation…

11. Noticing Children's Learning Processes--Teachers Jointly Reflect on Their Own Classroom Interaction for Improving Mathematics Teaching

ERIC Educational Resources Information Center

Scherer, Petra; Steinbring, Heinz

2006-01-01

One could focus on many different aspects of improving the quality of mathematics teaching. For a better understanding of children's mathematical learning processes or teaching and learning in general, reflection on and analysis of concrete classroom situations are of major importance. On the basis of experiences gained in a collaborative research…

12. Comparison of breast tissue measurements using magnetic resonance imaging, digital mammography and a mathematical algorithm

Lu, Lee-Jane W.; Nishino, Thomas K.; Johnson, Raleigh F.; Nayeem, Fatima; Brunder, Donald G.; Ju, Hyunsu; Leonard, Morton H., Jr.; Grady, James J.; Khamapirad, Tuenchit

2012-11-01

Women with mostly mammographically dense fibroglandular tissue (breast density, BD) have a four- to six-fold increased risk for breast cancer compared to women with little BD. BD is most frequently estimated from two-dimensional (2D) views of mammograms by a histogram segmentation approach (HSM) and more recently by a mathematical algorithm consisting of mammographic imaging parameters (MATH). Two non-invasive clinical magnetic resonance imaging (MRI) protocols: 3D gradient-echo (3DGRE) and short tau inversion recovery (STIR) were modified for 3D volumetric reconstruction of the breast for measuring fatty and fibroglandular tissue volumes by a Gaussian-distribution curve-fitting algorithm. Replicate breast exams (N = 2 to 7 replicates in six women) by 3DGRE and STIR were highly reproducible for all tissue-volume estimates (coefficients of variation <5%). Reliability studies compared measurements from four methods, 3DGRE, STIR, HSM, and MATH (N = 95 women) by linear regression and intra-class correlation (ICC) analyses. Rsqr, regression slopes, and ICC, respectively, were (1) 0.76-0.86, 0.8-1.1, and 0.87-0.92 for %-gland tissue, (2) 0.72-0.82, 0.64-0.96, and 0.77-0.91, for glandular volume, (3) 0.87-0.98, 0.94-1.07, and 0.89-0.99, for fat volume, and (4) 0.89-0.98, 0.94-1.00, and 0.89-0.98, for total breast volume. For all values estimated, the correlation was stronger for comparisons between the two MRI than between each MRI versus mammography, and between each MRI versus MATH data than between each MRI versus HSM data. All ICC values were >0.75 indicating that all four methods were reliable for measuring BD and that the mathematical algorithm and the two complimentary non-invasive MRI protocols could objectively and reliably estimate different types of breast tissues.

13. A fuzzy mathematics model for radioactive waste characterization by process knowledge

SciTech Connect

Smith, M.; Stevens, S.; Elam, K.; Vrba, J.

1994-12-31

Fuzzy mathematics and fuzzy logic are means for making decisions that can integrate complicated combinations of hard and soft factors and produce mathematically validated results that can be independently verified. In this particular application, several sources of information regarding the waste stream have been compiled, including facility operating records, other waste generated from the facility in the past, laboratory analysis results, and interviews with facility personnel. A fuzzy mathematics model is used to interrelate these various sources of information and arrive at a defensible estimate of the contaminant concentration in the final waste product. The model accounts for the separate process knowledge-based contaminant concentrations by providing a weighted averaging technique to incorporate information from the various sources. Reliability estimates are provided for each of the component pieces of information and combined using the model into an estimate that provides a near-probabilistic value for contaminant concentration. The speadsheet accounts for the estimated uncertainty in the concentration on the basis of {open_quotes}reliability curves,{close_quotes} which are derived from personal process knowledge as well as limited independent measurements.

14. Automated synthesis of image processing procedures using AI planning techniques

NASA Technical Reports Server (NTRS)

Chien, Steve; Mortensen, Helen

1994-01-01

This paper describes the Multimission VICAR (Video Image Communication and Retrieval) Planner (MVP) (Chien 1994) system, which uses artificial intelligence planning techniques (Iwasaki & Friedland, 1985, Pemberthy & Weld, 1992, Stefik, 1981) to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing subprograms) in response to image processing requests made to the JPL Multimission Image Processing Laboratory (MIPL). The MVP system allows the user to specify the image processing requirements in terms of the various types of correction required. Given this information, MVP derives unspecified required processing steps and determines appropriate image processing programs and parameters to achieve the specified image processing goals. This information is output as an executable image processing program which can then be executed to fill the processing request.

15. High-speed imaging and image processing in voice disorders

Tigges, Monika; Wittenberg, Thomas; Rosanowski, Frank; Eysholdt, Ulrich

1996-12-01

A digital high-speed camera system for the endoscopic examination of the larynx delivers recording speeds of up to 10,000 frames/s. Recordings of up to 1 s duration can be stored and used for further evaluation. Maximum resolution is 128 multiplied by 128 pixel. The acoustic and electroglottographic signals are recorded simultaneously. An image processing program especially developed for this purpose renders time-way-waveforms (high-speed glottograms) of several locations on the vocal cords. From the graphs all of the known objective parameters of the voice can be derived. Results of examinations in normal subjects and patients are presented.

16. A mathematical model for the secondary drying of a freeze-drying process

Font, F.; Lee, W.

2015-09-01

In this manuscript a mathematical model describing the secondary drying stage of a freeze-drying process is presented. The model consists in governing equations for the transport of an air-vapour mixture in a porous medium. The production of water vapour due to the desorption of bound water is accounted for by means of a source term in the equation for the water vapour concentration. We show how, in the limit of small Peclet numbers, the model can be solved analytically. In addition, we provide with an explicit expression for the total time for the secondary drying stage of the freeze-drying process amenable for real time control applications.

17. Mathematical modeling in biological populations through branching processes. Application to salmonid populations.

PubMed

Molina, Manuel; Mota, Manuel; Ramos, Alfonso

2015-01-01

This work deals with mathematical modeling through branching processes. We consider sexually reproducing animal populations where, in each generation, the number of progenitor couples is determined in a non-predictable environment. By using a class of two-sex branching processes, we describe their demographic dynamics and provide several probabilistic and inferential contributions. They include results about the extinction of the population and the estimation of the offspring distribution and its main moments. We also present an application to salmonid populations. PMID:24526259

18. The Effect of Dynamic and Interactive Mathematics Learning Environments (DIMLE), Supporting Multiple Representations, on Perceptions of Elementary Mathematics Pre-Service Teachers in Problem Solving Process

ERIC Educational Resources Information Center

Ozdemir, S.; Reis, Z. Ayvaz

2013-01-01

Mathematics is an important discipline, providing crucial tools, such as problem solving, to improve our cognitive abilities. In order to solve a problem, it is better to envision and represent through multiple means. Multiple representations can help a person to redefine a problem with his/her own words in that envisioning process. Dynamic and…

19. Vector processing enhancements for real-time image analysis.

SciTech Connect

Shoaf, S.; APS Engineering Support Division

2008-01-01

A real-time image analysis system was developed for beam imaging diagnostics. An Apple Power Mac G5 with an Active Silicon LFG frame grabber was used to capture video images that were processed and analyzed. Software routines were created to utilize vector-processing hardware to reduce the time to process images as compared to conventional methods. These improvements allow for more advanced image processing diagnostics to be performed in real time.

20. Spatial Data Exploring by Satellite Image Distributed Processing

Mihon, V. D.; Colceriu, V.; Bektas, F.; Allenbach, K.; Gvilava, M.; Gorgan, D.

2012-04-01

Our society needs and environmental predictions encourage the applications development, oriented on supervising and analyzing different Earth Science related phenomena. Satellite images could be explored for discovering information concerning land cover, hydrology, air quality, and water and soil pollution. Spatial and environment related data could be acquired by imagery classification consisting of data mining throughout the multispectral bands. The process takes in account a large set of variables such as satellite image types (e.g. MODIS, Landsat), particular geographic area, soil composition, vegetation cover, and generally the context (e.g. clouds, snow, and season). All these specific and variable conditions require flexible tools and applications to support an optimal search for the appropriate solutions, and high power computation resources. The research concerns with experiments on solutions of using the flexible and visual descriptions of the satellite image processing over distributed infrastructures (e.g. Grid, Cloud, and GPU clusters). This presentation highlights the Grid based implementation of the GreenLand application. The GreenLand application development is based on simple, but powerful, notions of mathematical operators and workflows that are used in distributed and parallel executions over the Grid infrastructure. Currently it is used in three major case studies concerning with Istanbul geographical area, Rioni River in Georgia, and Black Sea catchment region. The GreenLand application offers a friendly user interface for viewing and editing workflows and operators. The description involves the basic operators provided by GRASS [1] library as well as many other image related operators supported by the ESIP platform [2]. The processing workflows are represented as directed graphs giving the user a fast and easy way to describe complex parallel algorithms, without having any prior knowledge of any programming language or application commands

1. Thermal Imaging System For Material Processing

Auric, Daniel; Hanonge, Eric; Kerrand, Emmanuel; de Miscault, Jean-Claude; Cornillault, Jean

1987-09-01

In the field of lasers for welding and surface processing, we need to measure the map of temperatures in order to control the processing in real time by adjusting the laser power, the beam pointing and focussing and the workpiece moving speed. For that purpose, we studied, realized and evaluated a model of thermal imaging system at 2 wavelengths in the mid-infrared. The device is connected to a 3 axis table and to a 3 kW CO2 laser. The range of measured temperatures is 800 C to 1 500 C. The device includes two AGEMA infrared cameras fixed to the welding torch each operating with a choice of filters in the 3, 4 and 5 micrometre band. The field of view of each is about 14 mm by 38 mm. The cameras are connected to an M68000 microprocessor family based microcomputer in which the images enter at the rate of 6. 25 Hz with 64 x 128 pixels by image at both wavelengths. The microcomputer stores the pictures into memory and floppy disk, displays them in false colours and calculates for each pixel the surface temperature of the material with the grey body assumption. The results have been compared with metallurgic analysis of the samples. The precision is about 20 C in most cases and depends on the sample surface state. Simplifications of the laboratory device should lead to a cheap, convenient and reliable product.

2. Numerical magnitude processing deficits in children with mathematical difficulties are independent of intelligence.

PubMed

Brankaer, Carmen; Ghesquière, Pol; De Smedt, Bert

2014-11-01

Developmental dyscalculia (DD) is thought to arise from difficulties in the ability to process numerical magnitudes. Most research relied on IQ-discrepancy based definitions of DD and only included individuals with normal IQ, yet little is known about the role of intelligence in the association between numerical magnitude processing and mathematical difficulties (MD). The present study examined numerical magnitude processing in matched groups of 7- to 8-year-olds (n=42) who had either discrepant MD (poor math scores, average IQ), nondiscrepant MD (poor math scores, below-average IQ) or no MD. Both groups of children with MD showed similar impairments in numerical magnitudes processing compared to controls, suggesting that the association between numerical magnitude processing deficits and MD is independent of intelligence. PMID:25036314

3. The Airborne Ocean Color Imager - System description and image processing

NASA Technical Reports Server (NTRS)

Wrigley, Robert C.; Slye, Robert E.; Klooster, Steven A.; Freedman, Richard S.; Carle, Mark; Mcgregor, Lloyd F.

1992-01-01

The Airborne Ocean Color Imager was developed as an aircraft instrument to simulate the spectral and radiometric characteristics of the next generation of satellite ocean color instrumentation. Data processing programs have been developed as extensions of the Coastal Zone Color Scanner algorithms for atmospheric correction and bio-optical output products. The latter include several bio-optical algorithms for estimating phytoplankton pigment concentration, as well as one for the diffuse attenuation coefficient of the water. Additional programs have been developed to geolocate these products and remap them into a georeferenced data base, using data from the aircraft's inertial navigation system. Examples illustrate the sequential data products generated by the processing system, using data from flightlines near the mouth of the Mississippi River: from raw data to atmospherically corrected data, to bio-optical data, to geolocated data, and, finally, to georeferenced data.

4. Improvement of the detection rate in digital watermarked images against image degradation caused by image processing

Nishio, Masato; Ando, Yutaka; Tsukamoto, Nobuhiro; Kawashima, Hironao; Nakamura, Shinya

2004-04-01

In the current environment of medical information disclosure, the general-purpose image format such as JPEG/BMP which does not require special software for viewing, is suitable for carrying and managing medical image information individually. These formats have no way to know patient and study information. We have therefore developed two kinds of ID embedding methods: one is Bit-swapping method for embedding Alteration detection ID and the other is data-imposing method in Fourier domain using Discrete Cosine Transform (DCT) for embedding Original image source ID. We then applied these two digital watermark methods to four modality images (Chest X-ray, Head CT, Abdomen CT, Bone scintigraphy). However, there were some cases where the digital watermarked ID could not be detected correctly due to image degradation caused by image processing. In this study, we improved the detection rate in digital watermarked image using several techniques, which are Error correction method, Majority correction method, and Scramble location method. We applied these techniques to digital watermarked images against image processing (Smoothing) and evaluated the effectiveness. As a result, Majority correction method is effective to improve the detection rate in digital watermarked image against image degradation.

5. Stent segmentation in IOCT-TD images using gradient combination and mathematical morphology

Cardona Cardenas, Diego A.; Cardoso Moraes, Matheus; Furuie, Sérgio S.

2015-01-01

In 2010, cardiovascular disease (CVD) caused 33% of the total deaths in Brazil. Modalities such as Intravascular Optical Coherent Tomography (IOCT) provides coronary in vivo for detecting and monitoring the progression of CVDs. Specifically, this type of modality is widely used in neo-intima post stent re-stenosis investigation. Computational methods applied to IOCT images can render objective structure information, such as areas, perimeters, etc., allowing more accurate diagnostics. However, the variety of methods in the literature applied in IOCT is still small compared to other related modalities. Therefore, we propose a stent segmentation approach based on extracted features by gradient operations, and Mathematical Morphology. The methodology can be summarized as following: the lumen is segmented and the contrast stretching is generated, both to be used as auxiliary information. Second, the edges of objects were obtained by gradient computation. Next, a stent extractor finds and select relevant stent information. Finally, an interpolation procedure followed by morphological operations ends the segmentation. To evaluate the method, 160 images from pig coronaries were segmented and compared to their gold standards, the images were acquired after 30, 90 and 180 days of stent implantation. The proposed approach present good accuracy of True Positive (TP(%)) = 96.51±5.10, False Positive (FP(%)) = 6.09±5.32 , False Negative (FN(%)) = 3.49±5.10. Conclusion, the good results and the low complexity encourage the use and continuous evolution of current approach. However, only images of IOCT-TD technology were evaluated; therefore, further investigations should adapt this approach to work with IOCT-FD technology as well.

6. Imaging spectrometer for process industry applications

Herrala, Esko; Okkonen, Jukka T.; Hyvarinen, Timo S.; Aikio, Mauri; Lammasniemi, Jorma

1994-11-01

This paper presents an imaging spectrometer principle based on a novel prism-grating-prism (PGP) element as the dispersive component and advanced camera solutions for on-line applications. The PGP element uses a volume type holographic plane transmission grating made of dichromated gelatin (DCG). Currently, spectrographs have been realized for the 400 - 1050 nm region but the applicable spectral region of the PGP is 380 - 1800 nm. Spectral resolution is typically between 1.5 and 5 nm. The on-axis optical configuration and simple rugged tubular optomechanical construction of the spectrograph provide a good image quality and resistance to harsh environmental conditions. Spectrograph optics are designed to be interfaced to any standard CCD camera. Special camera structures and operating modes can be used for applications requiring on-line data interpretation and process control.

7. Processing Neutron Imaging Data - Quo Vadis?

Kaestner, A. P.; Schulz, M.

Once an experiment has ended at a neutron imaging instrument, users often ask themselves how to proceed with the collected data. Large amounts of data have been obtained, but for first time users there is often no plan or experience to evaluate the obtained information. The users are then depending on the support from the local contact, who unfortunately does not have the time to perform in-depth studies for every user. By instructing the users and providing evaluation tools either on-site or as free software this situation can be improved. With the continuous development of new instrument features that require increasingly complex analysis methods, there is a deficit on the side of developing tools that bring the new features to the user community. We propose to start a common platform for open source development of analysis tools dedicated to processing neutron imaging data.

8. Development of the SOFIA Image Processing Tool

NASA Technical Reports Server (NTRS)

2011-01-01

The Stratospheric Observatory for Infrared Astronomy (SOFIA) is a Boeing 747SP carrying a 2.5 meter infrared telescope capable of operating between at altitudes of between twelve and fourteen kilometers, which is above more than 99 percent of the water vapor in the atmosphere. The ability to make observations above most water vapor coupled with the ability to make observations from anywhere, anytime, make SOFIA one of the world s premiere infrared observatories. SOFIA uses three visible light CCD imagers to assist in pointing the telescope. The data from these imagers is stored in archive files as is housekeeping data, which contains information such as boresight and area of interest locations. A tool that could both extract and process data from the archive files was developed.

9. Individual differences in children's mathematics achievement: The roles of symbolic numerical magnitude processing and domain-general cognitive functions.

PubMed

Vanbinst, K; De Smedt, B

2016-01-01

This contribution reviewed the available evidence on the domain-specific and domain-general neurocognitive determinants of children's arithmetic development, other than nonsymbolic numerical magnitude processing, which might have been overemphasized as a core factor of individual differences in mathematics and dyscalculia. We focused on symbolic numerical magnitude processing, working memory, and phonological processing, as these determinants have been most researched and their roles in arithmetic can be predicted against the background of brain imaging data. Our review indicates that symbolic numerical magnitude processing is a major determinant of individual differences in arithmetic. Working memory, particularly the central executive, also plays a role in learning arithmetic, but its influence appears to be dependent on the learning stage and experience of children. The available evidence on phonological processing suggests that it plays a more subtle role in children's acquisition of arithmetic facts. Future longitudinal studies should investigate these factors in concert to understand their relative contribution as well as their mediating and moderating roles in children's arithmetic development. PMID:27339010

10. HYMOSS signal processing for pushbroom spectral imaging

NASA Technical Reports Server (NTRS)

Ludwig, David E.

1991-01-01

The objective of the Pushbroom Spectral Imaging Program was to develop on-focal plane electronics which compensate for detector array non-uniformities. The approach taken was to implement a simple two point calibration algorithm on focal plane which allows for offset and linear gain correction. The key on focal plane features which made this technique feasible was the use of a high quality transimpedance amplifier (TIA) and an analog-to-digital converter for each detector channel. Gain compensation is accomplished by varying the feedback capacitance of the integrate and dump TIA. Offset correction is performed by storing offsets in a special on focal plane offset register and digitally subtracting the offsets from the readout data during the multiplexing operation. A custom integrated circuit was designed, fabricated, and tested on this program which proved that nonuniformity compensated, analog-to-digital converting circuits may be used to read out infrared detectors. Irvine Sensors Corporation (ISC) successfully demonstrated the following innovative on-focal-plane functions that allow for correction of detector non-uniformities. Most of the circuit functions demonstrated on this program are finding their way onto future IC's because of their impact on reduced downstream processing, increased focal plane performance, simplified focal plane control, reduced number of dewar connections, as well as the noise immunity of a digital interface dewar. The potential commercial applications for this integrated circuit are primarily in imaging systems. These imaging systems may be used for: security monitoring systems, manufacturing process monitoring, robotics, and for spectral imaging when used in analytical instrumentation.

11. HYMOSS signal processing for pushbroom spectral imaging

Ludwig, David E.

1991-06-01

The objective of the Pushbroom Spectral Imaging Program was to develop on-focal plane electronics which compensate for detector array non-uniformities. The approach taken was to implement a simple two point calibration algorithm on focal plane which allows for offset and linear gain correction. The key on focal plane features which made this technique feasible was the use of a high quality transimpedance amplifier (TIA) and an analog-to-digital converter for each detector channel. Gain compensation is accomplished by varying the feedback capacitance of the integrate and dump TIA. Offset correction is performed by storing offsets in a special on focal plane offset register and digitally subtracting the offsets from the readout data during the multiplexing operation. A custom integrated circuit was designed, fabricated, and tested on this program which proved that nonuniformity compensated, analog-to-digital converting circuits may be used to read out infrared detectors. Irvine Sensors Corporation (ISC) successfully demonstrated the following innovative on-focal-plane functions that allow for correction of detector non-uniformities. Most of the circuit functions demonstrated on this program are finding their way onto future IC's because of their impact on reduced downstream processing, increased focal plane performance, simplified focal plane control, reduced number of dewar connections, as well as the noise immunity of a digital interface dewar. The potential commercial applications for this integrated circuit are primarily in imaging systems. These imaging systems may be used for: security monitoring systems, manufacturing process monitoring, robotics, and for spectral imaging when used in analytical instrumentation.

12. A New Image Processing and GIS Package

NASA Technical Reports Server (NTRS)

Rickman, D.; Luvall, J. C.; Cheng, T.

1998-01-01

The image processing and GIS package ELAS was developed during the 1980's by NASA. It proved to be a popular, influential and powerful in the manipulation of digital imagery. Before the advent of PC's it was used by hundreds of institutions, mostly schools. It is the unquestioned, direct progenitor or two commercial GIS remote sensing packages, ERDAS and MapX and influenced others, such as PCI. Its power was demonstrated by its use for work far beyond its original purpose, having worked several different types of medical imagery, photomicrographs of rock, images of turtle flippers and numerous other esoteric imagery. Although development largely stopped in the early 1990's the package still offers as much or more power and flexibility than any other roughly comparable package, public or commercial. It is a huge body or code, representing more than a decade of work by full time, professional programmers. The current versions all have several deficiencies compared to current software standards and usage, notably its strictly command line interface. In order to support their research needs the authors are in the process of fundamentally changing ELAS, and in the process greatly increasing its power, utility, and ease of use. The new software is called ELAS II. This paper discusses the design of ELAS II.

13. Some notes on the application of discrete wavelet transform in image processing

SciTech Connect

Caria, Egydio C. S.; Costa A, Trajano A. de; Rebello, Joao Marcos A.

2011-06-23

Mathematical transforms are used in signal processing in order to extract what is known as 'hidden' information. One of these mathematical tools is the Discrete Wavelet Transform (DWT), which has been increasingly employed in non-destructive testing and, more specifically, in image processing. The main concern in the present work is to employ DWT to suppress noise without losing relevant image features. However, some aspects must be taken into consideration when applying DWT in image processing, mainly in the case of weld radiographs, in order to achieve consistent results. Three topics were selected as representative of these difficulties, as follows: 1) How can image matrix be filled to fit the 2{sup n} lines and 2{sup n} rows requirement? 2) How can the most suitable decomposition level of the DWT function and the correct choice of their coefficient suppression be selected? 3) Is there any influence of the scanning direction and the weld radiograph image, e.g., longitudinal or transversal, on the final processing image? It is known that some artifacts may be present in weld radiograph images. Indeed, the weld surface is frequently rough and rippled, what can be seen as gray level variation on the radiograph, being sometimes mistaken as defective areas. Depending on the position of these artifacts, longitudinal or transversal to the weld bead, they may have different influences on the image processing procedure. This influence is clearly seen in the distribution of the DWT Function coefficients. In the present work, examples of two weld radiographs of quite different image quality were given in order to exemplify it.

14. Some Notes on the Application of Discrete Wavelet Transform in Image Processing

Caria, Egydio C. S.; de A. Costa, Trajano A.; Rebello, João Marcos A.

2011-06-01

Mathematical transforms are used in signal processing in order to extract what is known as "hidden" information. One of these mathematical tools is the Discrete Wavelet Transform (DWT), which has been increasingly employed in non-destructive testing and, more specifically, in image processing. The main concern in the present work is to employ DWT to suppress noise without losing relevant image features. However, some aspects must be taken into consideration when applying DWT in image processing, mainly in the case of weld radiographs, in order to achieve consistent results. Three topics were selected as representative of these difficulties, as follows: 1) How can image matrix be filled to fit the 2n lines and 2n rows requirement? 2) How can the most suitable decomposition level of the DWT function and the correct choice of their coefficient suppression be selected? 3) Is there any influence of the scanning direction and the weld radiograph image, e.g., longitudinal or transversal, on the final processing image? It is known that some artifacts may be present in weld radiograph images. Indeed, the weld surface is frequently rough and rippled, what can be seen as gray level variation on the radiograph, being sometimes mistaken as defective areas. Depending on the position of these artifacts, longitudinal or transversal to the weld bead, they may have different influences on the image processing procedure. This influence is clearly seen in the distribution of the DWT Function coefficients. In the present work, examples of two weld radiographs of quite different image quality were given in order to exemplify it.

15. Mathematical Thinking of Kindergarten Boys and Girls: Similar Achievement, Different Contributing Processes

ERIC Educational Resources Information Center

Klein, Pnina S.; Adi-Japha, Esther; Hakak-Benizri, Simcha

2010-01-01

The objective of this study was to examine gender differences in the relations between verbal, spatial, mathematics, and teacher-child mathematics interaction variables. Kindergarten children (N = 80) were videotaped playing games that require mathematical reasoning in the presence of their teachers. The children's mathematics, spatial, and verbal…

16. Using wavelet denoising and mathematical morphology in the segmentation technique applied to blood cells images.

PubMed

Boix, Macarena; Cantó, Begoña

2013-04-01

Accurate image segmentation is used in medical diagnosis since this technique is a noninvasive pre-processing step for biomedical treatment. In this work we present an efficient segmentation method for medical image analysis. In particular, with this method blood cells can be segmented. For that, we combine the wavelet transform with morphological operations. Moreover, the wavelet thresholding technique is used to eliminate the noise and prepare the image for suitable segmentation. In wavelet denoising we determine the best wavelet that shows a segmentation with the largest area in the cell. We study different wavelet families and we conclude that the wavelet db1 is the best and it can serve for posterior works on blood pathologies. The proposed method generates goods results when it is applied on several images. Finally, the proposed algorithm made in MatLab environment is verified for a selected blood cells. PMID:23458301

17. Using Image Processing to Determine Emphysema Severity

2010-10-01

Currently X-rays and computerized tomography (CT) scans are used to detect emphysema, but other tests are required to accurately quantify the amount of lung that has been affected by the disease. These images clearly show if a patient has emphysema, but are unable by visual scan alone, to quantify the degree of the disease, as it presents as subtle, dark spots on the lung. Our goal is to use these CT scans to accurately diagnose and determine emphysema severity levels in patients. This will be accomplished by performing several different analyses of CT scan images of several patients representing a wide range of severity of the disease. In addition to analyzing the original CT data, this process will convert the data to one and two bit images and will then examine the deviation from a normal distribution curve to determine skewness. Our preliminary results show that this method of assessment appears to be more accurate and robust than the currently utilized methods, which involve looking at percentages of radiodensities in the air passages of the lung.

18. Image processing to optimize wave energy converters

Bailey, Kyle Marc-Anthony

The world is turning to renewable energies as a means of ensuring the planet's future and well-being. There have been a few attempts in the past to utilize wave power as a means of generating electricity through the use of Wave Energy Converters (WEC), but only recently are they becoming a focal point in the renewable energy field. Over the past few years there has been a global drive to advance the efficiency of WEC. Placing a mechanical device either onshore or offshore that captures the energy within ocean surface waves to drive a mechanical device is how wave power is produced. This paper seeks to provide a novel and innovative way to estimate ocean wave frequency through the use of image processing. This will be achieved by applying a complex modulated lapped orthogonal transform filter bank to satellite images of ocean waves. The complex modulated lapped orthogonal transform filterbank provides an equal subband decomposition of the Nyquist bounded discrete time Fourier Transform spectrum. The maximum energy of the 2D complex modulated lapped transform subband is used to determine the horizontal and vertical frequency, which subsequently can be used to determine the wave frequency in the direction of the WEC by a simple trigonometric scaling. The robustness of the proposed method is provided by the applications to simulated and real satellite images where the frequency is known.

19. Low level image processing techniques using the pipeline image processing engine in the flight telerobotic servicer

NASA Technical Reports Server (NTRS)

Nashman, Marilyn; Chaconas, Karen J.

1988-01-01

The sensory processing system for the NASA/NBS Standard Reference Model (NASREM) for telerobotic control is described. This control system architecture was adopted by NASA of the Flight Telerobotic Servicer. The control system is hierarchically designed and consists of three parallel systems: task decomposition, world modeling, and sensory processing. The Sensory Processing System is examined, and in particular the image processing hardware and software used to extract features at low levels of sensory processing for tasks representative of those envisioned for the Space Station such as assembly and maintenance are described.

20. Multispectral image processing: the nature factor

Watkins, Wendell R.

1998-09-01

The images processed by our brain represent our window into the world. For some animals this window is derived from a single eye, for others, including humans, two eyes provide stereo imagery, for others like the black widow spider several eyes are used (8 eyes), and some insects like the common housefly utilize thousands of eyes (ommatidia). Still other animals like the bat and dolphin have eyes for regular vision, but employ acoustic sonar vision for seeing where their regular eyes don't work such as in pitch black caves or turbid water. Of course, other animals have adapted to dark environments by bringing along their own lighting such as the firefly and several creates from the depths of the ocean floor. Animal vision is truly varied and has developed over millennia in many remarkable ways. We have learned a lot about vision processes by studying these animal systems and can still learn even more.

1. Mathematical Modeling of Ultracold Few-Body Processes in Atomic Traps

Melezhik, V. S.

2016-02-01

We discuss computational aspects of the developed mathematical models for ultracold few-body processes in atomic traps. The key element of the elaborated computational schemes is a nondirect product discrete variable representation (npDVR) we have suggested and applied to the time-dependent and stationary Schrödinger equations with a few spatial variables. It turned out that this approach is very effcient in quantitative analysis of low-dimensional ultracold few-body systems arising in confined geometry of atomic traps. The effciency of the method is demonstrated here on two examples. A brief review is also given of novel results obtained recently.

2. The mathematical modeling of rapid solidification processing. Ph.D. Thesis. Final Report

NASA Technical Reports Server (NTRS)

Gutierrez-Miravete, E.

1986-01-01

The detailed formulation of and the results obtained from a continuum mechanics-based mathematical model of the planar flow melt spinning (PFMS) rapid solidification system are presented and discussed. The numerical algorithm proposed is capable of computing the cooling and freezing rates as well as the fluid flow and capillary phenomena which take place inside the molten puddle formed in the PFMS process. The FORTRAN listings of some of the most useful computer programs and a collection of appendices describing the basic equations used for the modeling are included.

3. Platform for distributed image processing and image retrieval

Gueld, Mark O.; Thies, Christian J.; Fischer, Benedikt; Keysers, Daniel; Wein, Berthold B.; Lehmann, Thomas M.

2003-06-01

We describe a platform for the implementation of a system for content-based image retrieval in medical applications (IRMA). To cope with the constantly evolving medical knowledge, the platform offers a flexible feature model to store and uniformly access all feature types required within a multi-step retrieval approach. A structured generation history for each feature allows the automatic identification and re-use of already computed features. The platform uses directed acyclic graphs composed of processing steps and control elements to model arbitrary retrieval algorithms. This visually intuitive, data-flow oriented representation vastly improves the interdisciplinary communication between computer scientists and physicians during the development of new retrieval algorithms. The execution of the graphs is fully automated within the platform. Each processing step is modeled as a feature transformation. Due to a high degree of system transparency, both the implementation and the evaluation of retrieval algorithms are accelerated significantly. The platform uses a client-server architecture consisting of a central database, a central job scheduler, instances of a daemon service, and clients which embed user-implemented feature ansformations. Automatically distributed batch processing and distributed feature storage enable the cost-efficient use of an existing workstation cluster.

4. Kinetics of the zinc slag-Fuming Process: part II. mathematical model

Richards, G. G.; Brimacombe, J. K.

1985-09-01

A mathematical model of zinc slag fuming has been formulated based on the kinetic conception of the process developed in Part I of this paper. Each of the major reaction zones in the furnace — the slag bath where reduction of zinc oxide and ferric oxide takes place and the tuyere gas column where oxidation of coal and ferrous oxide occurs — have been characterized mathematically. The two zones and the water-jacketed furnace wall have been linked by overall heat and mass balances. Insufficient information is available, however, to characterize quantitatively two of the important kinetic processes occurring in the furnace: the division of coal between entrainment in the slag, combustion in the tuyere gas column and bypass; and oxygen utilization. To overcome this problem the model has been fitted to the data from eleven industrial fuming cycles. Consistent values have been obtained for these kinetic parameters over five different fuming operations indicating that the kinetic conception of the process is sound. The results indicate that about 33 pct of the injected coal is entrained in the slag, 55 pet combusts in the tuyere gas column, and 12 pct bypasses the bath completely. Oxygen utilization has been found to be high and can be correlated to bath depth.

5. Imaging fault zones using 3D seismic image processing techniques

Iacopini, David; Butler, Rob; Purves, Steve

2013-04-01

Significant advances in structural analysis of deep water structure, salt tectonic and extensional rift basin come from the descriptions of fault system geometries imaged in 3D seismic data. However, even where seismic data are excellent, in most cases the trajectory of thrust faults is highly conjectural and still significant uncertainty exists as to the patterns of deformation that develop between the main faults segments, and even of the fault architectures themselves. Moreover structural interpretations that conventionally define faults by breaks and apparent offsets of seismic reflectors are commonly conditioned by a narrow range of theoretical models of fault behavior. For example, almost all interpretations of thrust geometries on seismic data rely on theoretical "end-member" behaviors where concepts as strain localization or multilayer mechanics are simply avoided. Yet analogue outcrop studies confirm that such descriptions are commonly unsatisfactory and incomplete. In order to fill these gaps and improve the 3D visualization of deformation in the subsurface, seismic attribute methods are developed here in conjunction with conventional mapping of reflector amplitudes (Marfurt & Chopra, 2007)). These signal processing techniques recently developed and applied especially by the oil industry use variations in the amplitude and phase of the seismic wavelet. These seismic attributes improve the signal interpretation and are calculated and applied to the entire 3D seismic dataset. In this contribution we will show 3D seismic examples of fault structures from gravity-driven deep-water thrust structures and extensional basin systems to indicate how 3D seismic image processing methods can not only build better the geometrical interpretations of the faults but also begin to map both strain and damage through amplitude/phase properties of the seismic signal. This is done by quantifying and delineating the short-range anomalies on the intensity of reflector amplitudes

6. MISR Browse Images: Cold Land Processes Experiment (CLPX)

Atmospheric Science Data Center

2013-04-02

... MISR Browse Images: Cold Land Processes Experiment (CLPX) These MISR Browse images provide a ... over the region observed during the NASA Cold Land Processes Experiment (CLPX). CLPX involved ground, airborne, and satellite measurements ...

7. DKIST visible broadband imager data processing pipeline

Beard, Andrew; Cowan, Bruce; Ferayorni, Andrew

2014-07-01

The Daniel K. Inouye Solar Telescope (DKIST) Data Handling System (DHS) provides the technical framework and building blocks for developing on-summit instrument quality assurance and data reduction pipelines. The DKIST Visible Broadband Imager (VBI) is a first light instrument that alone will create two data streams with a bandwidth of 960 MB/s each. The high data rate and data volume of the VBI require near-real time processing capability for quality assurance and data reduction, and will be performed on-summit using Graphics Processing Unit (GPU) technology. The VBI data processing pipeline (DPP) is the first designed and developed using the DKIST DHS components, and therefore provides insight into the strengths and weaknesses of the framework. In this paper we lay out the design of the VBI DPP, examine how the underlying DKIST DHS components are utilized, and discuss how integration of the DHS framework with GPUs was accomplished. We present our results of the VBI DPP alpha release implementation of the calibration, frame selection reduction, and quality assurance display processing nodes.

8. ATM experiment S-056 image processing requirements definition

NASA Technical Reports Server (NTRS)

1972-01-01

A plan is presented for satisfying the image data processing needs of the S-056 Apollo Telescope Mount experiment. The report is based on information gathered from related technical publications, consultation with numerous image processing experts, and on the experience that was in working on related image processing tasks over a two-year period.

9. Quantification technology study on flaws in steam-filled pipelines based on image processing

Sun, Lina; Yuan, Peixin

2009-07-01

Starting from exploiting the applied detection system of gas transmission pipeline, a set of X-ray image processing methods and pipeline flaw quantificational evaluation methods are proposed. Defective and non-defective strings and rows in gray image were extracted and oscillogram was obtained. We can distinguish defects in contrast with two gray images division. According to the gray value of defects with different thicknesses, the gray level depth curve is founded. Through exponential and polynomial fitting way to obtain the attenuation mathematical model which the beam penetrates pipeline, thus attain flaw deep dimension. This paper tests on the PPR pipe in the production of simulated holes flaw and cracks flaw, 135KV used the X-ray source on the testing. Test results show that X-ray image processing method, which meet the needs of high efficient flaw detection and provide quality safeguard for thick oil recovery, can be used successfully in detecting corrosion of insulated pipe.

10. Mathematical modeling and multi-criteria optimization of rotary electrical discharge machining process

Shrinivas Balraj, U.

2015-12-01

In this paper, mathematical modeling of three performance characteristics namely material removal rate, surface roughness and electrode wear rate in rotary electrical discharge machining RENE80 nickel super alloy is done using regression approach. The parameters considered are peak current, pulse on time, pulse off time and electrode rotational speed. The regression approach is very much effective in mathematical modeling when the performance characteristic is influenced by many variables. The modeling of these characteristics is helpful in predicting the performance under a given set of combination of input process parameters. The adequacy of developed models is tested by correlation coefficient and Analysis of Variance. It is observed that the developed models are adequate in establishing the relationship between input parameters and performance characteristics. Further, multi-criteria optimization of process parameter levels is carried using grey based Taguchi method. The experiments are planned based on Taguchi's L9 orthogonal array. The proposed method employs single grey relational grade as a performance index to obtain optimum levels of parameters. It is found that peak current and electrode rotational speed are influential on these characteristics. Confirmation experiments are conducted to validate optimal parameters and it reveals the improvements in material removal rate, surface roughness and electrode wear rate as 13.84%, 12.91% and 19.42% respectively.

11. Effects of image processing on the detective quantum efficiency

Park, Hye-Suk; Kim, Hee-Joung; Cho, Hyo-Min; Lee, Chang-Lae; Lee, Seung-Wan; Choi, Yu-Na

2010-04-01

Digital radiography has gained popularity in many areas of clinical practice. This transition brings interest in advancing the methodologies for image quality characterization. However, as the methodologies for such characterizations have not been standardized, the results of these studies cannot be directly compared. The primary objective of this study was to standardize methodologies for image quality characterization. The secondary objective was to evaluate affected factors to Modulation transfer function (MTF), noise power spectrum (NPS), and detective quantum efficiency (DQE) according to image processing algorithm. Image performance parameters such as MTF, NPS, and DQE were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) images of hand posterior-anterior (PA) for measuring signal to noise ratio (SNR), slit image for measuring MTF, white image for measuring NPS were obtained and various Multi-Scale Image Contrast Amplification (MUSICA) parameters were applied to each of acquired images. In results, all of modified images were considerably influence on evaluating SNR, MTF, NPS, and DQE. Modified images by the post-processing had higher DQE than the MUSICA=0 image. This suggests that MUSICA values, as a post-processing, have an affect on the image when it is evaluating for image quality. In conclusion, the control parameters of image processing could be accounted for evaluating characterization of image quality in same way. The results of this study could be guided as a baseline to evaluate imaging systems and their imaging characteristics by measuring MTF, NPS, and DQE.

12. Mathematical Approaches to Understanding and Imaging Atrial Fibrillation: Significance for Mechanisms and Management

PubMed Central

Trayanova, Natalia A

2014-01-01

Atrial fibrillation (AF) is the most common sustained arrhythmia in humans. The mechanisms that govern AF initiation and persistence are highly complex, of dynamic nature, and involve interactions across multiple temporal and spatial scales in the atria. This articles aims to review the mathematical modeling and computer simulation approaches to understanding AF mechanisms and aiding in its management. Various atrial modeling approaches are presented, with descriptions of the methodological basis and advancements in both lower-dimensional and realistic geometry models. A review of the most significant mechanistic insights made by atrial simulations is provided. The article showcases the contributions that atrial modeling and simulation have made not only to our understanding of the pathophysiology of atrial arrhythmias, but also to the development of AF management approaches. A summary of the future developments envisioned for the field of atrial simulation and modeling is also presented. The review contends that computational models of the atria assembled with data from clinical imaging modalities that incorporate electrophysiological and structural remodeling could become a first line of screening for new AF therapies and approaches, new diagnostic developments, and new methods for arrhythmia prevention. PMID:24763468

13. Methods for processing and imaging marsh foraminifera

USGS Publications Warehouse

Dreher, Chandra A.; Flocks, James G.

2011-01-01

This study is part of a larger U.S. Geological Survey (USGS) project to characterize the physical conditions of wetlands in southwestern Louisiana. Within these wetlands, groups of benthic foraminifera-shelled amoeboid protists living near or on the sea floor-can be used as agents to measure land subsidence, relative sea-level rise, and storm impact. In the Mississippi River Delta region, intertidal-marsh foraminiferal assemblages and biofacies were established in studies that pre-date the 1970s, with a very limited number of more recent studies. This fact sheet outlines this project's improved methods, handling, and modified preparations for the use of Scanning Electron Microscope (SEM) imaging of these foraminifera. The objective is to identify marsh foraminifera to the taxonomic species level by using improved processing methods and SEM imaging for morphological characterization in order to evaluate changes in distribution and frequency relative to other environmental variables. The majority of benthic marsh foraminifera consists of agglutinated forms, which can be more delicate than porcelaneous forms. Agglutinated tests (shells) are made of particles such as sand grains or silt and clay material, whereas porcelaneous tests consist of calcite.

14. Image processing methods to elucidate spatial characteristics of retinal microglia after optic nerve transection.

PubMed

Zhang, Yudong; Peng, Bo; Wang, Shuihua; Liang, Yu-Xiang; Yang, Jiquan; So, Kwok-Fai; Yuan, Ti-Fei

2016-01-01

Microglia are the mononuclear phagocytes with various functions in the central nervous system, and the morphologies of microglia imply the different stages and functions. In optical nerve transection model of the retina, the retrograde degeneration of retinal ganglion cells induces microglial activations to a unique morphology termed rod microglia. A few studies described the rod microglia in the cortex and retina; however, the spatial characteristic of rod microglia is not fully understood. In this study, we built a mathematical model to characterize the spatial trait of rod microglia. In addition, we developed a Matlab-based image processing pipeline that consists of log enhancement, image segmentation, mathematical morphology based cell detection, area calculation and angle analysis. This computer program provides researchers a powerful tool to quickly analyze the spatial trait of rod microglia. PMID:26888347

15. Image processing methods to elucidate spatial characteristics of retinal microglia after optic nerve transection

PubMed Central

Zhang, Yudong; Peng, Bo; Wang, Shuihua; Liang, Yu-Xiang; Yang, Jiquan; So, Kwok-Fai; Yuan, Ti-Fei

2016-01-01

Microglia are the mononuclear phagocytes with various functions in the central nervous system, and the morphologies of microglia imply the different stages and functions. In optical nerve transection model of the retina, the retrograde degeneration of retinal ganglion cells induces microglial activations to a unique morphology termed rod microglia. A few studies described the rod microglia in the cortex and retina; however, the spatial characteristic of rod microglia is not fully understood. In this study, we built a mathematical model to characterize the spatial trait of rod microglia. In addition, we developed a Matlab-based image processing pipeline that consists of log enhancement, image segmentation, mathematical morphology based cell detection, area calculation and angle analysis. This computer program provides researchers a powerful tool to quickly analyze the spatial trait of rod microglia. PMID:26888347

16. Corn plant locating by image processing

Jia, Jiancheng; Krutz, Gary W.; Gibson, Harry W.

1991-02-01

The feasibility investigation of using machine vision technology to locate corn plants is an important issue for field production automation in the agricultural industry. This paper presents an approach which was developed to locate the center of a corn plant using image processing techniques. Corn plants were first identified using a main vein detection algorithm by detecting a local feature of corn leaves leaf main veins based on the spectral difference between mains and leaves then the center of the plant could be located using a center locating algorithm by tracing and extending each detected vein line and evaluating the center of the plant from intersection points of those lines. The experimental results show the usefulness of the algorithm for machine vision applications related to corn plant identification. Such a technique can be used for pre. cisc spraying of pesticides or biotech chemicals. 1.

17. Intelligent elevator management system using image processing

Narayanan, H. Sai; Karunamurthy, Vignesh; Kumar, R. Barath

2015-03-01

In the modern era, the increase in the number of shopping malls and industrial building has led to an exponential increase in the usage of elevator systems. Thus there is an increased need for an effective control system to manage the elevator system. This paper is aimed at introducing an effective method to control the movement of the elevators by considering various cases where in the location of the person is found and the elevators are controlled based on various conditions like Load, proximity etc... This method continuously monitors the weight limit of each elevator while also making use of image processing to determine the number of persons waiting for an elevator in respective floors. Canny edge detection technique is used to find out the number of persons waiting for an elevator. Hence the algorithm takes a lot of cases into account and locates the correct elevator to service the respective persons waiting in different floors.

18. How to build a course in mathematical-biological modeling: content and processes for knowledge and skill.

PubMed

Hoskinson, Anne-Marie

2010-01-01

Biological problems in the twenty-first century are complex and require mathematical insight, often resulting in mathematical models of biological systems. Building mathematical-biological models requires cooperation among biologists and mathematicians, and mastery of building models. A new course in mathematical modeling presented the opportunity to build both content and process learning of mathematical models, the modeling process, and the cooperative process. There was little guidance from the literature on how to build such a course. Here, I describe the iterative process of developing such a course, beginning with objectives and choosing content and process competencies to fulfill the objectives. I include some inductive heuristics for instructors seeking guidance in planning and developing their own courses, and I illustrate with a description of one instructional model cycle. Students completing this class reported gains in learning of modeling content, the modeling process, and cooperative skills. Student content and process mastery increased, as assessed on several objective-driven metrics in many types of assessments. PMID:20810966

19. Mathematical modeling of the process of filling a mold during injection molding of ceramic products

Kulkov, S. N.; Korobenkov, M. V.; Bragin, N. A.

2015-10-01

Using the software package Fluent it have been predicted of the filling of a mold in injection molding of ceramic products is of great importance, because the strength of the final product is directly related to the presence of voids in the molding, making possible early prediction of inaccuracies in the mold prior to manufacturing. The calculations were performed in the formulation of mathematical modeling of hydrodynamic turbulent process of filling a predetermined volume of a viscous liquid. The model used to determine the filling forms evaluated the influence of density and viscosity of the feedstock, and the injection pressure on the mold filling process to predict the formation of voids in the area caused by the shape defect geometry.

20. On the mathematical modeling of the transient process of spontaneous heating in a moist coal stockpile

SciTech Connect

Chen, X.D. )

1992-08-01

This paper reports that the influence of moisture transfer on the maximum temperature rise in a coal stockpile has been analyzed using the simplified one-dimensional differential equations that govern the spontaneous heating process. Analytical solutions of the maximum temperature rise have been obtained using well-established coefficients. The solutions suggest that, if the saturation of the gas stream in a coal stockpile with moisture is assumed, the numerically predicted temperature of the stockpile will be well below 100{degrees}C. However, as the relative humidity of the gas stream reduces, the predicted maximum temperature increases to over 100{degrees}C. These analytical solution strongly support the idea of introducing the equilibrium relationship between the relative humidity of the gas and the moisture content of coal into the transient mathematical model of the heating process that has been developed over 20 years at the University of Canterbury.

1. Teaching Image-Processing Concepts in Junior High School: Boys' and Girls' Achievements and Attitudes towards Technology

ERIC Educational Resources Information Center

2012-01-01

Background: This research focused on the development, implementation and evaluation of a course on image-processing principles aimed at middle-school students. Purpose: The overarching purpose of the study was that of integrating the learning of subjects in science, technology, engineering and mathematics (STEM), and linking the learning of these…

2. Image processing and products for the Magellan mission to Venus

NASA Technical Reports Server (NTRS)

Clark, Jerry; Alexander, Doug; Andres, Paul; Lewicki, Scott; Mcauley, Myche

1992-01-01

The Magellan mission to Venus is providing planetary scientists with massive amounts of new data about the surface geology of Venus. Digital image processing is an integral part of the ground data system that provides data products to the investigators. The mosaicking of synthetic aperture radar (SAR) image data from the spacecraft is being performed at JPL's Multimission Image Processing Laboratory (MIPL). MIPL hosts and supports the Image Data Processing Subsystem (IDPS), which was developed in a VAXcluster environment of hardware and software that includes optical disk jukeboxes and the TAE-VICAR (Transportable Applications Executive-Video Image Communication and Retrieval) system. The IDPS is being used by processing analysts of the Image Data Processing Team to produce the Magellan image data products. Various aspects of the image processing procedure are discussed.

3. Filters in 2D and 3D Cardiac SPECT Image Processing

PubMed Central

Ploussi, Agapi; Synefia, Stella

2014-01-01

Nuclear cardiac imaging is a noninvasive, sensitive method providing information on cardiac structure and physiology. Single photon emission tomography (SPECT) evaluates myocardial perfusion, viability, and function and is widely used in clinical routine. The quality of the tomographic image is a key for accurate diagnosis. Image filtering, a mathematical processing, compensates for loss of detail in an image while reducing image noise, and it can improve the image resolution and limit the degradation of the image. SPECT images are then reconstructed, either by filter back projection (FBP) analytical technique or iteratively, by algebraic methods. The aim of this study is to review filters in cardiac 2D, 3D, and 4D SPECT applications and how these affect the image quality mirroring the diagnostic accuracy of SPECT images. Several filters, including the Hanning, Butterworth, and Parzen filters, were evaluated in combination with the two reconstruction methods as well as with a specified MatLab program. Results showed that for both 3D and 4D cardiac SPECT the Butterworth filter, for different critical frequencies and orders, produced the best results. Between the two reconstruction methods, the iterative one might be more appropriate for cardiac SPECT, since it improves lesion detectability due to the significant improvement of image contrast. PMID:24804144

4. Validation of 2DH hydrodynamic and morphological mathematical models. A methodology based on SAR imaging

Canelas, Ricardo; Heleno, Sandra; Pestana, Rita; Ferreira, Rui M. L.

2014-05-01

The objective of the present work is to devise a methodology to validate 2DH shallow-water models suitable to simulate flow hydrodynamics and channel morphology. For this purpose, a 2DH mathematical model, assembled at CEHIDRO, IST, is employed to model Tagus river floods over a 70 km reach and Synthetic Aperture Radar (SAR) images are collected to retrieve planar inundation extents. The model is suited for highly unsteady discontinuous flows over complex, time-evolving geometries, employing a finite-volume discretization scheme, based on a flux-splitting technique incorporating a reviewed version of the Roe Riemann solver. Novel closure terms for the non-equilibrium sediment transport model are included. New boundary conditions are employed, based on the Riemann variables associated the outgoing characteristic fields, coping with the provided hydrographs in a mathematically coherent manner. A high resolution Digital Elevation Model (DEM) is used and levee structures are considered as fully erodible elements. Spatially heterogeneous roughness characteristics are derived from land-use databases such as CORINE LandCover 2006. SAR satellite imagery of the floods is available and is used to validate the simulation results, with particular emphasis on the 2000/2001 flood. The delimited areas from the satellite and simulations are superimposed. The quality of the adjustment depends on the calibration of roughness coefficients and the spatial discretization of with small structures, with lengths at the order of the spatial discretization. Flow depths and registered discharges are recovered from the simulation and compared with data from a measuring station in the domain, with the comparison revealing remarkably high accuracy, both in terms of amplitudes and phase. Further inclusion of topographical detail should improve the comparison of flood extents regarding satellite data. The validated model was then employed to simulate 100-year floods in the same reach. The

5. Improving Primary School Prospective Teachers' Understanding of the Mathematics Modeling Process

ERIC Educational Resources Information Center

Bal, Aytgen Pinar; Doganay, Ahmet

2014-01-01

The development of mathematical thinking plays an important role on the solution of problems faced in daily life. Determining the relevant variables and necessary procedural steps in order to solve problems constitutes the essence of mathematical thinking. Mathematical modeling provides an opportunity for explaining thoughts in real life by making…

6. Growth Processes and Formal Logic. Comments on History and Mathematics Regarded as Combined Educational Tools

ERIC Educational Resources Information Center

Seltman, Muriel; Seltman, P. E. J.

1978-01-01

The authors stress the importance of bringing together the causal logic of history and the formal logic of mathematics in order to humanize mathematics and make it more accessible. An example of such treatment is given in a discussion of the centrality of Euclid and the Euclidean system to mathematics development. (MN)

7. Hong Kong and U.S. Teachers' Perceptions of Mathematical Disagreements and Their Resolution Processes

ERIC Educational Resources Information Center

Barlow, Angela T.; Huang, Rongjin; Law, Huk-Yuen; Chan, Yip Cheung; Zhang, Qiaoping; Baxter, Wesley A.; Gaddy, Angeline K.

2016-01-01

Mathematical disagreements occur when students challenge each other's ideas related to a mathematical concept. In this research, we examined Hong Kong and U.S. elementary teachers' perceptions of mathematical disagreements and their resolutions using a video-stimulated survey. Participants were directed to give particular attention to the…

8. Spot restoration for GPR image post-processing

DOEpatents

Paglieroni, David W; Beer, N. Reginald

2014-05-20

A method and system for detecting the presence of subsurface objects within a medium is provided. In some embodiments, the imaging and detection system operates in a multistatic mode to collect radar return signals generated by an array of transceiver antenna pairs that is positioned across the surface and that travels down the surface. The imaging and detection system pre-processes the return signal to suppress certain undesirable effects. The imaging and detection system then generates synthetic aperture radar images from real aperture radar images generated from the pre-processed return signal. The imaging and detection system then post-processes the synthetic aperture radar images to improve detection of subsurface objects. The imaging and detection system identifies peaks in the energy levels of the post-processed image frame, which indicates the presence of a subsurface object.

9. Vision-sensing image analysis for GTAW process control

SciTech Connect

Long, D.D.

1994-11-01

Image analysis of a gas tungsten arc welding (GTAW) process was completed using video images from a charge coupled device (CCD) camera inside a specially designed coaxial (GTAW) electrode holder. Video data was obtained from filtered and unfiltered images, with and without the GTAW arc present, showing weld joint features and locations. Data Translation image processing boards, installed in an IBM PC AT 386 compatible computer, and Media Cybernetics image processing software were used to investigate edge flange weld joint geometry for image analysis.

10. Hybrid implementation of a real-time Radon-space image-processing system.

PubMed

Woolven, S; Ristic, V M; Chevrette, P

1993-11-10

A unique hybrid optical-digital image-processing system that functions at real-time rates and performs analysis in Radon space is presented. This system functions by using the forward Radon transform (a mathematical tomographic transform of image data from two-dimensional image space to onedimensional Radon space), which is achieved by a front-end optical processor followed by a digital processing subsystem operating in Radon space. The system works by optically converting the two-dimensional image data into a series of one-dimensional projections. All further processing is performed digitally in Radon space on the one-dimensional projections. Using the system in transform space, we show that it can perform real-time detection of minimum-resolvable-temperature-difference measurement targets better than a human observer. Also, this paper discusses the potential of real-time object-moment analysis in Radon space. These object moments can be calculated in Radon space with significantly less image data and fewer digital processing operations than in image space. The optical front end is capable of performing 6.04 × 10(10) operations/s on the two-dimensional image data. PMID:20856498

11. V-Sipal - a Virtual Laboratory for Satellite Image Processing and Analysis

Buddhiraju, K. M.; Eeti, L.; Tiwari, K. K.

2011-09-01

In this paper a virtual laboratory for the Satellite Image Processing and Analysis (v-SIPAL) being developed at the Indian Institute of Technology Bombay is described. v-SIPAL comprises a set of experiments that are normally carried out by students learning digital processing and analysis of satellite images using commercial software. Currently, the experiments that are available on the server include Image Viewer, Image Contrast Enhancement, Image Smoothing, Edge Enhancement, Principal Component Transform, Texture Analysis by Co-occurrence Matrix method, Image Indices, Color Coordinate Transforms, Fourier Analysis, Mathematical Morphology, Unsupervised Image Classification, Supervised Image Classification and Accuracy Assessment. The virtual laboratory includes a theory module for each option of every experiment, a description of the procedure to perform each experiment, the menu to choose and perform the experiment, a module on interpretation of results when performed with a given image and pre-specified options, bibliography, links to useful internet resources and user-feedback. The user can upload his/her own images for performing the experiments and can also reuse outputs of one experiment in another experiment where applicable. Some of the other experiments currently under development include georeferencing of images, data fusion, feature evaluation by divergence andJ-M distance, image compression, wavelet image analysis and change detection. Additions to the theory module include self-assessment quizzes, audio-video clips on selected concepts, and a discussion of elements of visual image interpretation. V-SIPAL is at the satge of internal evaluation within IIT Bombay and will soon be open to selected educational institutions in India for evaluation.

12. Quasi-Three-Dimensional Mathematical Modeling of Morphological Processes Based on Equilibrium Sediment Transport

Charafi, My. M.; Sadok, A.; Kamal, A.; Menai, A.

A quasi-three-dimensional mathematical model has been developed to study the morphological processes based on equilibrium sediment transport method. The flow velocities are computed by a two-dimensional horizontal depth-averaged flow model (H2D) in combination with logarithmic velocity profiles. The transport of sediment particles by a flow water has been considered in the form of bed load and suspended load. The bed load transport rate is defined as the transport of particles by rolling and saltating along the bed surface and is given by the Van Rijn relationship (1987). The equilibrium suspended load transport is described in terms of an equilibrium sediment concentration profile (ce) and a logarithmic velocity (u). Based on the equilibrium transport, the bed change rate is given by integration of the sediment mass-balance equation. The model results have been compared with a Van Rijn results (equilibrium approach) and good agreement has been found.

13. Mathematic modeling of the Earth's surface and the process of remote sensing

NASA Technical Reports Server (NTRS)

Balter, B. M.

1979-01-01

It is shown that real data from remote sensing of the Earth from outer space are not best suited to the search for optimal procedures with which to process such data. To work out the procedures, it was proposed that data synthesized with the help of mathematical modeling be used. A criterion for simularity to reality was formulated. The basic principles for constructing methods for modeling the data from remote sensing are recommended. A concrete method is formulated for modeling a complete cycle of radiation transformations in remote sensing. A computer program is described which realizes the proposed method. Some results from calculations are presented which show that the method satisfies the requirements imposed on it.

14. Mathematical modeling of quartz particle melting process in plasma-chemical reactor

Volokitin, Oleg; Vlasov, Viktor; Volokitin, Gennady; Skripnikova, Nelli; Shekhovtsov, Valentin

2016-01-01

Among silica-based materials vitreous silica has a special place. The paper presents the melting process of a quartz particle under conditions of low-temperature plasma. A mathematical model is designed for stages of melting in the experimental plasma-chemical reactor. As calculation data show, quartz particles having the radius of 0.21≤ rp ≤0.64 mm completely melt at W = 0.65 l/s particle feed rate depending on the Nusselt number, while 0.14≤ rp ≤0.44 mm particles melt at W = 1.4 l/s. Calculation data showed that 2 mm and 0.4 mm quartz particles completely melted during and 0.1 s respectively. Thus, phase transformations occurred in silicon dioxide play the important part in its heating up to the melting temperature.

15. Instruction of Engineering Exercises in Information Processing Including Electric Engineering, Mathematics, Information Basics, and Experiments

Nogaku, Mitsuharu

We propose to instruct engineering exercise in the part of information processing including electric engineering, mathematics, information basics, and experiments. We give first year students four themes for exercise; (1) the trigonometric function, (2) the solving equations, (3) Fourier series, and (4) the model of electric field in the dielectric materials. The used software is Microsoft Excel on Windows XP. At the fourth theme, all the students are arranged as lattice points of a model case, and work together to calculate voltage values by desk calculators. The results from their handwork are compared to simulated values of the Excel software. In each theme, the graphical method of simulated values leads students to understandings of theories and phenomena.

16. Fundamental remote science research program. Part 2: Status report of the mathematical pattern recognition and image analysis project

NASA Technical Reports Server (NTRS)

Heydorn, R. P.

1984-01-01

The Mathematical Pattern Recognition and Image Analysis (MPRIA) Project is concerned with basic research problems related to the study of he Earth from remotely sensed measurements of its surface characteristics. The program goal is to better understand how to analyze the digital image that represents the spatial, spectral, and temporal arrangement of these measurements for purposing of making selected inferences about the Earth. This report summarizes the progress that has been made toward this program goal by each of the principal investigators in the MPRIA Program.

17. Individual Differences in Children's Mathematical Competence Are Related to the Intentional but Not Automatic Processing of Arabic Numerals

ERIC Educational Resources Information Center

Bugden, Stephanie; Ansari, Daniel

2011-01-01

In recent years, there has been an increasing focus on the role played by basic numerical magnitude processing in the typical and atypical development of mathematical skills. In this context, tasks measuring both the intentional and automatic processing of numerical magnitude have been employed to characterize how children's representation and…

18. A Mathematical Analysis of the Learning Production Process and a Model for Determining What Matters in Education.

ERIC Educational Resources Information Center

Bacdayan, Andrew W.

1997-01-01

Describes in economic and mathematical terms the learning production process at the individual level. Presents a model to determine which factors influence this process. Tests the model, using data from two widely divergent learning situations, ranging from lecture-oriented college economics courses to programmed instruction to learn eighth-grade…

19. Mathematical Modeling of Nitrous Oxide Production during Denitrifying Phosphorus Removal Process.

PubMed

Liu, Yiwen; Peng, Lai; Chen, Xueming; Ni, Bing-Jie

2015-07-21

A denitrifying phosphorus removal process undergoes frequent alternating anaerobic/anoxic conditions to achieve phosphate release and uptake, during which microbial internal storage polymers (e.g., Polyhydroxyalkanoate (PHA)) could be produced and consumed dynamically. The PHA turnovers play important roles in nitrous oxide (N2O) accumulation during the denitrifying phosphorus removal process. In this work, a mathematical model is developed to describe N2O dynamics and the key role of PHA consumption on N2O accumulation during the denitrifying phosphorus removal process for the first time. In this model, the four-step anoxic storage of polyphosphate and four-step anoxic growth on PHA using nitrate, nitrite, nitric oxide (NO), and N2O consecutively by denitrifying polyphosphate accumulating organisms (DPAOs) are taken into account for describing all potential N2O accumulation steps in the denitrifying phosphorus removal process. The developed model is successfully applied to reproduce experimental data on N2O production obtained from four independent denitrifying phosphorus removal study reports with different experimental conditions. The model satisfactorily describes the N2O accumulation, nitrogen reduction, phosphate release and uptake, and PHA dynamics for all systems, suggesting the validity and applicability of the model. The results indicated a substantial role of PHA consumption in N2O accumulation due to the relatively low N2O reduction rate by using PHA during denitrifying phosphorus removal. PMID:26114730

20. Viewpoints on Medical Image Processing: From Science to Application

PubMed Central

Deserno (né Lehmann), Thomas M.; Handels, Heinz; Maier-Hein (né Fritzsche), Klaus H.; Mersmann, Sven; Palm, Christoph; Tolxdorff, Thomas; Wagenknecht, Gudrun; Wittenberg, Thomas

2013-01-01

Medical image processing provides core innovation for medical imaging. This paper is focused on recent developments from science to applications analyzing the past fifteen years of history of the proceedings of the German annual meeting on medical image processing (BVM). Furthermore, some members of the program committee present their personal points of views: (i) multi-modality for imaging and diagnosis, (ii) analysis of diffusion-weighted imaging, (iii) model-based image analysis, (iv) registration of section images, (v) from images to information in digital endoscopy, and (vi) virtual reality and robotics. Medical imaging and medical image computing is seen as field of rapid development with clear trends to integrated applications in diagnostics, treatment planning and treatment. PMID:24078804

1. a Real-Time Optical/digital Radon Space Image Processing System

Woolven, Steve

A unique hybrid optical/digital general image processing system which potentially functions at real-time rates and performs analysis on low object-to-background contrast images in Radon space is investigated. The system is capable of some real-time functions which are invariant to object distortions. This research is presented in three stages: the development and analysis of the theory of Radon space, the hardware and software design and implementation of the working system, and the results achieved. This original system functions by using the forward Radon transform ^1, which is achieved by a front-end optical processor, followed by a digital processing subsystem operating in Radon space instead of the more familiar image space. The system works by converting the two dimensional image data into a series of one dimensional projections, and it is demonstrated that several digital image processing functions can potentially be performed faster on the projection data than on the original image data. Using the transform, it is shown that the system is theoretically capable of performing real-time two dimensional Fourier transforms and matched filtering operations. Also, this document presents and demonstrates a method of potential real-time object-moment analysis which allows objects to undergo distortions and continue to be recognized as the original object. It is shown that these moments can be calculated in Radon space using significantly less image data and fewer digital processing operations than in image space. The optical system is potentially capable of performing 6.04 times 10^{10 } operations per second on the two dimensional image data. ftn^1The Radon transform refers to a mathematical tomographic transform of image data from two dimensional image space to a one dimensional space (Radon space).

2. Interactive image processing for mobile devices

Shaw, Rodney

2009-01-01

As the number of consumer digital images escalates by tens of billions each year, an increasing proportion of these images are being acquired using the latest generations of sophisticated mobile devices. The characteristics of the cameras embedded in these devices now yield image-quality outcomes that approach those of the parallel generations of conventional digital cameras, and all aspects of the management and optimization of these vast new image-populations become of utmost importance in providing ultimate consumer satisfaction. However this satisfaction is still limited by the fact that a substantial proportion of all images are perceived to have inadequate image quality, and a lesser proportion of these to be completely unacceptable (for sharing, archiving, printing, etc). In past years at this same conference, the author has described various aspects of a consumer digital-image interface based entirely on an intuitive image-choice-only operation. Demonstrations have been given of this facility in operation, essentially allowing criticalpath navigation through approximately a million possible image-quality states within a matter of seconds. This was made possible by the definition of a set of orthogonal image vectors, and defining all excursions in terms of a fixed linear visual-pixel model, independent of the image attribute. During recent months this methodology has been extended to yield specific user-interactive image-quality solutions in the form of custom software, which at less than 100kb is readily embedded in the latest generations of unlocked portable devices. This has also necessitated the design of new user-interfaces and controls, as well as streamlined and more intuitive versions of the user quality-choice hierarchy. The technical challenges and details will be described for these modified versions of the enhancement methodology, and initial practical experience with typical images will be described.

3. Image processing software for imaging spectrometry data analysis

NASA Technical Reports Server (NTRS)

Mazer, Alan; Martin, Miki; Lee, Meemong; Solomon, Jerry E.

1988-01-01

Imaging spectrometers simultaneously collect image data in hundreds of spectral channels, from the near-UV to the IR, and can thereby provide direct surface materials identification by means resembling laboratory reflectance spectroscopy. Attention is presently given to a software system, the Spectral Analysis Manager (SPAM) for the analysis of imaging spectrometer data. SPAM requires only modest computational resources and is composed of one main routine and a set of subroutine libraries. Additions and modifications are relatively easy, and special-purpose algorithms have been incorporated that are tailored to geological applications.

4. VIP: Vortex Image Processing pipeline for high-contrast direct imaging of exoplanets

Gomez Gonzalez, Carlos Alberto; Wertz, Olivier; Christiaens, Valentin; Absil, Olivier; Mawet, Dimitri

2016-03-01

VIP (Vortex Image Processing pipeline) provides pre- and post-processing algorithms for high-contrast direct imaging of exoplanets. Written in Python, VIP provides a very flexible framework for data exploration and image processing and supports high-contrast imaging observational techniques, including angular, reference-star and multi-spectral differential imaging. Several post-processing algorithms for PSF subtraction based on principal component analysis are available as well as the LLSG (Local Low-rank plus Sparse plus Gaussian-noise decomposition) algorithm for angular differential imaging. VIP also implements the negative fake companion technique coupled with MCMC sampling for rigorous estimation of the flux and position of potential companions.

5. Bessel filters applied in biomedical image processing

Mesa Lopez, Juan Pablo; Castañeda Saldarriaga, Diego Leon

2014-06-01

A magnetic resonance is an image obtained by means of an imaging test that uses magnets and radio waves to create body images, however, in some images it's difficult to recognize organs or foreign agents present in the body. With these Bessel filters the objective is to significantly increase the resolution of magnetic resonance images taken to make them much clearer in order to detect anomalies and diagnose the illness. As it's known, Bessel filters appear to solve the Schrödinger equation for a particle enclosed in a cylinder and affect the image distorting the colors and contours of it, therein lies the effectiveness of these filters, since the clear outline shows more defined and easy to recognize abnormalities inside the body.

6. DTV color and image processing: past, present, and future

Kim, Chang-Yeong; Lee, SeongDeok; Park, Du-Sik; Kwak, Youngshin

2006-01-01

The image processor in digital TV has started to play an important role due to the customers' growing desire for higher quality image. The customers want more vivid and natural images without any visual artifact. Image processing techniques are to meet customers' needs in spite of the physical limitation of the panel. In this paper, developments in image processing techniques for DTV in conjunction with developments in display technologies at Samsung R and D are reviewed. The introduced algorithms cover techniques required to solve the problems caused by the characteristics of the panel itself and techniques for enhancing the image quality of input signals optimized for the panel and human visual characteristics.

7. Quantifying the effect of tissue deformation on diffusion-weighted MRI: a mathematical model and an efficient simulation framework applied to cardiac diffusion imaging.

PubMed

Mekkaoui, Imen; Moulin, Kevin; Croisille, Pierre; Pousin, Jerome; Viallon, Magalie

2016-08-01

Cardiac motion presents a major challenge in diffusion weighted MRI, often leading to large signal losses that necessitate repeated measurements. The diffusion process in the myocardium is difficult to investigate because of the unqualified sensitivity of diffusion measurements to cardiac motion. A rigorous mathematical formalism is introduced to quantify the effect of tissue motion in diffusion imaging. The presented mathematical model, based on the Bloch-Torrey equations, takes into account deformations according to the laws of continuum mechanics. Approximating this mathematical model by using finite elements method, numerical simulations can predict the sensitivity of the diffusion signal to cardiac motion. Different diffusion encoding schemes are considered and the diffusion weighted MR signals, computed numerically, are compared to available results in literature. Our numerical model can identify the existence of two time points in the cardiac cycle, at which the diffusion is unaffected by myocardial strain and cardiac motion. Of course, these time points depend on the type of diffusion encoding scheme. Our numerical results also show that the motion sensitivity of the diffusion sequence can be reduced by using either spin echo technique with acceleration motion compensation diffusion gradients or stimulated echo acquisition mode with unipolar and bipolar diffusion gradients. PMID:27385441

8. Quantifying the effect of tissue deformation on diffusion-weighted MRI: a mathematical model and an efficient simulation framework applied to cardiac diffusion imaging

Mekkaoui, Imen; Moulin, Kevin; Croisille, Pierre; Pousin, Jerome; Viallon, Magalie

2016-08-01

Cardiac motion presents a major challenge in diffusion weighted MRI, often leading to large signal losses that necessitate repeated measurements. The diffusion process in the myocardium is difficult to investigate because of the unqualified sensitivity of diffusion measurements to cardiac motion. A rigorous mathematical formalism is introduced to quantify the effect of tissue motion in diffusion imaging. The presented mathematical model, based on the Bloch–Torrey equations, takes into account deformations according to the laws of continuum mechanics. Approximating this mathematical model by using finite elements method, numerical simulations can predict the sensitivity of the diffusion signal to cardiac motion. Different diffusion encoding schemes are considered and the diffusion weighted MR signals, computed numerically, are compared to available results in literature. Our numerical model can identify the existence of two time points in the cardiac cycle, at which the diffusion is unaffected by myocardial strain and cardiac motion. Of course, these time points depend on the type of diffusion encoding scheme. Our numerical results also show that the motion sensitivity of the diffusion sequence can be reduced by using either spin echo technique with acceleration motion compensation diffusion gradients or stimulated echo acquisition mode with unipolar and bipolar diffusion gradients.

9. Two satellite image sets for the training and validation of image processing systems for defense applications

Peterson, Michael R.; Aldridge, Shawn; Herzog, Britny; Moore, Frank

2010-04-01

Many image processing algorithms utilize the discrete wavelet transform (DWT) to provide efficient compression and near-perfect reconstruction of image data. Defense applications often require the transmission of data at high levels of compression over noisy channels. In recent years, evolutionary algorithms (EAs) have been utilized to optimize image transform filters that outperform standard wavelets for bandwidth-constrained compression of satellite images. The optimization of these filters requires the use of training images appropriately chosen for the image processing system's intended applications. This paper presents two robust sets of fifty images each intended for the training and validation of satellite and unmanned aerial vehicle (UAV) reconnaissance image processing algorithms. Each set consists of a diverse range of subjects consisting of cities, airports, military bases, and landmarks representative of the types of images that may be captured during reconnaissance missions. Optimized algorithms may be "overtrained" for a specific problem instance and thus exhibit poor performance over a general set of data. To reduce the risk of overtraining an image filter, we evaluate the suitability of each image as a training image. After evolving filters using each image, we assess the average compression performance of each filter across the entire set of images. We thus identify a small subset of images from each set that provide strong performance as training images for the image transform optimization problem. These images will also provide a suitable platform for the development of other algorithms for defense applications. The images are available upon request from the contact author.

10. Image processing techniques for digital orthophotoquad production

USGS Publications Warehouse

Hood, Joy J.; Ladner, L. J.; Champion, Richard A.

1989-01-01

Orthophotographs have long been recognized for their value as supplements or alternatives to standard maps. Recent trends towards digital cartography have resulted in efforts by the US Geological Survey to develop a digital orthophotoquad production system. Digital image files were created by scanning color infrared photographs on a microdensitometer. Rectification techniques were applied to remove tile and relief displacement, thereby creating digital orthophotos. Image mosaicking software was then used to join the rectified images, producing digital orthophotos in quadrangle format.

11. An Image Processing Algorithm Based On FMAT

NASA Technical Reports Server (NTRS)

Wang, Lui; Pal, Sankar K.

1995-01-01

Information deleted in ways minimizing adverse effects on reconstructed images. New grey-scale generalization of medial axis transformation (MAT), called FMAT (short for Fuzzy MAT) proposed. Formulated by making natural extension to fuzzy-set theory of all definitions and conditions (e.g., characteristic function of disk, subset condition of disk, and redundancy checking) used in defining MAT of crisp set. Does not need image to have any kind of priori segmentation, and allows medial axis (and skeleton) to be fuzzy subset of input image. Resulting FMAT (consisting of maximal fuzzy disks) capable of reconstructing exactly original image.

12. Cardiovascular Imaging and Image Processing: Theory and Practice - 1975

NASA Technical Reports Server (NTRS)

Harrison, Donald C. (Editor); Sandler, Harold (Editor); Miller, Harry A. (Editor); Hood, Manley J. (Editor); Purser, Paul E. (Editor); Schmidt, Gene (Editor)

1975-01-01

Ultrasonography was examined in regard to the developmental highlights and present applicatons of cardiac ultrasound. Doppler ultrasonic techniques and the technology of miniature acoustic element arrays were reported. X-ray angiography was discussed with special considerations on quantitative three dimensional dynamic imaging of structure and function of the cardiopulmonary and circulatory systems in all regions of the body. Nuclear cardiography and scintigraphy, three--dimensional imaging of the myocardium with isotopes, and the commercialization of the echocardioscope were studied.

13. Viking image processing. [digital stereo imagery and computer mosaicking

NASA Technical Reports Server (NTRS)

Green, W. B.

1977-01-01

The paper discusses the camera systems capable of recording black and white and color imagery developed for the Viking Lander imaging experiment. Each Viking Lander image consisted of a matrix of numbers with 512 rows and an arbitrary number of columns up to a maximum of about 9,000. Various techniques were used in the processing of the Viking Lander images, including: (1) digital geometric transformation, (2) the processing of stereo imagery to produce three-dimensional terrain maps, and (3) computer mosaicking of distinct processed images. A series of Viking Lander images is included.

14. Mathematical modelling of thermal process to aquatic environment with different hydrometeorological conditions.

PubMed

Issakhov, Alibek

2014-01-01

This paper presents the mathematical model of the thermal process from thermal power plant to aquatic environment of the reservoir-cooler, which is located in the Pavlodar region, 17 Km to the north-east of Ekibastuz town. The thermal process in reservoir-cooler with different hydrometeorological conditions is considered, which is solved by three-dimensional Navier-Stokes equations and temperature equation for an incompressible flow in a stratified medium. A numerical method based on the projection method, divides the problem into three stages. At the first stage, it is assumed that the transfer of momentum occurs only by convection and diffusion. Intermediate velocity field is solved by fractional steps method. At the second stage, three-dimensional Poisson equation is solved by the Fourier method in combination with tridiagonal matrix method (Thomas algorithm). Finally, at the third stage, it is expected that the transfer is only due to the pressure gradient. Numerical method determines the basic laws of the hydrothermal processes that qualitatively and quantitatively are approximated depending on different hydrometeorological conditions. PMID:24991644

15. Mathematical modeling and analysis of EDM process parameters based on Taguchi design of experiments

Laxman, J.; Raj, K. Guru

2015-12-01

Electro Discharge Machining is a process used for machining very hard metals, deep and complex shapes by metal erosion in all types of electro conductive materials. The metal is removed through the action of an electric discharge of short duration and high current density between the tool and the work piece. The eroded metal on the surface of both work piece and the tool is flushed away by the dielectric fluid. The objective of this work is to develop a mathematical model for an Electro Discharge Machining process which provides the necessary equations to predict the metal removal rate, electrode wear rate and surface roughness. Regression analysis is used to investigate the relationship between various process parameters. The input parameters are taken as peak current, pulse on time, pulse off time, tool lift time. and the Metal removal rate, electrode wear rate and surface roughness are as responses. Experiments are conducted on Titanium super alloy based on the Taguchi design of experiments i.e. L27 orthogonal experiments.

16. Mathematical Modelling of Thermal Process to Aquatic Environment with Different Hydrometeorological Conditions

PubMed Central

Issakhov, Alibek

2014-01-01

This paper presents the mathematical model of the thermal process from thermal power plant to aquatic environment of the reservoir-cooler, which is located in the Pavlodar region, 17 Km to the north-east of Ekibastuz town. The thermal process in reservoir-cooler with different hydrometeorological conditions is considered, which is solved by three-dimensional Navier-Stokes equations and temperature equation for an incompressible flow in a stratified medium. A numerical method based on the projection method, divides the problem into three stages. At the first stage, it is assumed that the transfer of momentum occurs only by convection and diffusion. Intermediate velocity field is solved by fractional steps method. At the second stage, three-dimensional Poisson equation is solved by the Fourier method in combination with tridiagonal matrix method (Thomas algorithm). Finally, at the third stage, it is expected that the transfer is only due to the pressure gradient. Numerical method determines the basic laws of the hydrothermal processes that qualitatively and quantitatively are approximated depending on different hydrometeorological conditions. PMID:24991644

17. Multispectral image restoration of historical documents based on LAAMs and mathematical morphology

Lechuga-S., Edwin; Valdiviezo-N., Juan C.; Urcid, Gonzalo

2014-09-01

This research introduces an automatic technique designed for the digital restoration of the damaged parts in historical documents. For this purpose an imaging spectrometer is used to acquire a set of images in the wavelength interval from 400 to 1000 nm. Assuming the presence of linearly mixed spectral pixels registered from the multispectral image, our technique uses two lattice autoassociative memories to extract the set of pure pigments conforming a given document. Through an spectral unmixing analysis, our method produces fractional abundance maps indicating the distributions of each pigment in the scene. These maps are then used to locate cracks and holes in the document under study. The restoration process is performed by the application of a region filling algorithm, based on morphological dilation, followed by a color interpolation to restore the original appearance of the filled areas. This procedure has been successfully applied to the analysis and restoration of three multispectral data sets: two corresponding to artificially superimposed scripts and a real data acquired from a Mexican pre-Hispanic codex, whose restoration results are presented.

18. Optoelectronic image processing for cervical cancer screening

Narayanswamy, Ramkumar; Sharpe, John P.; Johnson, Kristina M.

1994-05-01

Automation of the Pap-smear cervical screening method is highly desirable as it relieves tedium for the human operators, reduces cost and should increase accuracy and provide repeatability. We present here the design for a high-throughput optoelectronic system which forms the first stage of a two stage system to automate pap-smear screening. We use a mathematical morphological technique called the hit-or-miss transform to identify the suspicious areas on a pap-smear slide. This algorithm is implemented using a VanderLugt architecture and a time-sequential ANDing smart pixel array.

19. GStreamer as a framework for image processing applications in image fusion

Burks, Stephen D.; Doe, Joshua M.

2011-05-01

Multiple source band image fusion can sometimes be a multi-step process that consists of several intermediate image processing steps. Typically, each of these steps is required to be in a particular arrangement in order to produce a unique output image. GStreamer is an open source, cross platform multimedia framework, and using this framework, engineers at NVESD have produced a software package that allows for real time manipulation of processing steps for rapid prototyping in image fusion.

20. On digital image processing technology and application in geometric measure

Yuan, Jiugen; Xing, Ruonan; Liao, Na

2014-04-01

Digital image processing technique is an emerging science that emerging with the development of semiconductor integrated circuit technology and computer science technology since the 1960s.The article introduces the digital image processing technique and principle during measuring compared with the traditional optical measurement method. It takes geometric measure as an example and introduced the development tendency of digital image processing technology from the perspective of technology application.

1. Image-Processing Software For A Hypercube Computer

NASA Technical Reports Server (NTRS)

Lee, Meemong; Mazer, Alan S.; Groom, Steven L.; Williams, Winifred I.

1992-01-01

Concurrent Image Processing Executive (CIPE) is software system intended to develop and use image-processing application programs on concurrent computing environment. Designed to shield programmer from complexities of concurrent-system architecture, it provides interactive image-processing environment for end user. CIPE utilizes architectural characteristics of particular concurrent system to maximize efficiency while preserving architectural independence from user and programmer. CIPE runs on Mark-IIIfp 8-node hypercube computer and associated SUN-4 host computer.

2. Optimizing signal and image processing applications using Intel libraries

Landré, Jérôme; Truchetet, Frédéric

2007-01-01

This paper presents optimized signal and image processing libraries from Intel Corporation. Intel Performance Primitives (IPP) is a low-level signal and image processing library developed by Intel Corporation to optimize code on Intel processors. Open Computer Vision library (OpenCV) is a high-level library dedicated to computer vision tasks. This article describes the use of both libraries to build flexible and efficient signal and image processing applications.

3. Image processing methods for visual prostheses based on DSP

Liu, Huwei; Zhao, Ying; Tian, Yukun; Ren, Qiushi; Chai, Xinyu

2008-12-01

Visual prostheses for extreme vision impairment have come closer to reality during these few years. The task of this research has been to design exoteric devices and study image processing algorithms and methods for different complexity images. We have developed a real-time system capable of image capture and processing to obtain most available and important image features for recognition and simulation experiment based on DSP (Digital Signal Processor). Beyond developing hardware system, we introduce algorithms such as resolution reduction, information extraction, dilation and erosion, square (circular) pixelization and Gaussian pixelization. And we classify images with different stages according to different complexity such as simple images, medium complex images, complex images. As a result, this paper will get the needed signal for transmitting to electrode array and images for simulation experiment.

4. Similarity of Fibroglandular Breast Tissue Content Measured from Magnetic Resonance and Mammographic Images and by a Mathematical Algorithm

PubMed Central

Ju, Hyunsu; Brunder, Donald G.; Anderson, Karl E.; Khamapirad, Tuenchit; Lu, Lee-Jane W.

2014-01-01

Women with high breast density (BD) have a 4- to 6-fold greater risk for breast cancer than women with low BD. We found that BD can be easily computed from a mathematical algorithm using routine mammographic imaging data or by a curve-fitting algorithm using fat and nonfat suppression magnetic resonance imaging (MRI) data. These BD measures in a strictly defined group of premenopausal women providing both mammographic and breast MRI images were predicted as well by the same set of strong predictor variables as were measures from a published laborious histogram segmentation method and a full field digital mammographic unit in multivariate regression models. We also found that the number of completed pregnancies, C-reactive protein, aspartate aminotransferase, and progesterone were more strongly associated with amounts of glandular tissue than adipose tissue, while fat body mass, alanine aminotransferase, and insulin like growth factor-II appear to be more associated with the amount of breast adipose tissue. Our results show that methods of breast imaging and modalities for estimating the amount of glandular tissue have no effects on the strength of these predictors of BD. Thus, the more convenient mathematical algorithm and the safer MRI protocols may facilitate prospective measurements of BD. PMID:25132995

5. Research on non-destructive testing method of silkworm cocoons based on image processing technology

Gan, Yong; Kong, Qing-hua; Wei, Li-fu

2008-03-01

The major studied in this dissertation is the non-destructive testing method of silkworm cocoon's quality, based on the digital image processing and photoelectricity technology. Through the images collection and the data analysis, procession and calculation of the tested silkworm cocoons with the non-destructive testing technology, internet applications automatically reckon all items of the classification indexes. Finally we can conclude the classification result and the purchase price of the silkworm cocoons. According to the domestic classification standard of the silkworm cocoons, the author investigates various testing methods of silkworm cocoons which are used or have been explored at present, and devices a non-destructive testing scheme of the silkworm cocoons based on the digital image processing and photoelectricity technology. They are dissertated about the project design of the experiment. The precisions of all the implements are demonstrated. I establish Manifold mathematic models, compare them with each other and analyze the precision with technology of databank to get the best mathematic model to figure out the weight of the dried silkworm cocoon shells. The classification methods of all the complementary items are designed well and truly. The testing method has less error and reaches an advanced level of the present domestic non-destructive testing technology of the silkworm cocoons.

6. Image processing in a maritime environment

Pietrzak, Kenneth A.; Alberg, Matthew T.

2015-05-01

The performance of mast mounted imaging sensors operating near the near marine boundary layer can be severely impacted by environmental issues. Haze, atmospheric turbulence, and rough seas can all impact imaging system performance. Examples of these impacts are provided in this paper. In addition, sensor artifacts such as deinterlace artifacts can also impact imaging performance. Deinterlace artifacts caused by a rotating mast are often too severe to be useful by an operator for detection of contacts. An artifact edge minimization approach is presented that eliminates these global motion-based deinterlace artifacts.

7. Identifying potential misfit items in cognitive process of learning engineering mathematics based on Rasch model

Ataei, Sh; Mahmud, Z.; Khalid, M. N.

2014-04-01

The students learning outcomes clarify what students should know and be able to demonstrate after completing their course. So, one of the issues on the process of teaching and learning is how to assess students' learning. This paper describes an application of the dichotomous Rasch measurement model in measuring the cognitive process of engineering students' learning of mathematics. This study provides insights into the perspective of 54 engineering students' cognitive ability in learning Calculus III based on Bloom's Taxonomy on 31 items. The results denote that some of the examination questions are either too difficult or too easy for the majority of the students. This analysis yields FIT statistics which are able to identify if there is data departure from the Rasch theoretical model. The study has identified some potential misfit items based on the measurement of ZSTD where the removal misfit item was accomplished based on the MNSQ outfit of above 1.3 or less than 0.7 logit. Therefore, it is recommended that these items be reviewed or revised to better match the range of students' ability in the respective course.

8. Experiments with recursive estimation in astronomical image processing

NASA Technical Reports Server (NTRS)

Busko, I.

1992-01-01

Recursive estimation concepts were applied to image enhancement problems since the 70's. However, very few applications in the particular area of astronomical image processing are known. These concepts were derived, for 2-dimensional images, from the well-known theory of Kalman filtering in one dimension. The historic reasons for application of these techniques to digital images are related to the images' scanned nature, in which the temporal output of a scanner device can be processed on-line by techniques borrowed directly from 1-dimensional recursive signal analysis. However, recursive estimation has particular properties that make it attractive even in modern days, when big computer memories make the full scanned image available to the processor at any given time. One particularly important aspect is the ability of recursive techniques to deal with non-stationary phenomena, that is, phenomena which have their statistical properties variable in time (or position in a 2-D image). Many image processing methods make underlying stationary assumptions either for the stochastic field being imaged, for the imaging system properties, or both. They will underperform, or even fail, when applied to images that deviate significantly from stationarity. Recursive methods, on the contrary, make it feasible to perform adaptive processing, that is, to process the image by a processor with properties tuned to the image's local statistical properties. Recursive estimation can be used to build estimates of images degraded by such phenomena as noise and blur. We show examples of recursive adaptive processing of astronomical images, using several local statistical properties to drive the adaptive processor, as average signal intensity, signal-to-noise and autocorrelation function. Software was developed under IRAF, and as such will be made available to interested users.

9. Sliding mean edge estimation. [in digital image processing

NASA Technical Reports Server (NTRS)

Ford, G. E.

1978-01-01

A method for determining the locations of the major edges of objects in digital images is presented. The method is based on an algorithm utilizing maximum likelihood concepts. An image line-scan interval is processed to determine if an edge exists within the interval and its location. The proposed algorithm has demonstrated good results even in noisy images.

10. Experiences with digital processing of images at INPE

NASA Technical Reports Server (NTRS)

Mascarenhas, N. D. A. (Principal Investigator)

1984-01-01

Four different research experiments with digital image processing at INPE will be described: (1) edge detection by hypothesis testing; (2) image interpolation by finite impulse response filters; (3) spatial feature extraction methods in multispectral classification; and (4) translational image registration by sequential tests of hypotheses.

11. Proceedings of the Second Annual Symposium on Mathematical Pattern Recognition and Image Analysis Program

NASA Technical Reports Server (NTRS)

Guseman, L. F., Jr. (Principal Investigator)

1984-01-01

Several papers addressing image analysis and pattern recognition techniques for satellite imagery are presented. Texture classification, image rectification and registration, spatial parameter estimation, and surface fitting are discussed.

12. New Light on Old Horizon: Constructing Mathematical Concepts, Underlying Abstraction Processes, and Sense Making Strategies

ERIC Educational Resources Information Center

Scheiner, Thorsten

2016-01-01

The initial assumption of this article is that there is an overemphasis on abstraction-from-actions theoretical approaches in research on knowing and learning mathematics. This article uses a critical reflection on research on students' ways of constructing mathematical concepts to distinguish between abstraction-from-actions theoretical…

13. An Examination of Connections in Mathematical Processes in Students' Problem Solving: Connections between Representing and Justifying

ERIC Educational Resources Information Center

Stylianou, Despina A.

2013-01-01

Representation and justification are two central "mathematical practices". In the past, each has been examined to gain insights in the functions that they have in students' mathematical problem solving. Here, we examine the ways that representation and justification interact and influence the development of one another. We focus on the…

14. From the Everyday, through the Inauthentic, to Mathematics: Reflection on the Process of Teaching from Contexts

ERIC Educational Resources Information Center

Sethole, Godfrey

2005-01-01

This paper highlights an attempt by two grade 8 teachers, Bulelwa and Kevin, to draw in the everyday in the teaching of mathematics. Though located in different South African contexts and settings, both teachers tend to enable their learners' access to mathematics by rendering the everyday inauthentic. I argue that inauthenticating the everyday is…

15. Modeling Scientific Processes with Mathematics Equations Enhances Student Qualitative Conceptual Understanding and Quantitative Problem Solving

ERIC Educational Resources Information Center

Schuchardt, Anita M.; Schunn, Christian D.

2016-01-01

Amid calls for integrating science, technology, engineering, and mathematics (iSTEM) in K-12 education, there is a pressing need to uncover productive methods of integration. Prior research has shown that increasing contextual linkages between science and mathematics is associated with student problem solving and conceptual understanding. However,…

16. APPLEPIPS /Apple Personal Image Processing System/ - An interactive digital image processing system for the Apple II microcomputer

NASA Technical Reports Server (NTRS)

Masuoka, E.; Rose, J.; Quattromani, M.

1981-01-01

Recent developments related to microprocessor-based personal computers have made low-cost digital image processing systems a reality. Image analysis systems built around these microcomputers provide color image displays for images as large as 256 by 240 pixels in sixteen colors. Descriptive statistics can be computed for portions of an image, and supervised image classification can be obtained. The systems support Basic, Fortran, Pascal, and assembler language. A description is provided of a system which is representative of the new microprocessor-based image processing systems currently on the market. While small systems may never be truly independent of larger mainframes, because they lack 9-track tape drives, the independent processing power of the microcomputers will help alleviate some of the turn-around time problems associated with image analysis and display on the larger multiuser systems.

17. A system and mathematical framework to model shear flow effects in biomedical DW-imaging and spectroscopy‡

PubMed Central

Nevo, Uri; Özarslan, Evren; Komlosh, Michal E.; Koay, Cheng Guan; Sarlls, Joelle E.; Basser, Peter J.

2014-01-01

The pulsed-field gradient (PFG) MR experiment enables one to measure particle displacements, velocities, and even higher moments of complex fluid motions. In diffusion-weighted MRI (DWI) in living tissue, where the PFG MRI experiment is used to measure diffusion, Brownian motion is assumed to dominate the displacements causing the observed signal loss. However, motions of water molecules caused by various active biological processes occurring at different length and time scales may also cause additional dephasing of magnetization and signal loss. To help understand their relative effects on the DWI signal attenuation, we used an integrated experimental and theoretical framework: a Rheo-NMR, which served as an experimental model system to precisely prescribe a microscopic velocity distribution; and a mathematical model that relates the DW signal intensity in the Rheo-NMR to experimental parameters that characterize the impressed velocity field. A technical innovation reported here is our use of ‘natural’ (in this case, polar) coordinates both to simplify the description the fluid motion within the Couette cell of the Rheo-NMR, as well as to acquire and reconstruct magnitude and phase MR images obtained within it. We use this integrated model system to demonstrate how shear flows appears as pseudo-diffusion in magnitude DW MR signals obtained using PFG spin-echo (PGSE) NMR and MRI sequences. Our results lead us to reinterpret the possible causes of signal loss in DWI in vivo, in particular to revise and generalize the previous notion of intra-voxel incoherent motion (IVIM) in order to describe activity driven flows that appear as pseudo-diffusion over multiple length and time scales in living tissues. PMID:20886564

18. Airy-Kaup-Kupershmidt filters applied to digital image processing

Hoyos Yepes, Laura Cristina

2015-09-01

The Kaup-Kupershmidt operator is applied to the two-dimensional solution of the Airy-diffusion equation and the resulting filter is applied via convolution to image processing. The full procedure is implemented using Maple code with the package ImageTools. Some experiments were performed using a wide category of images including biomedical images generated by magnetic resonance, computarized axial tomography, positron emission tomography, infrared and photon diffusion. The Airy-Kaup-Kupershmidt filter can be used as a powerful edge detector and as powerful enhancement tool in image processing. It is expected that the Airy-Kaup-Kupershmidt could be incorporated in standard programs for image processing such as ImageJ.

19. Using quantum filters to process images of diffuse axonal injury

Pineda Osorio, Mateo

2014-06-01

Some images corresponding to a diffuse axonal injury (DAI) are processed using several quantum filters such as Hermite Weibull and Morse. Diffuse axonal injury is a particular, common and severe case of traumatic brain injury (TBI). DAI involves global damage on microscopic scale of brain tissue and causes serious neurologic abnormalities. New imaging techniques provide excellent images showing cellular damages related to DAI. Said images can be processed with quantum filters, which accomplish high resolutions of dendritic and axonal structures both in normal and pathological state. Using the Laplacian operators from the new quantum filters, excellent edge detectors for neurofiber resolution are obtained. Image quantum processing of DAI images is made using computer algebra, specifically Maple. Quantum filter plugins construction is proposed as a future research line, which can incorporated to the ImageJ software package, making its use simpler for medical personnel.

20. The Development of Sun-Tracking System Using Image Processing

PubMed Central

Lee, Cheng-Dar; Huang, Hong-Cheng; Yeh, Hong-Yih

2013-01-01

This article presents the development of an image-based sun position sensor and the algorithm for how to aim at the Sun precisely by using image processing. Four-quadrant light sensors and bar-shadow photo sensors were used to detect the Sun's position in the past years. Nevertheless, neither of them can maintain high accuracy under low irradiation conditions. Using the image-based Sun position sensor with image processing can address this drawback. To verify the performance of the Sun-tracking system including an image-based Sun position sensor and a tracking controller with embedded image processing algorithm, we established a Sun image tracking platform and did the performance testing in the laboratory; the results show that the proposed Sun tracking system had the capability to overcome the problem of unstable tracking in cloudy weather and achieve a tracking accuracy of 0.04°. PMID:23615582

1. Stochastic Process Underlying Emergent Recognition of Visual Objects Hidden in Degraded Images

PubMed Central

Murata, Tsutomu; Hamada, Takashi; Shimokawa, Tetsuya; Tanifuji, Manabu; Yanagida, Toshio

2014-01-01

When a degraded two-tone image such as a “Mooney” image is seen for the first time, it is unrecognizable in the initial seconds. The recognition of such an image is facilitated by giving prior information on the object, which is known as top-down facilitation and has been intensively studied. Even in the absence of any prior information, however, we experience sudden perception of the emergence of a salient object after continued observation of the image, whose processes remain poorly understood. This emergent recognition is characterized by a comparatively long reaction time ranging from seconds to tens of seconds. In this study, to explore this time-consuming process of emergent recognition, we investigated the properties of the reaction times for recognition of degraded images of various objects. The results show that the time-consuming component of the reaction times follows a specific exponential function related to levels of image degradation and subject's capability. Because generally an exponential time is required for multiple stochastic events to co-occur, we constructed a descriptive mathematical model inspired by the neurophysiological idea of combination coding of visual objects. Our model assumed that the coincidence of stochastic events complement the information loss of a degraded image leading to the recognition of its hidden object, which could successfully explain the experimental results. Furthermore, to see whether the present results are specific to the task of emergent recognition, we also conducted a comparison experiment with the task of perceptual decision making of degraded images, which is well known to be modeled by the stochastic diffusion process. The results indicate that the exponential dependence on the level of image degradation is specific to emergent recognition. The present study suggests that emergent recognition is caused by the underlying stochastic process which is based on the coincidence of multiple stochastic events

2. Image process technique used in a large FOV compound eye imaging system

Cao, Axiu; Shi, Lifang; Shi, Ruiying; Deng, Qiling; Du, Chunlei

2012-11-01

Biological inspiration has produced some successful solutions for different imaging systems. Inspired by the compound eye of insects, this paper presents some image process techniques used in the spherical compound eye imaging system. By analyzing the relationship between the system with large field of view (FOV) and each lens, an imaging system based on compound eyes has been designed, where 37 lenses pointing in different directions are arranged on a spherical substrate. By researching the relationship between the lens position and the corresponding image geometrical shape to realize a large FOV detection, the image process technique is proposed. To verify the technique, experiments are carried out based on the designed compound eye imaging system. The results show that an image with FOV over 166° can be acquired while keeping excellent image process quality.

3. Study on the improvement of overall optical image quality via digital image processing

Tsai, Cheng-Mu; Fang, Yi Chin; Lin, Yu Chin

2008-12-01

This paper studies the effects of improving overall optical image quality via Digital Image Processing (DIP) and compares the promoted optical image with the non-processed optical image. Seen from the optical system, the improvement of image quality has a great influence on chromatic aberration and monochromatic aberration. However, overall image capture systems-such as cellphones and digital cameras-include not only the basic optical system but also many other factors, such as the electronic circuit system, transducer system, and so forth, whose quality can directly affect the image quality of the whole picture. Therefore, in this thesis Digital Image Processing technology is utilized to improve the overall image. It is shown via experiments that system modulation transfer function (MTF) based on the proposed DIP technology and applied to a comparatively bad optical system can be comparable to, even possibly superior to, the system MTF derived from a good optical system.

4. Magnetic Resonance Current Density Imaging of Chemical Processes and Reactions

Beravs, Katarina; Demš Ar, Alojz; Demsar, Franci

1999-03-01

Electric current density imaging was used to image conductivity changes that occur as a chemical process or reaction progresses. Feasibility was assessed in two models representing the dissolving of an ionic solid and the formation of an insoluble precipitate. In both models, temporal and spatial changes in ionic concentrations were obtained on current density images. As expected, the images showed significant signal enhancement along the ionization/dissociation sites.

5. Learning by Preparing to Teach: Fostering Self-Regulatory Processes and Achievement during Complex Mathematics Problem Solving

ERIC Educational Resources Information Center

Muis, Krista R.; Psaradellis, Cynthia; Chevrier, Marianne; Di Leo, Ivana; Lajoie, Susanne P.

2016-01-01

We developed an intervention based on the learning by teaching paradigm to foster self-regulatory processes and better learning outcomes during complex mathematics problem solving in a technology-rich learning environment. Seventy-eight elementary students were randomly assigned to 1 of 2 conditions: learning by preparing to teach, or learning for…

6. How to Classify the Diversity of Seventh Grade Students' Mathematical Process Skills: An Application of Latent Profile Analysis

ERIC Educational Resources Information Center

Kaosa-ard, Chanapat; Erawan, Waraporn; Damrongpanit, Suntonrapot; Suksawang, Poonpong

2015-01-01

The researcher applied latent profile analysis to study the difference of the students' mathematical process skill. These skills are problem solving skills, reasoning skills, communication and presentation skills, connection knowledge skills, and creativity skills. Samples were 2,485 seventh-grade students obtained from Multi-stage Random…

7. The Process of Making Meaning: The Interplay between Teachers' Knowledge of Mathematical Proofs and Their Classroom Practices

ERIC Educational Resources Information Center

2009-01-01

The purpose of this study was to investigate and describe how middle school mathematics teachers "make meaning" of proofs and the process of proving in the context of their classroom practices. A framework of "making meaning," created by the researcher, guided the data collection and analysis phases of the study. This framework describes the five…

8. A Functional mathematical index for predicting effects of food processing on eight sweet potato(Ipomoea batatas)cultivars

Technology Transfer Automated Retrieval System (TEKTRAN)

In this paper we apply an improved functional mathematical index (FMI), modified from those presented in previous publications, to define the influence of different cooking processes of eight sweet potato (Ipomoea batatas) cultivars on composition of six bioactive phenolic compounds (flavonoids). Th...

9. IPL processing of the Viking orbiter images of Mars

NASA Technical Reports Server (NTRS)

Ruiz, R. M.; Elliott, D. A.; Yagi, G. M.; Pomphrey, R. B.; Power, M. A.; Farrell, W., Jr.; Lorre, J. J.; Benton, W. D.; Dewar, R. E.; Cullen, L. E.

1977-01-01

The Viking orbiter cameras returned over 9000 images of Mars during the 6-month nominal mission. Digital image processing was required to produce products suitable for quantitative and qualitative scientific interpretation. Processing included the production of surface elevation data using computer stereophotogrammetric techniques, crater classification based on geomorphological characteristics, and the generation of color products using multiple black-and-white images recorded through spectral filters. The Image Processing Laboratory of the Jet Propulsion Laboratory was responsible for the design, development, and application of the software required to produce these 'second-order' products.

10. High resolution image processing on low-cost microcomputers

NASA Technical Reports Server (NTRS)

Miller, R. L.

1993-01-01

Recent advances in microcomputer technology have resulted in systems that rival the speed, storage, and display capabilities of traditionally larger machines. Low-cost microcomputers can provide a powerful environment for image processing. A new software program which offers sophisticated image display and analysis on IBM-based systems is presented. Designed specifically for a microcomputer, this program provides a wide-range of functions normally found only on dedicated graphics systems, and therefore can provide most students, universities and research groups with an affordable computer platform for processing digital images. The processing of AVHRR images within this environment is presented as an example.

11. The system integration of image processing

Chen, Qi-xing; Wu, Qin-zhang; Gao, Xiao-dong; Ren, Guo-qiang

2008-03-01

An integration system was designed to apply to the remote communication of optics and electronics detection systems, which was integrated with programmable DSP and FPGA chirps in addition to a few Application Specific Integrated Circuits (ASICs). It could achieve image binarization, image enhancement, data encryption, image compression encoding, channel encoding, data interleaving, etc., and the algorithms of these functions might be renewed or updated easily. The CCD color camera being a signal source, experiments had been done on the platform with a DSP chirp and a FPGA one. The FPGA chirp mainly realized the reconstruction of image's brightness signal and the production of various timing signals, and the DSP chirp mainly accomplished the other functions. The algorithms to compress image data were based on discrete cosine transformation (DCT) and discrete wavelet transformation (DWT), respectively. The experiment results showed that the developed platform was characterized by flexibility, programmability and reconfigurability. The integration system is well suitable for the remote communication of optics and electronics detection systems.

12. Image processing of globular clusters - Simulation for deconvolution tests (GlencoeSim)

Blazek, Martin; Pata, Petr

2016-10-01

This paper presents an algorithmic approach for efficiency tests of deconvolution algorithms in astronomic image processing. Due to the existence of noise in astronomical data there is no certainty that a mathematically exact result of stellar deconvolution exists and iterative or other methods such as aperture or PSF fitting photometry are commonly used. Iterative methods are important namely in the case of crowded fields (e.g., globular clusters). For tests of the efficiency of these iterative methods on various stellar fields, information about the real fluxes of the sources is essential. For this purpose a simulator of artificial images with crowded stellar fields provides initial information on source fluxes for a robust statistical comparison of various deconvolution methods. The "GlencoeSim" simulator and the algorithms presented in this paper consider various settings of Point-Spread Functions, noise types and spatial distributions, with the aim of producing as realistic an astronomical optical stellar image as possible.

13. Protocols for Image Processing based Underwater Inspection of Infrastructure Elements

O'Byrne, Michael; Ghosh, Bidisha; Schoefs, Franck; Pakrashi, Vikram

2015-07-01

Image processing can be an important tool for inspecting underwater infrastructure elements like bridge piers and pile wharves. Underwater inspection often relies on visual descriptions of divers who are not necessarily trained in specifics of structural degradation and the information may often be vague, prone to error or open to significant variation of interpretation. Underwater vehicles, on the other hand can be quite expensive to deal with for such inspections. Additionally, there is now significant encouragement globally towards the deployment of more offshore renewable wind turbines and wave devices and the requirement for underwater inspection can be expected to increase significantly in the coming years. While the merit of image processing based assessment of the condition of underwater structures is understood to a certain degree, there is no existing protocol on such image based methods. This paper discusses and describes an image processing protocol for underwater inspection of structures. A stereo imaging image processing method is considered in this regard and protocols are suggested for image storage, imaging, diving, and inspection. A combined underwater imaging protocol is finally presented which can be used for a variety of situations within a range of image scenes and environmental conditions affecting the imaging conditions. An example of detecting marine growth is presented of a structure in Cork Harbour, Ireland.

14. Development of automatic hologram synthesizer for medical use III: image processing for original medical images

Yamamoto, Toshifumi; Misaki, Toshikazu; Kato, Tsutomu

1992-05-01

An image processing system for providing original images for synthesizing multiplex holograms is developed. This system reconstructs 3D surface rendering images of internal organs and/or bones of a patient from a series of tomograms such as computed tomography. Image processing includes interpolation, enhancement, extraction of diseased parts, selection of axis of projection, and compensation of distortions. This paper presents the features of this system, along with problems and resolutions encountered in actual test operation at hospitals.

15. Resolution modification and context based image processing for retinal prosthesis

Naghdy, Golshah; Beston, Chris; Seo, Jong-Mo; Chung, Hum

2006-08-01

This paper focuses on simulating image processing algorithms and exploring issues related to reducing high resolution images to 25 x 25 pixels suitable for the retinal implant. Field of view (FoV) is explored, and a novel method of virtual eye movement discussed. Several issues beyond the normal model of human vision are addressed through context based processing.

16. Image Processing In Laser-Beam-Steering Subsystem

NASA Technical Reports Server (NTRS)

Lesh, James R.; Ansari, Homayoon; Chen, Chien-Chung; Russell, Donald W.

1996-01-01

Conceptual design of image-processing circuitry developed for proposed tracking apparatus described in "Beam-Steering Subsystem For Laser Communication" (NPO-19069). In proposed system, desired frame rate achieved by "windowed" readout scheme in which only pixels containing and surrounding two spots read out and others skipped without being read. Image data processed rapidly and efficiently to achieve high frequency response.

17. From Image to Text: Using Images in the Writing Process

ERIC Educational Resources Information Center

Andrzejczak, Nancy; Trainin, Guy; Poldberg, Monique

2005-01-01

This study looks at the benefits of integrating visual art creation and the writing process. The qualitative inquiry uses student, parent, and teacher interviews coupled with field observation, and artifact analysis. Emergent coding based on grounded theory clearly shows that visual art creation enhances the writing process. Students used more…

18. Mathematical modelling of flow and transport processes in tissue engineering bioreactors

Waters, Sarah; Pearson, Natalie; Oliver, James; Shipley, Rebecca

2014-11-01

To artificially engineer tissues numerous biophysical and biochemical processes must be integrated to produce tissues with the desired in vivo properties. Tissue engineering bioreactors are cell culture systems which aim to mimic the in vivo environment. We consider a hollow fibre membrane bioreactor (HFMB), which utilises fluid flow to enhance the delivery of growth factors and nutrients to, and metabolite removal from, the cells, as well as provide appropriate mechanical stimuli to the cells. Biological tissues comprise a wide variety of interacting components, and multiphase models provide a natural framework to investigate such interactions. We present a suite of mathematical models (capturing different experimental setups) which consider the fluid flow, solute transport, and cell yield and distribution within a HFMB. The governing equations are simplified by exploiting the slender geometry of the bioreactor system, so that, e.g., lubrication theory may be used to describe flow in the lumen. We interrogate the models to illustrate typical behaviours of each setup in turn, and highlight the dependence of results on key experimentally controllable parameter values. Once validated, such models can be used to inform and direct future experiments.

19. Automated Processing of LASCO Coronal Images: Spurious Point-Source-Filtering and Missing-Blocks Correction

Pagot, E.; Lamy, P.; Llebaria, A.; Boclet, B.

2014-04-01

We report on automated procedures for correcting the images of the LASCO coronagraph for i) spurious quasi-point-sources such as the impacts of cosmic rays, stars, and planets, and ii) the absence of signal due to transmission errors or dropouts, which results in blocks of missing information in the images. Correcting for these undesirable artifacts is mandatory for all quantitative works on the solar corona that require data inversion and/or long series of images, for instance. The nonlinear filtering of spike noise or point-like objects is based on mathematical morphology and implements the procedure opening by morphological reconstruction. However, a simple opening filter is applied whenever the fractional area of corrupted pixels exceeds 50 % of the original image. We describe different strategies for reconstructing the missing information blocks. In general, it is possible to implement the method of averaged neighbors using the two images obtained immediately before and after the corrupted image. For the other cases, and in particular when missing blocks overlapped in three images, we developed an original procedure of weighted interpolation along radial profiles from the center of the Sun that intercept the missing block(s). This procedure is also adequate for the saturated images of bright planets (such as Venus) that bleed along the neighboring pixels. Missing blocks in polarized images may generally be reconstructed using the associated unpolarized image of the same format. But in the case of overlapping missing blocks, we implemented our procedure of weighted interpolation. All tests performed on numerous LASCO-C2 images at various periods of solar activity ( i.e. varying complexity of the structure of the corona) demonstrate the excellent performance of these new procedures, with results vastly superior to the methods implemented so far in the pipeline-processing of the LASCO images.

20. Advanced technology development for image gathering, coding, and processing

NASA Technical Reports Server (NTRS)

Huck, Friedrich O.

1990-01-01

Three overlapping areas of research activities are presented: (1) Information theory and optimal filtering are extended to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing. (2) Focal-plane processing techniques and technology are developed to combine effectively image gathering with coding. The emphasis is on low-level vision processing akin to the retinal processing in human vision. (3) A breadboard adaptive image-coding system is being assembled. This system will be used to develop and evaluate a number of advanced image-coding technologies and techniques as well as research the concept of adaptive image coding.

1. Linguistic Processing in a Mathematics Tutoring System: Cooperative Input Interpretation and Dialogue Modelling

Wolska, Magdalena; Buckley, Mark; Horacek, Helmut; Kruijff-Korbayová, Ivana; Pinkal, Manfred

Formal domains, such as mathematics, require exact language to communicate the intended content. Special symbolic notations are used to express the semantics precisely, compactly, and unambiguously. Mathematical textbooks offer plenty of examples of concise, accurate presentations. This effective communication is enabled by interleaved use of formulas and natural language. Since natural language interaction has been shown to be an important factor in the efficiency of human tutoring [29], it would be desirable to enhance interaction with Intelligent Tutoring Systems for mathematics by allowing elements of mixed language combining the exactness of formal expressions with natural language flexibility.

2. Constructing mathematical models for simulating the technological processes in thermal power equipment on the basis of statistical approximation methods

Kolchev, K. K.; Mezin, S. V.

2015-07-01

A technique for constructing mathematical models simulating the technological processes in thermal power equipment developed on the basis of the statistical approximation method is described. The considered method was used in the developed software module (plug-in) intended for calculating nonlinear mathematical models of gas turbine units and for diagnosing them. The mathematical models constructed using this module are used for describing the current state of a system. Deviations of the system's actual state from the estimate obtained using the mathematical model point to malfunctions in operation of this system. The multidimensional interpolation and approximation method and the theory of random functions serve as a theoretical basis of the developed technique. By using the developed technique it is possible to construct complex static models of plants that are subject to control and diagnostics. The module developed using the proposed technique makes it possible to carry out periodic diagnostics of the operating equipment for revealing deviations from the normal mode of its operation. The specific features relating to construction of mathematical models are considered, and examples of applying them with the use of observations obtained on the equipment of gas turbine units are given.

3. Review of biomedical signal and image processing

PubMed Central

2013-01-01

This article is a review of the book “Biomedical Signal and Image Processing” by Kayvan Najarian and Robert Splinter, which is published by CRC Press, Taylor & Francis Group. It will evaluate the contents of the book and discuss its suitability as a textbook, while mentioning highlights of the book, and providing comparison with other textbooks.

4. MOPEX: a software package for astronomical image processing and visualization

Makovoz, David; Roby, Trey; Khan, Iffat; Booth, Hartley

2006-06-01

We present MOPEX - a software package for astronomical image processing and display. The package is a combination of command-line driven image processing software written in C/C++ with a Java-based GUI. The main image processing capabilities include creating mosaic images, image registration, background matching, point source extraction, as well as a number of minor image processing tasks. The combination of the image processing and display capabilities allows for much more intuitive and efficient way of performing image processing. The GUI allows for the control over the image processing and display to be closely intertwined. Parameter setting, validation, and specific processing options are entered by the user through a set of intuitive dialog boxes. Visualization feeds back into further processing by providing a prompt feedback of the processing results. The GUI also allows for further analysis by accessing and displaying data from existing image and catalog servers using a virtual observatory approach. Even though originally designed for the Spitzer Space Telescope mission, a lot of functionalities are of general usefulness and can be used for working with existing astronomical data and for new missions. The software used in the package has undergone intensive testing and benefited greatly from effective software reuse. The visualization part has been used for observation planning for both the Spitzer and Herschel Space Telescopes as part the tool Spot. The visualization capabilities of Spot have been enhanced and integrated with the image processing functionality of the command-line driven MOPEX. The image processing software is used in the Spitzer automated pipeline processing, which has been in operation for nearly 3 years. The image processing capabilities have also been tested in off-line processing by numerous astronomers at various institutions around the world. The package is multi-platform and includes automatic update capabilities. The software

5. STEM Images Revealing STEM Conceptions of Pre-Service Chemistry and Mathematics Teachers

ERIC Educational Resources Information Center

Akaygun, Sevil; Aslan-Tutak, Fatma

2016-01-01

Science, technology, engineering, and mathematics (STEM) education has been an integral part of many countries' educational policies. In last decade, various practices have been implemented to make STEM areas valuable for 21st century generation. These actions require reconsideration of both pre- and in-service teacher education because those who…

6. Differences between Experts' and Students' Conceptual Images of the Mathematical Structure of Taylor Series Convergence

ERIC Educational Resources Information Center

Martin, Jason

2013-01-01

Taylor series convergence is a complicated mathematical structure which incorporates multiple concepts. Therefore, it can be very difficult for students to initially comprehend. How might students make sense of this structure? How might experts make sense of this structure? To answer these questions, an exploratory study was conducted using…

7. Assessment of vessel diameters for MR brain angiography processed images

Moraru, Luminita; Obreja, Cristian-Dragos; Moldovanu, Simona

2015-12-01

The motivation was to develop an assessment method to measure (in)visible differences between the original and the processed images in MR brain angiography as a method of evaluation of the status of the vessel segments (i.e. the existence of the occlusion or intracerebral vessels damaged as aneurysms). Generally, the image quality is limited, so we improve the performance of the evaluation through digital image processing. The goal is to determine the best processing method that allows an accurate assessment of patients with cerebrovascular diseases. A total of 10 MR brain angiography images were processed by the following techniques: histogram equalization, Wiener filter, linear contrast adjustment, contrastlimited adaptive histogram equalization, bias correction and Marr-Hildreth filter. Each original image and their processed images were analyzed into the stacking procedure so that the same vessel and its corresponding diameter have been measured. Original and processed images were evaluated by measuring the vessel diameter (in pixels) on an established direction and for the precise anatomic location. The vessel diameter is calculated using the plugin ImageJ. Mean diameter measurements differ significantly across the same segment and for different processing techniques. The best results are provided by the Wiener filter and linear contrast adjustment methods and the worst by Marr-Hildreth filter.

8. Graphical user interface for image acquisition and processing

DOEpatents

Goldberg, Kenneth A.

2002-01-01

An event-driven GUI-based image acquisition interface for the IDL programming environment designed for CCD camera control and image acquisition directly into the IDL environment where image manipulation and data analysis can be performed, and a toolbox of real-time analysis applications. Running the image acquisition hardware directly from IDL removes the necessity of first saving images in one program and then importing the data into IDL for analysis in a second step. Bringing the data directly into IDL creates an opportunity for the implementation of IDL image processing and display functions in real-time. program allows control over the available charge coupled device (CCD) detector parameters, data acquisition, file saving and loading, and image manipulation and processing, all from within IDL. The program is built using IDL's widget libraries to control the on-screen display and user interface.

9. Optical Processing of Speckle Images with Bacteriorhodopsin for Pattern Recognition

NASA Technical Reports Server (NTRS)

Downie, John D.; Tucker, Deanne (Technical Monitor)

1994-01-01

Logarithmic processing of images with multiplicative noise characteristics can be utilized to transform the image into one with an additive noise distribution. This simplifies subsequent image processing steps for applications such as image restoration or correlation for pattern recognition. One particularly common form of multiplicative noise is speckle, for which the logarithmic operation not only produces additive noise, but also makes it of constant variance (signal-independent). We examine the optical transmission properties of some bacteriorhodopsin films here and find them well suited to implement such a pointwise logarithmic transformation optically in a parallel fashion. We present experimental results of the optical conversion of speckle images into transformed images with additive, signal-independent noise statistics using the real-time photochromic properties of bacteriorhodopsin. We provide an example of improved correlation performance in terms of correlation peak signal-to-noise for such a transformed speckle image.

10. Partial difference operators on weighted graphs for image processing on surfaces and point clouds.

PubMed

Lozes, Francois; Elmoataz, Abderrahim; Lezoray, Olivier

2014-09-01

Partial difference equations (PDEs) and variational methods for image processing on Euclidean domains spaces are very well established because they permit to solve a large range of real computer vision problems. With the recent advent of many 3D sensors, there is a growing interest in transposing and solving PDEs on surfaces and point clouds. In this paper, we propose a simple method to solve such PDEs using the framework of PDEs on graphs. This latter approach enables us to transcribe, for surfaces and point clouds, many models and algorithms designed for image processing. To illustrate our proposal, three problems are considered: (1) p -Laplacian restoration and inpainting; (2) PDEs mathematical morphology; and (3) active contours segmentation. PMID:25020095

11. The Khoros software development environment for image and signal processing.

PubMed

Konstantinides, K; Rasure, J R

1994-01-01

Data flow visual language systems allow users to graphically create a block diagram of their applications and interactively control input, output, and system variables. Khoros is an integrated software development environment for information processing and visualization. It is particularly attractive for image processing because of its rich collection of tools for image and digital signal processing. This paper presents a general overview of Khoros with emphasis on its image processing and DSP tools. Various examples are presented and the future direction of Khoros is discussed. PMID:18291923

12. Measurement of glucose concentration by image processing of thin film slides

Piramanayagam, Sankaranaryanan; Saber, Eli; Heavner, David

2012-02-01

Measurement of glucose concentration is important for diagnosis and treatment of diabetes mellitus and other medical conditions. This paper describes a novel image-processing based approach for measuring glucose concentration. A fluid drop (patient sample) is placed on a thin film slide. Glucose, present in the sample, reacts with reagents on the slide to produce a color dye. The color intensity of the dye formed varies with glucose at different concentration levels. Current methods use spectrophotometry to determine the glucose level of the sample. Our proposed algorithm uses an image of the slide, captured at a specific wavelength, to automatically determine glucose concentration. The algorithm consists of two phases: training and testing. Training datasets consist of images at different concentration levels. The dye-occupied image region is first segmented using a Hough based technique and then an intensity based feature is calculated from the segmented region. Subsequently, a mathematical model that describes a relationship between the generated feature values and the given concentrations is obtained. During testing, the dye region of a test slide image is segmented followed by feature extraction. These two initial steps are similar to those done in training. However, in the final step, the algorithm uses the model (feature vs. concentration) obtained from the training and feature generated from test image to predict the unknown concentration. The performance of the image-based analysis was compared with that of a standard glucose analyzer.

13. Image processing system to analyze droplet distributions in sprays

NASA Technical Reports Server (NTRS)

Bertollini, Gary P.; Oberdier, Larry M.; Lee, Yong H.

1987-01-01

An image processing system was developed which automatically analyzes the size distributions in fuel spray video images. Images are generated by using pulsed laser light to freeze droplet motion in the spray sample volume under study. This coherent illumination source produces images which contain droplet diffraction patterns representing the droplets degree of focus. The analysis is performed by extracting feature data describing droplet diffraction patterns in the images. This allows the system to select droplets from image anomalies and measure only those droplets considered in focus. Unique features of the system are the totally automated analysis and droplet feature measurement from the grayscale image. The feature extraction and image restoration algorithms used in the system are described. Preliminary performance data is also given for two experiments. One experiment gives a comparison between a synthesized distribution measured manually and automatically. The second experiment compares a real spray distribution measured using current methods against the automatic system.

14. Processing of polarametric SAR images. Final report

SciTech Connect

Warrick, A.L.; Delaney, P.A.

1995-09-01

The objective of this work was to develop a systematic method of combining multifrequency polarized SAR images. It is shown that the traditional methods of correlation, hard targets, and template matching fail to produce acceptable results. Hence, a new algorithm was developed and tested. The new approach combines the three traditional methods and an interpolation method. An example is shown that demonstrates the new algorithms performance. The results are summarized suggestions for future research are presented.

15. Processing ISS Images of Titan's Surface

NASA Technical Reports Server (NTRS)

Perry, Jason; McEwen, Alfred; Fussner, Stephanie; Turtle, Elizabeth; West, Robert; Porco, Carolyn; Knowles, Ben; Dawson, Doug

2005-01-01

One of the primary goals of the Cassini-Huygens mission, in orbit around Saturn since July 2004, is to understand the surface and atmosphere of Titan. Surface investigations are primarily accomplished with RADAR, the Visual and Infrared Mapping Spectrometer (VIMS), and the Imaging Science Subsystem (ISS) [1]. The latter two use methane "windows", regions in Titan's reflectance spectrum where its atmosphere is most transparent, to observe the surface. For VIMS, this produces clear views of the surface near 2 and 5 microns [2]. ISS uses a narrow continuum band filter (CB3) at 938 nanometers. While these methane windows provide our best views of the surface, the images produced are not as crisp as ISS images of satellites like Dione and Iapetus [3] due to the atmosphere. Given a reasonable estimate of contrast (approx.30%), the apparent resolution of features is approximately 5 pixels due to the effects of the atmosphere and the Modulation Transfer Function of the camera [1,4]. The atmospheric haze also reduces contrast, especially with increasing emission angles [5].

16. Image processing of underwater multispectral imagery

USGS Publications Warehouse

2003-01-01

Capturing in situ fluorescence images of marine organisms presents many technical challenges. The effects of the medium, as well as the particles and organisms within it, are intermixed with the desired signal. Methods for extracting and preparing the imagery for analysis are discussed in reference to a novel underwater imaging system called the low-light-level underwater multispectral imaging system (LUMIS). The instrument supports both uni- and multispectral collections, each of which is discussed in the context of an experimental application. In unispectral mode, LUMIS was used to investigate the spatial distribution of phytoplankton. A thin sheet of laser light (532 nm) induced chlorophyll fluorescence in the phytoplankton, which was recorded by LUMIS. Inhomogeneities in the light sheet led to the development of a beam-pattern-correction algorithm. Separating individual phytoplankton cells from a weak background fluorescence field required a two-step procedure consisting of edge detection followed by a series of binary morphological operations. In multispectral mode, LUMIS was used to investigate the bio-assay potential of fluorescent pigments in corals. Problems with the commercial optical-splitting device produced nonlinear distortions in the imagery. A tessellation algorithm, including an automated tie-point-selection procedure, was developed to correct the distortions. Only pixels corresponding to coral polyps were of interest for further analysis. Extraction of these pixels was performed by a dynamic global-thresholding algorithm.

17. Distributed image processing for automatic target recognition

Cozien, Roger F.

2001-02-01

Our purpose is, in medium term, to detect in air images, characteristic shapes and objects such as airports, industrial plants, planes, tanks, trucks, . with great accuracy and low rate of mistakes. However, we also want to value whether the link between neural networks and multi-agents systems is relevant and effective. If it appears to be really effective, we hope to use this kind of technology in other fields. That would be an easy and convenient way to depict and to use the agents' knowledge which is distributed and fragmented. After a first phase of preliminary tests to know if agents are able to give relevant information to a neural network, we verify that only a few agents running on an image are enough to inform the network and let it generalize the agents' distributed and fragmented knowledge. In a second phase, we developed a distributed architecture allowing several multi-agents systems running at the same time on different computers with different images. All those agents send information to a "multi neural networks system" whose job is to identify the shapes detected by the agents. The name we gave to our project is Jarod.

18. Optical Signal Processing: Poisson Image Restoration and Shearing Interferometry

NASA Technical Reports Server (NTRS)

Hong, Yie-Ming

1973-01-01

Optical signal processing can be performed in either digital or analog systems. Digital computers and coherent optical systems are discussed as they are used in optical signal processing. Topics include: image restoration; phase-object visualization; image contrast reversal; optical computation; image multiplexing; and fabrication of spatial filters. Digital optical data processing deals with restoration of images degraded by signal-dependent noise. When the input data of an image restoration system are the numbers of photoelectrons received from various areas of a photosensitive surface, the data are Poisson distributed with mean values proportional to the illuminance of the incoherently radiating object and background light. Optical signal processing using coherent optical systems is also discussed. Following a brief review of the pertinent details of Ronchi's diffraction grating interferometer, moire effect, carrier-frequency photography, and achromatic holography, two new shearing interferometers based on them are presented. Both interferometers can produce variable shear.

19. A model for simulation and processing of radar images

NASA Technical Reports Server (NTRS)

Stiles, J. A.; Frost, V. S.; Shanmugam, K. S.; Holtzman, J. C.

1981-01-01

A model for recording, processing, presentation, and analysis of radar images in digital form is presented. The observed image is represented as having two random components, one which models the variation due to the coherent addition of electromagnetic energy scattered from different objects in the illuminated areas. This component is referred to as fading. The other component is a representation of the terrain variation which can be described as the actual signal which the radar is attempting to measure. The combination of these two components provides a description of radar images as being the output of a linear space-variant filter operating on the product of the fading and terrain random processes. In addition, the model is applied to a digital image processing problem using the design and implementation of enhancement scene. Finally, parallel approaches are being employed as possible means of solving other processing problems such as SAR image map-matching, data compression, and pattern recognition.

20. Proceedings of the Third Annual Symposium on Mathematical Pattern Recognition and Image Analysis

NASA Technical Reports Server (NTRS)

Guseman, L. F., Jr.

1985-01-01

Topics addressed include: multivariate spline method; normal mixture analysis applied to remote sensing; image data analysis; classifications in spatially correlated environments; probability density functions; graphical nonparametric methods; subpixel registration analysis; hypothesis integration in image understanding systems; rectification of satellite scanner imagery; spatial variation in remotely sensed images; smooth multidimensional interpolation; and optimal frequency domain textural edge detection filters.