Image analysis and mathematical modelling for the supervision of the dough fermentation process
NASA Astrophysics Data System (ADS)
Zettel, Viktoria; Paquet-Durand, Olivier; Hecker, Florian; Hitzmann, Bernd
2016-10-01
The fermentation (proof) process of dough is one of the quality-determining steps in the production of baking goods. Beside the fluffiness, whose fundaments are built during fermentation, the flavour of the final product is influenced very much during this production stage. However, until now no on-line measurement system is available, which can supervise this important process step. In this investigation the potential of an image analysis system is evaluated, that enables the determination of the volume of fermented dough pieces. The camera is moving around the fermenting pieces and collects images from the objects by means of different angles (360° range). Using image analysis algorithms the volume increase of individual dough pieces is determined. Based on a detailed mathematical description of the volume increase, which based on the Bernoulli equation, carbon dioxide production rate of yeast cells and the diffusion processes of carbon dioxide, the fermentation process is supervised. Important process parameters, like the carbon dioxide production rate of the yeast cells and the dough viscosity can be estimated just after 300 s of proofing. The mean percentage error for forecasting the further evolution of the relative volume of the dough pieces is just 2.3 %. Therefore, a forecast of the further evolution can be performed and used for fault detection.
Processing of microCT implant-bone systems images using Fuzzy Mathematical Morphology
NASA Astrophysics Data System (ADS)
Bouchet, A.; Colabella, L.; Omar, S.; Ballarre, J.; Pastore, J.
2016-04-01
The relationship between a metallic implant and the existing bone in a surgical permanent prosthesis is of great importance since the fixation and osseointegration of the system leads to the failure or success of the surgery. Micro Computed Tomography is a technique that helps to visualize the structure of the bone. In this study, the microCT is used to analyze implant-bone systems images. However, one of the problems presented in the reconstruction of these images is the effect of the iron based implants, with a halo or fluorescence scattering distorting the micro CT image and leading to bad 3D reconstructions. In this work we introduce an automatic method for eliminate the effect of AISI 316L iron materials in the implant-bone system based on the application of Compensatory Fuzzy Mathematical Morphology for future investigate about the structural and mechanical properties of bone and cancellous materials.
Investigating Teachers' Images of Mathematics
ERIC Educational Resources Information Center
Sterenberg, Gladys
2008-01-01
Research suggests that understanding new images of mathematics is very challenging and can contribute to teacher resistance. An explicit exploration of personal views of mathematics may be necessary for pedagogical change. One possible way for exploring these images is through mathematical metaphors. As metaphors focus on similarities, they can be…
Semantic Processing of Mathematical Gestures
ERIC Educational Resources Information Center
Lim, Vanessa K.; Wilson, Anna J.; Hamm, Jeff P.; Phillips, Nicola; Iwabuchi, Sarina J.; Corballis, Michael C.; Arzarello, Ferdinando; Thomas, Michael O. J.
2009-01-01
Objective: To examine whether or not university mathematics students semantically process gestures depicting mathematical functions (mathematical gestures) similarly to the way they process action gestures and sentences. Semantic processing was indexed by the N400 effect. Results: The N400 effect elicited by words primed with mathematical gestures…
Mathematical Methods for Diffusion MRI Processing
Lenglet, C.; Campbell, J.S.W.; Descoteaux, M.; Haro, G.; Savadjiev, P.; Wassermann, D.; Anwander, A.; Deriche, R.; Pike, G.B.; Sapiro, G.; Siddiqi, K.; Thompson, P.
2009-01-01
In this article, we review recent mathematical models and computational methods for the processing of diffusion Magnetic Resonance Images, including state-of-the-art reconstruction of diffusion models, cerebral white matter connectivity analysis, and segmentation techniques. We focus on Diffusion Tensor Images (DTI) and Q-Ball Images (QBI). PMID:19063977
Images, Anxieties and Attitudes toward Mathematics
ERIC Educational Resources Information Center
Belbase, Shashidhar
2010-01-01
Images, anxieties, and attitudes towards mathematics are common interest among mathematics teachers, teacher educators and researchers. The main purpose of this literature review based paper is to discuss and analyze images, anxieties, and attitudes towards mathematics in order to foster meaningful teaching and learning of mathematics. Images of…
Images, Anxieties, and Attitudes toward Mathematics
ERIC Educational Resources Information Center
Belbase, Shashidhar
2013-01-01
The purpose of this paper is to discuss and analyze images, anxieties, and attitudes towards mathematics in order to foster meaningful teaching and learning of mathematics. Images of mathematics seem to be profoundly shaped by epistemological, philosophical, and pedagogical perspectives of one who views mathematics either as priori or a…
Image Processing for Teaching.
ERIC Educational Resources Information Center
Greenberg, R.; And Others
1993-01-01
The Image Processing for Teaching project provides a powerful medium to excite students about science and mathematics, especially children from minority groups and others whose needs have not been met by traditional teaching. Using professional-quality software on microcomputers, students explore a variety of scientific data sets, including…
Image processing and reconstruction
Chartrand, Rick
2012-06-15
This talk will examine some mathematical methods for image processing and the solution of underdetermined, linear inverse problems. The talk will have a tutorial flavor, mostly accessible to undergraduates, while still presenting research results. The primary approach is the use of optimization problems. We will find that relaxing the usual assumption of convexity will give us much better results.
NASA Technical Reports Server (NTRS)
1982-01-01
Images are prepared from data acquired by the multispectral scanner aboard Landsat, which views Earth in four ranges of the electromagnetic spectrum, two visible bands and two infrared. Scanner picks up radiation from ground objects and converts the radiation signatures to digital signals, which are relayed to Earth and recorded on tape. Each tape contains "pixels" or picture elements covering a ground area; computerized equipment processes the tapes and plots each pixel, line be line to produce the basic image. Image can be further processed to correct sensor errors, to heighten contrast for feature emphasis or to enhance the end product in other ways. Key factor in conversion of digital data to visual form is precision of processing equipment. Jet Propulsion Laboratory prepared a digital mosaic that was plotted and enhanced by Optronics International, Inc. by use of the company's C-4300 Colorwrite, a high precision, high speed system which manipulates and analyzes digital data and presents it in visual form on film. Optronics manufactures a complete family of image enhancement processing systems to meet all users' needs. Enhanced imagery is useful to geologists, hydrologists, land use planners, agricultural specialists geographers and others.
Mathematical refocusing of images in electronic holography
Stetson, Karl A.
2009-07-01
This paper presents an illustration of mathematical refocusing of images obtained by the HoloFringe300K electronic holography program. The purpose is to demonstrate that this form of electronic holography is equivalent to image-plane, phase-stepped digital holography. The mathematical refocusing method used here differs from those in common use and may have some advantages.
Workbook, Basic Mathematics and Wastewater Processing Calculations.
ERIC Educational Resources Information Center
New York State Dept. of Environmental Conservation, Albany.
This workbook serves as a self-learning guide to basic mathematics and treatment plant calculations and also as a reference and source book for the mathematics of sewage treatment and processing. In addition to basic mathematics, the workbook discusses processing and process control, laboratory calculations and efficiency calculations necessary in…
Mathematical Modeling: A Structured Process
ERIC Educational Resources Information Center
Anhalt, Cynthia Oropesa; Cortez, Ricardo
2015-01-01
Mathematical modeling, in which students use mathematics to explain or interpret physical, social, or scientific phenomena, is an essential component of the high school curriculum. The Common Core State Standards for Mathematics (CCSSM) classify modeling as a K-12 standard for mathematical practice and as a conceptual category for high school…
Gifted Students' Metaphor Images about Mathematics
ERIC Educational Resources Information Center
Arikan, Elif Esra; Unal, Hasan
2015-01-01
The aim of this study is to investigate the metaphors images of gifted students about mathematics. The sample of the study consists of 82 gifted students, which are 2, 3, 4, 5, 6, 7 graders, from Istanbul. Data were collected by asking students to complete the sentence: "Mathematics is as …, because…". In the study content analysis was…
Kuo, Chung-Feng Jeffrey; Chu, Yueng-Hsiang; Wang, Po-Chun; Lai, Chun-Yu; Chu, Wen-Lin; Leu, Yi-Shing; Wang, Hsing-Won
2013-12-01
The human larynx is an important organ for voice production and respiratory mechanisms. The vocal cord is approximated for voice production and open for breathing. The videolaryngoscope is widely used for vocal cord examination. At present, physicians usually diagnose vocal cord diseases by manually selecting the image of the vocal cord opening to the largest extent (abduction), thus maximally exposing the vocal cord lesion. On the other hand, the severity of diseases such as vocal palsy, atrophic vocal cord is largely dependent on the vocal cord closing to the smallest extent (adduction). Therefore, diseases can be assessed by the image of the vocal cord opening to the largest extent, and the seriousness of breathy voice is closely correlated to the gap between vocal cords when closing to the smallest extent. The aim of the study was to design an automatic vocal cord image selection system to improve the conventional selection process by physicians and enhance diagnosis efficiency. Also, due to the unwanted fuzzy images resulting from examination process caused by human factors as well as the non-vocal cord images, texture analysis is added in this study to measure image entropy to establish a screening and elimination system to effectively enhance the accuracy of selecting the image of the vocal cord closing to the smallest extent.
A Mathematical Framework for Image Analysis
1991-08-01
The results reported here were derived from the research project ’A Mathematical Framework for Image Analysis ’ supported by the Office of Naval...Research, contract N00014-88-K-0289 to Brown University. A common theme for the work reported is the use of probabilistic methods for problems in image ... analysis and image reconstruction. Five areas of research are described: rigid body recognition using a decision tree/combinatorial approach; nonrigid
ERIC Educational Resources Information Center
Yilmaz, Suha; Tekin-Dede, Ayse
2016-01-01
Mathematization competency is considered in the field as the focus of modelling process. Considering the various definitions, the components of the mathematization competency are determined as identifying assumptions, identifying variables based on the assumptions and constructing mathematical model/s based on the relations among identified…
A Unified Mathematical Approach to Image Analysis.
1987-08-31
describes four instances of the paradigm in detail. Directions for ongoing and future research are also indicated. Keywords: Image processing; Algorithms; Segmentation; Boundary detection; tomography; Global image analysis .
Seeram, Euclid
2004-01-01
Digital image processing is now commonplace in radiology, nuclear medicine and sonography. This article outlines underlying principles and concepts of digital image processing. After completing this article, readers should be able to: List the limitations of film-based imaging. Identify major components of a digital imaging system. Describe the history and application areas of digital image processing. Discuss image representation and the fundamentals of digital image processing. Outline digital image processing techniques and processing operations used in selected imaging modalities. Explain the basic concepts and visualization tools used in 3-D and virtual reality imaging. Recognize medical imaging informatics as a new area of specialization for radiologic technologists.
Self and Peer Assessment of Mathematical Processes
ERIC Educational Resources Information Center
Onion, Alice; Javaheri, Elnaz
2011-01-01
This article explores using Bowland assessment tasks and Nuffield Applying Mathematical Processes (AMP) activities as part of a scheme of work. The Bowland tasks and Nuffield AMP activities are designed to develop students' mathematical thinking; they are focused on key processes. Unfamiliar demands are made on the students and they are challenged…
NASA Astrophysics Data System (ADS)
Berry, Richard
1994-04-01
Today's personal computers are more powerful than the mainframes that processed images during the early days of space exploration. We have entered an age in which anyone can do image processing. Topics covering the following aspects of image processing are discussed: digital-imaging basics, image calibration, image analysis, scaling, spatial enhancements, and compositing.
Mathematical Problems in Imaging in Random Media
2015-01-15
AFRL-OSR-VA-TR-2015-0030 Mathematical Problems in Imaging In Random Media Beatrice Riviere WILLIAM MARSH RICE UNIV HOUSTON TX Final Report 01/15/2015...Liliana Borcea University of Michigan as a subcontractor to Rice University DOD: Air Force Office of Scientific Research Publicly available We...impact Students advised: 1. Wang Yingpei, Rice University PhD 2014. Thesis topic: Imaging in high contrast media. Now at Oracle, San Francisco. 2
Mathematics from Still and Moving Images
ERIC Educational Resources Information Center
Pierce, Robyn; Stacey, Kaye; Ball, Lynda
2005-01-01
Digital photos and digital movies offer an excellent way of bringing real world situations into the mathematics classroom. The technologies surveyed here are feasible for everyday classroom use and inexpensive. Examples are drawn from the teaching of Cartesian coordinates, linear functions, ratio and Pythagoras' theorem using still images, and…
Iterative Processes in Mathematics Education
ERIC Educational Resources Information Center
Mudaly, Vimolan
2009-01-01
There are many arguments that reflect on inductive versus deductive methods in mathematics. Claims are often made that teaching from the general to the specific does make understanding better for learners or vice versa. I discuss an intervention conducted with Grade 10 (15-year-old) learners in a small suburb in South Africa. I reflect on the…
An Emergent Framework: Views of Mathematical Processes
ERIC Educational Resources Information Center
Sanchez, Wendy B.; Lischka, Alyson E.; Edenfield, Kelly W.; Gammill, Rebecca
2015-01-01
The findings reported in this paper were generated from a case study of teacher leaders at a state-level mathematics conference. Investigation focused on how participants viewed the mathematical processes of communication, connections, representations, problem solving, and reasoning and proof. Purposeful sampling was employed to select nine…
Sources of mathematical thinking: behavioral and brain-imaging evidence.
Dehaene, S; Spelke, E; Pinel, P; Stanescu, R; Tsivkin, S
1999-05-07
Does the human capacity for mathematical intuition depend on linguistic competence or on visuo-spatial representations? A series of behavioral and brain-imaging experiments provides evidence for both sources. Exact arithmetic is acquired in a language-specific format, transfers poorly to a different language or to novel facts, and recruits networks involved in word-association processes. In contrast, approximate arithmetic shows language independence, relies on a sense of numerical magnitudes, and recruits bilateral areas of the parietal lobes involved in visuo-spatial processing. Mathematical intuition may emerge from the interplay of these brain systems.
NASA Technical Reports Server (NTRS)
Gunther, F. J.
1986-01-01
Apple Image-Processing Educator (AIPE) explores ability of microcomputers to provide personalized computer-assisted instruction (CAI) in digital image processing of remotely sensed images. AIPE is "proof-of-concept" system, not polished production system. User-friendly prompts provide access to explanations of common features of digital image processing and of sample programs that implement these features.
A Mathematical Analysis of a Biology Process
NASA Astrophysics Data System (ADS)
Juratoni, A.; Bundǎu, O.; Chevereşan, A.
2010-09-01
We present a mathematical model of tumor growth with an immune response. We will analyze the problem of maximizes the effects of the immunotherapy while minimizing the number of tumor cells and the cost of the control which is given by medical treatment. Mathematical modeling of this process leads to an optimal control problem with a finite horizon. We give an existence result and we prove the necessary conditions for the optimal control problem.
Industrial applications of process imaging and image processing
NASA Astrophysics Data System (ADS)
Scott, David M.; Sunshine, Gregg; Rosen, Lou; Jochen, Ed
2001-02-01
Process imaging is the art of visualizing events inside closed industrial processes. Image processing is the art of mathematically manipulating digitized images to extract quantitative information about such processes. Ongoing advances in camera and computer technology have made it feasible to apply these abilities to measurement needs in the chemical industry. To illustrate the point, this paper describes several applications developed at DuPont, where a variety of measurements are based on in-line, at-line, and off-line imaging. Application areas include compounding, melt extrusion, crystallization, granulation, media milling, and particle characterization. Polymer compounded with glass fiber is evaluated by a patented radioscopic (real-time X-ray imaging) technique to measure concentration and dispersion uniformity of the glass. Contamination detection in molten polymer (important for extruder operations) is provided by both proprietary and commercial on-line systems. Crystallization in production reactors is monitored using in-line probes and flow cells. Granulation is controlled by at-line measurements of granule size obtained from image processing. Tomographic imaging provides feedback for improved operation of media mills. Finally, particle characterization is provided by a robotic system that measures individual size and shape for thousands of particles without human supervision. Most of these measurements could not be accomplished with other (non-imaging) techniques.
Multispectral imaging and image processing
NASA Astrophysics Data System (ADS)
Klein, Julie
2014-02-01
The color accuracy of conventional RGB cameras is not sufficient for many color-critical applications. One of these applications, namely the measurement of color defects in yarns, is why Prof. Til Aach and the Institute of Image Processing and Computer Vision (RWTH Aachen University, Germany) started off with multispectral imaging. The first acquisition device was a camera using a monochrome sensor and seven bandpass color filters positioned sequentially in front of it. The camera allowed sampling the visible wavelength range more accurately and reconstructing the spectra for each acquired image position. An overview will be given over several optical and imaging aspects of the multispectral camera that have been investigated. For instance, optical aberrations caused by filters and camera lens deteriorate the quality of captured multispectral images. The different aberrations were analyzed thoroughly and compensated based on models for the optical elements and the imaging chain by utilizing image processing. With this compensation, geometrical distortions disappear and sharpness is enhanced, without reducing the color accuracy of multispectral images. Strong foundations in multispectral imaging were laid and a fruitful cooperation was initiated with Prof. Bernhard Hill. Current research topics like stereo multispectral imaging and goniometric multispectral measure- ments that are further explored with his expertise will also be presented in this work.
Hyperspectral image processing methods
Technology Transfer Automated Retrieval System (TEKTRAN)
Hyperspectral image processing refers to the use of computer algorithms to extract, store and manipulate both spatial and spectral information contained in hyperspectral images across the visible and near-infrared portion of the electromagnetic spectrum. A typical hyperspectral image processing work...
Reconstruction of a Collaborative Mathematical Learning Process
ERIC Educational Resources Information Center
Pijls, Monique; Dekker, Rijkje; Van Hout-Wolters, Bernadette
2007-01-01
The study focused on the interaction between two secondary school students while they were working on computerized mathematical investigation tasks related to probability theory. The aim was to establish how such interaction helped the students to learn from one another, and how it may have hindered their learning process. The assumption was that…
Mathematics of Information Processing and the Internet
ERIC Educational Resources Information Center
Hart, Eric W.
2010-01-01
The mathematics of information processing and the Internet can be organized around four fundamental themes: (1) access (finding information easily); (2) security (keeping information confidential); (3) accuracy (ensuring accurate information); and (4) efficiency (data compression). In this article, the author discusses each theme with reference to…
Huang, H K
1981-01-01
Biomedical image processing is a very broad field; it covers biomedical signal gathering, image forming, picture processing, and image display to medical diagnosis based on features extracted from images. This article reviews this topic in both its fundamentals and applications. In its fundamentals, some basic image processing techniques including outlining, deblurring, noise cleaning, filtering, search, classical analysis and texture analysis have been reviewed together with examples. The state-of-the-art image processing systems have been introduced and discussed in two categories: general purpose image processing systems and image analyzers. In order for these systems to be effective for biomedical applications, special biomedical image processing languages have to be developed. The combination of both hardware and software leads to clinical imaging devices. Two different types of clinical imaging devices have been discussed. There are radiological imagings which include radiography, thermography, ultrasound, nuclear medicine and CT. Among these, thermography is the most noninvasive but is limited in application due to the low energy of its source. X-ray CT is excellent for static anatomical images and is moving toward the measurement of dynamic function, whereas nuclear imaging is moving toward organ metabolism and ultrasound is toward tissue physical characteristics. Heart imaging is one of the most interesting and challenging research topics in biomedical image processing; current methods including the invasive-technique cineangiography, and noninvasive ultrasound, nuclear medicine, transmission, and emission CT methodologies have been reviewed. Two current federally funded research projects in heart imaging, the dynamic spatial reconstructor and the dynamic cardiac three-dimensional densitometer, should bring some fruitful results in the near future. Miscrosopic imaging technique is very different from the radiological imaging technique in the sense that
Differential morphology and image processing.
Maragos, P
1996-01-01
Image processing via mathematical morphology has traditionally used geometry to intuitively understand morphological signal operators and set or lattice algebra to analyze them in the space domain. We provide a unified view and analytic tools for morphological image processing that is based on ideas from differential calculus and dynamical systems. This includes ideas on using partial differential or difference equations (PDEs) to model distance propagation or nonlinear multiscale processes in images. We briefly review some nonlinear difference equations that implement discrete distance transforms and relate them to numerical solutions of the eikonal equation of optics. We also review some nonlinear PDEs that model the evolution of multiscale morphological operators and use morphological derivatives. Among the new ideas presented, we develop some general 2-D max/min-sum difference equations that model the space dynamics of 2-D morphological systems (including the distance computations) and some nonlinear signal transforms, called slope transforms, that can analyze these systems in a transform domain in ways conceptually similar to the application of Fourier transforms to linear systems. Thus, distance transforms are shown to be bandpass slope filters. We view the analysis of the multiscale morphological PDEs and of the eikonal PDE solved via weighted distance transforms as a unified area in nonlinear image processing, which we call differential morphology, and briefly discuss its potential applications to image processing and computer vision.
Pre-Service Mathematics Teachers' Concept Images of Radian
ERIC Educational Resources Information Center
Akkoc, Hatice
2008-01-01
This study investigates pre-service mathematics teachers' concept images of radian and possible sources of such images. A multiple-case study was conducted for this study. Forty-two pre-service mathematics teachers completed a questionnaire, which aims to assess their understanding of radian. Six of them were selected for individual interviews on…
Phase in Optical Image Processing
NASA Astrophysics Data System (ADS)
Naughton, Thomas J.
2010-04-01
The use of phase has a long standing history in optical image processing, with early milestones being in the field of pattern recognition, such as VanderLugt's practical construction technique for matched filters, and (implicitly) Goodman's joint Fourier transform correlator. In recent years, the flexibility afforded by phase-only spatial light modulators and digital holography, for example, has enabled many processing techniques based on the explicit encoding and decoding of phase. One application area concerns efficient numerical computations. Pushing phase measurement to its physical limits, designs employing the physical properties of phase have ranged from the sensible to the wonderful, in some cases making computationally easy problems easier to solve and in other cases addressing mathematics' most challenging computationally hard problems. Another application area is optical image encryption, in which, typically, a phase mask modulates the fractional Fourier transformed coefficients of a perturbed input image, and the phase of the inverse transform is then sensed as the encrypted image. The inherent linearity that makes the system so elegant mitigates against its use as an effective encryption technique, but we show how a combination of optical and digital techniques can restore confidence in that security. We conclude with the concept of digital hologram image processing, and applications of same that are uniquely suited to optical implementation, where the processing, recognition, or encryption step operates on full field information, such as that emanating from a coherently illuminated real-world three-dimensional object.
Apple Image Processing Educator
NASA Technical Reports Server (NTRS)
Gunther, F. J.
1981-01-01
A software system design is proposed and demonstrated with pilot-project software. The system permits the Apple II microcomputer to be used for personalized computer-assisted instruction in the digital image processing of LANDSAT images. The programs provide data input, menu selection, graphic and hard-copy displays, and both general and detailed instructions. The pilot-project results are considered to be successful indicators of the capabilities and limits of microcomputers for digital image processing education.
NASA Technical Reports Server (NTRS)
1992-01-01
To convert raw data into environmental products, the National Weather Service and other organizations use the Global 9000 image processing system marketed by Global Imaging, Inc. The company's GAE software package is an enhanced version of the TAE, developed by Goddard Space Flight Center to support remote sensing and image processing applications. The system can be operated in three modes and is combined with HP Apollo workstation hardware.
NASA Technical Reports Server (NTRS)
Matthews, Christine G.; Posenau, Mary-Anne; Leonard, Desiree M.; Avis, Elizabeth L.; Debure, Kelly R.; Stacy, Kathryn; Vonofenheim, Bill
1992-01-01
The intent is to provide an introduction to the image processing capabilities available at the Langley Research Center (LaRC) Central Scientific Computing Complex (CSCC). Various image processing software components are described. Information is given concerning the use of these components in the Data Visualization and Animation Laboratory at LaRC.
The (Mathematical) Modeling Process in Biosciences.
Torres, Nestor V; Santos, Guido
2015-01-01
In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology.
The (Mathematical) Modeling Process in Biosciences
Torres, Nestor V.; Santos, Guido
2015-01-01
In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology. PMID:26734063
NASA Technical Reports Server (NTRS)
1986-01-01
Mallinckrodt Institute of Radiology (MIR) is using a digital image processing system which employs NASA-developed technology. MIR's computer system is the largest radiology system in the world. It is used in diagnostic imaging. Blood vessels are injected with x-ray dye, and the images which are produced indicate whether arteries are hardened or blocked. A computer program developed by Jet Propulsion Laboratory known as Mini-VICAR/IBIS was supplied to MIR by COSMIC. The program provides the basis for developing the computer imaging routines for data processing, contrast enhancement and picture display.
How Digital Image Processing Became Really Easy
NASA Astrophysics Data System (ADS)
Cannon, Michael
1988-02-01
In the early and mid-1970s, digital image processing was the subject of intense university and corporate research. The research lay along two lines: (1) developing mathematical techniques for improving the appearance of or analyzing the contents of images represented in digital form, and (2) creating cost-effective hardware to carry out these techniques. The research has been very effective, as evidenced by the continued decline of image processing as a research topic, and the rapid increase of commercial companies to market digital image processing software and hardware.
NASA Technical Reports Server (NTRS)
Zolotukhin, V. G.; Kolosov, B. I.; Usikov, D. A.; Borisenko, V. I.; Mosin, S. T.; Gorokhov, V. N.
1980-01-01
A description of a batch of programs for the YeS-1040 computer combined into an automated system for processing photo (and video) images of the Earth's surface, taken from spacecraft, is presented. Individual programs with the detailed discussion of the algorithmic and programmatic facilities needed by the user are presented. The basic principles for assembling the system, and the control programs are included. The exchange format within whose framework the cataloging of any programs recommended for the system of processing will be activated in the future is displayed.
NASA Astrophysics Data System (ADS)
Bosio, M. A.
1990-11-01
ABSTRACT: A brief description of astronomical image software is presented. This software was developed in a Digital Micro Vax II Computer System. : St presenta una somera descripci6n del software para procesamiento de imagenes. Este software fue desarrollado en un equipo Digital Micro Vax II. : DATA ANALYSIS - IMAGE PROCESSING
Mathematical modeling of the coating process.
Toschkoff, Gregor; Khinast, Johannes G
2013-12-05
Coating of tablets is a common unit operation in the pharmaceutical industry. In most cases, the final product must meet strict quality requirements; to meet them, a detailed understanding of the coating process is required. To this end, numerous experiment studies have been performed. However, to acquire a mechanistic understanding, experimental data must be interpreted in the light of mathematical models. In recent years, a combination of analytical modeling and computational simulations enabled deeper insights into the nature of the coating process. This paper presents an overview of modeling and simulation approaches of the coating process, covering various relevant aspects from scale-up considerations to coating mass uniformity investigations and models for drop atomization. The most important analytical and computational concepts are presented and the findings are compared.
Methods in Astronomical Image Processing
NASA Astrophysics Data System (ADS)
Jörsäter, S.
A Brief Introductory Note History of Astronomical Imaging Astronomical Image Data Images in Various Formats Digitized Image Data Digital Image Data Philosophy of Astronomical Image Processing Properties of Digital Astronomical Images Human Image Processing Astronomical vs. Computer Science Image Processing Basic Tools of Astronomical Image Processing Display Applications Calibration of Intensity Scales Calibration of Length Scales Image Re-shaping Feature Enhancement Noise Suppression Noise and Error Analysis Image Processing Packages: Design of AIPS and MIDAS AIPS MIDAS Reduction of CCD Data Bias Subtraction Clipping Preflash Subtraction Dark Subtraction Flat Fielding Sky Subtraction Extinction Correction Deconvolution Methods Rebinning/Combining Summary and Prospects for the Future
Mathematical Analysis and Optimization of Infiltration Processes
NASA Technical Reports Server (NTRS)
Chang, H.-C.; Gottlieb, D.; Marion, M.; Sheldon, B. W.
1997-01-01
A variety of infiltration techniques can be used to fabricate solid materials, particularly composites. In general these processes can be described with at least one time dependent partial differential equation describing the evolution of the solid phase, coupled to one or more partial differential equations describing mass transport through a porous structure. This paper presents a detailed mathematical analysis of a relatively simple set of equations which is used to describe chemical vapor infiltration. The results demonstrate that the process is controlled by only two parameters, alpha and beta. The optimization problem associated with minimizing the infiltration time is also considered. Allowing alpha and beta to vary with time leads to significant reductions in the infiltration time, compared with the conventional case where alpha and beta are treated as constants.
Image processing occupancy sensor
Brackney, Larry J.
2016-09-27
A system and method of detecting occupants in a building automation system environment using image based occupancy detection and position determinations. In one example, the system includes an image processing occupancy sensor that detects the number and position of occupants within a space that has controllable building elements such as lighting and ventilation diffusers. Based on the position and location of the occupants, the system can finely control the elements to optimize conditions for the occupants, optimize energy usage, among other advantages.
Stochastic processes, estimation theory and image enhancement
NASA Technical Reports Server (NTRS)
Assefi, T.
1978-01-01
An introductory account of stochastic processes, estimation theory, and image enhancement is presented. The book is primarily intended for first-year graduate students and practicing engineers and scientists whose work requires an acquaintance with the theory. Fundamental concepts of probability were reviewed that are required to support the main topics. The appendices discuss the remaining mathematical background.
Collective Mathematical Understanding as an Improvisational Process
ERIC Educational Resources Information Center
Martin, Lyndon C.; Towers, Jo
2003-01-01
This paper explores the phenomenon of mathematical understanding, and offers a response to the question raised by Martin (2001) at PME-NA about the possibility for and nature of collective mathematical understanding. In referring to collective mathematical understanding we point to the kinds of learning and understanding we may see occurring when…
NASA Astrophysics Data System (ADS)
Mastriani, Mario
2017-01-01
This paper presents a number of problems concerning the practical (real) implementation of the techniques known as quantum image processing. The most serious problem is the recovery of the outcomes after the quantum measurement, which will be demonstrated in this work that is equivalent to a noise measurement, and it is not considered in the literature on the subject. It is noteworthy that this is due to several factors: (1) a classical algorithm that uses Dirac's notation and then it is coded in MATLAB does not constitute a quantum algorithm, (2) the literature emphasizes the internal representation of the image but says nothing about the classical-to-quantum and quantum-to-classical interfaces and how these are affected by decoherence, (3) the literature does not mention how to implement in a practical way (at the laboratory) these proposals internal representations, (4) given that quantum image processing works with generic qubits, this requires measurements in all axes of the Bloch sphere, logically, and (5) among others. In return, the technique known as quantum Boolean image processing is mentioned, which works with computational basis states (CBS), exclusively. This methodology allows us to avoid the problem of quantum measurement, which alters the results of the measured except in the case of CBS. Said so far is extended to quantum algorithms outside image processing too.
Adding Structure to the Transition Process to Advanced Mathematical Activity
ERIC Educational Resources Information Center
Engelbrecht, Johann
2010-01-01
The transition process to advanced mathematical thinking is experienced as traumatic by many students. Experiences that students had of school mathematics differ greatly to what is expected from them at university. Success in school mathematics meant application of different methods to get an answer. Students are not familiar with logical…
Subdivision processes in mathematics and science
NASA Astrophysics Data System (ADS)
Stavy, Ruth; Tirosh, Dina
In the course of a research project now in progress, three successive division problems were presented to students in Grades 7-12. The first problem concerned a geometrical line segment, while the other two dealt with material substances (copper wire and water). All three problems involved the same process: successive division. Two of the problems (line segment and copper wire) were also figurally similar. Our data indicate that the similarity in the process had a profound effect on students' responses. The effect of the similarity in process suggests that the repeated process of division has a coercive effect, imposing itself on students' responses and encouraging then to view successive division processes as finite or infinite regardless of the content of the problem.It is possible to trace out, step by step, a more or less parallel process of development for the ideas of points and continuity and those dealing with atoms and physical objects in the child's conception of the ideal world. The only difference between these two processes is that to the child's way of thinking physical points or atoms still possess surface and volume, whereas mathematical points tend to lose all extension (though during the stages of development which concerns us here, this remains only a tendency.) (Piaget & Inhelder, 1948, pp. 126).Our first naive impression of nature and matter is that of continuity. Be it a piece of matter or a volume of liquid we invariably conceive it as divisible into infinity, and even so small a part of it appears to us to possess the same properties as the whole. (Hilbert, 1925, pp. 162).
NASA Astrophysics Data System (ADS)
Hou, H. S.
1985-07-01
An overview of the recent progress in the area of digital processing of binary images in the context of document processing is presented here. The topics covered include input scan, adaptive thresholding, halftoning, scaling and resolution conversion, data compression, character recognition, electronic mail, digital typography, and output scan. Emphasis has been placed on illustrating the basic principles rather than descriptions of a particular system. Recent technology advances and research in this field are also mentioned.
The Image of Mathematics Held by Irish Post-Primary Students
ERIC Educational Resources Information Center
Lane, Ciara; Stynes, Martin; O'Donoghue, John
2014-01-01
The image of mathematics held by Irish post-primary students was examined and a model for the image found was constructed. Initially, a definition for "image of mathematics" was adopted with image of mathematics hypothesized as comprising attitudes, beliefs, self-concept, motivation, emotions and past experiences of mathematics. Research…
NASA Technical Reports Server (NTRS)
Roth, D. J.; Hull, D. R.
1994-01-01
IMAGEP manipulates digital image data to effect various processing, analysis, and enhancement functions. It is keyboard-driven program organized into nine subroutines. Within subroutines are sub-subroutines also selected via keyboard. Algorithm has possible scientific, industrial, and biomedical applications in study of flows in materials, analysis of steels and ores, and pathology, respectively.
Adding structure to the transition process to advanced mathematical activity
NASA Astrophysics Data System (ADS)
Engelbrecht, Johann
2010-03-01
The transition process to advanced mathematical thinking is experienced as traumatic by many students. Experiences that students had of school mathematics differ greatly to what is expected from them at university. Success in school mathematics meant application of different methods to get an answer. Students are not familiar with logical deductive reasoning, required in advanced mathematics. It is necessary to assist students in this transition process, in moving from general to mathematical thinking. In this article some structure is suggested for this transition period. This essay is an argumentative exposition supported by personal experience and international literature. This makes this study theoretical rather than empirical.
Students' Images of Mathematics
ERIC Educational Resources Information Center
Martin, Lee; Gourley-Delaney, Pamela
2014-01-01
Students' judgments about "what counts" as mathematics in and out of school have important consequences for problem solving and transfer, yet our understanding of the source and nature of these judgments remains incomplete. Thirty-five sixth grade students participated in a study focused on what activities students judge as…
1975-09-30
Technical Journal, Vol. 36, pp. 653-709, May 1957. -50- 4. Image Restoration anJ Enhdikcement Projects Imaje restoration ani image enhancement are...n (9K =--i_ (9) -sn =0- 2. where o is the noise energy ani I is an identity matrix. n Color Imaje Scanner Calibration: A common problem in the...line of the imaje , and >at. The statistics cf the process N(k) can now be given in terms of the statistics of m , 8 2 , and the sequence W= (cLe (5
Image processing techniques for acoustic images
NASA Astrophysics Data System (ADS)
Murphy, Brian P.
1991-06-01
The primary goal of this research is to test the effectiveness of various image processing techniques applied to acoustic images generated in MATLAB. The simulated acoustic images have the same characteristics as those generated by a computer model of a high resolution imaging sonar. Edge detection and segmentation are the two image processing techniques discussed in this study. The two methods tested are a modified version of the Kalman filtering and median filtering.
Processes and Priorities in Planning Mathematics Teaching
ERIC Educational Resources Information Center
Sullivan, Peter; Clarke, David J.; Clarke, Doug M.; Farrell, Lesley; Gerrard, Jessica
2013-01-01
Insights into teachers' planning of mathematics reported here were gathered as part of a broader project examining aspects of the implementation of the Australian curriculum in mathematics (and English). In particular, the responses of primary and secondary teachers to a survey of various aspects of decisions that inform their use of curriculum…
Mathematical Problem Solving through Sequential Process Analysis
ERIC Educational Resources Information Center
Codina, A.; Cañadas, M. C.; Castro, E.
2015-01-01
Introduction: The macroscopic perspective is one of the frameworks for research on problem solving in mathematics education. Coming from this perspective, our study addresses the stages of thought in mathematical problem solving, offering an innovative approach because we apply sequential relations and global interrelations between the different…
Retinomorphic image processing.
Ghosh, Kuntal; Bhaumik, Kamales; Sarkar, Sandip
2008-01-01
The present work is aimed at understanding and explaining some of the aspects of visual signal processing at the retinal level while exploiting the same towards the development of some simple techniques in the domain of digital image processing. Classical studies on retinal physiology revealed the nature of contrast sensitivity of the receptive field of bipolar or ganglion cells, which lie in the outer and inner plexiform layers of the retina. To explain these observations, a difference of Gaussian (DOG) filter was suggested, which was subsequently modified to a Laplacian of Gaussian (LOG) filter for computational ease in handling two-dimensional retinal inputs. Till date almost all image processing algorithms, used in various branches of science and engineering had followed LOG or one of its variants. Recent observations in retinal physiology however, indicate that the retinal ganglion cells receive input from a larger area than the classical receptive fields. We have proposed an isotropic model for the non-classical receptive field of the retinal ganglion cells, corroborated from these recent observations, by introducing higher order derivatives of Gaussian expressed as linear combination of Gaussians only. In digital image processing, this provides a new mechanism of edge detection on one hand and image half-toning on the other. It has also been found that living systems may sometimes prefer to "perceive" the external scenario by adding noise to the received signals in the pre-processing level for arriving at better information on light and shade in the edge map. The proposed model also provides explanation to many brightness-contrast illusions hitherto unexplained not only by the classical isotropic model but also by some other Gestalt and Constructivist models or by non-isotropic multi-scale models. The proposed model is easy to implement both in the analog and digital domain. A scheme for implementation in the analog domain generates a new silicon retina
Van Eeckhout, E.; Pope, P.; Balick, L.
1996-07-01
This is the final report of a two-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The primary objective of this project was to advance image processing and visualization technologies for environmental characterization. This was effected by developing and implementing analyses of remote sensing data from satellite and airborne platforms, and demonstrating their effectiveness in visualization of environmental problems. Many sources of information were integrated as appropriate using geographic information systems.
Teaching the Inquiry Process through Experimental Mathematics
ERIC Educational Resources Information Center
Pudwell, Lara
2017-01-01
In this paper, we discuss the Experimental Mathematics course taught at Valparaiso University since 2009. We focus on aspects of the course that facilitate students' abilities to ask and explore their own research questions.
scikit-image: image processing in Python
Schönberger, Johannes L.; Nunez-Iglesias, Juan; Boulogne, François; Warner, Joshua D.; Yager, Neil; Gouillart, Emmanuelle; Yu, Tony
2014-01-01
scikit-image is an image processing library that implements algorithms and utilities for use in research, education and industry applications. It is released under the liberal Modified BSD open source license, provides a well-documented API in the Python programming language, and is developed by an active, international team of collaborators. In this paper we highlight the advantages of open source to achieve the goals of the scikit-image library, and we showcase several real-world image processing applications that use scikit-image. More information can be found on the project homepage, http://scikit-image.org. PMID:25024921
scikit-image: image processing in Python.
van der Walt, Stéfan; Schönberger, Johannes L; Nunez-Iglesias, Juan; Boulogne, François; Warner, Joshua D; Yager, Neil; Gouillart, Emmanuelle; Yu, Tony
2014-01-01
scikit-image is an image processing library that implements algorithms and utilities for use in research, education and industry applications. It is released under the liberal Modified BSD open source license, provides a well-documented API in the Python programming language, and is developed by an active, international team of collaborators. In this paper we highlight the advantages of open source to achieve the goals of the scikit-image library, and we showcase several real-world image processing applications that use scikit-image. More information can be found on the project homepage, http://scikit-image.org.
Image processing in planetology
NASA Astrophysics Data System (ADS)
Fulchignoni, M.; Picchiotti, A.
The authors summarize the state of art in the field of planetary image processing in terms of available data, required procedures and possible improvements. More than a technical description of the adopted algorithms, that are considered as the normal background of any research activity dealing with interpretation of planetary data, the authors outline the advances in planetology achieved as a consequence of the availability of better data and more sophisticated hardware. An overview of the available data base and of the organizational efforts to make the data accessible and updated constitutes a valuable reference for those people interested in getting information. A short description of the processing sequence, illustrated by an example which shows the quality of the obtained products and the improvement in each successive step of the processing procedure gives an idea of the possible use of this kind of information.
Image Processing Diagnostics: Emphysema
NASA Astrophysics Data System (ADS)
McKenzie, Alex
2009-10-01
Currently the computerized tomography (CT) scan can detect emphysema sooner than traditional x-rays, but other tests are required to measure more accurately the amount of affected lung. CT scan images show clearly if a patient has emphysema, but is unable by visual scan alone, to quantify the degree of the disease, as it appears merely as subtle, barely distinct, dark spots on the lung. Our goal is to create a software plug-in to interface with existing open source medical imaging software, to automate the process of accurately diagnosing and determining emphysema severity levels in patients. This will be accomplished by performing a number of statistical calculations using data taken from CT scan images of several patients representing a wide range of severity of the disease. These analyses include an examination of the deviation from a normal distribution curve to determine skewness, a commonly used statistical parameter. Our preliminary results show that this method of assessment appears to be more accurate and robust than currently utilized methods which involve looking at percentages of radiodensities in air passages of the lung.
Computer image processing and recognition
NASA Technical Reports Server (NTRS)
Hall, E. L.
1979-01-01
A systematic introduction to the concepts and techniques of computer image processing and recognition is presented. Consideration is given to such topics as image formation and perception; computer representation of images; image enhancement and restoration; reconstruction from projections; digital television, encoding, and data compression; scene understanding; scene matching and recognition; and processing techniques for linear systems.
Image processing and recognition for biological images
Uchida, Seiichi
2013-01-01
This paper reviews image processing and pattern recognition techniques, which will be useful to analyze bioimages. Although this paper does not provide their technical details, it will be possible to grasp their main tasks and typical tools to handle the tasks. Image processing is a large research area to improve the visibility of an input image and acquire some valuable information from it. As the main tasks of image processing, this paper introduces gray-level transformation, binarization, image filtering, image segmentation, visual object tracking, optical flow and image registration. Image pattern recognition is the technique to classify an input image into one of the predefined classes and also has a large research area. This paper overviews its two main modules, that is, feature extraction module and classification module. Throughout the paper, it will be emphasized that bioimage is a very difficult target for even state-of-the-art image processing and pattern recognition techniques due to noises, deformations, etc. This paper is expected to be one tutorial guide to bridge biology and image processing researchers for their further collaboration to tackle such a difficult target. PMID:23560739
Image processing and recognition for biological images.
Uchida, Seiichi
2013-05-01
This paper reviews image processing and pattern recognition techniques, which will be useful to analyze bioimages. Although this paper does not provide their technical details, it will be possible to grasp their main tasks and typical tools to handle the tasks. Image processing is a large research area to improve the visibility of an input image and acquire some valuable information from it. As the main tasks of image processing, this paper introduces gray-level transformation, binarization, image filtering, image segmentation, visual object tracking, optical flow and image registration. Image pattern recognition is the technique to classify an input image into one of the predefined classes and also has a large research area. This paper overviews its two main modules, that is, feature extraction module and classification module. Throughout the paper, it will be emphasized that bioimage is a very difficult target for even state-of-the-art image processing and pattern recognition techniques due to noises, deformations, etc. This paper is expected to be one tutorial guide to bridge biology and image processing researchers for their further collaboration to tackle such a difficult target.
Smart Image Enhancement Process
NASA Technical Reports Server (NTRS)
Jobson, Daniel J. (Inventor); Rahman, Zia-ur (Inventor); Woodell, Glenn A. (Inventor)
2012-01-01
Contrast and lightness measures are used to first classify the image as being one of non-turbid and turbid. If turbid, the original image is enhanced to generate a first enhanced image. If non-turbid, the original image is classified in terms of a merged contrast/lightness score based on the contrast and lightness measures. The non-turbid image is enhanced to generate a second enhanced image when a poor contrast/lightness score is associated therewith. When the second enhanced image has a poor contrast/lightness score associated therewith, this image is enhanced to generate a third enhanced image. A sharpness measure is computed for one image that is selected from (i) the non-turbid image, (ii) the first enhanced image, (iii) the second enhanced image when a good contrast/lightness score is associated therewith, and (iv) the third enhanced image. If the selected image is not-sharp, it is sharpened to generate a sharpened image. The final image is selected from the selected image and the sharpened image.
Enhancing the Teaching and Learning of Mathematical Visual Images
ERIC Educational Resources Information Center
Quinnell, Lorna
2014-01-01
The importance of mathematical visual images is indicated by the introductory paragraph in the Statistics and Probability content strand of the Australian Curriculum, which draws attention to the importance of learners developing skills to analyse and draw inferences from data and "represent, summarise and interpret data and undertake…
First Year Mathematics Undergraduates' Settled Images of Tangent Line
ERIC Educational Resources Information Center
Biza, Irene; Zachariades, Theodossios
2010-01-01
This study concerns 182 first year mathematics undergraduates' perspectives on the tangent line of function graph in the light of a previous study on Year 12 pupils' perspectives. The aim was the investigation of tangency images that settle after undergraduates' distancing from the notion for a few months and after their participation in…
Mechanical-mathematical modeling for landslide process
NASA Astrophysics Data System (ADS)
Svalova, V.
2009-04-01
500 m and displacement of a landslide in the plan over 1 m. Last serious activization of a landslide has taken place in 2002 with a motion on 53 cm. Catastrophic activization of the deep blockglide landslide in the area of Khoroshevo in Moscow took place in 2006-2007. A crack of 330 m long appeared in the old sliding circus, along which a new 220 m long creeping block was separated from the plateau and began sinking with a displaced surface of the plateau reaching to 12 m. Such activization of the landslide process was not observed in Moscow since mid XIX century. The sliding area of Khoroshevo was stable during long time without manifestations of activity. Revealing of the reasons of deformation and development of ways of protection from deep landslide motions is extremely actual and difficult problem which decision is necessary for preservation of valuable historical monuments and modern city constructions. The reasons of activization and protective measures are discussed. Structure of monitoring system for urban territories is elaborated. Mechanical-mathematical model of high viscous fluid was used for modeling of matter behavior on landslide slopes. Equation of continuity and an approximated equation of the Navier-Stockes for slow motions in a thin layer were used. The results of modelling give possibility to define the place of highest velocity on landslide surface, which could be the best place for monitoring post position. Model can be used for calibration of monitoring equipment and gives possibility to investigate some fundamental aspects of matter movement on landslide slope.
IMAGES: An interactive image processing system
NASA Technical Reports Server (NTRS)
Jensen, J. R.
1981-01-01
The IMAGES interactive image processing system was created specifically for undergraduate remote sensing education in geography. The system is interactive, relatively inexpensive to operate, almost hardware independent, and responsive to numerous users at one time in a time-sharing mode. Most important, it provides a medium whereby theoretical remote sensing principles discussed in lecture may be reinforced in laboratory as students perform computer-assisted image processing. In addition to its use in academic and short course environments, the system has also been used extensively to conduct basic image processing research. The flow of information through the system is discussed including an overview of the programs.
Characterising the Cognitive Processes in Mathematical Investigation
ERIC Educational Resources Information Center
Yeo, Joseph B. W.; Yeap, Ban Har
2010-01-01
Many educators believe that mathematical investigation involves both problem posing and problem solving, but some teachers have taught their students to investigate during problem solving. The confusion about the relationship between investigation and problem solving may affect how teachers teach their students and how researchers conduct their…
Hsu, Chun-Wei; Goh, Joshua O. S.
2016-01-01
When comparing between the values of different choices, human beings can rely on either more cognitive processes, such as using mathematical computation, or more affective processes, such as using emotion. However, the neural correlates of how these two types of processes operate during value-based decision-making remain unclear. In this study, we investigated the extent to which neural regions engaged during value-based decision-making overlap with those engaged during mathematical and emotional processing in a within-subject manner. In a functional magnetic resonance imaging experiment, participants viewed stimuli that always consisted of numbers and emotional faces that depicted two choices. Across tasks, participants decided between the two choices based on the expected value of the numbers, a mathematical result of the numbers, or the emotional face stimuli. We found that all three tasks commonly involved various cortical areas including frontal, parietal, motor, somatosensory, and visual regions. Critically, the mathematical task shared common areas with the value but not emotion task in bilateral striatum. Although the emotion task overlapped with the value task in parietal, motor, and sensory areas, the mathematical task also evoked responses in other areas within these same cortical structures. Minimal areas were uniquely engaged for the value task apart from the other two tasks. The emotion task elicited a more expansive area of neural activity whereas value and mathematical task responses were in more focal regions. Whole-brain spatial correlation analysis showed that valuative processing engaged functional brain responses more similarly to mathematical processing than emotional processing. While decisions on expected value entail both mathematical and emotional processing regions, mathematical processes have a more prominent contribution particularly in subcortical processes. PMID:27375466
Hsu, Chun-Wei; Goh, Joshua O S
2016-01-01
When comparing between the values of different choices, human beings can rely on either more cognitive processes, such as using mathematical computation, or more affective processes, such as using emotion. However, the neural correlates of how these two types of processes operate during value-based decision-making remain unclear. In this study, we investigated the extent to which neural regions engaged during value-based decision-making overlap with those engaged during mathematical and emotional processing in a within-subject manner. In a functional magnetic resonance imaging experiment, participants viewed stimuli that always consisted of numbers and emotional faces that depicted two choices. Across tasks, participants decided between the two choices based on the expected value of the numbers, a mathematical result of the numbers, or the emotional face stimuli. We found that all three tasks commonly involved various cortical areas including frontal, parietal, motor, somatosensory, and visual regions. Critically, the mathematical task shared common areas with the value but not emotion task in bilateral striatum. Although the emotion task overlapped with the value task in parietal, motor, and sensory areas, the mathematical task also evoked responses in other areas within these same cortical structures. Minimal areas were uniquely engaged for the value task apart from the other two tasks. The emotion task elicited a more expansive area of neural activity whereas value and mathematical task responses were in more focal regions. Whole-brain spatial correlation analysis showed that valuative processing engaged functional brain responses more similarly to mathematical processing than emotional processing. While decisions on expected value entail both mathematical and emotional processing regions, mathematical processes have a more prominent contribution particularly in subcortical processes.
Visual Processing in Generally Gifted and Mathematically Excelling Adolescents
ERIC Educational Resources Information Center
Paz-Baruch, Nurit; Leikin, Roza; Leikin, Mark
2016-01-01
Little empirical data are available concerning the cognitive abilities of gifted individuals in general and especially those who excel in mathematics. We examined visual processing abilities distinguishing between general giftedness (G) and excellence in mathematics (EM). The research population consisted of 190 students from four groups of 10th-…
Litke, Alan
2006-03-27
The back of the eye is lined by an extraordinary biological pixel detector, the retina. This neural network is able to extract vital information about the external visual world, and transmit this information in a timely manner to the brain. In this talk, Professor Litke will describe a system that has been implemented to study how the retina processes and encodes dynamic visual images. Based on techniques and expertise acquired in the development of silicon microstrip detectors for high energy physics experiments, this system can simultaneously record the extracellular electrical activity of hundreds of retinal output neurons. After presenting first results obtained with this system, Professor Litke will describe additional applications of this incredible technology.
Filter for biomedical imaging and image processing.
Mondal, Partha P; Rajan, K; Ahmad, Imteyaz
2006-07-01
Image filtering techniques have numerous potential applications in biomedical imaging and image processing. The design of filters largely depends on the a priori, knowledge about the type of noise corrupting the image. This makes the standard filters application specific. Widely used filters such as average, Gaussian, and Wiener reduce noisy artifacts by smoothing. However, this operation normally results in smoothing of the edges as well. On the other hand, sharpening filters enhance the high-frequency details, making the image nonsmooth. An integrated general approach to design a finite impulse response filter based on Hebbian learning is proposed for optimal image filtering. This algorithm exploits the interpixel correlation by updating the filter coefficients using Hebbian learning. The algorithm is made iterative for achieving efficient learning from the neighborhood pixels. This algorithm performs optimal smoothing of the noisy image by preserving high-frequency as well as low-frequency features. Evaluation results show that the proposed finite impulse response filter is robust under various noise distributions such as Gaussian noise, salt-and-pepper noise, and speckle noise. Furthermore, the proposed approach does not require any a priori knowledge about the type of noise. The number of unknown parameters is few, and most of these parameters are adaptively obtained from the processed image. The proposed filter is successfully applied for image reconstruction in a positron emission tomography imaging modality. The images reconstructed by the proposed algorithm are found to be superior in quality compared with those reconstructed by existing PET image reconstruction methodologies.
ERIC Educational Resources Information Center
Maines, David R.; And Others
Investigated were those long-term processes which contribute to high rates of attrition for women out of mathematics. It is based on the contention that university students drop out of mathematics as a consequence of prior socialization, educational career contingencies, and goal commitment and career aspirations, with the mix of these factors…
FORTRAN Algorithm for Image Processing
NASA Technical Reports Server (NTRS)
Roth, Don J.; Hull, David R.
1987-01-01
FORTRAN computer algorithm containing various image-processing analysis and enhancement functions developed. Algorithm developed specifically to process images of developmental heat-engine materials obtained with sophisticated nondestructive evaluation instruments. Applications of program include scientific, industrial, and biomedical imaging for studies of flaws in materials, analyses of steel and ores, and pathology.
The APL image processing laboratory
NASA Technical Reports Server (NTRS)
Jenkins, J. O.; Randolph, J. P.; Tilley, D. G.; Waters, C. A.
1984-01-01
The present and proposed capabilities of the Central Image Processing Laboratory, which provides a powerful resource for the advancement of programs in missile technology, space science, oceanography, and biomedical image analysis, are discussed. The use of image digitizing, digital image processing, and digital image output permits a variety of functional capabilities, including: enhancement, pseudocolor, convolution, computer output microfilm, presentation graphics, animations, transforms, geometric corrections, and feature extractions. The hardware and software of the Image Processing Laboratory, consisting of digitizing and processing equipment, software packages, and display equipment, is described. Attention is given to applications for imaging systems, map geometric correction, raster movie display of Seasat ocean data, Seasat and Skylab scenes of Nantucket Island, Space Shuttle imaging radar, differential radiography, and a computerized tomographic scan of the brain.
Infrared image enhancement based on the edge detection and mathematical morphology
NASA Astrophysics Data System (ADS)
Zhang, Linlin; Zhao, Yuejin; Dong, Liquan; Liu, Xiaohua; Yu, Xiaomei; Hui, Mei; Chu, Xuhong; Gong, Cheng
2010-11-01
The development of the un-cooled infrared imaging technology from military necessity. At present, It is widely applied in industrial, medicine, scientific and technological research and so on. The infrared radiation temperature distribution of the measured object's surface can be observed visually. The collection of infrared images from our laboratory has following characteristics: Strong spatial correlation, Low contrast , Poor visual effect; Without color or shadows because of gray image , and has low resolution; Low definition compare to the visible light image; Many kinds of noise are brought by the random disturbances of the external environment. Digital image processing are widely applied in many areas, it can now be studied up close and in detail in many research field. It has become one kind of important means of the human visual continuation. Traditional methods for image enhancement cannot capture the geometric information of images and tend to amplify noise. In order to remove noise and improve visual effect. Meanwhile, To overcome the above enhancement issues. The mathematical model of FPA unit was constructed based on matrix transformation theory. According to characteristics of FPA, Image enhancement algorithm which combined with mathematical morphology and edge detection are established. First of all, Image profile is obtained by using the edge detection combine with mathematical morphological operators. And then, through filling the template profile by original image to get the ideal background image, The image noise can be removed on the base of the above method. The experiments show that utilizing the proposed algorithm can enhance image detail and the signal to noise ratio.
Cooperative processes in image segmentation
NASA Technical Reports Server (NTRS)
Davis, L. S.
1982-01-01
Research into the role of cooperative, or relaxation, processes in image segmentation is surveyed. Cooperative processes can be employed at several levels of the segmentation process as a preprocessing enhancement step, during supervised or unsupervised pixel classification and, finally, for the interpretation of image segments based on segment properties and relations.
A Mathematical Model for Simulating Infrared Images of Ships
1986-12-01
DEFENCE RESEARCH CENTRE SALISBURY SOUTH AUSTRALIA TECHNICAL REPORT ER L-0396-TR A MATHEMATICAL MODEL FOR SIMULATING INFRARED IMAGES OF SHIPS OS SCO1T...lli,wlng purposes: Reports documents prepared for maneagrial purposes, Technical recodAs of scientific end technical work of a permanent value Intended...They are Memoranda usually tentative in nature and reflec the personal views of the author, 3j, . A ~ ~ ~ ,~tu’~’ ’. . . UNCLASSIFIED AR-004.885
Mathematical Modeling of Primary Wood Processing
NASA Astrophysics Data System (ADS)
Szyszka, Barbara; Rozmiarek, Klaudyna
2008-09-01
This work presents a way of optimizing wood logs' conversion into semi-products. Calculating algorithms have been used in order to choose the cutting patterns and the number of logs needed to realize an order, including task specification. What makes it possible for the author's computer program TARPAK1 to be written is the visualization of the results, the generation pattern of wood logs' conversion for given entry parameters and prediction of sawn timber manufacture. This program has been created with the intention of being introduced to small and medium sawmills in Poland. The Project has been financed from government resources and written by workers of the Institute of Mathematics (Poznan University of Technology) and the Department of Mechanical Wood Technology (Poznan University of Life Sciences).
Mathematical modelling in the computer-aided process planning
NASA Astrophysics Data System (ADS)
Mitin, S.; Bochkarev, P.
2016-04-01
This paper presents new approaches to organization of manufacturing preparation and mathematical models related to development of the computer-aided multi product process planning (CAMPP) system. CAMPP system has some peculiarities compared to the existing computer-aided process planning (CAPP) systems: fully formalized developing of the machining operations; a capacity to create and to formalize the interrelationships among design, process planning and process implementation; procedures for consideration of the real manufacturing conditions. The paper describes the structure of the CAMPP system and shows the mathematical models and methods to formalize the design procedures.
Parallel asynchronous systems and image processing algorithms
NASA Technical Reports Server (NTRS)
Coon, D. D.; Perera, A. G. U.
1989-01-01
A new hardware approach to implementation of image processing algorithms is described. The approach is based on silicon devices which would permit an independent analog processing channel to be dedicated to evey pixel. A laminar architecture consisting of a stack of planar arrays of the device would form a two-dimensional array processor with a 2-D array of inputs located directly behind a focal plane detector array. A 2-D image data stream would propagate in neuronlike asynchronous pulse coded form through the laminar processor. Such systems would integrate image acquisition and image processing. Acquisition and processing would be performed concurrently as in natural vision systems. The research is aimed at implementation of algorithms, such as the intensity dependent summation algorithm and pyramid processing structures, which are motivated by the operation of natural vision systems. Implementation of natural vision algorithms would benefit from the use of neuronlike information coding and the laminar, 2-D parallel, vision system type architecture. Besides providing a neural network framework for implementation of natural vision algorithms, a 2-D parallel approach could eliminate the serial bottleneck of conventional processing systems. Conversion to serial format would occur only after raw intensity data has been substantially processed. An interesting challenge arises from the fact that the mathematical formulation of natural vision algorithms does not specify the means of implementation, so that hardware implementation poses intriguing questions involving vision science.
Pattern Recognition and Image Processing of Infrared Astronomical Satellite Images
NASA Astrophysics Data System (ADS)
He, Lun Xiong
1996-01-01
The Infrared Astronomical Satellite (IRAS) images with wavelengths of 60 mu m and 100 mu m contain mainly information on both extra-galactic sources and low-temperature interstellar media. The low-temperature interstellar media in the Milky Way impose a "cirrus" screen of IRAS images, especially in images with 100 mu m wavelength. This dissertation deals with the techniques of removing the "cirrus" clouds from the 100 mu m band in order to achieve accurate determinations of point sources and their intensities (fluxes). We employ an image filtering process which utilizes mathematical morphology and wavelet analysis as the key tools in removing the "cirrus" foreground emission. The filtering process consists of extraction and classification of the size information, and then using the classification results in removal of the cirrus component from each pixel of the image. Extraction of size information is the most important step in this process. It is achieved by either mathematical morphology or wavelet analysis. In the mathematical morphological method, extraction of size information is done using the "sieving" process. In the wavelet method, multi-resolution techniques are employed instead. The classification of size information distinguishes extra-galactic sources from cirrus using their averaged size information. The cirrus component for each pixel is then removed by using the averaged cirrus size information. The filtered image contains much less cirrus. Intensity alteration for extra-galactic sources in the filtered image are discussed. It is possible to retain the fluxes of the point sources when we weigh the cirrus component differently pixel by pixel. The importance of the uni-directional size information extractions are addressed in this dissertation. Such uni-directional extractions are achieved by constraining the structuring elements, or by constraining the sieving process to be sequential. The generalizations of mathematical morphology operations based
A unified mathematical theory of electrophoretic processes
NASA Technical Reports Server (NTRS)
Bier, M.; Palusinski, O. A.; Mosher, R. A.; Graham, A.; Saville, D. A.
1983-01-01
A mathematical theory is presented which shows that each of the four classical electrophoretic modes (zone electrophoresis, moving boundary electrophoresis, isotachophoresis, and isoelectric focusing) is based on the same general principles and can collectively be described in terms of a single set of equations. This model can predict the evolution of the four electrophoretic modes as a function of time. The model system is one-dimensional, neglecting the effects of electroosmosis, temperature gradients, and any bulk flows of liquid. The model is based on equations which express the components' dissociation equilibria, the mass transport due to electromigration and diffusion, electroneutrality, and the conservation of mass and charge. The model consists of a system of coupled partial differential and nonlinear algebraic equations which can be solved numerically by use of a computer. The versatility of this model was verified using an example of a three-component system containing cacodylate, tris hydroxylmethylaminomethane, and histidine. Results show that this model not only correctly predicts the characteristic features of each electrophoretic mode, but also gives details of the concentration, pH, and conductivity profiles not easily amenable to direct experimental measurement.
Industrial Applications of Image Processing
NASA Astrophysics Data System (ADS)
Ciora, Radu Adrian; Simion, Carmen Mihaela
2014-11-01
The recent advances in sensors quality and processing power provide us with excellent tools for designing more complex image processing and pattern recognition tasks. In this paper we review the existing applications of image processing and pattern recognition in industrial engineering. First we define the role of vision in an industrial. Then a dissemination of some image processing techniques, feature extraction, object recognition and industrial robotic guidance is presented. Moreover, examples of implementations of such techniques in industry are presented. Such implementations include automated visual inspection, process control, part identification, robots control. Finally, we present some conclusions regarding the investigated topics and directions for future investigation
[Imaging center - optimization of the imaging process].
Busch, H-P
2013-04-01
Hospitals around the world are under increasing pressure to optimize the economic efficiency of treatment processes. Imaging is responsible for a great part of the success but also of the costs of treatment. In routine work an excessive supply of imaging methods leads to an "as well as" strategy up to the limit of the capacity without critical reflection. Exams that have no predictable influence on the clinical outcome are an unjustified burden for the patient. They are useless and threaten the financial situation and existence of the hospital. In recent years the focus of process optimization was exclusively on the quality and efficiency of performed single examinations. In the future critical discussion of the effectiveness of single exams in relation to the clinical outcome will be more important. Unnecessary exams can be avoided, only if in addition to the optimization of single exams (efficiency) there is an optimization strategy for the total imaging process (efficiency and effectiveness). This requires a new definition of processes (Imaging Pathway), new structures for organization (Imaging Center) and a new kind of thinking on the part of the medical staff. Motivation has to be changed from gratification of performed exams to gratification of process quality (medical quality, service quality, economics), including the avoidance of additional (unnecessary) exams.
SWNT Imaging Using Multispectral Image Processing
NASA Astrophysics Data System (ADS)
Blades, Michael; Pirbhai, Massooma; Rotkin, Slava V.
2012-02-01
A flexible optical system was developed to image carbon single-wall nanotube (SWNT) photoluminescence using the multispectral capabilities of a typical CCD camcorder. The built in Bayer filter of the CCD camera was utilized, using OpenCV C++ libraries for image processing, to decompose the image generated in a high magnification epifluorescence microscope setup into three pseudo-color channels. By carefully calibrating the filter beforehand, it was possible to extract spectral data from these channels, and effectively isolate the SWNT signals from the background.
NASA Astrophysics Data System (ADS)
Stobie, R. S.; Dodd, R. J.; MacGillivray, H. T.
1981-12-01
It is noted that astronomers have for some time been fascinated by the possibility of automatic plate measurement and that measuring engines have been constructed with an ever increasing degree of automation. A description is given of the COSMOS (CoOrdinates, Sizes, Magnitudes, Orientations, and Shapes) system at the Royal Observatory in Edinburgh. An automatic high-speed microdensitometer controlled by a minicomputer is linked to a very fast microcomputer that performs immediate image analysis. The movable carriage, whose position in two coordinates is controlled digitally to an accuracy of 0.5 micron (0.0005 mm) will take plates as large as 356 mm on a side. It is noted that currently the machine operates primarily in the Image Analysis Mode, in which COSMOS must first detect the presence of an image. It does this by scanning and digitizing the photograph in 'raster' fashion and then searching for local enhancements in the density of the exposed emulsion.
1982-11-16
spectral analysist texture image analysis and classification, __ image software package, automatic spatial clustering.ITWA domenit hi ba apa for...ICOLOR(256),IBW(256) 1502 FORMATO (30( CNO(N): fF12.1)) 1503 FORMAT(o *FMINo DMRGE:0f2E20.8) 1504 FORMAT(/o IMRGE:or15) 1505 FOR14ATV FIRST SUBIMAGE:v...1506 FORMATO ’ JOIN CLUSTER NL:0) 1507 FORMAT( NEW CLUSTER:O) 1508 FORMAT( LLBS.GE.600) 1532 FORMAT(15XoTHETA ,7X, SIGMA-SQUAREr3Xe MERGING-DISTANCE
ERIC Educational Resources Information Center
Martin, Lyndon C.; Towers, Jo
2015-01-01
In the research reported in this paper, we develop a theoretical perspective to describe and account for the growth of collective mathematical understanding. We discuss collective processes in mathematics, drawing in particular on theoretical work in the domains of improvisational jazz and theatre. Using examples of data from a study of elementary…
Adequate mathematical modelling of environmental processes
NASA Astrophysics Data System (ADS)
Chashechkin, Yu. D.
2012-04-01
In environmental observations and laboratory visualization both large scale flow components like currents, jets, vortices, waves and a fine structure are registered (different examples are given). The conventional mathematical modeling both analytical and numerical is directed mostly on description of energetically important flow components. The role of a fine structures is still remains obscured. A variety of existing models makes it difficult to choose the most adequate and to estimate mutual assessment of their degree of correspondence. The goal of the talk is to give scrutiny analysis of kinematics and dynamics of flows. A difference between the concept of "motion" as transformation of vector space into itself with a distance conservation and the concept of "flow" as displacement and rotation of deformable "fluid particles" is underlined. Basic physical quantities of the flow that are density, momentum, energy (entropy) and admixture concentration are selected as physical parameters defined by the fundamental set which includes differential D'Alembert, Navier-Stokes, Fourier's and/or Fick's equations and closing equation of state. All of them are observable and independent. Calculations of continuous Lie groups shown that only the fundamental set is characterized by the ten-parametric Galilelian groups reflecting based principles of mechanics. Presented analysis demonstrates that conventionally used approximations dramatically change the symmetries of the governing equations sets which leads to their incompatibility or even degeneration. The fundamental set is analyzed taking into account condition of compatibility. A high order of the set indicated on complex structure of complete solutions corresponding to physical structure of real flows. Analytical solutions of a number problems including flows induced by diffusion on topography, generation of the periodic internal waves a compact sources in week-dissipative media as well as numerical solutions of the same
Mathematical abilities in dyslexic children: a diffusion tensor imaging study.
Koerte, Inga K; Willems, Anna; Muehlmann, Marc; Moll, Kristina; Cornell, Sonia; Pixner, Silvia; Steffinger, Denise; Keeser, Daniel; Heinen, Florian; Kubicki, Marek; Shenton, Martha E; Ertl-Wagner, Birgit; Schulte-Körne, Gerd
2016-09-01
Dyslexia is characterized by a deficit in language processing which mainly affects word decoding and spelling skills. In addition, children with dyslexia also show problems in mathematics. However, for the latter, the underlying structural correlates have not been investigated. Sixteen children with dyslexia (mean age 9.8 years [0.39]) and 24 typically developing children (mean age 9.9 years [0.29]) group matched for age, gender, IQ, and handedness underwent 3 T MR diffusion tensor imaging as well as cognitive testing. Tract-Based Spatial Statistics were performed to correlate behavioral data with diffusion data. Children with dyslexia performed worse than controls in standardized verbal number tasks, such as arithmetic efficiency tests (addition, subtraction, multiplication, division). In contrast, the two groups did not differ in the nonverbal number line task. Arithmetic efficiency, representing the total score of the four arithmetic tasks, multiplication, and division, correlated with diffusion measures in widespread areas of the white matter, including bilateral superior and inferior longitudinal fasciculi in children with dyslexia compared to controls. Children with dyslexia demonstrated lower performance in verbal number tasks but performed similarly to controls in a nonverbal number task. Further, an association between verbal arithmetic efficiency and diffusion measures was demonstrated in widespread areas of the white matter suggesting compensatory mechanisms in children with dyslexia compared to controls. Taken together, poor fact retrieval in children with dyslexia is likely a consequence of deficits in the language system, which not only affects literacy skills but also impacts on arithmetic skills.
Basic research planning in mathematical pattern recognition and image analysis
NASA Technical Reports Server (NTRS)
Bryant, J.; Guseman, L. F., Jr.
1981-01-01
Fundamental problems encountered while attempting to develop automated techniques for applications of remote sensing are discussed under the following categories: (1) geometric and radiometric preprocessing; (2) spatial, spectral, temporal, syntactic, and ancillary digital image representation; (3) image partitioning, proportion estimation, and error models in object scene interference; (4) parallel processing and image data structures; and (5) continuing studies in polarization; computer architectures and parallel processing; and the applicability of "expert systems" to interactive analysis.
NASA Astrophysics Data System (ADS)
Castellano, M.; Ottaviani, D.; Fontana, A.; Merlin, E.; Pilo, S.; Falcone, M.
2015-09-01
In the past years modern mathematical methods for image analysis have led to a revolution in many fields, from computer vision to scientific imaging. However, some recently developed image processing techniques successfully exploited by other sectors have been rarely, if ever, experimented on astronomical observations. We present here tests of two classes of variational image enhancement techniques: "structure-texture decomposition" and "super-resolution" showing that they are effective in improving the quality of observations. Structure-texture decomposition allows to recover faint sources previously hidden by the background noise, effectively increasing the depth of available observations. Super-resolution yields an higher-resolution and a better sampled image out of a set of low resolution frames, thus mitigating problematics in data analysis arising from the difference in resolution/sampling between different instruments, as in the case of EUCLID VIS and NIR imagers.
ERIC Educational Resources Information Center
Lane, Ciara; Stynes, Martin; O'Donoghue, John
2016-01-01
A questionnaire survey was carried out as part of a PhD research study to investigate the image of mathematics held by post-primary students in Ireland. The study focused on students in fifth year of post-primary education studying ordinary level mathematics for the Irish Leaving Certificate examination--the final examination for students in…
NASA Astrophysics Data System (ADS)
Lane, Ciara; Stynes, Martin; O'Donoghue, John
2016-10-01
A questionnaire survey was carried out as part of a PhD research study to investigate the image of mathematics held by post-primary students in Ireland. The study focused on students in fifth year of post-primary education studying ordinary level mathematics for the Irish Leaving Certificate examination - the final examination for students in second-level or post-primary education. At the time this study was conducted, ordinary level mathematics students constituted approximately 72% of Leaving Certificate students. Students were aged between 15 and 18 years. A definition for 'image of mathematics' was adapted from Lim and Wilson, with image of mathematics hypothesized as comprising attitudes, beliefs, self-concept, motivation, emotions and past experiences of mathematics. A questionnaire was composed incorporating 84 fixed-response items chosen from eight pre-established scales by Aiken, Fennema and Sherman, Gourgey and Schoenfeld. This paper focuses on the findings from the questionnaire survey. Students' images of mathematics are compared with regard to gender, type of post-primary school attended and prior mathematical achievement.
Subband/Transform MATLAB Functions For Processing Images
NASA Technical Reports Server (NTRS)
Glover, D.
1995-01-01
SUBTRANS software is package of routines implementing image-data-processing functions for use with MATLAB*(TM) software. Provides capability to transform image data with block transforms and to produce spatial-frequency subbands of transformed data. Functions cascaded to provide further decomposition into more subbands. Also used in image-data-compression systems. For example, transforms used to prepare data for lossy compression. Written for use in MATLAB mathematical-analysis environment.
Image processing: some challenging problems.
Huang, T S; Aizawa, K
1993-01-01
Image processing can be broadly defined as the manipulation of signals which are inherently multidimensional. The most common such signals are photographs and video sequences. The goals of processing or manipulation can be (i) compression for storage or transmission; (ii) enhancement or restoration; (iii) analysis, recognition, and understanding; or (iv) visualization for human observers. The use of image processing techniques has become almost ubiquitous; they find applications in such diverse areas as astronomy, archaeology, medicine, video communication, and electronic games. Nonetheless, many important problems in image processing remain unsolved. It is the goal of this paper to discuss some of these challenging problems. In Section I, we mention a number of outstanding problems. Then, in the remainder of this paper, we concentrate on one of them: very-low-bit-rate video compression. This is chosen because it involves almost all aspects of image processing. PMID:8234312
Image Processing: Some Challenging Problems
NASA Astrophysics Data System (ADS)
Huang, T. S.; Aizawa, K.
1993-11-01
Image processing can be broadly defined as the manipulation of signals which are inherently multidimensional. The most common such signals are photographs and video sequences. The goals of processing or manipulation can be (i) compression for storage or transmission; (ii) enhancement or restoration; (iii) analysis, recognition, and understanding; or (iv) visualization for human observers. The use of image processing techniques has become almost ubiquitous; they find applications in such diverse areas as astronomy, archaeology, medicine, video communication, and electronic games. Nonetheless, many important problems in image processing remain unsolved. It is the goal of this paper to discuss some of these challenging problems. In Section I, we mention a number of outstanding problems. Then, in the remainder of this paper, we concentrate on one of them: very-low-bit-rate video compression. This is chosen because it involves almost all aspects of image processing.
Image processing for optical mapping.
Ravindran, Prabu; Gupta, Aditya
2015-01-01
Optical Mapping is an established single-molecule, whole-genome analysis system, which has been used to gain a comprehensive understanding of genomic structure and to study structural variation of complex genomes. A critical component of Optical Mapping system is the image processing module, which extracts single molecule restriction maps from image datasets of immobilized, restriction digested and fluorescently stained large DNA molecules. In this review, we describe robust and efficient image processing techniques to process these massive datasets and extract accurate restriction maps in the presence of noise, ambiguity and confounding artifacts. We also highlight a few applications of the Optical Mapping system.
[Mathematical approach to modeling of the treatment of suppurative processes].
Men'shikov, D D; Enileev, R Kh
1989-03-01
Consideration of an inflammation focus as an "open system" provided analogy between microbiological processes in inflamed wounds and in systems of continuous cultivation of microorganisms. Mathematical modeling of such systems is widely used. Some of the methods for the mathematical modeling were applied to chemoprophylaxis and chemotherapy of postoperative wounds. In modeling continuous cultivation of microorganisms it is usually necessary to determine optimal conditions for the maximum yield of their biomass. In modeling of wound treatment the aim was to determine the process parameters providing the minimum biomass. The described simple models showed that there could be certain optimal flow rate of the washing fluid in the aspiration-washing procedure for wound treatment at which the drug was not completely washed out while the growth rate of the microbial population was minimal. Such mathematical models were shown valuable in optimizing the use of bactericidal and bacteriostatic antibiotics.
Image Processing REST Web Services
2013-03-01
collections, deblurring, contrast enhancement, and super resolution. 2 1. Original Image with Target Chip to Super Resolve 2. Unenhanced...extracted target chip 3. Super-resolved target chip 4. Super-resolved, deblurred target chip 5. Super-resolved, deblurred and contrast...enhanced target chip Image 1. Chaining the image processing algorithms. 3 2. Resources There are two types of resources associated with these
ERIC Educational Resources Information Center
Sagirli, Meryem Özturan
2016-01-01
The aim of the present study is to investigate pre-service secondary mathematics teachers' cognitive-metacognitive behaviours during the mathematical problem-solving process considering class level. The study, in which the case study methodology was employed, was carried out with eight pre-service mathematics teachers, enrolled at a university in…
SOFT-1: Imaging Processing Software
NASA Technical Reports Server (NTRS)
1984-01-01
Five levels of image processing software are enumerated and discussed: (1) logging and formatting; (2) radiometric correction; (3) correction for geometric camera distortion; (4) geometric/navigational corrections; and (5) general software tools. Specific concerns about access to and analysis of digital imaging data within the Planetary Data System are listed.
Photographic image enhancement and processing
NASA Technical Reports Server (NTRS)
Lockwood, H. E.
1975-01-01
Image processing techniques (computer and photographic) are described which are used within the JSC Photographic Technology Division. Two purely photographic techniques used for specific subject isolation are discussed in detail. Sample imagery is included.
Mathematical Development: The Role of Broad Cognitive Processes
ERIC Educational Resources Information Center
Calderón-Tena, Carlos O.
2016-01-01
This study investigated the role of broad cognitive processes in the development of mathematics skills among children and adolescents. Four hundred and forty-seven students (age mean [M] = 10.23 years, 73% boys and 27% girls) from an elementary school district in the US southwest participated. Structural equation modelling tests indicated that…
Mathematical modeling of the neuron morphology using two dimensional images.
Rajković, Katarina; Marić, Dušica L; Milošević, Nebojša T; Jeremic, Sanja; Arsenijević, Valentina Arsić; Rajković, Nemanja
2016-02-07
In this study mathematical analyses such as the analysis of area and length, fractal analysis and modified Sholl analysis were applied on two dimensional (2D) images of neurons from adult human dentate nucleus (DN). Using mathematical analyses main morphological properties were obtained including the size of neuron and soma, the length of all dendrites, the density of dendritic arborization, the position of the maximum density and the irregularity of dendrites. Response surface methodology (RSM) was used for modeling the size of neurons and the length of all dendrites. However, the RSM model based on the second-order polynomial equation was only possible to apply to correlate changes in the size of the neuron with other properties of its morphology. Modeling data provided evidence that the size of DN neurons statistically depended on the size of the soma, the density of dendritic arborization and the irregularity of dendrites. The low value of mean relative percent deviation (MRPD) between the experimental data and the predicted neuron size obtained by RSM model showed that model was suitable for modeling the size of DN neurons. Therefore, RSM can be generally used for modeling neuron size from 2D images.
Sgraffito simulation through image processing
NASA Astrophysics Data System (ADS)
Guerrero, Roberto A.; Serón Arbeloa, Francisco J.
2011-10-01
This paper presents a tool for simulating the traditional Sgraffito technique through digital image processing. The tool is based on a digital image pile and a set of attributes recovered from the image at the bottom of the pile using the Streit and Buchanan multiresolution image pyramid. This technique tries to preserve the principles of artistic composition by means of the attributes of color, luminance and shape recovered from the foundation image. A couple of simulated scratching objects will establish how the recovered attributes have to be painted. Different attributes can be painted by using different scratching primitives. The resulting image will be a colorimetric composition reached from the image on the top of the pile, the color of the images revealed by scratching and the inner characteristics of each scratching primitive. The technique combines elements of image processing, art and computer graphics allowing users to make their own free compositions and providing a means for the development of visual communication skills within the user-observer relationship. The technique enables the application of the given concepts in non artistic fields with specific subject tools.
Mathematical modeling of DNA's transcription process for the cancer study
NASA Astrophysics Data System (ADS)
Morales-Peñaloza, A.; Meza-López, C. D.; Godina-Nava, J. J.
2012-10-01
The cancer is a phenomenon caused by an anomaly in the DNA's transcription process, therefore it is necessary to known how such anomaly is generated in order to implement alternative therapies to combat it. We propose to use mathematical modeling to treat the problem. Is implemented a simulation of the process of transcription and are studied the transport properties in the heterogeneous case using nonlinear dynamics.
Photographic patterns in macular images: representation by a mathematical model.
Smith, R Theodore; Nagasaki, Takayuki; Sparrow, Janet R; Barbazetto, Irene; Koniarek, Jan P; Bickmann, Lee J
2004-01-01
Normal macular photographic patterns are geometrically described and mathematically modeled. Forty normal color fundus photographs were digitized. The green channel gray-level data were filtered and contrast enhanced, then analyzed for concentricity, convexity, and radial resolution. The foveal data for five images were fit with elliptic quadratic polynomials in two zones: a central ellipse and a surrounding annulus. The ability of the model to reconstruct the entire foveal data from selected pixel values was tested. The gray-level patterns were nested sets of concentric ellipses. Gray levels increased radially, with retinal vessels changing the patterns to star shaped in the peripheral fovea. The elliptic polynomial model could fit a high-resolution green channel foveal image with mean absolute errors of 6.1% of the gray-level range. Foveal images were reconstructed from small numbers of selected pixel values with mean errors of 7.2%. Digital analysis of normal fundus photographs shows finely resolved concentric elliptical foveal and star-shaped parafoveal patterns, which are consistent with anatomical structures. A two-zone elliptic quadratic polynomial model can approximate foveal data, and can also reconstruct it from small subsets, allowing improved macular image analysis.
Fuzzy image processing in sun sensor
NASA Technical Reports Server (NTRS)
Mobasser, S.; Liebe, C. C.; Howard, A.
2003-01-01
This paper will describe how the fuzzy image processing is implemented in the instrument. Comparison of the Fuzzy image processing and a more conventional image processing algorithm is provided and shows that the Fuzzy image processing yields better accuracy then conventional image processing.
Image processing using reconfigurable FPGAs
NASA Astrophysics Data System (ADS)
Ferguson, Lee
1996-10-01
The use of reconfigurable field-programmable gate arrays (FPGAs) for imaging applications show considerable promise to fill the gap that often occurs when digital signal processor chips fail to meet performance specifications. Single chip DSPs do not have the overall performance to meet the needs of many imaging applications, particularly in real-time designs. Using multiple DSPs to boost performance often presents major design challenges in maintaining data alignment and process synchronization. These challenges can impose serious cost, power consumption and board space penalties. Image processing requires manipulating massive amounts of data at high-speed. Although DSP chips can process data at high-speeds, their architectures can inhibit overall system performance in real-time imaging. The rate of operations can be increased when they are performed in dedicated hardware, such as special-purpose imaging devices and FPGAs, which provides the horsepower necessary to implement real-time image processing products successfully and cost-effectively. For many fixed applications, non-SRAM- based (antifuse or flash-based) FPGAs provide the raw speed to accomplish standard high-speed functions. However, in applications where algorithms are continuously changing and compute operations must be modified, only SRAM-based FPGAs give enough flexibility. The addition of reconfigurable FPGAs as a flexible hardware facility enables DSP chips to perform optimally. The benefits primarily stem from optimizing the hardware for the algorithms or the use of reconfigurable hardware to enhance the product architecture. And with SRAM-based FPGAs that are capable of partial dynamic reconfiguration, such as the Cache-Logic FPGAs from Atmel, continuous modification of data and logic is not only possible, it is practical as well. First we review the particular demands of image processing. Then we present various applications and discuss strategies for exploiting the capabilities of
Integrating image processing in PACS.
Faggioni, Lorenzo; Neri, Emanuele; Cerri, Francesca; Turini, Francesca; Bartolozzi, Carlo
2011-05-01
Integration of RIS and PACS services into a single solution has become a widespread reality in daily radiological practice, allowing substantial acceleration of workflow with greater ease of work compared with older generation film-based radiological activity. In particular, the fast and spectacular recent evolution of digital radiology (with special reference to cross-sectional imaging modalities, such as CT and MRI) has been paralleled by the development of integrated RIS--PACS systems with advanced image processing tools (either two- and/or three-dimensional) that were an exclusive task of costly dedicated workstations until a few years ago. This new scenario is likely to further improve productivity in the radiology department with reduction of the time needed for image interpretation and reporting, as well as to cut costs for the purchase of dedicated standalone image processing workstations. In this paper, a general description of typical integrated RIS--PACS architecture with image processing capabilities will be provided, and the main available image processing tools will be illustrated.
Enhanced imaging process for xeroradiography
NASA Astrophysics Data System (ADS)
Fender, William D.; Zanrosso, Eddie M.
1993-09-01
An enhanced mammographic imaging process has been developed which is based on the conventional powder-toner selenium technology used in the Xerox 125/126 x-ray imaging system. The process is derived from improvements in the amorphous selenium x-ray photoconductor, the blue powder toner and the aerosol powder dispersion process. Comparisons of image quality and x-ray dose using the Xerox aluminum-wedge breast phantom and the Radiation Measurements Model 152D breast phantom have been made between the new Enhanced Process, the standard Xerox 125/126 System and screen-film at mammographic x-ray exposure parameters typical for each modality. When comparing the Enhanced Xeromammographic Process with the standard 125/126 System, a distinct advantage is seen for the Enhanced equivalent mass detection and superior fiber and speck detection. The broader imaging latitude of enhanced and standard Xeroradiography, in comparison to film, is illustrated in images made using the aluminum-wedge breast phantom.
Image Processing Language. Phase 2.
1988-11-01
knowledge engineering of coherent collections of methodological tools as they appear in the literature, and the implementation of expert knowledge in...knowledge representation becomes even more desirable. The role of morphology ( Reference 30 as a knowledge formalization tool is another area which is...sets of image processing algorithms. These analyses are to be carried out in several modes including a complete translation to image algebra machine
Digital processing of radiographic images
NASA Technical Reports Server (NTRS)
Bond, A. D.; Ramapriyan, H. K.
1973-01-01
Some techniques are presented and the software documentation for the digital enhancement of radiographs. Both image handling and image processing operations are considered. The image handling operations dealt with are: (1) conversion of format of data from packed to unpacked and vice versa; (2) automatic extraction of image data arrays; (3) transposition and 90 deg rotations of large data arrays; (4) translation of data arrays for registration; and (5) reduction of the dimensions of data arrays by integral factors. Both the frequency and the spatial domain approaches are presented for the design and implementation of the image processing operation. It is shown that spatial domain recursive implementation of filters is much faster than nonrecursive implementations using fast fourier transforms (FFT) for the cases of interest in this work. The recursive implementation of a class of matched filters for enhancing image signal to noise ratio is described. Test patterns are used to illustrate the filtering operations. The application of the techniques to radiographic images of metallic structures is demonstrated through several examples.
On the mathematical modeling of wound healing angiogenesis in skin as a reaction-transport process
Flegg, Jennifer A.; Menon, Shakti N.; Maini, Philip K.; McElwain, D. L. Sean
2015-01-01
Over the last 30 years, numerous research groups have attempted to provide mathematical descriptions of the skin wound healing process. The development of theoretical models of the interlinked processes that underlie the healing mechanism has yielded considerable insight into aspects of this critical phenomenon that remain difficult to investigate empirically. In particular, the mathematical modeling of angiogenesis, i.e., capillary sprout growth, has offered new paradigms for the understanding of this highly complex and crucial step in the healing pathway. With the recent advances in imaging and cell tracking, the time is now ripe for an appraisal of the utility and importance of mathematical modeling in wound healing angiogenesis research. The purpose of this review is to pedagogically elucidate the conceptual principles that have underpinned the development of mathematical descriptions of wound healing angiogenesis, specifically those that have utilized a continuum reaction-transport framework, and highlight the contribution that such models have made toward the advancement of research in this field. We aim to draw attention to the common assumptions made when developing models of this nature, thereby bringing into focus the advantages and limitations of this approach. A deeper integration of mathematical modeling techniques into the practice of wound healing angiogenesis research promises new perspectives for advancing our knowledge in this area. To this end we detail several open problems related to the understanding of wound healing angiogenesis, and outline how these issues could be addressed through closer cross-disciplinary collaboration. PMID:26483695
Image processing of galaxy photographs
NASA Technical Reports Server (NTRS)
Arp, H.; Lorre, J.
1976-01-01
New computer techniques for analyzing and processing photographic images of galaxies are presented, with interesting scientific findings gleaned from the processed photographic data. Discovery and enhancement of very faint and low-contrast nebulous features, improved resolution of near-limit detail in nebulous and stellar images, and relative colors of a group of nebulosities in the field are attained by the methods. Digital algorithms, nonlinear pattern-recognition filters, linear convolution filters, plate averaging and contrast enhancement techniques, and an atmospheric deconvolution technique are described. New detail is revealed in images of NGC 7331, Stephan's Quintet, Seyfert's Sextet, and the jet in M87, via processes of addition of plates, star removal, contrast enhancement, standard deviation filtering, and computer ratioing to bring out qualitative color differences.
FITS Liberator: Image processing software
NASA Astrophysics Data System (ADS)
Lindberg Christensen, Lars; Nielsen, Lars Holm; Nielsen, Kaspar K.; Johansen, Teis; Hurt, Robert; de Martin, David
2012-06-01
The ESA/ESO/NASA FITS Liberator makes it possible to process and edit astronomical science data in the FITS format to produce stunning images of the universe. Formerly a plugin for Adobe Photoshop, the current version of FITS Liberator is a stand-alone application and no longer requires Photoshop. This image processing software makes it possible to create color images using raw observations from a range of telescopes; the FITS Liberator continues to support the FITS and PDS formats, preferred by astronomers and planetary scientists respectively, which enables data to be processed from a wide range of telescopes and planetary probes, including ESO's Very Large Telescope, the NASA/ESA Hubble Space Telescope, NASA's Spitzer Space Telescope, ESA's XMM-Newton Telescope and Cassini-Huygens or Mars Reconnaissance Orbiter.
Fingerprint recognition using image processing
NASA Astrophysics Data System (ADS)
Dholay, Surekha; Mishra, Akassh A.
2011-06-01
Finger Print Recognition is concerned with the difficult task of matching the images of finger print of a person with the finger print present in the database efficiently. Finger print Recognition is used in forensic science which helps in finding the criminals and also used in authentication of a particular person. Since, Finger print is the only thing which is unique among the people and changes from person to person. The present paper describes finger print recognition methods using various edge detection techniques and also how to detect correct finger print using a camera images. The present paper describes the method that does not require a special device but a simple camera can be used for its processes. Hence, the describe technique can also be using in a simple camera mobile phone. The various factors affecting the process will be poor illumination, noise disturbance, viewpoint-dependence, Climate factors, and Imaging conditions. The described factor has to be considered so we have to perform various image enhancement techniques so as to increase the quality and remove noise disturbance of image. The present paper describe the technique of using contour tracking on the finger print image then using edge detection on the contour and after that matching the edges inside the contour.
Conceptions and Images of Mathematics Professors on Teaching Mathematics in School.
ERIC Educational Resources Information Center
Pehkonen, Erkki
1999-01-01
Clarifies what kind of mathematical beliefs are conveyed to student teachers during their studies. Interviews mathematics professors (n=7) from five Finnish universities who were responsible for mathematics teacher education. Professors estimated that teachers' basic knowledge was poor and old-fashioned, requiring improvement, and they emphasized…
Linear algebra and image processing
NASA Astrophysics Data System (ADS)
Allali, Mohamed
2010-09-01
We use the computing technology digital image processing (DIP) to enhance the teaching of linear algebra so as to make the course more visual and interesting. Certainly, this visual approach by using technology to link linear algebra to DIP is interesting and unexpected to both students as well as many faculty.
Concept Learning through Image Processing.
ERIC Educational Resources Information Center
Cifuentes, Lauren; Yi-Chuan, Jane Hsieh
This study explored computer-based image processing as a study strategy for middle school students' science concept learning. Specifically, the research examined the effects of computer graphics generation on science concept learning and the impact of using computer graphics to show interrelationships among concepts during study time. The 87…
Linear Algebra and Image Processing
ERIC Educational Resources Information Center
Allali, Mohamed
2010-01-01
We use the computing technology digital image processing (DIP) to enhance the teaching of linear algebra so as to make the course more visual and interesting. Certainly, this visual approach by using technology to link linear algebra to DIP is interesting and unexpected to both students as well as many faculty. (Contains 2 tables and 11 figures.)
Mathematical Modelling of Bacterial Populations in Bio-remediation Processes
NASA Astrophysics Data System (ADS)
Vasiliadou, Ioanna A.; Vayenas, Dimitris V.; Chrysikopoulos, Constantinos V.
2011-09-01
An understanding of bacterial behaviour concerns many field applications, such as the enhancement of water, wastewater and subsurface bio-remediation, the prevention of environmental pollution and the protection of human health. Numerous microorganisms have been identified to be able to degrade chemical pollutants, thus, a variety of bacteria are known that can be used in bio-remediation processes. In this study the development of mathematical models capable of describing bacterial behaviour considered in bio-augmentation plans, such as bacterial growth, consumption of nutrients, removal of pollutants, bacterial transport and attachment in porous media, is presented. The mathematical models may be used as a guide in designing and assessing the conditions under which areas contaminated with pollutants can be better remediated.
The Mathematics of Medical Imaging in the Classroom.
ERIC Educational Resources Information Center
Funkhouser, Charles P.; Jafari, Farhad; Eubank, William B.
2002-01-01
Presents an integrated exposition of aspects of secondary school mathematics and a medical science specialty. Reviews clinical medical practice and theoretical and empirical literature in mathematics education and radiology to develop and pilot model integrative classroom topics and activities. Suggests mathematical applications in numeration and…
New optical scheme for parallel processing of 1D gray images
NASA Astrophysics Data System (ADS)
Huang, Guoliang; Jin, Guofan; Wu, Minxian; Yan, Yingbai
1994-06-01
Based on mathematical morphology and digital umbra shading and shadowing algorithm, a new scheme for realizing the fundamental morphological operation of one dimensional gray images is proposed. The mathematical formula for the parallel processing of 1D gray images is summarized; some important conclusions of morphological processing from binary images to gray images are obtained. The advantages of this scheme is simple in structure, high resolution in gray level, and good in parallelism. It can raise the speed of performing morphological processing of gray images greatly and obtain more accurate results.
NASA Astrophysics Data System (ADS)
Sulentic, J. W.
1984-05-01
Digital technology has been used to improve enhancement techniques in astronomical image processing. Continuous tone variations in photographs are assigned density number (DN) values which are arranged in an array. DN locations are processed by computer and turned into pixels which form a reconstruction of the original scene on a television monitor. Digitized data can be manipulated to enhance contrast and filter out gross patterns of light and dark which obscure small scale features. Separate black and white frames exposed at different wavelengths can be digitized and processed individually, then recombined to produce a final image in color. Several examples of the use of the technique are provided, including photographs of spiral galaxy M33; four galaxies in Coma Berenices (NGC 4169, 4173, 4174, and 4175); and Stephens Quintet.
ImageJ: Image processing and analysis in Java
NASA Astrophysics Data System (ADS)
Rasband, W. S.
2012-06-01
ImageJ is a public domain Java image processing program inspired by NIH Image. It can display, edit, analyze, process, save and print 8-bit, 16-bit and 32-bit images. It can read many image formats including TIFF, GIF, JPEG, BMP, DICOM, FITS and "raw". It supports "stacks", a series of images that share a single window. It is multithreaded, so time-consuming operations such as image file reading can be performed in parallel with other operations.
Image post-processing in dental practice.
Gormez, Ozlem; Yilmaz, Hasan Huseyin
2009-10-01
Image post-processing of dental digital radiographs, a function which used commonly in dental practice is presented in this article. Digital radiography has been available in dentistry for more than 25 years and its use by dental practitioners is steadily increasing. Digital acquisition of radiographs enables computer-based image post-processing to enhance image quality and increase the accuracy of interpretation. Image post-processing applications can easily be practiced in dental office by a computer and image processing programs. In this article, image post-processing operations such as image restoration, image enhancement, image analysis, image synthesis, and image compression, and their diagnostic efficacy is described. In addition this article provides general dental practitioners with a broad overview of the benefits of the different image post-processing operations to help them understand the role of that the technology can play in their practices.
Mathematical Formulation Requirements and Specifications for the Process Models
Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.
2010-11-01
The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments naturally
Applications in Digital Image Processing
ERIC Educational Resources Information Center
Silverman, Jason; Rosen, Gail L.; Essinger, Steve
2013-01-01
Students are immersed in a mathematically intensive, technological world. They engage daily with iPods, HDTVs, and smartphones--technological devices that rely on sophisticated but accessible mathematical ideas. In this article, the authors provide an overview of four lab-type activities that have been used successfully in high school mathematics…
Image processing software for imaging spectrometry
NASA Technical Reports Server (NTRS)
Mazer, Alan S.; Martin, Miki; Lee, Meemong; Solomon, Jerry E.
1988-01-01
The paper presents a software system, Spectral Analysis Manager (SPAM), which has been specifically designed and implemented to provide the exploratory analysis tools necessary for imaging spectrometer data, using only modest computational resources. The basic design objectives are described as well as the major algorithms designed or adapted for high-dimensional images. Included in a discussion of system implementation are interactive data display, statistical analysis, image segmentation and spectral matching, and mixture analysis.
The image of mathematics held by Irish post-primary students
NASA Astrophysics Data System (ADS)
Lane, Ciara; Stynes, Martin; O'Donoghue, John
2014-08-01
The image of mathematics held by Irish post-primary students was examined and a model for the image found was constructed. Initially, a definition for 'image of mathematics' was adopted with image of mathematics hypothesized as comprising attitudes, beliefs, self-concept, motivation, emotions and past experiences of mathematics. Research focused on students studying ordinary level mathematics for the Irish Leaving Certificate examination - the final examination for students in second-level or post-primary education. Students were aged between 15 and 18 years. A questionnaire was constructed with both quantitative and qualitative aspects. The questionnaire survey was completed by 356 post-primary students. Responses were analysed quantitatively using Statistical Package for the Social Sciences (SPSS) and qualitatively using the constant comparative method of analysis and by reviewing individual responses. Findings provide an insight into Irish post-primary students' images of mathematics and offer a means for constructing a theoretical model of image of mathematics which could be beneficial for future research.
The Mathematics of Medical Imaging in the Classroom
ERIC Educational Resources Information Center
Funkhouser, Charles P.; Jafari, Farhad; Eubank, William B.
2002-01-01
The article presents an integrated exposition of aspects of secondary school mathematics and a medical science specialty together with related classroom activities. Clinical medical practice and theoretical and empirical literature in mathematics education and radiology were reviewed to develop and pilot model integrative classroom topics and…
Biomedical signal and image processing.
Cerutti, Sergio; Baselli, Giuseppe; Bianchi, Anna; Caiani, Enrico; Contini, Davide; Cubeddu, Rinaldo; Dercole, Fabio; Rienzo, Luca; Liberati, Diego; Mainardi, Luca; Ravazzani, Paolo; Rinaldi, Sergio; Signorini, Maria; Torricelli, Alessandro
2011-01-01
Generally, physiological modeling and biomedical signal processing constitute two important paradigms of biomedical engineering (BME): their fundamental concepts are taught starting from undergraduate studies and are more completely dealt with in the last years of graduate curricula, as well as in Ph.D. courses. Traditionally, these two cultural aspects were separated, with the first one more oriented to physiological issues and how to model them and the second one more dedicated to the development of processing tools or algorithms to enhance useful information from clinical data. A practical consequence was that those who did models did not do signal processing and vice versa. However, in recent years,the need for closer integration between signal processing and modeling of the relevant biological systems emerged very clearly [1], [2]. This is not only true for training purposes(i.e., to properly prepare the new professional members of BME) but also for the development of newly conceived research projects in which the integration between biomedical signal and image processing (BSIP) and modeling plays a crucial role. Just to give simple examples, topics such as brain–computer machine or interfaces,neuroengineering, nonlinear dynamical analysis of the cardiovascular (CV) system,integration of sensory-motor characteristics aimed at the building of advanced prostheses and rehabilitation tools, and wearable devices for vital sign monitoring and others do require an intelligent fusion of modeling and signal processing competences that are certainly peculiar of our discipline of BME.
Analysis of electronic autoradiographs by mathematical post-processing
NASA Astrophysics Data System (ADS)
Ghosh, S.; Baier, M.; Schütz, J.; Schneider, F.; Scherer, U. W.
2016-02-01
Autoradiography is a well-established method of nuclear imaging. When different radionuclides are present simultaneously, additional processing is needed to distinguish distributions of radionuclides. In this work, a method is presented where aluminium absorbers of different thickness are used to produce images with different cut-off energies. By subtracting images pixel-by-pixel one can generate images representing certain ranges of β-particle energies. The method is applied to the measurement of irradiated reactor graphite samples containing several radionuclides to determine the spatial distribution of these radionuclides within pre-defined energy windows. The process was repeated under fixed parameters after thermal treatment of the samples. The greyscale images of the distribution after treatment were subtracted from the corresponding pre-treatment images. Significant changes in the intensity and distribution of radionuclides could be observed in some samples. Due to the thermal treatment parameters the most significant differences were observed in the 3H and 14C inventory and distribution.
Image processing technique for arbitrary image positioning in holographic stereogram
NASA Astrophysics Data System (ADS)
Kang, Der-Kuan; Yamaguchi, Masahiro; Honda, Toshio; Ohyama, Nagaaki
1990-12-01
In a one-step holographic stereogram, if the series of original images are used just as they are taken from perspective views, three-dimensional images are usually reconstructed in back of the hologram plane. In order to enhance the sense of perspective of the reconstructed images and minimize blur of the interesting portions, we introduce an image processing technique for making a one-step flat format holographic stereogram in which three-dimensional images can be observed at an arbitrary specified position. Experimental results show the effect of the image processing. Further, we show results of a medical application using this image processing.
Multispectral Image Processing for Plants
NASA Technical Reports Server (NTRS)
Miles, Gaines E.
1991-01-01
The development of a machine vision system to monitor plant growth and health is one of three essential steps towards establishing an intelligent system capable of accurately assessing the state of a controlled ecological life support system for long-term space travel. Besides a network of sensors, simulators are needed to predict plant features, and artificial intelligence algorithms are needed to determine the state of a plant based life support system. Multispectral machine vision and image processing can be used to sense plant features, including health and nutritional status.
Framelet lifting in image processing
NASA Astrophysics Data System (ADS)
Lu, Da-Yong; Feng, Tie-Yong
2010-08-01
To obtain appropriate framelets in image processing, we often need to lift existing framelets. For this purpose the paper presents some methods which allow us to modify existing framelets or filters to construct new ones. The relationships of matrices and their eigenvalues which be used in lifting schemes show that the frame bounds of the lifted wavelet frames are optimal. Moreover, the examples given in Section 4 indicate that the lifted framelets can play the roles of some operators such as the weighted average operator, the Sobel operator and the Laplacian operator, which operators are often used in edge detection and motion estimation applications.
Investigation of Prospective Primary Mathematics Teachers' Perceptions and Images for Quadrilaterals
ERIC Educational Resources Information Center
Turnuklu, Elif; Gundogdu Alayli, Funda; Akkas, Elif Nur
2013-01-01
The object of this study was to show how prospective elementary mathematics teachers define and classify the quadrilaterals and to find out their images. This research was a qualitative study. It was conducted with 36 prospective elementary mathematics teachers studying at 3rd and 4th years in an educational faculty. The data were collected by…
Investigation of Primary Mathematics Student Teachers' Concept Images: Cylinder and Cone
ERIC Educational Resources Information Center
Ertekin, Erhan; Yazici, Ersen; Delice, Ali
2014-01-01
The aim of the present study is to determine the influence of concept definitions of cylinder and cone on primary mathematics student teachers' construction of relevant concept images. The study had a relational survey design and the participants were 238 primary mathematics student teachers. Statistical analyses implied the following: mathematics…
Processing of medical images using Maple
NASA Astrophysics Data System (ADS)
Toro Betancur, V.
2013-05-01
Maple's Image Tools package was used to process medical images. The results showed clearer images and records of its intensities and entropy. The medical images of a rhinocerebral mucormycosis patient, who was not early diagnosed, were processed and analyzed using Maple's tools, which showed, in a clearer way, the affected parts in the perinasal cavities.
Proceedings of the NASA Symposium on Mathematical Pattern Recognition and Image Analysis
NASA Technical Reports Server (NTRS)
Guseman, L. F., Jr.
1983-01-01
The application of mathematical and statistical analyses techniques to imagery obtained by remote sensors is described by Principal Investigators. Scene-to-map registration, geometric rectification, and image matching are among the pattern recognition aspects discussed.
Mathematical modelling of the composting process: a review.
Mason, I G
2006-01-01
In this paper mathematical models of the composting process are examined and their performance evaluated. Mathematical models of the composting process have been derived from both energy and mass balance considerations, with solutions typically derived in time, and in some cases, spatially. Both lumped and distributed parameter models have been reported, with lumped parameter models presently predominating in the literature. Biological energy production functions within the models included first-order, Monod-type or empirical expressions, and these have predicted volatile solids degradation, oxygen consumption or carbon dioxide production, with heat generation derived using heat quotient factors. Rate coefficient correction functions for temperature, moisture, oxygen and/or free air space have been incorporated in a number of the first-order and Monod-type expressions. The most successful models in predicting temperature profiles were those which incorporated either empirical kinetic expressions for volatile solids degradation or CO2 production, or which utilised a first-order model for volatile solids degradation, with empirical corrections for temperature and moisture variations. Models incorporating Monod-type kinetic expressions were less successful. No models were able to predict maximum, average and peak temperatures to within criteria of 5, 2 and 2 degrees C, respectively, or to predict the times to reach peak temperatures to within 8 h. Limitations included the modelling of forced aeration systems only and the generation of temperature validation data for relatively short time periods in relation to those used in full-scale composting practice. Moisture and solids profiles were well predicted by two models, but oxygen and carbon dioxide profiles were generally poorly modelled. Further research to obtain more extensive substrate degradation data, develop improved first-order biological heat production models, investigate mechanistically-based moisture
Concurrent Image Processing Executive (CIPE)
NASA Technical Reports Server (NTRS)
Lee, Meemong; Cooper, Gregory T.; Groom, Steven L.; Mazer, Alan S.; Williams, Winifred I.
1988-01-01
The design and implementation of a Concurrent Image Processing Executive (CIPE), which is intended to become the support system software for a prototype high performance science analysis workstation are discussed. The target machine for this software is a JPL/Caltech Mark IIIfp Hypercube hosted by either a MASSCOMP 5600 or a Sun-3, Sun-4 workstation; however, the design will accommodate other concurrent machines of similar architecture, i.e., local memory, multiple-instruction-multiple-data (MIMD) machines. The CIPE system provides both a multimode user interface and an applications programmer interface, and has been designed around four loosely coupled modules; (1) user interface, (2) host-resident executive, (3) hypercube-resident executive, and (4) application functions. The loose coupling between modules allows modification of a particular module without significantly affecting the other modules in the system. In order to enhance hypercube memory utilization and to allow expansion of image processing capabilities, a specialized program management method, incremental loading, was devised. To minimize data transfer between host and hypercube a data management method which distributes, redistributes, and tracks data set information was implemented.
Extraction of cross-sea bridges from GF-2 PMS satellite images using mathematical morphology
NASA Astrophysics Data System (ADS)
Chen, Chao; Sui, Xinxin; Zhen, Guangwei; Guo, Biyun; Chen, Xiaowei
2016-11-01
Cross-sea bridges are typical target of natural sceneries. Extracting cross-sea bridges from remote sensing images appears significant. In this study, for the complexity of ground objects in GF-2 PMS satellite images, we have selected direction-augmented linear structuring elements to be used for extraction of cross-sea bridges. Firstly, pre-processing of original image has been carried out; secondly, calculate NDVI and extract water bodies; thirdly, according to the direction of water bodies, select appropriate direction-augmented linear structuring elements, proceed mathematical morphology operations on the water bodies, and assisted by prior knowledge of the bridge, extract the bridge object; finally, select an experimental area to verify the effectiveness and applicability of the method. Our work shows that the direction-augmented linear structuring elements are effective and efficient for extraction of bridges with different directions in GF-2 PMS satellite images, and the approach in this study is important to expand the application areas of domestic high-resolution remote sensing images.
Medical Image Segmentation using the HSI color space and Fuzzy Mathematical Morphology
NASA Astrophysics Data System (ADS)
Gasparri, J. P.; Bouchet, A.; Abras, G.; Ballarin, V.; Pastore, J. I.
2011-12-01
Diabetic retinopathy is the most common cause of blindness among the active population in developed countries. An early ophthalmologic examination followed by proper treatment can prevent blindness. The purpose of this work is develop an automated method for segmentation the vasculature in retinal images in order to assist the expert in the evolution of a specific treatment or in the diagnosis of a potential pathology. Since the HSI space has the ability to separate the intensity of the intrinsic color information, its use is recommended for the digital processing images when they are affected by lighting changes, characteristic of the images under study. By the application of color filters, is achieved artificially change the tone of blood vessels, to better distinguish them from the bottom. This technique, combined with the application of fuzzy mathematical morphology tools as the Top-Hat transformation, creates images of the retina, where vascular branches are markedly enhanced over the original. These images provide the visualization of blood vessels by the specialist.
ERIC Educational Resources Information Center
Kosko, Karl Wesley; Norton, Anderson
2012-01-01
The current body of literature suggests an interactive relationship between several of the process standards advocated by National Council of Teachers of Mathematics. Verbal and written mathematical communication has often been described as an alternative to typical mathematical representations (e.g., charts and graphs). Therefore, the…
Modelling Of Flotation Processes By Classical Mathematical Methods - A Review
NASA Astrophysics Data System (ADS)
Jovanović, Ivana; Miljanović, Igor
2015-12-01
Flotation process modelling is not a simple task, mostly because of the process complexity, i.e. the presence of a large number of variables that (to a lesser or a greater extent) affect the final outcome of the mineral particles separation based on the differences in their surface properties. The attempts toward the development of the quantitative predictive model that would fully describe the operation of an industrial flotation plant started in the middle of past century and it lasts to this day. This paper gives a review of published research activities directed toward the development of flotation models based on the classical mathematical rules. The description and systematization of classical flotation models were performed according to the available references, with emphasize exclusively given to the flotation process modelling, regardless of the model application in a certain control system. In accordance with the contemporary considerations, models were classified as the empirical, probabilistic, kinetic and population balance types. Each model type is presented through the aspects of flotation modelling at the macro and micro process levels.
ERIC Educational Resources Information Center
Greenberg, Richard
1998-01-01
Describes the Image Processing for Teaching (IPT) project which provides digital image processing to excite students about science and mathematics as they use research-quality software on microcomputers. Provides information on IPT whose components of this dissemination project have been widespread teacher education, curriculum-based materials…
ERIC Educational Resources Information Center
Dede, Yuksel
2013-01-01
The purpose of this study was to explore the values underlying the decision-making processes in group studies for Turkish and German mathematics teachers. This study presented a small part of a wider study investigating German and Turkish mathematics teachers' and their students' values (Values in Mathematics Teaching in Turkey and Germany…
ERIC Educational Resources Information Center
Sedig, Kamran; Liang, Hai-Ning
2006-01-01
Computer-based mathematical cognitive tools (MCTs) are a category of external aids intended to support and enhance learning and cognitive processes of learners. MCTs often contain interactive visual mathematical representations (VMRs), where VMRs are graphical representations that encode properties and relationships of mathematical concepts. In…
ERIC Educational Resources Information Center
De Smedt, Bert; Gilmore, Camilla K.
2011-01-01
This study examined numerical magnitude processing in first graders with severe and mild forms of mathematical difficulties, children with mathematics learning disabilities (MLD) and children with low achievement (LA) in mathematics, respectively. In total, 20 children with MLD, 21 children with LA, and 41 regular achievers completed a numerical…
A Low-Achiever's Learning Process in Mathematics: Shirley's Fraction Learning
ERIC Educational Resources Information Center
Keijzer, Ronald; Terwel, Jan
2004-01-01
Research in mathematics education offers a considerable body of evidence that both high and low-achievers can benefit from learning mathematics in meaningful contexts. This case study offers an in-depth analysis of the learning process of a low-achieving student in the context of Realistic Mathematics Education (RME). The focus is on the use of…
Mathematical modeling and fluorescence imaging to study the Ca2+ turnover in skinned muscle fibers.
Uttenweiler, D; Weber, C; Fink, R H
1998-01-01
A mathematical model was developed for the simulation of the spatial and temporal time course of Ca2+ ion movement in caffeine-induced calcium transients of chemically skinned muscle fiber preparations. Our model assumes cylindrical symmetry and quantifies the radial profile of Ca2+ ion concentration by solving the diffusion equations for Ca2+ ions and various mobile buffers, and the rate equations for Ca2+ buffering (mobile and immobile buffers) and for the release and reuptake of Ca2+ ions by the sarcoplasmic reticulum (SR), with a finite-difference algorithm. The results of the model are compared with caffeine-induced spatial Ca2+ transients obtained from saponin skinned murine fast-twitch fibers by fluorescence photometry and imaging measurements using the ratiometric dye Fura-2. The combination of mathematical modeling and digital image analysis provides a tool for the quantitative description of the total Ca2+ turnover and the different contributions of all interacting processes to the overall Ca2+ transient in skinned muscle fibers. It should thereby strongly improve the usage of skinned fibers as quantitative assay systems for many parameters of the SR and the contractile apparatus helping also to bridge the gap to the intact muscle fiber. PMID:9545029
ERIC Educational Resources Information Center
Zeytun, Aysel Sen; Cetinkaya, Bulent; Erbas, Ayhan Kursat
2017-01-01
This paper investigates how prospective teachers develop mathematical models while they engage in modeling tasks. The study was conducted in an undergraduate elective course aiming to improve prospective teachers' mathematical modeling abilities, while enhancing their pedagogical knowledge for the integrating of modeling tasks into their future…
ERIC Educational Resources Information Center
Baki, Mujgan
2015-01-01
This study aims to explore the role of lesson analysis in the development of mathematical knowledge for teaching. For this purpose, a graduate course based on lesson analysis was designed for novice mathematics teachers. Throughout the course the teachers watched videos of group-mates and discussed the issues they identified in terms of…
Eliminating "Hotspots" in Digital Image Processing
NASA Technical Reports Server (NTRS)
Salomon, P. M.
1984-01-01
Signals from defective picture elements rejected. Image processing program for use with charge-coupled device (CCD) or other mosaic imager augmented with algorithm that compensates for common type of electronic defect. Algorithm prevents false interpretation of "hotspots". Used for robotics, image enhancement, image analysis and digital television.
Halftoning and Image Processing Algorithms
1999-02-01
screening techniques with the quality advantages of error diffusion in the half toning of color maps, and on color image enhancement for halftone ...image quality. Our goals in this research were to advance the understanding in image science for our new halftone algorithm and to contribute to...image retrieval and noise theory for such imagery. In the field of color halftone printing, research was conducted on deriving a theoretical model of our
Combining image-processing and image compression schemes
NASA Technical Reports Server (NTRS)
Greenspan, H.; Lee, M.-C.
1995-01-01
An investigation into the combining of image-processing schemes, specifically an image enhancement scheme, with existing compression schemes is discussed. Results are presented on the pyramid coding scheme, the subband coding scheme, and progressive transmission. Encouraging results are demonstrated for the combination of image enhancement and pyramid image coding schemes, especially at low bit rates. Adding the enhancement scheme to progressive image transmission allows enhanced visual perception at low resolutions. In addition, further progressing of the transmitted images, such as edge detection schemes, can gain from the added image resolution via the enhancement.
Precision processing of earth image data
NASA Technical Reports Server (NTRS)
Bernstein, R.; Stierhoff, G. C.
1976-01-01
Precise corrections of Landsat data are useful for generating land-use maps, detecting various crops and determining their acreage, and detecting changes. The paper discusses computer processing and visualization techniques for Landsat data so that users can get more information from the imagery. The elementary unit of data in each band of each scene is the integrated value of intensity of reflected light detected in the field of view by each sensor. To develop the basic mathematical approach for precision correction of the data, differences between positions of ground control points on the reference map and the observed control points in the scene are used to evaluate the coefficients of cubic time functions of roll, pitch, and yaw, and a linear time function of altitude deviation from normal height above local earth's surface. The resultant equation, termed a mapping function, corrects the warped data image into one that approximates the reference map. Applications are discussed relative to shade prints, extraction of road features, and atlas of cities.
NASA Astrophysics Data System (ADS)
Frollo, Ivan; Krafčík, Andrej; Andris, Peter; Přibil, Jiří; Dermek, Tomáš
2015-12-01
Circular samples are the frequent objects of "in-vitro" investigation using imaging method based on magnetic resonance principles. The goal of our investigation is imaging of thin planar layers without using the slide selection procedure, thus only 2D imaging or imaging of selected layers of samples in circular vessels, eppendorf tubes,.. compulsorily using procedure "slide selection". In spite of that the standard imaging methods was used, some specificity arise when mathematical modeling of these procedure is introduced. In the paper several mathematical models were presented that were compared with real experimental results. Circular magnetic samples were placed into the homogenous magnetic field of a low field imager based on nuclear magnetic resonance. For experimental verification an MRI 0.178 Tesla ESAOTE Opera imager was used.
Investigating Preservice Mathematics Teachers' Manipulative Material Design Processes
ERIC Educational Resources Information Center
Sandir, Hakan
2016-01-01
Students use concrete manipulatives to form an imperative affiliation between conceptual and procedural knowledge (Balka, 1993). Hence, it is necessary to design specific mathematics manipulatives that focus on different mathematical concepts. Preservice teachers need to know how to make and use manipulatives that stimulate students' thinking as…
The Importance of Dialogic Processes to Conceptual Development in Mathematics
ERIC Educational Resources Information Center
Kazak, Sibel; Wegerif, Rupert; Fujita, Taro
2015-01-01
We argue that dialogic theory, inspired by the Russian scholar Mikhail Bakhtin, has a distinct contribution to the analysis of the genesis of understanding in the mathematics classroom. We begin by contrasting dialogic theory to other leading theoretical approaches to understanding conceptual development in mathematics influenced by Jean Piaget…
Does Early Mathematics Intervention Change the Processes Underlying Children's Learning?
ERIC Educational Resources Information Center
Watts, Tyler W.; Clements, Douglas H.; Sarama, Julie; Wolfe, Christopher B.; Spitler, Mary Elaine; Bailey, Drew H.
2017-01-01
Early educational intervention effects typically fade in the years following treatment, and few studies have investigated why achievement impacts diminish over time. The current study tested the effects of a preschool mathematics intervention on two aspects of children's mathematical development. We tested for separate effects of the intervention…
ERIC Educational Resources Information Center
Costellano, Janet; Scaffa, Matthew
The product of a Special Studies Institute, this teacher developed resource guide for the emotionally handicapped (K-6) presents 37 activities designed to develop mathematics concepts and skills utilizing the urban out-of-doors. Focus is on experiencing math models, patterns, problems, and relationships found in an urban environment. Activities…
Prabhakaran, V; Rypma, B; Gabrieli, J D
2001-01-01
Brain activation was examined using functional magnetic resonance imaging during mathematical problem solving in 7 young healthy participants. Problems were selected from the Necessary Arithmetic Operations Test (NAOT; R. B. Ekstrom, J. W. French, H. H. Harman, & D. Dermen, 1976). Participants solved 3 types of problems: 2-operation problems requiring mathematical reasoning and text processing, 1-operation problems requiring text processing but minimal mathematical reasoning, and 0-operation problems requiring minimal text processing and controlling sensorimotor demands of the NAOT problems. Two-operation problems yielded major activations in bilateral frontal regions similar to those found in other problem-solving tasks, indicating that the processes mediated by these regions subserve many forms of reasoning. Findings suggest a dissociation in mathematical problem solving between reasoning, mediated by frontal cortex, and text processing, mediated by temporal cortex.
The effect of mathematics anxiety on the processing of numerical magnitude.
Maloney, Erin A; Ansari, Daniel; Fugelsang, Jonathan A
2011-01-01
In an effort to understand the origins of mathematics anxiety, we investigated the processing of symbolic magnitude by high mathematics-anxious (HMA) and low mathematics-anxious (LMA) individuals by examining their performance on two variants of the symbolic numerical comparison task. In two experiments, a numerical distance by mathematics anxiety (MA) interaction was obtained, demonstrating that the effect of numerical distance on response times was larger for HMA than for LMA individuals. These data support the claim that HMA individuals have less precise representations of numerical magnitude than their LMA peers, suggesting that MA is associated with low-level numerical deficits that compromise the development of higher level mathematical skills.
Image processing for medical diagnosis using CNN
NASA Astrophysics Data System (ADS)
Arena, Paolo; Basile, Adriano; Bucolo, Maide; Fortuna, Luigi
2003-01-01
Medical diagnosis is one of the most important area in which image processing procedures are usefully applied. Image processing is an important phase in order to improve the accuracy both for diagnosis procedure and for surgical operation. One of these fields is tumor/cancer detection by using Microarray analysis. The research studies in the Cancer Genetics Branch are mainly involved in a range of experiments including the identification of inherited mutations predisposing family members to malignant melanoma, prostate and breast cancer. In bio-medical field the real-time processing is very important, but often image processing is a quite time-consuming phase. Therefore techniques able to speed up the elaboration play an important rule. From this point of view, in this work a novel approach to image processing has been developed. The new idea is to use the Cellular Neural Networks to investigate on diagnostic images, like: Magnetic Resonance Imaging, Computed Tomography, and fluorescent cDNA microarray images.
Amplitude image processing by diffractive optics.
Cagigal, Manuel P; Valle, Pedro J; Canales, V F
2016-02-22
In contrast to the standard digital image processing, which operates over the detected image intensity, we propose to perform amplitude image processing. Amplitude processing, like low pass or high pass filtering, is carried out using diffractive optics elements (DOE) since it allows to operate over the field complex amplitude before it has been detected. We show the procedure for designing the DOE that corresponds to each operation. Furthermore, we accomplish an analysis of amplitude image processing performances. In particular, a DOE Laplacian filter is applied to simulated astronomical images for detecting two stars one Airy ring apart. We also check by numerical simulations that the use of a Laplacian amplitude filter produces less noisy images than the standard digital image processing.
Programmable remapper for image processing
NASA Technical Reports Server (NTRS)
Juday, Richard D. (Inventor); Sampsell, Jeffrey B. (Inventor)
1991-01-01
A video-rate coordinate remapper includes a memory for storing a plurality of transformations on look-up tables for remapping input images from one coordinate system to another. Such transformations are operator selectable. The remapper includes a collective processor by which certain input pixels of an input image are transformed to a portion of the output image in a many-to-one relationship. The remapper includes an interpolative processor by which the remaining input pixels of the input image are transformed to another portion of the output image in a one-to-many relationship. The invention includes certain specific transforms for creating output images useful for certain defects of visually impaired people. The invention also includes means for shifting input pixels and means for scrolling the output matrix.
Handbook on COMTAL's Image Processing System
NASA Technical Reports Server (NTRS)
Faulcon, N. D.
1983-01-01
An image processing system is the combination of an image processor with other control and display devices plus the necessary software needed to produce an interactive capability to analyze and enhance image data. Such an image processing system installed at NASA Langley Research Center, Instrument Research Division, Acoustics and Vibration Instrumentation Section (AVIS) is described. Although much of the information contained herein can be found in the other references, it is hoped that this single handbook will give the user better access, in concise form, to pertinent information and usage of the image processing system.
NASA Regional Planetary Image Facility image retrieval and processing system
NASA Technical Reports Server (NTRS)
Slavney, Susan
1986-01-01
The general design and analysis functions of the NASA Regional Planetary Image Facility (RPIF) image workstation prototype are described. The main functions of the MicroVAX II based workstation will be database searching, digital image retrieval, and image processing and display. The uses of the Transportable Applications Executive (TAE) in the system are described. File access and image processing programs use TAE tutor screens to receive parameters from the user and TAE subroutines are used to pass parameters to applications programs. Interface menus are also provided by TAE.
Visualization of children's mathematics solving process using near infrared spectroscopic approach
NASA Astrophysics Data System (ADS)
Kuroda, Yasufumi; Okamoto, Naoko; Chance, Britton; Nioka, Shoko; Eda, Hideo; Maesako, Takanori
2009-02-01
Over the past decade, the application of results from brain science research to education research has been a controversial topic. A NIRS imaging system shows images of Hb parameters in the brain. Measurements using NIRS are safe, easy and the equipment is portable, allowing subjects to tolerate longer research periods. The purpose of this research is to examine the characteristics of Hb using NIRS at the moment of understanding. We measured Hb in the prefrontal cortex of children while they were solving mathematical problems (tangram puzzles). As a result of the experiment, we were able to classify the children into three groups based on their solution methods. Hb continually increased in a group which could not develop a problem solving strategy for the tangram puzzles. Hb declined steadily for a group which was able to develop a strategy for the tangram puzzles. Hb was steady for a certain group that had already developed a strategy before solving the problems. Our experiments showed that the brain data from NIRS enables the visualization of children's mathematical solution processes.
Coordination in serial-parallel image processing
NASA Astrophysics Data System (ADS)
Wójcik, Waldemar; Dubovoi, Vladymyr M.; Duda, Marina E.; Romaniuk, Ryszard S.; Yesmakhanova, Laura; Kozbakova, Ainur
2015-12-01
Serial-parallel systems used to convert the image. The control of their work results with the need to solve coordination problem. The paper summarizes the model of coordination of resource allocation in relation to the task of synchronizing parallel processes; the genetic algorithm of coordination developed, its adequacy verified in relation to the process of parallel image processing.
Mathematical modeling of olive mill waste composting process.
Vasiliadou, Ioanna A; Muktadirul Bari Chowdhury, Abu Khayer Md; Akratos, Christos S; Tekerlekopoulou, Athanasia G; Pavlou, Stavros; Vayenas, Dimitrios V
2015-09-01
The present study aimed at developing an integrated mathematical model for the composting process of olive mill waste. The multi-component model was developed to simulate the composting of three-phase olive mill solid waste with olive leaves and different materials as bulking agents. The modeling system included heat transfer, organic substrate degradation, oxygen consumption, carbon dioxide production, water content change, and biological processes. First-order kinetics were used to describe the hydrolysis of insoluble organic matter, followed by formation of biomass. Microbial biomass growth was modeled with a double-substrate limitation by hydrolyzed available organic substrate and oxygen using Monod kinetics. The inhibitory factors of temperature and moisture content were included in the system. The production and consumption of nitrogen and phosphorous were also included in the model. In order to evaluate the kinetic parameters, and to validate the model, six pilot-scale composting experiments in controlled laboratory conditions were used. Low values of hydrolysis rates were observed (0.002841/d) coinciding with the high cellulose and lignin content of the composting materials used. Model simulations were in good agreement with the experimental results. Sensitivity analysis was performed and the modeling efficiency was determined to further evaluate the model predictions. Results revealed that oxygen simulations were more sensitive on the input parameters of the model compared to those of water, temperature and insoluble organic matter. Finally, the Nash and Sutcliff index (E), showed that the experimental data of insoluble organic matter (E>0.909) and temperature (E>0.678) were better simulated than those of water.
Semi-automated Image Processing for Preclinical Bioluminescent Imaging
Slavine, Nikolai V; McColl, Roderick W
2015-01-01
Objective Bioluminescent imaging is a valuable noninvasive technique for investigating tumor dynamics and specific biological molecular events in living animals to better understand the effects of human disease in animal models. The purpose of this study was to develop and test a strategy behind automated methods for bioluminescence image processing from the data acquisition to obtaining 3D images. Methods In order to optimize this procedure a semi-automated image processing approach with multi-modality image handling environment was developed. To identify a bioluminescent source location and strength we used the light flux detected on the surface of the imaged object by CCD cameras. For phantom calibration tests and object surface reconstruction we used MLEM algorithm. For internal bioluminescent sources we used the diffusion approximation with balancing the internal and external intensities on the boundary of the media and then determined an initial order approximation for the photon fluence we subsequently applied a novel iterative deconvolution method to obtain the final reconstruction result. Results We find that the reconstruction techniques successfully used the depth-dependent light transport approach and semi-automated image processing to provide a realistic 3D model of the lung tumor. Our image processing software can optimize and decrease the time of the volumetric imaging and quantitative assessment. Conclusion The data obtained from light phantom and lung mouse tumor images demonstrate the utility of the image reconstruction algorithms and semi-automated approach for bioluminescent image processing procedure. We suggest that the developed image processing approach can be applied to preclinical imaging studies: characteristics of tumor growth, identify metastases, and potentially determine the effectiveness of cancer treatment. PMID:26618187
Image processing on the IBM personal computer
NASA Technical Reports Server (NTRS)
Myers, H. J.; Bernstein, R.
1985-01-01
An experimental, personal computer image processing system has been developed which provides a variety of processing functions in an environment that connects programs by means of a 'menu' for both casual and experienced users. The system is implemented by a compiled BASIC program that is coupled to assembly language subroutines. Image processing functions encompass subimage extraction, image coloring, area classification, histogramming, contrast enhancement, filtering, and pixel extraction.
New method of contour image processing based on the formalism of spiral light beams
Volostnikov, Vladimir G; Kishkin, S A; Kotova, S P
2013-07-31
The possibility of applying the mathematical formalism of spiral light beams to the problems of contour image recognition is theoretically studied. The advantages and disadvantages of the proposed approach are evaluated; the results of numerical modelling are presented. (optical image processing)
ERIC Educational Resources Information Center
Delice, Ali; Kertil, Mahmut
2015-01-01
This article reports the results of a study that investigated pre-service mathematics teachers' modelling processes in terms of representational fluency in a modelling activity related to a cassette player. A qualitative approach was used in the data collection process. Students' individual and group written responses to the mathematical modelling…
Analysis of Mathematics Teachers' Self-Efficacy Levels Concerning the Teaching Process
ERIC Educational Resources Information Center
Ünsal, Serkan; Korkmaz, Fahrettin; Perçin, Safiye
2016-01-01
The purpose of this study is to identify mathematics teachers' opinions on the teaching process self-efficacy levels; and to examine mathematics teachers' teaching process self-efficacy beliefs with regards to specific variables. The study was conducted in Turkey during the second term of the 2015-2016 academic year. The study sample consisted of…
Prospective Elementary Mathematics Teachers' Thought Processes on a Model Eliciting Activity
ERIC Educational Resources Information Center
Eraslan, Ali
2012-01-01
Mathematical model and modeling are one of the topics that have been intensively discussed in recent years. The purpose of this study is to examine prospective elementary mathematics teachers' thought processes on a model eliciting activity and reveal difficulties or blockages in the processes. The study includes forty-five seniors taking the…
Computers in Public Schools: Changing the Image with Image Processing.
ERIC Educational Resources Information Center
Raphael, Jacqueline; Greenberg, Richard
1995-01-01
The kinds of educational technologies selected can make the difference between uninspired, rote computer use and challenging learning experiences. University of Arizona's Image Processing for Teaching Project has worked with over 1,000 teachers to develop image-processing techniques that provide students with exciting, open-ended opportunities for…
Image Processing in Intravascular OCT
NASA Astrophysics Data System (ADS)
Wang, Zhao; Wilson, David L.; Bezerra, Hiram G.; Rollins, Andrew M.
Coronary artery disease is the leading cause of death in the world. Intravascular optical coherence tomography (IVOCT) is rapidly becoming a promising imaging modality for characterization of atherosclerotic plaques and evaluation of coronary stenting. OCT has several unique advantages over alternative technologies, such as intravascular ultrasound (IVUS), due to its better resolution and contrast. For example, OCT is currently the only imaging modality that can measure the thickness of the fibrous cap of an atherosclerotic plaque in vivo. OCT also has the ability to accurately assess the coverage of individual stent struts by neointimal tissue over time. However, it is extremely time-consuming to analyze IVOCT images manually to derive quantitative diagnostic metrics. In this chapter, we introduce some computer-aided methods to automate the common IVOCT image analysis tasks.
Matching rendered and real world images by digital image processing
NASA Astrophysics Data System (ADS)
Mitjà, Carles; Bover, Toni; Bigas, Miquel; Escofet, Jaume
2010-05-01
Recent advances in computer-generated images (CGI) have been used in commercial and industrial photography providing a broad scope in product advertising. Mixing real world images with those rendered from virtual space software shows a more or less visible mismatching between corresponding image quality performance. Rendered images are produced by software which quality performance is only limited by the resolution output. Real world images are taken with cameras with some amount of image degradation factors as lens residual aberrations, diffraction, sensor low pass anti aliasing filters, color pattern demosaicing, etc. The effect of all those image quality degradation factors can be characterized by the system Point Spread Function (PSF). Because the image is the convolution of the object by the system PSF, its characterization shows the amount of image degradation added to any taken picture. This work explores the use of image processing to degrade the rendered images following the parameters indicated by the real system PSF, attempting to match both virtual and real world image qualities. The system MTF is determined by the slanted edge method both in laboratory conditions and in the real picture environment in order to compare the influence of the working conditions on the device performance; an approximation to the system PSF is derived from the two measurements. The rendered images are filtered through a Gaussian filter obtained from the taking system PSF. Results with and without filtering are shown and compared measuring the contrast achieved in different final image regions.
Combining advanced imaging processing and low cost remote imaging capabilities
NASA Astrophysics Data System (ADS)
Rohrer, Matthew J.; McQuiddy, Brian
2008-04-01
Target images are very important for evaluating the situation when Unattended Ground Sensors (UGS) are deployed. These images add a significant amount of information to determine the difference between hostile and non-hostile activities, the number of targets in an area, the difference between animals and people, the movement dynamics of targets, and when specific activities of interest are taking place. The imaging capabilities of UGS systems need to provide only target activity and not images without targets in the field of view. The current UGS remote imaging systems are not optimized for target processing and are not low cost. McQ describes in this paper an architectural and technologic approach for significantly improving the processing of images to provide target information while reducing the cost of the intelligent remote imaging capability.
Image processing utilizing an APL interface
NASA Astrophysics Data System (ADS)
Zmola, Carl; Kapp, Oscar H.
1991-03-01
The past few years have seen the growing use of digital techniques in the analysis of electron microscope image data. This trend is driven by the need to maximize the information extracted from the electron micrograph by submitting its digital representation to the broad spectrum of analytical techniques made available by the digital computer. We are developing an image processing system for the analysis of digital images obtained with a scanning transmission electron microscope (STEM) and a scanning electron microscope (SEM). This system, run on an IBM PS/2 model 70/A21, uses menu-based image processing and an interactive APL interface which permits the direct manipulation of image data.
Parallel processing considerations for image recognition tasks
NASA Astrophysics Data System (ADS)
Simske, Steven J.
2011-01-01
Many image recognition tasks are well-suited to parallel processing. The most obvious example is that many imaging tasks require the analysis of multiple images. From this standpoint, then, parallel processing need be no more complicated than assigning individual images to individual processors. However, there are three less trivial categories of parallel processing that will be considered in this paper: parallel processing (1) by task; (2) by image region; and (3) by meta-algorithm. Parallel processing by task allows the assignment of multiple workflows-as diverse as optical character recognition [OCR], document classification and barcode reading-to parallel pipelines. This can substantially decrease time to completion for the document tasks. For this approach, each parallel pipeline is generally performing a different task. Parallel processing by image region allows a larger imaging task to be sub-divided into a set of parallel pipelines, each performing the same task but on a different data set. This type of image analysis is readily addressed by a map-reduce approach. Examples include document skew detection and multiple face detection and tracking. Finally, parallel processing by meta-algorithm allows different algorithms to be deployed on the same image simultaneously. This approach may result in improved accuracy.
Programmable Iterative Optical Image And Data Processing
NASA Technical Reports Server (NTRS)
Jackson, Deborah J.
1995-01-01
Proposed method of iterative optical image and data processing overcomes limitations imposed by loss of optical power after repeated passes through many optical elements - especially, beam splitters. Involves selective, timed combination of optical wavefront phase conjugation and amplification to regenerate images in real time to compensate for losses in optical iteration loops; timing such that amplification turned on to regenerate desired image, then turned off so as not to regenerate other, undesired images or spurious light propagating through loops from unwanted reflections.
Non-linear Post Processing Image Enhancement
NASA Technical Reports Server (NTRS)
Hunt, Shawn; Lopez, Alex; Torres, Angel
1997-01-01
A non-linear filter for image post processing based on the feedforward Neural Network topology is presented. This study was undertaken to investigate the usefulness of "smart" filters in image post processing. The filter has shown to be useful in recovering high frequencies, such as those lost during the JPEG compression-decompression process. The filtered images have a higher signal to noise ratio, and a higher perceived image quality. Simulation studies comparing the proposed filter with the optimum mean square non-linear filter, showing examples of the high frequency recovery, and the statistical properties of the filter are given,
Real-time video image processing
NASA Astrophysics Data System (ADS)
Smedley, Kirk G.; Yool, Stephen R.
1990-11-01
Lockheed has designed and implemented a prototype real-time Video Enhancement Workbench (VEW) using commercial offtheshelf hardware and custom software. The hardware components include a Sun workstation Aspex PIPE image processor time base corrector VCR video camera and realtime disk subsystem. A cornprehensive set of image processing functions can be invoked by the analyst at any time during processing enabling interactive enhancement and exploitation of video sequences. Processed images can be transmitted and stored within the system in digital or video form. VEW also provides image output to a laser printer and to Interleaf technical publishing software.
Quantitative image processing in fluid mechanics
NASA Technical Reports Server (NTRS)
Hesselink, Lambertus; Helman, James; Ning, Paul
1992-01-01
The current status of digital image processing in fluid flow research is reviewed. In particular, attention is given to a comprehensive approach to the extraction of quantitative data from multivariate databases and examples of recent developments. The discussion covers numerical simulations and experiments, data processing, generation and dissemination of knowledge, traditional image processing, hybrid processing, fluid flow vector field topology, and isosurface analysis using Marching Cubes.
Heuristic and algorithmic processing in English, mathematics, and science education.
Sharps, Matthew J; Hess, Adam B; Price-Sharps, Jana L; Teh, Jane
2008-01-01
Many college students experience difficulties in basic academic skills. Recent research suggests that much of this difficulty may lie in heuristic competency--the ability to use and successfully manage general cognitive strategies. In the present study, the authors evaluated this possibility. They compared participants' performance on a practice California Basic Educational Skills Test and on a series of questions in the natural sciences with heuristic and algorithmic performance on a series of mathematics and reading comprehension exercises. Heuristic competency in mathematics was associated with better scores in science and mathematics. Verbal and algorithmic skills were associated with better reading comprehension. These results indicate the importance of including heuristic training in educational contexts and highlight the importance of a relatively domain-specific approach to questions of cognition in higher education.
Image processing algorithm of equiarm delay line in SAIL
NASA Astrophysics Data System (ADS)
Xu, Nan; Liu, Liren; Lu, Wei
2010-08-01
The phase errors due to the nonlinear chirp of a tunable laser reduce the range resolution in Synthetic Aperture Imaging Ladar (SAIL). The analogue and digital image processing algorithms were developed, and all the image processing algorithms employed matched or nonmatched optical delay line. In this paper a theory of equiarm delay line to compensate the nonlinear chirp phase errors is proposed. This image processing algorithm includes three methods with different compensation precision and implementation difficulties, and promotes the application flexibility. Firstly, we derive the theory that the impact of the nonlinear chirp is suppressed with decreasing the delay time difference of the echo signal and the LO signal. Based on the theory, we propose three methods to establish the equiarm delay line: establishing matched target LO path, establishing reference path and dual coherent detections, establishing reference path and phase shifting calculation. Then the construction of the signal processing system and the mathematical flow of the algorithm are established. The simulations of the airborne synthetic aperture imaging ladar model approve that three methods suppress the phase errors of the nonlinear chirp to various extent, and improve the range resolution. The characteristics and the applicabilities of three methods are discussed finally.
Water surface capturing by image processing
Technology Transfer Automated Retrieval System (TEKTRAN)
An alternative means of measuring the water surface interface during laboratory experiments is processing a series of sequentially captured images. Image processing can provide a continuous, non-intrusive record of the water surface profile whose accuracy is not dependent on water depth. More trad...
Digital image processing in cephalometric analysis.
Jäger, A; Döler, W; Schormann, T
1989-01-01
Digital image processing methods were applied to improve the practicability of cephalometric analysis. The individual X-ray film was digitized by the aid of a high resolution microscope-photometer. Digital processing was done using a VAX 8600 computer system. An improvement of the image quality was achieved by means of various digital enhancement and filtering techniques.
NASA Astrophysics Data System (ADS)
Prosvirnikov, D. B.; Ziatdinova, D. F.; Timerbaev, N. F.; Saldaev, V. A.; Gilfanov, K. H.
2016-04-01
The article analyses the physical picture of the process of steam explosion treatment of pre-impregnated lignocellulosic material, on the basis of which a mathematical modelling of the process is done. The mathematical modelling is represented in the form of differential equations with boundary conditions. The obtained mathematical description allows identifying the degree of influence of various factors on the kinetics of the process and producing a rational selection of operating parameters for the considered processes in terms of the set of application tasks.
Mathematics for generative processes: Living and non-living systems
NASA Astrophysics Data System (ADS)
Giannantoni, Corrado
2006-05-01
The traditional Differential Calculus often shows its limits when describing living systems. These in fact present such a richness of characteristics that are, in the majority of cases, much wider than the description capabilities of the usual differential equations. Such an aspect became particularly evident during the research (completed in 2001) for an appropriate formulation of Odum's Maximum Em-Power Principle (proposed by the Author as a possible Fourth Thermodynamic Principle). In fact, in such a context, the particular non-conservative Algebra, adopted to account for both Quality and quantity of generative processes, suggested we introduce a faithfully corresponding concept of "derivative" (of both integer and fractional order) to describe dynamic conditions however variable. The new concept not only succeeded in pointing out the corresponding differential bases of all the rules of Emergy Algebra, but also represented the preferential guide in order to recognize the most profound physical nature of the basic processes which mostly characterize self-organizing Systems (co-production, co-injection, inter-action, feed-back, splits, etc.).From a mathematical point of view, the most important novelties introduced by such a new approach are: (i) the derivative of any integer or fractional order can be obtained independently from the evaluation of its lower order derivatives; (ii) the exponential function plays an extremely hinge role, much more marked than in the case of traditional differential equations; (iii) wide classes of differential equations, traditionally considered as being non-linear, become "intrinsically" linear when reconsidered in terms of "incipient" derivatives; (iv) their corresponding explicit solutions can be given in terms of new classes of functions (such as "binary" and "duet" functions); (v) every solution shows a sort of "persistence of form" when representing the product generated with respect to the agents of the generating process
Thinking Process of Pseudo Construction in Mathematics Concepts
ERIC Educational Resources Information Center
Subanji; Nusantara, Toto
2016-01-01
This article aims at studying pseudo construction of student thinking in mathematical concepts, integer number operation, algebraic forms, area concepts, and triangle concepts. 391 junior high school students from four districts of East Java Province Indonesia were taken as the subjects. Data were collected by means of distributing the main…
Thinking Process of Naive Problem Solvers to Solve Mathematical Problems
ERIC Educational Resources Information Center
Mairing, Jackson Pasini
2017-01-01
Solving problems is not only a goal of mathematical learning. Students acquire ways of thinking, habits of persistence and curiosity, and confidence in unfamiliar situations by learning to solve problems. In fact, there were students who had difficulty in solving problems. The students were naive problem solvers. This research aimed to describe…
Learning Elementary School Mathematics as a Culturally Conditioned Process.
ERIC Educational Resources Information Center
Vasco, Carlos E.
Mathematics is thought to be the most culturally independent of all academic subjects. "New Math" textbooks printed in the United States or Belgium were translated into Spanish and Portuguese with only minor variations in the story problems and are now taught in most Latin-American countries. Looking backwards, it was not different in past years…
Image processing for cameras with fiber bundle image relay.
Olivas, Stephen J; Arianpour, Ashkan; Stamenov, Igor; Morrison, Rick; Stack, Ron A; Johnson, Adam R; Agurok, Ilya P; Ford, Joseph E
2015-02-10
Some high-performance imaging systems generate a curved focal surface and so are incompatible with focal plane arrays fabricated by conventional silicon processing. One example is a monocentric lens, which forms a wide field-of-view high-resolution spherical image with a radius equal to the focal length. Optical fiber bundles have been used to couple between this focal surface and planar image sensors. However, such fiber-coupled imaging systems suffer from artifacts due to image sampling and incoherent light transfer by the fiber bundle as well as resampling by the focal plane, resulting in a fixed obscuration pattern. Here, we describe digital image processing techniques to improve image quality in a compact 126° field-of-view, 30 megapixel panoramic imager, where a 12 mm focal length F/1.35 lens made of concentric glass surfaces forms a spherical image surface, which is fiber-coupled to six discrete CMOS focal planes. We characterize the locally space-variant system impulse response at various stages: monocentric lens image formation onto the 2.5 μm pitch fiber bundle, image transfer by the fiber bundle, and sensing by a 1.75 μm pitch backside illuminated color focal plane. We demonstrate methods to mitigate moiré artifacts and local obscuration, correct for sphere to plane mapping distortion and vignetting, and stitch together the image data from discrete sensors into a single panorama. We compare processed images from the prototype to those taken with a 10× larger commercial camera with comparable field-of-view and light collection.
Pina, Violeta; Castillo, Alejandro; Cohen Kadosh, Roi; Fuentes, Luis J.
2015-01-01
Previous studies have suggested that numerical processing relates to mathematical performance, but it seems that such relationship is more evident for intentional than for automatic numerical processing. In the present study we assessed the relationship between the two types of numerical processing and specific mathematical abilities in a sample of 109 children in grades 1–6. Participants were tested in an ample range of mathematical tests and also performed both a numerical and a size comparison task. The results showed that numerical processing related to mathematical performance only when inhibitory control was involved in the comparison tasks. Concretely, we found that intentional numerical processing, as indexed by the numerical distance effect in the numerical comparison task, was related to mathematical reasoning skills only when the task-irrelevant dimension (the physical size) was incongruent; whereas automatic numerical processing, indexed by the congruency effect in the size comparison task, was related to mathematical calculation skills only when digits were separated by small distance. The observed double dissociation highlights the relevance of both intentional and automatic numerical processing in mathematical skills, but when inhibitory control is also involved. PMID:25873909
Pina, Violeta; Castillo, Alejandro; Cohen Kadosh, Roi; Fuentes, Luis J
2015-01-01
Previous studies have suggested that numerical processing relates to mathematical performance, but it seems that such relationship is more evident for intentional than for automatic numerical processing. In the present study we assessed the relationship between the two types of numerical processing and specific mathematical abilities in a sample of 109 children in grades 1-6. Participants were tested in an ample range of mathematical tests and also performed both a numerical and a size comparison task. The results showed that numerical processing related to mathematical performance only when inhibitory control was involved in the comparison tasks. Concretely, we found that intentional numerical processing, as indexed by the numerical distance effect in the numerical comparison task, was related to mathematical reasoning skills only when the task-irrelevant dimension (the physical size) was incongruent; whereas automatic numerical processing, indexed by the congruency effect in the size comparison task, was related to mathematical calculation skills only when digits were separated by small distance. The observed double dissociation highlights the relevance of both intentional and automatic numerical processing in mathematical skills, but when inhibitory control is also involved.
SHETTY, ANIL N.; CHIANG, SHARON; MALETIC-SAVATIC, MIRJANA; KASPRIAN, GREGOR; VANNUCCI, MARINA; LEE, WESLEY
2016-01-01
In this article, we discuss the theoretical background for diffusion weighted imaging and diffusion tensor imaging. Molecular diffusion is a random process involving thermal Brownian motion. In biological tissues, the underlying microstructures restrict the diffusion of water molecules, making diffusion directionally dependent. Water diffusion in tissue is mathematically characterized by the diffusion tensor, the elements of which contain information about the magnitude and direction of diffusion and is a function of the coordinate system. Thus, it is possible to generate contrast in tissue based primarily on diffusion effects. Expressing diffusion in terms of the measured diffusion coefficient (eigenvalue) in any one direction can lead to errors. Nowhere is this more evident than in white matter, due to the preferential orientation of myelin fibers. The directional dependency is removed by diagonalization of the diffusion tensor, which then yields a set of three eigenvalues and eigenvectors, representing the magnitude and direction of the three orthogonal axes of the diffusion ellipsoid, respectively. For example, the eigenvalue corresponding to the eigenvector along the long axis of the fiber corresponds qualitatively to diffusion with least restriction. Determination of the principal values of the diffusion tensor and various anisotropic indices provides structural information. We review the use of diffusion measurements using the modified Stejskal–Tanner diffusion equation. The anisotropy is analyzed by decomposing the diffusion tensor based on symmetrical properties describing the geometry of diffusion tensor. We further describe diffusion tensor properties in visualizing fiber tract organization of the human brain. PMID:27441031
Examining Prospective Mathematics Teachers' Proof Processes for Algebraic Concepts
ERIC Educational Resources Information Center
Güler, Gürsel; Dikici, Ramazan
2014-01-01
The aim of this study was to examine prospective mathematics teachers' proof processes for algebraic concepts. The study was conducted with 10 prospective teachers who were studying at the department of secondary mathematics teaching and who volunteered to participate in the study. The data were obtained via task-based clinical interviews…
Speed of Information Processing in Generally Gifted and Excelling-in-Mathematics Adolescents
ERIC Educational Resources Information Center
Paz-Baruch, N.; Leikin, M.; Aharon-Peretz, J.; Leikin, R.
2014-01-01
A considerable amount of recent evidence suggests that speed of information processing (SIP) may be related to general giftedness as well as contributing to higher mathematical ability. To date, no study has examined SIP associated with both general giftedness (G) and excellence in mathematics (EM). This paper presents a part of more extensive…
Some aspects of mathematical and chemical modeling of complex chemical processes
NASA Technical Reports Server (NTRS)
Nemes, I.; Botar, L.; Danoczy, E.; Vidoczy, T.; Gal, D.
1983-01-01
Some theoretical questions involved in the mathematical modeling of the kinetics of complex chemical process are discussed. The analysis is carried out for the homogeneous oxidation of ethylbenzene in the liquid phase. Particular attention is given to the determination of the general characteristics of chemical systems from an analysis of mathematical models developed on the basis of linear algebra.
The Impact of the Data Teams Process on Student Mathematics Achievement
ERIC Educational Resources Information Center
Walters, Mokysha Benford
2012-01-01
The purpose of this study is to determine the difference in the mathematics academic achievement of students when teachers engage in Data Teams, a continuous improvement process, and when they do not. Additionally, this study will examine differences in mathematics academic achievement of students by ethnicity, gender, and socio-economic status…
ERIC Educational Resources Information Center
Palla, Marina; Potari, Despina; Spyrou, Panagiotis
2012-01-01
In this study, we investigate the meaning students attribute to the structure of mathematical induction (MI) and the process of proof construction using mathematical induction in the context of a geometric recursion problem. Two hundred and thirteen 17-year-old students of an upper secondary school in Greece participated in the study. Students'…
A Mathematical Experience Involving Defining Processes: In-Action Definitions and Zero-Definitions
ERIC Educational Resources Information Center
Ouvrier-Buffet, Cecile
2011-01-01
In this paper, a focus is made on defining processes at stake in an unfamiliar situation coming from discrete mathematics which brings surprising mathematical results. The epistemological framework of Lakatos is questioned and used for the design and the analysis of the situation. The cognitive background of Vergnaud's approach enriches the study…
CT Image Processing Using Public Digital Networks
Rhodes, Michael L.; Azzawi, Yu-Ming; Quinn, John F.; Glenn, William V.; Rothman, Stephen L.G.
1984-01-01
Nationwide commercial computer communication is now commonplace for those applications where digital dialogues are generally short and widely distributed, and where bandwidth does not exceed that of dial-up telephone lines. Image processing using such networks is prohibitive because of the large volume of data inherent to digital pictures. With a blend of increasing bandwidth and distributed processing, network image processing becomes possible. This paper examines characteristics of a digital image processing service for a nationwide network of CT scanner installations. Issues of image transmission, data compression, distributed processing, software maintenance, and interfacility communication are also discussed. Included are results that show the volume and type of processing experienced by a network of over 50 CT scanners for the last 32 months.
Parallel-Processing Software for Creating Mosaic Images
NASA Technical Reports Server (NTRS)
Klimeck, Gerhard; Deen, Robert; McCauley, Michael; DeJong, Eric
2008-01-01
A computer program implements parallel processing for nearly real-time creation of panoramic mosaics of images of terrain acquired by video cameras on an exploratory robotic vehicle (e.g., a Mars rover). Because the original images are typically acquired at various camera positions and orientations, it is necessary to warp the images into the reference frame of the mosaic before stitching them together to create the mosaic. [Also see "Parallel-Processing Software for Correlating Stereo Images," Software Supplement to NASA Tech Briefs, Vol. 31, No. 9 (September 2007) page 26.] The warping algorithm in this computer program reflects the considerations that (1) for every pixel in the desired final mosaic, a good corresponding point must be found in one or more of the original images and (2) for this purpose, one needs a good mathematical model of the cameras and a good correlation of individual pixels with respect to their positions in three dimensions. The desired mosaic is divided into slices, each of which is assigned to one of a number of central processing units (CPUs) operating simultaneously. The results from the CPUs are gathered and placed into the final mosaic. The time taken to create the mosaic depends upon the number of CPUs, the speed of each CPU, and whether a local or a remote data-staging mechanism is used.
ESO C Library for an Image Processing Software Environment (eclipse)
NASA Astrophysics Data System (ADS)
Devillard, N.
Written in ANSI C, eclipse is a library offering numerous services related to astronomical image processing: FITS data access, various image and cube loading methods, binary image handling and filtering (including convolution and morphological filters), 2-D cross-correlation, connected components, cube and image arithmetic, dead pixel detection and correction, object detection, data extraction, flat-fielding with robust fit, image generation, statistics, photometry, image-space resampling, image combination, and cube stacking. It also contains support for mathematical tools like random number generation, FFT, curve fitting, matrices, fast median computation, and point-pattern matching. The main feature of this library is its ability to handle large amounts of input data (up to 2 GB in the current version) regardless of the amount of memory and swap available on the local machine. Another feature is the very high speed allowed by optimized C, making it an ideal base tool for programming efficient number-crunching applications, e.g., on parallel (Beowulf) systems. Running on all Unix-like platforms, eclipse is portable. A high-level interface to Python is foreseen that would allow programmers to prototype their applications much faster than through C programs.
Eclipse: ESO C Library for an Image Processing Software Environment
NASA Astrophysics Data System (ADS)
Devillard, Nicolas
2011-12-01
Written in ANSI C, eclipse is a library offering numerous services related to astronomical image processing: FITS data access, various image and cube loading methods, binary image handling and filtering (including convolution and morphological filters), 2-D cross-correlation, connected components, cube and image arithmetic, dead pixel detection and correction, object detection, data extraction, flat-fielding with robust fit, image generation, statistics, photometry, image-space resampling, image combination, and cube stacking. It also contains support for mathematical tools like random number generation, FFT, curve fitting, matrices, fast median computation, and point-pattern matching. The main feature of this library is its ability to handle large amounts of input data (up to 2GB in the current version) regardless of the amount of memory and swap available on the local machine. Another feature is the very high speed allowed by optimized C, making it an ideal base tool for programming efficient number-crunching applications, e.g., on parallel (Beowulf) systems.
Image processing of digital chest ionograms.
Yarwood, J R; Moores, B M
1988-10-01
A number of image-processing techniques have been applied to a digital ionographic chest image in order to evaluate their possible effects on this type of image. In order to quantify any effect, a simulated lesion was superimposed on the image at a variety of locations representing different types of structural detail. Visualization of these lesions was evaluated by a number of observers both pre- and post-processing operations. The operations employed included grey-scale transformations, histogram operations, edge-enhancement and smoothing functions. The resulting effects of these operations on the visualization of the simulated lesions are discussed.
The Mathematical Model of Image, Generated by Scanning Digital Radiography System
NASA Astrophysics Data System (ADS)
Udod, V. A.; Osipov, S. P.; Wang, Yanzhao
2017-01-01
The mathematical model of image, generated by scanning digital radiography system is present. This model takes into account the X-ray energy spectrum transformation of the test object and a noise due to the quantum nature of radiation. The calculation results confirm the importance of fluctuations of the absorbed energy of the registered photon for the small size of the scintillation detectors.
[Development of a Text-Data Based Learning Tool That Integrates Image Processing and Displaying].
Shinohara, Hiroyuki; Hashimoto, Takeyuki
2015-01-01
We developed a text-data based learning tool that integrates image processing and displaying by Excel. Knowledge required for programing this tool is limited to using absolute, relative, and composite cell references and learning approximately 20 mathematical functions available in Excel. The new tool is capable of resolution translation, geometric transformation, spatial-filter processing, Radon transform, Fourier transform, convolutions, correlations, deconvolutions, wavelet transform, mutual information, and simulation of proton density-, T1-, and T2-weighted MR images. The processed images of 128 x 128 pixels or 256 x 256 pixels are observed directly within Excel worksheets without using any particular image display software. The results of image processing using this tool were compared with those using C language and the new tool was judged to have sufficient accuracy to be practically useful. The images displayed on Excel worksheets were compared with images using binary-data display software. This comparison indicated that the image quality of the Excel worksheets was nearly equal to the latter in visual impressions. Since image processing is performed by using text-data, the process is visible and facilitates making contrasts by using mathematical equations within the program. We concluded that the newly developed tool is adequate as a computer-assisted learning tool for use in medical image processing.
Process perspective on image quality evaluation
NASA Astrophysics Data System (ADS)
Leisti, Tuomas; Halonen, Raisa; Kokkonen, Anna; Weckman, Hanna; Mettänen, Marja; Lensu, Lasse; Ritala, Risto; Oittinen, Pirkko; Nyman, Göte
2008-01-01
The psychological complexity of multivariate image quality evaluation makes it difficult to develop general image quality metrics. Quality evaluation includes several mental processes and ignoring these processes and the use of a few test images can lead to biased results. By using a qualitative/quantitative (Interpretation Based Quality, IBQ) methodology, we examined the process of pair-wise comparison in a setting, where the quality of the images printed by laser printer on different paper grades was evaluated. Test image consisted of a picture of a table covered with several objects. Three other images were also used, photographs of a woman, cityscape and countryside. In addition to the pair-wise comparisons, observers (N=10) were interviewed about the subjective quality attributes they used in making their quality decisions. An examination of the individual pair-wise comparisons revealed serious inconsistencies in observers' evaluations on the test image content, but not on other contexts. The qualitative analysis showed that this inconsistency was due to the observers' focus of attention. The lack of easily recognizable context in the test image may have contributed to this inconsistency. To obtain reliable knowledge of the effect of image context or attention on subjective image quality, a qualitative methodology is needed.
On Processing Hexagonally Sampled Images
2011-07-01
two points p1 = (a1,r1, c1 ) and p2 = (a2,r2,c2): 2 21 21 2 21 21 21 2 3 2...rr aa cc aa d pp “City-Block” distance (on the image plane) between two points p1 = (a1,r1, c1 ) and p2 = (a2,r2,c2...A. Approved for public release, distribution unlimited. (96ABW-2011-0325) Neuromorphic Infrared Sensor ( NIFS ) 31 DISTRIBUTION A. Approved
Image processing technology for enhanced situational awareness
NASA Astrophysics Data System (ADS)
Page, S. F.; Smith, M. I.; Hickman, D.
2009-09-01
This paper discusses the integration of a number of advanced image and data processing technologies in support of the development of next-generation Situational Awareness systems for counter-terrorism and crime fighting applications. In particular, the paper discusses the European Union Framework 7 'SAMURAI' project, which is investigating novel approaches to interactive Situational Awareness using cooperative networks of heterogeneous imaging sensors. Specific focus is given to novel Data Fusion aspects of the research which aim to improve system performance through intelligently fusing both image data and non image data sources, resolving human-machine conflicts, and refining the Situational Awareness picture. In addition, the paper highlights some recent advances in supporting image processing technologies. Finally, future trends in image-based Situational Awareness are identified, such as Post-Event Analysis (also known as 'Back-Tracking'), and the associated technical challenges are discussed.
Distinct neural substrates for deductive and mathematical processing.
Kroger, James K; Nystrom, Leigh E; Cohen, Jonathan D; Johnson-Laird, Philip N
2008-12-03
In an effort to clarify how deductive reasoning is accomplished, an fMRI study was performed to observe the neural substrates of logical reasoning and mathematical calculation. Participants viewed a problem statement and three premises, and then either a conclusion or a mathematical formula. They had to indicate whether the conclusion followed from the premises, or to solve the mathematical formula. Language areas of the brain (Broca's and Wernicke's area) responded as the premises and the conclusion were read, but solution of the problems was then carried out by non-language areas. Regions in right prefrontal cortex and inferior parietal lobe were more active for reasoning than for calculation, whereas regions in left prefrontal cortex and superior parietal lobe were more active for calculation than for reasoning. In reasoning, only those problems calling for a search for counterexamples to conclusions recruited right frontal pole. These results have important implications for understanding how higher cognition, including deduction, is implemented in the brain. Different sorts of thinking recruit separate neural substrates, and logical reasoning goes beyond linguistic regions of the brain.
Energy preserving QMF for image processing.
Lian, Jian-ao; Wang, Yonghui
2014-07-01
Implementation of new biorthogonal filter banks (BFB) for image compression and denoising is performed, using test images with diversified characteristics. These new BFB’s are linear-phase, have odd lengths, and with a critical feature, namely, the filters preserve signal energy very well. Experimental results show that the proposed filter banks demonstrate promising performance improvement over the filter banks of those widely used in the image processing area, such as the CDF 9/7.
Earth Observation Services (Image Processing Software)
NASA Technical Reports Server (NTRS)
1992-01-01
San Diego State University and Environmental Systems Research Institute, with other agencies, have applied satellite imaging and image processing techniques to geographic information systems (GIS) updating. The resulting images display land use and are used by a regional planning agency for applications like mapping vegetation distribution and preserving wildlife habitats. The EOCAP program provides government co-funding to encourage private investment in, and to broaden the use of NASA-developed technology for analyzing information about Earth and ocean resources.
Image-plane processing of visual information
NASA Technical Reports Server (NTRS)
Huck, F. O.; Fales, C. L.; Park, S. K.; Samms, R. W.
1984-01-01
Shannon's theory of information is used to optimize the optical design of sensor-array imaging systems which use neighborhood image-plane signal processing for enhancing edges and compressing dynamic range during image formation. The resultant edge-enhancement, or band-pass-filter, response is found to be very similar to that of human vision. Comparisons of traits in human vision with results from information theory suggest that: (1) Image-plane processing, like preprocessing in human vision, can improve visual information acquisition for pattern recognition when resolving power, sensitivity, and dynamic range are constrained. Improvements include reduced sensitivity to changes in lighter levels, reduced signal dynamic range, reduced data transmission and processing, and reduced aliasing and photosensor noise degradation. (2) Information content can be an appropriate figure of merit for optimizing the optical design of imaging systems when visual information is acquired for pattern recognition. The design trade-offs involve spatial response, sensitivity, and sampling interval.
Nonlinear Optical Image Processing with Bacteriorhodopsin Films
NASA Technical Reports Server (NTRS)
Downie, John D.; Deiss, Ron (Technical Monitor)
1994-01-01
The transmission properties of some bacteriorhodopsin film spatial light modulators are uniquely suited to allow nonlinear optical image processing operations to be applied to images with multiplicative noise characteristics. A logarithmic amplitude transmission feature of the film permits the conversion of multiplicative noise to additive noise, which may then be linearly filtered out in the Fourier plane of the transformed image. The bacteriorhodopsin film displays the logarithmic amplitude response for write beam intensities spanning a dynamic range greater than 2.0 orders of magnitude. We present experimental results demonstrating the principle and capability for several different image and noise situations, including deterministic noise and speckle. Using the bacteriorhodopsin film, we successfully filter out image noise from the transformed image that cannot be removed from the original image.
Sun, Zhiwei; Sun, Jianfeng; Hou, Peipei; Zhou, Yu; Xu, Qian; Zhang, Ning; Liu, Liren
2015-02-01
A principle scheme of a lensless optical processor for synthetic-aperture imaging ladar (SAIL) is proposed. The collected data from SAIL is initially digitally added with a quadratic phase in the range direction. These data are then uploaded on a liquid crystal spatial light modulator to modulate the incident light. The target image is obtained through two-dimensional (2D) free-space Fresnel diffraction. The imaging process is mathematically analyzed using a 2D data-collection equation of strip-mode side-looking SAIL. The design equation, imaging resolutions, and target-image compression ratios are presented. Based on this principle scheme, we construct an experimental optical SAIL processor and present the imaging result of data obtained from one SAIL demonstrator. The optical processor is found to exhibit the flexible property of digital processing, as well as the fast processing capability of optical means, because this optical processor is a lensless system.
Screw thread parameter measurement system based on image processing method
NASA Astrophysics Data System (ADS)
Rao, Zhimin; Huang, Kanggao; Mao, Jiandong; Zhang, Yaya; Zhang, Fan
2013-08-01
In the industrial production, as an important transmission part, the screw thread is applied extensively in many automation equipments. The traditional measurement methods of screw thread parameter, including integrated test methods of multiparameters and the single parameter measurement method, belong to contact measurement method. In practical the contact measurement exists some disadvantages, such as relatively high time cost, introducing easily human error and causing thread damage. In this paper, as a new kind of real-time and non-contact measurement method, a screw thread parameter measurement system based on image processing method is developed to accurately measure the outside diameter, inside diameter, pitch diameter, pitch, thread height and other parameters of screw thread. In the system the industrial camera is employed to acquire the image of screw thread, some image processing methods are used to obtain the image profile of screw thread and a mathematics model is established to compute the parameters. The C++Builder 6.0 is employed as the software development platform to realize the image process and computation of screw thread parameters. For verifying the feasibility of the measurement system, some experiments were carried out and the measurement errors were analyzed. The experiment results show the image measurement system satisfies the measurement requirements and suitable for real-time detection of screw thread parameters mentioned above. Comparing with the traditional methods the system based on image processing method has some advantages, such as, non-contact, easy operation, high measuring accuracy, no work piece damage, fast error analysis and so on. In the industrial production, this measurement system can provide an important reference value for development of similar parameter measurement system.
A mathematical model of neuro-fuzzy approximation in image classification
NASA Astrophysics Data System (ADS)
Gopalan, Sasi; Pinto, Linu; Sheela, C.; Arun Kumar M., N.
2016-06-01
Image digitization and explosion of World Wide Web has made traditional search for image, an inefficient method for retrieval of required grassland image data from large database. For a given input query image Content-Based Image Retrieval (CBIR) system retrieves the similar images from a large database. Advances in technology has increased the use of grassland image data in diverse areas such has agriculture, art galleries, education, industry etc. In all the above mentioned diverse areas it is necessary to retrieve grassland image data efficiently from a large database to perform an assigned task and to make a suitable decision. A CBIR system based on grassland image properties and it uses the aid of a feed-forward back propagation neural network for an effective image retrieval is proposed in this paper. Fuzzy Memberships plays an important role in the input space of the proposed system which leads to a combined neural fuzzy approximation in image classification. The CBIR system with mathematical model in the proposed work gives more clarity about fuzzy-neuro approximation and the convergence of the image features in a grassland image.
Digital Image Processing in Private Industry.
ERIC Educational Resources Information Center
Moore, Connie
1986-01-01
Examines various types of private industry optical disk installations in terms of business requirements for digital image systems in five areas: records management; transaction processing; engineering/manufacturing; information distribution; and office automation. Approaches for implementing image systems are addressed as well as key success…
Command Line Image Processing System (CLIPS)
NASA Astrophysics Data System (ADS)
Fleagle, S. R.; Meyers, G. L.; Kulinski, R. G.
1985-06-01
An interactive image processing language (CLIPS) has been developed for use in an image processing environment. CLIPS uses a simple syntax with extensive on-line help to allow even the most naive user perform complex image processing tasks. In addition, CLIPS functions as an interpretive language complete with data structures and program control statements. CLIPS statements fall into one of three categories: command, control,and utility statements. Command statements are expressions comprised of intrinsic functions and/or arithmetic operators which act directly on image or user defined data. Some examples of CLIPS intrinsic functions are ROTATE, FILTER AND EXPONENT. Control statements allow a structured programming style through the use of statements such as DO WHILE and IF-THEN - ELSE. Utility statements such as DEFINE, READ, and WRITE, support I/O and user defined data structures. Since CLIPS uses a table driven parser, it is easily adapted to any environment. New commands may be added to CLIPS by writing the procedure in a high level language such as Pascal or FORTRAN and inserting the syntax for that command into the table. However, CLIPS was designed by incorporating most imaging operations into the language as intrinsic functions. CLIPS allows the user to generate new procedures easily with these powerful functions in an interactive or off line fashion using a text editor. The fact that CLIPS can be used to generate complex procedures quickly or perform basic image processing functions interactively makes it a valuable tool in any image processing environment.
Yuan, Jing-Ping; Chen, Chuang; Sun, Sheng-Rong; Hu, Ming-Bai; Liu, Juan; Li, Yan
2013-01-01
Background The expending and invasive features of tumor nests could reflect the malignant biological behaviors of breast invasive ductal carcinoma. Useful information on cancer invasiveness hidden within tumor nests could be extracted and analyzed by computer image processing and big data analysis. Methods Tissue microarrays from invasive ductal carcinoma (n = 202) were first stained with cytokeratin by immunohistochemical method to clearly demarcate the tumor nests. Then an expert-aided computer analysis system was developed to study the mathematical and geometrical features of the tumor nests. Computer recognition system and imaging analysis software extracted tumor nests information, and mathematical features of tumor nests were calculated. The relationship between tumor nests mathematical parameters and patients' 5-year disease free survival was studied. Results There were 8 mathematical parameters extracted by expert-aided computer analysis system. Three mathematical parameters (number, circularity and total perimeter) with area under curve >0.5 and 4 mathematical parameters (average area, average perimeter, total area/total perimeter, average (area/perimeter)) with area under curve <0.5 in ROC analysis were combined into integrated parameter 1 and integrated parameter 2, respectively. Multivariate analysis showed that integrated parameter 1 (P = 0.040) was independent prognostic factor of patients' 5-year disease free survival. The hazard risk ratio of integrated parameter 1 was 1.454 (HR 95% CI [1.017–2.078]), higher than that of N stage (HR 1.396, 95% CI [1.125–1.733]) and hormone receptor status (HR 0.575, 95% CI [0.353–0.936]), but lower than that of histological grading (HR 3.370, 95% CI [1.125–5.364]) and T stage (HR 1.610, 95% CI [1.026 –2.527]). Conclusions This study indicated integrated parameter 1 of mathematical features (number, circularity and total perimeter) of tumor nests could be a useful parameter to predict the prognosis
Fingerprint image enhancement by differential hysteresis processing.
Blotta, Eduardo; Moler, Emilce
2004-05-10
A new method to enhance defective fingerprints images through image digital processing tools is presented in this work. When the fingerprints have been taken without any care, blurred and in some cases mostly illegible, as in the case presented here, their classification and comparison becomes nearly impossible. A combination of spatial domain filters, including a technique called differential hysteresis processing (DHP), is applied to improve these kind of images. This set of filtering methods proved to be satisfactory in a wide range of cases by uncovering hidden details that helped to identify persons. Dactyloscopy experts from Policia Federal Argentina and the EAAF have validated these results.
Image processing for HTS SQUID probe microscope
NASA Astrophysics Data System (ADS)
Hayashi, T.; Koetitz, R.; Itozaki, H.; Ishikawa, T.; Kawabe, U.
2005-10-01
An HTS SQUID probe microscope has been developed using a high-permeability needle to enable high spatial resolution measurement of samples in air even at room temperature. Image processing techniques have also been developed to improve the magnetic field images obtained from the microscope. Artifacts in the data occur due to electromagnetic interference from electric power lines, line drift and flux trapping. The electromagnetic interference could successfully be removed by eliminating the noise peaks from the power spectrum of fast Fourier transforms of line scans of the image. The drift between lines was removed by interpolating the mean field value of each scan line. Artifacts in line scans occurring due to flux trapping or unexpected noise were removed by the detection of a sharp drift and interpolation using the line data of neighboring lines. Highly detailed magnetic field images were obtained from the HTS SQUID probe microscope by the application of these image processing techniques.
Image-processing with augmented reality (AR)
NASA Astrophysics Data System (ADS)
Babaei, Hossein R.; Mohurutshe, Pagiel L.; Habibi Lashkari, Arash
2013-03-01
In this project, the aim is to discuss and articulate the intent to create an image-based Android Application. The basis of this study is on real-time image detection and processing. It's a new convenient measure that allows users to gain information on imagery right on the spot. Past studies have revealed attempts to create image based applications but have only gone up to crating image finders that only work with images that are already stored within some form of database. Android platform is rapidly spreading around the world and provides by far the most interactive and technical platform for smart-phones. This is why it was important to base the study and research on it. Augmented Reality is this allows the user to maipulate the data and can add enhanced features (video, GPS tags) to the image taken.
Image processing via ultrasonics - Status and promise
NASA Technical Reports Server (NTRS)
Kornreich, P. G.; Kowel, S. T.; Mahapatra, A.; Nouhi, A.
1979-01-01
Acousto-electric devices for electronic imaging of light are discussed. These devices are more versatile than line scan imaging devices in current use. They have the capability of presenting the image information in a variety of modes. The image can be read out in the conventional line scan mode. It can be read out in the form of the Fourier, Hadamard, or other transform. One can take the transform along one direction of the image and line scan in the other direction, or perform other combinations of image processing functions. This is accomplished by applying the appropriate electrical input signals to the device. Since the electrical output signal of these devices can be detected in a synchronous mode, substantial noise reduction is possible
A mathematical study of a random process proposed as an atmospheric turbulence model
NASA Technical Reports Server (NTRS)
Sidwell, K.
1977-01-01
A random process is formed by the product of a local Gaussian process and a random amplitude process, and the sum of that product with an independent mean value process. The mathematical properties of the resulting process are developed, including the first and second order properties and the characteristic function of general order. An approximate method for the analysis of the response of linear dynamic systems to the process is developed. The transition properties of the process are also examined.
Overview on METEOSAT geometrical image data processing
NASA Technical Reports Server (NTRS)
Diekmann, Frank J.
1994-01-01
Digital Images acquired from the geostationary METEOSAT satellites are processed and disseminated at ESA's European Space Operations Centre in Darmstadt, Germany. Their scientific value is mainly dependent on their radiometric quality and geometric stability. This paper will give an overview on the image processing activities performed at ESOC, concentrating on the geometrical restoration and quality evaluation. The performance of the rectification process for the various satellites over the past years will be presented and the impacts of external events as for instance the Pinatubo eruption in 1991 will be explained. Special developments both in hard and software, necessary to cope with demanding tasks as new image resampling or to correct for spacecraft anomalies, are presented as well. The rotating lens of MET-5 causing severe geometrical image distortions is an example for the latter.
Real-time optical image processing techniques
NASA Technical Reports Server (NTRS)
Liu, Hua-Kuang
1988-01-01
Nonlinear real-time optical processing on spatial pulse frequency modulation has been pursued through the analysis, design, and fabrication of pulse frequency modulated halftone screens and the modification of micro-channel spatial light modulators (MSLMs). Micro-channel spatial light modulators are modified via the Fabry-Perot method to achieve the high gamma operation required for non-linear operation. Real-time nonlinear processing was performed using the halftone screen and MSLM. The experiments showed the effectiveness of the thresholding and also showed the needs of higher SBP for image processing. The Hughes LCLV has been characterized and found to yield high gamma (about 1.7) when operated in low frequency and low bias mode. Cascading of two LCLVs should also provide enough gamma for nonlinear processing. In this case, the SBP of the LCLV is sufficient but the uniformity of the LCLV needs improvement. These include image correlation, computer generation of holograms, pseudo-color image encoding for image enhancement, and associative-retrieval in neural processing. The discovery of the only known optical method for dynamic range compression of an input image in real-time by using GaAs photorefractive crystals is reported. Finally, a new architecture for non-linear multiple sensory, neural processing has been suggested.
Mathematical Modeling of Black-and-White Chromogenic Image Stability.
1982-10-01
from special purpose applications and diffusion transfer processes, has changed very little during this -7 century. Recently, however, Agfa -Gevaert...34 Modern Photo- graphy, 44, 98-101 (1981). 7. D. O’Neill, " Agfa Vario XL Vs Ilford XPl," Camera 35, 26, 56-59, 72-73 (1981). 8. D. C. Hubbell, R. G...R1969), p. 7. 18. Agfa -Gevaert Technical Bulletin, "Agfapan Vario-XL Professional Film," Dec. 16, 1980. 19. Bard, et. al., p. 43. 20. A. D. Rickmers
Bistatic SAR: Signal Processing and Image Formation.
Wahl, Daniel E.; Yocky, David A.
2014-10-01
This report describes the significant processing steps that were used to take the raw recorded digitized signals from the bistatic synthetic aperture RADAR (SAR) hardware built for the NCNS Bistatic SAR project to a final bistatic SAR image. In general, the process steps herein are applicable to bistatic SAR signals that include the direct-path signal and the reflected signal. The steps include preprocessing steps, data extraction to for a phase history, and finally, image format. Various plots and values will be shown at most steps to illustrate the processing for a bistatic COSMO SkyMed collection gathered on June 10, 2013 on Kirtland Air Force Base, New Mexico.
Palm print image processing with PCNN
NASA Astrophysics Data System (ADS)
Yang, Jun; Zhao, Xianhong
2010-08-01
Pulse coupled neural networks (PCNN) is based on Eckhorn's model of cat visual cortex, and imitate mammals visual processing, and palm print has been found as a personal biological feature for a long history. This inspired us with the combination of them: a novel method for palm print processing is proposed, which includes pre-processing and feature extraction of palm print image using PCNN; then the feature of palm print image is used for identifying. Our experiment shows that a verification rate of 87.5% can be achieved at ideal condition. We also find that the verification rate decreases duo to rotate or shift of palm.
ERIC Educational Resources Information Center
Davis, C. E.; Osler, James E.
2013-01-01
This paper details the outcomes of a qualitative in-depth investigation into teacher education mathematics preparation. This research is grounded in the notion that mathematics teacher education students (as "degree seeking candidates") need to develop strong foundations of mathematical practice as defined by the Common Core State…
NASA Astrophysics Data System (ADS)
Pesenson, M.; Roby, W.; Helou, G.; McCollum, B.; Ly, L.; Wu, X.; Laine, S.; Hartley, B.
2008-08-01
A new application framework for advanced image processing for astronomy is presented. It implements standard two-dimensional operators, and recent developments in the field of non-astronomical image processing (IP), as well as original algorithms based on nonlinear partial differential equations (PDE). These algorithms are especially well suited for multi-scale astronomical images since they increase signal to noise ratio without smearing localized and diffuse objects. The visualization component is based on the extensive tools that we developed for Spitzer Space Telescope's observation planning tool Spot and archive retrieval tool Leopard. It contains many common features, combines images in new and unique ways and interfaces with many astronomy data archives. Both interactive and batch mode processing are incorporated. In the interactive mode, the user can set up simple processing pipelines, and monitor and visualize the resulting images from each step of the processing stream. The system is platform-independent and has an open architecture that allows extensibility by addition of plug-ins. This presentation addresses astronomical applications of traditional topics of IP (image enhancement, image segmentation) as well as emerging new topics like automated image quality assessment (QA) and feature extraction, which have potential for shaping future developments in the field. Our application framework embodies a novel synergistic approach based on integration of image processing, image visualization and image QA (iQA).
3D seismic image processing for interpretation
NASA Astrophysics Data System (ADS)
Wu, Xinming
Extracting fault, unconformity, and horizon surfaces from a seismic image is useful for interpretation of geologic structures and stratigraphic features. Although interpretation of these surfaces has been automated to some extent by others, significant manual effort is still required for extracting each type of these geologic surfaces. I propose methods to automatically extract all the fault, unconformity, and horizon surfaces from a 3D seismic image. To a large degree, these methods just involve image processing or array processing which is achieved by efficiently solving partial differential equations. For fault interpretation, I propose a linked data structure, which is simpler than triangle or quad meshes, to represent a fault surface. In this simple data structure, each sample of a fault corresponds to exactly one image sample. Using this linked data structure, I extract complete and intersecting fault surfaces without holes from 3D seismic images. I use the same structure in subsequent processing to estimate fault slip vectors. I further propose two methods, using precomputed fault surfaces and slips, to undo faulting in seismic images by simultaneously moving fault blocks and faults themselves. For unconformity interpretation, I first propose a new method to compute a unconformity likelihood image that highlights both the termination areas and the corresponding parallel unconformities and correlative conformities. I then extract unconformity surfaces from the likelihood image and use these surfaces as constraints to more accurately estimate seismic normal vectors that are discontinuous near the unconformities. Finally, I use the estimated normal vectors and use the unconformities as constraints to compute a flattened image, in which seismic reflectors are all flat and vertical gaps correspond to the unconformities. Horizon extraction is straightforward after computing a map of image flattening; we can first extract horizontal slices in the flattened space
Digital-image processing and image analysis of glacier ice
Fitzpatrick, Joan J.
2013-01-01
This document provides a methodology for extracting grain statistics from 8-bit color and grayscale images of thin sections of glacier ice—a subset of physical properties measurements typically performed on ice cores. This type of analysis is most commonly used to characterize the evolution of ice-crystal size, shape, and intercrystalline spatial relations within a large body of ice sampled by deep ice-coring projects from which paleoclimate records will be developed. However, such information is equally useful for investigating the stress state and physical responses of ice to stresses within a glacier. The methods of analysis presented here go hand-in-hand with the analysis of ice fabrics (aggregate crystal orientations) and, when combined with fabric analysis, provide a powerful method for investigating the dynamic recrystallization and deformation behaviors of bodies of ice in motion. The procedures described in this document compose a step-by-step handbook for a specific image acquisition and data reduction system built in support of U.S. Geological Survey ice analysis projects, but the general methodology can be used with any combination of image processing and analysis software. The specific approaches in this document use the FoveaPro 4 plug-in toolset to Adobe Photoshop CS5 Extended but it can be carried out equally well, though somewhat less conveniently, with software such as the image processing toolbox in MATLAB, Image-Pro Plus, or ImageJ.
Employing image processing techniques for cancer detection using microarray images.
Dehghan Khalilabad, Nastaran; Hassanpour, Hamid
2017-02-01
Microarray technology is a powerful genomic tool for simultaneously studying and analyzing the behavior of thousands of genes. The analysis of images obtained from this technology plays a critical role in the detection and treatment of diseases. The aim of the current study is to develop an automated system for analyzing data from microarray images in order to detect cancerous cases. The proposed system consists of three main phases, namely image processing, data mining, and the detection of the disease. The image processing phase performs operations such as refining image rotation, gridding (locating genes) and extracting raw data from images the data mining includes normalizing the extracted data and selecting the more effective genes. Finally, via the extracted data, cancerous cell is recognized. To evaluate the performance of the proposed system, microarray database is employed which includes Breast cancer, Myeloid Leukemia and Lymphomas from the Stanford Microarray Database. The results indicate that the proposed system is able to identify the type of cancer from the data set with an accuracy of 95.45%, 94.11%, and 100%, respectively.
The mathematical model for synthesis process management of the carbon nanostructures
NASA Astrophysics Data System (ADS)
Chistyakova, T. B.; Petrov, D. N.
2017-01-01
In this article, key difficulties of management process for carbon nanostructure synthesis are described. Tasks of optimum control of the carbon nanostructure synthesis process and management in case of emergency situations are formulated. The mathematical model of carbon nanostructure synthesis is offered. The equations for calculation of quantitative, qualitative indexes, indicators of safety and operability of engineering procedure are provided. The necessity of mathematical model use for carbon nanostructure synthesis is caused by improvement of the quality, the quantity, a decrease in the cost value of carbon nanostructures and an increase in safety of the engineering procedure of their obtaining. Testing and approbation of the mathematical model for carbon nanostructure synthesis are executed on a fullerene industrial production line. Suitability of the mathematical model of carbon nanostructure synthesis for production control in the mode of optimum control and management in case of emergency situations is confirmed. The obtained solution is recommended for implementation on the enterprises of a similar purpose.
A low-cost vector processor boosting compute-intensive image processing operations
NASA Technical Reports Server (NTRS)
Adorf, Hans-Martin
1992-01-01
Low-cost vector processing (VP) is within reach of everyone seriously engaged in scientific computing. The advent of affordable add-on VP-boards for standard workstations complemented by mathematical/statistical libraries is beginning to impact compute-intensive tasks such as image processing. A case in point in the restoration of distorted images from the Hubble Space Telescope. A low-cost implementation is presented of the standard Tarasko-Richardson-Lucy restoration algorithm on an Intel i860-based VP-board which is seamlessly interfaced to a commercial, interactive image processing system. First experience is reported (including some benchmarks for standalone FFT's) and some conclusions are drawn.
Fundamental Concepts of Digital Image Processing
DOE R&D Accomplishments Database
Twogood, R. E.
1983-03-01
The field of a digital-image processing has experienced dramatic growth and increasingly widespread applicability in recent years. Fortunately, advances in computer technology have kept pace with the rapid growth in volume of image data in these and other applications. Digital image processing has become economical in many fields of research and in industrial and military applications. While each application has requirements unique from the others, all are concerned with faster, cheaper, more accurate, and more extensive computation. The trend is toward real-time and interactive operations, where the user of the system obtains preliminary results within a short enough time that the next decision can be made by the human processor without loss of concentration on the task at hand. An example of this is the obtaining of two-dimensional (2-D) computer-aided tomography (CAT) images. A medical decision might be made while the patient is still under observation rather than days later.
A Pipeline Tool for CCD Image Processing
NASA Astrophysics Data System (ADS)
Bell, Jon F.; Young, Peter J.; Roberts, William H.; Sebo, Kim M.
MSSSO is part of a collaboration developing a wide field imaging CCD mosaic (WFI). As part of this project, we have developed a GUI based pipeline tool that is an integrated part of MSSSO's CICADA data acquisition environment and processes CCD FITS images as they are acquired. The tool is also designed to run as a stand alone program to process previously acquired data. IRAF tasks are used as the central engine, including the new NOAO mscred package for processing multi-extension FITS files. The STScI OPUS pipeline environment may be used to manage data and process scheduling. The Motif GUI was developed using SUN Visual Workshop. C++ classes were written to facilitate launching of IRAF and OPUS tasks. While this first version implements calibration processing up to and including flat field corrections, there is scope to extend it to other processing.
Thermal Imaging Processes of Polymer Nanocomposite Coatings
NASA Astrophysics Data System (ADS)
Meth, Jeffrey
2015-03-01
Laser induced thermal imaging (LITI) is a process whereby infrared radiation impinging on a coating on a donor film transfers that coating to a receiving film to produce a pattern. This talk describes how LITI patterning can print color filters for liquid crystal displays, and details the physical processes that are responsible for transferring the nanocomposite coating in a coherent manner that does not degrade its optical properties. Unique features of this process involve heating rates of 107 K/s, and cooling rates of 104 K/s, which implies that not all of the relaxation modes of the polymer are accessed during the imaging process. On the microsecond time scale, the polymer flow is forced by devolatilization of solvents, followed by deformation akin to the constrained blister test, and then fracture caused by differential thermal expansion. The unique combination of disparate physical processes demonstrates the gamut of physics that contribute to advanced material processing in an industrial setting.
Shape in Picture: Mathematical Description of Shape in Grey-Level Images
1992-09-11
vision, and artificial intelligence all make wue ot descriptiosf shape in grey-level images. Most aisting algorithms for the automatic recognition and...analy- sis, computer vision, and artificial intelligence . The NATO Advanced Rosearch WorsMhop "Shape in Picture" was organised with a twofd objective...Annals of Maths. and Artificial Intelligence , special issue on "Mathematics in Pattern Recognition", to appear. 46. Ronse, C. (1989). Fourier analysis
Radiographic image processing for industrial applications
NASA Astrophysics Data System (ADS)
Dowling, Martin J.; Kinsella, Timothy E.; Bartels, Keith A.; Light, Glenn M.
1998-03-01
One advantage of working with digital images is the opportunity for enhancement. While it is important to preserve the original image, variations can be generated that yield greater understanding of object properties. It is often possible to effectively increase dynamic range, improve contrast in regions of interest, emphasize subtle features, reduce background noise, and provide more robust detection of faults. This paper describes and illustrates some of these processes using real world examples.
Image processing of angiograms: A pilot study
NASA Technical Reports Server (NTRS)
Larsen, L. E.; Evans, R. A.; Roehm, J. O., Jr.
1974-01-01
The technology transfer application this report describes is the result of a pilot study of image-processing methods applied to the image enhancement, coding, and analysis of arteriograms. Angiography is a subspecialty of radiology that employs the introduction of media with high X-ray absorption into arteries in order to study vessel pathology as well as to infer disease of the organs supplied by the vessel in question.
A Method for Identifying Contours in Processing Digital Images from Computer Tomograph
NASA Astrophysics Data System (ADS)
Roşu, Şerban; Pater, Flavius; Costea, Dan; Munteanu, Mihnea; Roşu, Doina; Fratila, Mihaela
2011-09-01
The first step in digital processing of two-dimensional computed tomography images is to identify the contour of component elements. This paper deals with the collective work of specialists in medicine and applied mathematics in computer science on elaborating new algorithms and methods in medical 2D and 3D imagery.
System identification by video image processing
NASA Astrophysics Data System (ADS)
Shinozuka, Masanobu; Chung, Hung-Chi; Ichitsubo, Makoto; Liang, Jianwen
2001-07-01
Emerging image processing techniques demonstrate their potential applications in earthquake engineering, particularly in the area of system identification. In this respect, the objectives of this research are to demonstrate the underlying principle that permits system identification, non-intrusively and remotely, with the aid of video camera and, for the purpose of the proof-of-concept, to apply the principle to a system identification problem involving relative motion, on the basis of the images. In structural control, accelerations at different stories of a building are usually measured and fed back for processing and control. As an alternative, this study attempts to identify the relative motion between different stories of a building for the purpose of on-line structural control by digitizing the images taken by video camera. For this purpose, the video image of the vibration of a structure base-isolated by a friction device under shaking-table was used successfully to observe relative displacement between the isolated structure and the shaking-table. This proof-of-concept experiment demonstrates that the proposed identification method based on digital image processing can be used with appropriate modifications to identify many other engineering-wise significant quantities remotely. In addition to the system identification study in the structural dynamics mentioned above, a result of preliminary study is described involving the video imaging of state of crack damage of road and highway pavement.
Fu, Chi-Yung; Petrich, Loren I.
1997-01-01
An image represented in a first image array of pixels is first decimated in two dimensions before being compressed by a predefined compression algorithm such as JPEG. Another possible predefined compression algorithm can involve a wavelet technique. The compressed, reduced image is then transmitted over the limited bandwidth transmission medium, and the transmitted image is decompressed using an algorithm which is an inverse of the predefined compression algorithm (such as reverse JPEG). The decompressed, reduced image is then interpolated back to its original array size. Edges (contours) in the image are then sharpened to enhance the perceptual quality of the reconstructed image. Specific sharpening techniques are described.
Fu, C.Y.; Petrich, L.I.
1997-12-30
An image represented in a first image array of pixels is first decimated in two dimensions before being compressed by a predefined compression algorithm such as JPEG. Another possible predefined compression algorithm can involve a wavelet technique. The compressed, reduced image is then transmitted over the limited bandwidth transmission medium, and the transmitted image is decompressed using an algorithm which is an inverse of the predefined compression algorithm (such as reverse JPEG). The decompressed, reduced image is then interpolated back to its original array size. Edges (contours) in the image are then sharpened to enhance the perceptual quality of the reconstructed image. Specific sharpening techniques are described. 22 figs.
Support Routines for In Situ Image Processing
NASA Technical Reports Server (NTRS)
Deen, Robert G.; Pariser, Oleg; Yeates, Matthew C.; Lee, Hyun H.; Lorre, Jean
2013-01-01
This software consists of a set of application programs that support ground-based image processing for in situ missions. These programs represent a collection of utility routines that perform miscellaneous functions in the context of the ground data system. Each one fulfills some specific need as determined via operational experience. The most unique aspect to these programs is that they are integrated into the large, in situ image processing system via the PIG (Planetary Image Geometry) library. They work directly with space in situ data, understanding the appropriate image meta-data fields and updating them properly. The programs themselves are completely multimission; all mission dependencies are handled by PIG. This suite of programs consists of: (1)marscahv: Generates a linearized, epi-polar aligned image given a stereo pair of images. These images are optimized for 1-D stereo correlations, (2) marscheckcm: Compares the camera model in an image label with one derived via kinematics modeling on the ground, (3) marschkovl: Checks the overlaps between a list of images in order to determine which might be stereo pairs. This is useful for non-traditional stereo images like long-baseline or those from an articulating arm camera, (4) marscoordtrans: Translates mosaic coordinates from one form into another, (5) marsdispcompare: Checks a Left Right stereo disparity image against a Right Left disparity image to ensure they are consistent with each other, (6) marsdispwarp: Takes one image of a stereo pair and warps it through a disparity map to create a synthetic opposite- eye image. For example, a right eye image could be transformed to look like it was taken from the left eye via this program, (7) marsfidfinder: Finds fiducial markers in an image by projecting their approximate location and then using correlation to locate the markers to subpixel accuracy. These fiducial markets are small targets attached to the spacecraft surface. This helps verify, or improve, the
Huang, Jian; Du, Feng-lei; Yao, Yuan; Wan, Qun; Wang, Xiao-Song; Chen, Fei-Yan
2015-08-01
Distance effect has been regarded as the best established marker of basic numerical magnitude processes and is related to individual mathematical abilities. A larger behavioral distance effect is suggested to be concomitant with lower mathematical achievement in children. However, the relationship between distance effect and superior mathematical abilities is unclear. One could get superior mathematical abilities by acquiring the skill of abacus-based mental calculation (AMC), which can be used to solve calculation problems with exceptional speed and high accuracy. In the current study, we explore the relationship between distance effect and superior mathematical abilities by examining whether and how the AMC training modifies numerical magnitude processing. Thus, mathematical competencies were tested in 18 abacus-trained children (who accepted the AMC training) and 18 non-trained children. Electroencephalography (EEG) waveforms were recorded when these children executed numerical comparison tasks in both Arabic digit and dot array forms. We found that: (a) the abacus-trained group had superior mathematical abilities than their peers; (b) distance effects were found both in behavioral results and on EEG waveforms; (c) the distance effect size of the average amplitude on the late negative-going component was different between groups in the digit task, with a larger effect size for abacus-trained children; (d) both the behavioral and EEG distance effects were modulated by the notation. These results revealed that the neural substrates of magnitude processing were modified by AMC training, and suggested that the mechanism of the representation of numerical magnitude for children with superior mathematical abilities was different from their peers. In addition, the results provide evidence for a view of non-abstract numerical representation.
NASA Technical Reports Server (NTRS)
Bernstein, R.
1973-01-01
ERTS-1 MSS and RBV data recorded on computer compatible tapes have been analyzed and processed, and preliminary results have been obtained. No degradation of intensity (radiance) information occurred in implementing the geometric correction. The quality and resolution of the digitally processed images are very good, due primarily to the fact that the number of film generations and conversions is reduced to a minimum. Processing times of digitally processed images are about equivalent to the NDPF electro-optical processor.
Cognitive components of a mathematical processing network in 9-year-old children.
Szűcs, Dénes; Devine, Amy; Soltesz, Fruzsina; Nobes, Alison; Gabriel, Florence
2014-07-01
We determined how various cognitive abilities, including several measures of a proposed domain-specific number sense, relate to mathematical competence in nearly 100 9-year-old children with normal reading skill. Results are consistent with an extended number processing network and suggest that important processing nodes of this network are phonological processing, verbal knowledge, visuo-spatial short-term and working memory, spatial ability and general executive functioning. The model was highly specific to predicting arithmetic performance. There were no strong relations between mathematical achievement and verbal short-term and working memory, sustained attention, response inhibition, finger knowledge and symbolic number comparison performance. Non-verbal intelligence measures were also non-significant predictors when added to our model. Number sense variables were non-significant predictors in the model and they were also non-significant predictors when entered into regression analysis with only a single visuo-spatial WM measure. Number sense variables were predicted by sustained attention. Results support a network theory of mathematical competence in primary school children and falsify the importance of a proposed modular 'number sense'. We suggest an 'executive memory function centric' model of mathematical processing. Mapping a complex processing network requires that studies consider the complex predictor space of mathematics rather than just focusing on a single or a few explanatory factors.
Mathematical simulation of hemodynamical processes and medical technologies
NASA Astrophysics Data System (ADS)
Tsitsyura, Nadiya; Novyc'kyy, Victor V.; Lushchyk, Ulyana B.
2001-06-01
Vascular pathologies constitute a significant part of human's diseases and their rate tends to increase. Numerous investigations of brain blood flow in a normal condition and in a pathological one has created a new branch of modern medicine -- angioneurology. It combines the information on brain angioarchitecture and on blood supply in a normal condition and in a pathological one. Investigations of a disease's development constitute an important problem of a modern medicine. Cerebrum blood supply is regulated by arterial inflow and venous outflow, but, unfortunately, in the literature available arterial and venous beds are considered separately. This causes an one-sided interpretation of atherosclerotical and discirculatory encefalopathies. As arterial inflow and venous outflow are interrelated, it seems to be expedient to perform a complex estimation of arteriovenous interactions, prove a correlation dependence connection between the beds and find a dependence in a form of mathematical function. The results will be observed clearly in the graphs. There were 139 patients aged from 2 up to 70 examined in the 'Istyna' Scientific Medical Ultrasound Center by means of a Logidop 2 apparatus manufactured by Kranzbuhler, Germany using a technique of cerebral arteries and veins ultrasound location (invented and patented by Ulyana Lushchyk, State Patent of Ukraine N10262 of 19/07/1995). A clinical interpretation of the results obtained was performed. With the help of this technique and ultrasound Dopplerography the blood flow in major head and cervical arteries was investigated. While performing a visual graphic analysis we paid attention to the changes of carotid artery (CA), internal jugular vein (IJV) and supratrochlear artery's (STA) hemodynamical parameters. Generally accepted blood flow parameters: FS -- maximal systolic frequency and FD -- minimal diastolic frequency were measured. The correlation between different combinations of parameters in the vessels mentioned
De Smedt, Bert; Gilmore, Camilla K
2011-02-01
This study examined numerical magnitude processing in first graders with severe and mild forms of mathematical difficulties, children with mathematics learning disabilities (MLD) and children with low achievement (LA) in mathematics, respectively. In total, 20 children with MLD, 21 children with LA, and 41 regular achievers completed a numerical magnitude comparison task and an approximate addition task, which were presented in a symbolic and a nonsymbolic (dot arrays) format. Children with MLD and LA were impaired on tasks that involved the access of numerical magnitude information from symbolic representations, with the LA children showing a less severe performance pattern than children with MLD. They showed no deficits in accessing magnitude from underlying nonsymbolic magnitude representations. Our findings indicate that this performance pattern occurs in children from first grade onward and generalizes beyond numerical magnitude comparison tasks. These findings shed light on the types of intervention that may help children who struggle with learning mathematics.
Processing Images of Craters for Spacecraft Navigation
NASA Technical Reports Server (NTRS)
Cheng, Yang; Johnson, Andrew E.; Matthies, Larry H.
2009-01-01
A crater-detection algorithm has been conceived to enable automation of what, heretofore, have been manual processes for utilizing images of craters on a celestial body as landmarks for navigating a spacecraft flying near or landing on that body. The images are acquired by an electronic camera aboard the spacecraft, then digitized, then processed by the algorithm, which consists mainly of the following steps: 1. Edges in an image detected and placed in a database. 2. Crater rim edges are selected from the edge database. 3. Edges that belong to the same crater are grouped together. 4. An ellipse is fitted to each group of crater edges. 5. Ellipses are refined directly in the image domain to reduce errors introduced in the detection of edges and fitting of ellipses. 6. The quality of each detected crater is evaluated. It is planned to utilize this algorithm as the basis of a computer program for automated, real-time, onboard processing of crater-image data. Experimental studies have led to the conclusion that this algorithm is capable of a detection rate >93 percent, a false-alarm rate <5 percent, a geometric error <0.5 pixel, and a position error <0.3 pixel.
[Image processing of early gastric cancer cases].
Inamoto, K; Umeda, T; Inamura, K
1992-11-25
Computer image processing was used to enhance gastric lesions in order to improve the detection of stomach cancer. Digitization was performed in 25 cases of early gastric cancer that had been confirmed surgically and pathologically. The image processing consisted of grey scale transformation, edge enhancement (Sobel operator), and high-pass filtering (unsharp masking). Gery scale transformation improved image quality for the detection of gastric lesions. The Sobel operator enhanced linear and curved margins, and consequently, suppressed the rest. High-pass filtering with unsharp masking was superior to visualization of the texture pattern on the mucosa. Eight of 10 small lesions (less than 2.0 cm) were successfully demonstrated. However, the detection of two lesions in the antrum, was difficult even with the aid of image enhancement. In the other 15 lesions (more than 2.0 cm), the tumor surface pattern and margin between the tumor and non-pathological mucosa were clearly visualized. Image processing was considered to contribute to the detection of small early gastric cancer lesions by enhancing the pathological lesions.
Mathematical modeling of material behaviors in the fracture process zone
NASA Astrophysics Data System (ADS)
Zhu, Mingcheng
2000-10-01
This Ph.D. research focuses on employing the cohesive crack models to investigate the fracture process zone behavior. The main contributions are summarized as the following: (1) A generalized mixed mode Dugdale model is developed. Research shows that the crack interaction will result in highly nonsymmetrical fracture process zone behavior. The nonsymmetrical fracture process zone behavior may be important in evaluation of effective properties of cracked materials if the local unsymmetrical loading induced by its neighbor crack interactions cannot be ignored. (2) A closed form solution of the stress history effect on the mixed mode Dugdale crack is obtained. Then a numerical procedure is proposed for studying the residual stress behavior of the loading and unloading path dependent Dugdale crack. (3) A general weight function method is developed for simulating the fracture process zone behavior. With this method the fracture process zone behavior can be easily simulated with singular solutions. (4) A numerical procedure is developed to investigate the strain-hardening or strain-softening effect on the Dugdale crack. Numerical examples show that, for a given Jc, the far-field failure stress of strain-hardening or strain-softening materials are very close to the Dugdale solution and this implies that the fracture failure criteria used in elastic-plastic material can be extended to the strain-hardening or strain-softening materials in the static loading situation. Stress distributions in the process zone have been calculated for several strain-hardening and strain-softening materials. An empirical equation of power-law type is proposed to represent the stress distribution as a function of the position in the process zone. It is shown that the power-law index varies linearly with the size of the fracture process zone. For static loading, Jc is the controlling parameter and the fracture process zone behavior is a secondary issue.
Onboard Image Processing System for Hyperspectral Sensor.
Hihara, Hiroki; Moritani, Kotaro; Inoue, Masao; Hoshi, Yoshihiro; Iwasaki, Akira; Takada, Jun; Inada, Hitomi; Suzuki, Makoto; Seki, Taeko; Ichikawa, Satoshi; Tanii, Jun
2015-09-25
Onboard image processing systems for a hyperspectral sensor have been developed in order to maximize image data transmission efficiency for large volume and high speed data downlink capacity. Since more than 100 channels are required for hyperspectral sensors on Earth observation satellites, fast and small-footprint lossless image compression capability is essential for reducing the size and weight of a sensor system. A fast lossless image compression algorithm has been developed, and is implemented in the onboard correction circuitry of sensitivity and linearity of Complementary Metal Oxide Semiconductor (CMOS) sensors in order to maximize the compression ratio. The employed image compression method is based on Fast, Efficient, Lossless Image compression System (FELICS), which is a hierarchical predictive coding method with resolution scaling. To improve FELICS's performance of image decorrelation and entropy coding, we apply a two-dimensional interpolation prediction and adaptive Golomb-Rice coding. It supports progressive decompression using resolution scaling while still maintaining superior performance measured as speed and complexity. Coding efficiency and compression speed enlarge the effective capacity of signal transmission channels, which lead to reducing onboard hardware by multiplexing sensor signals into a reduced number of compression circuits. The circuitry is embedded into the data formatter of the sensor system without adding size, weight, power consumption, and fabrication cost.
Onboard Image Processing System for Hyperspectral Sensor
Hihara, Hiroki; Moritani, Kotaro; Inoue, Masao; Hoshi, Yoshihiro; Iwasaki, Akira; Takada, Jun; Inada, Hitomi; Suzuki, Makoto; Seki, Taeko; Ichikawa, Satoshi; Tanii, Jun
2015-01-01
Onboard image processing systems for a hyperspectral sensor have been developed in order to maximize image data transmission efficiency for large volume and high speed data downlink capacity. Since more than 100 channels are required for hyperspectral sensors on Earth observation satellites, fast and small-footprint lossless image compression capability is essential for reducing the size and weight of a sensor system. A fast lossless image compression algorithm has been developed, and is implemented in the onboard correction circuitry of sensitivity and linearity of Complementary Metal Oxide Semiconductor (CMOS) sensors in order to maximize the compression ratio. The employed image compression method is based on Fast, Efficient, Lossless Image compression System (FELICS), which is a hierarchical predictive coding method with resolution scaling. To improve FELICS’s performance of image decorrelation and entropy coding, we apply a two-dimensional interpolation prediction and adaptive Golomb-Rice coding. It supports progressive decompression using resolution scaling while still maintaining superior performance measured as speed and complexity. Coding efficiency and compression speed enlarge the effective capacity of signal transmission channels, which lead to reducing onboard hardware by multiplexing sensor signals into a reduced number of compression circuits. The circuitry is embedded into the data formatter of the sensor system without adding size, weight, power consumption, and fabrication cost. PMID:26404281
Feedback regulation of microscopes by image processing.
Tsukada, Yuki; Hashimoto, Koichi
2013-05-01
Computational microscope systems are becoming a major part of imaging biological phenomena, and the development of such systems requires the design of automated regulation of microscopes. An important aspect of automated regulation is feedback regulation, which is the focus of this review. As modern microscope systems become more complex, often with many independent components that must work together, computer control is inevitable since the exact orchestration of parameters and timings for these multiple components is critical to acquire proper images. A number of techniques have been developed for biological imaging to accomplish this. Here, we summarize the basics of computational microscopy for the purpose of building automatically regulated microscopes focus on feedback regulation by image processing. These techniques allow high throughput data acquisition while monitoring both short- and long-term dynamic phenomena, which cannot be achieved without an automated system.
Enhanced neutron imaging detector using optical processing
Hutchinson, D.P.; McElhaney, S.A.
1992-01-01
Existing neutron imaging detectors have limited count rates due to inherent property and electronic limitations. The popular multiwire proportional counter is qualified by gas recombination to a count rate of less than 10{sup 5} n/s over the entire array and the neutron Anger camera, even though improved with new fiber optic encoding methods, can only achieve 10{sup 6} cps over a limited array. We present a preliminary design for a new type of neutron imaging detector with a resolution of 2--5 mm and a count rate capability of 10{sup 6} cps pixel element. We propose to combine optical and electronic processing to economically increase the throughput of advanced detector systems while simplifying computing requirements. By placing a scintillator screen ahead of an optical image processor followed by a detector array, a high throughput imaging detector may be constructed.
3D integral imaging with optical processing
NASA Astrophysics Data System (ADS)
Martínez-Corral, Manuel; Martínez-Cuenca, Raúl; Saavedra, Genaro; Javidi, Bahram
2008-04-01
Integral imaging (InI) systems are imaging devices that provide auto-stereoscopic images of 3D intensity objects. Since the birth of this new technology, InI systems have faced satisfactorily many of their initial drawbacks. Basically, two kind of procedures have been used: digital and optical procedures. The "3D Imaging and Display Group" at the University of Valencia, with the essential collaboration of Prof. Javidi, has centered its efforts in the 3D InI with optical processing. Among other achievements, our Group has proposed the annular amplitude modulation for enlargement of the depth of field, dynamic focusing for reduction of the facet-braiding effect, or the TRES and MATRES devices to enlarge the viewing angle.
Simplified labeling process for medical image segmentation.
Gao, Mingchen; Huang, Junzhou; Huang, Xiaolei; Zhang, Shaoting; Metaxas, Dimitris N
2012-01-01
Image segmentation plays a crucial role in many medical imaging applications by automatically locating the regions of interest. Typically supervised learning based segmentation methods require a large set of accurately labeled training data. However, thel labeling process is tedious, time consuming and sometimes not necessary. We propose a robust logistic regression algorithm to handle label outliers such that doctors do not need to waste time on precisely labeling images for training set. To validate its effectiveness and efficiency, we conduct carefully designed experiments on cervigram image segmentation while there exist label outliers. Experimental results show that the proposed robust logistic regression algorithms achieve superior performance compared to previous methods, which validates the benefits of the proposed algorithms.
NASA Astrophysics Data System (ADS)
Putri, Arrival Rince; Nova, Tertia Delia; Watanabe, M.
2016-02-01
Bird flu infection processes within a poultry farm are formulated mathematically. A spatial effect is taken into account for the virus concentration with a diffusive term. An infection process is represented in terms of a traveling wave solutions. For a small removal rate, a singular perturbation analysis lead to existence of traveling wave solutions, that correspond to progressive infection in one direction.
Cognitive Components of a Mathematical Processing Network in 9-Year-Old Children
ERIC Educational Resources Information Center
Szucs, Dénes; Devine, Amy; Soltesz, Fruzsina; Nobes, Alison; Gabriel, Florence
2014-01-01
We determined how various cognitive abilities, including several measures of a proposed domain-specific number sense, relate to mathematical competence in nearly 100 9-year-old children with normal reading skill. Results are consistent with an extended number processing network and suggest that important processing nodes of this network are…
PASS Processes and Early Mathematics Skills in Dutch and Italian Kindergarteners
ERIC Educational Resources Information Center
Kroesbergen, Evelyn H.; Van Luit, Johannes E. H.; Naglieri, Jack A.; Taddei, Stefano; Franchi, Elena
2010-01-01
The purpose of this study was to investigate the relation between early mathematical skills and cognitive processing abilities for two samples of children in Italy (N = 40) and the Netherlands (N = 59) who completed both a cognitive test that measures Planning, Attention, Simultaneous, and Successive (PASS) processing and an early mathematical…
Transition Process of Procedural to Conceptual Understanding in Solving Mathematical Problems
ERIC Educational Resources Information Center
Fatqurhohman
2016-01-01
This article aims to describe the transition process from procedural understanding to conceptual understanding in solving mathematical problems. Subjects in this study were three students from 20 fifth grade students of SDN 01 Sumberberas Banyuwangi selected based on the results of the students' answers. The transition process from procedural to…
Mathematical Thinking Process of Autistic Students in Terms of Representational Gesture
ERIC Educational Resources Information Center
Mustafa, Sriyanti; Nusantara, Toto; Subanji; Irawati, Santi
2016-01-01
The aim of this study is to describe the mathematical thinking process of autistic students in terms of gesture, using a qualitative approach. Data collecting is conducted by using 3 (three) audio-visual cameras. During the learning process, both teacher and students' activity are recorded using handy cam and digital camera (full HD capacity).…
ERIC Educational Resources Information Center
Hunsader, Patricia D.; Thompson, Denisse R.; Zorin, Barbara
2013-01-01
In this paper, we present a framework used to analyze the extent to which assessments (i.e., chapter tests) accompanying three published elementary grades 3-5 curricula in the United States provide students with opportunities to engage with key mathematical processes. The framework uses indicators for five criteria to assess the processes of…
Mariner 9-Image processing and products
Levinthal, E.C.; Green, W.B.; Cutts, J.A.; Jahelka, E.D.; Johansen, R.A.; Sander, M.J.; Seidman, J.B.; Young, A.T.; Soderblom, L.A.
1973-01-01
The purpose of this paper is to describe the system for the display, processing, and production of image-data products created to support the Mariner 9 Television Experiment. Of necessity, the system was large in order to respond to the needs of a large team of scientists with a broad scope of experimental objectives. The desire to generate processed data products as rapidly as possible to take advantage of adaptive planning during the mission, coupled with the complexities introduced by the nature of the vidicon camera, greatly increased the scale of the ground-image processing effort. This paper describes the systems that carried out the processes and delivered the products necessary for real-time and near-real-time analyses. References are made to the computer algorithms used for the, different levels of decalibration and analysis. ?? 1973.
Web-based document image processing
NASA Astrophysics Data System (ADS)
Walker, Frank L.; Thoma, George R.
1999-12-01
Increasing numbers of research libraries are turning to the Internet for electron interlibrary loan and for document delivery to patrons. This has been made possible through the widespread adoption of software such as Ariel and DocView. Ariel, a product of the Research Libraries Group, converts paper-based documents to monochrome bitmapped images, and delivers them over the Internet. The National Library of Medicine's DocView is primarily designed for library patrons are beginning to reap the benefits of this new technology, barriers exist, e.g., differences in image file format, that lead to difficulties in the use of library document information. To research how to overcome such barriers, the Communications Engineering Branch of the Lister Hill National Center for Biomedical Communications, an R and D division of NLM, has developed a web site called the DocMorph Server. This is part of an ongoing intramural R and D program in document imaging that has spanned many aspects of electronic document conversion and preservation, Internet document transmission and document usage. The DocMorph Server Web site is designed to fill two roles. First, in a role that will benefit both libraries and their patrons, it allows Internet users to upload scanned image files for conversion to alternative formats, thereby enabling wider delivery and easier usage of library document information. Second, the DocMorph Server provides the design team an active test bed for evaluating the effectiveness and utility of new document image processing algorithms and functions, so that they may be evaluated for possible inclusion in other image processing software products being developed at NLM or elsewhere. This paper describes the design of the prototype DocMorph Server and the image processing functions being implemented on it.
The Role of Hellinger Processes in Mathematical Finance
NASA Astrophysics Data System (ADS)
Choulli, T.; Hurd, T. R.
2001-09-01
This paper illustrates the natural role that Hellinger processes can play in solving problems from ¯nance. We propose an extension of the concept of Hellinger process applicable to entropy distance and f-divergence distances, where f is a convex logarithmic function or a convex power function with general order q, 0 6= q < 1. These concepts lead to a new approach to Merton's optimal portfolio problem and its dual in general L¶evy markets.
Ultrafast laser processing of glass-phase materials: mathematical simulation
NASA Astrophysics Data System (ADS)
Sokolova, Tatiana N.; Surmenko, Elena L.; Chebotarevsky, Yury V.; Konyushin, Alexander V.; Popov, Ivan A.; Bessonov, Dmitry A.
2013-11-01
Glass-phase materials, such as glass-carbon, ceramics etc., are a wide class of substances applied in electronic industry. These materials often need special technologies for their processing. Unlike traditional methods of micromachining, focused ultrashort laser pulses of sufficiently high fluence makes it possible not only to avoid the majority of side effects, including temperature, but also to create a qualitatively new laser technology for "hard materials". When using ultrafast lasers in micromachining processes it is necessary to account the possible negative effects that occur in the processing of brittle materials. Removing material from the surface in cold ablation process caused by laser light, in such a short period of time with such a high rate, creates the area of high pressure in the interaction zone that could cause a microdamage of brittle materials. To study the stress-strain state arising in brittle materials under the influence of ultrafast lasers, the special physicalmathematical model of the process was formulated. As a measure of the mechanical action of laser radiation on the processed material in cold ablation the reactive force was taken. As a mechanical reaction of the treated glass-carbon substrate a back pressure generated by the reactive force was considered. Brittle materials suffer plastic deformation, as a rule, only in the areas of high-temperature heating. Hence, in case of picosecond treatment in cold ablation process the material, from a mechanical point of view, was seen as a perfectly elastic up to its destruction. From a geometrical point of view, the processed object was presented in the form of a thin rectangular plate, loosely founded on the elastic base.
Davis, Nicole; Cannistraci, Christopher J.; Rogers, Baxter P.; Gatenby, J. Christopher; Fuchs, Lynn S.; Anderson, Adam W.; Gore, John C.
2009-01-01
We used functional magnetic resonance imaging (fMRI) to explore the patterns of brain activation associated with different levels of performance in exact and approximate calculation tasks in well defined cohorts of children with mathematical calculation difficulties (MD) and typically developing controls. Both groups of children activated the same network of brain regions; however, children in the MD group had significantly increased activation in parietal, frontal, and cingulate cortices during both calculation tasks. A majority of the differences occurred in anatomical brain regions associated with cognitive resources such as executive functioning and working memory that are known to support higher level arithmetic skill but are not specific to mathematical processing. We propose that these findings are evidence that children with MD use the same types of problem solving strategies as TD children, but their weak mathematical processing system causes them to employ a more developmentally immature and less efficient form of the strategies. PMID:19410589
Digital image processing of vascular angiograms
NASA Technical Reports Server (NTRS)
Selzer, R. H.; Beckenbach, E. S.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.
1975-01-01
The paper discusses the estimation of the degree of atherosclerosis in the human femoral artery through the use of a digital image processing system for vascular angiograms. The film digitizer uses an electronic image dissector camera to scan the angiogram and convert the recorded optical density information into a numerical format. Another processing step involves locating the vessel edges from the digital image. The computer has been programmed to estimate vessel abnormality through a series of measurements, some derived primarily from the vessel edge information and others from optical density variations within the lumen shadow. These measurements are combined into an atherosclerosis index, which is found in a post-mortem study to correlate well with both visual and chemical estimates of atherosclerotic disease.
Wavelet-aided pavement distress image processing
NASA Astrophysics Data System (ADS)
Zhou, Jian; Huang, Peisen S.; Chiang, Fu-Pen
2003-11-01
A wavelet-based pavement distress detection and evaluation method is proposed. This method consists of two main parts, real-time processing for distress detection and offline processing for distress evaluation. The real-time processing part includes wavelet transform, distress detection and isolation, and image compression and noise reduction. When a pavement image is decomposed into different frequency subbands by wavelet transform, the distresses, which are usually irregular in shape, appear as high-amplitude wavelet coefficients in the high-frequency details subbands, while the background appears in the low-frequency approximation subband. Two statistical parameters, high-amplitude wavelet coefficient percentage (HAWCP) and high-frequency energy percentage (HFEP), are established and used as criteria for real-time distress detection and distress image isolation. For compression of isolated distress images, a modified EZW (Embedded Zerotrees of Wavelet coding) is developed, which can simultaneously compress the images and reduce the noise. The compressed data are saved to the hard drive for further analysis and evaluation. The offline processing includes distress classification, distress quantification, and reconstruction of the original image for distress segmentation, distress mapping, and maintenance decision-making. The compressed data are first loaded and decoded to obtain wavelet coefficients. Then Radon transform is then applied and the parameters related to the peaks in the Radon domain are used for distress classification. For distress quantification, a norm is defined that can be used as an index for evaluating the severity and extent of the distress. Compared to visual or manual inspection, the proposed method has the advantages of being objective, high-speed, safe, automated, and applicable to different types of pavements and distresses.
Hemispheric superiority for processing a mirror image.
Garren, R B; Gehlsen, G M
1981-04-01
39 adult subjects were administered a test using tachistoscopic half-field presentations to determine hemispheric dominance and a mirror-tracing task to determine if an hemispheric superiority exists for processing a mirror-image. The results indicate superiority of the nondominant hemisphere for this task.
Image Processing Using a Parallel Architecture.
1987-12-01
Computer," Byte, 3: 14-25 (December 1978). McGraw-Hill, 1985 24. Trussell, H. Joel . "Processing of X-ray Images," Proceedings of the IEEE, 69: 615-627...Services Electronics Program contract N00014-79-C-0424 (AD-085-846). 107 Therrien , Charles W. et al. "A Multiprocessor System for Simulation of
A novel mathematical setup for fault tolerant control systems with state-dependent failure process
NASA Astrophysics Data System (ADS)
Chitraganti, S.; Aberkane, S.; Aubrun, C.
2014-12-01
In this paper, we consider a fault tolerant control system (FTCS) with state- dependent failures and provide a tractable mathematical model to handle the state-dependent failures. By assuming abrupt changes in system parameters, we use a jump process modelling of failure process and the fault detection and isolation (FDI) process. In particular, we assume that the failure rates of the failure process vary according to which set the state of the system belongs to.
The integration of quantitative multi-modality imaging data into mathematical models of tumors
NASA Astrophysics Data System (ADS)
Atuegwu, Nkiruka C.; Gore, John C.; Yankeelov, Thomas E.
2010-05-01
Quantitative imaging data obtained from multiple modalities may be integrated into mathematical models of tumor growth and treatment response to achieve additional insights of practical predictive value. We show how this approach can describe the development of tumors that appear realistic in terms of producing proliferating tumor rims and necrotic cores. Two established models (the logistic model with and without the effects of treatment) and one novel model built a priori from available imaging data have been studied. We modify the logistic model to predict the spatial expansion of a tumor driven by tumor cell migration after a voxel's carrying capacity has been reached. Depending on the efficacy of a simulated cytoxic treatment, we show that the tumor may either continue to expand, or contract. The novel model includes hypoxia as a driver of tumor cell movement. The starting conditions for these models are based on imaging data related to the tumor cell number (as estimated from diffusion-weighted MRI), apoptosis (from 99mTc-Annexin-V SPECT), cell proliferation and hypoxia (from PET). We conclude that integrating multi-modality imaging data into mathematical models of tumor growth is a promising combination that can capture the salient features of tumor growth and treatment response and this indicates the direction for additional research.
Limiting liability via high resolution image processing
Greenwade, L.E.; Overlin, T.K.
1996-12-31
The utilization of high resolution image processing allows forensic analysts and visualization scientists to assist detectives by enhancing field photographs, and by providing the tools and training to increase the quality and usability of field photos. Through the use of digitized photographs and computerized enhancement software, field evidence can be obtained and processed as `evidence ready`, even in poor lighting and shadowed conditions or darkened rooms. These images, which are most often unusable when taken with standard camera equipment, can be shot in the worst of photographic condition and be processed as usable evidence. Visualization scientists have taken the use of digital photographic image processing and moved the process of crime scene photos into the technology age. The use of high resolution technology will assist law enforcement in making better use of crime scene photography and positive identification of prints. Valuable court room and investigation time can be saved and better served by this accurate, performance based process. Inconclusive evidence does not lead to convictions. Enhancement of the photographic capability helps solve one major problem with crime scene photos, that if taken with standard equipment and without the benefit of enhancement software would be inconclusive, thus allowing guilty parties to be set free due to lack of evidence.
ERIC Educational Resources Information Center
Kuzle, Ana
2013-01-01
Despite a great deal of research on the benefits of writing in mathematics, writing plays a minimal role, if any, in secondary and tertiary mathematics education. In order for teachers to use writing in their classrooms, they themselves have to experience writing mathematics within the teacher education programme. The present paper reports on a…
Mathematical simulation of the process of condensing natural gas
NASA Astrophysics Data System (ADS)
Tastandieva, G. M.
2015-01-01
Presents a two-dimensional unsteady model of heat transfer in terms of condensation of natural gas at low temperatures. Performed calculations of the process heat and mass transfer of liquefied natural gas (LNG) storage tanks of cylindrical shape. The influence of model parameters on the nature of heat transfer. Defined temperature regimes eliminate evaporation by cooling liquefied natural gas. The obtained dependence of the mass flow rate of vapor condensation gas temperature. Identified the possibility of regulating the process of "cooling down" liquefied natural gas in terms of its partial evaporation with low cost energy.
Image processing techniques for passive millimeter-wave imaging
NASA Astrophysics Data System (ADS)
Lettington, Alan H.; Gleed, David G.
1998-08-01
We present our results on the application of image processing techniques for passive millimeter-wave imaging and discuss possible future trends. Passive millimeter-wave imaging is useful in poor weather such as in fog and cloud. Its spatial resolution, however, can be restricted due to the diffraction limit of the front aperture. Its resolution may be increased using super-resolution techniques but often at the expense of processing time. Linear methods may be implemented in real time but non-linear methods which are required to restore missing spatial frequencies are usually more time consuming. In the present paper we describe fast super-resolution techniques which are potentially capable of being applied in real time. Associated issues such as reducing the influence of noise and improving recognition capability will be discussed. Various techniques have been used to enhance passive millimeter wave images giving excellent results and providing a significant quantifiable increase in spatial resolution. Examples of applying these techniques to imagery will be given.
Mathematical simulation of a steady process of anisotropic filtration
NASA Astrophysics Data System (ADS)
Badriev, I. B.; Banderov, V. V.; Pankratova, O. V.; Shangaraeva, A. I.
2016-11-01
This article discusses the methods of approximate solution of mixed variational inequalities with operators of monotone type. The functional, which is included in this variational inequality, is separable, in other words, it is the sum of a number of nondifferentiable functionals. These variational inequalities appear, in particular, in the description of steady incompressible filtration processes of highly viscous fluids in anisotropic medium.
Visual parameter optimisation for biomedical image processing
2015-01-01
Background Biomedical image processing methods require users to optimise input parameters to ensure high-quality output. This presents two challenges. First, it is difficult to optimise multiple input parameters for multiple input images. Second, it is difficult to achieve an understanding of underlying algorithms, in particular, relationships between input and output. Results We present a visualisation method that transforms users' ability to understand algorithm behaviour by integrating input and output, and by supporting exploration of their relationships. We discuss its application to a colour deconvolution technique for stained histology images and show how it enabled a domain expert to identify suitable parameter values for the deconvolution of two types of images, and metrics to quantify deconvolution performance. It also enabled a breakthrough in understanding by invalidating an underlying assumption about the algorithm. Conclusions The visualisation method presented here provides analysis capability for multiple inputs and outputs in biomedical image processing that is not supported by previous analysis software. The analysis supported by our method is not feasible with conventional trial-and-error approaches. PMID:26329538
MRI Image Processing Based on Fractal Analysis
Marusina, Mariya Y; Mochalina, Alexandra P; Frolova, Ekaterina P; Satikov, Valentin I; Barchuk, Anton A; Kuznetcov, Vladimir I; Gaidukov, Vadim S; Tarakanov, Segrey A
2017-01-01
Background: Cancer is one of the most common causes of human mortality, with about 14 million new cases and 8.2 million deaths reported in in 2012. Early diagnosis of cancer through screening allows interventions to reduce mortality. Fractal analysis of medical images may be useful for this purpose. Materials and Methods: In this study, we examined magnetic resonance (MR) images of healthy livers and livers containing metastases from colorectal cancer. The fractal dimension and the Hurst exponent were chosen as diagnostic features for tomographic imaging using Image J software package for image processings FracLac for applied for fractal analysis with a 120x150 pixel area. Calculations of the fractal dimensions of pathological and healthy tissue samples were performed using the box-counting method. Results: In pathological cases (foci formation), the Hurst exponent was less than 0.5 (the region of unstable statistical characteristics). For healthy tissue, the Hurst index is greater than 0.5 (the zone of stable characteristics). Conclusions: The study indicated the possibility of employing fractal rapid analysis for the detection of focal lesions of the liver. The Hurst exponent can be used as an important diagnostic characteristic for analysis of medical images.
Subband/transform functions for image processing
NASA Technical Reports Server (NTRS)
Glover, Daniel
1993-01-01
Functions for image data processing written for use with the MATLAB(TM) software package are presented. These functions provide the capability to transform image data with block transformations (such as the Walsh Hadamard) and to produce spatial frequency subbands of the transformed data. Block transforms are equivalent to simple subband systems. The transform coefficients are reordered using a simple permutation to give subbands. The low frequency subband is a low resolution version of the original image, while the higher frequency subbands contain edge information. The transform functions can be cascaded to provide further decomposition into more subbands. If the cascade is applied to all four of the first stage subbands (in the case of a four band decomposition), then a uniform structure of sixteen bands is obtained. If the cascade is applied only to the low frequency subband, an octave structure of seven bands results. Functions for the inverse transforms are also given. These functions can be used for image data compression systems. The transforms do not in themselves produce data compression, but prepare the data for quantization and compression. Sample quantization functions for subbands are also given. A typical compression approach is to subband the image data, quantize it, then use statistical coding (e.g., run-length coding followed by Huffman coding) for compression. Contour plots of image data and subbanded data are shown.
Color Imaging management in film processing
NASA Astrophysics Data System (ADS)
Tremeau, Alain; Konik, Hubert; Colantoni, Philippe
2003-12-01
The latest research projects in the laboratory LIGIV concerns capture, processing, archiving and display of color images considering the trichromatic nature of the Human Vision System (HSV). Among these projects one addresses digital cinematographic film sequences of high resolution and dynamic range. This project aims to optimize the use of content for the post-production operators and for the end user. The studies presented in this paper address the use of metadata to optimise the consumption of video content on a device of user's choice independent of the nature of the equipment that captured the content. Optimising consumption includes enhancing the quality of image reconstruction on a display. Another part of this project addresses the content-based adaptation of image display. Main focus is on Regions of Interest (ROI) operations, based on the ROI concepts of MPEG-7. The aim of this second part is to characterize and ensure the conditions of display even if display device or display media changes. This requires firstly the definition of a reference color space and the definition of bi-directional color transformations for each peripheral device (camera, display, film recorder, etc.). The complicating factor is that different devices have different color gamuts, depending on the chromaticity of their primaries and the ambient illumination under which they are viewed. To match the displayed image to the aimed appearance, all kind of production metadata (camera specification, camera colour primaries, lighting conditions) should be associated to the film material. Metadata and content build together rich content. The author is assumed to specify conditions as known from digital graphics arts. To control image pre-processing and image post-processing, these specifications should be contained in the film's metadata. The specifications are related to the ICC profiles but need additionally consider mesopic viewing conditions.
[Digital thoracic radiology: devices, image processing, limits].
Frija, J; de Géry, S; Lallouet, F; Guermazi, A; Zagdanski, A M; De Kerviler, E
2001-09-01
In a first part, the different techniques of digital thoracic radiography are described. Since computed radiography with phosphore plates are the most commercialized it is more emphasized. But the other detectors are also described, as the drum coated with selenium and the direct digital radiography with selenium detectors. The other detectors are also studied in particular indirect flat panels detectors and the system with four high resolution CCD cameras. In a second step the most important image processing are discussed: the gradation curves, the unsharp mask processing, the system MUSICA, the dynamic range compression or reduction, the soustraction with dual energy. In the last part the advantages and the drawbacks of computed thoracic radiography are emphasized. The most important are the almost constant good quality of the pictures and the possibilities of image processing.
Image processing via VLSI: A concept paper
NASA Technical Reports Server (NTRS)
Nathan, R.
1982-01-01
Implementing specific image processing algorithms via very large scale integrated systems offers a potent solution to the problem of handling high data rates. Two algorithms stand out as being particularly critical -- geometric map transformation and filtering or correlation. These two functions form the basis for data calibration, registration and mosaicking. VLSI presents itself as an inexpensive ancillary function to be added to almost any general purpose computer and if the geometry and filter algorithms are implemented in VLSI, the processing rate bottleneck would be significantly relieved. A set of image processing functions that limit present systems to deal with future throughput needs, translates these functions to algorithms, implements via VLSI technology and interfaces the hardware to a general purpose digital computer is developed.
Gaia astrometric instrument calibration and image processing
NASA Astrophysics Data System (ADS)
Castaneda, J.; Fabricius, C.; Portell, J.; Garralda, N.; González-Vidal, J. J.; Clotet, M.; Torra, J.
2017-03-01
The astrometric instrument calibration and image processing is an integral and critical part of the Gaia mission. The data processing starts with a preliminary treatment on daily basis of the most recent data received and continues with the execution of several processing chains included in a cyclic reduction system. The cyclic processing chains are reprocessing all the accumulated data again in each iteration, thus adding the latest measurements and recomputing the outputs to obtain better quality on their results. This cyclic processing lasts until the convergence of the results is achieved and the catalogue is consolidated and published periodically. In this paper we describe the core of the data processing which has made possible the first catalogue release from the Gaia mission.
EOS image data processing system definition study
NASA Technical Reports Server (NTRS)
Gilbert, J.; Honikman, T.; Mcmahon, E.; Miller, E.; Pietrzak, L.; Yorsz, W.
1973-01-01
The Image Processing System (IPS) requirements and configuration are defined for NASA-sponsored advanced technology Earth Observatory System (EOS). The scope included investigation and definition of IPS operational, functional, and product requirements considering overall system constraints and interfaces (sensor, etc.) The scope also included investigation of the technical feasibility and definition of a point design reflecting system requirements. The design phase required a survey of present and projected technology related to general and special-purpose processors, high-density digital tape recorders, and image recorders.
Advanced communications technologies for image processing
NASA Technical Reports Server (NTRS)
Likens, W. C.; Jones, H. W.; Shameson, L.
1984-01-01
It is essential for image analysts to have the capability to link to remote facilities as a means of accessing both data bases and high-speed processors. This can increase productivity through enhanced data access and minimization of delays. New technology is emerging to provide the high communication data rates needed in image processing. These developments include multi-user sharing of high bandwidth (60 megabits per second) Time Division Multiple Access (TDMA) satellite links, low-cost satellite ground stations, and high speed adaptive quadrature modems that allow 9600 bit per second communications over voice-grade telephone lines.
Image processing with JPEG2000 coders
NASA Astrophysics Data System (ADS)
Śliwiński, Przemysław; Smutnicki, Czesław; Chorażyczewski, Artur
2008-04-01
In the note, several wavelet-based image processing algorithms are presented. Denoising algorithm is derived from the Donoho's thresholding. Rescaling algorithm reuses sub-division scheme of the Sweldens' lifting and a sensor linearization procedure exploiting system identification algorithms developed for nonlinear dynamic systems. Proposed autofocus algorithm is a passive one, works in wavelet domain and relies on properties of lens transfer function. The common advantage of the algorithms is that they can easily be implemented within the JPEG2000 image compression standard encoder, offering simplification of the final circuitry (or the software package) and the reduction of the power consumption (program size, respectively) when compared to solutions based on separate components.
Translational motion compensation in ISAR image processing.
Wu, H; Grenier, D; Delisle, G Y; Fang, D G
1995-01-01
In inverse synthetic aperture radar (ISAR) imaging, the target rotational motion with respect to the radar line of sight contributes to the imaging ability, whereas the translational motion must be compensated out. This paper presents a novel two-step approach to translational motion compensation using an adaptive range tracking method for range bin alignment and a recursive multiple-scatterer algorithm (RMSA) for signal phase compensation. The initial step of RMSA is equivalent to the dominant-scatterer algorithm (DSA). An error-compensating point source is then recursively synthesized from the selected range bins, where each contains a prominent scatterer. Since the clutter-induced phase errors are reduced by phase averaging, the image speckle noise can be reduced significantly. Experimental data processing for a commercial aircraft and computer simulations confirm the validity of the approach.
Computer image processing in marine resource exploration
NASA Technical Reports Server (NTRS)
Paluzzi, P. R.; Normark, W. R.; Hess, G. R.; Hess, H. D.; Cruickshank, M. J.
1976-01-01
Pictographic data or imagery is commonly used in marine exploration. Pre-existing image processing techniques (software) similar to those used on imagery obtained from unmanned planetary exploration were used to improve marine photography and side-scan sonar imagery. Features and details not visible by conventional photo processing methods were enhanced by filtering and noise removal on selected deep-sea photographs. Information gained near the periphery of photographs allows improved interpretation and facilitates construction of bottom mosaics where overlapping frames are available. Similar processing techniques were applied to side-scan sonar imagery, including corrections for slant range distortion, and along-track scale changes. The use of digital data processing and storage techniques greatly extends the quantity of information that can be handled, stored, and processed.
IMAGE 100: The interactive multispectral image processing system
NASA Technical Reports Server (NTRS)
Schaller, E. S.; Towles, R. W.
1975-01-01
The need for rapid, cost-effective extraction of useful information from vast quantities of multispectral imagery available from aircraft or spacecraft has resulted in the design, implementation and application of a state-of-the-art processing system known as IMAGE 100. Operating on the general principle that all objects or materials possess unique spectral characteristics or signatures, the system uses this signature uniqueness to identify similar features in an image by simultaneously analyzing signatures in multiple frequency bands. Pseudo-colors, or themes, are assigned to features having identical spectral characteristics. These themes are displayed on a color CRT, and may be recorded on tape, film, or other media. The system was designed to incorporate key features such as interactive operation, user-oriented displays and controls, and rapid-response machine processing. Owing to these features, the user can readily control and/or modify the analysis process based on his knowledge of the input imagery. Effective use can be made of conventional photographic interpretation skills and state-of-the-art machine analysis techniques in the extraction of useful information from multispectral imagery. This approach results in highly accurate multitheme classification of imagery in seconds or minutes rather than the hours often involved in processing using other means.
Multidimensional energy operator for image processing
NASA Astrophysics Data System (ADS)
Maragos, Petros; Bovik, Alan C.; Quatieri, Thomas F.
1992-11-01
The 1-D nonlinear differential operator (Psi) (f) equals (f')2 - ff' has been recently introduced to signal processing and has been found very useful for estimating the parameters of sinusoids and the modulating signals of AM-FM signals. It is called an energy operator because it can track the energy of an oscillator source generating a sinusoidal signal. In this paper we introduce the multidimensional extension (Phi) (f) equals (parallel)DELf(parallel)2 - fDEL2f of the 1-D energy operator and briefly outline some of its applications to image processing. We discuss some interesting properties of the multidimensional operator and develop demodulation algorithms to estimate the amplitude envelope and instantaneous frequencies of 2-D spatially-varying AM-FM signals, which can model image texture. The attractive features of the multidimensional operator and the related amplitude/frequency demodulation algorithms are their simplicity, efficiency, and ability to track instantaneously- varying spatial modulation patterns.
Novel image processing approach to detect malaria
NASA Astrophysics Data System (ADS)
Mas, David; Ferrer, Belen; Cojoc, Dan; Finaurini, Sara; Mico, Vicente; Garcia, Javier; Zalevsky, Zeev
2015-09-01
In this paper we present a novel image processing algorithm providing good preliminary capabilities for in vitro detection of malaria. The proposed concept is based upon analysis of the temporal variation of each pixel. Changes in dark pixels mean that inter cellular activity happened, indicating the presence of the malaria parasite inside the cell. Preliminary experimental results involving analysis of red blood cells being either healthy or infected with malaria parasites, validated the potential benefit of the proposed numerical approach.
Digital image processing of vascular angiograms
NASA Technical Reports Server (NTRS)
Selzer, R. H.; Blankenhorn, D. H.; Beckenbach, E. S.; Crawford, D. W.; Brooks, S. H.
1975-01-01
A computer image processing technique was developed to estimate the degree of atherosclerosis in the human femoral artery. With an angiographic film of the vessel as input, the computer was programmed to estimate vessel abnormality through a series of measurements, some derived primarily from the vessel edge information and others from optical density variations within the lumen shadow. These measurements were combined into an atherosclerosis index, which was found to correlate well with both visual and chemical estimates of atherosclerotic disease.
IPLIB (Image processing library) user's manual
NASA Technical Reports Server (NTRS)
Faulcon, N. D.; Monteith, J. H.; Miller, K.
1985-01-01
IPLIB is a collection of HP FORTRAN 77 subroutines and functions that facilitate the use of a COMTAL image processing system driven by an HP-1000 computer. It is intended for programmers who want to use the HP 1000 to drive the COMTAL Vision One/20 system. It is assumed that the programmer knows HP 1000 FORTRAN 77 or at least one FORTRAN dialect. It is also assumed that the programmer has some familiarity with the COMTAL Vision One/20 system.
Sorting Olive Batches for the Milling Process Using Image Processing
Puerto, Daniel Aguilera; Martínez Gila, Diego Manuel; Gámez García, Javier; Gómez Ortega, Juan
2015-01-01
The quality of virgin olive oil obtained in the milling process is directly bound to the characteristics of the olives. Hence, the correct classification of the different incoming olive batches is crucial to reach the maximum quality of the oil. The aim of this work is to provide an automatic inspection system, based on computer vision, and to classify automatically different batches of olives entering the milling process. The classification is based on the differentiation between ground and tree olives. For this purpose, three different species have been studied (Picudo, Picual and Hojiblanco). The samples have been obtained by picking the olives directly from the tree or from the ground. The feature vector of the samples has been obtained on the basis of the olive image histograms. Moreover, different image preprocessing has been employed, and two classification techniques have been used: these are discriminant analysis and neural networks. The proposed methodology has been validated successfully, obtaining good classification results. PMID:26147729
ERIC Educational Resources Information Center
Wilson, P. Holt; Lee, Hollylynne Stohl; Hollebrands, Karen F.
2011-01-01
This study investigated the processes used by prospective mathematics teachers as they examined middle-school students' work solving statistical problems using a computer software program. Ways in which the model may be used by other researchers and implications for the design of pedagogical tasks for prospective teachers are discussed. (Contains…
Pre-Service Secondary Mathematics Teachers' Behaviors in the Proving Process
ERIC Educational Resources Information Center
Ugurel, Isikhan; Morali, Sevgi; Melike Yigit Koyunkaya,; Karahan, Özge
2016-01-01
Pre-service secondary mathematics teachers' (PSMTs) understanding and ability of constructing a proof is not only important for their own learning process, but also important for these PSMTs to help their future students learn how to do proofs. Therefore, this study is focused on and explains PSMTs' behaviors that they revealed throughout the…
ERIC Educational Resources Information Center
Nakamura, Yasuyuki; Nishi, Shinnosuke; Muramatsu, Yuta; Yasutake, Koichi; Yamakawa, Osamu; Tagawa, Takahiro
2014-01-01
In this paper, we introduce a mathematical model for collaborative learning and the answering process for multiple-choice questions. The collaborative learning model is inspired by the Ising spin model and the model for answering multiple-choice questions is based on their difficulty level. An intensive simulation study predicts the possibility of…
ERIC Educational Resources Information Center
Incikabi, Lutfi; Sancar Tokmak, Hatice
2012-01-01
This case study examined the educational software evaluation processes of pre-service teachers who attended either expertise-based training (XBT) or traditional training in conjunction with a Software-Evaluation checklist. Forty-three mathematics teacher candidates and three experts participated in the study. All participants evaluated educational…
ERIC Educational Resources Information Center
Andersson, Ulf; Ostergren, Rickard
2012-01-01
The study sought out to extend our knowledge regarding the origin of mathematical learning disabilities (MLD) in children by testing different hypotheses in the same samples of children. Different aspects of cognitive functions and number processing were assessed in fifth- and sixth-graders (11-13 years old) with MLD and compared to controls. The…
ERIC Educational Resources Information Center
Ong, Ewe Gnoh; Lim, Chap Sam; Ghazali, Munirah
2010-01-01
The purpose of this study was to examine the changes in novice and experienced mathematics teachers' questioning techniques. This study was conducted in Sarawak where ten (experienced and novice) teachers from two schools underwent the lesson study process for fifteen months. Four data collection methods namely, observation, interview, lesson…
ERIC Educational Resources Information Center
Iiskala, Tuike; Vauras, Marja; Lehtinen, Erno; Salonen, Pekka
2011-01-01
This study investigated how metacognition appears as a socially shared phenomenon within collaborative mathematical word-problem solving processes of dyads of high-achieving pupils. Four dyads solved problems of different difficulty levels. The pupils were 10 years old. The problem-solving activities were videotaped and transcribed in terms of…
NASA Astrophysics Data System (ADS)
Fedyaev, V. L.; Galimov, E. R.; Galimova, N. Ya; Takhaviev, M. S.; Siraev, A. R.
2017-01-01
The deposition of polymeric powder particles on a surface of a treated body and a layer of particles deposited previously is considered using mathematical modeling. Basic provisions of the impact theory are used. The relationships to evaluate the characteristic parameters of the processes under study are given, the results of their analysis and recommendations to improve the deposition efficiency are presented.
ERIC Educational Resources Information Center
Baltaci, Serdal
2016-01-01
It is a widely known fact that gifted students have different skills compared to their peers. However, to what extent gifted students use mathematical thinking skills during probability problem solving process emerges as a significant question. Thence, the main aim of the present study is to examine 8th grade gifted students' probability…
A Process of Students and Their Instructor Developing a Final Closed-Book Mathematics Exam
ERIC Educational Resources Information Center
Rapke, Tina
2016-01-01
This article describes a study, from a Canadian technical institute's upgrading mathematics course, where students played a role in developing the final closed-book exam that they sat. The study involved a process where students developed practice exams and solutions keys, students sat each other's practice exams, students evaluated classmates'…
The Development and Validation of Scores on the Mathematics Information Processing Scale (MIPS).
ERIC Educational Resources Information Center
Bessant, Kenneth C.
1997-01-01
This study reports on the development and psychometric properties of a new 87-item Mathematics Information Processing Scale that explores learning strategies, metacognitive problem-solving skills, and attentional deployment. Results with 340 college students support the use of the instrument, for which factor analysis identified five theoretically…
Mathematical modeling of physical processes in inorganic chemistry
Chiu, H.L.
1988-01-01
The first part deals with the rapid calculation of steady-state concentration profiles in contactors using the Purex Process. Most of the computer codes simulating the reprocessing of spent nuclear fuel generate the steady-state properties by calculating the transient behavior of the contactors. In this study, the author simulates the steady-state concentration profiles directly without first generating the transient behavior. Two computer codes are developed, PUMA (Plutonium-Uranium-Matrix-Algorithm) and PUNE (Plutonium-Uranium-Non-Equilibrium). The first one simulates the steady-state concentration profiles under conditions of equilibrium mass transfer. The second one accounts for deviations from mass transfer equilibrium. The second part of this dissertation shows how to use the classical trajectory method to study the equilibrium and saddle-point geometries of MX{sub n} (n = 2-7) molecules. Two nuclear potential functions that have the property of invariance to the operations of the permutation group of nuclei in molecules of the general formula MX{sub n} are described. Such potential functions allow equivalent isomers to have equal energies so that various statistical mechanical properties can be simply determined. The first function contains two center interactions between pairs of peripheral atoms and its defined by V(r) = 1/2{Sigma}{sub {alpha}}k{triangle}r{sub {alpha}{mu}}{sup 2} + {Sigma}{sub {alpha}< {beta}} QR{sub {alpha}{beta}}{sup {minus}n} (n = 1,2...). The second function contains two and three center interactions and is defined by V({Theta}) = 1/2{Sigma}{sub {alpha}}K{triangle}{sub {alpha}{mu}}{sup 2} + 1/2{Sigma}{sub {alpha}<{beta}}Qr{sub 0}{sup 2} ({Theta}{sub {alpha}{mu}{beta}} - {pi}){sup 2}.
Color Image Processing and Object Tracking System
NASA Technical Reports Server (NTRS)
Klimek, Robert B.; Wright, Ted W.; Sielken, Robert S.
1996-01-01
This report describes a personal computer based system for automatic and semiautomatic tracking of objects on film or video tape, developed to meet the needs of the Microgravity Combustion and Fluids Science Research Programs at the NASA Lewis Research Center. The system consists of individual hardware components working under computer control to achieve a high degree of automation. The most important hardware components include 16-mm and 35-mm film transports, a high resolution digital camera mounted on a x-y-z micro-positioning stage, an S-VHS tapedeck, an Hi8 tapedeck, video laserdisk, and a framegrabber. All of the image input devices are remotely controlled by a computer. Software was developed to integrate the overall operation of the system including device frame incrementation, grabbing of image frames, image processing of the object's neighborhood, locating the position of the object being tracked, and storing the coordinates in a file. This process is performed repeatedly until the last frame is reached. Several different tracking methods are supported. To illustrate the process, two representative applications of the system are described. These applications represent typical uses of the system and include tracking the propagation of a flame front and tracking the movement of a liquid-gas interface with extremely poor visibility.
NASA Technical Reports Server (NTRS)
Heydorn, R. D.
1984-01-01
The Mathematical Pattern Recognition and Image Analysis (MPRIA) Project is concerned with basic research problems related to the study of the Earth from remotely sensed measurement of its surface characteristics. The program goal is to better understand how to analyze the digital image that represents the spatial, spectral, and temporal arrangement of these measurements for purposing of making selected inference about the Earth.
The ‘hit’ phenomenon: a mathematical model of human dynamics interactions as a stochastic process
NASA Astrophysics Data System (ADS)
Ishii, Akira; Arakaki, Hisashi; Matsuda, Naoya; Umemura, Sanae; Urushidani, Tamiko; Yamagata, Naoya; Yoshida, Narihiko
2012-06-01
A mathematical model for the ‘hit’ phenomenon in entertainment within a society is presented as a stochastic process of human dynamics interactions. The model uses only the advertisement budget time distribution as an input, and word-of-mouth (WOM), represented by posts on social network systems, is used as data to make a comparison with the calculated results. The unit of time is days. The WOM distribution in time is found to be very close to the revenue distribution in time. Calculations for the Japanese motion picture market based on the mathematical model agree well with the actual revenue distribution in time.
Specialization of the Right Intraparietal Sulcus for Processing Mathematics During Development.
Schel, Margot A; Klingberg, Torkel
2016-08-27
Mathematical ability, especially perception of numbers and performance of arithmetics, is known to rely on the activation of intraparietal sulcus (IPS). However, reasoning ability and working memory, 2 highly associated abilities also activate partly overlapping regions. Most studies aimed at localizing mathematical function have used group averages, where individual variability is averaged out, thus confounding the anatomical specificity when localizing cognitive functions. Here, we analyze the functional anatomy of the intraparietal cortex by using individual analysis of subregions of IPS based on how they are structurally connected to frontal, parietal, and occipital cortex. Analysis of cortical thickness showed that the right anterior IPS, defined by its connections to the frontal lobe, was associated with both visuospatial working memory, and mathematics in 6-year-old children. This region specialized during development to be specifically related to mathematics, but not visuospatial working memory in adolescents and adults. This could be an example of interactive specialization, where interacting with the environment in combination with interactions between cortical regions leads from a more general role of right anterior IPS in spatial processing, to a specialization of this region for mathematics.
Soleimani, Effat; Mokhtari-Dizaji, Manijhe; Saberi, Hajir; Sharif-Kashani, Shervin
2016-08-01
Clarifying the complex interaction between mechanical and biological processes in healthy and diseased conditions requires constitutive models for arterial walls. In this study, a mathematical model for the displacement of the carotid artery wall in the longitudinal direction is defined providing a satisfactory representation of the axial stress applied to the arterial wall. The proposed model was applied to the carotid artery wall motion estimated from ultrasound image sequences of 10 healthy adults, and the axial stress waveform exerted on the artery wall was extracted. Consecutive ultrasonic images (30 frames per second) of the common carotid artery of 10 healthy subjects (age 44 ± 4 year) were recorded and transferred to a personal computer. Longitudinal displacement and acceleration were extracted from ultrasonic image processing using a block-matching algorithm. Furthermore, images were examined using a maximum gradient algorithm and time rate changes of the internal diameter and intima-media thickness were extracted. Finally, axial stress was estimated using an appropriate constitutive equation for thin-walled tubes. Performance of the proposed model was evaluated using goodness of fit between approximated and measured longitudinal displacement statistics. Values of goodness-of-fit statistics indicated high quality of fit for all investigated subjects with the mean adjusted R-square (0.86 ± 0.08) and root mean squared error (0.08 ± 0.04 mm). According to the results of the present study, maximum and minimum axial stresses exerted on the arterial wall are 1.7 ± 0.6 and -1.5 ± 0.5 kPa, respectively. These results reveal the potential of this technique to provide a new method to assess arterial stress from ultrasound images, overcoming the limitations of the finite element and other simulation techniques.
Vector processing enhancements for real-time image analysis.
Shoaf, S.; APS Engineering Support Division
2008-01-01
A real-time image analysis system was developed for beam imaging diagnostics. An Apple Power Mac G5 with an Active Silicon LFG frame grabber was used to capture video images that were processed and analyzed. Software routines were created to utilize vector-processing hardware to reduce the time to process images as compared to conventional methods. These improvements allow for more advanced image processing diagnostics to be performed in real time.
Spatial Data Exploring by Satellite Image Distributed Processing
NASA Astrophysics Data System (ADS)
Mihon, V. D.; Colceriu, V.; Bektas, F.; Allenbach, K.; Gvilava, M.; Gorgan, D.
2012-04-01
Our society needs and environmental predictions encourage the applications development, oriented on supervising and analyzing different Earth Science related phenomena. Satellite images could be explored for discovering information concerning land cover, hydrology, air quality, and water and soil pollution. Spatial and environment related data could be acquired by imagery classification consisting of data mining throughout the multispectral bands. The process takes in account a large set of variables such as satellite image types (e.g. MODIS, Landsat), particular geographic area, soil composition, vegetation cover, and generally the context (e.g. clouds, snow, and season). All these specific and variable conditions require flexible tools and applications to support an optimal search for the appropriate solutions, and high power computation resources. The research concerns with experiments on solutions of using the flexible and visual descriptions of the satellite image processing over distributed infrastructures (e.g. Grid, Cloud, and GPU clusters). This presentation highlights the Grid based implementation of the GreenLand application. The GreenLand application development is based on simple, but powerful, notions of mathematical operators and workflows that are used in distributed and parallel executions over the Grid infrastructure. Currently it is used in three major case studies concerning with Istanbul geographical area, Rioni River in Georgia, and Black Sea catchment region. The GreenLand application offers a friendly user interface for viewing and editing workflows and operators. The description involves the basic operators provided by GRASS [1] library as well as many other image related operators supported by the ESIP platform [2]. The processing workflows are represented as directed graphs giving the user a fast and easy way to describe complex parallel algorithms, without having any prior knowledge of any programming language or application commands
Development of the SOFIA Image Processing Tool
NASA Technical Reports Server (NTRS)
Adams, Alexander N.
2011-01-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) is a Boeing 747SP carrying a 2.5 meter infrared telescope capable of operating between at altitudes of between twelve and fourteen kilometers, which is above more than 99 percent of the water vapor in the atmosphere. The ability to make observations above most water vapor coupled with the ability to make observations from anywhere, anytime, make SOFIA one of the world s premiere infrared observatories. SOFIA uses three visible light CCD imagers to assist in pointing the telescope. The data from these imagers is stored in archive files as is housekeeping data, which contains information such as boresight and area of interest locations. A tool that could both extract and process data from the archive files was developed.
ERIC Educational Resources Information Center
Gullick, Margaret M.; Sprute, Lisa A.; Temple, Elise
2011-01-01
Individual differences in mathematics performance may stem from domain-general factors like working memory and intelligence. Parietal and frontal brain areas have been implicated in number processing, but the influence of such cognitive factors on brain activity during mathematics processing is not known. The relationship between brain mechanisms…
ERIC Educational Resources Information Center
Hidiroglu, Çaglar Naci; Bukova Güzel, Esra
2013-01-01
The aim of the present study is to conceptualize the approaches displayed for validation of model and thought processes provided in mathematical modeling process performed in technology-aided learning environment. The participants of this grounded theory study were nineteen secondary school mathematics student teachers. The data gathered from the…
ERIC Educational Resources Information Center
Seo, Daeryong; Taherbhai, Husein
2009-01-01
The relations among students' motivational beliefs, cognitive processes, and academic achievement were investigated. A 51-item questionnaire together with a mathematics achievement test was administered to 459 fifth graders in Korean elementary school mathematics classrooms. Results indicated that, in general, students' cognitive processes related…
HYMOSS signal processing for pushbroom spectral imaging
NASA Technical Reports Server (NTRS)
Ludwig, David E.
1991-01-01
The objective of the Pushbroom Spectral Imaging Program was to develop on-focal plane electronics which compensate for detector array non-uniformities. The approach taken was to implement a simple two point calibration algorithm on focal plane which allows for offset and linear gain correction. The key on focal plane features which made this technique feasible was the use of a high quality transimpedance amplifier (TIA) and an analog-to-digital converter for each detector channel. Gain compensation is accomplished by varying the feedback capacitance of the integrate and dump TIA. Offset correction is performed by storing offsets in a special on focal plane offset register and digitally subtracting the offsets from the readout data during the multiplexing operation. A custom integrated circuit was designed, fabricated, and tested on this program which proved that nonuniformity compensated, analog-to-digital converting circuits may be used to read out infrared detectors. Irvine Sensors Corporation (ISC) successfully demonstrated the following innovative on-focal-plane functions that allow for correction of detector non-uniformities. Most of the circuit functions demonstrated on this program are finding their way onto future IC's because of their impact on reduced downstream processing, increased focal plane performance, simplified focal plane control, reduced number of dewar connections, as well as the noise immunity of a digital interface dewar. The potential commercial applications for this integrated circuit are primarily in imaging systems. These imaging systems may be used for: security monitoring systems, manufacturing process monitoring, robotics, and for spectral imaging when used in analytical instrumentation.
Survey: interpolation methods in medical image processing.
Lehmann, T M; Gönner, C; Spitzer, K
1999-11-01
Image interpolation techniques often are required in medical imaging for image generation (e.g., discrete back projection for inverse Radon transform) and processing such as compression or resampling. Since the ideal interpolation function spatially is unlimited, several interpolation kernels of finite size have been introduced. This paper compares 1) truncated and windowed sinc; 2) nearest neighbor; 3) linear; 4) quadratic; 5) cubic B-spline; 6) cubic; g) Lagrange; and 7) Gaussian interpolation and approximation techniques with kernel sizes from 1 x 1 up to 8 x 8. The comparison is done by: 1) spatial and Fourier analyses; 2) computational complexity as well as runtime evaluations; and 3) qualitative and quantitative interpolation error determinations for particular interpolation tasks which were taken from common situations in medical image processing. For local and Fourier analyses, a standardized notation is introduced and fundamental properties of interpolators are derived. Successful methods should be direct current (DC)-constant and interpolators rather than DC-inconstant or approximators. Each method's parameters are tuned with respect to those properties. This results in three novel kernels, which are introduced in this paper and proven to be within the best choices for medical image interpolation: the 6 x 6 Blackman-Harris windowed sinc interpolator, and the C2-continuous cubic kernels with N = 6 and N = 8 supporting points. For quantitative error evaluations, a set of 50 direct digital X rays was used. They have been selected arbitrarily from clinical routine. In general, large kernel sizes were found to be superior to small interpolation masks. Except for truncated sinc interpolators, all kernels with N = 6 or larger sizes perform significantly better than N = 2 or N = 3 point methods (p < 0.005). However, the differences within the group of large-sized kernels were not significant. Summarizing the results, the cubic 6 x 6 interpolator with continuous
NASA Astrophysics Data System (ADS)
Paranin, Y.; Burmistrov, A.; Salikeev, S.; Fomina, M.
2015-08-01
Basic propositions of calculation procedures for oil free scroll compressors characteristics are presented. It is shown that mathematical modelling of working process in a scroll compressor makes it possible to take into account such factors influencing the working process as heat and mass exchange, mechanical interaction in working chambers, leakage through slots, etc. The basic mathematical model may be supplemented by taking into account external heat exchange, elastic deformation of scrolls, inlet and outlet losses, etc. To evaluate the influence of procedure on scroll compressor characteristics calculations accuracy different calculations were carried out. Internal adiabatic efficiency was chosen as a comparative parameter which evaluates the perfection of internal thermodynamic and gas-dynamic compressor processes. Calculated characteristics are compared with experimental values obtained for the compressor pilot sample.
Mathematical Simulation of the Crystallization Process in a Continuous Linear Crystallizer
NASA Astrophysics Data System (ADS)
Veselov, S. N.; Volk, V. I.; Kashcheev, V. A.; Podymova, T. V.; Posenitskiy, E. A.
2017-01-01
A mathematical model of the crystallization of uranium in a continuous linear crystallizer, designed for the crystallization separation of desired products in the processing of an irradiated nuclear fuel, is proposed. This model defines the dynamics of growth/dissolution of uranyl nitrate hexahydrate crystals in a nitric acid solution of uranyl nitrate. Results of a numerical simulation of the indicated process, pointing to the existence of stationary conditions in the working space of the crystallizer, are presented. On the basis of these results, the characteristic time of establishment of the stationary regime at different parameters of the process was estimated. The mathematical model proposed was validated on the basis of a comparison of the results of calculations carried out within its framework with experimental data.
Mathematical models in simulation process in rehabilitation of persons with disabilities
NASA Astrophysics Data System (ADS)
Gorie, Nina; Dolga, Valer; Mondoc, Alina
2012-11-01
The problems of people with disability are varied. A disability may be physical, cognitive, mental, sensory, emotional, developmental or some combination of these. The major disabilities which can appear in people's lives are: the blindness, the deafness, the limb-girdle muscular dystrophy, the orthopedic impairment, the visual impairment. A disability is an umbrella term, covering impairments, activity limitations and participation restrictions. A disability may occur during a person's lifetime or may be present from birth. The authors conclude that some of these disabilities like physical, cognitive, mental, sensory, emotional, developmental can be rehabilitated. Starting from this state of affairs the authors present briefly the possibility of using certain mechatronic systems for rehabilitation of persons with different disabilities. The authors focus their presentation on alternative calling the Stewart platform in order to achieve the proposed goal. The authors present a mathematical model of systems theory approach under the parallel system and described its contents can. The authors analyze in a meaningful mathematical model describing the procedure of rehabilitation process. From the affected function biomechanics and taking into account medical recommendations the authors illustrate the mathematical models of rehabilitation work. The authors assemble a whole mathematical model of parallel structure and the rehabilitation process and making simulation and highlighting the results estimated. The authors present in the end work the results envisaged in the end analysis work, conclusions and steps for future work program..
A New Image Processing and GIS Package
NASA Technical Reports Server (NTRS)
Rickman, D.; Luvall, J. C.; Cheng, T.
1998-01-01
The image processing and GIS package ELAS was developed during the 1980's by NASA. It proved to be a popular, influential and powerful in the manipulation of digital imagery. Before the advent of PC's it was used by hundreds of institutions, mostly schools. It is the unquestioned, direct progenitor or two commercial GIS remote sensing packages, ERDAS and MapX and influenced others, such as PCI. Its power was demonstrated by its use for work far beyond its original purpose, having worked several different types of medical imagery, photomicrographs of rock, images of turtle flippers and numerous other esoteric imagery. Although development largely stopped in the early 1990's the package still offers as much or more power and flexibility than any other roughly comparable package, public or commercial. It is a huge body or code, representing more than a decade of work by full time, professional programmers. The current versions all have several deficiencies compared to current software standards and usage, notably its strictly command line interface. In order to support their research needs the authors are in the process of fundamentally changing ELAS, and in the process greatly increasing its power, utility, and ease of use. The new software is called ELAS II. This paper discusses the design of ELAS II.
4MOST metrology system image processing
NASA Astrophysics Data System (ADS)
Winkler, Roland; Barden, Samuel C.; Saviauk, Allar
2016-08-01
The 4-meter Multi-Object Spectroscopic Telescope (4MOST) instrument uses 2400 individually positioned optical fibres to couple the light of targets into its spectrographs. The metrology system determines the position of the back-illuminated fibres on the focal surface of the telescope. It consists of 4 identical cameras that are mounted on the spider vanes of the secondary mirror of the VISTA telescope and look through the entire optical train, including M1, M2 and the WFC/ADC unit. Here, we describe the image and data processing steps of the metrology system as well as present results from our 1 in 10 sized lab prototype.
Estimation of age by epidermal image processing.
Tatsumi, S; Noda, H; Sugiyama, S
1999-12-01
Small pieces of human precordial skin were obtained from 266 individuals during autopsy performed in Osaka Prefecture. The area from the stratum corneum to the stratum basale in a unit area of the epidermal cross-section was extracted as a segmented area of white image by image processing. The number of pixels surrounding this area was measured in individuals of various ages, and the age-associated changes were evaluated. The number of pixels around this binary image in the epidermal cross-section showed a strong correlation with age. The number tended to decrease with an increase in age in individuals aged 20 years and above, which could be closely approximated by an exponential function. A formula for estimating age was obtained as an inverse function of the number of pixels and age, and the accuracy of estimation using this formula was examined by comparing the estimated age with the actual age. Such age-associated changes in the epidermis were considered to be closely related with increased roughening of the stratum basale, flattening of dermal papillae, and a decreased percentage of the stratum granulosum per unit area of epidermis observed by light microscopy or scanning electron microscopy.
Image processing to optimize wave energy converters
NASA Astrophysics Data System (ADS)
Bailey, Kyle Marc-Anthony
The world is turning to renewable energies as a means of ensuring the planet's future and well-being. There have been a few attempts in the past to utilize wave power as a means of generating electricity through the use of Wave Energy Converters (WEC), but only recently are they becoming a focal point in the renewable energy field. Over the past few years there has been a global drive to advance the efficiency of WEC. Placing a mechanical device either onshore or offshore that captures the energy within ocean surface waves to drive a mechanical device is how wave power is produced. This paper seeks to provide a novel and innovative way to estimate ocean wave frequency through the use of image processing. This will be achieved by applying a complex modulated lapped orthogonal transform filter bank to satellite images of ocean waves. The complex modulated lapped orthogonal transform filterbank provides an equal subband decomposition of the Nyquist bounded discrete time Fourier Transform spectrum. The maximum energy of the 2D complex modulated lapped transform subband is used to determine the horizontal and vertical frequency, which subsequently can be used to determine the wave frequency in the direction of the WEC by a simple trigonometric scaling. The robustness of the proposed method is provided by the applications to simulated and real satellite images where the frequency is known.
Using Image Processing to Determine Emphysema Severity
NASA Astrophysics Data System (ADS)
McKenzie, Alexander; Sadun, Alberto
2010-10-01
Currently X-rays and computerized tomography (CT) scans are used to detect emphysema, but other tests are required to accurately quantify the amount of lung that has been affected by the disease. These images clearly show if a patient has emphysema, but are unable by visual scan alone, to quantify the degree of the disease, as it presents as subtle, dark spots on the lung. Our goal is to use these CT scans to accurately diagnose and determine emphysema severity levels in patients. This will be accomplished by performing several different analyses of CT scan images of several patients representing a wide range of severity of the disease. In addition to analyzing the original CT data, this process will convert the data to one and two bit images and will then examine the deviation from a normal distribution curve to determine skewness. Our preliminary results show that this method of assessment appears to be more accurate and robust than the currently utilized methods, which involve looking at percentages of radiodensities in the air passages of the lung.
Saitou, Takashi; Imamura, Takeshi
2016-01-01
Cell cycle progression is strictly coordinated to ensure proper tissue growth, development, and regeneration of multicellular organisms. Spatiotemporal visualization of cell cycle phases directly helps us to obtain a deeper understanding of controlled, multicellular, cell cycle progression. The fluorescent ubiquitination-based cell cycle indicator (Fucci) system allows us to monitor, in living cells, the G1 and the S/G2/M phases of the cell cycle in red and green fluorescent colors, respectively. Since the discovery of Fucci technology, it has found numerous applications in the characterization of the timing of cell cycle phase transitions under diverse conditions and various biological processes. However, due to the complexity of cell cycle dynamics, understanding of specific patterns of cell cycle progression is still far from complete. In order to tackle this issue, quantitative approaches combined with mathematical modeling seem to be essential. Here, we review several studies that attempted to integrate Fucci technology and mathematical models to obtain quantitative information regarding cell cycle regulatory patterns. Focusing on the technological development of utilizing mathematics to retrieve meaningful information from the Fucci producing data, we discuss how the combined methods advance a quantitative understanding of cell cycle regulation.
Automatic draft reading based on image processing
NASA Astrophysics Data System (ADS)
Tsujii, Takahiro; Yoshida, Hiromi; Iiguni, Youji
2016-10-01
In marine transportation, a draft survey is a means to determine the quantity of bulk cargo. Automatic draft reading based on computer image processing has been proposed. However, the conventional draft mark segmentation may fail when the video sequence has many other regions than draft marks and a hull, and the estimated waterline is inherently higher than the true one. To solve these problems, we propose an automatic draft reading method that uses morphological operations to detect draft marks and estimate the waterline for every frame with Canny edge detection and a robust estimation. Moreover, we emulate surveyors' draft reading process for getting the understanding of a shipper and a receiver. In an experiment in a towing tank, the draft reading error of the proposed method was <1 cm, showing the advantage of the proposed method. It is also shown that accurate draft reading has been achieved in a real-world scene.
Imaging fault zones using 3D seismic image processing techniques
NASA Astrophysics Data System (ADS)
Iacopini, David; Butler, Rob; Purves, Steve
2013-04-01
Significant advances in structural analysis of deep water structure, salt tectonic and extensional rift basin come from the descriptions of fault system geometries imaged in 3D seismic data. However, even where seismic data are excellent, in most cases the trajectory of thrust faults is highly conjectural and still significant uncertainty exists as to the patterns of deformation that develop between the main faults segments, and even of the fault architectures themselves. Moreover structural interpretations that conventionally define faults by breaks and apparent offsets of seismic reflectors are commonly conditioned by a narrow range of theoretical models of fault behavior. For example, almost all interpretations of thrust geometries on seismic data rely on theoretical "end-member" behaviors where concepts as strain localization or multilayer mechanics are simply avoided. Yet analogue outcrop studies confirm that such descriptions are commonly unsatisfactory and incomplete. In order to fill these gaps and improve the 3D visualization of deformation in the subsurface, seismic attribute methods are developed here in conjunction with conventional mapping of reflector amplitudes (Marfurt & Chopra, 2007)). These signal processing techniques recently developed and applied especially by the oil industry use variations in the amplitude and phase of the seismic wavelet. These seismic attributes improve the signal interpretation and are calculated and applied to the entire 3D seismic dataset. In this contribution we will show 3D seismic examples of fault structures from gravity-driven deep-water thrust structures and extensional basin systems to indicate how 3D seismic image processing methods can not only build better the geometrical interpretations of the faults but also begin to map both strain and damage through amplitude/phase properties of the seismic signal. This is done by quantifying and delineating the short-range anomalies on the intensity of reflector amplitudes
MISR Browse Images: Cold Land Processes Experiment (CLPX)
Atmospheric Science Data Center
2013-04-02
... MISR Browse Images: Cold Land Processes Experiment (CLPX) These MISR Browse images provide a ... over the region observed during the NASA Cold Land Processes Experiment (CLPX). CLPX involved ground, airborne, and satellite measurements ...
Spatial Processing in Infancy Predicts Both Spatial and Mathematical Aptitude in Childhood.
Lauer, Jillian E; Lourenco, Stella F
2016-10-01
Despite considerable interest in the role of spatial intelligence in science, technology, engineering, and mathematics (STEM) achievement, little is known about the ontogenetic origins of individual differences in spatial aptitude or their relation to later accomplishments in STEM disciplines. The current study provides evidence that spatial processes present in infancy predict interindividual variation in both spatial and mathematical competence later in development. Using a longitudinal design, we found that children's performance on a brief visuospatial change-detection task administered between 6 and 13 months of age was related to their spatial aptitude (i.e., mental-transformation skill) and mastery of symbolic-math concepts at 4 years of age, even when we controlled for general cognitive abilities and spatial memory. These results suggest that nascent spatial processes present in the first year of life not only act as precursors to later spatial intelligence but also predict math achievement during childhood.
Mathematical modelling of the incomplete transformations in pseudoelastic processes in binary alloys
NASA Astrophysics Data System (ADS)
Vokoun, David; Kafka, Vratislav
1996-04-01
In his two papers Kafka (1994,1994a) presented a new approach to explanation and to mathematical modelling of shape memory effect and of pseudoelasticity. This approach was based on his general concept of modelling inelastic processes in heterogeneous media (Kafka 1987) and it was shown that this concept can successfiilly be applied even in the case where the heterogeneity under study is on the atomic scale, i.e. in the case of binary alloys and their shape memory behaviour. In the second quoted paper (Kafka 1994a) quantitative comparisons with experimental data received with samples ofNiTi alloy were shown and it was demonstrated that the unified mathematical model is able to quantitatively describe the shape memory effect as well as the pseudoelastic processes under different temperatures.
Kol'dyaev, V.I.; Svitashev, K.K.
1987-01-01
The authors review developments of mathematical modeling of electron processes in dielectrics in strong fields, and compare some results of the modeling obtained from the experimental data on MNOS structures. The authors study modeling of the volt-ampere characteristics; methods based on the study of the polarization kinetics of MDS structures (polarization methods); depolarization of MDS structures under isothermal conditions, and under variable temperature; methods using photoconductivity and photoinjection; and modeling of the degradation of MNOS structures.
A mathematical study of a syntrophic relationship of a model of anaerobic digestion process.
El Hajji, Miled; Mazenc, Frédéric; Harmand, Jérôme
2010-07-01
A mathematical model involving the syntrophic relationship of two major populations of bacteria (acetogens and methanogens), each responsible for a stage of the methane fermentation process is proposed. A detailed qualitative analysis is carried out. The local and global stability analyses of the equilibria are performed. We demonstrate, under general assumptions of monotonicity, relevant from an applied point of view, the global asymptotic stability of a positive equilibrium point which corresponds to the coexistence of acetogenic and methanogenic bacteria.
Digital image processing of cephalometric radiographs: a preliminary report.
Jackson, P H; Dickson, G C; Birnie, D J
1985-07-01
The principles of image capture, image storage and image processing in digital radiology are described. The enhancement of radiographic images using digital image processing techniques and its application to cephalometry is discussed. The results of a pilot study which compared some common cephalometric measurements made from manual point identification with those made by direct digitization of digital radiographic images from video monitors are presented. Although in an early stage of development, the results from the image processing system were comparable with those obtained by traditional methods.
Are poor mathematics skills associated with visual deficits in temporal processing?
Sigmundsson, H; Anholt, S K; Talcott, J B
2010-01-22
Developmental learning disabilities such as dyslexia and dyscalculia have a high rate of co-occurrence in pediatric populations, suggesting that they share underlying cognitive and neurophysiological mechanisms. Dyslexia and other developmental disorders with a strong heritable component have been associated with reduced sensitivity to coherent motion stimuli, an index of visual temporal processing on a millisecond time-scale. Here we examined whether deficits in sensitivity to visual motion are evident in children who have poor mathematics skills relative to other children of the same age. We obtained psychophysical thresholds for visual coherent motion and a control task from two groups of children who differed in their performance on a test of mathematics achievement. Children with math skills in the lowest 10% in their cohort were less sensitive than age-matched controls to coherent motion, but they had statistically equivalent thresholds to controls on a coherent form control measure. Children with mathematics difficulties therefore tend to present a similar pattern of visual processing deficit to those that have been reported previously in other developmental disorders. We speculate that reduced sensitivity to temporally defined stimuli such as coherent motion represents a common processing deficit apparent across a range of commonly co-occurring developmental disorders.
Moll, Kristina; Göbel, Silke M; Snowling, Margaret J
2015-01-01
As well as being the hallmark of mathematics disorders, deficits in number processing have also been reported for individuals with reading disorders. The aim of the present study was to investigate separately the components of numerical processing affected in reading and mathematical disorders within the framework of the Triple Code Model. Children with reading disorders (RD), mathematics disorders (MD), comorbid deficits (RD + MD), and typically developing children (TD) were tested on verbal, visual-verbal, and nonverbal number tasks. As expected, children with MD were impaired across a broad range of numerical tasks. In contrast, children with RD were impaired in (visual-)verbal number tasks but showed age-appropriate performance in nonverbal number skills, suggesting their impairments were domain specific and related to their reading difficulties. The comorbid group showed an additive profile of the impairments of the two single-deficit groups. Performance in speeded verbal number tasks was related to rapid automatized naming, a measure of visual-verbal access in the RD but not in the MD group. The results indicate that deficits in number skills are due to different underlying cognitive deficits in children with RD compared to children with MD: a phonological deficit in RD and a deficit in processing numerosities in MD.
Students, Computers and Mathematics the Golden Trilogy in the Teaching-Learning Process
ERIC Educational Resources Information Center
García-Santillán, Arturo; Escalera-Chávez, Milka Elena; López-Morales, José Satsumi; Córdova Rangel, Arturo
2014-01-01
In this paper we examine the relationships between students' attitudes towards mathematics and technology, therefore, we take a Galbraith and Hines' scale (1998, 2000) about mathematics confidence, computer confidence, computer and mathematics interaction, mathematics motivation, computer motivation, and mathematics engagement. 164 questionnaires…
Exploring Metacognition in Preservice Teachers: Problem Solving Processes in Elementary Mathematics
ERIC Educational Resources Information Center
Sparkman, Dana; Harris, Kymberly
2009-01-01
In Principles and Standards for School Mathematics (2000), the (U.S.) National Council of Teachers of Mathematics recommended that students communicate their mathematical thinking in a logical manner, and use the language of mathematics to express their thinking accurately and logically. Students should not only learn mathematics content, but…
Effects of image processing on the detective quantum efficiency
NASA Astrophysics Data System (ADS)
Park, Hye-Suk; Kim, Hee-Joung; Cho, Hyo-Min; Lee, Chang-Lae; Lee, Seung-Wan; Choi, Yu-Na
2010-04-01
Digital radiography has gained popularity in many areas of clinical practice. This transition brings interest in advancing the methodologies for image quality characterization. However, as the methodologies for such characterizations have not been standardized, the results of these studies cannot be directly compared. The primary objective of this study was to standardize methodologies for image quality characterization. The secondary objective was to evaluate affected factors to Modulation transfer function (MTF), noise power spectrum (NPS), and detective quantum efficiency (DQE) according to image processing algorithm. Image performance parameters such as MTF, NPS, and DQE were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) images of hand posterior-anterior (PA) for measuring signal to noise ratio (SNR), slit image for measuring MTF, white image for measuring NPS were obtained and various Multi-Scale Image Contrast Amplification (MUSICA) parameters were applied to each of acquired images. In results, all of modified images were considerably influence on evaluating SNR, MTF, NPS, and DQE. Modified images by the post-processing had higher DQE than the MUSICA=0 image. This suggests that MUSICA values, as a post-processing, have an affect on the image when it is evaluating for image quality. In conclusion, the control parameters of image processing could be accounted for evaluating characterization of image quality in same way. The results of this study could be guided as a baseline to evaluate imaging systems and their imaging characteristics by measuring MTF, NPS, and DQE.
NASA Astrophysics Data System (ADS)
Lu, Lee-Jane W.; Nishino, Thomas K.; Johnson, Raleigh F.; Nayeem, Fatima; Brunder, Donald G.; Ju, Hyunsu; Leonard, Morton H., Jr.; Grady, James J.; Khamapirad, Tuenchit
2012-11-01
Women with mostly mammographically dense fibroglandular tissue (breast density, BD) have a four- to six-fold increased risk for breast cancer compared to women with little BD. BD is most frequently estimated from two-dimensional (2D) views of mammograms by a histogram segmentation approach (HSM) and more recently by a mathematical algorithm consisting of mammographic imaging parameters (MATH). Two non-invasive clinical magnetic resonance imaging (MRI) protocols: 3D gradient-echo (3DGRE) and short tau inversion recovery (STIR) were modified for 3D volumetric reconstruction of the breast for measuring fatty and fibroglandular tissue volumes by a Gaussian-distribution curve-fitting algorithm. Replicate breast exams (N = 2 to 7 replicates in six women) by 3DGRE and STIR were highly reproducible for all tissue-volume estimates (coefficients of variation <5%). Reliability studies compared measurements from four methods, 3DGRE, STIR, HSM, and MATH (N = 95 women) by linear regression and intra-class correlation (ICC) analyses. Rsqr, regression slopes, and ICC, respectively, were (1) 0.76-0.86, 0.8-1.1, and 0.87-0.92 for %-gland tissue, (2) 0.72-0.82, 0.64-0.96, and 0.77-0.91, for glandular volume, (3) 0.87-0.98, 0.94-1.07, and 0.89-0.99, for fat volume, and (4) 0.89-0.98, 0.94-1.00, and 0.89-0.98, for total breast volume. For all values estimated, the correlation was stronger for comparisons between the two MRI than between each MRI versus mammography, and between each MRI versus MATH data than between each MRI versus HSM data. All ICC values were >0.75 indicating that all four methods were reliable for measuring BD and that the mathematical algorithm and the two complimentary non-invasive MRI protocols could objectively and reliably estimate different types of breast tissues.
Skagerlund, Kenny; Träff, Ulf
2016-03-01
The current study investigated whether processing of number, space, and time contributes to mathematical abilities beyond previously known domain-general cognitive abilities in a sample of 8- to 10-year-old children (N=133). Multiple regression analyses revealed that executive functions and general intelligence predicted all aspects of mathematics and overall mathematical ability. Working memory capacity did not contribute significantly to our models, whereas spatial ability was a strong predictor of achievement. The study replicates earlier research showing that non-symbolic number processing seems to lose predictive power of mathematical abilities once the symbolic system is acquired. Novel findings include the fact that time discrimination ability was tied to calculation ability. Therefore, a conclusion is that magnitude processing in general contributes to mathematical achievement.
ERIC Educational Resources Information Center
Warwick, Jon
2007-01-01
The decline in the development of mathematical skills in students prior to university entrance has been a matter of concern to UK higher education staff for a number of years. This article describes a pilot study that uses the Analytic Hierarchy Process to quantify the mathematical experiences of computing students prior to the start of a first…
ERIC Educational Resources Information Center
Klein, M.
2002-01-01
Undertakes, from a poststructuralist perspective, a meta-analysis of two short episodes from a paper by Manouchehri and Goodman (2000). Explores how mathematical knowledge and identities are produced in teaching/learning interactions in the classroom and the wider practical implications of this productive power of process for mathematics education…
ERIC Educational Resources Information Center
Canturk-Gunhan, Berna; Bukova-Guzel, Esra; Ozgur, Zekiye
2012-01-01
The purpose of this study is to determine prospective mathematics teachers' views about using problem-based learning (PBL) in statistics teaching and to examine their thought processes. It is a qualitative study conducted with 15 prospective mathematics teachers from a state university in Turkey. The data were collected via participant observation…
Methods for processing and imaging marsh foraminifera
Dreher, Chandra A.; Flocks, James G.
2011-01-01
This study is part of a larger U.S. Geological Survey (USGS) project to characterize the physical conditions of wetlands in southwestern Louisiana. Within these wetlands, groups of benthic foraminifera-shelled amoeboid protists living near or on the sea floor-can be used as agents to measure land subsidence, relative sea-level rise, and storm impact. In the Mississippi River Delta region, intertidal-marsh foraminiferal assemblages and biofacies were established in studies that pre-date the 1970s, with a very limited number of more recent studies. This fact sheet outlines this project's improved methods, handling, and modified preparations for the use of Scanning Electron Microscope (SEM) imaging of these foraminifera. The objective is to identify marsh foraminifera to the taxonomic species level by using improved processing methods and SEM imaging for morphological characterization in order to evaluate changes in distribution and frequency relative to other environmental variables. The majority of benthic marsh foraminifera consists of agglutinated forms, which can be more delicate than porcelaneous forms. Agglutinated tests (shells) are made of particles such as sand grains or silt and clay material, whereas porcelaneous tests consist of calcite.
Intelligent elevator management system using image processing
NASA Astrophysics Data System (ADS)
Narayanan, H. Sai; Karunamurthy, Vignesh; Kumar, R. Barath
2015-03-01
In the modern era, the increase in the number of shopping malls and industrial building has led to an exponential increase in the usage of elevator systems. Thus there is an increased need for an effective control system to manage the elevator system. This paper is aimed at introducing an effective method to control the movement of the elevators by considering various cases where in the location of the person is found and the elevators are controlled based on various conditions like Load, proximity etc... This method continuously monitors the weight limit of each elevator while also making use of image processing to determine the number of persons waiting for an elevator in respective floors. Canny edge detection technique is used to find out the number of persons waiting for an elevator. Hence the algorithm takes a lot of cases into account and locates the correct elevator to service the respective persons waiting in different floors.
Corn plant locating by image processing
NASA Astrophysics Data System (ADS)
Jia, Jiancheng; Krutz, Gary W.; Gibson, Harry W.
1991-02-01
The feasibility investigation of using machine vision technology to locate corn plants is an important issue for field production automation in the agricultural industry. This paper presents an approach which was developed to locate the center of a corn plant using image processing techniques. Corn plants were first identified using a main vein detection algorithm by detecting a local feature of corn leaves leaf main veins based on the spectral difference between mains and leaves then the center of the plant could be located using a center locating algorithm by tracing and extending each detected vein line and evaluating the center of the plant from intersection points of those lines. The experimental results show the usefulness of the algorithm for machine vision applications related to corn plant identification. Such a technique can be used for pre. cisc spraying of pesticides or biotech chemicals. 1.
Image processing and products for the Magellan mission to Venus
NASA Technical Reports Server (NTRS)
Clark, Jerry; Alexander, Doug; Andres, Paul; Lewicki, Scott; Mcauley, Myche
1992-01-01
The Magellan mission to Venus is providing planetary scientists with massive amounts of new data about the surface geology of Venus. Digital image processing is an integral part of the ground data system that provides data products to the investigators. The mosaicking of synthetic aperture radar (SAR) image data from the spacecraft is being performed at JPL's Multimission Image Processing Laboratory (MIPL). MIPL hosts and supports the Image Data Processing Subsystem (IDPS), which was developed in a VAXcluster environment of hardware and software that includes optical disk jukeboxes and the TAE-VICAR (Transportable Applications Executive-Video Image Communication and Retrieval) system. The IDPS is being used by processing analysts of the Image Data Processing Team to produce the Magellan image data products. Various aspects of the image processing procedure are discussed.
Cárdenas Sandoval, Rosy Paola; Garzón-Alvarado, Diego Alexander; Ramírez Martínez, Angélica Maria
2012-06-07
This article proposes a mathematical model that predicts the wound healing process of the ligament after a sprain, grade II. The model describes the swelling, expression of the platelet-derived growth factor (PDGF), formation and migration of fibroblasts into the injury area and the expression of collagen fibers. Additionally, the model can predict the effect of ice treatment in reducing inflammation and the action of mechanical stress in the process of remodeling of collagen fibers. The results obtained from computer simulation show a high concordance with the clinical data previously reported by other authors.
Filters in 2D and 3D Cardiac SPECT Image Processing
Ploussi, Agapi; Synefia, Stella
2014-01-01
Nuclear cardiac imaging is a noninvasive, sensitive method providing information on cardiac structure and physiology. Single photon emission tomography (SPECT) evaluates myocardial perfusion, viability, and function and is widely used in clinical routine. The quality of the tomographic image is a key for accurate diagnosis. Image filtering, a mathematical processing, compensates for loss of detail in an image while reducing image noise, and it can improve the image resolution and limit the degradation of the image. SPECT images are then reconstructed, either by filter back projection (FBP) analytical technique or iteratively, by algebraic methods. The aim of this study is to review filters in cardiac 2D, 3D, and 4D SPECT applications and how these affect the image quality mirroring the diagnostic accuracy of SPECT images. Several filters, including the Hanning, Butterworth, and Parzen filters, were evaluated in combination with the two reconstruction methods as well as with a specified MatLab program. Results showed that for both 3D and 4D cardiac SPECT the Butterworth filter, for different critical frequencies and orders, produced the best results. Between the two reconstruction methods, the iterative one might be more appropriate for cardiac SPECT, since it improves lesion detectability due to the significant improvement of image contrast. PMID:24804144
Filters in 2D and 3D Cardiac SPECT Image Processing.
Lyra, Maria; Ploussi, Agapi; Rouchota, Maritina; Synefia, Stella
2014-01-01
Nuclear cardiac imaging is a noninvasive, sensitive method providing information on cardiac structure and physiology. Single photon emission tomography (SPECT) evaluates myocardial perfusion, viability, and function and is widely used in clinical routine. The quality of the tomographic image is a key for accurate diagnosis. Image filtering, a mathematical processing, compensates for loss of detail in an image while reducing image noise, and it can improve the image resolution and limit the degradation of the image. SPECT images are then reconstructed, either by filter back projection (FBP) analytical technique or iteratively, by algebraic methods. The aim of this study is to review filters in cardiac 2D, 3D, and 4D SPECT applications and how these affect the image quality mirroring the diagnostic accuracy of SPECT images. Several filters, including the Hanning, Butterworth, and Parzen filters, were evaluated in combination with the two reconstruction methods as well as with a specified MatLab program. Results showed that for both 3D and 4D cardiac SPECT the Butterworth filter, for different critical frequencies and orders, produced the best results. Between the two reconstruction methods, the iterative one might be more appropriate for cardiac SPECT, since it improves lesion detectability due to the significant improvement of image contrast.
Zhang, Yudong; Peng, Bo; Wang, Shuihua; Liang, Yu-Xiang; Yang, Jiquan; So, Kwok-Fai; Yuan, Ti-Fei
2016-02-18
Microglia are the mononuclear phagocytes with various functions in the central nervous system, and the morphologies of microglia imply the different stages and functions. In optical nerve transection model of the retina, the retrograde degeneration of retinal ganglion cells induces microglial activations to a unique morphology termed rod microglia. A few studies described the rod microglia in the cortex and retina; however, the spatial characteristic of rod microglia is not fully understood. In this study, we built a mathematical model to characterize the spatial trait of rod microglia. In addition, we developed a Matlab-based image processing pipeline that consists of log enhancement, image segmentation, mathematical morphology based cell detection, area calculation and angle analysis. This computer program provides researchers a powerful tool to quickly analyze the spatial trait of rod microglia.
Zhang, Yudong; Peng, Bo; Wang, Shuihua; Liang, Yu-Xiang; Yang, Jiquan; So, Kwok-Fai; Yuan, Ti-Fei
2016-01-01
Microglia are the mononuclear phagocytes with various functions in the central nervous system, and the morphologies of microglia imply the different stages and functions. In optical nerve transection model of the retina, the retrograde degeneration of retinal ganglion cells induces microglial activations to a unique morphology termed rod microglia. A few studies described the rod microglia in the cortex and retina; however, the spatial characteristic of rod microglia is not fully understood. In this study, we built a mathematical model to characterize the spatial trait of rod microglia. In addition, we developed a Matlab-based image processing pipeline that consists of log enhancement, image segmentation, mathematical morphology based cell detection, area calculation and angle analysis. This computer program provides researchers a powerful tool to quickly analyze the spatial trait of rod microglia. PMID:26888347
Vanbinst, K; De Smedt, B
2016-01-01
This contribution reviewed the available evidence on the domain-specific and domain-general neurocognitive determinants of children's arithmetic development, other than nonsymbolic numerical magnitude processing, which might have been overemphasized as a core factor of individual differences in mathematics and dyscalculia. We focused on symbolic numerical magnitude processing, working memory, and phonological processing, as these determinants have been most researched and their roles in arithmetic can be predicted against the background of brain imaging data. Our review indicates that symbolic numerical magnitude processing is a major determinant of individual differences in arithmetic. Working memory, particularly the central executive, also plays a role in learning arithmetic, but its influence appears to be dependent on the learning stage and experience of children. The available evidence on phonological processing suggests that it plays a more subtle role in children's acquisition of arithmetic facts. Future longitudinal studies should investigate these factors in concert to understand their relative contribution as well as their mediating and moderating roles in children's arithmetic development.
Spot restoration for GPR image post-processing
Paglieroni, David W; Beer, N. Reginald
2014-05-20
A method and system for detecting the presence of subsurface objects within a medium is provided. In some embodiments, the imaging and detection system operates in a multistatic mode to collect radar return signals generated by an array of transceiver antenna pairs that is positioned across the surface and that travels down the surface. The imaging and detection system pre-processes the return signal to suppress certain undesirable effects. The imaging and detection system then generates synthetic aperture radar images from real aperture radar images generated from the pre-processed return signal. The imaging and detection system then post-processes the synthetic aperture radar images to improve detection of subsurface objects. The imaging and detection system identifies peaks in the energy levels of the post-processed image frame, which indicates the presence of a subsurface object.
Vision-sensing image analysis for GTAW process control
Long, D.D.
1994-11-01
Image analysis of a gas tungsten arc welding (GTAW) process was completed using video images from a charge coupled device (CCD) camera inside a specially designed coaxial (GTAW) electrode holder. Video data was obtained from filtered and unfiltered images, with and without the GTAW arc present, showing weld joint features and locations. Data Translation image processing boards, installed in an IBM PC AT 386 compatible computer, and Media Cybernetics image processing software were used to investigate edge flange weld joint geometry for image analysis.
ERIC Educational Resources Information Center
Barak, Moshe; Asad, Khaled
2012-01-01
Background: This research focused on the development, implementation and evaluation of a course on image-processing principles aimed at middle-school students. Purpose: The overarching purpose of the study was that of integrating the learning of subjects in science, technology, engineering and mathematics (STEM), and linking the learning of these…
Low cost 3D scanning process using digital image processing
NASA Astrophysics Data System (ADS)
Aguilar, David; Romero, Carlos; Martínez, Fernando
2017-02-01
This paper shows the design and building of a low cost 3D scanner, able to digitize solid objects through contactless data acquisition, using active object reflection. 3D scanners are used in different applications such as: science, engineering, entertainment, etc; these are classified in: contact scanners and contactless ones, where the last ones are often the most used but they are expensive. This low-cost prototype is done through a vertical scanning of the object using a fixed camera and a mobile horizontal laser light, which is deformed depending on the 3-dimensional surface of the solid. Using digital image processing an analysis of the deformation detected by the camera was done; it allows determining the 3D coordinates using triangulation. The obtained information is processed by a Matlab script, which gives to the user a point cloud corresponding to each horizontal scanning done. The obtained results show an acceptable quality and significant details of digitalized objects, making this prototype (built on LEGO Mindstorms NXT kit) a versatile and cheap tool, which can be used for many applications, mainly by engineering students.
NASA Astrophysics Data System (ADS)
Garcia, Arnaud; Vachier, Corinne; Vallée, Jean-Paul
2008-02-01
Multivariate images are now commonly produced in many applications. If their process is possible due to computers power and new programming languages, theoretical difficulties have still to be solved. Standard image analysis operators are defined for scalars rather than for vectors and their extension is not immediate. Several solutions exist but their pertinence is hardly linked to context. In the present paper we are going to get interested in segmentation of vector images also including a priori knowledge. The proposed strategy combines a decision procedure (where points are classified) and an automatic segmentation scheme (where regions are properly extracted). The classification is made using a Bayesian classifier. The segmentation is computed via a region growing method: the morphological Watershed transform. A direct computation of the Watershed transform on vector images is not possible since vector sets are not ordered. So, the Bayesian classifier is used for computing a scalar distance map where regions are enhanced or attenuated depending on their similitude to a reference shape: the current distance is the Mahalanobis distance. This combination allows to transfer the decision function from pixels to regions and to preserve the advantages of the original Watershed transform defined for scalar functions. The algorithm is applied for segmenting colour images (with a priori) and medical images, especially dermatology images where skin lesions have to be detected.
Boix, Macarena; Cantó, Begoña
2013-04-01
Accurate image segmentation is used in medical diagnosis since this technique is a noninvasive pre-processing step for biomedical treatment. In this work we present an efficient segmentation method for medical image analysis. In particular, with this method blood cells can be segmented. For that, we combine the wavelet transform with morphological operations. Moreover, the wavelet thresholding technique is used to eliminate the noise and prepare the image for suitable segmentation. In wavelet denoising we determine the best wavelet that shows a segmentation with the largest area in the cell. We study different wavelet families and we conclude that the wavelet db1 is the best and it can serve for posterior works on blood pathologies. The proposed method generates goods results when it is applied on several images. Finally, the proposed algorithm made in MatLab environment is verified for a selected blood cells.
Processing of Image Data by Integrated Circuits
NASA Technical Reports Server (NTRS)
Armstrong, R. W.
1985-01-01
Sensors combined with logic and memory circuitry. Cross-correlation of two inputs accomplished by transversal filter. Position of image taken to point where image and template data yield maximum value correlation function. Circuit used for controlling robots, medical-image analysis, automatic vehicle guidance, and precise pointing of scientific cameras.
Sasanguie, Delphine; Göbel, Silke M; Moll, Kristina; Smets, Karolien; Reynvoet, Bert
2013-03-01
In this study, the performance of typically developing 6- to 8-year-old children on an approximate number discrimination task, a symbolic comparison task, and a symbolic and nonsymbolic number line estimation task was examined. For the first time, children's performances on these basic cognitive number processing tasks were explicitly contrasted to investigate which of them is the best predictor of their future mathematical abilities. Math achievement was measured with a timed arithmetic test and with a general curriculum-based math test to address the additional question of whether the predictive association between the basic numerical abilities and mathematics achievement is dependent on which math test is used. Results revealed that performance on both mathematics achievement tests was best predicted by how well childrencompared digits. In addition, an association between performance on the symbolic number line estimation task and math achievement scores for the general curriculum-based math test measuring a broader spectrum of skills was found. Together, these results emphasize the importance of learning experiences with symbols for later math abilities.
NASA Astrophysics Data System (ADS)
Evtushenko, V. F.; Myshlyaev, L. P.; Makarov, G. V.; Ivushkin, K. A.; Burkova, E. V.
2016-10-01
The structure of multi-variant physical and mathematical models of control system is offered as well as its application for adjustment of automatic control system (ACS) of production facilities on the example of coal processing plant.
Mathematical modelling to predict the roughness average in micro milling process
NASA Astrophysics Data System (ADS)
Burlacu, C.; Iordan, O.
2016-08-01
Surface roughness plays a very important role in micro milling process and in any machining process, because indicates the state of the machined surface. Many surface roughness parameters that can be used to analyse a surface, but the most common surface roughness parameter used is the average roughness (Ra). This paper presents the experimental results obtained at micro milling of the C45W steel and the ways to determine the Ra parameter with respect to the working conditions. The chemical characteristics of the material were determined from a spectral analysis, chemical composition was measured at one point and two points, graphical and tabular. A profilometer Surtronic 3+ was used to examine the surface roughness profiles; the effect of independent parameters can be investigated and can get a proper relationship between the Ra parameter and the process variables. The mathematical model were developed, using multiple regression method with four independent variables D, v, ap, fz; the analysis was done using statistical software SPSS. The ANOVA analysis of variance and the F- test was used to justify the accuracy of the mathematical model. The multiple regression method was used to determine the correlation between a criterion variable and the predictor variables. The prediction model can be used for micro milling process optimization.
Głuszcz, Paweł; Petera, Jerzy; Ledakowicz, Stanisław
2011-03-01
The mathematical model of the integrated process of mercury contaminated wastewater bioremediation in a fixed-bed industrial bioreactor is presented. An activated carbon packing in the bioreactor plays the role of an adsorbent for ionic mercury and at the same time of a carrier material for immobilization of mercury-reducing bacteria. The model includes three basic stages of the bioremediation process: mass transfer in the liquid phase, adsorption of mercury onto activated carbon and ionic mercury bioreduction to Hg(0) by immobilized microorganisms. Model calculations were verified using experimental data obtained during the process of industrial wastewater bioremediation in the bioreactor of 1 m³ volume. It was found that the presented model reflects the properties of the real system quite well. Numerical simulation of the bioremediation process confirmed the experimentally observed positive effect of the integration of ionic mercury adsorption and bioreduction in one apparatus.
Viewpoints on Medical Image Processing: From Science to Application
Deserno (né Lehmann), Thomas M.; Handels, Heinz; Maier-Hein (né Fritzsche), Klaus H.; Mersmann, Sven; Palm, Christoph; Tolxdorff, Thomas; Wagenknecht, Gudrun; Wittenberg, Thomas
2013-01-01
Medical image processing provides core innovation for medical imaging. This paper is focused on recent developments from science to applications analyzing the past fifteen years of history of the proceedings of the German annual meeting on medical image processing (BVM). Furthermore, some members of the program committee present their personal points of views: (i) multi-modality for imaging and diagnosis, (ii) analysis of diffusion-weighted imaging, (iii) model-based image analysis, (iv) registration of section images, (v) from images to information in digital endoscopy, and (vi) virtual reality and robotics. Medical imaging and medical image computing is seen as field of rapid development with clear trends to integrated applications in diagnostics, treatment planning and treatment. PMID:24078804
ERIC Educational Resources Information Center
Klein, Pnina S.; Adi-Japha, Esther; Hakak-Benizri, Simcha
2010-01-01
The objective of this study was to examine gender differences in the relations between verbal, spatial, mathematics, and teacher-child mathematics interaction variables. Kindergarten children (N = 80) were videotaped playing games that require mathematical reasoning in the presence of their teachers. The children's mathematics, spatial, and verbal…
An image processing system for digital chest X-ray images.
Cocklin, M; Gourlay, A; Jackson, P; Kaye, G; Miessler, M; Kerr, I; Lams, P
1984-01-01
This paper investigates the requirements for image processing of digital chest X-ray images. These images are conventionally recorded on film and are characterised by large size, wide dynamic range and high resolution. X-ray detection systems are now becoming available for capturing these images directly in photoelectronic-digital form. In this report, the hardware and software facilities required for handling these images are described. These facilities include high resolution digital image displays, programmable video look up tables, image stores for image capture and processing and a full range of software tools for image manipulation. Examples are given of the application of digital image processing techniques to this class of image.
NASA Astrophysics Data System (ADS)
Wang, Jiaoyang; Wang, Lin; Yang, Ying; Gong, Rui; Shao, Xiaopeng; Liang, Chao; Xu, Jun
2016-05-01
In this paper, an integral design that combines optical system with image processing is introduced to obtain high resolution images, and the performance is evaluated and demonstrated. Traditional imaging methods often separate the two technical procedures of optical system design and imaging processing, resulting in the failures in efficient cooperation between the optical and digital elements. Therefore, an innovative approach is presented to combine the merit function during optical design together with the constraint conditions of image processing algorithms. Specifically, an optical imaging system with low resolution is designed to collect the image signals which are indispensable for imaging processing, while the ultimate goal is to obtain high resolution images from the final system. In order to optimize the global performance, the optimization function of ZEMAX software is utilized and the number of optimization cycles is controlled. Then Wiener filter algorithm is adopted to process the image simulation and mean squared error (MSE) is taken as evaluation criterion. The results show that, although the optical figures of merit for the optical imaging systems is not the best, it can provide image signals that are more suitable for image processing. In conclusion. The integral design of optical system and image processing can search out the overall optimal solution which is missed by the traditional design methods. Especially, when designing some complex optical system, this integral design strategy has obvious advantages to simplify structure and reduce cost, as well as to gain high resolution images simultaneously, which has a promising perspective of industrial application.
Image processing software for imaging spectrometry data analysis
NASA Technical Reports Server (NTRS)
Mazer, Alan; Martin, Miki; Lee, Meemong; Solomon, Jerry E.
1988-01-01
Imaging spectrometers simultaneously collect image data in hundreds of spectral channels, from the near-UV to the IR, and can thereby provide direct surface materials identification by means resembling laboratory reflectance spectroscopy. Attention is presently given to a software system, the Spectral Analysis Manager (SPAM) for the analysis of imaging spectrometer data. SPAM requires only modest computational resources and is composed of one main routine and a set of subroutine libraries. Additions and modifications are relatively easy, and special-purpose algorithms have been incorporated that are tailored to geological applications.
Image processing software for imaging spectrometry data analysis
NASA Astrophysics Data System (ADS)
Mazer, Alan; Martin, Miki; Lee, Meemong; Solomon, Jerry E.
1988-02-01
Imaging spectrometers simultaneously collect image data in hundreds of spectral channels, from the near-UV to the IR, and can thereby provide direct surface materials identification by means resembling laboratory reflectance spectroscopy. Attention is presently given to a software system, the Spectral Analysis Manager (SPAM) for the analysis of imaging spectrometer data. SPAM requires only modest computational resources and is composed of one main routine and a set of subroutine libraries. Additions and modifications are relatively easy, and special-purpose algorithms have been incorporated that are tailored to geological applications.
Image-processing pipelines: applications in magnetic resonance histology
NASA Astrophysics Data System (ADS)
Johnson, G. Allan; Anderson, Robert J.; Cook, James J.; Long, Christopher; Badea, Alexandra
2016-03-01
Image processing has become ubiquitous in imaging research—so ubiquitous that it is easy to loose track of how diverse this processing has become. The Duke Center for In Vivo Microscopy has pioneered the development of Magnetic Resonance Histology (MRH), which generates large multidimensional data sets that can easily reach into the tens of gigabytes. A series of dedicated image-processing workstations and associated software have been assembled to optimize each step of acquisition, reconstruction, post-processing, registration, visualization, and dissemination. This talk will describe the image-processing pipelines from acquisition to dissemination that have become critical to our everyday work.
The mathematical modeling of rapid solidification processing. Ph.D. Thesis. Final Report
NASA Technical Reports Server (NTRS)
Gutierrez-Miravete, E.
1986-01-01
The detailed formulation of and the results obtained from a continuum mechanics-based mathematical model of the planar flow melt spinning (PFMS) rapid solidification system are presented and discussed. The numerical algorithm proposed is capable of computing the cooling and freezing rates as well as the fluid flow and capillary phenomena which take place inside the molten puddle formed in the PFMS process. The FORTRAN listings of some of the most useful computer programs and a collection of appendices describing the basic equations used for the modeling are included.
Improving night sky star image processing algorithm for star sensors.
Arbabmir, Mohammad Vali; Mohammadi, Seyyed Mohammad; Salahshour, Sadegh; Somayehee, Farshad
2014-04-01
In this paper, the night sky star image processing algorithm, consisting of image preprocessing, star pattern recognition, and centroiding steps, is improved. It is shown that the proposed noise reduction approach can preserve more necessary information than other frequently used approaches. It is also shown that the proposed thresholding method unlike commonly used techniques can properly perform image binarization, especially in images with uneven illumination. Moreover, the higher performance rate and lower average centroiding estimation error of near 0.045 for 400 simulated images compared to other algorithms show the high capability of the proposed night sky star image processing algorithm.
DTV color and image processing: past, present, and future
NASA Astrophysics Data System (ADS)
Kim, Chang-Yeong; Lee, SeongDeok; Park, Du-Sik; Kwak, Youngshin
2006-01-01
The image processor in digital TV has started to play an important role due to the customers' growing desire for higher quality image. The customers want more vivid and natural images without any visual artifact. Image processing techniques are to meet customers' needs in spite of the physical limitation of the panel. In this paper, developments in image processing techniques for DTV in conjunction with developments in display technologies at Samsung R and D are reviewed. The introduced algorithms cover techniques required to solve the problems caused by the characteristics of the panel itself and techniques for enhancing the image quality of input signals optimized for the panel and human visual characteristics.
Image processing techniques for digital orthophotoquad production
Hood, Joy J.; Ladner, L. J.; Champion, Richard A.
1989-01-01
Orthophotographs have long been recognized for their value as supplements or alternatives to standard maps. Recent trends towards digital cartography have resulted in efforts by the US Geological Survey to develop a digital orthophotoquad production system. Digital image files were created by scanning color infrared photographs on a microdensitometer. Rectification techniques were applied to remove tile and relief displacement, thereby creating digital orthophotos. Image mosaicking software was then used to join the rectified images, producing digital orthophotos in quadrangle format.
Cardiovascular Imaging and Image Processing: Theory and Practice - 1975
NASA Technical Reports Server (NTRS)
Harrison, Donald C. (Editor); Sandler, Harold (Editor); Miller, Harry A. (Editor); Hood, Manley J. (Editor); Purser, Paul E. (Editor); Schmidt, Gene (Editor)
1975-01-01
Ultrasonography was examined in regard to the developmental highlights and present applicatons of cardiac ultrasound. Doppler ultrasonic techniques and the technology of miniature acoustic element arrays were reported. X-ray angiography was discussed with special considerations on quantitative three dimensional dynamic imaging of structure and function of the cardiopulmonary and circulatory systems in all regions of the body. Nuclear cardiography and scintigraphy, three--dimensional imaging of the myocardium with isotopes, and the commercialization of the echocardioscope were studied.
An Image Processing Algorithm Based On FMAT
NASA Technical Reports Server (NTRS)
Wang, Lui; Pal, Sankar K.
1995-01-01
Information deleted in ways minimizing adverse effects on reconstructed images. New grey-scale generalization of medial axis transformation (MAT), called FMAT (short for Fuzzy MAT) proposed. Formulated by making natural extension to fuzzy-set theory of all definitions and conditions (e.g., characteristic function of disk, subset condition of disk, and redundancy checking) used in defining MAT of crisp set. Does not need image to have any kind of priori segmentation, and allows medial axis (and skeleton) to be fuzzy subset of input image. Resulting FMAT (consisting of maximal fuzzy disks) capable of reconstructing exactly original image.
Survey on Neural Networks Used for Medical Image Processing.
Shi, Zhenghao; He, Lifeng; Suzuki, Kenji; Nakamura, Tsuyoshi; Itoh, Hidenori
2009-02-01
This paper aims to present a review of neural networks used in medical image processing. We classify neural networks by its processing goals and the nature of medical images. Main contributions, advantages, and drawbacks of the methods are mentioned in the paper. Problematic issues of neural network application for medical image processing and an outlook for the future research are also discussed. By this survey, we try to answer the following two important questions: (1) What are the major applications of neural networks in medical image processing now and in the nearby future? (2) What are the major strengths and weakness of applying neural networks for solving medical image processing tasks? We believe that this would be very helpful researchers who are involved in medical image processing with neural network techniques.
Medical image processing on the GPU - past, present and future.
Eklund, Anders; Dufort, Paul; Forsberg, Daniel; LaConte, Stephen M
2013-12-01
Graphics processing units (GPUs) are used today in a wide range of applications, mainly because they can dramatically accelerate parallel computing, are affordable and energy efficient. In the field of medical imaging, GPUs are in some cases crucial for enabling practical use of computationally demanding algorithms. This review presents the past and present work on GPU accelerated medical image processing, and is meant to serve as an overview and introduction to existing GPU implementations. The review covers GPU acceleration of basic image processing operations (filtering, interpolation, histogram estimation and distance transforms), the most commonly used algorithms in medical imaging (image registration, image segmentation and image denoising) and algorithms that are specific to individual modalities (CT, PET, SPECT, MRI, fMRI, DTI, ultrasound, optical imaging and microscopy). The review ends by highlighting some future possibilities and challenges.
Design of a distributed CORBA based image processing server.
Giess, C; Evers, H; Heid, V; Meinzer, H P
2000-01-01
This paper presents the design and implementation of a distributed image processing server based on CORBA. Existing image processing tools were encapsulated in a common way with this server. Data exchange and conversion is done automatically inside the server, hiding these tasks from the user. The different image processing tools are visible as one large collection of algorithms and due to the use of CORBA are accessible via intra-/internet.
Principles of cryo-EM single-particle image processing
Sigworth, Fred J.
2016-01-01
Single-particle reconstruction is the process by which 3D density maps are obtained from a set of low-dose cryo-EM images of individual macromolecules. This review considers the fundamental principles of this process and the steps in the overall workflow for single-particle image processing. Also considered are the limits that image signal-to-noise ratio places on resolution and the distinguishing of heterogeneous particle populations. PMID:26705325
On digital image processing technology and application in geometric measure
NASA Astrophysics Data System (ADS)
Yuan, Jiugen; Xing, Ruonan; Liao, Na
2014-04-01
Digital image processing technique is an emerging science that emerging with the development of semiconductor integrated circuit technology and computer science technology since the 1960s.The article introduces the digital image processing technique and principle during measuring compared with the traditional optical measurement method. It takes geometric measure as an example and introduced the development tendency of digital image processing technology from the perspective of technology application.
NASA Astrophysics Data System (ADS)
Shrinivas Balraj, U.
2015-12-01
In this paper, mathematical modeling of three performance characteristics namely material removal rate, surface roughness and electrode wear rate in rotary electrical discharge machining RENE80 nickel super alloy is done using regression approach. The parameters considered are peak current, pulse on time, pulse off time and electrode rotational speed. The regression approach is very much effective in mathematical modeling when the performance characteristic is influenced by many variables. The modeling of these characteristics is helpful in predicting the performance under a given set of combination of input process parameters. The adequacy of developed models is tested by correlation coefficient and Analysis of Variance. It is observed that the developed models are adequate in establishing the relationship between input parameters and performance characteristics. Further, multi-criteria optimization of process parameter levels is carried using grey based Taguchi method. The experiments are planned based on Taguchi's L9 orthogonal array. The proposed method employs single grey relational grade as a performance index to obtain optimum levels of parameters. It is found that peak current and electrode rotational speed are influential on these characteristics. Confirmation experiments are conducted to validate optimal parameters and it reveals the improvements in material removal rate, surface roughness and electrode wear rate as 13.84%, 12.91% and 19.42% respectively.
Gomez, Alice; Piazza, Manuela; Jobert, Antoinette; Dehaene-Lambertz, Ghislaine; Dehaene, Stanislas; Huron, Caroline
2015-01-01
At school, children with Developmental Coordination Disorder (DCD) struggle with mathematics. However, little attention has been paid to their numerical cognition abilities. The goal of this study was to better understand the cognitive basis for mathematical difficulties in children with DCD. Twenty 7-to-10 years-old children with DCD were compared to twenty age-matched typically developing children using dot and digit comparison tasks to assess symbolic and nonsymbolic number processing and in a task of single digits additions. Results showed that children with DCD had lower performance in nonsymbolic and symbolic number comparison tasks than typically developing children. They were also slower to solve simple addition problems. Moreover, correlational analyses showed that children with DCD who experienced greater impairments in the nonsymbolic task also performed more poorly in the symbolic tasks. These findings suggest that DCD impairs both nonsymbolic and symbolic number processing. A systematic assessment of numerical cognition in children with DCD could provide a more comprehensive picture of their deficits and help in proposing specific remediation.
Hoskinson, Anne-Marie
2010-01-01
Biological problems in the twenty-first century are complex and require mathematical insight, often resulting in mathematical models of biological systems. Building mathematical-biological models requires cooperation among biologists and mathematicians, and mastery of building models. A new course in mathematical modeling presented the opportunity to build both content and process learning of mathematical models, the modeling process, and the cooperative process. There was little guidance from the literature on how to build such a course. Here, I describe the iterative process of developing such a course, beginning with objectives and choosing content and process competencies to fulfill the objectives. I include some inductive heuristics for instructors seeking guidance in planning and developing their own courses, and I illustrate with a description of one instructional model cycle. Students completing this class reported gains in learning of modeling content, the modeling process, and cooperative skills. Student content and process mastery increased, as assessed on several objective-driven metrics in many types of assessments.
NASA Astrophysics Data System (ADS)
Sadrtdinov, Almaz R.; Esmagilova, Liliya M.; Saldaev, Vladimir A.; Sattarova, Zulfiya G.; Mokhovikov, Alexey A.
2016-08-01
The paper describes the process of thermochemical wood waste processing in to dimethyl ether. The physical picture of the process of waste wood recycling was compiled and studied and the mathematical model in the form of differential and algebraic equations with initial and boundary conditions was developed on its basis. The mathematical model allows to determine the optimum operating parameters of synthesis gas producing process, suitable for the catalytic synthesis of dimethyl ether and to calculate the basic constructive parameters of the equipment flowsheet.
A color image processing pipeline for digital microscope
NASA Astrophysics Data System (ADS)
Liu, Yan; Liu, Peng; Zhuang, Zhefeng; Chen, Enguo; Yu, Feihong
2012-10-01
Digital microscope has found wide application in the field of biology, medicine et al. A digital microscope differs from traditional optical microscope in that there is no need to observe the sample through an eyepiece directly, because the optical image is projected directly on the CCD/CMOS camera. However, because of the imaging difference between human eye and sensor, color image processing pipeline is needed for the digital microscope electronic eyepiece to get obtain fine image. The color image pipeline for digital microscope, including the procedures that convert the RAW image data captured by sensor into real color image, is of great concern to the quality of microscopic image. The color pipeline for digital microscope is different from digital still cameras and video cameras because of the specific requirements of microscopic image, which should have the characters of high dynamic range, keeping the same color with the objects observed and a variety of image post-processing. In this paper, a new color image processing pipeline is proposed to satisfy the requirements of digital microscope image. The algorithm of each step in the color image processing pipeline is designed and optimized with the purpose of getting high quality image and accommodating diverse user preferences. With the proposed pipeline implemented on the digital microscope platform, the output color images meet the various analysis requirements of images in the medicine and biology fields very well. The major steps of color imaging pipeline proposed include: black level adjustment, defect pixels removing, noise reduction, linearization, white balance, RGB color correction, tone scale correction and gamma correction.
Experiences with digital processing of images at INPE
NASA Technical Reports Server (NTRS)
Mascarenhas, N. D. A. (Principal Investigator)
1984-01-01
Four different research experiments with digital image processing at INPE will be described: (1) edge detection by hypothesis testing; (2) image interpolation by finite impulse response filters; (3) spatial feature extraction methods in multispectral classification; and (4) translational image registration by sequential tests of hypotheses.
Quantitative high spatiotemporal imaging of biological processes
NASA Astrophysics Data System (ADS)
Borbely, Joe; Otterstrom, Jason; Mohan, Nitin; Manzo, Carlo; Lakadamyali, Melike
2015-08-01
Super-resolution microscopy has revolutionized fluorescence imaging providing access to length scales that are much below the diffraction limit. The super-resolution methods have the potential for novel discoveries in biology. However, certain technical limitations must be overcome for this potential to be fulfilled. One of the main challenges is the use of super-resolution to study dynamic events in living cells. In addition, the ability to extract quantitative information from the super-resolution images is confounded by the complex photophysics that the fluorescent probes exhibit during the imaging. Here, we will review recent developments we have been implementing to overcome these challenges and introduce new steps in automated data acquisition towards high-throughput imaging.
NASA Technical Reports Server (NTRS)
Masuoka, E.; Rose, J.; Quattromani, M.
1981-01-01
Recent developments related to microprocessor-based personal computers have made low-cost digital image processing systems a reality. Image analysis systems built around these microcomputers provide color image displays for images as large as 256 by 240 pixels in sixteen colors. Descriptive statistics can be computed for portions of an image, and supervised image classification can be obtained. The systems support Basic, Fortran, Pascal, and assembler language. A description is provided of a system which is representative of the new microprocessor-based image processing systems currently on the market. While small systems may never be truly independent of larger mainframes, because they lack 9-track tape drives, the independent processing power of the microcomputers will help alleviate some of the turn-around time problems associated with image analysis and display on the larger multiuser systems.
Research on non-destructive testing method of silkworm cocoons based on image processing technology
NASA Astrophysics Data System (ADS)
Gan, Yong; Kong, Qing-hua; Wei, Li-fu
2008-03-01
The major studied in this dissertation is the non-destructive testing method of silkworm cocoon's quality, based on the digital image processing and photoelectricity technology. Through the images collection and the data analysis, procession and calculation of the tested silkworm cocoons with the non-destructive testing technology, internet applications automatically reckon all items of the classification indexes. Finally we can conclude the classification result and the purchase price of the silkworm cocoons. According to the domestic classification standard of the silkworm cocoons, the author investigates various testing methods of silkworm cocoons which are used or have been explored at present, and devices a non-destructive testing scheme of the silkworm cocoons based on the digital image processing and photoelectricity technology. They are dissertated about the project design of the experiment. The precisions of all the implements are demonstrated. I establish Manifold mathematic models, compare them with each other and analyze the precision with technology of databank to get the best mathematic model to figure out the weight of the dried silkworm cocoon shells. The classification methods of all the complementary items are designed well and truly. The testing method has less error and reaches an advanced level of the present domestic non-destructive testing technology of the silkworm cocoons.
The Image Processing of Droplet for Evaporation Experiment in SJ-10
NASA Astrophysics Data System (ADS)
Xue, Changbin; Feng, Yanhui; Yu, Qiang
2017-03-01
We have completed an experiment for droplet evaporation processing using Young-Laplace fitting, exponent fitting, polynomial fitting and ellipse fitting, which could be used for multiple shapes of droplets. The droplet evaporation experiment test was an important science experiment in SJ-10. In order to get the change process of the physical parameter, such as the touching edges and the droplet evaporation rate, we had gained the contour edge image of the droplet and used mathematic method to do the fitting analysis. The accuracy of the physical parameter was depended on the accuracy of the mathematic fitting. Using the original Young-Laplace fitting method could not process all the images of evaporation and liquid interface from the space experiment facility of SJ-10, especially the smaller droplet images. We could get more accurate contour fitting and result using the new method described in this article. This article proposes a complete solution, including edge detecting and contour fitting. In edge detecting, Canny detector was applied to extract droplet edge. In contour fitting, Young-Laplace fitting, exponent fitting, polynomial fitting and ellipse fitting are designed to fit the contour of droplets, which make the solution apply to all of droplets in SJ-10.
A mathematical model for the admission process in intensive care units
NASA Astrophysics Data System (ADS)
Rokni Lamooki, Gholam Reza; Maleki, Farzaneh; Hajihosseini, Amirhossein
2014-01-01
A mathematical model is given for the admission process in Intensive Care Units (ICUs). It is shown that the model exhibits bistability for certain values of its parameters. In particular, it is observed that in a two-dimensional parameter space, two saddle-node bifurcation curves terminate at a single point of the cusp bifurcation, creating an enclosed region in which the model has one unstable and two stable states. It is shown that in the presence of bistability, variations in the value of parameters may lead to undesired outcomes in the admission process as the value of state variables abruptly changes. Using numerical simulations, it is also discussed how such outcomes can be avoided by appropriately adjusting the parameter values.
Ahn, Chi K; Lee, Min W; Lee, Dae S; Woo, Seung H; Park, Jong M
2008-12-15
The performances of various soil washing processes, including surfactant recovery by selective adsorption, were evaluated using a mathematical model for partitioning a target compound and surfactant in water/sorbent system. Phenanthrene was selected as a representative hazardous organic compound and Triton X-100 as a surfactant. Two activated carbons that differed in size (Darco 20-40 mesh and >100 mesh sizes) were used in adsorption experiments. The adsorption isotherms of the chemicals were used in model simulations for various washing scenarios. The optimal process conditions were suggested to minimize the dosage of activated carbon and surfactant and the number of washings. We estimated that the requirement of surfactant could be reduced to 33% of surfactant requirements (from 265 to 86.6g) with a reuse step using 9.1g activated carbon (>100 mesh) to achieve 90% removal of phenanthrene (initially 100mg kg-soil(-1)) with a water/soil ratio of 10.
NASA Astrophysics Data System (ADS)
Canelas, Ricardo; Heleno, Sandra; Pestana, Rita; Ferreira, Rui M. L.
2014-05-01
The objective of the present work is to devise a methodology to validate 2DH shallow-water models suitable to simulate flow hydrodynamics and channel morphology. For this purpose, a 2DH mathematical model, assembled at CEHIDRO, IST, is employed to model Tagus river floods over a 70 km reach and Synthetic Aperture Radar (SAR) images are collected to retrieve planar inundation extents. The model is suited for highly unsteady discontinuous flows over complex, time-evolving geometries, employing a finite-volume discretization scheme, based on a flux-splitting technique incorporating a reviewed version of the Roe Riemann solver. Novel closure terms for the non-equilibrium sediment transport model are included. New boundary conditions are employed, based on the Riemann variables associated the outgoing characteristic fields, coping with the provided hydrographs in a mathematically coherent manner. A high resolution Digital Elevation Model (DEM) is used and levee structures are considered as fully erodible elements. Spatially heterogeneous roughness characteristics are derived from land-use databases such as CORINE LandCover 2006. SAR satellite imagery of the floods is available and is used to validate the simulation results, with particular emphasis on the 2000/2001 flood. The delimited areas from the satellite and simulations are superimposed. The quality of the adjustment depends on the calibration of roughness coefficients and the spatial discretization of with small structures, with lengths at the order of the spatial discretization. Flow depths and registered discharges are recovered from the simulation and compared with data from a measuring station in the domain, with the comparison revealing remarkably high accuracy, both in terms of amplitudes and phase. Further inclusion of topographical detail should improve the comparison of flood extents regarding satellite data. The validated model was then employed to simulate 100-year floods in the same reach. The
Bauer, Rüdiger; Hekmat, Dariusch
2006-01-01
For the mathematical description of the semicontinuous two-stage repeated-fed-batch fermentation of dihydroxyacetone (DHA), a novel segregated model incorporating transient growth rates was developed. The fermentation process was carried out in two stages. A viable, not irreversibly product-inhibited culture was maintained in the first reactor stage until a predetermined DHA threshold value was reached. In the second reactor stage, high final product concentrations of up to 220 g L(-1) were reached while the culture was irreversibly product-inhibited. The experimentally observed changes of the physiological state of the culture due to product inhibition were taken into account by introducing a segregation into the mathematical model. It was shown that the state of the cells was dependent on the current environment and on the previous history. This phenomenon was considered in the model by utilizing delay time equations for the specific rates of growth on the primary and the secondary substrate. A comparison with reproducible measurements gave a good correlation between computation and experiment. The mathematical model was validated using independent own experimental data. A comparison with a stationary and nonsegregated model demonstrated the essential improvements of the novel model. It was deduced from the model calculations that high product formation rates of 3.3-3.5 g L(-1) h(-1) as well as high final DHA concentrations of 196-215 g L(-1) can be obtained with a residual broth volume in the first reactor stage of 2% and a DHA threshold value in the range of 100-110 g L(-1).
Breast image pre-processing for mammographic tissue segmentation.
He, Wenda; Hogg, Peter; Juette, Arne; Denton, Erika R E; Zwiggelaar, Reyer
2015-12-01
During mammographic image acquisition, a compression paddle is used to even the breast thickness in order to obtain optimal image quality. Clinical observation has indicated that some mammograms may exhibit abrupt intensity change and low visibility of tissue structures in the breast peripheral areas. Such appearance discrepancies can affect image interpretation and may not be desirable for computer aided mammography, leading to incorrect diagnosis and/or detection which can have a negative impact on sensitivity and specificity of screening mammography. This paper describes a novel mammographic image pre-processing method to improve image quality for analysis. An image selection process is incorporated to better target problematic images. The processed images show improved mammographic appearances not only in the breast periphery but also across the mammograms. Mammographic segmentation and risk/density classification were performed to facilitate a quantitative and qualitative evaluation. When using the processed images, the results indicated more anatomically correct segmentation in tissue specific areas, and subsequently better classification accuracies were achieved. Visual assessments were conducted in a clinical environment to determine the quality of the processed images and the resultant segmentation. The developed method has shown promising results. It is expected to be useful in early breast cancer detection, risk-stratified screening, and aiding radiologists in the process of decision making prior to surgery and/or treatment.
Using quantum filters to process images of diffuse axonal injury
NASA Astrophysics Data System (ADS)
Pineda Osorio, Mateo
2014-06-01
Some images corresponding to a diffuse axonal injury (DAI) are processed using several quantum filters such as Hermite Weibull and Morse. Diffuse axonal injury is a particular, common and severe case of traumatic brain injury (TBI). DAI involves global damage on microscopic scale of brain tissue and causes serious neurologic abnormalities. New imaging techniques provide excellent images showing cellular damages related to DAI. Said images can be processed with quantum filters, which accomplish high resolutions of dendritic and axonal structures both in normal and pathological state. Using the Laplacian operators from the new quantum filters, excellent edge detectors for neurofiber resolution are obtained. Image quantum processing of DAI images is made using computer algebra, specifically Maple. Quantum filter plugins construction is proposed as a future research line, which can incorporated to the ImageJ software package, making its use simpler for medical personnel.
The Development of Sun-Tracking System Using Image Processing
Lee, Cheng-Dar; Huang, Hong-Cheng; Yeh, Hong-Yih
2013-01-01
This article presents the development of an image-based sun position sensor and the algorithm for how to aim at the Sun precisely by using image processing. Four-quadrant light sensors and bar-shadow photo sensors were used to detect the Sun's position in the past years. Nevertheless, neither of them can maintain high accuracy under low irradiation conditions. Using the image-based Sun position sensor with image processing can address this drawback. To verify the performance of the Sun-tracking system including an image-based Sun position sensor and a tracking controller with embedded image processing algorithm, we established a Sun image tracking platform and did the performance testing in the laboratory; the results show that the proposed Sun tracking system had the capability to overcome the problem of unstable tracking in cloudy weather and achieve a tracking accuracy of 0.04°. PMID:23615582
Visualization and processing of images in nano-resolution
NASA Astrophysics Data System (ADS)
Vozenilek, Vit; Pour, Tomas
2017-02-01
The paper aims to apply the methods of image processing which are widely used in Earth remote sensing for processing and visualization of images in nano-resolution because most of these images are currently analyzed only by an expert researcher without proper statistical background. Nano-resolution level may range from a resolution in picometres to the resolution of a light microscope that may be up to about 200 nanometers. Images in nano-resolution play an essential role in physics, medicine, and chemistry. Three case studies demonstrate different image visualization and image analysis approaches for different scales at the nano-resolution level. The results of case studies prove the suitability and applicability of Earth remote sensing methods for image visualization and processing for the nanoresolution level. It even opens new dimensions for spatial analysis at such an extreme spatial detail.
Ojanguren, Aitziber; Ayo, Josune
2013-06-20
Industrial processes that apply high temperatures in the presence of oxygen may compromise the stability of conjugated linoleic acid (CLA) bioactive isomers. Statistical techniques are used in this study to model and predict, on a laboratory scale, the oxidative behaviour of oil with high CLA content, controlling the limiting factors of food processing. This modelling aims to estimate the impact of an industrial frying process (140 °C, 7 L/h air) on the oxidation of CLA oil for use as frying oil instead of sunflower oil. A factorial design was constructed within a temperature (80-200 °C) and air flow (7-20 L/h) range. Oil stability index (Rancimat method) was used as a measure of oxidation. Three-level full factorial design was used to obtain a quadratic model for CLA oil, enabling the oxidative behaviour to be predicted under predetermined process conditions (temperature and air flow). It is deduced that temperatures applied in food processes affect the oxidation of CLA to a greater extent than air flow. As a result, it is estimated that the oxidative stability of CLA oil is less resistant to industrial frying than sunflower oil. In conclusion, thanks to the mathematical model, a good choice of the appropriate industrial food process can be selected to avoid the oxidation of the bioactive isomers of CLA, ensuring its functionality in novel applications.
NASA Astrophysics Data System (ADS)
Rauh, Cornelia; Delgado, Antonio
2011-03-01
High pressures up to several hundreds of MPa are utilised in a wide range of applications in chemical engineering, bioengineering, and food engineering, aiming at selective control of (bio-)chemical reactions. Non-uniformity of process conditions may threaten the safety and quality of the resulting products as the process conditions such as pressure, temperature, and treatment history are crucial for the course of (bio-)chemical reactions. Therefore, thermofluid dynamical phenomena during the high-pressure process have to be examined, and tools to predict process uniformity and to optimise the processes have to be developed. Recently, mathematical models and numerical simulations of laboratory and industrial scale high-pressure processes have been set up and validated by experimental results. This contribution deals with the assumption of the modelling that relevant (bio-)chemical compounds are ideally dissolved or diluted particles in a continuum flow. By considering the definition of the continuum hypothesis regarding the minimum particle population in a distinct volume, limitations of this modelling and simulation are addressed.
Wavelet-Based Signal and Image Processing for Target Recognition
2002-01-01
in target recognition applications. Classical spatial and frequency domain image processing algorithms were generalized to process discrete wavelet ... transform (DWT) data. Results include adaptation of classical filtering, smoothing and interpolation techniques to DWT. From 2003 the research
Image data processing of earth resources management. [technology transfer
NASA Technical Reports Server (NTRS)
Desio, A. W.
1974-01-01
Various image processing and information extraction systems are described along with the design and operation of an interactive multispectral information system, IMAGE 100. Analyses of ERTS data, using IMAGE 100, over a number of U.S. sites are presented. The following analyses are included: investigations of crop inventory and management using remote sensing; and (2) land cover classification for environmental impact assessments. Results show that useful information is provided by IMAGE 100 analyses of ERTS data in digital form.
IPL Processing of the Viking Orbiter Images of Mars
NASA Technical Reports Server (NTRS)
Ruiz, R. M.; Elliott, D. A.; Yagi, G. M.; Pomphrey, R. B.; Power, M. A.; Farrell, W., Jr.; Lorre, J. J.; Benton, W. D.; Dewar, R. E.; Cullen, L. E.
1977-01-01
The Viking orbiter cameras returned over 9000 images of Mars during the 6-month nominal mission. Digital image processing was required to produce products suitable for quantitative and qualitative scientific interpretation. Processing included the production of surface elevation data using computer stereophotogrammetric techniques, crater classification based on geomorphological characteristics, and the generation of color products using multiple black-and-white images recorded through spectral filters. The Image Processing Laboratory of the Jet Propulsion Laboratory was responsible for the design, development, and application of the software required to produce these 'second-order' products.
Pyramidal Image-Processing Code For Hexagonal Grid
NASA Technical Reports Server (NTRS)
Watson, Andrew B.; Ahumada, Albert J., Jr.
1990-01-01
Algorithm based on processing of information on intensities of picture elements arranged in regular hexagonal grid. Called "image pyramid" because image information at each processing level arranged in hexagonal grid having one-seventh number of picture elements of next lower processing level, each picture element derived from hexagonal set of seven nearest-neighbor picture elements in next lower level. At lowest level, fine-resolution of elements of original image. Designed to have some properties of image-coding scheme of primate visual cortex.
Stochastic Process Underlying Emergent Recognition of Visual Objects Hidden in Degraded Images
Murata, Tsutomu; Hamada, Takashi; Shimokawa, Tetsuya; Tanifuji, Manabu; Yanagida, Toshio
2014-01-01
When a degraded two-tone image such as a “Mooney” image is seen for the first time, it is unrecognizable in the initial seconds. The recognition of such an image is facilitated by giving prior information on the object, which is known as top-down facilitation and has been intensively studied. Even in the absence of any prior information, however, we experience sudden perception of the emergence of a salient object after continued observation of the image, whose processes remain poorly understood. This emergent recognition is characterized by a comparatively long reaction time ranging from seconds to tens of seconds. In this study, to explore this time-consuming process of emergent recognition, we investigated the properties of the reaction times for recognition of degraded images of various objects. The results show that the time-consuming component of the reaction times follows a specific exponential function related to levels of image degradation and subject's capability. Because generally an exponential time is required for multiple stochastic events to co-occur, we constructed a descriptive mathematical model inspired by the neurophysiological idea of combination coding of visual objects. Our model assumed that the coincidence of stochastic events complement the information loss of a degraded image leading to the recognition of its hidden object, which could successfully explain the experimental results. Furthermore, to see whether the present results are specific to the task of emergent recognition, we also conducted a comparison experiment with the task of perceptual decision making of degraded images, which is well known to be modeled by the stochastic diffusion process. The results indicate that the exponential dependence on the level of image degradation is specific to emergent recognition. The present study suggests that emergent recognition is caused by the underlying stochastic process which is based on the coincidence of multiple stochastic events
Improving Primary School Prospective Teachers' Understanding of the Mathematics Modeling Process
ERIC Educational Resources Information Center
Bal, Aytgen Pinar; Doganay, Ahmet
2014-01-01
The development of mathematical thinking plays an important role on the solution of problems faced in daily life. Determining the relevant variables and necessary procedural steps in order to solve problems constitutes the essence of mathematical thinking. Mathematical modeling provides an opportunity for explaining thoughts in real life by making…
ERIC Educational Resources Information Center
Barlow, Angela T.; Huang, Rongjin; Law, Huk-Yuen; Chan, Yip Cheung; Zhang, Qiaoping; Baxter, Wesley A.; Gaddy, Angeline K.
2016-01-01
Mathematical disagreements occur when students challenge each other's ideas related to a mathematical concept. In this research, we examined Hong Kong and U.S. elementary teachers' perceptions of mathematical disagreements and their resolutions using a video-stimulated survey. Participants were directed to give particular attention to the…
ERIC Educational Resources Information Center
Gerken, Katheryn
The manual is intended to help school psychologists determine strengths and weaknesses, establish goals, and prescribe interventions for students with difficulties in mathematics, spelling, and written language. Research on sources of mathematical difficulties, theoretical bases, and hierarchies in mathematics is reviewed, procedures for formal…
ERIC Educational Resources Information Center
Seltman, Muriel; Seltman, P. E. J.
1978-01-01
The authors stress the importance of bringing together the causal logic of history and the formal logic of mathematics in order to humanize mathematics and make it more accessible. An example of such treatment is given in a discussion of the centrality of Euclid and the Euclidean system to mathematics development. (MN)
ERIC Educational Resources Information Center
Superfine, Alison Castro; Kelso, Catherine Randall; Beal, Susan
2010-01-01
The implementation of "research-based" mathematics curricula is increasingly becoming a central element of mathematics education reform policies. Given the recent focus on grounding mathematics curriculum policies in research, it is important to understand precisely what it means for a curriculum to be research-based. Using the Curriculum Research…
Protocols for Image Processing based Underwater Inspection of Infrastructure Elements
NASA Astrophysics Data System (ADS)
O'Byrne, Michael; Ghosh, Bidisha; Schoefs, Franck; Pakrashi, Vikram
2015-07-01
Image processing can be an important tool for inspecting underwater infrastructure elements like bridge piers and pile wharves. Underwater inspection often relies on visual descriptions of divers who are not necessarily trained in specifics of structural degradation and the information may often be vague, prone to error or open to significant variation of interpretation. Underwater vehicles, on the other hand can be quite expensive to deal with for such inspections. Additionally, there is now significant encouragement globally towards the deployment of more offshore renewable wind turbines and wave devices and the requirement for underwater inspection can be expected to increase significantly in the coming years. While the merit of image processing based assessment of the condition of underwater structures is understood to a certain degree, there is no existing protocol on such image based methods. This paper discusses and describes an image processing protocol for underwater inspection of structures. A stereo imaging image processing method is considered in this regard and protocols are suggested for image storage, imaging, diving, and inspection. A combined underwater imaging protocol is finally presented which can be used for a variety of situations within a range of image scenes and environmental conditions affecting the imaging conditions. An example of detecting marine growth is presented of a structure in Cork Harbour, Ireland.
Linear Mathematical Model for Seam Tracking with an Arc Sensor in P-GMAW Processes
Liu, Wenji; Li, Liangyu; Hong, Ying; Yue, Jianfeng
2017-01-01
Arc sensors have been used in seam tracking and widely studied since the 80s and commercial arc sensing products for T and V shaped grooves have been developed. However, it is difficult to use these arc sensors in narrow gap welding because the arc stability and sensing accuracy are not satisfactory. Pulse gas melting arc welding (P-GMAW) has been successfully applied in narrow gap welding and all position welding processes, so it is worthwhile to research P-GMAW arc sensing technology. In this paper, we derived a linear mathematical P-GMAW model for arc sensing, and the assumptions for the model are verified through experiments and finite element methods. Finally, the linear characteristics of the mathematical model were investigated. In torch height changing experiments, uphill experiments, and groove angle changing experiments the P-GMAW arc signals all satisfied the linear rules. In addition, the faster the welding speed, the higher the arc signal sensitivities; the smaller the groove angle, the greater the arc sensitivities. The arc signal variation rate needs to be modified according to the welding power, groove angles, and weaving or rotate speed. PMID:28335425
Digital processing of radiographic images from PACS to publishing.
Christian, M E; Davidson, H C; Wiggins, R H; Berges, G; Cannon, G; Jackson, G; Chapman, B; Harnsberger, H R
2001-03-01
Several studies have addressed the implications of filmless radiologic imaging on telemedicine, diagnostic ability, and electronic teaching files. However, many publishers still require authors to submit hard-copy images for publication of articles and textbooks. This study compares the quality digital images directly exported from picture archive and communications systems (PACS) to images digitized from radiographic film. The authors evaluated the quality of publication-grade glossy photographs produced from digital radiographic images using 3 different methods: (1) film images digitized using a desktop scanner and then printed, (2) digital images obtained directly from PACS then printed, and (3) digital images obtained from PACS and processed to improve sharpness prior to printing. Twenty images were printed using each of the 3 different methods and rated for quality by 7 radiologists. The results were analyzed for statistically significant differences among the image sets. Subjective evaluations of the filmless images found them to be of equal or better quality than the digitized images. Direct electronic transfer of PACS images reduces the number of steps involved in creating publication-quality images as well as providing the means to produce high-quality radiographic images in a digital environment.
Wood industrial application for quality control using image processing
NASA Astrophysics Data System (ADS)
Ferreira, M. J. O.; Neves, J. A. C.
1994-11-01
This paper describes an application of image processing for the furniture industry. It uses an input data, images acquired directly from wood planks where defects were previously marked by an operator. A set of image processing algorithms separates and codes each defect and detects a polygonal approach of the line representing them. For such a purpose we developed a pattern classification algorithm and a new technique of segmenting defects by carving the convex hull of the binary shape representing each isolated defect.
Image Processing In Laser-Beam-Steering Subsystem
NASA Technical Reports Server (NTRS)
Lesh, James R.; Ansari, Homayoon; Chen, Chien-Chung; Russell, Donald W.
1996-01-01
Conceptual design of image-processing circuitry developed for proposed tracking apparatus described in "Beam-Steering Subsystem For Laser Communication" (NPO-19069). In proposed system, desired frame rate achieved by "windowed" readout scheme in which only pixels containing and surrounding two spots read out and others skipped without being read. Image data processed rapidly and efficiently to achieve high frequency response.
Digital image processing software system using an array processor
Sherwood, R.J.; Portnoff, M.R.; Journeay, C.H.; Twogood, R.E.
1981-03-10
A versatile array processor-based system for general-purpose image processing was developed. At the heart of this system is an extensive, flexible software package that incorporates the array processor for effective interactive image processing. The software system is described in detail, and its application to a diverse set of applications at LLNL is briefly discussed. 4 figures, 1 table.
Restoration Of Faded Color Photographs By Digital Image Processing
NASA Astrophysics Data System (ADS)
Gschwind, Rudolf
1989-10-01
Color photographs possess a poor stability towards light, chemicals heat and humidity. As a consequence, the colors of photographs deteriorate with time. Because of the complexity of processes that cause the dyes to fade, it is impossible to restore the images by chemical means. It is therefore attempted to restore faded color films by means of digital image processing.
AstroImageJ: Image Processing and Photometric Extraction for Ultra-precise Astronomical Light Curves
NASA Astrophysics Data System (ADS)
Collins, Karen A.; Kielkopf, John F.; Stassun, Keivan G.; Hessman, Frederic V.
2017-02-01
ImageJ is a graphical user interface (GUI) driven, public domain, Java-based, software package for general image processing traditionally used mainly in life sciences fields. The image processing capabilities of ImageJ are useful and extendable to other scientific fields. Here we present AstroImageJ (AIJ), which provides an astronomy specific image display environment and tools for astronomy specific image calibration and data reduction. Although AIJ maintains the general purpose image processing capabilities of ImageJ, AIJ is streamlined for time-series differential photometry, light curve detrending and fitting, and light curve plotting, especially for applications requiring ultra-precise light curves (e.g., exoplanet transits). AIJ reads and writes standard Flexible Image Transport System (FITS) files, as well as other common image formats, provides FITS header viewing and editing, and is World Coordinate System aware, including an automated interface to the astrometry.net web portal for plate solving images. AIJ provides research grade image calibration and analysis tools with a GUI driven approach, and easily installed cross-platform compatibility. It enables new users, even at the level of undergraduate student, high school student, or amateur astronomer, to quickly start processing, modeling, and plotting astronomical image data with one tightly integrated software package.
NASA Technical Reports Server (NTRS)
Heydorn, R. P.
1984-01-01
The Mathematical Pattern Recognition and Image Analysis (MPRIA) Project is concerned with basic research problems related to the study of he Earth from remotely sensed measurements of its surface characteristics. The program goal is to better understand how to analyze the digital image that represents the spatial, spectral, and temporal arrangement of these measurements for purposing of making selected inferences about the Earth. This report summarizes the progress that has been made toward this program goal by each of the principal investigators in the MPRIA Program.
From Image to Text: Using Images in the Writing Process
ERIC Educational Resources Information Center
Andrzejczak, Nancy; Trainin, Guy; Poldberg, Monique
2005-01-01
This study looks at the benefits of integrating visual art creation and the writing process. The qualitative inquiry uses student, parent, and teacher interviews coupled with field observation, and artifact analysis. Emergent coding based on grounded theory clearly shows that visual art creation enhances the writing process. Students used more…
Functional minimization problems in image processing
NASA Astrophysics Data System (ADS)
Kim, Yunho; Vese, Luminita A.
2008-02-01
In this work we wish to recover an unknown image from a blurry version. We solve this inverse problem by energy minimization and regularization. We seek a solution of the form u + v, where u is a function of bounded variation (cartoon component), while v is an oscillatory component (texture), modeled by a Sobolev function with negative degree of differentiability. Experimental results show that this cartoon + texture model better recovers textured details in natural images, by comparison with the more standard models where the unknown is restricted only to the space of functions of bounded variation.
Mathematical modelling of magnesium reduction in a novel vertical Pidgeon process
NASA Astrophysics Data System (ADS)
Yu, Alfred; Hu, Henry; Li, Naiyi
2002-07-01
A mathematical model has been developed to simulate the phenomenon of heat transfer occurring during a novel magnesium reduction process - the vertical retort technology. The model was based on the control-volume finite difference approach. The simulations were run to determine the effect of various parameters, such as the diameter and thickness of the compound, and slot angle, on the magnesium reduction cycle time. The model predicted the temperature distributions, the heating curves, and the total process time. The predictions were used to optimize the magnesium reduction process including the dimensions of the retort, shapes of charged materials, and reduction cycle time. The computed results show that the utilization of the optimized process parameters leads to a decrease in reduction time and energy consumption, and an increase in production capacities and recovery rates. Consequently, the magnesium thermal reduction process is significantly improved in the vertical retort. The model has been verified in a demo-plant operation with an annual production capacity of 1200 ton magnesium.
High-performance image processing on the desktop
NASA Astrophysics Data System (ADS)
Jordan, Stephen D.
1996-04-01
The suitability of computers to the task of medical image visualization for the purposes of primary diagnosis and treatment planning depends on three factors: speed, image quality, and price. To be widely accepted the technology must increase the efficiency of the diagnostic and planning processes. This requires processing and displaying medical images of various modalities in real-time, with accuracy and clarity, on an affordable system. Our approach to meeting this challenge began with market research to understand customer image processing needs. These needs were translated into system-level requirements, which in turn were used to determine which image processing functions should be implemented in hardware. The result is a computer architecture for 2D image processing that is both high-speed and cost-effective. The architectural solution is based on the high-performance PA-RISC workstation with an HCRX graphics accelerator. The image processing enhancements are incorporated into the image visualization accelerator (IVX) which attaches to the HCRX graphics subsystem. The IVX includes a custom VLSI chip which has a programmable convolver, a window/level mapper, and an interpolator supporting nearest-neighbor, bi-linear, and bi-cubic modes. This combination of features can be used to enable simultaneous convolution, pan, zoom, rotate, and window/level control into 1 k by 1 k by 16-bit medical images at 40 frames/second.
Lee, Kyungmin; Cho, Soohyun
2017-01-26
Mathematics anxiety (MA) refers to the experience of negative affect when engaging in mathematical activity. According to Ashcraft and Kirk (2001), MA selectively affects calculation with high working memory (WM) demand. On the other hand, Maloney, Ansari, and Fugelsang (2011) claim that MA affects all mathematical activities, including even the most basic ones such as magnitude comparison. The two theories make opposing predictions on the negative effect of MA on magnitude processing and simple calculation that make minimal demands on WM. We propose that MA has a selective impact on mathematical problem solving that likely involves processing of magnitude representations. Based on our hypothesis, MA will impinge upon magnitude processing even though it makes minimal demand on WM, but will spare retrieval-based, simple calculation, because it does not require magnitude processing. Our hypothesis can reconcile opposing predictions on the negative effect of MA on magnitude processing and simple calculation. In the present study, we observed a negative relationship between MA and performance on magnitude comparison and calculation with high but not low WM demand. These results demonstrate that MA has an impact on a wide range of mathematical performance, which depends on one's sense of magnitude, but spares over-practiced, retrieval-based calculation.
Image processing of globular clusters - Simulation for deconvolution tests (GlencoeSim)
NASA Astrophysics Data System (ADS)
Blazek, Martin; Pata, Petr
2016-10-01
This paper presents an algorithmic approach for efficiency tests of deconvolution algorithms in astronomic image processing. Due to the existence of noise in astronomical data there is no certainty that a mathematically exact result of stellar deconvolution exists and iterative or other methods such as aperture or PSF fitting photometry are commonly used. Iterative methods are important namely in the case of crowded fields (e.g., globular clusters). For tests of the efficiency of these iterative methods on various stellar fields, information about the real fluxes of the sources is essential. For this purpose a simulator of artificial images with crowded stellar fields provides initial information on source fluxes for a robust statistical comparison of various deconvolution methods. The "GlencoeSim" simulator and the algorithms presented in this paper consider various settings of Point-Spread Functions, noise types and spatial distributions, with the aim of producing as realistic an astronomical optical stellar image as possible.
Advanced technology development for image gathering, coding, and processing
NASA Technical Reports Server (NTRS)
Huck, Friedrich O.
1990-01-01
Three overlapping areas of research activities are presented: (1) Information theory and optimal filtering are extended to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing. (2) Focal-plane processing techniques and technology are developed to combine effectively image gathering with coding. The emphasis is on low-level vision processing akin to the retinal processing in human vision. (3) A breadboard adaptive image-coding system is being assembled. This system will be used to develop and evaluate a number of advanced image-coding technologies and techniques as well as research the concept of adaptive image coding.
Assessment of vessel diameters for MR brain angiography processed images
NASA Astrophysics Data System (ADS)
Moraru, Luminita; Obreja, Cristian-Dragos; Moldovanu, Simona
2015-12-01
The motivation was to develop an assessment method to measure (in)visible differences between the original and the processed images in MR brain angiography as a method of evaluation of the status of the vessel segments (i.e. the existence of the occlusion or intracerebral vessels damaged as aneurysms). Generally, the image quality is limited, so we improve the performance of the evaluation through digital image processing. The goal is to determine the best processing method that allows an accurate assessment of patients with cerebrovascular diseases. A total of 10 MR brain angiography images were processed by the following techniques: histogram equalization, Wiener filter, linear contrast adjustment, contrastlimited adaptive histogram equalization, bias correction and Marr-Hildreth filter. Each original image and their processed images were analyzed into the stacking procedure so that the same vessel and its corresponding diameter have been measured. Original and processed images were evaluated by measuring the vessel diameter (in pixels) on an established direction and for the precise anatomic location. The vessel diameter is calculated using the plugin ImageJ. Mean diameter measurements differ significantly across the same segment and for different processing techniques. The best results are provided by the Wiener filter and linear contrast adjustment methods and the worst by Marr-Hildreth filter.
[Generation and processing of digital images in radiodiagnosis].
Bajla, I; Belan, V
1993-05-01
The paper describes universal principles of diagnostic imaging. The attention is focused particularly on digital image generation in medicine. The methodology of display visualization of measured data is discussed. The problems of spatial relation representation and visual perception of image brightness are mentioned. The methodological issues of digital image processing (DIP) are discussed, particularly the relation of DIP to the other related disciplines, fundamental tasks in DIP and classification of DIP operations from the computational viewpoint. The following examples of applying DIP operations in diagnostic radiology are overviewed: local contrast enhancement in digital image, spatial filtering, quantitative texture analysis, synthesis of the 3D pseudospatial image based on the 2D tomogram set, multimodal processing of medical images. New trends of application of DIP methods in diagnostic radiology are outlined: evaluation of the diagnostic efficiency of DIP operations by means of ROC analysis, construction of knowledge-based systems of DIP in medicine. (Fig. 12, Ref. 26.)
Dynamic feature analysis for Voyager at the Image Processing Laboratory
NASA Technical Reports Server (NTRS)
Yagi, G. M.; Lorre, J. J.; Jepsen, P. L.
1978-01-01
Voyager 1 and 2 were launched from Cape Kennedy to Jupiter, Saturn, and beyond on September 5, 1977 and August 20, 1977. The role of the Image Processing Laboratory is to provide the Voyager Imaging Team with the necessary support to identify atmospheric features (tiepoints) for Jupiter and Saturn data, and to analyze and display them in a suitable form. This support includes the software needed to acquire and store tiepoints, the hardware needed to interactively display images and tiepoints, and the general image processing environment necessary for decalibration and enhancement of the input images. The objective is an understanding of global circulation in the atmospheres of Jupiter and Saturn. Attention is given to the Voyager imaging subsystem, the Voyager imaging science objectives, hardware, software, display monitors, a dynamic feature study, decalibration, navigation, and data base.
Method of detecting meter base on image-processing
NASA Astrophysics Data System (ADS)
Wang, Hong-ping; Wang, Peng; Yu, Zheng-lin
2008-03-01
This paper proposes a new idea of detecting meter using image arithmetic- logic operation and high-precision raster sensor. This method regards the data measured by precision raster as real value, the data obtained by digital image-processing as measuring value, and achieves the aim of detecting meter through the compare of above two datum finally. This method utilizes the dynamic change of meter pointer to complete subtraction processing of image, to realize image segmentation, and to achieve warp-value of image pointer of border time. This method using the technology of image segmentation replaces the traditional method which is low accuracy and low repetition caused by manual operation and ocular reading. Its precision reaches technology index demand according to the arithmetic of nation detecting rules and experiment indicates it is reliable, high accuracy. The paper introduces the total scheme of detecting meter, capturing method of image pointer, and also shows the precision analysis of indicating value error.
Graphical user interface for image acquisition and processing
Goldberg, Kenneth A.
2002-01-01
An event-driven GUI-based image acquisition interface for the IDL programming environment designed for CCD camera control and image acquisition directly into the IDL environment where image manipulation and data analysis can be performed, and a toolbox of real-time analysis applications. Running the image acquisition hardware directly from IDL removes the necessity of first saving images in one program and then importing the data into IDL for analysis in a second step. Bringing the data directly into IDL creates an opportunity for the implementation of IDL image processing and display functions in real-time. program allows control over the available charge coupled device (CCD) detector parameters, data acquisition, file saving and loading, and image manipulation and processing, all from within IDL. The program is built using IDL's widget libraries to control the on-screen display and user interface.
Optical Processing of Speckle Images with Bacteriorhodopsin for Pattern Recognition
NASA Technical Reports Server (NTRS)
Downie, John D.; Tucker, Deanne (Technical Monitor)
1994-01-01
Logarithmic processing of images with multiplicative noise characteristics can be utilized to transform the image into one with an additive noise distribution. This simplifies subsequent image processing steps for applications such as image restoration or correlation for pattern recognition. One particularly common form of multiplicative noise is speckle, for which the logarithmic operation not only produces additive noise, but also makes it of constant variance (signal-independent). We examine the optical transmission properties of some bacteriorhodopsin films here and find them well suited to implement such a pointwise logarithmic transformation optically in a parallel fashion. We present experimental results of the optical conversion of speckle images into transformed images with additive, signal-independent noise statistics using the real-time photochromic properties of bacteriorhodopsin. We provide an example of improved correlation performance in terms of correlation peak signal-to-noise for such a transformed speckle image.
NASA Astrophysics Data System (ADS)
Mekkaoui, Imen; Moulin, Kevin; Croisille, Pierre; Pousin, Jerome; Viallon, Magalie
2016-08-01
Cardiac motion presents a major challenge in diffusion weighted MRI, often leading to large signal losses that necessitate repeated measurements. The diffusion process in the myocardium is difficult to investigate because of the unqualified sensitivity of diffusion measurements to cardiac motion. A rigorous mathematical formalism is introduced to quantify the effect of tissue motion in diffusion imaging. The presented mathematical model, based on the Bloch-Torrey equations, takes into account deformations according to the laws of continuum mechanics. Approximating this mathematical model by using finite elements method, numerical simulations can predict the sensitivity of the diffusion signal to cardiac motion. Different diffusion encoding schemes are considered and the diffusion weighted MR signals, computed numerically, are compared to available results in literature. Our numerical model can identify the existence of two time points in the cardiac cycle, at which the diffusion is unaffected by myocardial strain and cardiac motion. Of course, these time points depend on the type of diffusion encoding scheme. Our numerical results also show that the motion sensitivity of the diffusion sequence can be reduced by using either spin echo technique with acceleration motion compensation diffusion gradients or stimulated echo acquisition mode with unipolar and bipolar diffusion gradients.
Mekkaoui, Imen; Moulin, Kevin; Croisille, Pierre; Pousin, Jerome; Viallon, Magalie
2016-08-07
Cardiac motion presents a major challenge in diffusion weighted MRI, often leading to large signal losses that necessitate repeated measurements. The diffusion process in the myocardium is difficult to investigate because of the unqualified sensitivity of diffusion measurements to cardiac motion. A rigorous mathematical formalism is introduced to quantify the effect of tissue motion in diffusion imaging. The presented mathematical model, based on the Bloch-Torrey equations, takes into account deformations according to the laws of continuum mechanics. Approximating this mathematical model by using finite elements method, numerical simulations can predict the sensitivity of the diffusion signal to cardiac motion. Different diffusion encoding schemes are considered and the diffusion weighted MR signals, computed numerically, are compared to available results in literature. Our numerical model can identify the existence of two time points in the cardiac cycle, at which the diffusion is unaffected by myocardial strain and cardiac motion. Of course, these time points depend on the type of diffusion encoding scheme. Our numerical results also show that the motion sensitivity of the diffusion sequence can be reduced by using either spin echo technique with acceleration motion compensation diffusion gradients or stimulated echo acquisition mode with unipolar and bipolar diffusion gradients.
Acquisition and Post-Processing of Immunohistochemical Images.
Sedgewick, Jerry
2017-01-01
Augmentation of digital images is almost always a necessity in order to obtain a reproduction that matches the appearance of the original. However, that augmentation can mislead if it is done incorrectly and not within reasonable limits. When procedures are in place for insuring that originals are archived, and image manipulation steps reported, scientists not only follow good laboratory practices, but avoid ethical issues associated with post processing, and protect their labs from any future allegations of scientific misconduct. Also, when procedures are in place for correct acquisition of images, the extent of post processing is minimized or eliminated. These procedures include white balancing (for brightfield images), keeping tonal values within the dynamic range of the detector, frame averaging to eliminate noise (typically in fluorescence imaging), use of the highest bit depth when a choice is available, flatfield correction, and archiving of the image in a non-lossy format (not JPEG).When post-processing is necessary, the commonly used applications for correction include Photoshop, and ImageJ, but a free program (GIMP) can also be used. Corrections to images include scaling the bit depth to higher and lower ranges, removing color casts from brightfield images, setting brightness and contrast, reducing color noise, reducing "grainy" noise, conversion of pure colors to grayscale, conversion of grayscale to colors typically used in fluorescence imaging, correction of uneven illumination (flatfield correction), merging color images (fluorescence), and extending the depth of focus. These corrections are explained in step-by-step procedures in the chapter that follows.
Mathematical modeling of quartz particle melting process in plasma-chemical reactor
Volokitin, Oleg Volokitin, Gennady Skripnikova, Nelli Shekhovtsov, Valentin; Vlasov, Viktor
2016-01-15
Among silica-based materials vitreous silica has a special place. The paper presents the melting process of a quartz particle under conditions of low-temperature plasma. A mathematical model is designed for stages of melting in the experimental plasma-chemical reactor. As calculation data show, quartz particles having the radius of 0.21≤ r{sub p} ≤0.64 mm completely melt at W = 0.65 l/s particle feed rate depending on the Nusselt number, while 0.14≤ r{sub p} ≤0.44 mm particles melt at W = 1.4 l/s. Calculation data showed that 2 mm and 0.4 mm quartz particles completely melted during and 0.1 s respectively. Thus, phase transformations occurred in silicon dioxide play the important part in its heating up to the melting temperature.
Mathematic modeling of the Earth's surface and the process of remote sensing
NASA Technical Reports Server (NTRS)
Balter, B. M.
1979-01-01
It is shown that real data from remote sensing of the Earth from outer space are not best suited to the search for optimal procedures with which to process such data. To work out the procedures, it was proposed that data synthesized with the help of mathematical modeling be used. A criterion for simularity to reality was formulated. The basic principles for constructing methods for modeling the data from remote sensing are recommended. A concrete method is formulated for modeling a complete cycle of radiation transformations in remote sensing. A computer program is described which realizes the proposed method. Some results from calculations are presented which show that the method satisfies the requirements imposed on it.
ERIC Educational Resources Information Center
Bugden, Stephanie; Ansari, Daniel
2011-01-01
In recent years, there has been an increasing focus on the role played by basic numerical magnitude processing in the typical and atypical development of mathematical skills. In this context, tasks measuring both the intentional and automatic processing of numerical magnitude have been employed to characterize how children's representation and…
Bugden, Stephanie; Ansari, Daniel
2011-01-01
In recent years, there has been an increasing focus on the role played by basic numerical magnitude processing in the typical and atypical development of mathematical skills. In this context, tasks measuring both the intentional and automatic processing of numerical magnitude have been employed to characterize how children's representation and processing of numerical magnitude changes over developmental time. To date, however, there has been little effort to differentiate between different measures of 'number sense'. The aim of the present study was to examine the relationship between automatic and intentional measures of magnitude processing as well as their relationships to individual differences in children's mathematical achievement. A group of 119 children in 1st and 2nd grade were tested on the physical size congruity paradigm (automatic processing) as well as the number comparison paradigm to measure the ratio effect (intentional processing). The results reveal that measures of intentional and automatic processing are uncorrelated with one another, suggesting that these tasks tap into different levels of numerical magnitude processing in children. Furthermore, while children's performance on the number comparison paradigm was found to correlate with their mathematical achievement scores, no such correlations could be obtained for any of the measures typically derived from the physical size congruity task. These findings therefore suggest that different tasks measuring 'number sense' tap into different levels of numerical magnitude representation that may be unrelated to one another and have differential predictive power for individual differences in mathematical achievement.
IR camera system with an advanced image processing technologies
NASA Astrophysics Data System (ADS)
Ohkubo, Syuichi; Tamura, Tetsuo
2016-05-01
We have developed image processing technologies for resolving issues caused by the inherent UFPA (uncooled focal plane array) sensor characteristics to spread its applications. For example, large time constant of an uncooled IR (infra-red) sensor limits its application field, because motion blur is caused in monitoring the objective moving at high speed. The developed image processing technologies can eliminate the blur and retrieve almost the equivalent image observed in still motion. This image processing is based on the idea that output of the IR sensor is construed as the convolution of radiated IR energy from the objective and impulse response of the IR sensor. With knowledge of the impulse response and moving speed of the objective, the IR energy from the objective can be de-convolved from the observed images. We have successfully retrieved the image without blur using the IR sensor of 15 ms time constant under the conditions in which the objective is moving at the speed of about 10 pixels/60 Hz. The image processing for reducing FPN (fixed pattern noise) has also been developed. UFPA having the responsivity in the narrow wavelength region, e.g., around 8 μm is appropriate for measuring the surface of glass. However, it suffers from severe FPN due to lower sensitivity compared with 8-13 μm. The developed image processing exploits the images of the shutter itself, and can reduce FPN significantly.
Image processing system to analyze droplet distributions in sprays
NASA Technical Reports Server (NTRS)
Bertollini, Gary P.; Oberdier, Larry M.; Lee, Yong H.
1987-01-01
An image processing system was developed which automatically analyzes the size distributions in fuel spray video images. Images are generated by using pulsed laser light to freeze droplet motion in the spray sample volume under study. This coherent illumination source produces images which contain droplet diffraction patterns representing the droplets degree of focus. The analysis is performed by extracting feature data describing droplet diffraction patterns in the images. This allows the system to select droplets from image anomalies and measure only those droplets considered in focus. Unique features of the system are the totally automated analysis and droplet feature measurement from the grayscale image. The feature extraction and image restoration algorithms used in the system are described. Preliminary performance data is also given for two experiments. One experiment gives a comparison between a synthesized distribution measured manually and automatically. The second experiment compares a real spray distribution measured using current methods against the automatic system.
Image processing methods to obtain symmetrical distribution from projection image.
Asano, H; Takenaka, N; Fujii, T; Nakamatsu, E; Tagami, Y; Takeshima, K
2004-10-01
Flow visualization and measurement of cross-sectional liquid distribution is very effective to clarify the effects of obstacles in a conduit on heat transfer and flow characteristics of gas-liquid two-phase flow. In this study, two methods to obtain cross-sectional distribution of void fraction are applied to vertical upward air-water two-phase flow. These methods need projection image only from one direction. Radial distributions of void fraction in a circular tube and a circular-tube annuli with a spacer were calculated by Abel transform based on the assumption of axial symmetry. On the other hand, cross-sectional distributions of void fraction in a circular tube with a wire coil whose conduit configuration rotates about the tube central axis periodically were measured by CT method based on the assumption that the relative distributions of liquid phase against the wire were kept along the flow direction.
Wavelet Transforms in Parallel Image Processing
1994-01-27
operation is performed to go either up or down a level on the pyramid. The algorithm can be extended to operate on higher dimensional input signals ...quantization to the multi- dimensional case. A block of pixels, for example, 4 x 4 pixels, forming a vector of k(= 16) dimensions are quantized together to...multiscale approach for representation and characterization of signals and images. One can select a suitable or an optimal wavelet and its associated
Processing of polarametric SAR images. Final report
Warrick, A.L.; Delaney, P.A.
1995-09-01
The objective of this work was to develop a systematic method of combining multifrequency polarized SAR images. It is shown that the traditional methods of correlation, hard targets, and template matching fail to produce acceptable results. Hence, a new algorithm was developed and tested. The new approach combines the three traditional methods and an interpolation method. An example is shown that demonstrates the new algorithms performance. The results are summarized suggestions for future research are presented.
Partial difference operators on weighted graphs for image processing on surfaces and point clouds.
Lozes, Francois; Elmoataz, Abderrahim; Lezoray, Olivier
2014-09-01
Partial difference equations (PDEs) and variational methods for image processing on Euclidean domains spaces are very well established because they permit to solve a large range of real computer vision problems. With the recent advent of many 3D sensors, there is a growing interest in transposing and solving PDEs on surfaces and point clouds. In this paper, we propose a simple method to solve such PDEs using the framework of PDEs on graphs. This latter approach enables us to transcribe, for surfaces and point clouds, many models and algorithms designed for image processing. To illustrate our proposal, three problems are considered: (1) p -Laplacian restoration and inpainting; (2) PDEs mathematical morphology; and (3) active contours segmentation.
Processing ISS Images of Titan's Surface
NASA Technical Reports Server (NTRS)
Perry, Jason; McEwen, Alfred; Fussner, Stephanie; Turtle, Elizabeth; West, Robert; Porco, Carolyn; Knowles, Ben; Dawson, Doug
2005-01-01
One of the primary goals of the Cassini-Huygens mission, in orbit around Saturn since July 2004, is to understand the surface and atmosphere of Titan. Surface investigations are primarily accomplished with RADAR, the Visual and Infrared Mapping Spectrometer (VIMS), and the Imaging Science Subsystem (ISS) [1]. The latter two use methane "windows", regions in Titan's reflectance spectrum where its atmosphere is most transparent, to observe the surface. For VIMS, this produces clear views of the surface near 2 and 5 microns [2]. ISS uses a narrow continuum band filter (CB3) at 938 nanometers. While these methane windows provide our best views of the surface, the images produced are not as crisp as ISS images of satellites like Dione and Iapetus [3] due to the atmosphere. Given a reasonable estimate of contrast (approx.30%), the apparent resolution of features is approximately 5 pixels due to the effects of the atmosphere and the Modulation Transfer Function of the camera [1,4]. The atmospheric haze also reduces contrast, especially with increasing emission angles [5].
Diagnosis of skin cancer using image processing
NASA Astrophysics Data System (ADS)
Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué; Coronel-Beltrán, Ángel
2014-10-01
In this papera methodology for classifying skin cancerin images of dermatologie spots based on spectral analysis using the K-law Fourier non-lineartechnique is presented. The image is segmented and binarized to build the function that contains the interest area. The image is divided into their respective RGB channels to obtain the spectral properties of each channel. The green channel contains more information and therefore this channel is always chosen. This information is point to point multiplied by a binary mask and to this result a Fourier transform is applied written in nonlinear form. If the real part of this spectrum is positive, the spectral density takeunit values, otherwise are zero. Finally the ratio of the sum of the unit values of the spectral density with the sum of values of the binary mask are calculated. This ratio is called spectral index. When the value calculated is in the spectral index range three types of cancer can be detected. Values found out of this range are benign injure.
Image processing of underwater multispectral imagery
Zawada, D. G.
2003-01-01
Capturing in situ fluorescence images of marine organisms presents many technical challenges. The effects of the medium, as well as the particles and organisms within it, are intermixed with the desired signal. Methods for extracting and preparing the imagery for analysis are discussed in reference to a novel underwater imaging system called the low-light-level underwater multispectral imaging system (LUMIS). The instrument supports both uni- and multispectral collections, each of which is discussed in the context of an experimental application. In unispectral mode, LUMIS was used to investigate the spatial distribution of phytoplankton. A thin sheet of laser light (532 nm) induced chlorophyll fluorescence in the phytoplankton, which was recorded by LUMIS. Inhomogeneities in the light sheet led to the development of a beam-pattern-correction algorithm. Separating individual phytoplankton cells from a weak background fluorescence field required a two-step procedure consisting of edge detection followed by a series of binary morphological operations. In multispectral mode, LUMIS was used to investigate the bio-assay potential of fluorescent pigments in corals. Problems with the commercial optical-splitting device produced nonlinear distortions in the imagery. A tessellation algorithm, including an automated tie-point-selection procedure, was developed to correct the distortions. Only pixels corresponding to coral polyps were of interest for further analysis. Extraction of these pixels was performed by a dynamic global-thresholding algorithm.
Optical Signal Processing: Poisson Image Restoration and Shearing Interferometry
NASA Technical Reports Server (NTRS)
Hong, Yie-Ming
1973-01-01
Optical signal processing can be performed in either digital or analog systems. Digital computers and coherent optical systems are discussed as they are used in optical signal processing. Topics include: image restoration; phase-object visualization; image contrast reversal; optical computation; image multiplexing; and fabrication of spatial filters. Digital optical data processing deals with restoration of images degraded by signal-dependent noise. When the input data of an image restoration system are the numbers of photoelectrons received from various areas of a photosensitive surface, the data are Poisson distributed with mean values proportional to the illuminance of the incoherently radiating object and background light. Optical signal processing using coherent optical systems is also discussed. Following a brief review of the pertinent details of Ronchi's diffraction grating interferometer, moire effect, carrier-frequency photography, and achromatic holography, two new shearing interferometers based on them are presented. Both interferometers can produce variable shear.
Data management in pattern recognition and image processing systems
NASA Technical Reports Server (NTRS)
Zobrist, A. L.; Bryant, N. A.
1976-01-01
Data management considerations are important to any system which handles large volumes of data or where the manipulation of data is technically sophisticated. A particular problem is the introduction of image-formatted files into the mainstream of data processing application. This report describes a comprehensive system for the manipulation of image, tabular, and graphical data sets which involve conversions between the various data types. A key characteristic is the use of image processing technology to accomplish data management tasks. Because of this, the term 'image-based information system' has been adopted.
Performance evaluation of image processing algorithms in digital mammography
NASA Astrophysics Data System (ADS)
Zanca, Federica; Van Ongeval, Chantal; Jacobs, Jurgen; Pauwels, Herman; Marchal, Guy; Bosmans, Hilde
2008-03-01
The purpose of the study is to evaluate the performance of different image processing algorithms in terms of representation of microcalcification clusters in digital mammograms. Clusters were simulated in clinical raw ("for processing") images. The entire dataset of images consisted of 200 normal mammograms, selected out of our clinical routine cases and acquired with a Siemens Novation DR system. In 100 of the normal images a total of 142 clusters were simulated; the remaining 100 normal mammograms served as true negative input cases. Both abnormal and normal images were processed with 5 commercially available processing algorithms: Siemens OpView1 and Siemens OpView2, Agfa Musica1, Sectra Mamea AB Sigmoid and IMS Raffaello Mammo 1.2. Five observers were asked to locate and score the cluster(s) in each image, by means of dedicated software tool. Observer performance was assessed using the JAFROC Figure of Merit. FROC curves, fitted using the IDCA method, have also been calculated. JAFROC analysis revealed significant differences among the image processing algorithms in the detection of microcalcifications clusters (p=0.0000369). Calculated average Figures of Merit are: 0.758 for Siemens OpView2, 0.747 for IMS Processing 1.2, 0.736 for Agfa Musica1 processing, 0.706 for Sectra Mamea AB Sigmoid processing and 0.703 for Siemens OpView1. This study is a first step towards a quantitative assessment of image processing in terms of cluster detection in clinical mammograms. Although we showed a significant difference among the image processing algorithms, this method does not on its own allow for a global performance ranking of the investigated algorithms.
Colorimetric Topography of Atherosclerotic Lesions by Television Image Processing
1979-06-15
thesis requires exposure of a grey scale. These exposures are bracketed to ± 3 f-stops centered at the meter-indicated exposure. The processed ...in atherogenesis. For this thesis , five specimens ofI’ -121- similar age, sex, and epidemiology were simulated and processed by the algorithms...6.1. Conclusion Employing the standard digitized image derived from the existing I theory of image processing , this thesis documents the development
Multispectral image restoration of historical documents based on LAAMs and mathematical morphology
NASA Astrophysics Data System (ADS)
Lechuga-S., Edwin; Valdiviezo-N., Juan C.; Urcid, Gonzalo
2014-09-01
This research introduces an automatic technique designed for the digital restoration of the damaged parts in historical documents. For this purpose an imaging spectrometer is used to acquire a set of images in the wavelength interval from 400 to 1000 nm. Assuming the presence of linearly mixed spectral pixels registered from the multispectral image, our technique uses two lattice autoassociative memories to extract the set of pure pigments conforming a given document. Through an spectral unmixing analysis, our method produces fractional abundance maps indicating the distributions of each pigment in the scene. These maps are then used to locate cracks and holes in the document under study. The restoration process is performed by the application of a region filling algorithm, based on morphological dilation, followed by a color interpolation to restore the original appearance of the filled areas. This procedure has been successfully applied to the analysis and restoration of three multispectral data sets: two corresponding to artificially superimposed scripts and a real data acquired from a Mexican pre-Hispanic codex, whose restoration results are presented.
Fast image processing on chain board of inverted tooth chain
NASA Astrophysics Data System (ADS)
Liu, Qing-min; Li, Guo-fa
2007-12-01
Discussed ordinary image processing technology of inverted tooth chain board, including noise reduction, image segmentation, edge detection and contour extraction etc.. Put forward a new kind of sub-pixel arithmetic for edge orientation of circle. The arithmetic first did edge detection to image by Canny arithmetic, so as to enhance primary orientation precision of edge, then calculated gradient direction, and then interpolated gradient image (image that was detected by Sobel arithmetic) along gradient direction, last obtained sub-pixel orientation of edge. Performed two kinds of least-square fitting methods for line edge to getting its sub-pixel orientation, from analysis and experiments, the orientation error of improved least-square linear fitting method was one quarter of ordinary least-square linear fitting error under small difference of orientation time. The sub-pixel orientation of circle made resolution of CCD increase 42 tines, which enhanced greatly orientation precision of image edge. For the need of quick on-line inspection next step, integrated the whole environment containing image preprocess, Hough conversion of line, setting orientation & direction of image, sub-pixel orientation of line and circle, output of calculation result. The whole quick processing course performed without operator, processing tine of single part is less than 0.3 second. The sub-pixel orientation method this paper posed fits precision orientation of image, and integration calculation method ensure requirement of quick inspection, and lays the foundations for on-line precision visual measurement of image.
Nayeem, Fatima; Ju, Hyunsu; Brunder, Donald G; Nagamani, Manubai; Anderson, Karl E; Khamapirad, Tuenchit; Lu, Lee-Jane W
2014-01-01
Women with high breast density (BD) have a 4- to 6-fold greater risk for breast cancer than women with low BD. We found that BD can be easily computed from a mathematical algorithm using routine mammographic imaging data or by a curve-fitting algorithm using fat and nonfat suppression magnetic resonance imaging (MRI) data. These BD measures in a strictly defined group of premenopausal women providing both mammographic and breast MRI images were predicted as well by the same set of strong predictor variables as were measures from a published laborious histogram segmentation method and a full field digital mammographic unit in multivariate regression models. We also found that the number of completed pregnancies, C-reactive protein, aspartate aminotransferase, and progesterone were more strongly associated with amounts of glandular tissue than adipose tissue, while fat body mass, alanine aminotransferase, and insulin like growth factor-II appear to be more associated with the amount of breast adipose tissue. Our results show that methods of breast imaging and modalities for estimating the amount of glandular tissue have no effects on the strength of these predictors of BD. Thus, the more convenient mathematical algorithm and the safer MRI protocols may facilitate prospective measurements of BD.
Principles of image processing in digital chest radiography.
Prokop, Mathias; Neitzel, Ulrich; Schaefer-Prokop, Cornelia
2003-07-01
Image processing has a major impact on image quality and diagnostic performance of digital chest radiographs. Goals of processing are to reduce the dynamic range of the image data to capture the full range of attenuation differences between lungs and mediastinum, to improve the modulation transfer function to optimize spatial resolution, to enhance structural contrast, and to suppress image noise. Image processing comprises look-up table operations and spatial filtering. Look-up table operations allow for automated signal normalization and arbitrary choice of image gradation. The most simple and still widely applied spatial filtering algorithms are based on unsharp masking. Various modifications were introduced for dynamic range reduction and MTF restoration. More elaborate and more effective are multi-scale frequency processing algorithms. They are based on the subdivision of an image in multiple frequency bands according to its structural composition. This allows for a wide range of image manipulations including a size-independent enhancement of low-contrast structures. Principles of the various algorithms will be explained and their impact on image appearance will be illustrated by clinical examples. Optimum and sub-optimum parameter settings are discussed and pitfalls will be explained.
Image Harvest: an open-source platform for high-throughput plant image processing and analysis
Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal
2016-01-01
High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917
Image Harvest: an open-source platform for high-throughput plant image processing and analysis.
Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal
2016-05-01
High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets.
Issakhov, Alibek
2014-01-01
This paper presents the mathematical model of the thermal process from thermal power plant to aquatic environment of the reservoir-cooler, which is located in the Pavlodar region, 17 Km to the north-east of Ekibastuz town. The thermal process in reservoir-cooler with different hydrometeorological conditions is considered, which is solved by three-dimensional Navier-Stokes equations and temperature equation for an incompressible flow in a stratified medium. A numerical method based on the projection method, divides the problem into three stages. At the first stage, it is assumed that the transfer of momentum occurs only by convection and diffusion. Intermediate velocity field is solved by fractional steps method. At the second stage, three-dimensional Poisson equation is solved by the Fourier method in combination with tridiagonal matrix method (Thomas algorithm). Finally, at the third stage, it is expected that the transfer is only due to the pressure gradient. Numerical method determines the basic laws of the hydrothermal processes that qualitatively and quantitatively are approximated depending on different hydrometeorological conditions. PMID:24991644
Suárez-Pellicioni, M; Núñez-Peña, M I; Colomé, A
2013-12-01
This study uses event-related brain potentials to investigate the difficulties that high math anxious individuals face when processing dramatically incorrect solutions to simple arithmetical problems. To this end, thirteen high math-anxious (HMA) and thirteen low math-anxious (LMA) individuals were presented with simple addition problems in a verification task. The proposed solution could be correct, incorrect but very close to the correct one (small-split), or dramatically incorrect (large-split). The two groups did not differ in mathematical ability or trait anxiety. We reproduced previous results for flawed scores suggesting HMA difficulties in processing large-split solutions. Moreover, large-split solutions elicited a late positive component (P600/P3b) which was more enhanced and delayed in the HMA group. Our study proposes that the pattern of flawed scores found by previous studies (and that we replicate) has to do with HMA individuals'difficulties in inhibiting an extended processing of irrelevant information (large-split solutions).
Detecting jaundice by using digital image processing
NASA Astrophysics Data System (ADS)
Castro-Ramos, J.; Toxqui-Quitl, C.; Villa Manriquez, F.; Orozco-Guillen, E.; Padilla-Vivanco, A.; Sánchez-Escobar, JJ.
2014-03-01
When strong Jaundice is presented, babies or adults should be subject to clinical exam like "serum bilirubin" which can cause traumas in patients. Often jaundice is presented in liver disease such as hepatitis or liver cancer. In order to avoid additional traumas we propose to detect jaundice (icterus) in newborns or adults by using a not pain method. By acquiring digital images in color, in palm, soles and forehead, we analyze RGB attributes and diffuse reflectance spectra as the parameter to characterize patients with either jaundice or not, and we correlate that parameters with the level of bilirubin. By applying support vector machine we distinguish between healthy and sick patients.
An Image Processing Approach to Linguistic Translation
NASA Astrophysics Data System (ADS)
Kubatur, Shruthi; Sreehari, Suhas; Hegde, Rajeshwari
2011-12-01
The art of translation is as old as written literature. Developments since the Industrial Revolution have influenced the practice of translation, nurturing schools, professional associations, and standard. In this paper, we propose a method of translation of typed Kannada text (taken as an image) into its equivalent English text. The National Instruments (NI) Vision Assistant (version 8.5) has been used for Optical character Recognition (OCR). We developed a new way of transliteration (which we call NIV transliteration) to simplify the training of characters. Also, we build a special type of dictionary for the purpose of translation.