Optic flow detection is not influenced by visual-vestibular congruency.
Holten, Vivian; MacNeilage, Paul R
2018-01-01
Optic flow patterns generated by self-motion relative to the stationary environment result in congruent visual-vestibular self-motion signals. Incongruent signals can arise due to object motion, vestibular dysfunction, or artificial stimulation, which are less common. Hence, we are predominantly exposed to congruent rather than incongruent visual-vestibular stimulation. If the brain takes advantage of this probabilistic association, we expect observers to be more sensitive to visual optic flow that is congruent with ongoing vestibular stimulation. We tested this expectation by measuring the motion coherence threshold, which is the percentage of signal versus noise dots, necessary to detect an optic flow pattern. Observers seated on a hexapod motion platform in front of a screen experienced two sequential intervals. One interval contained optic flow with a given motion coherence and the other contained noise dots only. Observers had to indicate which interval contained the optic flow pattern. The motion coherence threshold was measured for detection of laminar and radial optic flow during leftward/rightward and fore/aft linear self-motion, respectively. We observed no dependence of coherence thresholds on vestibular congruency for either radial or laminar optic flow. Prior studies using similar methods reported both decreases and increases in coherence thresholds in response to congruent vestibular stimulation; our results do not confirm either of these prior reports. While methodological differences may explain the diversity of results, another possibility is that motion coherence thresholds are mediated by neural populations that are either not modulated by vestibular stimulation or that are modulated in a manner that does not depend on congruency.
Stereoscopic advantages for vection induced by radial, circular, and spiral optic flows.
Palmisano, Stephen; Summersby, Stephanie; Davies, Rodney G; Kim, Juno
2016-11-01
Although observer motions project different patterns of optic flow to our left and right eyes, there has been surprisingly little research into potential stereoscopic contributions to self-motion perception. This study investigated whether visually induced illusory self-motion (i.e., vection) is influenced by the addition of consistent stereoscopic information to radial, circular, and spiral (i.e., combined radial + circular) patterns of optic flow. Stereoscopic vection advantages were found for radial and spiral (but not circular) flows when monocular motion signals were strong. Under these conditions, stereoscopic benefits were greater for spiral flow than for radial flow. These effects can be explained by differences in the motion aftereffects generated by these displays, which suggest that the circular motion component in spiral flow selectively reduced adaptation to stereoscopic motion-in-depth. Stereoscopic vection advantages were not observed for circular flow when monocular motion signals were strong, but emerged when monocular motion signals were weakened. These findings show that stereoscopic information can contribute to visual self-motion perception in multiple ways.
Breaking camouflage and detecting targets require optic flow and image structure information.
Pan, Jing Samantha; Bingham, Ned; Chen, Chang; Bingham, Geoffrey P
2017-08-01
Use of motion to break camouflage extends back to the Cambrian [In the Blink of an Eye: How Vision Sparked the Big Bang of Evolution (New York Basic Books, 2003)]. We investigated the ability to break camouflage and continue to see camouflaged targets after motion stops. This is crucial for the survival of hunting predators. With camouflage, visual targets and distracters cannot be distinguished using only static image structure (i.e., appearance). Motion generates another source of optical information, optic flow, which breaks camouflage and specifies target locations. Optic flow calibrates image structure with respect to spatial relations among targets and distracters, and calibrated image structure makes previously camouflaged targets perceptible in a temporally stable fashion after motion stops. We investigated this proposal using laboratory experiments and compared how many camouflaged targets were identified either with optic flow information alone or with combined optic flow and image structure information. Our results show that the combination of motion-generated optic flow and target-projected image structure information yielded efficient and stable perception of camouflaged targets.
Optical Flow Estimation for Flame Detection in Videos
Mueller, Martin; Karasev, Peter; Kolesov, Ivan; Tannenbaum, Allen
2014-01-01
Computational vision-based flame detection has drawn significant attention in the past decade with camera surveillance systems becoming ubiquitous. Whereas many discriminating features, such as color, shape, texture, etc., have been employed in the literature, this paper proposes a set of motion features based on motion estimators. The key idea consists of exploiting the difference between the turbulent, fast, fire motion, and the structured, rigid motion of other objects. Since classical optical flow methods do not model the characteristics of fire motion (e.g., non-smoothness of motion, non-constancy of intensity), two optical flow methods are specifically designed for the fire detection task: optimal mass transport models fire with dynamic texture, while a data-driven optical flow scheme models saturated flames. Then, characteristic features related to the flow magnitudes and directions are computed from the flow fields to discriminate between fire and non-fire motion. The proposed features are tested on a large video database to demonstrate their practical usefulness. Moreover, a novel evaluation method is proposed by fire simulations that allow for a controlled environment to analyze parameter influences, such as flame saturation, spatial resolution, frame rate, and random noise. PMID:23613042
Pavan, Andrea; Marotti, Rosilari Bellacosa; Mather, George
2013-05-31
Motion and form encoding are closely coupled in the visual system. A number of physiological studies have shown that neurons in the striate and extrastriate cortex (e.g., V1 and MT) are selective for motion direction parallel to their preferred orientation, but some neurons also respond to motion orthogonal to their preferred spatial orientation. Recent psychophysical research (Mather, Pavan, Bellacosa, & Casco, 2012) has demonstrated that the strength of adaptation to two fields of transparently moving dots is modulated by simultaneously presented orientation signals, suggesting that the interaction occurs at the level of motion integrating receptive fields in the extrastriate cortex. In the present psychophysical study, we investigated whether motion-form interactions take place at a higher level of neural processing where optic flow components are extracted. In Experiment 1, we measured the duration of the motion aftereffect (MAE) generated by contracting or expanding dot fields in the presence of either radial (parallel) or concentric (orthogonal) counterphase pedestal gratings. To tap the stage at which optic flow is extracted, we measured the duration of the phantom MAE (Weisstein, Maguire, & Berbaum, 1977) in which we adapted and tested different parts of the visual field, with orientation signals presented either in the adapting (Experiment 2) or nonadapting (Experiments 3 and 4) sectors. Overall, the results showed that motion adaptation is suppressed most by orientation signals orthogonal to optic flow direction, suggesting that motion-form interactions also take place at the global motion level where optic flow is extracted.
Agathos, Catherine P; Bernardin, Delphine; Baranton, Konogan; Assaiante, Christine; Isableu, Brice
2017-04-07
Optic flow provides visual self-motion information and is shown to modulate gait and provoke postural reactions. We have previously reported an increased reliance on the visual, as opposed to the somatosensory-based egocentric, frame of reference (FoR) for spatial orientation with age. In this study, we evaluated FoR reliance for self-motion perception with respect to the ground surface. We examined how effects of ground optic flow direction on posture may be enhanced by an intermittent podal contact with the ground, and reliance on the visual FoR and aging. Young, middle-aged and old adults stood quietly (QS) or stepped in place (SIP) for 30s under static stimulation, approaching and receding optic flow on the ground and a control condition. We calculated center of pressure (COP) translation and optic flow sensitivity was defined as the ratio of COP translation velocity over absolute optic flow velocity: the visual self-motion quotient (VSQ). COP translation was more influenced by receding flow during QS and by approaching flow during SIP. In addition, old adults drifted forward while SIP without any imposed visual stimulation. Approaching flow limited this natural drift and receding flow enhanced it, as indicated by the VSQ. The VSQ appears to be a motor index of reliance on the visual FoR during SIP and is associated with greater reliance on the visual and reduced reliance on the egocentric FoR. Exploitation of the egocentric FoR for self-motion perception with respect to the ground surface is compromised by age and associated with greater sensitivity to optic flow. Copyright © 2017 IBRO. Published by Elsevier Ltd. All rights reserved.
Pavan, Andrea; Marotti, Rosilari Bellacosa; Mather, George
2013-01-01
Motion and form encoding are closely coupled in the visual system. A number of physiological studies have shown that neurons in the striate and extrastriate cortex (e.g., V1 and MT) are selective for motion direction parallel to their preferred orientation, but some neurons also respond to motion orthogonal to their preferred spatial orientation. Recent psychophysical research (Mather, Pavan, Bellacosa, & Casco, 2012) has demonstrated that the strength of adaptation to two fields of transparently moving dots is modulated by simultaneously presented orientation signals, suggesting that the interaction occurs at the level of motion integrating receptive fields in the extrastriate cortex. In the present psychophysical study, we investigated whether motion-form interactions take place at a higher level of neural processing where optic flow components are extracted. In Experiment 1, we measured the duration of the motion aftereffect (MAE) generated by contracting or expanding dot fields in the presence of either radial (parallel) or concentric (orthogonal) counterphase pedestal gratings. To tap the stage at which optic flow is extracted, we measured the duration of the phantom MAE (Weisstein, Maguire, & Berbaum, 1977) in which we adapted and tested different parts of the visual field, with orientation signals presented either in the adapting (Experiment 2) or nonadapting (Experiments 3 and 4) sectors. Overall, the results showed that motion adaptation is suppressed most by orientation signals orthogonal to optic flow direction, suggesting that motion-form interactions also take place at the global motion level where optic flow is extracted. PMID:23729767
Control of a Quadcopter Aerial Robot Using Optic Flow Sensing
NASA Astrophysics Data System (ADS)
Hurd, Michael Brandon
This thesis focuses on the motion control of a custom-built quadcopter aerial robot using optic flow sensing. Optic flow sensing is a vision-based approach that can provide a robot the ability to fly in global positioning system (GPS) denied environments, such as indoor environments. In this work, optic flow sensors are used to stabilize the motion of quadcopter robot, where an optic flow algorithm is applied to provide odometry measurements to the quadcopter's central processing unit to monitor the flight heading. The optic-flow sensor and algorithm are capable of gathering and processing the images at 250 frames/sec, and the sensor package weighs 2.5 g and has a footprint of 6 cm2 in area. The odometry value from the optic flow sensor is then used a feedback information in a simple proportional-integral-derivative (PID) controller on the quadcopter. Experimental results are presented to demonstrate the effectiveness of using optic flow for controlling the motion of the quadcopter aerial robot. The technique presented herein can be applied to different types of aerial robotic systems or unmanned aerial vehicles (UAVs), as well as unmanned ground vehicles (UGV).
Improved optical flow motion estimation for digital image stabilization
NASA Astrophysics Data System (ADS)
Lai, Lijun; Xu, Zhiyong; Zhang, Xuyao
2015-11-01
Optical flow is the instantaneous motion vector at each pixel in the image frame at a time instant. The gradient-based approach for optical flow computation can't work well when the video motion is too large. To alleviate such problem, we incorporate this algorithm into a pyramid multi-resolution coarse-to-fine search strategy. Using pyramid strategy to obtain multi-resolution images; Using iterative relationship from the highest level to the lowest level to obtain inter-frames' affine parameters; Subsequence frames compensate back to the first frame to obtain stabilized sequence. The experiment results demonstrate that the promoted method has good performance in global motion estimation.
NASA Astrophysics Data System (ADS)
Khalifa, Intissar; Ejbali, Ridha; Zaied, Mourad
2018-04-01
To survive the competition, companies always think about having the best employees. The selection is depended on the answers to the questions of the interviewer and the behavior of the candidate during the interview session. The study of this behavior is always based on a psychological analysis of the movements accompanying the answers and discussions. Few techniques are proposed until today to analyze automatically candidate's non verbal behavior. This paper is a part of a work psychology recognition system; it concentrates in spontaneous hand gesture which is very significant in interviews according to psychologists. We propose motion history representation of hand based on an hybrid approach that merges optical flow and history motion images. The optical flow technique is used firstly to detect hand motions in each frame of a video sequence. Secondly, we use the history motion images (HMI) to accumulate the output of the optical flow in order to have finally a good representation of the hand`s local movement in a global temporal template.
Effects of background stimulation upon eye-movement information.
Nakamura, S
1996-04-01
To investigate the effects of background stimulation upon eye-movement information (EMI), the perceived deceleration of the target motion during pursuit eye movement (Aubert-Fleishl paradox) was analyzed. In the experiment, a striped pattern was used as a background stimulus with various brightness contrasts and spatial frequencies for serially manipulating the attributions of the background stimulus. Analysis showed that the retinal-image motion of the background stimulus (optic flow) affected eye-movement information and that the effects of optic flow became stronger when high contrast and low spatial frequency stripes were presented as the background stimulus. In conclusion, optic flow is one source of eye-movement information in determining real object motion, and the effectiveness of optic flow depends on the attributes of the background stimulus.
Dokka, Kalpana; DeAngelis, Gregory C.
2015-01-01
Humans and animals are fairly accurate in judging their direction of self-motion (i.e., heading) from optic flow when moving through a stationary environment. However, an object moving independently in the world alters the optic flow field and may bias heading perception if the visual system cannot dissociate object motion from self-motion. We investigated whether adding vestibular self-motion signals to optic flow enhances the accuracy of heading judgments in the presence of a moving object. Macaque monkeys were trained to report their heading (leftward or rightward relative to straight-forward) when self-motion was specified by vestibular, visual, or combined visual-vestibular signals, while viewing a display in which an object moved independently in the (virtual) world. The moving object induced significant biases in perceived heading when self-motion was signaled by either visual or vestibular cues alone. However, this bias was greatly reduced when visual and vestibular cues together signaled self-motion. In addition, multisensory heading discrimination thresholds measured in the presence of a moving object were largely consistent with the predictions of an optimal cue integration strategy. These findings demonstrate that multisensory cues facilitate the perceptual dissociation of self-motion and object motion, consistent with computational work that suggests that an appropriate decoding of multisensory visual-vestibular neurons can estimate heading while discounting the effects of object motion. SIGNIFICANCE STATEMENT Objects that move independently in the world alter the optic flow field and can induce errors in perceiving the direction of self-motion (heading). We show that adding vestibular (inertial) self-motion signals to optic flow almost completely eliminates the errors in perceived heading induced by an independently moving object. Furthermore, this increased accuracy occurs without a substantial loss in the precision. Our results thus demonstrate that vestibular signals play a critical role in dissociating self-motion from object motion. PMID:26446214
Quantitative Analysis of Intracellular Motility Based on Optical Flow Model
Li, Heng
2017-01-01
Analysis of cell mobility is a key issue for abnormality identification and classification in cell biology research. However, since cell deformation induced by various biological processes is random and cell protrusion is irregular, it is difficult to measure cell morphology and motility in microscopic images. To address this dilemma, we propose an improved variation optical flow model for quantitative analysis of intracellular motility, which not only extracts intracellular motion fields effectively but also deals with optical flow computation problem at the border by taking advantages of the formulation based on L1 and L2 norm, respectively. In the energy functional of our proposed optical flow model, the data term is in the form of L2 norm; the smoothness of the data changes with regional features through an adaptive parameter, using L1 norm near the edge of the cell and L2 norm away from the edge. We further extract histograms of oriented optical flow (HOOF) after optical flow field of intracellular motion is computed. Then distances of different HOOFs are calculated as the intracellular motion features to grade the intracellular motion. Experimental results show that the features extracted from HOOFs provide new insights into the relationship between the cell motility and the special pathological conditions. PMID:29065574
Insect-Inspired Optical-Flow Navigation Sensors
NASA Technical Reports Server (NTRS)
Thakoor, Sarita; Morookian, John M.; Chahl, Javan; Soccol, Dean; Hines, Butler; Zornetzer, Steven
2005-01-01
Integrated circuits that exploit optical flow to sense motions of computer mice on or near surfaces ( optical mouse chips ) are used as navigation sensors in a class of small flying robots now undergoing development for potential use in such applications as exploration, search, and surveillance. The basic principles of these robots were described briefly in Insect-Inspired Flight Control for Small Flying Robots (NPO-30545), NASA Tech Briefs, Vol. 29, No. 1 (January 2005), page 61. To recapitulate from the cited prior article: The concept of optical flow can be defined, loosely, as the use of texture in images as a source of motion cues. The flight-control and navigation systems of these robots are inspired largely by the designs and functions of the vision systems and brains of insects, which have been demonstrated to utilize optical flow (as detected by their eyes and brains) resulting from their own motions in the environment. Optical flow has been shown to be very effective as a means of avoiding obstacles and controlling speeds and altitudes in robotic navigation. Prior systems used in experiments on navigating by means of optical flow have involved the use of panoramic optics, high-resolution image sensors, and programmable imagedata- processing computers.
Robust optical flow using adaptive Lorentzian filter for image reconstruction under noisy condition
NASA Astrophysics Data System (ADS)
Kesrarat, Darun; Patanavijit, Vorapoj
2017-02-01
In optical flow for motion allocation, the efficient result in Motion Vector (MV) is an important issue. Several noisy conditions may cause the unreliable result in optical flow algorithms. We discover that many classical optical flows algorithms perform better result under noisy condition when combined with modern optimized model. This paper introduces effective robust models of optical flow by using Robust high reliability spatial based optical flow algorithms using the adaptive Lorentzian norm influence function in computation on simple spatial temporal optical flows algorithm. Experiment on our proposed models confirm better noise tolerance in optical flow's MV under noisy condition when they are applied over simple spatial temporal optical flow algorithms as a filtering model in simple frame-to-frame correlation technique. We illustrate the performance of our models by performing an experiment on several typical sequences with differences in movement speed of foreground and background where the experiment sequences are contaminated by the additive white Gaussian noise (AWGN) at different noise decibels (dB). This paper shows very high effectiveness of noise tolerance models that they are indicated by peak signal to noise ratio (PSNR).
Optic flow-based collision-free strategies: From insects to robots.
Serres, Julien R; Ruffier, Franck
2017-09-01
Flying insects are able to fly smartly in an unpredictable environment. It has been found that flying insects have smart neurons inside their tiny brains that are sensitive to visual motion also called optic flow. Consequently, flying insects rely mainly on visual motion during their flight maneuvers such as: takeoff or landing, terrain following, tunnel crossing, lateral and frontal obstacle avoidance, and adjusting flight speed in a cluttered environment. Optic flow can be defined as the vector field of the apparent motion of objects, surfaces, and edges in a visual scene generated by the relative motion between an observer (an eye or a camera) and the scene. Translational optic flow is particularly interesting for short-range navigation because it depends on the ratio between (i) the relative linear speed of the visual scene with respect to the observer and (ii) the distance of the observer from obstacles in the surrounding environment without any direct measurement of either speed or distance. In flying insects, roll stabilization reflex and yaw saccades attenuate any rotation at the eye level in roll and yaw respectively (i.e. to cancel any rotational optic flow) in order to ensure pure translational optic flow between two successive saccades. Our survey focuses on feedback-loops which use the translational optic flow that insects employ for collision-free navigation. Optic flow is likely, over the next decade to be one of the most important visual cues that can explain flying insects' behaviors for short-range navigation maneuvers in complex tunnels. Conversely, the biorobotic approach can therefore help to develop innovative flight control systems for flying robots with the aim of mimicking flying insects' abilities and better understanding their flight. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Petrou, Zisis I.; Xian, Yang; Tian, YingLi
2018-04-01
Estimation of sea ice motion at fine scales is important for a number of regional and local level applications, including modeling of sea ice distribution, ocean-atmosphere and climate dynamics, as well as safe navigation and sea operations. In this study, we propose an optical flow and super-resolution approach to accurately estimate motion from remote sensing images at a higher spatial resolution than the original data. First, an external example learning-based super-resolution method is applied on the original images to generate higher resolution versions. Then, an optical flow approach is applied on the higher resolution images, identifying sparse correspondences and interpolating them to extract a dense motion vector field with continuous values and subpixel accuracies. Our proposed approach is successfully evaluated on passive microwave, optical, and Synthetic Aperture Radar data, proving appropriate for multi-sensor applications and different spatial resolutions. The approach estimates motion with similar or higher accuracy than the original data, while increasing the spatial resolution of up to eight times. In addition, the adopted optical flow component outperforms a state-of-the-art pattern matching method. Overall, the proposed approach results in accurate motion vectors with unprecedented spatial resolutions of up to 1.5 km for passive microwave data covering the entire Arctic and 20 m for radar data, and proves promising for numerous scientific and operational applications.
Local statistics of retinal optic flow for self-motion through natural sceneries.
Calow, Dirk; Lappe, Markus
2007-12-01
Image analysis in the visual system is well adapted to the statistics of natural scenes. Investigations of natural image statistics have so far mainly focused on static features. The present study is dedicated to the measurement and the analysis of the statistics of optic flow generated on the retina during locomotion through natural environments. Natural locomotion includes bouncing and swaying of the head and eye movement reflexes that stabilize gaze onto interesting objects in the scene while walking. We investigate the dependencies of the local statistics of optic flow on the depth structure of the natural environment and on the ego-motion parameters. To measure these dependencies we estimate the mutual information between correlated data sets. We analyze the results with respect to the variation of the dependencies over the visual field, since the visual motions in the optic flow vary depending on visual field position. We find that retinal flow direction and retinal speed show only minor statistical interdependencies. Retinal speed is statistically tightly connected to the depth structure of the scene. Retinal flow direction is statistically mostly driven by the relation between the direction of gaze and the direction of ego-motion. These dependencies differ at different visual field positions such that certain areas of the visual field provide more information about ego-motion and other areas provide more information about depth. The statistical properties of natural optic flow may be used to tune the performance of artificial vision systems based on human imitating behavior, and may be useful for analyzing properties of natural vision systems.
The role of optical flow in automated quality assessment of full-motion video
NASA Astrophysics Data System (ADS)
Harguess, Josh; Shafer, Scott; Marez, Diego
2017-09-01
In real-world video data, such as full-motion-video (FMV) taken from unmanned vehicles, surveillance systems, and other sources, various corruptions to the raw data is inevitable. This can be due to the image acquisition process, noise, distortion, and compression artifacts, among other sources of error. However, we desire methods to analyze the quality of the video to determine whether the underlying content of the corrupted video can be analyzed by humans or machines and to what extent. Previous approaches have shown that motion estimation, or optical flow, can be an important cue in automating this video quality assessment. However, there are many different optical flow algorithms in the literature, each with their own advantages and disadvantages. We examine the effect of the choice of optical flow algorithm (including baseline and state-of-the-art), on motionbased automated video quality assessment algorithms.
Motion estimation under location uncertainty for turbulent fluid flows
NASA Astrophysics Data System (ADS)
Cai, Shengze; Mémin, Etienne; Dérian, Pierre; Xu, Chao
2018-01-01
In this paper, we propose a novel optical flow formulation for estimating two-dimensional velocity fields from an image sequence depicting the evolution of a passive scalar transported by a fluid flow. This motion estimator relies on a stochastic representation of the flow allowing to incorporate naturally a notion of uncertainty in the flow measurement. In this context, the Eulerian fluid flow velocity field is decomposed into two components: a large-scale motion field and a small-scale uncertainty component. We define the small-scale component as a random field. Subsequently, the data term of the optical flow formulation is based on a stochastic transport equation, derived from the formalism under location uncertainty proposed in Mémin (Geophys Astrophys Fluid Dyn 108(2):119-146, 2014) and Resseguier et al. (Geophys Astrophys Fluid Dyn 111(3):149-176, 2017a). In addition, a specific regularization term built from the assumption of constant kinetic energy involves the very same diffusion tensor as the one appearing in the data transport term. Opposite to the classical motion estimators, this enables us to devise an optical flow method dedicated to fluid flows in which the regularization parameter has now a clear physical interpretation and can be easily estimated. Experimental evaluations are presented on both synthetic and real world image sequences. Results and comparisons indicate very good performance of the proposed formulation for turbulent flow motion estimation.
NASA Astrophysics Data System (ADS)
Huebner, Claudia S.
2016-10-01
As a consequence of fluctuations in the index of refraction of the air, atmospheric turbulence causes scintillation, spatial and temporal blurring as well as global and local image motion creating geometric distortions. To mitigate these effects many different methods have been proposed. Global as well as local motion compensation in some form or other constitutes an integral part of many software-based approaches. For the estimation of motion vectors between consecutive frames simple methods like block matching are preferable to more complex algorithms like optical flow, at least when challenged with near real-time requirements. However, the processing power of commercially available computers continues to increase rapidly and the more powerful optical flow methods have the potential to outperform standard block matching methods. Therefore, in this paper three standard optical flow algorithms, namely Horn-Schunck (HS), Lucas-Kanade (LK) and Farnebäck (FB), are tested for their suitability to be employed for local motion compensation as part of a turbulence mitigation system. Their qualitative performance is evaluated and compared with that of three standard block matching methods, namely Exhaustive Search (ES), Adaptive Rood Pattern Search (ARPS) and Correlation based Search (CS).
Control of self-motion in dynamic fluids: fish do it differently from bees.
Scholtyssek, Christine; Dacke, Marie; Kröger, Ronald; Baird, Emily
2014-05-01
To detect and avoid collisions, animals need to perceive and control the distance and the speed with which they are moving relative to obstacles. This is especially challenging for swimming and flying animals that must control movement in a dynamic fluid without reference from physical contact to the ground. Flying animals primarily rely on optic flow to control flight speed and distance to obstacles. Here, we investigate whether swimming animals use similar strategies for self-motion control to flying animals by directly comparing the trajectories of zebrafish (Danio rerio) and bumblebees (Bombus terrestris) moving through the same experimental tunnel. While moving through the tunnel, black and white patterns produced (i) strong horizontal optic flow cues on both walls, (ii) weak horizontal optic flow cues on both walls and (iii) strong optic flow cues on one wall and weak optic flow cues on the other. We find that the mean speed of zebrafish does not depend on the amount of optic flow perceived from the walls. We further show that zebrafish, unlike bumblebees, move closer to the wall that provides the strongest visual feedback. This unexpected preference for strong optic flow cues may reflect an adaptation for self-motion control in water or in environments where visibility is limited. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Bertrand, Olivier J. N.; Lindemann, Jens P.; Egelhaaf, Martin
2015-01-01
Avoiding collisions is one of the most basic needs of any mobile agent, both biological and technical, when searching around or aiming toward a goal. We propose a model of collision avoidance inspired by behavioral experiments on insects and by properties of optic flow on a spherical eye experienced during translation, and test the interaction of this model with goal-driven behavior. Insects, such as flies and bees, actively separate the rotational and translational optic flow components via behavior, i.e. by employing a saccadic strategy of flight and gaze control. Optic flow experienced during translation, i.e. during intersaccadic phases, contains information on the depth-structure of the environment, but this information is entangled with that on self-motion. Here, we propose a simple model to extract the depth structure from translational optic flow by using local properties of a spherical eye. On this basis, a motion direction of the agent is computed that ensures collision avoidance. Flying insects are thought to measure optic flow by correlation-type elementary motion detectors. Their responses depend, in addition to velocity, on the texture and contrast of objects and, thus, do not measure the velocity of objects veridically. Therefore, we initially used geometrically determined optic flow as input to a collision avoidance algorithm to show that depth information inferred from optic flow is sufficient to account for collision avoidance under closed-loop conditions. Then, the collision avoidance algorithm was tested with bio-inspired correlation-type elementary motion detectors in its input. Even then, the algorithm led successfully to collision avoidance and, in addition, replicated the characteristics of collision avoidance behavior of insects. Finally, the collision avoidance algorithm was combined with a goal direction and tested in cluttered environments. The simulated agent then showed goal-directed behavior reminiscent of components of the navigation behavior of insects. PMID:26583771
Grouping of optic flow stimuli during binocular rivalry is driven by monocular information.
Holten, Vivian; Stuit, Sjoerd M; Verstraten, Frans A J; van der Smagt, Maarten J
2016-10-01
During binocular rivalry, perception alternates between two dissimilar images, presented dichoptically. Although binocular rivalry is thought to result from competition at a local level, neighboring image parts with similar features tend to be perceived together for longer durations than image parts with dissimilar features. This simultaneous dominance of two image parts is called grouping during rivalry. Previous studies have shown that this grouping depends on a shared eye-of-origin to a much larger extent than on image content, irrespective of the complexity of a static image. In the current study, we examine whether grouping of dynamic optic flow patterns is also primarily driven by monocular (eye-of-origin) information. In addition, we examine whether image parameters, such as optic flow direction, and partial versus full visibility of the optic flow pattern, affect grouping durations during rivalry. The results show that grouping of optic flow is, as is known for static images, primarily affected by its eye-of-origin. Furthermore, global motion can affect grouping durations, but only under specific conditions. Namely, only when the two full optic flow patterns were presented locally. These results suggest that grouping during rivalry is primarily driven by monocular information even for motion stimuli thought to rely on higher-level motion areas. Copyright © 2016 Elsevier Ltd. All rights reserved.
Optimal Filter Estimation for Lucas-Kanade Optical Flow
Sharmin, Nusrat; Brad, Remus
2012-01-01
Optical flow algorithms offer a way to estimate motion from a sequence of images. The computation of optical flow plays a key-role in several computer vision applications, including motion detection and segmentation, frame interpolation, three-dimensional scene reconstruction, robot navigation and video compression. In the case of gradient based optical flow implementation, the pre-filtering step plays a vital role, not only for accurate computation of optical flow, but also for the improvement of performance. Generally, in optical flow computation, filtering is used at the initial level on original input images and afterwards, the images are resized. In this paper, we propose an image filtering approach as a pre-processing step for the Lucas-Kanade pyramidal optical flow algorithm. Based on a study of different types of filtering methods and applied on the Iterative Refined Lucas-Kanade, we have concluded on the best filtering practice. As the Gaussian smoothing filter was selected, an empirical approach for the Gaussian variance estimation was introduced. Tested on the Middlebury image sequences, a correlation between the image intensity value and the standard deviation value of the Gaussian function was established. Finally, we have found that our selection method offers a better performance for the Lucas-Kanade optical flow algorithm.
Variational optical flow estimation for images with spectral and photometric sensor diversity
NASA Astrophysics Data System (ADS)
Bengtsson, Tomas; McKelvey, Tomas; Lindström, Konstantin
2015-03-01
Motion estimation of objects in image sequences is an essential computer vision task. To this end, optical flow methods compute pixel-level motion, with the purpose of providing low-level input to higher-level algorithms and applications. Robust flow estimation is crucial for the success of applications, which in turn depends on the quality of the captured image data. This work explores the use of sensor diversity in the image data within a framework for variational optical flow. In particular, a custom image sensor setup intended for vehicle applications is tested. Experimental results demonstrate the improved flow estimation performance when IR sensitivity or flash illumination is added to the system.
Tuning self-motion perception in virtual reality with visual illusions.
Bruder, Gerd; Steinicke, Frank; Wieland, Phil; Lappe, Markus
2012-07-01
Motion perception in immersive virtual environments significantly differs from the real world. For example, previous work has shown that users tend to underestimate travel distances in virtual environments (VEs). As a solution to this problem, researchers proposed to scale the mapped virtual camera motion relative to the tracked real-world movement of a user until real and virtual motion are perceived as equal, i.e., real-world movements could be mapped with a larger gain to the VE in order to compensate for the underestimation. However, introducing discrepancies between real and virtual motion can become a problem, in particular, due to misalignments of both worlds and distorted space cognition. In this paper, we describe a different approach that introduces apparent self-motion illusions by manipulating optic flow fields during movements in VEs. These manipulations can affect self-motion perception in VEs, but omit a quantitative discrepancy between real and virtual motions. In particular, we consider to which regions of the virtual view these apparent self-motion illusions can be applied, i.e., the ground plane or peripheral vision. Therefore, we introduce four illusions and show in experiments that optic flow manipulation can significantly affect users' self-motion judgments. Furthermore, we show that with such manipulations of optic flow fields the underestimation of travel distances can be compensated.
Saunders, Jeffrey A.
2014-01-01
Direction of self-motion during walking is indicated by multiple cues, including optic flow, nonvisual sensory cues, and motor prediction. I measured the reliability of perceived heading from visual and nonvisual cues during walking, and whether cues are weighted in an optimal manner. I used a heading alignment task to measure perceived heading during walking. Observers walked toward a target in a virtual environment with and without global optic flow. The target was simulated to be infinitely far away, so that it did not provide direct feedback about direction of self-motion. Variability in heading direction was low even without optic flow, with average RMS error of 2.4°. Global optic flow reduced variability to 1.9°–2.1°, depending on the structure of the environment. The small amount of variance reduction was consistent with optimal use of visual information. The relative contribution of visual and nonvisual information was also measured using cue conflict conditions. Optic flow specified a conflicting heading direction (±5°), and bias in walking direction was used to infer relative weighting. Visual feedback influenced heading direction by 16%–34% depending on scene structure, with more effect with dense motion parallax. The weighting of visual feedback was close to the predictions of an optimal integration model given the observed variability measures. PMID:24648194
Wada, Atsushi; Sakano, Yuichi; Ando, Hiroshi
2016-01-01
Vision is important for estimating self-motion, which is thought to involve optic-flow processing. Here, we investigated the fMRI response profiles in visual area V6, the precuneus motion area (PcM), and the cingulate sulcus visual area (CSv)—three medial brain regions recently shown to be sensitive to optic-flow. We used wide-view stereoscopic stimulation to induce robust self-motion processing. Stimuli included static, randomly moving, and coherently moving dots (simulating forward self-motion). We varied the stimulus size and the presence of stereoscopic information. A combination of univariate and multi-voxel pattern analyses (MVPA) revealed that fMRI responses in the three regions differed from each other. The univariate analysis identified optic-flow selectivity and an effect of stimulus size in V6, PcM, and CSv, among which only CSv showed a significantly lower response to random motion stimuli compared with static conditions. Furthermore, MVPA revealed an optic-flow specific multi-voxel pattern in the PcM and CSv, where the discrimination of coherent motion from both random motion and static conditions showed above-chance prediction accuracy, but that of random motion from static conditions did not. Additionally, while area V6 successfully classified different stimulus sizes regardless of motion pattern, this classification was only partial in PcM and was absent in CSv. This may reflect the known retinotopic representation in V6 and the absence of such clear visuospatial representation in CSv. We also found significant correlations between the strength of subjective self-motion and univariate activation in all examined regions except for primary visual cortex (V1). This neuro-perceptual correlation was significantly higher for V6, PcM, and CSv when compared with V1, and higher for CSv when compared with the visual motion area hMT+. Our convergent results suggest the significant involvement of CSv in self-motion processing, which may give rise to its percept. PMID:26973588
Selectivity to Translational Egomotion in Human Brain Motion Areas
Pitzalis, Sabrina; Sdoia, Stefano; Bultrini, Alessandro; Committeri, Giorgia; Di Russo, Francesco; Fattori, Patrizia; Galletti, Claudio; Galati, Gaspare
2013-01-01
The optic flow generated when a person moves through the environment can be locally decomposed into several basic components, including radial, circular, translational and spiral motion. Since their analysis plays an important part in the visual perception and control of locomotion and posture it is likely that some brain regions in the primate dorsal visual pathway are specialized to distinguish among them. The aim of this study is to explore the sensitivity to different types of egomotion-compatible visual stimulations in the human motion-sensitive regions of the brain. Event-related fMRI experiments, 3D motion and wide-field stimulation, functional localizers and brain mapping methods were used to study the sensitivity of six distinct motion areas (V6, MT, MST+, V3A, CSv and an Intra-Parietal Sulcus motion [IPSmot] region) to different types of optic flow stimuli. Results show that only areas V6, MST+ and IPSmot are specialized in distinguishing among the various types of flow patterns, with a high response for the translational flow which was maximum in V6 and IPSmot and less marked in MST+. Given that during egomotion the translational optic flow conveys differential information about the near and far external objects, areas V6 and IPSmot likely process visual egomotion signals to extract information about the relative distance of objects with respect to the observer. Since area V6 is also involved in distinguishing object-motion from self-motion, it could provide information about location in space of moving and static objects during self-motion, particularly in a dynamically unstable environment. PMID:23577096
Accuracy and Tuning of Flow Parsing for Visual Perception of Object Motion During Self-Motion
Niehorster, Diederick C.
2017-01-01
How do we perceive object motion during self-motion using visual information alone? Previous studies have reported that the visual system can use optic flow to identify and globally subtract the retinal motion component resulting from self-motion to recover scene-relative object motion, a process called flow parsing. In this article, we developed a retinal motion nulling method to directly measure and quantify the magnitude of flow parsing (i.e., flow parsing gain) in various scenarios to examine the accuracy and tuning of flow parsing for the visual perception of object motion during self-motion. We found that flow parsing gains were below unity for all displays in all experiments; and that increasing self-motion and object motion speed did not alter flow parsing gain. We conclude that visual information alone is not sufficient for the accurate perception of scene-relative motion during self-motion. Although flow parsing performs global subtraction, its accuracy also depends on local motion information in the retinal vicinity of the moving object. Furthermore, the flow parsing gain was constant across common self-motion or object motion speeds. These results can be used to inform and validate computational models of flow parsing. PMID:28567272
Acoustic facilitation of object movement detection during self-motion
Calabro, F. J.; Soto-Faraco, S.; Vaina, L. M.
2011-01-01
In humans, as well as most animal species, perception of object motion is critical to successful interaction with the surrounding environment. Yet, as the observer also moves, the retinal projections of the various motion components add to each other and extracting accurate object motion becomes computationally challenging. Recent psychophysical studies have demonstrated that observers use a flow-parsing mechanism to estimate and subtract self-motion from the optic flow field. We investigated whether concurrent acoustic cues for motion can facilitate visual flow parsing, thereby enhancing the detection of moving objects during simulated self-motion. Participants identified an object (the target) that moved either forward or backward within a visual scene containing nine identical textured objects simulating forward observer translation. We found that spatially co-localized, directionally congruent, moving auditory stimuli enhanced object motion detection. Interestingly, subjects who performed poorly on the visual-only task benefited more from the addition of moving auditory stimuli. When auditory stimuli were not co-localized to the visual target, improvements in detection rates were weak. Taken together, these results suggest that parsing object motion from self-motion-induced optic flow can operate on multisensory object representations. PMID:21307050
Processing of Egomotion-Consistent Optic Flow in the Rhesus Macaque Cortex
Cottereau, Benoit R.; Smith, Andrew T.; Rima, Samy; Fize, Denis; Héjja-Brichard, Yseult; Renaud, Luc; Lejards, Camille; Vayssière, Nathalie; Trotter, Yves; Durand, Jean-Baptiste
2017-01-01
Abstract The cortical network that processes visual cues to self-motion was characterized with functional magnetic resonance imaging in 3 awake behaving macaques. The experimental protocol was similar to previous human studies in which the responses to a single large optic flow patch were contrasted with responses to an array of 9 similar flow patches. This distinguishes cortical regions where neurons respond to flow in their receptive fields regardless of surrounding motion from those that are sensitive to whether the overall image arises from self-motion. In all 3 animals, significant selectivity for egomotion-consistent flow was found in several areas previously associated with optic flow processing, and notably dorsal middle superior temporal area, ventral intra-parietal area, and VPS. It was also seen in areas 7a (Opt), STPm, FEFsem, FEFsac and in a region of the cingulate sulcus that may be homologous with human area CSv. Selectivity for egomotion-compatible flow was never total but was particularly strong in VPS and putative macaque CSv. Direct comparison of results with the equivalent human studies reveals several commonalities but also some differences. PMID:28108489
Stability in Young Infants' Discrimination of Optic Flow
ERIC Educational Resources Information Center
Gilmore, Rick O.; Baker, Thomas J.; Grobman, K. H.
2004-01-01
Although considerable progress has been made in understanding how adults perceive their direction of self-motion, or heading, from optic flow, little is known about how these perceptual processes develop in infants. In 3 experiments, the authors explored how well 3- to 6-month-old infants could discriminate between optic flow patterns that…
Local motion adaptation enhances the representation of spatial structure at EMD arrays
Lindemann, Jens P.; Egelhaaf, Martin
2017-01-01
Neuronal representation and extraction of spatial information are essential for behavioral control. For flying insects, a plausible way to gain spatial information is to exploit distance-dependent optic flow that is generated during translational self-motion. Optic flow is computed by arrays of local motion detectors retinotopically arranged in the second neuropile layer of the insect visual system. These motion detectors have adaptive response characteristics, i.e. their responses to motion with a constant or only slowly changing velocity decrease, while their sensitivity to rapid velocity changes is maintained or even increases. We analyzed by a modeling approach how motion adaptation affects signal representation at the output of arrays of motion detectors during simulated flight in artificial and natural 3D environments. We focused on translational flight, because spatial information is only contained in the optic flow induced by translational locomotion. Indeed, flies, bees and other insects segregate their flight into relatively long intersaccadic translational flight sections interspersed with brief and rapid saccadic turns, presumably to maximize periods of translation (80% of the flight). With a novel adaptive model of the insect visual motion pathway we could show that the motion detector responses to background structures of cluttered environments are largely attenuated as a consequence of motion adaptation, while responses to foreground objects stay constant or even increase. This conclusion even holds under the dynamic flight conditions of insects. PMID:29281631
Blindsight modulation of motion perception.
Intriligator, James M; Xie, Ruiman; Barton, Jason J S
2002-11-15
Monkey data suggest that of all perceptual abilities, motion perception is the most likely to survive striate damage. The results of studies on motion blindsight in humans, though, are mixed. We used an indirect strategy to examine how responses to visible stimuli were modulated by blind-field stimuli. In a 26-year-old man with focal striate lesions, discrimination of visible optic flow was enhanced about 7% by blind-field flow, even though discrimination of optic flow in the blind field alone (the direct strategy) was at chance. Pursuit of an imagined target using peripheral cues showed reduced variance but not increased gain with blind-field cues. Preceding blind-field prompts shortened reaction times to visible targets by about 10 msec, but there was no attentional crowding of visible stimuli by blind-field distractors. A similar efficacy of indirect blind-field optic flow modulation was found in a second patient with residual vision after focal striate damage, but not in a third with more extensive medial occipito-temporal damage. We conclude that indirect modulatory strategies are more effective than direct forced-choice methods at revealing residual motion perception after focal striate lesions.
Do kinematic metrics of walking balance adapt to perturbed optical flow?
Thompson, Jessica D; Franz, Jason R
2017-08-01
Visual (i.e., optical flow) perturbations can be used to study balance control and balance deficits. However, it remains unclear whether walking balance control adapts to such perturbations over time. Our purpose was to investigate the propensity for visuomotor adaptation in walking balance control using prolonged exposure to optical flow perturbations. Ten subjects (age: 25.4±3.8years) walked on a treadmill while watching a speed-matched virtual hallway with and without continuous mediolateral optical flow perturbations of three different amplitudes. Each of three perturbation trials consisted of 8min of prolonged exposure followed by 1min of unperturbed walking. Using 3D motion capture, we analyzed changes in foot placement kinematics and mediolateral sacrum motion. At their onset, perturbations elicited wider and shorter steps, alluding to a more cautious, general anticipatory balance control strategy. As perturbations continued, foot placement tended toward values seen during unperturbed walking while step width variability and mediolateral sacrum motion concurrently increased. Our findings suggest that subjects progressively shifted from a general anticipatory balance control strategy to a reactive, task-specific strategy using step-to-step adjustments. Prolonged exposure to optical flow perturbations may have clinical utility to reinforce reactive, task-specific balance control through training. Copyright © 2017 Elsevier B.V. All rights reserved.
Vision System Measures Motions of Robot and External Objects
NASA Technical Reports Server (NTRS)
Talukder, Ashit; Matthies, Larry
2008-01-01
A prototype of an advanced robotic vision system both (1) measures its own motion with respect to a stationary background and (2) detects other moving objects and estimates their motions, all by use of visual cues. Like some prior robotic and other optoelectronic vision systems, this system is based partly on concepts of optical flow and visual odometry. Whereas prior optoelectronic visual-odometry systems have been limited to frame rates of no more than 1 Hz, a visual-odometry subsystem that is part of this system operates at a frame rate of 60 to 200 Hz, given optical-flow estimates. The overall system operates at an effective frame rate of 12 Hz. Moreover, unlike prior machine-vision systems for detecting motions of external objects, this system need not remain stationary: it can detect such motions while it is moving (even vibrating). The system includes a stereoscopic pair of cameras mounted on a moving robot. The outputs of the cameras are digitized, then processed to extract positions and velocities. The initial image-data-processing functions of this system are the same as those of some prior systems: Stereoscopy is used to compute three-dimensional (3D) positions for all pixels in the camera images. For each pixel of each image, optical flow between successive image frames is used to compute the two-dimensional (2D) apparent relative translational motion of the point transverse to the line of sight of the camera. The challenge in designing this system was to provide for utilization of the 3D information from stereoscopy in conjunction with the 2D information from optical flow to distinguish between motion of the camera pair and motions of external objects, compute the motion of the camera pair in all six degrees of translational and rotational freedom, and robustly estimate the motions of external objects, all in real time. To meet this challenge, the system is designed to perform the following image-data-processing functions: The visual-odometry subsystem (the subsystem that estimates the motion of the camera pair relative to the stationary background) utilizes the 3D information from stereoscopy and the 2D information from optical flow. It computes the relationship between the 3D and 2D motions and uses a least-mean-squares technique to estimate motion parameters. The least-mean-squares technique is suitable for real-time implementation when the number of external-moving-object pixels is smaller than the number of stationary-background pixels.
Cerebral palsy characterization by estimating ocular motion
NASA Astrophysics Data System (ADS)
González, Jully; Atehortúa, Angélica; Moncayo, Ricardo; Romero, Eduardo
2017-11-01
Cerebral palsy (CP) is a large group of motion and posture disorders caused during the fetal or infant brain development. Sensorial impairment is commonly found in children with CP, i.e., between 40-75 percent presents some form of vision problems or disabilities. An automatic characterization of the cerebral palsy is herein presented by estimating the ocular motion during a gaze pursuing task. Specifically, After automatically detecting the eye location, an optical flow algorithm tracks the eye motion following a pre-established visual assignment. Subsequently, the optical flow trajectories are characterized in the velocity-acceleration phase plane. Differences are quantified in a small set of patients between four to ten years.
Brightness-compensated 3-D optical flow algorithm for monitoring cochlear motion patterns
NASA Astrophysics Data System (ADS)
von Tiedemann, Miriam; Fridberger, Anders; Ulfendahl, Mats; de Monvel, Jacques Boutet
2010-09-01
A method for three-dimensional motion analysis designed for live cell imaging by fluorescence confocal microscopy is described. The approach is based on optical flow computation and takes into account brightness variations in the image scene that are not due to motion, such as photobleaching or fluorescence variations that may reflect changes in cellular physiology. The 3-D optical flow algorithm allowed almost perfect motion estimation on noise-free artificial sequences, and performed with a relative error of <10% on noisy images typical of real experiments. The method was applied to a series of 3-D confocal image stacks from an in vitro preparation of the guinea pig cochlea. The complex motions caused by slow pressure changes in the cochlear compartments were quantified. At the surface of the hearing organ, the largest motion component was the transverse one (normal to the surface), but significant radial and longitudinal displacements were also present. The outer hair cell displayed larger radial motion at their basolateral membrane than at their apical surface. These movements reflect mechanical interactions between different cellular structures, which may be important for communicating sound-evoked vibrations to the sensory cells. A better understanding of these interactions is important for testing realistic models of cochlear mechanics.
Brightness-compensated 3-D optical flow algorithm for monitoring cochlear motion patterns.
von Tiedemann, Miriam; Fridberger, Anders; Ulfendahl, Mats; de Monvel, Jacques Boutet
2010-01-01
A method for three-dimensional motion analysis designed for live cell imaging by fluorescence confocal microscopy is described. The approach is based on optical flow computation and takes into account brightness variations in the image scene that are not due to motion, such as photobleaching or fluorescence variations that may reflect changes in cellular physiology. The 3-D optical flow algorithm allowed almost perfect motion estimation on noise-free artificial sequences, and performed with a relative error of <10% on noisy images typical of real experiments. The method was applied to a series of 3-D confocal image stacks from an in vitro preparation of the guinea pig cochlea. The complex motions caused by slow pressure changes in the cochlear compartments were quantified. At the surface of the hearing organ, the largest motion component was the transverse one (normal to the surface), but significant radial and longitudinal displacements were also present. The outer hair cell displayed larger radial motion at their basolateral membrane than at their apical surface. These movements reflect mechanical interactions between different cellular structures, which may be important for communicating sound-evoked vibrations to the sensory cells. A better understanding of these interactions is important for testing realistic models of cochlear mechanics.
NASA Astrophysics Data System (ADS)
Lannutti, E.; Lenzano, M. G.; Toth, C.; Lenzano, L.; Rivera, A.
2016-06-01
In this work, we assessed the feasibility of using optical flow to obtain the motion estimation of a glacier. In general, former investigations used to detect glacier changes involve solutions that require repeated observations which are many times based on extensive field work. Taking into account glaciers are usually located in geographically complex and hard to access areas, deploying time-lapse imaging sensors, optical flow may provide an efficient solution at good spatial and temporal resolution to describe mass motion. Several studies in computer vision and image processing community have used this method to detect large displacements. Therefore, we carried out a test of the proposed Large Displacement Optical Flow method at the Viedma Glacier, located at South Patagonia Icefield, Argentina. We collected monoscopic terrestrial time-lapse imagery, acquired by a calibrated camera at every 24 hour from April 2014 until April 2015. A filter based on temporal correlation and RGB color discretization between the images was applied to minimize errors related to changes in lighting, shadows, clouds and snow. This selection allowed discarding images that do not follow a sequence of similarity. Our results show a flow field in the direction of the glacier movement with acceleration in the terminus. We analyzed the errors between image pairs, and the matching generally appears to be adequate, although some areas show random gross errors related to the presence of changes in lighting. The proposed technique allowed the determination of glacier motion during one year, providing accurate and reliable motion data for subsequent analysis.
Experience affects the use of ego-motion signals during 3D shape perception.
Jain, Anshul; Backus, Benjamin T
2010-12-29
Experience has long-term effects on perceptual appearance (Q. Haijiang, J. A. Saunders, R. W. Stone, & B. T. Backus, 2006). We asked whether experience affects the appearance of structure-from-motion stimuli when the optic flow is caused by observer ego-motion. Optic flow is an ambiguous depth cue: a rotating object and its oppositely rotating, depth-inverted dual generate similar flow. However, the visual system exploits ego-motion signals to prefer the percept of an object that is stationary over one that rotates (M. Wexler, F. Panerai, I. Lamouret, & J. Droulez, 2001). We replicated this finding and asked whether this preference for stationarity, the "stationarity prior," is modulated by experience. During training, two groups of observers were exposed to objects with identical flow, but that were either stationary or moving as determined by other cues. The training caused identical test stimuli to be seen preferentially as stationary or moving by the two groups, respectively. We then asked whether different priors can exist independently at different locations in the visual field. Observers were trained to see objects either as stationary or as moving at two different locations. Observers' stationarity bias at the two respective locations was modulated in the directions consistent with training. Thus, the utilization of extraretinal ego-motion signals for disambiguating optic flow signals can be updated as the result of experience, consistent with the updating of a Bayesian prior for stationarity.
Scale Changes Provide an Alternative Cue For the Discrimination of Heading, But Not Object Motion
Calabro, Finnegan J.; Vaina, Lucia Maria
2016-01-01
Background Understanding the dynamics of our surrounding environments is a task usually attributed to the detection of motion based on changes in luminance across space. Yet a number of other cues, both dynamic and static, have been shown to provide useful information about how we are moving and how objects around us move. One such cue, based on changes in spatial frequency, or scale, over time has been shown to be useful in conveying motion in depth even in the absence of a coherent, motion-defined flow field (optic flow). Material/Methods 16 right handed healthy observers (ages 18–28) participated in the behavioral experiments described in this study. Using analytical behavioral methods we investigate the functional specificity of this cue by measuring the ability of observers to perform tasks of heading (direction of self-motion) and 3D trajectory discrimination on the basis of scale changes and optic flow. Results Statistical analyses of performance on the test-experiments in comparison to the control experiments suggests that while scale changes may be involved in the detection of heading, they are not correctly integrated with translational motion and, thus, do not provide a correct discrimination of 3D object trajectories. Conclusions These results have the important implication for the type of visual guided navigation that can be done by an observer blind to optic flow. Scale change is an important alternative cue for self-motion. PMID:27231114
Scale Changes Provide an Alternative Cue For the Discrimination of Heading, But Not Object Motion.
Calabro, Finnegan J; Vaina, Lucia Maria
2016-05-27
BACKGROUND Understanding the dynamics of our surrounding environments is a task usually attributed to the detection of motion based on changes in luminance across space. Yet a number of other cues, both dynamic and static, have been shown to provide useful information about how we are moving and how objects around us move. One such cue, based on changes in spatial frequency, or scale, over time has been shown to be useful in conveying motion in depth even in the absence of a coherent, motion-defined flow field (optic flow). MATERIAL AND METHODS 16 right handed healthy observers (ages 18-28) participated in the behavioral experiments described in this study. Using analytical behavioral methods we investigate the functional specificity of this cue by measuring the ability of observers to perform tasks of heading (direction of self-motion) and 3D trajectory discrimination on the basis of scale changes and optic flow. RESULTS Statistical analyses of performance on the test-experiments in comparison to the control experiments suggests that while scale changes may be involved in the detection of heading, they are not correctly integrated with translational motion and, thus, do not provide a correct discrimination of 3D object trajectories. CONCLUSIONS These results have the important implication for the type of visual guided navigation that can be done by an observer blind to optic flow. Scale change is an important alternative cue for self-motion.
Vaina, Lucia M.; Buonanno, Ferdinando; Rushton, Simon K.
2014-01-01
Background All contemporary models of perception of locomotor heading from optic flow (the characteristic patterns of retinal motion that result from self-movement) begin with relative motion. Therefore it would be expected that an impairment on perception of relative motion should impact on the ability to judge heading and other 3D motion tasks. Material/Methods We report two patients with occipital lobe lesions whom we tested on a battery of motion tasks. Patients were impaired on all tests that involved relative motion in plane (motion discontinuity, form from differences in motion direction or speed). Despite this they retained the ability to judge their direction of heading relative to a target. A potential confound is that observers can derive information about heading from scale changes bypassing the need to use optic flow. Therefore we ran further experiments in which we isolated optic flow and scale change. Results Patients’ performance was in normal ranges on both tests. The finding that ability to perceive heading can be retained despite an impairment on ability to judge relative motion questions the assumption that heading perception proceeds from initial processing of relative motion. Furthermore, on a collision detection task, SS and SR’s performance was significantly better for simulated forward movement of the observer in the 3D scene, than for the static observer. This suggests that in spite of severe deficits on relative motion in the frontoparlel (xy) plane, information from self-motion helped identification objects moving along an intercept 3D relative motion trajectory. Conclusions This result suggests a potential use of a flow parsing strategy to detect in a 3D world the trajectory of moving objects when the observer is moving forward. These results have implications for developing rehabilitation strategies for deficits in visually guided navigation. PMID:25183375
Egelhaaf, Martin; Kern, Roland; Lindemann, Jens Peter
2014-01-01
Despite their miniature brains insects, such as flies, bees and wasps, are able to navigate by highly erobatic flight maneuvers in cluttered environments. They rely on spatial information that is contained in the retinal motion patterns induced on the eyes while moving around ("optic flow") to accomplish their extraordinary performance. Thereby, they employ an active flight and gaze strategy that separates rapid saccade-like turns from translatory flight phases where the gaze direction is kept largely constant. This behavioral strategy facilitates the processing of environmental information, because information about the distance of the animal to objects in the environment is only contained in the optic flow generated by translatory motion. However, motion detectors as are widespread in biological systems do not represent veridically the velocity of the optic flow vectors, but also reflect textural information about the environment. This characteristic has often been regarded as a limitation of a biological motion detection mechanism. In contrast, we conclude from analyses challenging insect movement detectors with image flow as generated during translatory locomotion through cluttered natural environments that this mechanism represents the contours of nearby objects. Contrast borders are a main carrier of functionally relevant object information in artificial and natural sceneries. The motion detection system thus segregates in a computationally parsimonious way the environment into behaviorally relevant nearby objects and-in many behavioral contexts-less relevant distant structures. Hence, by making use of an active flight and gaze strategy, insects are capable of performing extraordinarily well even with a computationally simple motion detection mechanism.
Harbor seals (Phoca vitulina) can perceive optic flow under water.
Gläser, Nele; Mauck, Björn; Kandil, Farid I; Lappe, Markus; Dehnhardt, Guido; Hanke, Frederike D
2014-01-01
Optic flow, the pattern of apparent motion elicited on the retina during movement, has been demonstrated to be widely used by animals living in the aerial habitat, whereas underwater optic flow has not been intensively studied so far. However optic flow would also provide aquatic animals with valuable information about their own movement relative to the environment; even under conditions in which vision is generally thought to be drastically impaired, e. g. in turbid waters. Here, we tested underwater optic flow perception for the first time in a semi-aquatic mammal, the harbor seal, by simulating a forward movement on a straight path through a cloud of dots on an underwater projection. The translatory motion pattern expanded radially out of a singular point along the direction of heading, the focus of expansion. We assessed the seal's accuracy in determining the simulated heading in a task, in which the seal had to judge whether a cross superimposed on the flow field was deviating from or congruent with the actual focus of expansion. The seal perceived optic flow and determined deviations from the simulated heading with a threshold of 0.6 deg of visual angle. Optic flow is thus a source of information seals, fish and most likely aquatic species in general may rely on for e. g. controlling locomotion and orientation under water. This leads to the notion that optic flow seems to be a tool universally used by any moving organism possessing eyes.
Harbor Seals (Phoca vitulina) Can Perceive Optic Flow under Water
Gläser, Nele; Mauck, Björn; Kandil, Farid I.; Lappe, Markus; Dehnhardt, Guido; Hanke, Frederike D.
2014-01-01
Optic flow, the pattern of apparent motion elicited on the retina during movement, has been demonstrated to be widely used by animals living in the aerial habitat, whereas underwater optic flow has not been intensively studied so far. However optic flow would also provide aquatic animals with valuable information about their own movement relative to the environment; even under conditions in which vision is generally thought to be drastically impaired, e. g. in turbid waters. Here, we tested underwater optic flow perception for the first time in a semi-aquatic mammal, the harbor seal, by simulating a forward movement on a straight path through a cloud of dots on an underwater projection. The translatory motion pattern expanded radially out of a singular point along the direction of heading, the focus of expansion. We assessed the seal's accuracy in determining the simulated heading in a task, in which the seal had to judge whether a cross superimposed on the flow field was deviating from or congruent with the actual focus of expansion. The seal perceived optic flow and determined deviations from the simulated heading with a threshold of 0.6 deg of visual angle. Optic flow is thus a source of information seals, fish and most likely aquatic species in general may rely on for e. g. controlling locomotion and orientation under water. This leads to the notion that optic flow seems to be a tool universally used by any moving organism possessing eyes. PMID:25058490
Whitney, Susan L.; Sparto, Patrick J.; Cook, James R.; Redfern, Mark S.; Furman, Joseph M.
2016-01-01
Introduction People with vestibular disorders often experience space and motion discomfort when exposed to moving or highly textured visual scenes. The purpose of this study was to measure the type and severity of symptoms in people with vestibular dysfunction during coordinated head and eye movements in optic flow environments. Methods Seven subjects with vestibular disorders and 25 controls viewed four different full-field optic flow environments on six different visits. The optic flow environments consisted of textures with various contrasts and spatial frequencies. Subjects performed 8 gaze movement tasks, including eye saccades, gaze saccades, and gaze stabilization tasks. Subjects reported symptoms using Subjective Units of Discomfort (SUD) and the Simulator Sickness Questionnaire (SSQ). Self-reported dizziness handicap and space and motion discomfort were also measured. Results/ Conclusion Subjects with vestibular disorders had greater discomfort and experienced greater oculomotor and disorientation symptoms. The magnitude of the symptoms increased during each visit, but did not depend on the optic flow condition. Subjects who reported greater dizziness handicap and space motion discomfort had greater severity of symptoms during the experiment. Symptoms of fatigue, difficulty focusing, and dizziness during the experiment were evident. Compared with controls, subjects with vestibular disorders had less head movement during the gaze saccade tasks. Overall, performance of gaze pursuit and gaze stabilization tasks in moving visual environments elicited greater symptoms in subjects with vestibular disorders compared with healthy subjects. PMID:23549055
Beck, Cornelia; Ognibeni, Thilo; Neumann, Heiko
2008-01-01
Background Optic flow is an important cue for object detection. Humans are able to perceive objects in a scene using only kinetic boundaries, and can perform the task even when other shape cues are not provided. These kinetic boundaries are characterized by the presence of motion discontinuities in a local neighbourhood. In addition, temporal occlusions appear along the boundaries as the object in front covers the background and the objects that are spatially behind it. Methodology/Principal Findings From a technical point of view, the detection of motion boundaries for segmentation based on optic flow is a difficult task. This is due to the problem that flow detected along such boundaries is generally not reliable. We propose a model derived from mechanisms found in visual areas V1, MT, and MSTl of human and primate cortex that achieves robust detection along motion boundaries. It includes two separate mechanisms for both the detection of motion discontinuities and of occlusion regions based on how neurons respond to spatial and temporal contrast, respectively. The mechanisms are embedded in a biologically inspired architecture that integrates information of different model components of the visual processing due to feedback connections. In particular, mutual interactions between the detection of motion discontinuities and temporal occlusions allow a considerable improvement of the kinetic boundary detection. Conclusions/Significance A new model is proposed that uses optic flow cues to detect motion discontinuities and object occlusion. We suggest that by combining these results for motion discontinuities and object occlusion, object segmentation within the model can be improved. This idea could also be applied in other models for object segmentation. In addition, we discuss how this model is related to neurophysiological findings. The model was successfully tested both with artificial and real sequences including self and object motion. PMID:19043613
Representation of vestibular and visual cues to self-motion in ventral intraparietal (VIP) cortex
Chen, Aihua; Deangelis, Gregory C.; Angelaki, Dora E.
2011-01-01
Convergence of vestibular and visual motion information is important for self-motion perception. One cortical area that combines vestibular and optic flow signals is the ventral intraparietal area (VIP). We characterized unisensory and multisensory responses of macaque VIP neurons to translations and rotations in three dimensions. Approximately half of VIP cells show significant directional selectivity in response to optic flow, half show tuning to vestibular stimuli, and one-third show multisensory responses. Visual and vestibular direction preferences of multisensory VIP neurons could be congruent or opposite. When visual and vestibular stimuli were combined, VIP responses could be dominated by either input, unlike medial superior temporal area (MSTd) where optic flow tuning typically dominates or the visual posterior sylvian area (VPS) where vestibular tuning dominates. Optic flow selectivity in VIP was weaker than in MSTd but stronger than in VPS. In contrast, vestibular tuning for translation was strongest in VPS, intermediate in VIP, and weakest in MSTd. To characterize response dynamics, direction-time data were fit with a spatiotemporal model in which temporal responses were modeled as weighted sums of velocity, acceleration, and position components. Vestibular responses in VIP reflected balanced contributions of velocity and acceleration, whereas visual responses were dominated by velocity. Timing of vestibular responses in VIP was significantly faster than in MSTd, whereas timing of optic flow responses did not differ significantly among areas. These findings suggest that VIP may be proximal to MSTd in terms of vestibular processing but hierarchically similar to MSTd in terms of optic flow processing. PMID:21849564
Manipulating the content of dynamic natural scenes to characterize response in human MT/MST.
Durant, Szonya; Wall, Matthew B; Zanker, Johannes M
2011-09-09
Optic flow is one of the most important sources of information for enabling human navigation through the world. A striking finding from single-cell studies in monkeys is the rapid saturation of response of MT/MST areas with the density of optic flow type motion information. These results are reflected psychophysically in human perception in the saturation of motion aftereffects. We began by comparing responses to natural optic flow scenes in human visual brain areas to responses to the same scenes with inverted contrast (photo negative). This changes scene familiarity while preserving local motion signals. This manipulation had no effect; however, the response was only correlated with the density of local motion (calculated by a motion correlation model) in V1, not in MT/MST. To further investigate this, we manipulated the visible proportion of natural dynamic scenes and found that areas MT and MST did not increase in response over a 16-fold increase in the amount of information presented, i.e., response had saturated. This makes sense in light of the sparseness of motion information in natural scenes, suggesting that the human brain is well adapted to exploit a small amount of dynamic signal and extract information important for survival.
Direct Estimation of Structure and Motion from Multiple Frames
1990-03-01
sequential frames in an image sequence. As a consequence, the information that can be extracted from a single optical flow field is limited to a snapshot of...researchers have developed techniques that extract motion and structure inform.4tion without computation of the optical flow. Best known are the "direct...operated iteratively on a sequence of images to recover structure. It required feature extraction and matching. Broida and Chellappa [9] suggested the use of
Experience affects the use of ego-motion signals during 3D shape perception
Jain, Anshul; Backus, Benjamin T.
2011-01-01
Experience has long-term effects on perceptual appearance (Q. Haijiang, J. A. Saunders, R. W. Stone, & B. T. Backus, 2006). We asked whether experience affects the appearance of structure-from-motion stimuli when the optic flow is caused by observer ego-motion. Optic flow is an ambiguous depth cue: a rotating object and its oppositely rotating, depth-inverted dual generate similar flow. However, the visual system exploits ego-motion signals to prefer the percept of an object that is stationary over one that rotates (M. Wexler, F. Panerai, I. Lamouret, & J. Droulez, 2001). We replicated this finding and asked whether this preference for stationarity, the “stationarity prior,” is modulated by experience. During training, two groups of observers were exposed to objects with identical flow, but that were either stationary or moving as determined by other cues. The training caused identical test stimuli to be seen preferentially as stationary or moving by the two groups, respectively. We then asked whether different priors can exist independently at different locations in the visual field. Observers were trained to see objects either as stationary or as moving at two different locations. Observers’ stationarity bias at the two respective locations was modulated in the directions consistent with training. Thus, the utilization of extraretinal ego-motion signals for disambiguating optic flow signals can be updated as the result of experience, consistent with the updating of a Bayesian prior for stationarity. PMID:21191132
Optical Flow in a Smart Sensor Based on Hybrid Analog-Digital Architecture
Guzmán, Pablo; Díaz, Javier; Agís, Rodrigo; Ros, Eduardo
2010-01-01
The purpose of this study is to develop a motion sensor (delivering optical flow estimations) using a platform that includes the sensor itself, focal plane processing resources, and co-processing resources on a general purpose embedded processor. All this is implemented on a single device as a SoC (System-on-a-Chip). Optical flow is the 2-D projection into the camera plane of the 3-D motion information presented at the world scenario. This motion representation is widespread well-known and applied in the science community to solve a wide variety of problems. Most applications based on motion estimation require work in real-time; hence, this restriction must be taken into account. In this paper, we show an efficient approach to estimate the motion velocity vectors with an architecture based on a focal plane processor combined on-chip with a 32 bits NIOS II processor. Our approach relies on the simplification of the original optical flow model and its efficient implementation in a platform that combines an analog (focal-plane) and digital (NIOS II) processor. The system is fully functional and is organized in different stages where the early processing (focal plane) stage is mainly focus to pre-process the input image stream to reduce the computational cost in the post-processing (NIOS II) stage. We present the employed co-design techniques and analyze this novel architecture. We evaluate the system’s performance and accuracy with respect to the different proposed approaches described in the literature. We also discuss the advantages of the proposed approach as well as the degree of efficiency which can be obtained from the focal plane processing capabilities of the system. The final outcome is a low cost smart sensor for optical flow computation with real-time performance and reduced power consumption that can be used for very diverse application domains. PMID:22319283
NASA Astrophysics Data System (ADS)
Yokoi, Naomichi; Aizu, Yoshihisa
2018-01-01
Optical trapping and guiding using laser have been proven to be useful for non-contact and non-invasive manipulation of small objects such as biological cells, organelles within cells, and dielectric particles. We have numerically investigated so far the motion of a Brownian particle suspended in still water under the illumination of a speckle pattern generated by the interference of coherent light scattered by a rough object. In the present study, we investigate numerically the motion of a particle in a water flow under the illumination of a speckle pattern that is at rest or in motion. Trajectory of the particle is simulated in relation with its size, flow velocity, maximum irradiance, and moving velocity of the speckle pattern to confirm the feasibility of the present method for performing optical trapping and guiding of the particle in the flow.
NASA Astrophysics Data System (ADS)
Yokoi, Naomichi; Aizu, Yoshihisa
2018-06-01
Optical trapping and guiding using laser have been proven to be useful for non-contact and non-invasive manipulation of small objects such as biological cells, organelles within cells, and dielectric particles. We have numerically investigated so far the motion of a Brownian particle suspended in still water under the illumination of a speckle pattern generated by the interference of coherent light scattered by a rough object. In the present study, we investigate numerically the motion of a particle in a water flow under the illumination of a speckle pattern that is at rest or in motion. Trajectory of the particle is simulated in relation with its size, flow velocity, maximum irradiance, and moving velocity of the speckle pattern to confirm the feasibility of the present method for performing optical trapping and guiding of the particle in the flow.
Kubo, Fumi; Hablitzel, Bastian; Dal Maschio, Marco; Driever, Wolfgang; Baier, Herwig; Arrenberg, Aristides B
2014-03-19
Animals respond to whole-field visual motion with compensatory eye and body movements in order to stabilize both their gaze and position with respect to their surroundings. In zebrafish, rotational stimuli need to be distinguished from translational stimuli to drive the optokinetic and the optomotor responses, respectively. Here, we systematically characterize the neural circuits responsible for these operations using a combination of optogenetic manipulation and in vivo calcium imaging during optic flow stimulation. By recording the activity of thousands of neurons within the area pretectalis (APT), we find four bilateral pairs of clusters that process horizontal whole-field motion and functionally classify eleven prominent neuron types with highly selective response profiles. APT neurons are prevalently direction selective, either monocularly or binocularly driven, and hierarchically organized to distinguish between rotational and translational optic flow. Our data predict a wiring diagram of a neural circuit tailored to drive behavior that compensates for self-motion. Copyright © 2014 Elsevier Inc. All rights reserved.
Baumberger, Bernard; Isableu, Brice; Flückiger, Michelangelo
2004-11-01
The aim of this research was to analyse the development of postural reactions to approaching (AOF) and receding (ROF) ground rectilinear optical flows. Optical flows were shaped by a pattern of circular spots of light projected on the ground surface by a texture flow generator. The geometrical structure of the projected scenes corresponded to the spatial organisation of visual flows encountered in open outdoor settings. Postural readjustments of 56 children, ranging from 7 to 11 years old, and 12 adults were recorded by the changes of the centre of foot pressure (CoP) on a force platform during 44-s exposures to the moving texture. Before and after the optical flows exposure, a 24-s motionless texture served as a reference condition. Effect of ground rectilinear optical flows on postural control development was assessed by analysing sway latencies (SL), stability performances and postural orientation. The main results that emerge from this experiment show that postural responses are directionally specific to optical flow pattern and that they vary as a function of the motion onset and offset. Results showed that greater developmental changes in postural control occurred in an AOF (both at the onset and offset of the optical flow) than in an ROF. Onset of an approaching flow induced postural instability, canonical shifts in postural orientation and long latencies in children which were stronger than in the receding flow. This pattern of responses evolved with age towards an improvement in stability performances and shorter SL. The backward decreasing shift of the CoP in children evolved in adults towards forward postural tilt, i.show $132#e. in the opposite direction of the texture's motion. Offset of an AOF motion induced very short SL in children (which became longer in adult subjects), strong postural instability, but weaker shift of orientation compared to the receding one. Postural stability improved and orientation shift evolved to forward inclinations with age. SL remained almost constant across age at both onset and offset of the receding flow. Critical developmental periods seem to occur by the age of 8 and 10 years, as suggested by the transient 'neglect' of the children to optical flows. Linear vection was felt by 90% of the 7 year olds and decreased with age to reach 55% in adult subjects. The mature sensorimotor coordination subserving the postural organisation shown in adult subjects is an example aiming at reducing the postural effects induced by optical flows. The data are discussed in relation to the perceptual importance of mobile visual references on a ground support.
DeLucia, Patricia R; Tharanathan, Anand
2009-12-01
More than 25% of accidents are rear-end collisions. It is essential to identify the factors that contribute to such collisions. One such factor is a driver's ability to respond to the deceleration of the car ahead. In Experiment 1, we measured effects of optic flow information and discrete visual and auditory warnings (brake lights, tones) on responses to deceleration during car following. With computer simulations of car-following scenes, university students pressed a button when the lead car decelerated. Both classes of information affected responses. Observers relied on discrete warnings when optic flow information was relatively less effective as determined by the lead car's headway and deceleration rate. This is consistent with DeLucia's (2008) conceptual framework of space perception that emphasized the importance of viewing distance and motion (and task). In Experiment 2, we measured responses to deceleration after a visual interruption. Scenes were designed to tease apart the role of expectations and optic flow. Responses mostly were consistent with optic flow information presented after the interruption rather than with putative mental expectations that were set up by the lead car's motion prior to the interruption. The theoretical implication of the present results is that responses to deceleration are based on multiple sources of information, including optical size, optical expansion rate and tau, and discrete warnings that are independent of optic flow. The practical implication is that in-vehicle collision-avoidance warning systems may be more useful when optic flow is less effective (e.g., slow deceleration rates), implicating a role for adaptive collision-warning systems. Copyright 2009 APA
Fast instantaneous center of rotation estimation algorithm for a skied-steered robot
NASA Astrophysics Data System (ADS)
Kniaz, V. V.
2015-05-01
Skid-steered robots are widely used as mobile platforms for machine vision systems. However it is hard to achieve a stable motion of such robots along desired trajectory due to an unpredictable wheel slip. It is possible to compensate the unpredictable wheel slip and stabilize the motion of the robot using visual odometry. This paper presents a fast optical flow based algorithm for estimation of instantaneous center of rotation, angular and longitudinal speed of the robot. The proposed algorithm is based on Horn-Schunck variational optical flow estimation method. The instantaneous center of rotation and motion of the robot is estimated by back projection of optical flow field to the ground surface. The developed algorithm was tested using skid-steered mobile robot. The robot is based on a mobile platform that includes two pairs of differential driven motors and a motor controller. Monocular visual odometry system consisting of a singleboard computer and a low cost webcam is mounted on the mobile platform. A state-space model of the robot was derived using standard black-box system identification. The input (commands) and the output (motion) were recorded using a dedicated external motion capture system. The obtained model was used to control the robot without visual odometry data. The paper is concluded with the algorithm quality estimation by comparison of the trajectories estimated by the algorithm with the data from motion capture system.
NASA Astrophysics Data System (ADS)
Ohyama, R.; Inoue, K.; Chang, J. S.
2007-01-01
A flow pattern characterization of electrohydrodynamically (EHD) induced flow phenomena of a stratified dielectric fluid situated in an ac corona discharge field is conducted by a Schlieren optical system. A high voltage application to a needle-plate electrode arrangement in gas-phase normally initiates a conductive type EHD gas flow. Although the EHD gas flow motion initiated from the corona discharge electrode has been well known as corona wind, no comprehensive study has been conducted for an EHD fluid flow motion of the stratified dielectric liquid that is exposed to the gas-phase ac corona discharge. The experimentally observed result clearly presents the liquid-phase EHD flow phenomenon induced from the gas-phase EHD flow via an interfacial momentum transfer. The flow phenomenon is also discussed in terms of the gas-phase EHD number under the reduced gas pressure (reduced interfacial momentum transfer) conditions.
Rueckauer, Bodo; Delbruck, Tobi
2016-01-01
In this study we compare nine optical flow algorithms that locally measure the flow normal to edges according to accuracy and computation cost. In contrast to conventional, frame-based motion flow algorithms, our open-source implementations compute optical flow based on address-events from a neuromorphic Dynamic Vision Sensor (DVS). For this benchmarking we created a dataset of two synthesized and three real samples recorded from a 240 × 180 pixel Dynamic and Active-pixel Vision Sensor (DAVIS). This dataset contains events from the DVS as well as conventional frames to support testing state-of-the-art frame-based methods. We introduce a new source for the ground truth: In the special case that the perceived motion stems solely from a rotation of the vision sensor around its three camera axes, the true optical flow can be estimated using gyro data from the inertial measurement unit integrated with the DAVIS camera. This provides a ground-truth to which we can compare algorithms that measure optical flow by means of motion cues. An analysis of error sources led to the use of a refractory period, more accurate numerical derivatives and a Savitzky-Golay filter to achieve significant improvements in accuracy. Our pure Java implementations of two recently published algorithms reduce computational cost by up to 29% compared to the original implementations. Two of the algorithms introduced in this paper further speed up processing by a factor of 10 compared with the original implementations, at equal or better accuracy. On a desktop PC, they run in real-time on dense natural input recorded by a DAVIS camera. PMID:27199639
Photo-actuation of liquids for light-driven microfluidics: state of the art and perspectives.
Baigl, Damien
2012-10-07
Using light to control liquid motion is a new paradigm for the actuation of microfluidic systems. We review here the different principles and strategies to induce or control liquid motion using light, which includes the use of radiation pressure, optical tweezers, light-induced wettability gradients, the thermocapillary effect, photosensitive surfactants, the chromocapillary effect, optoelectrowetting, photocontrolled electroosmotic flows and optical dielectrophoresis. We analyze the performance of these approaches to control using light many kinds of microfluidic operations involving discrete pL- to μL-sized droplets (generation, driving, mixing, reaction, sorting) or fluid flows in microchannels (valve operation, injection, pumping, flow rate control). We show that a complete toolbox is now available to control microfluidic systems by light. We finally discuss the perspectives of digital optofluidics as well as microfluidics based on all optical fluidic chips and optically reconfigurable devices.
A Robust Method for Ego-Motion Estimation in Urban Environment Using Stereo Camera.
Ci, Wenyan; Huang, Yingping
2016-10-17
Visual odometry estimates the ego-motion of an agent (e.g., vehicle and robot) using image information and is a key component for autonomous vehicles and robotics. This paper proposes a robust and precise method for estimating the 6-DoF ego-motion, using a stereo rig with optical flow analysis. An objective function fitted with a set of feature points is created by establishing the mathematical relationship between optical flow, depth and camera ego-motion parameters through the camera's 3-dimensional motion and planar imaging model. Accordingly, the six motion parameters are computed by minimizing the objective function, using the iterative Levenberg-Marquard method. One of key points for visual odometry is that the feature points selected for the computation should contain inliers as much as possible. In this work, the feature points and their optical flows are initially detected by using the Kanade-Lucas-Tomasi (KLT) algorithm. A circle matching is followed to remove the outliers caused by the mismatching of the KLT algorithm. A space position constraint is imposed to filter out the moving points from the point set detected by the KLT algorithm. The Random Sample Consensus (RANSAC) algorithm is employed to further refine the feature point set, i.e., to eliminate the effects of outliers. The remaining points are tracked to estimate the ego-motion parameters in the subsequent frames. The approach presented here is tested on real traffic videos and the results prove the robustness and precision of the method.
A Robust Method for Ego-Motion Estimation in Urban Environment Using Stereo Camera
Ci, Wenyan; Huang, Yingping
2016-01-01
Visual odometry estimates the ego-motion of an agent (e.g., vehicle and robot) using image information and is a key component for autonomous vehicles and robotics. This paper proposes a robust and precise method for estimating the 6-DoF ego-motion, using a stereo rig with optical flow analysis. An objective function fitted with a set of feature points is created by establishing the mathematical relationship between optical flow, depth and camera ego-motion parameters through the camera’s 3-dimensional motion and planar imaging model. Accordingly, the six motion parameters are computed by minimizing the objective function, using the iterative Levenberg–Marquard method. One of key points for visual odometry is that the feature points selected for the computation should contain inliers as much as possible. In this work, the feature points and their optical flows are initially detected by using the Kanade–Lucas–Tomasi (KLT) algorithm. A circle matching is followed to remove the outliers caused by the mismatching of the KLT algorithm. A space position constraint is imposed to filter out the moving points from the point set detected by the KLT algorithm. The Random Sample Consensus (RANSAC) algorithm is employed to further refine the feature point set, i.e., to eliminate the effects of outliers. The remaining points are tracked to estimate the ego-motion parameters in the subsequent frames. The approach presented here is tested on real traffic videos and the results prove the robustness and precision of the method. PMID:27763508
A neural model of motion processing and visual navigation by cortical area MST.
Grossberg, S; Mingolla, E; Pack, C
1999-12-01
Cells in the dorsal medial superior temporal cortex (MSTd) process optic flow generated by self-motion during visually guided navigation. A neural model shows how interactions between well-known neural mechanisms (log polar cortical magnification, Gaussian motion-sensitive receptive fields, spatial pooling of motion-sensitive signals and subtractive extraretinal eye movement signals) lead to emergent properties that quantitatively simulate neurophysiological data about MSTd cell properties and psychophysical data about human navigation. Model cells match MSTd neuron responses to optic flow stimuli placed in different parts of the visual field, including position invariance, tuning curves, preferred spiral directions, direction reversals, average response curves and preferred locations for stimulus motion centers. The model shows how the preferred motion direction of the most active MSTd cells can explain human judgments of self-motion direction (heading), without using complex heading templates. The model explains when extraretinal eye movement signals are needed for accurate heading perception, and when retinal input is sufficient, and how heading judgments depend on scene layouts and rotation rates.
Perception of object trajectory: parsing retinal motion into self and object movement components.
Warren, Paul A; Rushton, Simon K
2007-08-16
A moving observer needs to be able to estimate the trajectory of other objects moving in the scene. Without the ability to do so, it would be difficult to avoid obstacles or catch a ball. We hypothesized that neural mechanisms sensitive to the patterns of motion generated on the retina during self-movement (optic flow) play a key role in this process, "parsing" motion due to self-movement from that due to object movement. We investigated this "flow parsing" hypothesis by measuring the perceived trajectory of a moving probe placed within a flow field that was consistent with movement of the observer. In the first experiment, the flow field was consistent with an eye rotation; in the second experiment, it was consistent with a lateral translation of the eyes. We manipulated the distance of the probe in both experiments and assessed the consequences. As predicted by the flow parsing hypothesis, manipulating the distance of the probe had differing effects on the perceived trajectory of the probe in the two experiments. The results were consistent with the scene geometry and the type of simulated self-movement. In a third experiment, we explored the contribution of local and global motion processing to the results of the first two experiments. The data suggest that the parsing process involves global motion processing, not just local motion contrast. The findings of this study support a role for optic flow processing in the perception of object movement during self-movement.
Coupling reconstruction and motion estimation for dynamic MRI through optical flow constraint
NASA Astrophysics Data System (ADS)
Zhao, Ningning; O'Connor, Daniel; Gu, Wenbo; Ruan, Dan; Basarab, Adrian; Sheng, Ke
2018-03-01
This paper addresses the problem of dynamic magnetic resonance image (DMRI) reconstruction and motion estimation jointly. Because of the inherent anatomical movements in DMRI acquisition, reconstruction of DMRI using motion estimation/compensation (ME/MC) has been explored under the compressed sensing (CS) scheme. In this paper, by embedding the intensity based optical flow (OF) constraint into the traditional CS scheme, we are able to couple the DMRI reconstruction and motion vector estimation. Moreover, the OF constraint is employed in a specific coarse resolution scale in order to reduce the computational complexity. The resulting optimization problem is then solved using a primal-dual algorithm due to its efficiency when dealing with nondifferentiable problems. Experiments on highly accelerated dynamic cardiac MRI with multiple receiver coils validate the performance of the proposed algorithm.
Egelhaaf, Martin; Kern, Roland; Lindemann, Jens Peter
2014-01-01
Despite their miniature brains insects, such as flies, bees and wasps, are able to navigate by highly erobatic flight maneuvers in cluttered environments. They rely on spatial information that is contained in the retinal motion patterns induced on the eyes while moving around (“optic flow”) to accomplish their extraordinary performance. Thereby, they employ an active flight and gaze strategy that separates rapid saccade-like turns from translatory flight phases where the gaze direction is kept largely constant. This behavioral strategy facilitates the processing of environmental information, because information about the distance of the animal to objects in the environment is only contained in the optic flow generated by translatory motion. However, motion detectors as are widespread in biological systems do not represent veridically the velocity of the optic flow vectors, but also reflect textural information about the environment. This characteristic has often been regarded as a limitation of a biological motion detection mechanism. In contrast, we conclude from analyses challenging insect movement detectors with image flow as generated during translatory locomotion through cluttered natural environments that this mechanism represents the contours of nearby objects. Contrast borders are a main carrier of functionally relevant object information in artificial and natural sceneries. The motion detection system thus segregates in a computationally parsimonious way the environment into behaviorally relevant nearby objects and—in many behavioral contexts—less relevant distant structures. Hence, by making use of an active flight and gaze strategy, insects are capable of performing extraordinarily well even with a computationally simple motion detection mechanism. PMID:25389392
On event-based optical flow detection
Brosch, Tobias; Tschechne, Stephan; Neumann, Heiko
2015-01-01
Event-based sensing, i.e., the asynchronous detection of luminance changes, promises low-energy, high dynamic range, and sparse sensing. This stands in contrast to whole image frame-wise acquisition by standard cameras. Here, we systematically investigate the implications of event-based sensing in the context of visual motion, or flow, estimation. Starting from a common theoretical foundation, we discuss different principal approaches for optical flow detection ranging from gradient-based methods over plane-fitting to filter based methods and identify strengths and weaknesses of each class. Gradient-based methods for local motion integration are shown to suffer from the sparse encoding in address-event representations (AER). Approaches exploiting the local plane like structure of the event cloud, on the other hand, are shown to be well suited. Within this class, filter based approaches are shown to define a proper detection scheme which can also deal with the problem of representing multiple motions at a single location (motion transparency). A novel biologically inspired efficient motion detector is proposed, analyzed and experimentally validated. Furthermore, a stage of surround normalization is incorporated. Together with the filtering this defines a canonical circuit for motion feature detection. The theoretical analysis shows that such an integrated circuit reduces motion ambiguity in addition to decorrelating the representation of motion related activations. PMID:25941470
A retinal code for motion along the gravitational and body axes
Sabbah, Shai; Gemmer, John A.; Bhatia-Lin, Ananya; Manoff, Gabrielle; Castro, Gabriel; Siegel, Jesse K.; Jeffery, Nathan; Berson, David M.
2017-01-01
Summary Self-motion triggers complementary visual and vestibular reflexes supporting image-stabilization and balance. Translation through space produces one global pattern of retinal image motion (optic flow), rotation another. We show that each subtype of direction-selective ganglion cell (DSGC) adjusts its direction preference topographically to align with specific translatory optic flow fields, creating a neural ensemble tuned for a specific direction of motion through space. Four cardinal translatory directions are represented, aligned with two axes of high adaptive relevance: the body and gravitational axes. One subtype maximizes its output when the mouse advances, others when it retreats, rises, or falls. ON-DSGCs and ON-OFF-DSGCs share the same spatial geometry but weight the four channels differently. Each subtype ensemble is also tuned for rotation. The relative activation of DSGC channels uniquely encodes every translation and rotation. Though retinal and vestibular systems both encode translatory and rotatory self-motion, their coordinate systems differ. PMID:28607486
Modeling heading and path perception from optic flow in the case of independently moving objects
Raudies, Florian; Neumann, Heiko
2013-01-01
Humans are usually accurate when estimating heading or path from optic flow, even in the presence of independently moving objects (IMOs) in an otherwise rigid scene. To invoke significant biases in perceived heading, IMOs have to be large and obscure the focus of expansion (FOE) in the image plane, which is the point of approach. For the estimation of path during curvilinear self-motion no significant biases were found in the presence of IMOs. What makes humans robust in their estimation of heading or path using optic flow? We derive analytical models of optic flow for linear and curvilinear self-motion using geometric scene models. Heading biases of a linear least squares method, which builds upon these analytical models, are large, larger than those reported for humans. This motivated us to study segmentation cues that are available from optic flow. We derive models of accretion/deletion, expansion/contraction, acceleration/deceleration, local spatial curvature, and local temporal curvature, to be used as cues to segment an IMO from the background. Integrating these segmentation cues into our method of estimating heading or path now explains human psychophysical data and extends, as well as unifies, previous investigations. Our analysis suggests that various cues available from optic flow help to segment IMOs and, thus, make humans' heading and path perception robust in the presence of such IMOs. PMID:23554589
NASA Astrophysics Data System (ADS)
Murillo, Sergio; Pattichis, Marios; Soliz, Peter; Barriga, Simon; Loizou, C. P.; Pattichis, C. S.
2010-03-01
Motion estimation from digital video is an ill-posed problem that requires a regularization approach. Regularization introduces a smoothness constraint that can reduce the resolution of the velocity estimates. The problem is further complicated for ultrasound videos (US), where speckle noise levels can be significant. Motion estimation using optical flow models requires the modification of several parameters to satisfy the optical flow constraint as well as the level of imposed smoothness. Furthermore, except in simulations or mostly unrealistic cases, there is no ground truth to use for validating the velocity estimates. This problem is present in all real video sequences that are used as input to motion estimation algorithms. It is also an open problem in biomedical applications like motion analysis of US of carotid artery (CA) plaques. In this paper, we study the problem of obtaining reliable ultrasound video motion estimates for atherosclerotic plaques for use in clinical diagnosis. A global optimization framework for motion parameter optimization is presented. This framework uses actual carotid artery motions to provide optimal parameter values for a variety of motions and is tested on ten different US videos using two different motion estimation techniques.
Accurate motion parameter estimation for colonoscopy tracking using a regression method
NASA Astrophysics Data System (ADS)
Liu, Jianfei; Subramanian, Kalpathi R.; Yoo, Terry S.
2010-03-01
Co-located optical and virtual colonoscopy images have the potential to provide important clinical information during routine colonoscopy procedures. In our earlier work, we presented an optical flow based algorithm to compute egomotion from live colonoscopy video, permitting navigation and visualization of the corresponding patient anatomy. In the original algorithm, motion parameters were estimated using the traditional Least Sum of squares(LS) procedure which can be unstable in the context of optical flow vectors with large errors. In the improved algorithm, we use the Least Median of Squares (LMS) method, a robust regression method for motion parameter estimation. Using the LMS method, we iteratively analyze and converge toward the main distribution of the flow vectors, while disregarding outliers. We show through three experiments the improvement in tracking results obtained using the LMS method, in comparison to the LS estimator. The first experiment demonstrates better spatial accuracy in positioning the virtual camera in the sigmoid colon. The second and third experiments demonstrate the robustness of this estimator, resulting in longer tracked sequences: from 300 to 1310 in the ascending colon, and 410 to 1316 in the transverse colon.
Rapid encoding of relationships between spatially remote motion signals.
Maruya, Kazushi; Holcombe, Alex O; Nishida, Shin'ya
2013-02-06
For visual processing, the temporal correlation of remote local motion signals is a strong cue to detect meaningful large-scale structures in the retinal image, because related points are likely to move together regardless of their spatial separation. While the processing of multi-element motion patterns involved in biological motion and optic flow has been studied intensively, the encoding of simpler pairwise relationships between remote motion signals remains poorly understood. We investigated this process by measuring the temporal rate limit for perceiving the relationship of two motion directions presented at the same time at different spatial locations. Compared to luminance or orientation, motion comparison was more rapid. Performance remained very high even when interstimulus separation was increased up to 100°. Motion comparison also remained rapid regardless of whether the two motion directions were similar to or different from each other. The exception was a dramatic slowing when the elements formed an orthogonal "T," in which two motions do not perceptually group together. Motion presented at task-irrelevant positions did not reduce performance, suggesting that the rapid motion comparison could not be ascribed to global optic flow processing. Our findings reveal the existence and unique nature of specialized processing that encodes long-range relationships between motion signals for quick appreciation of global dynamic scene structure.
Event-Based Computation of Motion Flow on a Neuromorphic Analog Neural Platform
Giulioni, Massimiliano; Lagorce, Xavier; Galluppi, Francesco; Benosman, Ryad B.
2016-01-01
Estimating the speed and direction of moving objects is a crucial component of agents behaving in a dynamic world. Biological organisms perform this task by means of the neural connections originating from their retinal ganglion cells. In artificial systems the optic flow is usually extracted by comparing activity of two or more frames captured with a vision sensor. Designing artificial motion flow detectors which are as fast, robust, and efficient as the ones found in biological systems is however a challenging task. Inspired by the architecture proposed by Barlow and Levick in 1965 to explain the spiking activity of the direction-selective ganglion cells in the rabbit's retina, we introduce an architecture for robust optical flow extraction with an analog neuromorphic multi-chip system. The task is performed by a feed-forward network of analog integrate-and-fire neurons whose inputs are provided by contrast-sensitive photoreceptors. Computation is supported by the precise time of spike emission, and the extraction of the optical flow is based on time lag in the activation of nearby retinal neurons. Mimicking ganglion cells our neuromorphic detectors encode the amplitude and the direction of the apparent visual motion in their output spiking pattern. Hereby we describe the architectural aspects, discuss its latency, scalability, and robustness properties and demonstrate that a network of mismatched delicate analog elements can reliably extract the optical flow from a simple visual scene. This work shows how precise time of spike emission used as a computational basis, biological inspiration, and neuromorphic systems can be used together for solving specific tasks. PMID:26909015
Event-Based Computation of Motion Flow on a Neuromorphic Analog Neural Platform.
Giulioni, Massimiliano; Lagorce, Xavier; Galluppi, Francesco; Benosman, Ryad B
2016-01-01
Estimating the speed and direction of moving objects is a crucial component of agents behaving in a dynamic world. Biological organisms perform this task by means of the neural connections originating from their retinal ganglion cells. In artificial systems the optic flow is usually extracted by comparing activity of two or more frames captured with a vision sensor. Designing artificial motion flow detectors which are as fast, robust, and efficient as the ones found in biological systems is however a challenging task. Inspired by the architecture proposed by Barlow and Levick in 1965 to explain the spiking activity of the direction-selective ganglion cells in the rabbit's retina, we introduce an architecture for robust optical flow extraction with an analog neuromorphic multi-chip system. The task is performed by a feed-forward network of analog integrate-and-fire neurons whose inputs are provided by contrast-sensitive photoreceptors. Computation is supported by the precise time of spike emission, and the extraction of the optical flow is based on time lag in the activation of nearby retinal neurons. Mimicking ganglion cells our neuromorphic detectors encode the amplitude and the direction of the apparent visual motion in their output spiking pattern. Hereby we describe the architectural aspects, discuss its latency, scalability, and robustness properties and demonstrate that a network of mismatched delicate analog elements can reliably extract the optical flow from a simple visual scene. This work shows how precise time of spike emission used as a computational basis, biological inspiration, and neuromorphic systems can be used together for solving specific tasks.
Initial assessment of facial nerve paralysis based on motion analysis using an optical flow method.
Samsudin, Wan Syahirah W; Sundaraj, Kenneth; Ahmad, Amirozi; Salleh, Hasriah
2016-01-01
An initial assessment method that can classify as well as categorize the severity of paralysis into one of six levels according to the House-Brackmann (HB) system based on facial landmarks motion using an Optical Flow (OF) algorithm is proposed. The desired landmarks were obtained from the video recordings of 5 normal and 3 Bell's Palsy subjects and tracked using the Kanade-Lucas-Tomasi (KLT) method. A new scoring system based on the motion analysis using area measurement is proposed. This scoring system uses the individual scores from the facial exercises and grades the paralysis based on the HB system. The proposed method has obtained promising results and may play a pivotal role towards improved rehabilitation programs for patients.
Perceived Surface Slant Is Systematically Biased in the Actively-Generated Optic Flow
Fantoni, Carlo; Caudek, Corrado; Domini, Fulvio
2012-01-01
Humans make systematic errors in the 3D interpretation of the optic flow in both passive and active vision. These systematic distortions can be predicted by a biologically-inspired model which disregards self-motion information resulting from head movements (Caudek, Fantoni, & Domini 2011). Here, we tested two predictions of this model: (1) A plane that is stationary in an earth-fixed reference frame will be perceived as changing its slant if the movement of the observer's head causes a variation of the optic flow; (2) a surface that rotates in an earth-fixed reference frame will be perceived to be stationary, if the surface rotation is appropriately yoked to the head movement so as to generate a variation of the surface slant but not of the optic flow. Both predictions were corroborated by two experiments in which observers judged the perceived slant of a random-dot planar surface during egomotion. We found qualitatively similar biases for monocular and binocular viewing of the simulated surfaces, although, in principle, the simultaneous presence of disparity and motion cues allows for a veridical recovery of surface slant. PMID:22479473
Nocturnal insects use optic flow for flight control
Baird, Emily; Kreiss, Eva; Wcislo, William; Warrant, Eric; Dacke, Marie
2011-01-01
To avoid collisions when navigating through cluttered environments, flying insects must control their flight so that their sensory systems have time to detect obstacles and avoid them. To do this, day-active insects rely primarily on the pattern of apparent motion generated on the retina during flight (optic flow). However, many flying insects are active at night, when obtaining reliable visual information for flight control presents much more of a challenge. To assess whether nocturnal flying insects also rely on optic flow cues to control flight in dim light, we recorded flights of the nocturnal neotropical sweat bee, Megalopta genalis, flying along an experimental tunnel when: (i) the visual texture on each wall generated strong horizontal (front-to-back) optic flow cues, (ii) the texture on only one wall generated these cues, and (iii) horizontal optic flow cues were removed from both walls. We find that Megalopta increase their groundspeed when horizontal motion cues in the tunnel are reduced (conditions (ii) and (iii)). However, differences in the amount of horizontal optic flow on each wall of the tunnel (condition (ii)) do not affect the centred position of the bee within the flight tunnel. To better understand the behavioural response of Megalopta, we repeated the experiments on day-active bumble-bees (Bombus terrestris). Overall, our findings demonstrate that despite the limitations imposed by dim light, Megalopta—like their day-active relatives—rely heavily on vision to control flight, but that they use visual cues in a different manner from diurnal insects. PMID:21307047
Optical measurement of blood flow in exercising skeletal muscle: a pilot study
NASA Astrophysics Data System (ADS)
Wang, Detian; Baker, Wesley B.; Parthasarathy, Ashwin B.; Zhu, Liguo; Li, Zeren; Yodh, Arjun G.
2017-07-01
Blood flow monitoring during rhythm exercising is very important for sports medicine and muscle dieases. Diffuse correlation spectroscopy(DCS) is a relative new invasive way to monitor blood flow but suffering from muscle fiber motion. In this study we focus on how to remove exercise driven artifacts and obtain accurate estimates of the increase in blood flow from exercise. Using a novel fast software correlator, we measured blood flow in forearm flexor muscles of N=2 healthy adults during handgrip exercise, at a sampling rate of 20 Hz. Combining the blood flow and acceleration data, we resolved the motion artifact in the DCS signal induced by muscle fiber motion, and isolated the blood flow component of the signal from the motion artifact. The results show that muscle fiber motion strongly affects the DCS signal, and if not accounted for, will result in an overestimate of blood flow more than 1000%. Our measurements indicate rapid dilation of arterioles following exercise onset, which enabled blood flow to increase to a plateau of 200% in 10s. The blood flow also rapidly recovered to baseline following exercise in 10s. Finally, preliminary results on the dependence of blood flow from exercise intensity changes will be discussed.
Detection of obstacles on runway using Ego-Motion compensation and tracking of significant features
NASA Technical Reports Server (NTRS)
Kasturi, Rangachar (Principal Investigator); Camps, Octavia (Principal Investigator); Gandhi, Tarak; Devadiga, Sadashiva
1996-01-01
This report describes a method for obstacle detection on a runway for autonomous navigation and landing of an aircraft. Detection is done in the presence of extraneous features such as tiremarks. Suitable features are extracted from the image and warping using approximately known camera and plane parameters is performed in order to compensate ego-motion as far as possible. Residual disparity after warping is estimated using an optical flow algorithm. Features are tracked from frame to frame so as to obtain more reliable estimates of their motion. Corrections are made to motion parameters with the residual disparities using a robust method, and features having large residual disparities are signaled as obstacles. Sensitivity analysis of the procedure is also studied. Nelson's optical flow constraint is proposed to separate moving obstacles from stationary ones. A Bayesian framework is used at every stage so that the confidence in the estimates can be determined.
Blood flow velocity measurement by endovascular Doppler optical coherence tomography
NASA Astrophysics Data System (ADS)
Sun, Cuiru; Nolte, Felix; Vuong, Barry; Cheng, Kyle H. Y.; Lee, Kenneth K. C.; Standish, Beau A.; Courtney, Brian; Marotta, Tom R.; Yang, Victor X. D.
2013-03-01
Blood flow velocity and volumetric flow measurements are important parameters for assessment of the severity of stenosis and the outcome of interventional therapy. However, feasibility of intravascular flow measurement using a rotational catheter based phase resolved Doppler optical coherence tomography (DOCT) is difficult. Motion artefacts induced by the rotating optical imaging catheter, and the radially dependent noise background of measured Doppler signals are the main challenges encountered. In this study, a custom-made data acquisition system and developed algorithms to remove non-uniform rotational distortion (NURD) induced phase shift artefact by tracking the phase shift observed on catheter sheath. The flow velocity is calculated from Doppler shift obtained by Kasai autocorrelation after motion artefact removal. Blood flow velocity profiles in porcine carotid arteries in vivo were obtained at 100 frames/s with 500 A-lines/frame and DOCT images were taken at 20 frames/s with 2500 A-lines/frame. Time-varying velocity profiles were obtained at an artery branch. Furthermore, the identification of a vein adjacent to the catheterized vessel based on the color Doppler signal was also observed. The absolute measurement of intravascular flow using a rotating fiber catheter can provide insights to different stages of interventional treatment of stenosis in carotid artery.
Reconstructing 3-D skin surface motion for the DIET breast cancer screening system.
Botterill, Tom; Lotz, Thomas; Kashif, Amer; Chase, J Geoffrey
2014-05-01
Digital image-based elasto-tomography (DIET) is a prototype system for breast cancer screening. A breast is imaged while being vibrated, and the observed surface motion is used to infer the internal stiffness of the breast, hence identifying tumors. This paper describes a computer vision system for accurately measuring 3-D surface motion. A model-based segmentation is used to identify the profile of the breast in each image, and the 3-D surface is reconstructed by fitting a model to the profiles. The surface motion is measured using a modern optical flow implementation customized to the application, then trajectories of points on the 3-D surface are given by fusing the optical flow with the reconstructed surfaces. On data from human trials, the system is shown to exceed the performance of an earlier marker-based system at tracking skin surface motion. We demonstrate that the system can detect a 10 mm tumor in a silicone phantom breast.
Optical flow versus retinal flow as sources of information for flight guidance
NASA Technical Reports Server (NTRS)
Cutting, James E.
1991-01-01
The appropriate description is considered of visual information for flight guidance, optical flow vs. retinal flow. Most descriptions in the psychological literature are based on the optical flow. However, human eyes move and this movement complicates the issues at stake, particularly when movement of the observer is involved. The question addressed is whether an observer, whose eyes register only retinal flow, use information in optical flow. It is suggested that the observer cannot and does not reconstruct the image in optical flow; instead they use retinal flow. Retinal array is defined as the projections of a three space onto a point and beyond to a movable, nearly hemispheric sensing device, like the retina. Optical array is defined as the projection of a three space environment to a point within that space. And flow is defined as global motion as a field of vectors, best placed on a spherical projection surface. Specifically, flow is the mapping of the field of changes in position of corresponding points on objects in three space onto a point, where that point has moved in position.
Optic Flow Dominates Visual Scene Polarity in Causing Adaptive Modification of Locomotor Trajectory
NASA Technical Reports Server (NTRS)
Nomura, Y.; Mulavara, A. P.; Richards, J. T.; Brady, R.; Bloomberg, Jacob J.
2005-01-01
Locomotion and posture are influenced and controlled by vestibular, visual and somatosensory information. Optic flow and scene polarity are two characteristics of a visual scene that have been identified as being critical in how they affect perceived body orientation and self-motion. The goal of this study was to determine the role of optic flow and visual scene polarity on adaptive modification in locomotor trajectory. Two computer-generated virtual reality scenes were shown to subjects during 20 minutes of treadmill walking. One scene was a highly polarized scene while the other was composed of objects displayed in a non-polarized fashion. Both virtual scenes depicted constant rate self-motion equivalent to walking counterclockwise around the perimeter of a room. Subjects performed Stepping Tests blindfolded before and after scene exposure to assess adaptive changes in locomotor trajectory. Subjects showed a significant difference in heading direction, between pre and post adaptation stepping tests, when exposed to either scene during treadmill walking. However, there was no significant difference in the subjects heading direction between the two visual scene polarity conditions. Therefore, it was inferred from these data that optic flow has a greater role than visual polarity in influencing adaptive locomotor function.
Nocturnal insects use optic flow for flight control.
Baird, Emily; Kreiss, Eva; Wcislo, William; Warrant, Eric; Dacke, Marie
2011-08-23
To avoid collisions when navigating through cluttered environments, flying insects must control their flight so that their sensory systems have time to detect obstacles and avoid them. To do this, day-active insects rely primarily on the pattern of apparent motion generated on the retina during flight (optic flow). However, many flying insects are active at night, when obtaining reliable visual information for flight control presents much more of a challenge. To assess whether nocturnal flying insects also rely on optic flow cues to control flight in dim light, we recorded flights of the nocturnal neotropical sweat bee, Megalopta genalis, flying along an experimental tunnel when: (i) the visual texture on each wall generated strong horizontal (front-to-back) optic flow cues, (ii) the texture on only one wall generated these cues, and (iii) horizontal optic flow cues were removed from both walls. We find that Megalopta increase their groundspeed when horizontal motion cues in the tunnel are reduced (conditions (ii) and (iii)). However, differences in the amount of horizontal optic flow on each wall of the tunnel (condition (ii)) do not affect the centred position of the bee within the flight tunnel. To better understand the behavioural response of Megalopta, we repeated the experiments on day-active bumble-bees (Bombus terrestris). Overall, our findings demonstrate that despite the limitations imposed by dim light, Megalopta-like their day-active relatives-rely heavily on vision to control flight, but that they use visual cues in a different manner from diurnal insects. This journal is © 2011 The Royal Society
A Variational Approach to Video Registration with Subspace Constraints.
Garg, Ravi; Roussos, Anastasios; Agapito, Lourdes
2013-01-01
This paper addresses the problem of non-rigid video registration, or the computation of optical flow from a reference frame to each of the subsequent images in a sequence, when the camera views deformable objects. We exploit the high correlation between 2D trajectories of different points on the same non-rigid surface by assuming that the displacement of any point throughout the sequence can be expressed in a compact way as a linear combination of a low-rank motion basis. This subspace constraint effectively acts as a trajectory regularization term leading to temporally consistent optical flow. We formulate it as a robust soft constraint within a variational framework by penalizing flow fields that lie outside the low-rank manifold. The resulting energy functional can be decoupled into the optimization of the brightness constancy and spatial regularization terms, leading to an efficient optimization scheme. Additionally, we propose a novel optimization scheme for the case of vector valued images, based on the dualization of the data term. This allows us to extend our approach to deal with colour images which results in significant improvements on the registration results. Finally, we provide a new benchmark dataset, based on motion capture data of a flag waving in the wind, with dense ground truth optical flow for evaluation of multi-frame optical flow algorithms for non-rigid surfaces. Our experiments show that our proposed approach outperforms state of the art optical flow and dense non-rigid registration algorithms.
NASA Astrophysics Data System (ADS)
Block, Stephan; Fast, Björn Johansson; Lundgren, Anders; Zhdanov, Vladimir P.; Höök, Fredrik
2016-09-01
Biological nanoparticles (BNPs) are of high interest due to their key role in various biological processes and use as biomarkers. BNP size and composition are decisive for their functions, but simultaneous determination of both properties with high accuracy remains challenging. Optical microscopy allows precise determination of fluorescence/scattering intensity, but not the size of individual BNPs. The latter is better determined by tracking their random motion in bulk, but the limited illumination volume for tracking this motion impedes reliable intensity determination. Here, we show that by attaching BNPs to a supported lipid bilayer, subjecting them to hydrodynamic flows and tracking their motion via surface-sensitive optical imaging enable determination of their diffusion coefficients and flow-induced drifts, from which accurate quantification of both BNP size and emission intensity can be made. For vesicles, the accuracy of this approach is demonstrated by resolving the expected radius-squared dependence of their fluorescence intensity for radii down to 15 nm.
NASA Astrophysics Data System (ADS)
Yokoi, Naomichi; Aizu, Yoshihisa
2017-04-01
Optical manipulation techniques proposed so far almost depend on carefully fabricated setups and samples. Similar conditions can be fixed in laboratories, however, it is still a challenging work to manipulate nanoparticles when the environment is not well controlled and is unknown in advance. Nonetheless, coherent light scattered by rough object generates speckles which are random interference patterns with well-defined statistical properties. In the present study, we numerically investigate the motion of a particle in a flow under the illumination of a speckle pattern that is at rest or in motion. Trajectory of the particle is simulated in relation to a flow velocity and a speckle contrast to confirm the feasibility of the present method for performing optical manipulation tasks such as trapping and guiding.
Synthetic perspective optical flow: Influence on pilot control tasks
NASA Technical Reports Server (NTRS)
Bennett, C. Thomas; Johnson, Walter W.; Perrone, John A.; Phatak, Anil V.
1989-01-01
One approach used to better understand the impact of visual flow on control tasks has been to use synthetic perspective flow patterns. Such patterns are the result of apparent motion across a grid or random dot display. Unfortunately, the optical flow so generated is based on a subset of the flow information that exists in the real world. The danger is that the resulting optical motions may not generate the visual flow patterns useful for actual flight control. Researchers conducted a series of studies directed at understanding the characteristics of synthetic perspective flow that support various pilot tasks. In the first of these, they examined the control of altitude over various perspective grid textures (Johnson et al., 1987). Another set of studies was directed at studying the head tracking of targets moving in a 3-D coordinate system. These studies, parametric in nature, utilized both impoverished and complex virtual worlds represented by simple perspective grids at one extreme, and computer-generated terrain at the other. These studies are part of an applied visual research program directed at understanding the design principles required for the development of instruments displaying spatial orientation information. The experiments also highlight the need for modeling the impact of spatial displays on pilot control tasks.
Effects of radial direction and eccentricity on acceleration perception.
Mueller, Alexandra S; Timney, Brian
2014-01-01
Radial optic flow can elicit impressions of self-motion--vection--or of objects moving relative to the observer, but there is disagreement as to whether humans have greater sensitivity to expanding or to contracting optic flow. Although most studies agree there is an anisotropy in sensitivity to radial optic flow, it is unclear whether this asymmetry is a function of eccentricity. The issue is further complicated by the fact that few studies have examined how acceleration sensitivity is affected, even though observers and objects in the environment seldom move at a constant speed. To address these issues, we investigated the effects of direction and eccentricity on the ability to detect acceleration in radial optic flow. Our results indicate that observers are better at detecting acceleration when viewing contraction compared with expansion and that eccentricity has no effect on the ability to detect accelerating radial optic flow. Ecological interpretations are discussed.
Terrain shape estimation from optical flow, using Kalman filtering
NASA Astrophysics Data System (ADS)
Hoff, William A.; Sklair, Cheryl W.
1990-01-01
As one moves through a static environment, the visual world as projected on the retina seems to flow past. This apparent motion, called optical flow, can be an important source of depth perception for autonomous robots. An important application is in planetary exploration -the landing vehicle must find a safe landing site in rugged terrain, and an autonomous rover must be able to navigate safely through this terrain. In this paper, we describe a solution to this problem. Image edge points are tracked between frames of a motion sequence, and the range to the points is calculated from the displacement of the edge points and the known motion of the camera. Kalman filtering is used to incrementally improve the range estimates to those points, and provide an estimate of the uncertainty in each range. Errors in camera motion and image point measurement can also be modelled with Kalman filtering. A surface is then interpolated to these points, providing a complete map from which hazards such as steeply sloping areas can be detected. Using the method of extended Kalman filtering, our approach allows arbitrary camera motion. Preliminary results of an implementation are presented, and show that the resulting range accuracy is on the order of 1-2% of the range.
NASA Technical Reports Server (NTRS)
Perrone, J. A.; Stone, L. S.
1998-01-01
We have proposed previously a computational neural-network model by which the complex patterns of retinal image motion generated during locomotion (optic flow) can be processed by specialized detectors acting as templates for specific instances of self-motion. The detectors in this template model respond to global optic flow by sampling image motion over a large portion of the visual field through networks of local motion sensors with properties similar to those of neurons found in the middle temporal (MT) area of primate extrastriate visual cortex. These detectors, arranged within cortical-like maps, were designed to extract self-translation (heading) and self-rotation, as well as the scene layout (relative distances) ahead of a moving observer. We then postulated that heading from optic flow is directly encoded by individual neurons acting as heading detectors within the medial superior temporal (MST) area. Others have questioned whether individual MST neurons can perform this function because some of their receptive-field properties seem inconsistent with this role. To resolve this issue, we systematically compared MST responses with those of detectors from two different configurations of the model under matched stimulus conditions. We found that the characteristic physiological properties of MST neurons can be explained by the template model. We conclude that MST neurons are well suited to support self-motion estimation via a direct encoding of heading and that the template model provides an explicit set of testable hypotheses that can guide future exploration of MST and adjacent areas within the superior temporal sulcus.
Braaf, Boy; Donner, Sabine; Nam, Ahhyun S.; Bouma, Brett E.; Vakoc, Benjamin J.
2018-01-01
Complex differential variance (CDV) provides phase-sensitive angiographic imaging for optical coherence tomography (OCT) with immunity to phase-instabilities of the imaging system and small-scale axial bulk motion. However, like all angiographic methods, measurement noise can result in erroneous indications of blood flow that confuse the interpretation of angiographic images. In this paper, a modified CDV algorithm that corrects for this noise-bias is presented. This is achieved by normalizing the CDV signal by analytically derived upper and lower limits. The noise-bias corrected CDV algorithm was implemented into an experimental 1 μm wavelength OCT system for retinal imaging that used an eye tracking scanner laser ophthalmoscope at 815 nm for compensation of lateral eye motions. The noise-bias correction improved the CDV imaging of the blood flow in tissue layers with a low signal-to-noise ratio and suppressed false indications of blood flow outside the tissue. In addition, the CDV signal normalization suppressed noise induced by galvanometer scanning errors and small-scale lateral motion. High quality cross-section and motion-corrected en face angiograms of the retina and choroid are presented. PMID:29552388
Braaf, Boy; Donner, Sabine; Nam, Ahhyun S; Bouma, Brett E; Vakoc, Benjamin J
2018-02-01
Complex differential variance (CDV) provides phase-sensitive angiographic imaging for optical coherence tomography (OCT) with immunity to phase-instabilities of the imaging system and small-scale axial bulk motion. However, like all angiographic methods, measurement noise can result in erroneous indications of blood flow that confuse the interpretation of angiographic images. In this paper, a modified CDV algorithm that corrects for this noise-bias is presented. This is achieved by normalizing the CDV signal by analytically derived upper and lower limits. The noise-bias corrected CDV algorithm was implemented into an experimental 1 μm wavelength OCT system for retinal imaging that used an eye tracking scanner laser ophthalmoscope at 815 nm for compensation of lateral eye motions. The noise-bias correction improved the CDV imaging of the blood flow in tissue layers with a low signal-to-noise ratio and suppressed false indications of blood flow outside the tissue. In addition, the CDV signal normalization suppressed noise induced by galvanometer scanning errors and small-scale lateral motion. High quality cross-section and motion-corrected en face angiograms of the retina and choroid are presented.
Motion estimation of magnetic resonance cardiac images using the Wigner-Ville and hough transforms
NASA Astrophysics Data System (ADS)
Carranza, N.; Cristóbal, G.; Bayerl, P.; Neumann, H.
2007-12-01
Myocardial motion analysis and quantification is of utmost importance for analyzing contractile heart abnormalities and it can be a symptom of a coronary artery disease. A fundamental problem in processing sequences of images is the computation of the optical flow, which is an approximation of the real image motion. This paper presents a new algorithm for optical flow estimation based on a spatiotemporal-frequency (STF) approach. More specifically it relies on the computation of the Wigner-Ville distribution (WVD) and the Hough Transform (HT) of the motion sequences. The latter is a well-known line and shape detection method that is highly robust against incomplete data and noise. The rationale of using the HT in this context is that it provides a value of the displacement field from the STF representation. In addition, a probabilistic approach based on Gaussian mixtures has been implemented in order to improve the accuracy of the motion detection. Experimental results in the case of synthetic sequences are compared with an implementation of the variational technique for local and global motion estimation, where it is shown that the results are accurate and robust to noise degradations. Results obtained with real cardiac magnetic resonance images are presented.
NASA Astrophysics Data System (ADS)
Carranza, N.; Cristóbal, G.; Sroubek, F.; Ledesma-Carbayo, M. J.; Santos, A.
2006-08-01
Myocardial motion analysis and quantification is of utmost importance for analyzing contractile heart abnormalities and it can be a symptom of a coronary artery disease. A fundamental problem in processing sequences of images is the computation of the optical flow, which is an approximation to the real image motion. This paper presents a new algorithm for optical flow estimation based on a spatiotemporal-frequency (STF) approach, more specifically on the computation of the Wigner-Ville distribution (WVD) and the Hough Transform (HT) of the motion sequences. The later is a well-known line and shape detection method very robust against incomplete data and noise. The rationale of using the HT in this context is because it provides a value of the displacement field from the STF representation. In addition, a probabilistic approach based on Gaussian mixtures has been implemented in order to improve the accuracy of the motion detection. Experimental results with synthetic sequences are compared against an implementation of the variational technique for local and global motion estimation, where it is shown that the results obtained here are accurate and robust to noise degradations. Real cardiac magnetic resonance images have been tested and evaluated with the current method.
Detecting dominant motion patterns in crowds of pedestrians
NASA Astrophysics Data System (ADS)
Saqib, Muhammad; Khan, Sultan Daud; Blumenstein, Michael
2017-02-01
As the population of the world increases, urbanization generates crowding situations which poses challenges to public safety and security. Manual analysis of crowded situations is a tedious job and usually prone to errors. In this paper, we propose a novel technique of crowd analysis, the aim of which is to detect different dominant motion patterns in real-time videos. A motion field is generated by computing the dense optical flow. The motion field is then divided into blocks. For each block, we adopt an Intra-clustering algorithm for detecting different flows within the block. Later on, we employ Inter-clustering for clustering the flow vectors among different blocks. We evaluate the performance of our approach on different real-time videos. The experimental results show that our proposed method is capable of detecting distinct motion patterns in crowded videos. Moreover, our algorithm outperforms state-of-the-art methods.
Extracting heading and temporal range from optic flow: Human performance issues
NASA Technical Reports Server (NTRS)
Kaiser, Mary K.; Perrone, John A.; Stone, Leland; Banks, Martin S.; Crowell, James A.
1993-01-01
Pilots are able to extract information about their vehicle motion and environmental structure from dynamic transformations in the out-the-window scene. In this presentation, we focus on the information in the optic flow which specifies vehicle heading and distance to objects in the environment, scaled to a temporal metric. In particular, we are concerned with modeling how the human operators extract the necessary information, and what factors impact their ability to utilize the critical information. In general, the psychophysical data suggest that the human visual system is fairly robust to degradations in the visual display, e.g., reduced contrast and resolution or restricted field of view. However, extraneous motion flow, i.e., introduced by sensor rotation, greatly compromises human performance. The implications of these models and data for enhanced/synthetic vision systems are discussed.
Repurposing video recordings for structure motion estimations
NASA Astrophysics Data System (ADS)
Khaloo, Ali; Lattanzi, David
2016-04-01
Video monitoring of public spaces is becoming increasingly ubiquitous, particularly near essential structures and facilities. During any hazard event that dynamically excites a structure, such as an earthquake or hurricane, proximal video cameras may inadvertently capture the motion time-history of the structure during the event. If this dynamic time-history could be extracted from the repurposed video recording it would become a valuable forensic analysis tool for engineers performing post-disaster structural evaluations. The difficulty is that almost all potential video cameras are not installed to monitor structure motions, leading to camera perspective distortions and other associated challenges. This paper presents a method for extracting structure motions from videos using a combination of computer vision techniques. Images from a video recording are first reprojected into synthetic images that eliminate perspective distortion, using as-built knowledge of a structure for calibration. The motion of the camera itself during an event is also considered. Optical flow, a technique for tracking per-pixel motion, is then applied to these synthetic images to estimate the building motion. The developed method was validated using the experimental records of the NEESHub earthquake database. The results indicate that the technique is capable of estimating structural motions, particularly the frequency content of the response. Further work will evaluate variants and alternatives to the optical flow algorithm, as well as study the impact of video encoding artifacts on motion estimates.
FPGA-Based Multimodal Embedded Sensor System Integrating Low- and Mid-Level Vision
Botella, Guillermo; Martín H., José Antonio; Santos, Matilde; Meyer-Baese, Uwe
2011-01-01
Motion estimation is a low-level vision task that is especially relevant due to its wide range of applications in the real world. Many of the best motion estimation algorithms include some of the features that are found in mammalians, which would demand huge computational resources and therefore are not usually available in real-time. In this paper we present a novel bioinspired sensor based on the synergy between optical flow and orthogonal variant moments. The bioinspired sensor has been designed for Very Large Scale Integration (VLSI) using properties of the mammalian cortical motion pathway. This sensor combines low-level primitives (optical flow and image moments) in order to produce a mid-level vision abstraction layer. The results are described trough experiments showing the validity of the proposed system and an analysis of the computational resources and performance of the applied algorithms. PMID:22164069
FPGA-based multimodal embedded sensor system integrating low- and mid-level vision.
Botella, Guillermo; Martín H, José Antonio; Santos, Matilde; Meyer-Baese, Uwe
2011-01-01
Motion estimation is a low-level vision task that is especially relevant due to its wide range of applications in the real world. Many of the best motion estimation algorithms include some of the features that are found in mammalians, which would demand huge computational resources and therefore are not usually available in real-time. In this paper we present a novel bioinspired sensor based on the synergy between optical flow and orthogonal variant moments. The bioinspired sensor has been designed for Very Large Scale Integration (VLSI) using properties of the mammalian cortical motion pathway. This sensor combines low-level primitives (optical flow and image moments) in order to produce a mid-level vision abstraction layer. The results are described trough experiments showing the validity of the proposed system and an analysis of the computational resources and performance of the applied algorithms.
Bumblebee flight performance in environments of different proximity.
Linander, Nellie; Baird, Emily; Dacke, Marie
2016-02-01
Flying animals are capable of navigating through environments of different complexity with high precision. To control their flight when negotiating narrow tunnels, bees and birds use the magnitude of apparent image motion (known as optic flow) generated by the walls. In their natural habitat, however, these animals would encounter both cluttered and open environments. Here, we investigate how large changes in the proximity of nearby surfaces affect optic flow-based flight control strategies. We trained bumblebees to fly along a flight and recorded how the distance between the walls--from 60 cm to 240 cm--affected their flight control. Our results reveal that, as tunnel width increases, both lateral position and ground speed become increasingly variable. We also find that optic flow information from the ground has an increasing influence on flight control, suggesting that bumblebees measure optic flow flexibly over a large lateral and ventral field of view, depending on where the highest magnitude of optic flow occurs. A consequence of this strategy is that, when flying in narrow spaces, bumblebees use optic flow information from the nearby obstacles to control flight, while in more open spaces they rely primarily on optic flow cues from the ground.
NASA Technical Reports Server (NTRS)
Perrone, John A.; Stone, Leland S.
1997-01-01
We have previously proposed a computational neural-network model by which the complex patterns of retinal image motion generated during locomotion (optic flow) can be processed by specialized detectors acting as templates for specific instances of self-motion. The detectors in this template model respond to global optic flow by sampling image motion over a large portion of the visual field through networks of local motion sensors with properties similar to neurons found in the middle temporal (MT) area of primate extrastriate visual cortex. The model detectors were designed to extract self-translation (heading), self-rotation, as well as the scene layout (relative distances) ahead of a moving observer, and are arranged in cortical-like heading maps to perform this function. Heading estimation from optic flow has been postulated by some to be implemented within the medial superior temporal (MST) area. Others have questioned whether MST neurons can fulfill this role because some of their receptive-field properties appear inconsistent with a role in heading estimation. To resolve this issue, we systematically compared MST single-unit responses with the outputs of model detectors under matched stimulus conditions. We found that the basic physiological properties of MST neurons can be explained by the template model. We conclude that MST neurons are well suited to support heading estimation and that the template model provides an explicit set of testable hypotheses which can guide future exploration of MST and adjacent areas within the primate superior temporal sulcus.
NASA Astrophysics Data System (ADS)
Thurrell, Adrian; Pelah, Adar
2005-03-01
We report on recent experiments to investigate the Arthrovisual Locomotor Effect (ALE), a mechanism based on non-visual signals postulated to discount or remove the self-generated visual motion signals during locomotion. It is shown that perceptual matches made by standing subjects to a constant motion optic flow stimulus that is viewed while walking on a treadmill are linearly reduced by walking speed, a measure of the reported ALE. The degree of reduction in perceived speed depends on the similarity of the motor activity to natural locomotion, thus for the four activities tested, ALE strength is ranked as follows: Walking > Cycling > Hand Pedalling > Finger Tapping = 0. Other variations and important controls for the ALE are described.
The effect of virtual reality on gait variability.
Katsavelis, Dimitrios; Mukherjee, Mukul; Decker, Leslie; Stergiou, Nicholas
2010-07-01
Optic Flow (OF) plays an important role in human locomotion and manipulation of OF characteristics can cause changes in locomotion patterns. The purpose of the study was to investigate the effect of the velocity of optic flow on the amount and structure of gait variability. Each subject underwent four conditions of treadmill walking at their self-selected pace. In three conditions the subjects walked in an endless virtual corridor, while a fourth control condition was also included. The three virtual conditions differed in the speed of the optic flow displayed as follows--same speed (OFn), faster (OFf), and slower (OFs) than that of the treadmill. Gait kinematics were tracked with an optical motion capture system. Gait variability measures of the hip, knee and ankle range of motion and stride interval were analyzed. Amount of variability was evaluated with linear measures of variability--coefficient of variation, while structure of variability i.e., its organization over time, were measured with nonlinear measures--approximate entropy and detrended fluctuation analysis. The linear measures of variability, CV, did not show significant differences between Non-VR and VR conditions while nonlinear measures of variability identified significant differences at the hip, ankle, and in stride interval. In response to manipulation of the optic flow, significant differences were observed between the three virtual conditions in the following order: OFn greater than OFf greater than OFs. Measures of structure of variability are more sensitive to changes in gait due to manipulation of visual cues, whereas measures of the amount of variability may be concealed by adaptive mechanisms. Visual cues increase the complexity of gait variability and may increase the degrees of freedom available to the subject. Further exploration of the effects of optic flow manipulation on locomotion may provide us with an effective tool for rehabilitation of subjects with sensorimotor issues.
A hybrid approach to estimate the complex motions of clouds in sky images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peng, Zhenzhou; Yu, Dantong; Huang, Dong
Tracking the motion of clouds is essential to forecasting the weather and to predicting the short-term solar energy generation. Existing techniques mainly fall into two categories: variational optical flow, and block matching. In this article, we summarize recent advances in estimating cloud motion using ground-based sky imagers and quantitatively evaluate state-of-the-art approaches. Then we propose a hybrid tracking framework to incorporate the strength of both block matching and optical flow models. To validate the accuracy of the proposed approach, we introduce a series of synthetic images to simulate the cloud movement and deformation, and thereafter comprehensively compare our hybrid approachmore » with several representative tracking algorithms over both simulated and real images collected from various sites/imagers. The results show that our hybrid approach outperforms state-of-the-art models by reducing at least 30% motion estimation errors compared with the ground-truth motions in most of simulated image sequences. Furthermore, our hybrid model demonstrates its superior efficiency in several real cloud image datasets by lowering at least 15% Mean Absolute Error (MAE) between predicted images and ground-truth images.« less
A hybrid approach to estimate the complex motions of clouds in sky images
Peng, Zhenzhou; Yu, Dantong; Huang, Dong; ...
2016-09-14
Tracking the motion of clouds is essential to forecasting the weather and to predicting the short-term solar energy generation. Existing techniques mainly fall into two categories: variational optical flow, and block matching. In this article, we summarize recent advances in estimating cloud motion using ground-based sky imagers and quantitatively evaluate state-of-the-art approaches. Then we propose a hybrid tracking framework to incorporate the strength of both block matching and optical flow models. To validate the accuracy of the proposed approach, we introduce a series of synthetic images to simulate the cloud movement and deformation, and thereafter comprehensively compare our hybrid approachmore » with several representative tracking algorithms over both simulated and real images collected from various sites/imagers. The results show that our hybrid approach outperforms state-of-the-art models by reducing at least 30% motion estimation errors compared with the ground-truth motions in most of simulated image sequences. Furthermore, our hybrid model demonstrates its superior efficiency in several real cloud image datasets by lowering at least 15% Mean Absolute Error (MAE) between predicted images and ground-truth images.« less
NASA Astrophysics Data System (ADS)
Menze, Moritz; Heipke, Christian; Geiger, Andreas
2018-06-01
This work investigates the estimation of dense three-dimensional motion fields, commonly referred to as scene flow. While great progress has been made in recent years, large displacements and adverse imaging conditions as observed in natural outdoor environments are still very challenging for current approaches to reconstruction and motion estimation. In this paper, we propose a unified random field model which reasons jointly about 3D scene flow as well as the location, shape and motion of vehicles in the observed scene. We formulate the problem as the task of decomposing the scene into a small number of rigidly moving objects sharing the same motion parameters. Thus, our formulation effectively introduces long-range spatial dependencies which commonly employed local rigidity priors are lacking. Our inference algorithm then estimates the association of image segments and object hypotheses together with their three-dimensional shape and motion. We demonstrate the potential of the proposed approach by introducing a novel challenging scene flow benchmark which allows for a thorough comparison of the proposed scene flow approach with respect to various baseline models. In contrast to previous benchmarks, our evaluation is the first to provide stereo and optical flow ground truth for dynamic real-world urban scenes at large scale. Our experiments reveal that rigid motion segmentation can be utilized as an effective regularizer for the scene flow problem, improving upon existing two-frame scene flow methods. At the same time, our method yields plausible object segmentations without requiring an explicitly trained recognition model for a specific object class.
Using optical flow for the detection of floating mines in IR image sequences
NASA Astrophysics Data System (ADS)
Borghgraef, Alexander; Acheroy, Marc
2006-09-01
In the first Gulf War, unmoored floating mines proved to be a real hazard for shipping traffic. An automated system capable of detecting these and other free-floating small objects, using readily available sensors such as infra-red cameras, would prove to be a valuable mine-warfare asset, and could double as a collision avoidance mechanism, and a search-and-rescue aid. The noisy background provided by the sea surface, and occlusion by waves make it difficult to detect small floating objects using only algorithms based upon the intensity, size or shape of the target. This leads us to look at the sequence of images for temporal detection characteristics. The target's apparent motion is such a determinant, given the contrast between the bobbing motion of the floating object and the strong horizontal component present in the propagation of the wavefronts. We have applied the Proesmans optical flow algorithm to IR video footage of practice mines, in order to extract the motion characteristic and a threshold on the vertical motion characteristic is then imposed to detect the floating targets.
Intracellular microrheology of motile Amoeba proteus.
Rogers, Salman S; Waigh, Thomas A; Lu, Jian R
2008-04-15
The motility of Amoeba proteus was examined using the technique of passive particle tracking microrheology, with the aid of newly developed particle tracking software, a fast digital camera, and an optical microscope. We tracked large numbers of endogeneous particles in the amoebae, which displayed subdiffusive motion at short timescales, corresponding to thermal motion in a viscoelastic medium, and superdiffusive motion at long timescales due to the convection of the cytoplasm. Subdiffusive motion was characterized by a rheological scaling exponent of 3/4 in the cortex, indicative of the semiflexible dynamics of the actin fibers. We observed shear-thinning in the flowing endoplasm, where exponents increased with increasing flow rate; i.e., the endoplasm became more fluid-like. The rheology of the cortex is found to be isotropic, reflecting an isotropic actin gel. A clear difference was seen between cortical and endoplasmic layers in terms of both viscoelasticity and flow velocity, where the profile of the latter is close to a Poiseuille flow for a Newtonian fluid.
Intracellular Microrheology of Motile Amoeba proteus
Rogers, Salman S.; Waigh, Thomas A.; Lu, Jian R.
2008-01-01
The motility of Amoeba proteus was examined using the technique of passive particle tracking microrheology, with the aid of newly developed particle tracking software, a fast digital camera, and an optical microscope. We tracked large numbers of endogeneous particles in the amoebae, which displayed subdiffusive motion at short timescales, corresponding to thermal motion in a viscoelastic medium, and superdiffusive motion at long timescales due to the convection of the cytoplasm. Subdiffusive motion was characterized by a rheological scaling exponent of 3/4 in the cortex, indicative of the semiflexible dynamics of the actin fibers. We observed shear-thinning in the flowing endoplasm, where exponents increased with increasing flow rate; i.e., the endoplasm became more fluid-like. The rheology of the cortex is found to be isotropic, reflecting an isotropic actin gel. A clear difference was seen between cortical and endoplasmic layers in terms of both viscoelasticity and flow velocity, where the profile of the latter is close to a Poiseuille flow for a Newtonian fluid. PMID:18192370
Intracellular Microrheology of Motile Amoeba proteus
NASA Astrophysics Data System (ADS)
Rogers, S.; Waigh, T.; Lu, J.
2008-04-01
The motility of motile Amoeba proteus was examined using the technique of passive particle tracking microrheology, with the aid of newly-developed particle tracking software, a fast digital camera and an optical microscope. We tracked large numbers of endogeneous particles in the amoebae, which displayed subdiffusive motion at short time scales, corresponding to thermal motion in a viscoelastic medium, and superdiffusive motion at long time scales due to the convection of the cytoplasm. Subdiffusive motion was characterised by a rheological scaling exponent of 3/4 in the cortex, indicative of the semiflexible dynamics of the actin fibres. We observed shear-thinning in the flowing endoplasm, where exponents increased with increasing flow rate; i.e. the endoplasm became more fluid-like. The rheology of the cortex is found to be isotropic, reflecting an isotropic actin gel. A clear difference was seen between cortical and endoplasmic layers in terms of both viscoelasticity and flow velocity, where the profile of the latter is close to a Poiseuille flow for a Newtonian fluid.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borovetz, H.S.; Shaffer, F.; Schaub, R.
This paper discusses a series of experiments to visualize and measure flow fields in the Novacor left ventricular assist system (LVAS). The experiments utilize a multiple exposure, optical imaging technique called fluorescent image tracking velocimetry (FITV) to hack the motion of small, neutrally-buoyant particles in a flowing fluid.
Linander, Nellie; Dacke, Marie; Baird, Emily
2015-04-01
When flying through narrow spaces, insects control their position by balancing the magnitude of apparent image motion (optic flow) experienced in each eye and their speed by holding this value about a desired set point. Previously, it has been shown that when bumblebees encounter sudden changes in the proximity to nearby surfaces - as indicated by a change in the magnitude of optic flow on each side of the visual field - they adjust their flight speed well before the change, suggesting that they measure optic flow for speed control at low visual angles in the frontal visual field. Here, we investigated the effect that sudden changes in the magnitude of translational optic flow have on both position and speed control in bumblebees if these changes are asymmetrical; that is, if they occur only on one side of the visual field. Our results reveal that the visual region over which bumblebees respond to optic flow cues for flight control is not dictated by a set viewing angle. Instead, bumblebees appear to use the maximum magnitude of translational optic flow experienced in the frontal visual field. This strategy ensures that bumblebees use the translational optic flow generated by the nearest obstacles - that is, those with which they have the highest risk of colliding - to control flight. © 2015. Published by The Company of Biologists Ltd.
Fast left ventricle tracking in CMR images using localized anatomical affine optical flow
NASA Astrophysics Data System (ADS)
Queirós, Sandro; Vilaça, João. L.; Morais, Pedro; Fonseca, Jaime C.; D'hooge, Jan; Barbosa, Daniel
2015-03-01
In daily cardiology practice, assessment of left ventricular (LV) global function using non-invasive imaging remains central for the diagnosis and follow-up of patients with cardiovascular diseases. Despite the different methodologies currently accessible for LV segmentation in cardiac magnetic resonance (CMR) images, a fast and complete LV delineation is still limitedly available for routine use. In this study, a localized anatomically constrained affine optical flow method is proposed for fast and automatic LV tracking throughout the full cardiac cycle in short-axis CMR images. Starting from an automatically delineated LV in the end-diastolic frame, the endocardial and epicardial boundaries are propagated by estimating the motion between adjacent cardiac phases using optical flow. In order to reduce the computational burden, the motion is only estimated in an anatomical region of interest around the tracked boundaries and subsequently integrated into a local affine motion model. Such localized estimation enables to capture complex motion patterns, while still being spatially consistent. The method was validated on 45 CMR datasets taken from the 2009 MICCAI LV segmentation challenge. The proposed approach proved to be robust and efficient, with an average distance error of 2.1 mm and a correlation with reference ejection fraction of 0.98 (1.9 +/- 4.5%). Moreover, it showed to be fast, taking 5 seconds for the tracking of a full 4D dataset (30 ms per image). Overall, a novel fast, robust and accurate LV tracking methodology was proposed, enabling accurate assessment of relevant global function cardiac indices, such as volumes and ejection fraction
2017-04-01
complementary fusion: Fourth-order Butterworth filter was used to high -pass ocelli and low-pass optic flow. The normalized cutoff frequency had to be kept...information introduced by luminance change. The high - frequency cutoff was added to reject the flickering noise for indoor usage. The filtered signals from the...function of the low- pass filter is to attenuate high - frequency noise. The final band-pass filter transfer function is in Eq. 2. (()
The effect of external forces on discrete motion within holographic optical tweezers.
Eriksson, E; Keen, S; Leach, J; Goksör, M; Padgett, M J
2007-12-24
Holographic optical tweezers is a widely used technique to manipulate the individual positions of optically trapped micron-sized particles in a sample. The trap positions are changed by updating the holographic image displayed on a spatial light modulator. The updating process takes a finite time, resulting in a temporary decrease of the intensity, and thus the stiffness, of the optical trap. We have investigated this change in trap stiffness during the updating process by studying the motion of an optically trapped particle in a fluid flow. We found a highly nonlinear behavior of the change in trap stiffness vs. changes in step size. For step sizes up to approximately 300 nm the trap stiffness is decreasing. Above 300 nm the change in trap stiffness remains constant for all step sizes up to one particle radius. This information is crucial for optical force measurements using holographic optical tweezers.
Importance of perceptual representation in the visual control of action
NASA Astrophysics Data System (ADS)
Loomis, Jack M.; Beall, Andrew C.; Kelly, Jonathan W.; Macuga, Kristen L.
2005-03-01
In recent years, many experiments have demonstrated that optic flow is sufficient for visually controlled action, with the suggestion that perceptual representations of 3-D space are superfluous. In contrast, recent research in our lab indicates that some visually controlled actions, including some thought to be based on optic flow, are indeed mediated by perceptual representations. For example, we have demonstrated that people are able to perform complex spatial behaviors, like walking, driving, and object interception, in virtual environments which are rendered visible solely by cyclopean stimulation (random-dot cinematograms). In such situations, the absence of any retinal optic flow that is correlated with the objects and surfaces within the virtual environment means that people are using stereo-based perceptual representations to perform the behavior. The fact that people can perform such behaviors without training suggests that the perceptual representations are likely the same as those used when retinal optic flow is present. Other research indicates that optic flow, whether retinal or a more abstract property of the perceptual representation, is not the basis for postural control, because postural instability is related to perceived relative motion between self and the visual surroundings rather than to optic flow, even in the abstract sense.
Real-time optical flow estimation on a GPU for a skied-steered mobile robot
NASA Astrophysics Data System (ADS)
Kniaz, V. V.
2016-04-01
Accurate egomotion estimation is required for mobile robot navigation. Often the egomotion is estimated using optical flow algorithms. For an accurate estimation of optical flow most of modern algorithms require high memory resources and processor speed. However simple single-board computers that control the motion of the robot usually do not provide such resources. On the other hand, most of modern single-board computers are equipped with an embedded GPU that could be used in parallel with a CPU to improve the performance of the optical flow estimation algorithm. This paper presents a new Z-flow algorithm for efficient computation of an optical flow using an embedded GPU. The algorithm is based on the phase correlation optical flow estimation and provide a real-time performance on a low cost embedded GPU. The layered optical flow model is used. Layer segmentation is performed using graph-cut algorithm with a time derivative based energy function. Such approach makes the algorithm both fast and robust in low light and low texture conditions. The algorithm implementation for a Raspberry Pi Model B computer is discussed. For evaluation of the algorithm the computer was mounted on a Hercules mobile skied-steered robot equipped with a monocular camera. The evaluation was performed using a hardware-in-the-loop simulation and experiments with Hercules mobile robot. Also the algorithm was evaluated using KITTY Optical Flow 2015 dataset. The resulting endpoint error of the optical flow calculated with the developed algorithm was low enough for navigation of the robot along the desired trajectory.
Estimation of velocities via optical flow
NASA Astrophysics Data System (ADS)
Popov, A.; Miller, A.; Miller, B.; Stepanyan, K.
2017-02-01
This article presents an approach to the optical flow (OF) usage as a general navigation means providing the information about the linear and angular vehicle's velocities. The term of "OF" came from opto-electronic devices where it corresponds to a video sequence of images related to the camera motion either over static surfaces or set of objects. Even if the positions of these objects are unknown in advance, one can estimate the camera motion provided just by video sequence itself and some metric information, such as distance between the objects or the range to the surface. This approach is applicable to any passive observation system which is able to produce a sequence of images, such as radio locator or sonar. Here the UAV application of the OF is considered since it is historically
Some uses of wavelets for imaging dynamic processes in live cochlear structures
NASA Astrophysics Data System (ADS)
Boutet de Monvel, J.
2007-09-01
A variety of image and signal processing algorithms based on wavelet filtering tools have been developed during the last few decades, that are well adapted to the experimental variability typically encountered in live biological microscopy. A number of processing tools are reviewed, that use wavelets for adaptive image restoration and for motion or brightness variation analysis by optical flow computation. The usefulness of these tools for biological imaging is illustrated in the context of the restoration of images of the inner ear and the analysis of cochlear motion patterns in two and three dimensions. I also report on recent work that aims at capturing fluorescence intensity changes associated with vesicle dynamics at synaptic zones of sensory hair cells. This latest application requires one to separate the intensity variations associated with the physiological process under study from the variations caused by motion of the observed structures. A wavelet optical flow algorithm for doing this is presented, and its effectiveness is demonstrated on artificial and experimental image sequences.
Calibration, Information, and Control Strategies for Braking to Avoid a Collision
ERIC Educational Resources Information Center
Fajen, Brett R.
2005-01-01
This study explored visual control strategies for braking to avoid collision by manipulating information about speed of self-motion. Participants watched computer-generated displays and used a brake to stop at an object in the path of motion. Global optic flow rate and edge rate were manipulated by adjusting eyeheight and ground-texture size.…
Schlieren System and method for moving objects
NASA Technical Reports Server (NTRS)
Weinstein, Leonard M. (Inventor)
1995-01-01
A system and method are provided for recording density changes in a flow field surrounding a moving object. A mask having an aperture for regulating the passage of images is placed in front of an image recording medium. An optical system is placed in front of the mask. A transition having a light field-of-view and a dark field-of-view is located beyond the test object. The optical system focuses an image of the transition at the mask such that the aperture causes a band of light to be defined on the image recording medium. The optical system further focuses an image of the object through the aperture of the mask so that the image of the object appears on the image recording medium. Relative motion is minimized between the mask and the transition. Relative motion is also minimized between the image recording medium and the image of the object. In this way, the image of the object and density changes in a flow field surrounding the object are recorded on the image recording medium when the object crosses the transition in front of the optical system.
Caudek, Corrado; Fantoni, Carlo; Domini, Fulvio
2011-01-01
We measured perceived depth from the optic flow (a) when showing a stationary physical or virtual object to observers who moved their head at a normal or slower speed, and (b) when simulating the same optic flow on a computer and presenting it to stationary observers. Our results show that perceived surface slant is systematically distorted, for both the active and the passive viewing of physical or virtual surfaces. These distortions are modulated by head translation speed, with perceived slant increasing directly with the local velocity gradient of the optic flow. This empirical result allows us to determine the relative merits of two alternative approaches aimed at explaining perceived surface slant in active vision: an “inverse optics” model that takes head motion information into account, and a probabilistic model that ignores extra-retinal signals. We compare these two approaches within the framework of the Bayesian theory. The “inverse optics” Bayesian model produces veridical slant estimates if the optic flow and the head translation velocity are measured with no error; because of the influence of a “prior” for flatness, the slant estimates become systematically biased as the measurement errors increase. The Bayesian model, which ignores the observer's motion, always produces distorted estimates of surface slant. Interestingly, the predictions of this second model, not those of the first one, are consistent with our empirical findings. The present results suggest that (a) in active vision perceived surface slant may be the product of probabilistic processes which do not guarantee the correct solution, and (b) extra-retinal signals may be mainly used for a better measurement of retinal information. PMID:21533197
1 kHz 2D Visual Motion Sensor Using 20 × 20 Silicon Retina Optical Sensor and DSP Microcontroller.
Liu, Shih-Chii; Yang, MinHao; Steiner, Andreas; Moeckel, Rico; Delbruck, Tobi
2015-04-01
Optical flow sensors have been a long running theme in neuromorphic vision sensors which include circuits that implement the local background intensity adaptation mechanism seen in biological retinas. This paper reports a bio-inspired optical motion sensor aimed towards miniature robotic and aerial platforms. It combines a 20 × 20 continuous-time CMOS silicon retina vision sensor with a DSP microcontroller. The retina sensor has pixels that have local gain control and adapt to background lighting. The system allows the user to validate various motion algorithms without building dedicated custom solutions. Measurements are presented to show that the system can compute global 2D translational motion from complex natural scenes using one particular algorithm: the image interpolation algorithm (I2A). With this algorithm, the system can compute global translational motion vectors at a sample rate of 1 kHz, for speeds up to ±1000 pixels/s, using less than 5 k instruction cycles (12 instructions per pixel) per frame. At 1 kHz sample rate the DSP is 12% occupied with motion computation. The sensor is implemented as a 6 g PCB consuming 170 mW of power.
Warren, Paul A; Rushton, Simon K
2009-05-01
We have recently suggested that the brain uses its sensitivity to optic flow in order to parse retinal motion into components arising due to self and object movement (e.g. Rushton, S. K., & Warren, P. A. (2005). Moving observers, 3D relative motion and the detection of object movement. Current Biology, 15, R542-R543). Here, we explore whether stereo disparity is necessary for flow parsing or whether other sources of depth information, which could theoretically constrain flow-field interpretation, are sufficient. Stationary observers viewed large field of view stimuli containing textured cubes, moving in a manner that was consistent with a complex observer movement through a stationary scene. Observers made speeded responses to report the perceived direction of movement of a probe object presented at different depths in the scene. Across conditions we varied the presence or absence of different binocular and monocular cues to depth order. In line with previous studies, results consistent with flow parsing (in terms of both perceived direction and response time) were found in the condition in which motion parallax and stereoscopic disparity were present. Observers were poorer at judging object movement when depth order was specified by parallax alone. However, as more monocular depth cues were added to the stimulus the results approached those found when the scene contained stereoscopic cues. We conclude that both monocular and binocular static depth information contribute to flow parsing. These findings are discussed in the context of potential architectures for a model of the flow parsing mechanism.
Towards designing an optical-flow based colonoscopy tracking algorithm: a comparative study
NASA Astrophysics Data System (ADS)
Liu, Jianfei; Subramanian, Kalpathi R.; Yoo, Terry S.
2013-03-01
Automatic co-alignment of optical and virtual colonoscopy images can supplement traditional endoscopic procedures, by providing more complete information of clinical value to the gastroenterologist. In this work, we present a comparative analysis of our optical flow based technique for colonoscopy tracking, in relation to current state of the art methods, in terms of tracking accuracy, system stability, and computational efficiency. Our optical-flow based colonoscopy tracking algorithm starts with computing multi-scale dense and sparse optical flow fields to measure image displacements. Camera motion parameters are then determined from optical flow fields by employing a Focus of Expansion (FOE) constrained egomotion estimation scheme. We analyze the design choices involved in the three major components of our algorithm: dense optical flow, sparse optical flow, and egomotion estimation. Brox's optical flow method,1 due to its high accuracy, was used to compare and evaluate our multi-scale dense optical flow scheme. SIFT6 and Harris-affine features7 were used to assess the accuracy of the multi-scale sparse optical flow, because of their wide use in tracking applications; the FOE-constrained egomotion estimation was compared with collinear,2 image deformation10 and image derivative4 based egomotion estimation methods, to understand the stability of our tracking system. Two virtual colonoscopy (VC) image sequences were used in the study, since the exact camera parameters(for each frame) were known; dense optical flow results indicated that Brox's method was superior to multi-scale dense optical flow in estimating camera rotational velocities, but the final tracking errors were comparable, viz., 6mm vs. 8mm after the VC camera traveled 110mm. Our approach was computationally more efficient, averaging 7.2 sec. vs. 38 sec. per frame. SIFT and Harris affine features resulted in tracking errors of up to 70mm, while our sparse optical flow error was 6mm. The comparison among egomotion estimation algorithms showed that our FOE-constrained egomotion estimation method achieved the optimal balance between tracking accuracy and robustness. The comparative study demonstrated that our optical-flow based colonoscopy tracking algorithm maintains good accuracy and stability for routine use in clinical practice.
Spatial attention is attracted in a sustained fashion toward singular points in the optic flow.
Wang, Shuo; Fukuchi, Masaki; Koch, Christof; Tsuchiya, Naotsugu
2012-01-01
While a single approaching object is known to attract spatial attention, it is unknown how attention is directed when the background looms towards the observer as s/he moves forward in a quasi-stationary environment. In Experiment 1, we used a cued speeded discrimination task to quantify where and how spatial attention is directed towards the target superimposed onto a cloud of moving dots. We found that when the motion was expansive, attention was attracted towards the singular point of the optic flow (the focus of expansion, FOE) in a sustained fashion. The effects were less pronounced when the motion was contractive. The more ecologically valid the motion features became (e.g., temporal expansion of each dot, spatial depth structure implied by distribution of the size of the dots), the stronger the attentional effects. Further, the attentional effects were sustained over 1000 ms. Experiment 2 quantified these attentional effects using a change detection paradigm by zooming into or out of photographs of natural scenes. Spatial attention was attracted in a sustained manner such that change detection was facilitated or delayed depending on the location of the FOE only when the motion was expansive. Our results suggest that focal attention is strongly attracted towards singular points that signal the direction of forward ego-motion.
Spatial Attention Is Attracted in a Sustained Fashion toward Singular Points in the Optic Flow
Wang, Shuo; Fukuchi, Masaki; Koch, Christof; Tsuchiya, Naotsugu
2012-01-01
While a single approaching object is known to attract spatial attention, it is unknown how attention is directed when the background looms towards the observer as s/he moves forward in a quasi-stationary environment. In Experiment 1, we used a cued speeded discrimination task to quantify where and how spatial attention is directed towards the target superimposed onto a cloud of moving dots. We found that when the motion was expansive, attention was attracted towards the singular point of the optic flow (the focus of expansion, FOE) in a sustained fashion. The effects were less pronounced when the motion was contractive. The more ecologically valid the motion features became (e.g., temporal expansion of each dot, spatial depth structure implied by distribution of the size of the dots), the stronger the attentional effects. Further, the attentional effects were sustained over 1000 ms. Experiment 2 quantified these attentional effects using a change detection paradigm by zooming into or out of photographs of natural scenes. Spatial attention was attracted in a sustained manner such that change detection was facilitated or delayed depending on the location of the FOE only when the motion was expansive. Our results suggest that focal attention is strongly attracted towards singular points that signal the direction of forward ego-motion. PMID:22905096
Woo, Kevin L; Rieucau, Guillaume
2008-07-01
The increasing use of the video playback technique in behavioural ecology reveals a growing need to ensure better control of the visual stimuli that focal animals experience. Technological advances now allow researchers to develop computer-generated animations instead of using video sequences of live-acting demonstrators. However, care must be taken to match the motion characteristics (speed and velocity) of the animation to the original video source. Here, we presented a tool based on the use of an optic flow analysis program to measure the resemblance of motion characteristics of computer-generated animations compared to videos of live-acting animals. We examined three distinct displays (tail-flick (TF), push-up body rock (PUBR), and slow arm wave (SAW)) exhibited by animations of Jacky dragons (Amphibolurus muricatus) that were compared to the original video sequences of live lizards. We found no significant differences between the motion characteristics of videos and animations across all three displays. Our results showed that our animations are similar the speed and velocity features of each display. Researchers need to ensure that similar motion characteristics in animation and video stimuli are represented, and this feature is a critical component in the future success of the video playback technique.
Heading Tuning in Macaque Area V6.
Fan, Reuben H; Liu, Sheng; DeAngelis, Gregory C; Angelaki, Dora E
2015-12-16
Cortical areas, such as the dorsal subdivision of the medial superior temporal area (MSTd) and the ventral intraparietal area (VIP), have been shown to integrate visual and vestibular self-motion signals. Area V6 is interconnected with areas MSTd and VIP, allowing for the possibility that V6 also integrates visual and vestibular self-motion cues. An alternative hypothesis in the literature is that V6 does not use these sensory signals to compute heading but instead discounts self-motion signals to represent object motion. However, the responses of V6 neurons to visual and vestibular self-motion cues have never been studied, thus leaving the functional roles of V6 unclear. We used a virtual reality system to examine the 3D heading tuning of macaque V6 neurons in response to optic flow and inertial motion stimuli. We found that the majority of V6 neurons are selective for heading defined by optic flow. However, unlike areas MSTd and VIP, V6 neurons are almost universally unresponsive to inertial motion in the absence of optic flow. We also explored the spatial reference frames of heading signals in V6 by measuring heading tuning for different eye positions, and we found that the visual heading tuning of most V6 cells was eye-centered. Similar to areas MSTd and VIP, the population of V6 neurons was best able to discriminate small variations in heading around forward and backward headings. Our findings support the idea that V6 is involved primarily in processing visual motion signals and does not appear to play a role in visual-vestibular integration for self-motion perception. To understand how we successfully navigate our world, it is important to understand which parts of the brain process cues used to perceive our direction of self-motion (i.e., heading). Cortical area V6 has been implicated in heading computations based on human neuroimaging data, but direct measurements of heading selectivity in individual V6 neurons have been lacking. We provide the first demonstration that V6 neurons carry 3D visual heading signals, which are represented in an eye-centered reference frame. In contrast, we found almost no evidence for vestibular heading signals in V6, indicating that V6 is unlikely to contribute to multisensory integration of heading signals, unlike other cortical areas. These findings provide important constraints on the roles of V6 in self-motion perception. Copyright © 2015 the authors 0270-6474/15/3516303-12$15.00/0.
Three-Dimensional High-Resolution Optical/X-Ray Stereoscopic Tracking Velocimetry
NASA Technical Reports Server (NTRS)
Cha, Soyoung S.; Ramachandran, Narayanan
2004-01-01
Measurement of three-dimensional (3-D) three-component velocity fields is of great importance in a variety of research and industrial applications for understanding materials processing, fluid physics, and strain/displacement measurements. The 3-D experiments in these fields most likely inhibit the use of conventional techniques, which are based only on planar and optically-transparent-field observation. Here, we briefly review the current status of 3-D diagnostics for motion/velocity detection, for both optical and x-ray systems. As an initial step for providing 3-D capabilities, we nave developed stereoscopic tracking velocimetry (STV) to measure 3-D flow/deformation through optical observation. The STV is advantageous in system simplicity, for continually observing 3- D phenomena in near real-time. In an effort to enhance the data processing through automation and to avoid the confusion in tracking numerous markers or particles, artificial neural networks are employed to incorporate human intelligence. Our initial optical investigations have proven the STV to be a very viable candidate for reliably measuring 3-D flow motions. With previous activities are focused on improving the processing efficiency, overall accuracy, and automation based on the optical system, the current efforts is directed to the concurrent expansion to the x-ray system for broader experimental applications.
Three-Dimensional High-Resolution Optical/X-Ray Stereoscopic Tracking Velocimetry
NASA Technical Reports Server (NTRS)
Cha, Soyoung S.; Ramachandran, Naryanan
2005-01-01
Measurement of three-dimensional (3-D) three-component velocity fields is of great importance in a variety of research and industrial applications for understanding materials processing, fluid physics, and strain/displacement measurements. The 3-D experiments in these fields most likely inhibit the use of conventional techniques, which are based only on planar and optically-transparent-field observation. Here, we briefly review the current status of 3-D diagnostics for motion/velocity detection, for both optical and x-ray systems. As an initial step for providing 3-D capabilities, we have developed stereoscopic tracking velocimetry (STV) to measure 3-D flow/deformation through optical observation. The STV is advantageous in system simplicity, for continually observing 3-D phenomena in near real-time. In an effort to enhance the data processing through automation and to avoid the confusion in tracking numerous markers or particles, artificial neural networks are employed to incorporate human intelligence. Our initial optical investigations have proven the STV to be a very viable candidate for reliably measuring 3-D flow motions. With previous activities focused on improving the processing efficiency, overall accuracy, and automation based on the optical system, the current efforts is directed to the concurrent expansion to the x-ray system for broader experimental applications.
2007-06-01
cross flow are taken at finer resolution, down to 6.5 μm/pixel. For the flow mapping, both the CCD camera and part of the laser -sheet optics are...Control of Supersonic Impinging Jet Flows using Microjets . AIAA Journal. 41(7):1347-1355, 2001. [9] M.J. Stanek, G. Raman, V. Kibens, J.A. Ross, J. Odedra
Sudo, S; Ohtomo, T; Otsuka, K
2015-08-01
We achieved a highly sensitive method for observing the motion of colloidal particles in a flowing suspension using a self-mixing laser Doppler velocimeter (LDV) comprising a laser-diode-pumped thin-slice solid-state laser and a simple photodiode. We describe the measurement method and the optical system of the self-mixing LDV for real-time measurements of the motion of colloidal particles. For a condensed solution, when the light scattered from the particles is reinjected into the solid-state laser, the laser output is modulated in intensity by the reinjected laser light. Thus, we can capture the motion of colloidal particles from the spectrum of the modulated laser output. For a diluted solution, when the relaxation oscillation frequency coincides with the Doppler shift frequency, fd, which is related to the average velocity of the particles, the spectrum reflecting the motion of the colloidal particles is enhanced by the resonant excitation of relaxation oscillations. Then, the spectral peak reflecting the motion of colloidal particles appears at 2×fd. The spectrum reflecting the motion of colloidal particles in a flowing diluted solution can be measured with high sensitivity, owing to the enhancement of the spectrum by the thin-slice solid-state laser.
1990-10-01
type of approach for finding a dense displacement vector field has a time complexity that allows a real - time implementation when an appropriate control...hardly vector fields as they appear in Stereo or motion. The reason for this is the fact that local displacement vector field ( DVF ) esti- mates bave...2 objects’ motion, but that the quantitative optical flow is not a reliable measure of the real motion [VP87, SU87]. This applies even more to the
Minimum viewing angle for visually guided ground speed control in bumblebees.
Baird, Emily; Kornfeldt, Torill; Dacke, Marie
2010-05-01
To control flight, flying insects extract information from the pattern of visual motion generated during flight, known as optic flow. To regulate their ground speed, insects such as honeybees and Drosophila hold the rate of optic flow in the axial direction (front-to-back) constant. A consequence of this strategy is that its performance varies with the minimum viewing angle (the deviation from the frontal direction of the longitudinal axis of the insect) at which changes in axial optic flow are detected. The greater this angle, the later changes in the rate of optic flow, caused by changes in the density of the environment, will be detected. The aim of the present study is to examine the mechanisms of ground speed control in bumblebees and to identify the extent of the visual range over which optic flow for ground speed control is measured. Bumblebees were trained to fly through an experimental tunnel consisting of parallel vertical walls. Flights were recorded when (1) the distance between the tunnel walls was either 15 or 30 cm, (2) the visual texture on the tunnel walls provided either strong or weak optic flow cues and (3) the distance between the walls changed abruptly halfway along the tunnel's length. The results reveal that bumblebees regulate ground speed using optic flow cues and that changes in the rate of optic flow are detected at a minimum viewing angle of 23-30 deg., with a visual field that extends to approximately 155 deg. By measuring optic flow over a visual field that has a low minimum viewing angle, bumblebees are able to detect and respond to changes in the proximity of the environment well before they are encountered.
Ego-motion based on EM for bionic navigation
NASA Astrophysics Data System (ADS)
Yue, Xiaofeng; Wang, L. J.; Liu, J. G.
2015-12-01
Researches have proved that flying insects such as bees can achieve efficient and robust flight control, and biologists have explored some biomimetic principles regarding how they control flight. Based on those basic studies and principles acquired from the flying insects, this paper proposes a different solution of recovering ego-motion for low level navigation. Firstly, a new type of entropy flow is provided to calculate the motion parameters. Secondly, EKF, which has been used for navigation for some years to correct accumulated error, and estimation-Maximization, which is always used to estimate parameters, are put together to determine the ego-motion estimation of aerial vehicles. Numerical simulation on MATLAB has proved that this navigation system provides more accurate position and smaller mean absolute error than pure optical flow navigation. This paper has done pioneering work in bionic mechanism to space navigation.
Optical Flow for Flight and Wind Tunnel Background Oriented Schlieren Imaging
NASA Technical Reports Server (NTRS)
Smith, Nathanial T.; Heineck, James T.; Schairer, Edward T.
2017-01-01
Background oriented Schlieren images have historically been generated by calculating the observed pixel displacement between a wind-on and wind-o image pair using normalized cross-correlation. This work uses optical flow to solve the displacement fields which generate the Schlieren images. A well established method used in the computer vision community, optical flow is the apparent motion in an image sequence due to brightness changes. The regularization method of Horn and Schunck is used to create Schlieren images using two data sets: a supersonic jet plume shock interaction from the NASA Ames Unitary Plan Wind Tunnel, and a transonic flight test of a T-38 aircraft using a naturally occurring background, performed in conjunction with NASA Ames and Armstrong Research Centers. Results are presented and contrasted with those using normalized cross-correlation. The optical flow Schlieren images are found to provided significantly more detail. We apply the method to historical data sets to demonstrate the broad applicability and limitations of the technique.
Inertial navigation sensor integrated obstacle detection system
NASA Technical Reports Server (NTRS)
Bhanu, Bir (Inventor); Roberts, Barry A. (Inventor)
1992-01-01
A system that incorporates inertial sensor information into optical flow computations to detect obstacles and to provide alternative navigational paths free from obstacles. The system is a maximally passive obstacle detection system that makes selective use of an active sensor. The active detection typically utilizes a laser. Passive sensor suite includes binocular stereo, motion stereo and variable fields-of-view. Optical flow computations involve extraction, derotation and matching of interest points from sequential frames of imagery, for range interpolation of the sensed scene, which in turn provides obstacle information for purposes of safe navigation.
Doppler imaging using spectrally-encoded endoscopy
Yelin, Dvir; Bouma, B. E.; Rosowsky, J. J.; Tearney, G. J.
2009-01-01
The capability to image tissue motion such as blood flow through an endoscope could have many applications in medicine. Spectrally encoded endoscopy (SEE) is a recently introduced technique that utilizes a single optical fiber and miniature diffractive optics to obtain endoscopic images through small diameter probes. Using spectral-domain interferometry, SEE is furthermore capable of three-dimensional volume imaging at video rates. Here we show that by measuring relative spectral phases, this technology can additionally measure Doppler shifts. Doppler SEE is demonstrated in flowing Intralipid phantoms and vibrating middle ear ossicles. PMID:18795020
Resolution experiments using the white light speckle method.
Conley, E; Cloud, G
1991-03-01
Noncoherent light speckle methods have been successfully applied to gauge the motion of glaciers and buildings. Resolution of the optical method was limited by the aberrating turbulent atmosphere through which the images were collected. Sensitivity limitations regarding this particular application of speckle interferometry are discussed and analyzed. Resolution limit experiments that were incidental to glacier flow studies are related to the basic theory of astronomical imaging. Optical resolution of the ice flow measurement technique is shown to be in substantial agreement with the sensitivity predictions of astronomy theory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Kaiming; Teo, Peng; Kawalec, Philip
2016-08-15
Purpose: This work reports on the development of a mechanical slider system for the counter-steering of tumor motion in adaptive Radiation Therapy (RT). The tumor motion was tracked using a weighted optical flow algorithm and its position is being predicted with a neural network (NN). Methods: The components of the proposed mechanical counter-steering system includes: (1) an actuator which provides the tumor motion, (2) the motion detection using an optical flow algorithm, (3) motion prediction using a neural network, (4) a control module and (5) a mechanical slider to counter-steer the anticipated motion of the tumor phantom. An asymmetrical cosinemore » function and five patient traces (P1–P5) were used to evaluate the tracking of a 3D printed lung tumor. In the proposed mechanical counter-steering system, both actuator (Zaber NA14D60) and slider (Zaber A-BLQ0070-E01) were programed to move independently with LabVIEW and their positions were recorded by 2 potentiometers (ETI LCP12S-25). The accuracy of this counter-steering system is given by the difference between the two potentiometers. Results: The inherent accuracy of the system, measured using the cosine function, is −0.15 ± 0.06 mm. While the errors when tracking and prediction were included, is (0.04 ± 0.71) mm. Conclusion: A prototype tumor motion counter-steering system with tracking and prediction was implemented. The inherent errors are small in comparison to the tracking and prediction errors, which in turn are small in comparison to the magnitude of tumor motion. The results show that this system is suited for evaluating RT tracking and prediction.« less
Dense depth maps from correspondences derived from perceived motion
NASA Astrophysics Data System (ADS)
Kirby, Richard; Whitaker, Ross
2017-01-01
Many computer vision applications require finding corresponding points between images and using the corresponding points to estimate disparity. Today's correspondence finding algorithms primarily use image features or pixel intensities common between image pairs. Some 3-D computer vision applications, however, do not produce the desired results using correspondences derived from image features or pixel intensities. Two examples are the multimodal camera rig and the center region of a coaxial camera rig. We present an image correspondence finding technique that aligns pairs of image sequences using optical flow fields. The optical flow fields provide information about the structure and motion of the scene, which are not available in still images but can be used in image alignment. We apply the technique to a dual focal length stereo camera rig consisting of a visible light-infrared camera pair and to a coaxial camera rig. We test our method on real image sequences and compare our results with the state-of-the-art multimodal and structure from motion (SfM) algorithms. Our method produces more accurate depth and scene velocity reconstruction estimates than the state-of-the-art multimodal and SfM algorithms.
Detection of linear ego-acceleration from optic flow.
Festl, Freya; Recktenwald, Fabian; Yuan, Chunrong; Mallot, Hanspeter A
2012-07-20
Human observers are able to estimate various ego-motion parameters from optic flow, including rotation, translational heading, time-to-collision (TTC), time-to-passage (TTP), etc. The perception of linear ego-acceleration or deceleration, i.e., changes of translational velocity, is less well understood. While time-to-passage experiments indicate that ego-acceleration is neglected, subjects are able to keep their (perceived) speed constant under changing conditions, indicating that some sense of ego-acceleration or velocity change must be present. In this paper, we analyze the relation of ego-acceleration estimates and geometrical parameters of the environment using simulated flights through cylindrical and conic (narrowing or widening) corridors. Theoretical analysis shows that a logarithmic ego-acceleration parameter, called the acceleration rate ρ, can be calculated from retinal acceleration measurements. This parameter is independent of the geometrical layout of the scene; if veridical ego-motion is known at some instant in time, acceleration rate allows updating of ego-motion without further depth-velocity calibration. Results indicate, however, that subjects systematically confuse ego-acceleration with corridor narrowing and ego-deceleration with corridor widening, while veridically judging ego-acceleration in straight corridors. We conclude that judgments of ego-acceleration are based on first-order retinal flow and do not make use of acceleration rate or retinal acceleration.
NASA Astrophysics Data System (ADS)
Ohyama, Ryu-Ichiro; Fukumoto, Masaru
A DC corona discharge induced electrohydrodynamic (EHD) flow phenomenon for a multi-phase fluid containing a vapor-phase dielectric liquid in the fresh air was investigated. The experimental electrode system was a simple arrangement of needle-plate electrodes for the corona discharges and high-resistivity silicon oil was used as the vapor-phase liquid enclosure. The qualitative observation of EHD flow patterns was conducted by an optical processing on computer tomography and the time-series of discharge current pulse generations at corona discharge electrode were measured simultaneously. These experimental results were analyzed in relationship between the EHD flow motions and the current pulse generations in synchronization. The current pulses and the EHD flow motions from the corona discharge electrode presented a continuous mode similar to the ionic wind in the fresh air and an intermittent mode. In the intermittent mode, the observed EHD flow motion was synchronized with the separated discharge pulse generations. From these experimental results, it was expected that the existence of silicon oil vapor trapped charges gave an occasion to the intermittent generations of the discharge pulses and the secondary EHD flow.
Visual-Cerebellar Pathways and Their Roles in the Control of Avian Flight.
Wylie, Douglas R; Gutiérrez-Ibáñez, Cristián; Gaede, Andrea H; Altshuler, Douglas L; Iwaniuk, Andrew N
2018-01-01
In this paper, we review the connections and physiology of visual pathways to the cerebellum in birds and consider their role in flight. We emphasize that there are two visual pathways to the cerebellum. One is to the vestibulocerebellum (folia IXcd and X) that originates from two retinal-recipient nuclei that process optic flow: the nucleus of the basal optic root (nBOR) and the pretectal nucleus lentiformis mesencephali (LM). The second is to the oculomotor cerebellum (folia VI-VIII), which receives optic flow information, mainly from LM, but also local visual motion information from the optic tectum, and other visual information from the ventral lateral geniculate nucleus (Glv). The tectum, LM and Glv are all intimately connected with the pontine nuclei, which also project to the oculomotor cerebellum. We believe this rich integration of visual information in the cerebellum is important for analyzing motion parallax that occurs during flight. Finally, we extend upon a suggestion by Ibbotson (2017) that the hypertrophy that is observed in LM in hummingbirds might be due to an increase in the processing demands associated with the pathway to the oculomotor cerebellum as they fly through a cluttered environment while feeding.
A Motion-Based Feature for Event-Based Pattern Recognition
Clady, Xavier; Maro, Jean-Matthieu; Barré, Sébastien; Benosman, Ryad B.
2017-01-01
This paper introduces an event-based luminance-free feature from the output of asynchronous event-based neuromorphic retinas. The feature consists in mapping the distribution of the optical flow along the contours of the moving objects in the visual scene into a matrix. Asynchronous event-based neuromorphic retinas are composed of autonomous pixels, each of them asynchronously generating “spiking” events that encode relative changes in pixels' illumination at high temporal resolutions. The optical flow is computed at each event, and is integrated locally or globally in a speed and direction coordinate frame based grid, using speed-tuned temporal kernels. The latter ensures that the resulting feature equitably represents the distribution of the normal motion along the current moving edges, whatever their respective dynamics. The usefulness and the generality of the proposed feature are demonstrated in pattern recognition applications: local corner detection and global gesture recognition. PMID:28101001
NASA Astrophysics Data System (ADS)
Tavakoli, Vahid; Stoddard, Marcus F.; Amini, Amir A.
2013-03-01
Quantitative motion analysis of echocardiographic images helps clinicians with the diagnosis and therapy of patients suffering from cardiac disease. Quantitative analysis is usually based on TDI (Tissue Doppler Imaging) or speckle tracking. These methods are based on two independent techniques - the Doppler Effect and image registration, respectively. In order to increase the accuracy of the speckle tracking technique and cope with the angle dependency of TDI, herein, a combined approach dubbed TDIOF (Tissue Doppler Imaging Optical Flow) is proposed. TDIOF is formulated based on the combination of B-mode and Doppler energy terms in an optical flow framework and minimized using algebraic equations. In this paper, we report on validations with simulated, physical cardiac phantom, and in-vivo patient data. It is shown that the additional Doppler term is able to increase the accuracy of speckle tracking, the basis for several commercially available echocardiography analysis techniques.
Effects of background motion on eye-movement information.
Nakamura, S
1997-02-01
The effect of background stimulus on eye-movement information was investigated by analyzing the underestimation of the target velocity during pursuit eye movement (Aubert-Fleishl paradox). In the experiment, a striped pattern with various brightness contrasts and spatial frequencies was used as a background stimulus, which was moved at various velocities. Analysis showed that the perceived velocity of the pursuit target, which indicated the magnitudes of eye-movement information, decreased when the background stripes moved in the same direction as eye movement at higher velocities and increased when the background moved in the opposite direction. The results suggest that the eye-movement information varied as a linear function of the velocity of the motion of the background retinal image (optic flow). In addition, the effectiveness of optic flow on eye-movement information was determined by the attributes of the background stimulus such as the brightness contrast or the spatial frequency of the striped pattern.
Estimating the Heading Direction Using Normal Flow
1994-01-01
understood (Faugeras and Maybank 1990), 3 Kinetic Stabilization under the assumption that optic flow or correspon- dence is known with some uncertainty...accelerometers can achieve very It can easily be shown (Koenderink and van Doom high accuracy, the same is not true for inexpensive 1975; Maybank 1985... Maybank . ’Motion from point matches: Multi- just don’t compute normal flow there (see Section 6). plicity of solutions". Int’l J. Computer Vision 4
Macaque Parieto-Insular Vestibular Cortex: Responses to self-motion and optic flow
Chen, Aihua; DeAngelis, Gregory C.; Angelaki, Dora E.
2011-01-01
The parieto-insular vestibular cortex (PIVC) is thought to contain an important representation of vestibular information. Here we describe responses of macaque PIVC neurons to three-dimensional (3D) vestibular and optic flow stimulation. We found robust vestibular responses to both translational and rotational stimuli in the retroinsular (Ri) and adjacent secondary somatosensory (S2) cortices. PIVC neurons did not respond to optic flow stimulation, and vestibular responses were similar in darkness and during visual fixation. Cells in the upper bank and tip of the lateral sulcus (Ri and S2) responded to sinusoidal vestibular stimuli with modulation at the first harmonic frequency, and were directionally tuned. Cells in the lower bank of the lateral sulcus (mostly Ri) often modulated at the second harmonic frequency, and showed either bimodal spatial tuning or no tuning at all. All directions of 3D motion were represented in PIVC, with direction preferences distributed roughly uniformly for translation, but showing a preference for roll rotation. Spatio-temporal profiles of responses to translation revealed that half of PIVC cells followed the linear velocity profile of the stimulus, one-quarter carried signals related to linear acceleration (in the form of two peaks of direction selectivity separated in time), and a few neurons followed the derivative of linear acceleration (jerk). In contrast, mainly velocity-coding cells were found in response to rotation. Thus, PIVC comprises a large functional region in macaque areas Ri and S2, with robust responses to 3D rotation and translation, but is unlikely to play a significant role in visual/vestibular integration for self-motion perception. PMID:20181599
Violent Interaction Detection in Video Based on Deep Learning
NASA Astrophysics Data System (ADS)
Zhou, Peipei; Ding, Qinghai; Luo, Haibo; Hou, Xinglin
2017-06-01
Violent interaction detection is of vital importance in some video surveillance scenarios like railway stations, prisons or psychiatric centres. Existing vision-based methods are mainly based on hand-crafted features such as statistic features between motion regions, leading to a poor adaptability to another dataset. En lightened by the development of convolutional networks on common activity recognition, we construct a FightNet to represent the complicated visual violence interaction. In this paper, a new input modality, image acceleration field is proposed to better extract the motion attributes. Firstly, each video is framed as RGB images. Secondly, optical flow field is computed using the consecutive frames and acceleration field is obtained according to the optical flow field. Thirdly, the FightNet is trained with three kinds of input modalities, i.e., RGB images for spatial networks, optical flow images and acceleration images for temporal networks. By fusing results from different inputs, we conclude whether a video tells a violent event or not. To provide researchers a common ground for comparison, we have collected a violent interaction dataset (VID), containing 2314 videos with 1077 fight ones and 1237 no-fight ones. By comparison with other algorithms, experimental results demonstrate that the proposed model for violent interaction detection shows higher accuracy and better robustness.
Bio-inspired multi-mode optic flow sensors for micro air vehicles
NASA Astrophysics Data System (ADS)
Park, Seokjun; Choi, Jaehyuk; Cho, Jihyun; Yoon, Euisik
2013-06-01
Monitoring wide-field surrounding information is essential for vision-based autonomous navigation in micro-air-vehicles (MAV). Our image-cube (iCube) module, which consists of multiple sensors that are facing different angles in 3-D space, can be applied to the wide-field of view optic flows estimation (μ-Compound eyes) and to attitude control (μ- Ocelli) in the Micro Autonomous Systems and Technology (MAST) platforms. In this paper, we report an analog/digital (A/D) mixed-mode optic-flow sensor, which generates both optic flows and normal images in different modes for μ- Compound eyes and μ-Ocelli applications. The sensor employs a time-stamp based optic flow algorithm which is modified from the conventional EMD (Elementary Motion Detector) algorithm to give an optimum partitioning of hardware blocks in analog and digital domains as well as adequate allocation of pixel-level, column-parallel, and chip-level signal processing. Temporal filtering, which may require huge hardware resources if implemented in digital domain, is remained in a pixel-level analog processing unit. The rest of the blocks, including feature detection and timestamp latching, are implemented using digital circuits in a column-parallel processing unit. Finally, time-stamp information is decoded into velocity from look-up tables, multiplications, and simple subtraction circuits in a chip-level processing unit, thus significantly reducing core digital processing power consumption. In the normal image mode, the sensor generates 8-b digital images using single slope ADCs in the column unit. In the optic flow mode, the sensor estimates 8-b 1-D optic flows from the integrated mixed-mode algorithm core and 2-D optic flows with an external timestamp processing, respectively.
Mixed-mode VLSI optic flow sensors for micro air vehicles
NASA Astrophysics Data System (ADS)
Barrows, Geoffrey Louis
We develop practical, compact optic flow sensors. To achieve the desired weight of 1--2 grams, mixed-mode and mixed-signal VLSI techniques are used to develop compact circuits that directly perform computations necessary to measure optic flow. We discuss several implementations, including a version fully integrated in VLSI, and several "hybrid sensors" in which the front end processing is performed with an analog chip and the back end processing is performed with a microcontroller. We extensively discuss one-dimensional optic flow sensors based on the linear competitive feature tracker (LCFT) algorithm. Hardware implementations of this algorithm are shown able to measure visual motion with contrast levels on the order of several percent. We argue that the development of one-dimensional optic flow sensors is therefore reduced to a problem of engineering. We also introduce two related two-dimensional optic flow algorithms that are amenable to implementation in VLSI. This includes the planar competitive feature tracker (PCFT) algorithm and the trajectory method. These sensors are being developed to solve small-scale navigation problems in micro air vehicles, which are autonomous aircraft whose maximum dimension is on the order of 15 cm. We obtain a proof-of-principle of small-scale navigation by mounting a prototype sensor onto a toy glider and programming the sensor to control a rudder or an elevator to affect the glider's path during flight. We demonstrate the determination of altitude by measuring optic flow in the downward direction. We also demonstrate steering to avoid a collision with a wall, when the glider is tossed towards the wall at a shallow angle, by measuring the optic flow in the direction of the glider's left and right side.
NASA Astrophysics Data System (ADS)
Justham, T.; Jarvis, S.; Clarke, A.; Garner, C. P.; Hargrave, G. K.; Halliwell, N. A.
2006-07-01
Simultaneous intake and in-cylinder digital particle image velocimetry (DPIV) experimental data is presented for a motored spark ignition (SI) optical internal combustion (IC) engine. Two individual DPIV systems were employed to study the inter-relationship between the intake and in-cylinder flow fields at an engine speed of 1500 rpm. Results for the intake runner velocity field at the time of maximum intake valve lift are compared to incylinder velocity fields later in the same engine cycle. Relationships between flow structures within the runner and cylinder were seen to be strong during the intake stroke but less significant during compression. Cyclic variations within the intake runner were seen to affect the large scale bulk flow motion. The subsequent decay of the large scale motions into smaller scale turbulent structures during the compression stroke appear to reduce the relationship with the intake flow variations.
Time-to-Passage Judgments in Nonconstant Optical Flow Fields
NASA Technical Reports Server (NTRS)
Kaiser, Mary K.; Hecht, Heiko
1995-01-01
The time until an approaching object will pass an observer (time to passage, or TTP) is optically specified by a global flow field even in the absence of local expansion or size cues. Kaiser and Mowafy have demonstrated that observers are in fact sensitive to this global flow information. The present studies investigate two factors that are usually ignored in work related to TTP: (1) non-constant motion functions and (2) concomitant eye rotation. Non-constant velocities violate an assumption of some TTP derivations, and eye rotations may complicate heading extraction. Such factors have practical significance, for example, in the case of a pilot accelerating an aircraft or executing a roll. In our studies, a flow field of constant-sized stars was presented monocularly on a large screen. TIP judgments had to be made on the basis of one target star. The flow field varied in its acceleration pattern and its roll component. Observers did not appear to utilize acceleration information. In particular, TTP with decelerating motion were consistently underestimated. TTP judgments were fairly robust with respect to roll, even when roll axis and track vector were decoupled. However, substantial decoupling between heading and track vector led to a decrement in performance, in both the presence and the absence of roll.
Exploring the use of optical flow for the study of functional NIRS signals
NASA Astrophysics Data System (ADS)
Fernandez Rojas, Raul; Huang, Xu; Ou, Keng-Liang; Hernandez-Juarez, Jesus
2017-03-01
Near infrared spectroscopy (NIRS) is an optical imaging technique that allows real-time measurements of Oxy and Deoxy-hemoglobin concentrations in human body tissue. In functional NIRS (fNIRS), this technique is used to study cortical activation in response to changes in neural activity. However, analysis of activation regions using NIRS is a challenging task in the field of medical image analysis and despite existing solutions, no homogeneous analysis method has yet been determined. For that reason, the aim of our present study is to report the use of an optical flow method for the analysis of cortical activation using near-infrared spectroscopy signals. We used real fNIRS data recorded from a noxious stimulation experiment as base of our implementation. To compute the optical flow algorithm, we first arrange NIRS signals (Oxy-hemoglobin) following our 24 channels (12 channels per hemisphere) head-probe configuration to create image-like samples. We then used two consecutive fNIRS samples per hemisphere as input frames for the optical flow algorithm, making one computation per hemisphere. The output from these two computations is the velocity field representing cortical activation from each hemisphere. The experimental results showed that the radial structure of flow vectors exhibited the origin of cortical activity, the development of stimulation as expansion or contraction of such flow vectors, and the flow of activation patterns may suggest prediction in cortical activity. The present study demonstrates that optical flow provides a power tool for the analysis of NIRS signals. Finally, we suggested a novel idea to identify pain status in nonverbal patients by using optical flow motion vectors; however, this idea will be study further in our future research.
NASA Technical Reports Server (NTRS)
Kadlec, R.
1979-01-01
The use of self synchronizing stroboscopic Schlieren and laser interferometer systems to obtain quantitative space time measurements of distinguished flow surfaces, steakline patterns, and the density field of two dimensional flows that exhibit a periodic content was investigated. A large field single path stroboscopic Schlieren system was designed, constructed and successfully applied to visualize four periodic flows: near wake behind an oscillating airfoil; edge tone sound generation; 2-D planar wall jet; and axisymmetric pulsed sonic jet. This visualization technique provides an effective means of studying quasi-periodic flows in real time. The image on the viewing screen is a spatial signal average of the coherent periodic motion rather than a single realization, the high speed motion of a quasi-periodic flow can be reconstructed by recording photographs of the flow at different fixed time delays in one cycle. The preliminary design and construction of a self synchronizing stroboscopic laser interferometer with a modified Mach-Zehnder optical system is also reported.
Moving object localization using optical flow for pedestrian detection from a moving vehicle.
Hariyono, Joko; Hoang, Van-Dung; Jo, Kang-Hyun
2014-01-01
This paper presents a pedestrian detection method from a moving vehicle using optical flows and histogram of oriented gradients (HOG). A moving object is extracted from the relative motion by segmenting the region representing the same optical flows after compensating the egomotion of the camera. To obtain the optical flow, two consecutive images are divided into grid cells 14 × 14 pixels; then each cell is tracked in the current frame to find corresponding cell in the next frame. Using at least three corresponding cells, affine transformation is performed according to each corresponding cell in the consecutive images, so that conformed optical flows are extracted. The regions of moving object are detected as transformed objects, which are different from the previously registered background. Morphological process is applied to get the candidate human regions. In order to recognize the object, the HOG features are extracted on the candidate region and classified using linear support vector machine (SVM). The HOG feature vectors are used as input of linear SVM to classify the given input into pedestrian/nonpedestrian. The proposed method was tested in a moving vehicle and also confirmed through experiments using pedestrian dataset. It shows a significant improvement compared with original HOG using ETHZ pedestrian dataset.
A bio-inspired flying robot sheds light on insect piloting abilities.
Franceschini, Nicolas; Ruffier, Franck; Serres, Julien
2007-02-20
When insects are flying forward, the image of the ground sweeps backward across their ventral viewfield and forms an "optic flow," which depends on both the groundspeed and the groundheight. To explain how these animals manage to avoid the ground by using this visual motion cue, we suggest that insect navigation hinges on a visual-feedback loop we have called the optic-flow regulator, which controls the vertical lift. To test this idea, we built a micro-helicopter equipped with an optic-flow regulator and a bio-inspired optic-flow sensor. This fly-by-sight micro-robot can perform exacting tasks such as take-off, level flight, and landing. Our control scheme accounts for many hitherto unexplained findings published during the last 70 years on insects' visually guided performances; for example, it accounts for the fact that honeybees descend in a headwind, land with a constant slope, and drown when travelling over mirror-smooth water. Our control scheme explains how insects manage to fly safely without any of the instruments used onboard aircraft to measure the groundheight, groundspeed, and descent speed. An optic-flow regulator is quite simple in terms of its neural implementation and just as appropriate for insects as it would be for aircraft.
Time-of-Travel Methods for Measuring Optical Flow on Board a Micro Flying Robot
Vanhoutte, Erik; Mafrica, Stefano; Ruffier, Franck; Bootsma, Reinoud J.; Serres, Julien
2017-01-01
For use in autonomous micro air vehicles, visual sensors must not only be small, lightweight and insensitive to light variations; on-board autopilots also require fast and accurate optical flow measurements over a wide range of speeds. Using an auto-adaptive bio-inspired Michaelis–Menten Auto-adaptive Pixel (M2APix) analog silicon retina, in this article, we present comparative tests of two optical flow calculation algorithms operating under lighting conditions from 6×10−7 to 1.6×10−2 W·cm−2 (i.e., from 0.2 to 12,000 lux for human vision). Contrast “time of travel” between two adjacent light-sensitive pixels was determined by thresholding and by cross-correlating the two pixels’ signals, with measurement frequency up to 5 kHz for the 10 local motion sensors of the M2APix sensor. While both algorithms adequately measured optical flow between 25 ∘/s and 1000 ∘/s, thresholding gave rise to a lower precision, especially due to a larger number of outliers at higher speeds. Compared to thresholding, cross-correlation also allowed for a higher rate of optical flow output (99 Hz and 1195 Hz, respectively) but required substantially more computational resources. PMID:28287484
Multisensor data fusion across time and space
NASA Astrophysics Data System (ADS)
Villeneuve, Pierre V.; Beaven, Scott G.; Reed, Robert A.
2014-06-01
Field measurement campaigns typically deploy numerous sensors having different sampling characteristics for spatial, temporal, and spectral domains. Data analysis and exploitation is made more difficult and time consuming as the sample data grids between sensors do not align. This report summarizes our recent effort to demonstrate feasibility of a processing chain capable of "fusing" image data from multiple independent and asynchronous sensors into a form amenable to analysis and exploitation using commercially-available tools. Two important technical issues were addressed in this work: 1) Image spatial registration onto a common pixel grid, 2) Image temporal interpolation onto a common time base. The first step leverages existing image matching and registration algorithms. The second step relies upon a new and innovative use of optical flow algorithms to perform accurate temporal upsampling of slower frame rate imagery. Optical flow field vectors were first derived from high-frame rate, high-resolution imagery, and then finally used as a basis for temporal upsampling of the slower frame rate sensor's imagery. Optical flow field values are computed using a multi-scale image pyramid, thus allowing for more extreme object motion. This involves preprocessing imagery to varying resolution scales and initializing new vector flow estimates using that from the previous coarser-resolution image. Overall performance of this processing chain is demonstrated using sample data involving complex too motion observed by multiple sensors mounted to the same base. Multiple sensors were included, including a high-speed visible camera, up to a coarser resolution LWIR camera.
Correcting for motion artifact in handheld laser speckle images.
Lertsakdadet, Ben; Yang, Bruce Y; Dunn, Cody E; Ponticorvo, Adrien; Crouzet, Christian; Bernal, Nicole; Durkin, Anthony J; Choi, Bernard
2018-03-01
Laser speckle imaging (LSI) is a wide-field optical technique that enables superficial blood flow quantification. LSI is normally performed in a mounted configuration to decrease the likelihood of motion artifact. However, mounted LSI systems are cumbersome and difficult to transport quickly in a clinical setting for which portability is essential in providing bedside patient care. To address this issue, we created a handheld LSI device using scientific grade components. To account for motion artifact of the LSI device used in a handheld setup, we incorporated a fiducial marker (FM) into our imaging protocol and determined the difference between highest and lowest speckle contrast values for the FM within each data set (Kbest and Kworst). The difference between Kbest and Kworst in mounted and handheld setups was 8% and 52%, respectively, thereby reinforcing the need for motion artifact quantification. When using a threshold FM speckle contrast value (KFM) to identify a subset of images with an acceptable level of motion artifact, mounted and handheld LSI measurements of speckle contrast of a flow region (KFLOW) in in vitro flow phantom experiments differed by 8%. Without the use of the FM, mounted and handheld KFLOW values differed by 20%. To further validate our handheld LSI device, we compared mounted and handheld data from an in vivo porcine burn model of superficial and full thickness burns. The speckle contrast within the burn region (KBURN) of the mounted and handheld LSI data differed by <4 % when accounting for motion artifact using the FM, which is less than the speckle contrast difference between superficial and full thickness burns. Collectively, our results suggest the potential of handheld LSI with an FM as a suitable alternative to mounted LSI, especially in challenging clinical settings with space limitations such as the intensive care unit. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
A nowcasting technique based on application of the particle filter blending algorithm
NASA Astrophysics Data System (ADS)
Chen, Yuanzhao; Lan, Hongping; Chen, Xunlai; Zhang, Wenhai
2017-10-01
To improve the accuracy of nowcasting, a new extrapolation technique called particle filter blending was configured in this study and applied to experimental nowcasting. Radar echo extrapolation was performed by using the radar mosaic at an altitude of 2.5 km obtained from the radar images of 12 S-band radars in Guangdong Province, China. The first bilateral filter was applied in the quality control of the radar data; an optical flow method based on the Lucas-Kanade algorithm and the Harris corner detection algorithm were used to track radar echoes and retrieve the echo motion vectors; then, the motion vectors were blended with the particle filter blending algorithm to estimate the optimal motion vector of the true echo motions; finally, semi-Lagrangian extrapolation was used for radar echo extrapolation based on the obtained motion vector field. A comparative study of the extrapolated forecasts of four precipitation events in 2016 in Guangdong was conducted. The results indicate that the particle filter blending algorithm could realistically reproduce the spatial pattern, echo intensity, and echo location at 30- and 60-min forecast lead times. The forecasts agreed well with observations, and the results were of operational significance. Quantitative evaluation of the forecasts indicates that the particle filter blending algorithm performed better than the cross-correlation method and the optical flow method. Therefore, the particle filter blending method is proved to be superior to the traditional forecasting methods and it can be used to enhance the ability of nowcasting in operational weather forecasts.
Joint estimation of motion and illumination change in a sequence of images
NASA Astrophysics Data System (ADS)
Koo, Ja-Keoung; Kim, Hyo-Hun; Hong, Byung-Woo
2015-09-01
We present an algorithm that simultaneously computes optical flow and estimates illumination change from an image sequence in a unified framework. We propose an energy functional consisting of conventional optical flow energy based on Horn-Schunck method and an additional constraint that is designed to compensate for illumination changes. Any undesirable illumination change that occurs in the imaging procedure in a sequence while the optical flow is being computed is considered a nuisance factor. In contrast to the conventional optical flow algorithm based on Horn-Schunck functional, which assumes the brightness constancy constraint, our algorithm is shown to be robust with respect to temporal illumination changes in the computation of optical flows. An efficient conjugate gradient descent technique is used in the optimization procedure as a numerical scheme. The experimental results obtained from the Middlebury benchmark dataset demonstrate the robustness and the effectiveness of our algorithm. In addition, comparative analysis of our algorithm and Horn-Schunck algorithm is performed on the additional test dataset that is constructed by applying a variety of synthetic bias fields to the original image sequences in the Middlebury benchmark dataset in order to demonstrate that our algorithm outperforms the Horn-Schunck algorithm. The superior performance of the proposed method is observed in terms of both qualitative visualizations and quantitative accuracy errors when compared to Horn-Schunck optical flow algorithm that easily yields poor results in the presence of small illumination changes leading to violation of the brightness constancy constraint.
NASA Astrophysics Data System (ADS)
Title, A. M.; Tarbell, T. D.; Topka, K. P.; Shine, R. A.; Simon, G. W.; Zirin, H.; SOUP Team
The SOUP flow fields have been compared with carefully aligned magnetograms taken at the BBSO before, during, and after the SOUP images. The magnetic field is observed to exist in locations where either the flow is convergent or on the boundaries of the outflow from a flow cell center. Streamlines calculated from the flow field agree very well with the observed motions of the magnetic field in the BBSO magnetogram movies.
NASA Astrophysics Data System (ADS)
Radhakrishnan, Harsha; Srinivasan, Vivek J.
2013-08-01
The hemodynamic response to neuronal activation is a well-studied phenomenon in the brain, due to the prevalence of functional magnetic resonance imaging. The retina represents an optically accessible platform for studying lamina-specific neurovascular coupling in the central nervous system; however, due to methodological limitations, this has been challenging to date. We demonstrate techniques for the imaging of visual stimulus-evoked hyperemia in the rat inner retina using Doppler optical coherence tomography (OCT) and OCT angiography. Volumetric imaging with three-dimensional motion correction, en face flow calculation, and normalization of dynamic signal to static signal are techniques that reduce spurious changes caused by motion. We anticipate that OCT imaging of retinal functional hyperemia may yield viable biomarkers in diseases, such as diabetic retinopathy, where the neurovascular unit may be impaired.
Lung tumor tracking in fluoroscopic video based on optical flow
Xu, Qianyi; Hamilton, Russell J.; Schowengerdt, Robert A.; Alexander, Brian; Jiang, Steve B.
2008-01-01
Respiratory gating and tumor tracking for dynamic multileaf collimator delivery require accurate and real-time localization of the lung tumor position during treatment. Deriving tumor position from external surrogates such as abdominal surface motion may have large uncertainties due to the intra- and interfraction variations of the correlation between the external surrogates and internal tumor motion. Implanted fiducial markers can be used to track tumors fluoroscopically in real time with sufficient accuracy. However, it may not be a practical procedure when implanting fiducials bronchoscopically. In this work, a method is presented to track the lung tumor mass or relevant anatomic features projected in fluoroscopic images without implanted fiducial markers based on an optical flow algorithm. The algorithm generates the centroid position of the tracked target and ignores shape changes of the tumor mass shadow. The tracking starts with a segmented tumor projection in an initial image frame. Then, the optical flow between this and all incoming frames acquired during treatment delivery is computed as initial estimations of tumor centroid displacements. The tumor contour in the initial frame is transferred to the incoming frames based on the average of the motion vectors, and its positions in the incoming frames are determined by fine-tuning the contour positions using a template matching algorithm with a small search range. The tracking results were validated by comparing with clinician determined contours on each frame. The position difference in 95% of the frames was found to be less than 1.4 pixels (∼0.7 mm) in the best case and 2.8 pixels (∼1.4 mm) in the worst case for the five patients studied. PMID:19175094
Lung tumor tracking in fluoroscopic video based on optical flow.
Xu, Qianyi; Hamilton, Russell J; Schowengerdt, Robert A; Alexander, Brian; Jiang, Steve B
2008-12-01
Respiratory gating and tumor tracking for dynamic multileaf collimator delivery require accurate and real-time localization of the lung tumor position during treatment. Deriving tumor position from external surrogates such as abdominal surface motion may have large uncertainties due to the intra- and interfraction variations of the correlation between the external surrogates and internal tumor motion. Implanted fiducial markers can be used to track tumors fluoroscopically in real time with sufficient accuracy. However, it may not be a practical procedure when implanting fiducials bronchoscopically. In this work, a method is presented to track the lung tumor mass or relevant anatomic features projected in fluoroscopic images without implanted fiducial markers based on an optical flow algorithm. The algorithm generates the centroid position of the tracked target and ignores shape changes of the tumor mass shadow. The tracking starts with a segmented tumor projection in an initial image frame. Then, the optical flow between this and all incoming frames acquired during treatment delivery is computed as initial estimations of tumor centroid displacements. The tumor contour in the initial frame is transferred to the incoming frames based on the average of the motion vectors, and its positions in the incoming frames are determined by fine-tuning the contour positions using a template matching algorithm with a small search range. The tracking results were validated by comparing with clinician determined contours on each frame. The position difference in 95% of the frames was found to be less than 1.4 pixels (approximately 0.7 mm) in the best case and 2.8 pixels (approximately 1.4 mm) in the worst case for the five patients studied.
Borst, Alexander; Weber, Franz
2011-01-01
Optic flow based navigation is a fundamental way of visual course control described in many different species including man. In the fly, an essential part of optic flow analysis is performed in the lobula plate, a retinotopic map of motion in the environment. There, the so-called lobula plate tangential cells possess large receptive fields with different preferred directions in different parts of the visual field. Previous studies demonstrated an extensive connectivity between different tangential cells, providing, in principle, the structural basis for their large and complex receptive fields. We present a network simulation of the tangential cells, comprising most of the neurons studied so far (22 on each hemisphere) with all the known connectivity between them. On their dendrite, model neurons receive input from a retinotopic array of Reichardt-type motion detectors. Model neurons exhibit receptive fields much like their natural counterparts, demonstrating that the connectivity between the lobula plate tangential cells indeed can account for their complex receptive field structure. We describe the tuning of a model neuron to particular types of ego-motion (rotation as well as translation around/along a given body axis) by its ‘action field’. As we show for model neurons of the vertical system (VS-cells), each of them displays a different type of action field, i.e., responds maximally when the fly is rotating around a particular body axis. However, the tuning width of the rotational action fields is relatively broad, comparable to the one with dendritic input only. The additional intra-lobula-plate connectivity mainly reduces their translational action field amplitude, i.e., their sensitivity to translational movements along any body axis of the fly. PMID:21305019
Chahl, J S
2014-01-20
This paper describes an application for arrays of narrow-field-of-view sensors with parallel optical axes. These devices exhibit some complementary characteristics with respect to conventional perspective projection or angular projection imaging devices. Conventional imaging devices measure rotational egomotion directly by measuring the angular velocity of the projected image. Translational egomotion cannot be measured directly by these devices because the induced image motion depends on the unknown range of the viewed object. On the other hand, a known translational motion generates image velocities which can be used to recover the ranges of objects and hence the three-dimensional (3D) structure of the environment. A new method is presented for computing egomotion and range using the properties of linear arrays of independent narrow-field-of-view optical sensors. An approximate parallel projection can be used to measure translational egomotion in terms of the velocity of the image. On the other hand, a known rotational motion of the paraxial sensor array generates image velocities, which can be used to recover the 3D structure of the environment. Results of tests of an experimental array confirm these properties.
A Unified Model of Heading and Path Perception in Primate MSTd
Layton, Oliver W.; Browning, N. Andrew
2014-01-01
Self-motion, steering, and obstacle avoidance during navigation in the real world require humans to travel along curved paths. Many perceptual models have been proposed that focus on heading, which specifies the direction of travel along straight paths, but not on path curvature, which humans accurately perceive and is critical to everyday locomotion. In primates, including humans, dorsal medial superior temporal area (MSTd) has been implicated in heading perception. However, the majority of MSTd neurons respond optimally to spiral patterns, rather than to the radial expansion patterns associated with heading. No existing theory of curved path perception explains the neural mechanisms by which humans accurately assess path and no functional role for spiral-tuned cells has yet been proposed. Here we present a computational model that demonstrates how the continuum of observed cells (radial to circular) in MSTd can simultaneously code curvature and heading across the neural population. Curvature is encoded through the spirality of the most active cell, and heading is encoded through the visuotopic location of the center of the most active cell's receptive field. Model curvature and heading errors fit those made by humans. Our model challenges the view that the function of MSTd is heading estimation, based on our analysis we claim that it is primarily concerned with trajectory estimation and the simultaneous representation of both curvature and heading. In our model, temporal dynamics afford time-history in the neural representation of optic flow, which may modulate its structure. This has far-reaching implications for the interpretation of studies that assume that optic flow is, and should be, represented as an instantaneous vector field. Our results suggest that spiral motion patterns that emerge in spatio-temporal optic flow are essential for guiding self-motion along complex trajectories, and that cells in MSTd are specifically tuned to extract complex trajectory estimation from flow. PMID:24586130
Anthony Eikema, Diderik Jan A.; Chien, Jung Hung; Stergiou, Nicholas; Myers, Sara A.; Scott-Pandorf, Melissa M.; Bloomberg, Jacob J.; Mukherjee, Mukul
2015-01-01
Human locomotor adaptation requires feedback and feed-forward control processes to maintain an appropriate walking pattern. Adaptation may require the use of visual and proprioceptive input to decode altered movement dynamics and generate an appropriate response. After a person transfers from an extreme sensory environment and back, as astronauts do when they return from spaceflight, the prolonged period required for re-adaptation can pose a significant burden. In our previous paper, we showed that plantar tactile vibration during a split-belt adaptation task did not interfere with the treadmill adaptation however, larger overground transfer effects with a slower decay resulted. Such effects, in the absence of visual feedback (of motion) and perturbation of tactile feedback, is believed to be due to a higher proprioceptive gain because, in the absence of relevant external dynamic cues such as optic flow, reliance on body-based cues is enhanced during gait tasks through multisensory integration. In this study we therefore investigated the effect of optic flow on tactile stimulated split-belt adaptation as a paradigm to facilitate the sensorimotor adaptation process. Twenty healthy young adults, separated into two matched groups, participated in the study. All participants performed an overground walking trial followed by a split-belt treadmill adaptation protocol. The tactile group (TC) received vibratory plantar tactile stimulation only, whereas the virtual reality and tactile group (VRT) received an additional concurrent visual stimulation: a moving virtual corridor, inducing perceived self-motion. A post-treadmill overground trial was performed to determine adaptation transfer. Interlimb coordination of spatiotemporal and kinetic variables was quantified using symmetry indices, and analyzed using repeated-measures ANOVA. Marked changes of step length characteristics were observed in both groups during split-belt adaptation. Stance and swing time symmetry were similar in the two groups, suggesting that temporal parameters are not modified by optic flow. However, whereas the TC group displayed significant stance time asymmetries during the post-treadmill session, such aftereffects were absent in the VRT group. The results indicated that the enhanced transfer resulting from exposure to plantar cutaneous vibration during adaptation was alleviated by optic flow information. The presence of visual self-motion information may have reduced proprioceptive gain during learning. Thus, during overground walking, the learned proprioceptive split-belt pattern is more rapidly overridden by visual input due to its increased relative gain. The results suggest that when visual stimulation is provided during adaptive training, the system acquires the novel movement dynamics while maintaining the ability to flexibly adapt to different environments. PMID:26525712
A Dynamic Bayesian Observer Model Reveals Origins of Bias in Visual Path Integration.
Lakshminarasimhan, Kaushik J; Petsalis, Marina; Park, Hyeshin; DeAngelis, Gregory C; Pitkow, Xaq; Angelaki, Dora E
2018-06-20
Path integration is a strategy by which animals track their position by integrating their self-motion velocity. To identify the computational origins of bias in visual path integration, we asked human subjects to navigate in a virtual environment using optic flow and found that they generally traveled beyond the goal location. Such a behavior could stem from leaky integration of unbiased self-motion velocity estimates or from a prior expectation favoring slower speeds that causes velocity underestimation. Testing both alternatives using a probabilistic framework that maximizes expected reward, we found that subjects' biases were better explained by a slow-speed prior than imperfect integration. When subjects integrate paths over long periods, this framework intriguingly predicts a distance-dependent bias reversal due to buildup of uncertainty, which we also confirmed experimentally. These results suggest that visual path integration in noisy environments is limited largely by biases in processing optic flow rather than by leaky integration. Copyright © 2018 Elsevier Inc. All rights reserved.
An Optical Flow-Based Full Reference Video Quality Assessment Algorithm.
K, Manasa; Channappayya, Sumohana S
2016-06-01
We present a simple yet effective optical flow-based full-reference video quality assessment (FR-VQA) algorithm for assessing the perceptual quality of natural videos. Our algorithm is based on the premise that local optical flow statistics are affected by distortions and the deviation from pristine flow statistics is proportional to the amount of distortion. We characterize the local flow statistics using the mean, the standard deviation, the coefficient of variation (CV), and the minimum eigenvalue ( λ min ) of the local flow patches. Temporal distortion is estimated as the change in the CV of the distorted flow with respect to the reference flow, and the correlation between λ min of the reference and of the distorted patches. We rely on the robust multi-scale structural similarity index for spatial quality estimation. The computed temporal and spatial distortions, thus, are then pooled using a perceptually motivated heuristic to generate a spatio-temporal quality score. The proposed method is shown to be competitive with the state-of-the-art when evaluated on the LIVE SD database, the EPFL Polimi SD database, and the LIVE Mobile HD database. The distortions considered in these databases include those due to compression, packet-loss, wireless channel errors, and rate-adaptation. Our algorithm is flexible enough to allow for any robust FR spatial distortion metric for spatial distortion estimation. In addition, the proposed method is not only parameter-free but also independent of the choice of the optical flow algorithm. Finally, we show that the replacement of the optical flow vectors in our proposed method with the much coarser block motion vectors also results in an acceptable FR-VQA algorithm. Our algorithm is called the flow similarity index.
NASA Astrophysics Data System (ADS)
Chen, H.; Ye, Sh.; Nedzvedz, O. V.; Ablameyko, S. V.
2018-03-01
Study of crowd movement is an important practical problem, and its solution is used in video surveillance systems for preventing various emergency situations. In the general case, a group of fast-moving people is of more interest than a group of stationary or slow-moving people. We propose a new method for crowd movement analysis using a video sequence, based on integral optical flow. We have determined several characteristics of a moving crowd such as density, speed, direction of motion, symmetry, and in/out index. These characteristics are used for further analysis of a video scene.
Evidence of a rolling motion of a microparticle on a silicon wafer in a liquid environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schiwek, Simon; Stark, Robert W., E-mail: stark@csi.tu-darmstadt.de, E-mail: dietz@csi.tu-darmstadt.de; Dietz, Christian, E-mail: stark@csi.tu-darmstadt.de, E-mail: dietz@csi.tu-darmstadt.de
2016-05-21
The interaction of micro- and nanometer-sized particles with surfaces plays a crucial role when small-scale structures are built in a bottom-up approach or structured surfaces are cleaned in the semiconductor industry. For a reliable quantification of the interaction between individual particles and a specific surface, however, the motion type of the particle must be known. We developed an approach to unambiguously distinguish between sliding and rolling particles. To this end, fluorescent particles were partially bleached in a confocal laser scanning microscope to tailor an optical inhomogeneity, which allowed for the identification of the characteristic motion pattern. For the manipulation, themore » water flow generated by a fast moving cantilever-tip of an atomic force microscope enabled the contactless pushing of the particle. We thus experimentally evidenced a rolling motion of a micrometer-sized particle directly with a fluorescence microscope. A similar approach could help to discriminate between rolling and sliding particles in liquid flows of microfluidic systems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dhou, S; Williams, C; Ionascu, D
2016-06-15
Purpose: To study the variability of patient-specific motion models derived from 4-dimensional CT (4DCT) images using different deformable image registration (DIR) algorithms for lung cancer stereotactic body radiotherapy (SBRT) patients. Methods: Motion models are derived by 1) applying DIR between each 4DCT image and a reference image, resulting in a set of displacement vector fields (DVFs), and 2) performing principal component analysis (PCA) on the DVFs, resulting in a motion model (a set of eigenvectors capturing the variations in the DVFs). Three DIR algorithms were used: 1) Demons, 2) Horn-Schunck, and 3) iterative optical flow. The motion models derived weremore » compared using patient 4DCT scans. Results: Motion models were derived and the variations were evaluated according to three criteria: 1) the average root mean square (RMS) difference which measures the absolute difference between the components of the eigenvectors, 2) the dot product between the eigenvectors which measures the angular difference between the eigenvectors in space, and 3) the Euclidean Model Norm (EMN), which is calculated by summing the dot products of an eigenvector with the first three eigenvectors from the reference motion model in quadrature. EMN measures how well an eigenvector can be reconstructed using another motion model derived using a different DIR algorithm. Results showed that comparing to a reference motion model (derived using the Demons algorithm), the eigenvectors of the motion model derived using the iterative optical flow algorithm has smaller RMS, larger dot product, and larger EMN values than those of the motion model derived using Horn-Schunck algorithm. Conclusion: The study showed that motion models vary depending on which DIR algorithms were used to derive them. The choice of a DIR algorithm may affect the accuracy of the resulting model, and it is important to assess the suitability of the algorithm chosen for a particular application. This project was supported, in part, through a Master Research Agreement with Varian Medical Systems, Inc, Palo Alto, CA.« less
From video to computation of biological fluid-structure interaction problems
NASA Astrophysics Data System (ADS)
Dillard, Seth I.; Buchholz, James H. J.; Udaykumar, H. S.
2016-04-01
This work deals with the techniques necessary to obtain a purely Eulerian procedure to conduct CFD simulations of biological systems with moving boundary flow phenomena. Eulerian approaches obviate difficulties associated with mesh generation to describe or fit flow meshes to body surfaces. The challenges associated with constructing embedded boundary information, body motions and applying boundary conditions on the moving bodies for flow computation are addressed in the work. The overall approach is applied to the study of a fluid-structure interaction problem, i.e., the hydrodynamics of swimming of an American eel, where the motion of the eel is derived from video imaging. It is shown that some first-blush approaches do not work, and therefore, careful consideration of appropriate techniques to connect moving images to flow simulations is necessary and forms the main contribution of the paper. A combination of level set-based active contour segmentation with optical flow and image morphing is shown to enable the image-to-computation process.
NASA Technical Reports Server (NTRS)
Duff, Michael J. B. (Editor); Siegel, Howard J. (Editor); Corbett, Francis J. (Editor)
1986-01-01
The conference presents papers on the architectures, algorithms, and applications of image processing. Particular attention is given to a very large scale integration system for image reconstruction from projections, a prebuffer algorithm for instant display of volume data, and an adaptive image sequence filtering scheme based on motion detection. Papers are also presented on a simple, direct practical method of sensing local motion and analyzing local optical flow, image matching techniques, and an automated biological dosimetry system.
Dense motion estimation using regularization constraints on local parametric models.
Patras, Ioannis; Worring, Marcel; van den Boomgaard, Rein
2004-11-01
This paper presents a method for dense optical flow estimation in which the motion field within patches that result from an initial intensity segmentation is parametrized with models of different order. We propose a novel formulation which introduces regularization constraints between the model parameters of neighboring patches. In this way, we provide the additional constraints for very small patches and for patches whose intensity variation cannot sufficiently constrain the estimation of their motion parameters. In order to preserve motion discontinuities, we use robust functions as a regularization mean. We adopt a three-frame approach and control the balance between the backward and forward constraints by a real-valued direction field on which regularization constraints are applied. An iterative deterministic relaxation method is employed in order to solve the corresponding optimization problem. Experimental results show that the proposed method deals successfully with motions large in magnitude, motion discontinuities, and produces accurate piecewise-smooth motion fields.
A new look at Op art: towards a simple explanation of illusory motion.
Zanker, Johannes M; Walker, Robin
2004-04-01
Vivid motion illusions created by some Op art paintings are at the centre of a lively scientific debate about possible mechanisms that might underlie these phenomena. Here we review emerging evidence from a new approach that combines perceptual judgements of the illusion and observations of eye movements with simulations of the induced optic flow. This work suggests that the small involuntary saccades which participants make when viewing such Op art patterns would generate an incoherent distribution of motion signals that resemble the perceptual effects experienced by the observers. The combined experimental and computational evidence supports the view that the illusion is indeed caused by involuntary image displacements picked up by low-level motion detectors, and further suggests that coherent motion signals are crucial to perceive a stable world.
A new look at Op art: towards a simple explanation of illusory motion
NASA Astrophysics Data System (ADS)
Zanker, Johannes M.; Walker, Robin
Vivid motion illusions created by some Op art paintings are at the centre of a lively scientific debate about possible mechanisms that might underlie these phenomena. Here we review emerging evidence from a new approach that combines perceptual judgements of the illusion and observations of eye movements with simulations of the induced optic flow. This work suggests that the small involuntary saccades which participants make when viewing such Op art patterns would generate an incoherent distribution of motion signals that resemble the perceptual effects experienced by the observers. The combined experimental and computational evidence supports the view that the illusion is indeed caused by involuntary image displacements picked up by low-level motion detectors, and further suggests that coherent motion signals are crucial to perceive a stable world.
Effective star tracking method based on optical flow analysis for star trackers.
Sun, Ting; Xing, Fei; Wang, Xiaochu; Li, Jin; Wei, Minsong; You, Zheng
2016-12-20
Benefiting from rapid development of imaging sensor technology, modern optical technology, and a high-speed computing chip, the star tracker's accuracy, dynamic performance, and update rate have been greatly improved with low power consumption and miniature size. The star tracker is currently one of the most competitive attitude measurement sensors. However, due to restrictions of the optical imaging system, difficulties still exist in moving star spot detection and star tracking when in special motion conditions. An effective star tracking method based on optical flow analysis for star trackers is proposed in this paper. Spot-based optical flow, based on a gray gradient between two adjacent star images, is analyzed to distinguish the star spot region and obtain an accurate star spot position so that the star tracking can keep continuous under high dynamic conditions. The obtained star vectors and extended Kalman filter (EKF) are then combined to conduct an angular velocity estimation to ensure region prediction of the star spot; this can be combined with the optical flow analysis result. Experiment results show that the method proposed in this paper has advantages in conditions of large angular velocity and large angular acceleration, despite the presence of noise. Higher functional density and better performance can be achieved; thus, the star tracker can be more widely applied in small satellites, remote sensing, and other complex space missions.
Ambiguities of a Motion Field.
1987-01-01
solutions in the case of planar surfaces has since been reported by Tsai et al. 11982], Waxman & Ullman [19851, Longuet-Higgins [1984], Maybank [1984], and... Maybank , SiJ. (1984) "The Angular Velocity Associated with the Optical Flow Field due to a Single Moving Rigid Plane,’ Proceedings of the Stzth European
NASA Astrophysics Data System (ADS)
Tang, Jianbo; Erdener, Sefik Evren; Li, Baoqiang; Fu, Buyin; Sakadzic, Sava; Carp, Stefan A.; Lee, Jonghwan; Boas, David A.
2018-02-01
Dynamic Light Scattering-Optical Coherence Tomography (DLS-OCT) takes the advantages of using DLS to measure particle flow and diffusion within an OCT resolution-constrained 3D volume, enabling the simultaneous measurements of absolute RBC velocity and diffusion coefficient with high spatial resolution. In this work, we applied DLS-OCT to measure both RBC velocity and the shear-induced diffusion coefficient within penetrating venules of the somatosensory cortex of anesthetized mice. Blood flow laminar profile measurements indicate a blunted laminar flow profile, and the degree of blunting decreases with increasing vessel diameter. The measured shear-induced diffusion coefficient was proportional to the flow shear rate with a magnitude of 0.1 to 0.5 × 10-6 mm2 . These results provide important experimental support for the recent theoretical explanation for why DCS is dominantly sensitive to RBC diffusive motion.
NASA Astrophysics Data System (ADS)
Jones, Philip H.; Smart, Thomas J.; Richards, Christopher J.; Cubero, David
2016-09-01
The Kapitza pendulum is the paradigm for the phenomenon of dynamical stabilization, whereby an otherwise unstable system achieves a stability that is induced by fast modulation of a control parameter. In the classic, macroscopic Kapitza pendulum, a rigid pendulum is stabilized in the upright, inverted pendulum using a particle confined in a ring-shaped optical trap, subject to a drag force via fluid flow and driven via oscillating the potential in a direction parallel to the fluid flow. In the regime of vanishing Reynold's number with high-frequency driving the inverted pendulum is no longer stable, but new equilibrium positions appear that depend on the amplitude of driving. As the driving frequency is decreased a yet different behavior emerges where stability of the pendulum depends also on the details of the pendulum hydrodynamics. We present a theory for the observed induced stability of the overdamped pendulum based on the separation of timescales in the pendulum motion as formulated by Kapitza, but with the addition of a viscous drag. Excellent agreement is found between the predicted behavior from the analytical theory and the experimental results across the range of pendulum driving frequencies. We complement these results with Brownian motion simulations, and we characterize the stabilized pendulum by both time- and frequency-domain analyses of the pendulum Brownian motion.
On-chip photonic tweezers for photonics, microfluidics, and biology
NASA Astrophysics Data System (ADS)
Pin, Christophe; Renaut, Claude; Tardif, Manon; Jager, Jean-Baptiste; Delamadeleine, Eric; Picard, Emmanuel; Peyrade, David; Hadji, Emmanuel; de Fornel, Frédérique; Cluzel, Benoît
2017-04-01
Near-field optical forces arise from evanescent electromagnetic fields and can be advantageously used for on-chip optical trapping. In this work, we investigate how evanescent fields at the surface of photonic cavities can efficiently trap micro-objects such as polystyrene particles and bacteria. We study first the influence of trapped particle's size on the trapping potential and introduce an original optofluidic near-field optical microscopy technique. Then we analyze the rotational motion of trapped clusters of microparticles and investigate their possible use as microfluidic micro-tools such as integrated micro-flow vane. Eventually, we demonstrate efficient on-chip optical trapping of various kinds of bacteria.
Elastohydrodynamic Lift at a Soft Wall
NASA Astrophysics Data System (ADS)
Davies, Heather S.; Débarre, Delphine; El Amri, Nouha; Verdier, Claude; Richter, Ralf P.; Bureau, Lionel
2018-05-01
We study experimentally the motion of nondeformable microbeads in a linear shear flow close to a wall bearing a thin and soft polymer layer. Combining microfluidics and 3D optical tracking, we demonstrate that the steady-state bead-to-surface distance increases with the flow strength. Moreover, such lift is shown to result from flow-induced deformations of the layer, in quantitative agreement with theoretical predictions from elastohydrodynamics. This study thus provides the first experimental evidence of "soft lubrication" at play at small scale, in a system relevant, for example, to the physics of blood microcirculation.
Vestibular signals in primate cortex for self-motion perception.
Gu, Yong
2018-04-21
The vestibular peripheral organs in our inner ears detect transient motion of the head in everyday life. This information is sent to the central nervous system for automatic processes such as vestibulo-ocular reflexes, balance and postural control, and higher cognitive functions including perception of self-motion and spatial orientation. Recent neurophysiological studies have discovered a prominent vestibular network in the primate cerebral cortex. Many of the areas involved are multisensory: their neurons are modulated by both vestibular signals and visual optic flow, potentially facilitating more robust heading estimation through cue integration. Combining psychophysics, computation, physiological recording and causal manipulation techniques, recent work has addressed both the encoding and decoding of vestibular signals for self-motion perception. Copyright © 2018. Published by Elsevier Ltd.
Handheld Fluorescence Microscopy based Flow Analyzer.
Saxena, Manish; Jayakumar, Nitin; Gorthi, Sai Siva
2016-03-01
Fluorescence microscopy has the intrinsic advantages of favourable contrast characteristics and high degree of specificity. Consequently, it has been a mainstay in modern biological inquiry and clinical diagnostics. Despite its reliable nature, fluorescence based clinical microscopy and diagnostics is a manual, labour intensive and time consuming procedure. The article outlines a cost-effective, high throughput alternative to conventional fluorescence imaging techniques. With system level integration of custom-designed microfluidics and optics, we demonstrate fluorescence microscopy based imaging flow analyzer. Using this system we have imaged more than 2900 FITC labeled fluorescent beads per minute. This demonstrates high-throughput characteristics of our flow analyzer in comparison to conventional fluorescence microscopy. The issue of motion blur at high flow rates limits the achievable throughput in image based flow analyzers. Here we address the issue by computationally deblurring the images and show that this restores the morphological features otherwise affected by motion blur. By further optimizing concentration of the sample solution and flow speeds, along with imaging multiple channels simultaneously, the system is capable of providing throughput of about 480 beads per second.
The Interaction of Focused Attention with Flow-field Sensitivity
NASA Technical Reports Server (NTRS)
Stoffregen, T.
1984-01-01
Two studies were performed to determine whether a subject's response to naturalistic optical flow specifying egomotion would be affected by a concurrent attention task. In the first study subjects stood in a moving room in which various areas of the optical flow generated by room movement were visible. Subjects responded to room motion with strong compensatory sway when the entire room was visible. When the side walls of the room were completely obscured by stationary screens, leaving only the front wall visible, sway was significantly reduced, though it remained greater than in an eyes-closed control. In Exp. 2 subjects were presented with either the full room (large sway response) or the room with only the front wall visible (moderate response), each in combination with either a hard or easy verbal addition task. Preliminary results show that swaying in the fully visible room and in the room with only the front wall visible increased when combined with either the hard or easy tasks. These preliminary results suggest that at the least the pick-up of optical flow specifying egomotion is not affected by concurrent attentional activity.
Real-time detection of moving objects from moving vehicles using dense stereo and optical flow
NASA Technical Reports Server (NTRS)
Talukder, Ashit; Matthies, Larry
2004-01-01
Dynamic scene perception is very important for autonomous vehicles operating around other moving vehicles and humans. Most work on real-time object tracking from moving platforms has used sparse features or assumed flat scene structures. We have recently extended a real-time, dense stereo system to include realtime, dense optical flow, enabling more comprehensive dynamic scene analysis. We describe algorithms to robustly estimate 6-DOF robot egomotion in the presence of moving objects using dense flow and dense stereo. We then use dense stereo and egomotion estimates to identify & other moving objects while the robot itself is moving. We present results showing accurate egomotion estimation and detection of moving people and vehicles under general 6-DOF motion of the robot and independently moving objects. The system runs at 18.3 Hz on a 1.4 GHz Pentium M laptop, computing 160x120 disparity maps and optical flow fields, egomotion, and moving object segmentation. We believe this is a significant step toward general unconstrained dynamic scene analysis for mobile robots, as well as for improved position estimation where GPS is unavailable.
Real-time detection of moving objects from moving vehicles using dense stereo and optical flow
NASA Technical Reports Server (NTRS)
Talukder, Ashit; Matthies, Larry
2004-01-01
Dynamic scene perception is very important for autonomous vehicles operating around other moving vehicles and humans. Most work on real-time object tracking from moving platforms has used sparse features or assumed flat scene structures. We have recently extended a real-time, dense stereo system to include real-time, dense optical flow, enabling more comprehensive dynamic scene analysis. We describe algorithms to robustly estimate 6-DOF robot egomotion in the presence of moving objects using dense flow and dense stereo. We then use dense stereo and egomotion estimates to identity other moving objects while the robot itself is moving. We present results showing accurate egomotion estimation and detection of moving people and vehicles under general 6-DOF motion of the robot and independently moving objects. The system runs at 18.3 Hz on a 1.4 GHz Pentium M laptop, computing 160x120 disparity maps and optical flow fields, egomotion, and moving object segmentation. We believe this is a significant step toward general unconstrained dynamic scene analysis for mobile robots, as well as for improved position estimation where GPS is unavailable.
Real-time Detection of Moving Objects from Moving Vehicles Using Dense Stereo and Optical Flow
NASA Technical Reports Server (NTRS)
Talukder, Ashit; Matthies, Larry
2004-01-01
Dynamic scene perception is very important for autonomous vehicles operating around other moving vehicles and humans. Most work on real-time object tracking from moving platforms has used sparse features or assumed flat scene structures. We have recently extended a real-time. dense stereo system to include realtime. dense optical flow, enabling more comprehensive dynamic scene analysis. We describe algorithms to robustly estimate 6-DOF robot egomotion in the presence of moving objects using dense flow and dense stereo. We then use dense stereo and egomotion estimates to identify other moving objects while the robot itself is moving. We present results showing accurate egomotion estimation and detection of moving people and vehicles under general 6DOF motion of the robot and independently moving objects. The system runs at 18.3 Hz on a 1.4 GHz Pentium M laptop. computing 160x120 disparity maps and optical flow fields, egomotion, and moving object segmentation. We believe this is a significant step toward general unconstrained dynamic scene analysis for mobile robots, as well as for improved position estimation where GPS is unavailable.
Optic flow odometry operates independently of stride integration in carried ants.
Pfeffer, Sarah E; Wittlinger, Matthias
2016-09-09
Cataglyphis desert ants are impressive navigators. When the foragers roam the desert, they employ path integration. For these ants, distance estimation is one key challenge. Distance information was thought to be provided by optic flow (OF)-that is, image motion experienced during travel-but this idea was abandoned when stride integration was discovered as an odometer mechanism in ants. We show that ants transported by nest mates are capable of measuring travel distance exclusively by the use of OF cues. Furthermore, we demonstrate that the information gained from the optic flowmeter cannot be transferred to the stride integrator. Our results suggest a dual information channel that allows the ants to measure distances by strides and OF cues, although both systems operate independently and in a redundant manner. Copyright © 2016, American Association for the Advancement of Science.
Mechanism of magnetic liquid flowing in the magnetic liquid seal gap of reciprocating shaft
NASA Astrophysics Data System (ADS)
Li, Decai; Chui, Haichun; Yang, Qingxin
2003-04-01
In order to solve the problems that exist in the magnetic liquid seal of reciprocating shaft, we have set up an experimental facility, which composes a camera, a microscope, a step-by-step motor, a pin roller screw, a reciprocating motion shaft, pole pieces, a permanent magnet and the magnetic liquid in the seal gap. Through the optical technology and image process of the experimental facility, we have studied the magnetic liquid flow in the seal gap when the reciprocating shaft moves with different velocities and strokes, this study especially concentrates on 1) the regular pattern of such flow; 2) the loss quantity of magnetic liquid caused by the reciprocating motion shaft; 3) the failure reasons of this magnetic liquid seal and 4) the design of a new structure for the magnetic liquid seal of reciprocating shaft. The application indicates that the new structure is very effective in some occasions.
Peripheral Visual Cues Contribute to the Perception of Object Movement During Self-Movement
Rogers, Cassandra; Warren, Paul A.
2017-01-01
Safe movement through the environment requires us to monitor our surroundings for moving objects or people. However, identification of moving objects in the scene is complicated by self-movement, which adds motion across the retina. To identify world-relative object movement, the brain thus has to ‘compensate for’ or ‘parse out’ the components of retinal motion that are due to self-movement. We have previously demonstrated that retinal cues arising from central vision contribute to solving this problem. Here, we investigate the contribution of peripheral vision, commonly thought to provide strong cues to self-movement. Stationary participants viewed a large field of view display, with radial flow patterns presented in the periphery, and judged the trajectory of a centrally presented probe. Across two experiments, we demonstrate and quantify the contribution of peripheral optic flow to flow parsing during forward and backward movement. PMID:29201335
Pupil Tracking for Real-Time Motion Corrected Anterior Segment Optical Coherence Tomography
Carrasco-Zevallos, Oscar M.; Nankivil, Derek; Viehland, Christian; Keller, Brenton; Izatt, Joseph A.
2016-01-01
Volumetric acquisition with anterior segment optical coherence tomography (ASOCT) is necessary to obtain accurate representations of the tissue structure and to account for asymmetries of the anterior eye anatomy. Additionally, recent interest in imaging of anterior segment vasculature and aqueous humor flow resulted in application of OCT angiography techniques to generate en face and 3D micro-vasculature maps of the anterior segment. Unfortunately, ASOCT structural and vasculature imaging systems do not capture volumes instantaneously and are subject to motion artifacts due to involuntary eye motion that may hinder their accuracy and repeatability. Several groups have demonstrated real-time tracking for motion-compensated in vivo OCT retinal imaging, but these techniques are not applicable in the anterior segment. In this work, we demonstrate a simple and low-cost pupil tracking system integrated into a custom swept-source OCT system for real-time motion-compensated anterior segment volumetric imaging. Pupil oculography hardware coaxial with the swept-source OCT system enabled fast detection and tracking of the pupil centroid. The pupil tracking ASOCT system with a field of view of 15 x 15 mm achieved diffraction-limited imaging over a lateral tracking range of +/- 2.5 mm and was able to correct eye motion at up to 22 Hz. Pupil tracking ASOCT offers a novel real-time motion compensation approach that may facilitate accurate and reproducible anterior segment imaging. PMID:27574800
Multimedia Thermofluid Dynamics, an Undergraduate Education Project
NASA Astrophysics Data System (ADS)
Settles, G. S.; Dreibelbis, L. J.; Miller, J. D.; Smith, B. P.
2002-11-01
New multimedia materials are being developed for undergraduate instruction in thermofluid dynamics (e.g. convective heat transfer, thermodynamics, and gas dynamics), with strong emphasis on experimental and optical flow visualization. Since textbooks often show only simple line diagrams, our emphasis is on real flow images as in Van Dyke's classic "Album of Fluid Motion." Here, however, digital video clips illustrate the pertinent phenomena in motion, with voice-over explanations and occasional musical accompaniment. Beyond that, no attempt is made to duplicate traditional textbook material, but rather to provide a visual "window" into the laboratory experience. The results will be produced and distributed in DVD form for instructors and students as a visual supplement to the standard textbooks on these topics. The suitability of such materials for national dissemination has already been demonstrated. This approach is believed to be especially important for small and minority universities that sometimes lack laboratory facilities. Several examples will be shown, including transitional flow, hydraulic jumps, nucleate boiling, convective heat transfer, and supersonic flow. (Supported by NSF DUE Grant.)
Wei, Xiang; Camino, Acner; Pi, Shaohua; Cepurna, William; Huang, David; Morrison, John C; Jia, Yali
2018-05-01
Phase-based optical coherence tomography (OCT), such as OCT angiography (OCTA) and Doppler OCT, is sensitive to the confounding phase shift introduced by subject bulk motion. Traditional bulk motion compensation methods are limited by their accuracy and computing cost-effectiveness. In this Letter, to the best of our knowledge, we present a novel bulk motion compensation method for phase-based functional OCT. Bulk motion associated phase shift can be directly derived by solving its equation using a standard deviation of phase-based OCTA and Doppler OCT flow signals. This method was evaluated on rodent retinal images acquired by a prototype visible light OCT and human retinal images acquired by a commercial system. The image quality and computational speed were significantly improved, compared to two conventional phase compensation methods.
Applications of Phase-Based Motion Processing
NASA Technical Reports Server (NTRS)
Branch, Nicholas A.; Stewart, Eric C.
2018-01-01
Image pyramids provide useful information in determining structural response at low cost using commercially available cameras. The current effort applies previous work on the complex steerable pyramid to analyze and identify imperceptible linear motions in video. Instead of implicitly computing motion spectra through phase analysis of the complex steerable pyramid and magnifying the associated motions, instead present a visual technique and the necessary software to display the phase changes of high frequency signals within video. The present technique quickly identifies regions of largest motion within a video with a single phase visualization and without the artifacts of motion magnification, but requires use of the computationally intensive Fourier transform. While Riesz pyramids present an alternative to the computationally intensive complex steerable pyramid for motion magnification, the Riesz formulation contains significant noise, and motion magnification still presents large amounts of data that cannot be quickly assessed by the human eye. Thus, user-friendly software is presented for quickly identifying structural response through optical flow and phase visualization in both Python and MATLAB.
Kim, Woojae; Han, Tae Hwa; Kim, Hyun Jun; Park, Man Young; Kim, Ku Sang; Park, Rae Woong
2011-06-01
The mucociliary transport system is a major defense mechanism of the respiratory tract. The performance of mucous transportation in the nasal cavity can be represented by a ciliary beating frequency (CBF). This study proposes a novel method to measure CBF by using optical flow. To obtain objective estimates of CBF from video images, an automated computer-based image processing technique is developed. This study proposes a new method based on optical flow for image processing and peak detection for signal processing. We compare the measuring accuracy of the method in various combinations of image processing (optical flow versus difference image) and signal processing (fast Fourier transform [FFT] vs. peak detection [PD]). The digital high-speed video method with a manual count of CBF in slow motion video play, is the gold-standard in CBF measurement. We obtained a total of fifty recorded ciliated sinonasal epithelium images to measure CBF from the Department of Otolaryngology. The ciliated sinonasal epithelium images were recorded at 50-100 frames per second using a charge coupled device camera with an inverted microscope at a magnification of ×1,000. The mean square errors and variance for each method were 1.24, 0.84 Hz; 11.8, 2.63 Hz; 3.22, 1.46 Hz; and 3.82, 1.53 Hz for optical flow (OF) + PD, OF + FFT, difference image [DI] + PD, and DI + FFT, respectively. Of the four methods, PD using optical flow showed the best performance for measuring the CBF of nasal mucosa. The proposed method was able to measure CBF more objectively and efficiently than what is currently possible.
The Complex Action Recognition via the Correlated Topic Model
Tu, Hong-bin; Xia, Li-min; Wang, Zheng-wu
2014-01-01
Human complex action recognition is an important research area of the action recognition. Among various obstacles to human complex action recognition, one of the most challenging is to deal with self-occlusion, where one body part occludes another one. This paper presents a new method of human complex action recognition, which is based on optical flow and correlated topic model (CTM). Firstly, the Markov random field was used to represent the occlusion relationship between human body parts in terms of an occlusion state variable. Secondly, the structure from motion (SFM) is used for reconstructing the missing data of point trajectories. Then, we can extract the key frame based on motion feature from optical flow and the ratios of the width and height are extracted by the human silhouette. Finally, we use the topic model of correlated topic model (CTM) to classify action. Experiments were performed on the KTH, Weizmann, and UIUC action dataset to test and evaluate the proposed method. The compared experiment results showed that the proposed method was more effective than compared methods. PMID:24574920
Optical flows method for lightweight agile remote sensor design and instrumentation
NASA Astrophysics Data System (ADS)
Wang, Chong; Xing, Fei; Wang, Hongjian; You, Zheng
2013-08-01
Lightweight agile remote sensors have become one type of the most important payloads and were widely utilized in space reconnaissance and resource survey. These imaging sensors are designed to obtain the high spatial, temporary and spectral resolution imageries. Key techniques in instrumentation include flexible maneuvering, advanced imaging control algorithms and integrative measuring techniques, which are closely correlative or even acting as the bottle-necks for each other. Therefore, mutual restrictive problems must be solved and optimized. Optical flow is the critical model which to be fully represented in the information transferring as well as radiation energy flowing in dynamic imaging. For agile sensors, especially with wide-field-of view, imaging optical flows may distort and deviate seriously when they perform large angle attitude maneuvering imaging. The phenomena are mainly attributed to the geometrical characteristics of the three-dimensional earth surface as well as the coupled effects due to the complicated relative motion between the sensor and scene. Under this circumstance, velocity fields distribute nonlinearly, the imageries may badly be smeared or probably the geometrical structures are changed since the image velocity matching errors are not having been eliminated perfectly. In this paper, precise imaging optical flow model is established for agile remote sensors, for which optical flows evolving is factorized by two forms, which respectively due to translational movement and image shape changing. Moreover, base on that, agile remote sensors instrumentation was investigated. The main techniques which concern optical flow modeling include integrative design with lightweight star sensors along with micro inertial measurement units and corresponding data fusion, the assemblies of focal plane layout and control, imageries post processing for agile remote sensors etc. Some experiments show that the optical analyzing method is effective to eliminate the limitations for the performance indexes, and succeeded to be applied for integrative system design. Finally, a principle prototype of agile remote sensor designed by the method is discussed.
1988-04-01
solution to a information. There is thus a biological motivation for investi- specific problem, e.g., solving the visual obstacle avoidance gating the...narticular practically motivated aspect of the image, known as the optical flow, does not necessarily the general problem. correspond to the 2-D motion...on (Z Z * "inexact" vision jThom8fi] The obvious motivation stems from a = X tancosa b - Z tan3sina; (1) the fact that an obstacle in relative motion
Computing motion using resistive networks
NASA Technical Reports Server (NTRS)
Koch, Christof; Luo, Jin; Mead, Carver; Hutchinson, James
1988-01-01
Recent developments in the theory of early vision are described which lead from the formulation of the motion problem as an ill-posed one to its solution by minimizing certain 'cost' functions. These cost or energy functions can be mapped onto simple analog and digital resistive networks. It is shown how the optical flow can be computed by injecting currents into resistive networks and recording the resulting stationary voltage distribution at each node. These networks can be implemented in cMOS VLSI circuits and represent plausible candidates for biological vision systems.
Visual perception of axes of head rotation
Arnoldussen, D. M.; Goossens, J.; van den Berg, A. V.
2013-01-01
Registration of ego-motion is important to accurately navigate through space. Movements of the head and eye relative to space are registered through the vestibular system and optical flow, respectively. Here, we address three questions concerning the visual registration of self-rotation. (1) Eye-in-head movements provide a link between the motion signals received by sensors in the moving eye and sensors in the moving head. How are these signals combined into an ego-rotation percept? We combined optic flow of simulated forward and rotational motion of the eye with different levels of eye-in-head rotation for a stationary head. We dissociated simulated gaze rotation and head rotation by different levels of eye-in-head pursuit. We found that perceived rotation matches simulated head- not gaze-rotation. This rejects a model for perceived self-rotation that relies on the rotation of the gaze line. Rather, eye-in-head signals serve to transform the optic flow's rotation information, that specifies rotation of the scene relative to the eye, into a rotation relative to the head. This suggests that transformed visual self-rotation signals may combine with vestibular signals. (2) Do transformed visual self-rotation signals reflect the arrangement of the semi-circular canals (SCC)? Previously, we found sub-regions within MST and V6+ that respond to the speed of the simulated head rotation. Here, we re-analyzed those Blood oxygenated level-dependent (BOLD) signals for the presence of a spatial dissociation related to the axes of visually simulated head rotation, such as have been found in sub-cortical regions of various animals. Contrary, we found a rather uniform BOLD response to simulated rotation along the three SCC axes. (3) We investigated if subject's sensitivity to the direction of the head rotation axis shows SCC axes specifcity. We found that sensitivity to head rotation is rather uniformly distributed, suggesting that in human cortex, visuo-vestibular integration is not arranged into the SCC frame. PMID:23919087
NASA Astrophysics Data System (ADS)
Gliß, Jonas; Stebel, Kerstin; Kylling, Arve; Sudbø, Aasmund
2018-02-01
Accurate gas velocity measurements in emission plumes are highly desirable for various atmospheric remote sensing applications. The imaging technique of UV SO2 cameras is commonly used to monitor SO2 emissions from volcanoes and anthropogenic sources (e.g. power plants, ships). The camera systems capture the emission plumes at high spatial and temporal resolution. This allows the gas velocities in the plume to be retrieved directly from the images. The latter can be measured at a pixel level using optical flow (OF) algorithms. This is particularly advantageous under turbulent plume conditions. However, OF algorithms intrinsically rely on contrast in the images and often fail to detect motion in low-contrast image areas. We present a new method to identify ill-constrained OF motion vectors and replace them using the local average velocity vector. The latter is derived based on histograms of the retrieved OF motion fields. The new method is applied to two example data sets recorded at Mt Etna (Italy) and Guallatiri (Chile). We show that in many cases, the uncorrected OF yields significantly underestimated SO2 emission rates. We further show that our proposed correction can account for this and that it significantly improves the reliability of optical-flow-based gas velocity retrievals. In the case of Mt Etna, the SO2 emissions of the north-eastern crater are investigated. The corrected SO2 emission rates range between 4.8 and 10.7 kg s-1 (average of 7.1 ± 1.3 kg s-1) and are in good agreement with previously reported values. For the Guallatiri data, the emissions of the central crater and a fumarolic field are investigated. The retrieved SO2 emission rates are between 0.5 and 2.9 kg s-1 (average of 1.3 ± 0.5 kg s-1) and provide the first report of SO2 emissions from this remotely located and inaccessible volcano.
Agyei, Seth B.; van der Weel, F. R. (Ruud); van der Meer, Audrey L. H.
2016-01-01
During infancy, smart perceptual mechanisms develop allowing infants to judge time-space motion dynamics more efficiently with age and locomotor experience. This emerging capacity may be vital to enable preparedness for upcoming events and to be able to navigate in a changing environment. Little is known about brain changes that support the development of prospective control and about processes, such as preterm birth, that may compromise it. As a function of perception of visual motion, this paper will describe behavioral and brain studies with young infants investigating the development of visual perception for prospective control. By means of the three visual motion paradigms of occlusion, looming, and optic flow, our research shows the importance of including behavioral data when studying the neural correlates of prospective control. PMID:26903908
Dikbas, Salih; Altunbasak, Yucel
2013-08-01
In this paper, a new low-complexity true-motion estimation (TME) algorithm is proposed for video processing applications, such as motion-compensated temporal frame interpolation (MCTFI) or motion-compensated frame rate up-conversion (MCFRUC). Regular motion estimation, which is often used in video coding, aims to find the motion vectors (MVs) to reduce the temporal redundancy, whereas TME aims to track the projected object motion as closely as possible. TME is obtained by imposing implicit and/or explicit smoothness constraints on the block-matching algorithm. To produce better quality-interpolated frames, the dense motion field at interpolation time is obtained for both forward and backward MVs; then, bidirectional motion compensation using forward and backward MVs is applied by mixing both elegantly. Finally, the performance of the proposed algorithm for MCTFI is demonstrated against recently proposed methods and smoothness constraint optical flow employed by a professional video production suite. Experimental results show that the quality of the interpolated frames using the proposed method is better when compared with the MCFRUC techniques.
NASA Astrophysics Data System (ADS)
Dmitriyenko, Margarita A.; Nyashina, Galina S.; Zhdanova, Alena O.; Vysokomornaya, Olga V.
2016-02-01
The evaporation features for the atomized flow of suspension on the base of water with ground admixtures in an area of high-temperature combustion products of liquid flammable substance (acetone) were investigated experimentally by the optical methods of gas flow diagnostic and the high-speed video recording. The scales of influence of clay and silt concentration in droplets of atomized flow on the intensity of its evaporation were determined. The approximation dependences describing a decrease in typical size of suspension droplets at various values of ground admixtures were obtained.
Operational tracking of lava lake surface motion at Kīlauea Volcano, Hawai‘i
Patrick, Matthew R.; Orr, Tim R.
2018-03-08
Surface motion is an important component of lava lake behavior, but previous studies of lake motion have been focused on short time intervals. In this study, we implement the first continuous, real-time operational routine for tracking lava lake surface motion, applying the technique to the persistent lava lake in Halema‘uma‘u Crater at the summit of Kīlauea Volcano, Hawai‘i. We measure lake motion by using images from a fixed thermal camera positioned on the crater rim, transmitting images to the Hawaiian Volcano Observatory (HVO) in real time. We use an existing optical flow toolbox in Matlab to calculate motion vectors, and we track the position of lava upwelling in the lake, as well as the intensity of spattering on the lake surface. Over the past 2 years, real-time tracking of lava lake surface motion at Halema‘uma‘u has been an important part of monitoring the lake’s activity, serving as another valuable tool in the volcano monitoring suite at HVO.
NASA Astrophysics Data System (ADS)
Stovall, Stephanie; Midgett, Madeline; Thornburg, Kent; Rugonyi, Sandra
2016-11-01
Abnormal blood flow during early cardiovascular development has been identified as a key factor in the pathogenesis of congenital heart disease; however, the mechanisms by which altered hemodynamics induce cardiac malformations are poorly understood. This study used outflow tract (OFT) banding to model increased afterload, pressure, and blood flow velocities at tubular stages of heart development and characterized the immediate changes in cardiac wall motion due to banding in chicken embryo models with light microscopy-based video densitometry. Optical videos were used to acquire two-dimensional heart image sequences over the cardiac cycle, from which intensity data were extracted along the heart centerline at several locations in the heart ventricle and OFT. While no changes were observed in the synchronous contraction of the ventricle with banding, the peristaltic-like wall motion in the OFT was significantly affected. Our data provide valuable insight into early cardiac biomechanics and its characterization using a simple light microscopy-based imaging modality.
Storm-time Convection Dynamics Viewed from Optical Auroras: from Streamer to Patchy Pulsating Aurora
NASA Astrophysics Data System (ADS)
Yang, B.; Donovan, E.; Liang, J.; Grono, E.
2016-12-01
In a series of statistical and event studies we have demonstrated that the motion of patches in regions of Patchy Pulsating Aurora (PPA) is very close to if not exactly convection. Thus, 2D maps of PPA motion provides us the opportunity to remote sense magnetospheric convection with relatively high space and time resolution, subject to uncertainties associated with mapping between the ionosphere and magnetosphere. In this study, we use THEMIS ASI aurora observations (streamers and patchy pulsating aurora) combined with SuperDARN convection measurements, Swarm ion drift velocity measurements, and RBSP electric field measurements to explore the convection dynamics in storm time. From 0500 UT to 0600 UT on March 19 2015, convection observations across 5 magnetic local time (MLT) inferred from the motion of PPA patches and SuperDARN measurements show that a westward SAPS (Subauroral Polarized Streams) enhancement occurs after an auroral streamer. This suggests that plasma sheet fast flows can affect the inner magnetospheric convection, and possibly trigger very fast flows in the inner magnetosphere.
Curvilinear approach to an intersection and visual detection of a collision.
Berthelon, C; Mestre, D
1993-09-01
Visual motion perception plays a fundamental role in vehicle control. Recent studies have shown that the pattern of optical flow resulting from the observer's self-motion through a stable environment is used by the observer to accurately control his or her movements. However, little is known about the perception of another vehicle during self-motion--for instance, when a car driver approaches an intersection with traffic. In a series of experiments using visual simulations of car driving, we show that observers are able to detect the presence of a moving object during self-motion. However, the perception of the other car's trajectory appears to be strongly dependent on environmental factors, such as the presence of a road sign near the intersection or the shape of the road. These results suggest that local and global visual factors determine the perception of a car's trajectory during self-motion.
Multisensory effects on somatosensation: a trimodal visuo-vestibular-tactile interaction
Kaliuzhna, Mariia; Ferrè, Elisa Raffaella; Herbelin, Bruno; Blanke, Olaf; Haggard, Patrick
2016-01-01
Vestibular information about self-motion is combined with other sensory signals. Previous research described both visuo-vestibular and vestibular-tactile bilateral interactions, but the simultaneous interaction between all three sensory modalities has not been explored. Here we exploit a previously reported visuo-vestibular integration to investigate multisensory effects on tactile sensitivity in humans. Tactile sensitivity was measured during passive whole body rotations alone or in conjunction with optic flow, creating either purely vestibular or visuo-vestibular sensations of self-motion. Our results demonstrate that tactile sensitivity is modulated by perceived self-motion, as provided by a combined visuo-vestibular percept, and not by the visual and vestibular cues independently. We propose a hierarchical multisensory interaction that underpins somatosensory modulation: visual and vestibular cues are first combined to produce a multisensory self-motion percept. Somatosensory processing is then enhanced according to the degree of perceived self-motion. PMID:27198907
Perception and Control of Simulated Self Motion. Final Report for the Period April 1983-March 1987.
ERIC Educational Resources Information Center
Owen, Dean H.; And Others
This report includes three experiment sections. The first experiment tested sensitivity to loss in altitude and demonstrated that: (1) preview effects led to adaptation; (2) sensitivity decreased with higher flow rates; and (3) sensitivity increased with higher optical texture densities and fractional loss. The second and third experiments…
Relativistic and Slowing Down: The Flow in the Hotspots of Powerful Radio Galaxies and Quasars
NASA Technical Reports Server (NTRS)
Kazanas, D.
2003-01-01
The 'hotspots' of powerful radio galaxies (the compact, high brightness regions, where the jet flow collides with the intergalactic medium (IGM)) have been imaged in radio, optical and recently in X-ray frequencies. We propose a scheme that unifies their, at first sight, disparate broad band (radio to X-ray) spectral properties. This scheme involves a relativistic flow upstream of the hotspot that decelerates to the sub-relativistic speed of its inferred advance through the IGM and it is viewed at different angles to its direction of motion, as suggested by two independent orientation estimators (the presence or not of broad emission lines in their optical spectra and the core-to-extended radio luminosity). This scheme, besides providing an account of the hotspot spectral properties with jet orientation, it also suggests that the large-scale jets remain relativistic all the way to the hotspots.
Gerencser, Akos A.; Nicholls, David G.
2008-01-01
Impaired transport of mitochondria, in dendrites and axons of neurons, and bioenergetic deficit are increasingly recognized to be of pathological importance in neurodegenerative diseases. To study the relationship between transport and bioenergetics, we have developed what to our knowledge is a novel technique to quantify organelle velocity in cultured cells. The aim was to combine measurement of motion and bioenergetic parameters while minimizing photodynamic oxidative artifacts evoked by fluorescence excitation. Velocity determination from sequential fluorescence images is not trivial, and here we describe an application of “optical flow”, the flow of gray values in grayscale images, to this problem. Based on the principles of photon shot noise occurring in low light level fluorescence microscopy, we describe and validate here an optical flow-based, robust method to measure velocity vectors for organelles expressing fluorescent proteins. This method features instantaneous velocity determination from a pair of images by detecting motion of edges, with no assumptions about the separation or shapes of the objects in the image. Optical flow was used in combination with single mitochondrion assay of mitochondrial thiol redox status by mitochondrially targeted redox-sensitive green fluorescent protein and measurement of mitochondrial membrane potential by tetramethylrhodamine methyl ester. Mitochondrial populations of resting cultured hippocampal neurons were analyzed. It was found that mitochondria with more oxidized thiol redox status have lower membrane potentials and are smaller in size. These mitochondria are more motile than the average; however, mitochondrial motility is only slightly dependent on the observed bioenergetic parameters and is correlated the best to the size of the mitochondria. PMID:18757564
A high-speed photographic system for flow visualization in a steam turbine
NASA Technical Reports Server (NTRS)
Barna, G. J.
1973-01-01
A photographic system was designed to visualize the moisture flow in a steam turbine. Good performance of the system was verified using dry turbine mockups in which an aerosol spray simulated, in a rough way, the moisture flow in the turbine. Borescopes and fiber-optic light tubes were selected as the general instrumentation approach. High speed motion-picture photographs of the liquid flow over the stator blade surfaces were taken using stroboscopic lighting. Good visualization of the liquid flow was obtained. Still photographs of drops in flight were made using short duration flash sources. Drops with diameters as small as 30 micrometers (0.0012 in.) could be resolved. In addition, motion pictures of a spray of water simulating the spray off the rotor blades and shrouds were taken at normal framing rates. Specially constructed light tubes containing small tungsten-halogen lamps were used. Sixteen millimeter photography was used in all cases. Two potential problems resulting from the two-phase turbine flow (attenuation and scattering of light by the fog present and liquid accumulation on the borescope mirrors) were taken into account in the photographic system design but not evaluated experimentally.
FUSE spectra of Lyman series emissions from the interplanetary medium
NASA Astrophysics Data System (ADS)
Clarke, John
Neutral atoms from the local ISM flow into the solar system producing diffuse emissions through resonant scattering of solar emissions. This wind contains the velocity distribution of the local ISM, plus modifications by solar gravity and radiation pressure near the Sun. In addition, the H atom motions are modified by charge exchange collisions with fast protons in the heliospheric interface region, while He atoms are little affected by charge exchange. Recent observations of the He and H flows in the solar system suggest that the He velocity of 26 km s-1 is that of the local ISM cloud, while the lower H velocity of 18-21 km s-1 and greatly increased velocity dispersion in the flow direction are due to an interface modification of the H flow. Remote observations of the H flow thereby provide a method to remotely study the heliospheric interface. The H flow has been studied from H Lyα line profiles at high spectral resolution observed by Copernicus, IUE, and HST, using the Earth orbital motion to Doppler shift the ISM from the geocoronal emission. One serious ambiguity in the interpretation of these data results from the optically thick Lyα emission, leading to uncertainties in derived values of the H density. Using FUSE to observe the brightness and line profile of the optically thin H Lyβ line, close in time to SOHO observations of the Lyα emission, we can determine accurately the optical depth and density n(H) along lines of sight upwind, downwind, and cross-flow. Comparing n(H) with the heliospheric helium density, and with the interstellar cloud HI/HeI ratio measured recently by the EUVE, will give the fraction of H atoms removed by charge exchange at the entrance to the heliosphere, and then the Local Cloud (or ambient ISM) electron density which governs the size of the heliosphere. We request FUSE sky aperture spectra in the two narrow science apertures obtained during other pointed observations, through cooperation in scheduling pointed observations in the correct look directions at the proper times of year.
Single-Camera Stereoscopy Setup to Visualize 3D Dusty Plasma Flows
NASA Astrophysics Data System (ADS)
Romero-Talamas, C. A.; Lemma, T.; Bates, E. M.; Birmingham, W. J.; Rivera, W. F.
2016-10-01
A setup to visualize and track individual particles in multi-layered dusty plasma flows is presented. The setup consists of a single camera with variable frame rate, and a pair of adjustable mirrors that project the same field of view from two different angles to the camera, allowing for three-dimensional tracking of particles. Flows are generated by inclining the plane in which the dust is levitated using a specially designed setup that allows for external motion control without compromising vacuum. Dust illumination is achieved with an optics arrangement that includes a Powell lens that creates a laser fan with adjustable thickness and with approximately constant intensity everywhere. Both the illumination and the stereoscopy setup allow for the camera to be placed at right angles with respect to the levitation plane, in preparation for magnetized dusty plasma experiments in which there will be no direct optical access to the levitation plane. Image data and analysis of unmagnetized dusty plasma flows acquired with this setup are presented.
Iwasaki, Wataru; Nogami, Hirofumi; Takeuchi, Satoshi; Furue, Masutaka; Higurashi, Eiji; Sawada, Renshi
2015-10-05
Wearable wireless physiological sensors are helpful for monitoring and maintaining human health. Blood flow contains abundant physiological information but it is hard to measure blood flow during exercise using conventional blood flowmeters because of their size, weight, and use of optic fibers. To resolve these disadvantages, we previously developed a micro integrated laser Doppler blood flowmeter using microelectromechanical systems technology. This micro blood flowmeter is wearable and capable of stable measurement signals even during movement. Therefore, we attempted to measure skin blood flow at the forehead, fingertip, and earlobe of seven young men while running as a pilot experiment to extend the utility of the micro blood flowmeter. We measured blood flow in each subject at velocities of 6, 8, and 10 km/h. We succeeded in obtaining stable measurements of blood flow, with few motion artifacts, using the micro blood flowmeter, and the pulse wave signal and motion artifacts were clearly separated by conducting frequency analysis. Furthermore, the results showed that the extent of the changes in blood flow depended on the intensity of exercise as well as previous work with an ergometer. Thus, we demonstrated the capability of this wearable blood flow sensor for measurement during exercise.
Rice, Tyler B; Kwan, Elliott; Hayakawa, Carole K; Durkin, Anthony J; Choi, Bernard; Tromberg, Bruce J
2013-01-01
Laser Speckle Imaging (LSI) is a simple, noninvasive technique for rapid imaging of particle motion in scattering media such as biological tissue. LSI is generally used to derive a qualitative index of relative blood flow due to unknown impact from several variables that affect speckle contrast. These variables may include optical absorption and scattering coefficients, multi-layer dynamics including static, non-ergodic regions, and systematic effects such as laser coherence length. In order to account for these effects and move toward quantitative, depth-resolved LSI, we have developed a method that combines Monte Carlo modeling, multi-exposure speckle imaging (MESI), spatial frequency domain imaging (SFDI), and careful instrument calibration. Monte Carlo models were used to generate total and layer-specific fractional momentum transfer distributions. This information was used to predict speckle contrast as a function of exposure time, spatial frequency, layer thickness, and layer dynamics. To verify with experimental data, controlled phantom experiments with characteristic tissue optical properties were performed using a structured light speckle imaging system. Three main geometries were explored: 1) diffusive dynamic layer beneath a static layer, 2) static layer beneath a diffuse dynamic layer, and 3) directed flow (tube) submerged in a dynamic scattering layer. Data fits were performed using the Monte Carlo model, which accurately reconstructed the type of particle flow (diffusive or directed) in each layer, the layer thickness, and absolute flow speeds to within 15% or better.
Origin and dynamics of emission line clouds in cooling flow environments
NASA Technical Reports Server (NTRS)
Loewenstein, Michael
1990-01-01
The author suggests that since clouds are born co-moving in a turbulent intra-cluster medium (ICM), the allowed parameter space can now be opened up to a more acceptable range. Large-scale motions can be driven in the central parts of cooling flows by a number of mechanisms including the motion of the central and other galaxies, and the dissipation of advected, focussed rotational and magnetic energy. In addition to the velocity width paradox, two other paradoxes (Heckman et al. 1989) can be solved if the ICM is turbulent. Firstly, the heating source for the emission line regions has always been puzzling - line luminosities are extremely high for a given (optical or radio) galaxy luminosity compared to those in non-cooling flow galaxies, therefore a mechanism peculiar to cooling flows must be at work. However most, if not all, previously suggested heating mechanisms either fail to provide enough ionization or give the wrong line ratios, or both. The kinetic energy in the turbulence provides a natural energy source if it can be efficiently converted to cloud heat. Researchers suggest that this can be done via magneto-hydrodynamic waves through plasma slip. Secondly, while the x ray observations indicate extended mass deposition, the optical line emission is more centrally concentrated. Since many of the turbulence-inducing mechanisms are strongest in the central regions of the ICM, so is the method of heating. In other words material is dropping out everywhere but only being lit up in the center.
Visual and Non-Visual Contributions to the Perception of Object Motion during Self-Motion
Fajen, Brett R.; Matthis, Jonathan S.
2013-01-01
Many locomotor tasks involve interactions with moving objects. When observer (i.e., self-)motion is accompanied by object motion, the optic flow field includes a component due to self-motion and a component due to object motion. For moving observers to perceive the movement of other objects relative to the stationary environment, the visual system could recover the object-motion component – that is, it could factor out the influence of self-motion. In principle, this could be achieved using visual self-motion information, non-visual self-motion information, or a combination of both. In this study, we report evidence that visual information about the speed (Experiment 1) and direction (Experiment 2) of self-motion plays a role in recovering the object-motion component even when non-visual self-motion information is also available. However, the magnitude of the effect was less than one would expect if subjects relied entirely on visual self-motion information. Taken together with previous studies, we conclude that when self-motion is real and actively generated, both visual and non-visual self-motion information contribute to the perception of object motion. We also consider the possible role of this process in visually guided interception and avoidance of moving objects. PMID:23408983
A Novel Approach to Measuring Glacier Motion Remotely using Aerial LiDAR
NASA Astrophysics Data System (ADS)
Telling, J. W.; Fountain, A. G.; Glennie, C. L.; Obryk, M.
2016-12-01
Glaciers play an important role in the Earth's climate system, affecting climate and ocean circulation at the largest scales, and contributing to runoff and sea level rise at local scales. A key variable is glacier motion and tracking motion is critical to understanding how flow responds to changes in boundary conditions and to testing predictive models of glacier behavior. Although field measurements of glacier motion have been collected since the 19th Century, field operations remain a slow, laborious, sometimes dangerous, task yielding only a few data points per glacier. In recent decades satellite imaging of glacier motion has proved very fruitful, but the spatial resolution of the imagery restricts applications to regional scale analyses. Here we assess the utility of using aerial LiDAR surveys and particle image velocimetry (PIV) as a method for tracking glacier motion over relatively small regions (<50km2). Five glaciers in Taylor Valley, Antarctica, were surveyed twice; the first LiDAR survey was conducted in 2001 and the second was conducted in 2014. The cold-dry climate conditions of Taylor Valley and the relatively slow motion of its polar glaciers (≤ 8m yr-1) preserve the surface roughness and limit the advected distance of the features making the 13-year interval between surveys sufficient for monitoring glacier motion. Initial results yield reasonable flow fields and show great promise. The range of flow speeds, surface roughness, and transient snow patches found on these glaciers provide a robust test of PIV methods. Results will be compared to field measurements of glacier velocity and to results from feature tracking, a common technique based on paired optical images. The merits of using this technique to measure glacier motion will be discussed in the context of these results. Applying PIV to LiDAR point clouds may offer a higher resolution data set of glacier velocity than satellite images or field measurements.
Peripheral Processing Facilitates Optic Flow-Based Depth Perception
Li, Jinglin; Lindemann, Jens P.; Egelhaaf, Martin
2016-01-01
Flying insects, such as flies or bees, rely on consistent information regarding the depth structure of the environment when performing their flight maneuvers in cluttered natural environments. These behaviors include avoiding collisions, approaching targets or spatial navigation. Insects are thought to obtain depth information visually from the retinal image displacements (“optic flow”) during translational ego-motion. Optic flow in the insect visual system is processed by a mechanism that can be modeled by correlation-type elementary motion detectors (EMDs). However, it is still an open question how spatial information can be extracted reliably from the responses of the highly contrast- and pattern-dependent EMD responses, especially if the vast range of light intensities encountered in natural environments is taken into account. This question will be addressed here by systematically modeling the peripheral visual system of flies, including various adaptive mechanisms. Different model variants of the peripheral visual system were stimulated with image sequences that mimic the panoramic visual input during translational ego-motion in various natural environments, and the resulting peripheral signals were fed into an array of EMDs. We characterized the influence of each peripheral computational unit on the representation of spatial information in the EMD responses. Our model simulations reveal that information about the overall light level needs to be eliminated from the EMD input as is accomplished under light-adapted conditions in the insect peripheral visual system. The response characteristics of large monopolar cells (LMCs) resemble that of a band-pass filter, which reduces the contrast dependency of EMDs strongly, effectively enhancing the representation of the nearness of objects and, especially, of their contours. We furthermore show that local brightness adaptation of photoreceptors allows for spatial vision under a wide range of dynamic light conditions. PMID:27818631
A stingless bee can use visual odometry to estimate both height and distance.
Eckles, M A; Roubik, D W; Nieh, J C
2012-09-15
Bees move and forage within three dimensions and rely heavily on vision for navigation. The use of vision-based odometry has been studied extensively in horizontal distance measurement, but not vertical distance measurement. The honey bee Apis mellifera and the stingless bee Melipona seminigra measure distance visually using optic flow-movement of images as they pass across the retina. The honey bees gauge height using image motion in the ventral visual field. The stingless bees forage at different tropical forest canopy levels, ranging up to 40 m at our site. Thus, estimating height would be advantageous. We provide the first evidence that the stingless bee Melipona panamica utilizes optic flow information to gauge not only distance traveled but also height above ground, by processing information primarily from the lateral visual field. After training bees to forage at a set height in a vertical tunnel lined with black and white stripes, we observed foragers that explored a new tunnel with no feeder. In a new tunnel, bees searched at the same height they were trained to. In a narrower tunnel, bees experienced more image motion and significantly lowered their search height. In a wider tunnel, bees experienced less image motion and searched at significantly greater heights. In a tunnel without optic cues, bees were disoriented and searched at random heights. A horizontal tunnel testing these variables similarly affected foraging, but bees exhibited less precision (greater variance in search positions). Accurately gauging flight height above ground may be crucial for this species and others that compete for resources located at heights ranging from ground level to the high tropical forest canopies.
Patch-based frame interpolation for old films via the guidance of motion paths
NASA Astrophysics Data System (ADS)
Xia, Tianran; Ding, Youdong; Yu, Bing; Huang, Xi
2018-04-01
Due to improper preservation, traditional films will appear frame loss after digital. To deal with this problem, this paper presents a new adaptive patch-based method of frame interpolation via the guidance of motion paths. Our method is divided into three steps. Firstly, we compute motion paths between two reference frames using optical flow estimation. Then, the adaptive bidirectional interpolation with holes filled is applied to generate pre-intermediate frames. Finally, using patch match to interpolate intermediate frames with the most similar patches. Since the patch match is based on the pre-intermediate frames that contain the motion paths constraint, we show a natural and inartificial frame interpolation. We test different types of old film sequences and compare with other methods, the results prove that our method has a desired performance without hole or ghost effects.
Detection of Brownian Torque in a Magnetically-Driven Rotating Microsystem
Romodina, Maria N.; Lyubin, Evgeny V.; Fedyanin, Andrey A.
2016-01-01
Thermal fluctuations significantly affect the behavior of microscale systems rotating in shear flow, such as microvortexes, microbubbles, rotating micromotors, microactuators and other elements of lab-on-a-chip devices. The influence of Brownian torque on the motion of individual magnetic microparticles in a rotating magnetic field is experimentally determined using optical tweezers. Rotational Brownian motion induces the flattening of the breakdown transition between the synchronous and asynchronous modes of microparticle rotation. The experimental findings regarding microparticle rotation in the presence of Brownian torque are compared with the results of numerical Brownian dynamics simulations. PMID:26876334
Neural Circuit to Integrate Opposing Motions in the Visual Field.
Mauss, Alex S; Pankova, Katarina; Arenz, Alexander; Nern, Aljoscha; Rubin, Gerald M; Borst, Alexander
2015-07-16
When navigating in their environment, animals use visual motion cues as feedback signals that are elicited by their own motion. Such signals are provided by wide-field neurons sampling motion directions at multiple image points as the animal maneuvers. Each one of these neurons responds selectively to a specific optic flow-field representing the spatial distribution of motion vectors on the retina. Here, we describe the discovery of a group of local, inhibitory interneurons in the fruit fly Drosophila key for filtering these cues. Using anatomy, molecular characterization, activity manipulation, and physiological recordings, we demonstrate that these interneurons convey direction-selective inhibition to wide-field neurons with opposite preferred direction and provide evidence for how their connectivity enables the computation required for integrating opposing motions. Our results indicate that, rather than sharpening directional selectivity per se, these circuit elements reduce noise by eliminating non-specific responses to complex visual information. Copyright © 2015 Elsevier Inc. All rights reserved.
Perceived change in orientation from optic flow in the central visual field
NASA Technical Reports Server (NTRS)
Dyre, Brian P.; Andersen, George J.
1988-01-01
The effects of internal depth within a simulation display on perceived changes in orientation have been studied. Subjects monocularly viewed displays simulating observer motion within a volume of randomly positioned points through a window which limited the field of view to 15 deg. Changes in perceived spatial orientation were measured by changes in posture. The extent of internal depth within the display, the presence or absence of visual information specifying change in orientation, and the frequency of motion supplied by the display were examined. It was found that increased sway occurred at frequencies equal to or below 0.375 Hz when motion at these frequencies was displayed. The extent of internal depth had no effect on the perception of changing orientation.
Spectral domain phase microscopy: a new tool for measuring cellular dynamics and cytoplasmic flow
NASA Astrophysics Data System (ADS)
McDowell, Emily J.; Choma, Michael A.; Ellerbee, Audrey K.; Izatt, Joseph A.
2005-03-01
Broadband interferometry is an attractive technique for the detection of cellular motions because it provides depth-resolved interferometric phase information via coherence gating. Here a phase sensitive technique called spectral domain phase microscopy (SDPM) is presented. SDPM is a functional extension of spectral domain optical coherence tomography that allows for the detection of cellular motions and dynamics with nanometer-scale sensitivity. This sensitivity is made possible by the inherent phase stability of spectral domain OCT combined with common-path interferometry. The theory that underlies this technique is presented, the sensitivity of the technique is demonstrated by the measurement of the thermal expansion coefficient of borosilicate glass, and the response of an Amoeba proteus to puncture of its cell membrane is measured. We also exploit the phase stability of SDPM to perform Doppler flow imaging of cytoplasmic streaming in A. proteus. We show reversal of cytoplasmic flow in response to stimuli, and we show that the cytoplasmic flow is laminar (i.e. parabolic) in nature. We are currently investigating the use of SDPM in a variety of different cell types.
Motion of particles with inertia in a compressible free shear layer
NASA Technical Reports Server (NTRS)
Samimy, M.; Lele, S. K.
1991-01-01
The effects of the inertia of a particle on its flow-tracking accuracy and particle dispersion are studied using direct numerical simulations of 2D compressible free shear layers in convective Mach number (Mc) range of 0.2 to 0.6. The results show that particle response is well characterized by tau, the ratio of particle response time to the flow time scales (Stokes' number). The slip between particle and fluid imposes a fundamental limit on the accuracy of optical measurements such as LDV and PIV. The error is found to grow like tau up to tau = 1 and taper off at higher tau. For tau = 0.2 the error is about 2 percent. In the flow visualizations based on Mie scattering, particles with tau more than 0.05 are found to grossly misrepresent the flow features. These errors are quantified by calculating the dispersion of particles relative to the fluid. Overall, the effect of compressibility does not seem to be significant on the motion of particles in the range of Mc considered here.
Tyler, Mitchell E.; Danilov, Yuri P.; Kaczmarek, Kurt A.; Meyerand, Mary E.
2013-01-01
Abstract Some individuals with balance impairment have hypersensitivity of the motion-sensitive visual cortices (hMT+) compared to healthy controls. Previous work showed that electrical tongue stimulation can reduce the exaggerated postural sway induced by optic flow in this subject population and decrease the hypersensitive response of hMT+. Additionally, a region within the brainstem (BS), likely containing the vestibular and trigeminal nuclei, showed increased optic flow-induced activity after tongue stimulation. The aim of this study was to understand how the modulation induced by tongue stimulation affects the balance-processing network as a whole and how modulation of BS structures can influence cortical activity. Four volumes of interest, discovered in a general linear model analysis, constitute major contributors to the balance-processing network. These regions were entered into a dynamic causal modeling analysis to map the network and measure any connection or topology changes due to the stimulation. Balance-impaired individuals had downregulated response of the primary visual cortex (V1) to visual stimuli but upregulated modulation of the connection between V1 and hMT+ by visual motion compared to healthy controls (p≤1E–5). This upregulation was decreased to near-normal levels after stimulation. Additionally, the region within the BS showed increased response to visual motion after stimulation compared to both prestimulation and controls. Stimulation to the tongue enters the central nervous system at the BS but likely propagates to the cortex through supramodal information transfer. We present a model to explain these brain responses that utilizes an anatomically present, but functionally dormant pathway of information flow within the processing network. PMID:23216162
Mechanism of magnetic liquid flowing in the magnetic liquid seal gap of reciprocating shaft
NASA Astrophysics Data System (ADS)
Li, Decai; Xu, Haiping; He, Xinzhi; Lan, Huiqing
2005-03-01
In order to solve the problems that exist in the magnetic liquid seal of reciprocating shaft, we have set up an experimental facility, which composes a camera, microscope, step-by-step motor, pin roller screw, reciprocating motion shaft, pole pieces, permanent magnet and the magnetic liquid in the seal gap. Through the optical technology and image process of the experimental facility, we have studied the magnetic liquid flow in the seal gap when the reciprocating shaft moves with different velocities and strokes. This study specially concentrates on: (1) the regular pattern of such flow; (2) the loss quantity of magnetic liquid caused by the reciprocating motion shaft; (3) the failure reasons of this magnetic liquid seal; and (4) the design of a new structure for the magnetic liquid seal of reciprocating shaft. The application indicates that the new structure is very effective in some occasions. The new structure was accepted as the state patent in 2001 and authenticated as the achievement in the scientific research in 2002.
Postural Responses to a Moving Room in Children with and without Developmental Coordination Disorder
ERIC Educational Resources Information Center
Chung, Hyun Chae; Stoffregen, Thomas A.
2011-01-01
Children (10 or 11 years old) with and without developmental coordination disorder (DCD) were exposed to imposed optic flow in a moving room. We manipulated the amplitude and frequency of oscillatory room motion, and we evaluated the coupling of standing body sway with room oscillations. The results revealed that standing sway of both children…
A Neural Model of Visually Guided Steering, Obstacle Avoidance, and Route Selection
ERIC Educational Resources Information Center
Elder, David M.; Grossberg, Stephen; Mingolla, Ennio
2009-01-01
A neural model is developed to explain how humans can approach a goal object on foot while steering around obstacles to avoid collisions in a cluttered environment. The model uses optic flow from a 3-dimensional virtual reality environment to determine the position of objects on the basis of motion discontinuities and computes heading direction,…
Egomotion Estimation with Optic Flow and Air Velocity Sensors
2012-01-22
RUMMELT ADAM J. RUTKOWSKI Acting Technical Advisor, RWW Program Manager This report is...method of distance and groundspeed estimation using an omnidirectional camera, but knowledge of the average scene distance is required. Flight height...varying wind and even over sloped terrain. Our method also does not require any prior knowledge of the environment or the flyer motion states. This
Motion-sensitized SPRITE measurements of hydrodynamic cavitation in fast pipe flow.
Adair, Alexander; Mastikhin, Igor V; Newling, Benedict
2018-06-01
The pressure variations experienced by a liquid flowing through a pipe constriction can, in some cases, result in the formation of a bubble cloud (i.e., hydrodynamic cavitation). Due to the nature of the bubble cloud, it is ideally measured through the use of non-optical and non-invasive techniques; therefore, it is well-suited for study by magnetic resonance imaging. This paper demonstrates the use of Conical SPRITE (a 3D, centric-scan, pure phase-encoding pulse sequence) to acquire time-averaged void fraction and velocity information about hydrodynamic cavitation for water flowing through a pipe constriction. Copyright © 2018 Elsevier Inc. All rights reserved.
A simple microfluidic Coriolis effect flowmeter for operation at high pressure and high temperature.
Harrison, Christopher; Jundt, Jacques
2016-08-01
We describe a microfluidic Coriolis effect flowmeter that is simple to assemble, operates at elevated temperature and pressure, and can be operated with a lock-in amplifier. The sensor has a flow rate sensitivity greater than 2° of phase shift per 1 g/min of mass flow and is benchmarked with flow rates ranging from 0.05 to 2.0 g/min. The internal volume is 15 μl and uses off-the-shelf optical components to measure the tube motion. We demonstrate that fluid density can be calculated from the frequency of the resonating element with proper calibration.
Correcting for motion artifact in handheld laser speckle images
NASA Astrophysics Data System (ADS)
Lertsakdadet, Ben; Yang, Bruce Y.; Dunn, Cody E.; Ponticorvo, Adrien; Crouzet, Christian; Bernal, Nicole; Durkin, Anthony J.; Choi, Bernard
2018-03-01
Laser speckle imaging (LSI) is a wide-field optical technique that enables superficial blood flow quantification. LSI is normally performed in a mounted configuration to decrease the likelihood of motion artifact. However, mounted LSI systems are cumbersome and difficult to transport quickly in a clinical setting for which portability is essential in providing bedside patient care. To address this issue, we created a handheld LSI device using scientific grade components. To account for motion artifact of the LSI device used in a handheld setup, we incorporated a fiducial marker (FM) into our imaging protocol and determined the difference between highest and lowest speckle contrast values for the FM within each data set (Kbest and Kworst). The difference between Kbest and Kworst in mounted and handheld setups was 8% and 52%, respectively, thereby reinforcing the need for motion artifact quantification. When using a threshold FM speckle contrast value (KFM) to identify a subset of images with an acceptable level of motion artifact, mounted and handheld LSI measurements of speckle contrast of a flow region (KFLOW) in in vitro flow phantom experiments differed by 8%. Without the use of the FM, mounted and handheld KFLOW values differed by 20%. To further validate our handheld LSI device, we compared mounted and handheld data from an in vivo porcine burn model of superficial and full thickness burns. The speckle contrast within the burn region (KBURN) of the mounted and handheld LSI data differed by <4 % when accounting for motion artifact using the FM, which is less than the speckle contrast difference between superficial and full thickness burns. Collectively, our results suggest the potential of handheld LSI with an FM as a suitable alternative to mounted LSI, especially in challenging clinical settings with space limitations such as the intensive care unit.
Temporal and spatial adaptation of transient responses to local features
O'Carroll, David C.; Barnett, Paul D.; Nordström, Karin
2012-01-01
Interpreting visual motion within the natural environment is a challenging task, particularly considering that natural scenes vary enormously in brightness, contrast and spatial structure. The performance of current models for the detection of self-generated optic flow depends critically on these very parameters, but despite this, animals manage to successfully navigate within a broad range of scenes. Within global scenes local areas with more salient features are common. Recent work has highlighted the influence that local, salient features have on the encoding of optic flow, but it has been difficult to quantify how local transient responses affect responses to subsequent features and thus contribute to the global neural response. To investigate this in more detail we used experimenter-designed stimuli and recorded intracellularly from motion-sensitive neurons. We limited the stimulus to a small vertically elongated strip, to investigate local and global neural responses to pairs of local “doublet” features that were designed to interact with each other in the temporal and spatial domain. We show that the passage of a high-contrast doublet feature produces a complex transient response from local motion detectors consistent with predictions of a simple computational model. In the neuron, the passage of a high-contrast feature induces a local reduction in responses to subsequent low-contrast features. However, this neural contrast gain reduction appears to be recruited only when features stretch vertically (i.e., orthogonal to the direction of motion) across at least several aligned neighboring ommatidia. Horizontal displacement of the components of elongated features abolishes the local adaptation effect. It is thus likely that features in natural scenes with vertically aligned edges, such as tree trunks, recruit the greatest amount of response suppression. This property could emphasize the local responses to such features vs. those in nearby texture within the scene. PMID:23087617
Westerdale, John; Belohlavek, Marek; McMahon, Eileen M; Jiamsripong, Panupong; Heys, Jeffrey J; Milano, Michele
2011-02-01
We performed an in vitro study to assess the precision and accuracy of particle imaging velocimetry (PIV) data acquired using a clinically available portable ultrasound system via comparison with stereo optical PIV. The performance of ultrasound PIV was compared with optical PIV on a benchmark problem involving vortical flow with a substantial out-of-plane velocity component. Optical PIV is capable of stereo image acquisition, thus measuring out-of-plane velocity components. This allowed us to quantify the accuracy of ultrasound PIV, which is limited to in-plane acquisition. The system performance was assessed by considering the instantaneous velocity fields without extracting velocity profiles by spatial averaging. Within the 2-dimensional correlation window, using 7 time-averaged frames, the vector fields were found to have correlations of 0.867 in the direction along the ultrasound beam and 0.738 in the perpendicular direction. Out-of-plane motion of greater than 20% of the in-plane vector magnitude was found to increase the SD by 11% for the vectors parallel to the ultrasound beam direction and 8.6% for the vectors perpendicular to the beam. The results show a close correlation and agreement of individual velocity vectors generated by ultrasound PIV compared with optical PIV. Most of the measurement distortions were caused by out-of-plane velocity components.
Coherent modulation of stimulus colour can affect visually induced self-motion perception.
Nakamura, Shinji; Seno, Takeharu; Ito, Hiroyuki; Sunaga, Shoji
2010-01-01
The effects of dynamic colour modulation on vection were investigated to examine whether perceived variation of illumination affects self-motion perception. Participants observed expanding optic flow which simulated their forward self-motion. Onset latency, accumulated duration, and estimated magnitude of the self-motion were measured as indices of vection strength. Colour of the dots in the visual stimulus was modulated between white and red (experiment 1), white and grey (experiment 2), and grey and red (experiment 3). The results indicated that coherent colour oscillation in the visual stimulus significantly suppressed the strength of vection, whereas incoherent or static colour modulation did not affect vection. There was no effect of the types of the colour modulation; both achromatic and chromatic modulations turned out to be effective in inhibiting self-motion perception. Moreover, in a situation where the simulated direction of a spotlight was manipulated dynamically, vection strength was also suppressed (experiment 4). These results suggest that observer's perception of illumination is critical for self-motion perception, and rapid variation of perceived illumination would impair the reliabilities of visual information in determining self-motion.
Object Recognition in Flight: How Do Bees Distinguish between 3D Shapes?
Werner, Annette; Stürzl, Wolfgang; Zanker, Johannes
2016-01-01
Honeybees (Apis mellifera) discriminate multiple object features such as colour, pattern and 2D shape, but it remains unknown whether and how bees recover three-dimensional shape. Here we show that bees can recognize objects by their three-dimensional form, whereby they employ an active strategy to uncover the depth profiles. We trained individual, free flying honeybees to collect sugar water from small three-dimensional objects made of styrofoam (sphere, cylinder, cuboids) or folded paper (convex, concave, planar) and found that bees can easily discriminate between these stimuli. We also tested possible strategies employed by the bees to uncover the depth profiles. For the card stimuli, we excluded overall shape and pictorial features (shading, texture gradients) as cues for discrimination. Lacking sufficient stereo vision, bees are known to use speed gradients in optic flow to detect edges; could the bees apply this strategy also to recover the fine details of a surface depth profile? Analysing the bees’ flight tracks in front of the stimuli revealed specific combinations of flight maneuvers (lateral translations in combination with yaw rotations), which are particularly suitable to extract depth cues from motion parallax. We modelled the generated optic flow and found characteristic patterns of angular displacement corresponding to the depth profiles of our stimuli: optic flow patterns from pure translations successfully recovered depth relations from the magnitude of angular displacements, additional rotation provided robust depth information based on the direction of the displacements; thus, the bees flight maneuvers may reflect an optimized visuo-motor strategy to extract depth structure from motion signals. The robustness and simplicity of this strategy offers an efficient solution for 3D-object-recognition without stereo vision, and could be employed by other flying insects, or mobile robots. PMID:26886006
Object Recognition in Flight: How Do Bees Distinguish between 3D Shapes?
Werner, Annette; Stürzl, Wolfgang; Zanker, Johannes
2016-01-01
Honeybees (Apis mellifera) discriminate multiple object features such as colour, pattern and 2D shape, but it remains unknown whether and how bees recover three-dimensional shape. Here we show that bees can recognize objects by their three-dimensional form, whereby they employ an active strategy to uncover the depth profiles. We trained individual, free flying honeybees to collect sugar water from small three-dimensional objects made of styrofoam (sphere, cylinder, cuboids) or folded paper (convex, concave, planar) and found that bees can easily discriminate between these stimuli. We also tested possible strategies employed by the bees to uncover the depth profiles. For the card stimuli, we excluded overall shape and pictorial features (shading, texture gradients) as cues for discrimination. Lacking sufficient stereo vision, bees are known to use speed gradients in optic flow to detect edges; could the bees apply this strategy also to recover the fine details of a surface depth profile? Analysing the bees' flight tracks in front of the stimuli revealed specific combinations of flight maneuvers (lateral translations in combination with yaw rotations), which are particularly suitable to extract depth cues from motion parallax. We modelled the generated optic flow and found characteristic patterns of angular displacement corresponding to the depth profiles of our stimuli: optic flow patterns from pure translations successfully recovered depth relations from the magnitude of angular displacements, additional rotation provided robust depth information based on the direction of the displacements; thus, the bees flight maneuvers may reflect an optimized visuo-motor strategy to extract depth structure from motion signals. The robustness and simplicity of this strategy offers an efficient solution for 3D-object-recognition without stereo vision, and could be employed by other flying insects, or mobile robots.
Competitive Dynamics in MSTd: A Mechanism for Robust Heading Perception Based on Optic Flow
Layton, Oliver W.; Fajen, Brett R.
2016-01-01
Human heading perception based on optic flow is not only accurate, it is also remarkably robust and stable. These qualities are especially apparent when observers move through environments containing other moving objects, which introduce optic flow that is inconsistent with observer self-motion and therefore uninformative about heading direction. Moving objects may also occupy large portions of the visual field and occlude regions of the background optic flow that are most informative about heading perception. The fact that heading perception is biased by no more than a few degrees under such conditions attests to the robustness of the visual system and warrants further investigation. The aim of the present study was to investigate whether recurrent, competitive dynamics among MSTd neurons that serve to reduce uncertainty about heading over time offer a plausible mechanism for capturing the robustness of human heading perception. Simulations of existing heading models that do not contain competitive dynamics yield heading estimates that are far more erratic and unstable than human judgments. We present a dynamical model of primate visual areas V1, MT, and MSTd based on that of Layton, Mingolla, and Browning that is similar to the other models, except that the model includes recurrent interactions among model MSTd neurons. Competitive dynamics stabilize the model’s heading estimate over time, even when a moving object crosses the future path. Soft winner-take-all dynamics enhance units that code a heading direction consistent with the time history and suppress responses to transient changes to the optic flow field. Our findings support recurrent competitive temporal dynamics as a crucial mechanism underlying the robustness and stability of perception of heading. PMID:27341686
On-the-fly cross flow laser guided separation of aerosol particles
NASA Astrophysics Data System (ADS)
Lall, A. A.; Terray, A.; Hart, S. J.
2010-08-01
Laser separation of particles is achieved using forces resulting from the momentum exchange between particles and photons constituting the laser radiation. Particles can experience different optical forces depending on their size and/or optical properties, such as refractive index. Thus, particles can move at different speeds in the presence of an optical force, leading to spatial separations. Several studies for aqueous suspension of particles have been reported in the past. In this paper, we present extensive analysis for optical forces on non-absorbing aerosol particles. We used a loosely focused Gaussian 1064 nm laser to simultaneously hold and deflect particles entrained in flow perpendicular to their direction of travel. The gradient force is used to hold the particles against the viscous drag for a short period of time. The scattering force simultaneously pushes the particles during this period. Theoretical calculations are used to simulate particle trajectories and to determine the net deflection: a measure of the ability to separate. We invented a novel method for aerosol generation and delivery to the flow cell. Particle motion was imaged using a high speed camera working at 3000+ frames per second with a viewing area up to a few millimeters. An 8W near-infrared 1064 nm laser was used to provide the optical force to the particles. Theoretical predictions were corroborated with measurements using polystyrene latex particles of 20 micron diameter. We measured particle deflections up to about 1500 microns. Such large deflections represent a new milestone for optical chromatography in the gas phase.
A characterization of Parkinson's disease by describing the visual field motion during gait
NASA Astrophysics Data System (ADS)
Trujillo, David; Martínez, Fabio; Atehortúa, Angélica; Alvarez, Charlens; Romero, Eduardo
2015-12-01
An early diagnosis of Parkinson's Disease (PD) is crucial towards devising successful rehabilitation programs. Typically, the PD diagnosis is performed by characterizing typical symptoms, namely bradykinesia, rigidity, tremor, postural instability or freezing gait. However, traditional examination tests are usually incapable of detecting slight motor changes, specially for early stages of the pathology. Recently, eye movement abnormalities have correlated with early onset of some neurodegenerative disorders. This work introduces a new characterization of the Parkinson disease by describing the ocular motion during a common daily activity as the gait. This paper proposes a fully automatic eye motion analysis using a dense optical flow that tracks the ocular direction. The eye motion is then summarized using orientation histograms constructed during a whole gait cycle. The proposed approach was evaluated by measuring the χ2 distance between the orientation histograms, showing substantial differences between control and PD patients.
A marker-free system for the analysis of movement disabilities.
Legrand, L; Marzani, F; Dusserre, L
1998-01-01
A major step toward improving the treatments of disabled persons may be achieved by using motion analysis equipment. We are developing such a system. It allows the analysis of plane human motion (e.g. gait) without using the tracking of markers. The system is composed of one fixed camera which acquires an image sequence of a human in motion. Then the treatment is divided into two steps: first, a large number of pixels belonging to the boundaries of the human body are extracted at each acquisition time. Secondly, a two-dimensional model of the human body, based on tapered superquadrics, is successively matched with the sets of pixels previously extracted; a specific fuzzy clustering process is used for this purpose. Moreover, an optical flow procedure gives a prediction of the model location at each acquisition time from its location at the previous time. Finally we present some results of this process applied to a leg in motion.
NASA Astrophysics Data System (ADS)
Wu, Jianping; Lu, Fei; Zou, Kai; Yan, Hong; Wan, Min; Kuang, Yan; Zhou, Yanqing
2018-03-01
An ultra-high angular velocity and minor-caliber high-precision stably control technology application for active-optics image-motion compensation, is put forward innovatively in this paper. The image blur problem due to several 100°/s high-velocity relative motion between imaging system and target is theoretically analyzed. The velocity match model of detection system and active optics compensation system is built, and active optics image motion compensation platform experiment parameters are designed. Several 100°/s high-velocity high-precision control optics compensation technology is studied and implemented. The relative motion velocity is up to 250°/s, and image motion amplitude is more than 20 pixel. After the active optics compensation, motion blur is less than one pixel. The bottleneck technology of ultra-high angular velocity and long exposure time in searching and infrared detection system is successfully broke through.
Translation and Rotation Trade Off in Human Visual Heading Estimation
NASA Technical Reports Server (NTRS)
Stone, Leland S.; Perrone, John A.; Null, Cynthia H. (Technical Monitor)
1996-01-01
We have previously shown that, during simulated curvilinear motion, humans can make reasonably accurate and precise heading judgments from optic flow without either oculomotor or static-depth cues about rotation. We now systematically investigate the effect of varying the parameters of self-motion. We visually simulated 400 ms of self-motion along curved paths (constant rotation and translation rates, fixed retinocentric heading) towards two planes of random dots at 10.3 m and 22.3 m at mid-trial. Retinocentric heading judgments of 4 observers (2 naive) were measured for 12 different combinations of translation (T between 4 and 16 m/s) and rotation (R either 8 or 16 deg/s). In the range tested, heading bias and uncertainty decrease quasilinearly with T/R, but the bias also appears to depend on R. If depth is held constant, the ratio T/R can account for much of the variation in the accuracy and precision of human visual heading estimation, although further experiments are needed to resolve whether absolute rotation rate, total flow rate, or some other factor can account for the observed -2 deg shift between the bias curves.
Sims, J A; Giorgi, M C; Oliveira, M A; Meneghetti, J C; Gutierrez, M A
2018-04-01
Extract directional information related to left ventricular (LV) rotation and torsion from a 4D PET motion field using the Discrete Helmholtz Hodge Decomposition (DHHD). Synthetic motion fields were created using superposition of rotational and radial field components and cardiac fields produced using optical flow from a control and patient image. These were decomposed into curl-free (CF) and divergence-free (DF) components using the DHHD. Synthetic radial components were present in the CF field and synthetic rotational components in the DF field, with each retaining its center position, direction of motion and diameter after decomposition. Direction of rotation at apex and base for the control field were in opposite directions during systole, reversing during diastole. The patient DF field had little overall rotation with several small rotators. The decomposition of the LV motion field into directional components could assist quantification of LV torsion, but further processing stages seem necessary. Copyright © 2017 Elsevier Ltd. All rights reserved.
Broadband boundary effects on Brownian motion.
Mo, Jianyong; Simha, Akarsh; Raizen, Mark G
2015-12-01
Brownian motion of particles in confined fluids is important for many applications, yet the effects of the boundary over a wide range of time scales are still not well understood. We report high-bandwidth, comprehensive measurements of Brownian motion of an optically trapped micrometer-sized silica sphere in water near an approximately flat wall. At short distances we observe anisotropic Brownian motion with respect to the wall. We find that surface confinement not only occurs in the long time scale diffusive regime but also in the short time scale ballistic regime, and the velocity autocorrelation function of the Brownian particle decays faster than that of a particle in bulk fluid. Furthermore, at low frequencies the thermal force loses its color due to the reflected flow from the no-slip boundary. The power spectrum of the thermal force on the particle near a no-slip boundary becomes flat at low frequencies. This detailed understanding of boundary effects on Brownian motion opens a door to developing a 3D microscope using particles as remote sensors.
Oscillatory flow in the cochlea visualized by a magnetic resonance imaging technique.
Denk, W; Keolian, R M; Ogawa, S; Jelinski, L W
1993-02-15
We report a magnetic resonance imaging technique that directly measures motion of cochlear fluids. It uses oscillating magnetic field gradients phase-locked to an external stimulus to selectively visualize and quantify oscillatory fluid motion. It is not invasive, and it does not require optical line-of-sight access to the inner ear. It permits the detection of displacements far smaller than the spatial resolution. The method is demonstrated on a phantom and on living rats. It is projected to have applications for auditory research, for the visualization of vocal tract dynamics during speech and singing, and for determination of the spatial distribution of mechanical relaxations in materials.
Shaking video stabilization with content completion
NASA Astrophysics Data System (ADS)
Peng, Yi; Ye, Qixiang; Liu, Yanmei; Jiao, Jianbin
2009-01-01
A new stabilization algorithm to counterbalance the shaking motion in a video based on classical Kandade-Lucas- Tomasi (KLT) method is presented in this paper. Feature points are evaluated with law of large numbers and clustering algorithm to reduce the side effect of moving foreground. Analysis on the change of motion direction is also carried out to detect the existence of shaking. For video clips with detected shaking, an affine transformation is performed to warp the current frame to the reference one. In addition, the missing content of a frame during the stabilization is completed with optical flow analysis and mosaicking operation. Experiments on video clips demonstrate the effectiveness of the proposed algorithm.
NASA Astrophysics Data System (ADS)
Karam, Pascal; Pennathur, Sumita
2016-11-01
Characterization of the electrophoretic mobility and zeta potential of micro and nanoparticles is important for assessing properties such as stability, charge and size. In electrophoretic techniques for such characterization, the bulk fluid motion due to the interaction between the fluid and the charged surface must be accounted for. Unlike current industrial systems which rely on DLS and oscillating potentials to mitigate electroosmotic flow (EOF), we propose a simple alternative electrophoretic method for optically determining electrophoretic mobility using a DC electric fields. Specifically, we create a system where an adverse pressure gradient counters EOF, and design the geometry of the channel so that the flow profile of the pressure driven flow matches that of the EOF in large regions of the channel (ie. where we observe particle flow). Our specific COMSOL-optimized geometry is two large cross sectional areas adjacent to a central, high aspect ratio channel. We show that this effectively removes EOF from a large region of the channel and allows for the accurate optical characterization of electrophoretic particle mobility, no matter the wall charge or particle size.
Effect of travel speed on the visual control of steering toward a goal.
Chen, Rongrong; Niehorster, Diederick C; Li, Li
2018-03-01
Previous studies have proposed that people can use visual cues such as the instantaneous direction (i.e., heading) or future path trajectory of travel specified by optic flow or target visual direction in egocentric space to steer or walk toward a goal. In the current study, we examined what visual cues people use to guide their goal-oriented locomotion and whether their reliance on such visual cues changes as travel speed increases. We presented participants with optic flow displays that simulated their self-motion toward a target at various travel speeds under two viewing conditions in which we made target egocentric direction available or unavailable for steering. We found that for both viewing conditions, participants did not steer along a curved path toward the target such that the actual and the required path curvature to reach the target would converge when approaching the target. At higher travel speeds, participants showed a faster and larger reduction in target-heading angle and more accurate and precise steady-state control of aligning their heading specified by optic flow with the target. These findings support the claim that people use heading and target egocentric direction but not path for goal-oriented locomotion control, and their reliance on heading increases at higher travel speeds. The increased reliance on heading for goal-oriented locomotion control could be due to an increased reliability in perceiving heading from optic flow as the magnitude of flow increases with travel speed. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Mechanistic basis of otolith formation during teleost inner ear development
Wu, David; Freund, Jonathan B.; Fraser, Scott E.; Vermot, Julien
2011-01-01
Otoliths, which are connected to stereociliary bundles in the inner ear, serve as inertial sensors for balance. In teleostei, otolith development is critically dependant on flow forces generated by beating cilia; however, the mechanism by which flow controls otolith formation remains unclear. Here, we have developed a non-invasive flow probe using optical tweezers and a viscous flow model in order to demonstrate how the observed hydrodynamics influence otolith assembly. We show that rotational flow stirs and suppresses precursor agglomeration in the core of the cilia-driven vortex. The velocity field correlates with the shape of the otolith and we provide evidence that hydrodynamics is actively involved in controlling otolith morphogenesis. An implication of this hydrodynamic effect is that otolith self-assembly is mediated by the balance between Brownian motion and cilia-driven flow. More generally, this flow feature highlights an alternative biological strategy for controlling particle localization in solution. PMID:21316594
Active contour-based visual tracking by integrating colors, shapes, and motions.
Hu, Weiming; Zhou, Xue; Li, Wei; Luo, Wenhan; Zhang, Xiaoqin; Maybank, Stephen
2013-05-01
In this paper, we present a framework for active contour-based visual tracking using level sets. The main components of our framework include contour-based tracking initialization, color-based contour evolution, adaptive shape-based contour evolution for non-periodic motions, dynamic shape-based contour evolution for periodic motions, and the handling of abrupt motions. For the initialization of contour-based tracking, we develop an optical flow-based algorithm for automatically initializing contours at the first frame. For the color-based contour evolution, Markov random field theory is used to measure correlations between values of neighboring pixels for posterior probability estimation. For adaptive shape-based contour evolution, the global shape information and the local color information are combined to hierarchically evolve the contour, and a flexible shape updating model is constructed. For the dynamic shape-based contour evolution, a shape mode transition matrix is learnt to characterize the temporal correlations of object shapes. For the handling of abrupt motions, particle swarm optimization is adopted to capture the global motion which is applied to the contour in the current frame to produce an initial contour in the next frame.
FPGA-based architecture for motion recovering in real-time
NASA Astrophysics Data System (ADS)
Arias-Estrada, Miguel; Maya-Rueda, Selene E.; Torres-Huitzil, Cesar
2002-03-01
A key problem in the computer vision field is the measurement of object motion in a scene. The main goal is to compute an approximation of the 3D motion from the analysis of an image sequence. Once computed, this information can be used as a basis to reach higher level goals in different applications. Motion estimation algorithms pose a significant computational load for the sequential processors limiting its use in practical applications. In this work we propose a hardware architecture for motion estimation in real time based on FPGA technology. The technique used for motion estimation is Optical Flow due to its accuracy, and the density of velocity estimation, however other techniques are being explored. The architecture is composed of parallel modules working in a pipeline scheme to reach high throughput rates near gigaflops. The modules are organized in a regular structure to provide a high degree of flexibility to cover different applications. Some results will be presented and the real-time performance will be discussed and analyzed. The architecture is prototyped in an FPGA board with a Virtex device interfaced to a digital imager.
Optical flow estimation on image sequences with differently exposed frames
NASA Astrophysics Data System (ADS)
Bengtsson, Tomas; McKelvey, Tomas; Lindström, Konstantin
2015-09-01
Optical flow (OF) methods are used to estimate dense motion information between consecutive frames in image sequences. In addition to the specific OF estimation method itself, the quality of the input image sequence is of crucial importance to the quality of the resulting flow estimates. For instance, lack of texture in image frames caused by saturation of the camera sensor during exposure can significantly deteriorate the performance. An approach to avoid this negative effect is to use different camera settings when capturing the individual frames. We provide a framework for OF estimation on such sequences that contain differently exposed frames. Information from multiple frames are combined into a total cost functional such that the lack of an active data term for saturated image areas is avoided. Experimental results demonstrate that using alternate camera settings to capture the full dynamic range of an underlying scene can clearly improve the quality of flow estimates. When saturation of image data is significant, the proposed methods show superior performance in terms of lower endpoint errors of the flow vectors compared to a set of baseline methods. Furthermore, we provide some qualitative examples of how and when our method should be used.
Magnetomotive laser speckle imaging
Kim, Jeehyun; Oh, Junghwan; Choi, Bernard
2010-01-01
Laser speckle imaging (LSI) involves analysis of reflectance images collected during coherent optical excitation of an object to compute wide-field maps of tissue blood flow. An intrinsic limitation of LSI for resolving microvascular architecture is that its signal depends on relative motion of interrogated red blood cells. Hence, with LSI, small-diameter arterioles, venules, and capillaries are difficult to resolve due to the slow flow speeds associated with such vasculature. Furthermore, LSI characterization of subsurface blood flow is subject to blurring due to scattering, further limiting the ability of LSI to resolve or quantify blood flow in small vessels. Here, we show that magnetic activation of superparamagnetic iron oxide (SPIO) nanoparticles modulate the speckle flow index (SFI) values estimated from speckle contrast analysis of collected images. With application of an ac magnetic field to a solution of stagnant SPIO particles, an apparent increase in SFI is induced. Furthermore, with application of a focused dc magnetic field, a focal decrease in SFI values is induced. Magnetomotive LSI may enable wide-field mapping of suspicious tissue regions, enabling subsequent high-resolution optical interrogation of these regions. Similarly, subsequent photoactivation of intravascular SPIO nanoparticles could then be performed to induce selective photothermal destruction of unwanted vasculature. PMID:20210436
Monitoring of fluid motion in a micromixer by dynamic NMR microscopy.
Ahola, Susanna; Casanova, Federico; Perlo, Juan; Münnemann, Kerstin; Blümich, Bernhard; Stapf, Siegfried
2006-01-01
The velocity distribution of liquid flowing in a commercial micromixer has been determined directly by using pulsed-field gradient NMR. Velocity maps with a spatial resolution of 29 microm x 43 microm were obtained by combining standard imaging gradient units with a homebuilt rectangular surface coil matching the mixer geometry. The technique provides access to mixers and reactors of arbitrary shape regardless of optical transparency. Local heterogeneities in the signal intensity and the velocity pattern were found and serve to investigate the quality and functionality of a micromixer, revealing clogging and inhomogeneous flow distributions.
Studying Turbulence Using Numerical Simulation Databases, 2. Proceedings of the 1988 Summer Program
NASA Technical Reports Server (NTRS)
1988-01-01
The focus of the program was on the use of direct numerical simulations of turbulent flow for study of turbulence physics and modeling. A special interest was placed on turbulent mixing layers. The required data for these investigations were generated from four newly developed codes for simulation of time and spatially developing incompressible and compressible mixing layers. Also of interest were the structure of wall bounded turbulent and transitional flows, evaluation of diagnostic techniques for detection of organized motions, energy transfer in isotropic turbulence, optical propagation through turbulent media, and detailed analysis of the interaction of vortical structures.
Real-time eye motion correction in phase-resolved OCT angiography with tracking SLO
Braaf, Boy; Vienola, Kari V.; Sheehy, Christy K.; Yang, Qiang; Vermeer, Koenraad A.; Tiruveedhula, Pavan; Arathorn, David W.; Roorda, Austin; de Boer, Johannes F.
2012-01-01
In phase-resolved OCT angiography blood flow is detected from phase changes in between A-scans that are obtained from the same location. In ophthalmology, this technique is vulnerable to eye motion. We address this problem by combining inter-B-scan phase-resolved OCT angiography with real-time eye tracking. A tracking scanning laser ophthalmoscope (TSLO) at 840 nm provided eye tracking functionality and was combined with a phase-stabilized optical frequency domain imaging (OFDI) system at 1040 nm. Real-time eye tracking corrected eye drift and prevented discontinuity artifacts from (micro)saccadic eye motion in OCT angiograms. This improved the OCT spot stability on the retina and consequently reduced the phase-noise, thereby enabling the detection of slower blood flows by extending the inter-B-scan time interval. In addition, eye tracking enabled the easy compounding of multiple data sets from the fovea of a healthy volunteer to create high-quality eye motion artifact-free angiograms. High-quality images are presented of two distinct layers of vasculature in the retina and the dense vasculature of the choroid. Additionally we present, for the first time, a phase-resolved OCT angiogram of the mesh-like network of the choriocapillaris containing typical pore openings. PMID:23304647
A novel rheo-optical device for studying complex fluids in a double shear plate geometry.
Boitte, Jean-Baptiste; Vizcaïno, Claude; Benyahia, Lazhar; Herry, Jean-Marie; Michon, Camille; Hayert, Murielle
2013-01-01
A new rheo-optical shearing device was designed to investigate the structural evolution of complex material under shear flow. Seeking to keep the area under study constantly within the field of vision, it was conceived to produce shear flow by relying on the uniaxial translation of two parallel plates. The device features three modes of translation motion: step strain (0.02-320), constant shear rate (0.01-400 s(-1)), and oscillation (0.01-20 Hz) flow. Because the temperature is controlled by using a Peltier module coupled with a water cooling system, temperatures can range from 10 to 80 °C. The sample is loaded onto a user-friendly plate on which standard glasses can be attached with a depression vacuum pump. The principle innovation of the proposed rheo-optical shearing device lies in the fact that this suction system renders the microscopy glasses one with the plates, thereby ensuring their perfect planarity and parallelism. The gap width between the two plates can range from 0 to 5 mm. The device was designed to fit on any inverted confocal laser scanning microscope. In terms of controlled deformation, the conception and technical solutions achieve a high level of accuracy. Moreover, user-friendly software has been developed to control both shear flow parameters and temperature. The validation of specifications as well as the three modes of motion was carried out, first of all without a sample, and then by tracking fluorescent particles in a model system, in our case a micro-gel. Real values agreed well with those we targeted. In addition, an experiment with bread dough deformation under shear flow was initiated to gain some insight into the potential use of our device. These results show that the RheOptiCAD(®) promises to be a useful tool to better understand, from both a fundamental and an industrial point of view, the rheological behavior of the microstructure of complex fluids under controlled thermo-mechanical parameters in the case of food and non-food systems.
A novel rheo-optical device for studying complex fluids in a double shear plate geometry
NASA Astrophysics Data System (ADS)
Boitte, Jean-Baptiste; Vizcaïno, Claude; Benyahia, Lazhar; Herry, Jean-Marie; Michon, Camille; Hayert, Murielle
2013-01-01
A new rheo-optical shearing device was designed to investigate the structural evolution of complex material under shear flow. Seeking to keep the area under study constantly within the field of vision, it was conceived to produce shear flow by relying on the uniaxial translation of two parallel plates. The device features three modes of translation motion: step strain (0.02-320), constant shear rate (0.01-400 s-1), and oscillation (0.01-20 Hz) flow. Because the temperature is controlled by using a Peltier module coupled with a water cooling system, temperatures can range from 10 to 80 °C. The sample is loaded onto a user-friendly plate on which standard glasses can be attached with a depression vacuum pump. The principle innovation of the proposed rheo-optical shearing device lies in the fact that this suction system renders the microscopy glasses one with the plates, thereby ensuring their perfect planarity and parallelism. The gap width between the two plates can range from 0 to 5 mm. The device was designed to fit on any inverted confocal laser scanning microscope. In terms of controlled deformation, the conception and technical solutions achieve a high level of accuracy. Moreover, user-friendly software has been developed to control both shear flow parameters and temperature. The validation of specifications as well as the three modes of motion was carried out, first of all without a sample, and then by tracking fluorescent particles in a model system, in our case a micro-gel. Real values agreed well with those we targeted. In addition, an experiment with bread dough deformation under shear flow was initiated to gain some insight into the potential use of our device. These results show that the RheOptiCAD® promises to be a useful tool to better understand, from both a fundamental and an industrial point of view, the rheological behavior of the microstructure of complex fluids under controlled thermo-mechanical parameters in the case of food and non-food systems.
NASA Astrophysics Data System (ADS)
Minchew, B. M.; Simons, M.; Riel, B.; Milillo, P.
2017-01-01
To better understand the influence of stress changes over floating ice shelves on grounded ice streams, we develop a Bayesian method for inferring time-dependent 3-D surface velocity fields from synthetic aperture radar (SAR) and optical remote sensing data. Our specific goal is to observe ocean tide-induced variability in vertical ice shelf position and horizontal ice stream flow. Thus, we consider the special case where observed surface displacement at a given location can be defined by a 3-D secular velocity vector, a family of 3-D sinusoidal functions, and a correction to the digital elevation model used to process the SAR data. Using nearly 9 months of SAR data collected from multiple satellite viewing geometries with the COSMO-SkyMed 4-satellite constellation, we infer the spatiotemporal response of Rutford Ice Stream, West Antarctica, to ocean tidal forcing. Consistent with expected tidal uplift, inferred vertical motion over the ice shelf is dominated by semidiurnal and diurnal tidal constituents. Horizontal ice flow variability, on the other hand, occurs primarily at the fortnightly spring-neap tidal period (Msf). We propose that periodic grounding of the ice shelf is the primary mechanism for translating vertical tidal motion into horizontal flow variability, causing ice flow to accelerate first and most strongly over the ice shelf. Flow variations then propagate through the grounded ice stream at a mean rate of ˜29 km/d and decay quasi-linearly with distance over ˜85 km upstream of the grounding zone.
Target detection in insects: optical, neural and behavioral optimizations.
Gonzalez-Bellido, Paloma T; Fabian, Samuel T; Nordström, Karin
2016-12-01
Motion vision provides important cues for many tasks. Flying insects, for example, may pursue small, fast moving targets for mating or feeding purposes, even when these are detected against self-generated optic flow. Since insects are small, with size-constrained eyes and brains, they have evolved to optimize their optical, neural and behavioral target visualization solutions. Indeed, even if evolutionarily distant insects display different pursuit strategies, target neuron physiology is strikingly similar. Furthermore, the coarse spatial resolution of the insect compound eye might actually be beneficial when it comes to detection of moving targets. In conclusion, tiny insects show higher than expected performance in target visualization tasks. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Sensing of minute airflow motions near walls using pappus-type nature-inspired sensors
Mikulich, Vladimir
2017-01-01
This work describes the development and use of pappus-like structures as sensitive sensors to detect minute air-flow motions. We made such sensors from pappi taken from nature-grown seed, whose filiform hairs’ length-scale is suitable for the study of large-scale turbulent convection flows. The stem with the pappus on top is fixated on an elastic membrane on the wall and tilts under wind-load proportional to the velocity magnitude in direction of the wind, similar as the biological sensory hairs found in spiders, however herein the sensory hair has multiple filiform protrusions at the tip. As the sensor response is proportional to the drag on the tip and a low mass ensures a larger bandwidth, lightweight pappus structures similar as those found in nature with documented large drag are useful to improve the response of artificial sensors. The pappus of a Dandelion represents such a structure which has evolved to maximize wind-driven dispersion, therefore it is used herein as the head of our sensor. Because of its multiple hairs arranged radially around the stem it generates uniform drag for all wind directions. While still being permeable to the flow, the hundreds of individual hairs on the tip of the sensor head maximize the drag and minimize influence of pressure gradients or shear-induced lift forces on the sensor response as they occur in non-permeable protrusions. In addition, the flow disturbance by the sensor itself is limited. The optical recording of the head-motion allows continuously remote-distance monitoring of the flow fluctuations in direction and magnitude. Application is shown for the measurement of a reference flow under isothermal conditions to detect the early occurrence of instabilities. PMID:28658272
Electro-Optical Platform for the Manipulation of Live Cells
2002-10-02
system, other physical forces may play a significant role. In particular, electroosmotic forces that cause fluid movement relative to a surface can...occur due to the mobility of ions in solution. Electroosmotic forces are commonly utilized in capillary electrophoretic separa- tion, where the capillary...fluid motion that acts to entrain particles to be separated.46 Thus, in the chamber presented here, the patterned anode can induce electroosmotic flow
Video to Text (V2T) in Wide Area Motion Imagery
2015-09-01
microtext) or a document (e.g., using Sphinx or Apache NLP ) as an automated approach [102]. Previous work in natural language full-text searching...language processing ( NLP ) based module. The heart of the structured text processing module includes the following seven key word banks...Features Tracker MHT Multiple Hypothesis Tracking MIL Multiple Instance Learning NLP Natural Language Processing OAB Online AdaBoost OF Optic Flow
Dynamic analysis of trapping and escaping in dual beam optical trap
NASA Astrophysics Data System (ADS)
Li, Wenqiang; Hu, Huizhu; Su, Heming; Li, Zhenggang; Shen, Yu
2016-10-01
In this paper, we simulate the dynamic movement of a dielectric sphere in optical trap. This dynamic analysis can be used to calibrate optical forces, increase trapping efficiency and measure viscous coefficient of surrounding medium. Since an accurate dynamic analysis is based on a detailed force calculation, we calculate all forces a sphere receives. We get the forces of dual-beam gradient radiation pressure on a micron-sized dielectric sphere in the ray optics regime and utilize Einstein-Ornstein-Uhlenbeck to deal with its Brownian motion forces. Hydrodynamic viscous force also exists when the sphere moves in liquid. Forces from buoyance and gravity are also taken into consideration. Then we simulate trajectory of a sphere when it is subject to all these forces in a dual optical trap. From our dynamic analysis, the sphere can be trapped at an equilibrium point in static water, although it permanently fluctuates around the equilibrium point due to thermal effects. We go a step further to analyze the effects of misalignment of two optical traps. Trapping and escaping phenomena of the sphere in flowing water are also simulated. In flowing water, the sphere is dragged away from the equilibrium point. This dragging distance increases with the decrease of optical power, which results in escaping of the sphere with optical power below a threshold. In both trapping and escaping process we calculate the forces and position of the sphere. Finally, we analyze a trapping region in dual optical tweezers.
Ubiquitous and Continuous Propagating Disturbances in the Solar Corona
NASA Astrophysics Data System (ADS)
Morgan, Huw; Hutton, Joseph
2018-02-01
A new processing method applied to Atmospheric Imaging Assembly/Solar Dynamic Observatory observations reveals continuous propagating faint motions throughout the corona. The amplitudes are small, typically 2% of the background intensity. An hour’s data are processed from four AIA channels for a region near disk center, and the motions are characterized using an optical flow method. The motions trace the underlying large-scale magnetic field. The motion vector field describes large-scale coherent regions that tend to converge at narrow corridors. Large-scale vortices can also be seen. The hotter channels have larger-scale regions of coherent motion compared to the cooler channels, interpreted as the typical length of magnetic loops at different heights. Regions of low mean and high time variance in velocity are where the dominant motion component is along the line of sight as a result of a largely vertical magnetic field. The mean apparent magnitude of the optical velocities are a few tens of km s‑1, with different distributions in different channels. Over time, the velocities vary smoothly between a few km s‑1 to 100 km s‑1 or higher, varying on timescales of minutes. A clear bias of a few km s‑1 toward positive x-velocities is due to solar rotation and may be used as calibration in future work. All regions of the low corona thus experience a continuous stream of propagating disturbances at the limit of both spatial resolution and signal level. The method provides a powerful new diagnostic tool for tracing the magnetic field, and to probe motions at sub-pixel scales, with important implications for models of heating and of the magnetic field.
Civil infrastructure monitoring for IVHS using optical fiber sensors
NASA Astrophysics Data System (ADS)
de Vries, Marten J.; Arya, Vivek; Grinder, C. R.; Murphy, Kent A.; Claus, Richard O.
1995-01-01
8Early deployment of Intelligent Vehicle Highway Systems would necessitate the internal instrumentation of infrastructure for emergency preparedness. Existing quantitative analysis and visual analysis techniques are time consuming, cost prohibitive, and are often unreliable. Fiber optic sensors are rapidly replacing conventional instrumentation because of their small size, light weight, immunity to electromagnetic interference, and extremely high information carrying capability. In this paper research on novel optical fiber sensing techniques for health monitoring of civil infrastructure such as highways and bridges is reported. Design, fabrication, and implementation of fiber optic sensor configurations used for measurements of strain are discussed. Results from field tests conducted to demonstrate the effectiveness of fiber sensors at determining quantitative strain vector components near crack locations in bridges are presented. Emerging applications of fiber sensors for vehicle flow, vehicle speed, and weigh-in-motion measurements are also discussed.
NASA Astrophysics Data System (ADS)
Larsen, C. F.; Bartholomaus, T. C.; O'Neel, S.; West, M. E.
2010-12-01
We observe ice motion, calving and seismicity simultaneously and with high-resolution on an advancing tidewater glacier in Icy Bay, Alaska. Icy Bay’s tidewater glaciers dominate regional glacier-generated seismicity in Alaska. Yahtse emanates from the St. Elias Range near the Bering-Bagley-Seward-Malaspina Icefield system, the most extensive glacier cover outside the polar regions. Rapid rates of change and fast flow (>16 m/d near the terminus) at Yahtse Glacier provide a direct analog to the disintegrating outlet systems in Greenland. Our field experiment co-locates GPS and seismometers on the surface of the glacier, with a greater network of bedrock seismometers surrounding the glacier. Time-lapse photogrammetry, fjord wave height sensors, and optical survey methods monitor iceberg calving and ice velocity near the terminus. This suite of geophysical instrumentation enables us to characterize glacier motion and geometry changes while concurrently listening for seismic energy release. We are performing a close examination of calving as a seismic source, and the associated mechanisms of energy transfer to seismic waves. Detailed observations of ice motion (GPS and optical surveying), glacier geometry and iceberg calving (direct observations and timelapse photogrammetry) have been made in concert with a passive seismic network. Combined, the observations form the basis of a rigorous analysis exploring the relationship between glacier-generated seismic events and motion, glacier-fiord interactions, calving and hydraulics. Our work is designed to demonstrate the applicability and utility of seismology to study the impact of climate forcing on calving glaciers.
Arterial Mechanical Motion Estimation Based on a Semi-Rigid Body Deformation Approach
Guzman, Pablo; Hamarneh, Ghassan; Ros, Rafael; Ros, Eduardo
2014-01-01
Arterial motion estimation in ultrasound (US) sequences is a hard task due to noise and discontinuities in the signal derived from US artifacts. Characterizing the mechanical properties of the artery is a promising novel imaging technique to diagnose various cardiovascular pathologies and a new way of obtaining relevant clinical information, such as determining the absence of dicrotic peak, estimating the Augmentation Index (AIx), the arterial pressure or the arterial stiffness. One of the advantages of using US imaging is the non-invasive nature of the technique unlike Intra Vascular Ultra Sound (IVUS) or angiography invasive techniques, plus the relative low cost of the US units. In this paper, we propose a semi rigid deformable method based on Soft Bodies dynamics realized by a hybrid motion approach based on cross-correlation and optical flow methods to quantify the elasticity of the artery. We evaluate and compare different techniques (for instance optical flow methods) on which our approach is based. The goal of this comparative study is to identify the best model to be used and the impact of the accuracy of these different stages in the proposed method. To this end, an exhaustive assessment has been conducted in order to decide which model is the most appropriate for registering the variation of the arterial diameter over time. Our experiments involved a total of 1620 evaluations within nine simulated sequences of 84 frames each and the estimation of four error metrics. We conclude that our proposed approach obtains approximately 2.5 times higher accuracy than conventional state-of-the-art techniques. PMID:24871987
NASA Astrophysics Data System (ADS)
Gulyaev, P.; Jordan, V.; Gulyaev, I.; Dolmatov, A.
2017-05-01
The paper presents the analysis of the recorded tracks of high-velocity emission in the air-argon plasma flow during breaking up of tungsten microdroplets. This new physical effect of optical emission involves two stages. The first one includes thermionic emission of electrons from the surface of the melted tungsten droplet of 100-200 μm size and formation of the charged sphere of 3-5 mm diameter. After it reaches the breakdown electric potential, it collapses and produces a spherical shock wave and luminous radiation. The second stage includes previously unknown physical phenomenon of narrowly directed energy jet with velocity exceeding 4000 m/s from the surface of the tungsten droplet. The luminous spherical collapse and high-velocity jets were recorded using CMOS photo-array operating in a global shutter charge storage mode. Special features of the CMOS array scanning algorithm affect formation of distinctive signs of the recorded tracks, which stay invariant to trace transform (TT) with specific functional. The series of concentric circles were adopted as primitive object models (patterns) used in TT at the spherical collapse stage and linear segment of fixed thickness - at the high-velocity emission stage. The two invariants of the physical object, motion velocity and optical brightness distribution in the motion front, were adopted as desired identification features of tracks. The analytical expressions of the relation of 2D TT parameters and physical object motion invariants were obtained. The equations for spherical collapse stage correspond to Radon-Nikodym transform.
Fetsch, Christopher R; Wang, Sentao; Gu, Yong; Deangelis, Gregory C; Angelaki, Dora E
2007-01-17
Heading perception is a complex task that generally requires the integration of visual and vestibular cues. This sensory integration is complicated by the fact that these two modalities encode motion in distinct spatial reference frames (visual, eye-centered; vestibular, head-centered). Visual and vestibular heading signals converge in the primate dorsal subdivision of the medial superior temporal area (MSTd), a region thought to contribute to heading perception, but the reference frames of these signals remain unknown. We measured the heading tuning of MSTd neurons by presenting optic flow (visual condition), inertial motion (vestibular condition), or a congruent combination of both cues (combined condition). Static eye position was varied from trial to trial to determine the reference frame of tuning (eye-centered, head-centered, or intermediate). We found that tuning for optic flow was predominantly eye-centered, whereas tuning for inertial motion was intermediate but closer to head-centered. Reference frames in the two unimodal conditions were rarely matched in single neurons and uncorrelated across the population. Notably, reference frames in the combined condition varied as a function of the relative strength and spatial congruency of visual and vestibular tuning. This represents the first investigation of spatial reference frames in a naturalistic, multimodal condition in which cues may be integrated to improve perceptual performance. Our results compare favorably with the predictions of a recent neural network model that uses a recurrent architecture to perform optimal cue integration, suggesting that the brain could use a similar computational strategy to integrate sensory signals expressed in distinct frames of reference.
Noninvasive detection of cardiovascular pulsations by optical Doppler techniques
NASA Astrophysics Data System (ADS)
Hong, HyunDae; Fox, Martin D.
1997-10-01
A system has been developed based on the measurement of skin surface vibration that can be used to detect the underlying vascular wall motion of superficial arteries and the chest wall. Data obtained from tissue phantoms suggested that the detected signals were related to intravascular pressure, an important clinical and physiological parameter. Unlike the conventional optical Doppler techniques that have been used to measure blood perfusion in skin layers and blood flow within superficial arteries, the present system was optimized to pick up skin vibrations. An optical interferometer with a 633-nm He:Ne laser was utilized to detect micrometer displacements of the skin surface. Motion velocity profiles of the skin surface near each superficial artery and auscultation points on a chest for the two heart valve sounds exhibited distinctive profiles. The theoretical and experimental results demonstrated that the system detected the velocity of skin movement, which is related to the time derivative of the pressure. The system also reduces the loading effect on the pulsation signals and heart sounds produced by the conventional piezoelectric vibration sensors. The system's sensitivity, which could be optimized further, was 366.2 micrometers /s for the present research. Overall, optical cardiovascular vibrometry has the potential to become a simple noninvasive approach to cardiovascular screening.
Noise induced chaos in optically driven colloidal rings.
NASA Astrophysics Data System (ADS)
Roichman, Yael; Zaslavsky, George; Grier, David G.
2007-03-01
Given a constant flux of energy, many driven dissipative systems rapidly organize themselves into configurations that support steady state motion. Examples include swarming of bacterial colonies, convection in shaken sandpiles, and synchronization in flowing traffic. How simple objects interacting in simple ways self-organize generally is not understood, mainly because so few of the available experimental systems afford the necessary access to their microscopic degrees of freedom. This talk introduces a new class of model driven dissipative systems typified by three colloidal spheres circulating around a ring-like optical trap known as an optical vortex. By controlling the interplay between hydrodynamic interactions and fixed disorder we are able to drive a transition from a previously predicted periodic steady state to fully developed chaos. In addition, by tracking both microscopic trajectories and macroscopic collective fluctuations the relation between the onset of microscopic weak chaos and the evolution of space-time self-similarity in macroscopic transport properties is revealed. In a broader scope, several optical vortices can be coupled to create a large dissipative system where each building block has internal degrees of freedom. In such systems the little understood dynamics of processes like frustration and jamming, fluctuation-dissipation relations and the propagation of collective motion can be tracked microscopically.
NASA Technical Reports Server (NTRS)
Ruttley, T; Marshburn, A.; Bloomberg, J. J.; Mulavara, A. P.; Richards, J. T.; Nomura, Y.
2005-01-01
The goal of the present study was to investigate the adaptive effects of variation in the direction of optic flow, experienced during linear treadmill walking, on modifying locomotor trajectory. Subjects (n = 30) walked on a motorized linear treadmill at 4.0 kilometers per hour for 24 minutes while viewing the interior of a 3D virtual scene projected onto a screen 1.5 in in front of them. The virtual scene depicted constant self-motion equivalent to either 1) walking around the perimeter of a room to one s left (Rotating Room group) 2) walking down the center of a hallway (Infinite Hallway group). The scene was static for the first 4 minutes, and then constant rate self-motion was simulated for the remaining 20 minutes. Before and after the treadmill locomotion adaptation period, subjects performed five stepping trials where in each trial they marched in place to the beat of a metronome at 90 steps/min while blindfolded in a quiet room. The subject's final heading direction (deg), final X (for-aft, cm) and final Y (medio-lateral, cm) positions were measured for each trial. During the treadmill locomotion adaptation period subject's 3D torso position was measured. We found that subjects in the Rotating Room group as compared to the Infinite Hallway group: 1) showed significantly greater deviation during post exposure testing in the heading direction and Y position opposite to the direction of optic flow experienced during treadmill walking 2) showed a significant monotonically increasing torso yaw angular rotation bias in the direction of optic flow during the treadmill adaptation exposure period. Subjects in both groups showed greater forward translation (in the +X direction) during the post treadmill stepping task that differed significantly from their pre exposure performance. Subjects in both groups reported no perceptual deviation in position during the stepping tasks. We infer that viewing simulated rotary self-motion during treadmill locomotion causes adaptive modification of sensory-motor integration in the control of position and trajectory during locomotion which functionally reflects adaptive changes in the integration of visual, vestibular, and proprioceptive cues. Such an adaptation in the control of position and heading direction during locomotion due to the congruence of sensory information demonstrates the potential for adaptive transfer between sensorimotor systems and suggests a common neural site for the processing and self-motion perception and concurrent adaptation in motor output. This will result in lack of subjects perception of deviation of position and trajectory during the post treadmill step test while blind folded.
NASA Technical Reports Server (NTRS)
Mulavara, A. P.; Richards, J. T.; Marshburn, A.; Nomura, Y.; Bloomberg, J. J.
2005-01-01
The goal of the present study was to investigate the adaptive effects of variation in the direction of optic flow, experienced during linear treadmill walking, on modifying locomotor trajectory. Subjects (n = 30) walked on a motorized linear treadmill at 4.0 km/h for 24 minutes while viewing the interior of a 3D virtual scene projected onto a screen 1.5 m in front of them. The virtual scene depicted constant self-motion equivalent to either 1) walking around the perimeter of a room to one s left (Rotating Room group) 2) walking down the center of a hallway (Infinite Hallway group). The scene was static for the first 4 minutes, and then constant rate self-motion was simulated for the remaining 20 minutes. Before and after the treadmill locomotion adaptation period, subjects performed five stepping trials where in each trial they marched in place to the beat of a metronome at 90 steps/min while blindfolded in a quiet room. The subject s final heading direction (deg), final X (for-aft, cm) and final Y (medio-lateral, cm) positions were measured for each trial. During the treadmill locomotion adaptation period subject s 3D torso position was measured. We found that subjects in the Rotating Room group as compared to the Infinite Hallway group: 1) showed significantly greater deviation during post exposure testing in the heading direction and Y position opposite to the direction of optic flow experienced during treadmill walking 2) showed a significant monotonically increasing torso yaw angular rotation bias in the direction of optic flow during the treadmill adaptation exposure period. Subjects in both groups showed greater forward translation (in the +X direction) during the post treadmill stepping task that differed significantly from their pre exposure performance. Subjects in both groups reported no perceptual deviation in position during the stepping tasks. We infer that 3 viewing simulated rotary self-motion during treadmill locomotion causes adaptive modification of sensory-motor integration in the control of position and trajectory during locomotion which functionally reflects adaptive changes in the integration of visual, vestibular, and proprioceptive cues. Such an adaptation in the control of position and heading direction during locomotion due to the congruence of sensory information demonstrates the potential for adaptive transfer between sensorimotor systems and suggests a common neural site for the processing and self-motion perception and concurrent adaptation in motor output. This will result in lack of subjects perception of deviation of position and trajectory during the post treadmill step test while blind folded.
A Motion Detection Algorithm Using Local Phase Information
Lazar, Aurel A.; Ukani, Nikul H.; Zhou, Yiyin
2016-01-01
Previous research demonstrated that global phase alone can be used to faithfully represent visual scenes. Here we provide a reconstruction algorithm by using only local phase information. We also demonstrate that local phase alone can be effectively used to detect local motion. The local phase-based motion detector is akin to models employed to detect motion in biological vision, for example, the Reichardt detector. The local phase-based motion detection algorithm introduced here consists of two building blocks. The first building block measures/evaluates the temporal change of the local phase. The temporal derivative of the local phase is shown to exhibit the structure of a second order Volterra kernel with two normalized inputs. We provide an efficient, FFT-based algorithm for implementing the change of the local phase. The second processing building block implements the detector; it compares the maximum of the Radon transform of the local phase derivative with a chosen threshold. We demonstrate examples of applying the local phase-based motion detection algorithm on several video sequences. We also show how the locally detected motion can be used for segmenting moving objects in video scenes and compare our local phase-based algorithm to segmentation achieved with a widely used optic flow algorithm. PMID:26880882
Compiling Techniques for East Antarctic Ice Velocity Mapping Based on Historical Optical Imagery
NASA Astrophysics Data System (ADS)
Li, X.; Li, R.; Qiao, G.; Cheng, Y.; Ye, W.; Gao, T.; Huang, Y.; Tian, Y.; Tong, X.
2018-05-01
Ice flow velocity over long time series in East Antarctica plays a vital role in estimating and predicting the mass balance of Antarctic Ice Sheet and its contribution to global sea level rise. However, there is no Antarctic ice velocity product with large space scale available showing the East Antarctic ice flow velocity pattern before the 1990s. We proposed three methods including parallax decomposition, grid-based NCC image matching, feature and gird-based image matching with constraints for estimation of surface velocity in East Antarctica based on ARGON KH-5 and LANDSAT imagery, showing the feasibility of using historical optical imagery to obtain Antarctic ice motion. Based on these previous studies, we presented a set of systematic method for developing ice surface velocity product for the entire East Antarctica from the 1960s to the 1980s in this paper.
Diffusion tensor optical coherence tomography
NASA Astrophysics Data System (ADS)
Marks, Daniel L.; Blackmon, Richard L.; Oldenburg, Amy L.
2018-01-01
In situ measurements of diffusive particle transport provide insight into tissue architecture, drug delivery, and cellular function. Analogous to diffusion-tensor magnetic resonance imaging (DT-MRI), where the anisotropic diffusion of water molecules is mapped on the millimeter scale to elucidate the fibrous structure of tissue, here we propose diffusion-tensor optical coherence tomography (DT-OCT) for measuring directional diffusivity and flow of optically scattering particles within tissue. Because DT-OCT is sensitive to the sub-resolution motion of Brownian particles as they are constrained by tissue macromolecules, it has the potential to quantify nanoporous anisotropic tissue structure at micrometer resolution as relevant to extracellular matrices, neurons, and capillaries. Here we derive the principles of DT-OCT, relating the detected optical signal from a minimum of six probe beams with the six unique diffusion tensor and three flow vector components. The optimal geometry of the probe beams is determined given a finite numerical aperture, and a high-speed hardware implementation is proposed. Finally, Monte Carlo simulations are employed to assess the ability of the proposed DT-OCT system to quantify anisotropic diffusion of nanoparticles in a collagen matrix, an extracellular constituent that is known to become highly aligned during tumor development.
Shared sensory estimates for human motion perception and pursuit eye movements.
Mukherjee, Trishna; Battifarano, Matthew; Simoncini, Claudio; Osborne, Leslie C
2015-06-03
Are sensory estimates formed centrally in the brain and then shared between perceptual and motor pathways or is centrally represented sensory activity decoded independently to drive awareness and action? Questions about the brain's information flow pose a challenge because systems-level estimates of environmental signals are only accessible indirectly as behavior. Assessing whether sensory estimates are shared between perceptual and motor circuits requires comparing perceptual reports with motor behavior arising from the same sensory activity. Extrastriate visual cortex both mediates the perception of visual motion and provides the visual inputs for behaviors such as smooth pursuit eye movements. Pursuit has been a valuable testing ground for theories of sensory information processing because the neural circuits and physiological response properties of motion-responsive cortical areas are well studied, sensory estimates of visual motion signals are formed quickly, and the initiation of pursuit is closely coupled to sensory estimates of target motion. Here, we analyzed variability in visually driven smooth pursuit and perceptual reports of target direction and speed in human subjects while we manipulated the signal-to-noise level of motion estimates. Comparable levels of variability throughout viewing time and across conditions provide evidence for shared noise sources in the perception and action pathways arising from a common sensory estimate. We found that conditions that create poor, low-gain pursuit create a discrepancy between the precision of perception and that of pursuit. Differences in pursuit gain arising from differences in optic flow strength in the stimulus reconcile much of the controversy on this topic. Copyright © 2015 the authors 0270-6474/15/358515-16$15.00/0.
Shared Sensory Estimates for Human Motion Perception and Pursuit Eye Movements
Mukherjee, Trishna; Battifarano, Matthew; Simoncini, Claudio
2015-01-01
Are sensory estimates formed centrally in the brain and then shared between perceptual and motor pathways or is centrally represented sensory activity decoded independently to drive awareness and action? Questions about the brain's information flow pose a challenge because systems-level estimates of environmental signals are only accessible indirectly as behavior. Assessing whether sensory estimates are shared between perceptual and motor circuits requires comparing perceptual reports with motor behavior arising from the same sensory activity. Extrastriate visual cortex both mediates the perception of visual motion and provides the visual inputs for behaviors such as smooth pursuit eye movements. Pursuit has been a valuable testing ground for theories of sensory information processing because the neural circuits and physiological response properties of motion-responsive cortical areas are well studied, sensory estimates of visual motion signals are formed quickly, and the initiation of pursuit is closely coupled to sensory estimates of target motion. Here, we analyzed variability in visually driven smooth pursuit and perceptual reports of target direction and speed in human subjects while we manipulated the signal-to-noise level of motion estimates. Comparable levels of variability throughout viewing time and across conditions provide evidence for shared noise sources in the perception and action pathways arising from a common sensory estimate. We found that conditions that create poor, low-gain pursuit create a discrepancy between the precision of perception and that of pursuit. Differences in pursuit gain arising from differences in optic flow strength in the stimulus reconcile much of the controversy on this topic. PMID:26041919
Video stereolization: combining motion analysis with user interaction.
Liao, Miao; Gao, Jizhou; Yang, Ruigang; Gong, Minglun
2012-07-01
We present a semiautomatic system that converts conventional videos into stereoscopic videos by combining motion analysis with user interaction, aiming to transfer as much as possible labeling work from the user to the computer. In addition to the widely used structure from motion (SFM) techniques, we develop two new methods that analyze the optical flow to provide additional qualitative depth constraints. They remove the camera movement restriction imposed by SFM so that general motions can be used in scene depth estimation-the central problem in mono-to-stereo conversion. With these algorithms, the user's labeling task is significantly simplified. We further developed a quadratic programming approach to incorporate both quantitative depth and qualitative depth (such as these from user scribbling) to recover dense depth maps for all frames, from which stereoscopic view can be synthesized. In addition to visual results, we present user study results showing that our approach is more intuitive and less labor intensive, while producing 3D effect comparable to that from current state-of-the-art interactive algorithms.
Dynamics of topological solitons, knotted streamlines, and transport of cargo in liquid crystals
NASA Astrophysics Data System (ADS)
Sohn, Hayley R. O.; Ackerman, Paul J.; Boyle, Timothy J.; Sheetah, Ghadah H.; Fornberg, Bengt; Smalyukh, Ivan I.
2018-05-01
Active colloids and liquid crystals are capable of locally converting the macroscopically supplied energy into directional motion and promise a host of new applications, ranging from drug delivery to cargo transport at the mesoscale. Here we uncover how topological solitons in liquid crystals can locally transform electric energy to translational motion and allow for the transport of cargo along directions dependent on frequency of the applied electric field. By combining polarized optical video microscopy and numerical modeling that reproduces both the equilibrium structures of solitons and their temporal evolution in applied fields, we uncover the physical underpinnings behind this reconfigurable motion and study how it depends on the structure and topology of solitons. We show that, unexpectedly, the directional motion of solitons with and without the cargo arises mainly from the asymmetry in rotational dynamics of molecular ordering in liquid crystal rather than from the asymmetry of fluid flows, as in conventional active soft matter systems.
Embodied memory allows accurate and stable perception of hidden objects despite orientation change.
Pan, Jing Samantha; Bingham, Ned; Bingham, Geoffrey P
2017-07-01
Rotating a scene in a frontoparallel plane (rolling) yields a change in orientation of constituent images. When using only information provided by static images to perceive a scene after orientation change, identification performance typically decreases (Rock & Heimer, 1957). However, rolling generates optic flow information that relates the discrete, static images (before and after the change) and forms an embodied memory that aids recognition. The embodied memory hypothesis predicts that upon detecting a continuous spatial transformation of image structure, or in other words, seeing the continuous rolling process and objects undergoing rolling observers should accurately perceive objects during and after motion. Thus, in this case, orientation change should not affect performance. We tested this hypothesis in three experiments and found that (a) using combined optic flow and image structure, participants identified locations of previously perceived but currently occluded targets with great accuracy and stability (Experiment 1); (b) using combined optic flow and image structure information, participants identified hidden targets equally well with or without 30° orientation changes (Experiment 2); and (c) when the rolling was unseen, identification of hidden targets after orientation change became worse (Experiment 3). Furthermore, when rolling was unseen, although target identification was better when participants were told about the orientation change than when they were not told, performance was still worse than when there was no orientation change. Therefore, combined optic flow and image structure information, not mere knowledge about the rolling, enables accurate and stable perception despite orientation change. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Indovina, Iole; Maffei, Vincenzo; Pauwels, Karl; Macaluso, Emiliano; Orban, Guy A; Lacquaniti, Francesco
2013-05-01
Multiple visual signals are relevant to perception of heading direction. While the role of optic flow and depth cues has been studied extensively, little is known about the visual effects of gravity on heading perception. We used fMRI to investigate the contribution of gravity-related visual cues on the processing of vertical versus horizontal apparent self-motion. Participants experienced virtual roller-coaster rides in different scenarios, at constant speed or 1g-acceleration/deceleration. Imaging results showed that vertical self-motion coherent with gravity engaged the posterior insula and other brain regions that have been previously associated with vertical object motion under gravity. This selective pattern of activation was also found in a second experiment that included rectilinear motion in tunnels, whose direction was cued by the preceding open-air curves only. We argue that the posterior insula might perform high-order computations on visual motion patterns, combining different sensory cues and prior information about the effects of gravity. Medial-temporal regions including para-hippocampus and hippocampus were more activated by horizontal motion, preferably at constant speed, consistent with a role in inertial navigation. Overall, the results suggest partially distinct neural representations of the cardinal axes of self-motion (horizontal and vertical). Copyright © 2013 Elsevier Inc. All rights reserved.
Resonance-inclined optical nuclear spin polarization of liquids in diamond structures
NASA Astrophysics Data System (ADS)
Chen, Q.; Schwarz, I.; Jelezko, F.; Retzker, A.; Plenio, M. B.
2016-02-01
Dynamic nuclear polarization (DNP) of molecules in a solution at room temperature has the potential to revolutionize nuclear magnetic resonance spectroscopy and imaging. The prevalent methods for achieving DNP in solutions are typically most effective in the regime of small interaction correlation times between the electron and nuclear spins, limiting the size of accessible molecules. To solve this limitation, we design a mechanism for DNP in the liquid phase that is applicable for large interaction correlation times. Importantly, while this mechanism makes use of a resonance condition similar to solid-state DNP, the polarization transfer is robust to a relatively large detuning from the resonance due to molecular motion. We combine this scheme with optically polarized nitrogen-vacancy (NV) center spins in nanodiamonds to design a setup that employs optical pumping and is therefore not limited by room temperature electron thermal polarization. We illustrate numerically the effectiveness of the model in a flow cell containing nanodiamonds immobilized in a hydrogel, polarizing flowing water molecules 4700-fold above thermal polarization in a magnetic field of 0.35 T, in volumes detectable by current NMR scanners.
Electroosmotic flow analysis of a branched U-turn nanofluidic device.
Parikesit, Gea O F; Markesteijn, Anton P; Kutchoukov, Vladimir G; Piciu, Oana; Bossche, Andre; Westerweel, Jerry; Garini, Yuval; Young, Ian T
2005-10-01
In this paper, we present the analysis of electroosmotic flow in a branched -turn nanofluidic device, which we developed for detection and sorting of single molecules. The device, where the channel depth is only 150 nm, is designed to optically detect fluorescence from a volume as small as 270 attolitres (al) with a common wide-field fluorescent setup. We use distilled water as the liquid, in which we dilute 110 nm fluorescent beads employed as tracer-particles. Quantitative imaging is used to characterize the pathlines and velocity distribution of the electroosmotic flow in the device. Due to the device's complex geometry, the electroosmotic flow cannot be solved analytically. Therefore we use numerical flow simulation to model our device. Our results show that the deviation between measured and simulated data can be explained by the measured Brownian motion of the tracer-particles, which was not incorporated in the simulation.
Maragó, Onofrio M; Bonaccorso, Francesco; Saija, Rosalba; Privitera, Giulia; Gucciardi, Pietro G; Iatì, Maria Antonia; Calogero, Giuseppe; Jones, Philip H; Borghese, Ferdinando; Denti, Paolo; Nicolosi, Valeria; Ferrari, Andrea C
2010-12-28
Brownian motion is a manifestation of the fluctuation-dissipation theorem of statistical mechanics. It regulates systems in physics, biology, chemistry, and finance. We use graphene as prototype material to unravel the consequences of the fluctuation-dissipation theorem in two dimensions, by studying the Brownian motion of optically trapped graphene flakes. These orient orthogonal to the light polarization, due to the optical constants anisotropy. We explain the flake dynamics in the optical trap and measure force and torque constants from the correlation functions of the tracking signals, as well as comparing experiments with a full electromagnetic theory of optical trapping. The understanding of optical trapping of two-dimensional nanostructures gained through our Brownian motion analysis paves the way to light-controlled manipulation and all-optical sorting of biological membranes and anisotropic macromolecules.
Longden, Kit D.; Krapp, Holger G.
2010-01-01
Flying generates predictably different patterns of optic flow compared with other locomotor states. A sensorimotor system tuned to rapid responses and a high bandwidth of optic flow would help the animal to avoid wasting energy through imprecise motor action. However, neural processing that covers a higher input bandwidth itself comes at higher energetic costs which would be a poor investment when the animal was not flying. How does the blowfly adjust the dynamic range of its optic flow-processing neurons to the locomotor state? Octopamine (OA) is a biogenic amine central to the initiation and maintenance of flight in insects. We used an OA agonist chlordimeform (CDM) to simulate the widespread OA release during flight and recorded the effects on the temporal frequency coding of the H2 cell. This cell is a visual interneuron known to be involved in flight stabilization reflexes. The application of CDM resulted in (i) an increase in the cell's spontaneous activity, expanding the inhibitory signaling range (ii) an initial response gain to moving gratings (20–60 ms post-stimulus) that depended on the temporal frequency of the grating and (iii) a reduction in the rate and magnitude of motion adaptation that was also temporal frequency-dependent. To our knowledge, this is the first demonstration that the application of a neuromodulator can induce velocity-dependent alterations in the gain of a wide-field optic flow-processing neuron. The observed changes in the cell's response properties resulted in a 33% increase of the cell's information rate when encoding random changes in temporal frequency of the stimulus. The increased signaling range and more rapid, longer lasting responses employed more spikes to encode each bit, and so consumed a greater amount of energy. It appears that for the fly investing more energy in sensory processing during flight is more efficient than wasting energy on under-performing motor control. PMID:21152339
NASA Astrophysics Data System (ADS)
Burger, Martin; Dirks, Hendrik; Frerking, Lena; Hauptmann, Andreas; Helin, Tapio; Siltanen, Samuli
2017-12-01
In this paper we study the reconstruction of moving object densities from undersampled dynamic x-ray tomography in two dimensions. A particular motivation of this study is to use realistic measurement protocols for practical applications, i.e. we do not assume to have a full Radon transform in each time step, but only projections in few angular directions. This restriction enforces a space-time reconstruction, which we perform by incorporating physical motion models and regularization of motion vectors in a variational framework. The methodology of optical flow, which is one of the most common methods to estimate motion between two images, is utilized to formulate a joint variational model for reconstruction and motion estimation. We provide a basic mathematical analysis of the forward model and the variational model for the image reconstruction. Moreover, we discuss the efficient numerical minimization based on alternating minimizations between images and motion vectors. A variety of results are presented for simulated and real measurement data with different sampling strategy. A key observation is that random sampling combined with our model allows reconstructions of similar amount of measurements and quality as a single static reconstruction.
Ice motion of the Patagonian Icefields of South America: 1984-2014
NASA Astrophysics Data System (ADS)
Mouginot, J.; Rignot, E.
2015-03-01
We present the first comprehensive high-resolution mosaic of ice velocity of the Northern (NPI) and Southern Patagonian Icefields (SPI), from multiple synthetic aperture radar and optical data collected between 1984 and 2014. The results reveal that many of the outlet glaciers extend far into the central ice plateaus, which implies that changes in ice dynamics propagate far inside the accumulation area. We report pronounced seasonal to interannual variability of ice motion on Pío XI and Jorge Montt, a doubling in speed of Jorge Montt, a major slow down of O'Higgins, significant fluctuations of Upsala and a deceleration of San Rafael, which illustrate the need for sustained, continuous time series of ice motion to understand the long-term evolution of the rapidly thinning icefields. The velocity product also resolves major ambiguities in glacier drainage in areas of relatively flat topography illustrating the need to combine topography and flow direction to map drainage basins.
NASA Astrophysics Data System (ADS)
Hil'kevics, Sergej
The book concerns problems from all the chapters of General Physics, earlier not examined, which not need High Mathematics. The book covers Mechanics, Oscillation theory, Molecular-Kinetic theory, Thermodynamics, Electricity and Optics. Particularly, the motion of Railways and the Raylways Shape, the Vehicles Deceleration, the Motion and Conduction of a Bicycle, the Rotation of Gyroscopes, the appearence of low altitude Hills in some geographical areas (Russia, Baltic Countries, Belarus), the Motion of Earth around the Sun, the construction of a Sorting machine for potatoes, the flow of the Water from a buchet, the impact of a Drop with a Wall, the problems of Solidity of some Solid bodies, the Absorption in a swamp (NonNewtonians (Bingam) liquids), Winter fishing, the molecular structure of Gudrons, the Behaviour of Birds during winters, the Thermodynamcs of Whales, why the Eyes of cats Shine?, what is the Temperature of a Sunlight reflection? has been examined. The author is not using Integrals and Dervivatives throughout the book.
Visualizing Cochlear Mechanics Using Confocal Microscopy
NASA Astrophysics Data System (ADS)
Ulfendahl, M.; Boutet de Monvel, J.; Fridberger, A.
2003-02-01
The sound-evoked vibration pattern of the hearing organ is based on complex mechanical interactions between different cellular structures. To explore the structural changes occurring within the organ of Corti during basilar-membrane motion, stepwise alterations of the scala tympani pressure were applied in an in vitro preparation of the guinea-pig temporal bone. Confocal images were acquired at each pressure level. In this way, the motion of several structures could be simultaneously observed with high resolution in a nearly intact system. Images were analyzed using a novel wavelet-based optical-flow estimation algorithm. Under the present experimental conditions, the reticular lamina moved as a stiff plate with a center of rotation in the region of the inner hair cells. The outer hair cells appeared non-rigid and the basal, synaptic regions of these cells displayed significant radial motion indicative of cellular bending and internal shearing.
Microbial alignment in flow changes ocean light climate.
Marcos; Seymour, Justin R; Luhar, Mitul; Durham, William M; Mitchell, James G; Macke, Andreas; Stocker, Roman
2011-03-08
The growth of microbial cultures in the laboratory often is assessed informally with a quick flick of the wrist: dense suspensions of microorganisms produce translucent "swirls" when agitated. Here, we rationalize the mechanism behind this phenomenon and show that the same process may affect the propagation of light through the upper ocean. Analogous to the shaken test tubes, the ocean can be characterized by intense fluid motion and abundant microorganisms. We demonstrate that the swirl patterns arise when elongated microorganisms align preferentially in the direction of fluid flow and alter light scattering. Using a combination of experiments and mathematical modeling, we find that this phenomenon can be recurrent under typical marine conditions. Moderate shear rates (0.1 s(-1)) can increase optical backscattering of natural microbial assemblages by more than 20%, and even small shear rates (0.001 s(-1)) can increase backscattering from blooms of large phytoplankton by more than 30%. These results imply that fluid flow, currently neglected in models of marine optics, may exert an important control on light propagation, influencing rates of global carbon fixation and how we estimate these rates via remote sensing.
Visualization of Sliding and Deformation of Orbital Fat During Eye Rotation
Hötte, Gijsbert J.; Schaafsma, Peter J.; Botha, Charl P.; Wielopolski, Piotr A.; Simonsz, Huibert J.
2016-01-01
Purpose Little is known about the way orbital fat slides and/or deforms during eye movements. We compared two deformation algorithms from a sequence of MRI volumes to visualize this complex behavior. Methods Time-dependent deformation data were derived from motion-MRI volumes using Lucas and Kanade Optical Flow (LK3D) and nonrigid registration (B-splines) deformation algorithms. We compared how these two algorithms performed regarding sliding and deformation in three critical areas: the sclera-fat interface, how the optic nerve moves through the fat, and how the fat is squeezed out under the tendon of a relaxing rectus muscle. The efficacy was validated using identified tissue markers such as the lens and blood vessels in the fat. Results Fat immediately behind the eye followed eye rotation by approximately one-half. This was best visualized using the B-splines technique as it showed less ripping of tissue and less distortion. Orbital fat flowed around the optic nerve during eye rotation. In this case, LK3D provided better visualization as it allowed orbital fat tissue to split. The resolution was insufficient to visualize fat being squeezed out between tendon and sclera. Conclusion B-splines performs better in tracking structures such as the lens, while LK3D allows fat tissue to split as should happen as the optic nerve slides through the fat. Orbital fat follows eye rotation by one-half and flows around the optic nerve during eye rotation. Translational Relevance Visualizing orbital fat deformation and sliding offers the opportunity to accurately locate a region of cicatrization and permit an individualized surgical plan. PMID:27540495
Fath, Aaron J; Lind, Mats; Bingham, Geoffrey P
2018-04-17
The role of the monocular-flow-based optical variable τ in the perception of the time to contact of approaching objects has been well-studied. There are additional contributions from binocular sources of information, such as changes in disparity over time (CDOT), but these are less understood. We conducted an experiment to determine whether an object's velocity affects which source is most effective for perceiving time to contact. We presented participants with stimuli that simulated two approaching squares. During approach the squares disappeared, and participants indicated which square would have contacted them first. Approach was specified by (a) only disparity-based information, (b) only monocular flow, or (c) all sources of information in normal viewing conditions. As expected, participants were more accurate at judging fast objects when only monocular flow was available than when only CDOT was. In contrast, participants were more accurate judging slow objects with only CDOT than with only monocular flow. For both ranges of velocity, the condition with both information sources yielded performance equivalent to the better of the single-source conditions. These results show that different sources of motion information are used to perceive time to contact and play different roles in allowing for stable perception across a variety of conditions.
High-performance holographic technologies for fluid-dynamics experiments
Orlov, Sergei S.; Abarzhi, Snezhana I.; Oh, Se Baek; Barbastathis, George; Sreenivasan, Katepalli R.
2010-01-01
Modern technologies offer new opportunities for experimentalists in a variety of research areas of fluid dynamics. Improvements are now possible in the state-of-the-art in precision, dynamic range, reproducibility, motion-control accuracy, data-acquisition rate and information capacity. These improvements are required for understanding complex turbulent flows under realistic conditions, and for allowing unambiguous comparisons to be made with new theoretical approaches and large-scale numerical simulations. One of the new technologies is high-performance digital holography. State-of-the-art motion control, electronics and optical imaging allow for the realization of turbulent flows with very high Reynolds number (more than 107) on a relatively small laboratory scale, and quantification of their properties with high space–time resolutions and bandwidth. In-line digital holographic technology can provide complete three-dimensional mapping of the flow velocity and density fields at high data rates (over 1000 frames per second) over a relatively large spatial area with high spatial (1–10 μm) and temporal (better than a few nanoseconds) resolution, and can give accurate quantitative description of the fluid flows, including those of multi-phase and unsteady conditions. This technology can be applied in a variety of problems to study fundamental properties of flow–particle interactions, rotating flows, non-canonical boundary layers and Rayleigh–Taylor mixing. Some of these examples are discussed briefly. PMID:20211881
MotionFlow: Visual Abstraction and Aggregation of Sequential Patterns in Human Motion Tracking Data.
Jang, Sujin; Elmqvist, Niklas; Ramani, Karthik
2016-01-01
Pattern analysis of human motions, which is useful in many research areas, requires understanding and comparison of different styles of motion patterns. However, working with human motion tracking data to support such analysis poses great challenges. In this paper, we propose MotionFlow, a visual analytics system that provides an effective overview of various motion patterns based on an interactive flow visualization. This visualization formulates a motion sequence as transitions between static poses, and aggregates these sequences into a tree diagram to construct a set of motion patterns. The system also allows the users to directly reflect the context of data and their perception of pose similarities in generating representative pose states. We provide local and global controls over the partition-based clustering process. To support the users in organizing unstructured motion data into pattern groups, we designed a set of interactions that enables searching for similar motion sequences from the data, detailed exploration of data subsets, and creating and modifying the group of motion patterns. To evaluate the usability of MotionFlow, we conducted a user study with six researchers with expertise in gesture-based interaction design. They used MotionFlow to explore and organize unstructured motion tracking data. Results show that the researchers were able to easily learn how to use MotionFlow, and the system effectively supported their pattern analysis activities, including leveraging their perception and domain knowledge.
Schindler, Andreas; Bartels, Andreas
2017-05-01
Superimposed on the visual feed-forward pathway, feedback connections convey higher level information to cortical areas lower in the hierarchy. A prominent framework for these connections is the theory of predictive coding where high-level areas send stimulus interpretations to lower level areas that compare them with sensory input. Along these lines, a growing body of neuroimaging studies shows that predictable stimuli lead to reduced blood oxygen level-dependent (BOLD) responses compared with matched nonpredictable counterparts, especially in early visual cortex (EVC) including areas V1-V3. The sources of these modulatory feedback signals are largely unknown. Here, we re-examined the robust finding of relative BOLD suppression in EVC evident during processing of coherent compared with random motion. Using functional connectivity analysis, we show an optic flow-dependent increase of functional connectivity between BOLD suppressed EVC and a network of visual motion areas including MST, V3A, V6, the cingulate sulcus visual area (CSv), and precuneus (Pc). Connectivity decreased between EVC and 2 areas known to encode heading direction: entorhinal cortex (EC) and retrosplenial cortex (RSC). Our results provide first evidence that BOLD suppression in EVC for predictable stimuli is indeed mediated by specific high-level areas, in accord with the theory of predictive coding. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
NASA Technical Reports Server (NTRS)
Domini, F.; Caudek, C.; Proffitt, D. R.; Kaiser, M. K. (Principal Investigator)
1997-01-01
Accuracy in discriminating rigid from nonrigid motion was investigated for orthographic projections of three-dimension rotating objects. In 3 experiments the hypothesis that magnitudes of angular velocity are misperceived in the kinetic depth effect was tested, and in 4 other experiments the hypothesis that misperceiving angular velocities leads to misperceiving rigidity was tested. The principal findings were (a) the magnitude of perceived angular velocity is derived heuristically as a function of a property of the first-order optic flow called deformation and (b) perceptual performance in discriminating rigid from nonrigid motion is accurate in cases when the variability of the deformations of the individual triplets of points of the stimulus displays favors this interpretation and not accurate in other cases.
Domini, F; Caudek, C; Proffitt, D R
1997-08-01
Accuracy in discriminating rigid from nonrigid motion was investigated for orthographic projections of three-dimension rotating objects. In 3 experiments the hypothesis that magnitudes of angular velocity are misperceived in the kinetic depth effect was tested, and in 4 other experiments the hypothesis that misperceiving angular velocities leads to misperceiving rigidity was tested. The principal findings were (a) the magnitude of perceived angular velocity is derived heuristically as a function of a property of the first-order optic flow called deformation and (b) perceptual performance in discriminating rigid from nonrigid motion is accurate in cases when the variability of the deformations of the individual triplets of points of the stimulus displays favors this interpretation and not accurate in other cases.
Schwegmann, Alexander; Lindemann, Jens P.; Egelhaaf, Martin
2014-01-01
Knowing the depth structure of the environment is crucial for moving animals in many behavioral contexts, such as collision avoidance, targeting objects, or spatial navigation. An important source of depth information is motion parallax. This powerful cue is generated on the eyes during translatory self-motion with the retinal images of nearby objects moving faster than those of distant ones. To investigate how the visual motion pathway represents motion-based depth information we analyzed its responses to image sequences recorded in natural cluttered environments with a wide range of depth structures. The analysis was done on the basis of an experimentally validated model of the visual motion pathway of insects, with its core elements being correlation-type elementary motion detectors (EMDs). It is the key result of our analysis that the absolute EMD responses, i.e., the motion energy profile, represent the contrast-weighted nearness of environmental structures during translatory self-motion at a roughly constant velocity. In other words, the output of the EMD array highlights contours of nearby objects. This conclusion is largely independent of the scale over which EMDs are spatially pooled and was corroborated by scrutinizing the motion energy profile after eliminating the depth structure from the natural image sequences. Hence, the well-established dependence of correlation-type EMDs on both velocity and textural properties of motion stimuli appears to be advantageous for representing behaviorally relevant information about the environment in a computationally parsimonious way. PMID:25136314
Optical Flow Experiments for Small-Body Navigation
NASA Astrophysics Data System (ADS)
Schmidt, A.; Kueppers, M.
2012-09-01
Optical Flow algorithms [1, 2] have been successfully used and been robustly implemented in many application domains from motion estimation to video compression. We argue that they also show potential for autonomous spacecraft payload operation around small solar system bodies, such as comets or asteroids. Operating spacecraft around small bodies in close distance provides numerous challenges, many of which are related to uncertainties in spacecraft position and velocity relative to a body. To make best use of usually scarce resource, it would be good to grant a certain amount of autonomy to a spacecraft, for example, to make time-critical decisions when to operate the payload. The Optical Flow describes is the apparent velocities of common, usually brightness-related features in at least two images. From it, one can make estimates about the spacecraft velocity and direction relative to the last manoeuvre or known state. The authors have conducted experiments with readily-available optical imagery using the relatively robust and well-known Lucas-Kanade method [3]; it was found to be applicable in a large number of cases. Since one of the assumptions is that the brightness of corresponding points in subsequent images does not change greatly, it is important that imagery is acquired at sensible intervals, during which illumination conditions can be assumed constant and the spacecraft does not move too far so that there is significant overlap. Full-frame Optical Flow can be computationally more expensive than image compression and usually focuses on movements of regions with significant brightness-gradients. However, given that missions which explore small bodies move at low relative velocities, computation time is not expected to be a limiting resource. Since there are now several missions which either have flown to small bodies or are planned to visit small bodies and stay there for some time, it shows potential to explore how instrument operations can benefit from the additional knowledge that is gained from analysing readily available data on-board. The algorithms for Optical Flow show the maturity that is necessary to be considered in safety-critical systems; their use can be complemented with shape models, pattern matching, housekeeping data and navigation techniques to obtain even more accurate information.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teo, P; Guo, K; Alayoubi, N
Purpose: Accounting for tumor motion during radiation therapy is important to ensure that the tumor receives the prescribed dose. Increasing the field size to account for this motion exposes the surrounding healthy tissues to unnecessary radiation. In contrast to using motion-encompassing techniques to treat moving tumors, conformal radiation therapy (RT) uses a smaller field to track the tumor and adapts the beam aperture according to the motion detected. This work investigates and compares the performance of three markerless, EPID based, optical flow methods to track tumor motion with conformal RT. Methods: Three techniques were used to track the motions ofmore » a 3D printed lung tumor programmed to move according to the tumor of seven lung cancer patients. These techniques utilized a multi-resolution optical flow algorithm as the core computation for image registration. The first method (DIR) registers the incoming images with an initial reference frame, while the second method (RFSF) uses an adaptive reference frame and the third method (CU) uses preceding image frames for registration. The patient traces and errors were evaluated for the seven patients. Results: The average position errors for all patient traces were 0.12 ± 0.33 mm, −0.05 ± 0.04 mm and −0.28 ± 0.44 mm for CU, DIR and RFSF method respectively. The position errors distributed within 1 standard deviation are 0.74 mm, 0.37 mm and 0.96 mm respectively. The CU and RFSF algorithms are sensitive to the characteristics of the patient trace and produce a wider distribution of errors amongst patients. Although the mean error for the DIR method is negatively biased (−0.05 mm) for all patients, it has the narrowest distribution of position error, which can be corrected using an offset calibration. Conclusion: Three techniques of image registration and position update were studied. Using direct comparison with an initial frame yields the best performance. The authors would like to thank Dr.YeLin Suh for making the Cyberknife dataset available to us. Scholarship funding from the Natural Sciences and Engineering Research Council of Canada (NSERC) and CancerCare Manitoba Foundation is acknowledged.« less
Vision based obstacle detection and grouping for helicopter guidance
NASA Technical Reports Server (NTRS)
Sridhar, Banavar; Chatterji, Gano
1993-01-01
Electro-optical sensors can be used to compute range to objects in the flight path of a helicopter. The computation is based on the optical flow/motion at different points in the image. The motion algorithms provide a sparse set of ranges to discrete features in the image sequence as a function of azimuth and elevation. For obstacle avoidance guidance and display purposes, these discrete set of ranges, varying from a few hundreds to several thousands, need to be grouped into sets which correspond to objects in the real world. This paper presents a new method for object segmentation based on clustering the sparse range information provided by motion algorithms together with the spatial relation provided by the static image. The range values are initially grouped into clusters based on depth. Subsequently, the clusters are modified by using the K-means algorithm in the inertial horizontal plane and the minimum spanning tree algorithms in the image plane. The object grouping allows interpolation within a group and enables the creation of dense range maps. Researchers in robotics have used densely scanned sequence of laser range images to build three-dimensional representation of the outside world. Thus, modeling techniques developed for dense range images can be extended to sparse range images. The paper presents object segmentation results for a sequence of flight images.
NASA Astrophysics Data System (ADS)
Beigi, Parmida; Salcudean, Tim; Rohling, Robert; Lessoway, Victoria A.; Ng, Gary C.
2015-03-01
This paper presents a new needle detection technique for ultrasound guided interventions based on the spectral properties of small displacements arising from hand tremour or intentional motion. In a block-based approach, the displacement map is computed for each block of interest versus a reference frame, using an optical flow technique. To compute the flow parameters, the Lucas-Kanade approach is used in a multiresolution and regularized form. A least-squares fit is used to estimate the flow parameters from the overdetermined system of spatial and temporal gradients. Lateral and axial components of the displacement are obtained for each block of interest at consecutive frames. Magnitude-squared spectral coherency is derived between the median displacements of the reference block and each block of interest, to determine the spectral correlation. In vivo images were obtained from the tissue near the abdominal aorta to capture the extreme intrinsic body motion and insertion images were captured from a tissue-mimicking agar phantom. According to the analysis, both the involuntary and intentional movement of the needle produces coherent displacement with respect to a reference window near the insertion site. Intrinsic body motion also produces coherent displacement with respect to a reference window in the tissue; however, the coherency spectra of intrinsic and needle motion are distinguishable spectrally. Blocks with high spectral coherency at high frequencies are selected, estimating a channel for needle trajectory. The needle trajectory is detected from locally thresholded absolute displacement map within the initial estimate. Experimental results show the RMS localization accuracy of 1:0 mm, 0:7 mm, and 0:5 mm for hand tremour, vibrational and rotational needle movements, respectively.
In-motion optical sensing for assessment of animal well-being
NASA Astrophysics Data System (ADS)
Atkins, Colton A.; Pond, Kevin R.; Madsen, Christi K.
2017-05-01
The application of in-motion optical sensor measurements was investigated for inspecting livestock soundness as a means of animal well-being. An optical sensor-based platform was used to collect in-motion, weight-related information. Eight steers, weighing between 680 and 1134 kg, were evaluated twice. Six of the 8 steers were used for further evaluation and analysis. Hoof impacts caused plate flexion that was optically sensed. Observed kinetic differences between animals' strides at a walking or running/trotting gait with significant force distributions of animals' hoof impacts allowed for observation of real-time, biometric patterns. Overall, optical sensor-based measurements identified hoof differences between and within animals in motion that may allow for diagnosis of musculoskeletal unsoundness without visual evaluation.
Independent motion detection with a rival penalized adaptive particle filter
NASA Astrophysics Data System (ADS)
Becker, Stefan; Hübner, Wolfgang; Arens, Michael
2014-10-01
Aggregation of pixel based motion detection into regions of interest, which include views of single moving objects in a scene is an essential pre-processing step in many vision systems. Motion events of this type provide significant information about the object type or build the basis for action recognition. Further, motion is an essential saliency measure, which is able to effectively support high level image analysis. When applied to static cameras, background subtraction methods achieve good results. On the other hand, motion aggregation on freely moving cameras is still a widely unsolved problem. The image flow, measured on a freely moving camera is the result from two major motion types. First the ego-motion of the camera and second object motion, that is independent from the camera motion. When capturing a scene with a camera these two motion types are adverse blended together. In this paper, we propose an approach to detect multiple moving objects from a mobile monocular camera system in an outdoor environment. The overall processing pipeline consists of a fast ego-motion compensation algorithm in the preprocessing stage. Real-time performance is achieved by using a sparse optical flow algorithm as an initial processing stage and a densely applied probabilistic filter in the post-processing stage. Thereby, we follow the idea proposed by Jung and Sukhatme. Normalized intensity differences originating from a sequence of ego-motion compensated difference images represent the probability of moving objects. Noise and registration artefacts are filtered out, using a Bayesian formulation. The resulting a posteriori distribution is located on image regions, showing strong amplitudes in the difference image which are in accordance with the motion prediction. In order to effectively estimate the a posteriori distribution, a particle filter is used. In addition to the fast ego-motion compensation, the main contribution of this paper is the design of the probabilistic filter for real-time detection and tracking of independently moving objects. The proposed approach introduces a competition scheme between particles in order to ensure an improved multi-modality. Further, the filter design helps to generate a particle distribution which is homogenous even in the presence of multiple targets showing non-rigid motion patterns. The effectiveness of the method is shown on exemplary outdoor sequences.
Dynamics analysis of microsphere in a dual-beam fiber-optic trap with transverse offset.
Chen, Xinlin; Xiao, Guangzong; Luo, Hui; Xiong, Wei; Yang, Kaiyong
2016-04-04
A comprehensive dynamics analysis of microsphere has been presented in a dual-beam fiber-optic trap with transverse offset. As the offset distance between two counterpropagating beams increases, the motion type of the microsphere starts with capture, then spiral motion, then orbital rotation, and ends with escape. We analyze the transformation process and mechanism of the four motion types based on ray optics approximation. Dynamic simulations show that the existence of critical offset distances at which different motion types transform. The result is an important step toward explaining physical phenomena in a dual-beam fiber-optic trap with transverse offset, and is generally applicable to achieving controllable motions of microspheres in integrated systems, such as microfluidic systems and lab-on-a-chip systems.
Indovina, Iole; Maffei, Vincenzo; Lacquaniti, Francesco
2013-09-01
By simulating self-motion on a virtual rollercoaster, we investigated whether acceleration cued by the optic flow affected the estimate of time-to-passage (TTP) to a target. In particular, we studied the role of a visual acceleration (1 g = 9.8 m/s(2)) simulating the effects of gravity in the scene, by manipulating motion law (accelerated or decelerated at 1 g, constant speed) and motion orientation (vertical, horizontal). Thus, 1-g-accelerated motion in the downward direction or decelerated motion in the upward direction was congruent with the effects of visual gravity. We found that acceleration (positive or negative) is taken into account but is overestimated in module in the calculation of TTP, independently of orientation. In addition, participants signaled TTP earlier when the rollercoaster accelerated downward at 1 g (as during free fall), with respect to when the same acceleration occurred along the horizontal orientation. This time shift indicates an influence of the orientation relative to visual gravity on response timing that could be attributed to the anticipation of the effects of visual gravity on self-motion along the vertical, but not the horizontal orientation. Finally, precision in TTP estimates was higher during vertical fall than when traveling at constant speed along the vertical orientation, consistent with a higher noise in TTP estimates when the motion violates gravity constraints.
Gait Recognition Based on Convolutional Neural Networks
NASA Astrophysics Data System (ADS)
Sokolova, A.; Konushin, A.
2017-05-01
In this work we investigate the problem of people recognition by their gait. For this task, we implement deep learning approach using the optical flow as the main source of motion information and combine neural feature extraction with the additional embedding of descriptors for representation improvement. In order to find the best heuristics, we compare several deep neural network architectures, learning and classification strategies. The experiments were made on two popular datasets for gait recognition, so we investigate their advantages and disadvantages and the transferability of considered methods.
Anti-Le-Chatelet behavior driven by strong natural light
NASA Astrophysics Data System (ADS)
Antonyuk, B. P.
2007-01-01
We show that strong incoherent broad band light causes positive feedback in response to a static electric field in random media: electric current flows in opposite to a voltage drop direction; static polarization is induced in opposition to an applied electric field. This type of the electron motion amplifies the external action revealing anti-Le-Chatelet behavior. The applied static electric field is amplified up to the domain of optical damage of a silica glass ≈10 7 V/cm.
Transformation-aware perceptual image metric
NASA Astrophysics Data System (ADS)
Kellnhofer, Petr; Ritschel, Tobias; Myszkowski, Karol; Seidel, Hans-Peter
2016-09-01
Predicting human visual perception has several applications such as compression, rendering, editing, and retargeting. Current approaches, however, ignore the fact that the human visual system compensates for geometric transformations, e.g., we see that an image and a rotated copy are identical. Instead, they will report a large, false-positive difference. At the same time, if the transformations become too strong or too spatially incoherent, comparing two images gets increasingly difficult. Between these two extrema, we propose a system to quantify the effect of transformations, not only on the perception of image differences but also on saliency and motion parallax. To this end, we first fit local homographies to a given optical flow field, and then convert this field into a field of elementary transformations, such as translation, rotation, scaling, and perspective. We conduct a perceptual experiment quantifying the increase of difficulty when compensating for elementary transformations. Transformation entropy is proposed as a measure of complexity in a flow field. This representation is then used for applications, such as comparison of nonaligned images, where transformations cause threshold elevation, detection of salient transformations, and a model of perceived motion parallax. Applications of our approach are a perceptual level-of-detail for real-time rendering and viewpoint selection based on perceived motion parallax.
Multirate and event-driven Kalman filters for helicopter flight
NASA Technical Reports Server (NTRS)
Sridhar, Banavar; Smith, Phillip; Suorsa, Raymond E.; Hussien, Bassam
1993-01-01
A vision-based obstacle detection system that provides information about objects as a function of azimuth and elevation is discussed. The range map is computed using a sequence of images from a passive sensor, and an extended Kalman filter is used to estimate range to obstacles. The magnitude of the optical flow that provides measurements for each Kalman filter varies significantly over the image depending on the helicopter motion and object location. In a standard Kalman filter, the measurement update takes place at fixed intervals. It may be necessary to use a different measurement update rate in different parts of the image in order to maintain the same signal to noise ratio in the optical flow calculations. A range estimation scheme that accepts the measurement only under certain conditions is presented. The estimation results from the standard Kalman filter are compared with results from a multirate Kalman filter and an event-driven Kalman filter for a sequence of helicopter flight images.
A Height Estimation Approach for Terrain Following Flights from Monocular Vision.
Campos, Igor S G; Nascimento, Erickson R; Freitas, Gustavo M; Chaimowicz, Luiz
2016-12-06
In this paper, we present a monocular vision-based height estimation algorithm for terrain following flights. The impressive growth of Unmanned Aerial Vehicle (UAV) usage, notably in mapping applications, will soon require the creation of new technologies to enable these systems to better perceive their surroundings. Specifically, we chose to tackle the terrain following problem, as it is still unresolved for consumer available systems. Virtually every mapping aircraft carries a camera; therefore, we chose to exploit this in order to use presently available hardware to extract the height information toward performing terrain following flights. The proposed methodology consists of using optical flow to track features from videos obtained by the UAV, as well as its motion information to estimate the flying height. To determine if the height estimation is reliable, we trained a decision tree that takes the optical flow information as input and classifies whether the output is trustworthy or not. The classifier achieved accuracies of 80 % for positives and 90 % for negatives, while the height estimation algorithm presented good accuracy.
Short-latency primate vestibuloocular responses during translation
NASA Technical Reports Server (NTRS)
Angelaki, D. E.; McHenry, M. Q.
1999-01-01
Short-lasting, transient head displacements and near target fixation were used to measure the latency and early response gain of vestibularly evoked eye movements during lateral and fore-aft translations in rhesus monkeys. The latency of the horizontal eye movements elicited during lateral motion was 11.9 +/- 5.4 ms. Viewing distance-dependent behavior was seen as early as the beginning of the response profile. For fore-aft motion, latencies were different for forward and backward displacements. Latency averaged 7.1 +/- 9.3 ms during forward motion (same for both eyes) and 12.5 +/- 6.3 ms for the adducting eye (e.g., left eye during right fixation) during backward motion. Latencies during backward motion were significantly longer for the abducting eye (18.9 +/- 9.8 ms). Initial acceleration gains of the two eyes were generally larger than unity but asymmetric. Specifically, gains were consistently larger for abducting than adducting eye movements. The large initial acceleration gains tended to compensate for the response latencies such that the early eye movement response approached, albeit consistently incompletely, that required for maintaining visual acuity during the movement. These short-latency vestibuloocular responses could complement the visually generated optic flow responses that have been shown to exhibit much longer latencies.
Maser mechanism of optical pulsations from anomalous X-ray pulsar 4U 0142+61
NASA Astrophysics Data System (ADS)
Lu, Y.; Zhang, S. N.
2004-11-01
Based on the work of Luo & Melrose from the early 1990s, a maser curvature emission mechanism in the presence of curvature drift is used to explain the optical pulsations from anomalous X-ray pulsars (AXPs). The model comprises a rotating neutron star with a strong surface magnetic field, i.e. a magnetar. Assuming the space-charge-limited flow acceleration mechanism, in which the strongly magnetized neutron star induces strong electric fields that pull the charges from its surface and flow along the open field lines, the neutron star generates a dense flow of electrons and positrons (relativistic pair plasma) by either two-photon pair production or one-photon pair creation resulting from inverse Compton scattering of the thermal photons above the pulsar polar cap (PC). The motion of the pair plasma is essentially one-dimensional along the field lines. We propose that optical pulsations from AXPs are generated by a curvature-drift-induced maser developing in the PC of magnetars. Pair plasma is considered as an active medium that can amplify its normal modes. The curvature drift, which is energy-dependent, is another essential ingredient in allowing negative absorption (maser action) to occur. For the source AXP 4U 0142+61, we find that the optical pulsation triggered by curvature-drift maser radiation occurs at the radial distance R(νM) ~ 4.75 × 109 cm to the neutron star. The corresponding curvature maser frequency is about νM~ 1.39 × 1014 Hz, and the pulse component from the maser amplification is about 27 per cent. The result is consistent with the observation of the optical pulsations from AXP 4U 0142+61.
Yang, Yang; Saleemi, Imran; Shah, Mubarak
2013-07-01
This paper proposes a novel representation of articulated human actions and gestures and facial expressions. The main goals of the proposed approach are: 1) to enable recognition using very few examples, i.e., one or k-shot learning, and 2) meaningful organization of unlabeled datasets by unsupervised clustering. Our proposed representation is obtained by automatically discovering high-level subactions or motion primitives, by hierarchical clustering of observed optical flow in four-dimensional, spatial, and motion flow space. The completely unsupervised proposed method, in contrast to state-of-the-art representations like bag of video words, provides a meaningful representation conducive to visual interpretation and textual labeling. Each primitive action depicts an atomic subaction, like directional motion of limb or torso, and is represented by a mixture of four-dimensional Gaussian distributions. For one--shot and k-shot learning, the sequence of primitive labels discovered in a test video are labeled using KL divergence, and can then be represented as a string and matched against similar strings of training videos. The same sequence can also be collapsed into a histogram of primitives or be used to learn a Hidden Markov model to represent classes. We have performed extensive experiments on recognition by one and k-shot learning as well as unsupervised action clustering on six human actions and gesture datasets, a composite dataset, and a database of facial expressions. These experiments confirm the validity and discriminative nature of the proposed representation.
3D motion picture of transparent gas flow by parallel phase-shifting digital holography
NASA Astrophysics Data System (ADS)
Awatsuji, Yasuhiro; Fukuda, Takahito; Wang, Yexin; Xia, Peng; Kakue, Takashi; Nishio, Kenzo; Matoba, Osamu
2018-03-01
Parallel phase-shifting digital holography is a technique capable of recording three-dimensional (3D) motion picture of dynamic object, quantitatively. This technique can record single hologram of an object with an image sensor having a phase-shift array device and reconstructs the instantaneous 3D image of the object with a computer. In this technique, a single hologram in which the multiple holograms required for phase-shifting digital holography are multiplexed by using space-division multiplexing technique pixel by pixel. We demonstrate 3D motion picture of dynamic and transparent gas flow recorded and reconstructed by the technique. A compressed air duster was used to generate the gas flow. A motion picture of the hologram of the gas flow was recorded at 180,000 frames/s by parallel phase-shifting digital holography. The phase motion picture of the gas flow was reconstructed from the motion picture of the hologram. The Abel inversion was applied to the phase motion picture and then the 3D motion picture of the gas flow was obtained.
Yang, Qiang; Zhang, Jie; Nozato, Koji; Saito, Kenichi; Williams, David R.; Roorda, Austin; Rossi, Ethan A.
2014-01-01
Eye motion is a major impediment to the efficient acquisition of high resolution retinal images with the adaptive optics (AO) scanning light ophthalmoscope (AOSLO). Here we demonstrate a solution to this problem by implementing both optical stabilization and digital image registration in an AOSLO. We replaced the slow scanning mirror with a two-axis tip/tilt mirror for the dual functions of slow scanning and optical stabilization. Closed-loop optical stabilization reduced the amplitude of eye-movement related-image motion by a factor of 10–15. The residual RMS error after optical stabilization alone was on the order of the size of foveal cones: ~1.66–2.56 μm or ~0.34–0.53 arcmin with typical fixational eye motion for normal observers. The full implementation, with real-time digital image registration, corrected the residual eye motion after optical stabilization with an accuracy of ~0.20–0.25 μm or ~0.04–0.05 arcmin RMS, which to our knowledge is more accurate than any method previously reported. PMID:25401030
Wang, Ao; Song, Qiang; Ji, Bingqiang; Yao, Qiang
2015-12-01
As a key mechanism of submicron particle capture in wet deposition and wet scrubbing processes, thermophoresis is influenced by the flow and temperature fields. Three-dimensional direct numerical simulations were conducted to quantify the characteristics of the flow and temperature fields around a droplet at three droplet Reynolds numbers (Re) that correspond to three typical boundary-layer-separation flows (steady axisymmetric, steady plane-symmetric, and unsteady plane-symmetric flows). The thermophoretic motion of submicron particles was simulated in these cases. Numerical results show that the motion of submicron particles around the droplet and the deposition distribution exhibit different characteristics under three typical flow forms. The motion patterns of particles are dependent on their initial positions in the upstream and flow forms. The patterns of particle motion and deposition are diversified as Re increases. The particle motion pattern, initial position of captured particles, and capture efficiency change periodically, especially during periodic vortex shedding. The key effects of flow forms on particle motion are the shape and stability of the wake behind the droplet. The drag force of fluid and the thermophoretic force in the wake contribute jointly to the deposition of submicron particles after the boundary-layer separation around a droplet.
Two-component wind fields over ocean waves using atmospheric lidar and motion estimation algorithms
NASA Astrophysics Data System (ADS)
Mayor, S. D.
2016-02-01
Numerical models, such as large eddy simulations, are capable of providing stunning visualizations of the air-sea interface. One reason for this is the inherent spatial nature of such models. As compute power grows, models are able to provide higher resolution visualizations over larger domains revealing intricate details of the interactions of ocean waves and the airflow over them. Spatial observations on the other hand, which are necessary to validate the simulations, appear to lag behind models. The rough ocean environment of the real world is an additional challenge. One method of providing spatial observations of fluid flow is that of particle image velocimetry (PIV). PIV has been successfully applied to many problems in engineering and the geosciences. This presentation will show recent research results that demonstate that a PIV-style approach using pulsed-fiber atmospheric elastic backscatter lidar hardware and wavelet-based optical flow motion estimation software can reveal two-component wind fields over rough ocean surfaces. Namely, a recently-developed compact lidar was deployed for 10 days in March of 2015 in the Eureka, California area. It scanned over the ocean. Imagery reveal that breaking ocean waves provide copius amounts of particulate matter for the lidar to detect and for the motion estimation algorithms to retrieve wind vectors from. The image below shows two examples of results from the experiment. The left panel shows the elastic backscatter intensity (copper shades) under a field of vectors that was retrieved by the wavelet-based optical flow algorithm from two scans that took about 15 s each to acquire. The vectors, that reveal offshore flow toward the NW, were decimated for clarity. The bright aerosol features along the right edge of the sector scan were caused by ocean waves breaking on the beach. The right panel is the result of scanning over the ocean on a day when wave amplitudes ranged from 8-12 feet and whitecaps offshore beyond the surf zone appeared to be rare and fleeting. Nonetheless, faint coherent aerosol structures are observable in the backscatter field as long, streaky, wind-parallel filaments and a wind field was retrieved. During the 10-day deployment, the seas were not as rough as expected. A current goal is to find collaborators and return to map airflow in rougher conditions.
Design of a haptic device with grasp and push-pull force feedback for a master-slave surgical robot.
Hu, Zhenkai; Yoon, Chae-Hyun; Park, Samuel Byeongjun; Jo, Yung-Ho
2016-07-01
We propose a portable haptic device providing grasp (kinesthetic) and push-pull (cutaneous) sensations for optical-motion-capture master interfaces. Although optical-motion-capture master interfaces for surgical robot systems can overcome the stiffness, friction, and coupling problems of mechanical master interfaces, it is difficult to add haptic feedback to an optical-motion-capture master interface without constraining the free motion of the operator's hands. Therefore, we utilized a Bowden cable-driven mechanism to provide the grasp and push-pull sensation while retaining the free hand motion of the optical-motion capture master interface. To evaluate the haptic device, we construct a 2-DOF force sensing/force feedback system. We compare the sensed force and the reproduced force of the haptic device. Finally, a needle insertion test was done to evaluate the performance of the haptic interface in the master-slave system. The results demonstrate that both the grasp force feedback and the push-pull force feedback provided by the haptic interface closely matched with the sensed forces of the slave robot. We successfully apply our haptic interface in the optical-motion-capture master-slave system. The results of the needle insertion test showed that our haptic feedback can provide more safety than merely visual observation. We develop a suitable haptic device to produce both kinesthetic grasp force feedback and cutaneous push-pull force feedback. Our future research will include further objective performance evaluations of the optical-motion-capture master-slave robot system with our haptic interface in surgical scenarios.
Ice flood velocity calculating approach based on single view metrology
NASA Astrophysics Data System (ADS)
Wu, X.; Xu, L.
2017-02-01
Yellow River is the river in which the ice flood occurs most frequently in China, hence, the Ice flood forecasting has great significance for the river flood prevention work. In various ice flood forecast models, the flow velocity is one of the most important parameters. In spite of the great significance of the flow velocity, its acquisition heavily relies on manual observation or deriving from empirical formula. In recent years, with the high development of video surveillance technology and wireless transmission network, the Yellow River Conservancy Commission set up the ice situation monitoring system, in which live videos can be transmitted to the monitoring center through 3G mobile networks. In this paper, an approach to get the ice velocity based on single view metrology and motion tracking technique using monitoring videos as input data is proposed. First of all, River way can be approximated as a plane. On this condition, we analyze the geometry relevance between the object side and the image side. Besides, we present the principle to measure length in object side from image. Secondly, we use LK optical flow which support pyramid data to track the ice in motion. Combining the result of camera calibration and single view metrology, we propose a flow to calculate the real velocity of ice flood. At last we realize a prototype system by programming and use it to test the reliability and rationality of the whole solution.
Ruhlandt, A; Töpperwien, M; Krenkel, M; Mokso, R; Salditt, T
2017-07-26
We present an approach towards four dimensional (4d) movies of materials, showing dynamic processes within the entire 3d structure. The method is based on tomographic reconstruction on dynamically curved paths using a motion model estimated by optical flow techniques, considerably reducing the typical motion artefacts of dynamic tomography. At the same time we exploit x-ray phase contrast based on free propagation to enhance the signal from micron scale structure recorded with illumination times down to a millisecond (ms). The concept is demonstrated by observing the burning process of a match stick in 4d, using high speed synchrotron phase contrast x-ray tomography recordings. The resulting movies reveal the structural changes of the wood cells during the combustion.
Watching Mobility Engendered by Actin Polymerization
NASA Astrophysics Data System (ADS)
Jee, Ah-Young; Granick, Steve; Tlusty, Tsvi
We have been investigating hydrodynamic flows engendered in molecular systems by active motion. In fact, active directed motion is ubiquitous as a transport mechanism within cells and other systems, sometimes by the action of molecular motors as they move along cytoskeletal filaments, sometimes by the polymerization and depolymerization of filament themselves. To probe this situation, we have employed fluorescence correlation spectroscopy (FCS) in the STED mode (stimulation emission-depletion), this super-resolution approach allowing us to investigate molecular mobility as averaged over a spectrum of space scales: from areas of the optical diffraction limit or larger, to regions as small as 30 40 nm. This comparison of FCS-STED measurements when the projected area investigated varies by a factor of >10, reveals remarkable scale dependence of the mobility that we infer.
Polymer stabilization of electrohydrodynamic instability in non-iridescent cholesteric thin films.
Hsiao, Yu-Cheng; Lee, Wei
2015-08-24
A non-iridescent cholesterol liquid crystal (CLC) thin film is demonstrated by using the polymer-stabilized electrohydrodymanic (PSEHD) method. The photopolymerized cell made from a CLC/monomer mixture exhibits an optically stable gridlike pattern. The helical axis of thus-formed CLC is aligned with the hydrodynamic flow induced by a space charge motion, and the arrayed CLC grid configuration renders a wide viewing angle thanks to the limited color shift at various lines of sight. The formation of the PSEHD structure was verified with polarized optical microscopy, ascertaining that the electrohydrodymanic pattern can be photo-cured or stabilized. The PSEHD CLC is simple to fabricate and potentially suitable for applications in wide-viewing-angle or non-iridescent devices.
Visualization and quantification of two-phase flow in transparent miniature packed beds
NASA Astrophysics Data System (ADS)
Zhu, Peixi; Papadopoulos, Kyriakos D.
2012-10-01
Optical microscopy was used to visualize the flow of two phases [British Petroleum (BP) oil and an aqueous surfactant phase] in confined space, three-dimensional, transparent, natural porous media. The porous media consisted of water-wet cryolite grains packed inside cylindrical, glass microchannels, thus producing microscopic packed beds. Primary drainage of BP oil displacing an aqueous surfactant phase was studied at capillary numbers that varied between 10-6 and 10-2. The confinement space had a significant effect on the flow behavior. Phenomena of burst motion and capillary fingering were observed for low capillary numbers due to the domination of capillary forces. It was discovered that breakthrough time and capillary number bear a log-log scale linear relationship, based on which a generalized correlation between oil travel distance x and time t was found empirically.
Visualization and quantification of two-phase flow in transparent miniature packed beds.
Zhu, Peixi; Papadopoulos, Kyriakos D
2012-10-01
Optical microscopy was used to visualize the flow of two phases [British Petroleum (BP) oil and an aqueous surfactant phase] in confined space, three-dimensional, transparent, natural porous media. The porous media consisted of water-wet cryolite grains packed inside cylindrical, glass microchannels, thus producing microscopic packed beds. Primary drainage of BP oil displacing an aqueous surfactant phase was studied at capillary numbers that varied between 10(-6) and 10(-2). The confinement space had a significant effect on the flow behavior. Phenomena of burst motion and capillary fingering were observed for low capillary numbers due to the domination of capillary forces. It was discovered that breakthrough time and capillary number bear a log-log scale linear relationship, based on which a generalized correlation between oil travel distance x and time t was found empirically.
A cross correlation PIV technique using electro-optical image separation
NASA Astrophysics Data System (ADS)
Wirth, M.; Baritaud, T. A.
1996-11-01
A new approach for 2-dimensional flow field investigation by PIV has been developed for measurements with high spatial resolution without the well known directional ambiguity. This feature of the technique is especially important for measurements in flows with reversal regions or strong turbulent motion as in-cylinder engine measurements. The major aim of the work was to achieve the benefits of cross correlation PIV image evaluation at reasonable cost and under application of common single wavelength double pulsed laser systems as they are mainly used for PIV experiments. The development of the technique is based on polarization rotation of the light scattered by the seeding particles by means of a ferroelectric liquid crystal half wave plate (FLC). Measurement samples from low turbulent jets and the flow in the wake of a cylinder are being presented.
Mody, Nipa A; King, Michael R
2007-05-22
We used the platelet adhesive dynamics computational method to study the influence of Brownian motion of a platelet on its flow characteristics near a surface in the creeping flow regime. Two important characterizations were done in this regard: (1) quantification of the platelet's ability to contact the surface by virtue of the Brownian forces and torques acting on it, and (2) determination of the relative importance of Brownian motion in promoting surface encounters in the presence of shear flow. We determined the Peclet number for a platelet undergoing Brownian motion in shear flow, which could be expressed as a simple linear function of height of the platelet centroid, H from the surface Pe (platelet) = . (1.56H + 0.66) for H > 0.3 microm. Our results demonstrate that at timescales relevant to shear flow in blood Brownian motion plays an insignificant role in influencing platelet motion or creating further opportunities for platelet-surface contact. The platelet Peclet number at shear rates >100 s-1 is large enough (>200) to neglect platelet Brownian motion in computational modeling of flow in arteries and arterioles for most practical purposes even at very close distances from the surface. We also conducted adhesive dynamics simulations to determine the effects of platelet Brownian motion on GPIbalpha-vWF-A1 single-bond dissociation dynamics. Brownian motion was found to have little effect on bond lifetime and caused minimal bond stressing as bond rupture forces were calculated to be less than 0.005 pN. We conclude from our results that, for the case of platelet-shaped cells, Brownian motion is not expected to play an important role in influencing flow characteristics, platelet-surface contact frequency, and dissociative binding phenomena under flow at physiological shear rates (>50 s(-1)).
Flow Mapping Based on the Motion-Integration Errors of Autonomous Underwater Vehicles
NASA Astrophysics Data System (ADS)
Chang, D.; Edwards, C. R.; Zhang, F.
2016-02-01
Knowledge of a flow field is crucial in the navigation of autonomous underwater vehicles (AUVs) since the motion of AUVs is affected by ambient flow. Due to the imperfect knowledge of the flow field, it is typical to observe a difference between the actual and predicted trajectories of an AUV, which is referred to as a motion-integration error (also known as a dead-reckoning error if an AUV navigates via dead-reckoning). The motion-integration error has been essential for an underwater glider to compute its flow estimate from the travel information of the last leg and to improve navigation performance by using the estimate for the next leg. However, the estimate by nature exhibits a phase difference compared to ambient flow experienced by gliders, prohibiting its application in a flow field with strong temporal and spatial gradients. In our study, to mitigate the phase problem, we have developed a local ocean model by combining the flow estimate based on the motion-integration error with flow predictions from a tidal ocean model. Our model has been used to create desired trajectories of gliders for guidance. Our method is validated by Long Bay experiments in 2012 and 2013 in which we deployed multiple gliders on the shelf of South Atlantic Bight and near the edge of Gulf Stream. In our recent study, the application of the motion-integration error is further extended to create a spatial flow map. Considering that the motion-integration errors of AUVs accumulate along their trajectories, the motion-integration error is formulated as a line integral of ambient flow which is then reformulated into algebraic equations. By solving an inverse problem for these algebraic equations, we obtain the knowledge of such flow in near real time, allowing more effective and precise guidance of AUVs in a dynamic environment. This method is referred to as motion tomography. We provide the results of non-parametric and parametric flow mapping from both simulated and experimental data.
Nelson, Jonathan M.; Kinzel, Paul J.; McDonald, Richard R.; Schmeeckle, Mark
2016-01-01
Recently developed optical and videographic methods for measuring water-surface properties in a noninvasive manner hold great promise for extracting river hydraulic and bathymetric information. This paper describes such a technique, concentrating on the method of infrared videog- raphy for measuring surface velocities and both acoustic (laboratory-based) and laser-scanning (field-based) techniques for measuring water-surface elevations. In ideal laboratory situations with simple flows, appropriate spatial and temporal averaging results in accurate water-surface elevations and water-surface velocities. In test cases, this accuracy is sufficient to allow direct inversion of the governing equations of motion to produce estimates of depth and discharge. Unlike other optical techniques for determining local depth that rely on transmissivity of the water column (bathymetric lidar, multi/hyperspectral correlation), this method uses only water-surface information, so even deep and/or turbid flows can be investigated. However, significant errors arise in areas of nonhydrostatic spatial accelerations, such as those associated with flow over bedforms or other relatively steep obstacles. Using laboratory measurements for test cases, the cause of these errors is examined and both a simple semi-empirical method and computational results are presented that can potentially reduce bathymetric inversion errors.
NASA Astrophysics Data System (ADS)
Wiederrecht, Gary
2014-03-01
Collective hybrid excitations resulting from the coupling of metal nanostructures with organic molecules present unique opportunities for manipulating light-matter interactions at the nanoscale. In this talk, I discuss recent studies that are examples of the breadth of phenomena that are possible. First, the interactions of coupled plasmonic nanostructures with azobenzene-based polymers are described, in which the spatial features of the plasmonic near-field can be used to manipulate molecular motion. The directional molecular transport that results is shown to be useful for imaging the spatial and polarization features of the optical near-field. The modeling of this effect is described. Second, the coupling of excitonic molecular aggregates to metal nanostructures produces coherent coupling that provides added structure to the optical extinction spectra of metal nanoparticles, thereby by providing a photonic handle with which to manipulate energy flow on an ultrafast timescale. Monitoring the rate of energy flow as a function of photon energy reveals important information about the energy dissipation channels and the structural interactions between molecule and metal. Third, the strongly enhanced optical nonlinearity resulting from coupled plasmonic nanorods is described. The closely spaced nanorod material exhibits nonlocality of the optical response that has an unusually strong nonlinear dependence on incident light intensity. Electromagnetic modeling confirms the nonlocal response of the plasmonic metamaterial. The broader impact of collective hybrid excitations on nanophotonics applications is described. Use of the Center for Nanoscale Materials was supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences under Contract No. DE-AC02-06CH11357.
Speckle variance optical coherence tomography of blood flow in the beating mouse embryonic heart.
Grishina, Olga A; Wang, Shang; Larina, Irina V
2017-05-01
Efficient separation of blood and cardiac wall in the beating embryonic heart is essential and critical for experiment-based computational modelling and analysis of early-stage cardiac biomechanics. Although speckle variance optical coherence tomography (SV-OCT) relying on calculation of intensity variance over consecutively acquired frames is a powerful approach for segmentation of fluid flow from static tissue, application of this method in the beating embryonic heart remains challenging because moving structures generate SV signal indistinguishable from the blood. Here, we demonstrate a modified four-dimensional SV-OCT approach that effectively separates the blood flow from the dynamic heart wall in the beating mouse embryonic heart. The method takes advantage of the periodic motion of the cardiac wall and is based on calculation of the SV signal over the frames corresponding to the same phase of the heartbeat cycle. Through comparison with Doppler OCT imaging, we validate this speckle-based approach and show advantages in its insensitiveness to the flow direction and velocity as well as reduced influence from the heart wall movement. This approach has a potential in variety of applications relying on visualization and segmentation of blood flow in periodically moving structures, such as mechanical simulation studies and finite element modelling. Picture: Four-dimensional speckle variance OCT imaging shows the blood flow inside the beating heart of an E8.5 mouse embryo. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Minimum-variance Brownian motion control of an optically trapped probe.
Huang, Yanan; Zhang, Zhipeng; Menq, Chia-Hsiang
2009-10-20
This paper presents a theoretical and experimental investigation of the Brownian motion control of an optically trapped probe. The Langevin equation is employed to describe the motion of the probe experiencing random thermal force and optical trapping force. Since active feedback control is applied to suppress the probe's Brownian motion, actuator dynamics and measurement delay are included in the equation. The equation of motion is simplified to a first-order linear differential equation and transformed to a discrete model for the purpose of controller design and data analysis. The derived model is experimentally verified by comparing the model prediction to the measured response of a 1.87 microm trapped probe subject to proportional control. It is then employed to design the optimal controller that minimizes the variance of the probe's Brownian motion. Theoretical analysis is derived to evaluate the control performance of a specific optical trap. Both experiment and simulation are used to validate the design as well as theoretical analysis, and to illustrate the performance envelope of the active control. Moreover, adaptive minimum variance control is implemented to maintain the optimal performance in the case in which the system is time varying when operating the actively controlled optical trap in a complex environment.
Richards, Lisa M.; Towle, Erica L.; Fox, Douglas J.; Dunn, Andrew K.
2014-01-01
Abstract. Although multiple intraoperative cerebral blood flow (CBF) monitoring techniques are currently available, a quantitative method that allows for continuous monitoring and that can be easily integrated into the surgical workflow is still needed. Laser speckle contrast imaging (LSCI) is an optical imaging technique with a high spatiotemporal resolution that has been recently demonstrated as feasible and effective for intraoperative monitoring of CBF during neurosurgical procedures. This study demonstrates the impact of retrospective motion correction on the quantitative analysis of intraoperatively acquired LSCI images. LSCI images were acquired through a surgical microscope during brain tumor resection procedures from 10 patients under baseline conditions and after a cortical stimulation in three of those patients. The patient’s electrocardiogram (ECG) was recorded during acquisition for postprocess correction of pulsatile artifacts. Automatic image registration was retrospectively performed to correct for tissue motion artifacts, and the performance of rigid and nonrigid transformations was compared. In baseline cases, the original images had 25%±27% noise across 16 regions of interest (ROIs). ECG filtering moderately reduced the noise to 20%±21%, while image registration resulted in a further noise reduction of 15%±4%. Combined ECG filtering and image registration significantly reduced the noise to 6.2%±2.6% (p<0.05). Using the combined motion correction, accuracy and sensitivity to small changes in CBF were improved in cortical stimulation cases. There was also excellent agreement between rigid and nonrigid registration methods (15/16 ROIs with <3% difference). Results from this study demonstrate the importance of motion correction for improved visualization of CBF changes in clinical LSCI images. PMID:26157974
Non-classical light generated by quantum-noise-driven cavity optomechanics.
Brooks, Daniel W C; Botter, Thierry; Schreppler, Sydney; Purdy, Thomas P; Brahms, Nathan; Stamper-Kurn, Dan M
2012-08-23
Optomechanical systems, in which light drives and is affected by the motion of a massive object, will comprise a new framework for nonlinear quantum optics, with applications ranging from the storage and transduction of quantum information to enhanced detection sensitivity in gravitational wave detectors. However, quantum optical effects in optomechanical systems have remained obscure, because their detection requires the object’s motion to be dominated by vacuum fluctuations in the optical radiation pressure; so far, direct observations have been stymied by technical and thermal noise. Here we report an implementation of cavity optomechanics using ultracold atoms in which the collective atomic motion is dominantly driven by quantum fluctuations in radiation pressure. The back-action of this motion onto the cavity light field produces ponderomotive squeezing. We detect this quantum phenomenon by measuring sub-shot-noise optical squeezing. Furthermore, the system acts as a low-power, high-gain, nonlinear parametric amplifier for optical fluctuations, demonstrating a gain of 20 dB with a pump corresponding to an average of only seven intracavity photons. These findings may pave the way for low-power quantum optical devices, surpassing quantum limits on position and force sensing, and the control and measurement of motion in quantum gases.
Two-phase SLIPI for instantaneous LIF and Mie imaging of transient fuel sprays.
Storch, Michael; Mishra, Yogeshwar Nath; Koegl, Matthias; Kristensson, Elias; Will, Stefan; Zigan, Lars; Berrocal, Edouard
2016-12-01
We report in this Letter a two-phase structured laser illumination planar imaging [two-pulse SLIPI (2p-SLIPI)] optical setup where the "lines structure" is spatially shifted by exploiting the birefringence property of a calcite crystal. By using this optical component and two cross-polarized laser pulses, the shift of the modulated pattern is not "time-limited" anymore. Consequently, two sub-images with spatially mismatched phases can be recorded within a few hundred of nanoseconds only, freezing the motion of the illuminated transient flow. In comparison with previous setups for instantaneous imaging based on structured illumination, the current optical design presents the advantage of having a single optical path, greatly simplifying its complexity. Due to its virtue of suppressing the effects from multiple light scattering, the 2p-SLIPI technique is applied here in an optically dense multi-jet direct-injection spark-ignition (DISI) ethanol spray. The fast formation of polydispersed droplets and appearance of voids after fuel injection are investigated by simultaneous detection of Mie scattering and liquid laser-induced fluorescence. The results allow for significantly improved analysis of the spray structure.
Flow dichroism as a reliable method to measure the hydrodynamic aspect ratio of gold nanoparticles.
Reddy, Naveen Krishna; Pérez-Juste, Jorge; Pastoriza-Santos, Isabel; Lang, Peter R; Dhont, Jan K G; Liz-Marzán, Luis M; Vermant, Jan
2011-06-28
Particle shape plays an important role in controlling the optical, magnetic, and mechanical properties of nanoparticle suspensions as well as nanocomposites. However, characterizing the size, shape, and the associated polydispersity of nanoparticles is not straightforward. Electron microscopy provides an accurate measurement of the geometric properties, but sample preparation can be laborious, and to obtain statistically relevant data many particles need to be analyzed separately. Moreover, when the particles are suspended in a fluid, it is important to measure their hydrodynamic properties, as they determine aspects such as diffusion and the rheological behavior of suspensions. Methods that evaluate the dynamics of nanoparticles such as light scattering and rheo-optical methods accurately provide these hydrodynamic properties, but do necessitate a sufficient optical response. In the present work, three different methods for characterizing nonspherical gold nanoparticles are critically compared, especially taking into account the complex optical response of these particles. The different methods are evaluated in terms of their versatility to asses size, shape, and polydispersity. Among these, the rheo-optical technique is shown to be the most reliable method to obtain hydrodynamic aspect ratio and polydispersity for nonspherical gold nanoparticles for two reasons. First, the use of the evolution of the orientation angle makes effects of polydispersity less important. Second, the use of an external flow field gives a mathematically more robust relation between particle motion and aspect ratio, especially for particles with relatively small aspect ratios.
Apthorp, Deborah; Palmisano, Stephen
2014-01-01
Illusory self-motion (‘vection’) in depth is strongly enhanced when horizontal/vertical simulated viewpoint oscillation is added to optic flow inducing displays; a similar effect is found for simulated viewpoint jitter. The underlying cause of these oscillation and jitter advantages for vection is still unknown. Here we investigate the possibility that perceived speed of motion in depth (MID) plays a role. First, in a 2AFC procedure, we obtained MID speed PSEs for briefly presented (vertically oscillating and smooth) radial flow displays. Then we examined the strength, duration and onset latency of vection induced by oscillating and smooth radial flow displays matched either for simulated or perceived MID speed. The oscillation advantage was eliminated when displays were matched for perceived MID speed. However, when we tested the jitter advantage in the same manner, jittering displays were found to produce greater vection in depth than speed-matched controls. In summary, jitter and oscillation advantages were the same across experiments, but slower MID speed was required to match jittering than oscillating stimuli. Thus, to the extent that vection is driven by perceived speed of MID, this effect is greater for oscillating than for jittering stimuli, which suggests that the two effects may arise from separate mechanisms. PMID:24651861
Estimating nonrigid motion from inconsistent intensity with robust shape features
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Wenyang; Ruan, Dan, E-mail: druan@mednet.ucla.edu; Department of Radiation Oncology, University of California, Los Angeles, California 90095
2013-12-15
Purpose: To develop a nonrigid motion estimation method that is robust to heterogeneous intensity inconsistencies amongst the image pairs or image sequence. Methods: Intensity and contrast variations, as in dynamic contrast enhanced magnetic resonance imaging, present a considerable challenge to registration methods based on general discrepancy metrics. In this study, the authors propose and validate a novel method that is robust to such variations by utilizing shape features. The geometry of interest (GOI) is represented with a flexible zero level set, segmented via well-behaved regularized optimization. The optimization energy drives the zero level set to high image gradient regions, andmore » regularizes it with area and curvature priors. The resulting shape exhibits high consistency even in the presence of intensity or contrast variations. Subsequently, a multiscale nonrigid registration is performed to seek a regular deformation field that minimizes shape discrepancy in the vicinity of GOIs. Results: To establish the working principle, realistic 2D and 3D images were subject to simulated nonrigid motion and synthetic intensity variations, so as to enable quantitative evaluation of registration performance. The proposed method was benchmarked against three alternative registration approaches, specifically, optical flow, B-spline based mutual information, and multimodality demons. When intensity consistency was satisfied, all methods had comparable registration accuracy for the GOIs. When intensities among registration pairs were inconsistent, however, the proposed method yielded pronounced improvement in registration accuracy, with an approximate fivefold reduction in mean absolute error (MAE = 2.25 mm, SD = 0.98 mm), compared to optical flow (MAE = 9.23 mm, SD = 5.36 mm), B-spline based mutual information (MAE = 9.57 mm, SD = 8.74 mm) and mutimodality demons (MAE = 10.07 mm, SD = 4.03 mm). Applying the proposed method on a real MR image sequence also provided qualitatively appealing results, demonstrating good feasibility and applicability of the proposed method. Conclusions: The authors have developed a novel method to estimate the nonrigid motion of GOIs in the presence of spatial intensity and contrast variations, taking advantage of robust shape features. Quantitative analysis and qualitative evaluation demonstrated good promise of the proposed method. Further clinical assessment and validation is being performed.« less
Estimating nonrigid motion from inconsistent intensity with robust shape features.
Liu, Wenyang; Ruan, Dan
2013-12-01
To develop a nonrigid motion estimation method that is robust to heterogeneous intensity inconsistencies amongst the image pairs or image sequence. Intensity and contrast variations, as in dynamic contrast enhanced magnetic resonance imaging, present a considerable challenge to registration methods based on general discrepancy metrics. In this study, the authors propose and validate a novel method that is robust to such variations by utilizing shape features. The geometry of interest (GOI) is represented with a flexible zero level set, segmented via well-behaved regularized optimization. The optimization energy drives the zero level set to high image gradient regions, and regularizes it with area and curvature priors. The resulting shape exhibits high consistency even in the presence of intensity or contrast variations. Subsequently, a multiscale nonrigid registration is performed to seek a regular deformation field that minimizes shape discrepancy in the vicinity of GOIs. To establish the working principle, realistic 2D and 3D images were subject to simulated nonrigid motion and synthetic intensity variations, so as to enable quantitative evaluation of registration performance. The proposed method was benchmarked against three alternative registration approaches, specifically, optical flow, B-spline based mutual information, and multimodality demons. When intensity consistency was satisfied, all methods had comparable registration accuracy for the GOIs. When intensities among registration pairs were inconsistent, however, the proposed method yielded pronounced improvement in registration accuracy, with an approximate fivefold reduction in mean absolute error (MAE = 2.25 mm, SD = 0.98 mm), compared to optical flow (MAE = 9.23 mm, SD = 5.36 mm), B-spline based mutual information (MAE = 9.57 mm, SD = 8.74 mm) and mutimodality demons (MAE = 10.07 mm, SD = 4.03 mm). Applying the proposed method on a real MR image sequence also provided qualitatively appealing results, demonstrating good feasibility and applicability of the proposed method. The authors have developed a novel method to estimate the nonrigid motion of GOIs in the presence of spatial intensity and contrast variations, taking advantage of robust shape features. Quantitative analysis and qualitative evaluation demonstrated good promise of the proposed method. Further clinical assessment and validation is being performed.
Scanning two-photon continuous flow lithography for synthesis of high-resolution 3D microparticles.
Shaw, Lucas A; Chizari, Samira; Shusteff, Maxim; Naghsh-Nilchi, Hamed; Di Carlo, Dino; Hopkins, Jonathan B
2018-05-14
Demand continues to rise for custom-fabricated and engineered colloidal microparticles across a breadth of application areas. This paper demonstrates an improvement in the fabrication rate of high-resolution 3D colloidal particles by using two-photon scanning lithography within a microfluidic channel. To accomplish this, we present (1) an experimental setup that supports fast, 3D scanning by synchronizing a galvanometer, piezoelectric stage, and an acousto-optic switch, and (2) a new technique for modifying the laser's scan path to compensate for the relative motion of the rapidly-flowing photopolymer medium. The result is an instrument that allows for rapid conveyor-belt-like fabrication of colloidal objects with arbitrary 3D shapes and micron-resolution features.
Deblurring for spatial and temporal varying motion with optical computing
NASA Astrophysics Data System (ADS)
Xiao, Xiao; Xue, Dongfeng; Hui, Zhao
2016-05-01
A way to estimate and remove spatially and temporally varying motion blur is proposed, which is based on an optical computing system. The translation and rotation motion can be independently estimated from the joint transform correlator (JTC) system without iterative optimization. The inspiration comes from the fact that the JTC system is immune to rotation motion in a Cartesian coordinate system. The work scheme of the JTC system is designed to keep switching between the Cartesian coordinate system and polar coordinate system in different time intervals with the ping-pang handover. In the ping interval, the JTC system works in the Cartesian coordinate system to obtain a translation motion vector with optical computing speed. In the pang interval, the JTC system works in the polar coordinate system. The rotation motion is transformed to the translation motion through coordinate transformation. Then the rotation motion vector can also be obtained from JTC instantaneously. To deal with continuous spatially variant motion blur, submotion vectors based on the projective motion path blur model are proposed. The submotion vectors model is more effective and accurate at modeling spatially variant motion blur than conventional methods. The simulation and real experiment results demonstrate its overall effectiveness.
A simple 5-DoF MR-compatible motion signal measurement system.
Chung, Soon-Cheol; Kim, Hyung-Sik; Yang, Jae-Woong; Lee, Su-Jeong; Choi, Mi-Hyun; Kim, Ji-Hye; Yeon, Hong-Won; Park, Jang-Yeon; Yi, Jeong-Han; Tack, Gye-Rae
2011-09-01
The purpose of this study was to develop a simple motion measurement system with magnetic resonance (MR) compatibility and safety. The motion measurement system proposed here can measure 5-DoF motion signals without deteriorating the MR images, and it has no effect on the intense and homogeneous main magnetic field, the temporal-gradient magnetic field (which varies rapidly with time), the transceiver radio frequency (RF) coil, and the RF pulse during MR data acquisition. A three-axis accelerometer and a two-axis gyroscope were used to measure 5-DoF motion signals, and Velcro was used to attach a sensor module to a finger or wrist. To minimize the interference between the MR imaging system and the motion measurement system, nonmagnetic materials were used for all electric circuit components in an MR shield room. To remove the effect of RF pulse, an amplifier, modulation circuit, and power supply were located in a shielded case, which was made of copper and aluminum. The motion signal was modulated to an optic signal using pulse width modulation, and the modulated optic signal was transmitted outside the MR shield room using a high-intensity light-emitting diode and an optic cable. The motion signal was recorded on a PC by demodulating the transmitted optic signal into an electric signal. Various kinematic variables, such as angle, acceleration, velocity, and jerk, can be measured or calculated by using the motion measurement system developed here. This system also enables motion tracking by extracting the position information from the motion signals. It was verified that MR images and motion signals could reliably be measured simultaneously.
NASA Astrophysics Data System (ADS)
Gurley, Katelyn; Shang, Yu; Yu, Guoqiang
2012-07-01
This study investigates a method using novel hybrid diffuse optical spectroscopies [near-infrared spectroscopy (NIRS) and diffuse correlation spectroscopy (DCS)] to obtain continuous, noninvasive measurement of absolute blood flow (BF), blood oxygenation, and oxygen consumption rate (\\Vdot O2) in exercising skeletal muscle. Healthy subjects (n=9) performed a handgrip exercise to increase BF and \\Vdot O2 in forearm flexor muscles, while a hybrid optical probe on the skin surface directly monitored oxy-, deoxy-, and total hemoglobin concentrations ([HbO2], [Hb], and THC), tissue oxygen saturation (StO2), relative BF (rBF), and relative oxygen consumption rate (r\\Vdot O2). The rBF and r\\Vdot O2 signals were calibrated with absolute baseline BF and \\Vdot O2 obtained through venous and arterial occlusions, respectively. Known problems with muscle-fiber motion artifacts in optical measurements during exercise were mitigated using a novel gating algorithm that determined muscle contraction status based on control signals from a dynamometer. Results were consistent with previous findings in the literature. This study supports the application of NIRS/DCS technology to quantitatively evaluate hemodynamic and metabolic parameters in exercising skeletal muscle and holds promise for improving diagnosis and treatment evaluation for patients suffering from diseases affecting skeletal muscle and advancing fundamental understanding of muscle and exercise physiology.
Gurley, Katelyn; Shang, Yu
2012-01-01
Abstract. This study investigates a method using novel hybrid diffuse optical spectroscopies [near-infrared spectroscopy (NIRS) and diffuse correlation spectroscopy (DCS)] to obtain continuous, noninvasive measurement of absolute blood flow (BF), blood oxygenation, and oxygen consumption rate (V˙O2) in exercising skeletal muscle. Healthy subjects (n=9) performed a handgrip exercise to increase BF and V˙O2 in forearm flexor muscles, while a hybrid optical probe on the skin surface directly monitored oxy-, deoxy-, and total hemoglobin concentrations ([HbO2], [Hb], and THC), tissue oxygen saturation (StO2), relative BF (rBF), and relative oxygen consumption rate (rV˙O2). The rBF and rV˙O2 signals were calibrated with absolute baseline BF and V˙O2 obtained through venous and arterial occlusions, respectively. Known problems with muscle-fiber motion artifacts in optical measurements during exercise were mitigated using a novel gating algorithm that determined muscle contraction status based on control signals from a dynamometer. Results were consistent with previous findings in the literature. This study supports the application of NIRS/DCS technology to quantitatively evaluate hemodynamic and metabolic parameters in exercising skeletal muscle and holds promise for improving diagnosis and treatment evaluation for patients suffering from diseases affecting skeletal muscle and advancing fundamental understanding of muscle and exercise physiology. PMID:22894482
Computational hydrodynamics and optical performance of inductively-coupled plasma adaptive lenses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mortazavi, M.; Urzay, J., E-mail: jurzay@stanford.edu; Mani, A.
2015-06-15
This study addresses the optical performance of a plasma adaptive lens for aero-optical applications by using both axisymmetric and three-dimensional numerical simulations. Plasma adaptive lenses are based on the effects of free electrons on the phase velocity of incident light, which, in theory, can be used as a phase-conjugation mechanism. A closed cylindrical chamber filled with Argon plasma is used as a model lens into which a beam of light is launched. The plasma is sustained by applying a radio-frequency electric current through a coil that envelops the chamber. Four different operating conditions, ranging from low to high powers andmore » induction frequencies, are employed in the simulations. The numerical simulations reveal complex hydrodynamic phenomena related to buoyant and electromagnetic laminar transport, which generate, respectively, large recirculating cells and wall-normal compression stresses in the form of local stagnation-point flows. In the axisymmetric simulations, the plasma motion is coupled with near-wall axial striations in the electron-density field, some of which propagate in the form of low-frequency traveling disturbances adjacent to vortical quadrupoles that are reminiscent of Taylor-Görtler flow structures in centrifugally unstable flows. Although the refractive-index fields obtained from axisymmetric simulations lead to smooth beam wavefronts, they are found to be unstable to azimuthal disturbances in three of the four three-dimensional cases considered. The azimuthal striations are optically detrimental, since they produce high-order angular aberrations that account for most of the beam wavefront error. A fourth case is computed at high input power and high induction frequency, which displays the best optical properties among all the three-dimensional simulations considered. In particular, the increase in induction frequency prevents local thermalization and leads to an axisymmetric distribution of electrons even after introduction of spatial disturbances. The results highlight the importance of accounting for spatial effects in the numerical computations when optical analyses of plasma lenses are pursued in this range of operating conditions.« less
Magnetic Fields in Blazar Jets: Jet-Alignment of Radio and Optical Polarization over 20-30 Years
NASA Astrophysics Data System (ADS)
Wills, Beverley J.; Aller, M. F.; Caldwell, C.; Aller, H. D.
2012-01-01
Blazars are highly active nuclei of distant galaxies. They produce synchrotron-emitting relativistic jets on scales of less than a parsec to many Kpc. When viewed head-on, as opposed to in the plane of the sky, the jet motion appears superluminal, and the emission is Doppler boosted. Blazars show rapid radio and optical variability in flux density and polarization. There are two types of blazars that can have strong synchrotron continua: some quasars with strong broad emission lines, and BL Lac objects with weak or undetected broad lines. We have compiled optical linear polarization measurements of more than 100 blazars, including archival data from McDonald Observatory. While the optical data are somewhat sparsely sampled, The University of Michigan Radio Astronomical Observatory observed many blazars over 20-30 years, often well-sampled over days to weeks, enabling quasi-simultaneous comparison of optical and radio polarization position angles (EVPAs). We also collected data on jet direction -- position angles of the jet component nearest the radio core. The project is unique in examining the polarization and jet behavior over many years. BL Lac objects tend to have stable optically thin EVPA in the jet direction, meaning magnetic field is perpendicular to jet flow, often interpreted as the magnetic field compressed by shocks. In quasar-blazars optical and radio EVPA often changes between parallel or perpendicular to the jet direction, even in the same object. The underlying B field of the jet is is parallel to the flow, with approximately 90 degree changes resulting from shocks. For both BL Lac objects & quasars, the scatter in EVPA usually increases from low frequencies (4.8 GHz) through 14.5 GHz through optical. The wide optical-radio frequency range allows us to investigate optical depth effects and the spatial origin of radio and optical emission.
Analysis of secondary motions in square duct flow
NASA Astrophysics Data System (ADS)
Modesti, Davide; Pirozzoli, Sergio; Orlandi, Paolo; Grasso, Francesco
2018-04-01
We carry out direct numerical simulations (DNS) of square duct flow spanning the friction Reynolds number range {Re}τ * =150-1055, to study the nature and the role of secondary motions. We preliminarily find that secondary motions are not the mere result of the time averaging procedure, but rather they are present in the instantaneous flow realizations, corresponding to large eddies persistent in both space and time. Numerical experiments have also been carried out whereby the secondary motions are suppressed, hence allowing to quantifying their effect on the mean flow field. At sufficiently high Reynolds number, secondary motions are found to increase the friction coefficient by about 3%, hence proportionally to their relative strength with respect to the bulk flow. Simulations without secondary motions are found to yield larger deviations on the mean velocity profiles from the standard law-of-the-wall, revealing that secondary motions act as a self-regulating mechanism of turbulence whereby the effect of the corners is mitigated.
Laboratory modeling of multiple zonal jets on the polar beta-plane
NASA Astrophysics Data System (ADS)
Afanasyev, Y.
2011-12-01
Zonal jets observed in the oceans and atmospheres of planets are studied in a laboratory rotating tank. The fluid layer in the rotating tank has parabolic free surface and dynamically simulates the polar beta-plane where the Coriolis parameter varies quadratically with distance from the pole. Velocity and surface elevation fields are measured with an optical altimetry method (Afanasyev et al., Exps Fluids 2009). The flows are induced by a localized buoyancy source along radial direction. The baroclinic flow consisting of a field of eddies propagates away from the source due West and forms zonal jets (Fig. 1). Barotropic jets ahead of the baroclinic flow are formed by radiation of beta plumes. Inside the baroclinic flow the jets flow between the chains of eddies. Experimental evidence of so-called noodles (baroclinic instability mode with motions in the radial, North-South direction) theoretically predicted by Berloff et al. (JFM, JPO 2009) was found in our experiments. Beta plume radiation mechanism and the mechanism associated with the instability of noodles are likely to contribute to formation of jets in the baroclinic flow.
Optically gated beating-heart imaging
Taylor, Jonathan M.
2014-01-01
The constant motion of the beating heart presents an obstacle to clear optical imaging, especially 3D imaging, in small animals where direct optical imaging would otherwise be possible. Gating techniques exploit the periodic motion of the heart to computationally “freeze” this movement and overcome motion artifacts. Optically gated imaging represents a recent development of this, where image analysis is used to synchronize acquisition with the heartbeat in a completely non-invasive manner. This article will explain the concept of optical gating, discuss a range of different implementation strategies and their strengths and weaknesses. Finally we will illustrate the usefulness of the technique by discussing applications where optical gating has facilitated novel biological findings by allowing 3D in vivo imaging of cardiac myocytes in their natural environment of the beating heart. PMID:25566083
Insect-Inspired Flight Control for Unmanned Aerial Vehicles
NASA Technical Reports Server (NTRS)
Thakoor, Sarita; Stange, G.; Srinivasan, M.; Chahl, Javaan; Hine, Butler; Zornetzer, Steven
2005-01-01
Flight-control and navigation systems inspired by the structure and function of the visual system and brain of insects have been proposed for a class of developmental miniature robotic aircraft called "biomorphic flyers" described earlier in "Development of Biomorphic Flyers" (NPO-30554), NASA Tech Briefs, Vol. 28, No. 11 (November 2004), page 54. These form a subset of biomorphic explorers, which, as reported in several articles in past issues of NASA Tech Briefs ["Biomorphic Explorers" (NPO-20142), Vol. 22, No. 9 (September 1998), page 71; "Bio-Inspired Engineering of Exploration Systems" (NPO-21142), Vol. 27, No. 5 (May 2003), page 54; and "Cooperative Lander-Surface/Aerial Microflyer Missions for Mars Exploration" (NPO-30286), Vol. 28, No. 5 (May 2004), page 36], are proposed small robots, equipped with microsensors and communication systems, that would incorporate crucial functions of mobility, adaptability, and even cooperative behavior. These functions are inherent to biological organisms but are challenging frontiers for technical systems. Biomorphic flyers could be used on Earth or remote planets to explore otherwise difficult or impossible to reach sites. An example of an exploratory task of search/surveillance functions currently being tested is to obtain high-resolution aerial imagery, using a variety of miniaturized electronic cameras. The control functions to be implemented by the systems in development include holding altitude, avoiding hazards, following terrain, navigation by reference to recognizable terrain features, stabilization of flight, and smooth landing. Flying insects perform these and other functions remarkably well, even though insect brains contains fewer than 10(exp -4) as many neurons as does the human brain. Although most insects have immobile, fixed-focus eyes and lack stereoscopy (and hence cannot perceive depth directly), they utilize a number of ingenious strategies for perceiving, and navigating in, three dimensions. Despite their lack of stereoscopy, insects infer distances to potential obstacles and other objects from image motion cues that result from their own motions in the environment. The concept of motion of texture in images as a source of motion cues is denoted generally as the concept of optic or optical flow. Computationally, a strategy based on optical flow is simpler than is stereoscopy for avoiding hazards and following terrain. Hence, this strategy offers the potential to design vision-based control computing subsystems that would be more compact, would weigh less, and would demand less power than would subsystems of equivalent capability based on a conventional stereoscopic approach.
Computer Controlled Optical Surfacing With Orbital Tool Motion
NASA Astrophysics Data System (ADS)
Jones, Robert A.
1985-11-01
Asymmetric aspheric optical surfaces are very difficult to fabricate using classical techniques and laps the same size as the workpiece. Opticians can produce such surfaces by hand grinding and polishing, using small laps with orbital tool motion. However, this is a time consuming process unsuitable for large optical elements.
1985-04-01
from any angle of approach. An angular impetus is imparted to the particle 25 7. motion via the eight evenly spaced entrance vanes. As the particles...measurement cycle. 2 f.. 27 VA NE S M= VANE ASSEMBLY BASE IN SECT SCREEN TUBE PROTECTIVE CPLUGHOUSING HOUSING-DEFLECTORAEOYMIINT SPACING PHAEO YNAI N...AERODYNAMIC FLOW DEFLECTOR OUTER TUJBE TAPERED p. Figure 3 r Wedding PM 1 0 Inlet [I]J 28 . .I .. II | I I J . 1 _1 ! I 7 I ! . ?. . . - 120 -- WEDDING INLET
NASA Astrophysics Data System (ADS)
Kang, Sungil; Roh, Annah; Nam, Bodam; Hong, Hyunki
2011-12-01
This paper presents a novel vision system for people detection using an omnidirectional camera mounted on a mobile robot. In order to determine regions of interest (ROI), we compute a dense optical flow map using graphics processing units, which enable us to examine compliance with the ego-motion of the robot in a dynamic environment. Shape-based classification algorithms are employed to sort ROIs into human beings and nonhumans. The experimental results show that the proposed system detects people more precisely than previous methods.
Transonic applications of the Wake Imaging System
NASA Astrophysics Data System (ADS)
Crowder, J. P.
1982-09-01
The extension of a rapid flow field survey method (wake imaging system) originally developed for low speed wind tunnel operation, to transonic wind tunnel applications is discussed. The advantage of the system, beside the simplicity and low cost of the data acquisition system, is that the probe position data are recorded as an optical image of the actual sensor and thus are unaffected by the inevitable deflections of the probe support. This permits traversing systems which are deliberately flexible and have unusual motions. Two transverse drive systems are described and several typical data images are given.
Mody, Nipa A.; King, Michael R.
2008-01-01
We used the Platelet Adhesive Dynamics computational method to study the influence of Brownian motion of a platelet on its flow characteristics near a surface in the creeping flow regime. Two important characterizations were done in this regard: (1) quantification of the platelet’s ability to contact the surface by virtue of the Brownian forces and torques acting on it, and (2) determination of the relative importance of Brownian motion in promoting surface encounters in the presence of shear flow. We determined the Peclet number for a platelet undergoing Brownian motion in shear flow, which could be expressed as a simple linear function of height of the platelet centroid, H from the surface Pe (platelet) = γ. · (1.56H + 0.66) for H > 0.3 μm. Our results demonstrate that at timescales relevant to shear flow in blood, Brownian motion plays an insignificant role in influencing platelet motion or creating further opportunities for platelet-surface contact. The platelet Peclet number at shear rates > 100 s-1 is large enough (> 200) to neglect platelet Brownian motion in computational modeling of flow in arteries and arterioles for most practical purposes even at very close distances from the surface. We also conducted adhesive dynamics simulations to determine the effects of platelet Brownian motion on GPIbα-vWF-A1 single-bond dissociation dynamics. Brownian motion was found to have little effect on bond lifetime and caused minimal bond stressing as bond rupture forces were calculated to be less than 0.005 pN. We conclude from our results that for the case of platelet-shaped cells, Brownian motion is not expected to play an important role in influencing flow characteristics, platelet-surface contact frequency and dissociative binding phenomena under flow at physiological shear rates (> 50 s-1). PMID:17417890
Binocular Interactions Underlying the Classic Optomotor Responses of Flying Flies
Duistermars, Brian J.; Care, Rachel A.; Frye, Mark A.
2012-01-01
In response to imposed course deviations, the optomotor reactions of animals reduce motion blur and facilitate the maintenance of stable body posture. In flies, many anatomical and electrophysiological studies suggest that disparate motion cues stimulating the left and right eyes are not processed in isolation but rather are integrated in the brain to produce a cohesive panoramic percept. To investigate the strength of such inter-ocular interactions and their role in compensatory sensory–motor transformations, we utilize a virtual reality flight simulator to record wing and head optomotor reactions by tethered flying flies in response to imposed binocular rotation and monocular front-to-back and back-to-front motion. Within a narrow range of stimulus parameters that generates large contrast insensitive optomotor responses to binocular rotation, we find that responses to monocular front-to-back motion are larger than those to panoramic rotation, but are contrast sensitive. Conversely, responses to monocular back-to-front motion are slower than those to rotation and peak at the lowest tested contrast. Together our results suggest that optomotor responses to binocular rotation result from the influence of non-additive contralateral inhibitory as well as excitatory circuit interactions that serve to confer contrast insensitivity to flight behaviors influenced by rotatory optic flow. PMID:22375108
Min, Yugang; Santhanam, Anand; Neelakkantan, Harini; Ruddy, Bari H; Meeks, Sanford L; Kupelian, Patrick A
2010-09-07
In this paper, we present a graphics processing unit (GPU)-based simulation framework to calculate the delivered dose to a 3D moving lung tumor and its surrounding normal tissues, which are undergoing subject-specific lung deformations. The GPU-based simulation framework models the motion of the 3D volumetric lung tumor and its surrounding tissues, simulates the dose delivery using the dose extracted from a treatment plan using Pinnacle Treatment Planning System, Phillips, for one of the 3DCTs of the 4DCT and predicts the amount and location of radiation doses deposited inside the lung. The 4DCT lung datasets were registered with each other using a modified optical flow algorithm. The motion of the tumor and the motion of the surrounding tissues were simulated by measuring the changes in lung volume during the radiotherapy treatment using spirometry. The real-time dose delivered to the tumor for each beam is generated by summing the dose delivered to the target volume at each increase in lung volume during the beam delivery time period. The simulation results showed the real-time capability of the framework at 20 discrete tumor motion steps per breath, which is higher than the number of 4DCT steps (approximately 12) reconstructed during multiple breathing cycles.
Pitch body orientation influences the perception of self-motion direction induced by optic flow.
Bourrelly, A; Vercher, J-L; Bringoux, L
2010-10-04
We studied the effect of static pitch body tilts on the perception of self-motion direction induced by a visual stimulus. Subjects were seated in front of a screen on which was projected a 3D cluster of moving dots visually simulating a forward motion of the observer with upward or downward directional biases (relative to a true earth horizontal direction). The subjects were tilted at various angles relative to gravity and were asked to estimate the direction of the perceived motion (nose-up, as during take-off or nose-down, as during landing). The data showed that body orientation proportionally affected the amount of error in the reported perceived direction (by 40% of body tilt magnitude in a range of +/-20 degrees) and these errors were systematically recorded in the direction of body tilt. As a consequence, a same visual stimulus was differently interpreted depending on body orientation. While the subjects were required to perform the task in a geocentric reference frame (i.e., relative to a gravity-related direction), they were obviously influenced by egocentric references. These results suggest that the perception of self-motion is not elaborated within an exclusive reference frame (either egocentric or geocentric) but rather results from the combined influence of both. (c) 2010 Elsevier Ireland Ltd. All rights reserved.
Spatio-Temporal Dynamics of Impulse Responses to Figure Motion in Optic Flow Neurons
Lee, Yu-Jen; Jönsson, H. Olof; Nordström, Karin
2015-01-01
White noise techniques have been used widely to investigate sensory systems in both vertebrates and invertebrates. White noise stimuli are powerful in their ability to rapidly generate data that help the experimenter decipher the spatio-temporal dynamics of neural and behavioral responses. One type of white noise stimuli, maximal length shift register sequences (m-sequences), have recently become particularly popular for extracting response kernels in insect motion vision. We here use such m-sequences to extract the impulse responses to figure motion in hoverfly lobula plate tangential cells (LPTCs). Figure motion is behaviorally important and many visually guided animals orient towards salient features in the surround. We show that LPTCs respond robustly to figure motion in the receptive field. The impulse response is scaled down in amplitude when the figure size is reduced, but its time course remains unaltered. However, a low contrast stimulus generates a slower response with a significantly longer time-to-peak and half-width. Impulse responses in females have a slower time-to-peak than males, but are otherwise similar. Finally we show that the shapes of the impulse response to a figure and a widefield stimulus are very similar, suggesting that the figure response could be coded by the same input as the widefield response. PMID:25955416
Age Differences in Visual-Auditory Self-Motion Perception during a Simulated Driving Task
Ramkhalawansingh, Robert; Keshavarz, Behrang; Haycock, Bruce; Shahab, Saba; Campos, Jennifer L.
2016-01-01
Recent evidence suggests that visual-auditory cue integration may change as a function of age such that integration is heightened among older adults. Our goal was to determine whether these changes in multisensory integration are also observed in the context of self-motion perception under realistic task constraints. Thus, we developed a simulated driving paradigm in which we provided older and younger adults with visual motion cues (i.e., optic flow) and systematically manipulated the presence or absence of congruent auditory cues to self-motion (i.e., engine, tire, and wind sounds). Results demonstrated that the presence or absence of congruent auditory input had different effects on older and younger adults. Both age groups demonstrated a reduction in speed variability when auditory cues were present compared to when they were absent, but older adults demonstrated a proportionally greater reduction in speed variability under combined sensory conditions. These results are consistent with evidence indicating that multisensory integration is heightened in older adults. Importantly, this study is the first to provide evidence to suggest that age differences in multisensory integration may generalize from simple stimulus detection tasks to the integration of the more complex and dynamic visual and auditory cues that are experienced during self-motion. PMID:27199829
A new dimension in retrograde flow: centripetal movement of engulfed particles.
Caspi, A; Yeger, O; Grosheva, I; Bershadsky, A D; Elbaum, M
2001-01-01
Centripetal motion of surface-adherent particles is a classic experimental system for studying surface dynamics on a eukaryotic cell. To investigate bead migration over the entire cell surface, we have developed an experimental assay using multinuclear giant fibroblasts, which provide expanded length scales and an unambiguous frame of reference. Beads coated by adhesion ligands concanavalin A or fibronectin are placed in specific locations on the cell using optical tweezers, and their subsequent motion is tracked over time. The adhesion, as well as velocity and directionality of their movement, expose distinct regions of the cytoplasm and membrane. Beads placed on the peripheral lamella initiate centripetal motion, whereas beads placed on the central part of the cell attach to a stationary cortex and do not move. Careful examination by complementary three-dimensional methods shows that the motion of a bead placed on the cell periphery takes place after engulfment into the cytoplasm, whereas stationary beads, placed near the cell center, are not engulfed. These results demonstrate that centripetal motion of adhering particles may occur inside as well as outside the cell. Inhibition of actomyosin activity is used to explore requirements for engulfment and aspects of the bead movement. Centripetal movement of adherent particles seems to depend on mechanisms distinct from those driving overall cell contractility. PMID:11566772
Vision and air flow combine to streamline flying honeybees
Taylor, Gavin J.; Luu, Tien; Ball, David; Srinivasan, Mandyam V.
2013-01-01
Insects face the challenge of integrating multi-sensory information to control their flight. Here we study a ‘streamlining' response in honeybees, whereby honeybees raise their abdomen to reduce drag. We find that this response, which was recently reported to be mediated by optic flow, is also strongly modulated by the presence of air flow simulating a head wind. The Johnston's organs in the antennae were found to play a role in the measurement of the air speed that is used to control the streamlining response. The response to a combination of visual motion and wind is complex and can be explained by a model that incorporates a non-linear combination of the two stimuli. The use of visual and mechanosensory cues increases the strength of the streamlining response when the stimuli are present concurrently. We propose this multisensory integration will make the response more robust to transient disturbances in either modality. PMID:24019053
The temporal dynamics of heading perception in the presence of moving objects
Fajen, Brett R.
2015-01-01
Many forms of locomotion rely on the ability to accurately perceive one's direction of locomotion (i.e., heading) based on optic flow. Although accurate in rigid environments, heading judgments may be biased when independently moving objects are present. The aim of this study was to systematically investigate the conditions in which moving objects influence heading perception, with a focus on the temporal dynamics and the mechanisms underlying this bias. Subjects viewed stimuli simulating linear self-motion in the presence of a moving object and judged their direction of heading. Experiments 1 and 2 revealed that heading perception is biased when the object crosses or almost crosses the observer's future path toward the end of the trial, but not when the object crosses earlier in the trial. Nonetheless, heading perception is not based entirely on the instantaneous optic flow toward the end of the trial. This was demonstrated in Experiment 3 by varying the portion of the earlier part of the trial leading up to the last frame that was presented to subjects. When the stimulus duration was long enough to include the part of the trial before the moving object crossed the observer's path, heading judgments were less biased. The findings suggest that heading perception is affected by the temporal evolution of optic flow. The time course of dorsal medial superior temporal area (MSTd) neuron responses may play a crucial role in perceiving heading in the presence of moving objects, a property not captured by many existing models. PMID:26510765
Automated content and quality assessment of full-motion-video for the generation of meta data
NASA Astrophysics Data System (ADS)
Harguess, Josh
2015-05-01
Virtually all of the video data (and full-motion-video (FMV)) that is currently collected and stored in support of missions has been corrupted to various extents by image acquisition and compression artifacts. Additionally, video collected by wide-area motion imagery (WAMI) surveillance systems and unmanned aerial vehicles (UAVs) and similar sources is often of low quality or in other ways corrupted so that it is not worth storing or analyzing. In order to make progress in the problem of automatic video analysis, the first problem that should be solved is deciding whether the content of the video is even worth analyzing to begin with. We present a work in progress to address three types of scenes which are typically found in real-world data stored in support of Department of Defense (DoD) missions: no or very little motion in the scene, large occlusions in the scene, and fast camera motion. Each of these produce video that is generally not usable to an analyst or automated algorithm for mission support and therefore should be removed or flagged to the user as such. We utilize recent computer vision advances in motion detection and optical flow to automatically assess FMV for the identification and generation of meta-data (or tagging) of video segments which exhibit unwanted scenarios as described above. Results are shown on representative real-world video data.
Eye Movements in Darkness Modulate Self-Motion Perception.
Clemens, Ivar Adrianus H; Selen, Luc P J; Pomante, Antonella; MacNeilage, Paul R; Medendorp, W Pieter
2017-01-01
During self-motion, humans typically move the eyes to maintain fixation on the stationary environment around them. These eye movements could in principle be used to estimate self-motion, but their impact on perception is unknown. We had participants judge self-motion during different eye-movement conditions in the absence of full-field optic flow. In a two-alternative forced choice task, participants indicated whether the second of two successive passive lateral whole-body translations was longer or shorter than the first. This task was used in two experiments. In the first ( n = 8), eye movements were constrained differently in the two translation intervals by presenting either a world-fixed or body-fixed fixation point or no fixation point at all (allowing free gaze). Results show that perceived translations were shorter with a body-fixed than a world-fixed fixation point. A linear model indicated that eye-movement signals received a weight of ∼25% for the self-motion percept. This model was independently validated in the trials without a fixation point (free gaze). In the second experiment ( n = 10), gaze was free during both translation intervals. Results show that the translation with the larger eye-movement excursion was judged more often to be larger than chance, based on an oculomotor choice probability analysis. We conclude that eye-movement signals influence self-motion perception, even in the absence of visual stimulation.
Eye Movements in Darkness Modulate Self-Motion Perception
Pomante, Antonella
2017-01-01
Abstract During self-motion, humans typically move the eyes to maintain fixation on the stationary environment around them. These eye movements could in principle be used to estimate self-motion, but their impact on perception is unknown. We had participants judge self-motion during different eye-movement conditions in the absence of full-field optic flow. In a two-alternative forced choice task, participants indicated whether the second of two successive passive lateral whole-body translations was longer or shorter than the first. This task was used in two experiments. In the first (n = 8), eye movements were constrained differently in the two translation intervals by presenting either a world-fixed or body-fixed fixation point or no fixation point at all (allowing free gaze). Results show that perceived translations were shorter with a body-fixed than a world-fixed fixation point. A linear model indicated that eye-movement signals received a weight of ∼25% for the self-motion percept. This model was independently validated in the trials without a fixation point (free gaze). In the second experiment (n = 10), gaze was free during both translation intervals. Results show that the translation with the larger eye-movement excursion was judged more often to be larger than chance, based on an oculomotor choice probability analysis. We conclude that eye-movement signals influence self-motion perception, even in the absence of visual stimulation. PMID:28144623
Weatherbee, Andrew; Sugita, Mitsuro; Bizheva, Kostadinka; Popov, Ivan; Vitkin, Alex
2016-06-15
The distribution of backscattered intensities as described by the probability density function (PDF) of tissue-scattered light contains information that may be useful for tissue assessment and diagnosis, including characterization of its pathology. In this Letter, we examine the PDF description of the light scattering statistics in a well characterized tissue-like particulate medium using optical coherence tomography (OCT). It is shown that for low scatterer density, the governing statistics depart considerably from a Gaussian description and follow the K distribution for both OCT amplitude and intensity. The PDF formalism is shown to be independent of the scatterer flow conditions; this is expected from theory, and suggests robustness and motion independence of the OCT amplitude (and OCT intensity) PDF metrics in the context of potential biomedical applications.
Diffusion in translucent media.
Shi, Zhou; Genack, Azriel Z
2018-05-10
Diffusion is the result of repeated random scattering. It governs a wide range of phenomena from Brownian motion, to heat flow through window panes, neutron flux in fuel rods, dispersion of light in human tissue, and electronic conduction. It is universally acknowledged that the diffusion approach to describing wave transport fails in translucent samples thinner than the distance between scattering events such as are encountered in meteorology, astronomy, biomedicine, and communications. Here we show in optical measurements and numerical simulations that the scaling of transmission and the intensity profiles of transmission eigenchannels have the same form in translucent as in opaque media. Paradoxically, the similarities in transport across translucent and opaque samples explain the puzzling observations of suppressed optical and ultrasonic delay times relative to predictions of diffusion theory well into the diffusive regime.
NASA Astrophysics Data System (ADS)
Davis, Anjul M.; Rothenberg, Florence G.; Law, Tzuo H.; Taber, Larry A.; Izatt, Joseph A.
2007-02-01
The onset of congenital heart disease (CHD) is believed to occur at very early stages of development. Investigations in the initiation and development of CHD has been hampered by the inability to image early stage heart structure and function, in vivo. Imaging small animals using optical coherence tomography (OCT) has filled a niche between the limited penetration depth of confocal microscopy and insufficient resolution from ultrasound. Previous demonstrations of chick heart imaging using OCT have entailed excision of, or arresting the heart to prevent motion artifacts. In this summary, we introduce SDOCT Doppler velocimetry as an enhancement of Doppler OCT for in vivo measurement of localized temporal blood flow dynamics. With this technique, dynamic velocity waveforms were measured in the outflow tract of the heart tube. These flow dynamics correlate to a finite element model of pulsatile flow and may lead to a further understanding of morphological influences on early heart development.
Motion compensation for in vivo subcellular optical microscopy.
Lucotte, B; Balaban, R S
2014-04-01
In this review, we focus on the impact of tissue motion on attempting to conduct subcellular resolution optical microscopy, in vivo. Our position is that tissue motion is one of the major barriers in conducting these studies along with light induced damage, optical probe loading as well as absorbing and scattering effects on the excitation point spread function and collection of emitted light. Recent developments in the speed of image acquisition have reached the limit, in most cases, where the signal from a subcellular voxel limits the speed and not the scanning rate of the microscope. Different schemes for compensating for tissue displacements due to rigid body and deformation are presented from tissue restriction, gating, adaptive gating and active tissue tracking. We argue that methods that minimally impact the natural physiological motion of the tissue are desirable because the major reason to perform in vivo studies is to evaluate normal physiological functions. Towards this goal, active tracking using the optical imaging data itself to monitor tissue displacement and either prospectively or retrospectively correct for the motion without affecting physiological processes is desirable. Critical for this development was the implementation of near real time image processing in conjunction with the control of the microscope imaging parameters. Clearly, the continuing development of methods of motion compensation as well as significant technological solutions to the other barriers to tissue subcellular optical imaging in vivo, including optical aberrations and overall signal-to-noise ratio, will make major contributions to the understanding of cell biology within the body.
Imaging of optically diffusive media by use of opto-elastography
NASA Astrophysics Data System (ADS)
Bossy, Emmanuel; Funke, Arik R.; Daoudi, Khalid; Tanter, Mickael; Fink, Mathias; Boccara, Claude
2007-02-01
We present a camera-based optical detection scheme designed to detect the transient motion created by the acoustic radiation force in elastic media. An optically diffusive tissue mimicking phantom was illuminated with coherent laser light, and a high speed camera (2 kHz frame rate) was used to acquire and cross-correlate consecutive speckle patterns. Time-resolved transient decorrelations of the optical speckle were measured as the results of localised motion induced in the medium by the radiation force and subsequent propagating shear waves. As opposed to classical acousto-optic techniques which are sensitive to vibrations induced by compressional waves at ultrasonic frequencies, the proposed technique is sensitive only to the low frequency transient motion induced in the medium by the radiation force. It therefore provides a way to assess both optical and shear mechanical properties.
Characteristics of Atmospheric Pressure Rotating Gliding Arc Plasmas
NASA Astrophysics Data System (ADS)
Zhang, Hao; Zhu, Fengsen; Tu, Xin; Bo, Zheng; Cen, Kefa; Li, Xiaodong
2016-05-01
In this work, a novel direct current (DC) atmospheric pressure rotating gliding arc (RGA) plasma reactor has been developed for plasma-assisted chemical reactions. The influence of the gas composition and the gas flow rate on the arc dynamic behaviour and the formation of reactive species in the N2 and air gliding arc plasmas has been investigated by means of electrical signals, high speed photography, and optical emission spectroscopic diagnostics. Compared to conventional gliding arc reactors with knife-shaped electrodes which generally require a high flow rate (e.g., 10-20 L/min) to maintain a long arc length and reasonable plasma discharge zone, in this RGA system, a lower gas flow rate (e.g., 2 L/min) can also generate a larger effective plasma reaction zone with a longer arc length for chemical reactions. Two different motion patterns can be clearly observed in the N2 and air RGA plasmas. The time-resolved arc voltage signals show that three different arc dynamic modes, the arc restrike mode, takeover mode, and combined modes, can be clearly identified in the RGA plasmas. The occurrence of different motion and arc dynamic modes is strongly dependent on the composition of the working gas and gas flow rate. supported by National Natural Science Foundation of China (No. 51576174), the Specialized Research Fund for the Doctoral Program of Higher Education of China (No. 20120101110099) and the Fundamental Research Funds for the Central Universities (No. 2015FZA4011)
A Method for Calculating the Amount of Movements to Estimate the Self-position of Manta Robots
NASA Astrophysics Data System (ADS)
Imahama, Takuya; Watanabe, Keigo; Mikuriya, Kota; Nagai, Isaku
2018-02-01
In recent years, the demand of underwater investigation is increasing in the circumference of a dam, the environmental research of the shallow where approach by ship is difficult, etc. It is known, however, that for man, all over the sea, danger exists mostly, and prolonged diving has a bad influence to a human body. Then, the development of underwater exploration robots that investigate underwater instead of humans is expected. Among underwater exploration robots, it is known that robots imitating aquatic organisms have little influence on underwater environment. Therefore, at this laboratory, a Manta robot using propulsive mechanisms with pectoral fins was developed, imitating the pectoral fin of Manta. Although underwater environmental research needs a function for estimating the self-position, it is not mounted in this Manta robot. This paper explains the amount estimation of movements using optical flows. Especially, a gimbal mechanism is introduced to reduce the influence on the optical flow calculation by pitch motion of the Manta robot. Several experiments are conducted to demonstrate the usefulness of the proposed method.
Egomotion estimation with optic flow and air velocity sensors.
Rutkowski, Adam J; Miller, Mikel M; Quinn, Roger D; Willis, Mark A
2011-06-01
We develop a method that allows a flyer to estimate its own motion (egomotion), the wind velocity, ground slope, and flight height using only inputs from onboard optic flow and air velocity sensors. Our artificial algorithm demonstrates how it could be possible for flying insects to determine their absolute egomotion using their available sensors, namely their eyes and wind sensitive hairs and antennae. Although many behaviors can be performed by only knowing the direction of travel, behavioral experiments indicate that odor tracking insects are able to estimate the wind direction and control their absolute egomotion (i.e., groundspeed). The egomotion estimation method that we have developed, which we call the opto-aeronautic algorithm, is tested in a variety of wind and ground slope conditions using a video recorded flight of a moth tracking a pheromone plume. Over all test cases that we examined, the algorithm achieved a mean absolute error in height of 7% or less. Furthermore, our algorithm is suitable for the navigation of aerial vehicles in environments where signals from the Global Positioning System are unavailable.
A Height Estimation Approach for Terrain Following Flights from Monocular Vision
Campos, Igor S. G.; Nascimento, Erickson R.; Freitas, Gustavo M.; Chaimowicz, Luiz
2016-01-01
In this paper, we present a monocular vision-based height estimation algorithm for terrain following flights. The impressive growth of Unmanned Aerial Vehicle (UAV) usage, notably in mapping applications, will soon require the creation of new technologies to enable these systems to better perceive their surroundings. Specifically, we chose to tackle the terrain following problem, as it is still unresolved for consumer available systems. Virtually every mapping aircraft carries a camera; therefore, we chose to exploit this in order to use presently available hardware to extract the height information toward performing terrain following flights. The proposed methodology consists of using optical flow to track features from videos obtained by the UAV, as well as its motion information to estimate the flying height. To determine if the height estimation is reliable, we trained a decision tree that takes the optical flow information as input and classifies whether the output is trustworthy or not. The classifier achieved accuracies of 80% for positives and 90% for negatives, while the height estimation algorithm presented good accuracy. PMID:27929424
NASA Technical Reports Server (NTRS)
Title, A. M.; Tarbell, T. D.; Acton, L; Duncan, D.; Simon, G. W.
1986-01-01
Initial results are presented on solar granulation, pores and sunspots from the white-light films obtained by the Solar Optical Universal Polarimeter (SOUP) instrument in Spacelab 2. Several hours of movies were taken at various disk and limb positions in quiet and active regions. The images are diffraction-limited at 0.5 arcsec resolution and are, of course, free of atmospheric seeing and distortion. Properties of the granulation in magnetic and nonmagnetic regions are compared and are found to differ significantly in size, rate of intensity variation, and lifetime. In quiet sun, on the order of fifty-percent of the area has at least one 'exploding granule' occurring in it during a 25-min period. Local correlation tracking has detected several types of transverse flows, including systematic outflow from the penumbral boundary of a spot, motion of penumbral filaments, and cellular flow patterns of supergranular and mesogranular size. Feature tracking has shown that, in the quiet sun, the average granule fragment has a velocity of about one kilometer/second.
Hydrodynamic interaction of two deformable drops in confined shear flow.
Chen, Yongping; Wang, Chengyao
2014-09-01
We investigate hydrodynamic interaction between two neutrally buoyant circular drops in a confined shear flow based on a computational fluid dynamics simulation using the volume-of-fluid method. The rheological behaviors of interactive drops and the flow regimes are explored with a focus on elucidation of underlying physical mechanisms. We find that two types of drop behaviors during interaction occur, including passing-over motion and reversing motion, which are governed by the competition between the drag of passing flow and the entrainment of reversing flow in matrix fluid. With the increasing confinement, the drop behavior transits from the passing-over motion to reversing motion, because the entrainment of the reversing-flow matrix fluid turns to play the dominant role. The drag of the ambient passing flow is increased by enlarging the initial lateral separation due to the departure of the drop from the reversing flow in matrix fluid, resulting in the emergence of passing-over motion. In particular, a corresponding phase diagram is plotted to quantitatively illustrate the dependence of drop morphologies during interaction on confinement and initial lateral separation.
The Prominent Role of the Upstream Conditions on the Large-scale Motions of a Turbulent Channel Flow
NASA Astrophysics Data System (ADS)
Castillo, Luciano; Dharmarathne, Suranga; Tutkun, Murat; Hutchins, Nicholas
2017-11-01
In this study we investigate how upstream perturbations in a turbulent channel flow impact the downstream flow evolution, especially the large-scale motions. Direct numerical simulations were carried out at a friction Reynolds number, Reτ = 394 . Spanwise varying inlet blowing perturbations were imposed at 1 πh from the inlet. The flow field is decomposed into its constituent scales using proper orthogonal decomposition. The large-scale motions and the small-scale motions of the flow field are separated at a cut-off mode number, Mc. The cut-off mode number is defined as the number of the mode at which the fraction of energy recovered is 55 % . It is found that Reynolds stresses are increased due to blowing perturbations and large-scale motions are responsible for more than 70 % of the increase of the streamwise component of Reynolds normal stress. Surprisingly, 90 % of Reynolds shear stress is due to the energy augmentation of large-scale motions. It is shown that inlet perturbations impact the downstream flow by means of the LSM.
Motion perception: behavior and neural substrate.
Mather, George
2011-05-01
Visual motion perception is vital for survival. Single-unit recordings in primate primary visual cortex (V1) have revealed the existence of specialized motion sensing neurons; perceptual effects such as the motion after-effect demonstrate their importance for motion perception. Human psychophysical data on motion detection can be explained by a computational model of cortical motion sensors. Both psychophysical and physiological data reveal at least two classes of motion sensor capable of sensing motion in luminance-defined and texture-defined patterns, respectively. Psychophysical experiments also reveal that motion can be seen independently of motion sensor output, based on attentive tracking of visual features. Sensor outputs are inherently ambiguous, due to the problem of univariance in neural responses. In order to compute stimulus direction and speed, the visual system must compare the responses of many different sensors sensitive to different directions and speeds. Physiological data show that this computation occurs in the visual middle temporal (MT) area. Recent psychophysical studies indicate that information about spatial form may also play a role in motion computations. Adaptation studies show that the human visual system is selectively sensitive to large-scale optic flow patterns, and physiological studies indicate that cells in the middle superior temporal (MST) area derive this sensitivity from the combined responses of many MT cells. Extraretinal signals used to control eye movements are an important source of signals to cancel out the retinal motion responses generated by eye movements, though visual information also plays a role. A number of issues remain to be resolved at all levels of the motion-processing hierarchy. WIREs Cogni Sci 2011 2 305-314 DOI: 10.1002/wcs.110 For further resources related to this article, please visit the WIREs website Additional Supporting Information may be found in http://www.lifesci.sussex.ac.uk/home/George_Mather/Motion/index.html. Copyright © 2010 John Wiley & Sons, Ltd.
Phantom motion after effects--evidence of detectors for the analysis of optic flow.
Snowden, R J; Milne, A B
1997-10-01
Electrophysiological recording from the extrastriate cortex of non-human primates has revealed neurons that have large receptive fields and are sensitive to various components of object or self movement, such as translations, rotations and expansion/contractions. If these mechanisms exist in human vision, they might be susceptible to adaptation that generates motion aftereffects (MAEs). Indeed, it might be possible to adapt the mechanism in one part of the visual field and reveal what we term a 'phantom MAE' in another part. The existence of phantom MAEs was probed by adapting to a pattern that contained motion in only two non-adjacent 'quarter' segments and then testing using patterns that had elements in only the other two segments. We also tested for the more conventional 'concrete' MAE by testing in the same two segments that had adapted. The strength of each MAE was quantified by measuring the percentage of dots that had to be moved in the opposite direction to the MAE in order to nullify it. Four experiments tested rotational motion, expansion/contraction motion, translational motion and a 'rotation' that consisted simply of the two segments that contained only translational motions of opposing direction. Compared to a baseline measurement where no adaptation took place, all subjects in all experiments exhibited both concrete and phantom MAEs, with the size of the latter approximately half that of the former. Adaptation to two segments that contained upward and downward motion induced the perception of leftward and rightward motion in another part of the visual field. This strongly suggests there are mechanisms in human vision that are sensitive to complex motions such as rotations.
Griffing, Lawrence R
2018-01-01
In this chapter, approaches to the image analysis of the choreography of the plant endoplasmic reticulum (ER) labeled with fluorescent fusion proteins ("stars," if you wish) are presented. The approaches include the analyses of those parts of the ER that are attached through membrane contact sites to moving or nonmoving partners (other "stars"). Image analysis is also used to understand the nature of the tubular polygonal network, the hallmark of this organelle, and how the polygons change over time due to tubule sliding or motion. Furthermore, the remodeling polygons of the ER interact with regions of fundamentally different topology, the ER cisternae, and image analysis can be used to separate the tubules from the cisternae. ER cisternae, like polygons and tubules, can be motile or stationary. To study which parts are attached to nonmoving partners, such as domains of the ER that form membrane contact sites with the plasma membrane/cell wall, an image analysis approach called persistency mapping has been used. To study the domains of the ER that are moving rapidly and streaming through the cell, the image analysis of optic flow has been used. However, optic flow approaches confuse the movement of the ER itself with the movement of proteins within the ER. As an overall measure of ER dynamics, optic flow approaches are of value, but their limitation as to what exactly is "flowing" needs to be specified. Finally, there are important imaging approaches that directly address the movement of fluorescent proteins within the ER lumen or in the membrane of the ER. Of these, fluorescence recovery after photobleaching (FRAP), inverse FRAP (iFRAP), and single particle tracking approaches are described.
Analysis of Motorcycle Weave Mode by using Energy Flow Method
NASA Astrophysics Data System (ADS)
Marumo, Yoshitaka; Katayama, Tsuyoshi
The activation mechanism of motorcycle weave mode is clarified within the framework of the energy flow method, which calculates energy flow of mechanical forces in each motion. It is demonstrated that only a few mechanical forces affect the stability of the weave mode from among a total of about 40 mechanical forces. The activation of the lateral, yawing and rolling motions destabilize the weave mode, while activation of the steering motion stabilizes the weave mode. A detailed investigation of the energy flow of the steering motion reveals that the steering motion plays an important role in clarifying the characteristics of the weave mode. As activation of the steering motion progresses the phase of the front tire side force, and the weave mode is consequently stabilized. This paper provides a design guide for stabilizing the weave mode and the wobble mode compatibility.
Xu, Jing; Wong, Kevin; Jian, Yifan; Sarunic, Marinko V
2014-02-01
In this report, we describe a graphics processing unit (GPU)-accelerated processing platform for real-time acquisition and display of flow contrast images with Fourier domain optical coherence tomography (FDOCT) in mouse and human eyes in vivo. Motion contrast from blood flow is processed using the speckle variance OCT (svOCT) technique, which relies on the acquisition of multiple B-scan frames at the same location and tracking the change of the speckle pattern. Real-time mouse and human retinal imaging using two different custom-built OCT systems with processing and display performed on GPU are presented with an in-depth analysis of performance metrics. The display output included structural OCT data, en face projections of the intensity data, and the svOCT en face projections of retinal microvasculature; these results compare projections with and without speckle variance in the different retinal layers to reveal significant contrast improvements. As a demonstration, videos of real-time svOCT for in vivo human and mouse retinal imaging are included in our results. The capability of performing real-time svOCT imaging of the retinal vasculature may be a useful tool in a clinical environment for monitoring disease-related pathological changes in the microcirculation such as diabetic retinopathy.
Analysis of free breathing motion using artifact reduced 4D CT image data
NASA Astrophysics Data System (ADS)
Ehrhardt, Jan; Werner, Rene; Frenzel, Thorsten; Lu, Wei; Low, Daniel; Handels, Heinz
2007-03-01
The mobility of lung tumors during the respiratory cycle is a source of error in radiotherapy treatment planning. Spatiotemporal CT data sets can be used for studying the motion of lung tumors and inner organs during the breathing cycle. We present methods for the analysis of respiratory motion using 4D CT data in high temporal resolution. An optical flow based reconstruction method was used to generate artifact-reduced 4D CT data sets of lung cancer patients. The reconstructed 4D CT data sets were segmented and the respiratory motion of tumors and inner organs was analyzed. A non-linear registration algorithm is used to calculate the velocity field between consecutive time frames of the 4D data. The resulting velocity field is used to analyze trajectories of landmarks and surface points. By this technique, the maximum displacement of any surface point is calculated, and regions with large respiratory motion are marked. To describe the tumor mobility the motion of the lung tumor center in three orthogonal directions is displayed. Estimated 3D appearance probabilities visualize the movement of the tumor during the respiratory cycle in one static image. Furthermore, correlations between trajectories of the skin surface and the trajectory of the tumor center are determined and skin regions are identified which are suitable for prediction of the internal tumor motion. The results of the motion analysis indicate that the described methods are suitable to gain insight into the spatiotemporal behavior of anatomical and pathological structures during the respiratory cycle.
NASA Astrophysics Data System (ADS)
Gao, Bin; Liu, Wanyu; Wang, Liang; Liu, Zhengjun; Croisille, Pierre; Delachartre, Philippe; Clarysse, Patrick
2016-12-01
Cine-MRI is widely used for the analysis of cardiac function in clinical routine, because of its high soft tissue contrast and relatively short acquisition time in comparison with other cardiac MRI techniques. The gray level distribution in cardiac cine-MRI is relatively homogenous within the myocardium, and can therefore make motion quantification difficult. To ensure that the motion estimation problem is well posed, more image features have to be considered. This work is inspired by a method previously developed for color image processing. The monogenic signal provides a framework to estimate the local phase, orientation, and amplitude, of an image, three features which locally characterize the 2D intensity profile. The independent monogenic features are combined into a 3D matrix for motion estimation. To improve motion estimation accuracy, we chose the zero-mean normalized cross-correlation as a matching measure, and implemented a bilateral filter for denoising and edge-preservation. The monogenic features distance is used in lieu of the color space distance in the bilateral filter. Results obtained from four realistic simulated sequences outperformed two other state of the art methods even in the presence of noise. The motion estimation errors (end point error) using our proposed method were reduced by about 20% in comparison with those obtained by the other tested methods. The new methodology was evaluated on four clinical sequences from patients presenting with cardiac motion dysfunctions and one healthy volunteer. The derived strain fields were analyzed favorably in their ability to identify myocardial regions with impaired motion.
NASA Astrophysics Data System (ADS)
An, Lin; Shen, Tueng T.; Wang, Ruikang K.
2011-10-01
This paper presents comprehensive and depth-resolved retinal microvasculature images within human retina achieved by a newly developed ultrahigh sensitive optical microangiography (UHS-OMAG) system. Due to its high flow sensitivity, UHS-OMAG is much more sensitive to tissue motion due to the involuntary movement of the human eye and head compared to the traditional OMAG system. To mitigate these motion artifacts on final imaging results, we propose a new phase compensation algorithm in which the traditional phase-compensation algorithm is repeatedly used to efficiently minimize the motion artifacts. Comparatively, this new algorithm demonstrates at least 8 to 25 times higher motion tolerability, critical for the UHS-OMAG system to achieve retinal microvasculature images with high quality. Furthermore, the new UHS-OMAG system employs a high speed line scan CMOS camera (240 kHz A-line scan rate) to capture 500 A-lines for one B-frame at a 400 Hz frame rate. With this system, we performed a series of in vivo experiments to visualize the retinal microvasculature in humans. Two featured imaging protocols are utilized. The first is of the low lateral resolution (16 μm) and a wide field of view (4 × 3 mm2 with single scan and 7 × 8 mm2 for multiple scans), while the second is of the high lateral resolution (5 μm) and a narrow field of view (1.5 × 1.2 mm2 with single scan). The great imaging performance delivered by our system suggests that UHS-OMAG can be a promising noninvasive alternative to the current clinical retinal microvasculature imaging techniques for the diagnosis of eye diseases with significant vascular involvement, such as diabetic retinopathy and age-related macular degeneration.
Accurate band-to-band registration of AOTF imaging spectrometer using motion detection technology
NASA Astrophysics Data System (ADS)
Zhou, Pengwei; Zhao, Huijie; Jin, Shangzhong; Li, Ningchuan
2016-05-01
This paper concerns the problem of platform vibration induced band-to-band misregistration with acousto-optic imaging spectrometer in spaceborne application. Registrating images of different bands formed at different time or different position is difficult, especially for hyperspectral images form acousto-optic tunable filter (AOTF) imaging spectrometer. In this study, a motion detection method is presented using the polychromatic undiffracted beam of AOTF. The factors affecting motion detect accuracy are analyzed theoretically, and calculations show that optical distortion is an easily overlooked factor to achieve accurate band-to-band registration. Hence, a reflective dual-path optical system has been proposed for the first time, with reduction of distortion and chromatic aberration, indicating the potential of higher registration accuracy. Consequently, a spectra restoration experiment using additional motion detect channel is presented for the first time, which shows the accurate spectral image registration capability of this technique.
Magnetic Tethering of Microswimmers in Microfluidic Devices
NASA Astrophysics Data System (ADS)
Chawan, Aschvin; Jana, Saikat; Ghosh, Suvojit; Jung, Sunghwan; Puri, Ishwar
2013-03-01
Exercising control over animal locomotion is well known in the macro world. In the micro-scale world, such methods require more sophistication. We magnetize Paramecium multimicronucleatum by internalization of magnetite nanoparticles coated with bovine serum albumin (BSA). This enables control of their motion in a microfluidic device using a magnetic field. Miniature permanent magnets embedded within the device are used to tether the magnetized organisms to specific locations along a micro-channel. Ciliary beatings of the microswimmer generate shear flows nearby. We apply this setup to enhance cross-stream mixing in a microfluidic device by supplementing molecular diffusion. The device is similar to an active micromixer but requires no external power sources or artificial actuators. We optically characterize the effectiveness of the mechanism in a variety of flow situations.
Lee, Benjamin C; Moody, Jonathan B; Poitrasson-Rivière, Alexis; Melvin, Amanda C; Weinberg, Richard L; Corbett, James R; Ficaro, Edward P; Murthy, Venkatesh L
2018-03-23
Patient motion can lead to misalignment of left ventricular volumes of interest and subsequently inaccurate quantification of myocardial blood flow (MBF) and flow reserve (MFR) from dynamic PET myocardial perfusion images. We aimed to identify the prevalence of patient motion in both blood and tissue phases and analyze the effects of this motion on MBF and MFR estimates. We selected 225 consecutive patients that underwent dynamic stress/rest rubidium-82 chloride ( 82 Rb) PET imaging. Dynamic image series were iteratively reconstructed with 5- to 10-second frame durations over the first 2 minutes for the blood phase and 10 to 80 seconds for the tissue phase. Motion shifts were assessed by 3 physician readers from the dynamic series and analyzed for frequency, magnitude, time, and direction of motion. The effects of this motion isolated in time, direction, and magnitude on global and regional MBF and MFR estimates were evaluated. Flow estimates derived from the motion corrected images were used as the error references. Mild to moderate motion (5-15 mm) was most prominent in the blood phase in 63% and 44% of the stress and rest studies, respectively. This motion was observed with frequencies of 75% in the septal and inferior directions for stress and 44% in the septal direction for rest. Images with blood phase isolated motion had mean global MBF and MFR errors of 2%-5%. Isolating blood phase motion in the inferior direction resulted in mean MBF and MFR errors of 29%-44% in the RCA territory. Flow errors due to tissue phase isolated motion were within 1%. Patient motion was most prevalent in the blood phase and MBF and MFR errors increased most substantially with motion in the inferior direction. Motion correction focused on these motions is needed to reduce MBF and MFR errors.
Optical pseudomotors for soft x-ray beamlines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pedreira, P., E-mail: ppedreira@cells.es; Sics, I.; Sorrentino, A.
2016-05-15
Optical elements of soft x-ray beamlines usually have motorized translations and rotations that allow for the fine alignment of the beamline. This is to steer the photon beam at some positions and to correct the focus on slits or on sample. Generally, each degree of freedom of a mirror induces a change of several parameters of the beam. Inversely, several motions are required to actuate on a single optical parameter, keeping the others unchanged. We define optical pseudomotors as combinations of physical motions of the optical elements of a beamline, which allow modifying one optical parameter without affecting the others.more » We describe a method to obtain analytic relationships between physical motions of mirrors and the corresponding variations of the beam parameters. This method has been implemented and tested at two beamlines at ALBA, where it is used to control the focus of the photon beam and its position independently.« less
Neural basis of forward flight control and landing in honeybees.
Ibbotson, M R; Hung, Y-S; Meffin, H; Boeddeker, N; Srinivasan, M V
2017-11-06
The impressive repertoire of honeybee visually guided behaviors, and their ability to learn has made them an important tool for elucidating the visual basis of behavior. Like other insects, bees perform optomotor course correction to optic flow, a response that is dependent on the spatial structure of the visual environment. However, bees can also distinguish the speed of image motion during forward flight and landing, as well as estimate flight distances (odometry), irrespective of the visual scene. The neural pathways underlying these abilities are unknown. Here we report on a cluster of descending neurons (DNIIIs) that are shown to have the directional tuning properties necessary for detecting image motion during forward flight and landing on vertical surfaces. They have stable firing rates during prolonged periods of stimulation and respond to a wide range of image speeds, making them suitable to detect image flow during flight behaviors. While their responses are not strictly speed tuned, the shape and amplitudes of their speed tuning functions are resistant to large changes in spatial frequency. These cells are prime candidates not only for the control of flight speed and landing, but also the basis of a neural 'front end' of the honeybee's visual odometer.
NASA Astrophysics Data System (ADS)
Nasir, Saleem; Islam, Saeed; Gul, Taza; Shah, Zahir; Khan, Muhammad Altaf; Khan, Waris; Khan, Aurang Zeb; Khan, Saima
2018-05-01
In this article the modeling and computations are exposed to introduce the new idea of MHD three-dimensional rotating flow of nanofluid through a stretching sheet. Single wall carbon nanotubes (SWCNTs) are utilized as a nano-sized materials while water is used as a base liquid. Single-wall carbon nanotubes (SWNTs) parade sole assets due to their rare structure. Such structure has significant optical and electronics features, wonderful strength and elasticity, and high thermal and chemical permanence. The heat exchange phenomena are deliberated subject to thermal radiation and moreover the impact of nanoparticles Brownian motion and thermophoresis are involved in the present investigation. For the nanofluid transport mechanism, we implemented the Xue model (Xue, Phys B Condens Matter 368:302-307, 2005). The governing nonlinear formulation based upon the law of conservation of mass, quantity of motion, thermal field and nanoparticles concentrations is first modeled and then solved by homotopy analysis method (HAM). Moreover, the graphical result has been exposed to investigate that in what manner the velocities, heat and nanomaterial concentration distributions effected through influential parameters. The mathematical facts of skin friction, Nusselt number and Sherwood number are presented through numerical data for SWCNTs.
Crowd motion segmentation and behavior recognition fusing streak flow and collectiveness
NASA Astrophysics Data System (ADS)
Gao, Mingliang; Jiang, Jun; Shen, Jin; Zou, Guofeng; Fu, Guixia
2018-04-01
Crowd motion segmentation and crowd behavior recognition are two hot issues in computer vision. A number of methods have been proposed to tackle these two problems. Among the methods, flow dynamics is utilized to model the crowd motion, with little consideration of collective property. Moreover, the traditional crowd behavior recognition methods treat the local feature and dynamic feature separately and overlook the interconnection of topological and dynamical heterogeneity in complex crowd processes. A crowd motion segmentation method and a crowd behavior recognition method are proposed based on streak flow and crowd collectiveness. The streak flow is adopted to reveal the dynamical property of crowd motion, and the collectiveness is incorporated to reveal the structure property. Experimental results show that the proposed methods improve the crowd motion segmentation accuracy and the crowd recognition rates compared with the state-of-the-art methods.
2005-03-01
Guided Technologies, Boulder, CO; motion path built from three orthogonal sinusoidal paths is Optotrak , Northern Digital, Waterloo, ON) optical tracking...Hopkins University using an Optotrak to evaluate the simulated motions. The Optotrak (Northern Digital, Inc.) is an optical high- precision 3-D motion...verify the accuracy of the RMS, tests were carried out using the Optotrak , which was placed about 2 m from the simulator. For each test, two sets of data
The Stability and Interfacial Motion of Multi-layer Radial Porous Media and Hele-Shaw Flows
NASA Astrophysics Data System (ADS)
Gin, Craig; Daripa, Prabir
2017-11-01
In this talk, we will discuss viscous fingering instabilities of multi-layer immiscible porous media flows within the Hele-Shaw model in a radial flow geometry. We study the motion of the interfaces for flows with both constant and variable viscosity fluids. We consider the effects of using a variable injection rate on multi-layer flows. We also present a numerical approach to simulating the interface motion within linear theory using the method of eigenfunction expansion. We compare these results with fully non-linear simulations.
A motion detection system for AXAF X-ray ground testing
NASA Technical Reports Server (NTRS)
Arenberg, Jonathan W.; Texter, Scott C.
1993-01-01
The concept, implementation, and performance of the motion detection system (MDS) designed as a diagnostic for X-ray ground testing for AXAF are described. The purpose of the MDS is to measure the magnitude of a relative rigid body motion among the AXAF test optic, the X-ray source, and X-ray focal plane detector. The MDS consists of a point source, lens, centroid detector, transimpedance amplifier, and computer system. Measurement of the centroid position of the image of the optical point source provides a direct measure of the motions of the X-ray optical system. The outputs from the detector and filter/amplifier are digitized and processed using the calibration with a 50 Hz bandwidth to give the centroid's location on the detector. Resolution of 0.008 arcsec has been achieved by this system. Data illustrating the performance of the motion detection system are also presented.
Active eye-tracking for an adaptive optics scanning laser ophthalmoscope
Sheehy, Christy K.; Tiruveedhula, Pavan; Sabesan, Ramkumar; Roorda, Austin
2015-01-01
We demonstrate a system that combines a tracking scanning laser ophthalmoscope (TSLO) and an adaptive optics scanning laser ophthalmoscope (AOSLO) system resulting in both optical (hardware) and digital (software) eye-tracking capabilities. The hybrid system employs the TSLO for active eye-tracking at a rate up to 960 Hz for real-time stabilization of the AOSLO system. AOSLO videos with active eye-tracking signals showed, at most, an amplitude of motion of 0.20 arcminutes for horizontal motion and 0.14 arcminutes for vertical motion. Subsequent real-time digital stabilization limited residual motion to an average of only 0.06 arcminutes (a 95% reduction). By correcting for high amplitude, low frequency drifts of the eye, the active TSLO eye-tracking system enabled the AOSLO system to capture high-resolution retinal images over a larger range of motion than previously possible with just the AOSLO imaging system alone. PMID:26203370
Abouei, Elham; Lee, Anthony M D; Pahlevaninezhad, Hamid; Hohert, Geoffrey; Cua, Michelle; Lane, Pierre; Lam, Stephen; MacAulay, Calum
2018-01-01
We present a method for the correction of motion artifacts present in two- and three-dimensional in vivo endoscopic images produced by rotary-pullback catheters. This method can correct for cardiac/breathing-based motion artifacts and catheter-based motion artifacts such as nonuniform rotational distortion (NURD). This method assumes that en face tissue imaging contains slowly varying structures that are roughly parallel to the pullback axis. The method reduces motion artifacts using a dynamic time warping solution through a cost matrix that measures similarities between adjacent frames in en face images. We optimize and demonstrate the suitability of this method using a real and simulated NURD phantom and in vivo endoscopic pulmonary optical coherence tomography and autofluorescence images. Qualitative and quantitative evaluations of the method show an enhancement of the image quality. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
Shock waves in aviation security and safety
NASA Astrophysics Data System (ADS)
Settles, G. S.; Keane, B. T.; Anderson, B. W.; Gatto, J. A.
Accident investigations such as of Pan Am 103 and TWA 800 reveal the key role of shock-wave propagation in destroying the aircraft when an on-board explosion occurs. This paper surveys shock wave propagation inside an aircraft fuselage, caused either by a terrorist device or by accident, and provides some new experimental results. While aircraft-hardening research has been under way for more than a decade, no such experiments to date have used the crucial tool of high-speed optical imaging to visualize shock motion. Here, Penn State's Full-Scale Schlieren flow visualization facility yields the first shock-motion images in aviation security scenarios: 1) Explosions beneath full-size aircraft seats occupied by mannequins, 2) Explosions inside partially-filled luggage containers, and 3) Luggage-container explosions resulting in hull-holing. Both single-frame images and drum-camera movies are obtained. The implications of these results are discussed, though the overall topic must still be considered in its infancy.
Reynolds number scaling of straining motions in turbulence
NASA Astrophysics Data System (ADS)
Elsinga, Gerrit; Ishihara, T.; Goudar, M. V.; da Silva, C. B.; Hunt, J. C. R.
2017-11-01
Strain is an important fluid motion in turbulence as it is associated with the kinetic energy dissipation rate, vorticity stretching, and the dispersion of passive scalars. The present study investigates the scaling of the turbulent straining motions by evaluating the flow in the eigenframe of the local strain-rate tensor. The analysis is based on DNS of homogeneous isotropic turbulence covering a Reynolds number range Reλ = 34.6 - 1131. The resulting flow pattern reveals a shear layer containing tube-like vortices and a dissipation sheet, which both scale on the Kolmogorov length scale, η. The vorticity stretching motions scale on the Taylor length scale, while the flow outside the shear layer scales on the integral length scale. These scaling results are consistent with those in wall-bounded flow, which suggests a quantitative universality between the different flows. The overall coherence length of the vorticity is 120 η in all directions, which is considerably larger than the typical size of individual vortices, and reflects the importance of spatial organization at the small scales. Transitions in flow structure are identified at Reλ 45 and 250. Below these respective Reynolds numbers, the small-scale motions and the vorticity stretching motions appear underdeveloped.
An improved optical flow tracking technique for real-time MR-guided beam therapies in moving organs
NASA Astrophysics Data System (ADS)
Zachiu, C.; Papadakis, N.; Ries, M.; Moonen, C.; de Senneville, B. Denis
2015-12-01
Magnetic resonance (MR) guided high intensity focused ultrasound and external beam radiotherapy interventions, which we shall refer to as beam therapies/interventions, are promising techniques for the non-invasive ablation of tumours in abdominal organs. However, therapeutic energy delivery in these areas becomes challenging due to the continuous displacement of the organs with respiration. Previous studies have addressed this problem by coupling high-framerate MR-imaging with a tracking technique based on the algorithm proposed by Horn and Schunck (H and S), which was chosen due to its fast convergence rate and highly parallelisable numerical scheme. Such characteristics were shown to be indispensable for the real-time guidance of beam therapies. In its original form, however, the algorithm is sensitive to local grey-level intensity variations not attributed to motion such as those that occur, for example, in the proximity of pulsating arteries. In this study, an improved motion estimation strategy which reduces the impact of such effects is proposed. Displacements are estimated through the minimisation of a variation of the H and S functional for which the quadratic data fidelity term was replaced with a term based on the linear L1norm, resulting in what we have called an L2-L1 functional. The proposed method was tested in the livers and kidneys of two healthy volunteers under free-breathing conditions, on a data set comprising 3000 images equally divided between the volunteers. The results show that, compared to the existing approaches, our method demonstrates a greater robustness to local grey-level intensity variations introduced by arterial pulsations. Additionally, the computational time required by our implementation make it compatible with the work-flow of real-time MR-guided beam interventions. To the best of our knowledge this study was the first to analyse the behaviour of an L1-based optical flow functional in an applicative context: real-time MR-guidance of beam therapies in moving organs.
Note: A resonating reflector-based optical system for motion measurement in micro-cantilever arrays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sathishkumar, P.; Punyabrahma, P.; Sri Muthu Mrinalini, R.
A robust, compact optical measurement unit for motion measurement in micro-cantilever arrays enables development of portable micro-cantilever sensors. This paper reports on an optical beam deflection-based system to measure the deflection of micro-cantilevers in an array that employs a single laser source, a single detector, and a resonating reflector to scan the measurement laser across the array. A strategy is also proposed to extract the deflection of individual cantilevers from the acquired data. The proposed system and measurement strategy are experimentally evaluated and demonstrated to measure motion of multiple cantilevers in an array.
Images from Galileo of the Venus cloud deck
Belton, M.J.S.; Gierasch, P.J.; Smith, M.D.; Helfenstein, P.; Schinder, P.J.; Pollack, James B.; Rages, K.A.; Ingersoll, A.P.; Klaasen, K.P.; Veverka, J.; Anger, C.D.; Carr, M.H.; Chapman, C.R.; Davies, M.E.; Fanale, F.P.; Greeley, R.; Greenberg, R.; Head, J. W.; Morrison, D.; Neukum, G.; Pilcher, C.B.
1991-01-01
Images of Venus taken at 418 (violet) and 986 [near-infrared (NIR)] nanometers show that the morphology and motions of large-scale features change with depth in the cloud deck. Poleward meridional velocities, seen in both spectral regions, are much reduced in the NIR. In the south polar region the markings in the two wavelength bands are strongly anticorrelated. The images follow the changing state of the upper cloud layer downwind of the subsolar point, and the zonal flow field shows a longitudinal periodicity that may be coupled to the formation of large-scale planetary waves. No optical lightning was detected.
Note: Automated electrochemical etching and polishing of silver scanning tunneling microscope tips.
Sasaki, Stephen S; Perdue, Shawn M; Rodriguez Perez, Alejandro; Tallarida, Nicholas; Majors, Julia H; Apkarian, V Ara; Lee, Joonhee
2013-09-01
Fabrication of sharp and smooth Ag tips is crucial in optical scanning probe microscope experiments. To ensure reproducible tip profiles, the polishing process is fully automated using a closed-loop laminar flow system to deliver the electrolytic solution to moving electrodes mounted on a motorized translational stage. The repetitive translational motion is controlled precisely on the μm scale with a stepper motor and screw-thread mechanism. The automated setup allows reproducible control over the tip profile and improves smoothness and sharpness of tips (radius 27 ± 18 nm), as measured by ultrafast field emission.
Prometheus: Io's wandering plume.
Kieffer, S W; Lopes-Gautier, R; McEwen, A; Smythe, W; Keszthelyi, L; Carlson, R
2000-05-19
Unlike any volcanic behavior ever observed on Earth, the plume from Prometheus on Io has wandered 75 to 95 kilometers west over the last 20 years since it was first discovered by Voyager and more recently observed by Galileo. Despite the source motion, the geometric and optical properties of the plume have remained constant. We propose that this can be explained by vaporization of a sulfur dioxide and/or sulfur "snowfield" over which a lava flow is moving. Eruption of a boundary-layer slurry through a rootless conduit with sonic conditions at the intake of the melted snow can account for the constancy of plume properties.
Near-Field, On-Chip Optical Brownian Ratchets.
Wu, Shao-Hua; Huang, Ningfeng; Jaquay, Eric; Povinelli, Michelle L
2016-08-10
Nanoparticles in aqueous solution are subject to collisions with solvent molecules, resulting in random, Brownian motion. By breaking the spatiotemporal symmetry of the system, the motion can be rectified. In nature, Brownian ratchets leverage thermal fluctuations to provide directional motion of proteins and enzymes. In man-made systems, Brownian ratchets have been used for nanoparticle sorting and manipulation. Implementations based on optical traps provide a high degree of tunability along with precise spatiotemporal control. Here, we demonstrate an optical Brownian ratchet based on the near-field traps of an asymmetrically patterned photonic crystal. The system yields over 25 times greater trap stiffness than conventional optical tweezers. Our technique opens up new possibilities for particle manipulation in a microfluidic, lab-on-chip environment.
NASA Astrophysics Data System (ADS)
Qian, Jie; Cheng, Wei; Cao, Zhaoyuan; Chen, Xinjian; Mo, Jianhua
2017-02-01
Phase-resolved Doppler optical coherence tomography (PR-D-OCT) is a functional OCT imaging technique that can provide high-speed and high-resolution depth-resolved measurement on flow in biological materials. However, a common problem with conventional PR-D-OCT is that this technique often measures the flow motion projected onto the OCT beam path. In other words, it needs the projection angle to extract the absolute velocity from PR-D-OCT measurement. In this paper, we proposed a novel dual-beam PR-D-OCT method to measure absolute flow velocity without separate measurement on the projection angle. Two parallel light beams are created in sample arm and focused into the sample at two different incident angles. The images produced by these two beams are encoded to different depths in single B-scan. Then the Doppler signals picked up by the two beams together with the incident angle difference can be used to calculate the absolute velocity. We validated our approach in vitro on an artificial flow phantom with our home-built 1060 nm swept source OCT. Experimental results demonstrated that our method can provide an accurate measurement of absolute flow velocity with independency on the projection angle.
NASA Astrophysics Data System (ADS)
Zhu, Xinjian; Wu, Ruoyu; Li, Tao; Zhao, Dawei; Shan, Xin; Wang, Puling; Peng, Song; Li, Faqi; Wu, Baoming
2016-12-01
The time-intensity curve (TIC) from contrast-enhanced ultrasound (CEUS) image sequence of uterine fibroids provides important parameter information for qualitative and quantitative evaluation of efficacy of treatment such as high-intensity focused ultrasound surgery. However, respiration and other physiological movements inevitably affect the process of CEUS imaging, and this reduces the accuracy of TIC calculation. In this study, a method of TIC calculation for vascular perfusion of uterine fibroids based on subtraction imaging with motion correction is proposed. First, the fibroid CEUS recording video was decoded into frame images based on the record frame rate. Next, the Brox optical flow algorithm was used to estimate the displacement field and correct the motion between two frames based on warp technique. Then, subtraction imaging was performed to extract the positional distribution of vascular perfusion (PDOVP). Finally, the average gray of all pixels in the PDOVP from each image was determined, and this was considered the TIC of CEUS image sequence. Both the correlation coefficient and mutual information of the results with proposed method were larger than those determined using the original method. PDOVP extraction results have been improved significantly after motion correction. The variance reduction rates were all positive, indicating that the fluctuations of TIC had become less pronounced, and the calculation accuracy has been improved after motion correction. This proposed method can effectively overcome the influence of motion mainly caused by respiration and allows precise calculation of TIC.
3D surface perception from motion involves a temporal–parietal network
Beer, Anton L.; Watanabe, Takeo; Ni, Rui; Sasaki, Yuka; Andersen, George J.
2010-01-01
Previous research has suggested that three-dimensional (3D) structure-from-motion (SFM) perception in humans involves several motion-sensitive occipital and parietal brain areas. By contrast, SFM perception in nonhuman primates seems to involve the temporal lobe including areas MT, MST and FST. The present functional magnetic resonance imaging study compared several motion-sensitive regions of interest including the superior temporal sulcus (STS) while human observers viewed horizontally moving dots that defined either a 3D corrugated surface or a 3D random volume. Low-level stimulus features such as dot density and velocity vectors as well as attention were tightly controlled. Consistent with previous research we found that 3D corrugated surfaces elicited stronger responses than random motion in occipital and parietal brain areas including area V3A, the ventral and dorsal intraparietal sulcus, the lateral occipital sulcus and the fusiform gyrus. Additionally, 3D corrugated surfaces elicited stronger activity in area MT and the STS but not in area MST. Brain activity in the STS but not in area MT correlated with interindividual differences in 3D surface perception. Our findings suggest that area MT is involved in the analysis of optic flow patterns such as speed gradients and that the STS in humans plays a greater role in the analysis of 3D SFM than previously thought. PMID:19674088
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jaskowiak, J; Ahmad, S; Ali, I
Purpose: To investigate correlation of displacement vector fields (DVF) calculated by deformable image registration algorithms with motion parameters in helical axial and cone-beam CT images with motion artifacts. Methods: A mobile thorax phantom with well-known targets with different sizes that were made from water-equivalent material and inserted in foam to simulate lung lesions. The thorax phantom was imaged with helical, axial and cone-beam CT. The phantom was moved with a cyclic motion with different motion amplitudes and frequencies along the superior-inferior direction. Different deformable image registration algorithms including demons, fast demons, Horn-Shunck and iterative-optical-flow from the DIRART software were usedmore » to deform CT images for the phantom with different motion patterns. The CT images of the mobile phantom were deformed to CT images of the stationary phantom. Results: The values of displacement vectors calculated by deformable image registration algorithm correlated strongly with motion amplitude where large displacement vectors were calculated for CT images with large motion amplitudes. For example, the maximal displacement vectors were nearly equal to the motion amplitudes (5mm, 10mm or 20mm) at interfaces between the mobile targets lung tissue, while the minimal displacement vectors were nearly equal to negative the motion amplitudes. The maximal and minimal displacement vectors matched with edges of the blurred targets along the Z-axis (motion-direction), while DVF’s were small in the other directions. This indicates that the blurred edges by phantom motion were shifted largely to match with the actual target edge. These shifts were nearly equal to the motion amplitude. Conclusions: The DVF from deformable-image registration algorithms correlated well with motion amplitude of well-defined mobile targets. This can be used to extract motion parameters such as amplitude. However, as motion amplitudes increased, image artifacts increased significantly and that limited image quality and poor correlation between the motion amplitude and DVF was obtained.« less
Chang, Angela T; Palmer, Kerry R; McNaught, Jessie; Thomas, Peter J
2010-08-01
This study investigated the effect of flow rates and spirometer type on chest wall motion in healthy individuals. Twenty-one healthy volunteers completed breathing trials to either two times tidal volume (2xV(T)) or inspiratory capacity (IC) at high, low, or natural flow rates, using a volume- or flow-oriented spirometer. The proportions of rib cage movement to tidal volume (%RC/V(T)), chest wall diameters, and perceived level of exertion (RPE) were compared. Low and natural flow rates resulted in significantly lower %RC/V(T) compared to high flow rate trials (p=0.001) at 2xV(T). Low flow trials also resulted in significantly less chest wall motion in the upper anteroposterior direction than high and natural flow rates (p<0.001). At IC, significantly greater movement occurred in the abdominal lateral direction during low flow compared to high and natural flow trials (both p<0.003). RPE was lower for the low flow trials compared to high flow trials at IC and 2xV(T) (p<0.01). In healthy individuals, inspiratory flow (not device type) during incentive spirometry determines the resultant breathing pattern. High flow rates result in greater chest wall motion than low flow rates.
Anisotropic Janus Si nanopillar arrays as a microfluidic one-way valve for gas-liquid separation
NASA Astrophysics Data System (ADS)
Wang, Tieqiang; Chen, Hongxu; Liu, Kun; Li, Yang; Xue, Peihong; Yu, Ye; Wang, Shuli; Zhang, Junhu; Kumacheva, Eugenia; Yang, Bai
2014-03-01
In this paper, we demonstrate a facile strategy for the fabrication of a one-way valve for microfluidic (MF) systems. The micro-valve was fabricated by embedding arrays of Janus Si elliptical pillars (Si-EPAs) with anisotropic wettability into a MF channel fabricated in poly(dimethylsiloxane) (PDMS). Two sides of the Janus pillar are functionalized with molecules with distinct surface energies. The ability of the Janus pillar array to act as a valve was proved by investigating the flow behaviour of water in a T-shaped microchannel at different flow rates and pressures. In addition, the one-way valve was used to achieve gas-liquid separation. We believe that the Janus Si-EPAs modified by specific surface functionalization provide a new strategy to control the flow and motion of fluids in MF channels.In this paper, we demonstrate a facile strategy for the fabrication of a one-way valve for microfluidic (MF) systems. The micro-valve was fabricated by embedding arrays of Janus Si elliptical pillars (Si-EPAs) with anisotropic wettability into a MF channel fabricated in poly(dimethylsiloxane) (PDMS). Two sides of the Janus pillar are functionalized with molecules with distinct surface energies. The ability of the Janus pillar array to act as a valve was proved by investigating the flow behaviour of water in a T-shaped microchannel at different flow rates and pressures. In addition, the one-way valve was used to achieve gas-liquid separation. We believe that the Janus Si-EPAs modified by specific surface functionalization provide a new strategy to control the flow and motion of fluids in MF channels. Electronic supplementary information (ESI) available: The XPS spectrum of the as-prepared Janus arrays after the MHA modification; the SEM images of the PFS-MHA Janus Si pillar arrays fabricated through oblique evaporation of gold along the short axis of the elliptical pillars; images of the cross-shaped MF channel and Rhodamine aqueous solution injecting in a cross-shaped MF channel taken at different times; the plot data of DPFS/DMHA against the flow rate of the aqueous solution; the plot data of failure pressure against the bottom size of the channel; optical microscopy images of the Janus pillar array with less density of pillars; optical microscopy images of the T junction with higher magnification; the video of Rhodamine solution running in the T-shaped microchannel integrated with the Janus Si-EPAs; the video of the entire gas-liquid separation process. See DOI: 10.1039/c3nr05865d
Suppression of extraneous thermal noise in cavity optomechanics.
Zhao, Yi; Wilson, Dalziel J; Ni, K-K; Kimble, H J
2012-02-13
Extraneous thermal motion can limit displacement sensitivity and radiation pressure effects, such as optical cooling, in a cavity-optomechanical system. Here we present an active noise suppression scheme and its experimental implementation. The main challenge is to selectively sense and suppress extraneous thermal noise without affecting motion of the oscillator. Our solution is to monitor two modes of the optical cavity, each with different sensitivity to the oscillator's motion but similar sensitivity to the extraneous thermal motion. This information is used to imprint "anti-noise" onto the frequency of the incident laser field. In our system, based on a nano-mechanical membrane coupled to a Fabry-Pérot cavity, simulation and experiment demonstrate that extraneous thermal noise can be selectively suppressed and that the associated limit on optical cooling can be reduced.
Quantum correlations from a room-temperature optomechanical cavity
NASA Astrophysics Data System (ADS)
Purdy, T. P.; Grutter, K. E.; Srinivasan, K.; Taylor, J. M.
2017-06-01
The act of position measurement alters the motion of an object being measured. This quantum measurement backaction is typically much smaller than the thermal motion of a room-temperature object and thus difficult to observe. By shining laser light through a nanomechanical beam, we measure the beam’s thermally driven vibrations and perturb its motion with optical force fluctuations at a level dictated by the Heisenberg measurement-disturbance uncertainty relation. We demonstrate a cross-correlation technique to distinguish optically driven motion from thermally driven motion, observing this quantum backaction signature up to room temperature. We use the scale of the quantum correlations, which is determined by fundamental constants, to gauge the size of thermal motion, demonstrating a path toward absolute thermometry with quantum mechanically calibrated ticks.
Measures and Relative Motions of Some Mostly F. G. W. Struve Doubles
NASA Astrophysics Data System (ADS)
Wiley, E. O.
2012-04-01
Measures of 59 pairs of double stars with long observational histories using "lucky imaging" techniques are reported. Relative motions of 59 pairs are investigated using histories of observation, scatter plots of relative motion, ordinary least-squares (OLS) and total proper motion analyses performed in "R," an open source programming language. A scatter plot of the coefficient of determinations derived from the OLS y|epoch and OLS x|epoch clearly separates common proper motion pairs from optical pairs and what are termed "long-period binary candidates." Differences in proper motion separate optical pairs from long-term binary candidates. An Appendix is provided that details how to use known rectilinear pairs as calibration pairs for the program REDUC.
NASA Astrophysics Data System (ADS)
Barati, H.; Wu, M.; Kharicha, A.; Ludwig, A.
2016-07-01
Turbulent fluid flow due to the electromagnetic forces in induction crucible furnace (ICF) is modeled using k-ɛ, k-ω SST and Large Eddy Simulation (LES) turbulence models. Fluid flow patterns calculated by different turbulence models and their effects on the motion of non-metallic inclusions (NMI) in the bulk melt have been investigated. Results show that the conventional k-ɛ model cannot solve the transient flow in ICF properly. With k-ω model transient flow and oscillation behavior of the flow pattern can be solved, and the motion of NMI can be tracked fairly well. LES model delivers the best modeling result on both details of the transient flow pattern and motion trajectories of NMI without the limitation of NMI size. The drawback of LES model is the long calculation time. Therefore, for general purpose to estimate the dynamic behavior of NMI in ICF both k-ω SST and LES are recommended. For the precise calculation of the motion of NMI smaller than 10 μm only LES model is appropriate.
Crossed beam roof target for motion tracking
NASA Technical Reports Server (NTRS)
Olczak, Eugene (Inventor)
2009-01-01
A system for detecting motion between a first body and a second body includes first and second detector-emitter pairs, disposed on the first body, and configured to transmit and receive first and second optical beams, respectively. At least a first optical rotator is disposed on the second body and configured to receive and reflect at least one of the first and second optical beams. First and second detectors of the detector-emitter pairs are configured to detect the first and second optical beams, respectively. Each of the first and second detectors is configured to detect motion between the first and second bodies in multiple degrees of freedom (DOFs). The first optical rotator includes a V-notch oriented to form an apex of an isosceles triangle with respect to a base of the isosceles triangle formed by the first and second detector-emitter pairs. The V-notch is configured to receive the first optical beam and reflect the first optical beam to both the first and second detectors. The V-notch is also configured to receive the second optical beam and reflect the second optical beam to both the first and second detectors.
George, David L.; Iverson, Richard M.
2011-01-01
Pore-fluid pressure plays a crucial role in debris flows because it counteracts normal stresses at grain contacts and thereby reduces intergranular friction. Pore-pressure feedback accompanying debris deformation is particularly important during the onset of debrisflow motion, when it can dramatically influence the balance of forces governing downslope acceleration. We consider further effects of this feedback by formulating a new, depth-averaged mathematical model that simulates coupled evolution of granular dilatancy, solid and fluid volume fractions, pore-fluid pressure, and flow depth and velocity during all stages of debris-flow motion. To illustrate implications of the model, we use a finite-volume method to compute one-dimensional motion of a debris flow descending a rigid, uniformly inclined slope, and we compare model predictions with data obtained in large-scale experiments at the USGS debris-flow flume. Predictions for the first 1 s of motion show that increasing pore pressures (due to debris contraction) cause liquefaction that enhances flow acceleration. As acceleration continues, however, debris dilation causes dissipation of pore pressures, and this dissipation helps stabilize debris-flow motion. Our numerical predictions of this process match experimental data reasonably well, but predictions might be improved by accounting for the effects of grain-size segregation.
Eye-motion-corrected optical coherence tomography angiography using Lissajous scanning.
Chen, Yiwei; Hong, Young-Joo; Makita, Shuichi; Yasuno, Yoshiaki
2018-03-01
To correct eye motion artifacts in en face optical coherence tomography angiography (OCT-A) images, a Lissajous scanning method with subsequent software-based motion correction is proposed. The standard Lissajous scanning pattern is modified to be compatible with OCT-A and a corresponding motion correction algorithm is designed. The effectiveness of our method was demonstrated by comparing en face OCT-A images with and without motion correction. The method was further validated by comparing motion-corrected images with scanning laser ophthalmoscopy images, and the repeatability of the method was evaluated using a checkerboard image. A motion-corrected en face OCT-A image from a blinking case is presented to demonstrate the ability of the method to deal with eye blinking. Results show that the method can produce accurate motion-free en face OCT-A images of the posterior segment of the eye in vivo .
Robust Models for Optic Flow Coding in Natural Scenes Inspired by Insect Biology
Brinkworth, Russell S. A.; O'Carroll, David C.
2009-01-01
The extraction of accurate self-motion information from the visual world is a difficult problem that has been solved very efficiently by biological organisms utilizing non-linear processing. Previous bio-inspired models for motion detection based on a correlation mechanism have been dogged by issues that arise from their sensitivity to undesired properties of the image, such as contrast, which vary widely between images. Here we present a model with multiple levels of non-linear dynamic adaptive components based directly on the known or suspected responses of neurons within the visual motion pathway of the fly brain. By testing the model under realistic high-dynamic range conditions we show that the addition of these elements makes the motion detection model robust across a large variety of images, velocities and accelerations. Furthermore the performance of the entire system is more than the incremental improvements offered by the individual components, indicating beneficial non-linear interactions between processing stages. The algorithms underlying the model can be implemented in either digital or analog hardware, including neuromorphic analog VLSI, but defy an analytical solution due to their dynamic non-linear operation. The successful application of this algorithm has applications in the development of miniature autonomous systems in defense and civilian roles, including robotics, miniature unmanned aerial vehicles and collision avoidance sensors. PMID:19893631
Imaging of conformational changes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michl, Josef
2016-03-13
Control of intramolecular conformational change in a small number of molecules or even a single one by an application of an outside electric field defined by potentials on nearby metal or dielectric surfaces has potential applications in both 3-D and 2-D nanotechnology. Specifically, the synthesis, characterization, and understanding of designed solids with controlled built-in internal rotational motion of a dipole promises a new class of materials with intrinsic dielectric, ferroelectric, optical and optoelectronic properties not found in nature. Controlled rotational motion is of great interest due to its expected utility in phenomena as diverse as transport, current flow in molecularmore » junctions, diffusion in microfluidic channels, and rotary motion in molecular machines. A direct time-resolved observation of the dynamics of motion on ps or ns time scale in a single molecule would be highly interesting but is also very difficult and has yet to be accomplished. Much can be learned from an easier but still challenging comparison of directly observed initial and final orientational states of a single molecule, which is the basis of this project. The project also impacts the understanding of surface-enhanced Raman spectroscopy (SERS) and single-molecule spectroscopic detection, as well as the synthesis of solid-state materials with tailored properties from designed precursors.« less
4D-CT motion estimation using deformable image registration and 5D respiratory motion modeling.
Yang, Deshan; Lu, Wei; Low, Daniel A; Deasy, Joseph O; Hope, Andrew J; El Naqa, Issam
2008-10-01
Four-dimensional computed tomography (4D-CT) imaging technology has been developed for radiation therapy to provide tumor and organ images at the different breathing phases. In this work, a procedure is proposed for estimating and modeling the respiratory motion field from acquired 4D-CT imaging data and predicting tissue motion at the different breathing phases. The 4D-CT image data consist of series of multislice CT volume segments acquired in ciné mode. A modified optical flow deformable image registration algorithm is used to compute the image motion from the CT segments to a common full volume 3D-CT reference. This reference volume is reconstructed using the acquired 4D-CT data at the end-of-exhalation phase. The segments are optimally aligned to the reference volume according to a proposed a priori alignment procedure. The registration is applied using a multigrid approach and a feature-preserving image downsampling maxfilter to achieve better computational speed and higher registration accuracy. The registration accuracy is about 1.1 +/- 0.8 mm for the lung region according to our verification using manually selected landmarks and artificially deformed CT volumes. The estimated motion fields are fitted to two 5D (spatial 3D+tidal volume+airflow rate) motion models: forward model and inverse model. The forward model predicts tissue movements and the inverse model predicts CT density changes as a function of tidal volume and airflow rate. A leave-one-out procedure is used to validate these motion models. The estimated modeling prediction errors are about 0.3 mm for the forward model and 0.4 mm for the inverse model.
Real-Time Observation of Internal Motion within Ultrafast Dissipative Optical Soliton Molecules
NASA Astrophysics Data System (ADS)
Krupa, Katarzyna; Nithyanandan, K.; Andral, Ugo; Tchofo-Dinda, Patrice; Grelu, Philippe
2017-06-01
Real-time access to the internal ultrafast dynamics of complex dissipative optical systems opens new explorations of pulse-pulse interactions and dynamic patterns. We present the first direct experimental evidence of the internal motion of a dissipative optical soliton molecule generated in a passively mode-locked erbium-doped fiber laser. We map the internal motion of a soliton pair molecule by using a dispersive Fourier-transform imaging technique, revealing different categories of internal pulsations, including vibrationlike and phase drifting dynamics. Our experiments agree well with numerical predictions and bring insights to the analogy between self-organized states of lights and states of the matter.
Javidi, Bahram; Markman, Adam; Rawat, Siddharth; O'Connor, Timothy; Anand, Arun; Andemariam, Biree
2018-05-14
We present a spatio-temporal analysis of cell membrane fluctuations to distinguish healthy patients from patients with sickle cell disease. A video hologram containing either healthy red blood cells (h-RBCs) or sickle cell disease red blood cells (SCD-RBCs) was recorded using a low-cost, compact, 3D printed shearing interferometer. Reconstructions were created for each hologram frame (time steps), forming a spatio-temporal data cube. Features were extracted by computing the standard deviations and the mean of the height fluctuations over time and for every location on the cell membrane, resulting in two-dimensional standard deviation and mean maps, followed by taking the standard deviations of these maps. The optical flow algorithm was used to estimate the apparent motion fields between subsequent frames (reconstructions). The standard deviation of the magnitude of the optical flow vectors across all frames was then computed. In addition, seven morphological cell (spatial) features based on optical path length were extracted from the cells to further improve the classification accuracy. A random forest classifier was trained to perform cell identification to distinguish between SCD-RBCs and h-RBCs. To the best of our knowledge, this is the first report of machine learning assisted cell identification and diagnosis of sickle cell disease based on cell membrane fluctuations and morphology using both spatio-temporal and spatial analysis.
The Dynamics of Flow and Three-dimensional Motion Around a Morphologically Complex Aquatic Plant
NASA Astrophysics Data System (ADS)
Boothroyd, R.; Hardy, R. J.; Warburton, J.; Marjoribanks, T.
2016-12-01
Aquatic vegetation has a significant impact on the hydraulic functioning of river systems. The morphology of an individual plant can influence the mean and turbulent properties of the flow, and the plant posture reconfigures to minimise drag. We report findings from a flume and numerical experiment investigating the dynamics of motion and three-dimensional flow around an isolated Hebe odora plant over a range of flow conditions. In the flume experiment, a high definition video camera recorded plant motion dynamics and three-dimensional velocity profiles were measured using an acoustic Doppler velocimeter. By producing a binary image of the plant in each frame, the plant dynamics can be quantified. Zones of greatest plant motion are on the upper and leeward sides of the plant. With increasing flow the plant is compressed and deflected downwards by up to 18% of the unstressed height. Plant tip motions are tracked and shown to lengthen with increasing flow, transitioning from horizontally dominated to vertically dominated motion. The plant acts as a porous blockage to flow, producing spatially heterogeneous downstream velocity fields with the measured wake length decreasing by 20% with increasing flow. These measurements are then used as boundary conditions and to validate a computational fluid dynamics (CFD) model. By explicitly accounting for the time-averaged plant posture, good agreement is found between flume measurements and model predictions. The flow structures demonstrate characteristics of a junction vortex system, with plant shear layer turbulence dominated by Kelvin-Helmholtz and Görtler-type vortices generated through shear instability. With increasing flow, drag coefficients decrease by up to 8%, from 1.45 to 1.34. This is equivalent to a change in the Manning's n term from 0.086 to 0.078.
Electro-Optic Segment-Segment Sensors for Radio and Optical Telescopes
NASA Technical Reports Server (NTRS)
Abramovici, Alex
2012-01-01
A document discusses an electro-optic sensor that consists of a collimator, attached to one segment, and a quad diode, attached to an adjacent segment. Relative segment-segment motion causes the beam from the collimator to move across the quad diode, thus generating a measureable electric signal. This sensor type, which is relatively inexpensive, can be configured as an edge sensor, or as a remote segment-segment motion sensor.
A photogrammetric technique for generation of an accurate multispectral optical flow dataset
NASA Astrophysics Data System (ADS)
Kniaz, V. V.
2017-06-01
A presence of an accurate dataset is the key requirement for a successful development of an optical flow estimation algorithm. A large number of freely available optical flow datasets were developed in recent years and gave rise for many powerful algorithms. However most of the datasets include only images captured in the visible spectrum. This paper is focused on the creation of a multispectral optical flow dataset with an accurate ground truth. The generation of an accurate ground truth optical flow is a rather complex problem, as no device for error-free optical flow measurement was developed to date. Existing methods for ground truth optical flow estimation are based on hidden textures, 3D modelling or laser scanning. Such techniques are either work only with a synthetic optical flow or provide a sparse ground truth optical flow. In this paper a new photogrammetric method for generation of an accurate ground truth optical flow is proposed. The method combines the benefits of the accuracy and density of a synthetic optical flow datasets with the flexibility of laser scanning based techniques. A multispectral dataset including various image sequences was generated using the developed method. The dataset is freely available on the accompanying web site.
Measuring flow velocity and flow direction by spatial and temporal analysis of flow fluctuations.
Chagnaud, Boris P; Brücker, Christoph; Hofmann, Michael H; Bleckmann, Horst
2008-04-23
If exposed to bulk water flow, fish lateral line afferents respond only to flow fluctuations (AC) and not to the steady (DC) component of the flow. Consequently, a single lateral line afferent can encode neither bulk flow direction nor velocity. It is possible, however, for a fish to obtain bulk flow information using multiple afferents that respond only to flow fluctuations. We show by means of particle image velocimetry that, if a flow contains fluctuations, these fluctuations propagate with the flow. A cross-correlation of water motion measured at an upstream point with that at a downstream point can then provide information about flow velocity and flow direction. In this study, we recorded from pairs of primary lateral line afferents while a fish was exposed to either bulk water flow, or to the water motion caused by a moving object. We confirm that lateral line afferents responded to the flow fluctuations and not to the DC component of the flow, and that responses of many fiber pairs were highly correlated, if they were time-shifted to correct for gross flow velocity and gross flow direction. To prove that a cross-correlation mechanism can be used to retrieve the information about gross flow velocity and direction, we measured the flow-induced bending motions of two flexible micropillars separated in a downstream direction. A cross-correlation of the bending motions of these micropillars did indeed produce an accurate estimate of the velocity vector along the direction of the micropillars.
Postural control and perceptive configuration: influence of expertise in gymnastics.
Gautier, Geoffroy; Thouvarecq, Régis; Vuillerme, Nicolas
2008-07-01
The purpose of the present experiment was to investigate how postural adaptations to the perceptive configuration are modified by specific gymnastics experience. Two groups, one expert in gymnastics and the other non-expert, had to maintain the erected posture while optical flow was imposed as follows: 20s motionless, 30s approaching motion, and 20s motionless. The centre of pressure and head displacements were analysed. The postural adaptations were characterised by the variability of movements for the flow conditions and by the postural latencies for the flow transitions. The results showed that the gymnasts tended to minimise their body movements and were more stationary (head) but not more stable (COP) than the non-gymnasts. These results suggest that gymnastics experience develops a specific postural adaptability relative to the perceptive configuration. We conclude that a specific postural experience could be considered as an intrinsic constraint, which leads to modification in the patterns of functional adaptation in the perceptive motor space.
Equations of motion for the variable mass flow-variable exhaust velocity rocket
NASA Technical Reports Server (NTRS)
Tempelman, W. H.
1972-01-01
An equation of motion for a one dimensional rocket is derived as a function of the mass flow rate into the acceleration chamber and the velocity distribution along the chamber, thereby including the transient flow changes in the chamber. The derivation of the mass density requires the introduction of the special time coordinate. The equation of motion is derived from both classical force and momentum approaches and is shown to be consistent with the standard equation expressed in terms of flow parameters at the exit to the acceleration chamber.
NASA Astrophysics Data System (ADS)
Ornelas, Danielle; Hasan, Md.; Gonzalez, Oscar; Krishnan, Giri; Szu, Jenny I.; Myers, Timothy; Hirota, Koji; Bazhenov, Maxim; Binder, Devin K.; Park, Boris H.
2017-02-01
Epilepsy is a chronic neurological disorder characterized by recurrent and unpredictable seizures. Electrophysiology has remained the gold standard of neural activity detection but its resolution and high susceptibility to noise and motion artifact limit its efficiency. Optical imaging techniques, including fMRI, intrinsic optical imaging, and diffuse optical imaging, have also been used to detect neural activity yet these techniques rely on the indirect measurement of changes in blood flow. A more direct optical imaging technique is optical coherence tomography (OCT), a label-free, high resolution, and minimally invasive imaging technique that can produce depth-resolved cross-sectional and 3D images. In this study, OCT was used to detect non-vascular depth-dependent optical changes in cortical tissue during 4-aminopyridine (4-AP) induced seizure onset. Calculations of localized optical attenuation coefficient (µ) allow for the assessment of depth-resolved volumetric optical changes in seizure induced cortical tissue. By utilizing the depth-dependency of the attenuation coefficient, we demonstrate the ability to locate and remove the optical effects of vasculature within the upper regions of the cortex on the attenuation calculations of cortical tissue in vivo. The results of this study reveal a significant depth-dependent decrease in attenuation coefficient of nonvascular cortical tissue both ex vivo and in vivo. Regions exhibiting decreased attenuation coefficient show significant temporal correlation to regions of increased electrical activity during seizure onset and progression. This study allows for a more thorough and biologically relevant analysis of the optical signature of seizure activity in vivo using OCT.
Microsystem enabled photovoltaic modules and systems
Nielson, Gregory N; Sweatt, William C; Okandan, Murat
2015-05-12
A microsystem enabled photovoltaic (MEPV) module including: an absorber layer; a fixed optic layer coupled to the absorber layer; a translatable optic layer; a translation stage coupled between the fixed and translatable optic layers; and a motion processor electrically coupled to the translation stage to controls motion of the translatable optic layer relative to the fixed optic layer. The absorber layer includes an array of photovoltaic (PV) elements. The fixed optic layer includes an array of quasi-collimating (QC) micro-optical elements designed and arranged to couple incident radiation from an intermediate image formed by the translatable optic layer into one of the PV elements such that it is quasi-collimated. The translatable optic layer includes an array of focusing micro-optical elements corresponding to the QC micro-optical element array. Each focusing micro-optical element is designed to produce a quasi-telecentric intermediate image from substantially collimated radiation incident within a predetermined field of view.
Human pose tracking from monocular video by traversing an image motion mapped body pose manifold
NASA Astrophysics Data System (ADS)
Basu, Saurav; Poulin, Joshua; Acton, Scott T.
2010-01-01
Tracking human pose from monocular video sequences is a challenging problem due to the large number of independent parameters affecting image appearance and nonlinear relationships between generating parameters and the resultant images. Unlike the current practice of fitting interpolation functions to point correspondences between underlying pose parameters and image appearance, we exploit the relationship between pose parameters and image motion flow vectors in a physically meaningful way. Change in image appearance due to pose change is realized as navigating a low dimensional submanifold of the infinite dimensional Lie group of diffeomorphisms of the two dimensional sphere S2. For small changes in pose, image motion flow vectors lie on the tangent space of the submanifold. Any observed image motion flow vector field is decomposed into the basis motion vector flow fields on the tangent space and combination weights are used to update corresponding pose changes in the different dimensions of the pose parameter space. Image motion flow vectors are largely invariant to style changes in experiments with synthetic and real data where the subjects exhibit variation in appearance and clothing. The experiments demonstrate the robustness of our method (within +/-4° of ground truth) to style variance.
Unsteady flow motions in the supraglottal region during phonation
NASA Astrophysics Data System (ADS)
Luo, Haoxiang; Dai, Hu
2008-11-01
The highly unsteady flow motions in the larynx are not only responsible for producing the fundamental frequency tone in phonation, but also have a significant contribution to the broadband noise in the human voice. In this work, the laryngeal flow is modeled either as an incompressible pulsatile jet confined in a two-dimensional channel, or a pressure-driven flow modulated by a pair of viscoelastic vocal folds through the flow--structure interaction. The flow in the supraglottal region is found to be dominated by large-scale vortices whose unsteady motions significantly deflect the glottal jet. In the flow--structure interaction, a hybrid model based on the immersed-boundary method is developed to simulate the flow-induced vocal fold vibration, which involves a three-dimensional vocal fold prototype and a two-dimensional viscous flow. Both the flow behavior and the vibratory characteristics of the vocal folds will be presented.
Ultrafast large-amplitude relocation of electronic charge in ionic crystals
Zamponi, Flavio; Rothhardt, Philip; Stingl, Johannes; Woerner, Michael; Elsaesser, Thomas
2012-01-01
The interplay of vibrational motion and electronic charge relocation in an ionic hydrogen-bonded crystal is mapped by X-ray powder diffraction with a 100 fs time resolution. Photoexcitation of the prototype material KH2PO4 induces coherent low-frequency motions of the PO4 tetrahedra in the electronically excited state of the crystal while the average atomic positions remain unchanged. Time-dependent maps of electron density derived from the diffraction data demonstrate an oscillatory relocation of electronic charge with a spatial amplitude two orders of magnitude larger than the underlying vibrational lattice motions. Coherent longitudinal optical and tranverse optical phonon motions that dephase on a time scale of several picoseconds, drive the charge relocation, similar to a soft (transverse optical) mode driven phase transition between the ferro- and paraelectric phase of KH2PO4. PMID:22431621
NASA Technical Reports Server (NTRS)
Morino, Luigi; Bharadvaj, Bala K.; Freedman, Marvin I.; Tseng, Kadin
1988-01-01
The wave equation for an object in arbitrary motion is investigated analytically using a BEM approach, and practical applications to potential flows of compressible fluids around aircraft wings and helicopter rotors are considered. The treatment accounts for arbitrary combined rotational and translational motion of the reference frame and for the wake motion. The numerical implementation as a computer algorithm is demonstrated on problems with prescribed and free wakes, the former in compressible flows and the latter for incompressible flows; results are presented graphically and briefly characterized.
Relative-Motion Sensors and Actuators for Two Optical Tables
NASA Technical Reports Server (NTRS)
Gursel, Yekta; McKenney, Elizabeth
2004-01-01
Optoelectronic sensors and magnetic actuators have been developed as parts of a system for controlling the relative position and attitude of two massive optical tables that float on separate standard air suspensions that attenuate ground vibrations. In the specific application for which these sensors and actuators were developed, one of the optical tables holds an optical system that mimics distant stars, while the other optical table holds a test article that simulates a spaceborne stellar interferometer that would be used to observe the stars. The control system is designed to suppress relative motion of the tables or, on demand, to impose controlled relative motion between the tables. The control system includes a sensor system that detects relative motion of the tables in six independent degrees of freedom and a drive system that can apply force to the star-simulator table in the six degrees of freedom. The sensor system includes (1) a set of laser heterodyne gauges and (2) a set of four diode lasers on the star-simulator table, each aimed at one of four quadrant photodiodes at nominal corresponding positions on the test-article table. The heterodyne gauges are used to measure relative displacements along the x axis.
NASA Astrophysics Data System (ADS)
Zboray, Robert; Dangendorf, Volker; Mor, Ilan; Bromberger, Benjamin; Tittelmeier, Kai
2015-07-01
In a previous work, we have demonstrated the feasibility of high-frame-rate, fast-neutron radiography of generic air-water two-phase flows in a 1.5 cm thick, rectangular flow channel. The experiments have been carried out at the high-intensity, white-beam facility of the Physikalisch-Technische Bundesanstalt, Germany, using an multi-frame, time-resolved detector developed for fast neutron resonance radiography. The results were however not fully optimal and therefore we have decided to modify the detector and optimize it for the given application, which is described in the present work. Furthermore, we managed to improve the image post-processing methodology and the noise suppression. Using the tailored detector and the improved post-processing, significant increase in the image quality and an order of magnitude lower exposure times, down to 3.33 ms, have been achieved with minimized motion artifacts. Similar to the previous study, different two-phase flow regimes such as bubbly slug and churn flows have been examined. The enhanced imaging quality enables an improved prediction of two-phase flow parameters like the instantaneous volumetric gas fraction, bubble size, and bubble velocities. Instantaneous velocity fields around the gas enclosures can also be more robustly predicted using optical flow methods as previously.
Li, Yixian; Qi, Lehua; Song, Yongshan; Chao, Xujiang
2017-06-01
The components of carbon/carbon (C/C) composites have significant influence on the thermal and mechanical properties, so a quantitative characterization of component is necessary to study the microstructure of C/C composites, and further to improve the macroscopic properties of C/C composites. Considering the extinction crosses of the pyrocarbon matrix have significant moving features, the polarized light microscope (PLM) video is used to characterize C/C composites quantitatively because it contains sufficiently dynamic and structure information. Then the optical flow method is introduced to compute the optical flow field between the adjacent frames, and segment the components of C/C composites from PLM image by image processing. Meanwhile the matrix with different textures is re-segmented by the length difference of motion vectors, and then the component fraction of each component and extinction angle of pyrocarbon matrix are calculated directly. Finally, the C/C composites are successfully characterized from three aspects of carbon fiber, pyrocarbon, and pores by a series of image processing operators based on PLM video, and the errors of component fractions are less than 15%. © 2017 Wiley Periodicals, Inc.
Clinical measurement of the dart throwing motion of the wrist: variability, accuracy and correction.
Vardakastani, Vasiliki; Bell, Hannah; Mee, Sarah; Brigstocke, Gavin; Kedgley, Angela E
2018-01-01
Despite being functionally important, the dart throwing motion is difficult to assess accurately through goniometry. The objectives of this study were to describe a method for reliably quantifying the dart throwing motion using goniometric measurements within a healthy population. Wrist kinematics of 24 healthy participants were assessed using goniometry and optical motion tracking. Three wrist angles were measured at the starting and ending points of the motion: flexion-extension, radial-ulnar deviation and dart throwing motion angle. The orientation of the dart throwing motion plane relative to the flexion-extension axis ranged between 28° and 57° among the tested population. Plane orientations derived from optical motion capture differed from those calculated through goniometry by 25°. An equation to correct the estimation of the plane from goniometry measurements was derived. This was applied and differences in the orientation of the plane were reduced to non-significant levels, enabling the dart throwing motion to be measured using goniometry alone.
Modification of equation of motion of fluid-conveying pipe for laminar and turbulent flow profiles
NASA Astrophysics Data System (ADS)
Guo, C. Q.; Zhang, C. H.; Païdoussis, M. P.
2010-07-01
Considering the non-uniformity of the flow velocity distribution in fluid-conveying pipes caused by the viscosity of real fluids, the centrifugal force term in the equation of motion of the pipe is modified for laminar and turbulent flow profiles. The flow-profile-modification factors are found to be 1.333, 1.015-1.040 and 1.035-1.055 for laminar flow in circular pipes, turbulent flow in smooth-wall circular pipes and turbulent flow in rough-wall circular pipes, respectively. The critical flow velocities for divergence in the above-mentioned three cases are found to be 13.4%, 0.74-1.9% and 1.7-2.6%, respectively, lower than that with plug flow, while those for flutter are even lower, which could reach 36% for the laminar flow profile. By introducing two new concepts of equivalent flow velocity and equivalent mass, fluid-conveying pipe problems with different flow profiles can be solved with the equation of motion for plug flow.
Measurement of six-degree-of-freedom planar motions by using a multiprobe surface encoder
NASA Astrophysics Data System (ADS)
Li, Xinghui; Shimizu, Yuki; Ito, Takeshi; Cai, Yindi; Ito, So; Gao, Wei
2014-12-01
A multiprobe surface encoder for optical metrology of six-degree-of-freedom (six-DOF) planar motions is presented. The surface encoder is composed of an XY planar scale grating with identical microstructures in X- and Y-axes and an optical sensor head. In the optical sensor head, three paralleled laser beams were used as laser probes. After being divided by a beam splitter, the three laser probes were projected onto the scale grating and a reference grating with identical microstructures, respectively. For each probe, the first-order positive and negative diffraction beams along the X- and Y-directions from the scale grating and from the reference grating superimposed with each other and four pieces of interference signals were generated. Three-DOF translational motions of the scale grating Δx, Δy, and Δz can be obtained simultaneously from the interference signals of each probe. Three-DOF angular error motions θX, θY, and θZ can also be calculated simultaneously from differences of displacement output variations and the geometric relationship among the three probes. A prototype optical sensor head was designed, constructed, and evaluated. Experimental results verified that this surface encoder could provide measurement resolutions of subnanometer and better than 0.1 arc sec for three-DOF translational motions and three-DOF angular error motions, respectively.
Disentangling Random Motion and Flow in a Complex Medium
Koslover, Elena F.; Chan, Caleb K.; Theriot, Julie A.
2016-01-01
We describe a technique for deconvolving the stochastic motion of particles from large-scale fluid flow in a dynamic environment such as that found in living cells. The method leverages the separation of timescales to subtract out the persistent component of motion from single-particle trajectories. The mean-squared displacement of the resulting trajectories is rescaled so as to enable robust extraction of the diffusion coefficient and subdiffusive scaling exponent of the stochastic motion. We demonstrate the applicability of the method for characterizing both diffusive and fractional Brownian motion overlaid by flow and analytically calculate the accuracy of the method in different parameter regimes. This technique is employed to analyze the motion of lysosomes in motile neutrophil-like cells, showing that the cytoplasm of these cells behaves as a viscous fluid at the timescales examined. PMID:26840734
Programmable micrometer-sized motor array based on live cells.
Xie, Shuangxi; Wang, Xiaodong; Jiao, Niandong; Tung, Steve; Liu, Lianqing
2017-06-13
Trapping and transporting microorganisms with intrinsic motility are important tasks for biological, physical, and biomedical applications. However, fast swimming speed makes the manipulation of these organisms an inherently challenging task. In this study, we demonstrated that an optoelectrical technique, namely, optically induced dielectrophoresis (ODEP), could effectively trap and manipulate Chlamydomonas reinhardtii (C. reinhardtii) cells swimming at velocities faster than 100 μm s -1 . Furthermore, live C. reinhardtii cells trapped by ODEP can form a micrometer-sized motor array. The rotating frequency of the cells ranges from 50 to 120 rpm, which can be reversibly adjusted with a fast response speed by varying the optical intensity. Functional flagella have been demonstrated to play a decisive role in the rotation. The programmable cell array with a rotating motion can be used as a bio-micropump to drive the liquid flow in microfludic chips and may shed new light on bio-actuation.
NASA Astrophysics Data System (ADS)
Kuruliuk, K. A.; Kulesh, V. P.
2016-10-01
An optical videogrammetry method using one digital camera for non-contact measurements of geometric shape parameters, position and motion of models and structural elements of aircraft in experimental aerodynamics was developed. The tests with the use of this method for measurement of six components (three linear and three angular ones) of real position of helicopter device in wind tunnel flow were conducted. The distance between camera and test object was 15 meters. It was shown in practice that, in the conditions of aerodynamic experiment instrumental measurement error (standard deviation) for angular and linear displacements of helicopter device does not exceed 0,02° and 0.3 mm, respectively. Analysis of the results shows that at the minimum rotor thrust deviations are systematic and generally are within ± 0.2 degrees. Deviations of angle values grow with the increase of rotor thrust.
Surface flow observations from a gauge-cam station on the Tiber river
NASA Astrophysics Data System (ADS)
Tauro, Flavia; Porfiri, Maurizio; Petroselli, Andrea; Grimaldi, Salvatore
2016-04-01
Understanding the kinematic organization of natural water bodies is central to hydrology and environmental engineering practice. Reliable and continuous flow observations are essential to comprehend flood generation and propagation mechanisms, erosion dynamics, sediment transport, and drainage network evolution. In engineering practice, flood warning systems largely rely on real-time discharge measurements, and flow velocity monitoring is important for the design and management of hydraulic structures, such as reservoirs and hydropower plants. Traditionally, gauging stations have been equipped with water level meters, and stage-discharge relationships (rating curves) have been established through few direct discharge measurements. Only in rare instances, monitoring stations have integrated radar technology for local measurement of surface flow velocity. Establishing accurate rating curves depends on the availability of a comprehensive range of discharge values, including measurements recorded during extreme events. However, discharge values during high-flow events are often difficult or even impossible to obtain, thereby hampering the reliability of discharge predictions. Fully remote observations have been enabled in the past ten years through optics-based velocimetry techniques. Such methodologies enable the estimation of the surface flow velocity field over extended regions from the motion of naturally occurring debris or floaters dragged by the current. Resting on the potential demonstrated by such approaches, here, we present a novel permanent gauge-cam station for the observation of the flow velocity field in the Tiber river. This new station captures one-minute videos every 10 minutes over an area of up to 20.6 × 15.5m2. In a feasibility study, we demonstrate that experimental images analyzed via particle tracking velocimetry and particle image velocimetry can be used to obtain accurate surface flow velocity estimations in close agreement with radar records. Future efforts will be devoted to the development of a comprehensive testbed infrastructure for investigating the potential of multiple optics-based approaches for surface hydrology.
Reconciling surface plate motions with rapid three-dimensional mantle flow around a slab edge.
Jadamec, Margarete A; Billen, Magali I
2010-05-20
The direction of tectonic plate motion at the Earth's surface and the flow field of the mantle inferred from seismic anisotropy are well correlated globally, suggesting large-scale coupling between the mantle and the surface plates. The fit is typically poor at subduction zones, however, where regional observations of seismic anisotropy suggest that the direction of mantle flow is not parallel to and may be several times faster than plate motions. Here we present three-dimensional numerical models of buoyancy-driven deformation with realistic slab geometry for the Alaska subduction-transform system and use them to determine the origin of this regional decoupling of flow. We find that near a subduction zone edge, mantle flow velocities can have magnitudes of more than ten times the surface plate motions, whereas surface plate velocities are consistent with plate motions and the complex mantle flow field is consistent with observations from seismic anisotropy. The seismic anisotropy observations constrain the shape of the eastern slab edge and require non-Newtonian mantle rheology. The incorporation of the non-Newtonian viscosity results in mantle viscosities of 10(17) to 10(18) Pa s in regions of high strain rate (10(-12) s(-1)), and this low viscosity enables the mantle flow field to decouple partially from the motion of the surface plates. These results imply local rapid transport of geochemical signatures through subduction zones and that the internal deformation of slabs decreases the slab-pull force available to drive subducting plates.
Hydraulic modeling of unsteady debris-flow surges with solid-fluid interactions
Iverson, Richard M.
1997-01-01
Interactions of solid and fluid constituents produce the unique style of motion that typifies debris flows. To simulate this motion, a new hydraulic model represents debris flows as deforming masses of granular solids variably liquefied by viscous pore fluid. The momentum equation of the model describes how internal and boundary forces change as coarse-grained surge heads dominated by grain-contact friction grade into muddy debris-flow bodies more strongly influenced by fluid viscosity and pressure. Scaling analysis reveals that pore-pressure variations can cause flow resistance in surge heads to surpass that in debris-flow bodies by orders of magnitude. Numerical solutions of the coupled momentum and continuity equations provide good predictions of unsteady, nonuniform motion of experimental debris flows from initiation through deposition.
Differential responses in dorsal visual cortex to motion and disparity depth cues
Arnoldussen, David M.; Goossens, Jeroen; van den Berg, Albert V.
2013-01-01
We investigated how interactions between monocular motion parallax and binocular cues to depth vary in human motion areas for wide-field visual motion stimuli (110 × 100°). We used fMRI with an extensive 2 × 3 × 2 factorial blocked design in which we combined two types of self-motion (translational motion and translational + rotational motion), with three categories of motion inflicted by the degree of noise (self-motion, distorted self-motion, and multiple object-motion), and two different view modes of the flow patterns (stereo and synoptic viewing). Interactions between disparity and motion category revealed distinct contributions to self- and object-motion processing in 3D. For cortical areas V6 and CSv, but not the anterior part of MT+ with bilateral visual responsiveness (MT+/b), we found a disparity-dependent effect of rotational flow and noise: When self-motion perception was degraded by adding rotational flow and moderate levels of noise, the BOLD responses were reduced compared with translational self-motion alone, but this reduction was cancelled by adding stereo information which also rescued the subject's self-motion percept. At high noise levels, when the self-motion percept gave way to a swarm of moving objects, the BOLD signal strongly increased compared to self-motion in areas MT+/b and V6, but only for stereo in the latter. BOLD response did not increase for either view mode in CSv. These different response patterns indicate different contributions of areas V6, MT+/b, and CSv to the processing of self-motion perception and the processing of multiple independent motions. PMID:24339808
Mantle flow through a tear in the Nazca slab inferred from shear wave splitting
NASA Astrophysics Data System (ADS)
Lynner, Colton; Anderson, Megan L.; Portner, Daniel E.; Beck, Susan L.; Gilbert, Hersh
2017-07-01
A tear in the subducting Nazca slab is located between the end of the Pampean flat slab and normally subducting oceanic lithosphere. Tomographic studies suggest mantle material flows through this opening. The best way to probe this hypothesis is through observations of seismic anisotropy, such as shear wave splitting. We examine patterns of shear wave splitting using data from two seismic deployments in Argentina that lay updip of the slab tear. We observe a simple pattern of plate-motion-parallel fast splitting directions, indicative of plate-motion-parallel mantle flow, beneath the majority of the stations. Our observed splitting contrasts previous observations to the north and south of the flat slab region. Since plate-motion-parallel splitting occurs only coincidentally with the slab tear, we propose mantle material flows through the opening resulting in Nazca plate-motion-parallel flow in both the subslab mantle and mantle wedge.
Optical tweezers with 2.5 kHz bandwidth video detection for single-colloid electrophoresis
NASA Astrophysics Data System (ADS)
Otto, Oliver; Gutsche, Christof; Kremer, Friedrich; Keyser, Ulrich F.
2008-02-01
We developed an optical tweezers setup to study the electrophoretic motion of colloids in an external electric field. The setup is based on standard components for illumination and video detection. Our video based optical tracking of the colloid motion has a time resolution of 0.2ms, resulting in a bandwidth of 2.5kHz. This enables calibration of the optical tweezers by Brownian motion without applying a quadrant photodetector. We demonstrate that our system has a spatial resolution of 0.5nm and a force sensitivity of 20fN using a Fourier algorithm to detect periodic oscillations of the trapped colloid caused by an external ac field. The electrophoretic mobility and zeta potential of a single colloid can be extracted in aqueous solution avoiding screening effects common for usual bulk measurements.
Electrostatic micromembrane actuator arrays as motion generator
NASA Astrophysics Data System (ADS)
Wu, X. T.; Hui, J.; Young, M.; Kayatta, P.; Wong, J.; Kennith, D.; Zhe, J.; Warde, C.
2004-05-01
A rigid-body motion generator based on an array of micromembrane actuators is described. Unlike previous microelectromechanical systems (MEMS) techniques, the architecture employs a large number (typically greater than 1000) of micron-sized (10-200 μm) membrane actuators to simultaneously generate the displacement of a large rigid body, such as a conventional optical mirror. For optical applications, the approach provides optical design freedom of MEMS mirrors by enabling large-aperture mirrors to be driven electrostatically by MEMS actuators. The micromembrane actuator arrays have been built using a stacked architecture similar to that employed in the Multiuser MEMS Process (MUMPS), and the motion transfer from the arrayed micron-sized actuators to macro-sized components was demonstrated.
Motion detection using extended fractional Fourier transform and digital speckle photography.
Bhaduri, Basanta; Tay, C J; Quan, C; Sheppard, Colin J R
2010-05-24
Digital speckle photography is a useful tool for measuring the motion of optically rough surfaces from the speckle shift that takes place at the recording plane. A simple correlation based digital speckle photographic system has been proposed that implements two simultaneous optical extended fractional Fourier transforms (EFRTs) of different orders using only a single lens and detector to simultaneously detect both the magnitude and direction of translation and tilt by capturing only two frames: one before and another after the object motion. The dynamic range and sensitivity of the measurement can be varied readily by altering the position of the mirror/s used in the optical setup. Theoretical analysis and experiment results are presented.
Shape matters: Near-field fluid mechanics dominate the collective motions of ellipsoidal squirmers.
Kyoya, K; Matsunaga, D; Imai, Y; Omori, T; Ishikawa, T
2015-12-01
Microswimmers show a variety of collective motions. Despite extensive study, questions remain regarding the role of near-field fluid mechanics in collective motion. In this paper, we describe precisely the Stokes flow around hydrodynamically interacting ellipsoidal squirmers in a monolayer suspension. The results showed that various collective motions, such as ordering, aggregation, and whirls, are dominated by the swimming mode and the aspect ratio. The collective motions are mainly induced by near-field fluid mechanics, despite Stokes flow propagation over a long range. These results emphasize the importance of particle shape in collective motion.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, P; Cheng, S; Chao, C
Purpose: Respiratory motion artifacts are commonly seen in the abdominal and thoracic CT images. A Real-time Position Management (RPM) system is integrated with CT simulator using abdominal surface as a surrogate for tracking the patient respiratory motion. The respiratory-correlated four-dimensional computed tomography (4DCT) is then reconstructed by GE advantage software. However, there are still artifacts due to inaccurate respiratory motion detecting and sorting methods. We developed an Ultrasonography Respiration Monitoring (URM) system which can directly monitor diaphragm motion to detect respiratory cycles. We also developed a new 4DCT sorting and motion estimation method to reduce the respiratory motion artifacts. Themore » new 4DCT system was compared with RPM and the GE 4DCT system. Methods: Imaging from a GE CT scanner was simultaneously correlated with both the RPM and URM to detect respiratory motion. A radiation detector, Blackcat GM-10, recorded the X-ray on/off and synchronized with URM. The diaphragm images were acquired with Ultrasonix RP system. The respiratory wave was derived from diaphragm images and synchronized with CT scanner. A more precise peaks and valleys detection tool was developed and compared with RPM. The motion is estimated for the slices which are not in the predefined respiratory phases by using block matching and optical flow method. The CT slices were then sorted into different phases and reconstructed, compared with the images reconstructed from GE Advantage software using respiratory wave produced from RPM system. Results: The 4DCT images were reconstructed for eight patients. The discontinuity at the diaphragm level due to an inaccurate identification of phases by the RPM was significantly improved by URM system. Conclusion: Our URM 4DCT system was evaluated and compared with RPM and GE 4DCT system. The new system is user friendly and able to reduce motion artifacts. It also has the potential to monitor organ motion during therapy.« less
Teramoto, Wataru; Watanabe, Hiroshi; Umemura, Hiroyuki
2008-01-01
The perceived temporal order of external successive events does not always follow their physical temporal order. We examined the contribution of self-motion mechanisms in the perception of temporal order in the auditory modality. We measured perceptual biases in the judgment of the temporal order of two short sounds presented successively, while participants experienced visually induced self-motion (yaw-axis circular vection) elicited by viewing long-lasting large-field visual motion. In experiment 1, a pair of white-noise patterns was presented to participants at various stimulus-onset asynchronies through headphones, while they experienced visually induced self-motion. Perceived temporal order of auditory events was modulated by the direction of the visual motion (or self-motion). Specifically, the sound presented to the ear in the direction opposite to the visual motion (ie heading direction) was perceived prior to the sound presented to the ear in the same direction. Experiments 2A and 2B were designed to reduce the contributions of decisional and/or response processes. In experiment 2A, the directional cueing of the background (left or right) and the response dimension (high pitch or low pitch) were not spatially associated. In experiment 2B, participants were additionally asked to report which of the two sounds was perceived 'second'. Almost the same results as in experiment 1 were observed, suggesting that the change in temporal order of auditory events during large-field visual motion reflects a change in perceptual processing. Experiment 3 showed that the biases in the temporal-order judgments of auditory events were caused by concurrent actual self-motion with a rotatory chair. In experiment 4, using a small display, we showed that 'pure' long exposure to visual motion without the sensation of self-motion was not responsible for this phenomenon. These results are consistent with previous studies reporting a change in the perceived temporal order of visual or tactile events depending on the direction of self-motion. Hence, large-field induced (ie optic flow) self-motion can affect the temporal order of successive external events across various modalities.
NASA Technical Reports Server (NTRS)
Ye, B.; DelGenio, A. D.
1999-01-01
Areally extensive, optically thick anvil clouds associated with mesoscale convective clusters dominate the shortwave cloud forcing in the tropics and provide longwave forcing comparable to that of thin cirrus. Changes in the cover and optical thickness of tropical anvils as climate warms can regulate the sign of cloud feedback. As a prelude to the study of MMCR data from the ARM TWP sites, we analyze ISCCP-derived radiative characteristics of anvils observed in the tropical west Pacific during the TOGA-COARE IOP. Anvils with radius greater than 100 km were identified and tracked from inception to decay using the Machado-Rossow algorithm. Corresponding environmental conditions just prior to the start of the convectove event were diagnosed using the Lin-Johnson objective analysis product. Small clusters (100-200 km radius) are observed to have a broad range of optical thicknesses (10-50), while intermediate optical thickness clusters are observed to range in size from 100 km to almost 1000 km. Large-size clusters appear to be favored by strong pre-storm large scale upward motion throughout the troposphere, moist low-to-midlevel relative humidities, environments with slightly higher CAPE than those for smaller clusters, and strong front-to-rear flow. Optically thick anvils are favored in situations of strong low-level moisture convergence and strong upper-level shear.
NASA Astrophysics Data System (ADS)
Wiley, E. O.
2010-07-01
Relative motion studies of visual double stars can be investigated using least squares regression techniques and readily accessible programs such as Microsoft Excel and a calculator. Optical pairs differ from physical pairs under most geometries in both their simple scatter plots and their regression models. A step-by-step protocol for estimating the rectilinear elements of an optical pair is presented. The characteristics of physical pairs using these techniques are discussed.
Large-Amplitude, High-Rate Roll Oscillations of a 65 deg Delta Wing at High Incidence
NASA Technical Reports Server (NTRS)
Chaderjian, Neal M.; Schiff, Lewis B.
2000-01-01
The IAR/WL 65 deg delta wing experimental results provide both detail pressure measurements and a wide range of flow conditions covering from simple attached flow, through fully developed vortex and vortex burst flow, up to fully-stalled flow at very high incidence. Thus, the Computational Unsteady Aerodynamics researchers can use it at different level of validating the corresponding code. In this section a range of CFD results are provided for the 65 deg delta wing at selected flow conditions. The time-dependent, three-dimensional, Reynolds-averaged, Navier-Stokes (RANS) equations are used to numerically simulate the unsteady vertical flow. Two sting angles and two large- amplitude, high-rate, forced-roll motions and a damped free-to-roll motion are presented. The free-to-roll motion is computed by coupling the time-dependent RANS equations to the flight dynamic equation of motion. The computed results are compared with experimental pressures, forces, moments and roll angle time history. In addition, surface and off-surface flow particle streaks are also presented.
Wing Rock Motion and its Flow Mechanism over a Chined-Body Configuration
NASA Astrophysics Data System (ADS)
Wang, Yankui; Li, Qian; Shi, Wei
2015-11-01
Wing rock motion is one kind of uncommanded oscillation around the body axis over the most of the aircraft at enough high angle of attack and has a strong threat to the flight safety. The purpose of this paper is to investigate the wing rock motion over a typical body-wing configuration with a chined fuselage at fixed angle of attack firstly and four kinds of wing rock motion are revealed based on the flow phenomena, namely non-oscillation, lateral deflection, limit-cycle oscillation and irregular oscillation. Simultaneously, some special relationship between the wing rock motion and the flow over the chined body configuration are discussed. In addition, the evolution of wing rock motion and its corresponding flows when the model undergoes pitching up are also given out. All the experiments have been conducted in a low-speed wind tunnel at a Reynolds number of 1.87*10E5 and angle of attack from 0deg to 65deg. National Natural Science Foundation of China(11472028) and Open fund from State Key Laboratory of Aerodynamics.
Measurement of the near-wall velocity profile for a nanofluid flow inside a microchannel
NASA Astrophysics Data System (ADS)
Kanjirakat, Anoop; Sadr, Reza
2015-11-01
Hydrodynamics and anomalous heat transfer enhancements have been reported in the past for colloidal suspensions of nano-sized particles dispersed in a fluid (nanofluids). However, such augmentations may manifest itself by study of fluid flow characteristics near in the wall region. Present experimental study reports near-wall velocity profile for nanofluids (silicon dioxide nanoparticles in water) measured inside a microchannel. An objective-based nano-Particle Image Velocimetry (nPIV) technique is used to measure fluid velocity within three visible depths, O(100nm), from the wall. The near-wall fluid velocity profile is estimated after implementing the required corrections for optical properties and effects caused by hindered Brownian motion, wall-particle interactions, and non-uniform exponential illumination on the measurement technique. The fluid velocities of nanofluids at each of the three visible depths are observed to be higher than that of the base fluid resulting in a higher shear rate in this region. The relative increase in shear rates for nanofluids is believed to be the result of the near-wall shear-induced particle migration along with the Brownian motion of the nanoparticles. This research is funded by NPRP grant # 08-574-2-239 from the Qatar National Research Fund (a member of Qatar Foundation).
Optics derotator servo control system for SONG Telescope
NASA Astrophysics Data System (ADS)
Xu, Jin; Ren, Changzhi; Ye, Yu
2012-09-01
The Stellar Oscillations Network Group (SONG) is an initiative which aims at designing and building a groundbased network of 1m telescopes dedicated to the study of phenomena occurring in the time domain. Chinese standard node of SONG is an Alt-Az Telescope of F/37 with 1m diameter. Optics derotator control system of SONG telescope adopts the development model of "Industrial Computer + UMAC Motion Controller + Servo Motor".1 Industrial computer is the core processing part of the motion control, motion control card(UMAC) is in charge of the details on the motion control, Servo amplifier accepts the control commands from UMAC, and drives the servo motor. The position feedback information comes from the encoder, to form a closed loop control system. This paper describes in detail hardware design and software design for the optics derotator servo control system. In terms of hardware design, the principle, structure, and control algorithm of servo system based on optics derotator are analyzed and explored. In terms of software design, the paper proposes the architecture of the system software based on Object-Oriented Programming.
Lindemann, J P; Kern, R; Michaelis, C; Meyer, P; van Hateren, J H; Egelhaaf, M
2003-03-01
A high-speed panoramic visual stimulation device is introduced which is suitable to analyse visual interneurons during stimulation with rapid image displacements as experienced by fast moving animals. The responses of an identified motion sensitive neuron in the visual system of the blowfly to behaviourally generated image sequences are very complex and hard to predict from the established input circuitry of the neuron. This finding suggests that the computational significance of visual interneurons can only be assessed if they are characterised not only by conventional stimuli as are often used for systems analysis, but also by behaviourally relevant input.
Boutin, Henri; Smith, John; Wolfe, Joe
2015-07-01
Analysis of published depth-kymography data [George, de Mul, Qiu, Rakhorst, and Schutte (2008). Phys. Med. Biol. 53, 2667-2675] shows that, for the subject studied, the flow due to the longitudinal sweeping motion of the vocal folds contributes several percent of a typical acoustic flow at the larynx. This sweeping flow is a maximum when the glottis is closed. This observation suggests that assumption of zero laryngeal flow during the closed phase as a criterion when determining parameters in inverse filtering should be used with caution. Further, these data suggest that the swinging motion contributes work to overcome mechanical losses and thus to assist auto-oscillation.
On the Motion of an Annular Film in Microgravity Gas-Liquid Flow
NASA Technical Reports Server (NTRS)
McQuillen, John B.
2002-01-01
Three flow regimes have been identified for gas-liquid flow in a microgravity environment: Bubble, Slug, and Annular. For the slug and annular flow regimes, the behavior observed in vertical upflow in normal gravity is similar to microgravity flow with a thin, symmetrical annular film wetting the tube wall. However, the motion and behavior of this film is significantly different between the normal and low gravity cases. Specifically, the liquid film will slow and come to a stop during low frequency wave motion or slugging. In normal gravity vertical upflow, the film has been observed to slow, stop, and actually reverse direction until it meets the next slug or wave.
The mantle flow field beneath western North America.
Silver, P G; Holt, W E
2002-02-08
Although motions at the surface of tectonic plates are well determined, the accompanying horizontal mantle flow is not. We have combined observations of surface deformation and upper mantle seismic anisotropy to estimate this flow field for western North America. We find that the mantle velocity is 5.5 +/- 1.5 centimeters per year due east in a hot spot reference frame, nearly opposite to the direction of North American plate motion (west-southwest). The flow is only weakly coupled to the motion of the surface plate, producing a small drag force. This flow field is probably due to heterogeneity in mantle density associated with the former Farallon oceanic plate beneath North America.
Toward automated formation of microsphere arrangements using multiplexed optical tweezers
NASA Astrophysics Data System (ADS)
Rajasekaran, Keshav; Bollavaram, Manasa; Banerjee, Ashis G.
2016-09-01
Optical tweezers offer certain advantages such as multiplexing using a programmable spatial light modulator, flexibility in the choice of the manipulated object and the manipulation medium, precise control, easy object release, and minimal object damage. However, automated manipulation of multiple objects in parallel, which is essential for efficient and reliable formation of micro-scale assembly structures, poses a difficult challenge. There are two primary research issues in addressing this challenge. First, the presence of stochastic Langevin force giving rise to Brownian motion requires motion control for all the manipulated objects at fast rates of several Hz. Second, the object dynamics is non-linear and even difficult to represent analytically due to the interaction of multiple optical traps that are manipulating neighboring objects. As a result, automated controllers have not been realized for tens of objects, particularly with three dimensional motions with guaranteed collision avoidances. In this paper, we model the effect of interacting optical traps on microspheres with significant Brownian motions in stationary fluid media, and develop simplified state-space representations. These representations are used to design a model predictive controller to coordinate the motions of several spheres in real time. Preliminary experiments demonstrate the utility of the controller in automatically forming desired arrangements of varying configurations starting with randomly dispersed microspheres.
Optical identification of two nearby Isolated Neutron Stars through proper motion measuremnt.
NASA Astrophysics Data System (ADS)
Zane, Silvia
2004-07-01
Aim of this proposal is to perform high-resolution imaging of the proposed optical counterparts of the two, radio silent, isolated neutron stars RXJ1308.6+2127 and RX J1605.3+3249 with the STIS/50CCD. Imaging both fields with the same instrumental configuration used in mid 2001 by Kaplan et al {2002; 2003}, will allow us to measure the objects' position and to determine their proper motions over a time base of nearly four years. The measurement of proper motions at the level of at least few tens mas/yr, expected for relatively nearby neutron stars, would unambigouosly secure the proposed optical identifications, not achievable otherwise. In addition, the knowledge of the proper motion will provide useful indications on the space velocity and distance of these neutrons stars, as well as on the radius. Constraining these parameters is of paramount importance to discriminate between the variety of emission mechanisms invoked to explain their observed thermal X-ray spectra and to probe the neutron star equation of state {EOS}. The determination of the proper motion is a decisive step toward a dedicated follow-up program aimed at measuring the objects' optical parallax, thus providing much firmer constrains on the star properties, again to be performed with the STIS/50CCD.
A Surface-Coupled Optical Trap with 1-bp Precision via Active Stabilization
Okoniewski, Stephen R.; Carter, Ashley R.; Perkins, Thomas T.
2017-01-01
Optical traps can measure bead motions with Å-scale precision. However, using this level of precision to infer 1-bp motion of molecular motors along DNA is difficult, since a variety of noise sources degrade instrumental stability. In this chapter, we detail how to improve instrumental stability by (i) minimizing laser pointing, mode, polarization, and intensity noise using an acousto-optical-modulator mediated feedback loop and (ii) minimizing sample motion relative to the optical trap using a 3-axis piezo-electric-stage mediated feedback loop. These active techniques play a critical role in achieving a surface stability of 1 Å in 3D over tens of seconds and a 1-bp stability and precision in a surface-coupled optical trap over a broad bandwidth (Δf = 0.03–2 Hz) at low force (6 pN). These active stabilization techniques can also aid other biophysical assays that would benefit from improved laser stability and/or Å-scale sample stability, such as atomic force microscopy and super-resolution imaging. PMID:27844426
A high bandwidth three-axis out-of-plane motion measurement system based on optical beam deflection
NASA Astrophysics Data System (ADS)
Piyush, P.; Giridhar, M. S.; Jayanth, G. R.
2018-03-01
Multi-axis measurement of motion is indispensable for characterization of dynamic systems and control of motion stages. This paper presents an optical beam deflection-based measurement system to simultaneously measure three-axis out-of-plane motion of both micro- and macro-scale targets. Novel strategies are proposed to calibrate the sensitivities of the measurement system. Subsequently the measurement system is experimentally realized and calibrated. The system is employed to characterize coupled linear and angular motion of a piezo-actuated stage. The measured motion is shown to be in agreement with theoretical expectation. Next, the high bandwidth of the measurement system has been showcased by utilizing it to measure coupled two-axis transient motion of a Radio Frequency Micro-Electro-Mechanical System switch with a rise time of about 60 μs. Finally, the ability of the system to measure out-of-plane angular motion about the second axis has been demonstrated by measuring the deformation of a micro-cantilever beam.
Interrelationship of mid-diastolic mitral valve motion, pulmonary venous flow, and transmitral flow.
Keren, G; Meisner, J S; Sherez, J; Yellin, E L; Laniado, S
1986-07-01
This study offers a unifying mechanism of left ventricular filling dynamics to link the unexplained mid-diastolic motion of the mitral valve with an associated increase in transmitral flow, with the phasic character of pulmonary vein flow, and with changes in the atrioventricular pressure difference. M mode echograms of mitral valve motion and Doppler echocardiograms of mitral and pulmonary vein flow velocities were recorded in 12 healthy volunteers (heart rate = 60 +/- 9 beats/min). All echocardiograms showed an undulation in the mitral valve (L motion) at a relatively constant delay from the peak of the diastolic phase of pulmonary vein flow (K phase). In six subjects, the L motion was also associated with a distinct wave of mitral flow (L wave). Measured from the onset of the QRS complex, Q-K was 577 +/- 39 msec; Q-L was 703 +/- 42 msec, and K-L was 125 +/- 16 msec. Multiple measurements within each subject during respiratory variations in RR interval indicated exceptionally small differences in the temporal relationships (mean coefficient of variation 2%). Early rapid flow deceleration is caused by a reversal of the atrioventricular pressure gradient, and the L wave arises from the subsequent reestablishment of a positive gradient due to left atrial filling via the pulmonary veins. The mitral valve moves passively in response to the flowing blood and the associated pressure difference. This interpretation is confirmed by (1) a computational model, and (2) a retrospective analysis of data from patients with mitral stenosis and from conscious dogs instrumented to measure transmitral pressure-flow relationships.
Self-sustaining processes at all scales in wall-bounded turbulent shear flows
NASA Astrophysics Data System (ADS)
Cossu, Carlo; Hwang, Yongyun
2017-03-01
We collect and discuss the results of our recent studies which show evidence of the existence of a whole family of self-sustaining motions in wall-bounded turbulent shear flows with scales ranging from those of buffer-layer streaks to those of large-scale and very-large-scale motions in the outer layer. The statistical and dynamical features of this family of self-sustaining motions, which are associated with streaks and quasi-streamwise vortices, are consistent with those of Townsend's attached eddies. Motions at each relevant scale are able to sustain themselves in the absence of forcing from larger- or smaller-scale motions by extracting energy from the mean flow via a coherent lift-up effect. The coherent self-sustaining process is embedded in a set of invariant solutions of the filtered Navier-Stokes equations which take into full account the Reynolds stresses associated with the residual smaller-scale motions.
Yücel, Meryem A.; Selb, Juliette; Boas, David A.; Cash, Sydney S.; Cooper, Robert J.
2013-01-01
As the applications of near-infrared spectroscopy (NIRS) continue to broaden and long-term clinical monitoring becomes more common, minimizing signal artifacts due to patient movement becomes more pressing. This is particularly true in applications where clinically and physiologically interesting events are intrinsically linked to patient movement, as is the case in the study of epileptic seizures. In this study, we apply an approach common in the application of EEG electrodes to the application of specialized NIRS optical fibers. The method provides improved optode-scalp coupling through the use of miniaturized optical fiber tips fixed to the scalp using collodion, a clinical adhesive. We investigate and quantify the performance of this new method in minimizing motion artifacts in healthy subjects, and apply the technique to allow continuous NIRS monitoring throughout epileptic seizures in two epileptic in-patients. Using collodion-fixed fibers reduces the percent signal change of motion artifacts by 90 % and increases the SNR by 6 and 3 fold at 690 and 830 nm wavelengths respectively when compared to a standard Velcro-based array of optical fibers. The change in both HbO and HbR during motion artifacts is found to be statistically lower for the collodion-fixed fiber probe. The collodion-fixed optical fiber approach has also allowed us to obtain good quality NIRS recording of three epileptic seizures in two patients despite excessive motion in each case. PMID:23796546
Optical Trapping of Ion Coulomb Crystals
NASA Astrophysics Data System (ADS)
Schmidt, Julian; Lambrecht, Alexander; Weckesser, Pascal; Debatin, Markus; Karpa, Leon; Schaetz, Tobias
2018-04-01
The electronic and motional degrees of freedom of trapped ions can be controlled and coherently coupled on the level of individual quanta. Assembling complex quantum systems ion by ion while keeping this unique level of control remains a challenging task. For many applications, linear chains of ions in conventional traps are ideally suited to address this problem. However, driven motion due to the magnetic or radio-frequency electric trapping fields sometimes limits the performance in one dimension and severely affects the extension to higher-dimensional systems. Here, we report on the trapping of multiple barium ions in a single-beam optical dipole trap without radio-frequency or additional magnetic fields. We study the persistence of order in ensembles of up to six ions within the optical trap, measure their temperature, and conclude that the ions form a linear chain, commonly called a one-dimensional Coulomb crystal. As a proof-of-concept demonstration, we access the collective motion and perform spectrometry of the normal modes in the optical trap. Our system provides a platform that is free of driven motion and combines advantages of optical trapping, such as state-dependent confinement and nanoscale potentials, with the desirable properties of crystals of trapped ions, such as long-range interactions featuring collective motion. Starting with small numbers of ions, it has been proposed that these properties would allow the experimental study of many-body physics and the onset of structural quantum phase transitions between one- and two-dimensional crystals.
NASA Astrophysics Data System (ADS)
Go, Gi-Hyun; Heo, Seungjin; Cho, Jong-Hoi; Yoo, Yang-Seok; Kim, Minkwan; Park, Chung-Hyun; Cho, Yong-Hoon
2017-03-01
As interest in anisotropic particles has increased in various research fields, methods of tracking such particles have become increasingly desirable. Here, we present a new and intuitive method to monitor the Brownian motion of a nanowire, which can construct and visualize multi-dimensional motion of a nanowire confined in an optical trap, using a dual particle tracking system. We measured the isolated angular fluctuations and translational motion of the nanowire in the optical trap, and determined its physical properties, such as stiffness and torque constants, depending on laser power and polarization direction. This has wide implications in nanoscience and nanotechnology with levitated anisotropic nanoparticles.
Yücel, Meryem A; Selb, Juliette; Boas, David A; Cash, Sydney S; Cooper, Robert J
2014-01-15
As the applications of near-infrared spectroscopy (NIRS) continue to broaden and long-term clinical monitoring becomes more common, minimizing signal artifacts due to patient movement becomes more pressing. This is particularly true in applications where clinically and physiologically interesting events are intrinsically linked to patient movement, as is the case in the study of epileptic seizures. In this study, we apply an approach common in the application of EEG electrodes to the application of specialized NIRS optical fibers. The method provides improved optode-scalp coupling through the use of miniaturized optical fiber tips fixed to the scalp using collodion, a clinical adhesive. We investigate and quantify the performance of this new method in minimizing motion artifacts in healthy subjects, and apply the technique to allow continuous NIRS monitoring throughout epileptic seizures in two epileptic in-patients. Using collodion-fixed fibers reduces the percent signal change of motion artifacts by 90% and increases the SNR by 6 and 3 fold at 690 and 830 nm wavelengths respectively when compared to a standard Velcro-based array of optical fibers. The SNR has also increased by 2 fold during rest conditions without motion with the new probe design because of better light coupling between the fiber and scalp. The change in both HbO and HbR during motion artifacts is found to be statistically lower for the collodion-fixed fiber probe. The collodion-fixed optical fiber approach has also allowed us to obtain good quality NIRS recording of three epileptic seizures in two patients despite excessive motion in each case. Copyright © 2013 Elsevier Inc. All rights reserved.
Stereo-motion cooperation and the use of motion disparity in the visual perception of 3-D structure.
Cornilleau-Pérès, V; Droulez, J
1993-08-01
When an observer views a moving scene binocularly, both motion parallax and binocular disparity provide depth information. In Experiments 1A-1C, we measured sensitivity to surface curvature when these depth cues were available either individually or simultaneously. When the depth cues yielded comparable sensitivity to surface curvature, we found that curvature detection was easier with the cues present simultaneously, rather than individually. For 2 of the 6 subjects, this effect was stronger when the component of frontal translation of the surface was vertical, rather than horizontal. No such anisotropy was found for the 4 other subjects. If a moving object is observed binocularly, the patterns of optic flow are different on the left and right retinae. We have suggested elsewhere (Cornilleau-Pérès & Droulez, in press) that this motion disparity might be used as a visual cue for the perception of a 3-D structure. Our model consisted in deriving binocular disparity from the left and right distributions of vertical velocities, rather than from luminous intensities, as has been done in classical studies on stereoscopic vision. The model led to some predictions concerning the detection of surface curvature from motion disparity in the presence or absence of intensity-based disparity (classically termed binocular disparity). In a second set of experiments, we attempted to test these predictions, and we failed to validate our theoretical scheme from a physiological point of view.
NASA Astrophysics Data System (ADS)
Ornelas, Danielle; Hasan, Md.; Gonzalez, Oscar; Krishnan, Giri; Szu, Jenny I.; Myers, Timothy; Hirota, Koji; Bazhenov, Maxim; Binder, Devin K.; Park, Boris H.
2017-02-01
Electrophysiology has remained the gold standard of neural activity detection but its resolution and high susceptibility to noise and motion artifact limit its efficiency. Imaging techniques, including fMRI, intrinsic optical imaging, and diffuse optical imaging, have been used to detect neural activity, but rely on indirect measurements such as changes in blood flow. Fluorescence-based techniques, including genetically encoded indicators, are powerful techniques, but require introduction of an exogenous fluorophore. A more direct optical imaging technique is optical coherence tomography (OCT), a label-free, high resolution, and minimally invasive imaging technique that can produce depth-resolved cross-sectional and 3D images. In this study, we sought to examine non-vascular depth-dependent optical changes directly related to neural activity. We used an OCT system centered at 1310 nm to search for changes in an ex vivo brain slice preparation and an in vivo model during 4-AP induced seizure onset and propagation with respect to electrical recording. By utilizing Doppler OCT and the depth-dependency of the attenuation coefficient, we demonstrate the ability to locate and remove the optical effects of vasculature within the upper regions of the cortex from in vivo attenuation calculations. The results of this study show a non-vascular decrease in intensity and attenuation in ex vivo and in vivo seizure models, respectively. Regions exhibiting decreased optical changes show significant temporal correlation to regions of increased electrical activity during seizure. This study allows for a thorough and biologically relevant analysis of the optical signature of seizure activity both ex vivo and in vivo using OCT.
Optic flow cues guide flight in birds.
Bhagavatula, Partha S; Claudianos, Charles; Ibbotson, Michael R; Srinivasan, Mandyam V
2011-11-08
Although considerable effort has been devoted to investigating how birds migrate over large distances, surprisingly little is known about how they tackle so successfully the moment-to-moment challenges of rapid flight through cluttered environments [1]. It has been suggested that birds detect and avoid obstacles [2] and control landing maneuvers [3-5] by using cues derived from the image motion that is generated in the eyes during flight. Here we investigate the ability of budgerigars to fly through narrow passages in a collision-free manner, by filming their trajectories during flight in a corridor where the walls are decorated with various visual patterns. The results demonstrate, unequivocally and for the first time, that birds negotiate narrow gaps safely by balancing the speeds of image motion that are experienced by the two eyes and that the speed of flight is regulated by monitoring the speed of image motion that is experienced by the two eyes. These findings have close parallels with those previously reported for flying insects [6-13], suggesting that some principles of visual guidance may be shared by all diurnal, flying animals. Copyright © 2011 Elsevier Ltd. All rights reserved.
Quantifying cortical surface harmonic deformation with stereovision during open cranial neurosurgery
NASA Astrophysics Data System (ADS)
Ji, Songbai; Fan, Xiaoyao; Roberts, David W.; Paulsen, Keith D.
2012-02-01
Cortical surface harmonic motion during open cranial neurosurgery is well observed in image-guided neurosurgery. Recently, we quantified cortical surface deformation noninvasively with synchronized blood pressure pulsation (BPP) from a sequence of stereo image pairs using optical flow motion tracking. With three subjects, we found the average cortical surface displacement can reach more than 1 mm and in-plane principal strains of up to 7% relative to the first image pair. In addition, the temporal changes in deformation and strain were in concert with BPP and patient respiration [1]. However, because deformation was essentially computed relative to an arbitrary reference, comparing cortical surface deformation at different times was not possible. In this study, we extend the technique developed earlier by establishing a more reliable reference profile of the cortical surface for each sequence of stereo image acquisitions. Specifically, fast Fourier transform (FFT) was applied to the dynamic cortical surface deformation, and the fundamental frequencies corresponding to patient respiration and BPP were identified, which were used to determine the number of image acquisitions for use in averaging cortical surface images. This technique is important because it potentially allows in vivo characterization of soft tissue biomechanical properties using intraoperative stereovision and motion tracking.