Sample records for fly motion vision

  1. Flying Drosophila stabilize their vision-based velocity controller by sensing wind with their antennae

    PubMed Central

    Fuller, Sawyer Buckminster; Straw, Andrew D.; Peek, Martin Y.; Murray, Richard M.; Dickinson, Michael H.

    2014-01-01

    Flies and other insects use vision to regulate their groundspeed in flight, enabling them to fly in varying wind conditions. Compared with mechanosensory modalities, however, vision requires a long processing delay (~100 ms) that might introduce instability if operated at high gain. Flies also sense air motion with their antennae, but how this is used in flight control is unknown. We manipulated the antennal function of fruit flies by ablating their aristae, forcing them to rely on vision alone to regulate groundspeed. Arista-ablated flies in flight exhibited significantly greater groundspeed variability than intact flies. We then subjected them to a series of controlled impulsive wind gusts delivered by an air piston and experimentally manipulated antennae and visual feedback. The results show that an antenna-mediated response alters wing motion to cause flies to accelerate in the same direction as the gust. This response opposes flying into a headwind, but flies regularly fly upwind. To resolve this discrepancy, we obtained a dynamic model of the fly’s velocity regulator by fitting parameters of candidate models to our experimental data. The model suggests that the groundspeed variability of arista-ablated flies is the result of unstable feedback oscillations caused by the delay and high gain of visual feedback. The antenna response drives active damping with a shorter delay (~20 ms) to stabilize this regulator, in exchange for increasing the effect of rapid wind disturbances. This provides insight into flies’ multimodal sensory feedback architecture and constitutes a previously unknown role for the antennae. PMID:24639532

  2. Can Humans Fly Action Understanding with Multiple Classes of Actors

    DTIC Science & Technology

    2015-06-08

    recognition using structure from motion point clouds. In European Conference on Computer Vision, 2008. [5] R. Caruana. Multitask learning. Machine Learning...tonomous driving ? the kitti vision benchmark suite. In IEEE Conference on Computer Vision and Pattern Recognition, 2012. [12] L. Gorelick, M. Blank

  3. Contributions of the 12 neuron classes in the fly lamina to motion vision

    PubMed Central

    Tuthill, John C.; Nern, Aljoscha; Holtz, Stephen L.; Rubin, Gerald M.; Reiser, Michael B.

    2013-01-01

    SUMMARY Motion detection is a fundamental neural computation performed by many sensory systems. In the fly, local motion computation is thought to occur within the first two layers of the visual system, the lamina and medulla. We constructed specific genetic driver lines for each of the 12 neuron classes in the lamina. We then depolarized and hyperpolarized each neuron type, and quantified fly behavioral responses to a diverse set of motion stimuli. We found that only a small number of lamina output neurons are essential for motion detection, while most neurons serve to sculpt and enhance these feedforward pathways. Two classes of feedback neurons (C2 and C3), and lamina output neurons (L2 and L4), are required for normal detection of directional motion stimuli. Our results reveal a prominent role for feedback and lateral interactions in motion processing, and demonstrate that motion-dependent behaviors rely on contributions from nearly all lamina neuron classes. PMID:23849200

  4. Contributions of the 12 neuron classes in the fly lamina to motion vision.

    PubMed

    Tuthill, John C; Nern, Aljoscha; Holtz, Stephen L; Rubin, Gerald M; Reiser, Michael B

    2013-07-10

    Motion detection is a fundamental neural computation performed by many sensory systems. In the fly, local motion computation is thought to occur within the first two layers of the visual system, the lamina and medulla. We constructed specific genetic driver lines for each of the 12 neuron classes in the lamina. We then depolarized and hyperpolarized each neuron type and quantified fly behavioral responses to a diverse set of motion stimuli. We found that only a small number of lamina output neurons are essential for motion detection, while most neurons serve to sculpt and enhance these feedforward pathways. Two classes of feedback neurons (C2 and C3), and lamina output neurons (L2 and L4), are required for normal detection of directional motion stimuli. Our results reveal a prominent role for feedback and lateral interactions in motion processing and demonstrate that motion-dependent behaviors rely on contributions from nearly all lamina neuron classes. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Motion vision is independent of color in Drosophila

    PubMed Central

    Yamaguchi, Satoko; Wolf, Reinhard; Desplan, Claude; Heisenberg, Martin

    2008-01-01

    Whether motion vision uses color contrast is a controversial issue that has been investigated in several species, from insects to humans. We used Drosophila to answer this question, monitoring the optomotor response to moving color stimuli in WT and genetic variants. In the fly eye, a motion channel (outer photoreceptors R1–R6) and a color channel (inner photoreceptors R7 and R8) have been distinguished. With moving bars of alternating colors and high color contrast, a brightness ratio of the two colors can be found, at which the optomotor response is largely missing (point of equiluminance). Under these conditions, mutant flies lacking functional rhodopsin in R1–R6 cells do not respond at all. Furthermore, genetically eliminating the function of photoreceptors R7 and R8 neither alters the strength of the optomotor response nor shifts the point of equiluminance. We conclude that the color channel (R7/R8) does not contribute to motion detection as monitored by the optomotor response. PMID:18353989

  6. Motion vision is independent of color in Drosophila.

    PubMed

    Yamaguchi, Satoko; Wolf, Reinhard; Desplan, Claude; Heisenberg, Martin

    2008-03-25

    Whether motion vision uses color contrast is a controversial issue that has been investigated in several species, from insects to humans. We used Drosophila to answer this question, monitoring the optomotor response to moving color stimuli in WT and genetic variants. In the fly eye, a motion channel (outer photoreceptors R1-R6) and a color channel (inner photoreceptors R7 and R8) have been distinguished. With moving bars of alternating colors and high color contrast, a brightness ratio of the two colors can be found, at which the optomotor response is largely missing (point of equiluminance). Under these conditions, mutant flies lacking functional rhodopsin in R1-R6 cells do not respond at all. Furthermore, genetically eliminating the function of photoreceptors R7 and R8 neither alters the strength of the optomotor response nor shifts the point of equiluminance. We conclude that the color channel (R7/R8) does not contribute to motion detection as monitored by the optomotor response.

  7. Biomimetic machine vision system.

    PubMed

    Harman, William M; Barrett, Steven F; Wright, Cameron H G; Wilcox, Michael

    2005-01-01

    Real-time application of digital imaging for use in machine vision systems has proven to be prohibitive when used within control systems that employ low-power single processors without compromising the scope of vision or resolution of captured images. Development of a real-time machine analog vision system is the focus of research taking place at the University of Wyoming. This new vision system is based upon the biological vision system of the common house fly. Development of a single sensor is accomplished, representing a single facet of the fly's eye. This new sensor is then incorporated into an array of sensors capable of detecting objects and tracking motion in 2-D space. This system "preprocesses" incoming image data resulting in minimal data processing to determine the location of a target object. Due to the nature of the sensors in the array, hyperacuity is achieved thereby eliminating resolutions issues found in digital vision systems. In this paper, we will discuss the biological traits of the fly eye and the specific traits that led to the development of this machine vision system. We will also discuss the process of developing an analog based sensor that mimics the characteristics of interest in the biological vision system. This paper will conclude with a discussion of how an array of these sensors can be applied toward solving real-world machine vision issues.

  8. Vision in flying insects.

    PubMed

    Egelhaaf, Martin; Kern, Roland

    2002-12-01

    Vision guides flight behaviour in numerous insects. Despite their small brain, insects easily outperform current man-made autonomous vehicles in many respects. Examples are the virtuosic chasing manoeuvres male flies perform as part of their mating behaviour and the ability of bees to assess, on the basis of visual motion cues, the distance travelled in a novel environment. Analyses at both the behavioural and neuronal levels are beginning to unveil reasons for such extraordinary capabilities of insects. One recipe for their success is the adaptation of visual information processing to the specific requirements of the behavioural tasks and to the specific spatiotemporal properties of the natural input.

  9. Hi-Vision telecine system using pickup tube

    NASA Astrophysics Data System (ADS)

    Iijima, Goro

    1992-08-01

    Hi-Vision broadcasting, offering far more lifelike pictures than those produced by existing television broadcasting systems, has enormous potential in both industrial and commercial fields. The dissemination of the Hi-Vision system will enable vivid, movie theater quality pictures to be readily enjoyed in homes in the near future. To convert motion film pictures into Hi-Vision signals, a telecine system is needed. The Hi-Vision telecine systems currently under development are the "laser telecine," "flying-spot telecine," and "Saticon telecine" systems. This paper provides an overview of the pickup tube type Hi-Vision telecine system (referred to herein as the Saticon telecine system) developed and marketed by Ikegami Tsushinki Co., Ltd.

  10. Adaptation without parameter change: Dynamic gain control in motion detection

    PubMed Central

    Borst, Alexander; Flanagin, Virginia L.; Sompolinsky, Haim

    2005-01-01

    Many sensory systems adapt their input-output relationship to changes in the statistics of the ambient stimulus. Such adaptive behavior has been measured in a motion detection sensitive neuron of the fly visual system, H1. The rapid adaptation of the velocity response gain has been interpreted as evidence of optimal matching of the H1 response to the dynamic range of the stimulus, thereby maximizing its information transmission. Here, we show that correlation-type motion detectors, which are commonly thought to underlie fly motion vision, intrinsically possess adaptive properties. Increasing the amplitude of the velocity fluctuations leads to a decrease of the effective gain and the time constant of the velocity response without any change in the parameters of these detectors. The seemingly complex property of this adaptation turns out to be a straightforward consequence of the multidimensionality of the stimulus and the nonlinear nature of the system. PMID:15833815

  11. A Sensory-Motor Control Model of Animal Flight Explains Why Bats Fly Differently in Light Versus Dark

    PubMed Central

    Bar, Nadav S.; Skogestad, Sigurd; Marçal, Jose M.; Ulanovsky, Nachum; Yovel, Yossi

    2015-01-01

    Animal flight requires fine motor control. However, it is unknown how flying animals rapidly transform noisy sensory information into adequate motor commands. Here we developed a sensorimotor control model that explains vertebrate flight guidance with high fidelity. This simple model accurately reconstructed complex trajectories of bats flying in the dark. The model implies that in order to apply appropriate motor commands, bats have to estimate not only the angle-to-target, as was previously assumed, but also the angular velocity (“proportional-derivative” controller). Next, we conducted experiments in which bats flew in light conditions. When using vision, bats altered their movements, reducing the flight curvature. This change was explained by the model via reduction in sensory noise under vision versus pure echolocation. These results imply a surprising link between sensory noise and movement dynamics. We propose that this sensory-motor link is fundamental to motion control in rapidly moving animals under different sensory conditions, on land, sea, or air. PMID:25629809

  12. Bio-inspired motion detection in an FPGA-based smart camera module.

    PubMed

    Köhler, T; Röchter, F; Lindemann, J P; Möller, R

    2009-03-01

    Flying insects, despite their relatively coarse vision and tiny nervous system, are capable of carrying out elegant and fast aerial manoeuvres. Studies of the fly visual system have shown that this is accomplished by the integration of signals from a large number of elementary motion detectors (EMDs) in just a few global flow detector cells. We developed an FPGA-based smart camera module with more than 10,000 single EMDs, which is closely modelled after insect motion-detection circuits with respect to overall architecture, resolution and inter-receptor spacing. Input to the EMD array is provided by a CMOS camera with a high frame rate. Designed as an adaptable solution for different engineering applications and as a testbed for biological models, the EMD detector type and parameters such as the EMD time constants, the motion-detection directions and the angle between correlated receptors are reconfigurable online. This allows a flexible and simultaneous detection of complex motion fields such as translation, rotation and looming, such that various tasks, e.g., obstacle avoidance, height/distance control or speed regulation can be performed by the same compact device.

  13. A lightweight, inexpensive robotic system for insect vision.

    PubMed

    Sabo, Chelsea; Chisholm, Robert; Petterson, Adam; Cope, Alex

    2017-09-01

    Designing hardware for miniaturized robotics which mimics the capabilities of flying insects is of interest, because they share similar constraints (i.e. small size, low weight, and low energy consumption). Research in this area aims to enable robots with similarly efficient flight and cognitive abilities. Visual processing is important to flying insects' impressive flight capabilities, but currently, embodiment of insect-like visual systems is limited by the hardware systems available. Suitable hardware is either prohibitively expensive, difficult to reproduce, cannot accurately simulate insect vision characteristics, and/or is too heavy for small robotic platforms. These limitations hamper the development of platforms for embodiment which in turn hampers the progress on understanding of how biological systems fundamentally work. To address this gap, this paper proposes an inexpensive, lightweight robotic system for modelling insect vision. The system is mounted and tested on a robotic platform for mobile applications, and then the camera and insect vision models are evaluated. We analyse the potential of the system for use in embodiment of higher-level visual processes (i.e. motion detection) and also for development of navigation based on vision for robotics in general. Optic flow from sample camera data is calculated and compared to a perfect, simulated bee world showing an excellent resemblance. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Vision System for Coarsely Estimating Motion Parameters for Unknown Fast Moving Objects in Space

    PubMed Central

    Chen, Min; Hashimoto, Koichi

    2017-01-01

    Motivated by biological interests in analyzing navigation behaviors of flying animals, we attempt to build a system measuring their motion states. To do this, in this paper, we build a vision system to detect unknown fast moving objects within a given space, calculating their motion parameters represented by positions and poses. We proposed a novel method to detect reliable interest points from images of moving objects, which can be hardly detected by general purpose interest point detectors. 3D points reconstructed using these interest points are then grouped and maintained for detected objects, according to a careful schedule, considering appearance and perspective changes. In the estimation step, a method is introduced to adapt the robust estimation procedure used for dense point set to the case for sparse set, reducing the potential risk of greatly biased estimation. Experiments are conducted against real scenes, showing the capability of the system of detecting multiple unknown moving objects and estimating their positions and poses. PMID:29206189

  15. Proceedings from the 2nd International Symposium on Formation Flying Missions and Technologies

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Topics discussed include: The Stellar Imager (SI) "Vision Mission"; First Formation Flying Demonstration Mission Including on Flight Nulling; Formation Flying X-ray Telescope in L2 Orbit; SPECS: The Kilometer-baseline Far-IR Interferometer in NASA's Space Science Roadmap Presentation; A Tight Formation for Along-track SAR Interferometry; Realization of the Solar Power Satellite using the Formation Flying Solar Reflector; SIMBOL-X : Formation Flying for High-Energy Astrophysics; High Precision Optical Metrology for DARWIN; Close Formation Flight of Micro-Satellites for SAR Interferometry; Station-Keeping Requirements for Astronomical Imaging with Constellations of Free-Flying Collectors; Closed-Loop Control of Formation Flying Satellites; Formation Control for the MAXIM Mission; Precision Formation Keeping at L2 Using the Autonomous Formation Flying Sensor; Robust Control of Multiple Spacecraft Formation Flying; Virtual Rigid Body (VRB) Satellite Formation Control: Stable Mode-Switching and Cross-Coupling; Electromagnetic Formation Flight (EMFF) System Design, Mission Capabilities, and Testbed Development; Navigation Algorithms for Formation Flying Missions; Use of Formation Flying Small Satellites Incorporating OISL's in a Tandem Cluster Mission; Semimajor Axis Estimation Strategies; Relative Attitude Determination of Earth Orbiting Formations Using GPS Receivers; Analysis of Formation Flying in Eccentric Orbits Using Linearized Equations of Relative Motion; Conservative Analytical Collision Probabilities for Orbital Formation Flying; Equations of Motion and Stability of Two Spacecraft in Formation at the Earth/Moon Triangular Libration Points; Formations Near the Libration Points: Design Strategies Using Natural and Non-Natural Ares; An Overview of the Formation and Attitude Control System for the Terrestrial Planet Finder Formation Flying Interferometer; GVE-Based Dynamics and Control for Formation Flying Spacecraft; GNC System Design for a New Concept of X-Ray Distributed Telescope; GNC System for the Deployment and Fine Control of the DARWIN Free-Flying Interferometer; Formation Algorithm and Simulation Testbed; and PLATFORM: A Formation Flying, RvD and Robotic Validation Test-bench.

  16. Adaptive-Repetitive Visual-Servo Control of Low-Flying Aerial Robots via Uncalibrated High-Flying Cameras

    NASA Astrophysics Data System (ADS)

    Guo, Dejun; Bourne, Joseph R.; Wang, Hesheng; Yim, Woosoon; Leang, Kam K.

    2017-08-01

    This paper presents the design and implementation of an adaptive-repetitive visual-servo control system for a moving high-flying vehicle (HFV) with an uncalibrated camera to monitor, track, and precisely control the movements of a low-flying vehicle (LFV) or mobile ground robot. Applications of this control strategy include the use of high-flying unmanned aerial vehicles (UAVs) with computer vision for monitoring, controlling, and coordinating the movements of lower altitude agents in areas, for example, where GPS signals may be unreliable or nonexistent. When deployed, a remote operator of the HFV defines the desired trajectory for the LFV in the HFV's camera frame. Due to the circular motion of the HFV, the resulting motion trajectory of the LFV in the image frame can be periodic in time, thus an adaptive-repetitive control system is exploited for regulation and/or trajectory tracking. The adaptive control law is able to handle uncertainties in the camera's intrinsic and extrinsic parameters. The design and stability analysis of the closed-loop control system is presented, where Lyapunov stability is shown. Simulation and experimental results are presented to demonstrate the effectiveness of the method for controlling the movement of a low-flying quadcopter, demonstrating the capabilities of the visual-servo control system for localization (i.e.,, motion capturing) and trajectory tracking control. In fact, results show that the LFV can be commanded to hover in place as well as track a user-defined flower-shaped closed trajectory, while the HFV and camera system circulates above with constant angular velocity. On average, the proposed adaptive-repetitive visual-servo control system reduces the average RMS tracking error by over 77% in the image plane and over 71% in the world frame compared to using just the adaptive visual-servo control law.

  17. How Lovebirds Maneuver Rapidly Using Super-Fast Head Saccades and Image Feature Stabilization

    PubMed Central

    Kress, Daniel; van Bokhorst, Evelien; Lentink, David

    2015-01-01

    Diurnal flying animals such as birds depend primarily on vision to coordinate their flight path during goal-directed flight tasks. To extract the spatial structure of the surrounding environment, birds are thought to use retinal image motion (optical flow) that is primarily induced by motion of their head. It is unclear what gaze behaviors birds perform to support visuomotor control during rapid maneuvering flight in which they continuously switch between flight modes. To analyze this, we measured the gaze behavior of rapidly turning lovebirds in a goal-directed task: take-off and fly away from a perch, turn on a dime, and fly back and land on the same perch. High-speed flight recordings revealed that rapidly turning lovebirds perform a remarkable stereotypical gaze behavior with peak saccadic head turns up to 2700 degrees per second, as fast as insects, enabled by fast neck muscles. In between saccades, gaze orientation is held constant. By comparing saccade and wingbeat phase, we find that these super-fast saccades are coordinated with the downstroke when the lateral visual field is occluded by the wings. Lovebirds thus maximize visual perception by overlying behaviors that impair vision, which helps coordinate maneuvers. Before the turn, lovebirds keep a high contrast edge in their visual midline. Similarly, before landing, the lovebirds stabilize the center of the perch in their visual midline. The perch on which the birds land swings, like a branch in the wind, and we find that retinal size of the perch is the most parsimonious visual cue to initiate landing. Our observations show that rapidly maneuvering birds use precisely timed stereotypic gaze behaviors consisting of rapid head turns and frontal feature stabilization, which facilitates optical flow based flight control. Similar gaze behaviors have been reported for visually navigating humans. This finding can inspire more effective vision-based autopilots for drones. PMID:26107413

  18. Neural Summation in the Hawkmoth Visual System Extends the Limits of Vision in Dim Light.

    PubMed

    Stöckl, Anna Lisa; O'Carroll, David Charles; Warrant, Eric James

    2016-03-21

    Most of the world's animals are active in dim light and depend on good vision for the tasks of daily life. Many have evolved visual adaptations that permit a performance superior to that of manmade imaging devices [1]. In insects, a major model visual system, nocturnal species show impressive visual abilities ranging from flight control [2, 3], to color discrimination [4, 5], to navigation using visual landmarks [6-8] or dim celestial compass cues [9, 10]. In addition to optical adaptations that improve their sensitivity in dim light [11], neural summation of light in space and time-which enhances the coarser and slower features of the scene at the expense of noisier finer and faster features-has been suggested to improve sensitivity in theoretical [12-14], anatomical [15-17], and behavioral [18-20] studies. How these summation strategies function neurally is, however, presently unknown. Here, we quantified spatial and temporal summation in the motion vision pathway of a nocturnal hawkmoth. We show that spatial and temporal summation combine supralinearly to substantially increase contrast sensitivity and visual information rate over four decades of light intensity, enabling hawkmoths to see at light levels 100 times dimmer than without summation. Our results reveal how visual motion is calculated neurally in dim light and how spatial and temporal summation improve sensitivity while simultaneously maximizing spatial and temporal resolution, thus extending models of insect motion vision derived predominantly from diurnal flies. Moreover, the summation strategies we have revealed may benefit manmade vision systems optimized for variable light levels [21]. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Microsaccadic sampling of moving image information provides Drosophila hyperacute vision

    PubMed Central

    Solanki, Narendra; Rien, Diana; Jaciuch, David; Dongre, Sidhartha Anil; Blanchard, Florence; de Polavieja, Gonzalo G; Hardie, Roger C; Takalo, Jouni

    2017-01-01

    Small fly eyes should not see fine image details. Because flies exhibit saccadic visual behaviors and their compound eyes have relatively few ommatidia (sampling points), their photoreceptors would be expected to generate blurry and coarse retinal images of the world. Here we demonstrate that Drosophila see the world far better than predicted from the classic theories. By using electrophysiological, optical and behavioral assays, we found that R1-R6 photoreceptors’ encoding capacity in time is maximized to fast high-contrast bursts, which resemble their light input during saccadic behaviors. Whilst over space, R1-R6s resolve moving objects at saccadic speeds beyond the predicted motion-blur-limit. Our results show how refractory phototransduction and rapid photomechanical photoreceptor contractions jointly sharpen retinal images of moving objects in space-time, enabling hyperacute vision, and explain how such microsaccadic information sampling exceeds the compound eyes’ optical limits. These discoveries elucidate how acuity depends upon photoreceptor function and eye movements. PMID:28870284

  20. A screen for constituents of motor control and decision making in Drosophila reveals visual distance-estimation neurons

    PubMed Central

    Triphan, Tilman; Nern, Aljoscha; Roberts, Sonia F.; Korff, Wyatt; Naiman, Daniel Q.; Strauss, Roland

    2016-01-01

    Climbing over chasms larger than step size is vital to fruit flies, since foraging and mating are achieved while walking. Flies avoid futile climbing attempts by processing parallax-motion vision to estimate gap width. To identify neuronal substrates of climbing control, we screened a large collection of fly lines with temporarily inactivated neuronal populations in a novel high-throughput assay described here. The observed climbing phenotypes were classified; lines in each group are reported. Selected lines were further analysed by high-resolution video cinematography. One striking class of flies attempts to climb chasms of unsurmountable width; expression analysis guided us to C2 optic-lobe interneurons. Inactivation of C2 or the closely related C3 neurons with highly specific intersectional driver lines consistently reproduced hyperactive climbing whereas strong or weak artificial depolarization of C2/C3 neurons strongly or mildly decreased climbing frequency. Contrast-manipulation experiments support our conclusion that C2/C3 neurons are part of the distance-evaluation system. PMID:27255169

  1. Pilot Errors Involving Head-Up Displays (HUDs), Helmet-Mounted Displays (HMDs), and Night Vision Goggles (NVGs)

    DTIC Science & Technology

    1992-01-01

    results in stimulation of spatial-motion-location visual processes, which are known to take precedence over any other sensor or cognitive stimuli. In...or version he is flying. This was initially an observation that stimulated the birth of the human-factors engineering discipline during World War H...collisions with the surface, the pilot needs inputs to sensory channels other than the focal visual system. Properly designed auditory and

  2. Insect-Inspired Optical-Flow Navigation Sensors

    NASA Technical Reports Server (NTRS)

    Thakoor, Sarita; Morookian, John M.; Chahl, Javan; Soccol, Dean; Hines, Butler; Zornetzer, Steven

    2005-01-01

    Integrated circuits that exploit optical flow to sense motions of computer mice on or near surfaces ( optical mouse chips ) are used as navigation sensors in a class of small flying robots now undergoing development for potential use in such applications as exploration, search, and surveillance. The basic principles of these robots were described briefly in Insect-Inspired Flight Control for Small Flying Robots (NPO-30545), NASA Tech Briefs, Vol. 29, No. 1 (January 2005), page 61. To recapitulate from the cited prior article: The concept of optical flow can be defined, loosely, as the use of texture in images as a source of motion cues. The flight-control and navigation systems of these robots are inspired largely by the designs and functions of the vision systems and brains of insects, which have been demonstrated to utilize optical flow (as detected by their eyes and brains) resulting from their own motions in the environment. Optical flow has been shown to be very effective as a means of avoiding obstacles and controlling speeds and altitudes in robotic navigation. Prior systems used in experiments on navigating by means of optical flow have involved the use of panoramic optics, high-resolution image sensors, and programmable imagedata- processing computers.

  3. Pilot response to peripheral vision cues during instrument flying tasks.

    DOT National Transportation Integrated Search

    1968-02-01

    In an attempt to more closely associate the visual aspects of instrument flying with that of contact flight, a study was made of human response to peripheral vision cues relating to aircraft roll attitude. Pilots, ranging from 52 to 12,000 flying hou...

  4. Nocturnal insects use optic flow for flight control

    PubMed Central

    Baird, Emily; Kreiss, Eva; Wcislo, William; Warrant, Eric; Dacke, Marie

    2011-01-01

    To avoid collisions when navigating through cluttered environments, flying insects must control their flight so that their sensory systems have time to detect obstacles and avoid them. To do this, day-active insects rely primarily on the pattern of apparent motion generated on the retina during flight (optic flow). However, many flying insects are active at night, when obtaining reliable visual information for flight control presents much more of a challenge. To assess whether nocturnal flying insects also rely on optic flow cues to control flight in dim light, we recorded flights of the nocturnal neotropical sweat bee, Megalopta genalis, flying along an experimental tunnel when: (i) the visual texture on each wall generated strong horizontal (front-to-back) optic flow cues, (ii) the texture on only one wall generated these cues, and (iii) horizontal optic flow cues were removed from both walls. We find that Megalopta increase their groundspeed when horizontal motion cues in the tunnel are reduced (conditions (ii) and (iii)). However, differences in the amount of horizontal optic flow on each wall of the tunnel (condition (ii)) do not affect the centred position of the bee within the flight tunnel. To better understand the behavioural response of Megalopta, we repeated the experiments on day-active bumble-bees (Bombus terrestris). Overall, our findings demonstrate that despite the limitations imposed by dim light, Megalopta—like their day-active relatives—rely heavily on vision to control flight, but that they use visual cues in a different manner from diurnal insects. PMID:21307047

  5. FlyCap: Markerless Motion Capture Using Multiple Autonomous Flying Cameras.

    PubMed

    Xu, Lan; Liu, Yebin; Cheng, Wei; Guo, Kaiwen; Zhou, Guyue; Dai, Qionghai; Fang, Lu

    2017-07-18

    Aiming at automatic, convenient and non-instrusive motion capture, this paper presents a new generation markerless motion capture technique, the FlyCap system, to capture surface motions of moving characters using multiple autonomous flying cameras (autonomous unmanned aerial vehicles(UAVs) each integrated with an RGBD video camera). During data capture, three cooperative flying cameras automatically track and follow the moving target who performs large-scale motions in a wide space. We propose a novel non-rigid surface registration method to track and fuse the depth of the three flying cameras for surface motion tracking of the moving target, and simultaneously calculate the pose of each flying camera. We leverage the using of visual-odometry information provided by the UAV platform, and formulate the surface tracking problem in a non-linear objective function that can be linearized and effectively minimized through a Gaussian-Newton method. Quantitative and qualitative experimental results demonstrate the plausible surface and motion reconstruction results.

  6. Motion transparency: making models of motion perception transparent.

    PubMed

    Snowden; Verstraten

    1999-10-01

    In daily life our visual system is bombarded with motion information. We see cars driving by, flocks of birds flying in the sky, clouds passing behind trees that are dancing in the wind. Vision science has a good understanding of the first stage of visual motion processing, that is, the mechanism underlying the detection of local motions. Currently, research is focused on the processes that occur beyond the first stage. At this level, local motions have to be integrated to form objects, define the boundaries between them, construct surfaces and so on. An interesting, if complicated case is known as motion transparency: the situation in which two overlapping surfaces move transparently over each other. In that case two motions have to be assigned to the same retinal location. Several researchers have tried to solve this problem from a computational point of view, using physiological and psychophysical results as a guideline. We will discuss two models: one uses the traditional idea known as 'filter selection' and the other a relatively new approach based on Bayesian inference. Predictions from these models are compared with our own visual behaviour and that of the neural substrates that are presumed to underlie these perceptions.

  7. Walking modulates speed sensitivity in Drosophila motion vision.

    PubMed

    Chiappe, M Eugenia; Seelig, Johannes D; Reiser, Michael B; Jayaraman, Vivek

    2010-08-24

    Changes in behavioral state modify neural activity in many systems. In some vertebrates such modulation has been observed and interpreted in the context of attention and sensorimotor coordinate transformations. Here we report state-dependent activity modulations during walking in a visual-motor pathway of Drosophila. We used two-photon imaging to monitor intracellular calcium activity in motion-sensitive lobula plate tangential cells (LPTCs) in head-fixed Drosophila walking on an air-supported ball. Cells of the horizontal system (HS)--a subgroup of LPTCs--showed stronger calcium transients in response to visual motion when flies were walking rather than resting. The amplified responses were also correlated with walking speed. Moreover, HS neurons showed a relatively higher gain in response strength at higher temporal frequencies, and their optimum temporal frequency was shifted toward higher motion speeds. Walking-dependent modulation of HS neurons in the Drosophila visual system may constitute a mechanism to facilitate processing of higher image speeds in behavioral contexts where these speeds of visual motion are relevant for course stabilization. Copyright 2010 Elsevier Ltd. All rights reserved.

  8. Nocturnal insects use optic flow for flight control.

    PubMed

    Baird, Emily; Kreiss, Eva; Wcislo, William; Warrant, Eric; Dacke, Marie

    2011-08-23

    To avoid collisions when navigating through cluttered environments, flying insects must control their flight so that their sensory systems have time to detect obstacles and avoid them. To do this, day-active insects rely primarily on the pattern of apparent motion generated on the retina during flight (optic flow). However, many flying insects are active at night, when obtaining reliable visual information for flight control presents much more of a challenge. To assess whether nocturnal flying insects also rely on optic flow cues to control flight in dim light, we recorded flights of the nocturnal neotropical sweat bee, Megalopta genalis, flying along an experimental tunnel when: (i) the visual texture on each wall generated strong horizontal (front-to-back) optic flow cues, (ii) the texture on only one wall generated these cues, and (iii) horizontal optic flow cues were removed from both walls. We find that Megalopta increase their groundspeed when horizontal motion cues in the tunnel are reduced (conditions (ii) and (iii)). However, differences in the amount of horizontal optic flow on each wall of the tunnel (condition (ii)) do not affect the centred position of the bee within the flight tunnel. To better understand the behavioural response of Megalopta, we repeated the experiments on day-active bumble-bees (Bombus terrestris). Overall, our findings demonstrate that despite the limitations imposed by dim light, Megalopta-like their day-active relatives-rely heavily on vision to control flight, but that they use visual cues in a different manner from diurnal insects. This journal is © 2011 The Royal Society

  9. Software model of a machine vision system based on the common house fly.

    PubMed

    Madsen, Robert; Barrett, Steven; Wilcox, Michael

    2005-01-01

    The vision system of the common house fly has many properties, such as hyperacuity and parallel structure, which would be advantageous in a machine vision system. A software model has been developed which is ultimately intended to be a tool to guide the design of an analog real time vision system. The model starts by laying out cartridges over an image. The cartridges are analogous to the ommatidium of the fly's eye and contain seven photoreceptors each with a Gaussian profile. The spacing between photoreceptors is variable providing for more or less detail as needed. The cartridges provide information on what type of features they see and neighboring cartridges share information to construct a feature map.

  10. Peripheral Processing Facilitates Optic Flow-Based Depth Perception

    PubMed Central

    Li, Jinglin; Lindemann, Jens P.; Egelhaaf, Martin

    2016-01-01

    Flying insects, such as flies or bees, rely on consistent information regarding the depth structure of the environment when performing their flight maneuvers in cluttered natural environments. These behaviors include avoiding collisions, approaching targets or spatial navigation. Insects are thought to obtain depth information visually from the retinal image displacements (“optic flow”) during translational ego-motion. Optic flow in the insect visual system is processed by a mechanism that can be modeled by correlation-type elementary motion detectors (EMDs). However, it is still an open question how spatial information can be extracted reliably from the responses of the highly contrast- and pattern-dependent EMD responses, especially if the vast range of light intensities encountered in natural environments is taken into account. This question will be addressed here by systematically modeling the peripheral visual system of flies, including various adaptive mechanisms. Different model variants of the peripheral visual system were stimulated with image sequences that mimic the panoramic visual input during translational ego-motion in various natural environments, and the resulting peripheral signals were fed into an array of EMDs. We characterized the influence of each peripheral computational unit on the representation of spatial information in the EMD responses. Our model simulations reveal that information about the overall light level needs to be eliminated from the EMD input as is accomplished under light-adapted conditions in the insect peripheral visual system. The response characteristics of large monopolar cells (LMCs) resemble that of a band-pass filter, which reduces the contrast dependency of EMDs strongly, effectively enhancing the representation of the nearness of objects and, especially, of their contours. We furthermore show that local brightness adaptation of photoreceptors allows for spatial vision under a wide range of dynamic light conditions. PMID:27818631

  11. 78 FR 23786 - Settlement Conference

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-22

    ... convening a settlement conference between GameFly, Inc. and the Postal Service. This notice informs the.... GameFly Motion, Postal Service Reply, and GameFly Response IV. Analysis V. Settlement Procedures I..., 2013, GameFly, Inc. (GameFly) filed a motion requesting the Commission to establish standards and...

  12. Anatomical Reconstruction and Functional Imaging Reveal an Ordered Array of Skylight Polarization Detectors in Drosophila

    PubMed Central

    Bleul, Christiane; Baumann-Klausener, Franziska; Labhart, Thomas; Dickinson, Michael H.

    2016-01-01

    Many insects exploit skylight polarization as a compass cue for orientation and navigation. In the fruit fly, Drosophila melanogaster, photoreceptors R7 and R8 in the dorsal rim area (DRA) of the compound eye are specialized to detect the electric vector (e-vector) of linearly polarized light. These photoreceptors are arranged in stacked pairs with identical fields of view and spectral sensitivities, but mutually orthogonal microvillar orientations. As in larger flies, we found that the microvillar orientation of the distal photoreceptor R7 changes in a fan-like fashion along the DRA. This anatomical arrangement suggests that the DRA constitutes a detector for skylight polarization, in which different e-vectors maximally excite different positions in the array. To test our hypothesis, we measured responses to polarized light of varying e-vector angles in the terminals of R7/8 cells using genetically encoded calcium indicators. Our data confirm a progression of preferred e-vector angles from anterior to posterior in the DRA, and a strict orthogonality between the e-vector preferences of paired R7/8 cells. We observed decreased activity in photoreceptors in response to flashes of light polarized orthogonally to their preferred e-vector angle, suggesting reciprocal inhibition between photoreceptors in the same medullar column, which may serve to increase polarization contrast. Together, our results indicate that the polarization-vision system relies on a spatial map of preferred e-vector angles at the earliest stage of sensory processing. SIGNIFICANCE STATEMENT The fly's visual system is an influential model system for studying neural computation, and much is known about its anatomy, physiology, and development. The circuits underlying motion processing have received the most attention, but researchers are increasingly investigating other functions, such as color perception and object recognition. In this work, we investigate the early neural processing of a somewhat exotic sense, called polarization vision. Because skylight is polarized in an orientation that is rigidly determined by the position of the sun, this cue provides compass information. Behavioral experiments have shown that many species use the polarization pattern in the sky to direct locomotion. Here we describe the input stage of the fly's polarization-vision system. PMID:27170135

  13. Object preference by walking fruit flies, Drosophila melanogaster, is mediated by vision and graviperception

    PubMed Central

    Robie, Alice A.; Straw, Andrew D.; Dickinson, Michael H.

    2010-01-01

    Walking fruit flies, Drosophila melanogaster, use visual information to orient towards salient objects in their environment, presumably as a search strategy for finding food, shelter or other resources. Less is known, however, about the role of vision or other sensory modalities such as mechanoreception in the evaluation of objects once they have been reached. To study the role of vision and mechanoreception in exploration behavior, we developed a large arena in which we could track individual fruit flies as they walked through either simple or more topologically complex landscapes. When exploring a simple, flat environment lacking three-dimensional objects, flies used visual cues from the distant background to stabilize their walking trajectories. When exploring an arena containing an array of cones, differing in geometry, flies actively oriented towards, climbed onto, and explored the objects, spending most of their time on the tallest, steepest object. A fly's behavioral response to the geometry of an object depended upon the intrinsic properties of each object and not a relative assessment to other nearby objects. Furthermore, the preference was not due to a greater attraction towards tall, steep objects, but rather a change in locomotor behavior once a fly reached and explored the surface. Specifically, flies are much more likely to stop walking for long periods when they are perched on tall, steep objects. Both the vision system and the antennal chordotonal organs (Johnston's organs) provide sufficient information about the geometry of an object to elicit the observed change in locomotor behavior. Only when both these sensory systems were impaired did flies not show the behavioral preference for the tall, steep objects. PMID:20581279

  14. Vision-based flight control in the hawkmoth Hyles lineata

    PubMed Central

    Windsor, Shane P.; Bomphrey, Richard J.; Taylor, Graham K.

    2014-01-01

    Vision is a key sensory modality for flying insects, playing an important role in guidance, navigation and control. Here, we use a virtual-reality flight simulator to measure the optomotor responses of the hawkmoth Hyles lineata, and use a published linear-time invariant model of the flight dynamics to interpret the function of the measured responses in flight stabilization and control. We recorded the forces and moments produced during oscillation of the visual field in roll, pitch and yaw, varying the temporal frequency, amplitude or spatial frequency of the stimulus. The moths’ responses were strongly dependent upon contrast frequency, as expected if the optomotor system uses correlation-type motion detectors to sense self-motion. The flight dynamics model predicts that roll angle feedback is needed to stabilize the lateral dynamics, and that a combination of pitch angle and pitch rate feedback is most effective in stabilizing the longitudinal dynamics. The moths’ responses to roll and pitch stimuli coincided qualitatively with these functional predictions. The moths produced coupled roll and yaw moments in response to yaw stimuli, which could help to reduce the energetic cost of correcting heading. Our results emphasize the close relationship between physics and physiology in the stabilization of insect flight. PMID:24335557

  15. Vision-based flight control in the hawkmoth Hyles lineata.

    PubMed

    Windsor, Shane P; Bomphrey, Richard J; Taylor, Graham K

    2014-02-06

    Vision is a key sensory modality for flying insects, playing an important role in guidance, navigation and control. Here, we use a virtual-reality flight simulator to measure the optomotor responses of the hawkmoth Hyles lineata, and use a published linear-time invariant model of the flight dynamics to interpret the function of the measured responses in flight stabilization and control. We recorded the forces and moments produced during oscillation of the visual field in roll, pitch and yaw, varying the temporal frequency, amplitude or spatial frequency of the stimulus. The moths' responses were strongly dependent upon contrast frequency, as expected if the optomotor system uses correlation-type motion detectors to sense self-motion. The flight dynamics model predicts that roll angle feedback is needed to stabilize the lateral dynamics, and that a combination of pitch angle and pitch rate feedback is most effective in stabilizing the longitudinal dynamics. The moths' responses to roll and pitch stimuli coincided qualitatively with these functional predictions. The moths produced coupled roll and yaw moments in response to yaw stimuli, which could help to reduce the energetic cost of correcting heading. Our results emphasize the close relationship between physics and physiology in the stabilization of insect flight.

  16. Object Recognition in Flight: How Do Bees Distinguish between 3D Shapes?

    PubMed Central

    Werner, Annette; Stürzl, Wolfgang; Zanker, Johannes

    2016-01-01

    Honeybees (Apis mellifera) discriminate multiple object features such as colour, pattern and 2D shape, but it remains unknown whether and how bees recover three-dimensional shape. Here we show that bees can recognize objects by their three-dimensional form, whereby they employ an active strategy to uncover the depth profiles. We trained individual, free flying honeybees to collect sugar water from small three-dimensional objects made of styrofoam (sphere, cylinder, cuboids) or folded paper (convex, concave, planar) and found that bees can easily discriminate between these stimuli. We also tested possible strategies employed by the bees to uncover the depth profiles. For the card stimuli, we excluded overall shape and pictorial features (shading, texture gradients) as cues for discrimination. Lacking sufficient stereo vision, bees are known to use speed gradients in optic flow to detect edges; could the bees apply this strategy also to recover the fine details of a surface depth profile? Analysing the bees’ flight tracks in front of the stimuli revealed specific combinations of flight maneuvers (lateral translations in combination with yaw rotations), which are particularly suitable to extract depth cues from motion parallax. We modelled the generated optic flow and found characteristic patterns of angular displacement corresponding to the depth profiles of our stimuli: optic flow patterns from pure translations successfully recovered depth relations from the magnitude of angular displacements, additional rotation provided robust depth information based on the direction of the displacements; thus, the bees flight maneuvers may reflect an optimized visuo-motor strategy to extract depth structure from motion signals. The robustness and simplicity of this strategy offers an efficient solution for 3D-object-recognition without stereo vision, and could be employed by other flying insects, or mobile robots. PMID:26886006

  17. Object Recognition in Flight: How Do Bees Distinguish between 3D Shapes?

    PubMed

    Werner, Annette; Stürzl, Wolfgang; Zanker, Johannes

    2016-01-01

    Honeybees (Apis mellifera) discriminate multiple object features such as colour, pattern and 2D shape, but it remains unknown whether and how bees recover three-dimensional shape. Here we show that bees can recognize objects by their three-dimensional form, whereby they employ an active strategy to uncover the depth profiles. We trained individual, free flying honeybees to collect sugar water from small three-dimensional objects made of styrofoam (sphere, cylinder, cuboids) or folded paper (convex, concave, planar) and found that bees can easily discriminate between these stimuli. We also tested possible strategies employed by the bees to uncover the depth profiles. For the card stimuli, we excluded overall shape and pictorial features (shading, texture gradients) as cues for discrimination. Lacking sufficient stereo vision, bees are known to use speed gradients in optic flow to detect edges; could the bees apply this strategy also to recover the fine details of a surface depth profile? Analysing the bees' flight tracks in front of the stimuli revealed specific combinations of flight maneuvers (lateral translations in combination with yaw rotations), which are particularly suitable to extract depth cues from motion parallax. We modelled the generated optic flow and found characteristic patterns of angular displacement corresponding to the depth profiles of our stimuli: optic flow patterns from pure translations successfully recovered depth relations from the magnitude of angular displacements, additional rotation provided robust depth information based on the direction of the displacements; thus, the bees flight maneuvers may reflect an optimized visuo-motor strategy to extract depth structure from motion signals. The robustness and simplicity of this strategy offers an efficient solution for 3D-object-recognition without stereo vision, and could be employed by other flying insects, or mobile robots.

  18. Peripheral Vision of Youths with Low Vision: Motion Perception, Crowding, and Visual Search

    PubMed Central

    Tadin, Duje; Nyquist, Jeffrey B.; Lusk, Kelly E.; Corn, Anne L.; Lappin, Joseph S.

    2012-01-01

    Purpose. Effects of low vision on peripheral visual function are poorly understood, especially in children whose visual skills are still developing. The aim of this study was to measure both central and peripheral visual functions in youths with typical and low vision. Of specific interest was the extent to which measures of foveal function predict performance of peripheral tasks. Methods. We assessed central and peripheral visual functions in youths with typical vision (n = 7, ages 10–17) and low vision (n = 24, ages 9–18). Experimental measures used both static and moving stimuli and included visual crowding, visual search, motion acuity, motion direction discrimination, and multitarget motion comparison. Results. In most tasks, visual function was impaired in youths with low vision. Substantial differences, however, were found both between participant groups and, importantly, across different tasks within participant groups. Foveal visual acuity was a modest predictor of peripheral form vision and motion sensitivity in either the central or peripheral field. Despite exhibiting normal motion discriminations in fovea, motion sensitivity of youths with low vision deteriorated in the periphery. This contrasted with typically sighted participants, who showed improved motion sensitivity with increasing eccentricity. Visual search was greatly impaired in youths with low vision. Conclusions. Our results reveal a complex pattern of visual deficits in peripheral vision and indicate a significant role of attentional mechanisms in observed impairments. These deficits were not adequately captured by measures of foveal function, arguing for the importance of independently assessing peripheral visual function. PMID:22836766

  19. Peripheral vision of youths with low vision: motion perception, crowding, and visual search.

    PubMed

    Tadin, Duje; Nyquist, Jeffrey B; Lusk, Kelly E; Corn, Anne L; Lappin, Joseph S

    2012-08-24

    Effects of low vision on peripheral visual function are poorly understood, especially in children whose visual skills are still developing. The aim of this study was to measure both central and peripheral visual functions in youths with typical and low vision. Of specific interest was the extent to which measures of foveal function predict performance of peripheral tasks. We assessed central and peripheral visual functions in youths with typical vision (n = 7, ages 10-17) and low vision (n = 24, ages 9-18). Experimental measures used both static and moving stimuli and included visual crowding, visual search, motion acuity, motion direction discrimination, and multitarget motion comparison. In most tasks, visual function was impaired in youths with low vision. Substantial differences, however, were found both between participant groups and, importantly, across different tasks within participant groups. Foveal visual acuity was a modest predictor of peripheral form vision and motion sensitivity in either the central or peripheral field. Despite exhibiting normal motion discriminations in fovea, motion sensitivity of youths with low vision deteriorated in the periphery. This contrasted with typically sighted participants, who showed improved motion sensitivity with increasing eccentricity. Visual search was greatly impaired in youths with low vision. Our results reveal a complex pattern of visual deficits in peripheral vision and indicate a significant role of attentional mechanisms in observed impairments. These deficits were not adequately captured by measures of foveal function, arguing for the importance of independently assessing peripheral visual function.

  20. Independently Controlled Wing Stroke Patterns in the Fruit Fly Drosophila melanogaster

    PubMed Central

    Chakraborty, Soma; Bartussek, Jan; Fry, Steven N.; Zapotocky, Martin

    2015-01-01

    Flies achieve supreme flight maneuverability through a small set of miniscule steering muscles attached to the wing base. The fast flight maneuvers arise from precisely timed activation of the steering muscles and the resulting subtle modulation of the wing stroke. In addition, slower modulation of wing kinematics arises from changes in the activity of indirect flight muscles in the thorax. We investigated if these modulations can be described as a superposition of a limited number of elementary deformations of the wing stroke that are under independent physiological control. Using a high-speed computer vision system, we recorded the wing motion of tethered flying fruit flies for up to 12 000 consecutive wing strokes at a sampling rate of 6250 Hz. We then decomposed the joint motion pattern of both wings into components that had the minimal mutual information (a measure of statistical dependence). In 100 flight segments measured from 10 individual flies, we identified 7 distinct types of frequently occurring least-dependent components, each defining a kinematic pattern (a specific deformation of the wing stroke and the sequence of its activation from cycle to cycle). Two of these stroke deformations can be associated with the control of yaw torque and total flight force, respectively. A third deformation involves a change in the downstroke-to-upstroke duration ratio, which is expected to alter the pitch torque. A fourth kinematic pattern consists in the alteration of stroke amplitude with a period of 2 wingbeat cycles, extending for dozens of cycles. Our analysis indicates that these four elementary kinematic patterns can be activated mutually independently, and occur both in isolation and in linear superposition. The results strengthen the available evidence for independent control of yaw torque, pitch torque, and total flight force. Our computational method facilitates systematic identification of novel patterns in large kinematic datasets. PMID:25710715

  1. Independently controlled wing stroke patterns in the fruit fly Drosophila melanogaster.

    PubMed

    Chakraborty, Soma; Bartussek, Jan; Fry, Steven N; Zapotocky, Martin

    2015-01-01

    Flies achieve supreme flight maneuverability through a small set of miniscule steering muscles attached to the wing base. The fast flight maneuvers arise from precisely timed activation of the steering muscles and the resulting subtle modulation of the wing stroke. In addition, slower modulation of wing kinematics arises from changes in the activity of indirect flight muscles in the thorax. We investigated if these modulations can be described as a superposition of a limited number of elementary deformations of the wing stroke that are under independent physiological control. Using a high-speed computer vision system, we recorded the wing motion of tethered flying fruit flies for up to 12,000 consecutive wing strokes at a sampling rate of 6250 Hz. We then decomposed the joint motion pattern of both wings into components that had the minimal mutual information (a measure of statistical dependence). In 100 flight segments measured from 10 individual flies, we identified 7 distinct types of frequently occurring least-dependent components, each defining a kinematic pattern (a specific deformation of the wing stroke and the sequence of its activation from cycle to cycle). Two of these stroke deformations can be associated with the control of yaw torque and total flight force, respectively. A third deformation involves a change in the downstroke-to-upstroke duration ratio, which is expected to alter the pitch torque. A fourth kinematic pattern consists in the alteration of stroke amplitude with a period of 2 wingbeat cycles, extending for dozens of cycles. Our analysis indicates that these four elementary kinematic patterns can be activated mutually independently, and occur both in isolation and in linear superposition. The results strengthen the available evidence for independent control of yaw torque, pitch torque, and total flight force. Our computational method facilitates systematic identification of novel patterns in large kinematic datasets.

  2. Line width determination using a biomimetic fly eye vision system.

    PubMed

    Benson, John B; Wright, Cameron H G; Barrett, Steven F

    2007-01-01

    Developing a new vision system based on the vision of the common house fly, Musca domestica, has created many interesting design challenges. One of those problems is line width determination, which is the topic of this paper. It has been discovered that line width can be determined with a single sensor as long as either the sensor, or the object in question, has a constant, known velocity. This is an important first step for determining the width of any arbitrary object, with unknown velocity.

  3. Micro air vehicle autonomous obstacle avoidance from stereo-vision

    NASA Astrophysics Data System (ADS)

    Brockers, Roland; Kuwata, Yoshiaki; Weiss, Stephan; Matthies, Lawrence

    2014-06-01

    We introduce a new approach for on-board autonomous obstacle avoidance for micro air vehicles flying outdoors in close proximity to structure. Our approach uses inverse-range, polar-perspective stereo-disparity maps for obstacle detection and representation, and deploys a closed-loop RRT planner that considers flight dynamics for trajectory generation. While motion planning is executed in 3D space, we reduce collision checking to a fast z-buffer-like operation in disparity space, which allows for significant speed-up compared to full 3d methods. Evaluations in simulation illustrate the robustness of our approach, whereas real world flights under tree canopy demonstrate the potential of the approach.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodos, W.

    Collisions with wind turbines can be a problem for many species of birds. Of particular concern are collisions by eagles and other protected species. This research study used the laboratory methods of physiological optics, animal psychophysics, and retinal electrophysiology to analyze the causes of collisions and to evaluate visual deterrents based on the results of this analysis. Bird collisions with the seemingly slow-moving turbines seem paradoxical given the superb vision that most birds, especially raptors, possess. However, our optical analysis indicated that as the eye approaches the rotating blades, the retinal image of the blade (which is the information thatmore » is transmitted to the animal's brain) increases in velocity until it is moving so fast that the retina cannot keep up with it. At this point, the retinal image becomes a transparent blur that the bird probably interprets as a safe area to fly through, with disastrous consequences. This phenomenon is called"motion smear" or"motion blur."« less

  5. Anatomical Reconstruction and Functional Imaging Reveal an Ordered Array of Skylight Polarization Detectors in Drosophila.

    PubMed

    Weir, Peter T; Henze, Miriam J; Bleul, Christiane; Baumann-Klausener, Franziska; Labhart, Thomas; Dickinson, Michael H

    2016-05-11

    Many insects exploit skylight polarization as a compass cue for orientation and navigation. In the fruit fly, Drosophila melanogaster, photoreceptors R7 and R8 in the dorsal rim area (DRA) of the compound eye are specialized to detect the electric vector (e-vector) of linearly polarized light. These photoreceptors are arranged in stacked pairs with identical fields of view and spectral sensitivities, but mutually orthogonal microvillar orientations. As in larger flies, we found that the microvillar orientation of the distal photoreceptor R7 changes in a fan-like fashion along the DRA. This anatomical arrangement suggests that the DRA constitutes a detector for skylight polarization, in which different e-vectors maximally excite different positions in the array. To test our hypothesis, we measured responses to polarized light of varying e-vector angles in the terminals of R7/8 cells using genetically encoded calcium indicators. Our data confirm a progression of preferred e-vector angles from anterior to posterior in the DRA, and a strict orthogonality between the e-vector preferences of paired R7/8 cells. We observed decreased activity in photoreceptors in response to flashes of light polarized orthogonally to their preferred e-vector angle, suggesting reciprocal inhibition between photoreceptors in the same medullar column, which may serve to increase polarization contrast. Together, our results indicate that the polarization-vision system relies on a spatial map of preferred e-vector angles at the earliest stage of sensory processing. The fly's visual system is an influential model system for studying neural computation, and much is known about its anatomy, physiology, and development. The circuits underlying motion processing have received the most attention, but researchers are increasingly investigating other functions, such as color perception and object recognition. In this work, we investigate the early neural processing of a somewhat exotic sense, called polarization vision. Because skylight is polarized in an orientation that is rigidly determined by the position of the sun, this cue provides compass information. Behavioral experiments have shown that many species use the polarization pattern in the sky to direct locomotion. Here we describe the input stage of the fly's polarization-vision system. Copyright © 2016 the authors 0270-6474/16/365397-08$15.00/0.

  6. Commercial Flight Crew Decision-Making during Low-Visibility Approach Operations Using Fused Synthetic/Enhanced Vision Systems

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Bailey, Randall E.; Prinzel, Lawrence J., III

    2007-01-01

    NASA is investigating revolutionary crew-vehicle interface technologies that strive to proactively overcome aircraft safety barriers that would otherwise constrain the full realization of the next-generation air transportation system. A fixed-based piloted simulation experiment was conducted to evaluate the complementary use of Synthetic and Enhanced Vision technologies. Specific focus was placed on new techniques for integration and/or fusion of Enhanced and Synthetic Vision and its impact within a two-crew flight deck on the crew's decision-making process during low-visibility approach and landing operations. Overall, the experimental data showed that significant improvements in situation awareness, without concomitant increases in workload and display clutter, could be provided by the integration and/or fusion of synthetic and enhanced vision technologies for the pilot-flying and the pilot-not-flying. During non-normal operations, the ability of the crew to handle substantial navigational errors and runway incursions were neither improved nor adversely impacted by the display concepts. The addition of Enhanced Vision may not, unto itself, provide an improvement in runway incursion detection without being specifically tailored for this application. Existing enhanced vision system procedures were effectively used in the crew decision-making process during approach and missed approach operations but having to forcibly transition from an excellent FLIR image to natural vision by 100 ft above field level was awkward for the pilot-flying.

  7. Vision based control of unmanned aerial vehicles with applications to an autonomous four-rotor helicopter, quadrotor

    NASA Astrophysics Data System (ADS)

    Altug, Erdinc

    Our work proposes a vision-based stabilization and output tracking control method for a model helicopter. This is a part of our effort to produce a rotorcraft based autonomous Unmanned Aerial Vehicle (UAV). Due to the desired maneuvering ability, a four-rotor helicopter has been chosen as the testbed. On previous research on flying vehicles, vision is usually used as a secondary sensor. Unlike previous research, our goal is to use visual feedback as the main sensor, which is not only responsible for detecting where the ground objects are but also for helicopter localization. A novel two-camera method has been introduced for estimating the full six degrees of freedom (DOF) pose of the helicopter. This two-camera system consists of a pan-tilt ground camera and an onboard camera. The pose estimation algorithm is compared through simulation to other methods, such as four-point, and stereo method and is shown to be less sensitive to feature detection errors. Helicopters are highly unstable flying vehicles; although this is good for agility, it makes the control harder. To build an autonomous helicopter, two methods of control are studied---one using a series of mode-based, feedback linearizing controllers and the other using a back-stepping control law. Various simulations with 2D and 3D models demonstrate the implementation of these controllers. We also show global convergence of the 3D quadrotor controller even with large calibration errors or presence of large errors on the image plane. Finally, we present initial flight experiments where the proposed pose estimation algorithm and non-linear control techniques have been implemented on a remote-controlled helicopter. The helicopter was restricted with a tether to vertical, yaw motions and limited x and y translations.

  8. Computer-aided system for detecting runway incursions

    NASA Astrophysics Data System (ADS)

    Sridhar, Banavar; Chatterji, Gano B.

    1994-07-01

    A synthetic vision system for enhancing the pilot's ability to navigate and control the aircraft on the ground is described. The system uses the onboard airport database and images acquired by external sensors. Additional navigation information needed by the system is provided by the Inertial Navigation System and the Global Positioning System. The various functions of the system, such as image enhancement, map generation, obstacle detection, collision avoidance, guidance, etc., are identified. The available technologies, some of which were developed at NASA, that are applicable to the aircraft ground navigation problem are noted. Example images of a truck crossing the runway while the aircraft flies close to the runway centerline are described. These images are from a sequence of images acquired during one of the several flight experiments conducted by NASA to acquire data to be used for the development and verification of the synthetic vision concepts. These experiments provide a realistic database including video and infrared images, motion states from the Inertial Navigation System and the Global Positioning System, and camera parameters.

  9. A Height Estimation Approach for Terrain Following Flights from Monocular Vision.

    PubMed

    Campos, Igor S G; Nascimento, Erickson R; Freitas, Gustavo M; Chaimowicz, Luiz

    2016-12-06

    In this paper, we present a monocular vision-based height estimation algorithm for terrain following flights. The impressive growth of Unmanned Aerial Vehicle (UAV) usage, notably in mapping applications, will soon require the creation of new technologies to enable these systems to better perceive their surroundings. Specifically, we chose to tackle the terrain following problem, as it is still unresolved for consumer available systems. Virtually every mapping aircraft carries a camera; therefore, we chose to exploit this in order to use presently available hardware to extract the height information toward performing terrain following flights. The proposed methodology consists of using optical flow to track features from videos obtained by the UAV, as well as its motion information to estimate the flying height. To determine if the height estimation is reliable, we trained a decision tree that takes the optical flow information as input and classifies whether the output is trustworthy or not. The classifier achieved accuracies of 80 % for positives and 90 % for negatives, while the height estimation algorithm presented good accuracy.

  10. Assessing Impact of Dual Sensor Enhanced Flight Vision Systems on Departure Performance

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Etherington, Timothy J.; Severance, Kurt; Bailey, Randall E.

    2016-01-01

    Synthetic Vision (SV) and Enhanced Flight Vision Systems (EFVS) may serve as game-changing technologies to meet the challenges of the Next Generation Air Transportation System and the envisioned Equivalent Visual Operations (EVO) concept - that is, the ability to achieve the safety and operational tempos of current-day Visual Flight Rules operations irrespective of the weather and visibility conditions. One significant obstacle lies in the definition of required equipage on the aircraft and on the airport to enable the EVO concept objective. A motion-base simulator experiment was conducted to evaluate the operational feasibility and pilot workload of conducting departures and approaches on runways without centerline lighting in visibility as low as 300 feet runway visual range (RVR) by use of onboard vision system technologies on a Head-Up Display (HUD) without need or reliance on natural vision. Twelve crews evaluated two methods of combining dual sensor (millimeter wave radar and forward looking infrared) EFVS imagery on pilot-flying and pilot-monitoring HUDs. In addition, the impact of adding SV to the dual sensor EFVS imagery on crew flight performance and workload was assessed. Using EFVS concepts during 300 RVR terminal operations on runways without centerline lighting appears feasible as all EFVS concepts had equivalent (or better) departure performance and landing rollout performance, without any workload penalty, than those flown with a conventional HUD to runways having centerline lighting. Adding SV imagery to EFVS concepts provided situation awareness improvements but no discernible improvements in flight path maintenance.

  11. FlyMAD: rapid thermogenetic control of neuronal activity in freely walking Drosophila.

    PubMed

    Bath, Daniel E; Stowers, John R; Hörmann, Dorothea; Poehlmann, Andreas; Dickson, Barry J; Straw, Andrew D

    2014-07-01

    Rapidly and selectively modulating the activity of defined neurons in unrestrained animals is a powerful approach in investigating the circuit mechanisms that shape behavior. In Drosophila melanogaster, temperature-sensitive silencers and activators are widely used to control the activities of genetically defined neuronal cell types. A limitation of these thermogenetic approaches, however, has been their poor temporal resolution. Here we introduce FlyMAD (the fly mind-altering device), which allows thermogenetic silencing or activation within seconds or even fractions of a second. Using computer vision, FlyMAD targets an infrared laser to freely walking flies. As a proof of principle, we demonstrated the rapid silencing and activation of neurons involved in locomotion, vision and courtship. The spatial resolution of the focused beam enabled preferential targeting of neurons in the brain or ventral nerve cord. Moreover, the high temporal resolution of FlyMAD allowed us to discover distinct timing relationships for two neuronal cell types previously linked to courtship song.

  12. A Novel Method for Tracking Individuals of Fruit Fly Swarms Flying in a Laboratory Flight Arena.

    PubMed

    Cheng, Xi En; Qian, Zhi-Ming; Wang, Shuo Hong; Jiang, Nan; Guo, Aike; Chen, Yan Qiu

    2015-01-01

    The growing interest in studying social behaviours of swarming fruit flies, Drosophila melanogaster, has heightened the need for developing tools that provide quantitative motion data. To achieve such a goal, multi-camera three-dimensional tracking technology is the key experimental gateway. We have developed a novel tracking system for tracking hundreds of fruit flies flying in a confined cubic flight arena. In addition to the proposed tracking algorithm, this work offers additional contributions in three aspects: body detection, orientation estimation, and data validation. To demonstrate the opportunities that the proposed system offers for generating high-throughput quantitative motion data, we conducted experiments on five experimental configurations. We also performed quantitative analysis on the kinematics and the spatial structure and the motion patterns of fruit fly swarms. We found that there exists an asymptotic distance between fruit flies in swarms as the population density increases. Further, we discovered the evidence for repulsive response when the distance between fruit flies approached the asymptotic distance. Overall, the proposed tracking system presents a powerful method for studying flight behaviours of fruit flies in a three-dimensional environment.

  13. Visual summation in night-flying sweat bees: a theoretical study.

    PubMed

    Theobald, Jamie Carroll; Greiner, Birgit; Wcislo, William T; Warrant, Eric J

    2006-07-01

    Bees are predominantly diurnal; only a few groups fly at night. An evolutionary limitation that bees must overcome to inhabit dim environments is their eye type: bees possess apposition compound eyes, which are poorly suited to vision in dim light. Here, we theoretically examine how nocturnal bees Megalopta genalis fly at light levels usually reserved for insects bearing more sensitive superposition eyes. We find that neural summation should greatly increase M. genalis's visual reliability. Predicted spatial summation closely matches the morphology of laminal neurons believed to mediate such summation. Improved reliability costs acuity, but dark adapted bees already suffer optical blurring, and summation further degrades vision only slightly.

  14. From wheels to wings with evolutionary spiking circuits.

    PubMed

    Floreano, Dario; Zufferey, Jean-Christophe; Nicoud, Jean-Daniel

    2005-01-01

    We give an overview of the EPFL indoor flying project, whose goal is to evolve neural controllers for autonomous, adaptive, indoor micro-flyers. Indoor flight is still a challenge because it requires miniaturization, energy efficiency, and control of nonlinear flight dynamics. This ongoing project consists of developing a flying, vision-based micro-robot, a bio-inspired controller composed of adaptive spiking neurons directly mapped into digital microcontrollers, and a method to evolve such a neural controller without human intervention. This article describes the motivation and methodology used to reach our goal as well as the results of a number of preliminary experiments on vision-based wheeled and flying robots.

  15. Nonlinear circuits for naturalistic visual motion estimation

    PubMed Central

    Fitzgerald, James E; Clark, Damon A

    2015-01-01

    Many animals use visual signals to estimate motion. Canonical models suppose that animals estimate motion by cross-correlating pairs of spatiotemporally separated visual signals, but recent experiments indicate that humans and flies perceive motion from higher-order correlations that signify motion in natural environments. Here we show how biologically plausible processing motifs in neural circuits could be tuned to extract this information. We emphasize how known aspects of Drosophila's visual circuitry could embody this tuning and predict fly behavior. We find that segregating motion signals into ON/OFF channels can enhance estimation accuracy by accounting for natural light/dark asymmetries. Furthermore, a diversity of inputs to motion detecting neurons can provide access to more complex higher-order correlations. Collectively, these results illustrate how non-canonical computations improve motion estimation with naturalistic inputs. This argues that the complexity of the fly's motion computations, implemented in its elaborate circuits, represents a valuable feature of its visual motion estimator. DOI: http://dx.doi.org/10.7554/eLife.09123.001 PMID:26499494

  16. A Vision-Based Counting and Recognition System for Flying Insects in Intelligent Agriculture.

    PubMed

    Zhong, Yuanhong; Gao, Junyuan; Lei, Qilun; Zhou, Yao

    2018-05-09

    Rapid and accurate counting and recognition of flying insects are of great importance, especially for pest control. Traditional manual identification and counting of flying insects is labor intensive and inefficient. In this study, a vision-based counting and classification system for flying insects is designed and implemented. The system is constructed as follows: firstly, a yellow sticky trap is installed in the surveillance area to trap flying insects and a camera is set up to collect real-time images. Then the detection and coarse counting method based on You Only Look Once (YOLO) object detection, the classification method and fine counting based on Support Vector Machines (SVM) using global features are designed. Finally, the insect counting and recognition system is implemented on Raspberry PI. Six species of flying insects including bee, fly, mosquito, moth, chafer and fruit fly are selected to assess the effectiveness of the system. Compared with the conventional methods, the test results show promising performance. The average counting accuracy is 92.50% and average classifying accuracy is 90.18% on Raspberry PI. The proposed system is easy-to-use and provides efficient and accurate recognition data, therefore, it can be used for intelligent agriculture applications.

  17. A Vision-Based Counting and Recognition System for Flying Insects in Intelligent Agriculture

    PubMed Central

    Zhong, Yuanhong; Gao, Junyuan; Lei, Qilun; Zhou, Yao

    2018-01-01

    Rapid and accurate counting and recognition of flying insects are of great importance, especially for pest control. Traditional manual identification and counting of flying insects is labor intensive and inefficient. In this study, a vision-based counting and classification system for flying insects is designed and implemented. The system is constructed as follows: firstly, a yellow sticky trap is installed in the surveillance area to trap flying insects and a camera is set up to collect real-time images. Then the detection and coarse counting method based on You Only Look Once (YOLO) object detection, the classification method and fine counting based on Support Vector Machines (SVM) using global features are designed. Finally, the insect counting and recognition system is implemented on Raspberry PI. Six species of flying insects including bee, fly, mosquito, moth, chafer and fruit fly are selected to assess the effectiveness of the system. Compared with the conventional methods, the test results show promising performance. The average counting accuracy is 92.50% and average classifying accuracy is 90.18% on Raspberry PI. The proposed system is easy-to-use and provides efficient and accurate recognition data, therefore, it can be used for intelligent agriculture applications. PMID:29747429

  18. Modeling of the First Layers in the Fly's Eye

    NASA Technical Reports Server (NTRS)

    Moya, J. A.; Wilcox, M. J.; Donohoe, G. W.

    1997-01-01

    Increased autonomy of robots would yield significant advantages in the exploration of space. The shortfalls of computer vision can, however, pose significant limitations on a robot's potential. At the same time, simple insects which are largely hard-wired have effective visual systems. The understanding of insect vision systems thus may lead to improved approaches to visual tasks. A good starting point for the study of a vision system is its eye. In this paper, a model of the sensory portion of the fly's eye is presented. The effectiveness of the model is briefly addressed by a comparison of its performance to experimental data.

  19. A Study about the Taboo of Rotation Timing for the Flapping Wing Flight

    NASA Astrophysics Data System (ADS)

    Wang, An-Bang; Hsueh, Chia-Hsien; Chen, Shih-Shen

    2004-11-01

    Influence of rotation timing for flapping wing flight on the flying lift has been experimentally investigated in this study. Since the insects cannot extend and shrink their wings like birds, the rotation timing of wings becomes the major influential factor to affect the flying lift of the flapping wing flight. The results reveal that rotation timing has significant influence on the flying lift. The averaged flying lift increases for high rotation wing velocity. Based on the comparisons of flying lift, too late A-rotation (connecting from wing downward motion to upward one) is the most serious taboo for the motion design of the micro air vehicles with flapping wings. Too late B-rotation (connection from upward motion to downward one) should also be avoided.

  20. Hand-writing motion tracking with vision-inertial sensor fusion: calibration and error correction.

    PubMed

    Zhou, Shengli; Fei, Fei; Zhang, Guanglie; Liu, Yunhui; Li, Wen J

    2014-08-25

    The purpose of this study was to improve the accuracy of real-time ego-motion tracking through inertial sensor and vision sensor fusion. Due to low sampling rates supported by web-based vision sensor and accumulation of errors in inertial sensors, ego-motion tracking with vision sensors is commonly afflicted by slow updating rates, while motion tracking with inertial sensor suffers from rapid deterioration in accuracy with time. This paper starts with a discussion of developed algorithms for calibrating two relative rotations of the system using only one reference image. Next, stochastic noises associated with the inertial sensor are identified using Allan Variance analysis, and modeled according to their characteristics. Finally, the proposed models are incorporated into an extended Kalman filter for inertial sensor and vision sensor fusion. Compared with results from conventional sensor fusion models, we have shown that ego-motion tracking can be greatly enhanced using the proposed error correction model.

  1. The role of passive avian head stabilization in flapping flight

    PubMed Central

    Pete, Ashley E.; Kress, Daniel; Dimitrov, Marina A.; Lentink, David

    2015-01-01

    Birds improve vision by stabilizing head position relative to their surroundings, while their body is forced up and down during flapping flight. Stabilization is facilitated by compensatory motion of the sophisticated avian head–neck system. While relative head motion has been studied in stationary and walking birds, little is known about how birds accomplish head stabilization during flapping flight. To unravel this, we approximate the avian neck with a linear mass–spring–damper system for vertical displacements, analogous to proven head stabilization models for walking humans. We corroborate the model's dimensionless natural frequency and damping ratios from high-speed video recordings of whooper swans (Cygnus cygnus) flying over a lake. The data show that flap-induced body oscillations can be passively attenuated through the neck. We find that the passive model robustly attenuates large body oscillations, even in response to head mass and gust perturbations. Our proof of principle shows that bird-inspired drones with flapping wings could record better images with a swan-inspired passive camera suspension. PMID:26311316

  2. Miniature curved artificial compound eyes

    PubMed Central

    Floreano, Dario; Pericet-Camara, Ramon; Viollet, Stéphane; Ruffier, Franck; Brückner, Andreas; Leitel, Robert; Buss, Wolfgang; Menouni, Mohsine; Expert, Fabien; Juston, Raphaël; Dobrzynski, Michal Karol; L’Eplattenier, Geraud; Recktenwald, Fabian; Mallot, Hanspeter A.; Franceschini, Nicolas

    2013-01-01

    In most animal species, vision is mediated by compound eyes, which offer lower resolution than vertebrate single-lens eyes, but significantly larger fields of view with negligible distortion and spherical aberration, as well as high temporal resolution in a tiny package. Compound eyes are ideally suited for fast panoramic motion perception. Engineering a miniature artificial compound eye is challenging because it requires accurate alignment of photoreceptive and optical components on a curved surface. Here, we describe a unique design method for biomimetic compound eyes featuring a panoramic, undistorted field of view in a very thin package. The design consists of three planar layers of separately produced arrays, namely, a microlens array, a neuromorphic photodetector array, and a flexible printed circuit board that are stacked, cut, and curved to produce a mechanically flexible imager. Following this method, we have prototyped and characterized an artificial compound eye bearing a hemispherical field of view with embedded and programmable low-power signal processing, high temporal resolution, and local adaptation to illumination. The prototyped artificial compound eye possesses several characteristics similar to the eye of the fruit fly Drosophila and other arthropod species. This design method opens up additional vistas for a broad range of applications in which wide field motion detection is at a premium, such as collision-free navigation of terrestrial and aerospace vehicles, and for the experimental testing of insect vision theories. PMID:23690574

  3. A Novel Method for Tracking Individuals of Fruit Fly Swarms Flying in a Laboratory Flight Arena

    PubMed Central

    Cheng, Xi En; Qian, Zhi-Ming; Wang, Shuo Hong; Jiang, Nan; Guo, Aike; Chen, Yan Qiu

    2015-01-01

    The growing interest in studying social behaviours of swarming fruit flies, Drosophila melanogaster, has heightened the need for developing tools that provide quantitative motion data. To achieve such a goal, multi-camera three-dimensional tracking technology is the key experimental gateway. We have developed a novel tracking system for tracking hundreds of fruit flies flying in a confined cubic flight arena. In addition to the proposed tracking algorithm, this work offers additional contributions in three aspects: body detection, orientation estimation, and data validation. To demonstrate the opportunities that the proposed system offers for generating high-throughput quantitative motion data, we conducted experiments on five experimental configurations. We also performed quantitative analysis on the kinematics and the spatial structure and the motion patterns of fruit fly swarms. We found that there exists an asymptotic distance between fruit flies in swarms as the population density increases. Further, we discovered the evidence for repulsive response when the distance between fruit flies approached the asymptotic distance. Overall, the proposed tracking system presents a powerful method for studying flight behaviours of fruit flies in a three-dimensional environment. PMID:26083385

  4. Target detection in insects: optical, neural and behavioral optimizations.

    PubMed

    Gonzalez-Bellido, Paloma T; Fabian, Samuel T; Nordström, Karin

    2016-12-01

    Motion vision provides important cues for many tasks. Flying insects, for example, may pursue small, fast moving targets for mating or feeding purposes, even when these are detected against self-generated optic flow. Since insects are small, with size-constrained eyes and brains, they have evolved to optimize their optical, neural and behavioral target visualization solutions. Indeed, even if evolutionarily distant insects display different pursuit strategies, target neuron physiology is strikingly similar. Furthermore, the coarse spatial resolution of the insect compound eye might actually be beneficial when it comes to detection of moving targets. In conclusion, tiny insects show higher than expected performance in target visualization tasks. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. Bio-inspired optical rotation sensor

    NASA Astrophysics Data System (ADS)

    O'Carroll, David C.; Shoemaker, Patrick A.; Brinkworth, Russell S. A.

    2007-01-01

    Traditional approaches to calculating self-motion from visual information in artificial devices have generally relied on object identification and/or correlation of image sections between successive frames. Such calculations are computationally expensive and real-time digital implementation requires powerful processors. In contrast flies arrive at essentially the same outcome, the estimation of self-motion, in a much smaller package using vastly less power. Despite the potential advantages and a few notable successes, few neuromorphic analog VLSI devices based on biological vision have been employed in practical applications to date. This paper describes a hardware implementation in aVLSI of our recently developed adaptive model for motion detection. The chip integrates motion over a linear array of local motion processors to give a single voltage output. Although the device lacks on-chip photodetectors, it includes bias circuits to use currents from external photodiodes, and we have integrated it with a ring-array of 40 photodiodes to form a visual rotation sensor. The ring configuration reduces pattern noise and combined with the pixel-wise adaptive characteristic of the underlying circuitry, permits a robust output that is proportional to image rotational velocity over a large range of speeds, and is largely independent of either mean luminance or the spatial structure of the image viewed. In principle, such devices could be used as an element of a velocity-based servo to replace or augment inertial guidance systems in applications such as mUAVs.

  6. Nocturnal Visual Orientation in Flying Insects: A Benchmark for the Design of Vision-Based Sensors in Micro-Aerial Vehicles

    DTIC Science & Technology

    2011-03-09

    anu.edu.au Nocturnal visual orientation in flying insects: a benchmark for the design of vision-based sensors in Micro-Aerial Vehicles Report...9 10 Technical horizon sensors Over the past few years, a remarkable proliferation of designs for micro-aerial vehicles (MAVs) has occurred...possible elevations, it may severely degrade the performance of sensors by local saturation. Therefore it is necessary to find a method whereby the effect

  7. Stroboscopic Vision as a Treatment for Space Motion Sickness

    NASA Technical Reports Server (NTRS)

    Reschke, Millard F.; Somers, Jeffrey T.; Ford, George; Krnavek, Jody M.

    2007-01-01

    Results obtained from space flight indicate that most space crews will experience some symptoms of motion sickness causing significant impact on the operational objectives that must be accomplished to assure mission success. Based on the initial work of Melvill Jones we have evaluated stroboscopic vision as a method of preventing motion sickness. Given that the data presented by professor Melvill Jones were primarily post hoc results following a study not designed to investigate motion sickness, it is unclear how motion sickness results were actually determined. Building on these original results, we undertook a three part study that was designed to investigate the effect of stroboscopic vision (either with a strobe light or LCD shutter glasses) on motion sickness using: (1) visual field reversal, (2) Reading while riding in a car (with or without external vision present), and (3) making large pitch head movements during parabolic flight.

  8. Looking Back and Looking Forward: Reprising the Promise and Predicting the Future of Formation Flying and Spaceborne GPS Navigation Systems

    NASA Technical Reports Server (NTRS)

    Bauer, Frank H.; Dennehy, Neil

    2015-01-01

    A retrospective consideration of two 15-year old Guidance, Navigation and Control (GN&C) technology 'vision' predictions will be the focus of this paper. A look back analysis and critique of these late 1990s technology roadmaps out-lining the future vision, for two then nascent, but rapidly emerging, GN&C technologies will be performed. Specifically, these two GN&C technologies were: 1) multi-spacecraft formation flying and 2) the spaceborne use and exploitation of global positioning system (GPS) signals to enable formation flying. This paper reprises the promise of formation flying and spaceborne GPS as depicted in the cited 1999 and 1998 papers. It will discuss what happened to cause that promise to be mostly unfulfilled and the reasons why the envisioned formation flying dream has yet to become a reality. The recent technology trends over the past few years will then be identified and a renewed government interest in spacecraft formation flying/cluster flight will be highlighted. The authors will conclude with a reality-tempered perspective, 15 years after the initial technology roadmaps were published, predicting a promising future of spacecraft formation flying technology development over the next decade.

  9. Stroboscopic Vision as a Treatment for Retinal Slip Induced Motion Sickness

    NASA Technical Reports Server (NTRS)

    Reschke, M. F.; Somers, J. T.; Ford, G.; Krnavek, J. M.; Hwang, E. J.; Leigh, R. J.; Estrada, A.

    2007-01-01

    Motion sickness in the general population is a significant problem driven by the increasingly more sophisticated modes of transportation, visual displays, and virtual reality environments. It is important to investigate non-pharmacological alternatives for the prevention of motion sickness for individuals who cannot tolerate the available anti-motion sickness drugs, or who are precluded from medication because of different operational environments. Based on the initial work of Melvill Jones, in which post hoc results indicated that motion sickness symptoms were prevented during visual reversal testing when stroboscopic vision was used to prevent retinal slip, we have evaluated stroboscopic vision as a method of preventing motion sickness in a number of different environments. Specifically, we have undertaken a five part study that was designed to investigate the effect of stroboscopic vision (either with a strobe light or LCD shutter glasses) on motion sickness while: (1) using visual field reversal, (2) reading while riding in a car (with or without external vision present), (3) making large pitch head movements during parabolic flight, (4) during exposure to rough seas in a small boat, and (5) seated and reading in the cabin area of a UH60 Black Hawk Helicopter during 20 min of provocative flight patterns.

  10. Control of a Quadcopter Aerial Robot Using Optic Flow Sensing

    NASA Astrophysics Data System (ADS)

    Hurd, Michael Brandon

    This thesis focuses on the motion control of a custom-built quadcopter aerial robot using optic flow sensing. Optic flow sensing is a vision-based approach that can provide a robot the ability to fly in global positioning system (GPS) denied environments, such as indoor environments. In this work, optic flow sensors are used to stabilize the motion of quadcopter robot, where an optic flow algorithm is applied to provide odometry measurements to the quadcopter's central processing unit to monitor the flight heading. The optic-flow sensor and algorithm are capable of gathering and processing the images at 250 frames/sec, and the sensor package weighs 2.5 g and has a footprint of 6 cm2 in area. The odometry value from the optic flow sensor is then used a feedback information in a simple proportional-integral-derivative (PID) controller on the quadcopter. Experimental results are presented to demonstrate the effectiveness of using optic flow for controlling the motion of the quadcopter aerial robot. The technique presented herein can be applied to different types of aerial robotic systems or unmanned aerial vehicles (UAVs), as well as unmanned ground vehicles (UGV).

  11. Assessing Dual Sensor Enhanced Flight Vision Systems to Enable Equivalent Visual Operations

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Etherington, Timothy J.; Severance, Kurt; Bailey, Randall E.; Williams, Steven P.; Harrison, Stephanie J.

    2016-01-01

    Flight deck-based vision system technologies, such as Synthetic Vision (SV) and Enhanced Flight Vision Systems (EFVS), may serve as a revolutionary crew/vehicle interface enabling technologies to meet the challenges of the Next Generation Air Transportation System Equivalent Visual Operations (EVO) concept - that is, the ability to achieve the safety of current-day Visual Flight Rules (VFR) operations and maintain the operational tempos of VFR irrespective of the weather and visibility conditions. One significant challenge lies in the definition of required equipage on the aircraft and on the airport to enable the EVO concept objective. A motion-base simulator experiment was conducted to evaluate the operational feasibility, pilot workload and pilot acceptability of conducting straight-in instrument approaches with published vertical guidance to landing, touchdown, and rollout to a safe taxi speed in visibility as low as 300 ft runway visual range by use of onboard vision system technologies on a Head-Up Display (HUD) without need or reliance on natural vision. Twelve crews evaluated two methods of combining dual sensor (millimeter wave radar and forward looking infrared) EFVS imagery on pilot-flying and pilot-monitoring HUDs as they made approaches to runways with and without touchdown zone and centerline lights. In addition, the impact of adding SV to the dual sensor EFVS imagery on crew flight performance, workload, and situation awareness during extremely low visibility approach and landing operations was assessed. Results indicate that all EFVS concepts flown resulted in excellent approach path tracking and touchdown performance without any workload penalty. Adding SV imagery to EFVS concepts provided situation awareness improvements but no discernible improvements in flight path maintenance.

  12. Global versus local adaptation in fly motion-sensitive neurons

    PubMed Central

    Neri, Peter; Laughlin, Simon B

    2005-01-01

    Flies, like humans, experience a well-known consequence of adaptation to visual motion, the waterfall illusion. Direction-selective neurons in the fly lobula plate permit a detailed analysis of the mechanisms responsible for motion adaptation and their function. Most of these neurons are spatially non-opponent, they sum responses to motion in the preferred direction across their entire receptive field, and adaptation depresses responses by subtraction and by reducing contrast gain. When we adapted a small area of the receptive field to motion in its anti-preferred direction, we discovered that directional gain at unadapted regions was enhanced. This novel phenomenon shows that neuronal responses to the direction of stimulation in one area of the receptive field are dynamically adjusted to the history of stimulation both within and outside that area. PMID:16191636

  13. Hand-Writing Motion Tracking with Vision-Inertial Sensor Fusion: Calibration and Error Correction

    PubMed Central

    Zhou, Shengli; Fei, Fei; Zhang, Guanglie; Liu, Yunhui; Li, Wen J.

    2014-01-01

    The purpose of this study was to improve the accuracy of real-time ego-motion tracking through inertial sensor and vision sensor fusion. Due to low sampling rates supported by web-based vision sensor and accumulation of errors in inertial sensors, ego-motion tracking with vision sensors is commonly afflicted by slow updating rates, while motion tracking with inertial sensor suffers from rapid deterioration in accuracy with time. This paper starts with a discussion of developed algorithms for calibrating two relative rotations of the system using only one reference image. Next, stochastic noises associated with the inertial sensor are identified using Allan Variance analysis, and modeled according to their characteristics. Finally, the proposed models are incorporated into an extended Kalman filter for inertial sensor and vision sensor fusion. Compared with results from conventional sensor fusion models, we have shown that ego-motion tracking can be greatly enhanced using the proposed error correction model. PMID:25157546

  14. Haltere mechanosensory influence on tethered flight behavior in Drosophila.

    PubMed

    Mureli, Shwetha; Fox, Jessica L

    2015-08-01

    In flies, mechanosensory information from modified hindwings known as halteres is combined with visual information for wing-steering behavior. Haltere input is necessary for free flight, making it difficult to study the effects of haltere ablation under natural flight conditions. We thus used tethered Drosophila melanogaster flies to examine the relationship between halteres and the visual system, using wide-field motion or moving figures as visual stimuli. Haltere input was altered by surgically decreasing its mass, or by removing it entirely. Haltere removal does not affect the flies' ability to flap or steer their wings, but it does increase the temporal frequency at which they modify their wingbeat amplitude. Reducing the haltere mass decreases the optomotor reflex response to wide-field motion, and removing the haltere entirely does not further decrease the response. Decreasing the mass does not attenuate the response to figure motion, but removing the entire haltere does attenuate the response. When flies are allowed to control a visual stimulus in closed-loop conditions, haltereless flies fixate figures with the same acuity as intact flies, but cannot stabilize a wide-field stimulus as accurately as intact flies can. These manipulations suggest that the haltere mass is influential in wide-field stabilization, but less so in figure tracking. In both figure and wide-field experiments, we observe responses to visual motion with and without halteres, indicating that during tethered flight, intact halteres are not strictly necessary for visually guided wing-steering responses. However, the haltere feedback loop may operate in a context-dependent way to modulate responses to visual motion. © 2015. Published by The Company of Biologists Ltd.

  15. A Height Estimation Approach for Terrain Following Flights from Monocular Vision

    PubMed Central

    Campos, Igor S. G.; Nascimento, Erickson R.; Freitas, Gustavo M.; Chaimowicz, Luiz

    2016-01-01

    In this paper, we present a monocular vision-based height estimation algorithm for terrain following flights. The impressive growth of Unmanned Aerial Vehicle (UAV) usage, notably in mapping applications, will soon require the creation of new technologies to enable these systems to better perceive their surroundings. Specifically, we chose to tackle the terrain following problem, as it is still unresolved for consumer available systems. Virtually every mapping aircraft carries a camera; therefore, we chose to exploit this in order to use presently available hardware to extract the height information toward performing terrain following flights. The proposed methodology consists of using optical flow to track features from videos obtained by the UAV, as well as its motion information to estimate the flying height. To determine if the height estimation is reliable, we trained a decision tree that takes the optical flow information as input and classifies whether the output is trustworthy or not. The classifier achieved accuracies of 80% for positives and 90% for negatives, while the height estimation algorithm presented good accuracy. PMID:27929424

  16. Research on an autonomous vision-guided helicopter

    NASA Technical Reports Server (NTRS)

    Amidi, Omead; Mesaki, Yuji; Kanade, Takeo

    1994-01-01

    Integration of computer vision with on-board sensors to autonomously fly helicopters was researched. The key components developed were custom designed vision processing hardware and an indoor testbed. The custom designed hardware provided flexible integration of on-board sensors with real-time image processing resulting in a significant improvement in vision-based state estimation. The indoor testbed provided convenient calibrated experimentation in constructing real autonomous systems.

  17. Riemann tensor of motion vision revisited.

    PubMed

    Brill, M

    2001-07-02

    This note shows that the Riemann-space interpretation of motion vision developed by Barth and Watson is neither necessary for their results, nor sufficient to handle an intrinsic coordinate problem. Recasting the Barth-Watson framework as a classical velocity-solver (as in computer vision) solves these problems.

  18. Increasing Reliability with Wireless Instrumentation Systems from Space Shuttle to 'Fly-By-Wireless'

    NASA Technical Reports Server (NTRS)

    Studor, George

    2004-01-01

    This slide presentation discusses some of the requirements to allow for "Fly by Wireless". Included in the discussion are: a review of new technologies by decades starting with the 1930's and going through the current decade, structural health monitoring, the requisite system designs, and the vision of flying by wireless.

  19. Fusion of Synthetic and Enhanced Vision for All-Weather Commercial Aviation Operations

    NASA Technical Reports Server (NTRS)

    Bailey, Randall E.; Kramer, Lynda J.; Prinzel, Lawrence, III

    2007-01-01

    NASA is developing revolutionary crew-vehicle interface technologies that strive to proactively overcome aircraft safety barriers that would otherwise constrain the full realization of the next-generation air transportation system. A piloted simulation experiment was conducted to evaluate the complementary use of Synthetic and Enhanced Vision technologies. Specific focus was placed on new techniques for integration and/or fusion of Enhanced and Synthetic Vision and its impact within a two-crew flight deck during low visibility approach and landing operations. Overall, the experimental data showed that significant improvements in situation awareness, without concomitant increases in workload and display clutter, could be provided by the integration and/or fusion of synthetic and enhanced vision technologies for the pilot-flying and the pilot-not-flying. During non-normal operations, the ability of the crew to handle substantial navigational errors and runway incursions were not adversely impacted by the display concepts although the addition of Enhanced Vision did not, unto itself, provide an improvement in runway incursion detection.

  20. Ultra-Rapid Vision in Birds.

    PubMed

    Boström, Jannika E; Dimitrova, Marina; Canton, Cindy; Håstad, Olle; Qvarnström, Anna; Ödeen, Anders

    2016-01-01

    Flying animals need to accurately detect, identify and track fast-moving objects and these behavioral requirements are likely to strongly select for abilities to resolve visual detail in time. However, evidence of highly elevated temporal acuity relative to non-flying animals has so far been confined to insects while it has been missing in birds. With behavioral experiments on three wild passerine species, blue tits, collared and pied flycatchers, we demonstrate temporal acuities of vision far exceeding predictions based on the sizes and metabolic rates of these birds. This implies a history of strong natural selection on temporal resolution. These birds can resolve alternating light-dark cycles at up to 145 Hz (average: 129, 127 and 137, respectively), which is ca. 50 Hz over the highest frequency shown in any other vertebrate. We argue that rapid vision should confer a selective advantage in many bird species that are ecologically similar to the three species examined in our study. Thus, rapid vision may be a more typical avian trait than the famously sharp vision found in birds of prey.

  1. Motion Systems Role in Flight Simulators for Flying Training. Final Report for Period June 1977-June 1978.

    ERIC Educational Resources Information Center

    Cyrus, Michael L.

    This report reviews the literature as it relates to the use of platform motion systems in flight simulators for flying training. Motion is discussed in terms of its effect on compensatory, pursuit, and precognitive tasks, within both the simulator and transfer contexts. Although both skilled and unskilled behaviors are addressed, the former are…

  2. Binocular Interactions Underlying the Classic Optomotor Responses of Flying Flies

    PubMed Central

    Duistermars, Brian J.; Care, Rachel A.; Frye, Mark A.

    2012-01-01

    In response to imposed course deviations, the optomotor reactions of animals reduce motion blur and facilitate the maintenance of stable body posture. In flies, many anatomical and electrophysiological studies suggest that disparate motion cues stimulating the left and right eyes are not processed in isolation but rather are integrated in the brain to produce a cohesive panoramic percept. To investigate the strength of such inter-ocular interactions and their role in compensatory sensory–motor transformations, we utilize a virtual reality flight simulator to record wing and head optomotor reactions by tethered flying flies in response to imposed binocular rotation and monocular front-to-back and back-to-front motion. Within a narrow range of stimulus parameters that generates large contrast insensitive optomotor responses to binocular rotation, we find that responses to monocular front-to-back motion are larger than those to panoramic rotation, but are contrast sensitive. Conversely, responses to monocular back-to-front motion are slower than those to rotation and peak at the lowest tested contrast. Together our results suggest that optomotor responses to binocular rotation result from the influence of non-additive contralateral inhibitory as well as excitatory circuit interactions that serve to confer contrast insensitivity to flight behaviors influenced by rotatory optic flow. PMID:22375108

  3. Ultraviolet vision may be widespread in bats

    USGS Publications Warehouse

    Gorresen, P. Marcos; Cryan, Paul; Dalton, David C.; Wolf, Sandy; Bonaccorso, Frank

    2015-01-01

    Insectivorous bats are well known for their abilities to find and pursue flying insect prey at close range using echolocation, but they also rely heavily on vision. For example, at night bats use vision to orient across landscapes, avoid large obstacles, and locate roosts. Although lacking sharp visual acuity, the eyes of bats evolved to function at very low levels of illumination. Recent evidence based on genetics, immunohistochemistry, and laboratory behavioral trials indicated that many bats can see ultraviolet light (UV), at least at illumination levels similar to or brighter than those before twilight. Despite this growing evidence for potentially widespread UV vision in bats, the prevalence of UV vision among bats remains unknown and has not been studied outside of the laboratory. We used a Y-maze to test whether wild-caught bats could see reflected UV light and whether such UV vision functions at the dim lighting conditions typically experienced by night-flying bats. Seven insectivorous species of bats, representing five genera and three families, showed a statistically significant ‘escape-toward-the-light’ behavior when placed in the Y-maze. Our results provide compelling evidence of widespread dim-light UV vision in bats.

  4. Experiments in teleoperator and autonomous control of space robotic vehicles

    NASA Technical Reports Server (NTRS)

    Alexander, Harold L.

    1991-01-01

    A program of research embracing teleoperator and automatic navigational control of freely flying satellite robots is presented. Current research goals include: (1) developing visual operator interfaces for improved vehicle teleoperation; (2) determining the effects of different visual interface system designs on operator performance; and (3) achieving autonomous vision-based vehicle navigation and control. This research program combines virtual-environment teleoperation studies and neutral-buoyancy experiments using a space-robot simulator vehicle currently under development. Visual-interface design options under investigation include monoscopic versus stereoscopic displays and cameras, helmet-mounted versus panel-mounted display monitors, head-tracking versus fixed or manually steerable remote cameras, and the provision of vehicle-fixed visual cues, or markers, in the remote scene for improved sensing of vehicle position, orientation, and motion.

  5. Exploring Insect Vision

    ERIC Educational Resources Information Center

    Damonte, Kathleen

    2005-01-01

    A fly is buzzing around in the kitchen. You sneak up on it with a flyswatter, but just as you get close to it, it flies away. What makes flies and other insects so good at escaping from danger? The fact that insects have eyesight that can easily detect moving objects is one of the things that help them survive. In this month's Science Shorts,…

  6. Conditioning to colors: a population assay for visual learning in Drosophila.

    PubMed

    van Swinderen, Bruno

    2011-11-01

    Vision is a major sensory modality in Drosophila behavior, with more than one-half of the Drosophila brain devoted to visual processing. The mechanisms of vision in Drosophila can be studied in individuals and in populations of flies by using various paradigms. Although there has never been a widely used population assay for visual learning in Drosophila, some population paradigms have shown significant visual learning. These studies use colors as conditioned stimuli (CS) and shaking as the unconditioned stimulus (US). A simple version of the paradigm, conditioning to colors using a shaking device, is described here. A conditioning chamber, called a crab, is designed to center the flies after shaking by having them tumble down to the lowest point between joined glass tubes forming a V. Thus, vibration should be just strong enough to center most flies. After shaking, flies display a geotactic response and climb up either side of the V, and their choice of which side to climb is influenced by color displays on either side. The proportion of flies on either side determines the flies' natural preference or their learned avoidance of a color associated with shaking.

  7. The aerodynamics of free-flight maneuvers in Drosophila.

    PubMed

    Fry, Steven N; Sayaman, Rosalyn; Dickinson, Michael H

    2003-04-18

    Using three-dimensional infrared high-speed video, we captured the wing and body kinematics of free-flying fruit flies as they performed rapid flight maneuvers. We then "replayed" the wing kinematics on a dynamically scaled robotic model to measure the aerodynamic forces produced by the wings. The results show that a fly generates rapid turns with surprisingly subtle modifications in wing motion, which nonetheless generate sufficient torque for the fly to rotate its body through each turn. The magnitude and time course of the torque and body motion during rapid turns indicate that inertia, not friction, dominates the flight dynamics of insects.

  8. A computer program for an analysis of the relative motion of a space station and a free flying experiment module

    NASA Technical Reports Server (NTRS)

    Butler, J. H.

    1971-01-01

    A preliminary analysis of the relative motion of a free flying experiment module in the vicinity of a space station under the perturbative effects of drag and earth oblateness was made. A listing of a computer program developed for determining the relative motion of a module utilizing the Cowell procedure is presented, as well as instructions for its use.

  9. Robust object tracking techniques for vision-based 3D motion analysis applications

    NASA Astrophysics Data System (ADS)

    Knyaz, Vladimir A.; Zheltov, Sergey Y.; Vishnyakov, Boris V.

    2016-04-01

    Automated and accurate spatial motion capturing of an object is necessary for a wide variety of applications including industry and science, virtual reality and movie, medicine and sports. For the most part of applications a reliability and an accuracy of the data obtained as well as convenience for a user are the main characteristics defining the quality of the motion capture system. Among the existing systems for 3D data acquisition, based on different physical principles (accelerometry, magnetometry, time-of-flight, vision-based), optical motion capture systems have a set of advantages such as high speed of acquisition, potential for high accuracy and automation based on advanced image processing algorithms. For vision-based motion capture accurate and robust object features detecting and tracking through the video sequence are the key elements along with a level of automation of capturing process. So for providing high accuracy of obtained spatial data the developed vision-based motion capture system "Mosca" is based on photogrammetric principles of 3D measurements and supports high speed image acquisition in synchronized mode. It includes from 2 to 4 technical vision cameras for capturing video sequences of object motion. The original camera calibration and external orientation procedures provide the basis for high accuracy of 3D measurements. A set of algorithms as for detecting, identifying and tracking of similar targets, so for marker-less object motion capture is developed and tested. The results of algorithms' evaluation show high robustness and high reliability for various motion analysis tasks in technical and biomechanics applications.

  10. Perception of linear horizontal self-motion induced by peripheral vision /linearvection/ - Basic characteristics and visual-vestibular interactions

    NASA Technical Reports Server (NTRS)

    Berthoz, A.; Pavard, B.; Young, L. R.

    1975-01-01

    The basic characteristics of the sensation of linear horizontal motion have been studied. Objective linear motion was induced by means of a moving cart. Visually induced linear motion perception (linearvection) was obtained by projection of moving images at the periphery of the visual field. Image velocity and luminance thresholds for the appearance of linearvection have been measured and are in the range of those for image motion detection (without sensation of self motion) by the visual system. Latencies of onset are around 1 sec and short term adaptation has been shown. The dynamic range of the visual analyzer as judged by frequency analysis is lower than the vestibular analyzer. Conflicting situations in which visual cues contradict vestibular and other proprioceptive cues show, in the case of linearvection a dominance of vision which supports the idea of an essential although not independent role of vision in self motion perception.

  11. Three-Dimensional Motion Estimation Using Shading Information in Multiple Frames

    DTIC Science & Technology

    1989-09-01

    j. Threle-D.imensionai GO Motion Estimation U sing, Shadin g Ilnformation in Multiple Frames- IJean-Pierre Schotf MIT Artifi -cial intelligence...vision 3-D structure 3-D vision- shape from shading multiple frames 20. ABSTRACT (Cofrn11,00 an reysrf* OWd Of Rssss00n7 Ad 4111111& F~ block f)nseq See...motion and shading have been treated as two disjoint problems. On the one hand, researchers studying motion or structure from motion often assume

  12. Evaluation of Fused Synthetic and Enhanced Vision Display Concepts for Low-Visibility Approach and Landing

    NASA Technical Reports Server (NTRS)

    Bailey, Randall E.; Kramer, Lynda J.; Prinzel, Lawrence J., III; Wilz, Susan J.

    2009-01-01

    NASA is developing revolutionary crew-vehicle interface technologies that strive to proactively overcome aircraft safety barriers that would otherwise constrain the full realization of the next generation air transportation system. A piloted simulation experiment was conducted to evaluate the complementary use of Synthetic and Enhanced Vision technologies. Specific focus was placed on new techniques for integration and/or fusion of Enhanced and Synthetic Vision and its impact within a two-crew flight deck during low-visibility approach and landing operations. Overall, the experimental data showed that significant improvements in situation awareness, without concomitant increases in workload and display clutter, could be provided by the integration and/or fusion of synthetic and enhanced vision technologies for the pilot-flying and the pilot-not-flying. Improvements in lateral path control performance were realized when the Head-Up Display concepts included a tunnel, independent of the imagery (enhanced vision or fusion of enhanced and synthetic vision) presented with it. During non-normal operations, the ability of the crew to handle substantial navigational errors and runway incursions were neither improved nor adversely impacted by the display concepts. The addition of Enhanced Vision may not, of itself, provide an improvement in runway incursion detection without being specifically tailored for this application.

  13. Optic flow-based collision-free strategies: From insects to robots.

    PubMed

    Serres, Julien R; Ruffier, Franck

    2017-09-01

    Flying insects are able to fly smartly in an unpredictable environment. It has been found that flying insects have smart neurons inside their tiny brains that are sensitive to visual motion also called optic flow. Consequently, flying insects rely mainly on visual motion during their flight maneuvers such as: takeoff or landing, terrain following, tunnel crossing, lateral and frontal obstacle avoidance, and adjusting flight speed in a cluttered environment. Optic flow can be defined as the vector field of the apparent motion of objects, surfaces, and edges in a visual scene generated by the relative motion between an observer (an eye or a camera) and the scene. Translational optic flow is particularly interesting for short-range navigation because it depends on the ratio between (i) the relative linear speed of the visual scene with respect to the observer and (ii) the distance of the observer from obstacles in the surrounding environment without any direct measurement of either speed or distance. In flying insects, roll stabilization reflex and yaw saccades attenuate any rotation at the eye level in roll and yaw respectively (i.e. to cancel any rotational optic flow) in order to ensure pure translational optic flow between two successive saccades. Our survey focuses on feedback-loops which use the translational optic flow that insects employ for collision-free navigation. Optic flow is likely, over the next decade to be one of the most important visual cues that can explain flying insects' behaviors for short-range navigation maneuvers in complex tunnels. Conversely, the biorobotic approach can therefore help to develop innovative flight control systems for flying robots with the aim of mimicking flying insects' abilities and better understanding their flight. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. An Analysis of Helicopter Pilot Scan Techniques While Flying at Low Altitudes and High Speed

    DTIC Science & Technology

    2012-09-01

    Manager SV Synthetic Vision TFH Total Flight Hours TOFT Tactical Operational Flight Trainer VFR Visual Flight Rules VMC Visual Meteorological...Crognale, 2008). Recently, the use of synthetic vision (SV) and a heads-up- display (HUD) have been a topic of discussion in the aviation community... Synthetic vision uses external cameras to provide the pilot with an enhanced view of the outside world, usually with the assistance of night vision

  15. Peripheral Vision Horizon Display (PVHD). Corrected Copy

    NASA Technical Reports Server (NTRS)

    1984-01-01

    A Canadian invention, the peripheral vision horizon display (PVHD), shows promise in alleviating vertigo or disorientation in pilots flying under instrument conditions and easing the piloting task when flying in weather or other conditions requiring close attention to aircraft attitude instruments. A diversity of research and applied work was being done to investigate and validate the benefits of the PVHD during the years immediately preceding this conference. Organizers of the conference were able to assemble a group of outstanding presenters representing academic, industrial, and military. The theoretical foundation and applied use of the PVHD are discussed, and results from operational tests are presented.

  16. A sublethal dose of a neonicotinoid insecticide disrupts visual processing and collision avoidance behaviour in Locusta migratoria.

    PubMed

    Parkinson, Rachel H; Little, Jacelyn M; Gray, John R

    2017-04-20

    Neonicotinoids are known to affect insect navigation and vision, however the mechanisms of these effects are not fully understood. A visual motion sensitive neuron in the locust, the Descending Contralateral Movement Detector (DCMD), integrates visual information and is involved in eliciting escape behaviours. The DCMD receives coded input from the compound eyes and monosynaptically excites motorneurons involved in flight and jumping. We show that imidacloprid (IMD) impairs neural responses to visual stimuli at sublethal concentrations, and these effects are sustained two and twenty-four hours after treatment. Most significantly, IMD disrupted bursting, a coding property important for motion detection. Specifically, IMD reduced the DCMD peak firing rate within bursts at ecologically relevant doses of 10 ng/g (ng IMD per g locust body weight). Effects on DCMD firing translate to deficits in collision avoidance behaviours: exposure to 10 ng/g IMD attenuates escape manoeuvers while 100 ng/g IMD prevents the ability to fly and walk. We show that, at ecologically-relevant doses, IMD causes significant and lasting impairment of an important pathway involved with visual sensory coding and escape behaviours. These results show, for the first time, that a neonicotinoid pesticide directly impairs an important, taxonomically conserved, motion-sensitive visual network.

  17. Introductory review on `Flying Triangulation': a motion-robust optical 3D measurement principle

    NASA Astrophysics Data System (ADS)

    Ettl, Svenja

    2015-04-01

    'Flying Triangulation' (FlyTri) is a recently developed principle which allows for a motion-robust optical 3D measurement of rough surfaces. It combines a simple sensor with sophisticated algorithms: a single-shot sensor acquires 2D camera images. From each camera image, a 3D profile is generated. The series of 3D profiles generated are aligned to one another by algorithms, without relying on any external tracking device. It delivers real-time feedback of the measurement process which enables an all-around measurement of objects. The principle has great potential for small-space acquisition environments, such as the measurement of the interior of a car, and motion-sensitive measurement tasks, such as the intraoral measurement of teeth. This article gives an overview of the basic ideas and applications of FlyTri. The main challenges and their solutions are discussed. Measurement examples are also given to demonstrate the potential of the measurement principle.

  18. Pilot Fullerton examines SE-81-8 Insect Flight Motion Study

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Pilot Fullerton examines Student Experiment 81-8 (SE-81-8) Insect Flight Motion Study taped to the airlock on aft middeck. Todd Nelson, a high school senior from Minnesota, won a national contest to fly his experiment on this particular flight. Moths, flies, and bees were studied in the near weightless environment.

  19. Loads and motions of an F-106B flying through thunderstorms

    NASA Technical Reports Server (NTRS)

    Winebarger, R. M.

    1986-01-01

    Data are presented on loads and motions of a NASA F-106B airplane flying inside thunderstorms. No significant differences in piloting techniques were observed among the three pilots involved. It is indicated that airliners in normal operations occasionally encounter turbulence almost as severe as those encountered in these thunderstorm flights.

  20. Using parallel evolutionary development for a biologically-inspired computer vision system for mobile robots.

    PubMed

    Wright, Cameron H G; Barrett, Steven F; Pack, Daniel J

    2005-01-01

    We describe a new approach to attacking the problem of robust computer vision for mobile robots. The overall strategy is to mimic the biological evolution of animal vision systems. Our basic imaging sensor is based upon the eye of the common house fly, Musca domestica. The computational algorithms are a mix of traditional image processing, subspace techniques, and multilayer neural networks.

  1. Helicopter pilot estimation of self-altitude in a degraded visual environment

    NASA Astrophysics Data System (ADS)

    Crowley, John S.; Haworth, Loran A.; Szoboszlay, Zoltan P.; Lee, Alan G.

    2000-06-01

    The effect of night vision devices and degraded visual imagery on self-attitude perception is unknown. Thirteen Army aviators with normal vision flew five flights under various visual conditions in a modified AH-1 (Cobra) helicopter. Subjects estimated their altitude or flew to specified altitudes while flying a series of maneuvers. The results showed that subjects were better at detecting and controlling changes in altitude than they were at flying to or naming a specific altitude. In cruise flight and descent, the subjects tended to fly above the desired altitude, an error in the safe direction. While hovering, the direction of error was less predictable. In the low-level cruise flight scenario tested in this study, altitude perception was affected more by changes in image resolution than by changes in FOV or ocularity.

  2. Automatic human body modeling for vision-based motion capture system using B-spline parameterization of the silhouette

    NASA Astrophysics Data System (ADS)

    Jaume-i-Capó, Antoni; Varona, Javier; González-Hidalgo, Manuel; Mas, Ramon; Perales, Francisco J.

    2012-02-01

    Human motion capture has a wide variety of applications, and in vision-based motion capture systems a major issue is the human body model and its initialization. We present a computer vision algorithm for building a human body model skeleton in an automatic way. The algorithm is based on the analysis of the human shape. We decompose the body into its main parts by computing the curvature of a B-spline parameterization of the human contour. This algorithm has been applied in a context where the user is standing in front of a camera stereo pair. The process is completed after the user assumes a predefined initial posture so as to identify the main joints and construct the human model. Using this model, the initialization problem of a vision-based markerless motion capture system of the human body is solved.

  3. Figure–ground discrimination behavior in Drosophila. I. Spatial organization of wing-steering responses

    PubMed Central

    Fox, Jessica L.; Aptekar, Jacob W.; Zolotova, Nadezhda M.; Shoemaker, Patrick A.; Frye, Mark A.

    2014-01-01

    The behavioral algorithms and neural subsystems for visual figure–ground discrimination are not sufficiently described in any model system. The fly visual system shares structural and functional similarity with that of vertebrates and, like vertebrates, flies robustly track visual figures in the face of ground motion. This computation is crucial for animals that pursue salient objects under the high performance requirements imposed by flight behavior. Flies smoothly track small objects and use wide-field optic flow to maintain flight-stabilizing optomotor reflexes. The spatial and temporal properties of visual figure tracking and wide-field stabilization have been characterized in flies, but how the two systems interact spatially to allow flies to actively track figures against a moving ground has not. We took a systems identification approach in flying Drosophila and measured wing-steering responses to velocity impulses of figure and ground motion independently. We constructed a spatiotemporal action field (STAF) – the behavioral analog of a spatiotemporal receptive field – revealing how the behavioral impulse responses to figure tracking and concurrent ground stabilization vary for figure motion centered at each location across the visual azimuth. The figure tracking and ground stabilization STAFs show distinct spatial tuning and temporal dynamics, confirming the independence of the two systems. When the figure tracking system is activated by a narrow vertical bar moving within the frontal field of view, ground motion is essentially ignored despite comprising over 90% of the total visual input. PMID:24198267

  4. Vision-Mediated exploitation of a novel host plant by a tephritid fruit fly

    USDA-ARS?s Scientific Manuscript database

    Shortly after its introduction into the Hawaiian Islands around 1895, the polyphagous, invasive fruit fly Bactrocera cucurbitae (Coquillett)(Diptera:Tephritidae) was provided the opportunity to expand its host range to include a novel host, papaya (Carica papaya). It has been documented that female ...

  5. The Role of Vision and Mechanosensation in Insect Flight Control

    DTIC Science & Technology

    2012-01-01

    intensity. We used bumblebees (Bombus terrestris), honeybees ( Apis mellifera ), the common wasp (Vespa vulgaris), hornets (Vespa crabro) flies (Mousca...bees ( Apis mellifera L.). J. Exp. Biol. 209, 978-984. Beyeler, A., Zufferey, J.-C. and Floreano, D. (2009). Vision-based control of near- obstacle

  6. Measurement of three-dimensional posture and trajectory of lower body during standing long jumping utilizing body-mounted sensors.

    PubMed

    Ibata, Yuki; Kitamura, Seiji; Motoi, Kosuke; Sagawa, Koichi

    2013-01-01

    The measurement method of three-dimensional posture and flying trajectory of lower body during jumping motion using body-mounted wireless inertial measurement units (WIMU) is introduced. The WIMU is composed of three-dimensional (3D) accelerometer and gyroscope of two kinds with different dynamic range and one 3D geomagnetic sensor to adapt to quick movement. Three WIMUs are mounted under the chest, right thigh and right shank. Thin film pressure sensors are connected to the shank WIMU and are installed under right heel and tiptoe to distinguish the state of the body motion between grounding and jumping. Initial and final postures of trunk, thigh and shank at standing-still are obtained using gravitational acceleration and geomagnetism. The posture of body is determined using the 3D direction of each segment updated by the numerical integration of angular velocity. Flying motion is detected from pressure sensors and 3D flying trajectory is derived by the double integration of trunk acceleration applying the 3D velocity of trunk at takeoff. Standing long jump experiments are performed and experimental results show that the joint angle and flying trajectory agree with the actual motion measured by the optical motion capture system.

  7. Vision for the Future of the US National Strong-Motion Program

    USGS Publications Warehouse

    ,

    1997-01-01

    This document provides the requested vision for the future of the National Strong-Motion Program operated by the US Geological Survey. Options for operation of the program are presented in a companion document. Each of the three major charges of the EHRP, program council pertaining to the vision document is addressed here. The 'Vision Summary' through a series of answers to specific questions is intended to provide a complete synopsis of the committees response to program council charges. The Vision for the Future of the NSMP is presented as section III of the Summary. Analysis and detailed discussion supporting the answers in the summary are presented as sections organized according to the charges of the program council. The mission for the program is adopted from that developed at the national workshop entitled 'Research Needs for Strong Motion Data to Support Earthquake Engineering' sponsored by the National Science Foundation.

  8. Aerodynamic performance of two-dimensional, chordwise flexible flapping wings at fruit fly scale in hover flight.

    PubMed

    Sridhar, Madhu; Kang, Chang-kwon

    2015-05-06

    Fruit flies have flexible wings that deform during flight. To explore the fluid-structure interaction of flexible flapping wings at fruit fly scale, we use a well-validated Navier-Stokes equation solver, fully-coupled with a structural dynamics solver. Effects of chordwise flexibility on a two dimensional hovering wing is studied. Resulting wing rotation is purely passive, due to the dynamic balance between aerodynamic loading, elastic restoring force, and inertial force of the wing. Hover flight is considered at a Reynolds number of Re = 100, equivalent to that of fruit flies. The thickness and density of the wing also corresponds to a fruit fly wing. The wing stiffness and motion amplitude are varied to assess their influences on the resulting aerodynamic performance and structural response. Highest lift coefficient of 3.3 was obtained at the lowest-amplitude, highest-frequency motion (reduced frequency of 3.0) at the lowest stiffness (frequency ratio of 0.7) wing within the range of the current study, although the corresponding power required was also the highest. Optimal efficiency was achieved for a lower reduced frequency of 0.3 and frequency ratio 0.35. Compared to the water tunnel scale with water as the surrounding fluid instead of air, the resulting vortex dynamics and aerodynamic performance remained similar for the optimal efficiency motion, while the structural response varied significantly. Despite these differences, the time-averaged lift scaled with the dimensionless shape deformation parameter γ. Moreover, the wing kinematics that resulted in the optimal efficiency motion was closely aligned to the fruit fly measurements, suggesting that fruit fly flight aims to conserve energy, rather than to generate large forces.

  9. Rules to fly by: pigeons navigating horizontal obstacles limit steering by selecting gaps most aligned to their flight direction.

    PubMed

    Ros, Ivo G; Bhagavatula, Partha S; Lin, Huai-Ti; Biewener, Andrew A

    2017-02-06

    Flying animals must successfully contend with obstacles in their natural environments. Inspired by the robust manoeuvring abilities of flying animals, unmanned aerial systems are being developed and tested to improve flight control through cluttered environments. We previously examined steering strategies that pigeons adopt to fly through an array of vertical obstacles (VOs). Modelling VO flight guidance revealed that pigeons steer towards larger visual gaps when making fast steering decisions. In the present experiments, we recorded three-dimensional flight kinematics of pigeons as they flew through randomized arrays of horizontal obstacles (HOs). We found that pigeons still decelerated upon approach but flew faster through a denser array of HOs compared with the VO array previously tested. Pigeons exhibited limited steering and chose gaps between obstacles most aligned to their immediate flight direction, in contrast to VO navigation that favoured widest gap steering. In addition, pigeons navigated past the HOs with more variable and decreased wing stroke span and adjusted their wing stroke plane to reduce contact with the obstacles. Variability in wing extension, stroke plane and wing stroke path was greater during HO flight. Pigeons also exhibited pronounced head movements when negotiating HOs, which potentially serve a visual function. These head-bobbing-like movements were most pronounced in the horizontal (flight direction) and vertical directions, consistent with engaging motion vision mechanisms for obstacle detection. These results show that pigeons exhibit a keen kinesthetic sense of their body and wings in relation to obstacles. Together with aerodynamic flapping flight mechanics that favours vertical manoeuvring, pigeons are able to navigate HOs using simple rules, with remarkable success.

  10. Bats Use Path Integration Rather Than Acoustic Flow to Assess Flight Distance along Flyways.

    PubMed

    Aharon, Gal; Sadot, Meshi; Yovel, Yossi

    2017-12-04

    Navigation can be achieved using different strategies from simple beaconing to complex map-based movement [1-4]. Bats display remarkable navigation capabilities, ranging from nightly commutes of several kilometers and up to seasonal migrations over thousands of kilometers [5]. Many bats have been suggested to fly along fixed routes termed "flyways," when flying from their roost to their foraging sites [6]. Flyways commonly stretch along linear landscape elements such as tree lines, hedges, or rivers [7]. When flying along a flyway, bats must estimate the distance they have traveled in order to determine when to turn. This can be especially challenging when moving along a repetitive landscape. Some bats, like Kuhl's pipistrelles, which we studied here, have limited vision [8] and were suggested to rely on bio-sonar for navigation. These bats could therefore estimate distance using three main sensory-navigation strategies, all of which we have examined: acoustic flow, acoustic landmarks, or path integration. We trained bats to fly along a linear flyway and land on a platform. We then tested their behavior when the platform was removed under different manipulations, including changing the acoustic flow, moving the start point, and adding wind. We found that bats do not require acoustic flow, which was hypothesized to be important for their navigation [9-15], and that they can perform the task without landmarks. Our results suggest that Kuhl's pipistrelles use internal self-motion cues-also known as path integration-rather than external information to estimate flight distance for at least dozens of meters when navigating along linear flyways. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Rules to fly by: pigeons navigating horizontal obstacles limit steering by selecting gaps most aligned to their flight direction

    PubMed Central

    Ros, Ivo G.; Bhagavatula, Partha S.; Lin, Huai-Ti

    2017-01-01

    Flying animals must successfully contend with obstacles in their natural environments. Inspired by the robust manoeuvring abilities of flying animals, unmanned aerial systems are being developed and tested to improve flight control through cluttered environments. We previously examined steering strategies that pigeons adopt to fly through an array of vertical obstacles (VOs). Modelling VO flight guidance revealed that pigeons steer towards larger visual gaps when making fast steering decisions. In the present experiments, we recorded three-dimensional flight kinematics of pigeons as they flew through randomized arrays of horizontal obstacles (HOs). We found that pigeons still decelerated upon approach but flew faster through a denser array of HOs compared with the VO array previously tested. Pigeons exhibited limited steering and chose gaps between obstacles most aligned to their immediate flight direction, in contrast to VO navigation that favoured widest gap steering. In addition, pigeons navigated past the HOs with more variable and decreased wing stroke span and adjusted their wing stroke plane to reduce contact with the obstacles. Variability in wing extension, stroke plane and wing stroke path was greater during HO flight. Pigeons also exhibited pronounced head movements when negotiating HOs, which potentially serve a visual function. These head-bobbing-like movements were most pronounced in the horizontal (flight direction) and vertical directions, consistent with engaging motion vision mechanisms for obstacle detection. These results show that pigeons exhibit a keen kinesthetic sense of their body and wings in relation to obstacles. Together with aerodynamic flapping flight mechanics that favours vertical manoeuvring, pigeons are able to navigate HOs using simple rules, with remarkable success. PMID:28163883

  12. Optical simulation of flying targets using physically based renderer

    NASA Astrophysics Data System (ADS)

    Cheng, Ye; Zheng, Quan; Peng, Junkai; Lv, Pin; Zheng, Changwen

    2018-02-01

    The simulation of aerial flying targets is widely needed in many fields. This paper proposes a physically based method for optical simulation of flying targets. In the first step, three-dimensional target models are built and the motion speed and direction are defined. Next, the material of the outward appearance of a target is also simulated. Then the illumination conditions are defined. After all definitions are given, all settings are encoded in a description file. Finally, simulated results are generated by Monte Carlo ray tracing in a physically based renderer. Experiments show that this method is able to simulate materials, lighting and motion blur for flying targets, and it can generate convincing and highquality simulation results.

  13. Dichromatic vision in a fruit bat with diurnal proclivities: the Samoan flying fox (Pteropus samoensis).

    PubMed

    Melin, Amanda D; Danosi, Christina F; McCracken, Gary F; Dominy, Nathaniel J

    2014-12-01

    A nocturnal bottleneck during mammalian evolution left a majority of species with two cone opsins, or dichromatic color vision. Primate trichromatic vision arose from the duplication and divergence of an X-linked opsin gene, and is long attributed to tandem shifts from nocturnality to diurnality and from insectivory to frugivory. Opsin gene variation and at least one duplication event exist in the order Chiroptera, suggesting that trichromatic vision could evolve under favorable ecological conditions. The natural history of the Samoan flying fox (Pteropus samoensis) meets these conditions--it is a large bat that consumes nectar and fruit and demonstrates strong diurnal proclivities. It also possesses a visual system that is strikingly similar to that of primates. To explore the potential for opsin gene duplication and divergence in this species, we sequenced the opsin genes of 11 individuals (19 X-chromosomes) from three South Pacific islands. Our results indicate the uniform presence of two opsins with predicted peak sensitivities of ca. 360 and 553 nm. This result fails to support a causal link between diurnal frugivory and trichromatic vision, although it remains plausible that the diurnal activities of P. samoensis have insufficient antiquity to favor opsin gene renovation.

  14. Response of Phlebotomine Sand Flies to Light-Emitting Diode-Modified Light Traps in Southern Egypt

    DTIC Science & Technology

    2007-04-01

    light. Only one study has been performed on a New World sand fly ( Lutzomyia Iongipalpis) measuring spectral sensitivity with an electroretinogram... Lutzomyia longipalpis sandflies. Med. Vet. Entomol. 10: 372-374. Muir, L.E., M.J. Thorne, and D.H. Kay. 1992. Aedes aegypti (Diptera: Culicidae) vision

  15. Flight simulator platform motion and air transport pilot training

    NASA Technical Reports Server (NTRS)

    Lee, Alfred T.; Bussolari, Steven R.

    1987-01-01

    The effect of a flight simulator platform motion on the performance and training of a pilot was evaluated using subjective ratings and objective performance data obtained on experienced B-727 pilots and pilots with no prior heavy aircraft flying experience flying B-727-200 aircraft simulator used by the FAA in the upgrade and transition training for air carrier operations. The results on experienced pilots did not reveal any reliable effects of wide variations in platform motion design. On the other hand, motion variations significantly affected the behavior of pilots without heavy-aircraft experience. The effect was limited to pitch attitude control inputs during the early phase of landing training.

  16. Vision-Based UAV Flight Control and Obstacle Avoidance

    DTIC Science & Technology

    2006-01-01

    denoted it by Vb = (Vb1, Vb2 , Vb3). Fig. 2 shows the block diagram of the proposed vision-based motion analysis and obstacle avoidance system. We denote...structure analysis often involve computation- intensive computer vision tasks, such as feature extraction and geometric modeling. Computation-intensive...First, we extract a set of features from each block. 2) Second, we compute the distance between these two sets of features. In conventional motion

  17. Color discrimination with broadband photoreceptors.

    PubMed

    Schnaitmann, Christopher; Garbers, Christian; Wachtler, Thomas; Tanimoto, Hiromu

    2013-12-02

    Color vision is commonly assumed to rely on photoreceptors tuned to narrow spectral ranges. In the ommatidium of Drosophila, the four types of so-called inner photoreceptors express different narrow-band opsins. In contrast, the outer photoreceptors have a broadband spectral sensitivity and were thought to exclusively mediate achromatic vision. Using computational models and behavioral experiments, we demonstrate that the broadband outer photoreceptors contribute to color vision in Drosophila. The model of opponent processing that includes the opsin of the outer photoreceptors scored the best fit to wavelength discrimination data. To experimentally uncover the contribution of individual photoreceptor types, we restored phototransduction of targeted photoreceptor combinations in a blind mutant. Dichromatic flies with only broadband photoreceptors and one additional receptor type can discriminate different colors, indicating the existence of a specific output comparison of the outer and inner photoreceptors. Furthermore, blocking interneurons postsynaptic to the outer photoreceptors specifically impaired color but not intensity discrimination. Our findings show that receptors with a complex and broad spectral sensitivity can contribute to color vision and reveal that chromatic and achromatic circuits in the fly share common photoreceptors. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Security Applications Of Computer Motion Detection

    NASA Astrophysics Data System (ADS)

    Bernat, Andrew P.; Nelan, Joseph; Riter, Stephen; Frankel, Harry

    1987-05-01

    An important area of application of computer vision is the detection of human motion in security systems. This paper describes the development of a computer vision system which can detect and track human movement across the international border between the United States and Mexico. Because of the wide range of environmental conditions, this application represents a stringent test of computer vision algorithms for motion detection and object identification. The desired output of this vision system is accurate, real-time locations for individual aliens and accurate statistical data as to the frequency of illegal border crossings. Because most detection and tracking routines assume rigid body motion, which is not characteristic of humans, new algorithms capable of reliable operation in our application are required. Furthermore, most current detection and tracking algorithms assume a uniform background against which motion is viewed - the urban environment along the US-Mexican border is anything but uniform. The system works in three stages: motion detection, object tracking and object identi-fication. We have implemented motion detection using simple frame differencing, maximum likelihood estimation, mean and median tests and are evaluating them for accuracy and computational efficiency. Due to the complex nature of the urban environment (background and foreground objects consisting of buildings, vegetation, vehicles, wind-blown debris, animals, etc.), motion detection alone is not sufficiently accurate. Object tracking and identification are handled by an expert system which takes shape, location and trajectory information as input and determines if the moving object is indeed representative of an illegal border crossing.

  19. HP-9825A HFRMP trajectory processor (#TRAJ), detailed description. [relative motion of the space shuttle orbiter and a free-flying payload

    NASA Technical Reports Server (NTRS)

    Kindall, S. M.

    1980-01-01

    The computer code for the trajectory processor (#TRAJ) of the high fidelity relative motion program is described. The #TRAJ processor is a 12-degrees-of-freedom trajectory integrator (6 degrees of freedom for each of two vehicles) which can be used to generate digital and graphical data describing the relative motion of the Space Shuttle Orbiter and a free-flying cylindrical payload. A listing of the code, coding standards and conventions, detailed flow charts, and discussions of the computational logic are included.

  20. Formation Flying and Deformable Instruments

    NASA Astrophysics Data System (ADS)

    Rio, Yvon

    2009-05-01

    Astronomers have always attempted to build very stable instruments. They fight all that can cause mechanical deformation or image motion. This has led to well established technologies (autoguide, active optics, thermal control, tip/tilt correction), as well as observing methods based on the use of controlled motion (scanning, micro scanning, shift and add, chopping and nodding). Formation flying disturbs this practice. It is neither possible to reduce the relative motion to very small amplitudes, nor to control it at will. Some impacts on Simbol-X instrument design, and operation are presented.

  1. A Vision-Based Motion Sensor for Undergraduate Laboratories.

    ERIC Educational Resources Information Center

    Salumbides, Edcel John; Maristela, Joyce; Uy, Alfredson; Karremans, Kees

    2002-01-01

    Introduces an alternative method to determine the mechanics of a moving object that uses computer vision algorithms with a charge-coupled device (CCD) camera as a recording device. Presents two experiments, pendulum motion and terminal velocity, to compare results of the alternative and conventional methods. (YDS)

  2. Vision and air flow combine to streamline flying honeybees

    PubMed Central

    Taylor, Gavin J.; Luu, Tien; Ball, David; Srinivasan, Mandyam V.

    2013-01-01

    Insects face the challenge of integrating multi-sensory information to control their flight. Here we study a ‘streamlining' response in honeybees, whereby honeybees raise their abdomen to reduce drag. We find that this response, which was recently reported to be mediated by optic flow, is also strongly modulated by the presence of air flow simulating a head wind. The Johnston's organs in the antennae were found to play a role in the measurement of the air speed that is used to control the streamlining response. The response to a combination of visual motion and wind is complex and can be explained by a model that incorporates a non-linear combination of the two stimuli. The use of visual and mechanosensory cues increases the strength of the streamlining response when the stimuli are present concurrently. We propose this multisensory integration will make the response more robust to transient disturbances in either modality. PMID:24019053

  3. Model-based video segmentation for vision-augmented interactive games

    NASA Astrophysics Data System (ADS)

    Liu, Lurng-Kuo

    2000-04-01

    This paper presents an architecture and algorithms for model based video object segmentation and its applications to vision augmented interactive game. We are especially interested in real time low cost vision based applications that can be implemented in software in a PC. We use different models for background and a player object. The object segmentation algorithm is performed in two different levels: pixel level and object level. At pixel level, the segmentation algorithm is formulated as a maximizing a posteriori probability (MAP) problem. The statistical likelihood of each pixel is calculated and used in the MAP problem. Object level segmentation is used to improve segmentation quality by utilizing the information about the spatial and temporal extent of the object. The concept of an active region, which is defined based on motion histogram and trajectory prediction, is introduced to indicate the possibility of a video object region for both background and foreground modeling. It also reduces the overall computation complexity. In contrast with other applications, the proposed video object segmentation system is able to create background and foreground models on the fly even without introductory background frames. Furthermore, we apply different rate of self-tuning on the scene model so that the system can adapt to the environment when there is a scene change. We applied the proposed video object segmentation algorithms to several prototype virtual interactive games. In our prototype vision augmented interactive games, a player can immerse himself/herself inside a game and can virtually interact with other animated characters in a real time manner without being constrained by helmets, gloves, special sensing devices, or background environment. The potential applications of the proposed algorithms including human computer gesture interface and object based video coding such as MPEG-4 video coding.

  4. Neural mechanisms underlying sensitivity to reverse-phi motion in the fly

    PubMed Central

    Meier, Matthias; Serbe, Etienne; Eichner, Hubert; Borst, Alexander

    2017-01-01

    Optical illusions provide powerful tools for mapping the algorithms and circuits that underlie visual processing, revealing structure through atypical function. Of particular note in the study of motion detection has been the reverse-phi illusion. When contrast reversals accompany discrete movement, detected direction tends to invert. This occurs across a wide range of organisms, spanning humans and invertebrates. Here, we map an algorithmic account of the phenomenon onto neural circuitry in the fruit fly Drosophila melanogaster. Through targeted silencing experiments in tethered walking flies as well as electrophysiology and calcium imaging, we demonstrate that ON- or OFF-selective local motion detector cells T4 and T5 are sensitive to certain interactions between ON and OFF. A biologically plausible detector model accounts for subtle features of this particular form of illusory motion reversal, like the re-inversion of turning responses occurring at extreme stimulus velocities. In light of comparable circuit architecture in the mammalian retina, we suggest that similar mechanisms may apply even to human psychophysics. PMID:29261684

  5. Neural mechanisms underlying sensitivity to reverse-phi motion in the fly.

    PubMed

    Leonhardt, Aljoscha; Meier, Matthias; Serbe, Etienne; Eichner, Hubert; Borst, Alexander

    2017-01-01

    Optical illusions provide powerful tools for mapping the algorithms and circuits that underlie visual processing, revealing structure through atypical function. Of particular note in the study of motion detection has been the reverse-phi illusion. When contrast reversals accompany discrete movement, detected direction tends to invert. This occurs across a wide range of organisms, spanning humans and invertebrates. Here, we map an algorithmic account of the phenomenon onto neural circuitry in the fruit fly Drosophila melanogaster. Through targeted silencing experiments in tethered walking flies as well as electrophysiology and calcium imaging, we demonstrate that ON- or OFF-selective local motion detector cells T4 and T5 are sensitive to certain interactions between ON and OFF. A biologically plausible detector model accounts for subtle features of this particular form of illusory motion reversal, like the re-inversion of turning responses occurring at extreme stimulus velocities. In light of comparable circuit architecture in the mammalian retina, we suggest that similar mechanisms may apply even to human psychophysics.

  6. Vision-mediated exploitation of a novel host plant by a tephritid fruit fly.

    PubMed

    Piñero, Jaime C; Souder, Steven K; Vargas, Roger I

    2017-01-01

    Shortly after its introduction into the Hawaiian Islands around 1895, the polyphagous, invasive fruit fly Bactrocera (Zeugodacus) cucurbitae (Coquillett) (Diptera: Tephritidae) was provided the opportunity to expand its host range to include a novel host, papaya (Carica papaya). It has been documented that female B. cucurbitae rely strongly on vision to locate host fruit. Given that the papaya fruit is visually conspicuous in the papaya agro-ecosystem, we hypothesized that female B. cucurbitae used vision as the main sensory modality to find and exploit the novel host fruit. Using a comparative approach that involved a series of studies under natural and semi-natural conditions in Hawaii, we assessed the ability of female B. cucurbitae to locate and oviposit in papaya fruit using the sensory modalities of olfaction and vision alone and also in combination. The results of these studies demonstrate that, under a variety of conditions, volatiles emitted by the novel host do not positively stimulate the behavior of the herbivore. Rather, vision seems to be the main mechanism driving the exploitation of the novel host. Volatiles emitted by the novel host papaya fruit did not contribute in any way to the visual response of females. Our findings highlight the remarkable role of vision in the host-location process of B. cucurbitae and provide empirical evidence for this sensory modality as a potential mechanism involved in host range expansion.

  7. Vision-mediated exploitation of a novel host plant by a tephritid fruit fly

    PubMed Central

    2017-01-01

    Shortly after its introduction into the Hawaiian Islands around 1895, the polyphagous, invasive fruit fly Bactrocera (Zeugodacus) cucurbitae (Coquillett) (Diptera: Tephritidae) was provided the opportunity to expand its host range to include a novel host, papaya (Carica papaya). It has been documented that female B. cucurbitae rely strongly on vision to locate host fruit. Given that the papaya fruit is visually conspicuous in the papaya agro-ecosystem, we hypothesized that female B. cucurbitae used vision as the main sensory modality to find and exploit the novel host fruit. Using a comparative approach that involved a series of studies under natural and semi-natural conditions in Hawaii, we assessed the ability of female B. cucurbitae to locate and oviposit in papaya fruit using the sensory modalities of olfaction and vision alone and also in combination. The results of these studies demonstrate that, under a variety of conditions, volatiles emitted by the novel host do not positively stimulate the behavior of the herbivore. Rather, vision seems to be the main mechanism driving the exploitation of the novel host. Volatiles emitted by the novel host papaya fruit did not contribute in any way to the visual response of females. Our findings highlight the remarkable role of vision in the host-location process of B. cucurbitae and provide empirical evidence for this sensory modality as a potential mechanism involved in host range expansion. PMID:28380069

  8. Dendro-dendritic interactions between motion-sensitive large-field neurons in the fly.

    PubMed

    Haag, Juergen; Borst, Alexander

    2002-04-15

    For visual course control, flies rely on a set of motion-sensitive neurons called lobula plate tangential cells (LPTCs). Among these cells, the so-called CH (centrifugal horizontal) cells shape by their inhibitory action the receptive field properties of other LPTCs called FD (figure detection) cells specialized for figure-ground discrimination based on relative motion. Studying the ipsilateral input circuitry of CH cells by means of dual-electrode and combined electrical-optical recordings, we find that CH cells receive graded input from HS (large-field horizontal system) cells via dendro-dendritic electrical synapses. This particular wiring scheme leads to a spatial blur of the motion image on the CH cell dendrite, and, after inhibiting FD cells, to an enhancement of motion contrast. This could be crucial for enabling FD cells to discriminate object from self motion.

  9. Effect of light intensity on flight control and temporal properties of photoreceptors in bumblebees.

    PubMed

    Reber, Therese; Vähäkainu, Antti; Baird, Emily; Weckström, Matti; Warrant, Eric; Dacke, Marie

    2015-05-01

    To control flight, insects rely on the pattern of visual motion generated on the retina as they move through the environment. When light levels fall, vision becomes less reliable and flight control thus becomes more challenging. Here, we investigated the effect of light intensity on flight control by filming the trajectories of free-flying bumblebees (Bombus terrestris, Linnaeus 1758) in an experimental tunnel at different light levels. As light levels fell, flight speed decreased and the flight trajectories became more tortuous but the bees were still remarkably good at centring their flight about the tunnel's midline. To investigate whether this robust flight performance can be explained by visual adaptations in the bumblebee retina, we also examined the response speed of the green-sensitive photoreceptors at the same light intensities. We found that the response speed of the photoreceptors significantly decreased as light levels fell. This indicates that bumblebees have both behavioural (reduction in flight speed) and retinal (reduction in response speed of the photoreceptors) adaptations to allow them to fly in dim light. However, the more tortuous flight paths recorded in dim light suggest that these adaptations do not support flight with the same precision during the twilight hours of the day. © 2015. Published by The Company of Biologists Ltd.

  10. Observability/Identifiability of Rigid Motion under Perspective Projection

    DTIC Science & Technology

    1994-03-08

    Faugeras and S. Maybank . Motion from point mathces: multiplicity of solutions. Int. J, of Computer Vision, 1990. [16] D.B. Gennery. Tracking known...sequences. Int. 9. of computer vision, 1989. [37] S. Maybank . Theory of reconstruction from image motion. Springer Verlag, 1992. [38] Andrea 6...defined in section 5; in this appendix we show a simple characterization which is due to Faugeras and Maybank [15, 371. Theorem B.l . Let Q = UCVT

  11. A review of flight simulation techniques

    NASA Astrophysics Data System (ADS)

    Baarspul, Max

    After a brief historical review of the evolution of flight simulation techniques, this paper first deals with the main areas of flight simulator applications. Next, it describes the main components of a piloted flight simulator. Because of the presence of the pilot-in-the-loop, the digital computer driving the simulator must solve the aircraft equations of motion in ‘real-time’. Solutions to meet the high required computer power of todays modern flight simulator are elaborated. The physical similarity between aircraft and simulator in cockpit layout, flight instruments, flying controls etc., is discussed, based on the equipment and environmental cue fidelity required for training and research simulators. Visual systems play an increasingly important role in piloted flight simulation. The visual systems now available and most widely used are described, where image generators and display devices will be distinguished. The characteristics of out-of-the-window visual simulation systems pertaining to the perceptual capabilities of human vision are discussed. Faithful reproduction of aircraft motion requires large travel, velocity and acceleration capabilities of the motion system. Different types and applications of motion systems in e.g. airline training and research are described. The principles of motion cue generation, based on the characteristics of the non-visual human motion sensors, are described. The complete motion system, consisting of the hardware and the motion drive software, is discussed. The principles of mathematical modelling of the aerodynamic, flight control, propulsion, landing gear and environmental characteristics of the aircraft are reviewed. An example of the identification of an aircraft mathematical model, based on flight and taxi tests, is presented. Finally, the paper deals with the hardware and software integration of the flight simulator components and the testing and acceptance of the complete flight simulator. Examples of the so-called ‘Computer Generated Checkout’ and ‘Proof of Match’ are presented. The concluding remarks briefly summarize the status of flight simulator technology and consider possibilities for future research.

  12. Integration of local motion is normal in amblyopia

    NASA Astrophysics Data System (ADS)

    Hess, Robert F.; Mansouri, Behzad; Dakin, Steven C.; Allen, Harriet A.

    2006-05-01

    We investigate the global integration of local motion direction signals in amblyopia, in a task where performance is equated between normal and amblyopic eyes at the single element level. We use an equivalent noise model to derive the parameters of internal noise and number of samples, both of which we show are normal in amblyopia for this task. This result is in apparent conflict with a previous study in amblyopes showing that global motion processing is defective in global coherence tasks [Vision Res. 43, 729 (2003)]. A similar discrepancy between the normalcy of signal integration [Vision Res. 44, 2955 (2004)] and anomalous global coherence form processing has also been reported [Vision Res. 45, 449 (2005)]. We suggest that these discrepancies for form and motion processing in amblyopia point to a selective problem in separating signal from noise in the typical global coherence task.

  13. Pilot Fullerton examines SE-81-8 Insect Flight Motion Study

    NASA Image and Video Library

    1982-03-30

    STS003-23-178 (22-30 March 1982) --- Astronaut C. Gordon Fullerton, STS-3 pilot, examines Student Experiment 81-8 (SE-81-8) Insect Flight Motion Study taped to the airlock on aft middeck. Todd Nelson, a high school senior from Minnesota, won a national contest to fly his experiment on this particular flight. Moths, flies, and bees were studied in the near weightless environment. Photo credit: NASA

  14. Dynamic Estimation of Rigid Motion from Perspective Views via Recursive Identification of Exterior Differential Systems with Parameters on a Topological Manifold

    DTIC Science & Technology

    1994-02-15

    0. Faugeras. Three dimensional vision, a geometric viewpoint. MIT Press, 1993. [19] 0 . D. Faugeras and S. Maybank . Motion from point mathces...multiplicity of solutions. Int. J. of Computer Vision, 1990. 1201 0.D. Faugeras, Q.T. Luong, and S.J. Maybank . Camera self-calibration: theory and...Kalrnan filter-based algorithms for estimating depth from image sequences. Int. J. of computer vision, 1989. [41] S. Maybank . Theory of

  15. Design of a dynamic test platform for autonomous robot vision systems

    NASA Technical Reports Server (NTRS)

    Rich, G. C.

    1980-01-01

    The concept and design of a dynamic test platform for development and evluation of a robot vision system is discussed. The platform is to serve as a diagnostic and developmental tool for future work with the RPI Mars Rover's multi laser/multi detector vision system. The platform allows testing of the vision system while its attitude is varied, statically or periodically. The vision system is mounted on the test platform. It can then be subjected to a wide variety of simulated can thus be examined in a controlled, quantitative fashion. Defining and modeling Rover motions and designing the platform to emulate these motions are also discussed. Individual aspects of the design process are treated separately, as structural, driving linkages, and motors and transmissions.

  16. Visually induced self-motion sensation adapts rapidly to left-right reversal of vision

    NASA Technical Reports Server (NTRS)

    Oman, C. M.; Bock, O. L.

    1981-01-01

    Three experiments were conducted using 15 adult volunteers with no overt oculomotor or vestibular disorders. In all experiments, left-right vision reversal was achieved using prism goggles, which permitted a binocular field of vision subtending approximately 45 deg horizontally and 28 deg vertically. In all experiments, circularvection (CV) was tested before and immediately after a period of exposure to reversed vision. After one to three hours of active movement while wearing vision-reversing goggles, 10 of 15 (stationary) human subjects viewing a moving stripe display experienced a self-rotation illusion in the same direction as seen stripe motion, rather than in the opposite (normal) direction, demonstrating that the central neural pathways that process visual self-rotation cues can undergo rapid adaptive modification.

  17. Incremental inverse kinematics based vision servo for autonomous robotic capture of non-cooperative space debris

    NASA Astrophysics Data System (ADS)

    Dong, Gangqi; Zhu, Z. H.

    2016-04-01

    This paper proposed a new incremental inverse kinematics based vision servo approach for robotic manipulators to capture a non-cooperative target autonomously. The target's pose and motion are estimated by a vision system using integrated photogrammetry and EKF algorithm. Based on the estimated pose and motion of the target, the instantaneous desired position of the end-effector is predicted by inverse kinematics and the robotic manipulator is moved incrementally from its current configuration subject to the joint speed limits. This approach effectively eliminates the multiple solutions in the inverse kinematics and increases the robustness of the control algorithm. The proposed approach is validated by a hardware-in-the-loop simulation, where the pose and motion of the non-cooperative target is estimated by a real vision system. The simulation results demonstrate the effectiveness and robustness of the proposed estimation approach for the target and the incremental control strategy for the robotic manipulator.

  18. GVE-Based Dynamics and Control for Formation Flying Spacecraft

    NASA Technical Reports Server (NTRS)

    Breger, Louis; How, Jonathan P.

    2004-01-01

    Formation flying is an enabling technology for many future space missions. This paper presents extensions to the equations of relative motion expressed in Keplerian orbital elements, including new initialization techniques for general formation configurations. A new linear time-varying form of the equations of relative motion is developed from Gauss Variational Equations and used in a model predictive controller. The linearizing assumptions for these equations are shown to be consistent with typical formation flying scenarios. Several linear, convex initialization techniques are presented, as well as a general, decentralized method for coordinating a tetrahedral formation using differential orbital elements. Control methods are validated using a commercial numerical propagator.

  19. Length Contraction Should not be Independent of Time

    NASA Astrophysics Data System (ADS)

    Smarandache, Florentin

    2013-10-01

    In Special Theory of Relativity it looks that the length contraction along the direction of the motion is independent of time, i.e. if a rocket flies one second, or the rocket flies one year the rocket's along-the-motion length contraction is the same, since the contraction factor C (v) =√{ 1 -v2/c2 } depends on the rocket's relativistic speed (v) and on the light speed in vacuum (c) only. We find this as unrealistic, incomplete. It is logical that flying more and more it should increase the length contraction. What about the cosmic bodies that continuously travel, do they contract only once or are they continuously contracting?

  20. Neuropharmacology of vision in goldfish: a review.

    PubMed

    Mora-Ferrer, Carlos; Neumeyer, Christa

    2009-05-01

    The goldfish is one of the few animals exceptionally well analyzed in behavioral experiments and also in electrophysiological and neuroanatomical investigations of the retina. To get insight into the functional organization of the retina we studied color vision, motion detection and temporal resolution before and after intra-ocular injection of neuropharmaca with known effects on retinal neurons. Bicuculline, strychnine, curare, atropine, and dopamine D1- and D2-receptor antagonists were used. The results reviewed here indicate separate and parallel processing of L-cone contribution to different visual functions, and the influence of several neurotransmitters (dopamine, acetylcholine, glycine, and GABA) on motion vision, color vision, and temporal resolution.

  1. A High-Speed Vision-Based Sensor for Dynamic Vibration Analysis Using Fast Motion Extraction Algorithms.

    PubMed

    Zhang, Dashan; Guo, Jie; Lei, Xiujun; Zhu, Changan

    2016-04-22

    The development of image sensor and optics enables the application of vision-based techniques to the non-contact dynamic vibration analysis of large-scale structures. As an emerging technology, a vision-based approach allows for remote measuring and does not bring any additional mass to the measuring object compared with traditional contact measurements. In this study, a high-speed vision-based sensor system is developed to extract structure vibration signals in real time. A fast motion extraction algorithm is required for this system because the maximum sampling frequency of the charge-coupled device (CCD) sensor can reach up to 1000 Hz. Two efficient subpixel level motion extraction algorithms, namely the modified Taylor approximation refinement algorithm and the localization refinement algorithm, are integrated into the proposed vision sensor. Quantitative analysis shows that both of the two modified algorithms are at least five times faster than conventional upsampled cross-correlation approaches and achieve satisfactory error performance. The practicability of the developed sensor is evaluated by an experiment in a laboratory environment and a field test. Experimental results indicate that the developed high-speed vision-based sensor system can extract accurate dynamic structure vibration signals by tracking either artificial targets or natural features.

  2. Design and Validation of Exoskeleton Actuated by Soft Modules toward Neurorehabilitation-Vision-Based Control for Precise Reaching Motion of Upper Limb.

    PubMed

    Oguntosin, Victoria W; Mori, Yoshiki; Kim, Hyejong; Nasuto, Slawomir J; Kawamura, Sadao; Hayashi, Yoshikatsu

    2017-01-01

    We demonstrated the design, production, and functional properties of the Exoskeleton Actuated by the Soft Modules (EAsoftM). Integrating the 3D printed exoskeleton with passive joints to compensate gravity and with active joints to rotate the shoulder and elbow joints resulted in ultra-light system that could assist planar reaching motion by using the vision-based control law. The EAsoftM can support the reaching motion with compliance realized by the soft materials and pneumatic actuation. In addition, the vision-based control law has been proposed for the precise control over the target reaching motion within the millimeter scale. Aiming at rehabilitation exercise for individuals, typically soft actuators have been developed for relatively small motions, such as grasping motion, and one of the challenges has been to extend their use for a wider range reaching motion. The proposed EAsoftM presented one possible solution for this challenge by transmitting the torque effectively along the anatomically aligned with a human body exoskeleton. The proposed integrated systems will be an ideal solution for neurorehabilitation where affordable, wearable, and portable systems are required to be customized for individuals with specific motor impairments.

  3. Design and Validation of Exoskeleton Actuated by Soft Modules toward Neurorehabilitation—Vision-Based Control for Precise Reaching Motion of Upper Limb

    PubMed Central

    Oguntosin, Victoria W.; Mori, Yoshiki; Kim, Hyejong; Nasuto, Slawomir J.; Kawamura, Sadao; Hayashi, Yoshikatsu

    2017-01-01

    We demonstrated the design, production, and functional properties of the Exoskeleton Actuated by the Soft Modules (EAsoftM). Integrating the 3D printed exoskeleton with passive joints to compensate gravity and with active joints to rotate the shoulder and elbow joints resulted in ultra-light system that could assist planar reaching motion by using the vision-based control law. The EAsoftM can support the reaching motion with compliance realized by the soft materials and pneumatic actuation. In addition, the vision-based control law has been proposed for the precise control over the target reaching motion within the millimeter scale. Aiming at rehabilitation exercise for individuals, typically soft actuators have been developed for relatively small motions, such as grasping motion, and one of the challenges has been to extend their use for a wider range reaching motion. The proposed EAsoftM presented one possible solution for this challenge by transmitting the torque effectively along the anatomically aligned with a human body exoskeleton. The proposed integrated systems will be an ideal solution for neurorehabilitation where affordable, wearable, and portable systems are required to be customized for individuals with specific motor impairments. PMID:28736514

  4. Modulation frequency as a cue for auditory speed perception.

    PubMed

    Senna, Irene; Parise, Cesare V; Ernst, Marc O

    2017-07-12

    Unlike vision, the mechanisms underlying auditory motion perception are poorly understood. Here we describe an auditory motion illusion revealing a novel cue to auditory speed perception: the temporal frequency of amplitude modulation (AM-frequency), typical for rattling sounds. Naturally, corrugated objects sliding across each other generate rattling sounds whose AM-frequency tends to directly correlate with speed. We found that AM-frequency modulates auditory speed perception in a highly systematic fashion: moving sounds with higher AM-frequency are perceived as moving faster than sounds with lower AM-frequency. Even more interestingly, sounds with higher AM-frequency also induce stronger motion aftereffects. This reveals the existence of specialized neural mechanisms for auditory motion perception, which are sensitive to AM-frequency. Thus, in spatial hearing, the brain successfully capitalizes on the AM-frequency of rattling sounds to estimate the speed of moving objects. This tightly parallels previous findings in motion vision, where spatio-temporal frequency of moving displays systematically affects both speed perception and the magnitude of the motion aftereffects. Such an analogy with vision suggests that motion detection may rely on canonical computations, with similar neural mechanisms shared across the different modalities. © 2017 The Author(s).

  5. Flies and humans share a motion estimation strategy that exploits natural scene statistics

    PubMed Central

    Clark, Damon A.; Fitzgerald, James E.; Ales, Justin M.; Gohl, Daryl M.; Silies, Marion A.; Norcia, Anthony M.; Clandinin, Thomas R.

    2014-01-01

    Sighted animals extract motion information from visual scenes by processing spatiotemporal patterns of light falling on the retina. The dominant models for motion estimation exploit intensity correlations only between pairs of points in space and time. Moving natural scenes, however, contain more complex correlations. Here we show that fly and human visual systems encode the combined direction and contrast polarity of moving edges using triple correlations that enhance motion estimation in natural environments. Both species extract triple correlations with neural substrates tuned for light or dark edges, and sensitivity to specific triple correlations is retained even as light and dark edge motion signals are combined. Thus, both species separately process light and dark image contrasts to capture motion signatures that can improve estimation accuracy. This striking convergence argues that statistical structures in natural scenes have profoundly affected visual processing, driving a common computational strategy over 500 million years of evolution. PMID:24390225

  6. Two-photon calcium imaging from head-fixed Drosophila during optomotor walking behavior.

    PubMed

    Seelig, Johannes D; Chiappe, M Eugenia; Lott, Gus K; Dutta, Anirban; Osborne, Jason E; Reiser, Michael B; Jayaraman, Vivek

    2010-07-01

    Drosophila melanogaster is a model organism rich in genetic tools to manipulate and identify neural circuits involved in specific behaviors. Here we present a technique for two-photon calcium imaging in the central brain of head-fixed Drosophila walking on an air-supported ball. The ball's motion is tracked at high resolution and can be treated as a proxy for the fly's own movements. We used the genetically encoded calcium sensor, GCaMP3.0, to record from important elements of the motion-processing pathway, the horizontal-system lobula plate tangential cells (LPTCs) in the fly optic lobe. We presented motion stimuli to the tethered fly and found that calcium transients in horizontal-system neurons correlated with robust optomotor behavior during walking. Our technique allows both behavior and physiology in identified neurons to be monitored in a genetic model organism with an extensive repertoire of walking behaviors.

  7. Object tracking with stereo vision

    NASA Technical Reports Server (NTRS)

    Huber, Eric

    1994-01-01

    A real-time active stereo vision system incorporating gaze control and task directed vision is described. Emphasis is placed on object tracking and object size and shape determination. Techniques include motion-centroid tracking, depth tracking, and contour tracking.

  8. ROS-based ground stereo vision detection: implementation and experiments.

    PubMed

    Hu, Tianjiang; Zhao, Boxin; Tang, Dengqing; Zhang, Daibing; Kong, Weiwei; Shen, Lincheng

    This article concentrates on open-source implementation on flying object detection in cluttered scenes. It is of significance for ground stereo-aided autonomous landing of unmanned aerial vehicles. The ground stereo vision guidance system is presented with details on system architecture and workflow. The Chan-Vese detection algorithm is further considered and implemented in the robot operating systems (ROS) environment. A data-driven interactive scheme is developed to collect datasets for parameter tuning and performance evaluating. The flying vehicle outdoor experiments capture the stereo sequential images dataset and record the simultaneous data from pan-and-tilt unit, onboard sensors and differential GPS. Experimental results by using the collected dataset validate the effectiveness of the published ROS-based detection algorithm.

  9. Construction, implementation and testing of an image identification system using computer vision methods for fruit flies with economic importance (Diptera: Tephritidae).

    PubMed

    Wang, Jiang-Ning; Chen, Xiao-Lin; Hou, Xin-Wen; Zhou, Li-Bing; Zhu, Chao-Dong; Ji, Li-Qiang

    2017-07-01

    Many species of Tephritidae are damaging to fruit, which might negatively impact international fruit trade. Automatic or semi-automatic identification of fruit flies are greatly needed for diagnosing causes of damage and quarantine protocols for economically relevant insects. A fruit fly image identification system named AFIS1.0 has been developed using 74 species belonging to six genera, which include the majority of pests in the Tephritidae. The system combines automated image identification and manual verification, balancing operability and accuracy. AFIS1.0 integrates image analysis and expert system into a content-based image retrieval framework. In the the automatic identification module, AFIS1.0 gives candidate identification results. Afterwards users can do manual selection based on comparing unidentified images with a subset of images corresponding to the automatic identification result. The system uses Gabor surface features in automated identification and yielded an overall classification success rate of 87% to the species level by Independent Multi-part Image Automatic Identification Test. The system is useful for users with or without specific expertise on Tephritidae in the task of rapid and effective identification of fruit flies. It makes the application of computer vision technology to fruit fly recognition much closer to production level. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  10. Non-Native Chinese Language Learners' Attitudes towards Online Vision-Based Motion Games

    ERIC Educational Resources Information Center

    Hao, Yungwei; Hong, Jon-Chao; Jong, Jyh-Tsorng; Hwang, Ming-Yueh; Su, Chao-Ya; Yang, Jin-Shin

    2010-01-01

    Learning to write Chinese characters is often thought to be a very challenging and laborious task. However, new learning tools are being designed that might reduce learners' tedium. This study explores one such tool, an online program in which learners can learn Chinese characters through vision-based motion games. The learner's gestures are…

  11. Hovering of a jellyfish-like flying machine

    NASA Astrophysics Data System (ADS)

    Ristroph, Leif; Childress, Stephen

    2013-11-01

    Ornithopters, or flapping-wing aircraft, offer an alternative to helicopters in achieving maneuverability at small scales, although stabilizing such aerial vehicles remains a key challenge. Here, we present a hovering machine that achieves self-righting flight using flapping wings alone, without relying on additional aerodynamic surfaces and without feedback control. We design, construct, and test-fly a prototype that opens and closes four wings, resembling the motions of swimming jellyfish more so than any insect or bird. Lift measurements and high-speed video of free-flight are used to inform an aerodynamic model that explains the stabilization mechanism. These results show the promise of flapping-flight strategies beyond those that directly mimic the wing motions of flying animals.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheung, Y; Rahimi, A; Sawant, A

    Purpose: Active breathing control (ABC) has been used to reduce treatment margin due to respiratory organ motion by enforcing temporary breath-holds. However, in practice, even if the ABC device indicates constant lung volume during breath-hold, the patient may still exhibit minor chest motion. Consequently, therapists are given a false sense of security that the patient is immobilized. This study aims at quantifying such motion during ABC breath-holds by monitoring the patient chest motion using a surface photogrammetry system, VisionRT. Methods: A female patient with breast cancer was selected to evaluate chest motion during ABC breath-holds. During the entire course ofmore » treatment, the patient’s chest surface was monitored by a surface photogrammetry system, VisionRT. Specifically, a user-defined region-of-interest (ROI) on the chest surface was selected for the system to track at a rate of ∼3Hz. The surface motion was estimated by rigid image registration between the current ROI image captured and a reference image. The translational and rotational displacements computed were saved in a log file. Results: A total of 20 fractions of radiation treatment were monitored by VisionRT. After removing noisy data, we obtained chest motion of 79 breath-hold sessions. Mean chest motion in AP direction during breath-holds is 1.31mm with 0.62mm standard deviation. Of the 79 sessions, the patient exhibited motion ranging from 0–1 mm (30 sessions), 1–2 mm (37 sessions), 2–3 mm (11 sessions) and >3 mm (1 session). Conclusion: Contrary to popular assumptions, the patient is not completely still during ABC breath-hold sessions. In this particular case studied, the patient exhibited chest motion over 2mm in 14 out of 79 breath-holds. Underestimating treatment margin for radiation therapy with ABC could reduce treatment effectiveness due to geometric miss or overdose of critical organs. The senior author receives research funding from NIH, VisionRT, Varian Medical Systems and Elekta.« less

  13. Detecting target changes in multiple object tracking with peripheral vision: More pronounced eccentricity effects for changes in form than in motion.

    PubMed

    Vater, Christian; Kredel, Ralf; Hossner, Ernst-Joachim

    2017-05-01

    In the current study, dual-task performance is examined with multiple-object tracking as a primary task and target-change detection as a secondary task. The to-be-detected target changes in conditions of either change type (form vs. motion; Experiment 1) or change salience (stop vs. slowdown; Experiment 2), with changes occurring at either near (5°-10°) or far (15°-20°) eccentricities (Experiments 1 and 2). The aim of the study was to test whether changes can be detected solely with peripheral vision. By controlling for saccades and computing gaze distances, we could show that participants used peripheral vision to monitor the targets and, additionally, to perceive changes at both near and far eccentricities. Noticeably, gaze behavior was not affected by the actual target change. Detection rates as well as response times generally varied as a function of change condition and eccentricity, with faster detections for motion changes and near changes. However, in contrast to the effects found for motion changes, sharp declines in detection rates and increased response times were observed for form changes as a function of the eccentricities. This result can be ascribed to properties of the visual system, namely to the limited spatial acuity in the periphery and the comparably receptive motion sensitivity of peripheral vision. These findings show that peripheral vision is functional for simultaneous target monitoring and target-change detection as saccadic information suppression can be avoided and covert attention can be optimally distributed to all targets. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  14. Real-time Enhanced Vision System

    NASA Technical Reports Server (NTRS)

    Hines, Glenn D.; Rahman, Zia-Ur; Jobson, Daniel J.; Woodell, Glenn A.; Harrah, Steven D.

    2005-01-01

    Flying in poor visibility conditions, such as rain, snow, fog or haze, is inherently dangerous. However these conditions can occur at nearly any location, so inevitably pilots must successfully navigate through them. At NASA Langley Research Center (LaRC), under support of the Aviation Safety and Security Program Office and the Systems Engineering Directorate, we are developing an Enhanced Vision System (EVS) that combines image enhancement and synthetic vision elements to assist pilots flying through adverse weather conditions. This system uses a combination of forward-looking infrared and visible sensors for data acquisition. A core function of the system is to enhance and fuse the sensor data in order to increase the information content and quality of the captured imagery. These operations must be performed in real-time for the pilot to use while flying. For image enhancement, we are using the LaRC patented Retinex algorithm since it performs exceptionally well for improving low-contrast range imagery typically seen during poor visibility conditions. In general, real-time operation of the Retinex requires specialized hardware. To date, we have successfully implemented a single-sensor real-time version of the Retinex on several different Digital Signal Processor (DSP) platforms. In this paper we give an overview of the EVS and its performance requirements for real-time enhancement and fusion and we discuss our current real-time Retinex implementations on DSPs.

  15. Real-time enhanced vision system

    NASA Astrophysics Data System (ADS)

    Hines, Glenn D.; Rahman, Zia-ur; Jobson, Daniel J.; Woodell, Glenn A.; Harrah, Steven D.

    2005-05-01

    Flying in poor visibility conditions, such as rain, snow, fog or haze, is inherently dangerous. However these conditions can occur at nearly any location, so inevitably pilots must successfully navigate through them. At NASA Langley Research Center (LaRC), under support of the Aviation Safety and Security Program Office and the Systems Engineering Directorate, we are developing an Enhanced Vision System (EVS) that combines image enhancement and synthetic vision elements to assist pilots flying through adverse weather conditions. This system uses a combination of forward-looking infrared and visible sensors for data acquisition. A core function of the system is to enhance and fuse the sensor data in order to increase the information content and quality of the captured imagery. These operations must be performed in real-time for the pilot to use while flying. For image enhancement, we are using the LaRC patented Retinex algorithm since it performs exceptionally well for improving low-contrast range imagery typically seen during poor visibility poor visibility conditions. In general, real-time operation of the Retinex requires specialized hardware. To date, we have successfully implemented a single-sensor real-time version of the Retinex on several different Digital Signal Processor (DSP) platforms. In this paper we give an overview of the EVS and its performance requirements for real-time enhancement and fusion and we discuss our current real-time Retinex implementations on DSPs.

  16. Adaptation of velocity encoding in synaptically coupled neurons in the fly visual system.

    PubMed

    Kalb, Julia; Egelhaaf, Martin; Kurtz, Rafael

    2008-09-10

    Although many adaptation-induced effects on neuronal response properties have been described, it is often unknown at what processing stages in the nervous system they are generated. We focused on fly visual motion-sensitive neurons to identify changes in response characteristics during prolonged visual motion stimulation. By simultaneous recordings of synaptically coupled neurons, we were able to directly compare adaptation-induced effects at two consecutive processing stages in the fly visual motion pathway. This allowed us to narrow the potential sites of adaptation effects within the visual system and to relate them to the properties of signal transfer between neurons. Motion adaptation was accompanied by a response reduction, which was somewhat stronger in postsynaptic than in presynaptic cells. We found that the linear representation of motion velocity degrades during adaptation to a white-noise velocity-modulated stimulus. This effect is caused by an increasingly nonlinear velocity representation rather than by an increase of noise and is similarly strong in presynaptic and postsynaptic neurons. In accordance with this similarity, the dynamics and the reliability of interneuronal signal transfer remained nearly constant. Thus, adaptation is mainly based on processes located in the presynaptic neuron or in more peripheral processing stages. In contrast, changes of transfer properties at the analyzed synapse or in postsynaptic spike generation contribute little to changes in velocity coding during motion adaptation.

  17. Discovering the flight autostabilizer of fruit flies by inducing aerial stumbles.

    PubMed

    Ristroph, Leif; Bergou, Attila J; Ristroph, Gunnar; Coumes, Katherine; Berman, Gordon J; Guckenheimer, John; Wang, Z Jane; Cohen, Itai

    2010-03-16

    Just as the Wright brothers implemented controls to achieve stable airplane flight, flying insects have evolved behavioral strategies that ensure recovery from flight disturbances. Pioneering studies performed on tethered and dissected insects demonstrate that the sensory, neurological, and musculoskeletal systems play important roles in flight control. Such studies, however, cannot produce an integrative model of insect flight stability because they do not incorporate the interaction of these systems with free-flight aerodynamics. We directly investigate control and stability through the application of torque impulses to freely flying fruit flies (Drosophila melanogaster) and measurement of their behavioral response. High-speed video and a new motion tracking method capture the aerial "stumble," and we discover that flies respond to gentle disturbances by accurately returning to their original orientation. These insects take advantage of a stabilizing aerodynamic influence and active torque generation to recover their heading to within 2 degrees in < 60 ms. To explain this recovery behavior, we form a feedback control model that includes the fly's ability to sense body rotations, process this information, and actuate the wing motions that generate corrective aerodynamic torque. Thus, like early man-made aircraft and modern fighter jets, the fruit fly employs an automatic stabilization scheme that reacts to short time-scale disturbances.

  18. A visual horizon affects steering responses during flight in fruit flies.

    PubMed

    Caballero, Jorge; Mazo, Chantell; Rodriguez-Pinto, Ivan; Theobald, Jamie C

    2015-09-01

    To navigate well through three-dimensional environments, animals must in some way gauge the distances to objects and features around them. Humans use a variety of visual cues to do this, but insects, with their small size and rigid eyes, are constrained to a more limited range of possible depth cues. For example, insects attend to relative image motion when they move, but cannot change the optical power of their eyes to estimate distance. On clear days, the horizon is one of the most salient visual features in nature, offering clues about orientation, altitude and, for humans, distance to objects. We set out to determine whether flying fruit flies treat moving features as farther off when they are near the horizon. Tethered flies respond strongly to moving images they perceive as close. We measured the strength of steering responses while independently varying the elevation of moving stimuli and the elevation of a virtual horizon. We found responses to vertical bars are increased by negative elevations of their bases relative to the horizon, closely correlated with the inverse of apparent distance. In other words, a bar that dips far below the horizon elicits a strong response, consistent with using the horizon as a depth cue. Wide-field motion also had an enhanced effect below the horizon, but this was only prevalent when flies were additionally motivated with hunger. These responses may help flies tune behaviors to nearby objects and features when they are too far off for motion parallax. © 2015. Published by The Company of Biologists Ltd.

  19. Visions of our Planet's Atmosphere, Land and Oceans: NASA/NOAA Electronic-Theater 2002. Spectacular Visualizations of our Blue Marble

    NASA Technical Reports Server (NTRS)

    Hasler, A. F.; Starr, David (Technical Monitor)

    2002-01-01

    Spectacular Visualizations of our Blue Marble The NASA/NOAA Electronic Theater presents Earth science observations and visualizations in a historical perspective. Fly in from outer space to the 2002 Winter Olympic Stadium Site of the Olympic Opening and Closing Ceremonies in Salt Lake City. Fly in and through Olympic Alpine Venues using 1 m IKONOS "Spy Satellite" data. Go back to the early weather satellite images from the 1960s and see them contrasted with the latest US and international global satellite weather movies including hurricanes & "tornadoes". See the latest visualizations of spectacular images from NASA/NOAA remote sensing missions like Terra, GOES, TRMM, SeaWiFS, Landsat 7 including new 1 - min GOES rapid scan image sequences of Nov 9th 2001 Midwest tornadic thunderstorms and have them explained. See how High-Definition Television (HDTV) is revolutionizing the way we communicate science. (In cooperation with the American Museum of Natural History in NYC). See dust storms in Africa and smoke plumes from fires in Mexico. See visualizations featured on the covers of Newsweek, TIME, National Geographic, Popular Science & on National & International Network TV. New computer software tools allow us to roam & zoom through massive global images e.g. Landsat tours of the US, and Africa, showing desert and mountain geology as well as seasonal changes in vegetation. See animations of the polar ice packs and the motion of gigantic Antarctic Icebergs from SeaWinds data. Spectacular new visualizations of the global atmosphere & oceans are shown. See vertexes and currents in the global oceans that bring up the nutrients to feed tiny algae and draw the fish, whales and fisherman. See the how the ocean blooms in response to these currents and El Nicola Nina climate changes. See the city lights, fishing fleets, gas flares and biomass burning of the Earth at night observed by the "night-vision" DMSP military satellite.

  20. Visions of our Planet's Atmosphere, Land and Oceans: NASA/NOAA Electronic Theater 2002

    NASA Technical Reports Server (NTRS)

    Haser, Fritz; Starr, David (Technical Monitor)

    2002-01-01

    The NASA/NOAA Electronic Theater presents Earth science observations and visualizations in a historical perspective. Fly in from outer space to the 2002 Winter Olympic Stadium Site of the Olympic Opening and Closing Ceremonies in Salt Lake City. Fly in and through Olympic Alpine Venues using 1 m IKONOS "Spy Satellite" data. Go back to the early weather satellite images from the 1960s and see them contrasted with the latest US and international global satellite weather movies including hurricanes and "tornadoes". See the latest visualizations of spectacular images from NASA/NOAA remote sensing missions like Terra, GOES, TRMM, SeaWiFS, Landsat 7 including new 1 - min GOES rapid scan image sequences of Nov 9th 2001 Midwest tornadic thunderstorms and have them explained. See how High-Definition Television (HDTV) is revolutionizing the way we communicate science. (In cooperation with the American Museum of Natural History in NYC) See dust storms in Africa and smoke plumes from fires in Mexico. See visualizations featured on the covers of Newsweek, TIME, National Geographic, Popular Science and on National and International Network TV. New computer software tools allow us to roam and zoom through massive global images e.g. Landsat tours of the US, and Africa, showing desert and mountain geology as well as seasonal changes in vegetation. See animations of the polar ice packs and the motion of gigantic Antarctic Icebergs from SeaWinds. data. Spectacular new visualizations of the global atmosphere and oceans are shown. See vortexes and currents in the global oceans that bring up the nutrients to feed tiny algae and draw the fish, whales and fisherman. See the how the ocean blooms in response to these currents and El Nino/La Nina climate changes. See the city lights, fishing fleets, gas flares and bio-mass burning of the Earth at night observed by the "night-vision" DMSP military satellite.

  1. Dynamical Systems and Motion Vision.

    DTIC Science & Technology

    1988-04-01

    TASK Artificial Inteligence Laboratory AREA I WORK UNIT NUMBERS 545 Technology Square . Cambridge, MA 02139 C\\ II. CONTROLLING OFFICE NAME ANO0 ADDRESS...INSTITUTE OF TECHNOLOGY ARTIFICIAL INTELLIGENCE LABORATORY A.I.Memo No. 1037 April, 1988 Dynamical Systems and Motion Vision Joachim Heel Abstract: In this... Artificial Intelligence L3 Laboratory of the Massachusetts Institute of Technology. Support for the Laboratory’s [1 Artificial Intelligence Research is

  2. Detecting Motion from a Moving Platform; Phase 3: Unification of Control and Sensing for More Advanced Situational Awareness

    DTIC Science & Technology

    2011-11-01

    RX-TY-TR-2011-0096-01) develops a novel computer vision sensor based upon the biological vision system of the common housefly , Musca domestica...01 summarizes the development of a novel computer vision sensor based upon the biological vision system of the common housefly , Musca domestica

  3. Measuring pilot workload in a motion base simulator. III - Synchronous secondary task

    NASA Technical Reports Server (NTRS)

    Kantowitz, Barry H.; Bortolussi, Michael R.; Hart, Sandra G.

    1987-01-01

    This experiment continues earlier research of Kantowitz et al. (1983) conducted in a GAT-1 motion-base trainer to evaluate choice-reaction secondary tasks as measures of pilot work load. The earlier work used an asynchronous secondary task presented every 22 sec regardless of flying performance. The present experiment uses a synchronous task presented only when a critical event occurred on the flying task. Both two- and four-choice visual secondary tasks were investigated. Analysis of primary flying-task results showed no decrement in error for altitude, indicating that the key assumption necessary for using a choice secondary task was satisfied. Reaction times showed significant differences between 'easy' and 'hard' flight scenarios as well as the ability to discriminate among flight tasks.

  4. Stable hovering of a jellyfish-like flying machine

    PubMed Central

    Ristroph, Leif; Childress, Stephen

    2014-01-01

    Ornithopters, or flapping-wing aircraft, offer an alternative to helicopters in achieving manoeuvrability at small scales, although stabilizing such aerial vehicles remains a key challenge. Here, we present a hovering machine that achieves self-righting flight using flapping wings alone, without relying on additional aerodynamic surfaces and without feedback control. We design, construct and test-fly a prototype that opens and closes four wings, resembling the motions of swimming jellyfish more so than any insect or bird. Measurements of lift show the benefits of wing flexing and the importance of selecting a wing size appropriate to the motor. Furthermore, we use high-speed video and motion tracking to show that the body orientation is stable during ascending, forward and hovering flight modes. Our experimental measurements are used to inform an aerodynamic model of stability that reveals the importance of centre-of-mass location and the coupling of body translation and rotation. These results show the promise of flapping-flight strategies beyond those that directly mimic the wing motions of flying animals. PMID:24430122

  5. The Function and Organization of the Motor System Controlling Flight Maneuvers in Flies.

    PubMed

    Lindsay, Theodore; Sustar, Anne; Dickinson, Michael

    2017-02-06

    Animals face the daunting task of controlling their limbs using a small set of highly constrained actuators. This problem is particularly demanding for insects such as Drosophila, which must adjust wing motion for both quick voluntary maneuvers and slow compensatory reflexes using only a dozen pairs of muscles. To identify strategies by which animals execute precise actions using sparse motor networks, we imaged the activity of a complete ensemble of wing control muscles in intact, flying flies. Our experiments uncovered a remarkably efficient logic in which each of the four skeletal elements at the base of the wing are equipped with both large phasically active muscles capable of executing large changes and smaller tonically active muscles specialized for continuous fine-scaled adjustments. Based on the responses to a broad panel of visual motion stimuli, we have developed a model by which the motor array regulates aerodynamically functional features of wing motion. VIDEO ABSTRACT. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Effects of Motion on Skill Acquisition in Future Simulators

    DTIC Science & Technology

    2006-05-01

    performed by Jacobs (1976) concentrated on transfer of training under different motion conditions. Researchers used participants with no prior flying... Autogenic feedback training exercise is superior to promethazine for the treatment of motion sickness. Journal of Clinical Pharmacology, 40, 1154 -1165...motion in simulation was examined. A particular focus was paid to research on the effects of motion cueing on transfer of training from both ground

  7. Spatiotemporal Processing in Crossmodal Interactions for Perception of the External World: A Review

    PubMed Central

    Hidaka, Souta; Teramoto, Wataru; Sugita, Yoichi

    2015-01-01

    Research regarding crossmodal interactions has garnered much interest in the last few decades. A variety of studies have demonstrated that multisensory information (vision, audition, tactile sensation, and so on) can perceptually interact with each other in the spatial and temporal domains. Findings regarding crossmodal interactions in the spatiotemporal domain (i.e., motion processing) have also been reported, with updates in the last few years. In this review, we summarize past and recent findings on spatiotemporal processing in crossmodal interactions regarding perception of the external world. A traditional view regarding crossmodal interactions holds that vision is superior to audition in spatial processing, but audition is dominant over vision in temporal processing. Similarly, vision is considered to have dominant effects over the other sensory modalities (i.e., visual capture) in spatiotemporal processing. However, recent findings demonstrate that sound could have a driving effect on visual motion perception. Moreover, studies regarding perceptual associative learning reported that, after association is established between a sound sequence without spatial information and visual motion information, the sound sequence could trigger visual motion perception. Other sensory information, such as motor action or smell, has also exhibited similar driving effects on visual motion perception. Additionally, recent brain imaging studies demonstrate that similar activation patterns could be observed in several brain areas, including the motion processing areas, between spatiotemporal information from different sensory modalities. Based on these findings, we suggest that multimodal information could mutually interact in spatiotemporal processing in the percept of the external world and that common perceptual and neural underlying mechanisms would exist for spatiotemporal processing. PMID:26733827

  8. Visual Depth from Motion Parallax and Eye Pursuit

    PubMed Central

    Stroyan, Keith; Nawrot, Mark

    2012-01-01

    A translating observer viewing a rigid environment experiences “motion parallax,” the relative movement upon the observer’s retina of variously positioned objects in the scene. This retinal movement of images provides a cue to the relative depth of objects in the environment, however retinal motion alone cannot mathematically determine relative depth of the objects. Visual perception of depth from lateral observer translation uses both retinal image motion and eye movement. In (Nawrot & Stroyan, 2009, Vision Res. 49, p.1969) we showed mathematically that the ratio of the rate of retinal motion over the rate of smooth eye pursuit mathematically determines depth relative to the fixation point in central vision. We also reported on psychophysical experiments indicating that this ratio is the important quantity for perception. Here we analyze the motion/pursuit cue for the more general, and more complicated, case when objects are distributed across the horizontal viewing plane beyond central vision. We show how the mathematical motion/pursuit cue varies with different points across the plane and with time as an observer translates. If the time varying retinal motion and smooth eye pursuit are the only signals used for this visual process, it is important to know what is mathematically possible to derive about depth and structure. Our analysis shows that the motion/pursuit ratio determines an excellent description of depth and structure in these broader stimulus conditions, provides a detailed quantitative hypothesis of these visual processes for the perception of depth and structure from motion parallax, and provides a computational foundation to analyze the dynamic geometry of future experiments. PMID:21695531

  9. Turning behaviour depends on frictional damping in the fruit fly Drosophila.

    PubMed

    Hesselberg, Thomas; Lehmann, Fritz-Olaf

    2007-12-01

    Turning behaviour in the fruit fly Drosophila depends on several factors including not only feedback from sensory organs and muscular control of wing motion, but also the mass moments of inertia and the frictional damping coefficient of the rotating body. In the present study we evaluate the significance of body friction for yaw turning and thus the limits of visually mediated flight control in Drosophila, by scoring tethered flies flying in a flight simulator on their ability to visually compensate a bias on a moving object and a visual background panorama at different simulated frictional dampings. We estimated the fly's natural damping coefficient from a numerical aerodynamic model based on both friction on the body and the flapping wings during saccadic turning. The model predicts a coefficient of 54 x 10(-12) Nm s, which is more than 100-times larger than the value estimated from a previous study on the body alone. Our estimate suggests that friction plays a larger role for yaw turning in Drosophila than moments of inertia. The simulator experiments showed that visual performance of the fruit fly collapses near the physical conditions estimated for freely flying animals, which is consistent with the suggested role of the halteres for flight stabilization. However, kinematic analyses indicate that the measured loss of flight control might be due predominantly to the limited fine control in the fly's steering muscles below a threshold of 1-2 degrees stroke amplitude, rather than resulting from the limits of visual motion detection by the fly's compound eyes. We discuss the impact of these results and suggest that the elevated frictional coefficient permits freely flying fruit flies to passively terminate rotational body movements without producing counter-torque during the second half of the saccadic turning manoeuvre.

  10. NASA's Exploration Architecture

    NASA Technical Reports Server (NTRS)

    Tyburski, Timothy

    2006-01-01

    A Bold Vision for Space Exploration includes: 1) Complete the International Space Station; 2) Safely fly the Space Shuttle until 2010; 3) Develop and fly the Crew Exploration Vehicle no later than 2012; 4) Return to the moon no later than 2020; 5) Extend human presence across the solar system and beyond; 6) Implement a sustained and affordable human and robotic program; 7) Develop supporting innovative technologies, knowledge, and infrastructures; and 8) Promote international and commercial participation in exploration.

  11. Ego-motion based on EM for bionic navigation

    NASA Astrophysics Data System (ADS)

    Yue, Xiaofeng; Wang, L. J.; Liu, J. G.

    2015-12-01

    Researches have proved that flying insects such as bees can achieve efficient and robust flight control, and biologists have explored some biomimetic principles regarding how they control flight. Based on those basic studies and principles acquired from the flying insects, this paper proposes a different solution of recovering ego-motion for low level navigation. Firstly, a new type of entropy flow is provided to calculate the motion parameters. Secondly, EKF, which has been used for navigation for some years to correct accumulated error, and estimation-Maximization, which is always used to estimate parameters, are put together to determine the ego-motion estimation of aerial vehicles. Numerical simulation on MATLAB has proved that this navigation system provides more accurate position and smaller mean absolute error than pure optical flow navigation. This paper has done pioneering work in bionic mechanism to space navigation.

  12. Crew and Display Concepts Evaluation for Synthetic / Enhanced Vision Systems

    NASA Technical Reports Server (NTRS)

    Bailey, Randall E.; Kramer, Lynda J.; Prinzel, Lawrence J., III

    2006-01-01

    NASA s Synthetic Vision Systems (SVS) project is developing technologies with practical applications that strive to eliminate low-visibility conditions as a causal factor to civil aircraft accidents and replicate the operational benefits of clear day flight operations, regardless of the actual outside visibility condition. Enhanced Vision System (EVS) technologies are analogous and complementary in many respects to SVS, with the principle difference being that EVS is an imaging sensor presentation, as opposed to a database-derived image. The use of EVS in civil aircraft is projected to increase rapidly as the Federal Aviation Administration recently changed the aircraft operating rules under Part 91, revising the flight visibility requirements for conducting operations to civil airports. Operators conducting straight-in instrument approach procedures may now operate below the published approach minimums when using an approved EVS that shows the required visual references on the pilot s Head-Up Display. An experiment was conducted to evaluate the complementary use of SVS and EVS technologies, specifically focusing on new techniques for integration and/or fusion of synthetic and enhanced vision technologies and crew resource management while operating under the newly adopted FAA rules which provide operating credit for EVS. Overall, the experimental data showed that significant improvements in SA without concomitant increases in workload and display clutter could be provided by the integration and/or fusion of synthetic and enhanced vision technologies for the pilot-flying and the pilot-not-flying.

  13. Perceived spatial displacement of motion-defined contours in peripheral vision.

    PubMed

    Fan, Zhao; Harris, John

    2008-12-01

    The perceived displacement of motion-defined contours in peripheral vision was examined in four experiments. In Experiment 1, in line with Ramachandran and Anstis' finding [Ramachandran, V. S., & Anstis, S. M. (1990). Illusory displacement of equiluminous kinetic edges. Perception, 19, 611-616], the border between a field of drifting dots and a static dot pattern was apparently displaced in the same direction as the movement of the dots. When a uniform dark area was substituted for the static dots, a similar displacement was found, but this was smaller and statistically insignificant. In Experiment 2, the border between two fields of dots moving in opposite directions was displaced in the direction of motion of the dots in the more eccentric field, so that the location of a boundary defined by a diverging pattern is perceived as more eccentric, and that defined by a converging as less eccentric. Two explanations for this effect (that the displacement reflects a greater weight given to the more eccentric motion, or that the region containing stronger centripetal motion components expands perceptually into that containing centrifugal motion) were tested in Experiment 3, by varying the velocity of the more eccentric region. The results favoured the explanation based on the expansion of an area in centripetal motion. Experiment 4 showed that the difference in perceived location was unlikely to be due to differences in the discriminability of contours in diverging and converging patterns, and confirmed that this effect is due to a difference between centripetal and centrifugal motion rather than motion components in other directions. Our result provides new evidence for a bias towards centripetal motion in human vision, and suggests that the direction of motion-induced displacement of edges is not always in the direction of an adjacent moving pattern.

  14. Wing-kinematics measurement and aerodynamics in a small insect in hovering flight.

    PubMed

    Cheng, Xin; Sun, Mao

    2016-05-11

    Wing-motion of hovering small fly Liriomyza sativae was measured using high-speed video and flows of the wings calculated numerically. The fly used high wingbeat frequency (≈265 Hz) and large stroke amplitude (≈182°); therefore, even if its wing-length (R) was small (R ≈ 1.4 mm), the mean velocity of wing reached ≈1.5 m/s, the same as that of an average-size insect (R ≈ 3 mm). But the Reynolds number (Re) of wing was still low (≈40), owing to the small wing-size. In increasing the stroke amplitude, the outer parts of the wings had a "clap and fling" motion. The mean-lift coefficient was high, ≈1.85, several times larger than that of a cruising airplane. The partial "clap and fling" motion increased the lift by ≈7%, compared with the case of no aerodynamic interaction between the wings. The fly mainly used the delayed stall mechanism to generate the high-lift. The lift-to-drag ratio is only 0.7 (for larger insects, Re being about 100 or higher, the ratio is 1-1.2); that is, although the small fly can produce enough lift to support its weight, it needs to overcome a larger drag to do so.

  15. Use of 3D vision for fine robot motion

    NASA Technical Reports Server (NTRS)

    Lokshin, Anatole; Litwin, Todd

    1989-01-01

    An integration of 3-D vision systems with robot manipulators will allow robots to operate in a poorly structured environment by visually locating targets and obstacles. However, by using computer vision for objects acquisition makes the problem of overall system calibration even more difficult. Indeed, in a CAD based manipulation a control architecture has to find an accurate mapping between the 3-D Euclidean work space and a robot configuration space (joint angles). If a stereo vision is involved, then one needs to map a pair of 2-D video images directly into the robot configuration space. Neural Network approach aside, a common solution to this problem is to calibrate vision and manipulator independently, and then tie them via common mapping into the task space. In other words, both vision and robot refer to some common Absolute Euclidean Coordinate Frame via their individual mappings. This approach has two major difficulties. First a vision system has to be calibrated over the total work space. And second, the absolute frame, which is usually quite arbitrary, has to be the same with a high degree of precision for both robot and vision subsystem calibrations. The use of computer vision to allow robust fine motion manipulation in a poorly structured world which is currently in progress is described along with the preliminary results and encountered problems.

  16. Small fruit flies sacrifice temporal acuity to maintain contrast sensitivity.

    PubMed

    Currea, John P; Smith, Joshua L; Theobald, Jamie C

    2018-06-05

    Holometabolous insects, like fruit flies, grow primarily during larval development. Scarce larval feeding is common in nature and generates smaller adults. Despite the importance of vision to flies, eye size scales proportionately with body size, and smaller eyes confer poorer vision due to smaller optics. Variable larval feeding, therefore, causes within-species differences in visual processing, which have gone largely unnoticed due to ad libitum feeding in the lab that results in generally large adults. Do smaller eyes have smaller ommatidial lenses, reducing sensitivity, or broader inter-ommatidial angles, reducing acuity? And to what extent might neural processes adapt to these optical challenges with temporal and spatial summation? To understand this in the fruit fly, we generated a distribution of body lengths (1.67-2.34 mm; n = 24) and eye lengths (0.33-0.44 mm; n = 24), resembling the distribution of wild-caught flies, by removing larvae from food during their third instar. We find smaller eyes (0.19 vs.0.07 mm 2 ) have substantially fewer (978 vs. 540, n = 45) and smaller ommatidia (222 vs. 121 μm 2 ;n = 45) separated by slightly wider inter-ommatidial angles (4.5 vs.5.5°; n = 34). This corresponds to a greater loss in contrast sensitivity (<50%) than spatial acuity (<20%). Using a flight arena and psychophysics paradigm, we find that smaller flies lose little spatial acuity (0.126 vs. 0.118CPD; n = 45), and recover contrast sensitivity (2.22 for both; n = 65) by sacrificing temporal acuity (26.3 vs. 10.8Hz; n = 112) at the neural level. Therefore, smaller flies sacrifice contrast sensitivity to maintain spatial acuity optically, but recover contrast sensitivity, almost completely, by sacrificing temporal acuity neurally. Copyright © 2018. Published by Elsevier Ltd.

  17. Global Methods for Image Motion Analysis

    DTIC Science & Technology

    1992-10-01

    a variant of the same error function as in Adiv [2]. Another related approach was presented by Maybank [46,45]. Nearly all researchers in motion...with an application to stereo vision. In Proc. 7th Intern. Joint Conference on AI, pages 674{679, Vancouver, 1981. [45] S. J. Maybank . Algorithm for...analysing optical ow based on the least-squares method. Image and Vision Computing, 4:38{42, 1986. [46] S. J. Maybank . A Theoretical Study of Optical

  18. Dynamic and predictive links between touch and vision.

    PubMed

    Gray, Rob; Tan, Hong Z

    2002-07-01

    We investigated crossmodal links between vision and touch for moving objects. In experiment 1, observers discriminated visual targets presented randomly at one of five locations on their forearm. Tactile pulses simulating motion along the forearm preceded visual targets. At short tactile-visual ISIs, discriminations were more rapid when the final tactile pulse and visual target were at the same location. At longer ISIs, discriminations were more rapid when the visual target was offset in the motion direction and were slower for offsets opposite to the motion direction. In experiment 2, speeded tactile discriminations at one of three random locations on the forearm were preceded by a visually simulated approaching object. Discriminations were more rapid when the object approached the location of the tactile stimulation and discrimination performance was dependent on the approaching object's time to contact. These results demonstrate dynamic links in the spatial mapping between vision and touch.

  19. FPGA-Based Multimodal Embedded Sensor System Integrating Low- and Mid-Level Vision

    PubMed Central

    Botella, Guillermo; Martín H., José Antonio; Santos, Matilde; Meyer-Baese, Uwe

    2011-01-01

    Motion estimation is a low-level vision task that is especially relevant due to its wide range of applications in the real world. Many of the best motion estimation algorithms include some of the features that are found in mammalians, which would demand huge computational resources and therefore are not usually available in real-time. In this paper we present a novel bioinspired sensor based on the synergy between optical flow and orthogonal variant moments. The bioinspired sensor has been designed for Very Large Scale Integration (VLSI) using properties of the mammalian cortical motion pathway. This sensor combines low-level primitives (optical flow and image moments) in order to produce a mid-level vision abstraction layer. The results are described trough experiments showing the validity of the proposed system and an analysis of the computational resources and performance of the applied algorithms. PMID:22164069

  20. FPGA-based multimodal embedded sensor system integrating low- and mid-level vision.

    PubMed

    Botella, Guillermo; Martín H, José Antonio; Santos, Matilde; Meyer-Baese, Uwe

    2011-01-01

    Motion estimation is a low-level vision task that is especially relevant due to its wide range of applications in the real world. Many of the best motion estimation algorithms include some of the features that are found in mammalians, which would demand huge computational resources and therefore are not usually available in real-time. In this paper we present a novel bioinspired sensor based on the synergy between optical flow and orthogonal variant moments. The bioinspired sensor has been designed for Very Large Scale Integration (VLSI) using properties of the mammalian cortical motion pathway. This sensor combines low-level primitives (optical flow and image moments) in order to produce a mid-level vision abstraction layer. The results are described trough experiments showing the validity of the proposed system and an analysis of the computational resources and performance of the applied algorithms.

  1. Discovering the flight autostabilizer of fruit flies by inducing aerial stumbles

    PubMed Central

    Ristroph, Leif; Bergou, Attila J.; Ristroph, Gunnar; Coumes, Katherine; Berman, Gordon J.; Guckenheimer, John; Wang, Z. Jane; Cohen, Itai

    2010-01-01

    Just as the Wright brothers implemented controls to achieve stable airplane flight, flying insects have evolved behavioral strategies that ensure recovery from flight disturbances. Pioneering studies performed on tethered and dissected insects demonstrate that the sensory, neurological, and musculoskeletal systems play important roles in flight control. Such studies, however, cannot produce an integrative model of insect flight stability because they do not incorporate the interaction of these systems with free-flight aerodynamics. We directly investigate control and stability through the application of torque impulses to freely flying fruit flies (Drosophila melanogaster) and measurement of their behavioral response. High-speed video and a new motion tracking method capture the aerial “stumble,” and we discover that flies respond to gentle disturbances by accurately returning to their original orientation. These insects take advantage of a stabilizing aerodynamic influence and active torque generation to recover their heading to within 2° in < 60 ms. To explain this recovery behavior, we form a feedback control model that includes the fly’s ability to sense body rotations, process this information, and actuate the wing motions that generate corrective aerodynamic torque. Thus, like early man-made aircraft and modern fighter jets, the fruit fly employs an automatic stabilization scheme that reacts to short time-scale disturbances. PMID:20194789

  2. Residual perception of biological motion in cortical blindness.

    PubMed

    Ruffieux, Nicolas; Ramon, Meike; Lao, Junpeng; Colombo, Françoise; Stacchi, Lisa; Borruat, François-Xavier; Accolla, Ettore; Annoni, Jean-Marie; Caldara, Roberto

    2016-12-01

    From birth, the human visual system shows a remarkable sensitivity for perceiving biological motion. This visual ability relies on a distributed network of brain regions and can be preserved even after damage of high-level ventral visual areas. However, it remains unknown whether this critical biological skill can withstand the loss of vision following bilateral striate damage. To address this question, we tested the categorization of human and animal biological motion in BC, a rare case of cortical blindness after anoxia-induced bilateral striate damage. The severity of his impairment, encompassing various aspects of vision (i.e., color, shape, face, and object recognition) and causing blind-like behavior, contrasts with a residual ability to process motion. We presented BC with static or dynamic point-light displays (PLDs) of human or animal walkers. These stimuli were presented either individually, or in pairs in two alternative forced choice (2AFC) tasks. When confronted with individual PLDs, the patient was unable to categorize the stimuli, irrespective of whether they were static or dynamic. In the 2AFC task, BC exhibited appropriate eye movements towards diagnostic information, but performed at chance level with static PLDs, in stark contrast to his ability to efficiently categorize dynamic biological agents. This striking ability to categorize biological motion provided top-down information is important for at least two reasons. Firstly, it emphasizes the importance of assessing patients' (visual) abilities across a range of task constraints, which can reveal potential residual abilities that may in turn represent a key feature for patient rehabilitation. Finally, our findings reinforce the view that the neural network processing biological motion can efficiently operate despite severely impaired low-level vision, positing our natural predisposition for processing dynamicity in biological agents as a robust feature of human vision. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Multiple Drosophila Tracking System with Heading Direction

    PubMed Central

    Sirigrivatanawong, Pudith; Arai, Shogo; Thoma, Vladimiros; Hashimoto, Koichi

    2017-01-01

    Machine vision systems have been widely used for image analysis, especially that which is beyond human ability. In biology, studies of behavior help scientists to understand the relationship between sensory stimuli and animal responses. This typically requires the analysis and quantification of animal locomotion. In our work, we focus on the analysis of the locomotion of the fruit fly Drosophila melanogaster, a widely used model organism in biological research. Our system consists of two components: fly detection and tracking. Our system provides the ability to extract a group of flies as the objects of concern and furthermore determines the heading direction of each fly. As each fly moves, the system states are refined with a Kalman filter to obtain the optimal estimation. For the tracking step, combining information such as position and heading direction with assignment algorithms gives a successful tracking result. The use of heading direction increases the system efficiency when dealing with identity loss and flies swapping situations. The system can also operate with a variety of videos with different light intensities. PMID:28067800

  4. DC-Powered Jumping Ring

    NASA Astrophysics Data System (ADS)

    Jeffery, Rondo N.; Amiri, Farhang

    2016-02-01

    The classroom jumping ring demonstration is nearly always performed using alternating current (AC), in which the ring jumps or flies off the extended iron core when the switch is closed. The ring jumps higher when cooled with liquid nitrogen (LN2). We have performed experiments using DC to power the solenoid and find similarities and significant differences from the AC case. In particular, the ring does not fly off the core but rises a short distance and then falls back. If the ring jumps high enough, the rising and the falling motion of the ring does not follow simple vertical motion of a projectile. This indicates that there are additional forces on the ring in each part of its motion. Four possible stages of the motion of the ring with DC are identified, which result from the ring current changing directions during the jump in response to a changing magnetic flux through the moving ring.

  5. Convex optimisation approach to constrained fuel optimal control of spacecraft in close relative motion

    NASA Astrophysics Data System (ADS)

    Massioni, Paolo; Massari, Mauro

    2018-05-01

    This paper describes an interesting and powerful approach to the constrained fuel-optimal control of spacecraft in close relative motion. The proposed approach is well suited for problems under linear dynamic equations, therefore perfectly fitting to the case of spacecraft flying in close relative motion. If the solution of the optimisation is approximated as a polynomial with respect to the time variable, then the problem can be approached with a technique developed in the control engineering community, known as "Sum Of Squares" (SOS), and the constraints can be reduced to bounds on the polynomials. Such a technique allows rewriting polynomial bounding problems in the form of convex optimisation problems, at the cost of a certain amount of conservatism. The principles of the techniques are explained and some application related to spacecraft flying in close relative motion are shown.

  6. Letting Your Students "Fly" in the Classroom.

    ERIC Educational Resources Information Center

    Adams, Thomas

    1997-01-01

    Students investigate the concept of motion by making simple paper airplanes and flying them in the classroom. Students are introduced to conversion factors to calculate various speeds. Additional activities include rounding decimal numbers, estimating, finding averages, making bar graphs, and solving problems. Offers ideas for extension such as…

  7. Enabling Spacecraft Formation Flying in Any Earth Orbit Through Spaceborne GPS and Enhanced Autonomy Technologies

    NASA Technical Reports Server (NTRS)

    Bauer, F. H.; Bristow, J. O.; Carpenter, J. R.; Garrison, J. L.; Hartman, K. R.; Lee, T.; Long, A. C.; Kelbel, D.; Lu, V.; How, J. P.; hide

    2000-01-01

    Formation flying is quickly revolutionizing the way the space community conducts autonomous science missions around the Earth and in space. This technological revolution will provide new, innovative ways for this community to gather scientific information, share this information between space vehicles and the ground, and expedite the human exploration of space. Once fully matured, this technology will result in swarms of space vehicles flying as a virtual platform and gathering significantly more and better science data than is possible today. Formation flying will be enabled through the development and deployment of spaceborne differential Global Positioning System (GPS) technology and through innovative spacecraft autonomy techniques, This paper provides an overview of the current status of NASA/DoD/Industry/University partnership to bring formation flying technology to the forefront as quickly as possible, the hurdles that need to be overcome to achieve the formation flying vision, and the team's approach to transfer this technology to space. It will also describe some of the formation flying testbeds, such as Orion, that are being developed to demonstrate and validate these innovative GPS sensing and formation control technologies.

  8. Analog "neuronal" networks in early vision.

    PubMed Central

    Koch, C; Marroquin, J; Yuille, A

    1986-01-01

    Many problems in early vision can be formulated in terms of minimizing a cost function. Examples are shape from shading, edge detection, motion analysis, structure from motion, and surface interpolation. As shown by Poggio and Koch [Poggio, T. & Koch, C. (1985) Proc. R. Soc. London, Ser. B 226, 303-323], quadratic variational problems, an important subset of early vision tasks, can be "solved" by linear, analog electrical, or chemical networks. However, in the presence of discontinuities, the cost function is nonquadratic, raising the question of designing efficient algorithms for computing the optimal solution. Recently, Hopfield and Tank [Hopfield, J. J. & Tank, D. W. (1985) Biol. Cybern. 52, 141-152] have shown that networks of nonlinear analog "neurons" can be effective in computing the solution of optimization problems. We show how these networks can be generalized to solve the nonconvex energy functionals of early vision. We illustrate this approach by implementing a specific analog network, solving the problem of reconstructing a smooth surface from sparse data while preserving its discontinuities. These results suggest a novel computational strategy for solving early vision problems in both biological and real-time artificial vision systems. PMID:3459172

  9. Are visual peripheries forever young?

    PubMed

    Burnat, Kalina

    2015-01-01

    The paper presents a concept of lifelong plasticity of peripheral vision. Central vision processing is accepted as critical and irreplaceable for normal perception in humans. While peripheral processing chiefly carries information about motion stimuli features and redirects foveal attention to new objects, it can also take over functions typical for central vision. Here I review the data showing the plasticity of peripheral vision found in functional, developmental, and comparative studies. Even though it is well established that afferent projections from central and peripheral retinal regions are not established simultaneously during early postnatal life, central vision is commonly used as a general model of development of the visual system. Based on clinical studies and visually deprived animal models, I describe how central and peripheral visual field representations separately rely on early visual experience. Peripheral visual processing (motion) is more affected by binocular visual deprivation than central visual processing (spatial resolution). In addition, our own experimental findings show the possible recruitment of coarse peripheral vision for fine spatial analysis. Accordingly, I hypothesize that the balance between central and peripheral visual processing, established in the course of development, is susceptible to plastic adaptations during the entire life span, with peripheral vision capable of taking over central processing.

  10. Visual Control for Multirobot Organized Rendezvous.

    PubMed

    Lopez-Nicolas, G; Aranda, M; Mezouar, Y; Sagues, C

    2012-08-01

    This paper addresses the problem of visual control of a set of mobile robots. In our framework, the perception system consists of an uncalibrated flying camera performing an unknown general motion. The robots are assumed to undergo planar motion considering nonholonomic constraints. The goal of the control task is to drive the multirobot system to a desired rendezvous configuration relying solely on visual information given by the flying camera. The desired multirobot configuration is defined with an image of the set of robots in that configuration without any additional information. We propose a homography-based framework relying on the homography induced by the multirobot system that gives a desired homography to be used to define the reference target, and a new image-based control law that drives the robots to the desired configuration by imposing a rigidity constraint. This paper extends our previous work, and the main contributions are that the motion constraints on the flying camera are removed, the control law is improved by reducing the number of required steps, the stability of the new control law is proved, and real experiments are provided to validate the proposal.

  11. Neural dynamics for landmark orientation and angular path integration

    PubMed Central

    Seelig, Johannes D.; Jayaraman, Vivek

    2015-01-01

    Summary Many animals navigate using a combination of visual landmarks and path integration. In mammalian brains, head direction cells integrate these two streams of information by representing an animal's heading relative to landmarks, yet maintaining their directional tuning in darkness based on self-motion cues. Here we use two-photon calcium imaging in head-fixed flies walking on a ball in a virtual reality arena to demonstrate that landmark-based orientation and angular path integration are combined in the population responses of neurons whose dendrites tile the ellipsoid body — a toroidal structure in the center of the fly brain. The population encodes the fly's azimuth relative to its environment, tracking visual landmarks when available and relying on self-motion cues in darkness. When both visual and self-motion cues are absent, a representation of the animal's orientation is maintained in this network through persistent activity — a potential substrate for short-term memory. Several features of the population dynamics of these neurons and their circular anatomical arrangement are suggestive of ring attractors — network structures proposed to support the function of navigational brain circuits. PMID:25971509

  12. Relative dynamics and motion control of nanosatellite formation flying

    NASA Astrophysics Data System (ADS)

    Pimnoo, Ammarin; Hiraki, Koju

    2016-04-01

    Orbit selection is a necessary factor in nanosatellite formation mission design/meanwhile, to keep the formation, it is necessary to consume fuel. Therefore, the best orbit design for nanosatellite formation flying should be one that requires the minimum fuel consumption. The purpose of this paper is to analyse orbit selection with respect to the minimum fuel consumption, to provide a convenient way to estimate the fuel consumption for keeping nanosatellite formation flying and to present a simplified method of formation control. The formation structure is disturbed by J2 gravitational perturbation and other perturbing accelerations such as atmospheric drag. First, Gauss' Variation Equations (GVE) are used to estimate the essential ΔV due to the J2 perturbation and atmospheric drag. The essential ΔV presents information on which orbit is good with respect to the minimum fuel consumption. Then, the linear equations which account for J2 gravitational perturbation of Schweighart-Sedwick are presented and used to estimate the fuel consumption to maintain the formation structure. Finally, the relative dynamics motion is presented as well as a simplified motion control of formation structure by using GVE.

  13. Computer vision-based technologies and commercial best practices for the advancement of the motion imagery tradecraft

    NASA Astrophysics Data System (ADS)

    Phipps, Marja; Capel, David; Srinivasan, James

    2014-06-01

    Motion imagery capabilities within the Department of Defense/Intelligence Community (DoD/IC) have advanced significantly over the last decade, attempting to meet continuously growing data collection, video processing and analytical demands in operationally challenging environments. The motion imagery tradecraft has evolved accordingly, enabling teams of analysts to effectively exploit data and generate intelligence reports across multiple phases in structured Full Motion Video (FMV) Processing Exploitation and Dissemination (PED) cells. Yet now the operational requirements are drastically changing. The exponential growth in motion imagery data continues, but to this the community adds multi-INT data, interoperability with existing and emerging systems, expanded data access, nontraditional users, collaboration, automation, and support for ad hoc configurations beyond the current FMV PED cells. To break from the legacy system lifecycle, we look towards a technology application and commercial adoption model course which will meet these future Intelligence, Surveillance and Reconnaissance (ISR) challenges. In this paper, we explore the application of cutting edge computer vision technology to meet existing FMV PED shortfalls and address future capability gaps. For example, real-time georegistration services developed from computer-vision-based feature tracking, multiple-view geometry, and statistical methods allow the fusion of motion imagery with other georeferenced information sources - providing unparalleled situational awareness. We then describe how these motion imagery capabilities may be readily deployed in a dynamically integrated analytical environment; employing an extensible framework, leveraging scalable enterprise-wide infrastructure and following commercial best practices.

  14. Influence of restricted vision and knee joint range of motion on gait properties during level walking and stair ascent and descent.

    PubMed

    Demura, Tomohiro; Demura, Shin-ich

    2011-01-01

    Because elderly individuals experience marked declines in various physical functions (e.g., vision, joint function) simultaneously, it is difficult to clarify the individual effects of these functional declines on walking. However, by imposing vision and joint function restrictions on young men, the effects of these functional declines on walking can be clarified. The authors aimed to determine the effect of restricted vision and range of motion (ROM) of the knee joint on gait properties while walking and ascending or descending stairs. Fifteen healthy young adults performed level walking and stair ascent and descent during control, vision restriction, and knee joint ROM restriction conditions. During level walking, walking speed and step width decreased, and double support time increased significantly with vision and knee joint ROM restrictions. Stance time, step width, and walking angle increased only with knee joint ROM restriction. Stance time, swing time, and double support time were significantly longer in level walking, stair descent, and stair ascent, in that order. The effects of vision and knee joint ROM restrictions were significantly larger than the control conditions. In conclusion, vision and knee joint ROM restrictions affect gait during level walking and stair ascent and descent. This effect is marked in stair ascent with knee joint ROM restriction.

  15. Wing attachment position of fruit fly minimizes flight cost

    NASA Astrophysics Data System (ADS)

    Noest, Robert; Wang, Jane

    Flight is energetically costly which means insects need to find ways to reduce their energy expenditure during sustained flight. Previous work has shown that insect muscles can recover some of the energy used for producing flapping motion. Moreover the form of flapping motions are efficient for generating the required force to balance the weight. In this talk, we show that one of the morphological parameters, the wing attachment point on a fly, is suitably located to further reduce the cost for flight, while allowing the fly to be close to stable. We investigate why this is the case and attempt to find a general rule for the optimal location of the wing hinge. Our analysis is based on computations of flapping free flight together with the Floquet stability analysis of periodic flight for descending, hovering and ascending cases.

  16. Local motion adaptation enhances the representation of spatial structure at EMD arrays

    PubMed Central

    Lindemann, Jens P.; Egelhaaf, Martin

    2017-01-01

    Neuronal representation and extraction of spatial information are essential for behavioral control. For flying insects, a plausible way to gain spatial information is to exploit distance-dependent optic flow that is generated during translational self-motion. Optic flow is computed by arrays of local motion detectors retinotopically arranged in the second neuropile layer of the insect visual system. These motion detectors have adaptive response characteristics, i.e. their responses to motion with a constant or only slowly changing velocity decrease, while their sensitivity to rapid velocity changes is maintained or even increases. We analyzed by a modeling approach how motion adaptation affects signal representation at the output of arrays of motion detectors during simulated flight in artificial and natural 3D environments. We focused on translational flight, because spatial information is only contained in the optic flow induced by translational locomotion. Indeed, flies, bees and other insects segregate their flight into relatively long intersaccadic translational flight sections interspersed with brief and rapid saccadic turns, presumably to maximize periods of translation (80% of the flight). With a novel adaptive model of the insect visual motion pathway we could show that the motion detector responses to background structures of cluttered environments are largely attenuated as a consequence of motion adaptation, while responses to foreground objects stay constant or even increase. This conclusion even holds under the dynamic flight conditions of insects. PMID:29281631

  17. Vision System Measures Motions of Robot and External Objects

    NASA Technical Reports Server (NTRS)

    Talukder, Ashit; Matthies, Larry

    2008-01-01

    A prototype of an advanced robotic vision system both (1) measures its own motion with respect to a stationary background and (2) detects other moving objects and estimates their motions, all by use of visual cues. Like some prior robotic and other optoelectronic vision systems, this system is based partly on concepts of optical flow and visual odometry. Whereas prior optoelectronic visual-odometry systems have been limited to frame rates of no more than 1 Hz, a visual-odometry subsystem that is part of this system operates at a frame rate of 60 to 200 Hz, given optical-flow estimates. The overall system operates at an effective frame rate of 12 Hz. Moreover, unlike prior machine-vision systems for detecting motions of external objects, this system need not remain stationary: it can detect such motions while it is moving (even vibrating). The system includes a stereoscopic pair of cameras mounted on a moving robot. The outputs of the cameras are digitized, then processed to extract positions and velocities. The initial image-data-processing functions of this system are the same as those of some prior systems: Stereoscopy is used to compute three-dimensional (3D) positions for all pixels in the camera images. For each pixel of each image, optical flow between successive image frames is used to compute the two-dimensional (2D) apparent relative translational motion of the point transverse to the line of sight of the camera. The challenge in designing this system was to provide for utilization of the 3D information from stereoscopy in conjunction with the 2D information from optical flow to distinguish between motion of the camera pair and motions of external objects, compute the motion of the camera pair in all six degrees of translational and rotational freedom, and robustly estimate the motions of external objects, all in real time. To meet this challenge, the system is designed to perform the following image-data-processing functions: The visual-odometry subsystem (the subsystem that estimates the motion of the camera pair relative to the stationary background) utilizes the 3D information from stereoscopy and the 2D information from optical flow. It computes the relationship between the 3D and 2D motions and uses a least-mean-squares technique to estimate motion parameters. The least-mean-squares technique is suitable for real-time implementation when the number of external-moving-object pixels is smaller than the number of stationary-background pixels.

  18. Wing-kinematics measurement and aerodynamics in a small insect in hovering flight

    PubMed Central

    Cheng, Xin; Sun, Mao

    2016-01-01

    Wing-motion of hovering small fly Liriomyza sativae was measured using high-speed video and flows of the wings calculated numerically. The fly used high wingbeat frequency (≈265 Hz) and large stroke amplitude (≈182°); therefore, even if its wing-length (R) was small (R ≈ 1.4 mm), the mean velocity of wing reached ≈1.5 m/s, the same as that of an average-size insect (R ≈ 3 mm). But the Reynolds number (Re) of wing was still low (≈40), owing to the small wing-size. In increasing the stroke amplitude, the outer parts of the wings had a “clap and fling” motion. The mean-lift coefficient was high, ≈1.85, several times larger than that of a cruising airplane. The partial “clap and fling” motion increased the lift by ≈7%, compared with the case of no aerodynamic interaction between the wings. The fly mainly used the delayed stall mechanism to generate the high-lift. The lift-to-drag ratio is only 0.7 (for larger insects, Re being about 100 or higher, the ratio is 1–1.2); that is, although the small fly can produce enough lift to support its weight, it needs to overcome a larger drag to do so. PMID:27168523

  19. A vision-based system for measuring the displacements of large structures: Simultaneous adaptive calibration and full motion estimation

    NASA Astrophysics Data System (ADS)

    Santos, C. Almeida; Costa, C. Oliveira; Batista, J.

    2016-05-01

    The paper describes a kinematic model-based solution to estimate simultaneously the calibration parameters of the vision system and the full-motion (6-DOF) of large civil engineering structures, namely of long deck suspension bridges, from a sequence of stereo images captured by digital cameras. Using an arbitrary number of images and assuming a smooth structure motion, an Iterated Extended Kalman Filter is used to recursively estimate the projection matrices of the cameras and the structure full-motion (displacement and rotation) over time, helping to meet the structure health monitoring fulfilment. Results related to the performance evaluation, obtained by numerical simulation and with real experiments, are reported. The real experiments were carried out in indoor and outdoor environment using a reduced structure model to impose controlled motions. In both cases, the results obtained with a minimum setup comprising only two cameras and four non-coplanar tracking points, showed a high accuracy results for on-line camera calibration and structure full motion estimation.

  20. Evaluation of surveillance methods for monitoring house fly abundance and activity on large commercial dairy operations.

    PubMed

    Gerry, Alec C; Higginbotham, G E; Periera, L N; Lam, A; Shelton, C R

    2011-06-01

    Relative house fly, Musca domestica L., activity at three large dairies in central California was monitored during the peak fly activity period from June to August 2005 by using spot cards, fly tapes, bait traps, and Alsynite traps. Counts for all monitoring methods were significantly related at two of three dairies; with spot card counts significantly related to fly tape counts recorded the same week, and both spot card counts and fly tape counts significantly related to bait trap counts 1-2 wk later. Mean fly counts differed significantly between dairies, but a significant interaction between dairies sampled and monitoring methods used demonstrates that between-dairy comparisons are unwise. Estimate precision was determined by the coefficient of variability (CV) (or SE/mean). Using a CV = 0.15 as a desired level of estimate precision and assuming an integrate pest management (IPM) action threshold near the peak house fly activity measured by each monitoring method, house fly monitoring at a large dairy would require 12 spot cards placed in midafternoon shaded fly resting sites near cattle or seven bait traps placed in open areas near cattle. Software (FlySpotter; http://ucanr.org/ sites/FlySpotter/download/) using computer vision technology was developed to count fly spots on a scanned image of a spot card to dramatically reduce time invested in monitoring house flies. Counts provided by the FlySpotter software were highly correlated to visual counts. The use of spot cards for monitoring house flies is recommended for dairy IPM programs.

  1. Ride quality assessment. III - Questionnaire results of a second flight programme

    NASA Technical Reports Server (NTRS)

    Richards, L. G.; Jacobson, I. D.

    1977-01-01

    A questionnaire was completed by 861 passengers on regularly-scheduled flights of four commuter airlines. Four types of aircraft were involved. Questions assessed major demographic variables, attitudes toward flying, frequency of flying, experience of airsickness, and passenger perceptions of detailed aspects of the physical environment. Passengers also rated their overall comfort level and their willingness to fly again. Passengers perceive motion, noise, and seat factors as the primary determinants of their comfort. Rated comfort is strongly related to willingness to fly again. Incidence of airsickness was low. Sex differences in reactions to aspects of the environment were found.

  2. The hazard of spatial disorientation during helicopter flight using night vision devices.

    PubMed

    Braithwaite, M G; Douglass, P K; Durnford, S J; Lucas, G

    1998-11-01

    Night Vision Devices (NVDs) provide an enormous advantage to the operational effectiveness of military helicopter flying by permitting flight throughout the night. However, compared with daytime flight, many of the depth perception and orientational cues are severely degraded. These degraded cues predispose aviators to spatial disorientation (SD), which is a serious drawback of these devices. As part of an overall analysis of Army helicopter accidents to assess the impact of SD on military flying, we scrutinized the class A-C mishap reports involving night-aided flight from 1987 to 1995. The accidents were classified according to the role of SD by three independent assessors, with the SD group further analyzed to determine associated factors and possible countermeasures. Almost 43% of all SD-related accidents in this series occurred during flight using NVDs, whereas only 13% of non-SD accidents involved NVDs. An examination of the SD accident rates per 100,000 flying hours revealed a significant difference between the rate for day flying and the rate for flight using NVDs (mean rate for daytime flight = 1.66, mean rate for NVD flight = 9.00, p < 0.001). The most important factors associated with these accidents were related to equipment limitations, distraction from the task, and training or procedural inadequacies. SD remains an important source of attrition of Army aircraft. The more than fivefold increase in risk associated with NVD flight is of serious concern. The associated factors and suggested countermeasures should be urgently addressed.

  3. Naive Beliefs in Baseball: Systematic Distortion in Perceived Time of Apex for Fly Balls

    ERIC Educational Resources Information Center

    Shaffer, Dennis M.; McBeath, Michael K.

    2005-01-01

    When fielders catch fly balls they use geometric properties to optically maintain control over the ball. The strategy provides ongoing guidance without indicating precise positional information concerning where the ball is located in space. Here, the authors show that observers have striking misconceptions about what the motion of projectiles…

  4. Landing characteristics in waves of three dynamic models of flying boats

    NASA Technical Reports Server (NTRS)

    Benson, James M; Havens, Robert F; Woodward, David R

    1952-01-01

    Powered models of three different flying boats were landed in oncoming waves of various heights and lengths. The effects of varying the trim at landing, the deceleration after landing, and the size of the waves were determined. Data are presented on the motions and accelerations obtained during landings in rough water.

  5. The research on visual industrial robot which adopts fuzzy PID control algorithm

    NASA Astrophysics Data System (ADS)

    Feng, Yifei; Lu, Guoping; Yue, Lulin; Jiang, Weifeng; Zhang, Ye

    2017-03-01

    The control system of six degrees of freedom visual industrial robot based on the control mode of multi-axis motion control cards and PC was researched. For the variable, non-linear characteristics of industrial robot`s servo system, adaptive fuzzy PID controller was adopted. It achieved better control effort. In the vision system, a CCD camera was used to acquire signals and send them to video processing card. After processing, PC controls the six joints` motion by motion control cards. By experiment, manipulator can operate with machine tool and vision system to realize the function of grasp, process and verify. It has influence on the manufacturing of the industrial robot.

  6. Progress in high-level exploratory vision

    NASA Astrophysics Data System (ADS)

    Brand, Matthew

    1993-08-01

    We have been exploring the hypothesis that vision is an explanatory process, in which causal and functional reasoning about potential motion plays an intimate role in mediating the activity of low-level visual processes. In particular, we have explored two of the consequences of this view for the construction of purposeful vision systems: Causal and design knowledge can be used to (1) drive focus of attention, and (2) choose between ambiguous image interpretations. An important result of visual understanding is an explanation of the scene's causal structure: How action is originated, constrained, and prevented, and what will happen in the immediate future. In everyday visual experience, most action takes the form of motion, and most causal analysis takes the form of dynamical analysis. This is even true of static scenes, where much of a scene's interest lies in how possible motions are arrested. This paper describes our progress in developing domain theories and visual processes for the understanding of various kinds of structured scenes, including structures built out of children's constructive toys and simple mechanical devices.

  7. Center of Mass Demonstration on the Fly

    ERIC Educational Resources Information Center

    Hazelrigg, Conner; Baker, Blane

    2015-01-01

    Center of mass (CM) is an important concept in physics, especially when studying extended bodies. For example, general motion of an extended body can be considered as the sum of the translational motion of the CM plus other types of motion about that CM. CM also can be regarded as a "balance point" so that a system supported at its CM…

  8. Smart vision chips: An overview

    NASA Technical Reports Server (NTRS)

    Koch, Christof

    1994-01-01

    This viewgraph presentation presents four working analog VLSI vision chips: (1) time-derivative retina, (2) zero-crossing chip, (3) resistive fuse, and (4) figure-ground chip; work in progress on computing motion and neuromorphic systems; and conceptual and practical lessons learned.

  9. Shuttlecock detection system for fully-autonomous badminton robot with two high-speed video cameras

    NASA Astrophysics Data System (ADS)

    Masunari, T.; Yamagami, K.; Mizuno, M.; Une, S.; Uotani, M.; Kanematsu, T.; Demachi, K.; Sano, S.; Nakamura, Y.; Suzuki, S.

    2017-02-01

    Two high-speed video cameras are successfully used to detect the motion of a flying shuttlecock of badminton. The shuttlecock detection system is applied to badminton robots that play badminton fully autonomously. The detection system measures the three dimensional position and velocity of a flying shuttlecock, and predicts the position where the shuttlecock falls to the ground. The badminton robot moves quickly to the position where the shuttle-cock falls to, and hits the shuttlecock back into the opponent's side of the court. In the game of badminton, there is a large audience, and some of them move behind a flying shuttlecock, which are a kind of background noise and makes it difficult to detect the motion of the shuttlecock. The present study demonstrates that such noises can be eliminated by the method of stereo imaging with two high-speed cameras.

  10. Effect of inertia on laminar swimming and flying of an assembly of rigid spheres in an incompressible viscous fluid.

    PubMed

    Felderhof, B U

    2015-01-01

    A mechanical model of swimming and flying in an incompressible viscous fluid in the absence of gravity is studied on the basis of assumed equations of motion. The system is modeled as an assembly of rigid spheres subject to elastic direct interactions and to periodic actuating forces which sum to zero. Hydrodynamic interactions are taken into account in the virtual mass matrix and in the friction matrix of the assembly. An equation of motion is derived for the velocity of the geometric center of the assembly. The mean power is calculated as the mean rate of dissipation. The full range of viscosity is covered, so that the theory can be applied to the flying of birds, as well as to the swimming of fish or bacteria. As an example a system of three equal spheres moving along a common axis is studied.

  11. Effect of inertia on laminar swimming and flying of an assembly of rigid spheres in an incompressible viscous fluid

    NASA Astrophysics Data System (ADS)

    Felderhof, B. U.

    2015-11-01

    A mechanical model of swimming and flying in an incompressible viscous fluid in the absence of gravity is studied on the basis of assumed equations of motion. The system is modeled as an assembly of rigid spheres subject to elastic direct interactions and to periodic actuating forces which sum to zero. Hydrodynamic interactions are taken into account in the virtual mass matrix and in the friction matrix of the assembly. An equation of motion is derived for the velocity of the geometric center of the assembly. The mean power is calculated as the mean rate of dissipation. The full range of viscosity is covered, so that the theory can be applied to the flying of birds, as well as to the swimming of fish or bacteria. As an example a system of three equal spheres moving along a common axis is studied.

  12. Pixel-wise deblurring imaging system based on active vision for structural health monitoring at a speed of 100 km/h

    NASA Astrophysics Data System (ADS)

    Hayakawa, Tomohiko; Moko, Yushi; Morishita, Kenta; Ishikawa, Masatoshi

    2018-04-01

    In this paper, we propose a pixel-wise deblurring imaging (PDI) system based on active vision for compensation of the blur caused by high-speed one-dimensional motion between a camera and a target. The optical axis is controlled by back-and-forth motion of a galvanometer mirror to compensate the motion. High-spatial-resolution image captured by our system in high-speed motion is useful for efficient and precise visual inspection, such as visually judging abnormal parts of a tunnel surface to prevent accidents; hence, we applied the PDI system for structural health monitoring. By mounting the system onto a vehicle in a tunnel, we confirmed significant improvement in image quality for submillimeter black-and-white stripes and real tunnel-surface cracks at a speed of 100 km/h.

  13. Helmet-mounted pilot night vision systems: Human factors issues

    NASA Technical Reports Server (NTRS)

    Hart, Sandra G.; Brickner, Michael S.

    1989-01-01

    Helmet-mounted displays of infrared imagery (forward-looking infrared (FLIR)) allow helicopter pilots to perform low level missions at night and in low visibility. However, pilots experience high visual and cognitive workload during these missions, and their performance capabilities may be reduced. Human factors problems inherent in existing systems stem from three primary sources: the nature of thermal imagery; the characteristics of specific FLIR systems; and the difficulty of using FLIR system for flying and/or visually acquiring and tracking objects in the environment. The pilot night vision system (PNVS) in the Apache AH-64 provides a monochrome, 30 by 40 deg helmet-mounted display of infrared imagery. Thermal imagery is inferior to television imagery in both resolution and contrast ratio. Gray shades represent temperatures differences rather than brightness variability, and images undergo significant changes over time. The limited field of view, displacement of the sensor from the pilot's eye position, and monocular presentation of a bright FLIR image (while the other eye remains dark-adapted) are all potential sources of disorientation, limitations in depth and distance estimation, sensations of apparent motion, and difficulties in target and obstacle detection. Insufficient information about human perceptual and performance limitations restrains the ability of human factors specialists to provide significantly improved specifications, training programs, or alternative designs. Additional research is required to determine the most critical problem areas and to propose solutions that consider the human as well as the development of technology.

  14. Towards photorealistic and immersive virtual-reality environments for simulated prosthetic vision: integrating recent breakthroughs in consumer hardware and software.

    PubMed

    Zapf, Marc P; Matteucci, Paul B; Lovell, Nigel H; Zheng, Steven; Suaning, Gregg J

    2014-01-01

    Simulated prosthetic vision (SPV) in normally sighted subjects is an established way of investigating the prospective efficacy of visual prosthesis designs in visually guided tasks such as mobility. To perform meaningful SPV mobility studies in computer-based environments, a credible representation of both the virtual scene to navigate and the experienced artificial vision has to be established. It is therefore prudent to make optimal use of existing hardware and software solutions when establishing a testing framework. The authors aimed at improving the realism and immersion of SPV by integrating state-of-the-art yet low-cost consumer technology. The feasibility of body motion tracking to control movement in photo-realistic virtual environments was evaluated in a pilot study. Five subjects were recruited and performed an obstacle avoidance and wayfinding task using either keyboard and mouse, gamepad or Kinect motion tracking. Walking speed and collisions were analyzed as basic measures for task performance. Kinect motion tracking resulted in lower performance as compared to classical input methods, yet results were more uniform across vision conditions. The chosen framework was successfully applied in a basic virtual task and is suited to realistically simulate real-world scenes under SPV in mobility research. Classical input peripherals remain a feasible and effective way of controlling the virtual movement. Motion tracking, despite its limitations and early state of implementation, is intuitive and can eliminate between-subject differences due to familiarity to established input methods.

  15. A simple method to design non-collision relative orbits for close spacecraft formation flying

    NASA Astrophysics Data System (ADS)

    Jiang, Wei; Li, JunFeng; Jiang, FangHua; Bernelli-Zazzera, Franco

    2018-05-01

    A set of linearized relative motion equations of spacecraft flying on unperturbed elliptical orbits are specialized for particular cases, where the leader orbit is circular or equatorial. Based on these extended equations, we are able to analyze the relative motion regulation between a pair of spacecraft flying on arbitrary unperturbed orbits with the same semi-major axis in close formation. Given the initial orbital elements of the leader, this paper presents a simple way to design initial relative orbital elements of close spacecraft with the same semi-major axis, thus preventing collision under non-perturbed conditions. Considering the mean influence of J 2 perturbation, namely secular J 2 perturbation, we derive the mean derivatives of orbital element differences, and then expand them to first order. Thus the first order expansion of orbital element differences can be added to the relative motion equations for further analysis. For a pair of spacecraft that will never collide under non-perturbed situations, we present a simple method to determine whether a collision will occur when J 2 perturbation is considered. Examples are given to prove the validity of the extended relative motion equations and to illustrate how the methods presented can be used. The simple method for designing initial relative orbital elements proposed here could be helpful to the preliminary design of the relative orbital elements between spacecraft in a close formation, when collision avoidance is necessary.

  16. Design, aerodynamics and autonomy of the DelFly.

    PubMed

    de Croon, G C H E; Groen, M A; De Wagter, C; Remes, B; Ruijsink, R; van Oudheusden, B W

    2012-06-01

    One of the major challenges in robotics is to develop a fly-like robot that can autonomously fly around in unknown environments. In this paper, we discuss the current state of the DelFly project, in which we follow a top-down approach to ever smaller and more autonomous ornithopters. The presented findings concerning the design, aerodynamics and autonomy of the DelFly illustrate some of the properties of the top-down approach, which allows the identification and resolution of issues that also play a role at smaller scales. A parametric variation of the wing stiffener layout produced a 5% more power-efficient wing. An experimental aerodynamic investigation revealed that this could be associated with an improved stiffness of the wing, while further providing evidence of the vortex development during the flap cycle. The presented experiments resulted in an improvement in the generated lift, allowing the inclusion of a yaw rate gyro, pressure sensor and microcontroller onboard the DelFly. The autonomy of the DelFly is expanded by achieving (1) an improved turning logic to obtain better vision-based obstacle avoidance performance in environments with varying texture and (2) successful onboard height control based on the pressure sensor.

  17. Virtual-reality techniques resolve the visual cues used by fruit flies to evaluate object distances.

    PubMed

    Schuster, Stefan; Strauss, Roland; Götz, Karl G

    2002-09-17

    Insects can estimate distance or time-to-contact of surrounding objects from locomotion-induced changes in their retinal position and/or size. Freely walking fruit flies (Drosophila melanogaster) use the received mixture of different distance cues to select the nearest objects for subsequent visits. Conventional methods of behavioral analysis fail to elucidate the underlying data extraction. Here we demonstrate first comprehensive solutions of this problem by substituting virtual for real objects; a tracker-controlled 360 degrees panorama converts a fruit fly's changing coordinates into object illusions that require the perception of specific cues to appear at preselected distances up to infinity. An application reveals the following: (1) en-route sampling of retinal-image changes accounts for distance discrimination within a surprising range of at least 8-80 body lengths (20-200 mm). Stereopsis and peering are not involved. (2) Distance from image translation in the expected direction (motion parallax) outweighs distance from image expansion, which accounts for impact-avoiding flight reactions to looming objects. (3) The ability to discriminate distances is robust to artificially delayed updating of image translation. Fruit flies appear to interrelate self-motion and its visual feedback within a surprisingly long time window of about 2 s. The comparative distance inspection practiced in the small fruit fly deserves utilization in self-moving robots.

  18. Ultra-Sensitive Electrostatic Accelerometers and Future Fundamental Physics Missions

    NASA Astrophysics Data System (ADS)

    Touboul, Pierre; Christophe, Bruno; Rodrigues, M.; Marque, Jean-Pierre; Foulon, Bernard

    Ultra-sensitive electrostatic accelerometers have in the last decade demonstrated their unique performance and reliability in orbit leading to the success of the three Earth geodesy missions presently in operation. In the near future, space fundamental physics missions are in preparation and highlight the importance of this instrument for achieving new scientific objectives. Corner stone of General Relativity, the Equivalence Principle may be violated as predicted by attempts of Grand Unification. Verification experiment at a level of at least 10-15 is the objective of the CNES-ESA mission MICROSCOPE, thanks to a differential accelerometer configuration with concentric cylindrical test masses. To achieve the numerous severe requirements of the mission, the instrument is also used to control the attitude and the orbital motion of the space laboratory leading to a pure geodesic motion of the drag-free satellite. The performance of the accelerometer is a few tenth of femto-g, at the selected frequency of the test about 10-3 Hz, i.e several orbit frequencies. Another important experimental research in Gravity is the verification of the Einstein metric, in particular its dependence with the distance to the attractive body. The Gravity Advanced Package (GAP) is proposed for the future EJSM planetary mission, with the objective to verify this scale dependence of the gravitation law from Earth to Jupiter. This verification is performed, during the interplanetary cruise, by following precisely the satellite trajectory in the planet and Sun fields with an accurate measurement of the non-gravitational accelerations in order to evaluate the deviations to the geodesic motion. Accelerations at DC and very low frequency domain are concerned and the natural bias of the electrostatic accelerometer is thus compensated down to 5 10-11 m/s2 thanks to a specific bias calibration device. More ambitious, the dedicated mission Odyssey, proposed for Cosmic Vision, will fly in the Solar System beyond Saturn. Based on the same instrument, the scientific return will be enlarged by the better performance achievable on a dedicated satellite and by the larger distance to the Sun. Fly by gravitational effects will also be carefully observed. At last, gravitational sensors take advantage of similar instrument concept, configuration and technologies to achieve pure free inertial masses, references of the LISA mission interferometer for the observation of gravity waves.

  19. Global motion perception is associated with motor function in 2-year-old children.

    PubMed

    Thompson, Benjamin; McKinlay, Christopher J D; Chakraborty, Arijit; Anstice, Nicola S; Jacobs, Robert J; Paudel, Nabin; Yu, Tzu-Ying; Ansell, Judith M; Wouldes, Trecia A; Harding, Jane E

    2017-09-29

    The dorsal visual processing stream that includes V1, motion sensitive area V5 and the posterior parietal lobe, supports visually guided motor function. Two recent studies have reported associations between global motion perception, a behavioural measure of processing in V5, and motor function in pre-school and school aged children. This indicates a relationship between visual and motor development and also supports the use of global motion perception to assess overall dorsal stream function in studies of human neurodevelopment. We investigated whether associations between vision and motor function were present at 2 years of age, a substantially earlier stage of development. The Bayley III test of Infant and Toddler Development and measures of vision including visual acuity (Cardiff Acuity Cards), stereopsis (Lang stereotest) and global motion perception were attempted in 404 2-year-old children (±4 weeks). Global motion perception (quantified as a motion coherence threshold) was assessed by observing optokinetic nystagmus in response to random dot kinematograms of varying coherence. Linear regression revealed that global motion perception was modestly, but statistically significantly associated with Bayley III composite motor (r 2 =0.06, P<0.001, n=375) and gross motor scores (r 2 =0.06, p<0.001, n=375). The associations remained significant when language score was included in the regression model. In addition, when language score was included in the model, stereopsis was significantly associated with composite motor and fine motor scores, but unaided visual acuity was not statistically significantly associated with any of the motor scores. These results demonstrate that global motion perception and binocular vision are associated with motor function at an early stage of development. Global motion perception can be used as a partial measure of dorsal stream function from early childhood. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Trailing Vortex-Induced Loads During Close Encounters in Cruise

    NASA Technical Reports Server (NTRS)

    Mendenhall, Michael R.; Lesieutre, Daniel J; Kelly, Michael J.

    2015-01-01

    The trailing vortex induced aerodynamic loads on a Falcon 20G business jet flying in the wake of a DC-8 are predicted to provide a preflight estimate of safe trail distances during flight test measurements in the wake. Static and dynamic loads on the airframe flying in the near wake are shown at a matrix of locations, and the dynamic motion of the Falcon 20G during traverses of the DC-8 primary trailing vortex is simulated. Safe trailing distances for the test flights are determined, and optimum vortex traverse schemes are identified to moderate the motion of the trailing aircraft during close encounters with the vortex wake.

  1. Optic flow cues guide flight in birds.

    PubMed

    Bhagavatula, Partha S; Claudianos, Charles; Ibbotson, Michael R; Srinivasan, Mandyam V

    2011-11-08

    Although considerable effort has been devoted to investigating how birds migrate over large distances, surprisingly little is known about how they tackle so successfully the moment-to-moment challenges of rapid flight through cluttered environments [1]. It has been suggested that birds detect and avoid obstacles [2] and control landing maneuvers [3-5] by using cues derived from the image motion that is generated in the eyes during flight. Here we investigate the ability of budgerigars to fly through narrow passages in a collision-free manner, by filming their trajectories during flight in a corridor where the walls are decorated with various visual patterns. The results demonstrate, unequivocally and for the first time, that birds negotiate narrow gaps safely by balancing the speeds of image motion that are experienced by the two eyes and that the speed of flight is regulated by monitoring the speed of image motion that is experienced by the two eyes. These findings have close parallels with those previously reported for flying insects [6-13], suggesting that some principles of visual guidance may be shared by all diurnal, flying animals. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Visual control of flight speed in Drosophila melanogaster.

    PubMed

    Fry, Steven N; Rohrseitz, Nicola; Straw, Andrew D; Dickinson, Michael H

    2009-04-01

    Flight control in insects depends on self-induced image motion (optic flow), which the visual system must process to generate appropriate corrective steering maneuvers. Classic experiments in tethered insects applied rigorous system identification techniques for the analysis of turning reactions in the presence of rotating pattern stimuli delivered in open-loop. However, the functional relevance of these measurements for visual free-flight control remains equivocal due to the largely unknown effects of the highly constrained experimental conditions. To perform a systems analysis of the visual flight speed response under free-flight conditions, we implemented a 'one-parameter open-loop' paradigm using 'TrackFly' in a wind tunnel equipped with real-time tracking and virtual reality display technology. Upwind flying flies were stimulated with sine gratings of varying temporal and spatial frequencies, and the resulting speed responses were measured from the resulting flight speed reactions. To control flight speed, the visual system of the fruit fly extracts linear pattern velocity robustly over a broad range of spatio-temporal frequencies. The speed signal is used for a proportional control of flight speed within locomotor limits. The extraction of pattern velocity over a broad spatio-temporal frequency range may require more sophisticated motion processing mechanisms than those identified in flies so far. In Drosophila, the neuromotor pathways underlying flight speed control may be suitably explored by applying advanced genetic techniques, for which our data can serve as a baseline. Finally, the high-level control principles identified in the fly can be meaningfully transferred into a robotic context, such as for the robust and efficient control of autonomous flying micro air vehicles.

  3. The free-flight response of Drosophila to motion of the visual environment.

    PubMed

    Mronz, Markus; Lehmann, Fritz-Olaf

    2008-07-01

    In the present study we investigated the behavioural strategies with which freely flying fruit flies (Drosophila) control their flight trajectories during active optomotor stimulation in a free-flight arena. We measured forward, turning and climbing velocities of single flies using high-speed video analysis and estimated the output of a 'Hassenstein-Reichardt' elementary motion detector (EMD) array and the fly's gaze to evaluate flight behaviour in response to a rotating visual panorama. In a stationary visual environment, flight is characterized by flight saccades during which the animals turn on average 120 degrees within 130 ms. In a rotating environment, the fly's behaviour typically changes towards distinct, concentric circular flight paths where the radius of the paths increases with increasing arena velocity. The EMD simulation suggests that this behaviour is driven by a rotation-sensitive EMD detector system that minimizes retinal slip on each compound eye, whereas an expansion-sensitive EMD system with a laterally centred visual focus potentially helps to achieve centring response on the circular flight path. We developed a numerical model based on force balance between horizontal, vertical and lateral forces that allows predictions of flight path curvature at a given locomotor capacity of the fly. The model suggests that turning flight in Drosophila is constrained by the production of centripetal forces needed to avoid side-slip movements. At maximum horizontal velocity this force may account for up to 70% of the fly's body weight during yaw turning. Altogether, our analyses are widely consistent with previous studies on Drosophila free flight and those on the optomotor response under tethered flight conditions.

  4. Biologically based machine vision: signal analysis of monopolar cells in the visual system of Musca domestica.

    PubMed

    Newton, Jenny; Barrett, Steven F; Wilcox, Michael J; Popp, Stephanie

    2002-01-01

    Machine vision for navigational purposes is a rapidly growing field. Many abilities such as object recognition and target tracking rely on vision. Autonomous vehicles must be able to navigate in dynamic enviroments and simultaneously locate a target position. Traditional machine vision often fails to react in real time because of large computational requirements whereas the fly achieves complex orientation and navigation with a relatively small and simple brain. Understanding how the fly extracts visual information and how neurons encode and process information could lead us to a new approach for machine vision applications. Photoreceptors in the Musca domestica eye that share the same spatial information converge into a structure called the cartridge. The cartridge consists of the photoreceptor axon terminals and monopolar cells L1, L2, and L4. It is thought that L1 and L2 cells encode edge related information relative to a single cartridge. These cells are thought to be equivalent to vertebrate bipolar cells, producing contrast enhancement and reduction of information sent to L4. Monopolar cell L4 is thought to perform image segmentation on the information input from L1 and L2 and also enhance edge detection. A mesh of interconnected L4's would correlate the output from L1 and L2 cells of adjacent cartridges and provide a parallel network for segmenting an object's edges. The focus of this research is to excite photoreceptors of the common housefly, Musca domestica, with different visual patterns. The electrical response of monopolar cells L1, L2, and L4 will be recorded using intracellular recording techniques. Signal analysis will determine the neurocircuitry to detect and segment images.

  5. Free flight odor tracking in Drosophila: Effect of wing chemosensors, sex and pheromonal gene regulation

    PubMed Central

    Houot, Benjamin; Gigot, Vincent; Robichon, Alain; Ferveur, Jean-François

    2017-01-01

    The evolution of powered flight in insects had major consequences for global biodiversity and involved the acquisition of adaptive processes allowing individuals to disperse to new ecological niches. Flies use both vision and olfactory input from their antennae to guide their flight; chemosensors on fly wings have been described, but their function remains mysterious. We studied Drosophila flight in a wind tunnel. By genetically manipulating wing chemosensors, we show that these structures play an essential role in flight performance with a sex-specific effect. Pheromonal systems are also involved in Drosophila flight guidance: transgenic expression of the pheromone production and detection gene, desat1, produced low, rapid flight that was absent in control flies. Our study suggests that the sex-specific modulation of free-flight odor tracking depends on gene expression in various fly tissues including wings and pheromonal-related tissues. PMID:28067325

  6. Fly-by-Wireless Update

    NASA Technical Reports Server (NTRS)

    Studor, George

    2010-01-01

    The presentation reviews what is meant by the term 'fly-by-wireless', common problems and motivation, provides recent examples, and examines NASA's future and basis for collaboration. The vision is to minimize cables and connectors and increase functionality across the aerospace industry by providing reliable, lower cost, modular, and higher performance alternatives to wired data connectivity to benefit the entire vehicle/program life-cycle. Focus areas are system engineering and integration methods to reduce cables and connectors, vehicle provisions for modularity and accessibility, and a 'tool box' of alternatives to wired connectivity.

  7. Comparative assessment of techniques for initial pose estimation using monocular vision

    NASA Astrophysics Data System (ADS)

    Sharma, Sumant; D`Amico, Simone

    2016-06-01

    This work addresses the comparative assessment of initial pose estimation techniques for monocular navigation to enable formation-flying and on-orbit servicing missions. Monocular navigation relies on finding an initial pose, i.e., a coarse estimate of the attitude and position of the space resident object with respect to the camera, based on a minimum number of features from a three dimensional computer model and a single two dimensional image. The initial pose is estimated without the use of fiducial markers, without any range measurements or any apriori relative motion information. Prior work has been done to compare different pose estimators for terrestrial applications, but there is a lack of functional and performance characterization of such algorithms in the context of missions involving rendezvous operations in the space environment. Use of state-of-the-art pose estimation algorithms designed for terrestrial applications is challenging in space due to factors such as limited on-board processing power, low carrier to noise ratio, and high image contrasts. This paper focuses on performance characterization of three initial pose estimation algorithms in the context of such missions and suggests improvements.

  8. Recent advances in the development and transfer of machine vision technologies for space

    NASA Technical Reports Server (NTRS)

    Defigueiredo, Rui J. P.; Pendleton, Thomas

    1991-01-01

    Recent work concerned with real-time machine vision is briefly reviewed. This work includes methodologies and techniques for optimal illumination, shape-from-shading of general (non-Lambertian) 3D surfaces, laser vision devices and technology, high level vision, sensor fusion, real-time computing, artificial neural network design and use, and motion estimation. Two new methods that are currently being developed for object recognition in clutter and for 3D attitude tracking based on line correspondence are discussed.

  9. "The Flying Man": The Power of Visual Media in Social Education

    ERIC Educational Resources Information Center

    Pearcy, Mark

    2015-01-01

    The visual arts and media historically employed by teachers as a corollary to "traditional" social education can play a more vital role in promoting critical inquiry among students. The use of short films like "The Flying Man" (2013), a 2013 motion picture which depicts an almost mythic vigilante in a realistic world, can help…

  10. Multi-Purpose Avionic Architecture for Vision Based Navigation Systems for EDL and Surface Mobility Scenarios

    NASA Astrophysics Data System (ADS)

    Tramutola, A.; Paltro, D.; Cabalo Perucha, M. P.; Paar, G.; Steiner, J.; Barrio, A. M.

    2015-09-01

    Vision Based Navigation (VBNAV) has been identified as a valid technology to support space exploration because it can improve autonomy and safety of space missions. Several mission scenarios can benefit from the VBNAV: Rendezvous & Docking, Fly-Bys, Interplanetary cruise, Entry Descent and Landing (EDL) and Planetary Surface exploration. For some of them VBNAV can improve the accuracy in state estimation as additional relative navigation sensor or as absolute navigation sensor. For some others, like surface mobility and terrain exploration for path identification and planning, VBNAV is mandatory. This paper presents the general avionic architecture of a Vision Based System as defined in the frame of the ESA R&T study “Multi-purpose Vision-based Navigation System Engineering Model - part 1 (VisNav-EM-1)” with special focus on the surface mobility application.

  11. Computing motion using resistive networks

    NASA Technical Reports Server (NTRS)

    Koch, Christof; Luo, Jin; Mead, Carver; Hutchinson, James

    1988-01-01

    Recent developments in the theory of early vision are described which lead from the formulation of the motion problem as an ill-posed one to its solution by minimizing certain 'cost' functions. These cost or energy functions can be mapped onto simple analog and digital resistive networks. It is shown how the optical flow can be computed by injecting currents into resistive networks and recording the resulting stationary voltage distribution at each node. These networks can be implemented in cMOS VLSI circuits and represent plausible candidates for biological vision systems.

  12. Distinct fMRI Responses to Self-Induced versus Stimulus Motion during Free Viewing in the Macaque

    PubMed Central

    Kaneko, Takaaki; Saleem, Kadharbatcha S.; Berman, Rebecca A.; Leopold, David A.

    2016-01-01

    Visual motion responses in the brain are shaped by two distinct sources: the physical movement of objects in the environment and motion resulting from one's own actions. The latter source, termed visual reafference, stems from movements of the head and body, and in primates from the frequent saccadic eye movements that mark natural vision. To study the relative contribution of reafferent and stimulus motion during natural vision, we measured fMRI activity in the brains of two macaques as they freely viewed >50 hours of naturalistic video footage depicting dynamic social interactions. We used eye movements obtained during scanning to estimate the level of reafferent retinal motion at each moment in time. We also estimated the net stimulus motion by analyzing the video content during the same time periods. Mapping the responses to these distinct sources of retinal motion, we found a striking dissociation in the distribution of visual responses throughout the brain. Reafferent motion drove fMRI activity in the early retinotopic areas V1, V2, V3, and V4, particularly in their central visual field representations, as well as lateral aspects of the caudal inferotemporal cortex (area TEO). However, stimulus motion dominated fMRI responses in the superior temporal sulcus, including areas MT, MST, and FST as well as more rostral areas. We discuss this pronounced separation of motion processing in the context of natural vision, saccadic suppression, and the brain's utilization of corollary discharge signals. SIGNIFICANCE STATEMENT Visual motion arises not only from events in the external world, but also from the movements of the observer. For example, even if objects are stationary in the world, the act of walking through a room or shifting one's eyes causes motion on the retina. This “reafferent” motion propagates into the brain as signals that must be interpreted in the context of real object motion. The delineation of whole-brain responses to stimulus versus self-generated retinal motion signals is critical for understanding visual perception and is of pragmatic importance given the increasing use of naturalistic viewing paradigms. The present study uses fMRI to demonstrate that the brain exhibits a fundamentally different pattern of responses to these two sources of retinal motion. PMID:27629710

  13. Distinct fMRI Responses to Self-Induced versus Stimulus Motion during Free Viewing in the Macaque.

    PubMed

    Russ, Brian E; Kaneko, Takaaki; Saleem, Kadharbatcha S; Berman, Rebecca A; Leopold, David A

    2016-09-14

    Visual motion responses in the brain are shaped by two distinct sources: the physical movement of objects in the environment and motion resulting from one's own actions. The latter source, termed visual reafference, stems from movements of the head and body, and in primates from the frequent saccadic eye movements that mark natural vision. To study the relative contribution of reafferent and stimulus motion during natural vision, we measured fMRI activity in the brains of two macaques as they freely viewed >50 hours of naturalistic video footage depicting dynamic social interactions. We used eye movements obtained during scanning to estimate the level of reafferent retinal motion at each moment in time. We also estimated the net stimulus motion by analyzing the video content during the same time periods. Mapping the responses to these distinct sources of retinal motion, we found a striking dissociation in the distribution of visual responses throughout the brain. Reafferent motion drove fMRI activity in the early retinotopic areas V1, V2, V3, and V4, particularly in their central visual field representations, as well as lateral aspects of the caudal inferotemporal cortex (area TEO). However, stimulus motion dominated fMRI responses in the superior temporal sulcus, including areas MT, MST, and FST as well as more rostral areas. We discuss this pronounced separation of motion processing in the context of natural vision, saccadic suppression, and the brain's utilization of corollary discharge signals. Visual motion arises not only from events in the external world, but also from the movements of the observer. For example, even if objects are stationary in the world, the act of walking through a room or shifting one's eyes causes motion on the retina. This "reafferent" motion propagates into the brain as signals that must be interpreted in the context of real object motion. The delineation of whole-brain responses to stimulus versus self-generated retinal motion signals is critical for understanding visual perception and is of pragmatic importance given the increasing use of naturalistic viewing paradigms. The present study uses fMRI to demonstrate that the brain exhibits a fundamentally different pattern of responses to these two sources of retinal motion. Copyright © 2016 the authors 0270-6474/16/369580-10$15.00/0.

  14. Aristotle, Motion, and Rhetoric.

    ERIC Educational Resources Information Center

    Sutton, Jane

    Aristotle rejects a world vision of changing reality as neither useful nor beneficial to human life, and instead he reaffirms both change and eternal reality, fuses motion and rest, and ends up with "well-behaved" changes. This concept of motion is foundational to his world view, and from it emerges his theory of knowledge, philosophy of…

  15. Incorporating Animation Concepts and Principles in STEM Education

    ERIC Educational Resources Information Center

    Harrison, Henry L., III; Hummell, Laura J.

    2010-01-01

    Animation is the rapid display of a sequence of static images that creates the illusion of movement. This optical illusion is often called perception of motion, persistence of vision, illusion of motion, or short-range apparent motion. The phenomenon occurs when the eye is exposed to rapidly changing still images, with each image being changed…

  16. SU-C-BRF-05: Design and Geometric Validation of An Externally and Internally Deformable, Programmable Lung Motion Phantom

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheung, Y; Sawant, A

    Purpose: Most clinically-deployed strategies for respiratory motion management in lung radiotherapy (e.g., gating, tracking) use external markers that serve as surrogates for tumor motion. However, typical lung phantoms used to validate these strategies are rigid-exterior+rigid-interior or rigid-exterior+deformable-interior. Neither class adequately represents the human anatomy, which is deformable internally as well as externally. We describe the construction and experimental validation of a more realistic, externally- and internally-deformable, programmable lung phantom. Methods: The outer shell of a commercially-available lung phantom (RS- 1500, RSD Inc.) was used. The shell consists of a chest cavity with a flexible anterior surface, and embedded vertebrae, rib-cagemore » and sternum. A 3-axis platform was programmed with sinusoidal and six patient-recorded lung tumor trajectories. The platform was used to drive a rigid foam ‘diaphragm’ that compressed/decompressed the phantom interior. Experimental characterization comprised of mapping the superior-inferior (SI) and anterior-posterior (AP) trajectories of external and internal radioopaque markers with kV x-ray fluoroscopy and correlating these with optical surface monitoring using the in-room VisionRT system. Results: The phantom correctly reproduced the programmed motion as well as realistic effects such as hysteresis. The reproducibility of marker trajectories over multiple runs for sinusoidal as well as patient traces, as characterized by fluoroscopy, was within 0.4 mm RMS error for internal as well as external markers. The motion trajectories of internal and external markers as measured by fluoroscopy were found to be highly correlated (R=0.97). Furthermore, motion trajectories of arbitrary points on the deforming phantom surface, as recorded by the VisionRT system also showed a high correlation with respect to the fluoroscopically-measured trajectories of internal markers (R=0.92). Conclusion: We have developed a realistic externally- and internally-deformable lung phantom that will serve as a valuable tool for clinical QA and motion management research. This work was supported through funding from the NIH and VisionRT Ltd. Amit Sawant has research funding from Varian Medical Systems, VisionRT and Elekta.« less

  17. Aging and Vision

    PubMed Central

    Owsley, Cynthia

    2010-01-01

    Given the increasing size of the older adult population in many countries, there is a pressing need to identify the nature of aging-related vision impairments, their underlying mechanisms, and how they impact older adults’ performance of everyday visual tasks. The results of this research can then be used to develop and evaluate interventions to slow or reverse aging-related declines in vision, thereby improving quality of life. Here we summarize salient developments in research on aging and vision over the past 25 years, focusing on spatial contrast sensitivity, vision under low luminance, temporal sensitivity and motion perception, and visual processing speed. PMID:20974168

  18. General principles in motion vision: color blindness of object motion depends on pattern velocity in honeybee and goldfish.

    PubMed

    Stojcev, Maja; Radtke, Nils; D'Amaro, Daniele; Dyer, Adrian G; Neumeyer, Christa

    2011-07-01

    Visual systems can undergo striking adaptations to specific visual environments during evolution, but they can also be very "conservative." This seems to be the case in motion vision, which is surprisingly similar in species as distant as honeybee and goldfish. In both visual systems, motion vision measured with the optomotor response is color blind and mediated by one photoreceptor type only. Here, we ask whether this is also the case if the moving stimulus is restricted to a small part of the visual field, and test what influence velocity may have on chromatic motion perception. Honeybees were trained to discriminate between clockwise- and counterclockwise-rotating sector disks. Six types of disk stimuli differing in green receptor contrast were tested using three different rotational velocities. When green receptor contrast was at a minimum, bees were able to discriminate rotation directions with all colored disks at slow velocities of 6 and 12 Hz contrast frequency but not with a relatively high velocity of 24 Hz. In the goldfish experiment, the animals were trained to detect a moving red or blue disk presented in a green surround. Discrimination ability between this stimulus and a homogenous green background was poor when the M-cone type was not or only slightly modulated considering high stimulus velocity (7 cm/s). However, discrimination was improved with slower stimulus velocities (4 and 2 cm/s). These behavioral results indicate that there is potentially an object motion system in both honeybee and goldfish, which is able to incorporate color information at relatively low velocities but is color blind with higher speed. We thus propose that both honeybees and goldfish have multiple subsystems of object motion, which include achromatic as well as chromatic processing.

  19. Surgical lesion of the anterior optic tract abolishes polarotaxis in tethered flying locusts, Schistocerca gregaria.

    PubMed

    Mappes, Martina; Homberg, Uwe

    2007-01-01

    Many insects can detect the polarization pattern of the blue sky and rely on polarization vision for sky compass orientation. In laboratory experiments, tethered flying locusts perform periodic changes in flight behavior under a slowly rotating polarizer even if one eye is painted black. Anatomical tracing studies and intracellular recordings have suggested that the polarization vision pathway in the locust brain involves the anterior optic tract and tubercle, the lateral accessory lobe, and the central complex of the brain. To investigate whether visual pathways through the anterior optic tract mediate polarotaxis in the desert locust, we transected the tract on one side and tested polarotaxis (1) with both eyes unoccluded and (2) with the eye of the intact hemisphere painted black. In the second group of animals, but not in the first group, polarotaxis was abolished. Sham operations did not impair polarotaxis. The experiments show that the anterior optic tract is an indispensable part of visual pathways mediating polarotaxis in the desert locust.

  20. Analysis of Formation Flying in Eccentric Orbits Using Linearized Equations of Relative Motion

    NASA Technical Reports Server (NTRS)

    Lane, Christopher; Axelrad, Penina

    2004-01-01

    Geometrical methods for formation flying design based on the analytical solution to Hill's equations have been previously developed and used to specify desired relative motions in near circular orbits. By generating relationships between the vehicles that are intuitive, these approaches offer valuable insight into the relative motion and allow for the rapid design of satellite configurations to achieve mission specific requirements, such as vehicle separation at perigee or apogee, minimum separation, or a specific geometrical shape. Furthermore, the results obtained using geometrical approaches can be used to better constrain numerical optimization methods; allowing those methods to converge to optimal satellite configurations faster. This paper presents a set of geometrical relationships for formations in eccentric orbits, where Hill.s equations are not valid, and shows how these relationships can be used to investigate formation designs and how they evolve with time.

  1. Novel techniques for data decomposition and load balancing for parallel processing of vision systems: Implementation and evaluation using a motion estimation system

    NASA Technical Reports Server (NTRS)

    Choudhary, Alok Nidhi; Leung, Mun K.; Huang, Thomas S.; Patel, Janak H.

    1989-01-01

    Computer vision systems employ a sequence of vision algorithms in which the output of an algorithm is the input of the next algorithm in the sequence. Algorithms that constitute such systems exhibit vastly different computational characteristics, and therefore, require different data decomposition techniques and efficient load balancing techniques for parallel implementation. However, since the input data for a task is produced as the output data of the previous task, this information can be exploited to perform knowledge based data decomposition and load balancing. Presented here are algorithms for a motion estimation system. The motion estimation is based on the point correspondence between the involved images which are a sequence of stereo image pairs. Researchers propose algorithms to obtain point correspondences by matching feature points among stereo image pairs at any two consecutive time instants. Furthermore, the proposed algorithms employ non-iterative procedures, which results in saving considerable amounts of computation time. The system consists of the following steps: (1) extraction of features; (2) stereo match of images in one time instant; (3) time match of images from consecutive time instants; (4) stereo match to compute final unambiguous points; and (5) computation of motion parameters.

  2. Age and visual impairment decrease driving performance as measured on a closed-road circuit.

    PubMed

    Wood, Joanne M

    2002-01-01

    In this study the effects of visual impairment and age on driving were investigated and related to visual function. Participants were 139 licensed drivers (young, middle-aged, and older participants with normal vision, and older participants with ocular disease). Driving performance was assessed during the daytime on a closed-road driving circuit. Visual performance was assessed using a vision testing battery. Age and visual impairment had a significant detrimental effect on recognition tasks (detection and recognition of signs and hazards), time to complete driving tasks (overall course time, reversing, and maneuvering), maneuvering ability, divided attention, and an overall driving performance index. All vision measures were significantly affected by group membership. A combination of motion sensitivity, useful field of view (UFOV), Pelli-Robson letter contrast sensitivity, and dynamic acuity could predict 50% of the variance in overall driving scores. These results indicate that older drivers with either normal vision or visual impairment had poorer driving performance compared with younger or middle-aged drivers with normal vision. The inclusion of tests such as motion sensitivity and the UFOV significantly improve the predictive power of vision tests for driving performance. Although such measures may not be practical for widespread screening, their application in selected cases should be considered.

  3. Neurons Forming Optic Glomeruli Compute Figure–Ground Discriminations in Drosophila

    PubMed Central

    Aptekar, Jacob W.; Keleş, Mehmet F.; Lu, Patrick M.; Zolotova, Nadezhda M.

    2015-01-01

    Many animals rely on visual figure–ground discrimination to aid in navigation, and to draw attention to salient features like conspecifics or predators. Even figures that are similar in pattern and luminance to the visual surroundings can be distinguished by the optical disparity generated by their relative motion against the ground, and yet the neural mechanisms underlying these visual discriminations are not well understood. We show in flies that a diverse array of figure–ground stimuli containing a motion-defined edge elicit statistically similar behavioral responses to one another, and statistically distinct behavioral responses from ground motion alone. From studies in larger flies and other insect species, we hypothesized that the circuitry of the lobula—one of the four, primary neuropiles of the fly optic lobe—performs this visual discrimination. Using calcium imaging of input dendrites, we then show that information encoded in cells projecting from the lobula to discrete optic glomeruli in the central brain group these sets of figure–ground stimuli in a homologous manner to the behavior; “figure-like” stimuli are coded similar to one another and “ground-like” stimuli are encoded differently. One cell class responds to the leading edge of a figure and is suppressed by ground motion. Two other classes cluster any figure-like stimuli, including a figure moving opposite the ground, distinctly from ground alone. This evidence demonstrates that lobula outputs provide a diverse basis set encoding visual features necessary for figure detection. PMID:25972183

  4. Neurons forming optic glomeruli compute figure-ground discriminations in Drosophila.

    PubMed

    Aptekar, Jacob W; Keleş, Mehmet F; Lu, Patrick M; Zolotova, Nadezhda M; Frye, Mark A

    2015-05-13

    Many animals rely on visual figure-ground discrimination to aid in navigation, and to draw attention to salient features like conspecifics or predators. Even figures that are similar in pattern and luminance to the visual surroundings can be distinguished by the optical disparity generated by their relative motion against the ground, and yet the neural mechanisms underlying these visual discriminations are not well understood. We show in flies that a diverse array of figure-ground stimuli containing a motion-defined edge elicit statistically similar behavioral responses to one another, and statistically distinct behavioral responses from ground motion alone. From studies in larger flies and other insect species, we hypothesized that the circuitry of the lobula--one of the four, primary neuropiles of the fly optic lobe--performs this visual discrimination. Using calcium imaging of input dendrites, we then show that information encoded in cells projecting from the lobula to discrete optic glomeruli in the central brain group these sets of figure-ground stimuli in a homologous manner to the behavior; "figure-like" stimuli are coded similar to one another and "ground-like" stimuli are encoded differently. One cell class responds to the leading edge of a figure and is suppressed by ground motion. Two other classes cluster any figure-like stimuli, including a figure moving opposite the ground, distinctly from ground alone. This evidence demonstrates that lobula outputs provide a diverse basis set encoding visual features necessary for figure detection. Copyright © 2015 the authors 0270-6474/15/357587-13$15.00/0.

  5. Physiology declines prior to death in Drosophila melanogaster.

    PubMed

    Shahrestani, Parvin; Tran, Xuan; Mueller, Laurence D

    2012-10-01

    For a period of 6-15 days prior to death, the fecundity and virility of Drosophila melanogaster fall significantly below those of same-aged flies that are not near death. It is likely that other aspects of physiology may decline during this period. This study attempts to document changes in two physiological characteristics prior to death: desiccation resistance and time-in-motion. Using individual fecundity estimates and previously described models, it is possible to accurately predict which flies in a population are near death at any given age; these flies are said to be in the "death spiral". In this study of approximately 7,600 females, we used cohort mortality data and individual fecundity estimates to dichotomize each of five replicate populations of same-aged D. melanogaster into "death spiral" and "non-spiral" groups. We then compared these groups for two physiological characteristics that decline during aging. We describe the statistical properties of a new multivariate test statistic that allows us to compare the desiccation resistance and time-in-motion for two populations chosen on the basis of their fecundity. This multivariate representation of the desiccation resistance and time-in-motion of spiral and non-spiral females was shown to be significantly different with the spiral females characterized by lower desiccation resistance and time spent in motion. Our results suggest that D. melanogaster may be used as a model organism to study physiological changes that occur when death is imminent.

  6. Flexible Wing Base Micro Aerial Vehicles: Vision-Guided Flight Stability and Autonomy for Micro Air Vehicles

    NASA Technical Reports Server (NTRS)

    Ettinger, Scott M.; Nechyba, Michael C.; Ifju, Peter G.; Wazak, Martin

    2002-01-01

    Substantial progress has been made recently towards design building and test-flying remotely piloted Micro Air Vehicle's (MAVs). We seek to complement this progress in overcoming the aerodynamic obstacles to.flight at very small scales with a vision stability and autonomy system. The developed system based on a robust horizon detection algorithm which we discuss in greater detail in a companion paper. In this paper, we first motivate the use of computer vision for MAV autonomy arguing that given current sensor technology, vision may he the only practical approach to the problem. We then briefly review our statistical vision-based horizon detection algorithm, which has been demonstrated at 30Hz with over 99.9% correct horizon identification. Next we develop robust schemes for the detection of extreme MAV attitudes, where no horizon is visible, and for the detection of horizon estimation errors, due to external factors such as video transmission noise. Finally, we discuss our feed-back controller for self-stabilized flight, and report results on vision autonomous flights of duration exceeding ten minutes.

  7. Vision and dual IMU integrated attitude measurement system

    NASA Astrophysics Data System (ADS)

    Guo, Xiaoting; Sun, Changku; Wang, Peng; Lu, Huang

    2018-01-01

    To determination relative attitude between two space objects on a rocking base, an integrated system based on vision and dual IMU (inertial determination unit) is built up. The determination system fuses the attitude information of vision with the angular determinations of dual IMU by extended Kalman filter (EKF) to obtain the relative attitude. One IMU (master) is attached to the measured motion object and the other (slave) to the rocking base. As the determination output of inertial sensor is relative to inertial frame, thus angular rate of the master IMU includes not only motion of the measured object relative to inertial frame but also the rocking base relative to inertial frame, where the latter can be seen as redundant harmful movement information for relative attitude determination between the measured object and the rocking base. The slave IMU here assists to remove the motion information of rocking base relative to inertial frame from the master IMU. The proposed integrated attitude determination system is tested on practical experimental platform. And experiment results with superior precision and reliability show the feasibility and effectiveness of the proposed attitude determination system.

  8. Vision sensor and dual MEMS gyroscope integrated system for attitude determination on moving base

    NASA Astrophysics Data System (ADS)

    Guo, Xiaoting; Sun, Changku; Wang, Peng; Huang, Lu

    2018-01-01

    To determine the relative attitude between the objects on a moving base and the base reference system by a MEMS (Micro-Electro-Mechanical Systems) gyroscope, the motion information of the base is redundant, which must be removed from the gyroscope. Our strategy is to add an auxiliary gyroscope attached to the reference system. The master gyroscope is to sense the total motion, and the auxiliary gyroscope is to sense the motion of the moving base. By a generalized difference method, relative attitude in a non-inertial frame can be determined by dual gyroscopes. With the vision sensor suppressing accumulative drift of the MEMS gyroscope, the vision and dual MEMS gyroscope integration system is formed. Coordinate system definitions and spatial transform are executed in order to fuse inertial and visual data from different coordinate systems together. And a nonlinear filter algorithm, Cubature Kalman filter, is used to fuse slow visual data and fast inertial data together. A practical experimental setup is built up and used to validate feasibility and effectiveness of our proposed attitude determination system in the non-inertial frame on the moving base.

  9. Helicopter human factors

    NASA Technical Reports Server (NTRS)

    Hart, Sandra G.

    1988-01-01

    The state-of-the-art helicopter and its pilot are examined using the tools of human-factors analysis. The significant role of human error in helicopter accidents is discussed; the history of human-factors research on helicopters is briefly traced; the typical flight tasks are described; and the noise, vibration, and temperature conditions typical of modern military helicopters are characterized. Also considered are helicopter controls, cockpit instruments and displays, and the impact of cockpit design on pilot workload. Particular attention is given to possible advanced-technology improvements, such as control stabilization and augmentation, FBW and fly-by-light systems, multifunction displays, night-vision goggles, pilot night-vision systems, night-vision displays with superimposed symbols, target acquisition and designation systems, and aural displays. Diagrams, drawings, and photographs are provided.

  10. "Fly-by-Wireless" and Wireless Sensors Update

    NASA Technical Reports Server (NTRS)

    Studor, George F.

    2009-01-01

    This slide presentation reviews the uses of wires in the Aerospace industry. The vision is to minimize cables and connectors and increase functionality across the aerospace industry by providing reliable lower cost modular and higher performance alternatives to wired data connectivity to benefit the entire vehicle and program

  11. Medical Handbook for Pilots.

    ERIC Educational Resources Information Center

    Federal Aviation Administration (DOT), Washington, DC.

    This handbook provides information on an airline pilot's physical and mental status and related medical factors which may affect his/her performance. Contents include information on the physical examination for pilots, the flyer's environment, hypoxia, hyperventilation, gas in the body, the ears, alcohol, drugs and flying, carbon monoxide, vision,…

  12. Human ophthalmomyiasis interna caused by Hypoderma tarandi, Northern Canada.

    PubMed

    Lagacé-Wiens, Philippe R S; Dookeran, Ravi; Skinner, Stuart; Leicht, Richard; Colwell, Douglas D; Galloway, Terry D

    2008-01-01

    Human myiasis caused by bot flies of nonhuman animals is rare but may be increasing. The treatment of choice is laser photocoagulation or vitrectomy with larva removal and intraocular steroids. Ophthalmomyiasis caused by Hypoderma spp. should be recognized as a potentially reversible cause of vision loss.

  13. Texture dependence of motion sensing and free flight behavior in blowflies

    PubMed Central

    Lindemann, Jens P.; Egelhaaf, Martin

    2013-01-01

    Many flying insects exhibit an active flight and gaze strategy: purely translational flight segments alternate with quick turns called saccades. To generate such a saccadic flight pattern, the animals decide the timing, direction, and amplitude of the next saccade during the previous translatory intersaccadic interval. The information underlying these decisions is assumed to be extracted from the retinal image displacements (optic flow), which scale with the distance to objects during the intersaccadic flight phases. In an earlier study we proposed a saccade-generation mechanism based on the responses of large-field motion-sensitive neurons. In closed-loop simulations we achieved collision avoidance behavior in a limited set of environments but observed collisions in others. Here we show by open-loop simulations that the cause of this observation is the known texture-dependence of elementary motion detection in flies, reflected also in the responses of large-field neurons as used in our model. We verified by electrophysiological experiments that this result is not an artifact of the sensory model. Already subtle changes in the texture may lead to qualitative differences in the responses of both our model cells and their biological counterparts in the fly's brain. Nonetheless, free flight behavior of blowflies is only moderately affected by such texture changes. This divergent texture dependence of motion-sensitive neurons and behavioral performance suggests either mechanisms that compensate for the texture dependence of the visual motion pathway at the level of the circuits generating the saccadic turn decisions or the involvement of a hypothetical parallel pathway in saccadic control that provides the information for collision avoidance independent of the textural properties of the environment. PMID:23335890

  14. A New Flying Wire System for the Tevatron

    NASA Astrophysics Data System (ADS)

    Blokland, Willem; Dey, Joseph; Vogel, Greg

    1997-05-01

    A new Flying Wires system replaces the old system to enhance the analysis of the beam emittance, improve the reliability, and handle the upcoming upgrades of the Tevatron. New VME data acquisition modules and timing modules allow for more bunches to be sampled more precisely. The programming language LabVIEW, running on a Macintosh computer, controls the VME modules and the nuLogic motion board that flies the wires. LabVIEW also analyzes and stores the data, and handles local and remote commands. The new system flies three wires and fits profiles of 72 bunches to a gaussian function within two seconds. A new console application operates the flying wires from any control console. This paper discusses the hardware and software setup, the capabilities and measurement results of the new Flying Wires system.

  15. Instability of the perceived world while watching 3D stereoscopic imagery: A likely source of motion sickness symptoms

    PubMed Central

    Hwang, Alex D.; Peli, Eli

    2014-01-01

    Watching 3D content using a stereoscopic display may cause various discomforting symptoms, including eye strain, blurred vision, double vision, and motion sickness. Numerous studies have reported motion-sickness-like symptoms during stereoscopic viewing, but no causal linkage between specific aspects of the presentation and the induced discomfort has been explicitly proposed. Here, we describe several causes, in which stereoscopic capture, display, and viewing differ from natural viewing resulting in static and, importantly, dynamic distortions that conflict with the expected stability and rigidity of the real world. This analysis provides a basis for suggested changes to display systems that may alleviate the symptoms, and suggestions for future studies to determine the relative contribution of the various effects to the unpleasant symptoms. PMID:26034562

  16. Rotary acceleration of a subject inhibits choice reaction time to motion in peripheral vision

    NASA Technical Reports Server (NTRS)

    Borkenhagen, J. M.

    1974-01-01

    Twelve pilots were tested in a rotation device with visual simulation, alone and in combination with rotary stimulation, in experiments with variable levels of acceleration and variable viewing angles, in a study of the effect of S's rotary acceleration on the choice reaction time for an accelerating target in peripheral vision. The pilots responded to the direction of the visual motion by moving a hand controller to the right or left. Visual-plus-rotary stimulation required a longer choice reaction time, which was inversely related to the level of acceleration and directly proportional to the viewing angle.

  17. Development of a body motion interactive system with a weight voting mechanism and computer vision technology

    NASA Astrophysics Data System (ADS)

    Lin, Chern-Sheng; Chen, Chia-Tse; Shei, Hung-Jung; Lay, Yun-Long; Chiu, Chuang-Chien

    2012-09-01

    This study develops a body motion interactive system with computer vision technology. This application combines interactive games, art performing, and exercise training system. Multiple image processing and computer vision technologies are used in this study. The system can calculate the characteristics of an object color, and then perform color segmentation. When there is a wrong action judgment, the system will avoid the error with a weight voting mechanism, which can set the condition score and weight value for the action judgment, and choose the best action judgment from the weight voting mechanism. Finally, this study estimated the reliability of the system in order to make improvements. The results showed that, this method has good effect on accuracy and stability during operations of the human-machine interface of the sports training system.

  18. 2013-2363

    NASA Image and Video Library

    2013-05-15

    (left to right) NASA Langley aerospace engineer Bruce Jackson briefs astronauts Rex Walheim and Gregory Johnson about the Synthetic Vision (SV) and Enhanced Vision (EV) systems in a flight simulator at the center's Cockpit Motion Facility. The astronauts were training to land the Dream Chaser spacecraft May 15th 2013. credit NASA/David C. Bowman

  19. A smart telerobotic system driven by monocular vision

    NASA Technical Reports Server (NTRS)

    Defigueiredo, R. J. P.; Maccato, A.; Wlczek, P.; Denney, B.; Scheerer, J.

    1994-01-01

    A robotic system that accepts autonomously generated motion and control commands is described. The system provides images from the monocular vision of a camera mounted on a robot's end effector, eliminating the need for traditional guidance targets that must be predetermined and specifically identified. The telerobotic vision system presents different views of the targeted object relative to the camera, based on a single camera image and knowledge of the target's solid geometry.

  20. An Integrated Calibration Technique for Stereo Vision Systems (PREPRINT)

    DTIC Science & Technology

    2010-03-01

    technique for stereo vision systems has been developed. To demonstrate and evaluate this calibration technique, multiple Wii Remotes (Wiimotes) from Nintendo ...from Nintendo were used to form stereo vision systems to perform 3D motion capture in real time. This integrated technique is a two-step process...Wiimotes) used in Nintendo Wii games. Many researchers have successfully dealt with the problem of camera calibration by taking images from a 2D

  1. Vision: a moving hill for spatial updating on the fly.

    PubMed

    Stanford, Terrence R

    2015-02-02

    A recent study reveals a dynamic neural map that provides a continuous representation of remembered visual stimulus locations with respect to constantly changing gaze. This finding suggests a new mechanistic framework for understanding the spatiotemporal dynamics of goal-directed action. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Tracking the "Lizardman": Writing Rotten to Write Well.

    ERIC Educational Resources Information Center

    Polette, Keith

    1995-01-01

    Suggests that students can improve their writing by being instructed on how to write badly. Applies the criteria of testability, tunnel-vision, excessive vagueness, flying in the face of established fact, and hazy authority to tabloid newspaper stories. Discusses how students can write their own "rotten" tabloid stories by taking these…

  3. Human Ophthalmomyiasis Interna Caused by Hypoderma tarandi, Northern Canada

    PubMed Central

    Dookeran, Ravi; Skinner, Stuart; Leicht, Richard; Colwell, Douglas D.; Galloway, Terry D.

    2008-01-01

    Human myiasis caused by bot flies of nonhuman animals is rare but may be increasing. The treatment of choice is laser photocoagulation or vitrectomy with larva removal and intraocular steroids. Ophthalmomyiasis caused by Hypoderma spp. should be recognized as a potentially reversible cause of vision loss. PMID:18258079

  4. On the role of spatial phase and phase correlation in vision, illusion, and cognition

    PubMed Central

    Gladilin, Evgeny; Eils, Roland

    2015-01-01

    Numerous findings indicate that spatial phase bears an important cognitive information. Distortion of phase affects topology of edge structures and makes images unrecognizable. In turn, appropriately phase-structured patterns give rise to various illusions of virtual image content and apparent motion. Despite a large body of phenomenological evidence not much is known yet about the role of phase information in neural mechanisms of visual perception and cognition. Here, we are concerned with analysis of the role of spatial phase in computational and biological vision, emergence of visual illusions and pattern recognition. We hypothesize that fundamental importance of phase information for invariant retrieval of structural image features and motion detection promoted development of phase-based mechanisms of neural image processing in course of evolution of biological vision. Using an extension of Fourier phase correlation technique, we show that the core functions of visual system such as motion detection and pattern recognition can be facilitated by the same basic mechanism. Our analysis suggests that emergence of visual illusions can be attributed to presence of coherently phase-shifted repetitive patterns as well as the effects of acuity compensation by saccadic eye movements. We speculate that biological vision relies on perceptual mechanisms effectively similar to phase correlation, and predict neural features of visual pattern (dis)similarity that can be used for experimental validation of our hypothesis of “cognition by phase correlation.” PMID:25954190

  5. On the role of spatial phase and phase correlation in vision, illusion, and cognition.

    PubMed

    Gladilin, Evgeny; Eils, Roland

    2015-01-01

    Numerous findings indicate that spatial phase bears an important cognitive information. Distortion of phase affects topology of edge structures and makes images unrecognizable. In turn, appropriately phase-structured patterns give rise to various illusions of virtual image content and apparent motion. Despite a large body of phenomenological evidence not much is known yet about the role of phase information in neural mechanisms of visual perception and cognition. Here, we are concerned with analysis of the role of spatial phase in computational and biological vision, emergence of visual illusions and pattern recognition. We hypothesize that fundamental importance of phase information for invariant retrieval of structural image features and motion detection promoted development of phase-based mechanisms of neural image processing in course of evolution of biological vision. Using an extension of Fourier phase correlation technique, we show that the core functions of visual system such as motion detection and pattern recognition can be facilitated by the same basic mechanism. Our analysis suggests that emergence of visual illusions can be attributed to presence of coherently phase-shifted repetitive patterns as well as the effects of acuity compensation by saccadic eye movements. We speculate that biological vision relies on perceptual mechanisms effectively similar to phase correlation, and predict neural features of visual pattern (dis)similarity that can be used for experimental validation of our hypothesis of "cognition by phase correlation."

  6. An optimal control strategy for two-dimensional motion camouflage with non-holonimic constraints.

    PubMed

    Rañó, Iñaki

    2012-07-01

    Motion camouflage is a stealth behaviour observed both in hover-flies and in dragonflies. Existing controllers for mimicking motion camouflage generate this behaviour on an empirical basis or without considering the kinematic motion restrictions present in animal trajectories. This study summarises our formal contributions to solve the generation of motion camouflage as a non-linear optimal control problem. The dynamics of the system capture the kinematic restrictions to motion of the agents, while the performance index ensures camouflage trajectories. An extensive set of simulations support the technique, and a novel analysis of the obtained trajectories contributes to our understanding of possible mechanisms to obtain sensor based motion camouflage, for instance, in mobile robots.

  7. 1 kHz 2D Visual Motion Sensor Using 20 × 20 Silicon Retina Optical Sensor and DSP Microcontroller.

    PubMed

    Liu, Shih-Chii; Yang, MinHao; Steiner, Andreas; Moeckel, Rico; Delbruck, Tobi

    2015-04-01

    Optical flow sensors have been a long running theme in neuromorphic vision sensors which include circuits that implement the local background intensity adaptation mechanism seen in biological retinas. This paper reports a bio-inspired optical motion sensor aimed towards miniature robotic and aerial platforms. It combines a 20 × 20 continuous-time CMOS silicon retina vision sensor with a DSP microcontroller. The retina sensor has pixels that have local gain control and adapt to background lighting. The system allows the user to validate various motion algorithms without building dedicated custom solutions. Measurements are presented to show that the system can compute global 2D translational motion from complex natural scenes using one particular algorithm: the image interpolation algorithm (I2A). With this algorithm, the system can compute global translational motion vectors at a sample rate of 1 kHz, for speeds up to ±1000 pixels/s, using less than 5 k instruction cycles (12 instructions per pixel) per frame. At 1 kHz sample rate the DSP is 12% occupied with motion computation. The sensor is implemented as a 6 g PCB consuming 170 mW of power.

  8. [Evaluation of Motion Sickness Induced by 3D Video Clips].

    PubMed

    Matsuura, Yasuyuki; Takada, Hiroki

    2016-01-01

    The use of stereoscopic images has been spreading rapidly. Nowadays, stereoscopic movies are nothing new to people. Stereoscopic systems date back to 280 A.D. when Euclid first recognized the concept of depth perception by humans. Despite the increase in the production of three-dimensional (3D) display products and many studies on stereoscopic vision, the effect of stereoscopic vision on the human body has been insufficiently understood. However, symptoms such as eye fatigue and 3D sickness have been the concerns when viewing 3D films for a prolonged period of time; therefore, it is important to consider the safety of viewing virtual 3D contents as a contribution to society. It is generally explained to the public that accommodation and convergence are mismatched during stereoscopic vision and that this is the main reason for the visual fatigue and visually induced motion sickness (VIMS) during 3D viewing. We have devised a method to simultaneously measure lens accommodation and convergence. We used this simultaneous measurement device to characterize 3D vision. Fixation distance was compared between accommodation and convergence during the viewing of 3D films with repeated measurements. Time courses of these fixation distances and their distributions were compared in subjects who viewed 2D and 3D video clips. The results indicated that after 90 s of continuously viewing 3D images, the accommodative power does not correspond to the distance of convergence. In this paper, remarks on methods to measure the severity of motion sickness induced by viewing 3D films are also given. From the epidemiological viewpoint, it is useful to obtain novel knowledge for reduction and/or prevention of VIMS. We should accumulate empirical data on motion sickness, which may contribute to the development of relevant fields in science and technology.

  9. Robust Notion Vision For A Vehicle Moving On A Plane

    NASA Astrophysics Data System (ADS)

    Moni, Shankar; Weldon, E. J.

    1987-05-01

    A vehicle equipped with a cemputer vision system moves on a plane. We show that subject to certain constraints, the system can determine the motion of the vehicle (one rotational and two translational degrees of freedom) and the depth of the scene in front of the vehicle. The constraints include limits on the speed of the vehicle, presence of texture on the plane and absence of pitch and roll in the vehicular motion. It is possible to decouple the problems of finding the vehicle's motion and the depth of the scene in front of the vehicle by using two rigidly connected cameras. One views a field with known depth (i.e. the ground plane) and estimates the motion parameters and the other determines the depth map knowing the motion parameters. The motion is constrained to be planar to increase robustness. We use a least squares method of fitting the vehicle motion to observer brightness gradients. With this method, no correspondence between image points needs to be established and information fran the entire image is used in calculating notion. The algorithm performs very reliably on real image sequences and these results have been included. The results compare favourably to the performance of the algorithm of Negandaripour and Horn [2] where six degrees of freedom are assumed.

  10. Development of a vision non-contact sensing system for telerobotic applications

    NASA Astrophysics Data System (ADS)

    Karkoub, M.; Her, M.-G.; Ho, M.-I.; Huang, C.-C.

    2013-08-01

    The study presented here describes a novel vision-based motion detection system for telerobotic operations such as distant surgical procedures. The system uses a CCD camera and image processing to detect the motion of a master robot or operator. Colour tags are placed on the arm and head of a human operator to detect the up/down, right/left motion of the head as well as the right/left motion of the arm. The motion of the colour tags are used to actuate a slave robot or a remote system. The determination of the colour tags' motion is achieved through image processing using eigenvectors and colour system morphology and the relative head, shoulder and wrist rotation angles through inverse dynamics and coordinate transformation. A program is used to transform this motion data into motor control commands and transmit them to a slave robot or remote system through wireless internet. The system performed well even in complex environments with errors that did not exceed 2 pixels with a response time of about 0.1 s. The results of the experiments are available at: http://www.youtube.com/watch?v=yFxLaVWE3f8 and http://www.youtube.com/watch?v=_nvRcOzlWHw

  11. Vision in Flies: Measuring the Attention Span

    PubMed Central

    Koenig, Sebastian; Wolf, Reinhard; Heisenberg, Martin

    2016-01-01

    A visual stimulus at a particular location of the visual field may elicit a behavior while at the same time equally salient stimuli in other parts do not. This property of visual systems is known as selective visual attention (SVA). The animal is said to have a focus of attention (FoA) which it has shifted to a particular location. Visual attention normally involves an attention span at the location to which the FoA has been shifted. Here the attention span is measured in Drosophila. The fly is tethered and hence has its eyes fixed in space. It can shift its FoA internally. This shift is revealed using two simultaneous test stimuli with characteristic responses at their particular locations. In tethered flight a wild type fly keeps its FoA at a certain location for up to 4s. Flies with a mutation in the radish gene, that has been suggested to be involved in attention-like mechanisms, display a reduced attention span of only 1s. PMID:26848852

  12. Vision in Flies: Measuring the Attention Span.

    PubMed

    Koenig, Sebastian; Wolf, Reinhard; Heisenberg, Martin

    2016-01-01

    A visual stimulus at a particular location of the visual field may elicit a behavior while at the same time equally salient stimuli in other parts do not. This property of visual systems is known as selective visual attention (SVA). The animal is said to have a focus of attention (FoA) which it has shifted to a particular location. Visual attention normally involves an attention span at the location to which the FoA has been shifted. Here the attention span is measured in Drosophila. The fly is tethered and hence has its eyes fixed in space. It can shift its FoA internally. This shift is revealed using two simultaneous test stimuli with characteristic responses at their particular locations. In tethered flight a wild type fly keeps its FoA at a certain location for up to 4s. Flies with a mutation in the radish gene, that has been suggested to be involved in attention-like mechanisms, display a reduced attention span of only 1s.

  13. Parallel implementation and evaluation of motion estimation system algorithms on a distributed memory multiprocessor using knowledge based mappings

    NASA Technical Reports Server (NTRS)

    Choudhary, Alok Nidhi; Leung, Mun K.; Huang, Thomas S.; Patel, Janak H.

    1989-01-01

    Several techniques to perform static and dynamic load balancing techniques for vision systems are presented. These techniques are novel in the sense that they capture the computational requirements of a task by examining the data when it is produced. Furthermore, they can be applied to many vision systems because many algorithms in different systems are either the same, or have similar computational characteristics. These techniques are evaluated by applying them on a parallel implementation of the algorithms in a motion estimation system on a hypercube multiprocessor system. The motion estimation system consists of the following steps: (1) extraction of features; (2) stereo match of images in one time instant; (3) time match of images from different time instants; (4) stereo match to compute final unambiguous points; and (5) computation of motion parameters. It is shown that the performance gains when these data decomposition and load balancing techniques are used are significant and the overhead of using these techniques is minimal.

  14. Student's experiment to fly on third Shuttle mission

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A spaceborne student experiment on insect motion during weightlessness scheduled to fly on the third flight of the space shuttle is described. The experiment will focus on the flight behavior in zero gravity of two species of flying insects with differing ratios of body mass to wing area, the velvetbean caterpillar moth and the honeybee drone. Ten insects of each species will be carried in separate canisters. The crew will remove the canisters from the storage locker and attach them to the mid-deck wall, where the insects will be observed and filmed by a data acquisition camera.

  15. A Comparison of the AVS-9 and the Panoramic Night Vision Goggle During Rotorcraft Hover and Landing

    NASA Technical Reports Server (NTRS)

    Szoboszlay, Zoltan; Haworth, Loran; Simpson, Carol; Rutkowski, Michael (Technical Monitor)

    2001-01-01

    The purpose of this flight test was to measure any differences in pilot-vehicle performance and pilot opinion between the use of the current generation AVS-9 Night Vision Goggle and one variant of the prototype Panoramic Night Vision Goggle (the PNV.GII). The PNVGII has more than double the horizontal field-of-view of the AVS-9, but reduced image quality. The flight path of the AH-1S helicopter was used as a measure of pilot-vehicle performance. Also recorded were subjective measures of flying qualities, physical reserves of the pilot, situational awareness, and display usability. Pilot comment and data indicate that the benefits of additional FOV with the PNVGIIs are to some extent negated by the reduced image quality of the PNVGIIs.

  16. Experimental and Analytic Evaluation of the Effects of Visual and Motion Simulation in SH-3 Helicopter Training. Technical Report 85-002.

    ERIC Educational Resources Information Center

    Pfeiffer, Mark G.; Scott, Paul G.

    A fly-only group (N=16) of Navy replacement pilots undergoing fleet readiness training in the SH-3 helicopter was compared with groups pre-trained on Device 2F64C with: (1) visual only (N=13); (2) no visual/no motion (N=14); and (3) one visual plus motion group (N=19). Groups were compared for their SH-3 helicopter performance in the transition…

  17. Real-Time Implementation of an Asynchronous Vision-Based Target Tracking System for an Unmanned Aerial Vehicle

    DTIC Science & Technology

    2007-06-01

    Chin Khoon Quek. “Vision Based Control and Target Range Estimation for Small Unmanned Aerial Vehicle.” Master’s Thesis, Naval Postgraduate School...December 2005. [6] Kwee Chye Yap. “Incorporating Target Mensuration System for Target Motion Estimation Along a Road Using Asynchronous Filter

  18. Range Image Flow using High-Order Polynomial Expansion

    DTIC Science & Technology

    2013-09-01

    included as a default algorithm in the OpenCV library [2]. The research of estimating the motion between range images, or range flow, is much more...Journal of Computer Vision, vol. 92, no. 1, pp. 1‒31. 2. G. Bradski and A. Kaehler. 2008. Learning OpenCV : Computer Vision with the OpenCV Library

  19. 76 FR 55109 - In the Matter of Certain DC-DC Controllers and Products Containing Same; Notice of Institution of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-06

    ... named the following respondents: VisionTek Products LLC (``VisionTek'') of Inverness, Illinois; uPI Semiconductor Corp. (``uPI'') of Taiwan; Sapphire Technology Limited (``Sapphire'') of Hong Kong; Advanced Micro...'') initial determination (``ID'') granting uPI's and Sapphire's joint motion to terminate the investigation...

  20. Algorithms and architectures for robot vision

    NASA Technical Reports Server (NTRS)

    Schenker, Paul S.

    1990-01-01

    The scope of the current work is to develop practical sensing implementations for robots operating in complex, partially unstructured environments. A focus in this work is to develop object models and estimation techniques which are specific to requirements of robot locomotion, approach and avoidance, and grasp and manipulation. Such problems have to date received limited attention in either computer or human vision - in essence, asking not only how perception is in general modeled, but also what is the functional purpose of its underlying representations. As in the past, researchers are drawing on ideas from both the psychological and machine vision literature. Of particular interest is the development 3-D shape and motion estimates for complex objects when given only partial and uncertain information and when such information is incrementally accrued over time. Current studies consider the use of surface motion, contour, and texture information, with the longer range goal of developing a fused sensing strategy based on these sources and others.

  1. Comparison of Artificial Immune System and Particle Swarm Optimization Techniques for Error Optimization of Machine Vision Based Tool Movements

    NASA Astrophysics Data System (ADS)

    Mahapatra, Prasant Kumar; Sethi, Spardha; Kumar, Amod

    2015-10-01

    In conventional tool positioning technique, sensors embedded in the motion stages provide the accurate tool position information. In this paper, a machine vision based system and image processing technique for motion measurement of lathe tool from two-dimensional sequential images captured using charge coupled device camera having a resolution of 250 microns has been described. An algorithm was developed to calculate the observed distance travelled by the tool from the captured images. As expected, error was observed in the value of the distance traversed by the tool calculated from these images. Optimization of errors due to machine vision system, calibration, environmental factors, etc. in lathe tool movement was carried out using two soft computing techniques, namely, artificial immune system (AIS) and particle swarm optimization (PSO). The results show better capability of AIS over PSO.

  2. How Magnus Bends the Flying Ball - Experimenting and Modeling

    NASA Astrophysics Data System (ADS)

    Timková, V.; Ješková, Z.

    2017-02-01

    Students are well aware of the effect of the deflection of sports balls when they have been given a spin. A volleyball, tennis, or table tennis ball served with topspin results in an additional downward force that makes the ball difficult to catch and return. In soccer, the effect of sidespin causes the ball to curve unexpectedly sideways, resulting in a so-called banana kick that can confuse the goalkeeper. These surprising effects attract students' attention such that the motion of sports balls can be used to capture the interest of students towards the physics behind it. However, to study and analyze the motion of a real ball kicked in a playfield is not an easy task. Instead of the large-scale full-size sports ball motion, there can be designed and studied simpler experiments that can be carried out in the classroom. Moreover, digital technologies that are available at schools enable students to collect data from the experiment easily in a reasonable time. The mathematical model based on the analysis of forces acting on the ball flying in the air can be used to simulate the motion in order to understand the basic physical principles of the motion so that the best correspondence may be found.

  3. Biomechanical analysis of the sidearm throwing motion for distance of a flying disc: a comparison of skilled and unskilled ultimate players.

    PubMed

    Sasakawa, Kei; Sakurai, Shinji

    2008-09-01

    Joint angles of the throwing limb were examined from the acceleration phase up until release for the sidearm throwing motion when using a flying disc. 17 individuals (ten skilled, seven unskilled) threw a disc as far as possible ten times. Throwing motions were recorded using three-dimensional high-speed videography. The initial condition of disc release and joint angle kinematics of the upper limb during the throwing motion were obtained. Mean (+/- standard deviation) throwing distance and disc spin rate were significantly greater for skilled throwers (51.4 +/- 6.6 m, 12.9 +/- 1.3 rps) than for unskilled throwers (29.5 +/- 7.6 m, 9.4 +/- 1.3 rps), although there was no significant difference in initial velocity of the disc between the two groups (skilled: 21.7 +/- 1.7m/s; unskilled: 20.7 +/- 2.5m/s). A marked difference in motion of supination/pronation of the forearm before disc release was identified, with the forearm supinated in the final acceleration phase leading up to disc release for the unskilled participants, while the forearm was pronated in the same phase for the skilled participants. These differences in joint kinematics could be related to differences in disc spin rate, and thus led to the substantial differences in throwing distance.

  4. Does the Owl Fly out of the Tree or Does the Owl Exit the Tree Flying? How L2 Learners Overcome Their L1 Lexicalization Biases

    ERIC Educational Resources Information Center

    Song, Lulu; Pulverman, Rachel; Pepe, Christina; Golinkoff, Roberta Michnick; Hirsh-Pasek, Kathy

    2016-01-01

    Learning a language is more than learning its vocabulary and grammar. For example, compared with English, Spanish uses many more path verbs such as "ascender" ("to move upward") and "salir" ("to go out"), and expresses manner of motion optionally. English, in contrast, has many manner verbs (e.g., "run,…

  5. Dynamics of contextual modulation of perceived shape in human vision

    PubMed Central

    Gheorghiu, Elena; Kingdom, Frederick A. A.

    2017-01-01

    In biological vision, contextual modulation refers to the influence of a surround pattern on either the perception of, or the neural responses to, a target pattern. One studied form of contextual modulation deals with the effect of a surround texture on the perceived shape of a contour, in the context of the phenomenon known as the shape aftereffect. In the shape aftereffect, prolonged viewing, or adaptation to a particular contour’s shape causes a shift in the perceived shape of a subsequently viewed contour. Shape aftereffects are suppressed when the adaptor contour is surrounded by a texture of similarly-shaped contours, a surprising result given that the surround contours are all potential adaptors. Here we determine the motion and temporal properties of this form of contextual modulation. We varied the relative motion directions, speeds and temporal phases between the central adaptor contour and the surround texture and measured for each manipulation the degree to which the shape aftereffect was suppressed. Results indicate that contextual modulation of shape processing is selective to motion direction, temporal frequency and temporal phase. These selectivities are consistent with one aim of vision being to segregate contours that define objects from those that form textured surfaces. PMID:28230085

  6. Micro-calibration of space and motion by photoreceptors synchronized in parallel with cortical oscillations: A unified theory of visual perception.

    PubMed

    Jerath, Ravinder; Cearley, Shannon M; Barnes, Vernon A; Jensen, Mike

    2018-01-01

    A fundamental function of the visual system is detecting motion, yet visual perception is poorly understood. Current research has determined that the retina and ganglion cells elicit responses for motion detection; however, the underlying mechanism for this is incompletely understood. Previously we proposed that retinogeniculo-cortical oscillations and photoreceptors work in parallel to process vision. Here we propose that motion could also be processed within the retina, and not in the brain as current theory suggests. In this paper, we discuss: 1) internal neural space formation; 2) primary, secondary, and tertiary roles of vision; 3) gamma as the secondary role; and 4) synchronization and coherence. Movement within the external field is instantly detected by primary processing within the space formed by the retina, providing a unified view of the world from an internal point of view. Our new theory begins to answer questions about: 1) perception of space, erect images, and motion, 2) purpose of lateral inhibition, 3) speed of visual perception, and 4) how peripheral color vision occurs without a large population of cones located peripherally in the retina. We explain that strong oscillatory activity influences on brain activity and is necessary for: 1) visual processing, and 2) formation of the internal visuospatial area necessary for visual consciousness, which could allow rods to receive precise visual and visuospatial information, while retinal waves could link the lateral geniculate body with the cortex to form a neural space formed by membrane potential-based oscillations and photoreceptors. We propose that vision is tripartite, with three components that allow a person to make sense of the world, terming them "primary, secondary, and tertiary roles" of vision. Finally, we propose that Gamma waves that are higher in strength and volume allow communication among the retina, thalamus, and various areas of the cortex, and synchronization brings cortical faculties to the retina, while the thalamus is the link that couples the retina to the rest of the brain through activity by gamma oscillations. This novel theory lays groundwork for further research by providing a theoretical understanding that expands upon the functions of the retina, photoreceptors, and retinal plexus to include parallel processing needed to form the internal visual space that we perceive as the external world. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Robotic Attention Processing And Its Application To Visual Guidance

    NASA Astrophysics Data System (ADS)

    Barth, Matthew; Inoue, Hirochika

    1988-03-01

    This paper describes a method of real-time visual attention processing for robots performing visual guidance. This robot attention processing is based on a novel vision processor, the multi-window vision system that was developed at the University of Tokyo. The multi-window vision system is unique in that it only processes visual information inside local area windows. These local area windows are quite flexible in their ability to move anywhere on the visual screen, change their size and shape, and alter their pixel sampling rate. By using these windows for specific attention tasks, it is possible to perform high speed attention processing. The primary attention skills of detecting motion, tracking an object, and interpreting an image are all performed at high speed on the multi-window vision system. A basic robotic attention scheme using the attention skills was developed. The attention skills involved detection and tracking of salient visual features. The tracking and motion information thus obtained was utilized in producing the response to the visual stimulus. The response of the attention scheme was quick enough to be applicable to the real-time vision processing tasks of playing a video 'pong' game, and later using an automobile driving simulator. By detecting the motion of a 'ball' on a video screen and then tracking the movement, the attention scheme was able to control a 'paddle' in order to keep the ball in play. The response was faster than that of a human's, allowing the attention scheme to play the video game at higher speeds. Further, in the application to the driving simulator, the attention scheme was able to control both direction and velocity of a simulated vehicle following a lead car. These two applications show the potential of local visual processing in its use for robotic attention processing.

  8. On the Geometry of Visual Correspondence

    DTIC Science & Technology

    1994-07-01

    from point and line matches. In Proc. International Conference on Computer Vision, pages 25-34, 1987. [11] 0. Faugeras and S. Maybank . Motion from...image. Proceed- ings of the Royal Society, London B, 208:385-397, 1980. (23] S. Maybank . Theory of Reconstruction from Image Motion. Springer, Berlin

  9. Falcons pursue prey using visual motion cues: new perspectives from animal-borne cameras

    PubMed Central

    Kane, Suzanne Amador; Zamani, Marjon

    2014-01-01

    This study reports on experiments on falcons wearing miniature videocameras mounted on their backs or heads while pursuing flying prey. Videos of hunts by a gyrfalcon (Falco rusticolus), gyrfalcon (F. rusticolus)/Saker falcon (F. cherrug) hybrids and peregrine falcons (F. peregrinus) were analyzed to determine apparent prey positions on their visual fields during pursuits. These video data were then interpreted using computer simulations of pursuit steering laws observed in insects and mammals. A comparison of the empirical and modeling data indicates that falcons use cues due to the apparent motion of prey on the falcon's visual field to track and capture flying prey via a form of motion camouflage. The falcons also were found to maintain their prey's image at visual angles consistent with using their shallow fovea. These results should prove relevant for understanding the co-evolution of pursuit and evasion, as well as the development of computer models of predation and the integration of sensory and locomotion systems in biomimetic robots. PMID:24431144

  10. Falcons pursue prey using visual motion cues: new perspectives from animal-borne cameras.

    PubMed

    Kane, Suzanne Amador; Zamani, Marjon

    2014-01-15

    This study reports on experiments on falcons wearing miniature videocameras mounted on their backs or heads while pursuing flying prey. Videos of hunts by a gyrfalcon (Falco rusticolus), gyrfalcon (F. rusticolus)/Saker falcon (F. cherrug) hybrids and peregrine falcons (F. peregrinus) were analyzed to determine apparent prey positions on their visual fields during pursuits. These video data were then interpreted using computer simulations of pursuit steering laws observed in insects and mammals. A comparison of the empirical and modeling data indicates that falcons use cues due to the apparent motion of prey on the falcon's visual field to track and capture flying prey via a form of motion camouflage. The falcons also were found to maintain their prey's image at visual angles consistent with using their shallow fovea. These results should prove relevant for understanding the co-evolution of pursuit and evasion, as well as the development of computer models of predation and the integration of sensory and locomotion systems in biomimetic robots.

  11. Fly with Me: How Stanley Park High School Developed an Alternative Vision and Practice, as Told through the Narrative of Four Teachers

    ERIC Educational Resources Information Center

    Davies, Mike

    2018-01-01

    This article introduces texts by practitioners at Stanley Park High School, links these to articles about the school in the previous issue of "FORUM," and endorses the continuing commitment at Stanley Park to encouraging a thriving learning culture.

  12. Assessment and Development of Oculomotor Flying Skills by the Application of the Channel Theory of Vision.

    DTIC Science & Technology

    1983-11-04

    visual acuity in amblyopia , using steady-state visual evoked potentials. In J. E. Desmedt (Ed.), Visual evoked potentials in man: new developments... amblyopia by the evoked potential method. Ophthalmologica, 1977s 175, 159-164. 61. Regan, D. & Spekreijse, H. Auditory-visual interactions and the

  13. Rotary-wing flight test methods used for the evaluation of night vision devices

    NASA Astrophysics Data System (ADS)

    Haworth, Loran A.; Blanken, Christopher J.; Szoboszlay, Zoltan P.

    2001-08-01

    The U.S. Army Aviation mission includes flying helicopters at low altitude, at night, and in adverse weather. Night Vision Devices (NVDs) are used to supplement the pilot's visual cues for night flying. As the military requirement to conduct night helicopter operations has increased, the impact of helicopter flight operations with NVD technology in the Degraded Visual Environment (DVE) became increasingly important to quantify. Aeronautical Design Standard-33 (ADS- 33) was introduced to update rotorcraft handling qualities requirements and to quantify the impact of the NVDs in the DVE. As reported in this paper, flight test methodology in ADS-33 has been used by the handling qualities community to measure the impact of NVDs on task performance in the DVE. This paper provides the background and rationale behind the development of ADS-33 flight test methodology for handling qualities in the DVE, as well as the test methodology developed for human factor assessment of NVDs in the DVE. Lessons learned, shortcomings and recommendations for NVD flight test methodology are provided in this paper.

  14. Neural Action Fields for Optic Flow Based Navigation: A Simulation Study of the Fly Lobula Plate Network

    PubMed Central

    Borst, Alexander; Weber, Franz

    2011-01-01

    Optic flow based navigation is a fundamental way of visual course control described in many different species including man. In the fly, an essential part of optic flow analysis is performed in the lobula plate, a retinotopic map of motion in the environment. There, the so-called lobula plate tangential cells possess large receptive fields with different preferred directions in different parts of the visual field. Previous studies demonstrated an extensive connectivity between different tangential cells, providing, in principle, the structural basis for their large and complex receptive fields. We present a network simulation of the tangential cells, comprising most of the neurons studied so far (22 on each hemisphere) with all the known connectivity between them. On their dendrite, model neurons receive input from a retinotopic array of Reichardt-type motion detectors. Model neurons exhibit receptive fields much like their natural counterparts, demonstrating that the connectivity between the lobula plate tangential cells indeed can account for their complex receptive field structure. We describe the tuning of a model neuron to particular types of ego-motion (rotation as well as translation around/along a given body axis) by its ‘action field’. As we show for model neurons of the vertical system (VS-cells), each of them displays a different type of action field, i.e., responds maximally when the fly is rotating around a particular body axis. However, the tuning width of the rotational action fields is relatively broad, comparable to the one with dendritic input only. The additional intra-lobula-plate connectivity mainly reduces their translational action field amplitude, i.e., their sensitivity to translational movements along any body axis of the fly. PMID:21305019

  15. Controlling free flight of a robotic fly using an onboard vision sensor inspired by insect ocelli

    PubMed Central

    Fuller, Sawyer B.; Karpelson, Michael; Censi, Andrea; Ma, Kevin Y.; Wood, Robert J.

    2014-01-01

    Scaling a flying robot down to the size of a fly or bee requires advances in manufacturing, sensing and control, and will provide insights into mechanisms used by their biological counterparts. Controlled flight at this scale has previously required external cameras to provide the feedback to regulate the continuous corrective manoeuvres necessary to keep the unstable robot from tumbling. One stabilization mechanism used by flying insects may be to sense the horizon or Sun using the ocelli, a set of three light sensors distinct from the compound eyes. Here, we present an ocelli-inspired visual sensor and use it to stabilize a fly-sized robot. We propose a feedback controller that applies torque in proportion to the angular velocity of the source of light estimated by the ocelli. We demonstrate theoretically and empirically that this is sufficient to stabilize the robot's upright orientation. This constitutes the first known use of onboard sensors at this scale. Dipteran flies use halteres to provide gyroscopic velocity feedback, but it is unknown how other insects such as honeybees stabilize flight without these sensory organs. Our results, using a vehicle of similar size and dynamics to the honeybee, suggest how the ocelli could serve this role. PMID:24942846

  16. Application of aircraft navigation sensors to enhanced vision systems

    NASA Technical Reports Server (NTRS)

    Sweet, Barbara T.

    1993-01-01

    In this presentation, the applicability of various aircraft navigation sensors to enhanced vision system design is discussed. First, the accuracy requirements of the FAA for precision landing systems are presented, followed by the current navigation systems and their characteristics. These systems include Instrument Landing System (ILS), Microwave Landing System (MLS), Inertial Navigation, Altimetry, and Global Positioning System (GPS). Finally, the use of navigation system data to improve enhanced vision systems is discussed. These applications include radar image rectification, motion compensation, and image registration.

  17. Characterization of the Structure and Function of the Normal Human Fovea Using Adaptive Optics Scanning Laser Ophthalmoscopy

    NASA Astrophysics Data System (ADS)

    Putnam, Nicole Marie

    In order to study the limits of spatial vision in normal human subjects, it is important to look at and near the fovea. The fovea is the specialized part of the retina, the light-sensitive multi-layered neural tissue that lines the inner surface of the human eye, where the cone photoreceptors are smallest (approximately 2.5 microns or 0.5 arcmin) and cone density reaches a peak. In addition, there is a 1:1 mapping from the photoreceptors to the brain in this central region of the retina. As a result, the best spatial sampling is achieved in the fovea and it is the retinal location used for acuity and spatial vision tasks. However, vision is typically limited by the blur induced by the normal optics of the eye and clinical tests of foveal vision and foveal imaging are both limited due to the blur. As a result, it is unclear what the perceptual benefit of extremely high cone density is. Cutting-edge imaging technology, specifically Adaptive Optics Scanning Laser Ophthalmoscopy (AOSLO), can be utilized to remove this blur, zoom in, and as a result visualize individual cone photoreceptors throughout the central fovea. This imaging combined with simultaneous image stabilization and targeted stimulus delivery expands our understanding of both the anatomical structure of the fovea on a microscopic scale and the placement of stimuli within this retinal area during visual tasks. The final step is to investigate the role of temporal variables in spatial vision tasks since the eye is in constant motion even during steady fixation. In order to learn more about the fovea, it becomes important to study the effect of this motion on spatial vision tasks. This dissertation steps through many of these considerations, starting with a model of the foveal cone mosaic imaged with AOSLO. We then use this high resolution imaging to compare anatomical and functional markers of the center of the normal human fovea. Finally, we investigate the role of natural and manipulated fixational eye movements in foveal vision, specifically looking at a motion detection task, contrast sensitivity, and image fading.

  18. Control of self-motion in dynamic fluids: fish do it differently from bees.

    PubMed

    Scholtyssek, Christine; Dacke, Marie; Kröger, Ronald; Baird, Emily

    2014-05-01

    To detect and avoid collisions, animals need to perceive and control the distance and the speed with which they are moving relative to obstacles. This is especially challenging for swimming and flying animals that must control movement in a dynamic fluid without reference from physical contact to the ground. Flying animals primarily rely on optic flow to control flight speed and distance to obstacles. Here, we investigate whether swimming animals use similar strategies for self-motion control to flying animals by directly comparing the trajectories of zebrafish (Danio rerio) and bumblebees (Bombus terrestris) moving through the same experimental tunnel. While moving through the tunnel, black and white patterns produced (i) strong horizontal optic flow cues on both walls, (ii) weak horizontal optic flow cues on both walls and (iii) strong optic flow cues on one wall and weak optic flow cues on the other. We find that the mean speed of zebrafish does not depend on the amount of optic flow perceived from the walls. We further show that zebrafish, unlike bumblebees, move closer to the wall that provides the strongest visual feedback. This unexpected preference for strong optic flow cues may reflect an adaptation for self-motion control in water or in environments where visibility is limited. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  19. Vision and Vestibular System Dysfunction Predicts Prolonged Concussion Recovery in Children.

    PubMed

    Master, Christina L; Master, Stephen R; Wiebe, Douglas J; Storey, Eileen P; Lockyer, Julia E; Podolak, Olivia E; Grady, Matthew F

    2018-03-01

    Up to one-third of children with concussion have prolonged symptoms lasting beyond 4 weeks. Vision and vestibular dysfunction is common after concussion. It is unknown whether such dysfunction predicts prolonged recovery. We sought to determine which vision or vestibular problems predict prolonged recovery in children. A retrospective cohort of pediatric patients with concussion. A subspecialty pediatric concussion program. Four hundred thirty-two patient records were abstracted. Presence of vision or vestibular dysfunction upon presentation to the subspecialty concussion program. The main outcome of interest was time to clinical recovery, defined by discharge from clinical follow-up, including resolution of acute symptoms, resumption of normal physical and cognitive activity, and normalization of physical examination findings to functional levels. Study subjects were 5 to 18 years (median = 14). A total of 378 of 432 subjects (88%) presented with vision or vestibular problems. A history of motion sickness was associated with vestibular dysfunction. Younger age, public insurance, and presence of headache were associated with later presentation for subspecialty concussion care. Vision and vestibular problems were associated within distinct clusters. Provocable symptoms with vestibulo-ocular reflex (VOR) and smooth pursuits and abnormal balance and accommodative amplitude (AA) predicted prolonged recovery time. Vision and vestibular problems predict prolonged concussion recovery in children. A history of motion sickness may be an important premorbid factor. Public insurance status may represent problems with disparities in access to concussion care. Vision assessments in concussion must include smooth pursuits, saccades, near point of convergence (NPC), and accommodative amplitude (AA). A comprehensive, multidomain assessment is essential to predict prolonged recovery time and enable active intervention with specific school accommodations and targeted rehabilitation.

  20. Clinical Tests of Ultra-Low Vision Used to Evaluate Rudimentary Visual Perceptions Enabled by the BrainPort Vision Device.

    PubMed

    Nau, Amy; Bach, Michael; Fisher, Christopher

    2013-01-01

    We evaluated whether existing ultra-low vision tests are suitable for measuring outcomes using sensory substitution. The BrainPort is a vision assist device coupling a live video feed with an electrotactile tongue display, allowing a user to gain information about their surroundings. We enrolled 30 adult subjects (age range 22-74) divided into two groups. Our blind group included 24 subjects ( n = 16 males and n = 8 females, average age 50) with light perception or worse vision. Our control group consisted of six subjects ( n = 3 males, n = 3 females, average age 43) with healthy ocular status. All subjects performed 11 computer-based psychophysical tests from three programs: Basic Assessment of Light Motion, Basic Assessment of Grating Acuity, and the Freiburg Vision Test as well as a modified Tangent Screen. Assessments were performed at baseline and again using the BrainPort after 15 hours of training. Most tests could be used with the BrainPort. Mean success scores increased for all of our tests except contrast sensitivity. Increases were statistically significant for tests of light perception (8.27 ± 3.95 SE), time resolution (61.4% ± 3.14 SE), light localization (44.57% ± 3.58 SE), grating orientation (70.27% ± 4.64 SE), and white Tumbling E on a black background (2.49 logMAR ± 0.39 SE). Motion tests were limited by BrainPort resolution. Tactile-based sensory substitution devices are amenable to psychophysical assessments of vision, even though traditional visual pathways are circumvented. This study is one of many that will need to be undertaken to achieve a common outcomes infrastructure for the field of artificial vision.

  1. Contrast Sensitivity, First-Order Motion and Initial Ocular Following in Demyelinating Optic Neuropathy

    PubMed Central

    Rucker, Janet C.; Sheliga, Boris M.; FitzGibbon, Edmond J.; Miles, Frederick A.; Leigh, R. John

    2008-01-01

    The ocular following response (OFR) is a measure of motion vision elicited at ultra-short latencies by sudden movement of a large visual stimulus. We compared the OFR to vertical sinusoidal gratings (spatial frequency 0.153 cycles/° or 0.458 cycles/°) of each eye in a subject with evidence of left optic nerve demyelination due to multiple sclerosis (MS). The subject showed substantial differences in vision measured with stationary low-contrast Sloan letters (20/63 OD and 20/200 OS at 2.5% contrast) and the Lanthony Desaturated 15-hue color test (Color Confusion Index 1.11 OD and 2.14 OS). Compared with controls, all of the subject's OFR to increasing contrast showed a higher threshold. The OFR of each of the subject's eyes were similar for the 0.153 cycles/° stimulus, and psychophysical measurements of his ability to detect these moving gratings were also similar for each eye. However, with the 0.458 cycles/° stimulus, the subject's OFR was asymmetric and the affected eye showed decreased responses (smaller slope constant as estimated by the Naka-Rushton equation). These results suggest that, in this case, optic neuritis caused a selective deficit that affected parvocellular pathways mediating higher spatial frequencies, lower-contrast, and color vision, but spared the field-holding mechanism underlying the OFR to lower spatial frequencies. The OFR may provide a useful method to study motion vision in individuals with disorders affecting anterior visual pathways. PMID:16649097

  2. Mission-oriented requirements for updating MIL-H-8501. Volume 1: STI proposed structure. [military rotorcraft

    NASA Technical Reports Server (NTRS)

    Clement, W. F.; Hoh, R. H.; Ferguson, S. W., III; Mitchell, D. G.; Ashkenas, I. L.; Mcruer, D. T.

    1985-01-01

    The structure of a new flying and ground handling qualities specification for military rotorcraft is presented. This preliminary specification structure is intended to evolve into a replacement for specification MIL-H-8501A. The new structure is designed to accommodate a variety of rotorcraft types, mission flight phases, flight envelopes, and flight environmental characteristics and to provide criteria for three levels of flying qualities, a systematic treatment of failures and reliability, both conventional and multiaxis controllers, and external vision aids which may also incorporate synthetic display content. Existing and new criteria were incorporated into the new structure wherever they could be substantiated.

  3. Predicting fruit fly's sensing rate with insect flight simulations.

    PubMed

    Chang, Song; Wang, Z Jane

    2014-08-05

    Without sensory feedback, flies cannot fly. Exactly how various feedback controls work in insects is a complex puzzle to solve. What do insects measure to stabilize their flight? How often and how fast must insects adjust their wings to remain stable? To gain insights into algorithms used by insects to control their dynamic instability, we develop a simulation tool to study free flight. To stabilize flight, we construct a control algorithm that modulates wing motion based on discrete measurements of the body-pitch orientation. Our simulations give theoretical bounds on both the sensing rate and the delay time between sensing and actuation. Interpreting our findings together with experimental results on fruit flies' reaction time and sensory motor reflexes, we conjecture that fruit flies sense their kinematic states every wing beat to stabilize their flight. We further propose a candidate for such a control involving the fly's haltere and first basalar motor neuron. Although we focus on fruit flies as a case study, the framework for our simulation and discrete control algorithms is applicable to studies of both natural and man-made fliers.

  4. HP-9810A calculator programs for plotting the 2-dimensional motion of cyclindrical payloads relative to the shuttle orbiter

    NASA Technical Reports Server (NTRS)

    Wilson, S. W.

    1976-01-01

    The HP-9810A calculator programs described provide the capability to generate HP-9862A plotter displays which depict the apparent motion of a free-flying cyclindrical payload relative to the shuttle orbiter body axes by projecting the payload geometry into the orbiter plane of symmetry at regular time intervals.

  5. Analysis multi-agent with precense of the leader

    NASA Astrophysics Data System (ADS)

    Achmadi, Sentot; Marjono, Miswanto

    2017-12-01

    The phenomenon of swarm is a natural phenomenon that is often done by a collection of living things in the form of motion from one place to another. By clustering, a group of animals can increase their effectiveness in food search and avoid predators. A group of geese also performs a swarm phenomenon when flying and forms an inverted V-formation with one of the geese acting as a leader. Each flying track of members of the geese group always follows the leader's path at a certain distance. This article discusses the mathematical modeling of the swarm phenomenon, which is the optimal tracking control for multi-agent model with the influence of the leader in the 2-dimensional space. The leader in this model is intended to track the specified path. Firstly, the leader's motion control is to follow the predetermined path using the Tracking Error Dynamic method. Then, the path from the leader is used to design the motion control of each agent to track the leader's path at a certain distance. The result of numerical simulation shows that the leader trajectory can track the specified path. Similarly, the motion of each agent can trace and follow the leader's path.

  6. Normal vision can compensate for the loss of the circadian clock

    PubMed Central

    Schlichting, Matthias; Menegazzi, Pamela; Helfrich-Förster, Charlotte

    2015-01-01

    Circadian clocks are thought to be essential for timing the daily activity of animals, and consequently increase fitness. This view was recently challenged for clock-less fruit flies and mice that exhibited astonishingly normal activity rhythms under outdoor conditions. Compensatory mechanisms appear to enable even clock mutants to live a normal life in nature. Here, we show that gradual daily increases/decreases of light in the laboratory suffice to provoke normally timed sharp morning (M) and evening (E) activity peaks in clock-less flies. We also show that the compound eyes, but not Cryptochrome (CRY), mediate the precise timing of M and E peaks under natural-like conditions, as CRY-less flies do and eyeless flies do not show these sharp peaks independently of a functional clock. Nevertheless, the circadian clock appears critical for anticipating dusk, as well as for inhibiting sharp activity peaks during midnight. Clock-less flies only increase E activity after dusk and not before the beginning of dusk, and respond strongly to twilight exposure in the middle of the night. Furthermore, the circadian clock responds to natural-like light cycles, by slightly broadening Timeless (TIM) abundance in the clock neurons, and this effect is mediated by CRY. PMID:26378222

  7. Perception of Motion in Statistically-Defined Displays

    DTIC Science & Technology

    1989-04-15

    psychophysical study before. He was paid $7.50/hour for his participation. Also, to insure high motivation , he received an additional one cent for every...correct response. This was the same motivational device used in the earlier work on motion discrimination (Ball and Sekuler, 1982). The observer...scientists, physiologists, and people interested in computer vision. Finally, one of the main motives for studying motion perception is a desire to

  8. Technical Note: A respiratory monitoring and processing system based on computer vision: prototype and proof of principle

    PubMed Central

    Atallah, Vincent; Escarmant, Patrick; Vinh‐Hung, Vincent

    2016-01-01

    Monitoring and controlling respiratory motion is a challenge for the accuracy and safety of therapeutic irradiation of thoracic tumors. Various commercial systems based on the monitoring of internal or external surrogates have been developed but remain costly. In this article we describe and validate Madibreast, an in‐house‐made respiratory monitoring and processing device based on optical tracking of external markers. We designed an optical apparatus to ensure real‐time submillimetric image resolution at 4 m. Using OpenCv libraries, we optically tracked high‐contrast markers set on patients' breasts. Validation of spatial and time accuracy was performed on a mechanical phantom and on human breast. Madibreast was able to track motion of markers up to a 5 cm/s speed, at a frame rate of 30 fps, with submillimetric accuracy on mechanical phantom and human breasts. Latency was below 100 ms. Concomitant monitoring of three different locations on the breast showed discrepancies in axial motion up to 4 mm for deep‐breathing patterns. This low‐cost, computer‐vision system for real‐time motion monitoring of the irradiation of breast cancer patients showed submillimetric accuracy and acceptable latency. It allowed the authors to highlight differences in surface motion that may be correlated to tumor motion. PACS number(s): 87.55.km PMID:27685116

  9. Technical Note: A respiratory monitoring and processing system based on computer vision: prototype and proof of principle.

    PubMed

    Leduc, Nicolas; Atallah, Vincent; Escarmant, Patrick; Vinh-Hung, Vincent

    2016-09-08

    Monitoring and controlling respiratory motion is a challenge for the accuracy and safety of therapeutic irradiation of thoracic tumors. Various commercial systems based on the monitoring of internal or external surrogates have been developed but remain costly. In this article we describe and validate Madibreast, an in-house-made respiratory monitoring and processing device based on optical tracking of external markers. We designed an optical apparatus to ensure real-time submillimetric image resolution at 4 m. Using OpenCv libraries, we optically tracked high-contrast markers set on patients' breasts. Validation of spatial and time accuracy was performed on a mechanical phantom and on human breast. Madibreast was able to track motion of markers up to a 5 cm/s speed, at a frame rate of 30 fps, with submillimetric accuracy on mechanical phantom and human breasts. Latency was below 100 ms. Concomitant monitoring of three different locations on the breast showed discrepancies in axial motion up to 4 mm for deep-breathing patterns. This low-cost, computer-vision system for real-time motion monitoring of the irradiation of breast cancer patients showed submillimetric accuracy and acceptable latency. It allowed the authors to highlight differences in surface motion that may be correlated to tumor motion.v. © 2016 The Authors.

  10. Robot arm system for automatic satellite capture and berthing

    NASA Technical Reports Server (NTRS)

    Nishida, Shinichiro; Toriu, Hidetoshi; Hayashi, Masato; Kubo, Tomoaki; Miyata, Makoto

    1994-01-01

    Load control is one of the most important technologies for capturing and berthing free flying satellites by a space robot arm because free flying satellites have different motion rates. The performance of active compliance control techniques depend on the location of the force sensor and the arm's structural compliance. A compliance control technique for the robot arm's structural elasticity and a consideration for an end-effector appropriate for it are presented in this paper.

  11. Square tracking sensor for autonomous helicopter hover stabilization

    NASA Astrophysics Data System (ADS)

    Oertel, Carl-Henrik

    1995-06-01

    Sensors for synthetic vision are needed to extend the mission profiles of helicopters. A special task for various applications is the autonomous position hold of a helicopter above a ground fixed or moving target. As a proof of concept for a general synthetic vision solution a restricted machine vision system, which is capable of locating and tracking a special target, was developed by the Institute of Flight Mechanics of Deutsche Forschungsanstalt fur Luft- und Raumfahrt e.V. (i.e., German Aerospace Research Establishment). This sensor, which is specialized to detect and track a square, was integrated in the fly-by-wire helicopter ATTHeS (i.e., Advanced Technology Testing Helicopter System). An existing model following controller for the forward flight condition was adapted for the hover and low speed requirements of the flight vehicle. The special target, a black square with a length of one meter, was mounted on top of a car. Flight tests demonstrated the automatic stabilization of the helicopter above the moving car by synthetic vision.

  12. The economics of motion perception and invariants of visual sensitivity.

    PubMed

    Gepshtein, Sergei; Tyukin, Ivan; Kubovy, Michael

    2007-06-21

    Neural systems face the challenge of optimizing their performance with limited resources, just as economic systems do. Here, we use tools of neoclassical economic theory to explore how a frugal visual system should use a limited number of neurons to optimize perception of motion. The theory prescribes that vision should allocate its resources to different conditions of stimulation according to the degree of balance between measurement uncertainties and stimulus uncertainties. We find that human vision approximately follows the optimal prescription. The equilibrium theory explains why human visual sensitivity is distributed the way it is and why qualitatively different regimes of apparent motion are observed at different speeds. The theory offers a new normative framework for understanding the mechanisms of visual sensitivity at the threshold of visibility and above the threshold and predicts large-scale changes in visual sensitivity in response to changes in the statistics of stimulation and system goals.

  13. Remote Safety Monitoring for Elderly Persons Based on Omni-Vision Analysis

    PubMed Central

    Xiang, Yun; Tang, Yi-ping; Ma, Bao-qing; Yan, Hang-chen; Jiang, Jun; Tian, Xu-yuan

    2015-01-01

    Remote monitoring service for elderly persons is important as the aged populations in most developed countries continue growing. To monitor the safety and health of the elderly population, we propose a novel omni-directional vision sensor based system, which can detect and track object motion, recognize human posture, and analyze human behavior automatically. In this work, we have made the following contributions: (1) we develop a remote safety monitoring system which can provide real-time and automatic health care for the elderly persons and (2) we design a novel motion history or energy images based algorithm for motion object tracking. Our system can accurately and efficiently collect, analyze, and transfer elderly activity information and provide health care in real-time. Experimental results show that our technique can improve the data analysis efficiency by 58.5% for object tracking. Moreover, for the human posture recognition application, the success rate can reach 98.6% on average. PMID:25978761

  14. MER-DIMES : a planetary landing application of computer vision

    NASA Technical Reports Server (NTRS)

    Cheng, Yang; Johnson, Andrew; Matthies, Larry

    2005-01-01

    During the Mars Exploration Rovers (MER) landings, the Descent Image Motion Estimation System (DIMES) was used for horizontal velocity estimation. The DIMES algorithm combines measurements from a descent camera, a radar altimeter and an inertial measurement unit. To deal with large changes in scale and orientation between descent images, the algorithm uses altitude and attitude measurements to rectify image data to level ground plane. Feature selection and tracking is employed in the rectified data to compute the horizontal motion between images. Differences of motion estimates are then compared to inertial measurements to verify correct feature tracking. DIMES combines sensor data from multiple sources in a novel way to create a low-cost, robust and computationally efficient velocity estimation solution, and DIMES is the first use of computer vision to control a spacecraft during planetary landing. In this paper, the detailed implementation of the DIMES algorithm and the results from the two landings on Mars are presented.

  15. Design and optimal control of on-orbit servicing trajectory for target vehicle in non-coplanar elliptical orbit

    NASA Astrophysics Data System (ADS)

    Zhou, Wenyong; Yuan, Jianping; Luo, Jianjun

    2005-11-01

    Autonomous on-orbit servicing provides flexibility to space systems and has great value both in civil and in military. When a satellite performs on-orbit servicing tasks, flying around is the basic type of motion. This paper is concerned with the design and control problems of a chaser satellite flying around a target spacecraft in non-coplanar elliptical orbit for a long time. At first, a mathematical model used to design a long-term flying around trajectory is presented, which is applicable to the situation that the target spacecraft flies in an elliptical orbit. The conditions of the target at the centre of the flying around path are deduced. Considering the safety and task requirements, a long-term flying around trajectory is designed. Taking into account perturbations and navigation errors which can cause the trajectory unstable and mission impossible, a two-impulse control method is put forward. Genetic algorithm is used to minimize the cost function which considers fuel consumption and bias simultaneously. Some simulation works are carried out and the results indicate the flying around mathematical model and the trajectory control method can be used in the design and control of a long-term flying around trajectory.

  16. Scalable Photogrammetric Motion Capture System "mosca": Development and Application

    NASA Astrophysics Data System (ADS)

    Knyaz, V. A.

    2015-05-01

    Wide variety of applications (from industrial to entertainment) has a need for reliable and accurate 3D information about motion of an object and its parts. Very often the process of movement is rather fast as in cases of vehicle movement, sport biomechanics, animation of cartoon characters. Motion capture systems based on different physical principles are used for these purposes. The great potential for obtaining high accuracy and high degree of automation has vision-based system due to progress in image processing and analysis. Scalable inexpensive motion capture system is developed as a convenient and flexible tool for solving various tasks requiring 3D motion analysis. It is based on photogrammetric techniques of 3D measurements and provides high speed image acquisition, high accuracy of 3D measurements and highly automated processing of captured data. Depending on the application the system can be easily modified for different working areas from 100 mm to 10 m. The developed motion capture system uses from 2 to 4 technical vision cameras for video sequences of object motion acquisition. All cameras work in synchronization mode at frame rate up to 100 frames per second under the control of personal computer providing the possibility for accurate calculation of 3D coordinates of interest points. The system was used for a set of different applications fields and demonstrated high accuracy and high level of automation.

  17. 1st- and 2nd-order motion and texture resolution in central and peripheral vision

    NASA Technical Reports Server (NTRS)

    Solomon, J. A.; Sperling, G.

    1995-01-01

    STIMULI. The 1st-order stimuli are moving sine gratings. The 2nd-order stimuli are fields of static visual texture, whose contrasts are modulated by moving sine gratings. Neither the spatial slant (orientation) nor the direction of motion of these 2nd-order (microbalanced) stimuli can be detected by a Fourier analysis; they are invisible to Reichardt and motion-energy detectors. METHOD. For these dynamic stimuli, when presented both centrally and in an annular window extending from 8 to 10 deg in eccentricity, we measured the highest spatial frequency for which discrimination between +/- 45 deg texture slants and discrimination between opposite directions of motion were each possible. RESULTS. For sufficiently low spatial frequencies, slant and direction can be discriminated in both central and peripheral vision, for both 1st- and for 2nd-order stimuli. For both 1st- and 2nd-order stimuli, at both retinal locations, slant discrimination is possible at higher spatial frequencies than direction discrimination. For both 1st- and 2nd-order stimuli, motion resolution decreases 2-3 times more rapidly with eccentricity than does texture resolution. CONCLUSIONS. (1) 1st- and 2nd-order motion scale similarly with eccentricity. (2) 1st- and 2nd-order texture scale similarly with eccentricity. (3) The central/peripheral resolution fall-off is 2-3 times greater for motion than for texture.

  18. Stroboscopic Vision as a Treatment Motion Sickness

    NASA Technical Reports Server (NTRS)

    Reschke, Millard F.; Somers, J. T.; Ford, G.; Krnavek, J. M.; Hwang, E. y.; Kornilova, L. N.; Leigh, R. J.

    2006-01-01

    Results obtained from space flight indicate that most space crews will experience some symptoms of motion sickness causing significant impact on the operational objectives that must be accomplished to assure mission success. Based on the initial work of Melvill-Jones, we have evaluated stroboscopic vision as a method of preventing motion sickness. Methods: Nineteen subjects read text while making +/-20deg head movements in the horizontal plane at 0.2 Hz while wearing left-right reversing prisms during exposure to 4 Hz stroboscopic or normal room illumination. Testing was repeated using LCD shutter glasses as the stroboscopic source with an additional 19 subjects. Results: With Strobe, motion sickness was significantly lower than with normal room illumination. Results with the LCD shutter glasses were analogous to those observed with environmental strobe. Conclusions: Stroboscopic illumination appears to be effective where retinal slip is a factor in eliciting motion sickness. Additional research is evaluating the glasses efficacy for, carsickness, sickness in parabolic flight and seasickness. There is evidence from pilot studies showing that the glasses reduce saccade velocity to visually presented targets by approximately half of the normal values. It is interesting to note that adaptation to space flight may also slow saccade velocity.

  19. Effect of vision and stance width on human body motion when standing: implications for afferent control of lateral sway.

    PubMed

    Day, B L; Steiger, M J; Thompson, P D; Marsden, C D

    1993-09-01

    1. Measurements of human upright body movements in three dimensions have been made on thirty-five male subjects attempting to stand still with various stance widths and with eyes closed or open. Body motion was inferred from movements of eight markers fixed to specific sites on the body from the shoulders to the ankles. Motion of these markers was recorded together with motion of the point of application of the resultant of the ground reaction forces (centre of pressure). 2. The speed of the body (average from eight sites) was increased by closing the eyes or narrowing the stance width and there was an interaction between these two factors such that vision reduced body speed more effectively when the feet were closer together. Similar relationships were found for components of velocity both in the frontal and sagittal planes although stance width exerted a much greater influence on the lateral velocity component. 3. Fluctuations in position of the body were also increased by eye closure or narrowing of stance width. Again, the effect of stance width was more potent for lateral than for anteroposterior movements. In contrast to the velocity measurements, there was no interaction between vision and stance width. 4. There was a progressive increase in the amplitude of position and velocity fluctuations from markers placed higher on the body. The fluctuations in the position of the centre of pressure were similar in magnitude to those of the markers placed near the hip. The fluctuations in velocity of centre of pressure, however, were greater than of any site on the body. 5. Analysis of the amplitude of angular motion between adjacent straight line segments joining the markers suggests that the inverted pendulum model of body sway is incomplete. Motion about the ankle joint was dominant only for lateral movement in the frontal plane with narrow stance widths (< 8 cm). For all other conditions most angular motion occurred between the trunk and leg. 6. The large reduction in lateral body motion with increasing stance width was mainly due to a disproportionate reduction in the angular motion about the ankles and feet. A mathematical model of the skeletal structure has been constructed which offers some explanation for this specific reduction in joint motion.(ABSTRACT TRUNCATED AT 400 WORDS)

  20. Contact lenses and corrective flying spectacles in military aircrew--implications for flight safety.

    PubMed

    Partner, Andrew M; Scott, Robert A H; Shaw, Penny; Coker, William J

    2005-07-01

    Refractive devices used by aviators need to suit the aerospace environment or their failure can have serious implications. A relatively minor visual disability can result in loss of life and aircraft. We surveyed commonly occurring problems with the different types of refractive correction worn by Royal Air Force (RAF) aircrew over the previous 12 mo. We also asked if they had experienced any flight safety incidents (FSI) relating to their refractive correction. A retrospective anonymous questionnaire survey was given to 700 active aircrew occupationally graded as requiring corrective flying spectacles (CFS) or contact lenses (CL) for flying. 63% (443) of the questionnaires were completed. CL were worn by 53% of aircrew; 71% of them used daily disposable CL. CFS were worn by the remaining 47% of aircrew, 14% of whom used multifocal lenses. Of CFS wearers, 83% reported problems including misting, moving, discomfort, and conflict with helmet-mounted devices (HMD). CL-related ocular symptoms were reported in 67% of wearers including cloudy vision, dry eye, photophobia, red eyes, excessive mucus formation, CL movement, itching, and grittiness. No CL-related FSI were reported over the previous 12 mo compared with 5% CFS-related FSI (p < 0.001). The graded performance of CL for vision, comfort, handling, convenience, and overall satisfaction was significantly higher than for CFS. CFS are associated with problems in terms of comfort and safety. CL are well tolerated by aircrew, and deliver improved visual performance.

  1. "Fly-by-Wireless" Vehicles and Evaluations of ISA 100 Applications to Space-Flight

    NASA Technical Reports Server (NTRS)

    Studor, George F.

    2009-01-01

    "Fly-by-Wireless" (What is it?) Vision: To minimize cables and connectors and increase functionality across the aerospace industry by providing reliable, lower cost, modular, and higher performance alternatives to wired data connectivity to benefit the entire vehicle/program life-cycle. Focus Areas: 1. System Engineering and Integration to reduce cables and connectors. 2. Provisions for modularity and accessibility in the vehicle architecture. 3. Develop Alternatives to wired connectivity (the "tool box").NASA and Aerospace depend more and more on cost-effective solutions that can meet our requirements. ISA-100.11 a is a promising new standard and NASA wants to evaluate it. NASA should be involved in understanding and contributing to other ISA-100 efforts that contribute to "Fly-by-Wireless" and it's objectives. ISA can engage other aerospace groups that are working on similar goals and obtain more aerospace industry perspective.

  2. In-Space Inspection Technologies Vision

    NASA Technical Reports Server (NTRS)

    Studor, George

    2012-01-01

    Purpose: Assess In-Space NDE technologies and needs - current & future spacecraft. Discover & build on needs, R&D & NDE products in other industries and agencies. Stimulate partnerships in & outside NASA to move technologies forward cooperatively. Facilitate group discussion on challenges and opportunities of mutual benefit. Focus Areas: Miniaturized 3D Penetrating Imagers Controllable Snake-arm Inspection systems Miniature Free-flying Micro-satellite Inspectors

  3. Witnessing Evolution First Hand: A K-12 Laboratory Exercise in Genetics & Evolution Using "Drosophila"

    ERIC Educational Resources Information Center

    Heil, Caiti S. S.; Manzano-Winkler, Brenda; Hunter, Mika J.; Noor, Juliet K. F.; Noor, Mohamed A. F.

    2013-01-01

    We present a laboratory exercise that leverages student interest in genetics to observe and understand evolution by natural selection. Students begin with white-eyed fruit fly populations, to which they introduce a single advantageous variant (one male with red eyes). The superior health and vision associated with having the red-eye-color allele…

  4. Capture of visual direction in dynamic vergence is reduced with flashed monocular lines.

    PubMed

    Jaschinski, Wolfgang; Jainta, Stephanie; Schürer, Michael

    2006-08-01

    The visual direction of a continuously presented monocular object is captured by the visual direction of a closely adjacent binocular object, which questions the reliability of nonius lines for measuring vergence. This was shown by Erkelens, C. J., and van Ee, R. (1997a,b) [Capture of the visual direction: An unexpected phenomenon in binocular vision. Vision Research, 37, 1193-1196; Capture of the visual direction of monocular objects by adjacent binocular objects. Vision Research, 37, 1735-1745] stimulating dynamic vergence by a counter phase oscillation of two square random-dot patterns (one to each eye) that contained a smaller central dot-free gap (of variable width) with a vertical monocular line oscillating in phase with the random-dot pattern of the respective eye; subjects adjusted the motion-amplitude of the line until it was perceived as (nearly) stationary. With a continuously presented monocular line, we replicated capture of visual direction provided the dot-free gap was narrow: the adjusted motion-amplitude of the line was similar as the motion-amplitude of the random-dot pattern, although large vergence errors occurred. However, when we flashed the line for 67 ms at the moments of maximal and minimal disparity of the vergence stimulus, we found that the adjusted motion-amplitude of the line was smaller; thus, the capture effect appeared to be reduced with flashed nonius lines. Accordingly, we found that the objectively measured vergence gain was significantly correlated (r=0.8) with the motion-amplitude of the flashed monocular line when the separation between the line and the fusion contour was at least 32 min arc. In conclusion, if one wishes to estimate the dynamic vergence response with psychophysical methods, effects of capture of visual direction can be reduced by using flashed nonius lines.

  5. Visuomotor Transformation in the Fly Gaze Stabilization System

    PubMed Central

    Huston, Stephen J; Krapp, Holger G

    2008-01-01

    For sensory signals to control an animal's behavior, they must first be transformed into a format appropriate for use by its motor systems. This fundamental problem is faced by all animals, including humans. Beyond simple reflexes, little is known about how such sensorimotor transformations take place. Here we describe how the outputs of a well-characterized population of fly visual interneurons, lobula plate tangential cells (LPTCs), are used by the animal's gaze-stabilizing neck motor system. The LPTCs respond to visual input arising from both self-rotations and translations of the fly. The neck motor system however is involved in gaze stabilization and thus mainly controls compensatory head rotations. We investigated how the neck motor system is able to selectively extract rotation information from the mixed responses of the LPTCs. We recorded extracellularly from fly neck motor neurons (NMNs) and mapped the directional preferences across their extended visual receptive fields. Our results suggest that—like the tangential cells—NMNs are tuned to panoramic retinal image shifts, or optic flow fields, which occur when the fly rotates about particular body axes. In many cases, tangential cells and motor neurons appear to be tuned to similar axes of rotation, resulting in a correlation between the coordinate systems the two neural populations employ. However, in contrast to the primarily monocular receptive fields of the tangential cells, most NMNs are sensitive to visual motion presented to either eye. This results in the NMNs being more selective for rotation than the LPTCs. Thus, the neck motor system increases its rotation selectivity by a comparatively simple mechanism: the integration of binocular visual motion information. PMID:18651791

  6. Large Scale Structure From Motion for Autonomous Underwater Vehicle Surveys

    DTIC Science & Technology

    2004-09-01

    Govern the Formation of Multiple Images of a Scene and Some of Their Applications. MIT Press, 2001. [26] 0. Faugeras and S. Maybank . Motion from point...Machine Vision Conference, volume 1, pages 384-393, September 2002. [69] S. Maybank and 0. Faugeras. A theory of self-calibration of a moving camera

  7. Intelligence Surveillance And Reconnaissance Full Motion Video Automatic Anomaly Detection Of Crowd Movements: System Requirements For Airborne Application

    DTIC Science & Technology

    The collection of Intelligence , Surveillance, and Reconnaissance (ISR) Full Motion Video (FMV) is growing at an exponential rate, and the manual... intelligence for the warfighter. This paper will address the question of how can automatic pattern extraction, based on computer vision, extract anomalies in

  8. Motion Estimation Using the Single-row Superposition-type Planar Compound-like Eye

    PubMed Central

    Cheng, Chi-Cheng; Lin, Gwo-Long

    2007-01-01

    How can the compound eye of insects capture the prey so accurately and quickly? This interesting issue is explored from the perspective of computer vision instead of from the viewpoint of biology. The focus is on performance evaluation of noise immunity for motion recovery using the single-row superposition-type planar compound like eye (SPCE). The SPCE owns a special symmetrical framework with tremendous amount of ommatidia inspired by compound eye of insects. The noise simulates possible ambiguity of image patterns caused by either environmental uncertainty or low resolution of CCD devices. Results of extensive simulations indicate that this special visual configuration provides excellent motion estimation performance regardless of the magnitude of the noise. Even when the noise interference is serious, the SPCE is able to dramatically reduce errors of motion recovery of the ego-translation without any type of filters. In other words, symmetrical, regular, and multiple vision sensing devices of the compound-like eye have statistical averaging advantage to suppress possible noises. This discovery lays the basic foundation in terms of engineering approaches for the secret of the compound eye of insects.

  9. Achieving the Earth Science Enterprise Vision for the 21st Century: Platform Challenges

    NASA Technical Reports Server (NTRS)

    Lemmerman, Loren; Komar, George (Technical Monitor)

    2001-01-01

    The ESE observational architecture of the future vision is dramatically different from that of today. The vision suggests observations from multiple orbits, collaborating space assets, and even seamless integration of space and other assets. Observations from GEO or from Libration points rather than from LEO suggest spacecraft carrying instruments with large deployable apertures. Minimization of launch costs suggests that these large apertures have long life, be extremely mass and volume efficient, and have low life cycle cost. Another significant challenge associated with high latitude orbits is high precision pointing and control. Finally, networks of spacecraft flying in predetermined constellation will be required either to apply complementary assets to an observation or to extend the virtual aperture beyond that attainable with a single spacecraft. These changes dictate development of new technology on several fronts, which are outlined in this paper. A section on high speed communications will outline requirements and approaches now envisioned. Sensorwebs will be developed from the viewpoint of work already begun for both space and for terrestrial networks. Precision guidance, navigation and control will be addressed from the perspective of precision flying for repeat pass interferometry and extreme pointing stability for advanced altimetry. A separate section will address requirements for distributed systems. Large lightweight deployables will be discussed with an emphasis on inflatable technology and its predicted benefits for large aperture instruments. For each technology area listed, current state-of-the-art, technological approaches for future development, and projected levels of performance are outlined.

  10. Relating binocular and monocular vision in strabismic and anisometropic amblyopia.

    PubMed

    Agrawal, Ritwick; Conner, Ian P; Odom, J V; Schwartz, Terry L; Mendola, Janine D

    2006-06-01

    To examine deficits in monocular and binocular vision in adults with amblyopia and to test the following 2 hypotheses: (1) Regardless of clinical subtype, the degree of impairment in binocular integration predicts the pattern of monocular acuity deficits. (2) Subjects who lack binocular integration exhibit the most severe interocular suppression. Seven subjects with anisometropia, 6 subjects with strabismus, and 7 control subjects were tested. Monocular tests included Snellen acuity, grating acuity, Vernier acuity, and contrast sensitivity. Binocular tests included Titmus stereo test, binocular motion integration, and dichoptic contrast masking. As expected, both groups showed deficits in monocular acuity, with subjects with strabismus showing greater deficits in Vernier acuity. Both amblyopic groups were then characterized according to the degree of residual stereoacuity and binocular motion integration ability, and 67% of subjects with strabismus compared with 29% of subjects with anisometropia were classified as having "nonbinocular" vision according to our criterion. For this nonbinocular group, Vernier acuity is most impaired. In addition, the nonbinocular group showed the most dichoptic contrast masking of the amblyopic eye and the least dichoptic contrast masking of the fellow eye. The degree of residual binocularity and interocular suppression predicts monocular acuity and may be a significant etiological mechanism of vision loss.

  11. A Study of Shuttlecock's Trajectory in Badminton.

    PubMed

    Chen, Lung-Ming; Pan, Yi-Hsiang; Chen, Yung-Jen

    2009-01-01

    The main purpose of this study was to construct and validate a motion equation for the flight of the badminton and to find the relationship between the air resistance force and a shuttlecock's speed. This research method was based on motion laws of aerodynamics. It applied aerodynamic theories to construct motion equation of a shuttlecock's flying trajectory under the effects of gravitational force and air resistance force. The result showed that the motion equation of a shuttlecock's flight trajectory could be constructed by determining the terminal velocity. The predicted shuttlecock trajectory fitted the measured data fairly well. The results also revealed that the drag force was proportional to the square of a shuttlecock velocity. Furthermore, the angle and strength of a stroke could influence trajectory. Finally, this study suggested that we could use a scientific approach to measure a shuttlecock's velocity objectively when testing the quality of shuttlecocks. And could be used to replace the traditional subjective method of the Badminton World Federation based on players' striking shuttlecocks, as well as applying research findings to improve professional knowledge of badminton player training. Key pointsThe motion equation of a shuttlecock's flying trajectory could be constructed by determining the terminal velocity in aerodynamics.Air drag force is proportional to the square of a shuttlecock velocity. Furthermore, the angle and strength of a stroke could influence trajectory.

  12. Formation Control for the MAXIM Mission

    NASA Technical Reports Server (NTRS)

    Luquette, Richard J.; Leitner, Jesse; Gendreau, Keith; Sanner, Robert M.

    2004-01-01

    Over the next twenty years, a wave of change is occurring in the space-based scientific remote sensing community. While the fundamental limits in the spatial and angular resolution achievable in spacecraft have been reached, based on today s technology, an expansive new technology base has appeared over the past decade in the area of Distributed Space Systems (DSS). A key subset of the DSS technology area is that which covers precision formation flying of space vehicles. Through precision formation flying, the baselines, previously defined by the largest monolithic structure which could fit in the largest launch vehicle fairing, are now virtually unlimited. Several missions including the Micro-Arcsecond X-ray Imaging Mission (MAXIM), and the Stellar Imager will drive the formation flying challenges to achieve unprecedented baselines for high resolution, extended-scene, interferometry in the ultraviolet and X-ray regimes. This paper focuses on establishing the feasibility for the formation control of the MAXIM mission. MAXIM formation flying requirements are on the order of microns, while Stellar Imager mission requirements are on the order of nanometers. This paper specifically addresses: (1) high-level science requirements for these missions and how they evolve into engineering requirements; and (2) the development of linearized equations of relative motion for a formation operating in an n-body gravitational field. Linearized equations of motion provide the ground work for linear formation control designs.

  13. Advanced Photogrammetry to Assess Lichen Colonization in the Hyper-Arid Namib Desert.

    PubMed

    Hinchliffe, Graham; Bollard-Breen, Barbara; Cowan, Don A; Doshi, Ashray; Gillman, Len N; Maggs-Kolling, Gillian; de Los Rios, Asuncion; Pointing, Stephen B

    2017-01-01

    The hyper-arid central region of the Namib Desert is characterized by quartz desert pavement terrain that is devoid of vascular plant covers. In this extreme habitat the only discernible surface covers are epilithic lichens that colonize exposed surfaces of quartz rocks. These lichens are highly susceptible to disturbance and so field surveys have been limited due to concerns about disturbing this unusual desert feature. Here we present findings that illustrate how non-destructive surveys based upon advanced photogrammetry techniques can yield meaningful and novel scientific data on these lichens. We combined 'structure from motion analysis,' computer vision and GIS to create 3-dimensional point clouds from two-dimensional imagery. The data were robust in its application to estimating absolute lichen cover. An orange Stellarangia spp. assemblage had coverage of 22.8% of available substrate, whilst for a black Xanthoparmelia spp. assemblage coverage was markedly lower at 0.6% of available substrate. Hyperspectral signatures for both lichens were distinct in the near-infra red range indicating that Xanthoparmelia spp. was likely under relatively more moisture stress than Stellarangia spp. at the time of sampling, and we postulate that albedo effects may have contributed to this in the black lichen. Further transformation of the data revealed a colonization preference for west-facing quartz surfaces and this coincides with prevailing winds for marine fog that is the major source of moisture in this system. Furthermore, a three-dimensional 'fly through' of the lichen habitat was created to illustrate how the application of computer vision in microbiology has further potential as a research and education tool. We discuss how advanced photogrammetry could be applied in astrobiology using autonomous rovers to add quantitative ecological data for visible surface colonization on the surface of Mars.

  14. Advanced Photogrammetry to Assess Lichen Colonization in the Hyper-Arid Namib Desert

    PubMed Central

    Hinchliffe, Graham; Bollard-Breen, Barbara; Cowan, Don A.; Doshi, Ashray; Gillman, Len N.; Maggs-Kolling, Gillian; de Los Rios, Asuncion; Pointing, Stephen B.

    2017-01-01

    The hyper-arid central region of the Namib Desert is characterized by quartz desert pavement terrain that is devoid of vascular plant covers. In this extreme habitat the only discernible surface covers are epilithic lichens that colonize exposed surfaces of quartz rocks. These lichens are highly susceptible to disturbance and so field surveys have been limited due to concerns about disturbing this unusual desert feature. Here we present findings that illustrate how non-destructive surveys based upon advanced photogrammetry techniques can yield meaningful and novel scientific data on these lichens. We combined ‘structure from motion analysis,’ computer vision and GIS to create 3-dimensional point clouds from two-dimensional imagery. The data were robust in its application to estimating absolute lichen cover. An orange Stellarangia spp. assemblage had coverage of 22.8% of available substrate, whilst for a black Xanthoparmelia spp. assemblage coverage was markedly lower at 0.6% of available substrate. Hyperspectral signatures for both lichens were distinct in the near-infra red range indicating that Xanthoparmelia spp. was likely under relatively more moisture stress than Stellarangia spp. at the time of sampling, and we postulate that albedo effects may have contributed to this in the black lichen. Further transformation of the data revealed a colonization preference for west-facing quartz surfaces and this coincides with prevailing winds for marine fog that is the major source of moisture in this system. Furthermore, a three-dimensional ‘fly through’ of the lichen habitat was created to illustrate how the application of computer vision in microbiology has further potential as a research and education tool. We discuss how advanced photogrammetry could be applied in astrobiology using autonomous rovers to add quantitative ecological data for visible surface colonization on the surface of Mars. PMID:29312153

  15. Real-time tracking using stereo and motion: Visual perception for space robotics

    NASA Technical Reports Server (NTRS)

    Nishihara, H. Keith; Thomas, Hans; Huber, Eric; Reid, C. Ann

    1994-01-01

    The state-of-the-art in computing technology is rapidly attaining the performance necessary to implement many early vision algorithms at real-time rates. This new capability is helping to accelerate progress in vision research by improving our ability to evaluate the performance of algorithms in dynamic environments. In particular, we are becoming much more aware of the relative stability of various visual measurements in the presence of camera motion and system noise. This new processing speed is also allowing us to raise our sights toward accomplishing much higher-level processing tasks, such as figure-ground separation and active object tracking, in real-time. This paper describes a methodology for using early visual measurements to accomplish higher-level tasks; it then presents an overview of the high-speed accelerators developed at Teleos to support early visual measurements. The final section describes the successful deployment of a real-time vision system to provide visual perception for the Extravehicular Activity Helper/Retriever robotic system in tests aboard NASA's KC135 reduced gravity aircraft.

  16. Vision-guided gripping of a cylinder

    NASA Technical Reports Server (NTRS)

    Nicewarner, Keith E.; Kelley, Robert B.

    1991-01-01

    The motivation for vision-guided servoing is taken from tasks in automated or telerobotic space assembly and construction. Vision-guided servoing requires the ability to perform rapid pose estimates and provide predictive feature tracking. Monocular information from a gripper-mounted camera is used to servo the gripper to grasp a cylinder. The procedure is divided into recognition and servo phases. The recognition stage verifies the presence of a cylinder in the camera field of view. Then an initial pose estimate is computed and uncluttered scan regions are selected. The servo phase processes only the selected scan regions of the image. Given the knowledge, from the recognition phase, that there is a cylinder in the image and knowing the radius of the cylinder, 4 of the 6 pose parameters can be estimated with minimal computation. The relative motion of the cylinder is obtained by using the current pose and prior pose estimates. The motion information is then used to generate a predictive feature-based trajectory for the path of the gripper.

  17. Vision-based system identification technique for building structures using a motion capture system

    NASA Astrophysics Data System (ADS)

    Oh, Byung Kwan; Hwang, Jin Woo; Kim, Yousok; Cho, Tongjun; Park, Hyo Seon

    2015-11-01

    This paper presents a new vision-based system identification (SI) technique for building structures by using a motion capture system (MCS). The MCS with outstanding capabilities for dynamic response measurements can provide gage-free measurements of vibrations through the convenient installation of multiple markers. In this technique, from the dynamic displacement responses measured by MCS, the dynamic characteristics (natural frequency, mode shape, and damping ratio) of building structures are extracted after the processes of converting the displacement from MCS to acceleration and conducting SI by frequency domain decomposition. A free vibration experiment on a three-story shear frame was conducted to validate the proposed technique. The SI results from the conventional accelerometer-based method were compared with those from the proposed technique and showed good agreement, which confirms the validity and applicability of the proposed vision-based SI technique for building structures. Furthermore, SI directly employing MCS measured displacements to FDD was performed and showed identical results to those of conventional SI method.

  18. Phantom motion after effects--evidence of detectors for the analysis of optic flow.

    PubMed

    Snowden, R J; Milne, A B

    1997-10-01

    Electrophysiological recording from the extrastriate cortex of non-human primates has revealed neurons that have large receptive fields and are sensitive to various components of object or self movement, such as translations, rotations and expansion/contractions. If these mechanisms exist in human vision, they might be susceptible to adaptation that generates motion aftereffects (MAEs). Indeed, it might be possible to adapt the mechanism in one part of the visual field and reveal what we term a 'phantom MAE' in another part. The existence of phantom MAEs was probed by adapting to a pattern that contained motion in only two non-adjacent 'quarter' segments and then testing using patterns that had elements in only the other two segments. We also tested for the more conventional 'concrete' MAE by testing in the same two segments that had adapted. The strength of each MAE was quantified by measuring the percentage of dots that had to be moved in the opposite direction to the MAE in order to nullify it. Four experiments tested rotational motion, expansion/contraction motion, translational motion and a 'rotation' that consisted simply of the two segments that contained only translational motions of opposing direction. Compared to a baseline measurement where no adaptation took place, all subjects in all experiments exhibited both concrete and phantom MAEs, with the size of the latter approximately half that of the former. Adaptation to two segments that contained upward and downward motion induced the perception of leftward and rightward motion in another part of the visual field. This strongly suggests there are mechanisms in human vision that are sensitive to complex motions such as rotations.

  19. Direction selectivity of blowfly motion-sensitive neurons is computed in a two-stage process.

    PubMed Central

    Borst, A; Egelhaaf, M

    1990-01-01

    Direction selectivity of motion-sensitive neurons is generally thought to result from the nonlinear interaction between the signals derived from adjacent image points. Modeling of motion-sensitive networks, however, reveals that such elements may still respond to motion in a rather poor directionally selective way. Direction selectivity can be significantly enhanced if the nonlinear interaction is followed by another processing stage in which the signals of elements with opposite preferred directions are subtracted from each other. Our electrophysiological experiments in the fly visual system suggest that here direction selectivity is acquired in such a two-stage process. Images PMID:2251278

  20. Staying Healthy While You Travel (For Parents)

    MedlinePlus

    ... ear discomfort , travel (or motion) sickness, and diarrhea . Jet Lag When you fly across time zones, it ... for longer than usual. In addition to tiredness, jet lag can also cause an upset stomach and ...

  1. Flies compensate for unilateral wing damage through modular adjustments of wing and body kinematics

    PubMed Central

    Iwasaki, Nicole A.; Elzinga, Michael J.; Melis, Johan M.; Dickinson, Michael H.

    2017-01-01

    Using high-speed videography, we investigated how fruit flies compensate for unilateral wing damage, in which loss of area on one wing compromises both weight support and roll torque equilibrium. Our results show that flies control for unilateral damage by rolling their body towards the damaged wing and by adjusting the kinematics of both the intact and damaged wings. To compensate for the reduction in vertical lift force due to damage, flies elevate wingbeat frequency. Because this rise in frequency increases the flapping velocity of both wings, it has the undesired consequence of further increasing roll torque. To compensate for this effect, flies increase the stroke amplitude and advance the timing of pronation and supination of the damaged wing, while making the opposite adjustments on the intact wing. The resulting increase in force on the damaged wing and decrease in force on the intact wing function to maintain zero net roll torque. However, the bilaterally asymmetrical pattern of wing motion generates a finite lateral force, which flies balance by maintaining a constant body roll angle. Based on these results and additional experiments using a dynamically scaled robotic fly, we propose a simple bioinspired control algorithm for asymmetric wing damage. PMID:28163885

  2. Flies compensate for unilateral wing damage through modular adjustments of wing and body kinematics.

    PubMed

    Muijres, Florian T; Iwasaki, Nicole A; Elzinga, Michael J; Melis, Johan M; Dickinson, Michael H

    2017-02-06

    Using high-speed videography, we investigated how fruit flies compensate for unilateral wing damage, in which loss of area on one wing compromises both weight support and roll torque equilibrium. Our results show that flies control for unilateral damage by rolling their body towards the damaged wing and by adjusting the kinematics of both the intact and damaged wings. To compensate for the reduction in vertical lift force due to damage, flies elevate wingbeat frequency. Because this rise in frequency increases the flapping velocity of both wings, it has the undesired consequence of further increasing roll torque. To compensate for this effect, flies increase the stroke amplitude and advance the timing of pronation and supination of the damaged wing, while making the opposite adjustments on the intact wing. The resulting increase in force on the damaged wing and decrease in force on the intact wing function to maintain zero net roll torque. However, the bilaterally asymmetrical pattern of wing motion generates a finite lateral force, which flies balance by maintaining a constant body roll angle. Based on these results and additional experiments using a dynamically scaled robotic fly, we propose a simple bioinspired control algorithm for asymmetric wing damage.

  3. Limit-cycle-based control of the myogenic wingbeat rhythm in the fruit fly Drosophila

    PubMed Central

    Bartussek, Jan; Mutlu, A. Kadir; Zapotocky, Martin; Fry, Steven N.

    2013-01-01

    In many animals, rhythmic motor activity is governed by neural limit cycle oscillations under the control of sensory feedback. In the fruit fly Drosophila melanogaster, the wingbeat rhythm is generated myogenically by stretch-activated muscles and hence independently from direct neural input. In this study, we explored if generation and cycle-by-cycle control of Drosophila's wingbeat are functionally separated, or if the steering muscles instead couple into the myogenic rhythm as a weak forcing of a limit cycle oscillator. We behaviourally tested tethered flying flies for characteristic properties of limit cycle oscillators. To this end, we mechanically stimulated the fly's ‘gyroscopic’ organs, the halteres, and determined the phase relationship between the wing motion and stimulus. The flies synchronized with the stimulus for specific ranges of stimulus amplitude and frequency, revealing the characteristic Arnol'd tongues of a forced limit cycle oscillator. Rapid periodic modulation of the wingbeat frequency prior to locking demonstrates the involvement of the fast steering muscles in the observed control of the wingbeat frequency. We propose that the mechanical forcing of a myogenic limit cycle oscillator permits flies to avoid the comparatively slow control based on a neural central pattern generator. PMID:23282849

  4. Deep hierarchies in the primate visual cortex: what can we learn for computer vision?

    PubMed

    Krüger, Norbert; Janssen, Peter; Kalkan, Sinan; Lappe, Markus; Leonardis, Ales; Piater, Justus; Rodríguez-Sánchez, Antonio J; Wiskott, Laurenz

    2013-08-01

    Computational modeling of the primate visual system yields insights of potential relevance to some of the challenges that computer vision is facing, such as object recognition and categorization, motion detection and activity recognition, or vision-based navigation and manipulation. This paper reviews some functional principles and structures that are generally thought to underlie the primate visual cortex, and attempts to extract biological principles that could further advance computer vision research. Organized for a computer vision audience, we present functional principles of the processing hierarchies present in the primate visual system considering recent discoveries in neurophysiology. The hierarchical processing in the primate visual system is characterized by a sequence of different levels of processing (on the order of 10) that constitute a deep hierarchy in contrast to the flat vision architectures predominantly used in today's mainstream computer vision. We hope that the functional description of the deep hierarchies realized in the primate visual system provides valuable insights for the design of computer vision algorithms, fostering increasingly productive interaction between biological and computer vision research.

  5. Research on three-dimensional reconstruction method based on binocular vision

    NASA Astrophysics Data System (ADS)

    Li, Jinlin; Wang, Zhihui; Wang, Minjun

    2018-03-01

    As the hot and difficult issue in computer vision, binocular stereo vision is an important form of computer vision,which has a broad application prospects in many computer vision fields,such as aerial mapping,vision navigation,motion analysis and industrial inspection etc.In this paper, a research is done into binocular stereo camera calibration, image feature extraction and stereo matching. In the binocular stereo camera calibration module, the internal parameters of a single camera are obtained by using the checkerboard lattice of zhang zhengyou the field of image feature extraction and stereo matching, adopted the SURF operator in the local feature operator and the SGBM algorithm in the global matching algorithm are used respectively, and the performance are compared. After completed the feature points matching, we can build the corresponding between matching points and the 3D object points using the camera parameters which are calibrated, which means the 3D information.

  6. Auditory opportunity and visual constraint enabled the evolution of echolocation in bats.

    PubMed

    Thiagavel, Jeneni; Cechetto, Clément; Santana, Sharlene E; Jakobsen, Lasse; Warrant, Eric J; Ratcliffe, John M

    2018-01-08

    Substantial evidence now supports the hypothesis that the common ancestor of bats was nocturnal and capable of both powered flight and laryngeal echolocation. This scenario entails a parallel sensory and biomechanical transition from a nonvolant, vision-reliant mammal to one capable of sonar and flight. Here we consider anatomical constraints and opportunities that led to a sonar rather than vision-based solution. We show that bats' common ancestor had eyes too small to allow for successful aerial hawking of flying insects at night, but an auditory brain design sufficient to afford echolocation. Further, we find that among extant predatory bats (all of which use laryngeal echolocation), those with putatively less sophisticated biosonar have relatively larger eyes than do more sophisticated echolocators. We contend that signs of ancient trade-offs between vision and echolocation persist today, and that non-echolocating, phytophagous pteropodid bats may retain some of the necessary foundations for biosonar.

  7. Coherent motion threshold measurements for M-cell deficit differ for above- and below-average readers.

    PubMed

    Solan, Harold A; Hansen, Peter C; Shelley-Tremblay, John; Ficarra, Anthony

    2003-11-01

    Research during the past 20 years has influenced the management of diagnosis and treatment of children identified as having learning-related vision problems. The intent of this study is to determine whether coherent motion threshold testing can distinguish better-than-average non-disabled (ND) readers from those who are moderately reading disabled (RD) among sixth-grade students. A sample of 23 better-than-average non-disabled readers (> or = 80th percentile) and 27 moderately disabled readers (< or = 32nd percentile) were identified using a standardized reading comprehension test. Each participant was tested for coherent motion threshold. Previous psychophysical and fMRI research with adults suggests that coherent motion threshold is a valid measure of magnocellular (M-cell) integrity. The average of two coherent motion threshold trials was significantly greater for moderately reading disabled subjects than for above-average readers (p < 0.01). The mean threshold percentage of dots required to observe lateral motion was 9.2% for moderately reading disabled readers and 4.6% for superior readers (p = 0.001). The outcome of this preliminary study provides an efficient procedure to identify sixth-grade students whose reading disability may be associated with an M-cell deficit. Our previous investigations involving visual processing, visual attention, and oculomotor therapy have resulted in significant improvements in reading comprehension, visual attention, and eye movements. It remains to be demonstrated whether vision therapy has an impact on the M-cell deficit, as measured with coherent motion threshold testing for moderately disabled readers.

  8. The Human Voice and the Silent Cinema.

    ERIC Educational Resources Information Center

    Berg, Charles M.

    This paper traces the history of motion pictures from Thomas Edison's vision in 1887 of an instrument that recorded body movements to the development of synchronized sound-motion films in the late 1920s. The first synchronized sound film was made and demonstrated by W. K. L. Dickson, an assistant to Edison, in 1889. The popular acceptance of…

  9. Teaching an Old Robot New Tricks: Learning Novel Tasks via Interaction with People and Things

    DTIC Science & Technology

    2003-06-01

    visions behind the Cog Project were to build a "robot baby ", which could interact with people and objects, imitate the motions of its teachers, and even...though. A very elaborate animatronic motor controller can produce very life-like canned motion, although the controller itself bears little resemblance

  10. Leveraging Simulation Against the F-16 Flying Training Gap

    DTIC Science & Technology

    2005-11-01

    must leverage emerging simulation technology into combined flight training to counter mission employment complexity created by technology itself...two or more of these stand-alone simulators creates a mission training center (MTC), which when further networked create distributed mission...operations (DMO). Ultimately, the grand operational vision of DMO is to interconnect non-collocated users creating a “virtual” joint training environment

  11. Lonely Skies: Air-to-Air Training for a 5th Generation Fighter Force

    DTIC Science & Technology

    2015-06-01

    Missing Attitude Indicator….…………………14 4 Lt James Doolittle during Blind Flight Test…..……………………...15 5 An Early Link Trainer Cockpit...during visual flight because it deceived pilots about the actual aircraft attitude and acceleration. 1st Lt James Doolittle used Doctor David Meyers...flying pioneer and leader, Doolittle believed that pilots should learn to ignore their physical sense of motion while flying blind and to trust their

  12. Compound eye and ocellar structure for walking and flying modes of locomotion in the Australian ant, Camponotus consobrinus

    PubMed Central

    Narendra, Ajay; Ramirez-Esquivel, Fiorella; Ribi, Willi A.

    2016-01-01

    Ants are unusual among insects in that individuals of the same species within a single colony have different modes of locomotion and tasks. We know from walking ants that vision plays a significant role in guiding this behaviour, but we know surprisingly little about the potential contribution of visual sensory structures for a flying mode of locomotion. Here we investigate the structure of the compound eye and ocelli in pedestrian workers, alate females and alate males of an Australian ant, Camponotus consobrinus, and discuss the trade-offs involved in optical sensitivity and spatial resolution. Male ants have more but smaller ommatidia and the smallest interommatidial angles, which is most likely an adaptation to visually track individual flying females. Both walking and flying forms of ants have a similar proportion of specialized receptors sensitive to polarized skylight, but the absolute number of these receptors varies, being greatest in males. Ocelli are present only in the flying forms. Each ocellus consists of a bipartite retina with a horizon-facing dorsal retina, which contains retinula cells with long rhabdoms, and a sky-facing ventral retina with shorter rhabdoms. We discuss the implications of these and their potential for sensing the pattern of polarized skylight. PMID:26975481

  13. Compound eye and ocellar structure for walking and flying modes of locomotion in the Australian ant, Camponotus consobrinus.

    PubMed

    Narendra, Ajay; Ramirez-Esquivel, Fiorella; Ribi, Willi A

    2016-03-15

    Ants are unusual among insects in that individuals of the same species within a single colony have different modes of locomotion and tasks. We know from walking ants that vision plays a significant role in guiding this behaviour, but we know surprisingly little about the potential contribution of visual sensory structures for a flying mode of locomotion. Here we investigate the structure of the compound eye and ocelli in pedestrian workers, alate females and alate males of an Australian ant, Camponotus consobrinus, and discuss the trade-offs involved in optical sensitivity and spatial resolution. Male ants have more but smaller ommatidia and the smallest interommatidial angles, which is most likely an adaptation to visually track individual flying females. Both walking and flying forms of ants have a similar proportion of specialized receptors sensitive to polarized skylight, but the absolute number of these receptors varies, being greatest in males. Ocelli are present only in the flying forms. Each ocellus consists of a bipartite retina with a horizon-facing dorsal retina, which contains retinula cells with long rhabdoms, and a sky-facing ventral retina with shorter rhabdoms. We discuss the implications of these and their potential for sensing the pattern of polarized skylight.

  14. The Application of Leap Motion in Astronaut Virtual Training

    NASA Astrophysics Data System (ADS)

    Qingchao, Xie; Jiangang, Chao

    2017-03-01

    With the development of computer vision, virtual reality has been applied in astronaut virtual training. As an advanced optic equipment to track hand, Leap Motion can provide precise and fluid tracking of hands. Leap Motion is suitable to be used as gesture input device in astronaut virtual training. This paper built an astronaut virtual training based Leap Motion, and established the mathematics model of hands occlusion. At last the ability of Leap Motion to handle occlusion was analysed. A virtual assembly simulation platform was developed for astronaut training, and occlusion gesture would influence the recognition process. The experimental result can guide astronaut virtual training.

  15. A video-based system for hand-driven stop-motion animation.

    PubMed

    Han, Xiaoguang; Fu, Hongbo; Zheng, Hanlin; Liu, Ligang; Wang, Jue

    2013-01-01

    Stop-motion is a well-established animation technique but is often laborious and requires craft skills. A new video-based system can animate the vast majority of everyday objects in stop-motion style, more flexibly and intuitively. Animators can perform and capture motions continuously instead of breaking them into increments and shooting one still picture per increment. More important, the system permits direct hand manipulation without resorting to rigs, achieving more natural object control for beginners. The system's key component is two-phase keyframe-based capturing and processing, assisted by computer vision techniques. With this system, even amateurs can generate high-quality stop-motion animations.

  16. Flight Simulator Platform Motion and Air Transport Pilot Training

    NASA Technical Reports Server (NTRS)

    Lee, Alfred T.; Bussolari, Steven R.

    1989-01-01

    The influence of flight simulator platform motion on pilot training and performance was examined In two studies utilizing a B-727-200 aircraft simulator. The simulator, located at Ames Research Center, Is certified by the FAA for upgrade and transition training in air carrier operations. Subjective ratings and objective performance of experienced B-727 pilots did not reveal any reliable effects of wide variations In platform motion de- sign. Motion platform variations did, however, affect the acquisition of control skill by pilots with no prior heavy aircraft flying experience. The effect was limited to pitch attitude control inputs during the early phase of landing training. Implications for the definition of platform motion requirements in air transport pilot training are discussed.

  17. A robust vision-based sensor fusion approach for real-time pose estimation.

    PubMed

    Assa, Akbar; Janabi-Sharifi, Farrokh

    2014-02-01

    Object pose estimation is of great importance to many applications, such as augmented reality, localization and mapping, motion capture, and visual servoing. Although many approaches based on a monocular camera have been proposed, only a few works have concentrated on applying multicamera sensor fusion techniques to pose estimation. Higher accuracy and enhanced robustness toward sensor defects or failures are some of the advantages of these schemes. This paper presents a new Kalman-based sensor fusion approach for pose estimation that offers higher accuracy and precision, and is robust to camera motion and image occlusion, compared to its predecessors. Extensive experiments are conducted to validate the superiority of this fusion method over currently employed vision-based pose estimation algorithms.

  18. Relativistic Tennis with Photons: Frequency Up-Shifting, Light Intensification and Ion Acceleration with Flying Mirrors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bulanov, S. V.; Esirkepov, T. Zh.; Kando, M.

    2011-01-04

    We formulate the Flying Mirror Concept for relativistic interaction of ultra-intense electromagnetic waves with plasmas, present its theoretical description and the results of computer simulations and laboratory experiments. In collisionless plasmas, the relativistic flying mirrors are thin and dense electron or electron-ion layers accelerated by the high intensity electromagnetic waves up to velocity close to the speed of light in vacuum; in nonlinear-media and in nonlinear vacuum they are the ionization fronts and the refraction index modulations induced by a strong electromagnetic wave. The reflection of the electromagnetic wave at the relativistic mirror results in its energy and frequency changemore » due to the double Doppler effect. In the co-propagating configuration, in the radiation pressure dominant regime, the energy of the electromagnetic wave is transferred to the ion energy providing a highly efficient acceleration mechanism. In the counter-propagation configuration the frequency of the reflected wave is multiplied by the factor proportional to the gamma-factor squared. If the relativistic mirror performs an oscillatory motion as in the case of the electron motion at the plasma-vacuum interface, the reflected light spectrum is enriched with high order harmonics.« less

  19. Predator pursuit strategies: how do falcons and hawks chase prey?

    NASA Astrophysics Data System (ADS)

    Kane, Suzanne Amador; Zamani, Marjon; Fulton, Andrew; Rosenthal, Lee

    2014-03-01

    This study reports on experiments on falcons, goshawks and red-tailed hawks wearing miniature videocameras mounted on their backs or heads while pursuing flying or ground-based prey. Videos of hunts recorded by the raptors were analyzed to determine apparent prey positions on their visual fields during pursuits. These video data then were interpreted using computer simulations of pursuit steering laws observed in insects and mammals. A comparison of the empirical and modeling data indicates that falcons use cues due to the apparent motion of prey on the falcon's visual field to track and capture flying prey via a form of motion camouflage. The falcons also were found to maintain their prey's image at visual angles consistent with using their shallow fovea. Results for goshawks and red-tailed hawks were analyzed for a comparative study of how pursuits of ground-based prey by accipeters and buteos differ from those used by falcons chasing flying prey. These results should prove relevant for understanding the coevolution of pursuit and evasion, as well as the development of computer models of predation on flocks,and the integration of sensory and locomotion systems in biomimetic robots.

  20. Basic quantitative assessment of visual performance in patients with very low vision.

    PubMed

    Bach, Michael; Wilke, Michaela; Wilhelm, Barbara; Zrenner, Eberhart; Wilke, Robert

    2010-02-01

    A variety of approaches to developing visual prostheses are being pursued: subretinal, epiretinal, via the optic nerve, or via the visual cortex. This report presents a method of comparing their efficacy at genuinely improving visual function, starting at no light perception (NLP). A test battery (a computer program, Basic Assessment of Light and Motion [BaLM]) was developed in four basic visual dimensions: (1) light perception (light/no light), with an unstructured large-field stimulus; (2) temporal resolution, with single versus double flash discrimination; (3) localization of light, where a wedge extends from the center into four possible directions; and (4) motion, with a coarse pattern moving in one of four directions. Two- or four-alternative, forced-choice paradigms were used. The participants' responses were self-paced and delivered with a keypad. The feasibility of the BaLM was tested in 73 eyes of 51 patients with low vision. The light and time test modules discriminated between NLP and light perception (LP). The localization and motion modules showed no significant response for NLP but discriminated between LP and hand movement (HM). All four modules reached their ceilings in the acuity categories higher than HM. BaLM results systematically differed between the very-low-acuity categories NLP, LP, and HM. Light and time yielded similar results, as did localization and motion; still, for assessing the visual prostheses with differing temporal characteristics, they are not redundant. The results suggest that this simple test battery provides a quantitative assessment of visual function in the very-low-vision range from NLP to HM.

  1. Deficient motion-defined and texture-defined figure-ground segregation in amblyopic children.

    PubMed

    Wang, Jane; Ho, Cindy S; Giaschi, Deborah E

    2007-01-01

    Motion-defined form deficits in the fellow eye and the amblyopic eye of children with amblyopia implicate possible direction-selective motion processing or static figure-ground segregation deficits. Deficient motion-defined form perception in the fellow eye of amblyopic children may not be fully accounted for by a general motion processing deficit. This study investigates the contribution of figure-ground segregation deficits to the motion-defined form perception deficits in amblyopia. Performances of 6 amblyopic children (5 anisometropic, 1 anisostrabismic) and 32 control children with normal vision were assessed on motion-defined form, texture-defined form, and global motion tasks. Performance on motion-defined and texture-defined form tasks was significantly worse in amblyopic children than in control children. Performance on global motion tasks was not significantly different between the 2 groups. Faulty figure-ground segregation mechanisms are likely responsible for the observed motion-defined form perception deficits in amblyopia.

  2. Feedforward ankle strategy of balance during quiet stance in adults

    PubMed Central

    Gatev, Plamen; Thomas, Sherry; Kepple, Thomas; Hallett, Mark

    1999-01-01

    We studied quiet stance investigating strategies for maintaining balance. Normal subjects stood with natural stance and with feet together, with eyes open or closed. Kinematic, kinetic and EMG data were evaluated and cross-correlated.Cross-correlation analysis revealed a high, positive, zero-phased correlation between anteroposterior motions of the centre of gravity (COG) and centre of pressure (COP), head and COG, and between linear motions of the shoulder and knee in both sagittal and frontal planes. There was a moderate, negative, zero-phased correlation between the anteroposterior motion of COP and ankle angular motion.Narrow stance width increased ankle angular motion, hip angular motion, mediolateral sway of the COG, and the correlation between linear motions of the shoulder and knee in the frontal plane. Correlations between COG and COP and linear motions of the shoulder and knee in the sagittal plane were decreased. The correlation between the hip angular sway in the sagittal and frontal planes was dependent on interaction between support and vision.Low, significant positive correlations with time lags of the maximum of cross-correlation of 250-300 ms were found between the EMG activity of the lateral gastrocnemius muscle and anteroposterior motions of the COG and COP during normal stance. Narrow stance width decreased both correlations whereas absence of vision increased the correlation with COP.Ankle mechanisms dominate during normal stance especially in the sagittal plane. Narrow stance width decreased the role of the ankle and increased the role of hip mechanisms in the sagittal plane, while in the frontal plane both increased.The modulation pattern of the lateral gastrocnemius muscle suggests a central program of control of the ankle joint stiffness working to predict the loading pattern. PMID:9882761

  3. The Manchester Fly Facility: Implementing an objective-driven long-term science communication initiative.

    PubMed

    Patel, Sanjai; Prokop, Andreas

    2017-10-01

    Science communication is increasingly important for scientists, although research, teaching and administration activities tend to eat up our time already, and budgets for science communication are usually low. It appears impossible to combine all these tasks and, in addition, to develop engagement activities to a quality and impact that would make the efforts worth their while. Here we argue that these challenges are easier addressed when centering science communication initiatives on a long-term vision with a view to eventually forming outreach networks where the load can be shared whilst being driven to higher momentum. As one example, we explain the science communication initiative of the Manchester Fly Facility. It aims to promote public awareness of research using the model organism Drosophila, which is a timely, economic and most efficient experimental strategy to drive discovery processes in the biomedical sciences and must have a firm place in the portfolios of funding organisations. Although this initiative by the Manchester Fly Facility is sustained on a low budget, its long-term vision has allowed gradual development into a multifaceted initiative: (1) targeting university students via resources and strategies for the advanced training in fly genetics; (2) targeting the general public via science fairs, educational YouTube videos, school visits, teacher seminars and the droso4schools project; (3) disseminating and marketing strategies and resources to the public as well as fellow scientists via dedicated websites, blogs, journal articles, conference presentations and workshops - with a view to gradually forming networks of drosophilists that will have a greater potential to drive the science communication objective to momentum and impact. Here we explain the rationales and implementation strategies for our various science communication activities - which are similarly applicable to other model animals and other areas of academic science - and share our experiences and resources to provide ideas and readily available means to those who are actively engaging or intend to do so. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  4. Image Processing Occupancy Sensor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The Image Processing Occupancy Sensor, or IPOS, is a novel sensor technology developed at the National Renewable Energy Laboratory (NREL). The sensor is based on low-cost embedded microprocessors widely used by the smartphone industry and leverages mature open-source computer vision software libraries. Compared to traditional passive infrared and ultrasonic-based motion sensors currently used for occupancy detection, IPOS has shown the potential for improved accuracy and a richer set of feedback signals for occupant-optimized lighting, daylighting, temperature setback, ventilation control, and other occupancy and location-based uses. Unlike traditional passive infrared (PIR) or ultrasonic occupancy sensors, which infer occupancy based only onmore » motion, IPOS uses digital image-based analysis to detect and classify various aspects of occupancy, including the presence of occupants regardless of motion, their number, location, and activity levels of occupants, as well as the illuminance properties of the monitored space. The IPOS software leverages the recent availability of low-cost embedded computing platforms, computer vision software libraries, and camera elements.« less

  5. Humanlike robot hands controlled by brain activity arouse illusion of ownership in operators

    PubMed Central

    Alimardani, Maryam; Nishio, Shuichi; Ishiguro, Hiroshi

    2013-01-01

    Operators of a pair of robotic hands report ownership for those hands when they hold image of a grasp motion and watch the robot perform it. We present a novel body ownership illusion that is induced by merely watching and controlling robot's motions through a brain machine interface. In past studies, body ownership illusions were induced by correlation of such sensory inputs as vision, touch and proprioception. However, in the presented illusion none of the mentioned sensations are integrated except vision. Our results show that during BMI-operation of robotic hands, the interaction between motor commands and visual feedback of the intended motions is adequate to incorporate the non-body limbs into one's own body. Our discussion focuses on the role of proprioceptive information in the mechanism of agency-driven illusions. We believe that our findings will contribute to improvement of tele-presence systems in which operators incorporate BMI-operated robots into their body representations. PMID:23928891

  6. Insect-like flapping wing mechanism based on a double spherical Scotch yoke.

    PubMed

    Galiński, Cezary; Zbikowski, Rafał

    2005-06-22

    We describe the rationale, concept, design and implementation of a fixed-motion (non-adjustable) mechanism for insect-like flapping wing micro air vehicles in hover, inspired by two-winged flies (Diptera). This spatial (as opposed to planar) mechanism is based on the novel idea of a double spherical Scotch yoke. The mechanism was constructed for two main purposes: (i) as a test bed for aeromechanical research on hover in flapping flight, and (ii) as a precursor design for a future flapping wing micro air vehicle. Insects fly by oscillating (plunging) and rotating (pitching) their wings through large angles, while sweeping them forwards and backwards. During this motion the wing tip approximately traces a "figure-of-eight" or a "banana" and the wing changes the angle of attack (pitching) significantly. The kinematic and aerodynamic data from free-flying insects are sparse and uncertain, and it is not clear what aerodynamic consequences different wing motions have. Since acquiring the necessary kinematic and dynamic data from biological experiments remains a challenge, a synthetic, controlled study of insect-like flapping is not only of engineering value, but also of biological relevance. Micro air vehicles are defined as flying vehicles approximately 150 mm in size (hand-held), weighing 50-100g, and are developed to reconnoitre in confined spaces (inside buildings, tunnels, etc.). For this application, insect-like flapping wings are an attractive solution and hence the need to realize the functionality of insect flight by engineering means. Since the semi-span of the insect wing is constant, the kinematics are spatial; in fact, an approximate figure-of-eight/banana is traced on a sphere. Hence a natural mechanism implementing such kinematics should be (i) spherical and (ii) generate mathematically convenient curves expressing the figure-of-eight/banana shape. The double spherical Scotch yoke design has property (i) by definition and achieves (ii) by tracing spherical Lissajous curves.

  7. Insect-like flapping wing mechanism based on a double spherical Scotch yoke

    PubMed Central

    Galiński, Cezary; Żbikowski, Rafał

    2005-01-01

    We describe the rationale, concept, design and implementation of a fixed-motion (non-adjustable) mechanism for insect-like flapping wing micro air vehicles in hover, inspired by two-winged flies (Diptera). This spatial (as opposed to planar) mechanism is based on the novel idea of a double spherical Scotch yoke. The mechanism was constructed for two main purposes: (i) as a test bed for aeromechanical research on hover in flapping flight, and (ii) as a precursor design for a future flapping wing micro air vehicle. Insects fly by oscillating (plunging) and rotating (pitching) their wings through large angles, while sweeping them forwards and backwards. During this motion the wing tip approximately traces a ‘figure-of-eight’ or a ‘banana’ and the wing changes the angle of attack (pitching) significantly. The kinematic and aerodynamic data from free-flying insects are sparse and uncertain, and it is not clear what aerodynamic consequences different wing motions have. Since acquiring the necessary kinematic and dynamic data from biological experiments remains a challenge, a synthetic, controlled study of insect-like flapping is not only of engineering value, but also of biological relevance. Micro air vehicles are defined as flying vehicles approximately 150 mm in size (hand-held), weighing 50–100 g, and are developed to reconnoitre in confined spaces (inside buildings, tunnels, etc.). For this application, insect-like flapping wings are an attractive solution and hence the need to realize the functionality of insect flight by engineering means. Since the semi-span of the insect wing is constant, the kinematics are spatial; in fact, an approximate figure-of-eight/banana is traced on a sphere. Hence a natural mechanism implementing such kinematics should be (i) spherical and (ii) generate mathematically convenient curves expressing the figure-of-eight/banana shape. The double spherical Scotch yoke design has property (i) by definition and achieves (ii) by tracing spherical Lissajous curves. PMID:16849181

  8. Efficient encoding of motion is mediated by gap junctions in the fly visual system.

    PubMed

    Wang, Siwei; Borst, Alexander; Zaslavsky, Noga; Tishby, Naftali; Segev, Idan

    2017-12-01

    Understanding the computational implications of specific synaptic connectivity patterns is a fundamental goal in neuroscience. In particular, the computational role of ubiquitous electrical synapses operating via gap junctions remains elusive. In the fly visual system, the cells in the vertical-system network, which play a key role in visual processing, primarily connect to each other via axonal gap junctions. This network therefore provides a unique opportunity to explore the functional role of gap junctions in sensory information processing. Our information theoretical analysis of a realistic VS network model shows that within 10 ms following the onset of the visual input, the presence of axonal gap junctions enables the VS system to efficiently encode the axis of rotation, θ, of the fly's ego motion. This encoding efficiency, measured in bits, is near-optimal with respect to the physical limits of performance determined by the statistical structure of the visual input itself. The VS network is known to be connected to downstream pathways via a subset of triplets of the vertical system cells; we found that because of the axonal gap junctions, the efficiency of this subpopulation in encoding θ is superior to that of the whole vertical system network and is robust to a wide range of signal to noise ratios. We further demonstrate that this efficient encoding of motion by this subpopulation is necessary for the fly's visually guided behavior, such as banked turns in evasive maneuvers. Because gap junctions are formed among the axons of the vertical system cells, they only impact the system's readout, while maintaining the dendritic input intact, suggesting that the computational principles implemented by neural circuitries may be much richer than previously appreciated based on point neuron models. Our study provides new insights as to how specific network connectivity leads to efficient encoding of sensory stimuli.

  9. Plasticity Beyond V1: Reinforcement of Motion Perception upon Binocular Central Retinal Lesions in Adulthood.

    PubMed

    Burnat, Kalina; Hu, Tjing-Tjing; Kossut, Małgorzata; Eysel, Ulf T; Arckens, Lutgarde

    2017-09-13

    Induction of a central retinal lesion in both eyes of adult mammals is a model for macular degeneration and leads to retinotopic map reorganization in the primary visual cortex (V1). Here we characterized the spatiotemporal dynamics of molecular activity levels in the central and peripheral representation of five higher-order visual areas, V2/18, V3/19, V4/21a,V5/PMLS, area 7, and V1/17, in adult cats with central 10° retinal lesions (both sexes), by means of real-time PCR for the neuronal activity reporter gene zif268. The lesions elicited a similar, permanent reduction in activity in the center of the lesion projection zone of area V1/17, V2/18, V3/19, and V4/21a, but not in the motion-driven V5/PMLS, which instead displayed an increase in molecular activity at 3 months postlesion, independent of visual field coordinates. Also area 7 only displayed decreased activity in its LPZ in the first weeks postlesion and increased activities in its periphery from 1 month onward. Therefore we examined the impact of central vision loss on motion perception using random dot kinematograms to test the capacity for form from motion detection based on direction and velocity cues. We revealed that the central retinal lesions either do not impair motion detection or even result in better performance, specifically when motion discrimination was based on velocity discrimination. In conclusion, we propose that central retinal damage leads to enhanced peripheral vision by sensitizing the visual system for motion processing relying on feedback from V5/PMLS and area 7. SIGNIFICANCE STATEMENT Central retinal lesions, a model for macular degeneration, result in functional reorganization of the primary visual cortex. Examining the level of cortical reactivation with the molecular activity marker zif268 revealed reorganization in visual areas outside V1. Retinotopic lesion projection zones typically display an initial depression in zif268 expression, followed by partial recovery with postlesion time. Only the motion-sensitive area V5/PMLS shows no decrease, and even a significant activity increase at 3 months post-retinal lesion. Behavioral tests of motion perception found no impairment and even better sensitivity to higher random dot stimulus velocities. We demonstrate that the loss of central vision induces functional mobilization of motion-sensitive visual cortex, resulting in enhanced perception of moving stimuli. Copyright © 2017 the authors 0270-6474/17/378989-11$15.00/0.

  10. On the Integration of Medium Wave Infrared Cameras for Vision-Based Navigation

    DTIC Science & Technology

    2015-03-01

    SWIR Short Wave Infrared VisualSFM Visual Structure from Motion WPAFB Wright Patterson Air Force Base xi ON THE INTEGRATION OF MEDIUM WAVE INFRARED...Structure from Motion Visual Structure from Motion ( VisualSFM ) is an application that performs incremental SfM using images fed into it of a scene [20...too drastically in between frames. When this happens, VisualSFM will begin creating a new model with images that do not fit to the old one. These new

  11. Vision sensing techniques in aeronautics and astronautics

    NASA Technical Reports Server (NTRS)

    Hall, E. L.

    1988-01-01

    The close relationship between sensing and other tasks in orbital space, and the integral role of vision sensing in practical aerospace applications, are illustrated. Typical space mission-vision tasks encompass the docking of space vehicles, the detection of unexpected objects, the diagnosis of spacecraft damage, and the inspection of critical spacecraft components. Attention is presently given to image functions, the 'windowing' of a view, the number of cameras required for inspection tasks, the choice of incoherent or coherent (laser) illumination, three-dimensional-to-two-dimensional model-matching, edge- and region-segmentation techniques, and motion analysis for tracking.

  12. VISIONS - Vista Star Formation Atlas

    NASA Astrophysics Data System (ADS)

    Meingast, Stefan; Alves, J.; Boui, H.; Ascenso, J.

    2017-06-01

    In this talk I will present the new ESO public survey VISIONS. Starting in early 2017 we will use the ESO VISTA survey telescope in a 550 h long programme to map the largest molecular cloud complexes within 500 pc in a multi-epoch program. The survey is optimized for measuring the proper motions of young stellar objects invisible to Gaia and mapping the cloud-structure with extinction. VISIONS will address a series of ISM topics ranging from the connection of dense cores to YSOs and the dynamical evolution of embedded clusters to variations in the reddening law on both small and large scales.

  13. 3D Holographic Observatory for Long-term Monitoring of Complex Behaviors in Drosophila

    NASA Astrophysics Data System (ADS)

    Kumar, S. Santosh; Sun, Yaning; Zou, Sige; Hong, Jiarong

    2016-09-01

    Drosophila is an excellent model organism towards understanding the cognitive function, aging and neurodegeneration in humans. The effects of aging and other long-term dynamics on the behavior serve as important biomarkers in identifying such changes to the brain. In this regard, we are presenting a new imaging technique for lifetime monitoring of Drosophila in 3D at spatial and temporal resolutions capable of resolving the motion of limbs and wings using holographic principles. The developed system is capable of monitoring and extracting various behavioral parameters, such as ethograms and spatial distributions, from a group of flies simultaneously. This technique can image complicated leg and wing motions of flies at a resolution, which allows capturing specific landing responses from the same data set. Overall, this system provides a unique opportunity for high throughput screenings of behavioral changes in 3D over a long term in Drosophila.

  14. Taming Crowded Visual Scenes

    DTIC Science & Technology

    2014-08-12

    Nolan Warner, Mubarak Shah. Tracking in Dense Crowds Using Prominenceand Neighborhood Motion Concurrence, IEEE Transactions on Pattern Analysis...of  computer  vision,   computer   graphics  and  evacuation  dynamics  by  providing  a  common  platform,  and  provides...areas  that  includes  Computer  Vision,  Computer   Graphics ,  and  Pedestrian   Evacuation  Dynamics.  Despite  the

  15. Detailed description of the HP-9825A HFRMP trajectory processor (TRAJ)

    NASA Technical Reports Server (NTRS)

    Kindall, S. M.; Wilson, S. W.

    1979-01-01

    The computer code for the trajectory processor of the HP-9825A High Fidelity Relative Motion Program is described in detail. The processor is a 12-degrees-of-freedom trajectory integrator which can be used to generate digital and graphical data describing the relative motion of the Space Shuttle Orbiter and a free-flying cylindrical payload. Coding standards and flow charts are given and the computational logic is discussed.

  16. Assessing the performance of structure-from-motion photogrammetry and terrestrial lidar 1 at reconstructing soil surface microtopography of naturally vegetated plots

    USDA-ARS?s Scientific Manuscript database

    Soil microtopography or soil roughness is a property of critical importance in many earth surface processes but is often difficult to measure. Advances in computer vision technologies have made image-based 3D depiction of the soil surface or Structure-from-Motion (SfM) available to many scientists ...

  17. Moving vehicles segmentation based on Gaussian motion model

    NASA Astrophysics Data System (ADS)

    Zhang, Wei; Fang, Xiang Z.; Lin, Wei Y.

    2005-07-01

    Moving objects segmentation is a challenge in computer vision. This paper focuses on the segmentation of moving vehicles in dynamic scene. We analyses the psychology of human vision and present a framework for segmenting moving vehicles in the highway. The proposed framework consists of two parts. Firstly, we propose an adaptive background update method in which the background is updated according to the change of illumination conditions and thus can adapt to the change of illumination sensitively. Secondly, we construct a Gaussian motion model to segment moving vehicles, in which the motion vectors of the moving pixels are modeled as a Gaussian model and an on-line EM algorithm is used to update the model. The Gaussian distribution of the adaptive model is elevated to determine which moving vectors result from moving vehicles and which from other moving objects such as waving trees. Finally, the pixels with motion vector result from the moving vehicles are segmented. Experimental results of several typical scenes show that the proposed model can detect the moving vehicles correctly and is immune from influence of the moving objects caused by the waving trees and the vibration of camera.

  18. Ageing vision and falls: a review.

    PubMed

    Saftari, Liana Nafisa; Kwon, Oh-Sang

    2018-04-23

    Falls are the leading cause of accidental injury and death among older adults. One of three adults over the age of 65 years falls annually. As the size of elderly population increases, falls become a major concern for public health and there is a pressing need to understand the causes of falls thoroughly. While it is well documented that visual functions such as visual acuity, contrast sensitivity, and stereo acuity are correlated with fall risks, little attention has been paid to the relationship between falls and the ability of the visual system to perceive motion in the environment. The omission of visual motion perception in the literature is a critical gap because it is an essential function in maintaining balance. In the present article, we first review existing studies regarding visual risk factors for falls and the effect of ageing vision on falls. We then present a group of phenomena such as vection and sensory reweighting that provide information on how visual motion signals are used to maintain balance. We suggest that the current list of visual risk factors for falls should be elaborated by taking into account the relationship between visual motion perception and balance control.

  19. Formation Design Strategy for SCOPE High-Elliptic Formation Flying Mission

    NASA Technical Reports Server (NTRS)

    Tsuda, Yuichi

    2007-01-01

    The new formation design strategy using simulated annealing (SA) optimization is presented. The SA algorithm is useful to survey a whole solution space of optimum formation, taking into account realistic constraints composed of continuous and discrete functions. It is revealed that this method is not only applicable for circular orbit, but also for high-elliptic orbit formation flying. The developed algorithm is first tested with a simple cart-wheel motion example, and then applied to the formation design for SCOPE. SCOPE is the next generation geomagnetotail observation mission planned in JAXA, utilizing a formation flying techonology in a high elliptic orbit. A distinctive and useful heuristics is found by investigating SA results, showing the effectiveness of the proposed design process.

  20. Control of a free-flying robot manipulator system

    NASA Technical Reports Server (NTRS)

    Alexander, H.; Cannon, R. H., Jr.

    1985-01-01

    The goal of the research is to develop and test control strategies for a self-contained, free flying space robot. Such a robot would perform operations in space similar to those currently handled by astronauts during extravehicular activity (EVA). The focus of the work is to develop and carry out a program of research with a series of physical Satellite Robot Simulator Vehicles (SRSV's), two-dimensionally freely mobile laboratory models of autonomous free-flying space robots such as might perform extravehicular functions associated with operation of a space station or repair of orbiting satellites. The development of the SRSV and of some of the controller subsystems are discribed. The two-link arm was fitted to the SRSV base, and researchers explored the open-loop characteristics of the arm and thruster actuators. Work began on building the software foundation necessary for use of the on-board computer, as well as hardware and software for a local vision system for target identification and tracking.

  1. A magnetic tether system to investigate visual and olfactory mediated flight control in Drosophila.

    PubMed

    Duistermars, Brian J; Frye, Mark

    2008-11-21

    It has been clear for many years that insects use visual cues to stabilize their heading in a wind stream. Many animals track odors carried in the wind. As such, visual stabilization of upwind tracking directly aids in odor tracking. But do olfactory signals directly influence visual tracking behavior independently from wind cues? Also, the recent deluge of research on the neurophysiology and neurobehavioral genetics of olfaction in Drosophila has motivated ever more technically sophisticated and quantitative behavioral assays. Here, we modified a magnetic tether system originally devised for vision experiments by equipping the arena with narrow laminar flow odor plumes. A fly is glued to a small steel pin and suspended in a magnetic field that enables it to yaw freely. Small diameter food odor plumes are directed downward over the fly's head, eliciting stable tracking by a hungry fly. Here we focus on the critical mechanics of tethering, aligning the magnets, devising the odor plume, and confirming stable odor tracking.

  2. Space Shuttle Strategic Planning Status

    NASA Technical Reports Server (NTRS)

    Norbraten, Gordon L.; Henderson, Edward M.

    2007-01-01

    The Space Shuttle Program is aggressively flying the Space Shuttle manifest for assembling the International Space Station and servicing the Hubble Space Telescope. Completing this flight manifest while concurrently transitioning to the Exploration architecture creates formidable challenges; the most notable of which is retaining critical skills within the Shuttle Program workforce. The Program must define a strategy that will allow safe and efficient fly-out of the Shuttle, while smoothly transitioning Shuttle assets (both human and facility) to support early flight demonstrations required in the development of NASA's Crew Exploration Vehicle (Orion) and Crew and Cargo Launch Vehicles (Ares I). The Program must accomplish all of this while maintaining the current level of resources. Therefore, it will be necessary to initiate major changes in operations and contracting. Overcoming these challenges will be essential for NASA to fly the Shuttle safely, accomplish the Vision for Space Exploration, and ultimately meet the national goal of maintaining a robust space program. This paper will address the Space Shuttle Program s strategy and its current status in meeting these challenges.

  3. Honey bees (Apis mellifera ligustica) swing abdomen to dissipate residual flying energy landing on a wall

    NASA Astrophysics Data System (ADS)

    Zhao, Jieliang; Huang, He; Yan, Shaoze

    2017-03-01

    Whether for insects or for aircrafts, landing is one of the indispensable links in the verification of airworthiness safety. The mechanisms by which insects achieve a fast and stable landing remain unclear. An intriguing example is provided by honeybees (Apis mellifera ligustica), which use the swinging motion of their abdomen to dissipate residual flying energy and to achieve a smooth, stable, and quick landing. By using a high-speed camera, we observed that touchdown is initiated by honeybees extending their front legs or antennae and then landing softly on a wall. After touchdown, they swing the rest of their bodies until all flying energy is dissipated. We suggested a simplified model with mass-spring dampers for the body of the honeybee and revealed the mechanism of flying energy transfer and dissipation in detail. Results demonstrate that body translation and abdomen swinging help honeybees dissipate residual flying energy and orchestrate smooth landings. The initial kinetic energy of flying is transformed into the kinetic energy of the abdomen's rotary movement. Then, the kinetic energy of rotary movement is converted into thermal energy during the swinging cycle. This strategy provides more insight into the mechanism of insect flying, which further inspires better design on aerial vehicle with better landing performance.

  4. Context-dependent olfactory enhancement of optomotor flight control in Drosophila.

    PubMed

    Chow, Dawnis M; Frye, Mark A

    2008-08-01

    Sensing and following the chemical plume of food odors is a fundamental challenge faced by many organisms. For flying insects, the task is complicated by wind that distorts the plume and buffets the fly. To maintain an upwind heading, and thus stabilize their orientation in a plume, insects such as flies and moths make use of strong context-specific visual equilibrium reflexes. For example, flying straight requires the regulation of image rotation across the eye, whereas minimizing side-slip and avoiding a collision require regulation of image expansion. In flies, visual rotation stabilizes plume tracking, but rotation and expansion optomotor responses are controlled by separate visual pathways. Are olfactory signals integrated with optomotor responses in a manner dependent upon visual context? We addressed this question by investigating the effect of an attractive food odor on active optomotor flight control. Odorant caused flies both to increase aerodynamic power output and to steer straighter. However, when challenged with wide-field optic flow, odor resulted in enhanced amplitude rotation responses but reduced amplitude expansion responses. For both visual conditions, flies tracked motion signals more closely in odor, an indication of increased saliency. These results suggest a simple search algorithm by which olfactory signals improve the salience of visual stimuli and modify optomotor control in a context-dependent manner, thereby enabling an animal to fly straight up a plume and approach odiferous objects.

  5. A new goldfish model to evaluate pharmacokinetic and pharmacodynamic effects of drugs used for motion sickness in different gravity loads

    NASA Astrophysics Data System (ADS)

    Lathers, Claire M.; Mukai, Chiaki; Smith, Cedric M.; Schraeder, Paul L.

    2001-08-01

    This paper proposes a new goldfish model to predict pharmacodynamic/pharmacokinetic effects of drugs used to treat motion sickness administered in differing gravity loads. The assumption of these experiments is that the vestibular system is dominant in producing motion sickness and that the visual system is secondary or of small import in the production of motion sickness. Studies will evaluate the parameter of gravity and the contribution of vision to the role of the neurovestibular system in the initiation of motion sickness with and without pharmacologic agents. Promethazine will be studied first. A comparison of data obtained in different groups of goldfish will be done (normal vs. acutely and chronically bilaterally blinded vs. sham operated). Some fish will be bilaterally blinded 10 months prior to initiation of the experiment (designated the chronically bilaterally blinded group of goldfish) to evaluate the neuroplasticity of the nervous system and the associated return of neurovestibular function. Data will be obtained under differing gravity loads with and without a pharmacological agent for motion sickness. Experiments will differentiate pharmacological effects on vision vs. neurovestibular input to motion sickness. Comparison of data obtained in the normal fish and in acutely and chronically bilaterally blinded fish with those obtained in fish with intact and denervated otoliths will differentiate if the visual or neurovestibular system is dominant in response to altered gravity and/or drugs. Experiments will contribute to validation of the goldfish as a model for humans since plasticity of the central nervous system allows astronauts to adapt to the altered visual stimulus conditions of 0-g. Space motion sickness may occur until such an adaptation is achieved.

  6. Reconstructing the behavior of walking fruit flies

    NASA Astrophysics Data System (ADS)

    Berman, Gordon; Bialek, William; Shaevitz, Joshua

    2010-03-01

    Over the past century, the fruit fly Drosophila melanogaster has arisen as almost a lingua franca in the study of animal behavior, having been utilized to study questions in fields as diverse as sleep deprivation, aging, and drug abuse, amongst many others. Accordingly, much is known about what can be done to manipulate these organisms genetically, behaviorally, and physiologically. Most of the behavioral work on this system to this point has been experiments where the flies in question have been given a choice between some discrete set of pre-defined behaviors. Our aim, however, is simply to spend some time with a cadre of flies, using techniques from nonlinear dynamics, statistical physics, and machine learning in an attempt to reconstruct and gain understanding into their behavior. More specifically, we use a multi-camera set-up combined with a motion tracking stage in order to obtain long time-series of walking fruit flies moving about a glass plate. This experimental system serves as a test-bed for analytical, statistical, and computational techniques for studying animal behavior. In particular, we attempt to reconstruct the natural modes of behavior for a fruit fly through a data-driven approach in a manner inspired by recent work in C. elegans and cockroaches.

  7. Towards Guided Underwater Survey Using Light Visual Odometry

    NASA Astrophysics Data System (ADS)

    Nawaf, M. M.; Drap, P.; Royer, J. P.; Merad, D.; Saccone, M.

    2017-02-01

    A light distributed visual odometry method adapted to embedded hardware platform is proposed. The aim is to guide underwater surveys in real time. We rely on image stream captured using portable stereo rig attached to the embedded system. Taken images are analyzed on the fly to assess image quality in terms of sharpness and lightness, so that immediate actions can be taken accordingly. Images are then transferred over the network to another processing unit to compute the odometry. Relying on a standard ego-motion estimation approach, we speed up points matching between image quadruplets using a low level points matching scheme relying on fast Harris operator and template matching that is invariant to illumination changes. We benefit from having the light source attached to the hardware platform to estimate a priori rough depth belief following light divergence over distance low. The rough depth is used to limit points correspondence search zone as it linearly depends on disparity. A stochastic relative bundle adjustment is applied to minimize re-projection errors. The evaluation of the proposed method demonstrates the gain in terms of computation time w.r.t. other approaches that use more sophisticated feature descriptors. The built system opens promising areas for further development and integration of embedded computer vision techniques.

  8. Time-of-Travel Methods for Measuring Optical Flow on Board a Micro Flying Robot

    PubMed Central

    Vanhoutte, Erik; Mafrica, Stefano; Ruffier, Franck; Bootsma, Reinoud J.; Serres, Julien

    2017-01-01

    For use in autonomous micro air vehicles, visual sensors must not only be small, lightweight and insensitive to light variations; on-board autopilots also require fast and accurate optical flow measurements over a wide range of speeds. Using an auto-adaptive bio-inspired Michaelis–Menten Auto-adaptive Pixel (M2APix) analog silicon retina, in this article, we present comparative tests of two optical flow calculation algorithms operating under lighting conditions from 6×10−7 to 1.6×10−2 W·cm−2 (i.e., from 0.2 to 12,000 lux for human vision). Contrast “time of travel” between two adjacent light-sensitive pixels was determined by thresholding and by cross-correlating the two pixels’ signals, with measurement frequency up to 5 kHz for the 10 local motion sensors of the M2APix sensor. While both algorithms adequately measured optical flow between 25 ∘/s and 1000 ∘/s, thresholding gave rise to a lower precision, especially due to a larger number of outliers at higher speeds. Compared to thresholding, cross-correlation also allowed for a higher rate of optical flow output (99 Hz and 1195 Hz, respectively) but required substantially more computational resources. PMID:28287484

  9. Effects of Sex and Gender on Adaptation to Space: Neurosensory Systems

    PubMed Central

    Cohen, Helen S.; Cerisano, Jody M.; Clayton, Janine A.; Cromwell, Ronita; Danielson, Richard W.; Hwang, Emma Y.; Tingen, Candace; Allen, John R.; Tomko, David L.

    2014-01-01

    Abstract Sex and gender differences have long been a research topic of interest, yet few studies have explored the specific differences in neurological responses between men and women during and after spaceflight. Knowledge in this field is limited due to the significant disproportion of sexes enrolled in the astronaut corps. Research indicates that general neurological and sensory differences exist between the sexes, such as those in laterality of amygdala activity, sensitivity and discrimination in vision processing, and neuronal cell death (apoptosis) pathways. In spaceflight, sex differences may include a higher incidence of entry and space motion sickness and of post-flight vestibular instability in female as opposed to male astronauts who flew on both short- and long-duration missions. Hearing and auditory function in crewmembers shows the expected hearing threshold differences between men and women, in which female astronauts exhibit better hearing thresholds. Longitudinal observations of hearing thresholds for crewmembers yield normal age-related decrements; however, no evidence of sex-related differences from spaceflight has been observed. The impact of sex and gender differences should be studied by making spaceflight accessible and flying more women into space. Only in this way will we know if increasingly longer-duration missions cause significantly different neurophysiological responses in men and women. PMID:25401941

  10. RoboLab and virtual environments

    NASA Technical Reports Server (NTRS)

    Giarratano, Joseph C.

    1994-01-01

    A useful adjunct to the manned space station would be a self-contained free-flying laboratory (RoboLab). This laboratory would have a robot operated under telepresence from the space station or ground. Long duration experiments aboard RoboLab could be performed by astronauts or scientists using telepresence to operate equipment and perform experiments. Operating the lab by telepresence would eliminate the need for life support such as food, water and air. The robot would be capable of motion in three dimensions, have binocular vision TV cameras, and two arms with manipulators to simulate hands. The robot would move along a two-dimensional grid and have a rotating, telescoping periscope section for extension in the third dimension. The remote operator would wear a virtual reality type headset to allow the superposition of computer displays over the real-time video of the lab. The operators would wear exoskeleton type arms to facilitate the movement of objects and equipment operation. The combination of video displays, motion, and the exoskeleton arms would provide a high degree of telepresence, especially for novice users such as scientists doing short-term experiments. The RoboLab could be resupplied and samples removed on other space shuttle flights. A self-contained RoboLab module would be designed to fit within the cargo bay of the space shuttle. Different modules could be designed for specific applications, i.e., crystal-growing, medicine, life sciences, chemistry, etc. This paper describes a RoboLab simulation using virtual reality (VR). VR provides an ideal simulation of telepresence before the actual robot and laboratory modules are constructed. The easy simulation of different telepresence designs will produce a highly optimum design before construction rather than the more expensive and time consuming hardware changes afterwards.

  11. Design and control of an embedded vision guided robotic fish with multiple control surfaces.

    PubMed

    Yu, Junzhi; Wang, Kai; Tan, Min; Zhang, Jianwei

    2014-01-01

    This paper focuses on the development and control issues of a self-propelled robotic fish with multiple artificial control surfaces and an embedded vision system. By virtue of the hybrid propulsion capability in the body plus the caudal fin and the complementary maneuverability in accessory fins, a synthesized propulsion scheme including a caudal fin, a pair of pectoral fins, and a pelvic fin is proposed. To achieve flexible yet stable motions in aquatic environments, a central pattern generator- (CPG-) based control method is employed. Meanwhile, a monocular underwater vision serves as sensory feedback that modifies the control parameters. The integration of the CPG-based motion control and the visual processing in an embedded microcontroller allows the robotic fish to navigate online. Aquatic tests demonstrate the efficacy of the proposed mechatronic design and swimming control methods. Particularly, a pelvic fin actuated sideward swimming gait was first implemented. It is also found that the speeds and maneuverability of the robotic fish with coordinated control surfaces were largely superior to that of the swimming robot propelled by a single control surface.

  12. Design and Control of an Embedded Vision Guided Robotic Fish with Multiple Control Surfaces

    PubMed Central

    Wang, Kai; Tan, Min; Zhang, Jianwei

    2014-01-01

    This paper focuses on the development and control issues of a self-propelled robotic fish with multiple artificial control surfaces and an embedded vision system. By virtue of the hybrid propulsion capability in the body plus the caudal fin and the complementary maneuverability in accessory fins, a synthesized propulsion scheme including a caudal fin, a pair of pectoral fins, and a pelvic fin is proposed. To achieve flexible yet stable motions in aquatic environments, a central pattern generator- (CPG-) based control method is employed. Meanwhile, a monocular underwater vision serves as sensory feedback that modifies the control parameters. The integration of the CPG-based motion control and the visual processing in an embedded microcontroller allows the robotic fish to navigate online. Aquatic tests demonstrate the efficacy of the proposed mechatronic design and swimming control methods. Particularly, a pelvic fin actuated sideward swimming gait was first implemented. It is also found that the speeds and maneuverability of the robotic fish with coordinated control surfaces were largely superior to that of the swimming robot propelled by a single control surface. PMID:24688413

  13. Time-Varying Expression of the Formation Flying along Circular Trajectories

    NASA Technical Reports Server (NTRS)

    Kawaguchi, Jun'ichiro

    2007-01-01

    Usually, the formation flying associated with circular orbits is discussed through the well-known Hill s or C-W equations of motion. This paper dares to present and discuss the coordinates that may contain time-varying coefficients. The discussion presents how the controller s performance is affected by the selection of coordinates, and also looks at the special coordinate suitable for designating a target bin to which each spacecraft in the formation has only to be guided. It is revealed that the latter strategy may incorporate the J2 disturbance automatically.

  14. Flying over decades

    NASA Astrophysics Data System (ADS)

    Hoeller, Judith; Issler, Mena; Imamoglu, Atac

    Levy flights haven been extensively used in the past three decades to describe non-Brownian motion of particles. In this presentation I give an overview on how Levy flights have been used across several disciplines, ranging from biology to finance to physics. In our publication we describe how a single electron spin 'flies' when captured in quantum dot using the central spin model. At last I motivate the use of Levy flights for the description of anomalous diffusion in modern experiments, concretely to describe the lifetimes of quasi-particles in Josephson junctions. Finished PhD at ETH in Spring 2015.

  15. Catching What We Can't See: Manual Interception of Occluded Fly-Ball Trajectories

    PubMed Central

    Bosco, Gianfranco; Delle Monache, Sergio; Lacquaniti, Francesco

    2012-01-01

    Control of interceptive actions may involve fine interplay between feedback-based and predictive mechanisms. These processes rely heavily on target motion information available when the target is visible. However, short-term visual memory signals as well as implicit knowledge about the environment may also contribute to elaborate a predictive representation of the target trajectory, especially when visual feedback is partially unavailable because other objects occlude the visual target. To determine how different processes and information sources are integrated in the control of the interceptive action, we manipulated a computer-generated visual environment representing a baseball game. Twenty-four subjects intercepted fly-ball trajectories by moving a mouse cursor and by indicating the interception with a button press. In two separate sessions, fly-ball trajectories were either fully visible or occluded for 750, 1000 or 1250 ms before ball landing. Natural ball motion was perturbed during the descending trajectory with effects of either weightlessness (0 g) or increased gravity (2 g) at times such that, for occluded trajectories, 500 ms of perturbed motion were visible before ball disappearance. To examine the contribution of previous visual experience with the perturbed trajectories to the interception of invisible targets, the order of visible and occluded sessions was permuted among subjects. Under these experimental conditions, we showed that, with fully visible targets, subjects combined servo-control and predictive strategies. Instead, when intercepting occluded targets, subjects relied mostly on predictive mechanisms based, however, on different type of information depending on previous visual experience. In fact, subjects without prior experience of the perturbed trajectories showed interceptive errors consistent with predictive estimates of the ball trajectory based on a-priori knowledge of gravity. Conversely, the interceptive responses of subjects previously exposed to fully visible trajectories were compatible with the fact that implicit knowledge of the perturbed motion was also taken into account for the extrapolation of occluded trajectories. PMID:23166653

  16. Catching what we can't see: manual interception of occluded fly-ball trajectories.

    PubMed

    Bosco, Gianfranco; Delle Monache, Sergio; Lacquaniti, Francesco

    2012-01-01

    Control of interceptive actions may involve fine interplay between feedback-based and predictive mechanisms. These processes rely heavily on target motion information available when the target is visible. However, short-term visual memory signals as well as implicit knowledge about the environment may also contribute to elaborate a predictive representation of the target trajectory, especially when visual feedback is partially unavailable because other objects occlude the visual target. To determine how different processes and information sources are integrated in the control of the interceptive action, we manipulated a computer-generated visual environment representing a baseball game. Twenty-four subjects intercepted fly-ball trajectories by moving a mouse cursor and by indicating the interception with a button press. In two separate sessions, fly-ball trajectories were either fully visible or occluded for 750, 1000 or 1250 ms before ball landing. Natural ball motion was perturbed during the descending trajectory with effects of either weightlessness (0 g) or increased gravity (2 g) at times such that, for occluded trajectories, 500 ms of perturbed motion were visible before ball disappearance. To examine the contribution of previous visual experience with the perturbed trajectories to the interception of invisible targets, the order of visible and occluded sessions was permuted among subjects. Under these experimental conditions, we showed that, with fully visible targets, subjects combined servo-control and predictive strategies. Instead, when intercepting occluded targets, subjects relied mostly on predictive mechanisms based, however, on different type of information depending on previous visual experience. In fact, subjects without prior experience of the perturbed trajectories showed interceptive errors consistent with predictive estimates of the ball trajectory based on a-priori knowledge of gravity. Conversely, the interceptive responses of subjects previously exposed to fully visible trajectories were compatible with the fact that implicit knowledge of the perturbed motion was also taken into account for the extrapolation of occluded trajectories.

  17. A Feasibility Study of View-independent Gait Identification

    DTIC Science & Technology

    2012-03-01

    ice skates . For walking, the footprint records for single pixels form clusters that are well separated in space and time. (Any overlap of contact...Pattern Recognition 2007, 1-8. Cheng M-H, Ho M-F & Huang C-L (2008), "Gait Analysis for Human Identification Through Manifold Learning and HMM... Learning and Cybernetics 2005, 4516-4521 Moeslund T B & Granum E (2001), "A Survey of Computer Vision-Based Human Motion Capture", Computer Vision

  18. Binocular Vision-Based Position and Pose of Hand Detection and Tracking in Space

    NASA Astrophysics Data System (ADS)

    Jun, Chen; Wenjun, Hou; Qing, Sheng

    After the study of image segmentation, CamShift target tracking algorithm and stereo vision model of space, an improved algorithm based of Frames Difference and a new space point positioning model were proposed, a binocular visual motion tracking system was constructed to verify the improved algorithm and the new model. The problem of the spatial location and pose of the hand detection and tracking have been solved.

  19. Evaluation of an organic light-emitting diode display for precise visual stimulation.

    PubMed

    Ito, Hiroyuki; Ogawa, Masaki; Sunaga, Shoji

    2013-06-11

    A new type of visual display for presentation of a visual stimulus with high quality was assessed. The characteristics of an organic light-emitting diode (OLED) display (Sony PVM-2541, 24.5 in.; Sony Corporation, Tokyo, Japan) were measured in detail from the viewpoint of its applicability to visual psychophysics. We found the new display to be superior to other display types in terms of spatial uniformity, color gamut, and contrast ratio. Changes in the intensity of luminance were sharper on the OLED display than those on a liquid crystal display. Therefore, such OLED displays could replace conventional cathode ray tube displays in vision research for high quality stimulus presentation. Benefits of using OLED displays in vision research were especially apparent in the fields of low-level vision, where precise control and description of the stimulus are needed, e.g., in mesopic or scotopic vision, color vision, and motion perception.

  20. Anisotropies in the perceived spatial displacement of motion-defined contours: opposite biases in the upper-left and lower-right visual quadrants.

    PubMed

    Fan, Zhao; Harris, John

    2010-10-12

    In a recent study (Fan, Z., & Harris, J. (2008). Perceived spatial displacement of motion-defined contours in peripheral vision. Vision Research, 48(28), 2793-2804), we demonstrated that virtual contours defined by two regions of dots moving in opposite directions were displaced perceptually in the direction of motion of the dots in the more eccentric region when the contours were viewed in the right visual field. Here, we show that the magnitude and/or direction of these displacements varies in different quadrants of the visual field. When contours were presented in the lower visual field, the direction of perceived contour displacement was consistent with that when both contours were presented in the right visual field. However, this illusory motion-induced spatial displacement disappeared when both contours were presented in the upper visual field. Also, perceived contour displacement in the direction of the more eccentric dots was larger in the right than in the left visual field, perhaps because of a hemispheric asymmetry in attentional allocation. Quadrant-based analyses suggest that the pattern of results arises from opposite directions of perceived contour displacement in the upper-left and lower-right visual quadrants, which depend on the relative strengths of two effects: a greater sensitivity to centripetal motion, and an asymmetry in the allocation of spatial attention. Copyright © 2010 Elsevier Ltd. All rights reserved.

  1. The Experience of Force: The Role of Haptic Experience of Forces in Visual Perception of Object Motion and Interactions, Mental Simulation, and Motion-Related Judgments

    ERIC Educational Resources Information Center

    White, Peter A.

    2012-01-01

    Forces are experienced in actions on objects. The mechanoreceptor system is stimulated by proximal forces in interactions with objects, and experiences of force occur in a context of information yielded by other sensory modalities, principally vision. These experiences are registered and stored as episodic traces in the brain. These stored…

  2. 3D Data Acquisition Platform for Human Activity Understanding

    DTIC Science & Technology

    2016-03-02

    3D data. The support for the acquisition of such research instrumentation have significantly facilitated our current and future research and educate ...SECURITY CLASSIFICATION OF: In this project, we incorporated motion capture devices, 3D vision sensors, and EMG sensors to cross validate...multimodality data acquisition, and address fundamental research problems of representation and invariant description of 3D data, human motion modeling and

  3. Drosophila learn efficient paths to a food source.

    PubMed

    Navawongse, Rapeechai; Choudhury, Deepak; Raczkowska, Marlena; Stewart, James Charles; Lim, Terrence; Rahman, Mashiur; Toh, Alicia Guek Geok; Wang, Zhiping; Claridge-Chang, Adam

    2016-05-01

    Elucidating the genetic, and neuronal bases for learned behavior is a central problem in neuroscience. A leading system for neurogenetic discovery is the vinegar fly Drosophila melanogaster; fly memory research has identified genes and circuits that mediate aversive and appetitive learning. However, methods to study adaptive food-seeking behavior in this animal have lagged decades behind rodent feeding analysis, largely due to the challenges presented by their small scale. There is currently no method to dynamically control flies' access to food. In rodents, protocols that use dynamic food delivery are a central element of experimental paradigms that date back to the influential work of Skinner. This method is still commonly used in the analysis of learning, memory, addiction, feeding, and many other subjects in experimental psychology. The difficulty of microscale food delivery means this is not a technique used in fly behavior. In the present manuscript we describe a microfluidic chip integrated with machine vision and automation to dynamically control defined liquid food presentations and sensory stimuli. Strikingly, repeated presentations of food at a fixed location produced improvements in path efficiency during food approach. This shows that improved path choice is a learned behavior. Active control of food availability using this microfluidic system is a valuable addition to the methods currently available for the analysis of learned feeding behavior in flies. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Ommatidia of blow fly, house fly, and flesh fly: implication of their vision efficiency.

    PubMed

    Sukontason, Kabkaew L; Chaiwong, Tarinee; Piangjai, Somsak; Upakut, Sorawit; Moophayak, Kittikhun; Sukontason, Kom

    2008-06-01

    This work aims to elucidate the number of ommatidia or facets (the outwardly visible units of each ommatidium) for compound eyes in blow flies [Chrysomya megacephala (F.), Chrysomya rufifacies (Macquart), Chrysomya nigripes (Aubertin), Lucilia cuprina (Wiedemann)], house flies (Musca domestica L.), and flesh flies (Liosarcophaga dux Thomson) by manual counts of the corneal spreads. The head of the fly in each species was soaked in 20% potassium hydroxide solution at room temperature for 7 days, and the clear compound eye was dissected into six small parts, each of which was placed onto a slide and flattened using a coverslip. Images of each part were obtained using a microscope connected to a computer. The printed images of each part were magnified, and the total number of ommatidia per eye was manually counted. For males, the mean number of ommatidia was statistically different among all flies examined: L. dux (6,032) > C. rufifacies (5,356) > C. nigripes (4,798) > C. megacephala (4,376) > L. cuprina (3,665) > M. domestica (3,484). Likewise, the mean number of facets in females was statistically different: L. dux (6,086) > C. megacephala (5,641) > C. rufifacies (5,208) > C. nigripes (4,774) > L. cuprina (3,608) > M. domestica (3433). Scanning electron microscopy analysis of adult flies revealed the sexual dimorphism in the compound eye. Male C. megacephala had large ommatidia in the upper two thirds part and small ommatidia in the lower one third part, whereas only small ommatidia were detected in females. Dense postulate appearance was detected in the external surface of the corneal lens of the ommatidia of C. megacephala, C. rufifacies, and C. nigripes, while a mix of dense postulate appearance and variable groove array length was detected in L. cuprina and M. domestica. The probable functions of ommatidia are discussed with reference to other literature.

  5. Vision-based calibration of parallax barrier displays

    NASA Astrophysics Data System (ADS)

    Ranieri, Nicola; Gross, Markus

    2014-03-01

    Static and dynamic parallax barrier displays became very popular over the past years. Especially for single viewer applications like tablets, phones and other hand-held devices, parallax barriers provide a convenient solution to render stereoscopic content. In our work we present a computer vision based calibration approach to relate image layer and barrier layer of parallax barrier displays with unknown display geometry for static or dynamic viewer positions using homographies. We provide the math and methods to compose the required homographies on the fly and present a way to compute the barrier without the need of any iteration. Our GPU implementation is stable and general and can be used to reduce latency and increase refresh rate of existing and upcoming barrier methods.

  6. Head-Mounted Display Technology for Low Vision Rehabilitation and Vision Enhancement

    PubMed Central

    Ehrlich, Joshua R.; Ojeda, Lauro V.; Wicker, Donna; Day, Sherry; Howson, Ashley; Lakshminarayanan, Vasudevan; Moroi, Sayoko E.

    2017-01-01

    Purpose To describe the various types of head-mounted display technology, their optical and human factors considerations, and their potential for use in low vision rehabilitation and vision enhancement. Design Expert perspective. Methods An overview of head-mounted display technology by an interdisciplinary team of experts drawing on key literature in the field. Results Head-mounted display technologies can be classified based on their display type and optical design. See-through displays such as retinal projection devices have the greatest potential for use as low vision aids. Devices vary by their relationship to the user’s eyes, field of view, illumination, resolution, color, stereopsis, effect on head motion and user interface. These optical and human factors considerations are important when selecting head-mounted displays for specific applications and patient groups. Conclusions Head-mounted display technologies may offer advantages over conventional low vision aids. Future research should compare head-mounted displays to commonly prescribed low vision aids in order to compare their effectiveness in addressing the impairments and rehabilitation goals of diverse patient populations. PMID:28048975

  7. Micromotor-based on-off fluorescence detection of sarin and soman simulants.

    PubMed

    Singh, Virendra V; Kaufmann, Kevin; Orozco, Jahir; Li, Jinxing; Galarnyk, Michael; Arya, Gaurav; Wang, Joseph

    2015-06-30

    Self-propelled micromotor-based fluorescent "On-Off" detection of nerve agents is described. The motion-based assay utilizes Si/Pt Janus micromotors coated with fluoresceinamine toward real-time "on-the-fly" field detection of sarin and soman simulants.

  8. Head movements in low and high gravitoinertial force environments elicit motion sickness - Implications for space motion sickness

    NASA Technical Reports Server (NTRS)

    Lackner, James R.; Graybiel, Ashton

    1987-01-01

    Astronauts report that head movements in flight tend to bring on symptoms of space motion sickness (SMS). The effects of head movements in pitch, yaw, and roll (made both with normal vision and with eyes occluded) on susceptibility to motion sickness in the zero G phase of parabolic flight maneuvers were evaluated. The findings are clear-cut: pitch head movements are most provocative, yaw least provocative, and roll intermediate. These experiments suggest that SMS is not a unique nosological entity, but is the consequence of exposure to nonterrestrial force levels. Head movements during departures in either direction from 1 G elicit symptoms.

  9. Cerebral palsy characterization by estimating ocular motion

    NASA Astrophysics Data System (ADS)

    González, Jully; Atehortúa, Angélica; Moncayo, Ricardo; Romero, Eduardo

    2017-11-01

    Cerebral palsy (CP) is a large group of motion and posture disorders caused during the fetal or infant brain development. Sensorial impairment is commonly found in children with CP, i.e., between 40-75 percent presents some form of vision problems or disabilities. An automatic characterization of the cerebral palsy is herein presented by estimating the ocular motion during a gaze pursuing task. Specifically, After automatically detecting the eye location, an optical flow algorithm tracks the eye motion following a pre-established visual assignment. Subsequently, the optical flow trajectories are characterized in the velocity-acceleration phase plane. Differences are quantified in a small set of patients between four to ten years.

  10. Flagellar Cap Protein FliD Mediates Adherence of Atypical Enteropathogenic Escherichia coli to Enterocyte Microvilli

    PubMed Central

    Sampaio, Suely C. F.; Luiz, Wilson B.; Vieira, Mônica A. M.; Ferreira, Rita C. C.; Garcia, Bruna G.; Sinigaglia-Coimbra, Rita; Sampaio, Jorge L. M.; Ferreira, Luís C. S.

    2016-01-01

    The expression of flagella correlates with different aspects of bacterial pathogenicity, ranging from adherence to host cells to activation of inflammatory responses by the innate immune system. In the present study, we investigated the role of flagella in the adherence of an atypical enteropathogenic Escherichia coli (aEPEC) strain (serotype O51:H40) to human enterocytes. Accordingly, isogenic mutants deficient in flagellin (FliC), the flagellar structural subunit; the flagellar cap protein (FliD); or the MotAB proteins, involved in the control of flagellar motion, were generated and tested for binding to differentiated Caco-2 cells. Binding of the aEPEC strain to enterocytes was significantly impaired in strains with the fliC and fliD genes deleted, both of which could not form flagella on the bacterial surface. A nonmotile but flagellated MotAB mutant also showed impaired adhesion to Caco-2 cells. In accordance with these observations, adhesion of aEPEC strain 1711-4 to Caco-2 cells was drastically reduced after the treatment of Caco-2 cells with purified FliD. In addition, incubation of aEPEC bacteria with specific anti-FliD serum impaired binding to Caco-2 cells. Finally, incubation of Caco-2 cells with purified FliD, followed by immunolabeling, showed that the protein was specifically bound to the microvillus tips of differentiated Caco-2 cells. The aEPEC FliD or anti-FliD serum also reduced the adherence of prototype typical enteropathogenic, enterohemorrhagic, and enterotoxigenic E. coli strains to Caco-2 cells. In conclusion, our findings further strengthened the role of flagella in the adherence of aEPEC to human enterocytes and disclosed the relevant structural and functional involvement of FliD in the adhesion process. PMID:26831466

  11. Audiovisual associations alter the perception of low-level visual motion

    PubMed Central

    Kafaligonul, Hulusi; Oluk, Can

    2015-01-01

    Motion perception is a pervasive nature of vision and is affected by both immediate pattern of sensory inputs and prior experiences acquired through associations. Recently, several studies reported that an association can be established quickly between directions of visual motion and static sounds of distinct frequencies. After the association is formed, sounds are able to change the perceived direction of visual motion. To determine whether such rapidly acquired audiovisual associations and their subsequent influences on visual motion perception are dependent on the involvement of higher-order attentive tracking mechanisms, we designed psychophysical experiments using regular and reverse-phi random dot motions isolating low-level pre-attentive motion processing. Our results show that an association between the directions of low-level visual motion and static sounds can be formed and this audiovisual association alters the subsequent perception of low-level visual motion. These findings support the view that audiovisual associations are not restricted to high-level attention based motion system and early-level visual motion processing has some potential role. PMID:25873869

  12. Vision technology/algorithms for space robotics applications

    NASA Technical Reports Server (NTRS)

    Krishen, Kumar; Defigueiredo, Rui J. P.

    1987-01-01

    The thrust of automation and robotics for space applications has been proposed for increased productivity, improved reliability, increased flexibility, higher safety, and for the performance of automating time-consuming tasks, increasing productivity/performance of crew-accomplished tasks, and performing tasks beyond the capability of the crew. This paper provides a review of efforts currently in progress in the area of robotic vision. Both systems and algorithms are discussed. The evolution of future vision/sensing is projected to include the fusion of multisensors ranging from microwave to optical with multimode capability to include position, attitude, recognition, and motion parameters. The key feature of the overall system design will be small size and weight, fast signal processing, robust algorithms, and accurate parameter determination. These aspects of vision/sensing are also discussed.

  13. Bumblebees minimize control challenges by combining active and passive modes in unsteady winds

    NASA Astrophysics Data System (ADS)

    Ravi, Sridhar; Kolomenskiy, Dmitry; Engels, Thomas; Schneider, Kai; Wang, Chun; Sesterhenn, Jörn; Liu, Hao

    2016-10-01

    The natural wind environment that volant insects encounter is unsteady and highly complex, posing significant flight-control and stability challenges. It is critical to understand the strategies insects employ to safely navigate in natural environments. We combined experiments on free flying bumblebees with high-fidelity numerical simulations and lower-order modeling to identify the mechanics that mediate insect flight in unsteady winds. We trained bumblebees to fly upwind towards an artificial flower in a wind tunnel under steady wind and in a von Kármán street formed in the wake of a cylinder. Analysis revealed that at lower frequencies in both steady and unsteady winds the bees mediated lateral movement with body roll - typical casting motion. Numerical simulations of a bumblebee in similar conditions permitted the separation of the passive and active components of the flight trajectories. Consequently, we derived simple mathematical models that describe these two motion components. Comparison between the free-flying live and modeled bees revealed a novel mechanism that enables bees to passively ride out high-frequency perturbations while performing active maneuvers at lower frequencies. The capacity of maintaining stability by combining passive and active modes at different timescales provides a viable means for animals and machines to tackle the challenges posed by complex airflows.

  14. Reviews Book: Marie Curie: A Biography Book: Fast Car Physics Book: Beautiful Invisible Equipment: Fun Fly Stick Science Kit Book: Quantum Theory Cannot Hurt You Book: Chaos: The Science of Predictable Random Motion Book: Seven Wonders of the Universe Book: Special Relativity Equipment: LabVIEWTM 2009 Education Edition Places to Visit: Edison and Ford Winter Estates Places to Visit: The Computer History Museum Web Watch

    NASA Astrophysics Data System (ADS)

    2011-07-01

    WE RECOMMEND Fun Fly Stick Science Kit Fun fly stick introduces electrostatics to youngsters Special Relativity Text makes a useful addition to the study of relativity as an undergraduate LabVIEWTM 2009 Education Edition LabVIEW sets industry standard for gathering and analysing data, signal processing, instrumentation design and control, and automation and robotics Edison and Ford Winter Estates Thomas Edison's home is open to the public The Computer History Museum Take a walk through technology history at this computer museum WORTH A LOOK Fast Car Physics Book races through physics Beautiful Invisible The main subject of this book is theoretical physics Quantum Theory Cannot Hurt You A guide to physics on the large and small scale Chaos: The Science of Predictable Random Motion Book explores the mathematics behind chaotic behaviour Seven Wonders of the Universe A textual trip through the wonderful universe HANDLE WITH CARE Marie Curie: A Biography Book fails to capture Curie's science WEB WATCH Web clips to liven up science lessons

  15. Floral colours in a world without birds and bees: the plants of Macquarie Island.

    PubMed

    Shrestha, M; Lunau, K; Dorin, A; Schulze, B; Bischoff, M; Burd, M; Dyer, A G

    2016-09-01

    We studied biotically pollinated angiosperms on Macquarie Island, a remote site in the Southern Ocean with a predominately or exclusively dipteran pollinator fauna, in an effort to understand how flower colour affects community assembly. We compared a distinctive group of cream-green Macquarie Island flowers to the flora of likely source pools of immigrants and to a continental flora from a high latitude in the northern hemisphere. We used both dipteran and hymenopteran colour models and phylogenetically informed analyses to explore the chromatic component of community assembly. The species with cream-green flowers are very restricted in colour space models of both fly vision and bee vision and represent a distinct group that plays a very minor role in other communities. It is unlikely that such a community could form through random immigration from continental source pools. Our findings suggest that fly pollination has imposed a strong ecological filter on Macquarie Island, favouring floral colours that are rare in continental floras. This is one of the strongest demonstrations that plant-pollinator interactions play an important role in plant community assembly. Future work exploring colour choices by dipteran flower visitors would be valuable. © 2016 German Botanical Society and The Royal Botanical Society of the Netherlands.

  16. Ventral polarization vision in tabanids: horseflies and deerflies (Diptera: Tabanidae) are attracted to horizontally polarized light

    NASA Astrophysics Data System (ADS)

    Horváth, Gábor; Majer, József; Horváth, Loránd; Szivák, Ildikó; Kriska, György

    2008-11-01

    Adult tabanid flies (horseflies and deerflies) are terrestrial and lay their eggs onto marsh plants near bodies of fresh water because the larvae develop in water or mud. To know how tabanids locate their host animals, terrestrial rendezvous sites and egg-laying places would be very useful for control measures against them, because the hematophagous females are primary/secondary vectors of some severe animal/human diseases/parasites. Thus, in choice experiments performed in the field we studied the behavior of tabanids governed by linearly polarized light. We present here evidence for positive polarotaxis, i.e., attraction to horizontally polarized light stimulating the ventral eye region, in both males and females of 27 tabanid species. The novelty of our findings is that positive polarotaxis has been described earlier only in connection with the water detection of some aquatic insects ovipositing directly into water. A further particularity of our discovery is that in the order Diptera and among blood-sucking insects the studied tabanids are the first known species possessing ventral polarization vision and definite polarization-sensitive behavior with known functions. The polarotaxis in tabanid flies makes it possible to develop new optically luring traps being more efficient than the existing ones based on the attraction of tabanids by the intensity and/or color of reflected light.

  17. Ventral polarization vision in tabanids: horseflies and deerflies (Diptera: Tabanidae) are attracted to horizontally polarized light.

    PubMed

    Horváth, Gábor; Majer, József; Horváth, Loránd; Szivák, Ildikó; Kriska, György

    2008-11-01

    Adult tabanid flies (horseflies and deerflies) are terrestrial and lay their eggs onto marsh plants near bodies of fresh water because the larvae develop in water or mud. To know how tabanids locate their host animals, terrestrial rendezvous sites and egg-laying places would be very useful for control measures against them, because the hematophagous females are primary/secondary vectors of some severe animal/human diseases/parasites. Thus, in choice experiments performed in the field we studied the behavior of tabanids governed by linearly polarized light. We present here evidence for positive polarotaxis, i.e., attraction to horizontally polarized light stimulating the ventral eye region, in both males and females of 27 tabanid species. The novelty of our findings is that positive polarotaxis has been described earlier only in connection with the water detection of some aquatic insects ovipositing directly into water. A further particularity of our discovery is that in the order Diptera and among blood-sucking insects the studied tabanids are the first known species possessing ventral polarization vision and definite polarization-sensitive behavior with known functions. The polarotaxis in tabanid flies makes it possible to develop new optically luring traps being more efficient than the existing ones based on the attraction of tabanids by the intensity and/or color of reflected light.

  18. Machine vision for digital microfluidics

    NASA Astrophysics Data System (ADS)

    Shin, Yong-Jun; Lee, Jeong-Bong

    2010-01-01

    Machine vision is widely used in an industrial environment today. It can perform various tasks, such as inspecting and controlling production processes, that may require humanlike intelligence. The importance of imaging technology for biological research or medical diagnosis is greater than ever. For example, fluorescent reporter imaging enables scientists to study the dynamics of gene networks with high spatial and temporal resolution. Such high-throughput imaging is increasingly demanding the use of machine vision for real-time analysis and control. Digital microfluidics is a relatively new technology with expectations of becoming a true lab-on-a-chip platform. Utilizing digital microfluidics, only small amounts of biological samples are required and the experimental procedures can be automatically controlled. There is a strong need for the development of a digital microfluidics system integrated with machine vision for innovative biological research today. In this paper, we show how machine vision can be applied to digital microfluidics by demonstrating two applications: machine vision-based measurement of the kinetics of biomolecular interactions and machine vision-based droplet motion control. It is expected that digital microfluidics-based machine vision system will add intelligence and automation to high-throughput biological imaging in the future.

  19. Using flight simulators aboard ships: human side effects of an optimal scenario with smooth seas.

    PubMed

    Muth, Eric R; Lawson, Ben

    2003-05-01

    The U.S. Navy is considering placing flight simulators aboard ships. It is known that certain types of flight simulators can elicit motion adaptation syndrome (MAS), and also that certain types of ship motion can cause MAS. The goal of this study was to determine if using a flight simulator during ship motion would cause MAS, even when the simulator stimulus and the ship motion were both very mild. All participants in this study completed three conditions. Condition 1 (Sim) entailed "flying" a personal computer-based flight simulator situated on land. Condition 2 (Ship) involved riding aboard a U.S. Navy Yard Patrol boat. Condition 3 (ShipSim) entailed "flying" a personal computer-based flight simulator while riding aboard a Yard Patrol boat. Before and after each condition, participants' balance and dynamic visual acuity were assessed. After each condition, participants filled out the Nausea Profile and the Simulator Sickness Questionnaire. Following exposure to a flight simulator aboard a ship, participants reported negligible symptoms of nausea and simulator sickness. However, participants exhibited a decrease in dynamic visual acuity after exposure to the flight simulator aboard ship (T[25] = 3.61, p < 0.05). Balance results were confounded by significant learning and, therefore, not interpretable. This study suggests that flight simulators can be used aboard ship. As a minimal safety precaution, these simulators should be used according to current safety practices for land-based simulators. Optimally, these simulators should be designed to minimize MAS, located near the ship's center of rotation and used when ship motion is not provocative.

  20. Implied motion language can influence visual spatial memory.

    PubMed

    Vinson, David W; Engelen, Jan; Zwaan, Rolf A; Matlock, Teenie; Dale, Rick

    2017-07-01

    How do language and vision interact? Specifically, what impact can language have on visual processing, especially related to spatial memory? What are typically considered errors in visual processing, such as remembering the location of an object to be farther along its motion trajectory than it actually is, can be explained as perceptual achievements that are driven by our ability to anticipate future events. In two experiments, we tested whether the prior presentation of motion language influences visual spatial memory in ways that afford greater perceptual prediction. Experiment 1 showed that motion language influenced judgments for the spatial memory of an object beyond the known effects of implied motion present in the image itself. Experiment 2 replicated this finding. Our findings support a theory of perception as prediction.

  1. Dense depth maps from correspondences derived from perceived motion

    NASA Astrophysics Data System (ADS)

    Kirby, Richard; Whitaker, Ross

    2017-01-01

    Many computer vision applications require finding corresponding points between images and using the corresponding points to estimate disparity. Today's correspondence finding algorithms primarily use image features or pixel intensities common between image pairs. Some 3-D computer vision applications, however, do not produce the desired results using correspondences derived from image features or pixel intensities. Two examples are the multimodal camera rig and the center region of a coaxial camera rig. We present an image correspondence finding technique that aligns pairs of image sequences using optical flow fields. The optical flow fields provide information about the structure and motion of the scene, which are not available in still images but can be used in image alignment. We apply the technique to a dual focal length stereo camera rig consisting of a visible light-infrared camera pair and to a coaxial camera rig. We test our method on real image sequences and compare our results with the state-of-the-art multimodal and structure from motion (SfM) algorithms. Our method produces more accurate depth and scene velocity reconstruction estimates than the state-of-the-art multimodal and SfM algorithms.

  2. Visions of our Planet's Atmosphere, Land and Oceans: NASA/NOAA Electronic Theater 2002

    NASA Technical Reports Server (NTRS)

    Hasler, A. F.; Starr, David (Technical Monitor)

    2001-01-01

    The NASA/NOAA Electronic Theater presents Earth science observations and visualizations in a historical perspective. Fly in from outer space to the Olympic Medals Plaza, the new Gateway Center, and the University of Utah Stadium Site of the Olympic Opening and Closing Ceremonies in Salt Lake City. Fly in and through the Park City, and Snow Basin sites of the 2002 Winter Olympic Alpine Venues using 1 m IKONOS "Spy Satellite" data. See the four seasons of the Wasatch Front as observed by Landsat 7 at 15m resolution and watch the trees turn color in the Fall, snow come and go in the mountains and the reservoirs freeze and melt. Go back to the early weather satellite images from the 1960s and see them contrasted with the latest US and international global satellite weather movies including hurricanes & "tornadoes". See the latest visualizations of spectacular images from NASA/NOAA remote sensing missions like Terra, GOES, TRMM, SeaWiFS, Landsat 7 including new 1 - min GOES rapid scan image sequences of Nov 9th 2001 Midwest tornadic thunderstorms and have them explained. See how High-Definition Television (HDTV) is revolutionizing the way we communicate science. (In cooperation with the American Museum of Natural History in NYC) See dust storms in Africa and smoke plumes from fires in Mexico. See visualizations featured on the covers of Newsweek, TIME, National Geographic, Popular Science & on National & International Network TV. New computer software tools allow us to roam & zoom through massive global images e.g. Landsat tours of the US, and Africa, showing desert and mountain geology as well as seasonal changes in vegetation. See animations of the polar ice packs and the motion of gigantic Antarctic Icebergs from SeaWinds data. Spectacular new visualizations of the global atmosphere & oceans are shown. See vortexes and currents in the global oceans that bring up the nutrients to feed tiny algae and draw the fish, whales and fisherman. See the how the ocean blooms in response to these currents and El Nino/La Nina climate changes. See the city lights, fishing fleets, gas flares and bio-mass burning of the Earth at night observed by the "night-vision" DMSP military satellite. The demonstration is interactively driven by a SGI Octane Graphics Supercomputer with two CPUs, 4 Gigabytes of RAM and 0.5 Terabyte of disk using two projectors across a super sized panoramic 48 foot screen. In addition new HDTV technology will be demonstrated from a portable computer server.

  3. Visions of our Planet's Atmosphere, Land and Oceans: NASA/NOAA Electronic Theater 2002

    NASA Technical Reports Server (NTRS)

    Hasler, A. F.; Starr, David (Technical Monitor)

    2002-01-01

    The NASA/NOAA Electronic Theater presents Earth science observations and visualizations in a historical perspective. Fly in from outer space to the Olympic Medals Plaza, the new Gateway Center, and the University of Utah Stadium Site of the Olympic Opening and Closing Ceremonies in Salt Lake City. Fly in and through the Park City, and Snow Basin sites of the 2002 Winter Olympic Alpine Venues using 1 m IKONOS "Spy Satellite" data. See the four seasons of the Wasatch Front as observed by Landsat 7 at 15m resolution and watch the trees turn color in the Fall, snow come and go in the mountains and the reservoirs freeze and melt. Go back to the early weather satellite images from the 1960s and see them contrasted with the latest US and international global satellite weather movies Including hurricanes & "tornadoes". See the latest visualizations of spectacular images from NASA/NOAA remote sensing missions like Terra, GOES, TRMM, SeaWiFS, Landsat 7 including new 1 - min GOES rapid scan image sequences of Nov 9th 2001 Midwest tornadic thunderstorms and have them explained. See how High-Definition Television (HDTV) is revolutionizing the way we communicate science. (In cooperation with the American Museum of Natural History in NYC) See dust storms in Africa and smoke plumes from fires in Mexico. See visualizations featured on the covers Of Newsweek, TIME, National Geographic, Popular Science & on National & International Network TV. New computer software. tools allow us to roam & zoom through massive global images e.g. Landsat tours of the US, and Africa, showing desert and mountain geology as well as seasonal changes in vegetation. See animations of the polar ice packs and the motion of gigantic Antarctic Icebergs from SeaWinds data. Spectacular new visualizations of the global atmosphere & oceans are shown. See vertexes and currents in the global oceans that bring up the nutrients to feed tin) algae and draw the fish, whales and fisherman. See the how the ocean blooms in response to these currents and El Nino/La Nina climate changes. See the city lights, fishing fleets, gas flares and biomass burning of the Earth at night observed by the "night-vision" DMSP military satellite. The demonstration is interactively driven by a SGI Octane Graphics Supercomputer with two CPUs, 4 Gigabytes of RAM and 0.5 Terabyte of disk using two projectors across a super sized panoramic 48 foot screen. In addition new HDTV technology will be demonstrated from a portable computer server.

  4. Tracking and Predicting Fine Scale Sea Ice Motion by Constructing Super-Resolution Images and Fusing Multiple Satellite Sensors

    DTIC Science & Technology

    2013-09-30

    COVERED 00-00-2013 to 00-00-2013 4. TITLE AND SUBTITLE Tracking and Predicting Fine Scale Sea Ice Motion by Constructing Super-Resolution Images...limited, but potentially provide more detailed data. Initial assessments have been made on MODIS data in terms of its suitability. While clouds obscure...estimates. 2 Data from Aqua, Terra, and Suomi NPP satellites were investigated. Aqua and Terra are older satellites that fly the MODIS instrument

  5. An experimental investigation of interaction between projectiles and flames

    NASA Astrophysics Data System (ADS)

    Baryshnikov, A. S.; Basargin, I. V.; Bobashev, S. V.; Monakhov, N. A.; Popov, P. A.; Sakharov, V. A.; Chistyakova, M. V.

    2015-12-01

    This investigation is devoted to the influence of a heated area of gas on model stability with the supersonic motion during free-flying operation. The conditions of the maximum influence on aerodynamics of body flight in an inhomogeneous heated area are ascertained.

  6. Image-based ranging and guidance for rotorcraft

    NASA Technical Reports Server (NTRS)

    Menon, P. K. A.

    1991-01-01

    This report documents the research carried out under NASA Cooperative Agreement No. NCC2-575 during the period Oct. 1988 - Dec. 1991. Primary emphasis of this effort was on the development of vision based navigation methods for rotorcraft nap-of-the-earth flight regime. A family of field-based ranging algorithms were developed during this research period. These ranging schemes are capable of handling both stereo and motion image sequences, and permits both translational and rotational camera motion. The algorithms require minimal computational effort and appear to be implementable in real time. A series of papers were presented on these ranging schemes, some of which are included in this report. A small part of the research effort was expended on synthesizing a rotorcraft guidance law that directly uses the vision-based ranging data. This work is discussed in the last section.

  7. Visual Motion Perception and Visual Attentive Processes.

    DTIC Science & Technology

    1988-04-01

    88-0551 Visual Motion Perception and Visual Attentive Processes George Spering , New YorkUnivesity A -cesson For DTIC TAB rant AFOSR 85-0364... Spering . HIPSt: A Unix-based image processing syslem. Computer Vision, Graphics, and Image Processing, 1984,25. 331-347. ’HIPS is the Human Information...Processing Laboratory’s Image Processing System. 1985 van Santen, Jan P. It, and George Spering . Elaborated Reichardt detectors. Journal of the Optical

  8. Dimensional coordinate measurements: application in characterizing cervical spine motion

    NASA Astrophysics Data System (ADS)

    Zheng, Weilong; Li, Linan; Wang, Shibin; Wang, Zhiyong; Shi, Nianke; Xue, Yuan

    2014-06-01

    Cervical spine as a complicated part in the human body, the form of its movement is diverse. The movements of the segments of vertebrae are three-dimensional, and it is reflected in the changes of the angle between two joint and the displacement in different directions. Under normal conditions, cervical can flex, extend, lateral flex and rotate. For there is no relative motion between measuring marks fixed on one segment of cervical vertebra, the cervical vertebrae with three marked points can be seen as a body. Body's motion in space can be decomposed into translational movement and rotational movement around a base point .This study concerns the calculation of dimensional coordinate of the marked points pasted to the human body's cervical spine by an optical method. Afterward, these measures will allow the calculation of motion parameters for every spine segment. For this study, we choose a three-dimensional measurement method based on binocular stereo vision. The object with marked points is placed in front of the CCD camera. Through each shot, we will get there two parallax images taken from different cameras. According to the principle of binocular vision we can be realized three-dimensional measurements. Cameras are erected parallelly. This paper describes the layout of experimental system and a mathematical model to get the coordinates.

  9. Behavioural evidence of colour vision in free flying stingless bees.

    PubMed

    Spaethe, J; Streinzer, M; Eckert, J; May, S; Dyer, A G

    2014-06-01

    Colour vision was first demonstrated with behavioural experiments in honeybees 100 years ago. Since that time a wealth of quality physiological data has shown a highly conserved set of trichromatic colour receptors in most bee species. Despite the subsequent wealth of behavioural research on honeybees and bumblebees, there currently is a relative dearth of data on stingless bees, which are the largest tribe of the eusocial bees comprising of more than 600 species. In our first experiment we tested Trigona cf. fuscipennis, a stingless bee species from Costa Rica in a field setting using the von Frisch method and show functional colour vision. In a second experiment with these bees, we use a simultaneous colour discrimination test designed for honeybees to enable a comparative analysis of relative colour discrimination. In a third experiment, we test in laboratory conditions Tetragonula carbonaria, an Australian stingless bee species using a similar simultaneous colour discrimination test. Both stingless bee species show relatively poorer colour discrimination compared to honeybees and bumblebees; and we discuss the value of being able to use these behavioural methods to efficiently extend our current knowledge of colour vision and discrimination in different bee species.

  10. The Effects of Synthetic and Enhanced Vision Technologies for Lunar Landings

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Norman, Robert M.; Prinzel, Lawrence J., III; Bailey, Randall E.; Arthur, Jarvis J., III; Shelton, Kevin J.; Williams, Steven P.

    2009-01-01

    Eight pilots participated as test subjects in a fixed-based simulation experiment to evaluate advanced vision display technologies such as Enhanced Vision (EV) and Synthetic Vision (SV) for providing terrain imagery on flight displays in a Lunar Lander Vehicle. Subjects were asked to fly 20 approaches to the Apollo 15 lunar landing site with four different display concepts - Baseline (symbology only with no terrain imagery), EV only (terrain imagery from Forward Looking Infra Red, or FLIR, and LIght Detection and Ranging, or LIDAR, sensors), SV only (terrain imagery from onboard database), and Fused EV and SV concepts. As expected, manual landing performance was excellent (within a meter of landing site center) and not affected by the inclusion of EV or SV terrain imagery on the Lunar Lander flight displays. Subjective ratings revealed significant situation awareness improvements with the concepts employing EV and/or SV terrain imagery compared to the Baseline condition that had no terrain imagery. In addition, display concepts employing EV imagery (compared to the SV and Baseline concepts which had none) were significantly better for pilot detection of intentional but unannounced navigation failures since this imagery provided an intuitive and obvious visual methodology to monitor the validity of the navigation solution.

  11. Image Motion Detection And Estimation: The Modified Spatio-Temporal Gradient Scheme

    NASA Astrophysics Data System (ADS)

    Hsin, Cheng-Ho; Inigo, Rafael M.

    1990-03-01

    The detection and estimation of motion are generally involved in computing a velocity field of time-varying images. A completely new modified spatio-temporal gradient scheme to determine motion is proposed. This is derived by using gradient methods and properties of biological vision. A set of general constraints is proposed to derive motion constraint equations. The constraints are that the second directional derivatives of image intensity at an edge point in the smoothed image will be constant at times t and t+L . This scheme basically has two stages: spatio-temporal filtering, and velocity estimation. Initially, image sequences are processed by a set of oriented spatio-temporal filters which are designed using a Gaussian derivative model. The velocity is then estimated for these filtered image sequences based on the gradient approach. From a computational stand point, this scheme offers at least three advantages over current methods. The greatest advantage of the modified spatio-temporal gradient scheme over the traditional ones is that an infinite number of motion constraint equations are derived instead of only one. Therefore, it solves the aperture problem without requiring any additional assumptions and is simply a local process. The second advantage is that because of the spatio-temporal filtering, the direct computation of image gradients (discrete derivatives) is avoided. Therefore the error in gradients measurement is reduced significantly. The third advantage is that during the processing of motion detection and estimation algorithm, image features (edges) are produced concurrently with motion information. The reliable range of detected velocity is determined by parameters of the oriented spatio-temporal filters. Knowing the velocity sensitivity of a single motion detection channel, a multiple-channel mechanism for estimating image velocity, seldom addressed by other motion schemes in machine vision, can be constructed by appropriately choosing and combining different sets of parameters. By applying this mechanism, a great range of velocity can be detected. The scheme has been tested for both synthetic and real images. The results of simulations are very satisfactory.

  12. Motion-Corrected 3D Sonic Anemometer for Tethersondes and Other Moving Platforms

    NASA Technical Reports Server (NTRS)

    Bognar, John

    2012-01-01

    To date, it has not been possible to apply 3D sonic anemometers on tethersondes or similar atmospheric research platforms due to the motion of the supporting platform. A tethersonde module including both a 3D sonic anemometer and associated motion correction sensors has been developed, enabling motion-corrected 3D winds to be measured from a moving platform such as a tethersonde. Blimps and other similar lifting systems are used to support tethersondes meteorological devices that fly on the tether of a blimp or similar platform. To date, tethersondes have been limited to making basic meteorological measurements (pressure, temperature, humidity, and wind speed and direction). The motion of the tethersonde has precluded the addition of 3D sonic anemometers, which can be used for high-speed flux measurements, thereby limiting what has been achieved to date with tethersondes. The tethersonde modules fly on a tether that can be constantly moving and swaying. This would introduce enormous error into the output of an uncorrected 3D sonic anemometer. The motion correction that is required must be implemented in a low-weight, low-cost manner to be suitable for this application. Until now, flux measurements using 3D sonic anemometers could only be made if the 3D sonic anemometer was located on a rigid, fixed platform such as a tower. This limited the areas in which they could be set up and used. The purpose of the innovation was to enable precise 3D wind and flux measurements to be made using tether - sondes. In brief, a 3D accelerometer and a 3D gyroscope were added to a tethersonde module along with a 3D sonic anemometer. This combination allowed for the necessary package motions to be measured, which were then mathematically combined with the measured winds to yield motion-corrected 3D winds. At the time of this reporting, no tethersonde has been able to make any wind measurement other than a basic wind speed and direction measurement. The addition of a 3D sonic anemometer is unique, as is the addition of the motion-correction sensors.

  13. [Motion sickness in motion: from carsickness to cybersickness].

    PubMed

    Bos, J E; van Leeuwen, R B; Bruintjes, T D

    2018-01-01

    - Motion sickness is not a disorder, but a normal response to a non-normal situation in which movement plays a central role, such as car travel, sailing, flying, or virtual reality.- Almost anyone can suffer from motion sickness, as long as at least one of the organs of balance functions. If neither of the organs of balance functions the individual will not suffer from carsickness, seasickness, airsickness, nor from cybersickness. - 'Cybersickness' is a form of motion sickness that is stimulated by artificial moving images such as in videogames. Because we are now exposed more often and for longer periods of time to increasingly realistic artificial images, doctors will also encounter cases of motion sickness more often. - The basis for motion sickness is the vestibular system, which can be modulated by visual-vestibular conflicts, i.e. when the movements seen by the eyes are not the same as those experienced by the organs of balance.- Antihistamines can be effective against motion sickness in everyday situations such as car travel if taken before departure, but the effectiveness of medication for motion sickness is limited.

  14. Differential GNSS and Vision-Based Tracking to Improve Navigation Performance in Cooperative Multi-UAV Systems.

    PubMed

    Vetrella, Amedeo Rodi; Fasano, Giancarmine; Accardo, Domenico; Moccia, Antonio

    2016-12-17

    Autonomous navigation of micro-UAVs is typically based on the integration of low cost Global Navigation Satellite System (GNSS) receivers and Micro-Electro-Mechanical Systems (MEMS)-based inertial and magnetic sensors to stabilize and control the flight. The resulting navigation performance in terms of position and attitude accuracy may not suffice for other mission needs, such as the ones relevant to fine sensor pointing. In this framework, this paper presents a cooperative UAV navigation algorithm that allows a chief vehicle, equipped with inertial and magnetic sensors, a Global Positioning System (GPS) receiver, and a vision system, to improve its navigation performance (in real time or in the post processing phase) exploiting formation flying deputy vehicles equipped with GPS receivers. The focus is set on outdoor environments and the key concept is to exploit differential GPS among vehicles and vision-based tracking (DGPS/Vision) to build a virtual additional navigation sensor whose information is then integrated in a sensor fusion algorithm based on an Extended Kalman Filter. The developed concept and processing architecture are described, with a focus on DGPS/Vision attitude determination algorithm. Performance assessment is carried out on the basis of both numerical simulations and flight tests. In the latter ones, navigation estimates derived from the DGPS/Vision approach are compared with those provided by the onboard autopilot system of a customized quadrotor. The analysis shows the potential of the developed approach, mainly deriving from the possibility to exploit magnetic- and inertial-independent accurate attitude information.

  15. Leveraging DMO’s Hi-Tech Simulation Against the F-16 Flying Training Gap

    DTIC Science & Technology

    2005-04-01

    39-49. 6 United States Department of Transportation, Airplance Simulator Qualification, Report AC No.120-40B (Washington, DC: Federal Aviation...and Motion Simulation Conference. Binghamton, NY: Singer-Simulation Products Division, 1976. United States Department of Transportation. Airplance

  16. Dynamic Metasurface Aperture as Smart Around-the-Corner Motion Detector.

    PubMed

    Del Hougne, Philipp; F Imani, Mohammadreza; Sleasman, Timothy; Gollub, Jonah N; Fink, Mathias; Lerosey, Geoffroy; Smith, David R

    2018-04-25

    Detecting and analysing motion is a key feature of Smart Homes and the connected sensor vision they embrace. At present, most motion sensors operate in line-of-sight Doppler shift schemes. Here, we propose an alternative approach suitable for indoor environments, which effectively constitute disordered cavities for radio frequency (RF) waves; we exploit the fundamental sensitivity of modes of such cavities to perturbations, caused here by moving objects. We establish experimentally three key features of our proposed system: (i) ability to capture the temporal variations of motion and discern information such as periodicity ("smart"), (ii) non line-of-sight motion detection, and (iii) single-frequency operation. Moreover, we explain theoretically and demonstrate experimentally that the use of dynamic metasurface apertures can substantially enhance the performance of RF motion detection. Potential applications include accurately detecting human presence and monitoring inhabitants' vital signs.

  17. LabVIEW application for motion tracking using USB camera

    NASA Astrophysics Data System (ADS)

    Rob, R.; Tirian, G. O.; Panoiu, M.

    2017-05-01

    The technical state of the contact line and also the additional equipment in electric rail transport is very important for realizing the repairing and maintenance of the contact line. During its functioning, the pantograph motion must stay in standard limits. Present paper proposes a LabVIEW application which is able to track in real time the motion of a laboratory pantograph and also to acquire the tracking images. An USB webcam connected to a computer acquires the desired images. The laboratory pantograph contains an automatic system which simulates the real motion. The tracking parameters are the horizontally motion (zigzag) and the vertically motion which can be studied in separate diagrams. The LabVIEW application requires appropriate tool-kits for vision development. Therefore the paper describes the subroutines that are especially programmed for real-time image acquisition and also for data processing.

  18. Variably Transmittive, Electronically-Controlled Eyewear

    NASA Technical Reports Server (NTRS)

    Chapman, John J. (Inventor); Glaab, Louis J. (Inventor); Schott, Timothy D. (Inventor); Howell, Charles T. (Inventor); Fleck, Vincent J. (Inventor)

    2013-01-01

    A system and method for flight training and evaluation of pilots comprises electronically activated vision restriction glasses that detect the pilot's head position and automatically darken and restrict the pilot's ability to see through the front and side windscreens when the pilot-in-training attempts to see out the windscreen. Thus, the pilot-in-training sees only within the aircraft cockpit, forcing him or her to fly by instruments in the most restricted operational mode.

  19. Uncovering the CH-53E Doppler Myth

    DTIC Science & Technology

    2008-01-01

    The Corps’ Changing Super Stallion 8 Helicopter Night Vision System 9 Heads-Up Display (BUD) 9 The Infamous Doppler 12 )FUTURE SOLUTIONS 14 Universal...ABSTRACTION FROM, OR REPRODUCTION OF ALL OR ANY PART OF THIS DOCUMENT IS PERMITTED PROVIDED PROPER ACKNOWLEDGEMENT IT MADE. List ofIllustrations Page...remains intimately familiar to those flying the Marine Corps’ CH-53E "Super Stallion " today. Fortunately, the CH.-53Es flown throughout the world today

  20. Representation of visual gravitational motion in the human vestibular cortex.

    PubMed

    Indovina, Iole; Maffei, Vincenzo; Bosco, Gianfranco; Zago, Myrka; Macaluso, Emiliano; Lacquaniti, Francesco

    2005-04-15

    How do we perceive the visual motion of objects that are accelerated by gravity? We propose that, because vision is poorly sensitive to accelerations, an internal model that calculates the effects of gravity is derived from graviceptive information, is stored in the vestibular cortex, and is activated by visual motion that appears to be coherent with natural gravity. The acceleration of visual targets was manipulated while brain activity was measured using functional magnetic resonance imaging. In agreement with the internal model hypothesis, we found that the vestibular network was selectively engaged when acceleration was consistent with natural gravity. These findings demonstrate that predictive mechanisms of physical laws of motion are represented in the human brain.

  1. Stimulus factors in motion perception and spatial orientation

    NASA Technical Reports Server (NTRS)

    Post, R. B.; Johnson, C. A.

    1984-01-01

    The Malcolm horizon utilizes a large projected light stimulus Peripheral Vision Horizon Device (PVHD) as an attitude indicator in order to achieve a more compelling sense of roll than is obtained with smaller devices. The basic principle is that the larger stimulus is more similar to visibility of a real horizon during roll, and does not require fixation and attention to the degree that smaller displays do. Successful implementation of such a device requires adjustment of the parameters of the visual stimulus so that its effects on motion perception and spatial orientation are optimized. With this purpose in mind, the effects of relevant image variables on the perception of object motion, self motion and spatial orientation are reviewed.

  2. Stroboscopic Goggles as a Countermeasure for Dynamic Visual Acuity and Landing Sickness After Long-Duration Spaceflight

    NASA Technical Reports Server (NTRS)

    Rosenberg, M. J. F.; Kreutzberg, G. A.; Peters, B. T.; Reschke, M. F.

    2017-01-01

    Gravity transitions cause changes in the vestibulo-occular reflex (VOR), which manifests as poor gaze control, a decrement in dynamic visual acuity (the ability to maintain gaze while in motion), both of which are caused by retinal slip. Retinal slip, the inability to keep an image focused on the retina, can drive or worsen sensory conflict, resulting in motion sickness (MS). Currently 100% of returning crewmembers report MS symptoms, which might affect their ability to perform mission critical tasks immediately after landing. Reschke et al. (2007) demonstrate that stroboscopic vision goggles improve motion sickness onset and symptom severity in motion sickness driven by retinal slip.

  3. Validation of vision-based obstacle detection algorithms for low-altitude helicopter flight

    NASA Technical Reports Server (NTRS)

    Suorsa, Raymond; Sridhar, Banavar

    1991-01-01

    A validation facility being used at the NASA Ames Research Center is described which is aimed at testing vision based obstacle detection and range estimation algorithms suitable for low level helicopter flight. The facility is capable of processing hundreds of frames of calibrated multicamera 6 degree-of-freedom motion image sequencies, generating calibrated multicamera laboratory images using convenient window-based software, and viewing range estimation results from different algorithms along with truth data using powerful window-based visualization software.

  4. Robot path planning using expert systems and machine vision

    NASA Astrophysics Data System (ADS)

    Malone, Denis E.; Friedrich, Werner E.

    1992-02-01

    This paper describes a system developed for the robotic processing of naturally variable products. In order to plan the robot motion path it was necessary to use a sensor system, in this case a machine vision system, to observe the variations occurring in workpieces and interpret this with a knowledge based expert system. The knowledge base was acquired by carrying out an in-depth study of the product using examination procedures not available in the robotic workplace and relates the nature of the required path to the information obtainable from the machine vision system. The practical application of this system to the processing of fish fillets is described and used to illustrate the techniques.

  5. Quantification of the relative contribution of the different right ventricular wall motion components to right ventricular ejection fraction: the ReVISION method.

    PubMed

    Lakatos, Bálint; Tősér, Zoltán; Tokodi, Márton; Doronina, Alexandra; Kosztin, Annamária; Muraru, Denisa; Badano, Luigi P; Kovács, Attila; Merkely, Béla

    2017-03-27

    Three major mechanisms contribute to right ventricular (RV) pump function: (i) shortening of the longitudinal axis with traction of the tricuspid annulus towards the apex; (ii) inward movement of the RV free wall; (iii) bulging of the interventricular septum into the RV and stretching the free wall over the septum. The relative contribution of the aforementioned mechanisms to RV pump function may change in different pathological conditions.Our aim was to develop a custom method to separately assess the extent of longitudinal, radial and anteroposterior displacement of the RV walls and to quantify their relative contribution to global RV ejection fraction using 3D data sets obtained by echocardiography.Accordingly, we decomposed the movement of the exported RV beutel wall in a vertex based manner. The volumes of the beutels accounting for the RV wall motion in only one direction (either longitudinal, radial, or anteroposterior) were calculated at each time frame using the signed tetrahedron method. Then, the relative contribution of the RV wall motion along the three different directions to global RV ejection fraction was calculated either as the ratio of the given direction's ejection fraction to global ejection fraction and as the frame-by-frame RV volume change (∆V/∆t) along the three motion directions.The ReVISION (Right VentrIcular Separate wall motIon quantificatiON) method may contribute to a better understanding of the pathophysiology of RV mechanical adaptations to different loading conditions and diseases.

  6. Autonomous Navigation Results from the Mars Exploration Rover (MER) Mission

    NASA Technical Reports Server (NTRS)

    Maimone, Mark; Johnson, Andrew; Cheng, Yang; Willson, Reg; Matthies, Larry H.

    2004-01-01

    In January, 2004, the Mars Exploration Rover (MER) mission landed two rovers, Spirit and Opportunity, on the surface of Mars. Several autonomous navigation capabilities were employed in space for the first time in this mission. ]n the Entry, Descent, and Landing (EDL) phase, both landers used a vision system called the, Descent Image Motion Estimation System (DIMES) to estimate horizontal velocity during the last 2000 meters (m) of descent, by tracking features on the ground with a downlooking camera, in order to control retro-rocket firing to reduce horizontal velocity before impact. During surface operations, the rovers navigate autonomously using stereo vision for local terrain mapping and a local, reactive planning algorithm called Grid-based Estimation of Surface Traversability Applied to Local Terrain (GESTALT) for obstacle avoidance. ]n areas of high slip, stereo vision-based visual odometry has been used to estimate rover motion, As of mid-June, Spirit had traversed 3405 m, of which 1253 m were done autonomously; Opportunity had traversed 1264 m, of which 224 m were autonomous. These results have contributed substantially to the success of the mission and paved the way for increased levels of autonomy in future missions.

  7. The contribution of LM to the neuroscience of movement vision

    PubMed Central

    Zihl, Josef; Heywood, Charles A.

    2015-01-01

    The significance of early and sporadic reports in the 19th century of impairments of motion vision following brain damage was largely unrecognized. In the absence of satisfactory post-mortem evidence, impairments were interpreted as the consequence of a more general disturbance resulting from brain damage, the location and extent of which was unknown. Moreover, evidence that movement constituted a special visual perception and may be selectively spared was similarly dismissed. Such skepticism derived from a reluctance to acknowledge that the neural substrates of visual perception may not be confined to primary visual cortex. This view did not persist. First, it was realized that visual movement perception does not depend simply on the analysis of spatial displacements and temporal intervals, but represents a specific visual movement sensation. Second persuasive evidence for functional specialization in extrastriate cortex, and notably the discovery of cortical area V5/MT, suggested a separate region specialized for motion processing. Shortly thereafter the remarkable case of patient LM was published, providing compelling evidence for a selective and specific loss of movement vision. The case is reviewed here, along with an assessment of its contribution to visual neuroscience. PMID:25741251

  8. Piloted Evaluation of the H-Mode, a Variable Autonomy Control System, in Motion-Based Simulation

    NASA Technical Reports Server (NTRS)

    Goodrich, Kenneth H.; Schutte, Paul C.; Williams, Ralph A.

    2008-01-01

    As aircraft become able to autonomously respond to a range of situations with performance surpassing human operators, we are compelled to look for new methods that help understand their use and guide the design of new, more effective forms of automation and interaction. The "H-mode" is one such method and is based on the metaphor of a well-trained horse. The concept allows the pilot to manage a broad range of control automation functionality, from augmented manual control to FMS-like coupling and automation initiated actions, using a common interface system and easily learned set of interaction skills. The interface leverages familiar manual control interfaces (e.g., the control stick) and flight displays through the addition of contextually dependent haptic-multimodal elements. The concept is relevant to manned and remotely piloted vehicles. This paper provides an overview of the H-mode concept followed by a presentation of the results from a recent evaluation conducted in a motion-based simulator. The evaluation focused on assessing the overall usability and flying qualities of the concept with an emphasis on the effects of turbulence and cockpit motion. Because the H-mode results in interactions between traditional flying qualities and management of higher-level flight path automation, these effects are of particular interest. The results indicate that the concept may provide a useful complement or replacement to conventional interfaces, and retains the usefulness in the presence of turbulence and motion.

  9. 3D morphology reconstruction using linear array CCD binocular stereo vision imaging system

    NASA Astrophysics Data System (ADS)

    Pan, Yu; Wang, Jinjiang

    2018-01-01

    Binocular vision imaging system, which has a small field of view, cannot reconstruct the 3-D shape of the dynamic object. We found a linear array CCD binocular vision imaging system, which uses different calibration and reconstruct methods. On the basis of the binocular vision imaging system, the linear array CCD binocular vision imaging systems which has a wider field of view can reconstruct the 3-D morphology of objects in continuous motion, and the results are accurate. This research mainly introduces the composition and principle of linear array CCD binocular vision imaging system, including the calibration, capture, matching and reconstruction of the imaging system. The system consists of two linear array cameras which were placed in special arrangements and a horizontal moving platform that can pick up objects. The internal and external parameters of the camera are obtained by calibrating in advance. And then using the camera to capture images of moving objects, the results are then matched and 3-D reconstructed. The linear array CCD binocular vision imaging systems can accurately measure the 3-D appearance of moving objects, this essay is of great significance to measure the 3-D morphology of moving objects.

  10. Visual cognition

    PubMed Central

    Cavanagh, Patrick

    2011-01-01

    Visual cognition, high-level vision, mid-level vision and top-down processing all refer to decision-based scene analyses that combine prior knowledge with retinal input to generate representations. The label “visual cognition” is little used at present, but research and experiments on mid- and high-level, inference-based vision have flourished, becoming in the 21st century a significant, if often understated part, of current vision research. How does visual cognition work? What are its moving parts? This paper reviews the origins and architecture of visual cognition and briefly describes some work in the areas of routines, attention, surfaces, objects, and events (motion, causality, and agency). Most vision scientists avoid being too explicit when presenting concepts about visual cognition, having learned that explicit models invite easy criticism. What we see in the literature is ample evidence for visual cognition, but few or only cautious attempts to detail how it might work. This is the great unfinished business of vision research: at some point we will be done with characterizing how the visual system measures the world and we will have to return to the question of how vision constructs models of objects, surfaces, scenes, and events. PMID:21329719

  11. Converting Static Image Datasets to Spiking Neuromorphic Datasets Using Saccades.

    PubMed

    Orchard, Garrick; Jayawant, Ajinkya; Cohen, Gregory K; Thakor, Nitish

    2015-01-01

    Creating datasets for Neuromorphic Vision is a challenging task. A lack of available recordings from Neuromorphic Vision sensors means that data must typically be recorded specifically for dataset creation rather than collecting and labeling existing data. The task is further complicated by a desire to simultaneously provide traditional frame-based recordings to allow for direct comparison with traditional Computer Vision algorithms. Here we propose a method for converting existing Computer Vision static image datasets into Neuromorphic Vision datasets using an actuated pan-tilt camera platform. Moving the sensor rather than the scene or image is a more biologically realistic approach to sensing and eliminates timing artifacts introduced by monitor updates when simulating motion on a computer monitor. We present conversion of two popular image datasets (MNIST and Caltech101) which have played important roles in the development of Computer Vision, and we provide performance metrics on these datasets using spike-based recognition algorithms. This work contributes datasets for future use in the field, as well as results from spike-based algorithms against which future works can compare. Furthermore, by converting datasets already popular in Computer Vision, we enable more direct comparison with frame-based approaches.

  12. Autogenic Feedback Training Applications for Man in Space

    NASA Technical Reports Server (NTRS)

    Cowings, Patricia S.; Wade, Charles E. (Technical Monitor)

    1994-01-01

    Finding an effective treatment for the motion sickness-like symptoms that occur in space has become a high priority for NASA. This paper reviews the back-round research and procedures of an experiment designed to prevent space motion sickness in shuttle crewmembers. The preventive method used, Autogenic - Feedback Training (AFT) involves training subjects to control voluntarily several of their own physiological responses to environmental stressors. AFT has been used reliably to increase tolerance to motion sickness during around based tests in over 300 men and women under a variety of conditions that induce motion sickness, and preliminary evidence from space suggests that AFT may be an effective treatment for space motion sickness as well. Other applications of AFT described include; (1) a potential treatment for post flight orthostatic intolerance, a serious biomedical problem resulting from long duration exposure to micro-g and (2) improving pilot performance during emergency flying conditions.

  13. Linear State-Space Representation of the Dynamics of Relative Motion, Based on Restricted Three Body Dynamics

    NASA Technical Reports Server (NTRS)

    Luquette,Richard J.; Sanner, Robert M.

    2004-01-01

    Precision Formation Flying is an enabling technology for a variety of proposed space-based observatories, including the Micro-Arcsecond X-ray Imaging Mission (MAXIM) , the associated MAXIM pathfinder mission, Stellar Imager (SI) and the Terrestrial Planet Finder (TPF). An essential element of the technology is the control algorithm, requiring a clear understanding of the dynamics of relative motion. This paper examines the dynamics of relative motion in the context of the Restricted Three Body Problem (RTBP). The natural dynamics of relative motion are presented in their full nonlinear form. Motivated by the desire to apply linear control methods, the dynamics equations are linearized and presented in state-space form. The stability properties are explored for regions in proximity to each of the libration points in the Earth/Moon - Sun rotating frame. The dynamics of relative motion are presented in both the inertial and rotating coordinate frames.

  14. Orientation selectivity sharpens motion detection in Drosophila

    PubMed Central

    Fisher, Yvette E.; Silies, Marion; Clandinin, Thomas R.

    2015-01-01

    SUMMARY Detecting the orientation and movement of edges in a scene is critical to visually guided behaviors of many animals. What are the circuit algorithms that allow the brain to extract such behaviorally vital visual cues? Using in vivo two-photon calcium imaging in Drosophila, we describe direction selective signals in the dendrites of T4 and T5 neurons, detectors of local motion. We demonstrate that this circuit performs selective amplification of local light inputs, an observation that constrains motion detection models and confirms a core prediction of the Hassenstein-Reichardt Correlator (HRC). These neurons are also orientation selective, responding strongly to static features that are orthogonal to their preferred axis of motion, a tuning property not predicted by the HRC. This coincident extraction of orientation and direction sharpens directional tuning through surround inhibition and reveals a striking parallel between visual processing in flies and vertebrate cortex, suggesting a universal strategy for motion processing. PMID:26456048

  15. Predictability and hierarchy in Drosophila behavior.

    PubMed

    Berman, Gordon J; Bialek, William; Shaevitz, Joshua W

    2016-10-18

    Even the simplest of animals exhibit behavioral sequences with complex temporal dynamics. Prominent among the proposed organizing principles for these dynamics has been the idea of a hierarchy, wherein the movements an animal makes can be understood as a set of nested subclusters. Although this type of organization holds potential advantages in terms of motion control and neural circuitry, measurements demonstrating this for an animal's entire behavioral repertoire have been limited in scope and temporal complexity. Here, we use a recently developed unsupervised technique to discover and track the occurrence of all stereotyped behaviors performed by fruit flies moving in a shallow arena. Calculating the optimally predictive representation of the fly's future behaviors, we show that fly behavior exhibits multiple time scales and is organized into a hierarchical structure that is indicative of its underlying behavioral programs and its changing internal states.

  16. Testing the FLI in the region of the Pallas asteroid family

    NASA Astrophysics Data System (ADS)

    Todorović, N.; Novaković, B.

    2015-08-01

    Computation of the fast Lyapunov indicator (FLI) is one of the most efficient numerical ways to characterize dynamical nature of motion and to detect phase-space structures in a large variety of dynamical models. Despite its effectiveness, FLI was mainly used for symplectic maps or simple Hamiltonians, but it has never been used to study dynamics of asteroids to a greater extent. This research shows that FLI could also be successfully applied to real (Solar system) dynamics. For this purpose, we focus on the main belt region where the Pallas asteroid family is located. By using the full Solar system model, different sets of initial conditions and different integration times, we managed not only to visualize a large multiplet of resonances located in the region, but also their structures, chaotic boundaries, stability islands therein and the positions of their mutual interaction. In the end, we have identified some of the most dominant resonances present in the region and established a link between these resonances and chaotic areas visible in our maps. We have illustrated that FLI once again has shown its efficiency to detect dynamical structures in the main belt, e.g. in the Pallas asteroid family, with a surprisingly good clarity.

  17. Body saccades of Drosophila consist of stereotyped banked turns.

    PubMed

    Muijres, Florian T; Elzinga, Michael J; Iwasaki, Nicole A; Dickinson, Michael H

    2015-03-01

    The flight pattern of many fly species consists of straight flight segments interspersed with rapid turns called body saccades, a strategy that is thought to minimize motion blur. We analyzed the body saccades of fruit flies (Drosophila hydei), using high-speed 3D videography to track body and wing kinematics and a dynamically scaled robot to study the production of aerodynamic forces and moments. Although the size, degree and speed of the saccades vary, the dynamics of the maneuver are remarkably stereotypic. In executing a body saccade, flies perform a quick roll and counter-roll, combined with a slower unidirectional rotation around their yaw axis. Flies regulate the size of the turn by adjusting the magnitude of torque that they produce about these control axes, while maintaining the orientation of the rotational axes in the body frame constant. In this way, body saccades are different from escape responses in the same species, in which the roll and pitch component of banking is varied to adjust turn angle. Our analysis of the wing kinematics and aerodynamics showed that flies control aerodynamic torques during the saccade primarily by adjusting the timing and amount of span-wise wing rotation. © 2015. Published by The Company of Biologists Ltd.

  18. Multi-camera and structured-light vision system (MSVS) for dynamic high-accuracy 3D measurements of railway tunnels.

    PubMed

    Zhan, Dong; Yu, Long; Xiao, Jian; Chen, Tanglong

    2015-04-14

    Railway tunnel 3D clearance inspection is critical to guaranteeing railway operation safety. However, it is a challenge to inspect railway tunnel 3D clearance using a vision system, because both the spatial range and field of view (FOV) of such measurements are quite large. This paper summarizes our work on dynamic railway tunnel 3D clearance inspection based on a multi-camera and structured-light vision system (MSVS). First, the configuration of the MSVS is described. Then, the global calibration for the MSVS is discussed in detail. The onboard vision system is mounted on a dedicated vehicle and is expected to suffer from multiple degrees of freedom vibrations brought about by the running vehicle. Any small vibration can result in substantial measurement errors. In order to overcome this problem, a vehicle motion deviation rectifying method is investigated. Experiments using the vision inspection system are conducted with satisfactory online measurement results.

  19. Bifurcation theory applied to aircraft motions

    NASA Technical Reports Server (NTRS)

    Hui, W. H.; Tobak, M.

    1985-01-01

    Bifurcation theory is used to analyze the nonlinear dynamic stability characteristics of single-degree-of-freedom motions of an aircraft or a flap about a trim position. The bifurcation theory analysis reveals that when the bifurcation parameter, e.g., the angle of attack, is increased beyond a critical value at which the aerodynamic damping vanishes, a new solution representing finite-amplitude periodic motion bifurcates from the previously stable steady motion. The sign of a simple criterion, cast in terms of aerodynamic properties, determines whether the bifurcating solution is stable (supercritical) or unstable (subcritical). For the pitching motion of a flap-plate airfoil flying at supersonic/hypersonic speed, and for oscillation of a flap at transonic speed, the bifurcation is subcritical, implying either that exchanges of stability between steady and periodic motion are accompanied by hysteresis phenomena, or that potentially large aperiodic departures from steady motion may develop. On the other hand, for the rolling oscillation of a slender delta wing in subsonic flight (wing rock), the bifurcation is found to be supercritical. This and the predicted amplitude of the bifurcation periodic motion are in good agreement with experiments.

  20. Bifurcation theory applied to aircraft motions

    NASA Technical Reports Server (NTRS)

    Hui, W. H.; Tobak, M.

    1985-01-01

    The bifurcation theory is used to analyze the nonlinear dynamic stability characteristics of single-degree-of-freedom motions of an aircraft or a flap about a trim position. The bifurcation theory analysis reveals that when the bifurcation parameter, e.g., the angle of attack, is increased beyond a critical value at which the aerodynamic damping vanishes, a new solution representing finite-amplitude periodic motion bifurcates from the previously stable steady motion. The sign of a simple criterion, cast in terms of aerodynamic properties, determines whether the bifurcating solution is stable (supercritical) or unstable (critical). For the pitching motion of a flap-plate airfoil flying at supersonic/hypersonic speed, and for oscillation of a flap at transonic speed, the bifurcation is subcritical, implying either that exchanges of stability between steady and periodic motion are accompanied by hysteresis phenomena, or that potentially large aperiodic departures from steady motion may develop. On the other hand, for the rolling oscillation of a slender delta wing in subsonic flight (wing rock), the bifurcation is found to be supercritical. This and the predicted amplitude of the bifurcation periodic motion are in good agreement with the experiments.

  1. Dual Use of Image Based Tracking Techniques: Laser Eye Surgery and Low Vision Prosthesis

    NASA Technical Reports Server (NTRS)

    Juday, Richard D.; Barton, R. Shane

    1994-01-01

    With a concentration on Fourier optics pattern recognition, we have developed several methods of tracking objects in dynamic imagery to automate certain space applications such as orbital rendezvous and spacecraft capture, or planetary landing. We are developing two of these techniques for Earth applications in real-time medical image processing. The first is warping of a video image, developed to evoke shift invariance to scale and rotation in correlation pattern recognition. The technology is being applied to compensation for certain field defects in low vision humans. The second is using the optical joint Fourier transform to track the translation of unmodeled scenes. Developed as an image fixation tool to assist in calculating shape from motion, it is being applied to tracking motions of the eyeball quickly enough to keep a laser photocoagulation spot fixed on the retina, thus avoiding collateral damage.

  2. Peripheral Visual Cues Contribute to the Perception of Object Movement During Self-Movement

    PubMed Central

    Rogers, Cassandra; Warren, Paul A.

    2017-01-01

    Safe movement through the environment requires us to monitor our surroundings for moving objects or people. However, identification of moving objects in the scene is complicated by self-movement, which adds motion across the retina. To identify world-relative object movement, the brain thus has to ‘compensate for’ or ‘parse out’ the components of retinal motion that are due to self-movement. We have previously demonstrated that retinal cues arising from central vision contribute to solving this problem. Here, we investigate the contribution of peripheral vision, commonly thought to provide strong cues to self-movement. Stationary participants viewed a large field of view display, with radial flow patterns presented in the periphery, and judged the trajectory of a centrally presented probe. Across two experiments, we demonstrate and quantify the contribution of peripheral optic flow to flow parsing during forward and backward movement. PMID:29201335

  3. High contrast sensitivity for visually guided flight control in bumblebees.

    PubMed

    Chakravarthi, Aravin; Kelber, Almut; Baird, Emily; Dacke, Marie

    2017-12-01

    Many insects rely on vision to find food, to return to their nest and to carefully control their flight between these two locations. The amount of information available to support these tasks is, in part, dictated by the spatial resolution and contrast sensitivity of their visual systems. Here, we investigate the absolute limits of these visual properties for visually guided position and speed control in Bombus terrestris. Our results indicate that the limit of spatial vision in the translational motion detection system of B. terrestris lies at 0.21 cycles deg -1 with a peak contrast sensitivity of at least 33. In the perspective of earlier findings, these results indicate that bumblebees have higher contrast sensitivity in the motion detection system underlying position control than in their object discrimination system. This suggests that bumblebees, and most likely also other insects, have different visual thresholds depending on the behavioral context.

  4. What constitutes an efficient reference frame for vision?

    PubMed Central

    Tadin, Duje; Lappin, Joseph S.; Blake, Randolph; Grossman, Emily D.

    2015-01-01

    Vision requires a reference frame. To what extent does this reference frame depend on the structure of the visual input, rather than just on retinal landmarks? This question is particularly relevant to the perception of dynamic scenes, when keeping track of external motion relative to the retina is difficult. We tested human subjects’ ability to discriminate the motion and temporal coherence of changing elements that were embedded in global patterns and whose perceptual organization was manipulated in a way that caused only minor changes to the retinal image. Coherence discriminations were always better when local elements were perceived to be organized as a global moving form than when they were perceived to be unorganized, individually moving entities. Our results indicate that perceived form influences the neural representation of its component features, and from this, we propose a new method for studying perceptual organization. PMID:12219092

  5. Dual use of image based tracking techniques: Laser eye surgery and low vision prosthesis

    NASA Technical Reports Server (NTRS)

    Juday, Richard D.

    1994-01-01

    With a concentration on Fourier optics pattern recognition, we have developed several methods of tracking objects in dynamic imagery to automate certain space applications such as orbital rendezvous and spacecraft capture, or planetary landing. We are developing two of these techniques for Earth applications in real-time medical image processing. The first is warping of a video image, developed to evoke shift invariance to scale and rotation in correlation pattern recognition. The technology is being applied to compensation for certain field defects in low vision humans. The second is using the optical joint Fourier transform to track the translation of unmodeled scenes. Developed as an image fixation tool to assist in calculating shape from motion, it is being applied to tracking motions of the eyeball quickly enough to keep a laser photocoagulation spot fixed on the retina, thus avoiding collateral damage.

  6. Landing Characteristics in Waves of Three Dynamic Models of Flying Boats

    NASA Technical Reports Server (NTRS)

    Benson, James M.; Havens, Robert F.; Woodward, David R.

    1947-01-01

    Powered models of three different flying boats were landed in oncoming wave of various heights and lengths. The resulting motions and acceleration were recorded to survey the effects of varying the trim at landing, the deceleration after landing, and the size of the waves. One of the models had an unusually long afterbody. The data for landing with normal rates of deceleration indicated that the most severe motions and accelerations were likely to occur at some period of the landing run subsequent to the initial impact. Landings made at abnormally low trims led to unusually severe bounces during the runout. The least severe landing occurred after a small lending when the model was rapidly decelerated at about 0.4 g in a simulation of the proposed use of braking devices. The severity of the landings increased with wave height and was at a maximum when the wave length was of the order of from one and one-half to twice the over-all length of the model. The models with afterbodies of moderate length frequently bounced clear of the water into a stalled attitude at speeds below flying speed. The model with the long afterbody had less tendency to bounce from the waves and consequently showed less severe accelerations during the landing run than the models with moderate lengths of afterbody.

  7. Perturbation analysis of 6DoF flight dynamics and passive dynamic stability of hovering fruit fly Drosophila melanogaster.

    PubMed

    Gao, Na; Aono, Hikaru; Liu, Hao

    2011-02-07

    Insects exhibit exquisite control of their flapping flight, capable of performing precise stability and steering maneuverability. Here we develop an integrated computational model to investigate flight dynamics of insect hovering based on coupling the equations of 6 degree of freedom (6DoF) motion with the Navier-Stokes (NS) equations. Unsteady aerodynamics is resolved by using a biology-inspired dynamic flight simulator that integrates models of realistic wing-body morphology and kinematics, and a NS solver. We further develop a dynamic model to solve the rigid body equations of 6DoF motion by using a 4th-order Runge-Kutta method. In this model, instantaneous forces and moments based on the NS-solutions are represented in terms of Fourier series. With this model, we perform a systematic simulation-based analysis on the passive dynamic stability of a hovering fruit fly, Drosophila melanogaster, with a specific focus on responses of state variables to six one-directional perturbation conditions during latency period. Our results reveal that the flight dynamics of fruit fly hovering does not have a straightforward dynamic stability in a conventional sense that perturbations damp out in a manner of monotonous convergence. However, it is found to exist a transient interval containing an initial converging response observed for all the six perturbation variables and a terminal instability that at least one state variable subsequently tends to diverge after several wing beat cycles. Furthermore, our results illustrate that a fruit fly does have sufficient time to apply some active mediation to sustain a steady hovering before losing body attitudes. Copyright © 2010 Elsevier Ltd. All rights reserved.

  8. Psychophysical evidence for auditory motion parallax.

    PubMed

    Genzel, Daria; Schutte, Michael; Brimijoin, W Owen; MacNeilage, Paul R; Wiegrebe, Lutz

    2018-04-17

    Distance is important: From an ecological perspective, knowledge about the distance to either prey or predator is vital. However, the distance of an unknown sound source is particularly difficult to assess, especially in anechoic environments. In vision, changes in perspective resulting from observer motion produce a reliable, consistent, and unambiguous impression of depth known as motion parallax. Here we demonstrate with formal psychophysics that humans can exploit auditory motion parallax, i.e., the change in the dynamic binaural cues elicited by self-motion, to assess the relative depths of two sound sources. Our data show that sensitivity to relative depth is best when subjects move actively; performance deteriorates when subjects are moved by a motion platform or when the sound sources themselves move. This is true even though the dynamic binaural cues elicited by these three types of motion are identical. Our data demonstrate a perceptual strategy to segregate intermittent sound sources in depth and highlight the tight interaction between self-motion and binaural processing that allows assessment of the spatial layout of complex acoustic scenes.

  9. Vestibular nuclei and cerebellum put visual gravitational motion in context.

    PubMed

    Miller, William L; Maffei, Vincenzo; Bosco, Gianfranco; Iosa, Marco; Zago, Myrka; Macaluso, Emiliano; Lacquaniti, Francesco

    2008-04-01

    Animal survival in the forest, and human success on the sports field, often depend on the ability to seize a target on the fly. All bodies fall at the same rate in the gravitational field, but the corresponding retinal motion varies with apparent viewing distance. How then does the brain predict time-to-collision under gravity? A perspective context from natural or pictorial settings might afford accurate predictions of gravity's effects via the recovery of an environmental reference from the scene structure. We report that embedding motion in a pictorial scene facilitates interception of gravitational acceleration over unnatural acceleration, whereas a blank scene eliminates such bias. Functional magnetic resonance imaging (fMRI) revealed blood-oxygen-level-dependent correlates of these visual context effects on gravitational motion processing in the vestibular nuclei and posterior cerebellar vermis. Our results suggest an early stage of integration of high-level visual analysis with gravity-related motion information, which may represent the substrate for perceptual constancy of ubiquitous gravitational motion.

  10. Neural Circuit to Integrate Opposing Motions in the Visual Field.

    PubMed

    Mauss, Alex S; Pankova, Katarina; Arenz, Alexander; Nern, Aljoscha; Rubin, Gerald M; Borst, Alexander

    2015-07-16

    When navigating in their environment, animals use visual motion cues as feedback signals that are elicited by their own motion. Such signals are provided by wide-field neurons sampling motion directions at multiple image points as the animal maneuvers. Each one of these neurons responds selectively to a specific optic flow-field representing the spatial distribution of motion vectors on the retina. Here, we describe the discovery of a group of local, inhibitory interneurons in the fruit fly Drosophila key for filtering these cues. Using anatomy, molecular characterization, activity manipulation, and physiological recordings, we demonstrate that these interneurons convey direction-selective inhibition to wide-field neurons with opposite preferred direction and provide evidence for how their connectivity enables the computation required for integrating opposing motions. Our results indicate that, rather than sharpening directional selectivity per se, these circuit elements reduce noise by eliminating non-specific responses to complex visual information. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Visions of Our Planet's Atmosphere, Land and Oceans Electronic-Theater 2001

    NASA Technical Reports Server (NTRS)

    Hasler, A. F.; Einaudi, Franco (Technical Monitor)

    2001-01-01

    The NASA/NOAA/AMS Electronic Theater presents Earth science observations and visualizations in a historical perspective. Fly in from outer space to Fredericton New Brunswick. Drop in on the Kennedy Space Center and Park City Utah, site of the 2002 Olympics using 1 m IKONOS "Spy Satellite" data. Go back to the early weather satellite images from the 1960s and see them contrasted with the latest US and International global satellite weather movies including hurricanes & tornadoes. See the latest spectacular images from NASA/NOAA and Canadian remote sensing missions like Terra GOES, TRMM, SeaWiFS, Landsat 7, and Radarsat that are visualized & explained. See how High Definition Television (HDTV) is revolutionizing the way we communicate science in cooperation with the American Museum of Natural History in NYC. See dust storms in Africa and smoke plumes from fires in Mexico. See visualizations featured on Newsweek, TIME, National Geographic, Popular Science covers & National & International Network TV. New visualization tools allow us to roam & zoom through massive global images eg Landsat tours of the US, Africa, & New Zealand showing desert and mountain geology as well as seasonal changes in vegetation. See animations of the polar ice packs and the motion of gigantic Antarctic Icebergs from SeaWinds data. Spectacular new visualizations of the global atmosphere & oceans are shown. See massive dust storms sweeping across Africa. See vortexes and currents in the global oceans that bring up the nutrients to feed tiny plankton and draw the fish, whales and fisherman. See the how the ocean blooms in response to these currents and El Nino/La Nina climate changes. The demonstration is interactively driven by a SGI Onyx II Graphics Supercomputer with four CPUs, 8 Gigabytes of RAM and Terabyte of disk. With multiple projectors on a giant screen. See the city lights, fishing fleets, gas flares and bio-mass burning of the Earth at night observed by the "night-vision" DMSP military satellite.

  12. Localization from Visual Landmarks on a Free-Flying Robot

    NASA Technical Reports Server (NTRS)

    Coltin, Brian; Fusco, Jesse; Moratto, Zack; Alexandrov, Oleg; Nakamura, Robert

    2016-01-01

    We present the localization approach for Astrobee,a new free-flying robot designed to navigate autonomously on board the International Space Station (ISS). Astrobee will conduct experiments in microgravity, as well as assisst astronauts and ground controllers. Astrobee replaces the SPHERES robots which currently operate on the ISS, which were limited to operating in a small cube since their localization system relied on triangulation from ultrasonic transmitters. Astrobee localizes with only monocular vision and an IMU, enabling it to traverse the entire US segment of the station. Features detected on a previously-built map, optical flow information,and IMU readings are all integrated into an extended Kalman filter (EKF) to estimate the robot pose. We introduce several modifications to the filter to make it more robust to noise.Finally, we extensively evaluate the behavior of the filter on atwo-dimensional testing surface.

  13. Localization from Visual Landmarks on a Free-Flying Robot

    NASA Technical Reports Server (NTRS)

    Coltin, Brian; Fusco, Jesse; Moratto, Zack; Alexandrov, Oleg; Nakamura, Robert

    2016-01-01

    We present the localization approach for Astrobee, a new free-flying robot designed to navigate autonomously on the International Space Station (ISS). Astrobee will accommodate a variety of payloads and enable guest scientists to run experiments in zero-g, as well as assist astronauts and ground controllers. Astrobee will replace the SPHERES robots which currently operate on the ISS, whose use of fixed ultrasonic beacons for localization limits them to work in a 2 meter cube. Astrobee localizes with monocular vision and an IMU, without any environmental modifications. Visual features detected on a pre-built map, optical flow information, and IMU readings are all integrated into an extended Kalman filter (EKF) to estimate the robot pose. We introduce several modifications to the filter to make it more robust to noise, and extensively evaluate the localization algorithm.

  14. Perception of object motion in three-dimensional space induced by cast shadows.

    PubMed

    Katsuyama, Narumi; Usui, Nobuo; Nose, Izuru; Taira, Masato

    2011-01-01

    Cast shadows can be salient depth cues in three-dimensional (3D) vision. Using a motion illusion in which a ball is perceived to roll in depth on the bottom or to flow in the front plane depending on the slope of the trajectory of its cast shadow, we investigated cortical mechanisms underlying 3D vision based on cast shadows using fMRI techniques. When modified versions of the original illusion, in which the slope of the shadow trajectory (shadow slope) was changed in 5 steps from the same one as the ball trajectory to the horizontal, were presented to participants, their perceived ball trajectory shifted gradually from rolling on the bottom to floating in the front plane as the change of the shadow slope. This observation suggests that the perception of the ball trajectory in this illusion is strongly affected by the motion of the cast shadow. In the fMRI study, cortical activity during observation of the movies of the illusion was investigated. We found that the bilateral posterior-occipital sulcus (POS) and right ventral precuneus showed activation related to the perception of the ball trajectory induced by the cast shadows in the illusion. Of these areas, it was suggested that the right POS may be involved in the inferring of the ball trajectory by the given spatial relation between the ball and the shadow. Our present results suggest that the posterior portion of the medial parietal cortex may be involved in 3D vision by cast shadows. Copyright © 2010 Elsevier Inc. All rights reserved.

  15. The role of vision in odor-plume tracking by walking and flying insects.

    PubMed

    Willis, Mark A; Avondet, Jennifer L; Zheng, Elizabeth

    2011-12-15

    The walking paths of male cockroaches, Periplaneta americana, tracking point-source plumes of female pheromone often appear similar in structure to those observed from flying male moths. Flying moths use visual-flow-field feedback of their movements to control steering and speed over the ground and to detect the wind speed and direction while tracking plumes of odors. Walking insects are also known to use flow field cues to steer their trajectories. Can the upwind steering we observe in plume-tracking walking male cockroaches be explained by visual-flow-field feedback, as in flying moths? To answer this question, we experimentally occluded the compound eyes and ocelli of virgin P. americana males, separately and in combination, and challenged them with different wind and odor environments in our laboratory wind tunnel. They were observed responding to: (1) still air and no odor, (2) wind and no odor, (3) a wind-borne point-source pheromone plume and (4) a wide pheromone plume in wind. If walking cockroaches require visual cues to control their steering with respect to their environment, we would expect their tracks to be less directed and more variable if they cannot see. Instead, we found few statistically significant differences among behaviors exhibited by intact control cockroaches or those with their eyes occluded, under any of our environmental conditions. Working towards our goal of a comprehensive understanding of chemo-orientation in insects, we then challenged flying and walking male moths to track pheromone plumes with and without visual feedback. Neither walking nor flying moths performed as well as walking cockroaches when there was no visual information available.

  16. A Stochastic Burst Follows the Periodic Morning Peak in Individual Drosophila Locomotion

    PubMed Central

    Lazopulo, Stanislav; Lopez, Juan A.; Levy, Paul; Syed, Sheyum

    2015-01-01

    Coupling between cyclically varying external light and an endogenous biochemical oscillator known as the circadian clock, modulates a rhythmic pattern with two prominent peaks in the locomotion of Drosophila melanogaster. A morning peak appears around the time lights turn on and an evening peak appears just before lights turn off. The close association between the peaks and the external 12:12 hour light/dark photoperiod means that respective morning and evening peaks of individual flies are well-synchronized in time and, consequently, feature prominently in population-averaged data. Here, we report on a brief but strong stochastic burst in fly activity that, in contrast to morning and evening peaks, is detectable only in single fly recordings. This burst was observed across 3 wild-type strains of Drosophila melanogaster. In a single fly recording, the burst is likely to appear once randomly within 0.5–5 hours after lights turn on, last for only 2–3 minutes and yet show 5 times greater activity compared to the maximum of morning peak with data binned in 3 minutes. Owing to its variable timing and short duration, the burst is virtually undetectable in population-averaged data. We use a locally-built illumination system to study the burst and find that its incidence in a population correlates with light intensity, with ~85% of control flies showing the behavior at 8000 lux (1942 μW/cm2). Consistent with that finding, several mutant flies with impaired vision show substantially reduced frequency of the burst. Additionally, we find that genetic ablation of the clock has insignificant effect on burst frequency. Together, these data suggest that the pronounced burst is likely generated by a light-activated circuit that is independent of the circadian clock. PMID:26528813

  17. The role of vision in odor-plume tracking by walking and flying insects

    PubMed Central

    Willis, Mark A.; Avondet, Jennifer L.; Zheng, Elizabeth

    2011-01-01

    SUMMARY The walking paths of male cockroaches, Periplaneta americana, tracking point-source plumes of female pheromone often appear similar in structure to those observed from flying male moths. Flying moths use visual-flow-field feedback of their movements to control steering and speed over the ground and to detect the wind speed and direction while tracking plumes of odors. Walking insects are also known to use flow field cues to steer their trajectories. Can the upwind steering we observe in plume-tracking walking male cockroaches be explained by visual-flow-field feedback, as in flying moths? To answer this question, we experimentally occluded the compound eyes and ocelli of virgin P. americana males, separately and in combination, and challenged them with different wind and odor environments in our laboratory wind tunnel. They were observed responding to: (1) still air and no odor, (2) wind and no odor, (3) a wind-borne point-source pheromone plume and (4) a wide pheromone plume in wind. If walking cockroaches require visual cues to control their steering with respect to their environment, we would expect their tracks to be less directed and more variable if they cannot see. Instead, we found few statistically significant differences among behaviors exhibited by intact control cockroaches or those with their eyes occluded, under any of our environmental conditions. Working towards our goal of a comprehensive understanding of chemo-orientation in insects, we then challenged flying and walking male moths to track pheromone plumes with and without visual feedback. Neither walking nor flying moths performed as well as walking cockroaches when there was no visual information available. PMID:22116754

  18. Enabling Spacecraft Formation Flying through Position Determination, Control and Enhanced Automation Technologies

    NASA Technical Reports Server (NTRS)

    Bristow, John; Bauer, Frank; Hartman, Kate; How, Jonathan

    2000-01-01

    Formation Flying is revolutionizing the way the space community conducts science missions around the Earth and in deep space. This technological revolution will provide new, innovative ways for the community to gather scientific information, share that information between space vehicles and the ground, and expedite the human exploration of space. Once fully matured, formation flying will result in numerous sciencecraft acting as virtual platforms and sensor webs, gathering significantly more and better science data than call be collected today. To achieve this goal, key technologies must be developed including those that address the following basic questions posed by the spacecraft: Where am I? Where is the rest of the fleet? Where do I need to be? What do I have to do (and what am I able to do) to get there? The answers to these questions and the means to implement those answers will depend oil the specific mission needs and formation configuration. However, certain critical technologies are common to most formations. These technologies include high-precision position and relative-position knowledge including Global Positioning System (GPS) mid celestial navigation; high degrees of spacecraft autonomy inter-spacecraft communication capabilities; targeting and control including distributed control algorithms, and high precision control thrusters and actuators. This paper provides an overview of a selection of the current activities NASA/DoD/Industry/Academia are working to develop Formation Flying technologies as quickly as possible, the hurdles that need to be overcome to achieve our formation flying vision, and the team's approach to transfer this technology to space. It will also describe several of the formation flying testbeds, such as Orion and University Nanosatellites, that are being developed to demonstrate and validate many of these innovative sensing and formation control technologies.

  19. The Verriest Lecture: Color lessons from space, time, and motion

    PubMed Central

    Shevell, Steven K.

    2012-01-01

    The appearance of a chromatic stimulus depends on more than the wavelengths composing it. The scientific literature has countless examples showing that spatial and temporal features of light influence the colors we see. Studying chromatic stimuli that vary over space, time or direction of motion has a further benefit beyond predicting color appearance: the unveiling of otherwise concealed neural processes of color vision. Spatial or temporal stimulus variation uncovers multiple mechanisms of brightness and color perception at distinct levels of the visual pathway. Spatial variation in chromaticity and luminance can change perceived three-dimensional shape, an example of chromatic signals that affect a percept other than color. Chromatic objects in motion expose the surprisingly weak link between the chromaticity of objects and their physical direction of motion, and the role of color in inducing an illusory motion direction. Space, time and motion – color’s colleagues – reveal the richness of chromatic neural processing. PMID:22330398

  20. Crowd motion segmentation and behavior recognition fusing streak flow and collectiveness

    NASA Astrophysics Data System (ADS)

    Gao, Mingliang; Jiang, Jun; Shen, Jin; Zou, Guofeng; Fu, Guixia

    2018-04-01

    Crowd motion segmentation and crowd behavior recognition are two hot issues in computer vision. A number of methods have been proposed to tackle these two problems. Among the methods, flow dynamics is utilized to model the crowd motion, with little consideration of collective property. Moreover, the traditional crowd behavior recognition methods treat the local feature and dynamic feature separately and overlook the interconnection of topological and dynamical heterogeneity in complex crowd processes. A crowd motion segmentation method and a crowd behavior recognition method are proposed based on streak flow and crowd collectiveness. The streak flow is adopted to reveal the dynamical property of crowd motion, and the collectiveness is incorporated to reveal the structure property. Experimental results show that the proposed methods improve the crowd motion segmentation accuracy and the crowd recognition rates compared with the state-of-the-art methods.

  1. USAF Test Pilot School. Flying Qualities Textbook, Volume 2 Part 2

    DTIC Science & Technology

    1986-04-01

    regime that precipitates entry into a PSG, spin, or deep stall condition (MIL-F-83691A, Reference 10.4, Paragraph 6.3.9). Notice two things about...motions may result after departure - the aircraft enters either a PSG, spin, or deep stall (of course, a PSG can progress into a spin or deep stall...gyration," "spin" and " deep stalls," used to define a departure. 10.3.1.3 Post-Stall Gyration. A post-stall gyration is an uncontrolled motion about one

  2. Integration across Time Determines Path Deviation Discrimination for Moving Objects

    PubMed Central

    Whitaker, David; Levi, Dennis M.; Kennedy, Graeme J.

    2008-01-01

    Background Human vision is vital in determining our interaction with the outside world. In this study we characterize our ability to judge changes in the direction of motion of objects–a common task which can allow us either to intercept moving objects, or else avoid them if they pose a threat. Methodology/Principal Findings Observers were presented with objects which moved across a computer monitor on a linear path until the midline, at which point they changed their direction of motion, and observers were required to judge the direction of change. In keeping with the variety of objects we encounter in the real world, we varied characteristics of the moving stimuli such as velocity, extent of motion path and the object size. Furthermore, we compared performance for moving objects with the ability of observers to detect a deviation in a line which formed the static trace of the motion path, since it has been suggested that a form of static memory trace may form the basis for these types of judgment. The static line judgments were well described by a ‘scale invariant’ model in which any two stimuli which possess the same two-dimensional geometry (length/width) result in the same level of performance. Performance for the moving objects was entirely different. Irrespective of the path length, object size or velocity of motion, path deviation thresholds depended simply upon the duration of the motion path in seconds. Conclusions/Significance Human vision has long been known to integrate information across space in order to solve spatial tasks such as judgment of orientation or position. Here we demonstrate an intriguing mechanism which integrates direction information across time in order to optimize the judgment of path deviation for moving objects. PMID:18414653

  3. Evaluation of simulation motion fidelity criteria in the vertical and directional axes

    NASA Technical Reports Server (NTRS)

    Schroeder, Jeffery A.

    1993-01-01

    An evaluation of existing motion fidelity criteria was conducted on the NASA Ames Vertical Motion Simulator. Experienced test pilots flew single-axis repositioning tasks in both the vertical and the directional axes. Using a first-order approximation of a hovering helicopter, tasks were flown with variations only in the filters that attenuate the commands to the simulator motion system. These filters had second-order high-pass characteristics, and the variations were made in the filter gain and natural frequency. The variations spanned motion response characteristics from nearly full math-model motion to fixed-base. Between configurations, pilots recalibrated their motion response perception by flying the task with full motion. Pilots subjectively rated the motion fidelity of subsequent configurations relative to this full motion case, which was considered the standard for comparison. The results suggested that the existing vertical-axis criterion was accurate for combinations of gain and natural frequency changes. However, if only the gain or the natural frequency was changed, the rated motion fidelity was better than the criterion predicted. In the vertical axis, the objective and subjective results indicated that a larger gain reduction was tolerated than the existing criterion allowed. The limited data collected in the yaw axis revealed that pilots had difficulty in distinguishing among the variations in the pure yaw motion cues.

  4. Perceived orientation of a runway model in nonpilots during simulated night approaches to landing.

    DOT National Transportation Integrated Search

    1977-07-01

    Illusions due to reduced visual cues at night have long been cited as contributing to the dangerous tendency of pilots to fly too low during night landing approaches. The cue of motion parallax (a difference in rate of apparent movement of objects in...

  5. Vision Assisted Navigation for Miniature Unmanned Aerial Vehicles (MAVs)

    DTIC Science & Technology

    2009-11-01

    commanded to orbit a target of known location. The error in target geolocation is shown for 200 frames with filtering (dashed line) and without (solid...so the performance of the filter was determined by using the estimated poses to solve a geolocation problem. An MAV flying at an altitude of 70 meters... geolocation as well as significantly reducing the short-term variance in the estimates based on the GPS/IMU alone. Due to the nature of the autopilot

  6. Mathematical modelling of animate and intentional motion.

    PubMed Central

    Rittscher, Jens; Blake, Andrew; Hoogs, Anthony; Stein, Gees

    2003-01-01

    Our aim is to enable a machine to observe and interpret the behaviour of others. Mathematical models are employed to describe certain biological motions. The main challenge is to design models that are both tractable and meaningful. In the first part we will describe how computer vision techniques, in particular visual tracking, can be applied to recognize a small vocabulary of human actions in a constrained scenario. Mainly the problems of viewpoint and scale invariance need to be overcome to formalize a general framework. Hence the second part of the article is devoted to the question whether a particular human action should be captured in a single complex model or whether it is more promising to make extensive use of semantic knowledge and a collection of low-level models that encode certain motion primitives. Scene context plays a crucial role if we intend to give a higher-level interpretation rather than a low-level physical description of the observed motion. A semantic knowledge base is used to establish the scene context. This approach consists of three main components: visual analysis, the mapping from vision to language and the search of the semantic database. A small number of robust visual detectors is used to generate a higher-level description of the scene. The approach together with a number of results is presented in the third part of this article. PMID:12689374

  7. Automated tracking of animal posture and movement during exploration and sensory orientation behaviors.

    PubMed

    Gomez-Marin, Alex; Partoune, Nicolas; Stephens, Greg J; Louis, Matthieu; Brembs, Björn

    2012-01-01

    The nervous functions of an organism are primarily reflected in the behavior it is capable of. Measuring behavior quantitatively, at high-resolution and in an automated fashion provides valuable information about the underlying neural circuit computation. Accordingly, computer-vision applications for animal tracking are becoming a key complementary toolkit to genetic, molecular and electrophysiological characterization in systems neuroscience. We present Sensory Orientation Software (SOS) to measure behavior and infer sensory experience correlates. SOS is a simple and versatile system to track body posture and motion of single animals in two-dimensional environments. In the presence of a sensory landscape, tracking the trajectory of the animal's sensors and its postural evolution provides a quantitative framework to study sensorimotor integration. To illustrate the utility of SOS, we examine the orientation behavior of fruit fly larvae in response to odor, temperature and light gradients. We show that SOS is suitable to carry out high-resolution behavioral tracking for a wide range of organisms including flatworms, fishes and mice. Our work contributes to the growing repertoire of behavioral analysis tools for collecting rich and fine-grained data to draw and test hypothesis about the functioning of the nervous system. By providing open-access to our code and documenting the software design, we aim to encourage the adaptation of SOS by a wide community of non-specialists to their particular model organism and questions of interest.

  8. Drogue tracking using 3D flash lidar for autonomous aerial refueling

    NASA Astrophysics Data System (ADS)

    Chen, Chao-I.; Stettner, Roger

    2011-06-01

    Autonomous aerial refueling (AAR) is an important capability for an unmanned aerial vehicle (UAV) to increase its flying range and endurance without increasing its size. This paper presents a novel tracking method that utilizes both 2D intensity and 3D point-cloud data acquired with a 3D Flash LIDAR sensor to establish relative position and orientation between the receiver vehicle and drogue during an aerial refueling process. Unlike classic, vision-based sensors, a 3D Flash LIDAR sensor can provide 3D point-cloud data in real time without motion blur, in the day or night, and is capable of imaging through fog and clouds. The proposed method segments out the drogue through 2D analysis and estimates the center of the drogue from 3D point-cloud data for flight trajectory determination. A level-set front propagation routine is first employed to identify the target of interest and establish its silhouette information. Sufficient domain knowledge, such as the size of the drogue and the expected operable distance, is integrated into our approach to quickly eliminate unlikely target candidates. A statistical analysis along with a random sample consensus (RANSAC) is performed on the target to reduce noise and estimate the center of the drogue after all 3D points on the drogue are identified. The estimated center and drogue silhouette serve as the seed points to efficiently locate the target in the next frame.

  9. Unchanging visions: the effects and limitations of ocular stillness

    PubMed Central

    Macknik, Stephen L.

    2017-01-01

    Scientists have pondered the perceptual effects of ocular motion, and those of its counterpart, ocular stillness, for over 200 years. The unremitting ‘trembling of the eye’ that occurs even during gaze fixation was first noted by Jurin in 1738. In 1794, Erasmus Darwin documented that gaze fixation produces perceptual fading, a phenomenon rediscovered in 1804 by Ignaz Paul Vital Troxler. Studies in the twentieth century established that Jurin's ‘eye trembling’ consisted of three main types of ‘fixational’ eye movements, now called microsaccades (or fixational saccades), drifts and tremor. Yet, owing to the constant and minute nature of these motions, the study of their perceptual and physiological consequences has met significant technological challenges. Studies starting in the 1950s and continuing in the present have attempted to study vision during retinal stabilization—a technique that consists on shifting any and all visual stimuli presented to the eye in such a way as to nullify all concurrent eye movements—providing a tantalizing glimpse of vision in the absence of change. No research to date has achieved perfect retinal stabilization, however, and so other work has devised substitute ways to counteract eye motion, such as by studying the perception of afterimages or of the entoptic images formed by retinal vessels, which are completely stable with respect to the eye. Yet other research has taken the alternative tack to control eye motion by behavioural instruction to fix one's gaze or to keep one's gaze still, during concurrent physiological and/or psychophysical measurements. Here, we review the existing data—from historical and contemporary studies that have aimed to nullify or minimize eye motion—on the perceptual and physiological consequences of perfect versus imperfect fixation. We also discuss the accuracy, quality and stability of ocular fixation, and the bottom–up and top–down influences that affect fixation behaviour. This article is part of the themed issue ‘Movement suppression: brain mechanisms for stopping and stillness’. PMID:28242737

  10. The Effectiveness of Simulator Motion in the Transfer of Performance on a Tracking Task Is Influenced by Vision and Motion Disturbance Cues.

    PubMed

    Grundy, John G; Nazar, Stefan; O'Malley, Shannon; Mohrenshildt, Martin V; Shedden, Judith M

    2016-06-01

    To examine the importance of platform motion to the transfer of performance in motion simulators. The importance of platform motion in simulators for pilot training is strongly debated. We hypothesized that the type of motion (e.g., disturbance) contributes significantly to performance differences. Participants used a joystick to perform a target tracking task in a pod on top of a MOOG Stewart motion platform. Five conditions compared training without motion, with correlated motion, with disturbance motion, with disturbance motion isolated to the visual display, and with both correlated and disturbance motion. The test condition involved the full motion model with both correlated and disturbance motion. We analyzed speed and accuracy across training and test as well as strategic differences in joystick control. Training with disturbance cues produced critical behavioral differences compared to training without disturbance; motion itself was less important. Incorporation of disturbance cues is a potentially important source of variance between studies that do or do not show a benefit of motion platforms in the transfer of performance in simulators. Potential applications of this research include the assessment of the importance of motion platforms in flight simulators, with a focus on the efficacy of incorporating disturbance cues during training. © 2016, Human Factors and Ergonomics Society.

  11. Differential GNSS and Vision-Based Tracking to Improve Navigation Performance in Cooperative Multi-UAV Systems

    PubMed Central

    Vetrella, Amedeo Rodi; Fasano, Giancarmine; Accardo, Domenico; Moccia, Antonio

    2016-01-01

    Autonomous navigation of micro-UAVs is typically based on the integration of low cost Global Navigation Satellite System (GNSS) receivers and Micro-Electro-Mechanical Systems (MEMS)-based inertial and magnetic sensors to stabilize and control the flight. The resulting navigation performance in terms of position and attitude accuracy may not suffice for other mission needs, such as the ones relevant to fine sensor pointing. In this framework, this paper presents a cooperative UAV navigation algorithm that allows a chief vehicle, equipped with inertial and magnetic sensors, a Global Positioning System (GPS) receiver, and a vision system, to improve its navigation performance (in real time or in the post processing phase) exploiting formation flying deputy vehicles equipped with GPS receivers. The focus is set on outdoor environments and the key concept is to exploit differential GPS among vehicles and vision-based tracking (DGPS/Vision) to build a virtual additional navigation sensor whose information is then integrated in a sensor fusion algorithm based on an Extended Kalman Filter. The developed concept and processing architecture are described, with a focus on DGPS/Vision attitude determination algorithm. Performance assessment is carried out on the basis of both numerical simulations and flight tests. In the latter ones, navigation estimates derived from the DGPS/Vision approach are compared with those provided by the onboard autopilot system of a customized quadrotor. The analysis shows the potential of the developed approach, mainly deriving from the possibility to exploit magnetic- and inertial-independent accurate attitude information. PMID:27999318

  12. Apparent motion determined by surface layout not by disparity or three-dimensional distance.

    PubMed

    He, Z J; Nakayama, K

    1994-01-13

    The most meaningful events ecologically, including the motion of objects, occur in relation to or on surfaces. We run along the ground, cars travel on roads, balls roll across lawns, and so on. Even though there are other motions, such as flying of birds, it is likely that motion along surfaces is more frequent and more significant biologically. To examine whether events occurring in relation to surfaces have a preferred status in terms of visual representation, we asked whether the phenomenon of apparent motion would show a preference for motion attached to surfaces. We used a competitive three-dimensional motion paradigm and found that there is a preference to see motion between tokens placed within the same disparity as opposed to different planes. Supporting our surface-layout hypothesis, the effect of disparity was eliminated either by slanting the tokens so that they were all seen within the same surface plane or by inserting a single slanted background surface upon which the tokens could rest. Additionally, a highly curved stereoscopic surface led to the perception of a more circuitous motion path defined by that surface, instead of the shortest path in three-dimensional space.

  13. Recent progress in millimeter-wave sensor system capabilities for enhanced (synthetic) vision

    NASA Astrophysics Data System (ADS)

    Hellemann, Karlheinz; Zachai, Reinhard

    1999-07-01

    Weather- and daylight independent operation of modern traffic systems is strongly required for an optimized and economic availability. Mainly helicopters, small aircraft and military transport aircraft operating frequently close to the ground have a need for effective and cost-effective Enhanced Vision sensors. The technical progress in sensor technology and processing speed offer today the possibility for new concepts to be realized. Derived from this background the paper reports on the improvements which are under development within the HiVision program at DaimlerChrysler Aerospace. A sensor demonstrator based on FMCW radar technology with high information update-rate and operating in the mm-wave band, has been up-graded to improve performance and fitted to fly on an experimental base. The results achieved so far demonstrate the capability to produce a weather independent enhanced vision. In addition the demonstrator has been tested on board a high- speed ferry at the Baltic sea, because fast vessels have a similar need for weather-independent operation and anti- collision measures. In the future one sensor type may serve both 'worlds' and help ease and save traffic. The described demonstrator fills up the technology gap between optical sensors (Infrared) and standard pulse radars with its specific features such as high speed scanning and weather penetration with the additional benefit of cost-effectiveness.

  14. Artifact mitigation of ptychography integrated with on-the-fly scanning probe microscopy

    DOE PAGES

    Huang, Xiaojing; Yan, Hanfei; Ge, Mingyuan; ...

    2017-07-11

    In this paper, we report our experiences with conducting ptychography simultaneously with the X-ray fluorescence measurement using the on-the-fly mode for efficient multi-modality imaging. We demonstrate that the periodic artifact inherent to the raster scan pattern can be mitigated using a sufficiently fine scan step size to provide an overlap ratio of >70%. This allows us to obtain transmitted phase contrast images with enhanced spatial resolution from ptychography while maintaining the fluorescence imaging with continuous-motion scans on pixelated grids. Lastly, this capability will greatly improve the competence and throughput of scanning probe X-ray microscopy.

  15. The aerodynamics and control of free flight manoeuvres in Drosophila.

    PubMed

    Dickinson, Michael H; Muijres, Florian T

    2016-09-26

    A firm understanding of how fruit flies hover has emerged over the past two decades, and recent work has focused on the aerodynamic, biomechanical and neurobiological mechanisms that enable them to manoeuvre and resist perturbations. In this review, we describe how flies manipulate wing movement to control their body motion during active manoeuvres, and how these actions are regulated by sensory feedback. We also discuss how the application of control theory is providing new insight into the logic and structure of the circuitry that underlies flight stability.This article is part of the themed issue 'Moving in a moving medium: new perspectives on flight'. © 2016 The Author(s).

  16. The aerodynamics and control of free flight manoeuvres in Drosophila

    PubMed Central

    Muijres, Florian T.

    2016-01-01

    A firm understanding of how fruit flies hover has emerged over the past two decades, and recent work has focused on the aerodynamic, biomechanical and neurobiological mechanisms that enable them to manoeuvre and resist perturbations. In this review, we describe how flies manipulate wing movement to control their body motion during active manoeuvres, and how these actions are regulated by sensory feedback. We also discuss how the application of control theory is providing new insight into the logic and structure of the circuitry that underlies flight stability. This article is part of the themed issue ‘Moving in a moving medium: new perspectives on flight’. PMID:27528778

  17. Vestibular selection criteria development. [assessing susceptability to motion sickness during orbital space flight

    NASA Technical Reports Server (NTRS)

    Lackner, J. R.

    1981-01-01

    The experimental elicitation of motion sickness using a short arm centrifuge or a rotating chair surrounded by a striped cylindrical enclosure failed to reveal any systematic group or consistent individual relationship between changes in heart rate, blood pressure, and body temperature and the appearance of symptoms of motion sickness. A study of the influence of vision on susceptability to motion sickness during sudden stop simulation shows that having the eyes open during any part of the sudden stop assessment is more stressful than having them closed throughout the test. Subjects were found to be highly susceptible to motion sickness when tested in free fall and in high force phases of flight. The effect of touch and pressure cues on body orientation during rotation and in parabolic flight are considered as sensory as well as motor adaptation.

  18. Relative Navigation for Formation Flying of Spacecraft

    NASA Technical Reports Server (NTRS)

    Alonso, Roberto; Du, Ju-Young; Hughes, Declan; Junkins, John L.; Crassidis, John L.

    2001-01-01

    This paper presents a robust and efficient approach for relative navigation and attitude estimation of spacecraft flying in formation. This approach uses measurements from a new optical sensor that provides a line of sight vector from the master spacecraft to the secondary satellite. The overall system provides a novel, reliable, and autonomous relative navigation and attitude determination system, employing relatively simple electronic circuits with modest digital signal processing requirements and is fully independent of any external systems. Experimental calibration results are presented, which are used to achieve accurate line of sight measurements. State estimation for formation flying is achieved through an optimal observer design. Also, because the rotational and translational motions are coupled through the observation vectors, three approaches are suggested to separate both signals just for stability analysis. Simulation and experimental results indicate that the combined sensor/estimator approach provides accurate relative position and attitude estimates.

  19. Comparing the aerodynamic forces produced by dragonfly forewings during inverted and non-inverted flight

    NASA Astrophysics Data System (ADS)

    Shumway, Nathan; Gabryszuk, Mateusz; Laurence, Stuart

    2017-11-01

    Experiments were conducted with live dragonflies to determine their wing kinematics during free flight. The motion of one forewing in two different tests, one where the dragonfly is inverted, is described using piecewise functions and simulated using the OVERTURNS Reynolds-averaged Navier-Stokes solver that has been used in previous work to determine trim conditions for a fruit fly model. For the inverted dragonfly the upstrokes were significantly longer than the downstrokes, pitching amplitude is lower than that for the right-side up flight and the flap amplitude is larger. Simulations of dragonfly kinematics of a single forewing are presented to determine how the forces differ for a dragonfly flying inverted and a dragonfly flying right-side up. This work was supported by the United States Army Research Laboratory's Micro Autonomous Systems and Technology Collaborative Technology Alliance Project MCE-16-17 1.2.

  20. Distance underestimation in virtual space is sensitive to gender but not activity-passivity or mode of interaction.

    PubMed

    Foreman, Nigel; Sandamas, George; Newson, David

    2004-08-01

    Four groups of undergraduates (half of each gender) experienced a movement along a corridor containing three distinctive objects, in a virtual environment (VE) with wide-screen projection. One group simulated walking along the virtual corridor using a proprietary step-exercise device. A second group moved along the corridor in conventional flying mode, depressing a keyboard key to initiate continuous forward motion. Two further groups observed the walking and flying participants, by viewing their progress on the screen. Participants then had to walk along a real equivalent but empty corridor, and indicate the positions of the three objects. All groups underestimated distances in the real corridor, the greatest underestimates occurring for the middle distance object. Males' underestimations were significantly lower than females' at all distances. However, there was no difference between the active participants and passive observers, nor between walking and flying conditions.

Top