The optic flow experienced by a rotating observer (in this case a fly). The direction and magnitude of optic flow at each location is represented by the direction and length of each arrow.
Optical flow or optic flow is the pattern of apparent motion of objects, surfaces, and edges in a visual scene caused by the relative motion between an observer (an eye or a camera) and the scene.^{[1]}^{[2]} The concept of optical flow was introduced by the American psychologist James J. Gibson in the 1940s to describe the visual stimulus provided to animals moving through the world.^{[3]} Gibson stressed the importance of optic flow for affordance perception, the ability to discern possibilities for action within the environment. Followers of Gibson and his ecological approach to psychology have further demonstrated the role of the optical flow stimulus for the perception of movement by the observer in the world; perception of the shape, distance and movement of objects in the world; and the control of locomotion.^{[4]} The term optical flow is also used by roboticists, encompassing related techniques from image processing and control of navigation including motion detection, object segmentation, timetocontact information, focus of expansion calculations, luminance, motion compensated encoding, and stereo disparity measurement.^{[5]}^{[6]}
Contents

Estimation 1

Methods for determination 1.1

Uses 2

Optical flow sensor 3

See also 4

References 5

External links 6
Estimation
Sequences of ordered images allow the estimation of motion as either instantaneous image velocities or discrete image displacements.^{[6]} Fleet and Weiss provide a tutorial introduction to gradient based optical flow .^{[7]} John L. Barron, David J. Fleet, and Steven Beauchemin provide a performance analysis of a number of optical flow techniques. It emphasizes the accuracy and density of measurements.^{[8]}
The optical flow methods try to calculate the motion between two image frames which are taken at times t and t+\Delta t at every voxel position. These methods are called differential since they are based on local Taylor series approximations of the image signal; that is, they use partial derivatives with respect to the spatial and temporal coordinates.
For a 2D+t dimensional case (3D or nD cases are similar) a voxel at location (x,y,t) with intensity I(x,y,t) will have moved by \Delta x, \Delta y and \Delta t between the two image frames, and the following brightness constancy constraint can be given:

I(x,y,t) = I(x+\Delta x, y + \Delta y, t + \Delta t)
Assuming the movement to be small, the image constraint at I(x,y,t) with Taylor series can be developed to get:

I(x+\Delta x,y+\Delta y,t+\Delta t) = I(x,y,t) + \frac{\partial I}{\partial x}\Delta x+\frac{\partial I}{\partial y}\Delta y+\frac{\partial I}{\partial t}\Delta t+H.O.T.
From these equations it follows that:

\frac{\partial I}{\partial x}\Delta x+\frac{\partial I}{\partial y}\Delta y+\frac{\partial I}{\partial t}\Delta t = 0
or

\frac{\partial I}{\partial x}\frac{\Delta x}{\Delta t}+\frac{\partial I}{\partial y}\frac{\Delta y}{\Delta t}+\frac{\partial I}{\partial t}\frac{\Delta t}{\Delta t} = 0
which results in

\frac{\partial I}{\partial x}V_x+\frac{\partial I}{\partial y}V_y+\frac{\partial I}{\partial t} = 0
where V_x,V_y are the x and y components of the velocity or optical flow of I(x,y,t) and \tfrac{\partial I}{\partial x}, \tfrac{\partial I}{\partial y} and \tfrac{\partial I}{\partial t} are the derivatives of the image at (x,y,t) in the corresponding directions. I_x, I_y and I_t can be written for the derivatives in the following.
Thus:

I_xV_x+I_yV_y=I_t
or

\nabla I^T\cdot\vec{V} = I_t
This is an equation in two unknowns and cannot be solved as such. This is known as the aperture problem of the optical flow algorithms. To find the optical flow another set of equations is needed, given by some additional constraint. All optical flow methods introduce additional conditions for estimating the actual flow.
Methods for determination

Phase correlation – inverse of normalized crosspower spectrum

Blockbased methods – minimizing sum of squared differences or sum of absolute differences, or maximizing normalized crosscorrelation

Differential methods of estimating optical flow, based on partial derivatives of the image signal and/or the sought flow field and higherorder partial derivatives, such as:

Lucas–Kanade method – regarding image patches and an affine model for the flow field

Horn–Schunck method – optimizing a functional based on residuals from the brightness constancy constraint, and a particular regularization term expressing the expected smoothness of the flow field

Buxton–Buxton method – based on a model of the motion of edges in image sequences^{[9]}

Black–Jepson method – coarse optical flow via correlation^{[6]}

General variational methods – a range of modifications/extensions of Horn–Schunck, using other data terms and other smoothness terms.

Discrete optimization methods – the search space is quantized, and then image matching is addressed through label assignment at every pixel, such that the corresponding deformation minimizes the distance between the source and the target image.^{[10]} The optimal solution is often recovered through Maxflow mincut theorem algorithms, linear programming or belief propagation methods.
Many of these, in addition to the current stateoftheart algorithms are evaluated on the Middlebury Benchmark Dataset.^{[11]}
Uses
Motion estimation and video compression have developed as a major aspect of optical flow research. While the optical flow field is superficially similar to a dense motion field derived from the techniques of motion estimation, optical flow is the study of not only the determination of the optical flow field itself, but also of its use in estimating the threedimensional nature and structure of the scene, as well as the 3D motion of objects and the observer relative to the scene, most of them using the Image Jacobian.
Optical flow was used by robotics researchers in many areas such as: object detection and tracking, image dominant plane extraction, movement detection, robot navigation and visual odometry.^{[5]} Optical flow information has been recognized as being useful for controlling micro air vehicles.^{[12]}
The application of optical flow includes the problem of inferring not only the motion of the observer and objects in the scene, but also the structure of objects and the environment. Since awareness of motion and the generation of mental maps of the structure of our environment are critical components of animal (and human) vision, the conversion of this innate ability to a computer capability is similarly crucial in the field of machine vision.^{[13]}
The optical flow vector of a moving object in a video sequence.
Consider a fiveframe clip of a ball moving from the bottom left of a field of vision, to the top right. Motion estimation techniques can determine that on a two dimensional plane the ball is moving up and to the right and vectors describing this motion can be extracted from the sequence of frames. For the purposes of video compression (e.g., MPEG), the sequence is now described as well as it needs to be. However, in the field of machine vision, the question of whether the ball is moving to the right or if the observer is moving to the left is unknowable yet critical information. Not even if a static, patterned background were present in the five frames, could we confidently state that the ball was moving to the right, because the pattern might have an infinite distance to the observer.
Optical flow sensor
An optical flow sensor is a vision sensor capable of measuring optical flow or visual motion and outputting a measurement based on optical flow. Various configurations of optical flow sensors exist. One configuration is an image sensor chip connected to a processor programmed to run an optical flow algorithm. Another configuration uses a vision chip, which is an integrated circuit having both the image sensor and the processor on the same die, allowing for a compact implementation.^{[14]}^{[15]} An example of this is a generic optical mouse sensor used in an optical mouse. In some cases the processing circuitry may be implemented using analog or mixedsignal circuits to enable fast optical flow computation using minimal current consumption.
One area of contemporary research is the use of neuromorphic engineering techniques to implement circuits that respond to optical flow, and thus may be appropriate for use in an optical flow sensor.^{[16]} Such circuits may draw inspiration from biological neural circuitry that similarly responds to optical flow.
Optical flow sensors are used extensively in computer optical mice, as the main sensing component for measuring the motion of the mouse across a surface.
Optical flow sensors are also being used in robotics applications, primarily where there is a need to measure visual motion or relative motion between the robot and other objects in the vicinity of the robot. The use of optical flow sensors in unmanned aerial vehicles (UAVs), for stability and obstacle avoidance, is also an area of current research.^{[17]}
See also
References

^ Andrew Burton and John Radford (1978). Thinking in Perspective: Critical Essays in the Study of Thought Processes. Routledge.

^ David H. Warren and Edward R. Strelow (1985). Electronic Spatial Sensing for the Blind: Contributions from Perception. Springer.

^ Gibson, J.J. (1950). The Perception of the Visual World. Houghton Mifflin.

^ Royden, C. S.; Moore, K. D. (2012). "Use of speed cues in the detection of moving objects by moving observers". Vision research 59: 17–24.

^ ^{a} ^{b} Kelson R. T. Aires, Andre M. Santana, Adelardo A. D. Medeiros (2008). Optical Flow Using Color Information (PDF). ACM New York, NY, USA.

^ ^{a} ^{b} ^{c} S. S. Beauchemin , J. L. Barron (1995). The computation of optical flow. ACM New York, USA.

^ David J. Fleet and Yair Weiss (2006). "Optical Flow Estimation". In Paragios; et al. Handbook of Mathematical Models in Computer Vision (PDF). Springer.

^ John L. Barron, David J. Fleet, and Steven Beauchemin (1994). "Performance of optical flow techniques" (PDF). International Journal of Computer Vision (Springer) 12: 43–77.

^ Glyn W. Humphreys and

^ B. Glocker, N. Komodakis, G. Tziritas, N. Navab & N. Paragios (2008). Dense Image Registration through MRFs and Efficient Linear Programming (PDF). Medical Image Analysis Journal.

^ http://vision.middlebury.edu/flow/

^ Barrows G.L., Chahl J.S., and Srinivasan M.V., Biologically inspired visual sensing and flight control, Aeronautical Journal vol. 107, pp. 159–268, 2003.

^ Christopher M. Brown (1987). Advances in Computer Vision. Lawrence Erlbaum Associates.

^ Vision Chips, by Alireza Moini, Kluwer Academic Publishers, 2000

^ Analog VLSI and Neural Systems, by Carver Mead, 1989

^ Analog VLSI Circuits for the Perception of Visual Motion, by Alan Stocker, Wiley and Sons, 2006

^ Flying Insects and Robotics, Ed. by Floreano, Zufferey, and Srinivasan, Springer, 2006
External links

Finding Optic Flow

Art of Optical Flow article on fxguide.com (using optical flow in Visual Effects)

Optical flow evaluation and ground truth sequences.

Middlebury Optical flow evaluation and ground truth sequences.

mrfregistration.net  Optical flow estimation through MRF

The French Aerospace Lab : GPU implementation of a LucasKanade based optical flow

CUDA Implementation by CUVI (CUDA Vision & Imaging Library)

Horn and Schunck Optical Flow: Online demo and source code of the Horn and Schunck method

TVL1 Optical Flow: Online demo and source code of the Zach et al. method

Robust Optical Flow: Online demo and source code of the Brox et al. method
This article was sourced from Creative Commons AttributionShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and USA.gov, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for USA.gov and content contributors is made possible from the U.S. Congress, EGovernment Act of 2002.
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a nonprofit organization.