A list,

1 What is optical flow

Optical flow is the instantaneous speed at which a moving object in space moves a pixel on the observed imaging plane. Optical flow method is a method to find the corresponding relationship between the last frame and the current frame by using the changes of pixels in the time domain and the correlation between adjacent frames in the image sequence, so as to calculate the motion information of objects between adjacent frames. The instantaneous rate of change of gray level on a two-dimensional image plane is usually defined as an optical flow vector. In a nutshell: optical flow is instantaneous velocity, which is equivalent to the displacement of a target point when the time interval is small (such as between successive frames of video).

Physical significance of optical flow

Generally speaking, optical flow is caused by the movement of the foreground object itself, the camera movement, or both in the scene.

When the human eye observes a moving object, the scene of the object forms a series of continuously changing images on the retina of the human eye. This series of continuously changing information continuously “flows” through the retina (i.e. the image plane), just like a kind of “flow” of light, so it is called optical flow. Optical flow represents the change of the image, because it contains the information of the target motion, it can be used by the observer to determine the motion of the target.

Figure (1) shows the projection of movement of objects in three-dimensional space on the two-dimensional imaging plane. What is obtained is a two-dimensional vector describing the change of position, but in the case of extremely small motion interval, we usually regard it as a two-dimensional vector describing the instantaneous velocity of the point, u=(u,v), which is called the optical flow vector.



Figure (1) Projection of three-dimensional motion in a two-dimensional plane

Three optical flow field

In space, motion can be described by sports field, while in an image plane, the motion of objects is often reflected by the gray distribution of different images in the image sequence. Thus, the sports field in space is represented as an optical flow field when transferred to the image.

The optical flow field is a two-dimensional vector field, which reflects the variation trend of the gray level of each point on the image. It can be regarded as the instantaneous velocity field generated by the motion of pixels with gray level on the image plane. The information it contains is the instantaneous motion velocity vector information of each image point.

The purpose of studying the optical flow field is to approximate the field which cannot be obtained directly from the sequence image. Optical flow field Ideally, the optical flow field corresponds to a sports field.



FIG. (2) Vector field in three-dimensional space and its projection in two-dimensional plane



Figure (3) Visual optical flow field of real scene

In three words: the so-called optical flow field is the collection of many optical flow.

When we calculate the optical flow of each image in an image, we can form an optical flow field.

The construction of optical flow fields is an attempt to recreate sports fields in the real world for motion analysis.

3 Basic principles of optical flow method 3.1 Basic Assumptions (1) Constant brightness. That is, when the same target moves between different frames, its brightness will not change. This is the assumption of the basic optical flow method (all variations of optical flow method must be satisfied), used to obtain the basic equation of optical flow method; (2) Time continuous or motion is “small motion”. That is, the change of time will not cause the drastic change of target position, and the displacement between adjacent frames should be relatively small. It is also an indispensable assumption of optical flow method.

3.2 Basic constraint equations

Consider the light intensity of a pixel I(x,y,t) in the first frame (where T represents its time dimension). It moves dx dy to the next frame in dt time. Since it is the same pixel, according to the first hypothesis mentioned above, we believe that the light intensity of the pixel is unchanged before and after movement, namely:





3. Introduction of several optical flow estimation algorithms

1) Gradient-based method Gradient method is also known as the differential method, which uses the space-time differential of time-varying image gray scale (or its filtering form) (i.e., the space-time gradient function) to calculate the pixel velocity vector. This method has been widely used and studied because of its simple calculation and good results. Typical representatives are horn-Schunck algorithm and Lucas-Kanade(LK) algorithm. Horn-schunck algorithm adds global smoothness hypothesis on the basis of the basic constraint equation of optical flow, which assumes that the change of optical flow in the whole image is smooth, that is, the object motion vector is smooth or only changes slowly. Based on this idea, a large number of improved algorithms have been put forward. Nagel uses conditional smoothing constraints, that is, the gradient is smoothed differently by weighted matrix control. Black and Anandan proposed a piecewise smoothing method for multi-motion estimation.

  1. Matching based approach

The optical flow calculation method based on matching includes two methods based on feature and region. The feature-based method continuously locates and tracks the main features of the target and is robust to the large motion and brightness changes of the target. The problem is that light flow is often sparse and feature extraction and accurate matching are very difficult. The region-based method locates similar regions first, and then calculates the optical flow by the displacement of similar regions. This method has been widely used in video coding. However, the optical flow it calculates is still not dense. In addition, it is difficult to estimate the optical flow with sub-pixel accuracy by these two methods.

3) based on energy method Method based on energy, also known as the method based on frequency, in the process of using the method of accurate velocity estimation to obtain a uniform flow, must be to space-time filtering the input image, namely the integration of space and time, but it will reduce the flow of time and space resolution. Frequency-based methods often involve a lot of calculation, and it is difficult to evaluate the reliability.

4) Phase-based approach

The phase-based approach was developed by Fleet and Jepson, who first proposed the idea of using phase information in optical flow calculations. When we calculate the optical flow, the phase information of the image is more reliable than the brightness information, so the optical flow field obtained by using the phase information has better robustness. The advantages of the phase-based optical flow algorithm are as follows: it is applicable to a wide range of image sequences, and the speed estimation is more accurate. However, there are also some problems: First, the phase-based optical flow model is reasonable to some extent, but has a high time complexity; Second, the phase-based method can calculate the optical flow from two frames of images, but it takes a certain amount of time to improve the estimation accuracy. Thirdly, the phase-based optical flow method is sensitive to the time aliasing of image sequence.

5) Neurodynamic methods

The neurodynamic method is a neurodynamic model of visual motion perception based on neural network, which directly simulates the function and structure of biological visual system.

Although the neurodynamic method of optical flow calculation is still immature, the research on it is of great significance. With the continuous development of biological vision research, neural methods will undoubtedly improve, perhaps the fundamental way out of optical flow computing and even computer vision lies in the introduction of neural mechanisms. Neural network is a developing direction of optical flow technology.

3. Dense and sparse optical flow

In addition to distinguishing optical flow methods according to different principles, optical flow methods can also be divided into dense optical flow and sparse optical flow according to the density of two-dimensional vectors in the formed optical flow field.

Dense optical flow

Dense optical flow is an image registration method that performs point-by-point matching for an image or a specified area. It calculates the offsets of all points on the image to form a dense optical flow field. Through this dense optical flow field, image registration can be performed at the pixel level.

Horn-schunck algorithm and most optical flow methods based on region matching belong to the category of dense optical flow.



FIG. (4) Legend of dense optical flow field generated based on region matching method

Due to the density of optical flow vectors, the registration effect is obviously better than that of sparse optical flow registration. However, its side effects are also obvious, because the offset of each point needs to be calculated, the amount of calculation is obviously large, and the timeliness is poor.

Ii. Source code

function varargout = optical_flow(varargin)
% OPTICAL_FLOW M-file for optical_flow.fig
%      OPTICAL_FLOW, by itself, creates a new OPTICAL_FLOW or raises the existing
%      singleton*.
%
%      H = OPTICAL_FLOW returns the handle to a new OPTICAL_FLOW or the handle to
%      the existing singleton*.
%
%      OPTICAL_FLOW('CALLBACK',hObject,eventData,handles,...) calls the local
%      function named CALLBACK in OPTICAL_FLOW.M with the given input arguments.
%
%      OPTICAL_FLOW('Property'.'Value',...). creates anew OPTICAL_FLOW or raises the
%      existing singleton*.  Starting from the left, property value pairs are
%      applied to the GUI before optical_flow_OpeningFunction gets called.  An
%      unrecognized property name orinvalid value makes property application % stop. All inputs are passed to optical_flow_OpeningFcn via varargin. % % *See  GUI Options on GUIDE's Tools menu.  Choose "GUI allows only one % instance to run (singleton)".
%
% See also: GUIDE, GUIDATA, GUIHANDLES
 
% Copyright 2002- 2003. The MathWorks, Inc.
 
% Edit the above text to modify the response to help optical_flow
 
% Last Modified by GUIDE v2. 5 23-Jul- 2012. 15:10:21
 
% Begin initialization code - DO NOT EDIT
gui_Singleton = 1;
gui_State = struct('gui_Name',       mfilename, ...
                   'gui_Singleton',  gui_Singleton, ...
                   'gui_OpeningFcn', @optical_flow_OpeningFcn, ...
                   'gui_OutputFcn',  @optical_flow_OutputFcn, ...
                   'gui_LayoutFcn', [],...'gui_Callback'[]);if nargin && ischar(varargin{1})
    gui_State.gui_Callback = str2func(varargin{1});
end
 
if nargout
    [varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:});
else
    gui_mainfcn(gui_State, varargin{:});
end
% End initialization code - DO NOT EDIT
 
 
% --- Executes just before optical_flow is made visible.
function optical_flow_OpeningFcn(hObject, eventdata, handles, varargin)
% This function has no output args, see OutputFcn.
% hObject    handle to figure
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)
% varargin   command line arguments to optical_flow (see VARARGIN)
% Choose default command line output for optical_flow
handles.output = hObject;
 
% Update handles structure
guidata(hObject, handles);
 
% UIWAIT makes optical_flow wait for user response (see UIRESUME)
% uiwait(handles.figure1);
 
 
% --- Outputs from this function are returned to the command line.
function varargout = optical_flow_OutputFcn(hObject, eventdata, handles) 
% varargout  cell array for returning output args (see VARARGOUT);
% hObject    handle to figure
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)
 
% Get default command line output from handles structure
varargout{1} = handles.output;
 
 
% --- Executes on button press in pushbutton_start.
function pushbutton_start_Callback(hObject, eventdata, handles)
% hObject    handle to pushbutton_start (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)
global FilePath1
global FilePath2
global version
 
if ( isequal(FilePath1,'No Picture') || isequal(FilePath2,'No Picture'))
    errordlg('Error selecting picture! '.'MATLAB error');
    return
end
 
switch version
    case { 0.1.2 }
        optic_flow_brox(FilePath1, FilePath2);
    case 3
        Horn_WJY(FilePath2, FilePath1);
    case 4
        Lucas_Kanade(FilePath2, FilePath1);
    otherwise
        errordlg('Selection algorithm error! '.'MATLAB error');
end
 
 
 
% --- Executes on button press in pushbutton_pic1.
function pushbutton_pic1_Callback(hObject, eventdata, handles)
% hObject    handle to pushbutton_pic1 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)
global FilePath1
[FileName,PathName] = uigetfile({'*.jpg';'*.bmp'},'Select the picture');
if ~isequal(FileName,0)
    FilePath1 = fullfile(PathName,FileName);
    set(handles.edit_lj1,'String',FilePath1);
    
    img=imread(FilePath1);
    axes(handles.axes_pic1);
    imshow(img);
end
guidata(hObject, handles);
 
 
% --- Executes on button press in pushbutton_pic2.
function pushbutton_pic2_Callback(hObject, eventdata, handles)
% hObject    handle to pushbutton_pic2 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)
global FilePath2
[FileName,PathName] = uigetfile({'*.jpg';'*.bmp'},'Select the picture');
if ~isequal(FileName,0)
    FilePath2 = fullfile(PathName,FileName);
    set(handles.edit_lj2,'String',FilePath2);
    
    img=imread(FilePath2);
    axes(handles.axes_pic2);
    imshow(img);
end
guidata(hObject, handles);
 
 
function edit_lj1_Callback(hObject, eventdata, handles)
% hObject    handle to edit_lj1 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)
 
% Hints: get(hObject,'String') returns contents of edit_lj1 as text
%        str2double(get(hObject,'String')) returns contents of edit_lj1 as a double
 
 
% --- Executes during object creation, after setting all properties.
function edit_lj1_CreateFcn(hObject, eventdata, handles)
% hObject    handle to edit_lj1 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    empty - handles not created until after all CreateFcns called
 
% Hint: edit controls usually have a white background on Windows.
%       See ISPC and COMPUTER.
if ispc
    set(hObject,'BackgroundColor'.'white');
else
    set(hObject,'BackgroundColor',get(0.'defaultUicontrolBackgroundColor'));
end
 
 
 
function edit_lj2_Callback(hObject, eventdata, handles)
% hObject    handle to edit_lj2 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)
 
% Hints: get(hObject,'String') returns contents of edit_lj2 as text
%        str2double(get(hObject,'String')) returns contents of edit_lj2 as a double
 
 
% --- Executes during object creation, after setting all properties.
function edit_lj2_CreateFcn(hObject, eventdata, handles)
% hObject    handle to edit_lj2 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    empty - handles not created until after all CreateFcns called
 
% Hint: edit controls usually have a white background on Windows.
%       See ISPC and COMPUTER.
if ispc
    set(hObject,'BackgroundColor'.'white');
else
    set(hObject,'BackgroundColor',get(0.'defaultUicontrolBackgroundColor'));
end
 
 
 
function edit_arithmetic_Callback(hObject, eventdata, handles)
% hObject    handle to edit_arithmetic (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)
 
% Hints: get(hObject,'String') returns contents of edit_arithmetic as text
%        str2double(get(hObject,'String')) returns contents of edit_arithmetic as a double
 
 
% --- Executes during object creation, after setting all properties.
function edit_arithmetic_CreateFcn(hObject, eventdata, handles)
% hObject    handle to edit_arithmetic (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    empty - handles not created until after all CreateFcns called
 
% Hint: edit controls usually have a white background on Windows.
%       See ISPC and COMPUTER.
if ispc
    set(hObject,'BackgroundColor'.'white');
else
    set(hObject,'BackgroundColor',get(0.'defaultUicontrolBackgroundColor'));
end
 
 
% --------------------------------------------------------------------
function uipanel_choice_SelectionChangeFcn(hObject, eventdata, handles)
% hObject    handle to uipanel_choice (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)
global version
if get(handles.radiobutton_horn,'Value')
s=1;
end
if get(handles.radiobutton_lk,'Value')
s=2;
end
if get(handles.radiobutton_brox,'Value')
s=3;
end
if get(handles.radiobutton_bs,'Value')
s=4;
end
if get(handles.radiobutton_bss,'Value')
s=5;
end
switch s
    case 1
        version = 3;
        set(handles.edit_arithmetic,'String'.'Horn-Schunck');
    case 2
        version = 4;
        set(handles.edit_arithmetic,'String'.'Lucas-Kanade');
    case 3
        version = 0;
        set(handles.edit_arithmetic,'String'.'Brox');
    case 4
        version = 1;
        set(handles.edit_arithmetic,'String'.'Brox Improved Version ');
    case 5
        version = 2;
        set(handles.edit_arithmetic,'String'.'Brox improved version + Sift');
end
guidata(hObject, handles);
 
 
 
% --- Executes during object creation, after setting all properties.
function pushbutton_start_CreateFcn(hObject, eventdata, handles)
% hObject    handle to pushbutton_start (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    empty - handles not created until after all CreateFcns called
 
 
 
 
% --- Executes during object creation, after setting all properties.
function figure1_CreateFcn(hObject, eventdata, handles)
% hObject    handle to figure1 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    empty - handles not created until after all CreateFcns called
addpath('./Brox');
addpath('./Horn_Schunck');
addpath('./Lucas_Kanade');
global version
global FilePath1
global FilePath2
version = 3;
FilePath1 = 'No Picture';
FilePath2 = 'No Picture';
Copy the code

3. Operation results





Fourth, note

Version: 2014 a