I. Introduction to LBP

LBP (Local Binary Pattern) is an operator used to describe Local texture features of images. It has significant advantages such as rotation invariance and gray invariance. It was first proposed by T. Ojala, M. Pietikainen, and D. Harwood in 1994 for texture feature extraction. Moreover, the extracted feature is the local texture feature of the image;

1. Description of LBP characteristicsThe original LBP operator is defined at 3In the window of 3, take the center pixel of the window as the threshold, compare the gray values of the 8 adjacent pixels with them. If the surrounding pixel value is greater than the center pixel value, the position of the pixel point is marked as 1; otherwise, it is 0. In this way, 3The comparison of 8 points in the neighborhood can generate 8-bit binary number (usually converted to decimal number, namely LBP code, 256 kinds in total), that is, the LBP value of the center pixel of the window can be obtained, and the value can be used to reflect the texture information of the region. As shown below:Improved version of LBP: Since the original LBP was proposed, researchers have continuously proposed various improvements and optimizations to it.

(1) Circular LBP operator: The biggest defect of the basic LBP operator is that it only covers a small area within a fixed radius, which obviously cannot meet the needs of different size and frequency textures. In order to adapt to texture features of different scales and meet the requirements of gray scale and rotation invariance, Ojala et al. improved the LBP operator by extending the 3×3 neighborhood to any neighborhood and replacing the square neighborhood with the circular neighborhood. The improved LBP operator allows any number of pixels in the circular neighborhood with radius R. Thus, LBP operators such as P sampling points in a circular region with radius R are obtained.As can be seen from the definition of LBP, LBP operator is gray level invariant, but not rotation invariant. The rotation of the image will result in different LBP values. Maenpaa et al. extended the LBP operator and proposed an LBP operator with rotation invariance, that is, a series of initial defined LBP values were obtained by constantly rotating the circular neighborhood, and the minimum value was taken as the LBP value of the neighborhood. Figure 2.5 shows the process diagram of obtaining rotation-invariant LBP. The number below the operator in the figure represents the CORRESPONDING LBP value of the operator. The LBP value with rotation-invariant is 15 after the processing of the eight LBP modes shown in the figure. In other words, the eight LBP patterns in the figure correspond to rotation-invariant LBP patterns of 00001111.(3) LBP equivalent mode An LBP operator can generate different binary modes. For the LBP operator containing P sampling points in the circular region with radius R, 2P2P modes will be generated. Obviously, the variety of binary patterns increases dramatically as the number of sampling points in the neighborhood set increases. For example, with 20 sampling points in a 5×5 neighborhood, there are 220220 = 1,048,576 binary modes. So many binary patterns are not good for texture extraction, texture recognition, classification and information access. At the same time, too many types of patterns are detrimental to texture expression. For example, when LBP operator is used for texture classification or face recognition, the statistical histogram of LBP mode is often used to express the information of the image, but more types of modes will make the data amount too large and the histogram is too sparse. Therefore, it is necessary to reduce the dimension of the original LBP mode so that the image information can be best represented when the amount of data is reduced. In order to solve the problem of too many binary patterns and improve statistics, Ojala proposed to adopt a Uniform Pattern to reduce the dimension of Pattern types of LBP operators. According to Ojala et al., in actual images, the vast majority of LBP patterns contain at most two jumps from 1 to 0 or from 0 to 1. Therefore, Ojala defines “equivalence pattern” as: when the cyclic binary number corresponding to an LBP jumps from 0 to 1 or from 1 to 0 at most twice, the binary corresponding to the LBP is called an equivalence pattern class. For example, 00000000 (0 jumps), 00000111 (including only one jump from 0 to 1), and 10001111 (jumping first from 1 to 0 and then from 0 to 1, two jumps in total) are equivalent mode classes. All schemas except the equivalent schema class are grouped into another category, called mixed schema class, such as 10010111 (four jumps) (this is my personal understanding, I don’t know if it is correct). With such improvements, the number of binary modes is greatly reduced without any loss of information. The number of patterns is reduced from 2P2P to P (P-1)+2, where P represents the sampling points in the neighborhood set. For the 8 sampling points in the 3×3 neighborhood, the number of binary modes is reduced from 256 to 58, which reduces the dimension of feature vector and reduces the influence of high-frequency noise.2. Principle of LBP feature for detectionIt is obvious that the LBP operator extracted above can obtain an LBP “code” at each pixel point. Then, after extracting the original LBP operator of an image (recording the gray value of each pixel point), the original LBP feature obtained is still “a picture” (recording the LBP value of each pixel point).It can be seen from the figure above that LBP has strong robustness to illumination. In the application of LBP, such as texture classification and face analysis, LBP atlas is generally not used as feature vector for classification recognition, but the statistical histogram of LBP feature spectrum is used as feature vector for classification recognition. Because, from the above analysis, we can see that this “feature” is closely related to location information. Direct extraction of such “features” from two images and discriminant analysis will produce a large error because of “misalignment of positions”. Later, researchers found that it was possible to divide an image into several sub-regions, extract LBP features for each pixel in each sub-region, and then build statistical histograms of LBP features in each sub-region. In this way, each subregion can be described by a statistical histogram; The whole picture is composed of several statistical histograms; For example, a picture of 100100 pixel size picture, divided into 1010=100 subregions (regions can be divided in various ways), each of which is 10 in size10 pixels; LBP features were extracted from each pixel in each sub-region, and then statistical histograms were established. So this picture has 1010 subregions, so we have 10Ten statistical histograms, using the tenTen statistical histograms describe this picture. Then, we can judge the similarity between two images by using various similarity measurement functions.3. The steps of extracting LBP feature vectors(1) Firstly, the detection window is divided into 16×16 cells; (2) For a pixel in each cell, the gray values of the 8 adjacent pixels are compared with them. If the surrounding pixel value is greater than the central pixel value, the position of the pixel point is marked as 1; otherwise, it is 0. In this way, the 8 points in the 3*3 neighborhood can produce 8-bit binary numbers by comparison, that is, the LBP value of the center pixel of the window can be obtained. (3) Then calculate the histogram of each cell, that is, the frequency of occurrence of each number (suppose LBP value of decimal number); Then the histogram is normalized. (4) Finally, the statistical histogram of each cell is connected into a feature vector, that is, the LBP texture feature vector of the whole image; Then, SVM or other machine learning algorithms can be used for classification.

The principle of LPQ(Local Phase Quantization) algorithm is that if the smoothing function H(x) is centrosymmetric, its Fourier transform is H(u). For all h(u)≥0, ∠G(u)= ∠F(u), where F(u) and G(u) are the Fourier transforms of the original image and the smoothed image respectively. Therefore, under the condition of H(u)≥0, the image has invariance to smoothness.

To make H(u)≥0, a is taken as the frequency point not exceeding the first zero crossing, and its value is a=1/winSize (winSize is the input parameter). Separately in f (x) of u1 = (a, 0), u2 = (0, a), u3 = (a, a), u4 = (a, a) four points for STFT, Then respectively the real part and imaginary part of the four points, form a vector W = [Re {F (u1, x)}, Re {F (u2, x)}, Re {F (u3, x)}, Re {F (u4, x)}, Im {F (u1, x)}, Im {F (u2, x)}, Im {F (u3, x)}, Im {F (u4, x) T}].

So you end up with the LPQ transformation which is Fx is equal to W times Fx. Then, the parameters are statistically analyzed. If the parameters are correlated, singular value decomposition is used to remove the correlation and quantify it.

Two, some source code

function varargout = identification(varargin)
% IDENTIFICATION MATLAB code for identification.fig
%      IDENTIFICATION, by itself, creates a new IDENTIFICATION or raises the existing
%      singleton*.
%
%      H = IDENTIFICATION returns the handle to a new IDENTIFICATION or the handle to
%      the existing singleton*.
%
%      IDENTIFICATION('CALLBACK',hObject,eventData,handles,...) calls the local
%      function named CALLBACK in IDENTIFICATION.M with the given input arguments.
%
%      IDENTIFICATION('Property'.'Value',...). creates anew IDENTIFICATION or raises the
%      existing singleton*.  Starting from the left, property value pairs are
%      applied to the GUI before identification_OpeningFcn gets called.  An
%      unrecognized property name or invalid value makes property application
%      stop.  All inputs are passed to identification_OpeningFcn via varargin.
%
%      *See GUI Options on GUIDE's Tools menu.  Choose "GUI allows only one % instance to run (singleton)".
%
% See also: GUIDE, GUIDATA, GUIHANDLES

% Edit the above text to modify the response to help identification

% Last Modified by GUIDE v2. 5 06-Aug- 2021. 03:03:30

% Begin initialization code - DO NOT EDIT
gui_Singleton = 1;
gui_State = struct('gui_Name',       mfilename, ...
                   'gui_Singleton',  gui_Singleton, ...
                   'gui_OpeningFcn', @identification_OpeningFcn, ...
                   'gui_OutputFcn',  @identification_OutputFcn, ...
                   'gui_LayoutFcn', [],...'gui_Callback'[]);if nargin && ischar(varargin{1})
    gui_State.gui_Callback = str2func(varargin{1});
end

if nargout
    [varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:});
else
    gui_mainfcn(gui_State, varargin{:});
end
% End initialization code - DO NOT EDIT


% --- Executes just before identification is made visible.
function identification_OpeningFcn(hObject, eventdata, handles, varargin)
% This function has no output args, see OutputFcn.
% hObject    handle to figure
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)
% varargin   command line arguments to identification (see VARARGIN)

% Choose default command line output for identification
handles.output = hObject;

% Update handles structure
guidata(hObject, handles);

% UIWAIT makes identification wait for user response (see UIRESUME)
% uiwait(handles.figure1);


% --- Outputs from this function are returned to the command line.
function varargout = identification_OutputFcn(hObject, eventdata, handles) 
% varargout  cell array for returning output args (see VARARGOUT);
% hObject    handle to figure
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)

% Get default command line output from handles structure
varargout{1} = handles.output;


% --- Executes on button press in pushbutton1.
function pushbutton1_Callback(hObject, eventdata, handles)
% hObject    handle to pushbutton1 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)
[filename,pathname]=uigetfile({'*.jpg';'*.tif'},'file selector');
str=[pathname filename];
I=imread(str);
axes(handles.axes1);

imshow(I);

lbp_face=[];
[lbp_face,feature]=lbpfeaturevector2(I,100.20);
axes(handles.axes2);
imshow(feature);
load('fb_lbp_face.mat')
ss=[];
ss=LBP_face(:,:);
ref_labels=label;
ref_label=number_label;
L=zeros(1,size(ss,2));
text=[];
%d=sum((A-B).^2);

   for j=1:size(ss,2)
        w=0;
    for i=1:size(lbp_face,1) 
           %w=w+(h(i,1)-ss(i,j))^2; % Euclidean distance %w=w-ss(I,j)*(log(h(i,1) +1e-100)); %log-likelihood statistic
           %w=w+min(ss(i,j),h(i,1)); %histogram intersection w=w+(((ss(i,j)-lbp_face(i,1)). ^2)./(ss(i,j)+lbp_face(i,1) + (1e-10))); %chi aquare statistic end w=sqrt(w);
    L(j)=w;
    end


axes(handles.axes3);
imshow(str);

% --- Executes on button press in pushbutton2.
function pushbutton2_Callback(hObject, eventdata, handles)
% hObject    handle to pushbutton2 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)% use 5*5 template to calculate LBP eigenvalues %=calculatelbp2(mat)
[m,n]=size(mat);
%mat=double(mat);
in=mat;
r=m+4;
c=n+4;
%out=zeros(m,n);
A=zeros(r,c);
r0=3:r2 -;
c0=3:c2 -;
A(r0,c0)=in;
%alpha=2-sqrt(2);
alpha=0.5;
beta=1-alpha;
d0=A(r0,c02 -)-in;
d2=A(r0+2,c0)-in;
d4=A(r0,c0+2)-in;
d6=A(r02 -,c0)-in;
d1=alpha*A(r0+1,c0- 1)+beta*A(r0+2,c02 -)-in;
d3=alpha*A(r0+1,c0+1)+beta*A(r0+2,c0+2)-in;
d5=alpha*A(r0- 1,c0+1)+beta*A(r02 -,c0+2)-in;
d7=alpha*A(r0- 1,c0- 1)+beta*A(r02 -,c02 -)-in;



d=[d0(:),d1(:),d2(:),d3(:),d4(:),d5(:),d6(:),d7(:)];
code=2.^ (7:- 1:0)'; out =reshape((d>=0)*code,m,n); Method of % twenty percent out = mat2gray (out); %imshow(out); % [m,n]=size(mat); % mat=[zeros(m,2) mat zeros(m,2)]; % mat=[zeros(2,n+4);mat;zeros(2,n+4)]; A = % zeros (5, 5); % p = zeros (1, 8); % k=0; % lbp=zeros(size(mat)); % h = zeros (1, 8); % for i=3:(m+2) % for j=3:(n+4) % lbpnumber=0; % A=mat(i:i+4,j:j+4); % k = A (3, 3); % (2) = p (A (1, 1) + A (2, 2)) / 2; % p (3) = A (1, 3); % p (4) = (A (1, 5) + A (2, 4)) / 2; % p (1) = A (3, 1); % p (5) = A (3, 5); % (8) = p (A (5, 1) + A (4, 2)) / 2; % p (7) = A (5, 3); % p (6) = (A (5, 5) + A (5, 4)) / 2; % h=(p>=k); % for q=1:8 % lbpnumber=lbpnumber+(h(q)*2^(8-q)); % end % lbp(i+2,j+2)=lbpnumber; Function [m,f]= lBPFeatureVector (mat,s,n) [~,~,d]=size(mat); function [m,f]= lBPFeatureVector (mat,s,n) [~,~,d]=size(mat) if d==3 mat=rgb2gray(mat); end mat=double(mat); mat=imresize(mat,[100 100],'bicubic'); % normalized k=calculatelbp(mat); f=mat2gray(k); %imshow(f); k=uint8(k); m=[];for i=1:n:s
    for j=1:n:s
        A=k(i:(i+19),j:(j+19));
        h=[];
        h=imhist(A);
        m=[m;h/(n*n)];
    end
end
%     lbp(1:2, :) = []; % lbp((1+m):end,:)=[];
%     lbp(:,1:2) = []; % lbp(:,(1+n):end)=[]; % end % imshow(lbp,[]); % cast todoubletypeCopy the code

3. Operation results

Matlab version and references

1 matlab version 2014A

2 Reference [1] CAI Limei. MATLAB Image Processing — Theory, Algorithm and Case Analysis [M]. Tsinghua University Press, 2020. [2] Yang Dan, ZHAO Haibin, LONG Zhe. Examples of MATLAB Image Processing In detail [M]. Tsinghua University Press, 2013. [3] Zhou Pin. MATLAB Image Processing and Graphical User Interface Design [M]. Tsinghua University Press, 2013. [4] LIU Chenglong. Proficient in MATLAB Image Processing [M]. Tsinghua University Press, 2015.