# LSSVM prediction model based on bat algorithm improved least squares support vector machine LSSVM prediction

Posted on Dec. 2, 2022, 11:45 a.m. by Tushar Chopra

Category:
The code of life
Tag:
MATLAB
## The characteristics of LSSVM

1) The original dual problem is also solved, but the QP problem in SVM (simplified solving process) is replaced by solving a linear equation set (caused by linear constraints in the optimization objective), which is also applicable to classification and regression tasks in high-dimensional input space; 2) In essence, it is the process of solving linear matrix equations. Regularization Networks and the core version of Fisher Discriminant analysis are combined; 3) Sparse approximation (to overcome the disadvantages of using the algorithm) and robust regression (robust statistics) are used; 4) Bayesian inference; 5) Can be extended to unsupervised learning: kernel PCA or density clustering; 6) Can be extended to recursive neural networks.

`% = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = % initialization CLC close all clear format long tic % = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = % % import data data = xlsread (' 1. XLSX '); [row,col]=size(data); x=data(:,1:col-1); y=data(:,col); set=1; % set number of measurement samples row1=row-set; % train_x=x(1:row1,:); train_y=y(1:row1,:); test_x=x(row1+1:row,:); % forecast input test_y=y(row1+1:row,:); % train_x=train_x'; train_y=train_y'; test_x=test_x'; test_y=test_y'; %% Data normalization [train_x,minx,maxx, train_yy,miny,maxy] = PreMNMX (train_x,train_y); test_x=tramnmx(test_x,minx,maxx); train_x=train_x'; train_yy=train_yy'; train_y=train_y'; test_x=test_x'; test_y=test_y'; %% Parameter initialization EPS = 10^(-6); %% define LSSVM related parameter type='f'; kernel = 'RBF_kernel'; proprecess='proprecess'; Lb = [0.01 0.02]; % lower limit of variation of parameter C and g ub=[1000 100]; Dim =2 for % c, g; % dimension, which is an optimization parameter SearchAgents_no=20; % Number of search agents Max_iter=100; % Maximum numbef of iterations n=10; % Population size, typically 10 to 25 A=0.25; % Loudness (constant or sharp) r=0.5; % Pulse rate (constant or decreasing) % This frequency range determines the scalings Qmin=0; % Frequency minimum Qmax=2; % Frequency maximum % Iteration parameters tol=10^(-10); % Stop tolerance Leader_pos=zeros(1,dim); Leader_score=inf; %change this to -inf for maximization problems %Initialize the positions of search agents for i=1:SearchAgents_no Positions(i,1)=ceil(rand(1)*(ub(1)-lb(1))+lb(1)); Positions(i,2)=ceil(rand(1)*(ub(2)-lb(2))+lb(2)); Fitness(i)=Fun(Positions(i,:),train_x,train_yy,type,kernel,proprecess,miny,maxy,train_y,test_x,test_y); v(i,:)=rand(1,dim); end [fmin,I]=min(Fitness); best=Positions(I,:); Convergence_curve=zeros(1,Max_iter); t=0; % Loop counter % Start the iterations -- Bat Algorithm %% result analysis plot(Convergence_curve,'LineWidth',2); Title ([' wolves optimization algorithm fitness curves', '(parameters c1 =', num2str (Leader_pos (1)), ', c2 = ', num2str (Leader_pos (2)), ', termination of algebra = ', num2str (Max_iter), and ') '], 'the Font Size',13); Xlabel (' evolutionary algebra '); Ylabel (' error fitness '); bestc = Leader_pos(1); bestg = Leader_pos(2); End RD=RD' disp([' grey Wolf optimization algorithm optimized SVM prediction error =',num2str(D)]) % figure % plot(test_predict,':og') % hold on % plot(test_y,'- *') % Legend (' prediction output ',' Expected output ') % title(' network prediction output ',' fontSize ',12) % yLabel (' function output ',' FontSize ',12) % xLabel (' sample ',' FontSize ',12) figure Plot (train_predict,':og') hold on plot(train_y,'- *') legend(' prediction output ',' expected output ') title(' Prediction output ',' Fontsize ',12) Ylabel (' function output ','fontsize',12) xLabel (' sample ','fontsize',12) TOC % Calculated timeCopy the code`