Neural network – support vector machine

Support Vector Machine (SVM) was first proposed by Cortes and Vapnik in 1995. It shows many unique advantages in solving small sample size, nonlinear and high-dimensional pattern recognition, and can be generalized to other Machine learning problems such as function fitting. 1 Mathematics section 1.1 Two-dimensional space​​ ​​ ​​ ​​ ​​ ​​ ​​ ​​ 2 algorithm Part​​ ​​ ​​

Second, locust algorithm

1. Algorithm principle

2. Algorithm flow

Three, part of the code

%%% Designed and Developed by Dr. Gaurav Dhiman (http://dhimangaurav.com/) %%% function[Score,Position,Convergence]=SOA(Search_Agents,Max_iterations,Lower_bound,Upper_bound,dimension,objective) Position=zeros(1,dimension); Score=inf; Positions=init(Search_Agents,dimension,Upper_bound,Lower_bound); Convergence=zeros(1,Max_iterations); l=0; while l<Max_iterations for i=1:size(Positions,1) Flag4Upper_bound=Positions(i,:)>Upper_bound; Flag4Lower_bound=Positions(i,:)<Lower_bound; Positions(i,:)=(Positions(i,:).*(~(Flag4Upper_bound+Flag4Lower_bound)))+Upper_bound.*Flag4Upper_bound+Lower_bound.*Flag4 Lower_bound; fitness=objective(Positions(i,:)); if fitness<Score Score=fitness; Position=Positions(i,:); end end Fc=2-l*((2)/Max_iterations); for i=1:size(Positions,1) for j=1:size(Positions,2) r1=rand(); r2=rand(); A1=2*Fc*r1-Fc; C1=2*r2; b=1; ll=(Fc-1)*rand()+1; D_alphs=Fc*Positions(i,j)+A1*((Position(j)-Positions(i,j))); X1=D_alphs*exp(b.*ll).*cos(ll.*2*pi)+Position(j); Positions(i,j)=X1; end end l=l+1; Convergence(l)=Score; endCopy the code

4. Simulation results

Five, reference and code private message blogger