A brief introduction to cuckoo algorithm

Cuckoo algorithm, English is called Cuckoo search (CS algorithm). Cuckoo is a kind of Cuckoo bird, O (∩_∩) O, this kind of bird has a lazy mother, she lays eggs by herself, she does not raise them, she usually throws her babies into the nests of other birds. However, when they hatch, they meet a clever mother bird, and they know it is not their own, so they are killed directly by the mother bird. So this group of baby cuckoos, in order to save their lives, imitate the calls of other birds, so that the mother bird with a very low IQ or eq will mistake it for her own baby, and it will survive. Cuckoo Search algorithm (CS) is an optimization algorithm proposed by Xin-She Yang and Suash Deb in Cuckoo Search Via Levy Flights in 2009. Cuckoo algorithm is a swarm intelligence search technology that combines cuckoo nest parasitism and Levy Flights patterns, and obtains an optimal nest to incubate its eggs through random walk. This approach can achieve an efficient optimization mode.

1 Nest parasitism of cuckoo 2 Levi’s Flight Figure 1. Schematic diagram of simulated Levi’s flight path

3. Realization process of cuckoo search algorithm

Two, some source code

function s=simplebounds(s,Lb,Ub)
  % Apply the lower bound
  ns_tmp=s;
  I=ns_tmp<Lb;
  ns_tmp(I)=Lb(I);
  
  % Apply the upper bounds 
  J=ns_tmp>Ub;
  ns_tmp(J)=Ub(J);
  % Update this newmove s=ns_tmp; End function errorn = fun(x) % Load data x [n,m]=size(x);for i=1:n
    y(i,1)=sum(X(1:i,1));
    y(i,2)=sum(X(1:i,2));
    y(i,3)=sum(X(1:i,3));
    y(i,4)=sum(X(1:i,4));
    y(i,5)=sum(X(1:i,5));
    y(i,6)=sum(X(1:i,6)); End %% Network parameter initialization a=0.2+x(1) /2;
b1=0.2+x(2) /2;
b2=0.2+x(3) /2;
b3=0.2+x(4) /2;
b4=0.2+x(5) /2;
b5=0.2+x(6) /2; %% Learning rate Initialization U1 =0.0015;
u2=0.0015;
u3=0.0015;
u4=0.0015;
u5=0.0015; %% weight threshold initialization t=1;
w11=a;
w21=-y(1.1);
w22=2*b1/a;
w23=2*b2/a;
w24=2*b3/a;
w25=2*b4/a;
w26=2*b5/a;
w31=1+exp(-a*t);
w32=1+exp(-a*t);
w33=1+exp(-a*t);
w34=1+exp(-a*t);
w35=1+exp(-a*t);
w36=1+exp(-a*t);
theta=(1+exp(-a*t))*(b1*y(1.2)/a+b2*y(1.3)/a+b3*y(1.4)/a+b4*y(1.5)/a+b5*y(1.6)/a-y(1.1));

kk=1; The %% loop iteratesfor j=1:10% loop iteration E(j)=0;
for i=1:30%% network output calculation t= I; LB_b=1/ (1+exp(-w11*t)); %LB output LC_c1=LB_b*w21; %LC layer output LC_c2=y(I,2)*LB_b*w22; %LC layer output LC_c3=y(I,3)*LB_b*w23; LC_c4=y(I,4)*LB_b*w24; % LC_c5=y(I,5)*LB_b*w25; % LC_c6=y(I,6)*LB_b*w26; LD_d= W31 *LC_c1+ W32 *LC_c2+ W33 *LC_c3+ W34 *LC_c4+ W35 *LC_c5+ W36 *LC_c6; %LD layer output theta=(1+exp(-w11*t))*(w22*y(i,2) /2+w23*y(i,3) /2+w24*y(i,4) /2+w25*y(i,5) /2+w26*y(i,6) /2-y(1.1)); % threshold ym = LD_d - theta; % Network output yc(I)=ym; Error =ym-y(I,1); % Calculation error E(j)=E(j)+abs(error); % error sum error1=error*(1+exp(-w11*t)); Error2 =error*(1+exp(-w11*t)); Error3 =error*(1+exp(-w11*t));
    error4=error*(1+exp(-w11*t));
    error5=error*(1+exp(-w11*t));
    error6=error*(1+exp(-w11*t));
    error7=(1/ (1+exp(-w11*t)))*(1- 1/ (1+exp(-w11*t)))*(w21*error1+w22*error2+w23*error3+w24*error4+w25*error5+w26*error6); W22 = w22-U1 *error2*LB_b; w23=w23-u2*error3*LB_b; w24=w24-u3*error4*LB_b; w25=w25-u4*error5*LB_b; w26=w26-u5*error6*LB_b; w11=w11+a*t*error7; End End % Is predicted according to the trained grey neural networkfor i=1:10
    t=i;
    LB_b=1/ (1+exp(-w11*t)); %LB output LC_c1=LB_b*w21; %LC layer output LC_c2=y(I,2)*LB_b*w22; %LC layer output LC_c3=y(I,3)*LB_b*w23; LC_c4=y(I,4)*LB_b*w24; % LC_c5=y(I,5)*LB_b*w25;
    LC_c6=y(i,6)*LB_b*w26; LD_d=w31*LC_c1+w32*LC_c2+w33*LC_c3+w34*LC_c4+w35*LC_c5+w36*LC_c6; %LD layer output theta=(1+exp(-w11*t))*(w22*y(i,2) /2+w23*y(i,3) /2+w24*y(i,4) /2+w25*y(i,5) /2+w26*y(i,6) /2-y(1.1)); % threshold ym = LD_d - theta; % Network output yc(I)=ym; end yc=yc*10000;
y(:,1)=y(:,1) *10000; Calculate the projected monthly demandfor j=30:- 1:2
    ys(j)=(yc(j)-yc(j- 1));
end
errorn=sum(abs(ys(2:30)-X(2:30.1)'* 10000)); end function nest=get_cuckoos(nest,best,Lb,Ub) % Levy flights n=size(nest,1); beta=3/2; sigma=(gamma(1+beta)*sin(pi*beta/2)/(gamma((1+beta)/2)*beta*2^((beta-1)/2)))^(1/beta); for j=1:n, s=nest(j,:); % This is a simple way of implementing Levy flights % For standard random walks, use step=1; %% Levy flights by Mantegna's algorithm
    u=randn(size(s))*sigma;
    v=randn(size(s));
    step=u./abs(v).^(1/beta);
  
    % In the next equation, the difference factor (s-best) means that 
    % when the solution is the best solution, it remains unchanged.     
    stepsize=0.01*step.*(s-best);
    % Here the factor 0.01 comes from the fact that L/100 should the typical
    % step size of walks/flights where L is the typical lenghtscale; 
    % otherwise, Levy flights may become too aggresive/efficient, 
    % which makes new solutions (even) jump out side of the design domain 
    % (and thus wasting evaluations).
    % Now the actual random walks or flights
    s=s+stepsize.*randn(size(s));
   % Apply simple bounds/limits
   nest(j,:)=simplebounds(s,Lb,Ub);
   fitness(j)=fun(nest(j,:));
end

% Find the current best
[fmin,K]=min(fitness) ;
best=nest(K,:);

Copy the code

3. Operation results

Matlab version and references

1 matlab version 2014A

[1] Yang Baoyang, YU Jizhou, Yang Shan. Intelligent Optimization Algorithm and Its MATLAB Example (2nd Edition) [M]. Publishing House of Electronics Industry, 2016. [2] ZHANG Yan, WU Shuigen. MATLAB Optimization Algorithm source code [M]. Tsinghua University Press, 2017. [3] Zhou Pin. Design and Application of MATLAB Neural Network [M]. Tsinghua University Press, 2013. [4] Chen Ming. MATLAB Neural Network Principle and Examples of Fine Solution [M]. Tsinghua University Press, 2013. [5] FANG Qingcheng. MATLAB R2016a Neural Network Design and Application of 28 Cases Analysis [M]. Tsinghua University Press, 2018.