A list,
Any optimization problem can be converted into a function, so the intelligent algorithm is widely used, and the same biological immune algorithm (AIA) is a simulation Darwin evolution of a new intelligent algorithm, biological immune algorithm (AIA) based on antibody antigen of processing mechanism, biological systems antibody evolution, and ultimately eliminate the antigen, This process is the global optimization of biological immune algorithm (AIA).
Considering the universality of function optimization problems, in recent years, many scholars using a new algorithm for different function test, such as the stability of the algorithm, guangdong-based, effectiveness, global and local optimization ability, so the optimization function problem (single objective and multi-objective function optimization problem) has become a research focus in the scientific research personnel. According to the possible solutions obtained from the test function, the intelligent algorithm is constantly improved and the theoretical basis is gradually deepened, which makes the emission itself more robust and can be quickly used in engineering.
Artificial immune system is attracting people’s great attention. Based on the principle of immune system, various algorithms have been developed, such as genetic algorithm GA, differential evolution algorithm DE, swarm algorithm ABC, fish swarm algorithm FSA, etc., which have been applied more and more widely in practical engineering problems and achieved more and more results.
Iii. Source code
clc,clear,close all;
warning off
global popsize length min max N code;
N=12; % Number of decimal coded digits per chromosome M=100; % Evolution algebra popsize=30; % set the initial parameter, population size length=10; % length is the binary coding bit of each gene chromlength=N*length; % String length (individual length), binary encoding length of chromosome PC =0.7; % Set the crossover probability. In this example, the crossover probability is a constant value. If you want to set the changing crossover probability, you can express it by expression or write a crossover probability function, for example, the value obtained by neural network training is used as the crossover probability PM =0.3; % set the probability of variation to bound={- 100.*ones(popsize,1),zeros(popsize,1)};
min=bound{1}; max=bound{2}; pop=initpop(popsize,chromlength); % runs the initialization function to randomly generate the initial population ymax=500; % fitness value initialization ysw_x = zeros(3.12); % capacitor C2: fault type code, one for each behavior! code(1,), normal; code(2, :).50%; code(3, :).150% code = [0.8180 1.6201 14.8590 17.9706 24.0737 33.4498 43.3949 53.3849 63.3451 73.0295 79.6806 74.3230
0.7791 1.2697 14.8682 26.2274 30.2779 39.4852 49.4172 59.4058 69.3676 79.0657 85.8789 81.0905
0.8571 1.9871 13.4385 13.8463 20.4918 29.9230 39.8724 49.8629 59.8215 69.4926 75.9868 70.6706];
function [bestindividual,bestfit]=best(pop,fitvalue)
global popsize N length;
bestindividual=pop(1, :); bestfit=fitvalue(1); Function [objx]=calx(pop) % global N length % Default binary length of chromosome length=10
N=12; length=10;
for j=1:N % decode! temp(:,j)=decodechrom(pop,1+(j- 1)*length,length);
x(:,j)=temp(:,j)/(2^length- 1)*(max(j)-min(j))+min(j);
end
fitvalue=calfitvalue(objvalue); favg(k)=sum(fitvalue)/popsize; %
newpop=selection(pop,fitvalue); objvalue=calobjvalue(newpop,i); %
newpop=crossover(newpop,pc,k); objvalue=calobjvalue(newpop,i); %
newpop=mutation(newpop,pm); objvalue=calobjvalue(newpop,i); %
[bestindividual,bestfit]=best(newpop,fitvalue);
if bestfit<ymax
ymax=bestfit;
end
Copy the code
3. Operation results
Fourth, note
Version: 2014 a