A list,

Genetic Algorithm (GA) is a part of evolutionary computing. It is a computational model that simulates Darwin’s Genetic selection and natural selection of biological evolution process. It is a method to search for optimal solution by simulating natural evolution process. The algorithm is simple, universal, robust and suitable for parallel processing.

2 Characteristics and Application of Genetic Algorithm Genetic algorithm is a kind of robust search algorithm which can be used for complex system optimization. Compared with traditional optimization algorithm, it has the following characteristics: (1) The encoding of decision variables is taken as the operation object. Traditional optimization algorithms often directly use the actual value of the decision variable itself to optimize the calculation, but genetic algorithm uses a certain form of the decision variable encoding as the operation object. This method of encoding decision variables makes it possible for us to use the concepts of chromosomes and genes in biology for reference in optimization calculation, to imitate the genetic and evolutionary incentives of organisms in nature, and to easily apply genetic operators. (2) Directly use fitness as search information. The traditional optimization algorithm not only needs to use the value of the objective function, but also needs to satisfy the requirement of “the derivative of the objective function must exist” in order to determine the search direction. Genetic algorithm can only use the fitness function value transformed from the objective function value to determine the further search range, without the derivative value of the objective function and other auxiliary information. The objective function value or individual fitness value can be directly used to concentrate the search scope into the search space with higher fitness, so as to improve the search efficiency. (3) Search information of multiple points is used with implicit parallelism. Traditional optimization algorithms usually start from an initial point in the solution space to search the optimal solution iteratively. The search information provided by a single point is not much, so the search efficiency is not high, and there may be local optimal solution and stagnation. Genetic algorithms start the search process for optimal solutions from an initial population consisting of many individuals rather than from a single individual. The operation of selection, crossover and mutation on the initial population produces a new generation of population, which contains a lot of population information. This information can avoid searching some unnecessary points, so as to avoid falling into the local optimal solution and gradually approach the global optimal solution. (4) Use probabilistic search rather than deterministic rules. Traditional optimization algorithms often use deterministic search methods, the transfer of a search point to another search point has a certain direction and transfer relationship, such deterministic search may not reach the optimal shop, limit the scope of application of the algorithm. Genetic algorithm (GA) is a kind of adaptive search technology. Its selection, crossover, mutation and other operations are carried out in a probabilistic way, which increases the flexibility of the search process, and can converge to the optimal solution with a large probability, and has a good global optimization capability. However, crossover probability, mutation probability and other parameters can also affect the search results and search efficiency of the algorithm, so how to choose the parameters of genetic algorithm is a relatively important problem in its application. To sum up, genetic algorithm provides a general framework for solving complex system problems because the overall search strategy and optimal search mode of genetic algorithm do not depend on gradient information or other auxiliary knowledge in calculation, but only require the solution of objective function that affects the search direction and the corresponding fitness function. It is not dependent on the specific domain of the problem, and has strong robustness to the type of the problem, so it is widely used in a variety of fields, including: function optimization, combinational optimization production scheduling problems, automatic control, robotics, image processing (image recovery, image edge feature extraction…) Artificial life, genetic programming, machine learning.

Simple Genetic Algorithms (SGA) only uses selection operator, crossover operator and mutation operator. It is the basis of other Genetic Algorithms because of its Simple evolution process.

3.1 Basic flow of genetic algorithm

A number of initial populations encoded by a certain length (length is related to the accuracy of the problem to be solved) are generated in a random way.

Each individual was evaluated by fitness function, and the individuals with high fitness were selected to participate in genetic operation, while the individuals with low fitness were eliminated.

The collection of genetically manipulated individuals (replication, crossover, mutation) forms a new generation of population until the stop criterion (evolutionary algebra GEN>=?) is satisfied. ;

The best cashed individual in the offspring is taken as the execution result of the genetic algorithm.



Where GEN is the current algebra; M is population size and I is population size.

3.2 Implementation technology of genetic algorithm

Basic genetic algorithm (SGA) consists of coding, fitness function, genetic operator (selection, crossover, variation) and operation parameters.

3.2.1 coding

(1) Binary coding

The length of the binary-encoded string depends on the precision of the problem being solved. It is necessary to ensure that every individual in the solved space can be coded.

Advantages: coding, decoding simple operation, inheritance, cross easy to achieve

Disadvantages: Large length

(2) Other coding methods

Gray code, floating point coding, symbol coding, multi – parameter coding and so on

3.2.2 Fitness function

The fitness function should effectively reflect the gap between each chromosome and the optimal solution chromosome of the problem.

3.2.3 Selecting the operator



3.2.4 Crossover operator

Crossover operation refers to two mutually paired chromosomes in a way to exchange part of their genes, so as to form two new individuals; Crossover operation is an important feature of genetic algorithm which is different from other evolutionary algorithms and is the main method to generate new individuals. Before crossover, individuals in the group need to be matched, generally adopting the principle of random pairing.

Common crossover methods:

A single point of intersection

Double point crossing (multi-point crossing, the more points of crossing, the more likely the individual’s structure will be damaged, generally, the multi-point crossing is not adopted)

Uniform cross

Arithmetic crossover

3.2.5 Mutation operator

Mutation operation in genetic algorithm refers to the replacement of gene values at some loci in the coding string of individual chromosome with other alleles of that loci, thus forming a new individual.

In terms of the ability to generate new individuals in the process of genetic algorithm, crossover operation is the main method to generate new individuals, which determines the global search ability of genetic algorithm. Mutation operation is only an auxiliary method to generate new individuals, but it is also an essential operation step, which determines the local search ability of genetic algorithm. The combination of crossover operator and mutation operator completes the global search and local search of the search space, so that the genetic algorithm can complete the optimization process with good search performance.

3.2.6 Operating Parameters



4 Basic principles of genetic algorithm

4.1 Pattern Theorem



4.2 Building block hypothesis

Patterns with low order, short definition length, and fitness values higher than the population average are called gene blocks or building blocks.

Building block hypothesis: individual gene blocks can be spliced together to form individual coding strings with higher fitness through selection, crossover, mutation and other genetic operators.

The building block hypothesis illustrates the basic idea of solving all kinds of problems with genetic algorithms, that is, better solutions can be produced by directly joining the building blocks together.

Ii. Source code

clc;
close all
clear
load('data4.mat')
figure(1)Hold on to S=(S_coo(2)0.5)*num_shange+(S_coo(1) +0.5);%起点对应的编号
E=(E_coo(2)0.5)*num_shange+(E_coo(1) +0.5); % The number corresponding to the endpointfor i=1:num_shange
    for j=1:num_shange
        if sign(i,j)==1
            y=[i- 1,i- 1,i,i];
            x=[j- 1,j,j,j- 1];
            h=fill(x,y,'k');
            set(h,'facealpha'.0.5)
        end
        s=(num2str((i- 1)*num_shange+j));
        %text(j0.95,i0.5,s,'fontsize'.6) 
    end
end
axis([0 num_shange 0 num_shange])% limits the bounds of the graphplot(S_coo(2),S_coo(1), 'p'.'markersize'.10.'markerfacecolor'.'b'.'MarkerEdgeColor'.'m')% draw starting pointplot(E_coo(2),E_coo(1),'o'.'markersize'.10.'markerfacecolor'.'g'.'MarkerEdgeColor'.'c')% to draw the endset(gca,'YDir'.'reverse'); % Image flipfor i=1:num_shange
    plot([0 num_shange],[i- 1 i- 1].'k-');
    plot([i i],[0 num_shange],'k-'); % draw grid line end PopSize=20; % population size OldBestFitness=0; % Old optimal fitness value gen=0; % Number of iterations maxgen =20; % Maximum number of iterations c1=0.5; % cognitive coefficient C2 =0.7; % social learning coefficient w=0.96; % inertia coefficient %% % initialization path w_min=0.5;
w_max=1; Group=ones(num_point,PopSize); % Population initializationfor i=1:PopSize
    p_lin=randperm(num_point)'; Index =find(p_lin==S); lin=p_lin(1); p_lin(1)=p_lin(index); p_lin(index)=lin; Group(:,i)=p_lin; [Group(:, I),flag]=deal_fun(Group(:, I),num_point,liantong_point,E, num_SHANGe); P_lin =randperm(num_point)';;
        index=find(p_lin==S);
        lin=p_lin(1);
        p_lin(1)=p_lin(index); p_lin(index)=lin; Group(:,i)=p_lin; [Group(:,i),flag]=deal_fun(Group(:,i),num_point,liantong_point,E,num_shange); End end % route=Group(:,end)'; index1=find(route==E); route_lin=route(1:index1); For I = 2: index1 Q1 = [mod (route_lin (I - 1) - 1, num_shange) + 1-0.5, ceil (route_lin (I - 1)/num_shange) to 0.5]; Q2 = [mod (route_lin (I) - 1, num_shange) + 1-0.5, ceil (route_lin (I)/num_shange) to 0.5]; plot([Q1(1),Q2(1)],[Q1(2),Q2(2)],'b-. '.'LineWidth'.3);hold on
    
end
title('Particle Swarm optimization - Random Route');

title('Particle Swarm optimization - Random Route');
figure(2)
hold on
for i=1:num_shange
    for j=1:num_shange
        if sign(i,j)==1
            y=[i- 1,i- 1,i,i];
            x=[j- 1,j,j,j- 1];
            h=fill(x,y,'k');
            set(h,'facealpha'.0.5)
        end
        s=(num2str((i- 1)*num_shange+j));
        text(j0.95,i0.5,s,'fontsize'.6) 
    end
end
axis([0 num_shange 0 num_shange])% limits the bounds of the graphplot(S_coo(2),S_coo(1), 'p'.'markersize'.10.'markerfacecolor'.'b'.'MarkerEdgeColor'.'m')% draw starting pointplot(E_coo(2),E_coo(1),'o'.'markersize'.10.'markerfacecolor'.'g'.'MarkerEdgeColor'.'c')% to draw the endset(gca,'YDir'.'reverse'); % Image flipfor i=1:num_shange
    plot([0 num_shange],[i- 1 i- 1].'k-');
    plot([i i],[0 num_shange],'k-'); Draw the grid line endfor i=2:index1
    Q1=[mod(route_lin(i- 1)- 1,num_shange)+10.5.ceil(route_lin(i- 1)/num_shange)0.5];
    Q2=[mod(route_lin(i)- 1,num_shange)+10.5.ceil(route_lin(i)/num_shange)0.5];
    plot([Q1(1),Q2(1)],[Q1(2),Q2(2)].'b-.'.'LineWidth'.3End % Initial Velocity =zeros(num_point,PopSize);for i=1:PopSize
    Velocity(:,i)=round(rand(1,num_point)'*num_point/10); For I =1:PopSize EachPathDis(I) = PathDistance(Group(:, I)',E,num_shange); end IndivdualBest=Group; % Record the location of individual extreme point of each particle, that is, the shortest path found by individuals IndivdualBestFitness=EachPathDis; % record the best fitness value, that is, the length of the shortest path found by the individual [GlobalBestFitness,index]=min(EachPathDis); % finds the global optimal value and the corresponding ordinal % searchwhilegen < maxgen w=w_max-(w_max-w_min)*gen/maxgen; % Increasing number of iterations gen = gen +1% updates the location of the global extreme point, in this case the pathfor i=1:PopSize GlobalBest(:,i) = Group(:,index); Pij_xij =GenerateChangeNums(Group,IndivdualBest) pij_xij=GenerateChangeNums(Group,IndivdualBest); % Pij_xij =HoldByOdds(PIj_xij, C1); % Preserve the switch sequence with probability C1 pgj_xij=GenerateChangeNums(Group,GlobalBest); % Switching sequence pgj_xij=HoldByOdds(pgj_xij,c2) according to global optimal solution; % Retain swap sequence with probability c2 % retain swap sequence with probability W Velocity=HoldByOdds(Velocity,w); Group = PathExchange(Group,Velocity); % PathExchange(Group,pij_xij); % PathExchange(Group,pij_xij); Group = PathExchange(Group,pgj_xij);for i = 1:PopSize    
        [Group(:,i),flag]=deal_fun(Group(:,i),num_point,liantong_point,E,num_shange);
        while flag==1
            p_lin=randperm(num_point)'; index=find(p_lin==S); lin=p_lin(1); p_lin(1)=p_lin(index); p_lin(index)=lin; Group(:,i)=p_lin; [Group(:,i),flag]=deal_fun(Group(:,i),num_point,liantong_point,E,num_shange); End end for I = 1:PopSize % EachPathDis(I) = PathDistance(Group(:, I)',E,num_shange); end IsChange = EachPathDis<IndivdualBestFitness; % IndivdualBest(:, find(IsChange)) = Group(:, find(IsChange)); % Update individual optimal path IndivdualBestFitness = IndivdualBestFitness.*(~IsChange) + EachPathDis.*IsChange; % Update individual optimal path distance [GlobalBestFitness, index] = min(IndivdualBestFitness); % Update the global best path, record the corresponding sequence numberifGlobalBestFitness~=OldBestFitness % Compare the fitness values before and after the update; OldBestFitness=GlobalBestFitness; Best_route =IndivdualBest(:,index)'; end BestFitness(gen) =GlobalBestFitness; End % optimal solution index1=find(best_route==E); route_lin=best_route(1:index1); For I = 2: index1 Q1 = [mod (route_lin (I - 1) - 1, num_shange) + 1-0.5, ceil (route_lin (I - 1)/num_shange) to 0.5]; Q2 = [mod (route_lin (I) - 1, num_shange) + 1-0.5, ceil (route_lin (I)/num_shange) to 0.5]; plot([Q1(1),Q2(1)],[Q1(2),Q2(2)],'r'.'LineWidth'.3)
end
for i=1:PopSize
    p_lin=randperm(num_point)'; Index =find(p_lin==S); lin=p_lin(1); p_lin(1)=p_lin(index); p_lin(index)=lin; Group(:,i)=p_lin; [Group(:, I),flag]=deal_fun(Group(:, I),num_point,liantong_point,E, num_SHANGe); P_lin =randperm(num_point)';;
        index=find(p_lin==S);
        lin=p_lin(1);
        p_lin(1)=p_lin(index); p_lin(index)=lin; Group(:,i)=p_lin; [Group(:,i),flag]=deal_fun(Group(:,i),num_point,liantong_point,E,num_shange); End end % route=Group(:,end)'; index3=find(route==E); route_lin1=route(1:index3); For I = 2: and index3 Q1 = [mod (route_lin1 (I - 1) - 1, num_shange) + 1-0.5, ceil (route_lin1 (I - 1)/num_shange) to 0.5]; Q2 = [mod (route_lin1 (I) - 1, num_shange) + 1-0.5, ceil (route_lin1 (I)/num_shange) to 0.5]; plot([Q1(1),Q2(1)],[Q1(2),Q2(2)],'c-. '.'LineWidth'.3); hold on endfor i=1:PopSize
    p_lin=randperm(num_point)'; Index =find(p_lin==S); lin=p_lin(1); p_lin(1)=p_lin(index); p_lin(index)=lin; Group(:,i)=p_lin; [Group(:, I),flag]=deal_fun(Group(:, I),num_point,liantong_point,E, num_SHANGe); P_lin =randperm(num_point)';;
        index=find(p_lin==S);
        lin=p_lin(1);
        p_lin(1)=p_lin(index); p_lin(index)=lin; Group(:,i)=p_lin; [Group(:,i),flag]=deal_fun(Group(:,i),num_point,liantong_point,E,num_shange); End end % route=Group(:,end)'; index2=find(route==E); route_lin2=route(1:index2); For I = 2: index2 Q1 = [mod (route_lin2 (I - 1) - 1, num_shange) + 1-0.5, ceil (route_lin2 (I - 1)/num_shange) to 0.5]; Q2 = [mod (route_lin2 (I) - 1, num_shange) + 1-0.5, ceil (route_lin2 (I)/num_shange) to 0.5]; plot([Q1(1),Q2(1)],[Q1(2),Q2(2)],'m-. '.'LineWidth'.3); hold on endCopy the code

3. Operation results

Fourth, note

Version: 2014 a