A brief introduction to genetic algorithms

1 the introduction Genetic algorithm theory2.1 Biological basis of genetic algorithm 2.2 Theoretical basis of genetic algorithm 2.3 Basic concepts of genetic algorithm 2.4 Standard genetic algorithm 2.5 Characteristics of genetic algorithm 2.6 Improvement direction of genetic algorithm 3. Genetic algorithm flow 4 Key Parameters

Ii. Introduction to simulated annealing algorithm

The idea of Simulated Annealing algorithm (SA) was first proposed by Metropolis et alin 1953: Kirkpatrick first used Simulated Annealing algorithm to solve combinatorial optimization problems in 1983 [1]. Simulated annealing algorithm is a random optimization algorithm based on MonteCarlo iterative solution strategy. Its starting point is based on the similarity between physical solid material annealing process and general combinatorial optimization problem. Its purpose is to provide an effective approximation solution algorithm for problems with NP(non-deterministic Polynomial) complexity. It overcomes the defects that other optimization processes are prone to fall into local minima and dependence on initial values. Simulated annealing algorithm is a general optimization algorithm and an extension of local search algorithm. It is different from local search algorithm in that it selects the inferior solution with large target value in the neighborhood with a certain probability. Theoretically, it is a global optimal algorithm. The simulated annealing algorithm is based on the similarity between the solution of the optimization problem and the annealing process of the physical system. It makes use of Metropolis algorithm and appropriately controls the temperature drop process to realize simulated annealing, so as to achieve the purpose of solving the global optimization problem [2]. Simulated annealing algorithm is an optimization process that can be applied to the minimization problem. In this process, the length of each step of the updating process is proportional to the corresponding parameters, which play the role of temperature. Similar to the principle of metal annealing, the temperature is initially raised very high for faster minimization, and then slowly cooled to stabilize. At present, simulated annealing algorithm is in its heyday, and both theoretical and applied research has become a very popular topic [3-7]. In particular, it has been widely used in engineering, such as production scheduling, control engineering, machine learning, neural networks, pattern recognition, image processing, structural optimization of discrete/continuous variables and so on. It can effectively solve combinatorial optimization problems and complex function optimization problems which are difficult to be solved by conventional optimization methods. Simulated annealing algorithm has the very strong global search performance, this is because compared with the common optimization search method, it adopted many unique methods and techniques: in simulated annealing algorithm, the basic without the knowledge of the search space or other auxiliary information, just define the neighborhood structure, adjacent solution within its neighborhood structure selection, using to evaluate the objective function: The simulated annealing algorithm does not use deterministic rules, but uses the change of probability to guide its search direction. The probability it uses is only a tool to guide its search process to move towards the region of more optimal solution. Therefore, although it appears to be a blind search method, it actually has a clear search direction.

2 theory of simulated annealing algorithm, simulated annealing algorithm to optimize the problem solving process and physical, which is based on the similarity between the annealing process, optimizing the objective function is equivalent to the internal energy of the metal, the optimization problem of independent variables is equivalent to the internal energy of the metal composite state space state space, the problem solving process is to look for a group of state, to minimize the objective function values. Using Me to POLIS algorithm and appropriately controlling the temperature drop process to achieve simulated annealing, so as to achieve the purpose of solving the global optimization problem [8]. 2.1 Physical Annealing Process The core idea of simulated annealing algorithm is very similar to the principle of thermodynamics. At high temperatures, a large number of molecules in a liquid move relatively freely between each other. If the liquid cools slowly, the mobility of the thermal atoms is lost. Large numbers of atoms can often arrange themselves in rows to form a pure crystal, perfectly ordered in all directions within millions of times the distance of individual atoms. For this system, the crystal state is the lowest energy state, and all slowly cooling systems can naturally reach this lowest energy state. In fact, if the liquid metal is cooled rapidly, it will not reach this state, but only a polycrystalline or amorphous state with a higher energy. The essence of the process, therefore, is to cool slowly to buy enough time for a large number of atoms to redistribute before they lose their mobility, which is necessary to ensure energy reaches a low-energy state. Simply speaking, the physical annealing process consists of the following parts: heating process, isothermal process and cooling process.

The purpose of the heating process is to enhance the thermal motion of particles and make them deviate from the equilibrium position. When the temperature is high enough, the solid will melt into liquid, thus eliminating the non-uniform state that may have existed in the system and allowing the subsequent cooling process to start with an equilibrium state. The melting process is related to the energy increase of the system, and the energy of the system also increases with the increase of temperature.

According to the knowledge of physics, for a closed system with constant temperature while exchanging heat with the surrounding environment, the spontaneous change of the system state is always in the direction of reducing the free energy: when the free energy reaches the minimum, the system reaches an equilibrium state.

The purpose of the cooling process is to weaken the thermal motion of particles and gradually tend to order, and the system energy gradually decreases, so as to obtain a low energy crystal structure.

2.2 Principle of simulated annealingThe simulated annealing algorithm is derived from the solid annealing principle. The solid is heated to a high enough temperature and then cooled slowly. When the temperature is increased, the particles inside the solid become disordered and their internal energy increases. When the particles are cooled slowly, they become orderly and reach equilibrium state at every temperature. Finally, they reach ground state at constant temperature and their internal energy decreases to a minimum. The similarity between simulated annealing algorithm and metal annealing process is shown in Table 7.1. In the root Metropolis criterion, the probability of particles approaching equilibrium at temperature 7 is exp(-▲E/T), where E is the internal energy at temperature 7 and ▲E is its change. Solid annealing is used to simulate the combinatorial optimization problem, the internal energy E is simulated as the objective function value, and the temperature 7 is evolved into the control parameter, namely, the simulated annealing algorithm to solve the combinatorial optimization problem is obtained: Starting from the initial solution % and the initial value of the control parameter 7, the iteration of “generating a new solution → calculating the difference of the objective function to accept or discard” is repeated for the current solution, and the T value is gradually reduced. The current solution at the end of the algorithm is the approximate optimal solution, which is a heuristic random search process of the MonteCarlo iterative solution method. The annealing process is controlled by the cooling schedule, including the initial value of the control parameter 7 and its decay factor K, the number of iterations at each value of 7 and the stop condition. 2.3 Idea of simulated annealing algorithmThe main idea of simulated annealing is that the random walk (i.e. random selection point) in the search interval is used to make the random walk gradually converge to the local optimal solution by using the Metropolis sampling criterion. As temperature is an important control parameter in Metropolis algorithm, it can be considered that the size of this parameter controls how fast the random process moves to the local or global optimal solution. Metropolis is an effective emphasis sampling method whose algorithm is: when the system changes from one energy state to another, the corresponding energy changes from E to E with the probability ofAfter a certain number of iterations, the system will gradually tend to a stable distribution state. In key sampling, if the new state is downward, then accept (local optimal); If up (global search), then accept with a certain probability. Starting from a certain initial solution, simulated annealing algorithm can obtain the relative optimal solution of combinatorial optimization problem with given control parameter values after the transformation of a large number of solutions. Then, the value of the control parameter 7 is reduced and the Metropolis algorithm is repeatedly executed. When the control parameter T approaches zero, the overall optimal solution of the combinatorial optimization problem can be finally obtained. The value of the control parameter must decay slowly. Temperature is an important control parameter of Metropolis algorithm, and simulated annealing can be regarded as an iteration of Metro PL is algorithm when the control parameter is decreased by 7. The value of 7 is large at the beginning, which can accept the poor deteriorating solution; With the decrease of 7, only a better deteriorating solution can be accepted; And finally, as 7 goes to 0, you don’t accept any deteriorating solutions anymore. At infinitely high temperature, the system is uniformly distributed at once, accepting all the proposed transformations. The smaller the attenuation of T, the longer the time for 7 to reach the terminal point; However, Markov chain can be reduced to shorten the time to reach the quasi-equilibrium distribution.

2.4 Characteristics of simulated annealing Algorithm Simulated annealing algorithm has a wide range of application, high reliability to obtain the global optimal solution, simple algorithm, easy to implement; The search strategy of the algorithm is helpful to avoid the defect of local optimal solution and improve the reliability of global optimal solution. Simulated annealing algorithm has very strong robustness, because compared with ordinary optimization search method, it adopts many unique methods and techniques. There are mainly the following aspects: (1) Accept the deteriorating solution with a certain probability. Simulated annealing algorithm not only introduces appropriate stochastic factors but also introduces the natural mechanism of physical system annealing process. The introduction of this natural mechanism enables the simulated annealing algorithm to accept not only the points that make the objective function value “good”, but also the points that make the objective function value “bad” with a certain probability in the iterative process. The states appearing in the iteration process are randomly generated, and it is not required that the latter state must be superior to the former state, and the acceptance probability gradually decreases with the decrease of temperature. Many traditional optimization algorithms are deterministic, and the transfer from one search point to another has certain transfer methods and transfer relations. This deterministic method often makes the search point far less than the optimal advantage, thus limiting the application scope of the algorithm. Simulated annealing algorithm searches in a probabilistic way, which increases the flexibility of the search process. (2) Introduced algorithm control parameters. The control parameters of an algorithm similar to annealing temperature are introduced, which divides the optimization process into several stages and determines the criteria of random state selection at each stage. The acceptance function is presented by Metropolis algorithm with a simple mathematical model. The simulated annealing algorithm has two important steps: first, under each control parameter, the neighboring random state is generated from the previous iteration point, and the acceptance criterion determined by the control parameter determines the choice of the new state, and then a certain length of random Markov chain is formed. The second is to slowly reduce the control parameters and improve the acceptance criteria until the control parameters approach zero and the state chain stabilizes at the optimal state of the optimization problem, thus improving the reliability of the global optimal solution of the simulated annealing algorithm. (3) Less requirements for the objective function. Traditional search algorithms need not only the value of the objective function, but also the derivative value of the objective function and other auxiliary information to determine the search direction: when these information does not exist, the algorithm is invalid. The simulated annealing algorithm does not need other auxiliary information, but only defines the neighborhood structure, selects adjacent solutions in the neighborhood structure, and evaluates them with the objective function.

2.5 Improvement Direction of simulated Annealing Algorithm Improving the search efficiency of simulated annealing algorithm on the basis of ensuring certain required optimization quality is the main content of simulated annealing algorithm improvement [9-10]. There are the following feasible schemes: choose the appropriate initial state; Design appropriate state generation functions to show the spatial dispersion or local region of the state according to the needs of the search process: design efficient annealing process; To improve the temperature control mode: using parallel search structure; Design suitable algorithm termination criteria: etc. In addition, the improvement of simulated annealing algorithm can be realized by adding some steps. The main improvement methods are as follows: (1) increase the memory function. In order to avoid the loss of the optimal solution currently encountered due to the execution of the probabilistic acceptance link in the search process, the best state so far can be stored by adding a storage link. (2) Increasing or reheating process. When the temperature is appropriately raised in the algorithm process, the acceptance probability of each state can be activated to adjust the current state in the search process, and the algorithm to avoid rabbits stagnates at the local minimum solution. (3) For each current state, multiple search strategy is adopted to accept the optimal state in the region with probability, instead of the single comparison method of the standard simulated annealing algorithm. (4) Combine with other search algorithms (such as genetic algorithm, immune algorithm, etc.). The advantages of other algorithms can be combined to improve the efficiency and quality of solution.

3. Simulated annealing algorithm flowThe generation and acceptance of new solutions in simulated annealing algorithm can be divided into three steps as follows: (1) a generation function generates a new solution in the solution space from the current solution; In order to facilitate the subsequent calculation and acceptance and reduce the time consuming of the algorithm, the method of generating a new solution by simple transformation of the current solution is usually selected. It should be noted that the transformation method to generate the new solution determines the neighborhood structure of the current new solution, so it has a certain influence on the selection of cooling schedule. (2) Whether a new solution is accepted or not is judged based on an acceptance criterion, the most commonly used of which is Metropolis criterion: If AK 0, accept X as the new current solution; otherwise, accept X “as the new current solution X with probability exp(-▲E/7). (3) When the new solution is confirmed to be accepted, replace the current solution with the new solution. This only needs to realize the transformation part of the current solution corresponding to the new solution when the new solution is generated, and modify the value of the objective function. At this point, the current solution has realized an iteration, and the next round of experiments can be started on this basis. If the new solution is judged to be abandoned, the next round of experiments will be continued on the basis of the original current solution. The solution obtained by simulated annealing algorithm is independent of the initial solution state (the starting point of algorithm iteration) and has asymptotic convergence, which has been proved to be an optimal algorithm with probability 1 converging to the global optimal solution. The simulated annealing algorithm can be divided into three parts: solution space, objective function and initial solution. The specific process of the algorithm is as follows [8] : (1) Initialization: set the initial temperature 7(sufficiently large), the initial solution state %(which is the starting point of algorithm iteration), and the number of iterations of each 7 value L: (2) for k=1,… , L do steps (3) to (6); (3) Generate a new solution X; (4) Calculate the increment A BE(X) -e (X), where E()) is the evaluation function: (5) If AK0, accept X as the new current solution, otherwise accept X as the new current solution with probability exp(-▲E/7); (6) If the termination condition is met, output the current solution as the optimal solution and end the program: (7)7 gradually decreases and T→0, and then go to step (2). The simulated annealing algorithm flow is shown in Figure 7.1. 4 Key ParametersSimulated annealing algorithm has high performance quality, it is more general and easy to implement. However, in order to obtain the optimal solution, the algorithm usually requires a high initial temperature and enough sampling times, which makes the optimization time of the algorithm often too long. From the algorithm structure, the new state generation function, initial temperature, detemperature function, Markov chain length and algorithm stop criterion are the main links that directly affect the algorithm optimization results.State generation functionThe state generation function should be designed to ensure that the generated candidate solutions are spread over the entire solution space as much as possible. In general, the state generation function consists of two parts, namely, the way of generating candidate solutions and the probability distribution of generating candidate solutions. The generation mode of the candidate solution is determined by the nature of the problem and is usually generated with a certain probability in the neighborhood structure of the current state.Initial temperatureTemperature 7 plays a decisive role in the algorithm, which directly controls the direction of annealing. According to the acceptance criterion of random movement, the greater the initial temperature, the greater the probability of obtaining high-quality solutions, and the acceptance rate of Metropolis is about 1. However, too high initial temperature will increase the calculation time. To this end, a group of states can be sampled uniformly, and the variance of the target value of each state can be used as the initial temperature.Annealing temperature functionThe detemperature function is the temperature update function, which is used to modify the temperature value in the outer loop. At present, the most commonly used temperature updating function is exponential dewarming function, namely T(n+1) =KXT(n), where 0<K1 is a constant very close to 1.Selection of Markov chain length LMarkov chain length is the number of iterative optimization under isothermal conditions. The selection principle is that L should be selected to restore quasi-equilibrium on each value of the control parameter under the premise that the attenuation function of attenuation parameter 7 has been selected. Generally, L is 100~1000.Algorithm stop criterionThe algorithm stop rule is used to determine when an algorithm ends. You can simply set the final temperature T, and when F=T, the algorithm terminates. However, the convergence theory of simulated fire algorithm requires that T tends to zero, which is actually not practical. Commonly used stop criteria package: set the threshold of the termination temperature, set the threshold of the number of iterations, or stop the search when the optimal value is continuously unchanged.

Three, some source code

M clear CLC close all %% Genetic parameter POPsize =100; % Population size MAXGEN=50; % Maximum genetic algebra vesselNum=7; % Number of ships berthNum=2; % Number of berths chromlength=8; % Chromosome length Pc=0.6; % cross probability Pm=0.001; % variation probability GGAP=0.9; % % % choice probability initialization population pop = initGroup (popsize, chromlength vesselNum, berthNum); % random generation of initial population %% output random solution berth scheduling scheme and total time of ship in port DISP ('Berth scheduling scheme in initial population :') OutputPath(pop); % draw the path function disp('Total time of ship in port for initial stochastic solution :')
objvalue=calobjvalue(pop,popsize,vesselNum,berthNum); 
disp(objvalue(1))
disp('-- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -') %% optimized gen=0; figure; hold on; box on %display the boundary of the current axes. xlim([0,MAXGEN]) %x in the range of the axes title('Optimization process')
xlabel('Number of iterations')
ylabel('Optimal solution for each iteration of ship time in port') objvalue=calobjvalue(pop,popsize,vesselNum,berthNum); % preobjValue =min(objvalue);whileGen < MAXGEN calculation fitness objvalue % % = calobjvalue (pop, popsize, vesselNum berthNum); % calculate the target function value line([gen- 1,gen],[preobjvalue,min(objvalue)]); pause(0.0001) %line([X1 X2],[Y1 Y2],S); % Draw A line between A(X1,Y1) and B(X2, Y2) with S for other attributes (color, line thickness, etc.). fitvalue=fitness(objvalue,popsize); % Select newPOP =Selection(POP, FITValue,GGAP); pop1=newpop; %pop1 =Crossover(pop1,Pc); pop2=newpop; % POP2 =Mutate(POP2,Pm); %% reinsert child POP =Reins(POP,newpop,objvalue); gen=gen+1; end [minobjvalue,minIndex]=min(objvalue); %[Y,U]=min(A); %[Y,U]=min(A); %[Y,U]=min(A); %% Output optimal berth scheduling scheme and total time of ship in port DISP ('Optimal berth scheduling scheme :')
OutputPath(minpop);
disp('The optimal solution of the ship's total time in port :')
disp(min(objvalue));
disp('-- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -') %Name: calobjValue.m %% Calculated objective function value % Input variables: % POP: initial population % POPSize: population size (size) %vesselNum: number of ships %berthNum: number of berths % Output variables:  function objvalue=calobjvalue(pop,popsize,vesselNum,berthNum) objvalue=zeros(popsize,1);
for k=1:popsize chromsome=pop(k,:); % shipInfo=zeros(vesselNum+berthNum)- 1.5); % Store the number of each ship in the chromosome in the first column of the shipInfo matrix I =1;
while i<=length (chromsome)
  shipInfo(i,1)=chromsome(i);
      i=i+1; The first column is the ship number, the second column is the time of arrival, the third column is the time of loading and unloading containers SHIP =[1 0 12
      2 4 10
      3 6 3
      4 9 8
      5 11 5
      6 18 12
      7 19 4];            
matrix=ship(1:vesselNum,1:3); The first column of the %shipInfo matrix stores the ship number (including the delimiter)0), % The second column stores the waiting time of the ship, the third column stores the idle time of the berth before the operation of the ship, % the fourth column stores the departure time of the ship, and the fifth column stores the time of the ship in port, I =1;
j=1;
berthNo=1;
whileI <=length(chromsome) % Calculates the information about the first ship in the berth in the scheduling order, taking the ship number first, if a delimiter is encountered0VesselNo =shipInfo(I,1);
        if vesselNo==0
            j=0;
            berthNo=berthNo+1;
        else% Search and subsequent operations are performed only when the ship number actually exists. VesselNo is searched from the matrix so as to extract the ship's arrival time and loading and unloading operation time m=find(matrix(:,1)==vesselNo);
        arrivetime=matrix(m,2); % loadtime=matrix(m,3); % Ship loading and unloading container operation time % Calculate the ship's waiting time, berth idle time before operation, departure time and time in portif j==1
              shipInfo(i,2) =0; % wait time (wait time before operation) shipInfo(I,3)=arrivetime0; % shipInfo(I,4)=arrivetime+loadtime; % shipInfo(I,5)=loadtime; % Time in portelse
                     if arrivetime<shipInfo(i- 1.4% time of arrival of the ship > time of departure of the previous ship2) =0;
                      shipInfo(i,3)=arrivetime-shipInfo(i- 1.4);
                      shipInfo(i,4)=arrivetime+loadtime;
                      shipInfo(i,5) =loadtime;
                     end
             end
         end      
             i=i+1;
             j=j+1; End end %Name:Crossover. M %% pop1 selected individual %Pc Crossover probability % output: Function [newpop]=Crossover(pop1,Pc) Npop1=size(pop1, Pc)1);           
newpop=pop1;
for i=1:2:Npop1- mod(Npop1,2)
    ifPc>=rand % cross probability Pc [newpop(I,:),newpop(I +)1,:)]=intercross(pop1(i,:),pop1(i+1:)); Function [a,b]=intercross(a,b) L=length(a); function [a,b]=intercross(a,b) L=length(a); r1=randsrc(1.1[1:L]); %Generate random matrix using prescribed alphabet.
r2=randsrc(1.1[1:L]); %OUT = RANDSRC(M,N,ALPHABET) generates an M-by-N random matrix,using the alphabet specified in ALPHABET.
ifr1~=r2 a0=a; b0=b; s=min([r1,r2]); e=max([r1,r2]);fori=s:e a1=a; b1=b; a(i)=b0(i); b(i)=a0(i); x=find(a==a(i)); y=find(b==b(i)); i1=x(x~=i); i2=y(y~=i);if ~isempty(i1)
            a(i1)=a1(i);
        end
        if~isempty(i2) b(i2)=b1(i); Stochastic Universal Sampling end End % SUS (Stochastic Universal Sampling) % fitValue %NSel number of selected individuals % output: Function NewChrIx = Sus(FitValue,NSel) %NewChrIx (Index) [Nind, ANS] = size(FitValue); cumfit = cumsum(fitvalue); Cumsum Cumulative sum of elements. Cumsum (A, cumsum)1) according to the column/cumsum (A,2Trials = Cumfit (Nind)/NSel * (rand + (0:NSel- 1)'); %trials Mf = cumfit(:, ones(1, NSel)); Mt = trials(:, ones(1, Nind))';
[NewChrIx, ans] = find(Mt < Mf & [ zeros(1, NSel); Mf(1:Nind- 1, :) ] <= Mt);
[ans, shuf] = sort(rand(NSel, 1)); NewChrIx = NewChrIx(shuf); %% RWS is the method of Roulette Wheel Selection, while SUS is random universal sampling. RWS tends to have a large selection bias, while SUS can statistically avoid it.Copy the code

4. Operation results

Matlab version and references

1 matlab version 2014A

[1] Yang Baoyang, YU Jizhou, Yang Shan. Intelligent Optimization Algorithm and Its MATLAB Example (2nd Edition) [M]. Publishing House of Electronics Industry, 2016. [2] ZHANG Yan, WU Shuigen. MATLAB Optimization Algorithm source code [M]. Tsinghua University Press, 2017.