A list,

Application background of simulated annealing algorithm

Simulated annealing algorithm was proposed in 1982. Kirkpatrick et al. were the first to recognize the similarities between solid annealing processes and optimization problems. They were also inspired by Metropolis et al. ‘s simulation of the process by which solids reach thermal equilibrium at constant temperature. By introducing Metropolis algorithm into the optimization process, we finally get an iterative optimization algorithm to Metropolis algorithm, which is similar to solid annealing process and is called “simulated annealing algorithm”.

Simulated annealing algorithm is a random search algorithm suitable for solving large-scale combinatorial optimization problems. At present, simulated annealing algorithm has achieved satisfactory results in solving TSP, VLSI circuit design and other combinatorial optimization problems. The combination of simulated annealing algorithm with other computational intelligence methods in modeling and optimization of various complex systems has been paid more and more attention, and has gradually become an important direction of development.

2 introduction to simulated annealing algorithm









3 parameters of simulated annealing algorithm

Simulated annealing is a kind of optimization algorithm, it can’t exist independently, needs to have a applications, of which temperature is simulated annealing to optimize the parameters, if it is applied to clustering analysis, so the clustering analysis has a certain or a few parameters are optimized, and the parameters, or parameter set is represented by the temperature. It could be some index, some correlation, some distance, etc.

Ii. Source code

Clear all CLC warning off %% Load Exchange_rate.mat x = []; y = []; tr_len =800;
num_input = 10;
for i = 1:length(X)-num_input
    x = [x; X(i:i+num_input- 1)]; y = [y; X(i+num_input)]; End % Training set --800Input_train = x(1:tr_len, :)';
output_train = y(1:tr_len)'; % Test set --52Input_test = x(tr_len+1:end, :)';
output_test = y(tr_len+1:end)'; %% BP network set % number of nodes [inputnum,N]=size(input_train); % Input node number outputnum=size(output_train,1); % Number of output nodes hiddenNum =5; % inputn,inputps =mapminmax(input_train,0.1); % build network net=newff(inputn,output_train,hiddennum); %% SA algorithm parameter initialization nvar=inputnum*hiddennum+ hiddenNum +hiddennum*outputnum+outputnum; [outputn outputps]=mapminmax(output_train,0.1); % normalized to [0 1【 between %% SA algorithm main procedure lb=- 1*ones(nvar,1); % Lower bound ub=ones(nvar,1); % Upper bound % Cooling table parameter MarkovLength=10; % Markov chain length DecayScale=0.85; % Attenuation parameter StepFactor=0.2; % Metropolis step size factor Temperature0=8; % Initial Temperatureend=3; % Final temperature Boltzmann_con=1; % Boltzmann constant AcceptPoints=0.0; % Metropolis total acceptance point % random initialization parameter range= uB-lb; Par_cur=rand(size(lb)).*range+lb; % use Par_cur to indicate the current solution Par_best_cur=Par_cur; Par_best=rand(size(lb)).*range+lb; % Par_best refers to the best solution in cooling. % Annealing (cooling) once per iteration until iteration conditions are satisfied t=Temperature0; itr_num=0; % Records the number of iterationswhile t>Temperatureend
    itr_num=itr_num+1; itr_num t=DecayScale*t; % Temperature update (cooling)for i=1:MarkovLength % Randomly select the next point near the current parameter point p=0;
        while p==0
            Par_new=Par_cur+StepFactor.*range.*(rand(size(lb))0.5); % Prevent trespassingif sum(Par_new>ub)+sum(Par_new<lb)= =0
                p=1; End end % tests whether the current solution is globally optimalif(objfun_BP(Par_best,inputnum,hiddennum,outputnum,net,inputn,outputn)>... Objfun_BP (Par_new inputnum, hiddennum outputnum,.net, inputn, outputn)) % retain a optimal solution Par_best_cur = Par_best; % This is the new optimal solution Par_best=Par_new; End % Metropolis procedureif (objfun_BP(Par_cur,inputnum,hiddennum,outputnum,net,inputn,outputn)-...
                objfun_BP(Par_new,inputnum,hiddennum,outputnum,net,inputn,outputn)>0Par_cur=Par_new; AcceptPoints=AcceptPoints+1;
        else
            changer=- 1*(objfun_BP(Par_new,inputnum,hiddennum,outputnum,net,inputn,outputn)...
                -objfun_BP(Par_cur,inputnum,hiddennum,outputnum,net,inputn,outputn))/Boltzmann_con*Temperature0;
            p1=exp(changer);
            if p1>rand
                Par_cur=Par_new;
                AcceptPoints=AcceptPoints+1; End end end %% the result shows x=Par_best'; The optimal initial threshold weight is assigned to network prediction %% The BP network optimized by genetic algorithm is used for value prediction W1 =x(1:inputnum*hiddennum);
B1=x(inputnum*hiddennum+1:inputnum*hiddennum+hiddennum);
w2=x(inputnum*hiddennum+hiddennum+1:inputnum*hiddennum+hiddennum+hiddennum*outputnum);
B2=x(inputnum*hiddennum+hiddennum+hiddennum*outputnum+1:inputnum*hiddennum+hiddennum+hiddennum*outputnum+outputnum);

net.iw{1.1}=reshape(w1,hiddennum,inputnum);
net.lw{2.1}=reshape(w2,outputnum,hiddennum);
net.b{1}=reshape(B1,hiddennum,1);
net.b{2}=B2; %% BP network training % network evolution parameter net.trainparam.epochs =100;
net.trainParam.lr=0.1;
net.trainParam.mc = 0.8; % momentum coefficient, [0 1] between.net. TrainParam. Goal =0.001; Net =train(net,inputn,outputn); Net =train(net,inputn,outputn); %% BP training set prediction BP_sim= SIM (NET,inputn); % network output inverse normalization T_sim=mapminmax('reverse',BP_sim,outputps);
% 
 figure
 plot(1:length(output_train),output_train,'b-'.'linewidth'.1)
 hold on
 plot(1:length(T_sim),T_sim,'r-.'.'linewidth'.1)
 axis tight
 xlabel('Training sample'.'FontSize'.12);
 ylabel('currency'.'FontSize'.12);
 legend('Actual value'.'Predicted value');
 string= {'SA - BP prediction'}
 title(string); %% % test data normalization inputn_test=mapminmax('apply',input_test,inputps); An =sim(net,inputn_test); BPsim=mapminmax('reverse',an,outputps);
 figure
 plot(1:length(output_test), output_test,'b-'.'linewidth'.1)
 hold on
Copy the code

3. Operation results





Fourth, note

Version: 2014 a