A brief introduction of simulated annealing algorithm

The idea of Simulated Annealing algorithm (SA) was first proposed by Metropolis et alin 1953: Kirkpatrick first used Simulated Annealing algorithm to solve combinatorial optimization problems in 1983 [1]. Simulated annealing algorithm is a random optimization algorithm based on MonteCarlo iterative solution strategy. Its starting point is based on the similarity between physical solid material annealing process and general combinatorial optimization problem. Its purpose is to provide an effective approximation solution algorithm for problems with NP(non-deterministic Polynomial) complexity. It overcomes the defects that other optimization processes are prone to fall into local minima and dependence on initial values. Simulated annealing algorithm is a general optimization algorithm and an extension of local search algorithm. It is different from local search algorithm in that it selects the inferior solution with large target value in the neighborhood with a certain probability. Theoretically, it is a global optimal algorithm. The simulated annealing algorithm is based on the similarity between the solution of the optimization problem and the annealing process of the physical system. It makes use of Metropolis algorithm and appropriately controls the temperature drop process to realize simulated annealing, so as to achieve the purpose of solving the global optimization problem [2]. Simulated annealing algorithm is an optimization process that can be applied to the minimization problem. In this process, the length of each step of the updating process is proportional to the corresponding parameters, which play the role of temperature. Similar to the principle of metal annealing, the temperature is initially raised very high for faster minimization, and then slowly cooled to stabilize. At present, simulated annealing algorithm is in its heyday, and both theoretical and applied research has become a very popular topic [3-7]. In particular, it has been widely used in engineering, such as production scheduling, control engineering, machine learning, neural networks, pattern recognition, image processing, structural optimization of discrete/continuous variables and so on. It can effectively solve combinatorial optimization problems and complex function optimization problems which are difficult to be solved by conventional optimization methods. Simulated annealing algorithm has the very strong global search performance, this is because compared with the common optimization search method, it adopted many unique methods and techniques: in simulated annealing algorithm, the basic without the knowledge of the search space or other auxiliary information, just define the neighborhood structure, adjacent solution within its neighborhood structure selection, using to evaluate the objective function: The simulated annealing algorithm does not use deterministic rules, but uses the change of probability to guide its search direction. The probability it uses is only a tool to guide its search process to move towards the region of more optimal solution. Therefore, although it appears to be a blind search method, it actually has a clear search direction.

2 theory of simulated annealing algorithm, simulated annealing algorithm to optimize the problem solving process and physical, which is based on the similarity between the annealing process, optimizing the objective function is equivalent to the internal energy of the metal, the optimization problem of independent variables is equivalent to the internal energy of the metal composite state space state space, the problem solving process is to look for a group of state, to minimize the objective function values. Using Me to POLIS algorithm and appropriately controlling the temperature drop process to achieve simulated annealing, so as to achieve the purpose of solving the global optimization problem [8]. 2.1 Physical Annealing Process The core idea of simulated annealing algorithm is very similar to the principle of thermodynamics. At high temperatures, a large number of molecules in a liquid move relatively freely between each other. If the liquid cools slowly, the mobility of the thermal atoms is lost. Large numbers of atoms can often arrange themselves in rows to form a pure crystal, perfectly ordered in all directions within millions of times the distance of individual atoms. For this system, the crystal state is the lowest energy state, and all slowly cooling systems can naturally reach this lowest energy state. In fact, if the liquid metal is cooled rapidly, it will not reach this state, but only a polycrystalline or amorphous state with a higher energy. The essence of the process, therefore, is to cool slowly to buy enough time for a large number of atoms to redistribute before they lose their mobility, which is necessary to ensure energy reaches a low-energy state. Simply speaking, the physical annealing process consists of the following parts: heating process, isothermal process and cooling process.

The purpose of the heating process is to enhance the thermal motion of particles and make them deviate from the equilibrium position. When the temperature is high enough, the solid will melt into liquid, thus eliminating the non-uniform state that may have existed in the system and allowing the subsequent cooling process to start with an equilibrium state. The melting process is related to the energy increase of the system, and the energy of the system also increases with the increase of temperature.

According to the knowledge of physics, for a closed system with constant temperature while exchanging heat with the surrounding environment, the spontaneous change of the system state is always in the direction of reducing the free energy: when the free energy reaches the minimum, the system reaches an equilibrium state.

The purpose of the cooling process is to weaken the thermal motion of particles and gradually tend to order, and the system energy gradually decreases, so as to obtain a low energy crystal structure.

2.2 Principle of simulated annealingThe simulated annealing algorithm is derived from the solid annealing principle. The solid is heated to a high enough temperature and then cooled slowly. When the temperature is increased, the particles inside the solid become disordered and their internal energy increases. When the particles are cooled slowly, they become orderly and reach equilibrium state at every temperature. Finally, they reach ground state at constant temperature and their internal energy decreases to a minimum. The similarity between simulated annealing algorithm and metal annealing process is shown in Table 7.1. In the root Metropolis criterion, the probability of particles approaching equilibrium at temperature 7 is exp(-▲E/T), where E is the internal energy at temperature 7 and ▲E is its change. Solid annealing is used to simulate the combinatorial optimization problem, the internal energy E is simulated as the objective function value, and the temperature 7 is evolved into the control parameter, namely, the simulated annealing algorithm to solve the combinatorial optimization problem is obtained: Starting from the initial solution % and the initial value of the control parameter 7, the iteration of “generating a new solution → calculating the difference of the objective function to accept or discard” is repeated for the current solution, and the T value is gradually reduced. The current solution at the end of the algorithm is the approximate optimal solution, which is a heuristic random search process of the MonteCarlo iterative solution method. The annealing process is controlled by the cooling schedule, including the initial value of the control parameter 7 and its decay factor K, the number of iterations at each value of 7 and the stop condition. 2.3 Idea of simulated annealing algorithmThe main idea of simulated annealing is that the random walk (i.e. random selection point) in the search interval is used to make the random walk gradually converge to the local optimal solution by using the Metropolis sampling criterion. As temperature is an important control parameter in Metropolis algorithm, it can be considered that the size of this parameter controls how fast the random process moves to the local or global optimal solution. Metropolis is an effective emphasis sampling method whose algorithm is: when the system changes from one energy state to another, the corresponding energy changes from E to E with the probability ofAfter a certain number of iterations, the system will gradually tend to a stable distribution state. In key sampling, if the new state is downward, then accept (local optimal); If up (global search), then accept with a certain probability. Starting from a certain initial solution, simulated annealing algorithm can obtain the relative optimal solution of combinatorial optimization problem with given control parameter values after the transformation of a large number of solutions. Then, the value of the control parameter 7 is reduced and the Metropolis algorithm is repeatedly executed. When the control parameter T approaches zero, the overall optimal solution of the combinatorial optimization problem can be finally obtained. The value of the control parameter must decay slowly. Temperature is an important control parameter of Metropolis algorithm, and simulated annealing can be regarded as an iteration of Metro PL is algorithm when the control parameter is decreased by 7. The value of 7 is large at the beginning, which can accept the poor deteriorating solution; With the decrease of 7, only a better deteriorating solution can be accepted; And finally, as 7 goes to 0, you don’t accept any deteriorating solutions anymore. At infinitely high temperature, the system is uniformly distributed at once, accepting all the proposed transformations. The smaller the attenuation of T, the longer the time for 7 to reach the terminal point; However, Markov chain can be reduced to shorten the time to reach the quasi-equilibrium distribution.

2.4 Characteristics of simulated annealing Algorithm Simulated annealing algorithm has a wide range of application, high reliability to obtain the global optimal solution, simple algorithm, easy to implement; The search strategy of the algorithm is helpful to avoid the defect of local optimal solution and improve the reliability of global optimal solution. Simulated annealing algorithm has very strong robustness, because compared with ordinary optimization search method, it adopts many unique methods and techniques. There are mainly the following aspects: (1) Accept the deteriorating solution with a certain probability. Simulated annealing algorithm not only introduces appropriate stochastic factors but also introduces the natural mechanism of physical system annealing process. The introduction of this natural mechanism enables the simulated annealing algorithm to accept not only the points that make the objective function value “good”, but also the points that make the objective function value “bad” with a certain probability in the iterative process. The states appearing in the iteration process are randomly generated, and it is not required that the latter state must be superior to the former state, and the acceptance probability gradually decreases with the decrease of temperature. Many traditional optimization algorithms are deterministic, and the transfer from one search point to another has certain transfer methods and transfer relations. This deterministic method often makes the search point far less than the optimal advantage, thus limiting the application scope of the algorithm. Simulated annealing algorithm searches in a probabilistic way, which increases the flexibility of the search process. (2) Introduced algorithm control parameters. The control parameters of an algorithm similar to annealing temperature are introduced, which divides the optimization process into several stages and determines the criteria of random state selection at each stage. The acceptance function is presented by Metropolis algorithm with a simple mathematical model. The simulated annealing algorithm has two important steps: first, under each control parameter, the neighboring random state is generated from the previous iteration point, and the acceptance criterion determined by the control parameter determines the choice of the new state, and then a certain length of random Markov chain is formed. The second is to slowly reduce the control parameters and improve the acceptance criteria until the control parameters approach zero and the state chain stabilizes at the optimal state of the optimization problem, thus improving the reliability of the global optimal solution of the simulated annealing algorithm. (3) Less requirements for the objective function. Traditional search algorithms need not only the value of the objective function, but also the derivative value of the objective function and other auxiliary information to determine the search direction: when these information does not exist, the algorithm is invalid. The simulated annealing algorithm does not need other auxiliary information, but only defines the neighborhood structure, selects adjacent solutions in the neighborhood structure, and evaluates them with the objective function.

2.5 Improvement Direction of simulated Annealing Algorithm Improving the search efficiency of simulated annealing algorithm on the basis of ensuring certain required optimization quality is the main content of simulated annealing algorithm improvement [9-10]. There are the following feasible schemes: choose the appropriate initial state; Design appropriate state generation functions to show the spatial dispersion or local region of the state according to the needs of the search process: design efficient annealing process; To improve the temperature control mode: using parallel search structure; Design suitable algorithm termination criteria: etc. In addition, the improvement of simulated annealing algorithm can be realized by adding some steps. The main improvement methods are as follows: (1) increase the memory function. In order to avoid the loss of the optimal solution currently encountered due to the execution of the probabilistic acceptance link in the search process, the best state so far can be stored by adding a storage link. (2) Increasing or reheating process. When the temperature is appropriately raised in the algorithm process, the acceptance probability of each state can be activated to adjust the current state in the search process, and the algorithm to avoid rabbits stagnates at the local minimum solution. (3) For each current state, multiple search strategy is adopted to accept the optimal state in the region with probability, instead of the single comparison method of the standard simulated annealing algorithm. (4) Combine with other search algorithms (such as genetic algorithm, immune algorithm, etc.). The advantages of other algorithms can be combined to improve the efficiency and quality of solution.

3. Simulated annealing algorithm flowThe generation and acceptance of new solutions in simulated annealing algorithm can be divided into three steps as follows: (1) a generation function generates a new solution in the solution space from the current solution; In order to facilitate the subsequent calculation and acceptance and reduce the time consuming of the algorithm, the method of generating a new solution by simple transformation of the current solution is usually selected. It should be noted that the transformation method to generate the new solution determines the neighborhood structure of the current new solution, so it has a certain influence on the selection of cooling schedule. (2) Whether a new solution is accepted or not is judged based on an acceptance criterion, the most commonly used of which is Metropolis criterion: If AK 0, accept X as the new current solution; otherwise, accept X “as the new current solution X with probability exp(-▲E/7). (3) When the new solution is confirmed to be accepted, replace the current solution with the new solution. This only needs to realize the transformation part of the current solution corresponding to the new solution when the new solution is generated, and modify the value of the objective function. At this point, the current solution has realized an iteration, and the next round of experiments can be started on this basis. If the new solution is judged to be abandoned, the next round of experiments will be continued on the basis of the original current solution. The solution obtained by simulated annealing algorithm is independent of the initial solution state (the starting point of algorithm iteration) and has asymptotic convergence, which has been proved to be an optimal algorithm with probability 1 converging to the global optimal solution. The simulated annealing algorithm can be divided into three parts: solution space, objective function and initial solution. The specific process of the algorithm is as follows [8] : (1) Initialization: set the initial temperature 7(sufficiently large), the initial solution state %(which is the starting point of algorithm iteration), and the number of iterations of each 7 value L: (2) for k=1,… , L do steps (3) to (6); (3) Generate a new solution X; (4) Calculate the increment A BE(X) -e (X), where E()) is the evaluation function: (5) If AK0, accept X as the new current solution, otherwise accept X as the new current solution with probability exp(-▲E/7); (6) If the termination condition is met, output the current solution as the optimal solution and end the program: (7)7 gradually decreases and T→0, and then go to step (2). The simulated annealing algorithm flow is shown in Figure 7.1. 4 Key ParametersSimulated annealing algorithm has high performance quality, it is more general and easy to implement. However, in order to obtain the optimal solution, the algorithm usually requires a high initial temperature and enough sampling times, which makes the optimization time of the algorithm often too long. From the algorithm structure, the new state generation function, initial temperature, detemperature function, Markov chain length and algorithm stop criterion are the main links that directly affect the algorithm optimization results.State generation functionThe state generation function should be designed to ensure that the generated candidate solutions are spread over the entire solution space as much as possible. In general, the state generation function consists of two parts, namely, the way of generating candidate solutions and the probability distribution of generating candidate solutions. The generation mode of the candidate solution is determined by the nature of the problem and is usually generated with a certain probability in the neighborhood structure of the current state.Initial temperatureTemperature 7 plays a decisive role in the algorithm, which directly controls the direction of annealing. According to the acceptance criterion of random movement, the greater the initial temperature, the greater the probability of obtaining high-quality solutions, and the acceptance rate of Metropolis is about 1. However, too high initial temperature will increase the calculation time. To this end, a group of states can be sampled uniformly, and the variance of the target value of each state can be used as the initial temperature.Annealing temperature functionThe detemperature function is the temperature update function, which is used to modify the temperature value in the outer loop. At present, the most commonly used temperature updating function is exponential dewarming function, namely T(n+1) =KXT(n), where 0<K1 is a constant very close to 1.Selection of Markov chain length LMarkov chain length is the number of iterative optimization under isothermal conditions. The selection principle is that L should be selected to restore quasi-equilibrium on each value of the control parameter under the premise that the attenuation function of attenuation parameter 7 has been selected. Generally, L is 100~1000.Algorithm stop criterionThe algorithm stop rule is used to determine when an algorithm ends. You can simply set the final temperature T, and when F=T, the algorithm terminates. However, the convergence theory of simulated fire algorithm requires that T tends to zero, which is actually not practical. Commonly used stop criteria package: set the threshold of the termination temperature, set the threshold of the number of iterations, or stop the search when the optimal value is continuously unchanged.

Two, some source code

function varargout = simulatedannealing(varargin)
% SIMULATEDANNEALING M-file for simulatedannealing.fig
%      SIMULATEDANNEALING, by itself, creates a new SIMULATEDANNEALING or raises the existing
%      singleton*.
%
%      H = SIMULATEDANNEALING returns the handle to a new SIMULATEDANNEALING or the handle to
%      the existing singleton*.
%
%      SIMULATEDANNEALING('CALLBACK',hObject,eventData,handles,...) calls the local
%      function named CALLBACK in SIMULATEDANNEALING.M with the given input arguments.
%
%      SIMULATEDANNEALING('Property'.'Value',...). creates anew SIMULATEDANNEALING or raises the
%      existing singleton*.  Starting from the left, property value pairs are
%      applied to the GUI before simulatedannealing_OpeningFcn gets called.  An
%      unrecognized property name or invalid value makes property application
%      stop.  All inputs are passed to simulatedannealing_OpeningFcn via varargin.
%
%      *See GUI Options on GUIDE's Tools menu.  Choose "GUI allows only one % instance to run (singleton)".
%
% See also: GUIDE, GUIDATA, GUIHANDLES

% Edit the above text to modify the response to help simulatedannealing

% Last Modified by GUIDE v2. 5 05-Apr- 2021. 02:02:51

% Begin initialization code - DO NOT EDIT
gui_Singleton = 1;
gui_State = struct('gui_Name',       mfilename, ...
                   'gui_Singleton',  gui_Singleton, ...
                   'gui_OpeningFcn', @simulatedannealing_OpeningFcn, ...
                   'gui_OutputFcn',  @simulatedannealing_OutputFcn, ...
                   'gui_LayoutFcn', [],...'gui_Callback'[]);if nargin && ischar(varargin{1})
    gui_State.gui_Callback = str2func(varargin{1});
end

if nargout
    [varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:});
else
    gui_mainfcn(gui_State, varargin{:});
end
% End initialization code - DO NOT EDIT


% --- Executes just before simulatedannealing is made visible.
function simulatedannealing_OpeningFcn(hObject, eventdata, handles, varargin)
% This function has no output args, see OutputFcn.
% hObject    handle to figure
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)
% varargin   command line arguments to simulatedannealing (see VARARGIN)

%set(handles.init_temperatur_input,'String'.'input Tmax')
%s = sprintf('% 0.3 f' Cooling Rate.,get(handles.set_cooling_rate,'Value'));
%set(handles.cooling_rate_disp, 'String', s);
% Choose default command line output for simulatedannealing
handles.output = hObject;

% Update handles structure
guidata(hObject, handles);

% UIWAIT makes simulatedannealing wait for user response (see UIRESUME)
% uiwait(handles.figure1);


% --- Outputs from this function are returned to the command line.
function varargout = simulatedannealing_OutputFcn(hObject, eventdata, handles) 
% varargout  cell array for returning output args (see VARARGOUT);
% hObject    handle to figure
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)

% Get default command line output from handles structure
varargout{1} = handles.output;





function init_temperatur_input_Callback(hObject, eventdata, handles)

% Hints: get(hObject,'String') returns contents of init_temperatur_input as text
%        str2double(get(hObject,'String')) returns contents of init_temperatur_input as a double
s = str2double(get(hObject,'String'));
if isnan(s)
    set(hObject,'String'.'10000')
end
if s < 100 | s > 10000000
    set(hObject,'String'.'10000')
end

% --- Executes during object creation, after setting all properties.
function init_temperatur_input_CreateFcn(hObject, eventdata, handles)

% Hint: edit controls usually have a white background on Windows.
%       See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'), get(0.'defaultUicontrolBackgroundColor'))
    set(hObject,'BackgroundColor'.'white');
end



function min_temperatur_input_Callback(hObject, eventdata, handles)

% Hints: get(hObject,'String') returns contents of min_temperatur_input as text
%        str2double(get(hObject,'String')) returns contents of min_temperatur_input as a double
s=str2double(get(hObject,'String'));
if isnan(s)
    set(hObject,'String'.'1');
end
if s < 0 | S > 100
    set(hObject,'String'.'1');
end

% --- Executes during object creation, after setting all properties.
function min_temperatur_input_CreateFcn(hObject, eventdata, handles)

% Hint: edit controls usually have a white background on Windows.
%       See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'), get(0.'defaultUicontrolBackgroundColor'))
    set(hObject,'BackgroundColor'.'white');
end



function max_iterations_input_Callback(hObject, eventdata, handles)

% Hints: get(hObject,'String') returns contents of max_iterations_input as text
%        str2double(get(hObject,'String')) returns contents of max_iterations_input as a double
s = str2double(get(hObject,'String'));
if isnan(s)
    set(hObject,'String'.'500')
end
if s < 100 | s > 10000
    set(hObject,'String'.'500')
end


% --- Executes during object creation, after setting all properties.
function max_iterations_input_CreateFcn(hObject, eventdata, handles)

% Hint: edit controls usually have a white background on Windows.
%       See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'), get(0.'defaultUicontrolBackgroundColor'))
    set(hObject,'BackgroundColor'.'white');
end


% --- Executes on selection change in set_init_solution.
function set_init_solution_Callback(hObject, eventdata, handles)

% Hints: contents = get(hObject,'String') returns set_init_solution contents as cell array
%        contents{get(hObject,'Value')} returns selected item from set_init_solution


% --- Executes during object creation, after setting all properties.
function set_init_solution_CreateFcn(hObject, eventdata, handles)

% Hint: listbox controls usually have a white background on Windows.
%       See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'), get(0.'defaultUicontrolBackgroundColor'))
    set(hObject,'BackgroundColor'.'white');
end


% --- Executes on slider movement.
function set_cooling_rate_Callback(hObject, eventdata, handles)

% Hints: get(hObject,'Value') returns position of slider
%        get(hObject,'Min') and get(hObject,'Max') to determine range of slider
s = sprintf('% 0.3 f' Cooling Rate.,get(hObject,'Value'));
set(handles.cooling_rate_disp, 'String', s);
guidata(hObject, handles);


% --- Executes during object creation, after setting all properties.
function set_cooling_rate_CreateFcn(hObject, eventdata, handles)

% Hint: slider controls usually have a light gray background.
if isequal(get(hObject,'BackgroundColor'), get(0.'defaultUicontrolBackgroundColor'))
    set(hObject,'BackgroundColor'[9. 9. 9.]);
end


% --- Executes during object creation, after setting all properties.
function cooling_rate_disp_CreateFcn(hObject, eventdata, handles)


% --- Executes when entered data in editable cell(s) in Parameter.
function Parameter_CellEditCallback(hObject, eventdata, handles)
% hObject    handle to Parameter (see GCBO)
% eventdata  structure with the following fields (see UITABLE)
%	Indices: row and column indices of the cell(s) edited
%	PreviousData: previous data for the cell(s) edited
%	EditData: string(s) entered by the user
%	NewData: EditData or its converted form set on the Data property. Empty if Data was not changed
%	Error: error string when failed to convert EditData to appropriate value for Data
% handles    structure with handles and user data (see GUIDATA)
   
% --- Executes on button press in Start.
function Start_Callback(hObject, eventdata, handles)

% get init_solution_selection
Pm = get(handles.Parameterzahl, 'Value');
initsolution = get(handles.Parameter, 'Data');
initsolution = str2double(initsolution);
initsolution = initsolution(1:Pm,1);   %nach dem Anzahl der Parameter veraedern

% get init_temperatur
inittemperatur = get(handles.init_temperatur_input,'String');
inittemperatur = str2double(inittemperatur);

% get cooling_rate
coolingrate = get(handles.set_cooling_rate,'Value');

% get max_iterations
maxiterations = get(handles.max_iterations_input,'String');
maxiterations = str2double(maxiterations);

% get min_temperatur
mintemperature = get(handles.min_temperatur_input,'String');
mintemperature = str2double(mintemperature);
if isnan(initsolution)
    %disp('falsch input');
    titledisp=sprintf('**falsch input check Parameter input**' );
    text(2... 5,titledisp,'fontweight'.'bold'.'color'.'red');
else     
    axes(handles.axes1)
    [Xopt Xcurrent] = Sahaupt( inittemperatur ,coolingrate,...
      maxiterations,initsolution,mintemperature);
    set(handles.axes1,'XMinorTick'.'on')
end

guidata(hObject, handles);

% --- Executes on button press in evolution.
function evolution_Callback(hObject, eventdata, handles)
% hObject    handle to evolution (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)


% get init_solution_selection
Pme = get(handles.Parameterzahl, 'Value');
initsolution = get(handles.Parameter, 'Data');
initsolution = str2double(initsolution);
initsolution = initsolution(1:Pme,1);   %nach dem Anzahl der Parameter veraedern


% get init_temperatur
inittemperatur = get(handles.init_temperatur_input,'String');
inittemperatur = str2double(inittemperatur);

% get cooling_rate
coolingrate = get(handles.set_cooling_rate,'Value');

% get max_iterations
maxiterations = get(handles.max_iterations_input,'String');
maxiterations = str2double(maxiterations);

% get min_temperatur
mintemperature = get(handles.min_temperatur_input,'String');
mintemperature = str2double(mintemperature);

% plot
if isnan(initsolution)
    %disp('falsch input');
    titledisp=sprintf('**falsch input check Parameter input**' );
    text(2... 5,titledisp,'fontweight'.'bold'.'color'.'red');
else     
    axes(handles.axes2)
    Evolution( inittemperatur ,coolingrate,...
    maxiterations,initsolution,mintemperature);
    set(handles.axes2,'XMinorTick'.'on')
end

guidata(hObject, handles);




% --- Executes on selection change in Parameterzahl.
function Parameterzahl_Callback(hObject, eventdata, handles)
% hObject    handle to Parameterzahl (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)

% Hints: contents = get(hObject,'String') returns Parameterzahl contents as cell array
%        contents{get(hObject,'Value')} returns selected item from Parameterzahl


% --- Executes during object creation, after setting all properties.
function Parameterzahl_CreateFcn(hObject, eventdata, handles)
% hObject    handle to Parameterzahl (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    empty - handles not created until after all CreateFcns called

% Hint: popupmenu controls usually have a white background on Windows.
%       See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'), get(0.'defaultUicontrolBackgroundColor'))
    set(hObject,'BackgroundColor'.'white');
end





guidata(hObject, handles);


Copy the code

3. Operation results

Matlab version and references

1 matlab version 2014A

[1] Yang Baoyang, YU Jizhou, Yang Shan. Intelligent Optimization Algorithm and Its MATLAB Example (2nd Edition) [M]. Publishing House of Electronics Industry, 2016. [2] ZHANG Yan, WU Shuigen. MATLAB Optimization Algorithm source code [M]. Tsinghua University Press, 2017.