One, core extreme learning machine

This paper introduces a new SLFN algorithm, extreme learning machine, the algorithm will be randomly generated between input layer and hidden layer connection weights and threshold of hidden layer neurons, and no adjustment in the process of training, only need to set the number of neurons in hidden layer, can obtain the optimal solution only, compared with the traditional training methods, This method has the advantages of fast learning rate and good generalization performance.

The typical single hidden layer feedforward neural network is shown in the figure above. The input layer is fully connected with the hidden layer, and the hidden layer is fully connected with the output layer. The number of neurons in the input layer is determined according to the number of samples and the number of features, and the number of neurons in the output layer is determined according to the number of types of samples

Let the threshold b of neurons in the hidden layer be:

When the number of neurons in the hidden layer is the same as the number of samples, equation (10) has a unique solution, that is to say, the training sample is approached with zero error. In the usual learning algorithm, W and B need constant adjustment, but the research results show that they do not need constant adjustment, and can even be specified at will. Adjusting them takes time and does not offer much benefit. (Suspect, probably out of context, this conclusion may be based on some premise).

Two, genetic algorithm

• Genetic Algorithm (GA) is an evolutionary Algorithm whose basic principle is to emulate the evolutionary law of “natural selection and survival of the fittest” in the biological world. It was first proposed by Professor J. Holland from The University of Michigan in 1967. • Genetic algorithms begin with a population that represents a possible set of potential solutions to a problem, and a population consists of a certain number of individuals encoded by a gene. Therefore, the first step is to map from phenotype to genotype, that is, coding. After the generation of the initial population, according to the principle of survival of the fittest and survival of the fittest, better and better approximate solutions are generated through generation by generation evolution. In each generation, individuals are selected according to their fitness in the problem domain. By means of genetic operators of natural genetics, a population representing a new solution set is generated. This process will lead to a population like natural evolution. The population of the later generation is more adapted to the environment than that of the previous generation, and the optimal individual in the last generation can be used as the approximate optimal solution of the problem after decoding.

• Genetic algorithms have three basic operations: Selection, Crossover, and Mutation. • (1) Choice. The purpose of selection is to select the best individuals from the current group and give them a chance to reproduce as fathers. According to the fitness value of each individual, according to certain rules or methods from the previous generation of the population to select some excellent individuals to the next generation of the population. Selection is based on the probability that the resilient individual will contribute one or more offspring to the next generation. • (2) cross. A new generation of individuals can be obtained through the crossover operation, which combines the characteristics of the parent individuals. Each individual in the population is randomly paired and, for each individual, a portion of the chromosome between them is exchanged at a crossover probability. • (3) variation. For each individual in the population, change the value of one or more genes at one or more loci to other alleles with the probability of mutation. As in biology, the probability of mutation is low, and mutation provides an opportunity for new individuals to emerge.

The basic steps of genetic algorithm:

1) Coding: Before GA searches, the solution data in the solution space is represented as the genotype string structure data in the genetic space, and the mixed combination of these string structure data constitutes the different points. 2) Generation of initial group: RANDOMLY generate N initial string structure data, each string structure data is called an individual, N individuals constitute a group. GA begins to evolve with this N string structure data as the initial point. 3) Evaluation of fitness: Fitness indicates the superiority of an individual or solution. The definition of fitness function is also different from that of different problems.

4) Selection: The purpose of selection is to select superior individuals from the current population to give them a chance to serve as fathers for the next generation to reproduce. Genetic algorithm reflects this idea through the selection process. The principle of selection is that individuals with strong adaptability have a high probability of contributing one or more offspring to the next generation. The choice embodies Darwinian survival of the fittest. 5) Crossover: Crossover operation is the most important genetic operation in genetic algorithm. A new generation of individuals can be obtained through the crossover operation, and the new individuals combine the characteristics of their parents. Crossover embodies the idea of information exchange. 6) Variation: First, an individual is randomly selected from the population. For the selected individual, the value of a string in the string structure data is randomly changed at a certain probability. As in biology, the probability of mutation in GA is very low, usually with a very small value.

Genetic Algorithm Toolbox:

• Genetic Algorithm Toolbox embedded in MATLAB: GADST • Genetic Algorithm Toolbox at University of Sheffield: GATBX • Genetic Algorithm Toolbox at University of North Carolina: GAOT

Initializega function:

The ga functions:

Genetic algorithm optimization of BP neural network initial weight and threshold:

Three, code,

% genetic algorithm main program %Name:genmain05. M clear CLF popsize=20; % population size chromlength=10; % String length (individual length) PC =0.6; % crossover probability PM =0.001; % variance probability pop= initPop (popsize,chromlength); For I = 1:20% 20 is the number of iterations [objValue]= calobjValue (pop); Fitvalue = calFitValue (objValue); % Calculate the fitness of each individual in the population [newpop]= Selection (pop,fitvalue); Copy [newpop] % = crossover (pop, PC); Cross [newpop] = % mutation (pop, PC); Variation [bestindividual bestfit] % = best (pop, fitvalue); % Find out the individual with the largest fitness value and its fitness value y(I)=-max(bestfit); n(i)=i; pop5=bestindividual; X (I) = decodechrom (pop5, 1, chromlength) * 10/1023; pop=newpop; end fplot('x^2-4*x+20',[0 10]) hold on plot(x,y,'r*') hold off [z index]=min(y); % to calculate the maximum value and its position, here is the maximum value of y vector, if the minimum value should be min, at the same time modify the fitness function x5=x(index)% to calculate the x value y=z corresponding to the maximum valueCopy the code

Four, references and code private message blogger