Entropy correlation, Softmax, sigmoID function

entropy


H ( X ) = Σ ( p i ) log p ( x i ) H\left( X \right) =-\Sigma \left( p_i \right) \log p\left( x_i \right)

The cross entropy


H ( p . q ) = Σ x p ( x ) log ( 1 q ( x ) ) H\left( p,q \right) =\underset{x}{\Sigma}p\left( x \right) \log \left( \frac{1}{q\left( x \right)} \right)

softmax


S i = e z i Σ i = 1 n e z i S_i=\frac{e^{zi}}{\Sigma _{i=1}^{n}e^{zi}}

The sigmoid function


S ( x ) = 1 1 + e x S\left( x \right) =\frac{1}{1+e^{-x}}