An overview of the

Bayesian analysis is the basic framework for the whole machine learning and in the middle school textbooks they say probability is the frequency with which something happens, or it’s called objective probability. Probability theory, based on the Bayesian framework, does provide the answer from another Angle. It assumes that probability is our personal subjective concept, indicating how much we believe something will happen.

Prior probability and posterior probability

It can also be called forward probability and backward probability.

Prior probability is the judgment of the probability of an event without conditions (information). Like the probability of whether or not you’re smart. Whether or not you have the disease.

A posteriori probability is the probability (a conditional probability) that an event will occur if certain conditions are attached. This condition can be that the occurrence of another event is known, and the additional condition is equivalent to new information for us to judge the previous event, so as to make more reliable judgment, thus realizing the change from prior probability prior to posterior probability posterior. For example, given that you went to college, the probability of judging whether or not you’re smart changes. Now that you’ve tested positive, the odds of determining whether or not you have the disease change. Notice that it’s important to understand thinking

Bayes formulaThe understanding of the

P (A) P (A) P (A) as the prior probability, P (B) P (B) P (B) for the A posteriori probability, P (B ∣ A) P (B | A) P (B ∣ A) as the conditional probability, the three is the three elements of bayesian statistics.

Prior probability

Prior probability P(A)P(A)P(A) P(A)P(A) is of great significance in Bayesian statistics. Firstly, prior probability is the probability P(A)P(A)P(A) P(A) that we specify before we obtain evidence. This value is usually based on our previous common sense and has A certain subjective color. One of the interesting things is that if we have a priori probability of 111 or 000, So no matter how we increase the evidence you still get the same conditional probability (P = 0/1 – > (A) P (A ∣ B) = P (A) 0/1 = 0/1 \ rightarrow P (A | B) = P (A) = 0/1 – > 0/1 P (A) ∣ B = 0/1) The first lesson that this tells us is don’t judge too soon, because a lot of times we’re ignoring the posterior probability

A posteriori probability

The posterior probability P(B)P(B)P(B) P(B) is usually a current emergency, which also needs to be taken into consideration.

Conditional probability

P (B ∣ A) P (B | A) P (B ∣ A) said in A premise, the incidence of the B, with A 100% events happen to calculate the probability of AB occur simultaneously.

To distinguish it from P(AB)P(AB)P(AB) P(AB)P(AB)P(AB) P(AB)P(AB)P(AB) P(AB)P(AB)P(AB) P(AB)P(AB) is the probability of AB occurring at the same time. The probability of AB occurring at the same time is calculated by 100% of all events.

The meaning of Bayes’ theorem

If you focus too much on the special case (i.e. ignore the prior probabilities), you are likely to mistake noise for signal, and if you adhere to the prior probabilities and ignore the posterior probabilities, you become a person who ignores the changes and is stuck in a routine. Bayes’ theorem tells us to look at both.

From a mathematical perspective

The above common understanding, simply put, is that for a given event B has occurred, to explore the probability of a certain cause of this result, must consider all possible circumstances.

The Bayesian formula takes into account the proportion of this cause to the total cause

The numerator is this cause and the denominator is the total cause.

Bayes’ formula explains 1

Bayesian consensus understanding (