An overview of the

Purpose: To predict the future

Preparation: Describe the past and analyze patterns

Three commonly used models: seasonal decomposition, exponential smoothing method, ARIMA model

Software used: SPSS

Time series = time factor (year, quarter, month)+ numerical factor (body weight, GDP)

Time series: period series (continuous change, cumulative process)/ time point series (measured every period of time)

Period series can be added, which reflects the total development (GDP); The sequence of time points is not additive and does not add up to anything (weight)

Grey prediction has an accumulation process, so grey prediction model is only suitable for period series

Time series decomposition

concept

Baidu index can see the trend of public opinion: index.baidu.com/v2/index.ht…

Example:

SPSS Actual operation

STEP1:

Invalid data processing:

  • The missing value occurs at the beginning or end of time series, which can be directly deleted.
  • If the missing value occurs in the middle of the sequence, it cannot be deleted (the original time series will be misplaced after deletion), and the missing value can be replaced.

STEP2:

Then do the seasonal decomposition (this is an addition model) : you get the following four variables

STEP3:

Plot four variables

In order to forecast, regression and fitting of the straight line T+C+L will be carried out, and the correct prediction result can be obtained by adding the seasonal adjustment factor S

☆ Can not time series decomposition -> use expert modeler to select the appropriate model

☆ Use expert modelers

On the basis of exponential smoothing method and ARIMA model, the best fitting model of sequence can be found automatically, and the outliers can be detected automatically

Objective: Not fully master exponential smoothing method and ARIMA model; Be able to do some analysis based on the results obtained by the expert modeler

Exponential smoothing: Simple model

Limitations: Can only predict one phase!

Linear trend model

Damped trend Model

Simple seasonal

Winters’ additive model

Winters’ multiplicative

☆ Expert modeler operation process

☆ Expert modeler results analysis

Descriptive regression and predictive regression

  • Predictive regression does not look at significance, only prediction error * descriptive regression looks at significance

Ten knowledge points:

ARIMA model

Stationary time series and white noise series

Covariance stationarity/time series stationarity:

Significance: Stable property is good, mean variance is constant, convenient modeling; If it is not stable, then the use of difference for certain deformation is stable

White noise sequence = disturbance term

Example:

  • A: The mean gets bigger, it’s not constant, it’s not stable
  • C: Variance change, not constant, not stationary
  • D: There is a seasonal trend, variance is not a constant, not stable
  • E: Downward trend, decreasing mean, unstable
  • F: The mean values of the three parts are different, so it’s not stable
  • H: There is a seasonal trend, variance is not a constant, not stable
  • I: The trend is upward and not steady

The smooth is:

  • B: Mean constant, variance constant, consistent fluctuation, one outlier can be ignored
  • G: G is a little strange, not very standard

Difference equation

Q: The disturbance term lags behind the q period

The first and second add up to the third autoregressive moving average model, which is AR plus MA to get ARMA

Z is the other variables, binary time series model

The red part is the second part of the difference equation

ARMA (p, q) model:

Significance of difference: The length of the mode of the solution of the characteristic equation of ARMA(P, Q) model can judge whether the time series is stable or not

The lag operator

ARMA(P, Q) model is expressed by hysteresis operator:

Unsteady to smooth: use difference

AR(P) model ==p order autoregressive model

Important: The AR(P) model we discussed must be a stable time series model. If the original data is not stable, it must be converted into stable data before modeling.

The characteristic equation determines that stationary + non-stationary transformation is a stationary process

K – > up,The explosion

Example:

MA(q) model ==q order moving average model

Relationship between MA model and AR model:

Significance: AR has many parameters, which are less after being converted into MA model and easier to analyze

Prove the stationarity of MA (q) : it is generally stationary

Nurture ARMA (p, q) model

Regressive Moving Average (ARMA) model

The stationarity of ARMA(P, Q) model

MA(q) must be stable. Just judge AR(q)

Stata software

(Software generated results) ARMA order: ACF autocorrelation coefficient, PACF partial autocorrelation function

meaning
The order of ARMA(P,q) model is determined
Data are stationary series

The autocorrelation coefficient is equal to the covariance of the interval s period divided by the standard deviation of each

Function of interval s as independent variable = autocorrelation function

White noise gamma s = 0

Find the autocorrelation coefficient by finding the regression coefficient

☆ Example: Judge the order by two graphs

ARMA(P,q) two graphs are trailing, so it is difficult to judge, SPSS software will tell us the correct order! , as shown below:

Estimation of ARMA model

Maximum likelihood estimation for ARMA model

The coefficient is calculated by maximum likelihood estimation after modeling. Automatic with software!

Model selection: AIC and BIC criteria (select small principle)

The smaller the number of parameters in the model, the better, and the larger the maximum likelihood function value of the model, so the smaller AIC and BIC are, the better

N periods =n parameters

BIC chose a more concise model

Significance: If there are ten models, calculate their BIC/AIC for comparison and select the smallest one

Check whether residuals are white noise

Q test

Null hypothesis: Autocorrelation coefficient is 0= it is white noise sequence

Construct statistic Q and test its hypothesis. The software will give p value and judge directly

☆ARIMA(P, D, Q) model: Differential autoregressional moving average model

Example:

★ SARIMA(Seasonal ARIMA) model: Seasonal adjustment factors are considered

Being is being fostered SPSS in field

Ideas of Spss time series modeling:

Make some analysis on the results and explain: For example, Winters addition model:

【 note 】

If population projections are not accurate, they will continue to rise

Example 3: Shanghai Composite Index forecast

A simple model

Q test p=0, so the residual is not white noise, so this model is not good; Reason: There are outliers, so the model is established after eliminating outliers

After solving the outliers, the model is ARIMA(0,1,1.4), and the model is improved q-test p>0.05. The residual is white noise, so the model is ok

And the simple model can only predict one phase

ARIMA(0,1,1.4) can only predict 14 periods, ♥ solution: get 14 periods plus the previous data thrown in for prediction

Example 4: GDP growth forecast

Real predictions are based on context, not directly on models

Prediction using all data (excluding outliers) —–> Poor fitting and prediction; Reason: Without context, GDP assessment was abolished after 14 years

Two predictions:

  • One should be combined with the background;
  • Second, make reasonable assumptions.

Prediction two don’t:

  • Don’t force models;
  • Don’t fail to explain.