A RoadMap to Time-Series Analysis

Eswara Prasad
featurepreneur
Published in
5 min readMay 17, 2021

--

Time Series analysis is a technique, Which unites finance and technology. Here we automate every aspect of finance using the tools in programming. Here, I will walk you through every concept in Time Series in this article. If you like to see the python implementation of the topics in the upcoming article. Just jump into my repo here.

Introduction:

  • Time-Series data is a series of data points or observations recorded at different or regular time intervals. In general, a time series is a sequence of data points taken at equally spaced time intervals. The frequency of recorded data points may be hourly, daily, weekly, monthly, quarterly or annually.
  • Time-Series Forecasting is the process of using a statistical model to predict future values of a time series based on past results.

Jargons to Know:

  • Trend — The linear increasing or decreasing behaviour of the series over time. A trend can be increasing(upward), decreasing(downward), or horizontal(stationary).
  • Seasonality — The repeating patterns or cycles of behaviour over time. Some examples include an increase in water consumption in summer due to hot weather conditions.
  • ETS Decomposition — ETS Decomposition is used to separate different components of a time series. The term ETS stands for Error, Trend and Seasonality.
  • Stationarity- It shows the mean value of the series that remains constant over the time period. If past effects accumulate and the values increase towards infinity then stationarity is not met
  • Differencing- Differencing is used to make the series stationary and to control the auto-correlations. There may be some cases in time series analyses where we do not require differencing and over-differenced series can produce the wrong estimate.
  • Dependence- It refers to the association of two observations of the same variable at prior time periods.
  • Noise- The variability in the observations that cannot be explained by the model.

After getting a pretty understanding of these topics, you can jump into the core concepts in time series.

Smoothing Techniques:

The smoothing process is essential to reduce the noise present in our series and point out the true patterns that may present over time. There are three important smoothing methods in time series analysis.

  • Single Exponential smoothing:

Single Exponential Smoothing, SES for short, also called Simple Exponential Smoothing is a time series forecasting method for univariate data without a trend or seasonality.

  • Double Exponential Smoothing

Double Exponential Smoothing is an extension to Exponential Smoothing that explicitly adds support for trends in the univariate time series. Double Exponential Smoothing with an additive trend is classically referred to as Holt’s linear trend model, named for the developer of the method Charles Holt.

  • Triple Exponential Smoothing

Triple Exponential Smoothing is an extension of Exponential Smoothing that explicitly adds support for seasonality to the univariate time series. This method is sometimes called Holt-Winters Exponential Smoothing, named for two contributors to the method: Charles Holt and Peter Winters.

Forecasting Models:

Basically, the normal machine learning algorithm cannot fit into this process, Hereafter, we are not going to predict the values. But we are going to forecast the values. So, there are algorithms, especially for the Time series.

Autoregression (AR):

The autoregression (AR) method models the next step in the sequence as a linear function of the observations at prior time steps. The method is suitable for univariate time series without trend and seasonal components.

Moving Average(MA):

A moving average also called a rolling or running average, is used to analyze the time-series data by calculating averages of different subsets of the complete dataset. Since it involves taking the average of the dataset over time, it is also called a moving mean (MM) or rolling mean. It's simply the rolling window, we explored in the past cells

Autoregressive Moving Average (ARMA):

The Autoregressive Moving Average (ARMA) method models the next step in the sequence as a linear function of the observations and residual errors at prior time steps. It combines both Autoregression (AR) and Moving Average (MA) models.

Autoregressive Integrated Moving Average (ARIMA):

The Autoregressive Integrated Moving Average (ARIMA) method models the next step in the sequence as a linear function of the differenced observations and residual errors at prior time steps. It combines both Autoregression (AR) and Moving Average (MA) models as well as a differencing pre-processing step of the sequence to make the sequence stationary, called integration (I).

Seasonal Autoregressive Integrated Moving-Average (SARIMA):

The Seasonal Autoregressive Integrated Moving Average (SARIMA) method models the next step in the sequence as a linear function of the differenced observations, errors, differenced seasonal observations, and seasonal errors at prior time steps. It combines the ARIMA model with the ability to perform the same autoregression, differencing, and moving average modelling at the seasonal level.

Seasonal Autoregressive Integrated Moving-Average with Exogenous Regressors (SARIMAX):

The Seasonal Autoregressive Integrated Moving-Average with Exogenous Regressors (SARIMAX) is an extension of the SARIMA model that also includes the modelling of exogenous variables. The SARIMAX method can also be used to model the subsumed models with exogenous variables, such as ARX, MAX, ARMAX, and ARIMAX. The method is suitable for univariate time series with trend and/or seasonal components and exogenous variables.

Vector Autoregression Moving-Average (VARMA):

The Vector Autoregression Moving-Average (VARMA) method models the next step in each time series using an ARMA model. It is the generalization of ARMA to multiple parallel time series, e.g. multivariate time series.The method is suitable for multivariate time series without trend and seasonal components.

Vector Autoregression Moving-Average with Exogenous Regressors (VARMAX):

The Vector Autoregression Moving-Average with Exogenous Regressors (VARMAX) is an extension of the VARMA model that also includes the modelling of exogenous variables. It is a multivariate version of the ARMAX method.

Conclusion:

These are the major topics to be covered in time series analysis. you can also apply deep learning in time-series analysis using LSTM models. We have just given you a glimpse of all topic in the Time Series, Still, there is more to Cover. So, Keep exploring !. If you like to extend your knowledge in time series using python. Just jump into my repo here and Enjoy learning.

--

--