Thursday, November 21, 2019

Working Progress 3: Holt and Winter's Model

Holt and Winters extended Holt’s method to capture seasonality. The Holt-Winters seasonal method comprises the forecast equation and three smoothing equations one for the level ℓtℓt, one for the trend btbt, and one for the seasonal component stst, with corresponding smoothing parameters αα, ββ and γγ. We use mm to denote the frequency of the seasonality, i.e., the number of seasons in a year. For example, for quarterly data m=4m=4, and for monthly data m=12.a
m=12

Reference: https://otexts.com/fpp2/holt-winters.html

Results were not that good and thus the model was discarded as SARIMAX gave better results.

Working Progress 2: Vanilla RNN and LSTM for prediction

As discussed earlier, RNN and LSTM can be used to predict future time series values. Here, we split the data into train and test where the data of the year 2016 is the testing data and the rest being the training data. The data is split in such a way that Xij(n_steps) = number of previous values the next values will depend on and Yi being the value to be predicted. Here the number of layers considered was 1 as it is vanilla RNN.
The result was 34.33 RMSE and the given predicted visualization:


Working Progress 1: Working On SARIMAX model

Seasonal Autoregressive Integrated Moving Average, SARIMA or Seasonal ARIMA, is an extension of ARIMA that explicitly supports univariate time-series data with a seasonal component. It adds three new hyperparameters to specify the autoregression (AR), differencing (I) and moving average (MA) for the seasonal component of the series, as well as an additional parameter for the period of the seasonality.


We checked for possible combinations of p,d,q,m for a given time series




After basic computations for the series, we got 0,1,0,12 as the p,d,q,m values for the given series. The result was 38.56 Root Mean Square Error and the visualization for the prediction for the upcoming year can be seen below:








For a better understanding of SARIMAX refer: https://machinelearningmastery.com/sarima-for-time-series-forecasting-in-python/

Proposed Model: Ensemble Learning


As our first step, we plan to remove outliers/noise from our dataset which diverges our forecasting from the actual trend. We observed that the algorithms performed better on a particular type of time-series data. Like for time series with more seasonal component prophet approach would give better results. Classical methods like ETS and ARIMA give better results with short term dependencies in time series whereas complex models like RNN/LSTM gave better results when there was long term correlation in time series.
 Thus we plan to do Ensemble learning for web traffic prediction. Ensemble learning combines multiple predictions (forecasts) from one or multiple methods to overcome the accuracy of simple prediction and to avoid possible overfit. The models that we would be working with are as follows –
1)     Ensembles of classical models-
·        Autoregressive (AR),
·        Moving Average (MA),
·        Autoregressive Moving Average (ARMA),
·        Autoregressive Integrated Moving Average (ARIMA), and
·        Seasonal Autoregressive Integrated Moving Average (SARIMA) models.
2)     Ensembles of LSTM models
3)     Ensembles of Prophet
Learn More Abot ensemble Learning: https://towardsdatascience.com/ensemble-methods-in-machine-learning-what-are-they-and-why-use-them-68ec3f9fef5f

Previous Work 3: Prophet


Prophet is a procedure for forecasting time series data based on an additive model where non-linear trends are fit with yearly, weekly, and daily seasonality, plus holiday effects. It works best with time series that have strong seasonal effects and several seasons of historical data. Prophet is robust to missing data and shifts in the trend, and typically handles outliers well.


Reference: https://www.kaggle.com/headsortails/wiki-traffic-forecast-exploration-wtf-eda
Learn more about Prophet:  https://facebook.github.io/prophet/

Previous Work 2: RNN Model


The model was rebuilt from the RNN seq2seq model, using the encoder-decoder architecture where cuDNN GRU was used to encode and TensorFlow GRUBlockCell as decoder with the output of the decoder passed on to the next step until the end of the sequence.

Previous Work 1: Arima Model

ARIMA, short for ‘AutoRegressive Integrated Moving Average’, is a forecasting algorithm based on the idea that the information in the past values of the time series can alone be used to predict future values. ARIMA models are used because they can reduce a non-stationary series to a stationary series using a sequence of differencing steps. 

Working Progress 8: Random Forest

Random Forest: It technically is an ensemble method (based on the divide-and-conquer approach) of decision trees generated on a randomly ...