Data Science Asked by mayuc on June 27, 2021
I have run LSTM and SVR models on various datasets having sample values in the range of 1-4000 and the MAPE obtained in SVR was consistently lesser than that obtained through LSTM. I was told the reverse is true (that LSTM should perform better) but haven’t found much information on this online. I would appreciate any feedback about this and any links to articles or papers (so far, I found grossly varied opinions).
So far I know, Recurrent Neural Network(RNN) is best for time series problem. LSTM(type of RNN), always consider long term dependencies and evaluate new value after understanding the whole series pattern. Whereas SVR, consider each row as a sample for training data and predict the outcome and will not consider the previous patterns.
For example: 10->20, 20->40, 30->60, and so on ...
LSTM will try understand the whole series and then predict the next value, whereas for SVM each row is a individual training sample divided into Features/Target and will predict value based upon the different values it learnt so far.
Thats the reason RNN is better for timeseries.
Please refer: https://www.researchgate.net/post/how_I_can_do_sequence_to_sequence_prediction_using_SVM
For SVR explanation https://machinelearningmastery.com/time-series-forecasting-supervised-learning/
Answered by vipin bansal on June 27, 2021
If you have enough data, LSTMs will definitely outperform SVRs. However, on most real world problems SVRs will perform better due to the small dataset size in my experience.
In the Paper: E-Commerce Price Forecasting Using LSTM Neural Networks from Houda Bakir, Ghassen Chniti, and Hédi Zaher you can find a comparison where LSTMs outperform SVRs, especially in terms of variance in RMSE over several time series.
Answered by fhaase on June 27, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP