Data Science Asked on April 12, 2021
Someone gave me a tip to use kalman filter for my dataset. How time intensive is it to get a good kalman filter running, compared to simple interpolation methods like
df.fillna(method="")
which takes basically no effort.
If one or two iterations is enough to get useful results, which come very near the real missing value, then I am willing to take the effort to implement it. (Dataset length 100.000 up to 200mio rows)
If it needs to be optimized like a Neural Network itself which can be costly in terms of time, isnt it better to simply use an LSTM?
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP