TransWikia.com

Output of a network to it's input, keras

Data Science Asked by Georgy Firsov on July 13, 2021

I’m trying to create a neural network in keras for time series forecast. I’ve build a concept, and now I’m not quite sure if it is possible to implement using keras.

I have a potentially complicated network (there are several layers already), that receives $N$ values as input (i.e. $N$ consecutive values of a time series), and outputs a prediction of the next one.

Then input values are "shifted" and the predicted one is placed alongside with the $N-1$ most recent values.

I’ve already built a network, but stuck on passing just predicted value to the input.

It should look somehow like below:

 +------------+
 |            |
 V  +----+    |
--->|    |    |  passes it back as an input
    |    |    | (current inputs are shifted)
--->|    |    |
    | NN |----+---> yields the next value
... |    |
    |    |
--->|    |
    +----+

I’ve tried to use callbacks API, but it it didn’t help, because there’s no way to call callback on each value produced (except using batch size equal to one, but it produces exploding gradients).

What is the proper way to do such a thing using keras?


Moreover, this "hidden" NN is a bit more complicated, than the sequential one. It consists of $M$ sequential networks, that are concatenated and followed by single LSTM layer. Each of these $M$ "sub-networks" receive all $N$ inputs (all the same ones).

I think, it is important to mention.

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP