-
Notifications
You must be signed in to change notification settings - Fork 36
Open
Description
Hi,
I was using the recurrent neural network from scratch code for my own research and ran into something I didn’t quite understand.
In the “Recurrent Neural Network” demo.ipynb the time-series data is reshaped like this:
trainX = np.reshape(trainX, (trainX.shape[0], 1, trainX.shape[1]))
testX = np.reshape(testX, (testX.shape[0], 1, testX.shape[1]))
This gives shape (samples, 1, look_back). With look_back = 2 that becomes (N, 1, 2), i.e. 1 timestep with 2 features. In the text you explain that the expected shape is [samples, time_steps, features].
I would have expected look_back to correspond to time_steps, not to features, so something like:
trainX = trainX.reshape(trainX.shape[0], trainX.shape[1], 1)
testX = testX.reshape(testX.shape[0], testX.shape[1], 1)
which gives (samples, look_back, 1).
Does this need to be switched, or am I missing something?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels