Skip to content

Question about reshape in “Recurrent Neural Network” demo.ipynb #2

@alydebles1

Description

@alydebles1

Hi,

I was using the recurrent neural network from scratch code for my own research and ran into something I didn’t quite understand.

In the “Recurrent Neural Network” demo.ipynb the time-series data is reshaped like this:

trainX = np.reshape(trainX, (trainX.shape[0], 1, trainX.shape[1]))
testX  = np.reshape(testX,  (testX.shape[0], 1, testX.shape[1]))

This gives shape (samples, 1, look_back). With look_back = 2 that becomes (N, 1, 2), i.e. 1 timestep with 2 features. In the text you explain that the expected shape is [samples, time_steps, features].

I would have expected look_back to correspond to time_steps, not to features, so something like:

trainX = trainX.reshape(trainX.shape[0], trainX.shape[1], 1)
testX  = testX.reshape(testX.shape[0],  testX.shape[1],  1)

which gives (samples, look_back, 1).

Does this need to be switched, or am I missing something?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions