Skip to content

Switch to PyTorch API (computational graph is broken!) #1

@Nikronic

Description

@Nikronic

patch_loss = np.sum(
[self.MSE_loss(self.gram_matrix(self.get_patch(ly)), self.gram_matrix(self.get_patch(lp)))
for ly, lp in zip(y_vgg, details_outputs_vgg)])

Why on earth there is a numpy function during the forward pass of computing loss? Switch to pytorch and get rid of for-loop.

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions