You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Apr 29, 2021. It is now read-only.
A simple fix could be to replace the nn.HardTanh operator with a custom layer that just uses torch.clamp instead, but there might be some additional code associated with module.layers, since torch.clamp expects a Tensor (i.e. not a Module). Hopefully, torch.onnx.export is smart enough convert a torch.clamp() call into the appropriate Clip operation(s).