Skip to content

Conversation

@leasunhy
Copy link
Collaborator

  • Use torch.optim.AdamW as fallback Adam implementation.
  • Support selecting the fused versions of the optimizers (via --use-fused-optimizer).

Speed:
custom_fused (only available for Adam) > fused > foreach

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants