-
Notifications
You must be signed in to change notification settings - Fork 77
build: unpin torch version #467
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Comment @cursor review or bugbot run to trigger another review on this PR
| dependencies = [ | ||
| "torch==2.7.0", | ||
| "torchvision==0.22.0", | ||
| "torch>=2.7.0", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Bug: Unpinned torch conflicts with torch2.7-specific gptqmodel wheel
Unpinning torch to >=2.7.0 creates a version mismatch with the gptqmodel==4.0.0.dev0+cu126torch2.7 dependency (in the gptq extra). The gptqmodel wheel's version string explicitly indicates it was built for torch 2.7 (torch2.7 suffix), but users may now install torch 2.8+ while getting a gptqmodel binary compiled against torch 2.7's ABI. This can cause runtime errors or crashes when using GPTQ functionality due to binary incompatibility between the torch version and the gptqmodel extension.
|
@ParagEkbote you looked at unpinning the torch version in an other PR, would you like to review this? |
In this PR, why are we aiming to unpin the We could also define [project.optional-dependencies]
torch29 = [
"torch==2.9.0",
"torchao==0.14.1",
]
torch28 = [
"torch==2.8.0",
"torchao==0.13.0",
]Another point to be considered is that if we unpin WDYT? cc: @gsprochette |
|
This PR has been inactive for 10 days and is now marked as stale. |
|
@ParagEkbote I would like to keep the a broad range of torch versions so that pruna is compatible with as many setup as possible, so I don't think updating the torch version every minor update is the solution we want. What I understand is that the torchao quantizer in pruna would crash with newer versions of torch requiring newer versions of torchao, but it is also the only part of pruna requiring the torchao dependency, correct? So maybe we should make a torchao extra instead, and pin the torch and torchao version until we make it compatible with newer versions? The longer term solution I think would be to have version ranges for both torch and torchao and warn the user about the version compatibilities, with a link to this table. |
|
This PR is not stale |
I understand, it is indeed a bit cumbersome for the long term.
I think that creating Also, #471 will expand the usage of
I think this approach can work, but would we need to set the torch version at a specified min. version so that it supports torchao and the other libraries such as WDYT? cc: @gsprochette |
sharpenb
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for fixing it @ParagEkbote and @gsprochette!
|
@ParagEkbote my prefered option is to unpin torch and torchao and I'll add a warning in torchao (and send a message in PR #471 to add a warning in the new torchao quantizer) so the user can refer to the compatibility table. I like this option because it gives more freedom and we can also directly see what needs to be fixed. |
Ok, that works for me 👍 |
Description
This PR unpins the torch version from "torch==2.7.0" to "torch>=2.7.0".
The version was pinned mainly to correspond to the version the stable-fast wheels were built against, so merging this PR will cause instability in the stable-fast install (in stable-fast and full extras) until we upload the new wheels (currently in testing).
Related Issue
Fixes #(issue number)
Type of Change
How Has This Been Tested?
Ran the tests on torch 2.9 and made sure we are getting the same results as before
Checklist
Additional Notes