Skip to content

Conversation

@CuiYifeng
Copy link
Contributor

Fallback _fused_rms_norm and _fused_rms_norm_backward to CPU for XPU backend.

@CuiYifeng CuiYifeng requested review from Copilot and etaf December 23, 2025 08:45
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds CPU fallback support for fused RMS normalization operations on the XPU backend. When these operations are called on XPU devices, they will automatically fall back to CPU execution to ensure functionality until native XPU implementations are available.

  • Adds _fused_rms_norm and _fused_rms_norm_backward to the XPU fallback list

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@CuiYifeng CuiYifeng requested a review from guangyey December 23, 2025 08:46
@CuiYifeng
Copy link
Contributor Author

CuiYifeng commented Dec 23, 2025

Encountered multiple definition of aoti_torch_xpu__fused_rms_norm when registering _fused_rms_norm in yaml/native/native_functions.yaml. Quickly ensure functionality before solving the multiple definition issue.

Copy link
Contributor

@guangyey guangyey left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. I’ve delayed these implementations listed in #1905 and plan to finish them in PT 2.11.

@github-actions
Copy link

Performance outliers, please check!

  • 🟡 [80%, 90%), may be fluctuations
Category Model Target vs. Baseline [Eager] Target vs. Baseline [Inductor]
huggingface_float16_training MT5ForConditionalGeneration 0.914857 0.890362

@CuiYifeng
Copy link
Contributor Author

Thanks. I’ve delayed these implementations listed in #1905 and plan to finish them in PT 2.11.

Thanks for the reminder. Please note that #1905 also requires modification in stock PyTorch.

@CuiYifeng
Copy link
Contributor Author

This PR is converted to draft due to some new foundings.

@CuiYifeng CuiYifeng marked this pull request as draft December 24, 2025 02:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants