Skip to content

CPU error? #1

@nerdyrodent

Description

@nerdyrodent

The Shivam one works for me, but trying to test with CPU gives me this?

...
  File "/home/nerdy/github/diffusers_shivam/src/diffusers/models/attention.py", line 273, in forward
    hidden_states = xformers.ops.memory_efficient_attention(query, key, value)
  File "/home/nerdy/anaconda3/envs/diffusers/lib/python3.9/site-packages/xformers/ops.py", line 568, in memory_efficient_attention
    op = AttentionOpDispatch.from_arguments(
  File "/home/nerdy/anaconda3/envs/diffusers/lib/python3.9/site-packages/xformers/ops.py", line 531, in op
    raise NotImplementedError(f"No operator found for this attention: {self}")
NotImplementedError: No operator found for this attention: AttentionOpDispatch(dtype=torch.float32, device=device(type='cpu'), k=40, has_dropout=False, attn_bias_type=<class 'NoneType'>, kv_len=4096, q_len=4096)

Do I need to compile xformers differently, maybe? I'm guessing this version has something different?
"I used this revision of the xformers library pip install git+https://github.com/facebookresearch/xformers@1d31a3a#egg=xformers"

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions