Skip to content

关于运行的问题 #19

@HenryFordham

Description

@HenryFordham

作者您好,我按照您的代码运行了一下,下载了各种大模型和您的模型

但是运行起来有如下错误。我不知道是怎么回事,看上去是模型的encode部分出现了哪些错误。如果您知道,请指教,谢谢

sampling 50 steps using ddpm sampler
Traceback (most recent call last):
File "inference_partition.py", line 160, in
main()
File "inference_partition.py", line 141, in main
preds, bpp = process(
File "/root/miniconda3/envs/mfr/lib/python3.8/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
File "inference_partition.py", line 56, in process
"c_crossattn": [model.get_learned_conditioning([""] * n_samples)]
File "/root/autodl-tmp/DiffEIC/ldm/models/diffusion/ddpm.py", line 677, in get_learned_conditioning
c = self.cond_stage_model.encode(c)
File "/root/autodl-tmp/DiffEIC/ldm/modules/encoders/modules.py", line 236, in encode
return self(text)
File "/root/miniconda3/envs/mfr/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/root/miniconda3/envs/mfr/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/mfr/lib/python3.8/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
File "/root/autodl-tmp/DiffEIC/ldm/modules/encoders/modules.py", line 213, in forward
z = self.encode_with_transformer(tokens.to(next(self.model.parameters()).device))
File "/root/autodl-tmp/DiffEIC/ldm/modules/encoders/modules.py", line 220, in encode_with_transformer
x = self.text_transformer_forward(x, attn_mask=self.model.attn_mask)
File "/root/autodl-tmp/DiffEIC/ldm/modules/encoders/modules.py", line 232, in text_transformer_forward
x = r(x, attn_mask=attn_mask)
File "/root/miniconda3/envs/mfr/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/root/miniconda3/envs/mfr/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/mfr/lib/python3.8/site-packages/open_clip/transformer.py", line 263, in forward
x = q_x + self.ls_1(self.attention(q_x=self.ln_1(q_x), k_x=k_x, v_x=v_x, attn_mask=attn_mask))
File "/root/miniconda3/envs/mfr/lib/python3.8/site-packages/open_clip/transformer.py", line 250, in attention
return self.attn(
File "/root/miniconda3/envs/mfr/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/root/miniconda3/envs/mfr/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/mfr/lib/python3.8/site-packages/torch/nn/modules/activation.py", line 1275, in forward
attn_output, attn_output_weights = F.multi_head_attention_forward(
File "/root/miniconda3/envs/mfr/lib/python3.8/site-packages/torch/nn/functional.py", line 5438, in multi_head_attention_forward
raise RuntimeError(f"The shape of the 2D attn_mask is {attn_mask.shape}, but should be {correct_2d_size}.")
RuntimeError: The shape of the 2D attn_mask is torch.Size([77, 77]), but should be (1, 1).

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions