Skip to content

Inference Input Dimension Requirements #26

@zhaoyuanyuan2011

Description

@zhaoyuanyuan2011

I'm trying to run inference using my own input images and masks, here's the command:

python test.py --batch_size 1 --dataroot . --pf_warp_checkpoint checkpoints/dmvton_pf_warp.pt --pf_gen_checkpoint checkpoints/dmvton_pf_gen.pt

And here's the error message

Traceback (most recent call last):
  File "/home/DM-VTON/test.py", line 155, in <module>
    main(opt)
  File "/home/DM-VTON/test.py", line 142, in main
    run_test_pf(
  File "/home/DM-VTON/test.py", line 50, in run_test_pf
    p_tryon, warped_cloth = pipeline(real_image, clothes, edge, phase="test")
  File "/home/anaconda3/envs/dm-vton/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/home/anaconda3/envs/dm-vton/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
    return forward_call(*args, **kwargs)
  File "/home/DM-VTON/pipelines/dmvton_pipeline.py", line 41, in forward
    flow_out = self.warp_model(person, clothes, phase=phase)
  File "/home/anaconda3/envs/dm-vton/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/home/anaconda3/envs/dm-vton/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
    return forward_call(*args, **kwargs)
  File "/home/DM-VTON/models/warp_modules/mobile_afwm.py", line 331, in forward
    x_warp, last_flow = self.aflow_net(
  File "/home/anaconda3/envs/dm-vton/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/home/anaconda3/envs/dm-vton/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
    return forward_call(*args, **kwargs)
  File "/home/DM-VTON/models/warp_modules/mobile_afwm.py", line 258, in forward
    concat = torch.cat([x_warp, x_cond], 1)
RuntimeError: Sizes of tensors must match except in dimension 1. Expected size 31 but got size 28 for tensor number 1 in the list.

I have used nn.functional.interpolate to resize mask so it has the same h and w with the cloth image.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions