Load model...
Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
Loading checkpoint shards: 0%| | 0/4 [00:00<?, ?it/s]
Traceback (most recent call last):
File "/Users/jiayueyuan/Downloads/UniPose/inference.py", line 204, in
main(args)
File "/Users/jiayueyuan/Downloads/UniPose/inference.py", line 122, in main
model, image_processor = load_pretrained_model(
File "/Users/jiayueyuan/Downloads/UniPose/inference.py", line 39, in load_pretrained_model
model = PoseGPTFullMask.from_pretrained(
File "/opt/miniconda3/envs/unipose/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3850, in from_pretrained
) = cls._load_pretrained_model(
File "/opt/miniconda3/envs/unipose/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4284, in _load_pretrained_model
new_error_msgs, offload_index, state_dict_index = _load_state_dict_into_meta_model(
File "/opt/miniconda3/envs/unipose/lib/python3.10/site-packages/transformers/modeling_utils.py", line 805, in _load_state_dict_into_meta_model
set_module_tensor_to_device(model, param_name, param_device, **set_module_kwargs)
File "/opt/miniconda3/envs/unipose/lib/python3.10/site-packages/accelerate/utils/modeling.py", line 298, in set_module_tensor_to_device
raise ValueError(
ValueError: Trying to set a tensor of shape torch.Size([32000, 4096]) in "weight" (which has shape torch.Size([34132, 4096])), this looks incorrect.
请问这个怎么解决?