-
Notifications
You must be signed in to change notification settings - Fork 54
Open
Description
Hello NV team,
Thanks a lot for your project!
I tried to export ONNX of VAD module, and followed guide to setup env, download datasets and checkpoints.
However during exporting ONNX
python export_no_prev.py /workspace/VAD/projects/configs/VAD/VAD_tiny_stage_2.py /workspace/VAD/ckpts/VAD_tiny.pth --launcher none --eval bbox --tmpdir tmpI met issue
TypeError: z(): incompatible function arguments. The following argument types are supported:
1. (self: torch._C.Node, arg0: str, arg1: torch.Tensor) -> torch._C.Node
Invoked with: %1510 : Tensor = onnx::Constant(), scope:
bev_deploy.patch.inspect.AutoInspectModule::/projects.mmdet3d_plugin.VAD.VAD_transformer.VADPerceptionTransformer::transformer/projects.mmdet3d_plugin.VAD.modules.encoder.BEVFormerEncoder::encoder/projects.mmdet3d_plugin.VAD.modules.encoder.BEVFormerLayer::layers.0/projects.mmdet3d_plugin.VAD.modules.temporal_self_attention.TemporalSelfAttention::attentions.0
, 'value', 64
(Occurred when translating PythonOp).
After updated AV-Solutions/vad-trt/export_eval/bev_deploy/patch/bevformer/ms_deform_attn.py to:
# Fix: Ensure all args are proper tensor nodes for ONNX
processed_args = []
for arg in args:
if isinstance(arg, (int, float)): # If it's a scalar
# Convert scalar to ONNX Constant tensor
processed_args.append(g.op("Constant", value_t=torch.tensor(arg)))
else:
processed_args.append(arg)
return g.op("custom_op::MultiScaleDeformableAttentionPlugin", *processed_args)
The issue solved.
However the exported ONNX cannot be inference:
onnx.onnx_cpp2py_export.checker.ValidationError: Nodes in a graph must be topologically sorted, however input '/positional_encoding/Unsqueeze_output_0' of node:
name: /positional_encoding/Expand OpType: Expand
Pls. provide some guidances.
Thanks,
Kevin
Metadata
Metadata
Assignees
Labels
No labels