Skip to content

ONNX Model Support #7

@ebowwa

Description

@ebowwa

Feature Request: ONNX Model Support

Enable CoreMLPlayer to load and run ONNX (Open Neural Network Exchange) models in addition to CoreML models.

Motivation

Bridge the training → deployment pipeline:

  • Train models on GPU using edge-training platform
  • Export to ONNX format
  • Load directly in CoreMLPlayer for testing/validation
  • Deploy to iOS/Mac after validation

Benefits:

  • Cross-platform model compatibility
  • Access to wider model ecosystem (HuggingFace, PyTorch Hub)
  • Eliminates manual conversion steps during development
  • Aligns with existing edge-training export capabilities (NCNN, ONNX, CoreML)

Proposed Implementation

Option A: Runtime Conversion (Recommended)

// On import, convert ONNX → CoreML
func loadONNXModel(url: URL) -> VNCoreMLModel {
    let coremlModel = convertONNXtoCoreML(url)
    return try VNCoreMLModel(for: coremlModel)
}

Pros:

  • Native ANE/GPU acceleration
  • No runtime dependencies
  • Consistent performance with pure CoreML

Cons:

  • Initial conversion delay
  • Some ONNX ops may not convert

Option B: Direct ONNX Runtime

// Run ONNX directly using ort crate
func loadONNXModel(url: URL) -> ONNXSession {
    return try ONNXSession(modelPath: url.path)
}

Pros:

  • No conversion overhead
  • Support for latest ONNX ops

Cons:

  • No ANE acceleration
  • Additional dependency
  • Slower inference

Acceptance Criteria

  • Load .onnx files via CoreMLModelView
  • Auto-convert to CoreML format on import (Option A)
  • Run inference with same DetectionView overlay
  • Display model metadata (ONNX opset version, inputs/outputs)
  • Handle conversion errors gracefully with user feedback

Related

References

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions