Skip to content

feat: Real-time object detection with Meta Glasses (relay) #4

@ebowwa

Description

@ebowwa

Feature: Real-time Object Detection with Meta Glasses

Integrate CoreMLPlayer with the Meta Glasses Relay for real-time CV on live video streams.

The Vision

Meta Glasses → iPhone → Mac (CoreMLPlayer) → Display with overlays

Use Cases

  1. Hands-free AR - Wear glasses, see live detection on Mac display
  2. Accessibility - Real-time scene description/object localization for blind/low-vision
  3. Professional Tools - Technician/doctor inspector with live CV feedback
  4. Research/Testing - Test models in real environments, collect training data

Technical Approach

  1. Use MacReceiver to get frames from Meta glasses
  2. Pass frames to CoreMLPlayer's detection pipeline
  3. Render detection boxes/labels over live video
  4. Add model selection (YOLO, SAM2, etc.)
  5. Optional: recording, annotation, audio feedback

Existing Infrastructure

  • ✅ Meta glasses relay (mac-relay-yolo branch)
  • ✅ CoreMLPlayer detection pipeline
  • ✅ Video rendering system
  • ✅ Model loading/switching

Main Challenges

  • Latency - Multi-hop adds delay (glasses→iPhone→Mac)
  • Performance - Real-time detection needs to be fast
  • UI - How to overlay detection results on live video feed

Related

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions