Smart Glasses Demo w/ CoreViz SDK
Visual Personal Intelligence for Smart Glasses
A demo that captures moments automatically and uploads them to CoreViz for intelligent visual processing and personal super-memory. Features include search, object / face detection, visual question answering (VQA) and image similarity
Demo app simulates syncing with smart glasses, capturing at regular intervals and uploading.
- Automatic Frame Capture: Captures photos every 10 seconds when recording
- High-Quality Processing: 0.8 quality JPEG with full EXIF metadata
- Orientation Aware: Preserves camera orientation data for proper display
- Media Library Integration: Automatically saves captured frames locally
- Timeline View: Chronological organization of captured moments
- Memory Lane: Browse through your visual history by date
Seamless integration through uploads to a smart collection (requires a CoreViz account).
- Object detection, face detection, clustering
- Indexing of images for quick search and RAG question answering
- Once images are uploaded, the CoreViz interface allows searching, tagging, answering questions and finding similar shots with Visual AI
- Metadata Rich Uploads: Includes timestamp, dimensions, and EXIF data
- Batch Processing: Upload multiple frames efficiently
- Real-time Status: Track upload progress and success/failure states
- Cross-Platform: iOS, Android, and Web support via Expo
- Responsive Design: Adaptive UI for different screen sizes
- Haptic Feedback: Enhanced user interaction with tactile responses
- Dark/Light Mode: Automatic theme adaptation
- Node.js (v18 or later)
- Expo CLI (
npm install -g @expo/cli) - CoreViz.io Account for API credentials
-
Clone the repository
git clone https://github.com/wassgha/ai-glasses.git cd glasses.coreviz -
Install dependencies
npm install
-
Start the development server
npx expo start
-
Run on your device
- iOS: Press
ior scan QR with Camera app - Android: Press
aor scan QR with Expo Go - Web: Press
wto open in browser
- iOS: Press
-
Get Your Credentials
- Sign up at CoreViz.io
- Navigate to Settings β API Keys
- Create a new API key
- Note your Entity ID from your dataset
-
Configure the App
- Open the app and go to the Camera tab
- Tap the βοΈ Settings button
- Enter your API Key and Entity ID
- Tap Save to enable automatic uploads
Update config/coreviz.ts with your settings:
export const COREVIZ_CONFIG = {
API_BASE_URL: 'https://your-coreviz-endpoint.com',
API_KEY: '', // Set via app UI
ENTITY_ID: '', // Set via app UI
UPLOAD_SETTINGS: {
MAX_FILE_SIZE: 10 * 1024 * 1024, // 10MB
SUPPORTED_FORMATS: ['image/jpeg', 'image/png', 'image/webp'],
QUALITY: 0.8,
}
};- Grant Permissions: Allow camera and media library access
- Configure CoreViz: Set up your API credentials in settings
- Start Recording: Tap the record button in the Camera tab
- Automatic Capture: Frames are captured every 10 seconds
- Upload Process: Frames are automatically uploaded to CoreViz
- Browse Moments: Use the Timeline tab to view captured frames
- Date Organization: Frames are grouped by capture date
- Upload Status: See real-time upload progress and status
- Batch Operations: Upload multiple frames at once
- Countdown Timer: Visual indicator for next capture
- Upload Progress: Real-time progress tracking
- Error Handling: Automatic retry on failed uploads
- Memory Management: Keeps last 50 frames in memory
app/
βββ (tabs)/
β βββ index.tsx # Camera interface and recording
β βββ explore.tsx # Timeline and frame browser
β βββ _layout.tsx # Tab navigation
βββ _layout.tsx # App root layout
βββ +not-found.tsx # 404 handling
components/
βββ CorevizConfigModal.tsx # API configuration UI
βββ ParallaxScrollView.tsx # Smooth scrolling interface
βββ ThemedText.tsx # Themed text components
βββ ThemedView.tsx # Themed view components
contexts/
βββ FramesContext.tsx # Global state management
services/
βββ corevizUpload.ts # CoreViz API integration
config/
βββ coreviz.ts # API configuration
- Capture: Camera captures frame with metadata
- Store: Frame added to local context with timestamp
- Process: EXIF data extracted and preserved
- Upload: Frame uploaded to CoreViz with rich metadata
- Track: Upload status tracked and displayed
# Development
npm start # Start Expo development server
npm run android # Run on Android emulator
npm run ios # Run on iOS simulator
npm run web # Run in web browser
# Utilities
npm run lint # Run ESLint
npm run reset-project # Reset to clean state- Expo SDK 53: Cross-platform development framework
- React Native 0.79: Mobile app framework
- TypeScript: Type-safe development
- Expo Camera: Camera API with EXIF support
- Expo Media Library: Local media storage
- React Navigation: Tab-based navigation
The app integrates with CoreViz.io using multipart form uploads:
// Upload with rich metadata
const formData = new FormData();
formData.append('file', {
uri: frameUri,
type: 'image/jpeg',
name: `glasses-frame-${timestamp}.jpg`,
});
formData.append('entityId', COREVIZ_CONFIG.ENTITY_ID);
formData.append('authToken', COREVIZ_CONFIG.API_KEY);
formData.append('exif', JSON.stringify({
...cameraExif,
capturedAt: timestamp,
source: 'wearable-glasses',
deviceType: 'mobile-camera',
}));- Local Storage: Frames stored locally until upload
- Secure Upload: HTTPS-only API communication
- Credential Management: Secure storage of API keys
- Data Retention: Configurable frame retention policies
We welcome contributions! Please follow these steps:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
- Follow TypeScript best practices
- Use semantic commit messages
- Add tests for new features
- Update documentation as needed
This project is licensed under the MIT License - see the LICENSE file for details.
- CoreViz.io for providing the visual intelligence platform
- Expo Team for the excellent development framework
- React Native Community for continuous innovation
Built with β€οΈ for the future of visual intelligence

