The Intelligent Framework for World-Building and Narrative Intelligence
Novellis is a privacy-first, local-LLM-powered workspace designed for storytellers, world-builders, and narrative designers. Transform static manuscripts into interactive, AI-powered knowledge bases while keeping your creative work entirely on your own machine.
Download Latest Release | Landing Page | Discussions
Important
Beta Release: Novellis is currently in public beta. We are actively refining our narrative engines and stability. Your feedback and bug reports are invaluable at this stage.
- Local-First Narrative AI: Seamlessly connect to Ollama for a 100% private, offline experience, or use cloud providers like OpenAI/Anthropic for intensive analysis.
- Narrative Intelligence & Timeline: Automatically visualize story arcs and temporal sequences across your entire novel.
- Visual Knowledge Graph: Map and explore complex relationships between characters, locations, and events in a dynamic graph view.
- AI Copilot & Ingestion: Ingest manuscripts to create a "brain" for your story that understands characters, plot points, and consistency.
- Privacy-First Architecture: Your work stay where it belongsβon your hardware. No cloud sync required for core features.
Novellis is built with a commitment to the writing community:
- Free Community Edition: Professional-grade story-building tools will remain free forever for individuals.
- Professional Features: Deep-analysis tools (large-scale consistency checks, narrative arc clustering) are available in our Pro tier to support the project's continued development.
- Visit the Releases page.
- Download the specialized installer for your system:
- macOS (arm64): For Apple Silicon (M1/M2/M3/M4).
- macOS (x64): For Intel-based Macs.
- Windows (x64): Standard Windows installer.
- Security Notice: As a community-built tool, our binaries are currently unsigned.
- macOS: Right-click the app and select "Open" to bypass Gatekeeper.
- Windows: Click "More info" -> "Run anyway" if prompted by SmartScreen.
- (Optional) Install Ollama to run AI models locally for maximum privacy.
- Found a bug? Open an Issue.
- Have an idea? Join the Feature Requests discussion.
- Need help? Check our documentation on the Landing Page.
Β© 2026 ByteYI Labs. Built with β€οΈ for storytellers.