Devonz is an AI-development agent that helps you build full-stack applications through natural language conversation. Originally built upon bolt.diy, Devonz focuses on speed, efficiency, and a streamlined development experience.
| Section | Description |
|---|---|
| Key Features | Core capabilities and highlights |
| Tech Stack | Technologies used in the project |
| Installation | Getting started guide |
| Configuration | Environment variables and settings |
| AI Providers | All 19 supported AI providers |
| Project Structure | Codebase organization |
| Available Scripts | Development and build commands |
| Settings | App settings and features |
| Contributing | How to contribute |
| Feature | Description |
|---|---|
| Natural Language Building | Describe what you want to build, and Devonz creates it |
| Multi-Provider Support | 19 AI providers including OpenAI, Anthropic, Google, Groq, Ollama, and more |
| Model Context Protocol (MCP) | Extend Devonz capabilities with MCP tools |
| Auto-Fix | Automatic error detection and fixing with terminal error detector |
| Feature | Description |
|---|---|
| In-Browser Development | Full development environment powered by WebContainers |
| Real-time Preview | Instant preview of your applications |
| Terminal Access | Full terminal access within the browser |
| Code Editor | Integrated CodeMirror editor with syntax highlighting |
| Platform | Description |
|---|---|
| GitHub | Push directly to GitHub repositories |
| GitLab | Deploy to GitLab projects |
| Netlify | One-click deployment to Netlify |
| Vercel | Deploy to Vercel with ease |
| Integration | Description |
|---|---|
| Supabase | Database and authentication API routes (requires configuration) |
| Git | Built-in Git support for version control |
| Template Gallery | Pre-built templates for popular frameworks |
| Category | Technology |
|---|---|
| Framework | Remix + Vite |
| Language | TypeScript |
| Styling | UnoCSS + Tailwind CSS |
| UI Components | Radix UI, Headless UI |
| Animation | Framer Motion |
| AI SDK | Vercel AI SDK |
| Editor | CodeMirror |
| Terminal | xterm.js |
| WebContainers | StackBlitz WebContainer API |
| Requirement | Version |
|---|---|
| Node.js | 18.18.0 or higher |
| pnpm | Latest (recommended) |
-
Clone the Repository:
git clone -b stable https://github.com/zebbern/Devonz.git cd Devonz -
Install Dependencies:
pnpm install
-
Start Development Server:
pnpm run dev
-
Open in Browser: Navigate to
http://localhost:5173
Create a .env.local file in the project root:
# AI Provider API Keys
OPENAI_API_KEY=your_openai_key
ANTHROPIC_API_KEY=your_anthropic_key
GOOGLE_GENERATIVE_AI_API_KEY=your_google_key
GROQ_API_KEY=your_groq_key
# Local Provider URLs
OLLAMA_BASE_URL=http://127.0.0.1:11434
LMSTUDIO_BASE_URL=http://127.0.0.1:1234
# Deployment Integrations (Optional)
GITHUB_ACCESS_TOKEN=your_github_token
NETLIFY_AUTH_TOKEN=your_netlify_token
VERCEL_ACCESS_TOKEN=your_vercel_token- Click the Settings icon in the sidebar
- Navigate to Providers tab
- Configure your preferred AI providers:
- Cloud Providers: OpenAI, Anthropic, Google, Groq, OpenRouter, etc.
- Local Providers: Ollama, LM Studio, OpenAI-compatible endpoints
| Provider | Models | Features |
|---|---|---|
| OpenAI | GPT-4o, GPT-4 Turbo, GPT-3.5 | Chat, Vision |
| Anthropic | Claude 3.5 Sonnet, Claude 3 Opus | Chat, Vision |
| Gemini Pro, Gemini Ultra | Chat, Vision | |
| Groq | LLaMA 3, Mixtral | Fast inference |
| OpenRouter | 100+ models | Model aggregation |
| Mistral | Mistral Large, Codestral | Chat, Code |
| Cohere | Command R+ | Chat, RAG |
| Deepseek | Deepseek Coder | Code generation |
| Amazon Bedrock | Claude, Titan | Enterprise |
| Together | Open source models | Chat, Code |
| Perplexity | Online models | Web search |
| HuggingFace | Open source models | Community models |
| xAI | Grok | Chat |
| GitHub | Copilot models | GitHub AI |
| Hyperbolic | Various models | Specialized inference |
| Moonshot | Moonshot models | Chinese LLM |
| Provider | Description |
|---|---|
| Ollama | Run open-source models locally with model management |
| LM Studio | Local model inference with GUI |
| OpenAI-like | Any OpenAI-compatible API endpoint |
bolt.diy/
├── app/
│ ├── components/ # React components
│ │ ├── @settings/ # Settings panel (14 tabs)
│ │ ├── auth/ # Authentication components
│ │ ├── chat/ # Chat interface
│ │ ├── deploy/ # Deployment (GitHub, GitLab, Netlify, Vercel)
│ │ ├── editor/ # Code editor
│ │ ├── git/ # Git integration
│ │ ├── header/ # App header
│ │ ├── sidebar/ # Sidebar navigation
│ │ ├── ui/ # Shared UI components
│ │ └── workbench/ # Development workbench
│ ├── lib/ # Core libraries
│ │ ├── api/ # API utilities
│ │ ├── hooks/ # React hooks
│ │ ├── modules/ # Feature modules (llm with 19 providers)
│ │ ├── services/ # API services
│ │ ├── stores/ # State management (nanostores)
│ │ └── utils/ # Utility functions
│ ├── routes/ # Remix routes (39 API endpoints + pages)
│ ├── styles/ # Global styles
│ └── types/ # TypeScript types
├── docs/ # Documentation (mkdocs)
├── public/ # Static assets
├── scripts/ # Build scripts
└── supabase/ # Supabase folder (requires configuration)
| Command | Description |
|---|---|
pnpm run dev |
Start development server |
pnpm run build |
Build for production |
pnpm run start |
Run production build |
pnpm run preview |
Build and preview locally |
| Command | Description |
|---|---|
pnpm test |
Run tests |
pnpm test:watch |
Run tests in watch mode |
pnpm run typecheck |
TypeScript type checking |
pnpm run lint |
ESLint check |
pnpm run lint:fix |
Auto-fix linting issues |
| Command | Description |
|---|---|
pnpm run clean |
Clean build artifacts |
pnpm run prepare |
Set up husky git hooks |
| Tab | Description |
|---|---|
| Profile | User profile management |
| Providers | AI provider configuration (19 providers) |
| Features | Enable/disable features |
| MCP | Model Context Protocol tools |
| GitHub | GitHub integration settings |
| GitLab | GitLab integration settings |
| Netlify | Netlify deployment settings |
| Vercel | Vercel deployment settings |
| Supabase | Database integration settings |
| Event Logs | Application logs |
| Data | Import/export data |
| Notifications | Notification preferences |
| Project Memory | Project context storage |
| Settings | General settings |
Devonz supports MCP tools for extending AI capabilities:
| Feature | Description |
|---|---|
| Custom MCP Servers | Configure custom MCP servers |
| Specialized Tools | Add specialized tools for your workflow |
| External Services | Extend AI reasoning with external services |
# Save local changes
git stash
# Pull latest updates
git pull
# Update dependencies
pnpm install
# Restore local changes
git stash pop# Remove dependencies
rm -rf node_modules pnpm-lock.yaml
# Clear cache
pnpm store prune
# Reinstall
pnpm installWe welcome contributions! Here's how to get started:
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Commit your changes:
git commit -m 'Add amazing feature' - Push to the branch:
git push origin feature/amazing-feature - Open a Pull Request
| Credit | Description |
|---|---|
| bolt.diy | Original project foundation |
| StackBlitz WebContainers | In-browser development environment |
| Vercel AI SDK | AI capabilities |
| Link | URL |
|---|---|
| Repository | https://github.com/zebbern/Devonz |
| Original Project | bolt.diy |
