Skip to content

NoaheCampbell/Loadout

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

46 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Loadout

πŸš€ AI-powered project blueprint generator - Transform your ideas into complete React applications with comprehensive documentation, development roadmaps, and production-ready UI components.

License Node Version Platform

✨ Overview

Loadout is a desktop application that uses advanced AI models to transform your project ideas into complete development blueprints. Simply describe what you want to build, and Loadout generates everything you need to start coding immediately.

What You Get

  • πŸ“‹ Product Requirements Document (PRD) - Detailed project specifications with goals, constraints, and success criteria
  • βœ… Development Checklist - Phase-based implementation roadmap with actionable tasks
  • 🧠 Technical Decisions Log - Documented assumptions and architectural choices
  • 🎨 Production-Ready UI - Complete React/Tailwind components with live preview
  • πŸ’¬ AI Chat Assistant - Refine and iterate on your generated components
  • πŸ“¦ Export Everything - Download your complete project as organized files

🎯 Key Features

Multi-Provider AI Support

  • OpenAI - GPT-4, GPT-4 Turbo, GPT-3.5 Turbo
  • Anthropic - Claude 3.5 Sonnet, Claude 3 Opus/Sonnet/Haiku
  • Ollama - Run models locally with no API key required

Advanced Capabilities

  • LangGraph Workflow - Sophisticated AI orchestration for consistent outputs
  • Streaming Responses - Real-time generation with progress tracking
  • Live UI Preview - Instant component rendering with hot reload
  • Chat Interface - Iterate on your UI with conversational AI
  • Project Management - Save, organize, and revisit all your ideas
  • Dark/Light Theme - Beautiful UI that's easy on the eyes

πŸš€ Quick Start

Prerequisites

  • Node.js 18+ and npm
  • At least one AI provider:
    • OpenAI API key (for GPT models)
    • Anthropic API key (for Claude models)
    • Ollama installed locally (for free local models)

Installation

# Clone the repository
git clone https://github.com/NoaheCampbell/loadout.git
cd loadout

# Install dependencies
npm install

# Start the application
npm run dev

First Run

  1. Click the settings icon to configure your AI provider
  2. Enter your API key (or select Ollama for local models)
  3. Create a new project and describe your idea
  4. Watch as Loadout generates your complete blueprint!

πŸ“– How It Works

The LangGraph Workflow

graph LR
    A[Your Idea] --> B[Process & Refine]
    B --> C[Generate PRD]
    C --> D[Create Checklist]
    C --> E[Document Decisions]
    C --> F[Plan UI Architecture]
    D & E & F --> G[Generate Components]
    G --> H[Validate & Export]
Loading

Each step is powered by specialized AI agents that build upon previous outputs, ensuring consistency and completeness across all generated artifacts.

Workflow Visualization

Click the "Workflow" button in the header to see a detailed, interactive diagram of how Loadout processes your ideas. The visualization includes:

  • Real-time progress tracking during generation
  • Validation steps for each phase
  • Parallel processing indicators
  • Provider-specific features

πŸ—οΈ Project Structure

Your generated projects include:

your-project/
β”œβ”€β”€ PRD.md                    # Product requirements document
β”œβ”€β”€ CHECKLIST.md             # Development roadmap
β”œβ”€β”€ DECISIONS.md             # Technical decisions log
β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ App.tsx             # Main application component
β”‚   β”œβ”€β”€ components/         # Generated UI components
β”‚   β”œβ”€β”€ _setup.js          # Project setup instructions
β”‚   └── index.html         # Preview HTML file
└── project.json            # Project metadata

πŸ“ Storage Locations

Projects are stored locally on your machine:

  • macOS: ~/Library/Application Support/Loadout/projects/
  • Windows: %APPDATA%/Loadout/projects/
  • Linux: ~/.config/Loadout/projects/

πŸ› οΈ Development

# Run in development mode
npm run dev

# Build for production
npm run build

# Package for distribution
npm run dist

# Run tests
npm test

# Type checking
npm run typecheck

πŸ”§ Configuration

AI Providers

Configure your preferred AI provider in Settings:

  1. OpenAI - Best for general-purpose generation
  2. Anthropic - Excellent for complex reasoning
  3. Ollama - Free local models, no internet required

Ollama Setup

# Install Ollama
curl -fsSL https://ollama.com/install.sh | sh

# Pull a model
ollama pull llama2

# Loadout will auto-detect available models

🎨 Features in Detail

Chat Interface

  • Stream responses in real-time
  • Direct UI modifications - just ask for changes and they're applied automatically
  • Maintain context across conversations
  • Uses your currently selected AI model
  • Export chat history with projects

UI Generation

  • Multi-file component generation
  • Automatic import resolution
  • Tailwind CSS integration
  • Responsive design patterns
  • Accessibility considerations

Export Options

  • Complete project ZIP file
  • Individual file downloads
  • Copy-to-clipboard for quick sharing
  • Direct integration with code editors

🀝 Contributing

We welcome contributions! Please see our Contributing Guide for details.

Development Setup

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ™ Acknowledgments

Built with amazing technologies:

πŸ› Troubleshooting

Common Issues

API Key Not Working

  • Ensure your API key has the necessary permissions
  • Check your billing/usage limits
  • Verify the key is correctly entered in Settings

Ollama Connection Failed

  • Make sure Ollama is running (ollama serve)
  • Check if models are downloaded (ollama list)
  • Verify Ollama is accessible at http://localhost:11434

Generation Errors

  • Try a different AI model
  • Simplify your project description
  • Check the console for detailed error messages

πŸ“ž Support


Made with ❀️ by developers, for developers

Transform your ideas into reality with Loadout

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

No packages published