GitHub Repository: https://github.com/c0ffee0wl/llm-windows-setup
Automated installation script for Simon Willison's llm CLI tool and related AI/LLM command-line utilities for Windows environments.
Based on the llm-linux-setup project.
π Looking for the Linux version? This is the Windows adaptation. For the original Linux version with additional features like AIChat (RAG), sandboxed shell execution, and more extensive documentation, see: llm-linux-setup - Linux/Debian/Ubuntu/Kali version
- Features
- What Gets Installed
- System Requirements
- Installation
- Updating
- Usage
- Configuration
- Troubleshooting
- Supported PowerShell Versions
- Documentation
- Differences from Linux Version
- License
- Contributing
- Credits
- Support
- Related Projects
- β One-command installation - Run once to install everything
- β Self-updating - Re-run to update all tools automatically
- β Multi-PowerShell support - Works with both PowerShell 5.1 and PowerShell 7+
- β Azure OpenAI integration - Configured for Azure Foundry
- β AI command completion - Press Ctrl+N for intelligent command suggestions
- β Automatic session recording - Terminal history captured for AI context
- β
AI-powered context retrieval - Query your command history with
contextorllm --tool context - β Smart admin handling - Only requires admin for Chocolatey installation
- llm - LLM CLI tool (fork with markdown markup enhancements, originally by Simon Willison - Documentation)
- Claude Code - Anthropic's agentic coding CLI
- OpenCode - AI coding agent for terminal
- uv - Modern Python package installer
- Python 3 - Via Chocolatey
- Node.js 22.x - Via Chocolatey (latest in 22 branch)
- Git - Via Chocolatey (if not already installed)
- jq - JSON processor
- llm-cmd - Command execution and management
- llm-cmd-comp - AI-powered command completion (powers Ctrl+N)
- llm-tools-sqlite - SQLite database tool
- llm-tools-context - Terminal history integration (exposes
contexttool to AI) - llm-fragments-site-text - Web page content extraction
- llm-fragments-pdf - PDF content extraction
- llm-fragments-github - GitHub repository integration
- llm-fragments-dir - Load all text files from a local directory recursively
- llm-jq - JSON processing tool
- llm-templates-fabric - Fabric prompt templates
- llm-gemini - Google Gemini models integration
- llm-vertex - Google Vertex AI integration (enterprise API)
- llm-openrouter - OpenRouter API integration
- llm-anthropic - Anthropic Claude models integration
- assistant.yaml - Custom assistant template with security/IT expertise configuration
- code.yaml - Code-only generation template (outputs clean, executable code without markdown)
- gitingest - Convert Git repositories to LLM-friendly text
- files-to-prompt - File content formatter for LLM prompts
- context - PowerShell history extraction for AI context retrieval
- AI-powered command completion (Ctrl+N)
- Custom llm wrapper with default assistant template
- Automatic PowerShell transcript logging for AI context
- Clipboard aliases (
pbcopy,pbpaste) for macOS compatibility - PATH configuration for all installed tools
- OS: Windows 10, Windows 11, or Windows Server 2016+
- PowerShell: 5.1 or higher (PowerShell 7+ supported)
- Internet: Required for installation and API access
- Disk Space: ~1GB for all tools and dependencies
- Admin Rights: Only required for initial Chocolatey installation
-
Clone the repository:
git clone https://github.com/c0ffee0wl/llm-windows-setup.git cd llm-windows-setup
-
Run the installation script:
-
If Chocolatey is NOT installed (first-time installation):
# Run as Administrator (required for Chocolatey installation) .\Install-LlmTools.ps1
-
If Chocolatey IS already installed:
# Can run as regular user .\Install-LlmTools.ps1
-
-
Follow the prompts:
- You'll be asked if you want to configure Azure OpenAI (optional)
- If yes, provide your Azure API key and resource URL
If you get an execution policy error, run:
Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUserSimply re-run the installation script:
cd llm-windows-setup
.\Install-LlmTools.ps1The script will automatically:
- Check for script updates from the git repository
- Pull latest changes and re-execute if updates are found
- Update llm and all plugins
- Update custom templates (assistant.yaml)
- Update gitingest, files-to-prompt, Claude Code, and OpenCode
- Refresh PowerShell integration files
Note: Updates do not require Administrator privileges (unless you need to update Chocolatey packages). The git pull happens automatically during Phase 0.
# Ask a question (uses assistant template by default)
llm "What is the capital of France?"
# Start an interactive chat session
llm chat "Let's discuss PowerShell"
# Use a specific model
llm -m azure/gpt-5 "Explain quantum computing"
# List available models
llm models list
# View installed plugins
llm pluginsType a partial command or describe what you want in natural language, then press Ctrl+N:
# Type: list all json files recursively
# Press Ctrl+N
# Result: Get-ChildItem -Recurse -Filter *.jsonThe AI will suggest and insert the command automatically.
The following models are configured (if you set up Azure OpenAI):
azure/gpt-4.1-mini- GPT-4.1 Mini (default, recommended for most tasks)azure/gpt-4.1- GPT-4.1 (recommended for complex tasks)azure/gpt-4.1-nano- GPT-4.1 Nano (lightweight)azure/gpt-5- GPT-5 (most capable)azure/gpt-5-mini- GPT-5 Mini (advanced reasoning)azure/gpt-5-nano- GPT-5 Nano (advanced lightweight)azure/o4-mini- O4 Mini
To switch models for more complex tasks:
llm models default azure/gpt-4.1# Copy to clipboard (like macOS pbcopy)
"Hello World" | pbcopy
# Paste from clipboard (like macOS pbpaste)
pbpaste# Convert Git repositories to LLM-friendly text
gitingest https://github.com/user/repo
gitingest C:\path\to\local\repo
# Convert files to LLM-friendly format
files-to-prompt src\*.py
# Use OpenCode
opencode
# Use Claude Code
claudePowerShell sessions are automatically logged via transcript recording. The AI can retrieve your terminal history for better context:
# Show last command
context
# Show last 5 commands
context 5
# Show entire session history
context all
# Check transcript file location
$env:TRANSCRIPT_LOG_FILE
# Check transcript directory
$env:TRANSCRIPT_LOG_DIRHow it works:
- Each PowerShell session automatically starts transcript logging
- Transcripts are stored in
$env:TRANSCRIPT_LOG_DIR(configurable during installation) - The
contextcommand parses transcripts and extracts command history - The
llm-tools-contextplugin exposes this to AI models for contextual assistance - AI can call
context(N)to retrieve last N commands when helping with your tasks
Storage options:
- Temporary (default):
%TEMP%\PowerShell_Transcripts- cleared on logout/reboot - Permanent:
%USERPROFILE%\PowerShell_Transcripts- survives reboots
%APPDATA%\io.datasette.llm\- LLM configuration directoryextra-openai-models.yaml- Azure OpenAI model definitionstemplates\assistant.yaml- Custom assistant template (auto-installed)- API keys stored securely via llm's key management
Located in the integration\ subdirectory:
integration\llm-integration.ps1- Unified integration for PS5 & PS7
This file is automatically sourced from your PowerShell profile.
The installation script adds integration to:
- PowerShell 5:
%USERPROFILE%\Documents\WindowsPowerShell\Microsoft.PowerShell_profile.ps1 - PowerShell 7:
%USERPROFILE%\Documents\PowerShell\Microsoft.PowerShell_profile.ps1
llm models default azure/gpt-4.1# Set Azure key
llm keys set azure
# View key storage path
llm keys path
# List all configured keys
llm keys-
Restart your PowerShell session or reload your profile:
. $PROFILE
-
Verify llm is in PATH:
Get-Command llm -
Test llm command completion manually:
llm cmdcomp "list files" -
Check if PSReadLine is loaded:
Get-Module PSReadLine
-
Verify API key is set:
llm keys get azure
-
Check model configuration:
Get-Content $env:APPDATA\io.datasette.llm\extra-openai-models.yaml
-
Update the API base URL in the YAML file if needed
- Check if the installation completed successfully
- Verify PATH includes:
%USERPROFILE%\.local\bin - Restart PowerShell
- Re-run the installation script
- Ensure you're running PowerShell as Administrator
- Check your internet connection
- Verify your execution policy allows scripts:
Get-ExecutionPolicy - Install Chocolatey manually: https://chocolatey.org/install
The script configures npm to use user-level global installs. If you still get permission errors:
npm config set prefix "$env:USERPROFILE\.npm-global"Then add %USERPROFILE%\.npm-global to your PATH.
- PowerShell 5.1 (Windows PowerShell - included with Windows 10/11)
- PowerShell 7.x (PowerShell Core - cross-platform)
- LLM Documentation
- LLM Plugins Directory
- Pedantic Journal - LLM Guide
- Gitingest Documentation
- Files-to-Prompt
- Claude Code Documentation
- OpenCode Documentation
This Windows version differs from the Linux version in the following ways:
π Note: The Linux version contains more extensive documentation, detailed usage examples, and additional features. Refer to its README for comprehensive guides on fragments, templates, RAG, and advanced workflows.
Windows:
- Uses Chocolatey for system packages (Python, Node.js, Git, jq)
- Smart admin privilege model (admin only required for Chocolatey on first run)
- User-scoped installations for everything else (pipx, uv, npm global)
Linux:
- Uses apt/dpkg for system packages
- rustup for Rust toolchain management (required for aichat/argc)
- nvm for Node.js version management (if repository version < 20)
- Root/sudo access required for system package installation
Windows:
- PowerShell 5.1 and 7+ support
- Single integration file (
llm-integration.ps1) works for both PS versions - Uses PSReadLine for Ctrl+N command completion
- Clipboard aliases:
pbcopy/pbpastefunctions
Linux:
- Bash and Zsh support
- Three-file integration pattern:
llm-common.sh- Shared configurationllm-integration.bash- Bash-specific widgetsllm-integration.zsh- Zsh-specific widgets
- llm-zsh-plugin - Tab completion for llm commands (Zsh only)
- Clipboard aliases via
xsel(macOS compatibility layer)
Windows:
- Uses native PowerShell
Start-Transcriptcmdlet - Creates
.txttranscript files (UTF-16-LE or UTF-8 encoding) - Stores in
%TEMP%\PowerShell_Transcripts(temporary) or%USERPROFILE%\PowerShell_Transcripts(permanent) - One transcript per PowerShell window/tab
- Parses transcript files with
context.py(indentation-based parsing) - Limitation: Native commands (ping, git, etc.) output not always captured by PowerShell transcription
Linux:
- Uses asciinema for terminal recording (built from source for latest features)
- Creates
.castfiles (asciinema JSON format) - Stores in
/tmp/session_logs/asciinema(temporary) or~/session_logs/asciinema(permanent) - Tmux/screen support: Each pane/window gets independent recording with unique session files
- Parses
.castfiles withcontextscript (regex-based prompt detection) - Full command output capture including native commands
Windows (assistant.yaml):
- Tools:
contextonly
Linux (assistant.yaml):
- Tools:
context+sandboxed_shell(bubblewrap-based safe command execution)
The Linux version includes several additional tools not present in the Windows version:
- AIChat - All-in-one LLM CLI with built-in RAG (Retrieval-Augmented Generation)
- Built-in vector database for document querying
- Query codebases, documents, and knowledge bases:
llm rag mydocs - Supports multiple document sources (Git repos, URLs, PDFs, DOCX, directories)
- Auto-configured with Azure OpenAI settings
The Linux version includes several additional LLM plugins not available in the Windows version:
Fragment Plugins:
- llm-fragments-youtube-transcript - YouTube video transcript extraction with metadata
Tool Plugins:
- llm-tools-sandboxed-shell - Safe shell command execution (covered in Sandboxed Command Execution section below)
- llm-tools-patch - File manipulation tools (read, write, edit, multi_edit, info) with approval workflow
- llm-tools-llm-functions - Bridge for llm-functions framework (covered in Developer Tools section below)
- llm-tools-quickjs - JavaScript execution environment for AI
- llm-tools-sandboxed-shell - Execute shell commands safely using bubblewrap
- Isolated environment (read-only root, no network access, Linux namespaces)
- Built into the Linux assistant template by default
- Requires bubblewrap (automatically installed)
- Rust/Cargo toolchain - Required for building aichat and argc
- argc - Bash CLI framework and command runner (enables llm-functions integration)
- yek - Fast Rust-based repository to LLM-friendly text converter (230x faster than gitingest)
- poppler-utils - PDF text extraction (pdftotext)
- pandoc - Document converter (DOCX support for RAG)
Both versions share these core features:
- β llm CLI tool - Simon Willison's LLM CLI
- β Self-updating installation scripts - git pull before execution
- β Azure OpenAI integration - Configured for Azure Foundry
- β
AI command completion - Ctrl+N intelligent command suggestions via
llm cmdcomp - β Context system - Query terminal history with AI
- β
Custom templates -
assistant.yaml(security/IT expertise) andcode.yaml(clean code output) - β Core LLM plugins - gemini, openrouter, anthropic, vertex, jq, sqlite, cmd, cmd-comp, fabric templates
- β Shared fragment plugins - site-text, pdf, github, dir (Linux has 1 additional: youtube-transcript)
- β Shared tool plugins - sqlite, context (Linux has 3 additional: quickjs, patch, sandboxed-shell)
- β Additional tools - gitingest, files-to-prompt, Claude Code, OpenCode
- β Smart template application - Shell wrapper auto-applies assistant template
This installation script is provided as-is under the MIT License. Individual tools have their own licenses:
- llm: Apache 2.0
- See individual tool repositories for details
To modify or extend this installation:
- Fork the repository
- Make your changes
- Test on Windows 10/11
- Submit a pull request
- Simon Willison - llm CLI tool
- Dan Mackinlay - files-to-prompt fork
- Damon McMinn - llm-templates-fabric fork
- c0ffee0wl - Original llm-linux-setup project
For issues, questions, or suggestions:
- Open an issue: https://github.com/c0ffee0wl/llm-windows-setup/issues
- llm-linux-setup - Linux/Debian version