A debugging tool that integrates LLMs with a debugging server. Currently only supports Node.js, with plans to expand to other programming languages in the future.
- Node.js 20 or higher
- pnpm package manager
- OpenAI API key or any compatible OpenAI API key (e.g. OpenRouter)
Clone git repository:
git clone https://github.com/frolleks/llm-debugger && cd llm-debuggerInstall dependencies:
pnpm installCreate a .env file and add your OpenAI API key:
OPENAI_API_KEY=your_api_key_here
Build the application:
pnpm build- Configure your model (required before first use):
# Add a model
llm-debugger model:add <name> <baseURL> [options]
# Example: Add DeepSeek R1 as default model
llm-debugger model:add deepseek/deepseek-r1 https://openrouter.ai/api/v1 --default
# List configured models
llm-debugger model:list- Start a Node.js debugging session:
node --inspect-brk <file.js>- Launch the debugger:
# Use default model
llm-debugger start
# Or specify a model
llm-debugger start --model <model-name>