Skip to content

frolleks/llm-debugger

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

llm-debugger

A debugging tool that integrates LLMs with a debugging server. Currently only supports Node.js, with plans to expand to other programming languages in the future.

Prerequisites

  • Node.js 20 or higher
  • pnpm package manager
  • OpenAI API key or any compatible OpenAI API key (e.g. OpenRouter)

Installation

Clone git repository:

git clone https://github.com/frolleks/llm-debugger && cd llm-debugger

Install dependencies:

pnpm install

Create a .env file and add your OpenAI API key:

OPENAI_API_KEY=your_api_key_here

Build the application:

pnpm build

Usage

  1. Configure your model (required before first use):
# Add a model
llm-debugger model:add <name> <baseURL> [options]

# Example: Add DeepSeek R1 as default model
llm-debugger model:add deepseek/deepseek-r1 https://openrouter.ai/api/v1 --default

# List configured models
llm-debugger model:list
  1. Start a Node.js debugging session:
node --inspect-brk <file.js>
  1. Launch the debugger:
# Use default model
llm-debugger start

# Or specify a model
llm-debugger start --model <model-name>

About

A debugging tool that integrates LLMs with a debugging server.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published