Skip to content

Proxy that allows you to use ollama as a copilot like Github copilot

License

Notifications You must be signed in to change notification settings

bernardo-bruning/ollama-copilot

Repository files navigation

PDD status

Ollama Copilot

Proxy that allows you to use ollama as a copilot like Github copilot

Video presentation

Installation

Ollama

Ensure ollama is installed:

curl -fsSL https://ollama.com/install.sh | sh

Or follow the manual install.

Models

To use the default model expected by ollama-copilot:

ollama pull codellama:code

DeepSeek

To use DeepSeek:

ollama-copilot -provider deepseek -token YOUR_DEEPSEEK_API_KEY -model deepseek-coder

Mistral

To use Mistral:

ollama-copilot -provider mistral -token YOUR_MISTRAL_API_KEY -model codestral-latest

Automatic Installation & Configuration

You can use the install.sh script to automate the build, installation, and configuration of ollama-copilot and your editors:

./install.sh

This script will:

  1. Build the binary and install it to ~/.local/bin/ollama-copilot.
  2. Optionally install and start a Systemd Service (allowing you to configure num-predict).
  3. Automatically detect and offer to configure Neovim, VSCode, and Zed.

Manual Installation

Binary

go install github.com/bernardo-bruning/ollama-copilot@latest

Running

Ensure your $PATH includes $HOME/go/bin or $GOPATH/bin. For example, in ~/.bashrc or ~/.zshrc:

export PATH="$HOME/go/bin:$GOPATH/bin:$PATH"
ollama-copilot

or if you are hosting ollama in a container or elsewhere

OLLAMA_HOST="http://192.168.133.7:11434" ollama-copilot

Configuration

You can configure the server using command-line flags:

Flag Default Description
-port :11437 HTTP port to listen on
-proxy-port :11438 HTTP proxy port
-port-ssl :11436 HTTPS port to listen on
-proxy-port-ssl :11435 HTTPS proxy port
-cert Certificate file path (*.crt) for custom TLS
-key Key file path (*.key) for custom TLS
-provider ollama Provider to run LLM
-token TOKEN Token to pass for provider
-model codellama:code LLM model to use
-num-predict 250 Number of tokens to predict (recommended 25 for copilot)
-template <PRE> {{.Prefix}} <SUF> {{.Suffix}} <MID> Prompt template for fill-in-middle
-system You are a helpful... System prompt to guide the model

Configure IDE (Manual)

Neovim

  1. Install copilot.vim
  2. Configure variables
let g:copilot_proxy = 'http://localhost:11435'
let g:copilot_proxy_strict_ssl = v:false

VScode

  1. Install copilot extension
  2. Sign-in or sign-up in github
  3. Configure open settings config and insert
{
    "github.copilot.advanced": {
        "debug.overrideProxyUrl": "http://localhost:11437"
    },
    "http.proxy": "http://localhost:11435",
    "http.proxyStrictSSL": false
}

Zed

  1. Open settings (ctrl + ,)
  2. Set up edit completion proxying:
{
    "features": {
        "edit_prediction_provider": "copilot"
    },
    "show_completions_on_input": true,
    "edit_predictions": {
        "copilot": {
            "proxy": "http://localhost:11435",
            "proxy_no_verify": true
        }
    }
}

PyCharm / JetBrains

  1. Open Settings (Ctrl+Alt+S)
  2. Navigate to Appearance & Behavior > System Settings > HTTP Proxy
  3. Select Manual proxy configuration:
    • HTTP
    • Host name: localhost
    • Port number: 11435
  4. Navigate to Tools > Server Certificates
  5. Check Accept non-trusted certificates automatically
  6. Restart PyCharm

Emacs

(experimental)

  1. Install copilot-emacs
  2. Configure the proxy
(use-package copilot
  :straight (:host github :repo "copilot-emacs/copilot.el" :files ("*.el"))  ;; if you don't use "straight", install otherwise
  :ensure t
  ;; :hook (prog-mode . copilot-mode)
  :bind (
         ("C-<tab>" . copilot-accept-completion)
         )
  :config
  (setq copilot-network-proxy '(:host "127.0.0.1" :port 11434 :rejectUnauthorized :json-false))
  )

Roadmap

  • Enable completions APIs usage; fill in the middle.
  • Enable flexible configuration model (Currently only supported llamacode:code).
  • Create self-installing functionality.
  • Auto-configure IDEs (Neovim, VSCode, Zed).
  • Documentation on how to use.
  • Windows setup

About

Proxy that allows you to use ollama as a copilot like Github copilot

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 12