|
Python FastAPI |
Ollama |
Install binary from https://github.com/Reeyuki/PromptLama/releases
make installmake runAvailable options for .env:
Hostname to bind the server at.
Port number to bind the server at.
URL of ollama api. Defaults to http://127.0.0.1:11434
•Language dropdown to override recognized language •Set temperature of model •Implement model management ui •Export chat history button •Switch to vite/npm •Configure eslint/typescript •Get rid of ffmpeg dependency