A Educacional Project by Matias Nisperuza
Bert its now at 1.2.0 , Featuring:
- Fixed paste support
- Terminal color themes commands: -color light / -color dark (they do work)
- Multiline paste mode: /*paste
Visit berts Official GitHub Page here: Bert CLI official Page
This is Software Licensed Under Apache 2.0.
- For licensing terms, see LICENSE.
Bert CLI is a small educational dev tool project Iβm building and maintaining. Working on the CLI, website, and server has been a fun way for me to learn, break things, and improve my skills while building something cool.
Bert CLI has nothing to do with BERT (Bidirectional Encoder Representations from Transformers), Bert CLI Is just a Framework that host decoder models such as Qwen 2.5/3 models from Alibaba or LFM2 models from LiquidAI.
If you spot any issues, have ideas, or just want to give feedback, open a issue on GitHub, if you want to collaborate, send me an email β Iβm always open to it.
# Directly from PyPI Package:
pip install bert-cli
bertGo to Bert CLI official Page.
Claim yout token -
Then run bert and use:
/*token YOUR-TOKEN-HERE
Token keys may seem unreliable, I know, But I have seen that, to ensure proper chat inside the structure I built with the memory system, a Limit of 20,000 Tokens seems fine, Its Free and Totally open.
I do store your data, but I am not going to do anything with it, I only store it to track and assign a ID to the Tokens, If you have any complain, doubt, Feedback or issue, feel free to email me: mnisperuza1102@gmail.com
For more info visit: Bert CLI official Page
| Model | Base | VRAM | Features |
|---|---|---|---|
| Bert Nano | LiquidAI/LFM2-700M | ~2GB | Ultra-fast |
| Bert Mini | LiquidAI/LFM2-1.2B | ~4GB | Balanced |
| Bert Main | Qwen/Qwen3-1.7B | ~5GB | Thinking π§ |
| Bert Max | LiquidAI/LFM2-2.6B | ~8GB | Reasoning |
| Bert Coder | Qwen/Qwen2.5-Coder-1.5B-Instruct | ~4GB | Code |
| Bert Max-Coder | Qwen/Qwen2.5-Coder-3B-Instruct | ~8GB | Heavy Code |
bert --ver # Show version
bert --info # Show info
bert --del # Remove Bert data (~/.bert)
bert --help # Show helpSwitch Models:
bert nano # Fastest (0.7B)
bert mini # Balanced (1.2B)
bert main # Flagship (1.7B)
bert max # Most capable (2.6B)
bert coder # Code-optimized (1.5B)
bert maxcoder # The best for Code (3B)
Change Quantization:
bert int4 # Balanced β
bert int8 # High quality
bert fp16 # Best quality (all platforms)
bert fp32 # Full precision / CPU
Other Commands:
/*help # Show all commands
/*status # Show current status
/*clear # Clear screen
/*exit # Exit Bert
-color light # change to Bone white background
-color dark # change to pitch black
/*paste # allows multiline paste
| Component | Minimum | Recommended |
|---|---|---|
| RAM | 8GB | 16GB+ |
| VRAM | 3GB | 6GB+ |
| Python | 3.8 | 3.10+ |
| Storage | 30GB | 40GB |
| Platform | INT4/INT8 | FP16 | FP32 |
|---|---|---|---|
| Linux | β | β | β |
| Windows | β * | β | β |
| macOS | β | β | β |
Remove Bert data:
bert --del-
GitHub Issues: github.com/mnisperuza/bert-cli/issues
-
Email: mnisperuza1102@gmail.com
When reporting issues, include:
- Bert version (
bert --ver) - Your OS (Windows/Linux/macOS)
- Error message
Thanks for Using Bert CLI β€οΈ
