Skip to content

mnisperuza/bert-cli

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

46 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

BERT CLI

A Educacional Project by Matias Nisperuza

Give the repository a star if you find it cool or useful!🌟

BERT CLI Version Python HuggingFace PyTorch Platform License


Updates:

Bert its now at 1.2.0 , Featuring:

  • Fixed paste support
  • Terminal color themes commands: -color light / -color dark (they do work)
  • Multiline paste mode: /*paste

Visit berts Official GitHub Page here: Bert CLI official Page

βš–οΈ Legal & Licensing

This is Software Licensed Under Apache 2.0.

  • For licensing terms, see LICENSE.

Overview

About Bert:

Bert CLI

Bert CLI is a small educational dev tool project I’m building and maintaining. Working on the CLI, website, and server has been a fun way for me to learn, break things, and improve my skills while building something cool.

Bert CLI has nothing to do with BERT (Bidirectional Encoder Representations from Transformers), Bert CLI Is just a Framework that host decoder models such as Qwen 2.5/3 models from Alibaba or LFM2 models from LiquidAI.

If you spot any issues, have ideas, or just want to give feedback, open a issue on GitHub, if you want to collaborate, send me an email β€” I’m always open to it.

Bert CLI Preview


Quick Start

PyPI

# Directly from PyPI Package:

pip install bert-cli

Usage

Start Bert

bert

Claim Your weekly token to start using Bert CLI

Go to Bert CLI official Page.

Claim yout token -

Then run bert and use:

/*token YOUR-TOKEN-HERE

Why Tokens Keys?

Token keys may seem unreliable, I know, But I have seen that, to ensure proper chat inside the structure I built with the memory system, a Limit of 20,000 Tokens seems fine, Its Free and Totally open.

I do store your data, but I am not going to do anything with it, I only store it to track and assign a ID to the Tokens, If you have any complain, doubt, Feedback or issue, feel free to email me: mnisperuza1102@gmail.com

For more info visit: Bert CLI official Page


Models

Model Base VRAM Features
Bert Nano LiquidAI/LFM2-700M ~2GB Ultra-fast
Bert Mini LiquidAI/LFM2-1.2B ~4GB Balanced
Bert Main Qwen/Qwen3-1.7B ~5GB Thinking 🧠
Bert Max LiquidAI/LFM2-2.6B ~8GB Reasoning
Bert Coder Qwen/Qwen2.5-Coder-1.5B-Instruct ~4GB Code
Bert Max-Coder Qwen/Qwen2.5-Coder-3B-Instruct ~8GB Heavy Code

Command Line Options

bert --ver      # Show version
bert --info     # Show info
bert --del      # Remove Bert data (~/.bert)
bert --help     # Show help

In-Session Commands

Switch Models:

bert nano       # Fastest (0.7B)
bert mini       # Balanced (1.2B)
bert main       # Flagship (1.7B)
bert max        # Most capable (2.6B)
bert coder      # Code-optimized (1.5B)
bert maxcoder   # The best for Code (3B)

Change Quantization:

bert int4       # Balanced ⭐ 
bert int8       # High quality 
bert fp16       # Best quality (all platforms)
bert fp32       # Full precision / CPU

Other Commands:

/*help          # Show all commands
/*status        # Show current status
/*clear         # Clear screen
/*exit          # Exit Bert
-color light    # change to Bone white background
-color dark     # change to pitch black
/*paste         # allows multiline paste

System Requirements

Component Minimum Recommended
RAM 8GB 16GB+
VRAM 3GB 6GB+
Python 3.8 3.10+
Storage 30GB 40GB

Quantization Support

Platform INT4/INT8 FP16 FP32
Linux βœ… βœ… βœ…
Windows βœ…* βœ… βœ…
macOS ❌ βœ… βœ…

Uninstall

Remove Bert data:

bert --del

Support

When reporting issues, include:

  1. Bert version (bert --ver)
  2. Your OS (Windows/Linux/macOS)
  3. Error message

Thanks for Using Bert CLI ❀️