Python bindings for the LNMP (LLM Native Minimal Protocol) - an efficient, deterministic protocol designed for LLM context optimization.
pip install lnmpimport lnmp
# Parse LNMP text
record = lnmp.core.parse("F12=14532;F7=1")
# Encode to text
text = record.encode()
# Binary encoding
binary = record.encode_binary()
# Wrap with envelope
envelope = lnmp.envelope.wrap(record, source="my-service")
# Context scoring
score = lnmp.net.context_score(envelope)
print(f"Composite score: {score.composite}")
# Routing decision
if lnmp.net.should_send_to_llm(envelope, threshold=0.7):
# Send to LLM
passParse and encode LNMP records.
# Parse from text
record = lnmp.core.parse("F12=14532;F7=1")
# Encode to text
text = record.encode()
# Binary encoding/decoding
binary = record.encode_binary()
decoded = lnmp.core.decode_binary(binary)Wrap records with operational metadata.
envelope = lnmp.envelope.wrap(
record,
source="auth-service",
timestamp_ms=1234567890,
trace_id="trace-123"
)Context scoring and routing decisions.
# Score envelope
score = lnmp.net.context_score(envelope)
print(score.composite) # 0.0-1.0
# Routing decision
decision = lnmp.net.routing_decide(envelope)
# Helper
if lnmp.net.should_send_to_llm(envelope, threshold=0.7):
send_to_llm(envelope)High-level helpers for common workflows.
result = lnmp.llm.normalize_and_route(
"F12=14532",
source="api-gateway",
threshold=0.7
)
if result["send_to_llm"]:
process_with_llm(result["envelope"])Vector operations and delta compression.
base = [0.1, 0.2, 0.3]
updated = [0.1, 0.25, 0.3]
delta_info = lnmp.embedding.delta(base, updated)
print(delta_info["change_count"])Spatial data encoding and streaming.
# Encode 3D position
data = lnmp.spatial.encode_position3d(1.5, 2.5, 3.5)
# Decode
x, y, z = lnmp.spatial.decode_position3d(data)Utility functions for quantization, sanitization, etc.
# Quantize vectors
quantized = lnmp.utils.quantize([0.1, 0.2, 0.3], "QInt8")
# Sanitize input
clean = lnmp.utils.sanitize("F12= 14532 ; F7=1")
# Debug explain
explained = lnmp.utils.debug_explain("F12=14532")import lnmp
# High-level workflow
result = lnmp.llm.normalize_and_route(
"F12=14532;F7=1",
source="my-service",
trace_id="req-001",
threshold=0.7
)
print(f"Score: {result['score'].composite:.3f}")
print(f"Decision: {result['decision']}")
if result['send_to_llm']:
# Route to LLM
send_to_llm(result['envelope'])
else:
# Process locally
process_locally(result['record'])The Python SDK uses a two-layer architecture:
lnmp-python(Rust layer): Native extension built with PyO3, exposing core LNMP functionalitylnmp(Python layer): Pythonic API wrapper providing a developer-friendly interface
This ensures:
- β Performance: Core operations run at native Rust speed
- β Determinism: Identical behavior across all LNMP SDKs
- β Developer UX: Clean, Pythonic API
See the examples/ directory for more:
basic_usage.py- Simple parse/encode operationscomplete_workflow.py- Full LLM routing workflow- (More examples in the repository)
# Clone repository
git clone https://github.com/lnmplang/lnmp-sdk-python
cd lnmp-sdk-python
# Create virtual environment
python3 -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# Install maturin and dev dependencies
pip install maturin pytest pytest-cov
# Build and install
maturin develop
# Run tests
export PYTHONPATH=$PWD:$PYTHONPATH
pytest tests/ -vReleases are automated via GitHub Actions:
# 1. Update version in pyproject.toml and Cargo.toml
# 2. Commit and push changes
git add pyproject.toml Cargo.toml
git commit -m "chore: bump version to 0.5.8"
git push origin main
# 3. Create and push version tag
git tag v0.5.8
git push origin v0.5.8
# GitHub Actions will automatically:
# - Run tests on all platforms
# - Build wheels for Linux, macOS, Windows
# - Publish to PyPI
# - Create GitHub ReleaseSee .github/WORKFLOW.md for detailed workflow documentation.
Contributions are welcome! Please see CONTRIBUTING.md for guidelines.
MIT License - see LICENSE for details.
Current version: 0.5.7
Synchronized with LNMP Rust crate version for consistency.