Langent is a RAG (Retrieval-Augmented Generation) framework that transforms your local workspace into a 3D cosmic nebula of knowledge. It combines vector embeddings (ChromaDB) with knowledge graphs (Neo4j) to provide a deeply connected AI experience.
- Security: Removed all hardcoded credentials, added Cypher injection prevention, optional API key auth
- Logging: Replaced all
print()with structuredloggingmodule - Testing: Comprehensive pytest suite with 40+ test cases
- CI/CD: GitHub Actions for lint + test across Python 3.10/3.11/3.12
- Tooling: ruff (lint), mypy (type check), Dockerfile for containerized deployment
- API: Pydantic request models, FastAPI dependency injection, health check endpoint
- Bug fixes: Fixed
${}interpolation bug in workflows, fixed MCP server method call error
- Auto-Ingestion: Automatically scans and indexes MD, PDF, TXT, CSV, JSON, and YAML files.
- Hybrid RAG: Merges semantic vector search with graph-based relationship traversal for superior context.
- 3D Nebula Visualizer: Explore your knowledge base in an interactive Three.js 3D environment.
- Knowledge Linking: Automatically discovers and creates relationships between your documents and entities.
- MCP Integration: Built-in support for Model Context Protocol (MCP) to plug into Claude Desktop, Antigravity, and more.
- LangGraph Workflows: Agent state machines with RAG + Graph reasoning pipeline.
git clone https://github.com/AlexAI-MCP/langent.git
cd langent
pip install -e .cp .env.example .envEdit .env to set your LANGENT_WORKSPACE (the folder containing your documents).
langent ingest --workspace ./samples # Start with sample data
langent serve # Open http://localhost:8000docker build -t langent .
docker run -p 8000:8000 -v ./workspace:/app/workspace langent# Install with dev dependencies
pip install -e ".[dev]"
# Run tests
pytest tests/ -v
# Lint
ruff check langent/ tests/
# Type check
mypy langent/Langent automates the journey from raw files to an interactive 3D knowledge universe.
- Gather Data (Workspace): Drop your PDFs, Markdown notes, CSV spreadsheets, and research papers into your
LANGENT_WORKSPACEfolder. - Chunking: Langent automatically breaks these large files into smaller, semantically meaningful chunks (300-500 tokens).
- Vectorization (ChromaDB):
- Using local embedding models (e.g.,
all-MiniLM-L6-v2), each chunk is transformed into a high-dimensional vector. - These vectors are stored in ChromaDB for lightning-fast semantic retrieval.
- Using local embedding models (e.g.,
- 3D Projection:
- Langent uses UMAP to project high-dimensional vectors into a 3D Point Cloud.
- Semantically similar points cluster together, forming knowledge constellations.
Result: Your messy folder becomes a beautiful, searchable, and navigable 3D cosmic map.
To enable Knowledge Graph features, you need a Neo4j instance.
- Option A: Docker (Recommended)
docker run \ --name langent-neo4j \ -p 7474:7474 -p 7687:7687 \ -e NEO4J_AUTH=neo4j/your_password \ neo4j:latest - Option B: Neo4j Desktop
Install Neo4j Desktop, create a local project, and update your
.env:NEO4J_URI="bolt://localhost:7687" NEO4J_USER="neo4j" NEO4J_PASSWORD="your_password"
Set an API key in your .env to protect write endpoints:
API_KEY="your-secret-api-key"Then pass it via the X-API-Key header for protected endpoints (/api/ingest, /api/link, /api/graph).
Langent acts as an MCP server, allowing AI agents like Claude to use your workspace as long-term memory.
Add the following to your mcp_config.json:
"langent": {
"command": "python",
"args": ["-m", "langent.server.mcp_server"],
"env": {
"LANGENT_WORKSPACE": "/path/to/your/workspace"
}
}Once connected via MCP, you can talk to your workspace as if it's an intelligent entity.
- Data Ingestion:
"Langent의 mcp 도구를 사용해서 내 워크스페이스에 있는 새로운 문서들을 인덱싱해줘."
- Semantic Search:
"내 워크스페이스에서 'AI 미래 전략'과 관련된 내용을 네뷸라에서 검색해서 요약해줘."
- Graph Insight:
"내 연구 주제인 'AI 에이전트'와 가장 많이 연결된 핵심 키워드들을 그래프로 분석해서 보고서로 만들어줘."
Once you run langent serve, navigate to http://localhost:8000.
- Points (Star Dust): Each point represents a chunk of your documents.
- Nodes (Planets): Entities extracted into the Neo4j Graph.
- Lines (Cosmic Strings): Relationships between graph entities and semantic links.
- Controls:
- Left Click: Select a point to see its original content.
- Right Click / Drag: Rotate the universe.
- Scroll: Zoom in/out of the knowledge cluster.
- Search Bar: Type a keyword to highlight matching stars.
This project is licensed under the Apache License 2.0.
Created by Alex AI


