Simple, opinionated cloud drive service with file uploads, sharing, search, and AI-powered document summarization/chat.
This repository contains three main folders:
backend/— Go API (REST + GraphQL) with PostgreSQL, Redis, MinIO/Azure integration.frontend/— React + TypeScript + Vite single-page app.llm/— Python gRPC service for summarization and document Q&A (FAISS + HuggingFace + Ollama).
- Start infra containers (Postgres, Redis, MinIO, Kafka optional):
cd backend
docker compose up -d- Configure env files
cp backend/.env.example backend/.env
cp frontend/.env.example frontend/.env
cp llm/.env.example llm/.env
# Edit values as needed (DB, Redis, STORAGE_PROVIDER, LLM token)- Run migrations and start backend
cd backend
make migrateUp
make watch # or: air- Start frontend
cd frontend
npm i
npm dev- (Optional) Start LLM service
cd llm
docker compose up -d # starts Ollama runtime
python -m venv venv
source venv/bin/activate # Windows: venv\Scripts\activate
pip install -r requirements.txt
python server.pygraph TB
WEB[Web Client] --> API[Backend API]
API -->|stores| PG[(PostgreSQL)]
API -->|cache/pubsub| RD[(Redis)]
API -->|storage| ST[MinIO/Azure]
API -->|gRPC| LLM[LLM Service]
LLM --> FAISS[FAISS Indexes]
backend/README.md— Backend details: tech, ports, migrations, run instructionsfrontend/README.md— Frontend details: tech, scripts, envllm/README.md— LLM gRPC service: API methods, ports, env
- Backend API:
http://localhost:8080 - PostgreSQL:
5432 - Redis:
6379 - MinIO:
9000(API),9001(console) - LLM gRPC:
50051
- Use
.env.examplefiles in each folder as templates. - The LLM service expects a bearer token in gRPC metadata; set
LLM_GRPC_TOKENin both backend and LLM.envfiles.