A simple chat interface for Mistral AI's language models. Built with Next.js, Tailwind CSS, and the official Mistral AI TypeScript SDK.
Disclaimer: This project is not production-ready. It is for local use only.
- ✅ Chat Completion - Have natural conversations with Mistral AI's language models
- ✅ Streaming - See responses appear in real-time as they're generated
- ✅ Context - Messages are preserved across the conversation for contextual awareness
- Clone this repository:
git clone https://github.com/delmaass/mistral-chat.git && cd mistral-chat- Copy
.envinto.env.localand set your Mistral AI API Key:
cp .env .env.local- Install dependencies & run the development server:
npm i && npm run devOpen http://localhost:3000 with your browser to see the result.
- Join Mistral AI - Indeed
- Authentication - Add user authentication and API key management
- Context & Server Session - Persist context with server-side session
- Conversations - Persist conversations and switch between them
