Skip to content

This repo serves as a template to build AI Agent Capsules

Notifications You must be signed in to change notification settings

codecapsules-io/ai-agent-template

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation


Code Capsules Base Agent

An Express API that exposes a Base AI Agent. Built with TypeScript and LangChain.

Overview

This API provides a LangChain-based Base Agent with RAG (Retrieval-Augmented Generation) capabilities. It allows for the use of a Redis vector store and supports streaming responses for real-time chat interactions.

Key topics:

Features

RAG (Retrieval-Augmented Generation)

Add contextual knowledge to the agent via:

  • Plain text input
  • URLs (automatically crawled and processed)
  • Documents are embedded and stored in Redis for semantic search

Streaming Responses

Real-time streaming chat responses using Server-Sent Events (SSE) for better user experience.

Integration

Code Capsules Integration

The following endpoints enable interaction from the Code Capsules Agent Capsule UI:

  • POST /api/chat/message/stream - Stream chat responses
  • GET /api/chat/history - Retrieve chat history

Authentication: These endpoints use a lightweight security model:

  • Headers: X-CC-API-KEY, X-CC-EMAIL
  • Environment: INTERNAL_API_KEY

The API validates X-CC-API-KEY against INTERNAL_API_KEY and uses X-CC-EMAIL as the user identifier. These are not required for the agent to function but enable Code Capsules chatbot integration.

Setup

Link Repo to an Agent Capsule in Code Capsules

To create your own Agent Capsule in Code Capsules with this functionaity:

  1. Create a new repo from this template
  2. Create an AI Agent Capsule
  3. Link your new repo to the capsule
  4. Mark the capsule as 'using a Code Capsules template' and enter your provider, model, and API key details
  5. Finish the setup and wait for the capsule to build

Vector Store Setup (Redis)

By default, Code Capsules Agent templates cater for vector store logic using a Redis instance. This is NOT recommended for production as the instance data is not permanent - if the Redis instance restarts, it will lose all vector data.

Via Code Capsules

To link a vector store through Code Capsules:

  1. Create a new Redis Capsule
  2. Go to the Details tab of your new Redis Capsule
  3. Copy the Connection string
  4. Go to the Config tab of your Agent Capsule
  5. Edit the environment variables using Text Editor and add the following:
REDIS_URL=your_copied_connection_string
  1. Save the changed environment variables
  2. Wait for the capsule to restart. Once complete, your Agent Capsule will now use this Redis Capsule as its vector store.

Warning

Please note: Agent Capsule templates have been built with the capabilities of using either a Redis or in-memory vector store. When using the in-memory option (i.e. no Redis Capsule is linked), it is recommended to scale the Agent Capsule to the following in order to ensure optimal performance:

  • CPU 25%
  • Memory 1GB
  • Replicas 1

To do this, visit the Agent Capsule Scale page.

Via local/other implementations

To link a vector store:

  1. Start a new Redis instance using Docker.
  2. Add the following as an environment variable to your agent:
REDIS_URL=your_redis_instance_url
  1. Your Agent Capsule will now use this Redis Capsule as its vector store.

Architecture

Diagram

Architecture

  • Vector Store: Makes use of Redis as a vector database or a fallback Memory vector store to facilitate RAG functionality.
  • History: Non-persistent Redis or In-Memory store. Can be easily expanded to be persistent.
  • Embeddings: Uses Hugging Face Transformers embeddings model to embed text into a vector space locally.
  • Agent: The core LangChain agent that handles the chat logic.
  • API: API exposed via Express.js.

Agent Tools

The Base Agent has access to the following tools:

1. retrieve

  • Purpose: RAG (Retrieval-Augmented Generation) tool for retrieving contextual information
  • Description: Performs similarity search in the Redis vector store to find relevant documents
  • Input: Query string
  • Output: Retrieved documents with source metadata and content

Installation

npm install

Environment Variables

For local testing, create a .env file:

# App Configuration

PORT=3000
APP_URL=localhost:3000
APP_NAME=my-agent
DEV_MODE=false

# Security
# This is generated by Code Capsules on creation of the Agent capsule
INTERNAL_API_KEY=your-secret-key

# LLM Provider (Required)
PROVIDER_API_KEY=your-google-api-key
PROVIDER_NAME=google-genai
PROVIDER_MODEL=gemini-2.0-flash

# Vector Store (Optional, but recommended)
REDIS_URL=redis://localhost:6379

# Host documentation (Optional)
SHOW_DOCUMENTATION=true

Scripts

To run

npm start

API Endpoints

Chat

  • POST /api/chat/message - Send a message (non-streaming)

    • Body: { message: string }
    • Response: Complete chat response
    • Auth: Required (X-CC-API-KEY, X-CC-EMAIL)
  • POST /api/chat/message/stream - Send a message (SSE streaming)

    • Body: { message: string }
    • Response: Server-Sent Events stream
    • Auth: Required (X-CC-API-KEY, X-CC-EMAIL)
  • GET /api/chat/history - Get chat history for user

    • Response: Array of previous messages
    • Auth: Required (X-CC-API-KEY, X-CC-EMAIL)

Context (RAG)

  • POST /api/rag/text - Add text context to vector store

    • Body: { text: string }
    • Response: Confirmation of context addition
    • Auth: Required (X-CC-API-KEY, X-CC-EMAIL)
  • POST /api/rag/url - Add context from URL to vector store

    • Body: { url: string }
    • Response: Confirmation of context addition
    • Auth: Required (X-CC-API-KEY, X-CC-EMAIL)

Documentation (only visible if SHOW_DOCUMENTATION env var is set to true)

  • GET /api-docs - Swagger UI (public, no auth required)
  • GET /swagger.json - Swagger JSON (public, no auth required)

Customization

Change Auth

Edit src/middleware/auth-user.ts to change API authorization. By default, this uses the pre-described headers (X-CC-API-KEY, X-CC-EMAIL) and API key validation against INTERNAL_API_KEY.

Change LLM Provider and/or Model

When using Code Capsules, changing the provider, model, and API key is easy via the Agent Capsule 'Config' page. For other implementations, edit the PROVIDER_NAME, PROVIDER_MODEL, and PROVIDER_API_KEY environment variables according to your requirements.

Supported providers include:

  • google-genai (default)
  • openai
  • anthropic
  • mistralai
  • groq
  • cohere
  • cerebras
  • xai

Modify System Prompt

Update src/modules/agent/config/system-prompt.ts to change agent behavior, personality, or instructions. When using Code Capsules, be sure to ensure that the Agent Capsule is rebuilt and deployed to see your changes.

Add or Remove Tools

Add New Tools:

  1. Create a new tool class in src/modules/agent/tools/implementations/ extending BaseTool
  2. Implement the tool using LangChain's tool() function
  3. Register the tool in src/modules/agent/tools/tools-manager.ts

Remove Tools:

  1. Remove the tool import from src/modules/agent/tools/implementations/index.ts
  2. Remove the tool registration from src/modules/agent/tools/tools-manager.ts

When using Code Capsules, ensure that the Agent Capsule is rebuild and deployed to see your changes.

Change Vector Store

The RAG module currently uses Redis or an in-memory vector store. To switch to another vector store (Pinecone, Weaviate, etc.), modify src/modules/rag/services/vector-store.service.ts.

Tech Stack

  • Runtime: Node.js + TypeScript
  • Framework: Express 5
  • LLM Framework: LangChain
  • LLM Provider: Google Generative AI
  • Vector Store: Redis (or Memory for fallback)
  • Documentation: Swagger/OpenAPI

Development Notes

  • CORS is enabled for all origins (*)
  • Request size limit: 10MB
  • Swagger docs are publicly accessible at /api-docs

About

This repo serves as a template to build AI Agent Capsules

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •