Skip to content

This repository contains the deployment architecture and infrastructure setup for a serverless chat application powered by AWS and Amazon Bedrock. The solution is designed for scalability, low latency, and cost-efficiency, leveraging AWS managed services.

Notifications You must be signed in to change notification settings

Sommie09/ai-chatbot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🏗️ Serverless Chat Application – Deployment Architecture

AWS Serverless OpenAI License: MIT

This repository showcases the deployment architecture and infrastructure setup for a serverless AI-powered chat application, leveraging AWS services, Amazon Bedrock, and OpenAI LLMs for intelligent conversational experiences.
The design prioritizes scalability, low latency, and minimal operational overhead using a fully managed cloud-native approach.

Architecture Overview

The architecture consists of the following components:

Frontend – S3 + CloudFront

  • A static web application hosted in Amazon S3
  • Distributed globally through Amazon CloudFront for high performance, caching, and SSL-secured access.

API Gateway

  • Serves as the entry point for all client requests.
  • Manages routing, authentication, and rate-limiting to backend services.

AWS Lambda – Business Logic Layer

  • Contains the core logic for handling chat interactions.
  • Processes user input, integrates with both OpenAI and Bedrock models, and manages responses.
  • Implements lightweight orchestration and memory handling.

LLM Engines – Bedrock & OpenAI

  • Amazon Bedrock: Provides access to foundational models for enterprise-grade AI.
  • OpenAI LLM (e.g., GPT-4): Enhances conversational capabilities with advanced natural-language understanding and reasoning.
  • Lambda dynamically routes requests to the most appropriate model based on context or configuration.

S3 Storage – Memory Layer

  • Stores conversation history and session metadata for persistence.
  • Enables contextual continuity across chat sessions.

About

This repository contains the deployment architecture and infrastructure setup for a serverless chat application powered by AWS and Amazon Bedrock. The solution is designed for scalability, low latency, and cost-efficiency, leveraging AWS managed services.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages