A NLIP web server with a JavaScript client for displaying products and handling multimodal chat interactions.
- Overview
- Features
- Quick Start
- Documentation
- Project Structure
- Architecture
- Configuration
- Contributing
- License
nlip_web is a web application that demonstrates the NLIP (Natural Language Interface Protocol) through three main features:
- Product Search: Search products across multiple e-commerce stores simultaneously
- Text Chat: Conversational interface using local LLMs via Ollama
- Image Chat: Multimodal chat interface supporting text and image inputs
The project uses FastAPI for the backend, React/Vite for modern frontends, and vanilla JavaScript for simple chat interfaces. All communication uses the NLIP protocol for structured message exchange.
- Website Modules: Product search modules for various stores
- Product Display: Product tables and cards
- Multi-Store Support: Amazon, Walmart, Staples, Newegg, etc.
- Text Chat: Text chat interface
- Image Chat: Multimodal chat with image support
- Session Management: Conversation state tracking
- Python 3.10+
- Poetry (Python package manager)
- Node.js 18+ (for React client, if using)
- Ollama running locally (for chat features)
- Chrome/Chromium (for web scraping)
# Clone the repository
git clone https://github.com/nlip-project/nlip_web.git
cd nlip_web
# Install Python dependencies
poetry install
poetry add beautifulsoup4 selenium requests
# Install client dependencies (if using React frontend)
cd client
npm installText Chat Server (Port 8010):
poetry run python scripts.py chat --port 8010
# Or: poetry run python nlip_web/text_chat.py
# Access at http://localhost:8010Image Chat Server (Port 8020):
poetry run python scripts.py image --port 8020
# Or: poetry run python nlip_web/image_chat.py
# Access at http://localhost:8020Product Search (CLI):
./run_main_module.sh
# Or: python3 -m website_modules.main_module "search term"- Text Chat:
http://localhost:8010orhttp://localhost:8010/static/text_chat.html - Image Chat:
http://localhost:8020orhttp://localhost:8020/static/image_chat.html
Documentation is available for all major components:
- Text Chat Development Guide
- Development guide for text chat interface
- Image Chat Development Guide
- Development guide for image chat interface
- Website Modules
- Guide for creating and using store modules
- Architecture
- System architecture overview
- Contributing
- Guidelines for contributors
The primary goal is to display products nicely on screen. The system receives JSON product data, parses it, and renders product tables and cards.
nlip_web/
├── nlip_web/ # Python backend modules
│ ├── genai.py # Ollama client interface
│ ├── text_chat.py # Text chat server (port 8010)
│ ├── image_chat.py # Image chat server (port 8020)
│ ├── vite_chat.py # Product search server (port 8030)
│ └── nlip_ext.py # NLIP extensions and session management
│
├── website_modules/ # E-commerce scraping modules
│ ├── main_module.py # Orchestrates all store searches
│ ├── staples_ca_module.py
│ ├── newegg_ca_module.py
│ └── ... (other store modules)
│
├── static/ # Static HTML/JS files
│ ├── text_chat.html # Text chat interface
│ ├── image_chat.html # Image chat interface
│ ├── chat_script.js # Text chat client
│ ├── image_script.js # Image chat client
│ └── nlip.js # NLIP protocol library
│
├── client/ # React frontend (if exists)
│ └── src/ # React components
│
└── Documentation files
├── README.md # This file
├── ARCHITECTURE.md # Architecture overview
└── CONTRIBUTING.md # Contributing guidelines
The system consists of three main applications:
- Text Chat: Simple HTML interface connecting to Ollama for text conversations
- Image Chat: HTML interface supporting text and image inputs for multimodal AI
- Product Search: Web scraping system that searches multiple e-commerce stores in parallel
All applications use the NLIP protocol for structured communication and maintain session state through correlator tokens.
For detailed architecture information, see ARCHITECTURE.md.
| Variable | Default | Description |
|---|---|---|
LOCAL_PORT |
8010 (text), 8020 (image), 8030 (products) |
Server port |
CHAT_MODEL |
granite3-moe (text), llava (image) |
Ollama model name |
CHAT_HOST |
localhost |
Ollama server host |
CHAT_PORT |
11434 |
Ollama server port |
Before using chat features, ensure Ollama is running:
# Install Ollama (if not already installed)
# Visit https://ollama.ai/ for installation instructions
# Pull required models
ollama pull granite3-moe # For text chat
ollama pull llava # For image chat
# Verify Ollama is running
curl http://localhost:11434/api/tagsWe welcome contributions. Please see CONTRIBUTING.md for guidelines on:
- Code style and standards
- How to submit pull requests
- Adding new store modules
- Improving documentation
- Reporting issues
Apache-2.0 License - see LICENSE file for details.
- NLIP Server - Required dependency
- NLIP SDK - Development tools
For questions, issues, or contributions:
- Open an issue on GitHub
- Review the documentation in each component's directory
- Check the troubleshooting sections in component documentation