This is a library which contains boilerplate code/basic example of different ways to create your AI agent that uses custom tools and locally run llm model (ollama). You can easily modify it to use any other model as well.
Examples:
- Langgraph Agent - Uses Langgraph library and related langchain sum library for a basic reactive agent with simple tools.
- Strands Agent - Uses Strands-Agent library and strands-agents-tools
- More to come.
It can be installed using pip install uv
use command uv --help to check if it is installed correctly.
Install ollama - Install ollama model with tooling support (https://ollama.com/blog/tool-support, https://ollama.com/search?c=tools)
ollama run qwen3:4bprollama run llama3.2It opens chat terminal which you can test for chatting or just close using/bye.
Start ollama server
ollama serve
uv add langgraph langchain_core langchain langchain-ollama
Comment/Uncomment the code to run specific Agent example in main.py.
Then run the main module main.py using
uv run -m main
or
.\.venv\Scripts\python.exe main.py
or to just run the langgraph module
uv run -m langgraph_chat.chat_assitant