Skip to content

feat: Add LlamaIndex Agent Integration#284

Open
tonycdr-prog wants to merge 1 commit intoopenagents-org:developfrom
tonycdr-prog:feat/llamaindex-integration
Open

feat: Add LlamaIndex Agent Integration#284
tonycdr-prog wants to merge 1 commit intoopenagents-org:developfrom
tonycdr-prog:feat/llamaindex-integration

Conversation

@tonycdr-prog
Copy link

Summary

Implements LlamaIndex agent runner as requested in #264, following the existing integration pattern.

What is LlamaIndex?

LlamaIndex is a data framework for LLM applications, excelling at:

  • Retrieval Augmented Generation (RAG)
  • Document Q&A and knowledge bases
  • Multi-document reasoning
  • Structured data extraction

Changes

  • New LlamaIndexAgentRunner class in src/openagents/agents/llamaindex_agent.py
  • Tool conversion utilities:
    • openagents_tool_to_llamaindex() - convert to LlamaIndex FunctionTool
    • llamaindex_tool_to_openagents() - convert LlamaIndex tools to OpenAgents
  • Supports multiple agent interfaces: chat(), query(), achat(), aquery()
  • Async-first design with sync fallback

Usage

Basic Agent

from llama_index.core.agent import ReActAgent
from llama_index.llms.openai import OpenAI
from openagents.agents import LlamaIndexAgentRunner

# Create LlamaIndex agent
llm = OpenAI(model='gpt-4')
agent = ReActAgent.from_tools(tools, llm=llm, verbose=True)

# Connect to OpenAgents
runner = LlamaIndexAgentRunner(
    llamaindex_agent=agent,
    agent_id='knowledge-agent'
)
runner.start(network_host='localhost', network_port=8600)
runner.wait_for_stop()

With RAG Pipeline

from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from llama_index.core.agent import ReActAgent
from llama_index.core.tools import QueryEngineTool

# Build index from documents
documents = SimpleDirectoryReader('data').load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()

# Create tool from query engine
query_tool = QueryEngineTool.from_defaults(
    query_engine=query_engine,
    name='knowledge_base',
    description='Search the knowledge base for information'
)

# Create agent with RAG tool
agent = ReActAgent.from_tools([query_tool], llm=llm)
runner = LlamaIndexAgentRunner(
    llamaindex_agent=agent,
    agent_id='rag-agent'
)

Reference

Follows the pattern established in:

  • src/openagents/agents/langchain_agent.py

Closes #264

Implements LlamaIndex agent runner as requested in openagents-org#264.

## Changes
- Add `LlamaIndexAgentRunner` class in `src/openagents/agents/llamaindex_agent.py`
- Support for LlamaIndex RAG-focused agents (ReActAgent, OpenAIAgent, etc.)
- Tool conversion utilities:
  - `openagents_tool_to_llamaindex()` - convert OpenAgents tools to FunctionTool
  - `llamaindex_tool_to_openagents()` - convert LlamaIndex tools to OpenAgents
- Supports both chat() and query() interfaces
- Async-first with sync fallback
- Event filtering support
- Export new classes from `openagents.agents`

## Usage
```python
from llama_index.core.agent import ReActAgent
from llama_index.llms.openai import OpenAI
from openagents.agents import LlamaIndexAgentRunner

llm = OpenAI(model='gpt-4')
agent = ReActAgent.from_tools(tools, llm=llm)

runner = LlamaIndexAgentRunner(
    llamaindex_agent=agent,
    agent_id='knowledge-agent'
)
runner.start(network_host='localhost', network_port=8600)
```

Closes openagents-org#264
@vercel
Copy link

vercel bot commented Feb 3, 2026

@Bandit-AI is attempting to deploy a commit to the Raphael's projects Team on Vercel.

A member of the Team first needs to authorize it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Help Wanted: LlamaIndex Agent Integration

2 participants