Skip to content

Conversation

@royisme
Copy link
Owner

@royisme royisme commented Nov 7, 2025

Summary

  • add the llama-agents dependency and build a reusable FunctionCallingAgent factory
  • expose Neo4j knowledge and memory operations as asynchronous tools and session manager utilities
  • provide FastAPI endpoints for creating agent sessions and driving multi-turn conversations with retrieval and memory updates
  • ensure pytest includes the repository root and src directory on sys.path so codebase_rag imports resolve

Testing

  • pytest (fails: ImportError: cannot import name 'PackBuilder' from 'src.codebase_rag.services.pipeline')

Codex Task

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines 132 to 181
async def process_message(
self,
session_id: str,
message: str,
*,
auto_save_memories: bool = False,
) -> Dict[str, Any]:
async with self._lock:
session = self._sessions.get(session_id)

if session is None:
raise KeyError(f"Session '{session_id}' not found")

session.chat_history.append(_to_chat_message(MessageRole.USER, message))

conversation_payload = [
{"role": msg.role.value, "content": msg.content}
for msg in session.chat_history
]

call_kwargs = {
"chat_history": session.chat_history,
"metadata": {
"project_id": session.project_id,
"auto_save_memories": auto_save_memories,
"conversation": conversation_payload,
},
}

if hasattr(session.agent, "achat"):
response = await session.agent.achat(message, **call_kwargs) # type: ignore[attr-defined]
elif hasattr(session.agent, "chat"):
response = session.agent.chat(message, **call_kwargs) # type: ignore[attr-defined]
elif hasattr(session.agent, "arun"):
response = await session.agent.arun(message, **call_kwargs) # type: ignore[attr-defined]
else:
raise AttributeError("FunctionCallingAgent does not expose a chat method")

reply_text = _extract_response_text(response)
session.chat_history.append(_to_chat_message(MessageRole.ASSISTANT, reply_text))

tool_events = _extract_tool_events(response)
session.tool_events.extend(tool_events)

task_record = {
"user_message": message,
"assistant_reply": reply_text,
"tools_used": tool_events,
}
session.task_trace.append(task_record)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Guard per-session state while processing messages

Message handling fetches the AgentSession under a lock and then releases the lock before mutating chat_history, tool_events, and invoking the agent. If two HTTP requests hit the same session concurrently, both coroutines will read the same pre‑message history, append their messages without coordination, and call the underlying FunctionCallingAgent simultaneously. This can interleave tool calls and append replies out of order, leaving the stored chat history and task trace inconsistent. Consider locking per session (or holding the lock through the entire method) so only one message is processed at a time for each session.

Useful? React with 👍 / 👎.

@royisme royisme requested a review from Copilot November 7, 2025 05:08
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR introduces a LlamaIndex-based agent orchestration layer to provide multi-turn conversational capabilities with integrated knowledge retrieval and memory management. The key changes include pytest path configuration fixes, agent tooling infrastructure, session management, and FastAPI endpoint exposure.

  • Adds sys.path configuration to resolve src/ imports during pytest execution
  • Implements async tools for Neo4j knowledge queries and memory operations
  • Provides session-based agent conversation management with tool activity tracking
  • Exposes RESTful endpoints for agent session lifecycle and message processing

Reviewed Changes

Copilot reviewed 10 out of 11 changed files in this pull request and generated 4 comments.

Show a summary per file
File Description
tests/conftest.py Adds both repository root and src directory to sys.path for proper import resolution
src/codebase_rag/services/utils/metrics.py Adds backward-compatible MetricsCollector alias and fixes docstring punctuation
src/codebase_rag/services/pipeline/init.py Exports PackBuilder and pack_builder from code service module
src/codebase_rag/services/agents/tools.py Defines async tool wrappers for knowledge graph queries and memory operations
src/codebase_rag/services/agents/session_manager.py Implements AgentSessionManager for multi-turn conversations with tool orchestration
src/codebase_rag/services/agents/base.py Provides FunctionAgent factory with default toolset and system prompt
src/codebase_rag/services/agents/init.py Package initialization exposing agent components and singleton manager
src/codebase_rag/services/init.py Adds agents module to services package exports
src/codebase_rag/core/routes.py Registers agent routes with FastAPI application
src/codebase_rag/api/agent_routes.py Implements REST endpoints for agent session management and messaging

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

# Knowledge Pipeline module initialization
"""Pipeline service package exports."""

from codebase_rag.services.code import PackBuilder, pack_builder
Copy link

Copilot AI Nov 7, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The import path uses codebase_rag.services.code but this should be src.codebase_rag.services.code based on the project structure shown in conftest.py, or it should use a relative import like from ..code import PackBuilder, pack_builder since this is within the services package.

Suggested change
from codebase_rag.services.code import PackBuilder, pack_builder
from ..code import PackBuilder, pack_builder

Copilot uses AI. Check for mistakes.
Comment on lines +22 to +33
if not neo4j_knowledge_service._initialized: # type: ignore[attr-defined]
async with _knowledge_lock:
if not neo4j_knowledge_service._initialized: # type: ignore[attr-defined]
await neo4j_knowledge_service.initialize()


async def _ensure_memory_ready() -> None:
"""Initialize the memory store when first accessed."""

if not memory_store._initialized: # type: ignore[attr-defined]
async with _memory_lock:
if not memory_store._initialized: # type: ignore[attr-defined]
Copy link

Copilot AI Nov 7, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] Accessing private _initialized attribute with type ignore suggests fragile coupling. Consider adding a public is_initialized() method to the service class for cleaner encapsulation.

Suggested change
if not neo4j_knowledge_service._initialized: # type: ignore[attr-defined]
async with _knowledge_lock:
if not neo4j_knowledge_service._initialized: # type: ignore[attr-defined]
await neo4j_knowledge_service.initialize()
async def _ensure_memory_ready() -> None:
"""Initialize the memory store when first accessed."""
if not memory_store._initialized: # type: ignore[attr-defined]
async with _memory_lock:
if not memory_store._initialized: # type: ignore[attr-defined]
if not neo4j_knowledge_service.is_initialized():
async with _knowledge_lock:
if not neo4j_knowledge_service.is_initialized():
await neo4j_knowledge_service.initialize()
async def _ensure_memory_ready() -> None:
"""Initialize the memory store when first accessed."""
if not memory_store.is_initialized():
async with _memory_lock:
if not memory_store.is_initialized():

Copilot uses AI. Check for mistakes.
Comment on lines +22 to +33
if not neo4j_knowledge_service._initialized: # type: ignore[attr-defined]
async with _knowledge_lock:
if not neo4j_knowledge_service._initialized: # type: ignore[attr-defined]
await neo4j_knowledge_service.initialize()


async def _ensure_memory_ready() -> None:
"""Initialize the memory store when first accessed."""

if not memory_store._initialized: # type: ignore[attr-defined]
async with _memory_lock:
if not memory_store._initialized: # type: ignore[attr-defined]
Copy link

Copilot AI Nov 7, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] Accessing private _initialized attribute with type ignore suggests fragile coupling. Consider adding a public is_initialized() method to the service class for cleaner encapsulation.

Suggested change
if not neo4j_knowledge_service._initialized: # type: ignore[attr-defined]
async with _knowledge_lock:
if not neo4j_knowledge_service._initialized: # type: ignore[attr-defined]
await neo4j_knowledge_service.initialize()
async def _ensure_memory_ready() -> None:
"""Initialize the memory store when first accessed."""
if not memory_store._initialized: # type: ignore[attr-defined]
async with _memory_lock:
if not memory_store._initialized: # type: ignore[attr-defined]
if not neo4j_knowledge_service.is_initialized():
async with _knowledge_lock:
if not neo4j_knowledge_service.is_initialized():
await neo4j_knowledge_service.initialize()
async def _ensure_memory_ready() -> None:
"""Initialize the memory store when first accessed."""
if not memory_store.is_initialized():
async with _memory_lock:
if not memory_store.is_initialized():

Copilot uses AI. Check for mistakes.
"session_id": self.session_id,
"project_id": self.project_id,
"metadata": self.metadata,
"turns": len(self.chat_history) // 2,
Copy link

Copilot AI Nov 7, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] The magic number 2 assumes each turn consists of exactly one user message and one assistant message. Consider adding a named constant like MESSAGES_PER_TURN = 2 or computing turns by counting user messages only for clarity and maintainability.

Suggested change
"turns": len(self.chat_history) // 2,
"turns": sum(1 for msg in self.chat_history if msg.role == MessageRole.USER),

Copilot uses AI. Check for mistakes.
@royisme royisme merged commit b3f01bf into main Nov 7, 2025
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants