Skip to content

PostHog with pydantic-ai streaming raises TypeError: 'async_generator' #393

@noelw-a11y

Description

@noelw-a11y

Description

When using posthog with pydantic-ai's streaming functionality, a TypeError is raised because the response object from PostHog's wrapped client doesn't support the asynchronous context manager protocol that pydantic-ai expects.

Have a repo to reproduce the issues here:
https://github.com/noelw-a11y/posthog-pydantic/tree/main

We have observed this issue for other providers such as OpenAI and Anthropic too.

Environment

  • Python version: 3.12.2
  • posthog version: >=7.3.1
  • openai version: >=2.13.0
  • pydantic-ai version: >=1.33.0

Steps to Reproduce

In README.md:
https://github.com/noelw-a11y/posthog-pydantic/tree/main

Expected Behavior

The streaming should work correctly, allowing pydantic-ai to iterate over the streaming response chunks.

Actual Behavior

A TypeError is raised:

TypeError: 'async_generator' object does not support the asynchronous context manager protocol

Error Traceback

Traceback (most recent call last):
  File "main.py", line 182, in <module>
    main()
  File "main.py", line 175, in main
    asyncio.run(test_streaming())
  File "/Users/noelwilson/.pyenv/versions/3.12.2/lib/python3.12/asyncio/runners.py", line 194, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/Users/noelwilson/.pyenv/versions/3.12.2/lib/python3.12/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/noelwilson/.pyenv/versions/3.12.2/lib/python3.12/asyncio/base_events.py", line 685, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "main.py", line 79, in test_streaming
    async with agent.run_stream('Write a short 3-sentence story about a robot learning to code.') as result:
  File "/Users/noelwilson/.pyenv/versions/3.12.2/lib/python3.12/contextlib.py", line 210, in __aenter__
    return await anext(self.gen)
           ^^^^^^^^^^^^^^^^^^^^^
  File ".../pydantic_ai/agent/abstract.py", line 525, in run_stream
    async with node.stream(graph_ctx) as stream:
  File "/Users/noelwilson/.pyenv/versions/3.12.2/lib/python3.12/contextlib.py", line 210, in __aenter__
    return await anext(self.gen)
           ^^^^^^^^^^^^^^^^^^^^^
  File ".../pydantic_ai/_agent_graph.py", line 450, in stream
    async with ctx.deps.model.request_stream(
  File "/Users/noelwilson/.pyenv/versions/3.12.2/lib/python3.12/contextlib.py", line 210, in __aenter__
    return await anext(self.gen)
           ^^^^^^^^^^^^^^^^^^^^^
  File ".../pydantic_ai/models/openai.py", line 496, in request_stream
    async with response:
TypeError: 'async_generator' object does not support the asynchronous context manager protocol

Root Cause Analysis

The issue occurs at pydantic_ai/models/openai.py:496 where the code attempts to use:

async with response:

However, when using posthog.ai.openai.AsyncOpenAI, the response object is an async generator that doesn't implement __aenter__ and __aexit__ methods required for the async context manager protocol.

Workaround

Using the standard openai.AsyncOpenAI client (without PostHog) works correctly. The issue only occurs when PostHog's wrapper is used.

Additional Context

  • Non-streaming calls (using agent.run()) appear to work correctly with PostHog
  • The issue is specific to streaming functionality (agent.run_stream())
  • A minimal reproduction repository is available at: [link to repo if public]

Related Code Location

The error originates from:

  • pydantic_ai/models/openai.py, line 496: async with response:
  • The response comes from PostHog's wrapped AsyncOpenAI client's streaming method

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions