How do you use Sentry?
Sentry Saas (sentry.io)
Version
2.59.0
Steps to Reproduce
PydanticAIIntegration is incompatible with pydantic-ai ≥ 1.93. In 1.93, AbstractAgent.run_stream_events was changed from an async generator to a regular function that returns an AgentEventStream async context manager (see pydantic/pydantic-ai abstract.py#L1064-L1176 and the AgentEventStream class with __aenter__ / __aexit__). Pydantic AI's own VercelAIAdapter now enters it with async with self.agent.run_stream_events(...) as stream: (_adapter.py:530-549).
Sentry's wrapper for run_stream_events is itself defined as async def ... yield, which makes it an async-generator function. So Agent.run_stream_events(...) after patching always returns an async_generator object regardless of what the original returns — and AgentEventStream's context-manager protocol is hidden behind the wrapper.
sentry_sdk/integrations/pydantic_ai/patches/agent_run.py:
def _create_streaming_events_wrapper(
original_func: "Callable[..., Any]",
) -> "Callable[..., Any]":
@wraps(original_func)
async def wrapper(self: "Any", *args: "Any", **kwargs: "Any") -> "Any":
try:
async for event in original_func(self, *args, **kwargs):
yield event
except Exception as exc:
...
return wrapper
The last commit to agent_run.py (#5947, 2026-04-10) predates pydantic-ai 1.93.0 (released 2026-05-09), so this case hasn't been considered yet.
Minimal repro:
import sentry_sdk
from sentry_sdk.integrations.pydantic_ai import PydanticAIIntegration
from pydantic_ai import Agent
from pydantic_ai.models.test import TestModel
from pydantic_ai.ui.vercel_ai import VercelAIAdapter
sentry_sdk.init(dsn="<any-valid-dsn>", integrations=[PydanticAIIntegration()])
agent = Agent(model=TestModel(), name="repro")
# Direct call after patching:
res = agent.run_stream_events("hello")
print(type(res).__name__) # async_generator (should be AgentEventStream)
print(hasattr(res, "__aexit__")) # False
# The real-world failure path:
# VercelAIAdapter does `async with agent.run_stream_events(...) as stream:`
# which raises TypeError: 'async_generator' object does not support the
# asynchronous context manager protocol (missed __aexit__ method)
Versions of relevant packages:
sentry-sdk==2.59.0
pydantic-ai==1.93.0 / pydantic-ai-slim==1.93.0
- Python 3.14
Expected Result
Agent.run_stream_events(...) should still return an AgentEventStream async context manager after the Sentry patch, so callers that use async with agent.run_stream_events(...) (including pydantic-ai's own VercelAIAdapter.run_stream_native) continue to work.
Actual Result
Agent.run_stream_events(...) returns an async_generator object. Any caller using async with on the return value raises:
TypeError: 'async_generator' object does not support the asynchronous context manager protocol (missed __aexit__ method)
File ".../pydantic_ai/ui/_adapter.py", line 531, in stream_events
async with self.agent.run_stream_events(
~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
output_type=output_type,
...
This breaks all streaming chat through VercelAIAdapter (and presumably the other UI adapters that use the same pattern) whenever the Sentry integration is enabled. PAI's transform_stream catches this internally and emits an SSE error chunk via on_error, so the failure surfaces to clients as "Something went wrong while the assistant was responding" without a useful traceback in Sentry.
Suggested direction
_create_streaming_events_wrapper needs to return whatever shape the wrapped function returns. With the 1.93 API that means returning the AgentEventStream itself (its __aenter__ / __aexit__ and __aiter__ need to be honoured), or wrapping it in a delegating context manager that drives the original through the same protocol. The current async def ... yield shape is only correct for the pre-1.93 async-iterator API.
Happy to follow up with a PR if helpful.
How do you use Sentry?
Sentry Saas (sentry.io)
Version
2.59.0
Steps to Reproduce
PydanticAIIntegrationis incompatible withpydantic-ai≥ 1.93. In 1.93,AbstractAgent.run_stream_eventswas changed from an async generator to a regular function that returns anAgentEventStreamasync context manager (see pydantic/pydantic-ai abstract.py#L1064-L1176 and theAgentEventStreamclass with__aenter__/__aexit__). Pydantic AI's ownVercelAIAdapternow enters it withasync with self.agent.run_stream_events(...) as stream:(_adapter.py:530-549).Sentry's wrapper for
run_stream_eventsis itself defined asasync def ... yield, which makes it an async-generator function. SoAgent.run_stream_events(...)after patching always returns anasync_generatorobject regardless of what the original returns — andAgentEventStream's context-manager protocol is hidden behind the wrapper.sentry_sdk/integrations/pydantic_ai/patches/agent_run.py:The last commit to
agent_run.py(#5947, 2026-04-10) predates pydantic-ai 1.93.0 (released 2026-05-09), so this case hasn't been considered yet.Minimal repro:
Versions of relevant packages:
sentry-sdk==2.59.0pydantic-ai==1.93.0/pydantic-ai-slim==1.93.0Expected Result
Agent.run_stream_events(...)should still return anAgentEventStreamasync context manager after the Sentry patch, so callers that useasync with agent.run_stream_events(...)(including pydantic-ai's ownVercelAIAdapter.run_stream_native) continue to work.Actual Result
Agent.run_stream_events(...)returns anasync_generatorobject. Any caller usingasync withon the return value raises:This breaks all streaming chat through
VercelAIAdapter(and presumably the other UI adapters that use the same pattern) whenever the Sentry integration is enabled. PAI'stransform_streamcatches this internally and emits an SSE error chunk viaon_error, so the failure surfaces to clients as "Something went wrong while the assistant was responding" without a useful traceback in Sentry.Suggested direction
_create_streaming_events_wrapperneeds to return whatever shape the wrapped function returns. With the 1.93 API that means returning theAgentEventStreamitself (its__aenter__/__aexit__and__aiter__need to be honoured), or wrapping it in a delegating context manager that drives the original through the same protocol. The currentasync def ... yieldshape is only correct for the pre-1.93 async-iterator API.Happy to follow up with a PR if helpful.