Skip to content

OpenAI server_is_overloaded stream errors are not retried #25884

@johnwaldo

Description

@johnwaldo

Description

OpenAI-compatible streams can fail with a transient overload event instead of normal text:

{"type":"error","error":{"type":"service_unavailable_error","code":"server_is_overloaded","message":"Our servers are currently overloaded. Please try again later."}}

When that shape is surfaced as a stream error, it should enter the existing session retry flow instead of stopping the turn as a final error.

Plugins

None required to reproduce the underlying provider error shape.

OpenCode version

1.14.33

Steps to reproduce

  1. Use an OpenAI-compatible model during a provider overload window.
  2. The provider returns the stream error above.
  3. The session stops instead of showing the existing retry countdown/status.

Screenshot and/or share link

N/A

Operating System

macOS

Terminal

Any

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type
No fields configured for issues without a type.

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions