Skip to content

feat: add MiniMax provider support #2260

@jxnl

Description

@jxnl

MiniMax provides an OpenAI-compatible API and their M2.7 model is worth supporting as a first-class provider.

What's needed

  • from_minimax() factory in instructor/providers/minimax/
  • MINIMAX_TOOLS and MINIMAX_JSON modes in Mode enum
  • Wire into handle_response_model and handle_reask_kwargs in processing/response.py
  • JSON mode via system prompt (MiniMax does not support response_format)
  • Strip <think>...</think> tags from JSON mode output (reasoning models emit these)
  • from_provider("minimax/...") support in auto_client.py
  • Integration tests against real API (requires MINIMAX_API_KEY)
  • Docs at docs/integrations/minimax.md

Notes

  • API base URL: https://api.minimax.io/v1
  • Primary models: MiniMax-M2.7, MiniMax-M2.7-highspeed
  • Uses standard OpenAI SDK with custom base URL — no separate SDK needed

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requestpythonPull requests that update python codesize:LThis PR changes 100-499 lines, ignoring generated files.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions