Skip to content

Model ID case normalization breaks third-party provider compatibility #729

@dfwqdyl-ui

Description

@dfwqdyl-ui

Description:

DeepSeek TUI (v0.8.11) normalizes model IDs to lowercase before sending them to the API, which breaks compatibility with third-party API providers that require case-sensitive model names.

Steps to reproduce:

  1. Configure default_text_model = "DeepSeek-V4-Pro" in config.toml
  2. Run deepseek model resolve "DeepSeek-V4-Pro"
  3. Observe the resolved model name

Actual behavior:

$ deepseek model resolve "DeepSeek-V4-Pro"
deepseek-v4-pro

The model ID is lowercased to deepseek-v4-pro, regardless of what was configured.

Expected behavior:

The model ID should be passed through as-is to the API. If a provider requires DeepSeek-V4-Pro (exact case), the TUI should respect that.

Why this matters:

Third-party API providers (OpenAI-compatible proxies, self-hosted gateways, etc.) often enforce exact model name matching. If the TUI lowercases DeepSeek-V4-Pro to deepseek-v4-pro, the provider rejects the request with a "model not found" error. The base_url config allows pointing to a different endpoint, but without case-preserving model IDs it's unusable for providers with case-sensitive routing.

Possible fix:

Add a passthrough or raw provider mode (or a config flag like preserve_model_case = true) that sends the model ID exactly as configured in default_text_model, without any normalization or provider-specific mapping.

Environment:

  • OS: Windows 10/11 x64
  • Version: deepseek-tui 0.8.11
  • Installation: npm global

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingenhancementNew feature or request

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions