Description:
DeepSeek TUI (v0.8.11) normalizes model IDs to lowercase before sending them to the API, which breaks compatibility with third-party API providers that require case-sensitive model names.
Steps to reproduce:
- Configure
default_text_model = "DeepSeek-V4-Pro" in config.toml
- Run
deepseek model resolve "DeepSeek-V4-Pro"
- Observe the resolved model name
Actual behavior:
$ deepseek model resolve "DeepSeek-V4-Pro"
deepseek-v4-pro
The model ID is lowercased to deepseek-v4-pro, regardless of what was configured.
Expected behavior:
The model ID should be passed through as-is to the API. If a provider requires DeepSeek-V4-Pro (exact case), the TUI should respect that.
Why this matters:
Third-party API providers (OpenAI-compatible proxies, self-hosted gateways, etc.) often enforce exact model name matching. If the TUI lowercases DeepSeek-V4-Pro to deepseek-v4-pro, the provider rejects the request with a "model not found" error. The base_url config allows pointing to a different endpoint, but without case-preserving model IDs it's unusable for providers with case-sensitive routing.
Possible fix:
Add a passthrough or raw provider mode (or a config flag like preserve_model_case = true) that sends the model ID exactly as configured in default_text_model, without any normalization or provider-specific mapping.
Environment:
- OS: Windows 10/11 x64
- Version: deepseek-tui 0.8.11
- Installation: npm global
Description:
DeepSeek TUI (v0.8.11) normalizes model IDs to lowercase before sending them to the API, which breaks compatibility with third-party API providers that require case-sensitive model names.
Steps to reproduce:
default_text_model = "DeepSeek-V4-Pro"inconfig.tomldeepseek model resolve "DeepSeek-V4-Pro"Actual behavior:
The model ID is lowercased to
deepseek-v4-pro, regardless of what was configured.Expected behavior:
The model ID should be passed through as-is to the API. If a provider requires
DeepSeek-V4-Pro(exact case), the TUI should respect that.Why this matters:
Third-party API providers (OpenAI-compatible proxies, self-hosted gateways, etc.) often enforce exact model name matching. If the TUI lowercases
DeepSeek-V4-Protodeepseek-v4-pro, the provider rejects the request with a "model not found" error. Thebase_urlconfig allows pointing to a different endpoint, but without case-preserving model IDs it's unusable for providers with case-sensitive routing.Possible fix:
Add a
passthroughorrawprovider mode (or a config flag likepreserve_model_case = true) that sends the model ID exactly as configured indefault_text_model, without any normalization or provider-specific mapping.Environment: