Git provider
Github Cloud
System Info
Deployment: GitHub Actions (uses: qodo-ai/pr-agent@v0.33)
Model configured:
gemini/gemini-2.5-flash
Also tested:
o4-mini (OpenAI) → works correctly
API keys:
- GEMINI_API_KEY (from GitHub Secrets)
- OPENAI_KEY (only for testing, works fine)
Notes:
- Gemini API key works correctly outside PR-Agent:
- GET /v1beta/models works
- generateContent works
- Issue only happens inside PR-Agent
Bug details
I am using PR-Agent in GitHub Actions with:
config.model: gemini/gemini-2.5-flash
GEMINI_API_KEY from GitHub Secrets
The Gemini API key is valid:
- listing models via API works
- generateContent works via curl
However, inside PR-Agent:
- It tries gemini/gemini-2.5-flash
- Fails with:
API_KEY_INVALID (generativelanguage.googleapis.com)
- Then falls back to o4-mini
- That fallback fails with:
Incorrect API key provided: dummy_key
This happens for:
- PR review
- Code suggestions
Important observation:
Switching to OpenAI (o4-mini with OPENAI_KEY) works correctly.
So:
- same workflow
- same PR-Agent version
- same environment
→ OpenAI works
→ Gemini fails
This suggests an issue in Gemini integration or auth handling inside PR-Agent / LiteLLM.
This behavior looks related to:
Expected behavior:
- If GEMINI_API_KEY is valid, PR-Agent should use Gemini successfully
- It should not fallback to OpenAI with dummy_key
Failed to generate prediction with gemini/gemini-2.5-flash
Generating prediction with o4-mini
Incorrect API key provided: dummy_key
Failed to generate prediction with any model of ['gemini/gemini-2.5-flash', 'o4-mini']
Git provider
Github Cloud
System Info
Deployment: GitHub Actions (uses: qodo-ai/pr-agent@v0.33)
Model configured:
gemini/gemini-2.5-flash
Also tested:
o4-mini (OpenAI) → works correctly
API keys:
Notes:
Bug details
I am using PR-Agent in GitHub Actions with:
config.model: gemini/gemini-2.5-flash
GEMINI_API_KEY from GitHub Secrets
The Gemini API key is valid:
However, inside PR-Agent:
API_KEY_INVALID (generativelanguage.googleapis.com)
Incorrect API key provided: dummy_key
This happens for:
Important observation:
Switching to OpenAI (o4-mini with OPENAI_KEY) works correctly.
So:
→ OpenAI works
→ Gemini fails
This suggests an issue in Gemini integration or auth handling inside PR-Agent / LiteLLM.
This behavior looks related to:
Expected behavior:
Failed to generate prediction with gemini/gemini-2.5-flash
Generating prediction with o4-mini
Incorrect API key provided: dummy_key
Failed to generate prediction with any model of ['gemini/gemini-2.5-flash', 'o4-mini']