Skip to content

[anthropic] Full API response/stream chunks are logged at INFO level and crash on dump failure #2921

@APilotAdmin

Description

@APilotAdmin

Self Checks

  • This is only for bug report, if you would like to ask a question, please head to Discussions.
  • I have searched for existing issues Dify issues & Dify Official Plugins, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

Dify version

1.13.3

Plugin version

0.3.10

Cloud or Self Hosted

Cloud

Steps to reproduce

Plugin

models/anthropic

Describe the bug

In models/anthropic/models/llm/llm.py, the full Anthropic API response and every streaming chunk are logged at INFO level via response.model_dump_json(...) / chunk.model_dump_json():

This causes two problems:

  1. Log noise. Production logs are flooded with the entire serialized response (including long completions) for every request, and one line per streaming chunk. At default INFO verbosity this is unusable and bloats log storage.
  2. Unhandled serialization errors. model_dump_json() can raise for certain payloads (e.g. unexpected/new event types the SDK model can't serialize cleanly). Because the call is unguarded, a serialization failure aborts the whole request instead of just dropping a debug line.

Expected behavior

  • Raw response/chunk dumps should be at DEBUG level, not INFO, so operators can opt in via log level.
  • The model_dump_json calls should be wrapped so a dump failure degrades gracefully and doesn't break the generation path.

Suggested fix

# non-stream path
try:
    logging.debug(f"Anthropic API Response: {response.model_dump_json(indent=2)}")
except Exception as e:
    logging.debug(f"Anthropic API Response: <dump failed: {e}>")

# stream path
try:
    logging.debug(f"Anthropic API Stream Response Chunk: {chunk.model_dump_json()}")
except Exception as e:
    logging.debug(f"Anthropic API Stream Response Chunk: <dump failed: {type(chunk).__name__}, err: {e}>")

### ✔️ Error log

Failed to transform agent message: req_id: 828c7a92ab PluginInvokeError: {"args":{},"error_type":"Exception","message":"read llm model failed: request failed: req_id: efa4e3301c PluginInvokeError: {\"args\":{},\"error_type\":\"PydanticSerializationError\",\"message\":\"Error serializing to JSON: TypeError: 'MockValSer' object is not an instance of 'SchemaSerializer'\"}"}

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions