Self Checks
Dify version
1.13.3
Plugin version
0.3.10
Cloud or Self Hosted
Cloud
Steps to reproduce
Plugin
models/anthropic
Describe the bug
In models/anthropic/models/llm/llm.py, the full Anthropic API response and every streaming chunk are logged at INFO level via response.model_dump_json(...) / chunk.model_dump_json():
This causes two problems:
- Log noise. Production logs are flooded with the entire serialized response (including long completions) for every request, and one line per streaming chunk. At default
INFO verbosity this is unusable and bloats log storage.
- Unhandled serialization errors.
model_dump_json() can raise for certain payloads (e.g. unexpected/new event types the SDK model can't serialize cleanly). Because the call is unguarded, a serialization failure aborts the whole request instead of just dropping a debug line.
Expected behavior
- Raw response/chunk dumps should be at
DEBUG level, not INFO, so operators can opt in via log level.
- The
model_dump_json calls should be wrapped so a dump failure degrades gracefully and doesn't break the generation path.
Suggested fix
# non-stream path
try:
logging.debug(f"Anthropic API Response: {response.model_dump_json(indent=2)}")
except Exception as e:
logging.debug(f"Anthropic API Response: <dump failed: {e}>")
# stream path
try:
logging.debug(f"Anthropic API Stream Response Chunk: {chunk.model_dump_json()}")
except Exception as e:
logging.debug(f"Anthropic API Stream Response Chunk: <dump failed: {type(chunk).__name__}, err: {e}>")
### ✔️ Error log
Failed to transform agent message: req_id: 828c7a92ab PluginInvokeError: {"args":{},"error_type":"Exception","message":"read llm model failed: request failed: req_id: efa4e3301c PluginInvokeError: {\"args\":{},\"error_type\":\"PydanticSerializationError\",\"message\":\"Error serializing to JSON: TypeError: 'MockValSer' object is not an instance of 'SchemaSerializer'\"}"}
Self Checks
Dify version
1.13.3
Plugin version
0.3.10
Cloud or Self Hosted
Cloud
Steps to reproduce
Plugin
models/anthropicDescribe the bug
In
models/anthropic/models/llm/llm.py, the full Anthropic API response and every streaming chunk are logged atINFOlevel viaresponse.model_dump_json(...)/chunk.model_dump_json():This causes two problems:
INFOverbosity this is unusable and bloats log storage.model_dump_json()can raise for certain payloads (e.g. unexpected/new event types the SDK model can't serialize cleanly). Because the call is unguarded, a serialization failure aborts the whole request instead of just dropping a debug line.Expected behavior
DEBUGlevel, notINFO, so operators can opt in via log level.model_dump_jsoncalls should be wrapped so a dump failure degrades gracefully and doesn't break the generation path.Suggested fix