When using Ollama (url http://localhost:11434), the response only shows the last streamed chunk/token/word.
It looks like it's replacing the response message with the last streamed token/chunk instead of appending to the previously streamed chunks/tokens.
This is on latest master (commit 952ee5f) checked out a few minutes ago.
When using Ollama (url
http://localhost:11434), the response only shows the last streamed chunk/token/word.It looks like it's replacing the response message with the last streamed token/chunk instead of appending to the previously streamed chunks/tokens.
This is on latest master (commit
952ee5f) checked out a few minutes ago.