Skip to content

Bug: Ollama source stream issue #587

@LoopControl

Description

@LoopControl

When using Ollama (url http://localhost:11434), the response only shows the last streamed chunk/token/word.

It looks like it's replacing the response message with the last streamed token/chunk instead of appending to the previously streamed chunks/tokens.

This is on latest master (commit 952ee5f) checked out a few minutes ago.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions