Skip to content

[BUG]: Ollama update preventing thinking models from showing thoughts (regular chat and embed widget) #4576

@shatfield4

Description

@shatfield4

How are you running AnythingLLM?

Docker (local)

What happened?

  • Ollama version 0.12.6 is having issues with displaying thoughts from thinking capable LLMs
  • Last Ollama version that worked on my machine was 0.11.4 Running MacOS 14 Apple Silicon
  • This prevents the thoughts from showing in the embed chat widget and regular chats inside AnythingLLM
  • Causes glitching in frontend UI where it looks like the message is just blank
  • Suspecting that Ollama changed the response format for LLM thoughts so we will need to parse this in a new way
  • We will need to ensure we handle this in a way that is backward compatible since not all AnythingLLM users will be on the latest version of Ollama

Relevant issues

connect #4535
connect #4561
connect #4564

Are there known steps to reproduce?

No response

Metadata

Metadata

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions