### How are you running AnythingLLM? Docker (local) ### What happened? - Ollama version 0.12.6 is having issues with displaying thoughts from thinking capable LLMs - Last Ollama version that worked on my machine was 0.11.4 Running MacOS 14 Apple Silicon - This prevents the thoughts from showing in the embed chat widget and regular chats inside AnythingLLM - Causes glitching in frontend UI where it looks like the message is just blank - Suspecting that Ollama changed the response format for LLM thoughts so we will need to parse this in a new way - We will need to ensure we handle this in a way that is backward compatible since not all AnythingLLM users will be on the latest version of Ollama ### Relevant issues connect #4535 connect #4561 connect #4564 ### Are there known steps to reproduce? _No response_