This is really cool. Does this support streaming responses?
From the docs & code it appears not to but it would be really nice. Something like how RubyLLM supports:
chat.ask('what is the capital of North Carolina') do |chunk|
broadcast_to_frontend chunk
end