For chat to be useful for testing there should be a configuration for setting the LLM system prompt.
Couple of ideas:
- Cogwheel button for configuration in top of the chat view that opens a text area dialog
- Menu item for LLM configuration that can support list of predefined system prompts
There also should be a storage to persist the system prompts and settings.