Conversation
- Custom prompts were being saved but not used in Ask feature - Modified sendMessage function to retrieve custom prompts from localStorage and user presets - Integrated custom prompts into system message before sending to AI - Added console logging for debugging custom prompt application - Fixes issue where personalized prompts were ignored in Ask responses Resolves custom prompts not being applied to LLM calls while using Ask feature on macOS v0.2.3
|
Hi @sundeep8967, thanks for your enthusiasm for fixing issue #111. However, I'll be closing this pull request for a couple of key reasons. Firstly, as a heads-up on our contribution process, I was already assigned to this ticket and actively working on a solution. In the future, please check for an existing assignee before starting work to avoid duplicated effort. Secondly, the technical approach in this PR is not aligned with our current architecture. We have recently deprecated direct localStorage access in favor of a centralized repository pattern. You can find our design pattern doc here. We appreciate your interest in contributing. If you'd like to help, please feel free to pick up an unassigned issue, especially those tagged with good first issue or help wanted. Thanks for your understanding. |
name: Pull Request
about: Fix: Apply custom prompts to LLM calls in Ask feature
Summary of Changes
Fixes the bug where custom prompts are saved but not applied to LLM calls while using the "Ask" feature.
Related Issue
custom prompts were being saved but the Ask feature only used hardcoded system prompts, ignoring user's personalized context.
Solution
Modified
sendMessagefunction insrc/features/listen/renderer.jsto:Environment