-
Notifications
You must be signed in to change notification settings - Fork 0
Optimize spans buffer insertion with eviction during insert #6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
Enhance the Redis buffer system to support dedicated queue routing for different models, improving performance isolation and reducing buffer processing contention across high-volume models. Key improvements: - Add PendingBufferRouter for per-model queue assignment - Implement RedisBufferRouter with customizable queue functions - Extract model metadata from Redis keys for routing decisions - Add comprehensive test coverage for custom queue assignments This optimization allows critical models like Group to use dedicated processing queues while maintaining backward compatibility for models without custom queue assignments. 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Copilot encountered an error and was unable to review this pull request. You can try again by re-requesting a review.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
Copilot reviewed 3 out of 3 changed files in this pull request and generated 8 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| def __init__(self, incr_batch_size: int) -> None: | ||
| self.incr_batch_size = incr_batch_size | ||
| self.default_pending_buffer = PendingBuffer(self.incr_batch_size) | ||
| # map of model_key to PendingBufferValue |
Copilot
AI
Nov 14, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove trailing whitespace at the end of the comment.
| # map of model_key to PendingBufferValue | |
| # map of model_key to PendingBufferValue |
| self.default_pending_buffer = PendingBuffer(self.incr_batch_size) | ||
| # map of model_key to PendingBufferValue | ||
| self.pending_buffer_router: dict[str, PendingBufferValue] = dict() | ||
|
|
Copilot
AI
Nov 14, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove trailing whitespace from empty line.
| self.pending_buffer_router[model_key] = PendingBufferValue( | ||
| model_key=model_key, pending_buffer=pending_buffer, generate_queue=generate_queue | ||
| ) | ||
|
|
Copilot
AI
Nov 14, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove trailing whitespace from empty line.
| if model_key is not None and model_key in self.pending_buffer_router: | ||
| return self.pending_buffer_router[model_key].pending_buffer | ||
| return self.default_pending_buffer | ||
|
|
Copilot
AI
Nov 14, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove trailing whitespace from empty line.
| if generate_queue is not None: | ||
| return generate_queue(model_key) | ||
| return None | ||
|
|
Copilot
AI
Nov 14, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove trailing whitespace from empty line.
| def __init__(self) -> None: | ||
| # map of model_key (generated from _get_model_key function) to queue name | ||
| self._routers: dict[str, ChooseQueueFunction] = dict() | ||
|
|
Copilot
AI
Nov 14, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove trailing whitespace from empty line.
| """ | ||
| key = _get_model_key(model=model) | ||
| self._routers[key] = generate_queue | ||
|
|
Copilot
AI
Nov 14, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove trailing whitespace from empty line.
| return f"b:k:{model._meta}:{md5}" | ||
| model_key = _get_model_key(model=model) | ||
| return f"b:k:{model_key}:{md5}" | ||
|
|
Copilot
AI
Nov 14, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove trailing whitespace from empty line.
Test 2
Replicated from ai-code-review-evaluation/sentry-copilot#10