-
Notifications
You must be signed in to change notification settings - Fork 0
Database Performance Optimizations #4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: db-cleanup-baseline
Are you sure you want to change the base?
Conversation
… deadlocks on MySQL (#80329) * Split subquery when cleaning annotations * update comment * Raise batch size, now that we pay attention to it * Iterate in batches * Separate cancellable batch implementation to allow for multi-statement callbacks, add overload for single-statement use * Use split-out utility in outer batching loop so it respects context cancellation * guard against empty queries * Use SQL parameters * Use same approach for tags * drop unused function * Work around parameter limit on sqlite for large batches * Bulk insert test data in DB * Refactor test to customise test data creation * Add test for catching SQLITE_MAX_VARIABLE_NUMBER limit * Turn annotation cleanup test to integration tests * lint --------- Co-authored-by: Sofia Papagiannaki <[email protected]>
|
cursor review |
| placeholders := "?" + strings.Repeat(",?", len(ids)-1) | ||
| sql = fmt.Sprintf(`DELETE FROM %s WHERE id IN (%s)`, table, placeholders) | ||
| args = asAny(ids) | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Bug: SQLite Parameter Limit: The Check That Fails
The SQLite parameter limit check compares the configured AnnotationCleanupJobBatchSize instead of the actual number of IDs being deleted. When batch size is small but len(ids) exceeds 999 (SQLite's parameter limit), the code uses parameterized queries which will fail. The condition should check len(ids) > sqliteParameterLimit instead of r.cfg.AnnotationCleanupJobBatchSize > sqliteParameterLimit.
| r.log.Error("Tags to clean", "count", len(ids), "ids", ids, "cond", cond, "err", err) | ||
|
|
||
| x, y := r.deleteByIDs(ctx, "annotation_tag", ids) | ||
| r.log.Error("cleaned tags", "count", len(ids), "affected", x, "err", y) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Bug: Misconfigured logging floods error channels.
Debug logging statements use Error() level instead of Debug() for normal operational information. Lines 534, 537, 554, 557, 576, and 579 log routine cleanup progress at ERROR level even when no error occurs, which creates false error alerts in logging systems and should be DEBUG or removed.
|
This pull request has been automatically marked as stale because it has not had activity in the last 30 days. It will be closed in 2 weeks if no further activity occurs. Please feel free to give a status update or ping for review. Thank you for your contributions! |
Test 7
Replicated from ai-code-review-evaluation/grafana-cursor#7
Note
Refactors annotation cleanup to fetch and delete IDs in batches (avoiding MySQL deadlocks), updates integration tests with bulk inserts and large-batch scenarios, and runs the cleanup loop every minute.
fetchIDs,deleteByIDs,untilDoneOrCancelled,asAny.annotation_tagorphan cleanup.InsertMulti, addannotation_tagrows, and parameterizeAnnotationCleanupJobBatchSize.Written by Cursor Bugbot for commit 81c45bf. Configure here.