-
Notifications
You must be signed in to change notification settings - Fork 58
[WIP] hermetic build: Non binary #858
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Signed-off-by: Haoyu Sun <[email protected]>
Signed-off-by: Haoyu Sun <[email protected]>
Signed-off-by: Haoyu Sun <[email protected]>
Signed-off-by: Haoyu Sun <[email protected]>
WalkthroughThis PR introduces hermetic builds with multi-architecture support by adding platform-specific prefetching configuration, RPM/pip dependency locking, PyTorch wheel management, and conditional container build logic. Updates Tekton pipelines, Makefile targets, and dependencies to support reproducible builds with prefetched artifacts and extended execution timeouts. Changes
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes
Areas requiring extra attention:
Possibly related PRs
Suggested reviewers
Pre-merge checks and finishing touches❌ Failed checks (1 inconclusive)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🧹 Nitpick comments (3)
README.md (1)
18-94: Consider fixing markdown list indentation for consistency.Markdownlint reports unordered list indentation inconsistencies in the table of contents. While not critical, consistent formatting improves maintainability.
Expected indentation is 2 spaces per level, but some entries use 4 spaces. You can run
markdownlint-cli2 --fix README.mdto automatically correct these, or adjust manually:
- Lines 19, 55-57, 90-94: Reduce indentation to match expected levels (2/4/6 spaces instead of 4/8/12)
scripts/remove_torch_deps.sh (2)
6-6: Consider addingset -ufor stricter error handling.While
set -eis present, addingset -uwould catch undefined variable references.Apply this diff:
-set -e +set -eu
16-19: Consider warning if backup already exists.The script silently overwrites any existing backup file. Consider adding a check to warn users.
Apply this diff:
# Create a backup BACKUP_FILE="${INPUT_FILE}.backup" +if [ -f "$BACKUP_FILE" ]; then + echo "Warning: Overwriting existing backup: $BACKUP_FILE" +fi cp "$INPUT_FILE" "$BACKUP_FILE" echo "Created backup: $BACKUP_FILE"
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (12)
.tekton/lightspeed-stack-pull-request.yaml(2 hunks).tekton/lightspeed-stack-push.yaml(2 hunks)Containerfile(1 hunks)Makefile(2 hunks)README.md(9 hunks)pyproject.toml(1 hunks)requirements.hermetic.txt(1 hunks)requirements.torch.txt(1 hunks)rpms.in.yaml(1 hunks)rpms.lock.yaml(1 hunks)scripts/remove_torch_deps.sh(1 hunks)ubi.repo(1 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
pyproject.toml
📄 CodeRabbit inference engine (CLAUDE.md)
pyproject.toml: Configure pylint withsource-roots = "src"
Excludesrc/auth/k8s.pyfrom pyright type checking
Files:
pyproject.toml
🧠 Learnings (4)
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use `uv sync --group dev --group llslibdev` to install development dependencies
Applied to files:
Containerfilepyproject.toml
📚 Learning: 2025-08-18T11:45:59.961Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:77-77
Timestamp: 2025-08-18T11:45:59.961Z
Learning: torch==2.7.1 is available on PyPI and is a valid version that can be used in dependency specifications.
Applied to files:
requirements.torch.txt
📚 Learning: 2025-08-18T10:56:55.349Z
Learnt from: matysek
Repo: lightspeed-core/lightspeed-stack PR: 292
File: pyproject.toml:0-0
Timestamp: 2025-08-18T10:56:55.349Z
Learning: The lightspeed-stack project intentionally uses a "generic image" approach, bundling many dependencies directly in the base runtime image to work for everyone, rather than using lean base images with optional dependency groups.
Applied to files:
README.md
📚 Learning: 2025-11-24T16:58:04.410Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-11-24T16:58:04.410Z
Learning: Use Python package manager `uv` with `uv run` prefix for all development commands
Applied to files:
requirements.hermetic.txt
🪛 markdownlint-cli2 (0.18.1)
README.md
19-19: Unordered list indentation
Expected: 2; Actual: 4
(MD007, ul-indent)
55-55: Unordered list indentation
Expected: 6; Actual: 12
(MD007, ul-indent)
56-56: Unordered list indentation
Expected: 4; Actual: 8
(MD007, ul-indent)
57-57: Unordered list indentation
Expected: 6; Actual: 12
(MD007, ul-indent)
90-90: Unordered list indentation
Expected: 2; Actual: 4
(MD007, ul-indent)
91-91: Unordered list indentation
Expected: 4; Actual: 8
(MD007, ul-indent)
92-92: Unordered list indentation
Expected: 4; Actual: 8
(MD007, ul-indent)
93-93: Unordered list indentation
Expected: 4; Actual: 8
(MD007, ul-indent)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
- GitHub Check: build-pr
- GitHub Check: e2e_tests (ci)
- GitHub Check: e2e_tests (azure)
🔇 Additional comments (19)
pyproject.toml (1)
90-94: LGTM! Grouped PyTorch source configuration.The addition of
group = "llslibdev"correctly restricts the PyTorch CPU index to the llslibdev dependency group, aligning with the hermetic build strategy introduced in this PR.requirements.torch.txt (1)
1-24: LGTM! Autogenerated PyTorch CPU requirements for hermetic builds.This file correctly provides pinned torch==2.7.1+cpu with integrity hashes for reproducible, hermetic builds across multiple platforms.
README.md (1)
940-985: LGTM! Comprehensive Konflux hermetic build documentation.The new Konflux section clearly documents the hermetic build workflow, dependency update process, and provides actionable commands for maintaining lock files. This aligns well with the hermetic build infrastructure introduced in this PR.
ubi.repo (1)
1-62: LGTM! Standard UBI 9 repository configuration.This YUM repository configuration correctly defines the BaseOS, AppStream, and CodeReady Builder repositories for UBI 9, with appropriate GPG verification enabled and debug/source repos disabled by default.
rpms.in.yaml (1)
1-4: LGTM! Clean RPM dependency specification for multi-arch builds.The package list (gcc, jq, patch) and architecture configuration (x86_64, aarch64) align well with the hermetic build strategy and multi-platform support introduced in this PR.
rpms.lock.yaml (1)
1-355: LGTM! Autogenerated RPM lockfile for hermetic builds.This lockfile provides reproducible, verified package resolution for multi-arch builds. It should be regenerated using
rpm-lockfile-prototypewhen dependencies change, as documented in the README..tekton/lightspeed-stack-push.yaml (4)
27-30: LGTM! ARM64 platform support added for multi-arch builds.The addition of
linux-c6gd2xlarge/arm64enables ARM64 builds alongside x86_64, aligning with the multi-architecture lockfiles (rpms.lock.yaml, requirements.aarch64.txt) introduced in this PR.
35-36: LGTM! Hermetic mode enabled for air-gapped builds.Setting
hermetic: 'true'enforces network isolation during builds, ensuring reproducibility with prefetched dependencies.
632-634: Extended timeouts for hermetic multi-arch builds.The 4-hour timeout accommodates the increased duration of hermetic builds with dependency prefetching, multi-architecture compilation, and comprehensive security scanning. This aligns with the expanded build complexity introduced in this PR.
31-36: Verify all referenced requirements files exist before merging.The prefetch configuration references four requirements files (requirements.x86_64.txt, requirements.aarch64.txt, requirements.hermetic.txt, requirements.torch.txt) that must exist at pipeline execution time. Confirm these files are present in the repository or are generated by the build system (e.g., via
make konflux-requirementsor similar build target) before this configuration is used.Containerfile (2)
25-25: LGTM!The glob pattern
requirements.*.txtcorrectly matches all platform-specific and hermetic requirements files needed for the multi-architecture build.
32-36: Verify requirements files exist for all target architectures.The hermetic build path uses
$(uname -m)to select architecture-specific requirements files. Ensure thatrequirements.x86_64.txt,requirements.aarch64.txt, andrequirements.torch.txtexist in the repository root for all supported build architectures..tekton/lightspeed-stack-pull-request.yaml (3)
30-33: LGTM!The addition of
linux-c6gd2xlarge/arm64correctly adds ARM64 platform support for multi-architecture builds. The c6gd2xlarge instance type is appropriate for ARM64 builds.
635-637: Verify 4-hour timeout is appropriate.The pipeline and task timeouts are set to 4 hours, which is significantly longer than typical CI builds. While multi-arch builds with hermetic prefetching can be time-consuming, verify this duration is necessary and consider if it can be optimized.
Monitor actual build times to determine if this timeout can be reduced in future iterations.
36-37: Verifyallow_binary: falsecompatibility with PyTorch wheels.The
prefetch-inputconfiguration sets"allow_binary": "false"for pip dependencies. If this parameter restricts downloads to source distributions only, it may conflict with PyTorch wheels referenced inrequirements.torch.txt. Clarify whether cachi2'sallow_binarysetting permits wheel downloads, or if the configuration should be adjusted to"allow_binary": "true"or removed to allow mixed binary/source packages for this use case.scripts/remove_torch_deps.sh (1)
48-55: LGTM!The atomic file replacement using a temporary file and the final diff output are well-implemented. The
|| trueon the diff command correctly prevents the script from failing when differences are found.Makefile (3)
7-8: No action required. PyTorch version 2.7.1 is available on PyPI and supports CPU wheels.
104-109: Verify duplicate target definitions in Makefile.The review comment references a potential duplicate
konflux-requirementstarget definition, but the provided code snippet shows only one definition (lines 104-109). Manual verification is needed to confirm whether duplicate target definitions exist elsewhere in the Makefile, as repository access for automated verification was unavailable.
104-109: The--torch-backend cpu,--emit-index-url, and--no-depsflags are all validuv pip compileoptions.
--torch-backendis a documented uv feature for specifying PyTorch backend indexes (cpu, cu128, etc.).--emit-index-urlis documented for including index URLs in generated requirements.--no-depsis a standard pip compatibility flag supported by uv.The design here—generating platform-specific requirements with full dependencies, removing torch from those files, then generating a separate torch-only file with
--no-deps—is intentional and correct. PyTorch dependencies not needed for the platform builds are isolated in the separaterequirements.torch.txtfile.Likely an incorrect or invalid review comment.
| uv==0.8.15 | ||
| pip==24.2 No newline at end of file |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🌐 Web query:
uv Python package manager latest stable version December 2025
💡 Result:
Latest stable uv release (Dec 1, 2025): 0.9.13 — released Nov 26, 2025. [1][2]
Sources:
[1] GitHub releases — astral-sh/uv (0.9.13).
[2] uv docs (installation / version pin example referencing 0.9.13).
🌐 Web query:
pip latest stable version December 2025
💡 Result:
The latest stable pip version (as of December 1, 2025) is 25.3. [1][2]
Sources:
[1] PyPI — pip release (pip 25.3 upload details).
[2] bootstrap.pypa.io pip zipapp index listing pip-25.3.
🏁 Script executed:
# Check if pyproject.toml exists and examine dependencies
find . -name "pyproject.toml" -type f | head -5Repository: lightspeed-core/lightspeed-stack
Length of output: 208
Update pinned versions to current stable releases.
The pinned versions are outdated. As of December 2025:
- uv==0.8.15 should be updated to 0.9.13 (current stable)
- pip==24.2 should be updated to 25.3 (current stable)
Update these to ensure security patches and improvements are included while maintaining hermetic build reproducibility.
🤖 Prompt for AI Agents
In requirements.hermetic.txt around lines 1 to 2 the pinned package versions are
outdated; update uv from 0.8.15 to 0.9.13 and pip from 24.2 to 25.3 to match
current stable releases—edit the two lines to replace the versions accordingly
and save the file so the hermetic requirements reference the updated, stable
versions.
| # If we are in torch section and line starts with 4 spaces, skip it | ||
| in_torch_section == 1 && /^ / { | ||
| next # Skip this line | ||
| } | ||
|
|
||
| # If we are in torch section and line does NOT start with 4 spaces, exit torch section | ||
| in_torch_section == 1 && !/^ / { | ||
| in_torch_section = 0 | ||
| # Fall through to print this line | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Potential issue with empty lines in torch section.
The awk script exits the torch section when encountering a line that doesn't start with 4 spaces. This includes empty lines, which might appear between the torch== line and its hash lines in some requirements files.
Consider updating the pattern to handle empty lines:
# If we are in torch section and line starts with 4 spaces, skip it
in_torch_section == 1 && /^ / {
next # Skip this line
}
+# If we are in torch section and line is empty, skip it
+in_torch_section == 1 && /^$/ {
+ next # Skip empty line
+}
+
# If we are in torch section and line does NOT start with 4 spaces, exit torch section
-in_torch_section == 1 && !/^ / {
+in_torch_section == 1 && !/^ / && !/^$/ {
in_torch_section = 0
# Fall through to print this line
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| # If we are in torch section and line starts with 4 spaces, skip it | |
| in_torch_section == 1 && /^ / { | |
| next # Skip this line | |
| } | |
| # If we are in torch section and line does NOT start with 4 spaces, exit torch section | |
| in_torch_section == 1 && !/^ / { | |
| in_torch_section = 0 | |
| # Fall through to print this line | |
| } | |
| # If we are in torch section and line starts with 4 spaces, skip it | |
| in_torch_section == 1 && /^ / { | |
| next # Skip this line | |
| } | |
| # If we are in torch section and line is empty, skip it | |
| in_torch_section == 1 && /^$/ { | |
| next # Skip empty line | |
| } | |
| # If we are in torch section and line does NOT start with 4 spaces, exit torch section | |
| in_torch_section == 1 && !/^ / && !/^$/ { | |
| in_torch_section = 0 | |
| # Fall through to print this line | |
| } |
🤖 Prompt for AI Agents
In scripts/remove_torch_deps.sh around lines 33 to 42, the awk logic treats any
line that doesn't start with four spaces (including blank lines) as the end of
the torch section; update the condition so blank lines are ignored (i.e., do not
terminate the section). Change the exit check to only leave the torch section
when the line is non-empty and does not start with four spaces (for example: if
in_torch_section == 1 && $0 !~ /^ / && $0 !~ /^$/ then set in_torch_section =
0), and ensure blank lines inside the torch block are either skipped or allowed
to fall through as part of the section so hash lines remain associated with the
torch entry.
Description
do not merge
Type of change
Tools used to create PR
Identify any AI code assistants used in this PR (for transparency and review context)
Related Tickets & Documents
Checklist before requesting a review
Testing
Summary by CodeRabbit
New Features
Documentation
Chores
✏️ Tip: You can customize this high-level summary in your review settings.