Conversation
…ld toolkit versions Introduces a new function `cleanup_old_versions` that removes outdated toolkit versions, retaining only the specified number of recent versions. This function is called in the main workflow to ensure that old versions do not accumulate, facilitating a cleaner download process for the current release.
|
📦 Python package built successfully!
|
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #72 +/- ##
==========================================
+ Coverage 73.50% 73.71% +0.21%
==========================================
Files 93 93
Lines 5253 5261 +8
Branches 764 764
==========================================
+ Hits 3861 3878 +17
+ Misses 1147 1138 -9
Partials 245 245
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
📝 WalkthroughWalkthroughAdds a new function cleanup_old_versions(base_path: str, current_release_name: str, versions_to_keep: int = 2) to the cache-downloader script. The function lists subdirectories under base_path, excludes the current release, sorts directories by modification time (newest first), and removes older ones beyond the keep count using shutil.rmtree with error handling and logging. main() now validates RELEASE_NAME (exits non-zero if missing) and calls cleanup_old_versions before starting downloads. New unit tests exercise multiple edge cases including non-existent base path, mtime ordering, current-release exclusion, nested contents, and rmtree failures. Sequence Diagram(s)sequenceDiagram
participant Main as "cache-downloader:main"
participant Cleanup as "cleanup_old_versions"
participant FS as "Filesystem (os/scandir/stat)"
participant Shutil as "shutil.rmtree"
participant Downloader as "Downloader logic"
Main->>Cleanup: call cleanup_old_versions(base_path, release_name)
Cleanup->>FS: list entries in base_path
FS-->>Cleanup: return entries (files & dirs with mtimes)
Cleanup->>Cleanup: filter directories, exclude current_release_name
Cleanup->>Cleanup: sort by mtime (newest first) and select old ones
loop for each old_dir
Cleanup->>Shutil: rmtree(old_dir)
alt success
Shutil-->>Cleanup: removed
else failure
Shutil-->>Cleanup: raise OSError (caught & logged)
end
end
Cleanup-->>Main: return
Main->>Downloader: proceed with downloads
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes 🚥 Pre-merge checks | ✅ 3 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. Comment |
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@dockerfiles/cache-downloader/cache-downloader.py`:
- Around line 110-117: The code uses versions_to_keep in a slice without
validation, so negative values cause unexpected deletion; before the existing
check (the line with "if len(version_dirs) <= versions_to_keep:"), validate and
normalize versions_to_keep (e.g., ensure it is an integer and clamp to a
non-negative value: versions_to_keep = max(0, int(versions_to_keep))) or raise a
clear ValueError for invalid input; reference the variables and loop in this
block (version_dirs, versions_to_keep, the sort lambda, and the for entry in
version_dirs[versions_to_keep:]) so the fix is applied immediately before the
slicing that removes old versions.
- Around line 86-123: Update the signature of cleanup_old_versions to include an
explicit return type (-> None) and switch filesystem ops to pathlib.Path APIs:
treat base_path as a Path (or wrap Path(base_path)), use Path.exists()/is_dir()
instead of os.path.isdir, iterate with Path.iterdir() and filter .is_dir() and
.name != current_release_name, and use .stat().st_mtime for sorting; keep the
existing behavior for skipping current_release_name, limiting with
versions_to_keep, logging, and shutil.rmtree removal handling in the same
places.
- Line 137: Add a check before calling cleanup_old_versions in the main function
or wherever cleanup_old_versions(BASE_PATH, release_name) is called to verify if
release_name is not None. If release_name is None (indicating RELEASE_NAME is
unset), exit early or raise an error instead of proceeding with the cleanup to
avoid deleting unintended version directories.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: 9b6cd832-7617-4552-9a1d-26ccfa86e843
📒 Files selected for processing (2)
dockerfiles/cache-downloader/cache-downloader.pytests/unit/test_cache_downloader.py
|
🚀 Review App Deployment Started
|
…E_NAME Enhances the `cleanup_old_versions` function by adding a check to ensure that the `versions_to_keep` parameter is non-negative, raising a ValueError if it is not. Additionally, the main function now verifies that the `RELEASE_NAME` environment variable is set, exiting with an error message if it is missing.
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@tests/unit/test_cache_downloader.py`:
- Around line 9-21: Replace os.path-based path construction in the test module
spec creation with pathlib.Path: build the path to
"dockerfiles/cache-downloader/cache-downloader.py" using Path(__file__).parent /
".." / ".." / "dockerfiles" / "cache-downloader" / "cache-downloader.py" (or
chained parents) when calling importlib.util.spec_from_file_location for _spec,
and then remove unused import os; keep the rest (_spec,
importlib.util.module_from_spec, _spec.loader.exec_module(_mod)) unchanged.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: 87a455a4-3a6d-4192-9361-dc4673b00ce2
📒 Files selected for processing (2)
dockerfiles/cache-downloader/cache-downloader.pytests/unit/test_cache_downloader.py
…h pathlib Refactors the `cleanup_old_versions` function to utilize `pathlib.Path` for file path operations, enhancing readability and consistency. The function now checks if the base path is a directory using `Path.is_dir()` and iterates through version directories with `Path.iterdir()`. This change simplifies the code and adheres to project standards for file handling.
…ns-in-toolkit-cache-downloader
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@dockerfiles/cache-downloader/cache-downloader.py`:
- Around line 147-150: The code currently calls cleanup_old_versions(BASE_PATH,
release_name) before verifying the TOOLKIT_INDEX_BUCKET_NAME env var; add a
guard that checks toolkit_index_bucket_name (the value from
os.getenv("TOOLKIT_INDEX_BUCKET_NAME")) and aborts (raise/exit/log error) if it
is empty/None before performing any destructive action, and only call
cleanup_old_versions(...) and submit_downloading(python_versions, release_name,
toolkit_index_bucket_name) after the env var is validated; reference the
toolkit_index_bucket_name variable, cleanup_old_versions, submit_downloading,
BASE_PATH, and release_name when implementing the check.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: 4058afc5-b18e-4c85-b01b-068baef50154
📒 Files selected for processing (1)
dockerfiles/cache-downloader/cache-downloader.py
Introduces a new function
cleanup_old_versionsthat removes outdated toolkit versions, retaining only the specified number of recent versions. This function is called in the main workflow to ensure that old versions do not accumulate, facilitating a cleaner download process for the current release.Summary by CodeRabbit
New Features
Bug Fixes
Tests