-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DM-46307: execute butler housekeeping scripts at remote DF for multisite processing #186
Conversation
Codecov ReportAttention: Patch coverage is
✅ All tests successful. No failed tests found. Additional details and impacted files@@ Coverage Diff @@
## main #186 +/- ##
==========================================
- Coverage 82.58% 82.57% -0.02%
==========================================
Files 41 44 +3
Lines 3492 3672 +180
Branches 355 359 +4
==========================================
+ Hits 2884 3032 +148
- Misses 518 549 +31
- Partials 90 91 +1 ☔ View full report in Codecov by Sentry. |
cbcc4c7
to
9ad298d
Compare
The code responsible for creating the submit directory was rather generic. I factored it out from the _init_submission_driver() and added it as a general purpose function to bps_utils.
Moved the "business logic" out of the _init_submission_driver() to a separate function to make the driver its structure more similar to other drivers.
Validating a BPS configuration for a regular workflow and for a workflow executing a script remotely requires different sets of validation tests. However, the validation tests for BPS config performed in init_submission() were "hard-coded" and there was no mechanism allowing for customizing them. Modified this function so the validation tests it performs could be explicitly specified when called.
Added a new command that will allow the user to submit custom scripts for execution.
Some unit tests for _init_submission_driver() stopped working after refactoring. Made some modifications to make them work again.
Added a new workflow attribute, bps_iscustom, so any BPS plugin can easily distinguish between regular payload workflows and custom ones intended to run ad hoc scripts.
Added a validator that will ensure 'customJob" is defined in BPS config when submitting a custom jobs.
ced56c0
to
03b03c7
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some suggestions. Merge approved
Based on the reviewer input, I fixed some spelling errors in the documentation and included additional pieces of information that the user might be interested in.
Some new docstrings had typos and some shortcomings (e.g. missing return value description). Other were missing entirely. Fixed all of those.
While the custom job validator was checking if the required section in BPS config exists, it wasn't checking if the required entry, in this section, i.e., ``executable``, is present. Made sure it does it as well.
Made some minor edits to the log messages displayed during BPS submission to make them either more specific or stylistically consistent.
The same snippet responsible for logging memory usage was repeated multiple times in ``drivers.py`` module. Factord it out as an auxiliary function to avoid repetitions.
03b03c7
to
7fb8495
Compare
Added an option the will allow the user to specify compute site when calling ``bps submitcmd``.
Checklist
doc/changes