-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix Blobstore failure response #101
Merged
Merged
Changes from all commits
Commits
Show all changes
3 commits
Select commit
Hold shift + click to select a range
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,12 @@ | ||
#!/bin/sh | ||
|
||
# TODO may want to make the image an env var or argument | ||
|
||
# See https://github.com/kbaseapps/kb_sdk_actions/blob/master/bin/kb-sdk for source | ||
|
||
# Cache the group for the docker file | ||
if [ ! -e $HOME/.kbsdk.cache ] ; then | ||
docker run -i -v /var/run/docker.sock:/var/run/docker.sock --entrypoint ls ghcr.io/kbase/kb_sdk_patch-develop:br-0.0.4 -l /var/run/docker.sock|awk '{print $4}' > $HOME/.kbsdk.cache | ||
fi | ||
|
||
exec docker run -i --rm -v $HOME:$HOME -w $(pwd) -v /var/run/docker.sock:/var/run/docker.sock -e DSHELL=$SHELL --group-add $(cat $HOME/.kbsdk.cache) ghcr.io/kbase/kb_sdk_patch-develop:br-0.0.4 $@ |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,16 @@ | ||
#!/bin/sh | ||
|
||
# TODO may want to make the image an env var or argument | ||
|
||
# See https://github.com/kbaseapps/kb_sdk_actions/blob/master/bin/make_testdir for source | ||
|
||
# Disable the default `return 1` when creating `test_local` | ||
set +e | ||
|
||
# Cache the group for the docker file | ||
if [ ! -e $HOME/.kbsdk.cache ] ; then | ||
docker run -i -v /var/run/docker.sock:/var/run/docker.sock --entrypoint ls ghcr.io/kbase/kb_sdk_patch-develop:br-0.0.4 -l /var/run/docker.sock|awk '{print $4}' > $HOME/.kbsdk.cache | ||
fi | ||
|
||
exec docker run -i --rm -v $HOME:$HOME -u $(id -u) -w $(pwd) -v /var/run/docker.sock:/var/run/docker.sock -e DUSER=$USER -e DSHELL=$SHELL --group-add $(cat $HOME/.kbsdk.cache) ghcr.io/kbase/kb_sdk_patch-develop:br-0.0.4 test | ||
exit |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -8,7 +8,7 @@ service-language: | |
python | ||
|
||
module-version: | ||
0.2.1 | ||
0.2.2 | ||
|
||
owners: | ||
[gaprice, scanon, tgu2] |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -61,9 +61,9 @@ | |
# state. A method could easily clobber the state set by another while | ||
# the latter method is running. | ||
######################################### noqa | ||
VERSION = "0.2.0" | ||
VERSION = "0.2.2" | ||
GIT_URL = "https://github.com/kbaseapps/DataFileUtil.git" | ||
GIT_COMMIT_HASH = "579a0928d7f4ce3713e31adfea216815110ecd5e" | ||
GIT_COMMIT_HASH = "0d855560a37f8787ab3bb67e464b9c94b299764e" | ||
|
||
#BEGIN_CLASS_HEADER | ||
|
||
|
@@ -308,9 +308,8 @@ | |
try: | ||
err = json.loads(response.content)['error'][0] | ||
except Exception: | ||
# this means shock is down or not responding. | ||
log("Couldn't parse response error content from Shock: " + | ||
response.content) | ||
# this means the Blobstore is down or not responding. | ||
log("Couldn't parse response error content from the Blobstore: " + response.text) | ||
Comment on lines
-311
to
+312
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This is the bugfix |
||
response.raise_for_status() | ||
raise ShockException(errtxt + str(err)) | ||
|
||
|
@@ -636,7 +635,7 @@ | |
# note that the unit tests cannot easily test this. Be careful with changes here | ||
with requests.get(config['kbase-endpoint'] + '/shock-direct', allow_redirects=False) as r: | ||
if r.status_code == 302: | ||
log('Using direct shock url for transferring files') | ||
log('Using direct Blobstore url for transferring files') | ||
self.shock_effective = r.headers['Location'] | ||
log('Shock url: ' + self.shock_effective) | ||
self.handle_url = config['handle-service-url'] | ||
|
@@ -696,10 +695,10 @@ | |
shock_id = params.get('shock_id') | ||
handle_id = params.get('handle_id') | ||
if not shock_id and not handle_id: | ||
raise ValueError('Must provide shock ID or handle ID') | ||
raise ValueError('Must provide Blobstore ID or handle ID') | ||
if shock_id and handle_id: | ||
raise ValueError( | ||
'Must provide either a shock ID or handle ID, not both') | ||
'Must provide either a Blobstore ID or handle ID, not both') | ||
|
||
shock_url = self.shock_effective | ||
if handle_id: | ||
|
@@ -716,8 +715,7 @@ | |
self.mkdir_p(os.path.dirname(file_path)) | ||
node_url = shock_url + '/node/' + shock_id | ||
r = requests.get(node_url, headers=headers, allow_redirects=True) | ||
errtxt = ('Error downloading file from shock ' + | ||
'node {}: ').format(shock_id) | ||
errtxt = f'Error downloading file from Blobstore node {shock_id}: ' | ||
self.check_shock_response(r, errtxt) | ||
resp_obj = r.json() | ||
size = resp_obj['data']['file']['size'] | ||
|
@@ -727,7 +725,7 @@ | |
attributes = resp_obj['data']['attributes'] | ||
if os.path.isdir(file_path): | ||
file_path = os.path.join(file_path, node_file_name) | ||
log('downloading shock node ' + shock_id + ' into file: ' + str(file_path)) | ||
log('downloading Blobstore node ' + shock_id + ' into file: ' + str(file_path)) | ||
with open(file_path, 'wb') as fhandle: | ||
with requests.get(node_url + '?download_raw', stream=True, | ||
headers=headers, allow_redirects=True) as r: | ||
|
@@ -863,11 +861,11 @@ | |
headers = {'Authorization': 'Oauth ' + token} | ||
file_path = params.get('file_path') | ||
if not file_path: | ||
raise ValueError('No file(s) provided for upload to Shock.') | ||
raise ValueError('No file(s) provided for upload to the Blobstore.') | ||
pack = params.get('pack') | ||
if pack: | ||
file_path = self._pack(file_path, pack) | ||
log('uploading file ' + str(file_path) + ' into shock node') | ||
log('uploading file ' + str(file_path) + ' into Blobstore node') | ||
with open(os.path.abspath(file_path), 'rb') as data_file: | ||
# Content-Length header is required for transition to | ||
# https://github.com/kbase/blobstore | ||
|
@@ -879,7 +877,7 @@ | |
self.shock_effective + '/node', headers=headers, data=mpe, | ||
stream=True, allow_redirects=True) | ||
self.check_shock_response( | ||
response, ('Error trying to upload file {} to Shock: ' | ||
response, ('Error trying to upload file {} to Blobstore: ' | ||
).format(file_path)) | ||
shock_data = response.json()['data'] | ||
shock_id = shock_data['id'] | ||
|
@@ -889,7 +887,7 @@ | |
'size': shock_data['file']['size']} | ||
if params.get('make_handle'): | ||
out['handle'] = self.make_handle(shock_data, token) | ||
log('uploading done into shock node: ' + shock_id) | ||
log('uploading done into Blobstore node: ' + shock_id) | ||
#END file_to_shock | ||
|
||
# At some point might do deeper type checking... | ||
|
@@ -958,8 +956,10 @@ | |
String, parameter "unpack" of String | ||
:returns: instance of list of type "UnpackFilesResult" (Output | ||
parameters for the unpack_files function. file_path - the path to | ||
the unpacked file, or in the case of archive files, the path to | ||
the original archive file.) -> structure: parameter "file_path" of | ||
either a) the unpacked file or b) in the case of archive files, | ||
the path to the original archive file, possibly uncompressed, or | ||
c) in the case of regular files that don't need processing, the | ||
path to the input file.) -> structure: parameter "file_path" of | ||
String | ||
""" | ||
# ctx is the context object | ||
|
@@ -1206,15 +1206,13 @@ | |
header = {'Authorization': 'Oauth {}'.format(token)} | ||
source_id = params.get('shock_id') | ||
if not source_id: | ||
raise ValueError('Must provide shock ID') | ||
raise ValueError('Must provide Blobstore ID') | ||
mpdata = MultipartEncoder(fields={'copy_data': source_id}) | ||
header['Content-Type'] = mpdata.content_type | ||
|
||
with requests.post(self.shock_url + '/node', headers=header, data=mpdata, | ||
allow_redirects=True) as response: | ||
self.check_shock_response( | ||
response, ('Error copying Shock node {}: ' | ||
).format(source_id)) | ||
self.check_shock_response(response, f'Error copying Blobstore node {source_id}: ') | ||
shock_data = response.json()['data'] | ||
shock_id = shock_data['id'] | ||
out = {'shock_id': shock_id, 'handle': None} | ||
|
@@ -1270,11 +1268,10 @@ | |
header = {'Authorization': 'Oauth {}'.format(token)} | ||
source_id = params.get('shock_id') | ||
if not source_id: | ||
raise ValueError('Must provide shock ID') | ||
raise ValueError('Must provide Blobstore ID') | ||
with requests.get(self.shock_url + '/node/' + source_id + '/acl/?verbosity=full', | ||
headers=header, allow_redirects=True) as res: | ||
self.check_shock_response( | ||
res, 'Error getting ACLs for Shock node {}: '.format(source_id)) | ||
self.check_shock_response(res, f'Error getting ACLs for Blobstore node {source_id}: ') | ||
owner = res.json()['data']['owner']['username'] | ||
if owner != ctx['user_id']: | ||
out = self.copy_shock_node(ctx, params)[0] | ||
|
@@ -1292,8 +1289,7 @@ | |
# meh | ||
with requests.get(self.shock_url + '/node/' + source_id, | ||
headers=header, allow_redirects=True) as r: | ||
errtxt = ('Error downloading attributes from shock ' + | ||
'node {}: ').format(source_id) | ||
errtxt = f'Error downloading attributes from Blobstore node {source_id}: ' | ||
self.check_shock_response(r, errtxt) | ||
out = {'shock_id': source_id, | ||
'handle': self.make_handle(r.json()['data'], token)} | ||
|
@@ -1503,7 +1499,7 @@ | |
del ctx | ||
wsver = Workspace(self.ws_url).ver() | ||
with requests.get(self.shock_url, allow_redirects=True) as resp: | ||
self.check_shock_response(resp, 'Error contacting Shock: ') | ||
self.check_shock_response(resp, 'Error contacting the Blobstore: ') | ||
shockver = resp.json()['version'] | ||
#END versions | ||
|
||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I know that GitHub Actions is transitioning
ubuntu-latest
from22.04
to24.04
. However, to me the currentubuntu-latest
is24.04
, does it make sense to continue usingubuntu-latest
, or could there be potential issues to consider?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just copied the test.yml file from another sdk repo that was already updated. I don't have strong feelings about using 22 or 24, but I do think we should pin the version. That being said, I really doubt that pinning 22 vs 24 will make a difference.
If you want to use 22 I'm fine with that, or we can keep it as is.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, you already approved. Ok, I'll merge and I can change it to 22 if you think that's important