-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add course log fetch CLI script #1
base: master
Are you sure you want to change the base?
Conversation
* @param string $keyname S3 keyname. | ||
* @return void | ||
*/ | ||
public function download_file($filepath, $keyname) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this method should probably return $result to allow for better error handling
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
doesn't an exception get thrown in that case ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For example with upload_file with an inexistant bucket:
<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>NoSuchBucket</Code><Message>The specified bucket does not exist</Mes (truncated...)
NoSuchBucket (client): The specified bucket does not exist - <?xml version="1.0" encoding="UTF-8"?>
<Error><Code>NoSuchBucket</Code><Message>The specified bucket does not exist</Message><BucketName>nswdet-evet-testa</BucketName><RequestId>9C25F4F3620A3ED4</RequestId><HostId>P9hlNIS8/PS6BFO4D6dhn7SDPFWSL1VqAWyvTMFC9G8T00rlUNTc5txFzZDIcfc88o0PwlPJm7w=</HostId></Error>
Backtrace:
* line 101 of /local/aws/sdk/Aws/WrappedHttpHandler.php: call to Aws\WrappedHttpHandler->parseError()
* line 203 of /local/aws/sdk/GuzzleHttp/Promise/Promise.php: call to Aws\WrappedHttpHandler->Aws\{closure}()
* line 174 of /local/aws/sdk/GuzzleHttp/Promise/Promise.php: call to GuzzleHttp\Promise\Promise::callHandler()
* line 40 of /local/aws/sdk/GuzzleHttp/Promise/RejectedPromise.php: call to GuzzleHttp\Promise\Promise::GuzzleHttp\Promise\{closure}()
* line 47 of /local/aws/sdk/GuzzleHttp/Promise/TaskQueue.php: call to GuzzleHttp\Promise\RejectedPromise::GuzzleHttp\Promise\{closure}()
* line 96 of /local/aws/sdk/GuzzleHttp/Handler/CurlMultiHandler.php: call to GuzzleHttp\Promise\TaskQueue->run()
* line 123 of /local/aws/sdk/GuzzleHttp/Handler/CurlMultiHandler.php: call to GuzzleHttp\Handler\CurlMultiHandler->tick()
* line 246 of /local/aws/sdk/GuzzleHttp/Promise/Promise.php: call to GuzzleHttp\Handler\CurlMultiHandler->execute()
* line 223 of /local/aws/sdk/GuzzleHttp/Promise/Promise.php: call to GuzzleHttp\Promise\Promise->invokeWaitFn()
* line 267 of /local/aws/sdk/GuzzleHttp/Promise/Promise.php: call to GuzzleHttp\Promise\Promise->waitIfPending()
* line 225 of /local/aws/sdk/GuzzleHttp/Promise/Promise.php: call to GuzzleHttp\Promise\Promise->invokeWaitList()
* line 267 of /local/aws/sdk/GuzzleHttp/Promise/Promise.php: call to GuzzleHttp\Promise\Promise->waitIfPending()
* line 225 of /local/aws/sdk/GuzzleHttp/Promise/Promise.php: call to GuzzleHttp\Promise\Promise->invokeWaitList()
* line 62 of /local/aws/sdk/GuzzleHttp/Promise/Promise.php: call to GuzzleHttp\Promise\Promise->waitIfPending()
* line 59 of /local/aws/sdk/Aws/AwsClientTrait.php: call to GuzzleHttp\Promise\Promise->wait()
* line 78 of /local/aws/sdk/Aws/AwsClientTrait.php: call to Aws\AwsClient->execute()
* line 92 of /admin/tool/s3logs/classes/client/s3_client.php: call to Aws\AwsClient->__call()
* line 198 of /admin/tool/s3logs/classes/task/process_logs.php: call to tool_s3logs\client\s3_client->upload_file()
* line 137 of /admin/tool/task/cli/schedule_task.php: call to tool_s3logs\task\process_logs->execute()
The
if (!$s3url) {
throw new \moodle_exception('s3uploadfailed', 'tool_s3logs', '');
}
Will never get hit
$s3client = new s3_client(); | ||
$tempdir = make_temp_directory('s3logs_download'); | ||
$tempfilepath = tempnam($tempdir, 's3logs_'); | ||
$s3client->download_file($tempfilepath, $s3logkey); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
probably shouldn't open file handle unless we have successfully got the file from s3
$tempfilepath = tempnam($tempdir, 's3logs_'); | ||
$s3client->download_file($tempfilepath, $s3logkey); | ||
$loghandle = fopen($tempfilepath, 'r'); | ||
return $loghandle; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe default log handle to false and only update if get s3 was good
$currentlogcount++; | ||
|
||
echo "\nDownloading $s3logkey ($currentlogcount/$totallogs) \n"; | ||
$loghandle = self::download_s3_log_file($s3logkey); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we want this loop to continue if we can't get one of the files? Or we happy for the whole thing to explode?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
whole thing to explode IMO - id rather have atomic failure
global $DB; | ||
|
||
foreach ($courseids as $courseid) { | ||
$records = $DB->get_records( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Be very careful calling get_records from the standard log store. LTU for example has about a billion rows and is 180GB on disk.
Really should be using get_recordset and grabbing results by pages. In Moodle 3.5 (and maybe 3,4) you won't need to page with get_recordset see MDL-60174
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you think its worth it given the scale we're dealing with here - only pulling one courses logs at once ?
No description provided.