Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Better handling of dataset not found error #204

Open
1 of 4 tasks
Tracked by #233
msm-code opened this issue May 28, 2020 · 2 comments
Open
1 of 4 tasks
Tracked by #233

Better handling of dataset not found error #204

msm-code opened this issue May 28, 2020 · 2 comments
Labels
type:bug Something isn't working zone:backend Backend oriented tasks
Milestone

Comments

@msm-code
Copy link
Contributor

Environment information

  • Mquery version (from the /status page): 1.2.0
  • Ursadb version (from the /status page): 1.3.2+1125ee5
  • Installation method:
    • Generic docker-compose
    • Dev docker-compose
    • Native (from source)
    • Other (please explain) k8s

Reproduction Steps

Start database compacting.
Run a query in just a right moment (ideallly a long running query)

Expected behaviour

The query completes, possibly without testing all datasets when some of them were compacted in the meantime (this should be counted as an error somewhere)

Actual behaviour the bug

Query ends with a failed status, without returning any results.

[28/05/2020 18:02:49][ERROR] Failed to execute task.
Traceback (most recent call last):
  File "/app/daemon.py", line 311, in __process_task
    self.__search_task(job)
  File "/app/daemon.py", line 99, in __search_task
    raise RuntimeError(result["error"])
RuntimeError: ursadb failed: Invalid dataset specified in query
@msm-code msm-code added type:bug Something isn't working status:up for grabs priority:low Priority: low labels May 28, 2020
@dskwhitehat dskwhitehat removed their assignment Jun 10, 2020
@msm-code msm-code mentioned this issue Nov 28, 2021
11 tasks
@msm-code msm-code added this to the v1.3.0 milestone Nov 30, 2021
@msm-code msm-code modified the milestones: v1.3.0, v1.4.0 Nov 26, 2022
@msm-code
Copy link
Contributor Author

Moving to v1.4.0 since it's low priority (not easy to trigger it with normal usage)

@msm-code
Copy link
Contributor Author

msm-code commented Jan 26, 2023

Still thinking about it (maybe it's not as easy as I've thought).

It's easy to cancel the whole processing (as we're doing now)
It's easy to ignore this error
But we should probably continue processing the query, and at the same time let the user know that some of the processed files are no more available? This does not seem to be easy, because we don't support "non-critical errors". The closest would be to increment the number of failed files, but we don't know how many files were in the dataset that just crashed.

My current idea: pass dataset size to query_ursadb, and increase the number of failed files when the dataset is missing.

@msm-cert msm-cert added zone:backend Backend oriented tasks and removed good first issue labels Sep 16, 2024
@msm-cert msm-cert modified the milestones: v1.4.0, v1.5.0 Sep 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type:bug Something isn't working zone:backend Backend oriented tasks
Projects
None yet
Development

No branches or pull requests

3 participants