Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Evaluate the analysis engine against the analysis we currently have #21

Open
hellais opened this issue Dec 13, 2022 · 1 comment
Open

Comments

@hellais
Copy link
Member

hellais commented Dec 13, 2022

This is about setting up some form of evaluation criteria for the OONI Data analysis engine and measuring some key metrics to asses how good it's doing its job.

@hellais hellais self-assigned this Oct 6, 2023
@hellais
Copy link
Member Author

hellais commented Oct 16, 2023

An initial evaluation has been done by focusing only on the DNS level anomalies. The findings from this initial investigation have been documented as part of an internal presentation made to the team here: https://docs.google.com/presentation/d/1rw7a02lpTj4CcguAz_nbqzNzkACKdFvMzTLrairAQ70/edit.

The approach we followed was to restrict the comparison to a location where we have a good ground truth, which in was Russia where we have the ability to easily build ground truth from the official blocklists.

Some ML based analysis was also done as part of this.

The next steps for this issue involve:

  • extending it to support more than just DNS based anomalies
  • expanding the ground truth to other countries (optional)
  • writing up the findings in a report
  • sharing the report with some field experts for collecting feedback
  • setting up the analysis so it can easily be reproduced as we move forward with the analsysis engine

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant