Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Reporting] Stabilize CSV export tests #112204

Merged
merged 12 commits into from
Sep 27, 2021

Conversation

tsullivan
Copy link
Member

@tsullivan tsullivan commented Sep 15, 2021

A CSV Export issue was confirmed to changing the test server's reporting settings to their defaults. Under the default settings (no artificially low maxSizeBytes setting), most CSV Export tests worked on 4675 documents. That amount of data triggered some weak performance in Elasticsearch for the _scroll API, and Reporting tests were failing essentially due to shard failures.

This PR stabilizes the CSV tests. It does so by adjusting the tests use a smaller date range and return back a smaller number of total records. This goal is from here

@tsullivan tsullivan changed the title Reportingg/csv flaky test [Reporting] Address Flaky CSV test Sep 15, 2021
@elastic elastic deleted a comment from kibanamachine Sep 15, 2021
@elastic elastic deleted a comment from kibanamachine Sep 15, 2021
@elastic elastic deleted a comment from kibanamachine Sep 20, 2021
@tsullivan tsullivan force-pushed the reportingg/csv-flaky-test branch 4 times, most recently from be8350d to c3dd0ba Compare September 21, 2021 01:00
const fromTime = 'Apr 27, 2019 @ 23:56:51.374';
const toTime = 'Aug 23, 2019 @ 16:18:51.821';
const fromTime = 'Jun 20, 2019 @ 00:00:00.000';
const toTime = 'Jun 25, 2019 @ 00:00:00.000';
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The earlier time range created a search that matched 4675 hits, now it is 720

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was only able to reliably reproduce the buggy behaviour with larger CSV exports. I am concerned that if we reduce the number of hits from the search we might hide issues like this in future.

I understand this runs against the changes here that introduced a full snapshot check, but perhaps we could do both tests? One checking a full snapshot (high resolution) and one checking a large export count (with lower resolution snapshot, first 10 and last 10 lines as we had it before).

Let me know what you think!


describe('Generation from Job Params', () => {
Copy link
Member Author

@tsullivan tsullivan Sep 21, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All of this removed code used the deprecated export type, and was moved to the new test file: x-pack/test/reporting_api_integration/reporting_and_security/generate_csv_discover_deprecated.ts

@@ -165,23 +164,21 @@ export default function ({ getService }: FtrProviderContext) {
describe('Discover: Generate CSV report', () => {
it('does not allow user that does not have the role-based privilege', async () => {
const res = await reportingAPI.generateCsv(
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Decided to move the username and password parameters to the end of the arguments. This will be useful for more csv_searchsource testing in later PRs.

@tsullivan tsullivan marked this pull request as ready for review September 21, 2021 03:30
@tsullivan tsullivan requested review from a team as code owners September 21, 2021 03:30
@elasticmachine
Copy link
Contributor

Pinging @elastic/kibana-app-services (Team:AppServices)

@elasticmachine
Copy link
Contributor

Pinging @elastic/kibana-reporting-services (Team:Reporting Services)

@tsullivan tsullivan marked this pull request as draft September 21, 2021 03:32
@tsullivan tsullivan marked this pull request as ready for review September 21, 2021 23:53
@elastic elastic deleted a comment from kibanamachine Sep 21, 2021
@@ -345,6 +345,15 @@ export class CsvGenerator {
break;
}

// TODO check for shard failures, log them and add a warning if found
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍🏻

Copy link
Contributor

@jloleysens jloleysens left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is looking great @tsullivan ! I left a few comments that I'd like to get your thoughts on before merging.

});

it('generates a report from a new search with data: discover:searchFieldsFromSource', async () => {
await setFieldsFromSource(true);
await PageObjects.discover.clickNewSearchButton();
await PageObjects.reporting.setTimepickerInDataRange();
await PageObjects.reporting.setTimepickerInEcommerceDataRange();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just after this line it might be useful to add an assertion against the number of hits returned by the search:

expect(await PageObjects.discover.getHitCount()).to.equal('4,675');

That way, if this fails in future, we will know that at least the Discover search is probably working as expected.

const fromTime = 'Apr 27, 2019 @ 23:56:51.374';
const toTime = 'Aug 23, 2019 @ 16:18:51.821';
const fromTime = 'Jun 20, 2019 @ 00:00:00.000';
const toTime = 'Jun 25, 2019 @ 00:00:00.000';
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was only able to reliably reproduce the buggy behaviour with larger CSV exports. I am concerned that if we reduce the number of hits from the search we might hide issues like this in future.

I understand this runs against the changes here that introduced a full snapshot check, but perhaps we could do both tests? One checking a full snapshot (high resolution) and one checking a large export count (with lower resolution snapshot, first 10 and last 10 lines as we had it before).

Let me know what you think!

@tsullivan
Copy link
Member Author

I was only able to reliably reproduce the buggy behaviour with larger CSV exports. I am concerned that if we reduce the number of hits from the search we might hide issues like this in future.

I understand this runs against the changes here that introduced a full snapshot check, but perhaps we could do both tests? One checking a full snapshot (high resolution) and one checking a large export count (with lower resolution snapshot, first 10 and last 10 lines as we had it before).

As much as I don't like to use Reporting tests for finding issues with other domains or services, I agree with you since it seems there aren't any other features (or tests) that use _scroll the way that Reporting uses it.

@jloleysens
Copy link
Contributor

@elasticmachine merge upstream

Copy link
Contributor

@jloleysens jloleysens left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for addressing my feedback @tsullivan !

As much as I don't like to use Reporting tests for finding issues with other domains or services, I agree with you since it seems there aren't any other features (or tests) that use _scroll the way that Reporting uses it.

That concern makes sense to me, perhaps this is an improvement we should follow up on ES side about (i.e., adding tests there for larger scrolls). We could open an issue about it. What do you think?

@kibanamachine
Copy link
Contributor

💚 Build Succeeded

Metrics [docs]

✅ unchanged

History

To update your PR or re-run it, just comment with:
@elasticmachine merge upstream

@tsullivan tsullivan merged commit cb37ae8 into elastic:master Sep 27, 2021
tsullivan added a commit to tsullivan/kibana that referenced this pull request Sep 27, 2021
* [Reporting] Stabilize CSV export tests

* add debugging logging for results metadata

* restore accidentally deleted tests

* restore "large export" test

* remove redundant availability test

* do not filter and re-save

* fix getHitCount

* fix large export test

* skip large export test :(

Co-authored-by: Kibana Machine <[email protected]>
@tsullivan tsullivan deleted the reportingg/csv-flaky-test branch September 27, 2021 20:03
@tsullivan
Copy link
Member Author

That concern makes sense to me, perhaps this is an improvement we should follow up on ES side about (i.e., adding tests there for larger scrolls). We could open an issue about it. What do you think?

I'm not sure we need an issue to follow up on things right now, since it looks like the teams are working actively on resolving the issue. Here is another issue where the work is happening: https://github.com/elastic/machine-learning-qa/issues/1125

tsullivan added a commit that referenced this pull request Sep 27, 2021
* [Reporting] Stabilize CSV export tests (#112204)

* [Reporting] Stabilize CSV export tests

* add debugging logging for results metadata

* restore accidentally deleted tests

* restore "large export" test

* remove redundant availability test

* do not filter and re-save

* fix getHitCount

* fix large export test

* skip large export test :(

Co-authored-by: Kibana Machine <[email protected]>

* update snapshots for discover tests

* update test snapshots

Co-authored-by: Kibana Machine <[email protected]>
@dnhatn
Copy link
Member

dnhatn commented Sep 27, 2021

We have added a test that verifies scroll and search_after requests against a large index.

@sophiec20 sophiec20 added the (Deprecated) Feature:Reporting Use Reporting:Screenshot, Reporting:CSV, or Reporting:Framework instead label Aug 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
(Deprecated) Feature:Reporting Use Reporting:Screenshot, Reporting:CSV, or Reporting:Framework instead release_note:skip Skip the PR/issue when compiling release notes v7.16.0 v8.0.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants