Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bugfix] Fix cache maximum size settings not working properly with pluggable caching #16636

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

peteralfonsi
Copy link
Contributor

Description

Fixes a bug (#16631) where max size settings for cache implementations were not working correctly when pluggable caching is on.

The cache implementations like OpenSearchOnHeapCache had been changed so that the max size value from the config overrode the value from their setting. This allowed the TieredSpilloverCache to send different sizes into the cache config when constructing its segments. However, the IndicesRequestCache was also putting its default 1% of heap size value into the cache config, and this overrode the setting values even when pluggable caching was on.

Fixes this by having the IRC only put a max size value into the config if pluggable caching is off.

Adds UTs around this. Also tested manually with different combinations of settings.

Related Issues

Resolves #16631

Check List

  • Functionality includes testing.
  • [N/A] API changes companion pull request created, if applicable.
  • [N/A] Public documentation issue/PR created, if applicable.

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
For more information on following Developer Certificate of Origin and signing off your commits, please check here.

Signed-off-by: Peter Alfonsi <[email protected]>
Signed-off-by: Peter Alfonsi <[email protected]>
Copy link
Contributor

✅ Gradle check result for 3d1fe73: SUCCESS

Copy link

codecov bot commented Nov 14, 2024

Codecov Report

Attention: Patch coverage is 87.09677% with 4 lines in your changes missing coverage. Please review.

Project coverage is 72.08%. Comparing base (10873f1) to head (3d1fe73).
Report is 10 commits behind head on main.

Files with missing lines Patch % Lines
...va/org/opensearch/indices/IndicesRequestCache.java 86.95% 3 Missing ⚠️
.../opensearch/common/cache/service/CacheService.java 80.00% 0 Missing and 1 partial ⚠️
Additional details and impacted files
@@             Coverage Diff              @@
##               main   #16636      +/-   ##
============================================
- Coverage     72.15%   72.08%   -0.08%     
- Complexity    65145    65162      +17     
============================================
  Files          5315     5318       +3     
  Lines        303573   303833     +260     
  Branches      43925    43962      +37     
============================================
- Hits         219039   219003      -36     
- Misses        66587    66925     +338     
+ Partials      17947    17905      -42     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@peteralfonsi peteralfonsi added the backport 2.x Backport to 2.x branch label Nov 14, 2024
Copy link
Contributor

❌ Gradle check result for 469006a: FAILURE

Please examine the workflow log, locate, and copy-paste the failure(s) below, then iterate to green. Is the failure a flaky test unrelated to your change?

Copy link
Contributor

❌ Gradle check result for 76ad2ef: FAILURE

Please examine the workflow log, locate, and copy-paste the failure(s) below, then iterate to green. Is the failure a flaky test unrelated to your change?

@peteralfonsi
Copy link
Contributor Author

Flaky tests: #14568, #16015

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
backport 2.x Backport to 2.x branch bug Something isn't working Search Search query, autocomplete ...etc
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] Cache maximum size settings don't work as expected when pluggable caching is on
2 participants