Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Redis with Large Cached Content Fails Resulting in Memory Leak #81

Closed
0xEmma opened this issue Apr 28, 2024 · 2 comments · Fixed by #83
Closed

Redis with Large Cached Content Fails Resulting in Memory Leak #81

0xEmma opened this issue Apr 28, 2024 · 2 comments · Fixed by #83

Comments

@0xEmma
Copy link

0xEmma commented Apr 28, 2024

Using Redis v7.2.4

Files of over >=512MB result in a
Impossible to set value into Redis, write tcp 10.69.42.43:57726->10.69.42.42:6379: write: connection reset by peer

This is most likely a result of the 512M STRING Value cap of REDIS, Files of ~900MB will get stuck part way downloading on cached handlers.

Large Files(few gigs) will results in a HTTP timeout never allowing file download, and causes a memory leak taking all up the systems memory.

Suggested Fix:
If redis write fails, bypass cache
on first load, send response & cache in a non-blocking manner

@0xEmma
Copy link
Author

0xEmma commented Apr 28, 2024

Setting proto-max-bulk-len in redis conf fixes the issues for files that dont violate the HTTP timeout.(beyond a large time to start download)

However the HTTP timeout still exists in larger files and will cause a memory leak.

@darkweak
Copy link
Collaborator

Hello @0xEmma, IMHO you may use the max_body_bytes directive to prevent these files from being cached.

@darkweak darkweak linked a pull request May 30, 2024 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants