-
Hi all, I am crawling a single index page of about 50,000 links and then crawling each one of those pages. I get the following warnings from crawlee:
Here's the VMs memory usage. It is using 820 MB of 1.9 GB. It looks like crawlee is trying to always reserve 50% of memory via Ideally it would use 75% of the total available memory (~1.5 GB). What's the correct way to get crawlee to use more of the available memory in this situation? Thanks so much! Nick |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
See in Snapshotter documentation |
Beta Was this translation helpful? Give feedback.
-
Ahh I missed the environment variable! Thank you @LeMoussel! |
Beta Was this translation helpful? Give feedback.
See in Snapshotter documentation
Memory becomes overloaded if its current use exceeds the maxUsedMemoryRatio option. It’s computed using the total memory available to the container when running on the Apify platform and a quarter of total system memory when running locally. Max total memory when running locally may be overridden by using the CRAWLEE_MEMORY_MBYTES environment variable.