Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fatal JavaScript out of memory: Reached heap limit #150

Open
omnicortex opened this issue Feb 1, 2025 · 4 comments
Open

Fatal JavaScript out of memory: Reached heap limit #150

omnicortex opened this issue Feb 1, 2025 · 4 comments

Comments

@omnicortex
Copy link

omnicortex commented Feb 1, 2025

Describe the bug

When I run the command to crawl with infinite depth like this

./single-file-x86_64-linux https://candycancook.com/ --crawl-links=true --crawl-max-depth=0 --crawl-replace-URLs=true

I got out of memory error below:


[589930:0x5ef5e3929000]   654249 ms: Mark-Compact 1408.4 (1413.7) -> 1398.2 (1407.5) MB, pooled: 0 MB, 10.50 / 0.00 ms  (average mu = 0.997, current mu = 0.795) allocation failure; scavenge might not succeed
[589930:0x5ef5e3929000]   654281 ms: Mark-Compact (reduce) 1418.6 (1427.9) -> 1418.5 (1420.9) MB, pooled: 0 MB, 2.18 / 0.00 ms  (+ 4.5 ms in 0 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 8 ms) (average mu = 0.994, cu

<--- JS stacktrace --->



#
# Fatal JavaScript out of memory: Reached heap limit
#
==== C stack trace ===============================

    ./single-file-x86_64-linux(+0x22a4163) [0x5ef5bde03163]
    ./single-file-x86_64-linux(+0x22a3a2b) [0x5ef5bde02a2b]
    ./single-file-x86_64-linux(+0x229ef48) [0x5ef5bddfdf48]
    ./single-file-x86_64-linux(+0x22f51dc) [0x5ef5bde541dc]
    ./single-file-x86_64-linux(+0x24a9827) [0x5ef5be008827]
    ./single-file-x86_64-linux(+0x24a79d3) [0x5ef5be0069d3]
    ./single-file-x86_64-linux(+0x24eefc1) [0x5ef5be04dfc1]
    ./single-file-x86_64-linux(+0x22a317a) [0x5ef5bde0217a]
    ./single-file-x86_64-linux(+0x4e2e3bc) [0x5ef5c098d3bc]
    ./single-file-x86_64-linux(+0x393eac7) [0x5ef5bf49dac7]
    ./single-file-x86_64-linux(+0x3a1b75e) [0x5ef5bf57a75e]
    ./single-file-x86_64-linux(+0x39f2827) [0x5ef5bf551827]
    ./single-file-x86_64-linux(+0x39271fd) [0x5ef5bf4861fd]
    ./single-file-x86_64-linux(+0x39a0b3c) [0x5ef5bf4ffb3c]
    ./single-file-x86_64-linux(+0x3a25474) [0x5ef5bf584474]
    ./single-file-x86_64-linux(+0x3965bf1) [0x5ef5bf4c4bf1]
    ./single-file-x86_64-linux(+0x3965bb9) [0x5ef5bf4c4bb9]
    ./single-file-x86_64-linux(+0x38dac30) [0x5ef5bf439c30]
    ./single-file-x86_64-linux(+0x3a361d5) [0x5ef5bf5951d5]
    /lib/x86_64-linux-gnu/libc.so.6(+0x2a1ca) [0x7242a522a1ca]
    /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0x8b) [0x7242a522a28b]
    ./single-file-x86_64-linux(+0x2277029) [0x5ef5bddd6029]
Trace/breakpoint trap (core dumped)

To Reproduce
Steps to reproduce the behavior:

  1. Run
./single-file-x86_64-linux https://candycancook.com/ --crawl-links=true --crawl-max-depth=0 --crawl-replace-URLs=true
  1. See error

Expected behavior

Should run without this error.

Screenshots

None

Environment

  • OS: Ubuntu 24.04
  • Browser: Chrome
  • Version: 2.0.73

Additional context

@gildas-lormeau
Copy link
Owner

Unfortunately I cannot reproduce the issue because https://candycancook.com is unresponsive/down. Can it be reproduced with another website?

@omnicortex
Copy link
Author

@gildas-lormeau I just ran the command again and https://candycancook.com/ works on my end. I can also access https://candycancook.com/ with firefox. I have only ran singlefile on a couple of sites and this is the only one with error.

@gildas-lormeau
Copy link
Owner

@omnicortex Maybe it's a random bug in Deno. I'll do a release soon with the updated Deno runtime.

@gildas-lormeau gildas-lormeau transferred this issue from gildas-lormeau/SingleFile Feb 6, 2025
@omnicortex
Copy link
Author

@gildas-lormeau I will try to run a couple more tests to see if we can find out what's going on.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants