We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Describe the bug
When I run the command to crawl with infinite depth like this
./single-file-x86_64-linux https://candycancook.com/ --crawl-links=true --crawl-max-depth=0 --crawl-replace-URLs=true
I got out of memory error below:
[589930:0x5ef5e3929000] 654249 ms: Mark-Compact 1408.4 (1413.7) -> 1398.2 (1407.5) MB, pooled: 0 MB, 10.50 / 0.00 ms (average mu = 0.997, current mu = 0.795) allocation failure; scavenge might not succeed [589930:0x5ef5e3929000] 654281 ms: Mark-Compact (reduce) 1418.6 (1427.9) -> 1418.5 (1420.9) MB, pooled: 0 MB, 2.18 / 0.00 ms (+ 4.5 ms in 0 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 8 ms) (average mu = 0.994, cu <--- JS stacktrace ---> # # Fatal JavaScript out of memory: Reached heap limit # ==== C stack trace =============================== ./single-file-x86_64-linux(+0x22a4163) [0x5ef5bde03163] ./single-file-x86_64-linux(+0x22a3a2b) [0x5ef5bde02a2b] ./single-file-x86_64-linux(+0x229ef48) [0x5ef5bddfdf48] ./single-file-x86_64-linux(+0x22f51dc) [0x5ef5bde541dc] ./single-file-x86_64-linux(+0x24a9827) [0x5ef5be008827] ./single-file-x86_64-linux(+0x24a79d3) [0x5ef5be0069d3] ./single-file-x86_64-linux(+0x24eefc1) [0x5ef5be04dfc1] ./single-file-x86_64-linux(+0x22a317a) [0x5ef5bde0217a] ./single-file-x86_64-linux(+0x4e2e3bc) [0x5ef5c098d3bc] ./single-file-x86_64-linux(+0x393eac7) [0x5ef5bf49dac7] ./single-file-x86_64-linux(+0x3a1b75e) [0x5ef5bf57a75e] ./single-file-x86_64-linux(+0x39f2827) [0x5ef5bf551827] ./single-file-x86_64-linux(+0x39271fd) [0x5ef5bf4861fd] ./single-file-x86_64-linux(+0x39a0b3c) [0x5ef5bf4ffb3c] ./single-file-x86_64-linux(+0x3a25474) [0x5ef5bf584474] ./single-file-x86_64-linux(+0x3965bf1) [0x5ef5bf4c4bf1] ./single-file-x86_64-linux(+0x3965bb9) [0x5ef5bf4c4bb9] ./single-file-x86_64-linux(+0x38dac30) [0x5ef5bf439c30] ./single-file-x86_64-linux(+0x3a361d5) [0x5ef5bf5951d5] /lib/x86_64-linux-gnu/libc.so.6(+0x2a1ca) [0x7242a522a1ca] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0x8b) [0x7242a522a28b] ./single-file-x86_64-linux(+0x2277029) [0x5ef5bddd6029] Trace/breakpoint trap (core dumped)
To Reproduce Steps to reproduce the behavior:
Expected behavior
Should run without this error.
Screenshots
None
Environment
Additional context
The text was updated successfully, but these errors were encountered:
Unfortunately I cannot reproduce the issue because https://candycancook.com is unresponsive/down. Can it be reproduced with another website?
Sorry, something went wrong.
@gildas-lormeau I just ran the command again and https://candycancook.com/ works on my end. I can also access https://candycancook.com/ with firefox. I have only ran singlefile on a couple of sites and this is the only one with error.
@omnicortex Maybe it's a random bug in Deno. I'll do a release soon with the updated Deno runtime.
@gildas-lormeau I will try to run a couple more tests to see if we can find out what's going on.
No branches or pull requests
Describe the bug
When I run the command to crawl with infinite depth like this
I got out of memory error below:
To Reproduce
Steps to reproduce the behavior:
Expected behavior
Should run without this error.
Screenshots
None
Environment
Additional context
The text was updated successfully, but these errors were encountered: