You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First, I love this package, thank you so much for your hard work.
I would love to see a feature where you can limit the number of scans. Some site are so big that the scan last forever and usually end up crashing. It would be great if we could have a limit, so the scan stops after X URLs.
For example an new argument,
--limit=500
And linkinator will scan the first 500 URLs that he's finding, and then stops and give the results for the 500 first URLs it found.
The text was updated successfully, but these errors were encountered:
pjoulot
changed the title
Add a limit
Add a limit for the number of scanned URLs
Dec 11, 2023
Hey thanks for the kind words! For a project this like, I'm usually really looking for comprehensiveness. I'm fine with it scanning for a long time as long is it can tell me nothing is broken.
Taking a step back - how are you trying to use it? The recurse option is meant to kinda limit the damage by keeping things shallow, but even with that enabled it's supposed to only scan within the same domain really. Tell me more about the site you're trying to scan?
Hello,
First, I love this package, thank you so much for your hard work.
I would love to see a feature where you can limit the number of scans. Some site are so big that the scan last forever and usually end up crashing. It would be great if we could have a limit, so the scan stops after X URLs.
For example an new argument,
--limit=500
And linkinator will scan the first 500 URLs that he's finding, and then stops and give the results for the 500 first URLs it found.
The text was updated successfully, but these errors were encountered: