Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Other - Memory usage optimization #237

Open
api0cradle opened this issue Mar 7, 2023 · 2 comments
Open

Other - Memory usage optimization #237

api0cradle opened this issue Mar 7, 2023 · 2 comments

Comments

@api0cradle
Copy link

Other
Other

Describe the feature request or bug or other
I am trying to run a rather big wordlist 500mb+ and the python process ends up eating all my memory 32gb+.
It seems the code is instructed to hold all results in memory until complete. A suggestion would be to add some sort of chunking based on the size of the file or something. Maybe just a set number of lines for each "task".

To Reproduce
Steps to reproduce the behaviour:

  1. Run tool like this: 'python dnsrecon.py -d domainwhatever.com -D first_level_subdomains_wordlist.txt -t brt -c output.csv'
  2. Memory usage through the roof

Expected behaviour
A more efficient way of handle bigger wordlist that does not consume that much ram

Screenshots
image

System Information (System that tool is running on):

  • OS: Linux in WSL (Also tested on Ubuntu)

Additional context
Add any other context about the problem here.

@api0cradle
Copy link
Author

api0cradle commented Mar 7, 2023

I did try to change the def brute_domain(): function https://github.com/darkoperator/dnsrecon/blob/master/dnsrecon/cli.py#L424 , however due to the threading I was not able to create any quick fix.

@L1ghtn1ng
Copy link
Collaborator

with the latest changes in git, does this still happen?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants