You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To improve the robustness and efficiency of Scrape-Up, I propose adding proxy support. This feature will help users scrape data from the internet more effectively by reducing the risk of getting blocked by websites.
Benefits:
Avoid IP Blocking
Access Geo-Restricted Content
Improve Scraping Performance
Implementation Details
Prepare a Proxy List File: Create a text file with a list of proxy addresses, one per line.
Call the Function: Use the get_valid_proxy function to retrieve a valid proxy from the list.
Implement in Scraping Logic: Use the retrieved proxy in your web scraping requests to help distribute the load and avoid IP blocking.
I believe adding this feature will significantly enhance the usability and effectiveness of Scrape-Up. I am happy to help
Add ScreenShots
When we scrape websites, each request we make is sent from our known IP address. Making too many requests from the same IP can lead to being blocked by the website.
This proxy server uses proxy addresses to avoid getting blocked by websites.
Record
I agree to follow this project's Code of Conduct
I'm a GSSoC'24 contributor
I want to work on this issue
The text was updated successfully, but these errors were encountered:
Hi there! Thanks for opening this issue. We appreciate your contribution to this open-source project. We aim to respond or assign your issue as soon as possible.
Describe the feature
Feature Request: Add Proxy Support to Scrape-Up
To improve the robustness and efficiency of Scrape-Up, I propose adding proxy support. This feature will help users scrape data from the internet more effectively by reducing the risk of getting blocked by websites.
Benefits:
Implementation Details
I believe adding this feature will significantly enhance the usability and effectiveness of Scrape-Up. I am happy to help
Add ScreenShots
When we scrape websites, each request we make is sent from our known IP address. Making too many requests from the same IP can lead to being blocked by the website.
This proxy server uses proxy addresses to avoid getting blocked by websites.
Record
The text was updated successfully, but these errors were encountered: