You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Ive found that sites are less likely to block you from scraping them if you vary the user agent you send in your scrape requests. Being able to set 1 agent is good, but being able to set 10 or 20 and have the application choose one at random when requesting a target url would help reduce the chances of being blocked.
The text was updated successfully, but these errors were encountered:
"Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2228.0 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2227.1 Safari/537.36",
"Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; AS; rv:11.0) like Gecko",
"Mozilla/5.0 (compatible, MSIE 11, Windows NT 6.3; Trident/7.0; rv:11.0) like Gecko"
]`
function getRandomUserAgent() { return userAgentList[Math.floor(Math.random()*userAgentList.length)]; }
Ive found that sites are less likely to block you from scraping them if you vary the user agent you send in your scrape requests. Being able to set 1 agent is good, but being able to set 10 or 20 and have the application choose one at random when requesting a target url would help reduce the chances of being blocked.
The text was updated successfully, but these errors were encountered: