You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Since most of the Websites scrapped have strict rate limits, it might be better to replace all the requests library used in BaseDataScrapper and BaseDataSetScrapper to use requests-cache so that same api calls can be cached and therefore prevent time loss from rate limits after intial calls. It should use the sqlite (default) caching method for persistant storing of api responses so that moving forward refetching the data will be much faster.
The text was updated successfully, but these errors were encountered:
Since most of the Websites scrapped have strict rate limits, it might be better to replace all the requests library used in
BaseDataScrapper
andBaseDataSetScrapper
to use requests-cache so that same api calls can be cached and therefore prevent time loss from rate limits after intial calls. It should use the sqlite (default) caching method for persistant storing of api responses so that moving forward refetching the data will be much faster.The text was updated successfully, but these errors were encountered: