You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Enabling the ability to scrape multiple spiders at once in real-time.
The alternative would be to write an api utilizing requests that programatically sends these requests one by one asynchronously then combine the results which i feel is a little bit unneat and resource intensive...built in support would be nice.
The text was updated successfully, but these errors were encountered:
It sounds interesting, I think some sort of batch processing would be good here, in your example it will be difficult to know which spider should crawl which url, but maybe we could support something like this
@pawelmhm requesting ability to scrape multiple spiders asynchronously at once instead of overwhelming the server with request
Here is what i mean:
Enabling the ability to scrape multiple spiders at once in real-time.
The alternative would be to write an api utilizing
requests
that programatically sends these requests one by one asynchronously then combine the results which i feel is a little bit unneat and resource intensive...built in support would be nice.The text was updated successfully, but these errors were encountered: