I want to use celery to make a distributed crawler
Let celery grab the data and store it in my mysql database
but the mysql database I bought only has 50 connections (database provider limit)
so I can"t start 100 worker to grab data at the same time?
is there a way to get worker to grab the data and return it to master, so that master can store it in the database?
do you have a solution?
Thank you