Celery is distributed. If I have 100th worker running, will all these 100worker connect to the database?

I want to use celery to make a distributed crawler

Let celery grab the data and store it in my mysql database

but the mysql database I bought only has 50 connections (database provider limit)

so I can"t start 100 worker to grab data at the same time?

is there a way to get worker to grab the data and return it to master, so that master can store it in the database?

do you have a solution?

Thank you

Mar.19,2021

production data is stored in redis's queue (List). Then start a process of slow consumption.

MySQL Query : SELECT * FROM `codeshelper`.`v9_news` WHERE status=99 AND catid='6' ORDER BY rand() LIMIT 5
MySQL Error : Disk full (/tmp/#sql-temptable-64f5-1b3a80d-2b580.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
MySQL Errno : 1021
Message : Disk full (/tmp/#sql-temptable-64f5-1b3a80d-2b580.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
Need Help?