A problem about mass requests of node crawler

my requirement is that node needs to crawl an interface data several times in a short time, and the amount of data in each entry is about 5m, and the resulting data is stored in mongo

.

I have made several attempts now:

  1. when the number of requests is small (about 20), everything is fine
  2. when there are too many requests (200 +), the data can be requested, but cannot be saved smoothly
  3. when the number of requests exceeds 500, direct memory overflows

I would like to ask if there is any solution to make the number of asynchronies less than 20 times?

Mar.28,2021

concurrency control, node crawler does not do concurrency control must be memory overflow.

use async npm package , async.maplimt to control concurrency

or create a simple wheel 15 line code to implement concurrency control (javascript)

MySQL Query : SELECT * FROM `codeshelper`.`v9_news` WHERE status=99 AND catid='6' ORDER BY rand() LIMIT 5
MySQL Error : Disk full (/tmp/#sql-temptable-64f5-1b39482-2b9ca.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
MySQL Errno : 1021
Message : Disk full (/tmp/#sql-temptable-64f5-1b39482-2b9ca.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
Need Help?