my requirement is that node needs to crawl an interface data several times in a short time, and the amount of data in each entry is about 5m, and the resulting data is stored in mongo
.I have made several attempts now:
- when the number of requests is small (about 20), everything is fine
- when there are too many requests (200 +), the data can be requested, but cannot be saved smoothly
- when the number of requests exceeds 500, direct memory overflows
I would like to ask if there is any solution to make the number of asynchronies less than 20 times?