to do another thing now, you need to use the curl function of PHP to read the data from an json data, and then loop to the curl to grab the data.
this itself is not very simple, but the current problem is that the amount of json data is too large [about 100,000] to grab data at a time, manually copy 20-30 data from large json data at a time, and then execute curl,. But think about the amount of more than 100,000 data, think carefully.
the general idea is whether you can read 20 data sequentially at a time, and then execute, perform the Nth stop, or execute the first 20 data each time, and delete the data after execution.
json data is very simple. The following is just a simple example, which is actually
in English."11111","22222","33333","44444","55555","66666"
to put it simply, the above json adds a URL loop curl crawling. For example:
http://xxoo.com/id/11111
http://xxoo.com/id/22222
http://xxoo.com/id/33333
.
cannot be crawled in large quantities because it is a virtual host. So consider grabbing 20-30 at a time, adding up to grab more than a hundred thousand data