Communication between processes in python3, Queue

problem description

python3 uses process communication, I crawled to some agents, need to check availability, use the validator function, and then I opened the process pool to run the validator function. Validator will put the available agents to Queue. At first, I want to use append to a list, but there is no communication between processes.
then I q.get (), before the process join, but running does not respond

related codes

this is to get the agent, and finally append all the agents to proxyList
clipboard.png

Queue
clipboard.png

pipe
clipboard.png

Code:

proxyList = []

def getProxy():
    r = requests.get(url, headers = headers, proxies = proxies)
    ips = re.findall(""PROXY_IP":"([\d.]+)"", r.text)
    ports = re.findall(""PROXY_PORT":"([\w]+)"", r.text)

    for i, p in zip(ips, ports):
        p = int(p,16)
        ip = "http://"+i+":"+str(p)
        proxyList.append(ip)
    print(": \n")
    print(proxyList)
-sharp getProxy()

def validator(proxy, cc): -sharp
    url = "https://www.baidu.com"
    try:
        r = requests.get(url, -sharp
            headers = {"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.99 Safari/537.36"}, 
            proxies = {
            "http": proxy,
            "https": proxy,
            }, timeout = 5)
        if (r.status_code == requests.codes.ok): -sharpokprint + q.put()
            print("valid proxy:", proxy)
            q.put(proxy)
        else:
            print("failed:", proxy)
    except Exception as e:
        print("error:", proxy)


if __name__ == "__main__":
    print("start!")
    getProxy()
    p = Pool(5)
    q = Queue()
    for proxy in proxyList: -sharp
        p.apply_async(validator, args = (proxy, q))
    p.close()
    p.join()
    print(q.get()) -sharpqueue
    print("over!")

what result do you expect? What is the error message actually seen?

I want to open multiple processes to check the availability of agents, and then return the available agents. In addition, I am a little strange about the way I open process pools.

for proxy in proxyList:
    p.apply_async(validator, args = (proxy,q))

although I stated earlier that there are four processes, will all four processes be used to deal with one at the same time? What I want is for four processes to handle the overall agent together to verify availability, but the result is good and the time is shortened.
the question now is how to return available agents, and why does Queue get stuck?


due to the limited capacity of Queue , the mode of production and consumption should be adopted at the same time, so as to avoid blocking Queue.put () when the queue is full.

the following example demonstrates how to use the ping command to check multiple domain names / IP at the same time.

-sharp -*- coding: utf-8 -*-
from subprocess import Popen, PIPE
from multiprocessing import Pool, Manager


def ping(host, q):
    -sharp ping  4  1 
    p = Popen(['ping', '-c', '4', '-W', '1', host], stdout=PIPE, stderr=PIPE)
    p.communicate()
    q.put([host, p.returncode == 0 and 'good' or 'bad'])


if __name__ == '__main__':
    hosts = [
        'www.baidu.com',
        'www.taobao.com',
        'www.bad123host.com',
        '1.2.3.4',
    ]
    m = Manager()
    q = m.Queue()
    p = Pool(3)
    for host in hosts:
        p.apply_async(ping, (host, q))
    p.close()

    for i in range(len(hosts)):
        item = q.get()
        print(f'{i:03d} {item[0]} is {item[1]}')
    p.join()
MySQL Query : SELECT * FROM `codeshelper`.`v9_news` WHERE status=99 AND catid='6' ORDER BY rand() LIMIT 5
MySQL Error : Disk full (/tmp/#sql-temptable-64f5-1ea4122-47a68.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
MySQL Errno : 1021
Message : Disk full (/tmp/#sql-temptable-64f5-1ea4122-47a68.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
Need Help?