2018-10-15 16:57:47 [scrapy.core.engine] INFO: Spider opened
2018-10-15 16:57:47 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2018-10-15 16:57:47 [scrapy.extensions.telnet] DEBUG: Telnet console listening on 127.0.0.1:6034
2018-10-15 16:57:47 [scrapy.core.scraper] ERROR: Error downloading <POST https://www.lagou.com/gongsi/0-1-0-0.json>
Traceback (most recent call last):
File "/home/shenjianlin/.local/lib/python3.4/site-packages/twisted/internet/defer.py", line 1386, in _inlineCallbacks
result = g.send(result)
File "/home/shenjianlin/.local/lib/python3.4/site-packages/scrapy/core/downloader/middleware.py", line 37, in process_request
response = yield method(request=request, spider=spider)
TypeError: process_request() got an unexpected keyword argument "spider"
2018-10-15 16:57:47 [scrapy.core.scraper] ERROR: Error downloading <POST https://www.lagou.com/gongsi/0-1-0-0.json>
Traceback (most recent call last):
File "/home/shenjianlin/.local/lib/python3.4/site-packages/twisted/internet/defer.py", line 1386, in _inlineCallbacks
result = g.send(result)
File "/home/shenjianlin/.local/lib/python3.4/site-packages/scrapy/core/downloader/middleware.py", line 37, in process_request
response = yield method(request=request, spider=spider)
TypeError: process_request() got an unexpected keyword argument "spider"
2018-10-15 16:57:47 [scrapy.core.scraper] ERROR: Error downloading <POST https://www.lagou.com/gongsi/0-1,2-0-0.json>
Traceback (most recent call last):
File "/home/shenjianlin/.local/lib/python3.4/site-packages/twisted/internet/defer.py", line 1386, in _inlineCallbacks
result = g.send(result)
File "/home/shenjianlin/.local/lib/python3.4/site-packages/scrapy/core/downloader/middleware.py", line 37, in process_request
response = yield method(request=request, spider=spider)
TypeError: process_request() got an unexpected keyword argument "spider"
2018-10-15 16:57:47 [scrapy.core.scraper] ERROR: Error downloading <POST https://www.lagou.com/gongsi/0-1,2-0-0.json>
Traceback (most recent call last):
File "/home/shenjianlin/.local/lib/python3.4/site-packages/twisted/internet/defer.py", line 1386, in _inlineCallbacks
result = g.send(result)
File "/home/shenjianlin/.local/lib/python3.4/site-packages/scrapy/core/downloader/middleware.py", line 37, in process_request
response = yield method(request=request, spider=spider)
TypeError: process_request() got an unexpected keyword argument "spider"
2018-10-15 16:57:47 [scrapy.core.scraper] ERROR: Error downloading <POST https://www.lagou.com/gongsi/0-1,2,3-0-0.json>
Traceback (most recent call last):
File "/home/shenjianlin/.local/lib/python3.4/site-packages/twisted/internet/defer.py", line 1386, in _inlineCallbacks
result = g.send(result)
File "/home/shenjianlin/.local/lib/python3.4/site-packages/scrapy/core/downloader/middleware.py", line 37, in process_request
response = yield method(request=request, spider=spider)
TypeError: process_request() got an unexpected keyword argument "spider"
2018-10-15 16:57:47 [scrapy.core.scraper] ERROR: Error downloading <POST https://www.lagou.com/gongsi/0-1,2,3-0-0.json>
Traceback (most recent call last):
File "/home/shenjianlin/.local/lib/python3.4/site-packages/twisted/internet/defer.py", line 1386, in _inlineCallbacks
result = g.send(result)
File "/home/shenjianlin/.local/lib/python3.4/site-packages/scrapy/core/downloader/middleware.py", line 37, in process_request
response = yield method(request=request, spider=spider)
TypeError: process_request() got an unexpected keyword argument "spider"
2018-10-15 16:57:47 [scrapy.core.engine] INFO: Closing spider (finished)
2018-10-15 16:57:47 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{"downloader/exception_count": 6,
class LagouMiddlerware():
def __init__(self):
proxy_host = "w5.t.16yun.cn"
proxy_port = "6469"
self.username = "16ISMSBW"
self.password = "162267"
self.proxies = {"https": "https://{}:{}/".format(proxy_host, proxy_port)}
self.proxy_server = "https://w5.t.16yun.cn:6469"
self.proxy_authorization = "Basic " + base64.urlsafe_b64encode(
bytes((self.username + ":" + self.password), "ascii")).decode("utf8")
def process_request(self, request):
request.meta["proxy"] = self.proxy_server
request.headers["Proxy-Authorization"] = self.proxy_authorization
I set up a dynamic proxy here, but always report an error? Ask for a reason?