Scrapy automatically shuts down after crawling part of the content. Why?

2018-09-19 11:58:25 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://www.zhihu.com/question/265749263> (referer: https://www.zhihu.com/question/265749263/answer/298529974)
2018-09-19 11:58:25 [scrapy.extensions.logstats] INFO: Crawled 20 pages (at 5 pages/min), scraped 10 items (at 5 items/min)
2018-09-19 11:58:25 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.zhihu.com/question/24252236>
None
2018-09-19 11:58:25 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.zhihu.com/question/24252236>
None
2018-09-19 11:58:25 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.zhihu.com/question/264756735>
None
2018-09-19 11:58:25 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.zhihu.com/question/264756735>
None
2018-09-19 11:58:25 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.zhihu.com/question/264756735>
None
2018-09-19 11:58:25 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.zhihu.com/question/264756735>
None
2018-09-19 11:58:25 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.zhihu.com/question/23846254>
None
2018-09-19 11:58:25 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.zhihu.com/question/23846254>
None
2018-09-19 11:58:25 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.zhihu.com/question/23846254>
None
2018-09-19 11:58:25 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.zhihu.com/question/60410308>
None
2018-09-19 11:58:25 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.zhihu.com/question/264756735>
None
2018-09-19 11:58:25 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.zhihu.com/question/23846254>
None
2018-09-19 11:58:25 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.zhihu.com/question/23846254>
None
2018-09-19 11:58:25 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.zhihu.com/question/23846254>
None
2018-09-19 11:58:25 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.zhihu.com/question/265749263>
None
2018-09-19 11:58:25 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.zhihu.com/question/265749263>
None
2018-09-19 11:58:25 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.zhihu.com/question/265749263>
None
2018-09-19 11:58:25 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.zhihu.com/question/265749263>
None
2018-09-19 11:58:25 [scrapy.core.engine] INFO: Closing spider (finished)
2018-09-19 11:58:25 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{"downloader/request_bytes": 7823,
 "downloader/request_count": 20,
 "downloader/request_method_count/GET": 20,
 "downloader/response_bytes": 7113,
 "downloader/response_count": 20,
 "downloader/response_status_count/400": 20,
 "dupefilter/filtered": 9,
 "finish_reason": "finished",
 "finish_time": datetime.datetime(2018, 9, 19, 3, 58, 25, 849646),
 "item_scraped_count": 28,
 "log_count/DEBUG": 202,
 "log_count/INFO": 10,
 "request_depth_max": 3,
 "response_received_count": 20,
 "scheduler/dequeued": 20,
 "scheduler/dequeued/memory": 20,
 "scheduler/enqueued": 20,
 "scheduler/enqueued/memory": 20,
 "start_time": datetime.datetime(2018, 9, 19, 3, 55, 14, 766276)}
2018-09-19 11:58:25 [scrapy.core.engine] INFO: Spider closed (finished)-sharp-sharp-sharp 

start_urlsstart_url
s

  
2018-09-19 11:58:25 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.zhihu.com/question/265749263>  None


Jul.01,2021

have you solved this problem

MySQL Query : SELECT * FROM `codeshelper`.`v9_news` WHERE status=99 AND catid='6' ORDER BY rand() LIMIT 5
MySQL Error : Disk full (/tmp/#sql-temptable-64f5-1bf9424-314be.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
MySQL Errno : 1021
Message : Disk full (/tmp/#sql-temptable-64f5-1bf9424-314be.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
Need Help?