after you have climbed with scrapy-redis spiders, you will not be able to climb again. If you change the name of the spider, you can crawl again, and if you change back to the original name, you will start the deduplication mechanism again. Although you need dupefilter, how to solve this problem if you generate some data during debugging and want to crawl again.
how to delete crawled url data previously recorded by dupefilter.