Python requests crawler monitoring tool?

Hello, folks, ask if there is a tool that can monitor the status of crawler processes, such as running, stopping, number of processes, data quality, log collection, etc.
preferably in the web interface.
my crawlers are written by requests+ multi-processes. I want to find a monitoring tool to monitor my crawlers.
if it is scrapy, it may use scrapyd to monitor the status. How to monitor requests.
Monitoring interface is shown in the following figure

MySQL Query : SELECT * FROM `codeshelper`.`v9_news` WHERE status=99 AND catid='6' ORDER BY rand() LIMIT 5
MySQL Error : Disk full (/tmp/#sql-temptable-64f5-1b227b8-2b612.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
MySQL Errno : 1021
Message : Disk full (/tmp/#sql-temptable-64f5-1b227b8-2b612.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
Need Help?