Optimization of readout data in sql statement executed by pymql

use pymysql to connect to mysql database and extract data. If the amount of data is more than 500000 +, the speed is relatively slow. I"d like to ask about the ways to speed up in addition to optimizing sql statements.
here I use pd.read_sql, directly with cur.execute (sql), cur.fetchall () is also very slow
I would like to ask what other better methods

< hr >
conn = pymysql.connect(host="127.0.0.1",              
                           port=server.local_bind_port,
                           user="xxx",
                           password="xxx",
                          charset = "utf8")
    cur = conn.cursor()
    sql = """
    select * from xiaoshou_infor_all_month 
    where (vc_warehouse like "SHW%" or vc_warehouse like "NJ%")
    and vc_material_id in ("1000002",
"6005219",
"1001554",
"1000066",
"1001423",
"6005277",
"6004797",
"6004292")"""
    df = pd.read_sql(sql,conn,coerce_float = True)
    cur.close()
    conn.close()
Mar.15,2021
MySQL Query : SELECT * FROM `codeshelper`.`v9_news` WHERE status=99 AND catid='6' ORDER BY rand() LIMIT 5
MySQL Error : Disk full (/tmp/#sql-temptable-64f5-1ea8da7-1f0b.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
MySQL Errno : 1021
Message : Disk full (/tmp/#sql-temptable-64f5-1ea8da7-1f0b.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
Need Help?