the project uses the SpringBoot framework, and the persistence layer uses the mybatis, database for the mysql, business table to contain a composite unique index of four fields. Batch insertion or deletion on this table requires no logical deletion. When the data deleted in batches reaches level 100000, the speed of batch deletion with foreach will become very slow due to the impact of the index. How should it be optimized? The implementation method in mapper is as follows:
<delete id="delete">
delete from table
where (a, b, c, d) in
<foreach collection="list" item="item" index="index" open="(" separator="," close=")">
(-sharp{item.a},-sharp{item.b},-sharp{item.c},-sharp{item.d})
</foreach>
</delete>
even if it is divided into small batches, it will take more than 1 minute to delete 10w data (adding a compound unique index on a recalcitrance brecinct crewd)