problem description
when I use python3"s redis for caching, the structure is hset, but when the data reaches 10000, I need to store these 10000 pieces of data into the database. When I delete the hset, of redis, the new request is not added to the redis. It will be added to the redis only after all the 10000 pieces of data have been stored.
the environmental background of the problems and what methods you have tried
add a temporary variable in the middle as a transition (still invalid, or in the wrong way)
related codes
rds.hset(chn_idx, uid, data) -sharp if,mysql.
ualen = int(rds.hlen(chn_idx))
if ualen > 10000:
keyData = rds.hgetall(chn_idx)
rds.delete(chn_idx)
for uid, infos in keyData.items():
... -sharp mysql.
what result do you expect? What is the error message actually seen?
New requests can still be added to redis afterrds.delete (chn_idx). This hset only acts as an intermediate cache and ensures that data is not lost. What is the reason for the present situation? Is it that the data in hset is too large that delete takes time? Or is it something else?