When pymongo inserts data in batches, it is no longer carried out, and there is no end without error.

-sharp !/usr/bin/python3
-sharp encoding=utf-8

import db
from pymongo import MongoClient

user = db.get_db_user()
recharge = db.get_db_recharge()
consume = db.get_db_consume()
client = MongoClient("localhost", 27017)
db = client.test
col_new = db.new


lookup1 = {"$lookup": {
        "from": "recharge",
        "localField": "_id",
        "foreignField": "uid",
        "as": "recharge"
}}
lookup2 = {"$lookup": {
        "from": "consume",
        "localField": "_id",
        "foreignField": "uid",
        "as": "consume"
}}
-sharp lookup uid
replaceRoot1 = {"$replaceRoot": {"newRoot": {"$mergeObjects": [{"$arrayElemAt": ["$recharge", 0]}, "$$ROOT"]}}}
replaceRoot2 = {"$replaceRoot": {"newRoot": {"$mergeObjects": [{"$arrayElemAt": ["$consume", 0]}, "$$ROOT"]}}}
-sharp replaceRoot 
-sharp mergeObjects 
-sharp arrayElemAt 
-sharp ROOT  
project = {"$project": {
        "_id": 1,
        "nickname": 1,
        "phone": 1,
        "regDate": 1,
        "lastLogin": 1,
        "balance": 1,
        "totalRecharge": 1,
        "jcode": 1,
        "consume.amount": 1,
        "consume.consumeDate": 1,
        "recharge.real": 1,
        "recharge.amount": 1,
        "recharge.from": 1,
        "recharge.rechargeDate": 1,
        "recharge.tradeNo": 1
}}
-sharp project 
pipeline = [lookup1, lookup2, replaceRoot1, replaceRoot2, project]
result = user.aggregate(pipeline)
num = 0
for i in result:
    num = num + 1
    -sharp col_new.insert_one(i)
    print(num)
    print(i)

Printing elements of result and inserting them into the new collection are no longer done at the time of 101elements. I don"t know why.
there is no problem with the number of elements in the user collection, even if the pipeline condition deletes a few!

the key is that there was no such problem yesterday. The code was also sent to github backup. Today, a field has been added to project. Deleting and writing yesterday"s new collection will not work all the time.

May.12,2021

@ ou Dou is a vegetarian

it was open all night last night. This morning, I found that it was plugged in, but it didn't print and reported an error, but there are 12020 pieces of data in the new table

.

MySQL Query : SELECT * FROM `codeshelper`.`v9_news` WHERE status=99 AND catid='6' ORDER BY rand() LIMIT 5
MySQL Error : Disk full (/tmp/#sql-temptable-64f5-1e9af59-1816.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
MySQL Errno : 1021
Message : Disk full (/tmp/#sql-temptable-64f5-1e9af59-1816.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
Need Help?