to export a batch of Json data from Mongodb, you need to transfer to Mysql, but the exported Json format cannot be directly written to mysql, so you want to convert the data to Pandas"s dataframe, and then write to sql: through dataframe
.import pandas as pd
from sqlalchemy import create_engine
import json
def json_to_sql(jsonfile, sqlname):
df = pd.read_json(jsonfile, lines=True)
conn = create_engine("mysql+mysqlconnector://root:123456@localhost:3306/ScientificName?charset=utf8")
for i in range(len(df)):
df["chromosomes"][i] = json.dumps(df["chromosomes"][i][0], ensure_ascii=False)
print(df["chromosomes"][0])
df.to_sql(name=sqlname, con=conn, if_exists="append", index=False)
json_to_sql("/Downloads/JSON-20180118/speciesHengduan.json", "speciesHengduan")
results always report errors at run time, and the mysql version used is 8.0.11:
NotSupportedError: (mysql.connector.errors.NotSupportedError) Authentication plugin "caching_sha2_password" is not supported (Background on this error at: http://sqlalche.me/e/tw8g)