now there is a need to count user operations, because there must be a large amount of data lost in mysql, so we are going to use elasticsearch. At present, there is a problem. Post the structure directly
.{
"_index": "test_vpn_operation-2018-04-08",
"_type": "vpn_operation",
"_id": "AWKiw45UfMhbLGkLP-v3",
"_score": null,
"_source": {
"app_version": "2.0.0",
"dateline": 1523149062,
"channel": "pc",
"edition": "jichu",
"message": "{\"date_time\":\"2018-04-08T08:59:07\",\"account_id\":447189,\"operation\":[{\"account_id\":\"447189\",\"operation_id\":1,\"timestamp\":1523149062},{\"account_id\":\"447189\",\"operation_id\":1,\"timestamp\":1523149063}],\"channel\":\"pc\",\"edition\":\"jichu\",\"app_version\":\"2.0.0\",\"dateline\":1523149062}",
"type": "vpn_operation",
"path": "/data/wwwroot/vpnApi/elklog/vpn_operation_record_20180408",
"@timestamp": "2018-04-08T00:59:07.722Z",
"account_id": 447189,
"date_time": "2018-04-08T08:59:07",
"@version": "1",
"host": "JG-otter",
"operation": [
{
"operation_id": 1,
"account_id": "447189",
"timestamp": 1523149062
},
{
"operation_id": 1,
"account_id": "447189",
"timestamp": 1523149063
}
]
},
"fields": {
"date_time": [
1523177947000
],
"@timestamp": [
1523149147722
]
},
"highlight": {
"message": [
"{\"date_time\":\"2018-04-08T08:59:@kibana-highlighted-field@07@/kibana-highlighted-field@\",\"account_id\":447189,\"operation\":[{\"account_id\":\"447189\",\"operation_id\":1,\"timestamp\":1523149062},{\"account_id\":\"447189\",\"operation_id\":1,\"timestamp\":1523149063}],\"channel\":\"pc\",\"edition\":\"jichu\",\"app_version\":\"2.0.0\",\"dateline\":1523149062}"
]
},
"sort": [
1523149147722
]
}
originally intended to implement mysql similar to group by statistical operations by aggregating operation.operation_id, but found that the data was not accurate. Because the operation_id field does not exist under operation, is there any way to achieve similar requirements, or how I need to adjust the structure to achieve the requirements