用了Elasticsearch,一口气上5T

[request] Data too large, ...., which is larger than the limit of [7436828672/6.9gb]

Elasticsearch | 作者 hongsir | 发布于2020年01月08日 | 阅读数:2654

场景:按字段"topic"分组去重,取前10个,类似sql的
count(distinct) top 10

GET topic/_search
{
"size": 0,
"query": {
"match_all": {}
},
"aggregations": {
"g1": {
"terms": {
"field": "topic",
"size": 10,
"order": {
"g2": "desc"
}
},
"aggregations": {
"g2": {
"cardinality": {
"field": "topic_id"
}
}
}
}
}
}
 
数据量有点大,当我用 g2 这个指标进行排序时报错
{
"type":"circuit_breaking_exception",
"reason":"[request] Data too large, data for [<reused_arrays>] would be [7572367360/7gb], which is larger than the limit of [7436828672/6.9gb]",
"bytes_wanted":7572367360,
"bytes_limit":7436828672,
"durability":"TRANSIENT"
}
已邀请:

要回复问题请先登录注册