Elasticsearch jdbc从mysql导入数据,到了一半shell脚本中断。

50亿导了30亿关了,
10亿导了6亿关了。
4kw导了3kw关了。
只有一个20w的完整的导进去了没有关。。
这个是导入的脚本:
#!/bin/bash
set -e

bin=/opt/elasticsearch-jdbc-2.3.3.0/bin
lib=/opt/elasticsearch-jdbc-2.3.3.0/lib
echo '{
"type" : "jdbc",
"jdbc" : {
"url" : "jdbc:mysql://localhost:3306/info?characterEncoding=utf8&useSSL=true",
"user" : "root",
"password" : "root",
"sql" : "select id AS id_infoes,number AS number_infoes,name AS name_infoes,age AS age_infoes,group AS group_infoes from infoes;",
"index": "data",
"type": "infoes"
}
}' | /usr/share/jdk1.8.0_101/bin/java \
-cp "${lib}/*" \
-Dlog4j.configurationFile=${bin}/log4j2.xml \
org.xbib.tools.Runner \
org.xbib.tools.JDBCImporter

[INFO ][indices.store            ] [Marvel Man] updating indices.store.throttle.max_bytes_per_sec from [100mb] to [10gb], note, type is [merge]
ax_bytes_per_sec设置的10g.
脚本pal跑起来的时候load很高,25-30多。
已邀请:

Xargin

赞同来自:

。。看看log目录。。。

要回复问题请先登录注册