搜索结果正在快递途中

elastic-spark classNotFount EsSpark

报错如下:
java.lang.ClassNotFoundException: org.elasticsearch.spark.rdd.EsSpark$$anonfun$doSaveToEs$1
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:66)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:71)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:97)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:63)
at org.apache.spark.scheduler.Task.run(Task.scala:90)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:253)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)


 写了一个elasticspark demo  如下:
```
package com.sydney.dream.elasticspark

import org.elasticsearch.spark._
import org.apache.spark.{SparkConf, SparkContext}

/**
  * 需要手动引入org.elasticsearch.spark._
  * 这样使得所有的RDD 都拥有saveToEs 的方法
  */
object ElasticSparkFirstDemo {
    def main(args: Array[String]): Unit = {
        val conf = new SparkConf()
            .setAppName("ElaticSparkFirsDemo")
            .set("es.nodes", "172.18.18.114")
            .set("es.port", "9200")
            .set("es.index.auto.create", "true")
        val sc = new SparkContext(conf)
        val numbers = Map("one" -> 1, "two" -> 2, "three" -> 3)
        val airports = Map("arrival" -> "Otopeni", "SFO" -> "San Fran")
        sc.makeRDD(Seq(numbers, airports)).saveToEs("spark/docs")
    }
}
 
 
pom 文件如下:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0&quot;
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance&quot;
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/ma ... gt%3B
    <parent>
        <artifactId>spark</artifactId>
        <groupId>com.sydney.dream</groupId>
        <version>1.0.0</version>
    </parent>
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.sydney.dream</groupId>
    <artifactId>ElasticSpark</artifactId>
    <dependencies>
        <!--<dependency>
            <groupId>org.elasticsearch</groupId>
            <artifactId>elasticsearch-hadoop</artifactId>
            <version>5.5.0</version>
        </dependency>-->
       <dependency>
            <groupId>org.elasticsearch</groupId>
            <artifactId>elasticsearch-spark-20_2.10</artifactId>
            <version>5.5.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>2.2.0</version>
        </dependency>
        <!--<dependency>
            <groupId> org.apache.storm</groupId>
            <artifactId>storm-core</artifactId>
            <version>1.0.1</version>
            <exclusions>
                <exclusion>
                    <groupId>org.slf4j</groupId>
                    <artifactId>log4j-over-slf4j</artifactId>
                </exclusion>
            </exclusions>
        </dependency>-->
    </dependencies>

   <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-jar-plugin</artifactId>
                <version>2.6</version>
                <configuration>
                    <archive>
                        <manifest>
                            <addClasspath>true</addClasspath>
                            <classpathPrefix>lib/</classpathPrefix>
                            <mainClass>com.sydney.dream.elasticspark.ElasticSparkFirstDemo</mainClass>
                        </manifest>
                    </archive>
                </configuration>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-dependency-plugin</artifactId>
                <version>2.10</version>
                <executions>
                    <execution>
                        <id>copy-dependencies</id>
                        <phase>package</phase>
                        <goals>
                            <goal>copy-dependencies</goal>
                        </goals>
                        <configuration>
                            <outputDirectory>${project.build.directory}/lib</outputDirectory>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
           <plugin>
                <groupId>org.scala-tools</groupId>
                <artifactId>maven-scala-plugin</artifactId>
                <version>2.15.2</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>testCompile</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <!--
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <createDependencyReducedPom>false</createDependencyReducedPom>
                </configuration>
                <executions>
                    <execution>
                        <phase>package</phase>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                        <configuration>
                            <transformers>
                                <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer" />
                                <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
                            </transformers>
                        </configuration>
                    </execution>
                </executions>
            </plugin>-->
        </plugins>
    </build>
</project>
 
 
spark-submit 提交:
 spark-submit --class com.sydney.dream.elasticspark.ElasticSparkFirstDemo --master yarn --deploy-mode client --executor-memory 5G --num-executors 10 --jars /home/ldl/sparkdemo/ElasticSpark-1.0.0.jar  /home/ldl/sparkdemo/lib/activation-1.1.1.jar /home/ldl/sparkdemo/lib/antlr4-runtime-4.5.3.jar /home/ldl/sparkdemo/lib/aopalliance-repackaged-2.4.0-b34.jar /home/ldl/sparkdemo/lib/apacheds-i18n-2.0.0-M15.jar /home/ldl/sparkdemo/lib/apacheds-kerberos-codec-2.0.0-M15.jar /home/ldl/sparkdemo/lib/api-asn1-api-1.0.0-M20.jar /home/ldl/sparkdemo/lib/api-util-1.0.0-M20.jar /home/ldl/sparkdemo/lib/avro-1.7.7.jar /home/ldl/sparkdemo/lib/avro-ipc-1.7.7.jar /home/ldl/sparkdemo/lib/avro-ipc-1.7.7-tests.jar /home/ldl/sparkdemo/lib/base64-2.3.8.jar /home/ldl/sparkdemo/lib/bcprov-jdk15on-1.51.jar /home/ldl/sparkdemo/lib/chill_2.10-0.8.0.jar /home/ldl/sparkdemo/lib/chill-java-0.8.0.jar /home/ldl/sparkdemo/lib/commons-beanutils-1.7.0.jar /home/ldl/sparkdemo/lib/commons-beanutils-core-1.8.0.jar /home/ldl/sparkdemo/lib/commons-cli-1.2.jar /home/ldl/sparkdemo/lib/commons-codec-1.8.jar /home/ldl/sparkdemo/lib/commons-collections-3.2.2.jar /home/ldl/sparkdemo/lib/commons-compiler-3.0.0.jar /home/ldl/sparkdemo/lib/commons-compress-1.4.1.jar /home/ldl/sparkdemo/lib/commons-configuration-1.6.jar /home/ldl/sparkdemo/lib/commons-crypto-1.0.0.jar /home/ldl/sparkdemo/lib/commons-digester-1.8.jar /home/ldl/sparkdemo/lib/commons-httpclient-3.1.jar /home/ldl/sparkdemo/lib/commons-io-2.4.jar /home/ldl/sparkdemo/lib/commons-lang-2.6.jar /home/ldl/sparkdemo/lib/commons-lang3-3.5.jar /home/ldl/sparkdemo/lib/commons-math3-3.4.1.jar /home/ldl/sparkdemo/lib/commons-net-2.2.jar /home/ldl/sparkdemo/lib/compress-lzf-1.0.3.jar /home/ldl/sparkdemo/lib/curator-client-2.6.0.jar /home/ldl/sparkdemo/lib/curator-framework-2.6.0.jar /home/ldl/sparkdemo/lib/curator-recipes-2.6.0.jar /home/ldl/sparkdemo/lib/gson-2.2.4.jar /home/ldl/sparkdemo/lib/guava-16.0.1.jar /home/ldl/sparkdemo/lib/hk2-api-2.4.0-b34.jar /home/ldl/sparkdemo/lib/hk2-locator-2.4.0-b34.jar /home/ldl/sparkdemo/lib/hk2-utils-2.4.0-b34.jar /home/ldl/sparkdemo/lib/htrace-core-3.0.4.jar /home/ldl/sparkdemo/lib/httpclient-4.3.6.jar /home/ldl/sparkdemo/lib/httpcore-4.3.3.jar /home/ldl/sparkdemo/lib/ivy-2.4.0.jar /home/ldl/sparkdemo/lib/jackson-annotations-2.6.5.jar /home/ldl/sparkdemo/lib/jackson-core-2.6.5.jar /home/ldl/sparkdemo/lib/jackson-core-asl-1.9.13.jar /home/ldl/sparkdemo/lib/jackson-databind-2.6.5.jar /home/ldl/sparkdemo/lib/jackson-jaxrs-1.9.13.jar /home/ldl/sparkdemo/lib/jackson-mapper-asl-1.9.13.jar /home/ldl/sparkdemo/lib/jackson-module-paranamer-2.6.5.jar /home/ldl/sparkdemo/lib/jackson-xc-1.9.13.jar /home/ldl/sparkdemo/lib/janino-3.0.0.jar /home/ldl/sparkdemo/lib/javassist-3.18.1-GA.jar /home/ldl/sparkdemo/lib/javax.annotation-api-1.2.jar /home/ldl/sparkdemo/lib/javax.inject-2.4.0-b34.jar /home/ldl/sparkdemo/lib/java-xmlbuilder-1.0.jar /home/ldl/sparkdemo/lib/javax.servlet-api-3.1.0.jar /home/ldl/sparkdemo/lib/javax.ws.rs-api-2.0.1.jar /home/ldl/sparkdemo/lib/jaxb-api-2.2.2.jar /home/ldl/sparkdemo/lib/jcl-over-slf4j-1.7.16.jar /home/ldl/sparkdemo/lib/jersey-client-2.22.2.jar /home/ldl/sparkdemo/lib/jersey-common-2.22.2.jar /home/ldl/sparkdemo/lib/jersey-container-servlet-2.22.2.jar /home/ldl/sparkdemo/lib/jersey-container-servlet-core-2.22.2.jar /home/ldl/sparkdemo/lib/jersey-guava-2.22.2.jar /home/ldl/sparkdemo/lib/jersey-media-jaxb-2.22.2.jar /home/ldl/sparkdemo/lib/jersey-server-2.22.2.jar /home/ldl/sparkdemo/lib/jets3t-0.9.3.jar /home/ldl/sparkdemo/lib/jetty-util-6.1.26.jar /home/ldl/sparkdemo/lib/json4s-ast_2.10-3.2.11.jar /home/ldl/sparkdemo/lib/json4s-core_2.10-3.2.11.jar /home/ldl/sparkdemo/lib/json4s-jackson_2.10-3.2.11.jar /home/ldl/sparkdemo/lib/jsr305-1.3.9.jar /home/ldl/sparkdemo/lib/jul-to-slf4j-1.7.16.jar /home/ldl/sparkdemo/lib/kryo-shaded-3.0.3.jar /home/ldl/sparkdemo/lib/leveldbjni-all-1.8.jar /home/ldl/sparkdemo/lib/log4j-1.2.17.jar /home/ldl/sparkdemo/lib/lz4-1.3.0.jar /home/ldl/sparkdemo/lib/mail-1.4.7.jar /home/ldl/sparkdemo/lib/metrics-core-3.1.2.jar /home/ldl/sparkdemo/lib/metrics-graphite-3.1.2.jar /home/ldl/sparkdemo/lib/metrics-json-3.1.2.jar /home/ldl/sparkdemo/lib/metrics-jvm-3.1.2.jar /home/ldl/sparkdemo/lib/minlog-1.3.0.jar /home/ldl/sparkdemo/lib/mx4j-3.0.2.jar /home/ldl/sparkdemo/lib/netty-3.9.9.Final.jar /home/ldl/sparkdemo/lib/netty-all-4.0.43.Final.jar /home/ldl/sparkdemo/lib/objenesis-2.1.jar /home/ldl/sparkdemo/lib/oro-2.0.8.jar /home/ldl/sparkdemo/lib/osgi-resource-locator-1.0.1.jar /home/ldl/sparkdemo/lib/paranamer-2.3.jar /home/ldl/sparkdemo/lib/parquet-column-1.8.1.jar /home/ldl/sparkdemo/lib/parquet-common-1.8.1.jar /home/ldl/sparkdemo/lib/parquet-encoding-1.8.1.jar /home/ldl/sparkdemo/lib/parquet-format-2.3.0-incubating.jar /home/ldl/sparkdemo/lib/parquet-jackson-1.8.1.jar /home/ldl/sparkdemo/lib/protobuf-java-2.5.0.jar /home/ldl/sparkdemo/lib/py4j-0.10.4.jar /home/ldl/sparkdemo/lib/pyrolite-4.13.jar /home/ldl/sparkdemo/lib/RoaringBitmap-0.5.11.jar /home/ldl/sparkdemo/lib/slf4j-api-1.7.16.jar /home/ldl/sparkdemo/lib/slf4j-log4j12-1.7.16.jar /home/ldl/sparkdemo/lib/snappy-java-1.1.2.6.jar /home/ldl/sparkdemo/lib/stax-api-1.0-2.jar /home/ldl/sparkdemo/lib/stream-2.7.0.jar /home/ldl/sparkdemo/lib/univocity-parsers-2.2.1.jar /home/ldl/sparkdemo/lib/unused-1.0.0.jar /home/ldl/sparkdemo/lib/validation-api-1.1.0.Final.jar /home/ldl/sparkdemo/lib/xbean-asm5-shaded-4.4.jar /home/ldl/sparkdemo/lib/xercesImpl-2.9.1.jar /home/ldl/sparkdemo/lib/xml-apis-1.3.04.jar /home/ldl/sparkdemo/lib/xmlenc-0.52.jar /home/ldl/sparkdemo/lib/xz-1.0.jar /home/ldl/sparkdemo/lib/zookeeper-3.4.6.jar
 
继续阅读 »
报错如下:
java.lang.ClassNotFoundException: org.elasticsearch.spark.rdd.EsSpark$$anonfun$doSaveToEs$1
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:66)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:71)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:97)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:63)
at org.apache.spark.scheduler.Task.run(Task.scala:90)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:253)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)


 写了一个elasticspark demo  如下:
```
package com.sydney.dream.elasticspark

import org.elasticsearch.spark._
import org.apache.spark.{SparkConf, SparkContext}

/**
  * 需要手动引入org.elasticsearch.spark._
  * 这样使得所有的RDD 都拥有saveToEs 的方法
  */
object ElasticSparkFirstDemo {
    def main(args: Array[String]): Unit = {
        val conf = new SparkConf()
            .setAppName("ElaticSparkFirsDemo")
            .set("es.nodes", "172.18.18.114")
            .set("es.port", "9200")
            .set("es.index.auto.create", "true")
        val sc = new SparkContext(conf)
        val numbers = Map("one" -> 1, "two" -> 2, "three" -> 3)
        val airports = Map("arrival" -> "Otopeni", "SFO" -> "San Fran")
        sc.makeRDD(Seq(numbers, airports)).saveToEs("spark/docs")
    }
}
 
 
pom 文件如下:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0&quot;
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance&quot;
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/ma ... gt%3B
    <parent>
        <artifactId>spark</artifactId>
        <groupId>com.sydney.dream</groupId>
        <version>1.0.0</version>
    </parent>
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.sydney.dream</groupId>
    <artifactId>ElasticSpark</artifactId>
    <dependencies>
        <!--<dependency>
            <groupId>org.elasticsearch</groupId>
            <artifactId>elasticsearch-hadoop</artifactId>
            <version>5.5.0</version>
        </dependency>-->
       <dependency>
            <groupId>org.elasticsearch</groupId>
            <artifactId>elasticsearch-spark-20_2.10</artifactId>
            <version>5.5.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>2.2.0</version>
        </dependency>
        <!--<dependency>
            <groupId> org.apache.storm</groupId>
            <artifactId>storm-core</artifactId>
            <version>1.0.1</version>
            <exclusions>
                <exclusion>
                    <groupId>org.slf4j</groupId>
                    <artifactId>log4j-over-slf4j</artifactId>
                </exclusion>
            </exclusions>
        </dependency>-->
    </dependencies>

   <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-jar-plugin</artifactId>
                <version>2.6</version>
                <configuration>
                    <archive>
                        <manifest>
                            <addClasspath>true</addClasspath>
                            <classpathPrefix>lib/</classpathPrefix>
                            <mainClass>com.sydney.dream.elasticspark.ElasticSparkFirstDemo</mainClass>
                        </manifest>
                    </archive>
                </configuration>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-dependency-plugin</artifactId>
                <version>2.10</version>
                <executions>
                    <execution>
                        <id>copy-dependencies</id>
                        <phase>package</phase>
                        <goals>
                            <goal>copy-dependencies</goal>
                        </goals>
                        <configuration>
                            <outputDirectory>${project.build.directory}/lib</outputDirectory>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
           <plugin>
                <groupId>org.scala-tools</groupId>
                <artifactId>maven-scala-plugin</artifactId>
                <version>2.15.2</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>testCompile</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <!--
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <createDependencyReducedPom>false</createDependencyReducedPom>
                </configuration>
                <executions>
                    <execution>
                        <phase>package</phase>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                        <configuration>
                            <transformers>
                                <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer" />
                                <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
                            </transformers>
                        </configuration>
                    </execution>
                </executions>
            </plugin>-->
        </plugins>
    </build>
</project>
 
 
spark-submit 提交:
 spark-submit --class com.sydney.dream.elasticspark.ElasticSparkFirstDemo --master yarn --deploy-mode client --executor-memory 5G --num-executors 10 --jars /home/ldl/sparkdemo/ElasticSpark-1.0.0.jar  /home/ldl/sparkdemo/lib/activation-1.1.1.jar /home/ldl/sparkdemo/lib/antlr4-runtime-4.5.3.jar /home/ldl/sparkdemo/lib/aopalliance-repackaged-2.4.0-b34.jar /home/ldl/sparkdemo/lib/apacheds-i18n-2.0.0-M15.jar /home/ldl/sparkdemo/lib/apacheds-kerberos-codec-2.0.0-M15.jar /home/ldl/sparkdemo/lib/api-asn1-api-1.0.0-M20.jar /home/ldl/sparkdemo/lib/api-util-1.0.0-M20.jar /home/ldl/sparkdemo/lib/avro-1.7.7.jar /home/ldl/sparkdemo/lib/avro-ipc-1.7.7.jar /home/ldl/sparkdemo/lib/avro-ipc-1.7.7-tests.jar /home/ldl/sparkdemo/lib/base64-2.3.8.jar /home/ldl/sparkdemo/lib/bcprov-jdk15on-1.51.jar /home/ldl/sparkdemo/lib/chill_2.10-0.8.0.jar /home/ldl/sparkdemo/lib/chill-java-0.8.0.jar /home/ldl/sparkdemo/lib/commons-beanutils-1.7.0.jar /home/ldl/sparkdemo/lib/commons-beanutils-core-1.8.0.jar /home/ldl/sparkdemo/lib/commons-cli-1.2.jar /home/ldl/sparkdemo/lib/commons-codec-1.8.jar /home/ldl/sparkdemo/lib/commons-collections-3.2.2.jar /home/ldl/sparkdemo/lib/commons-compiler-3.0.0.jar /home/ldl/sparkdemo/lib/commons-compress-1.4.1.jar /home/ldl/sparkdemo/lib/commons-configuration-1.6.jar /home/ldl/sparkdemo/lib/commons-crypto-1.0.0.jar /home/ldl/sparkdemo/lib/commons-digester-1.8.jar /home/ldl/sparkdemo/lib/commons-httpclient-3.1.jar /home/ldl/sparkdemo/lib/commons-io-2.4.jar /home/ldl/sparkdemo/lib/commons-lang-2.6.jar /home/ldl/sparkdemo/lib/commons-lang3-3.5.jar /home/ldl/sparkdemo/lib/commons-math3-3.4.1.jar /home/ldl/sparkdemo/lib/commons-net-2.2.jar /home/ldl/sparkdemo/lib/compress-lzf-1.0.3.jar /home/ldl/sparkdemo/lib/curator-client-2.6.0.jar /home/ldl/sparkdemo/lib/curator-framework-2.6.0.jar /home/ldl/sparkdemo/lib/curator-recipes-2.6.0.jar /home/ldl/sparkdemo/lib/gson-2.2.4.jar /home/ldl/sparkdemo/lib/guava-16.0.1.jar /home/ldl/sparkdemo/lib/hk2-api-2.4.0-b34.jar /home/ldl/sparkdemo/lib/hk2-locator-2.4.0-b34.jar /home/ldl/sparkdemo/lib/hk2-utils-2.4.0-b34.jar /home/ldl/sparkdemo/lib/htrace-core-3.0.4.jar /home/ldl/sparkdemo/lib/httpclient-4.3.6.jar /home/ldl/sparkdemo/lib/httpcore-4.3.3.jar /home/ldl/sparkdemo/lib/ivy-2.4.0.jar /home/ldl/sparkdemo/lib/jackson-annotations-2.6.5.jar /home/ldl/sparkdemo/lib/jackson-core-2.6.5.jar /home/ldl/sparkdemo/lib/jackson-core-asl-1.9.13.jar /home/ldl/sparkdemo/lib/jackson-databind-2.6.5.jar /home/ldl/sparkdemo/lib/jackson-jaxrs-1.9.13.jar /home/ldl/sparkdemo/lib/jackson-mapper-asl-1.9.13.jar /home/ldl/sparkdemo/lib/jackson-module-paranamer-2.6.5.jar /home/ldl/sparkdemo/lib/jackson-xc-1.9.13.jar /home/ldl/sparkdemo/lib/janino-3.0.0.jar /home/ldl/sparkdemo/lib/javassist-3.18.1-GA.jar /home/ldl/sparkdemo/lib/javax.annotation-api-1.2.jar /home/ldl/sparkdemo/lib/javax.inject-2.4.0-b34.jar /home/ldl/sparkdemo/lib/java-xmlbuilder-1.0.jar /home/ldl/sparkdemo/lib/javax.servlet-api-3.1.0.jar /home/ldl/sparkdemo/lib/javax.ws.rs-api-2.0.1.jar /home/ldl/sparkdemo/lib/jaxb-api-2.2.2.jar /home/ldl/sparkdemo/lib/jcl-over-slf4j-1.7.16.jar /home/ldl/sparkdemo/lib/jersey-client-2.22.2.jar /home/ldl/sparkdemo/lib/jersey-common-2.22.2.jar /home/ldl/sparkdemo/lib/jersey-container-servlet-2.22.2.jar /home/ldl/sparkdemo/lib/jersey-container-servlet-core-2.22.2.jar /home/ldl/sparkdemo/lib/jersey-guava-2.22.2.jar /home/ldl/sparkdemo/lib/jersey-media-jaxb-2.22.2.jar /home/ldl/sparkdemo/lib/jersey-server-2.22.2.jar /home/ldl/sparkdemo/lib/jets3t-0.9.3.jar /home/ldl/sparkdemo/lib/jetty-util-6.1.26.jar /home/ldl/sparkdemo/lib/json4s-ast_2.10-3.2.11.jar /home/ldl/sparkdemo/lib/json4s-core_2.10-3.2.11.jar /home/ldl/sparkdemo/lib/json4s-jackson_2.10-3.2.11.jar /home/ldl/sparkdemo/lib/jsr305-1.3.9.jar /home/ldl/sparkdemo/lib/jul-to-slf4j-1.7.16.jar /home/ldl/sparkdemo/lib/kryo-shaded-3.0.3.jar /home/ldl/sparkdemo/lib/leveldbjni-all-1.8.jar /home/ldl/sparkdemo/lib/log4j-1.2.17.jar /home/ldl/sparkdemo/lib/lz4-1.3.0.jar /home/ldl/sparkdemo/lib/mail-1.4.7.jar /home/ldl/sparkdemo/lib/metrics-core-3.1.2.jar /home/ldl/sparkdemo/lib/metrics-graphite-3.1.2.jar /home/ldl/sparkdemo/lib/metrics-json-3.1.2.jar /home/ldl/sparkdemo/lib/metrics-jvm-3.1.2.jar /home/ldl/sparkdemo/lib/minlog-1.3.0.jar /home/ldl/sparkdemo/lib/mx4j-3.0.2.jar /home/ldl/sparkdemo/lib/netty-3.9.9.Final.jar /home/ldl/sparkdemo/lib/netty-all-4.0.43.Final.jar /home/ldl/sparkdemo/lib/objenesis-2.1.jar /home/ldl/sparkdemo/lib/oro-2.0.8.jar /home/ldl/sparkdemo/lib/osgi-resource-locator-1.0.1.jar /home/ldl/sparkdemo/lib/paranamer-2.3.jar /home/ldl/sparkdemo/lib/parquet-column-1.8.1.jar /home/ldl/sparkdemo/lib/parquet-common-1.8.1.jar /home/ldl/sparkdemo/lib/parquet-encoding-1.8.1.jar /home/ldl/sparkdemo/lib/parquet-format-2.3.0-incubating.jar /home/ldl/sparkdemo/lib/parquet-jackson-1.8.1.jar /home/ldl/sparkdemo/lib/protobuf-java-2.5.0.jar /home/ldl/sparkdemo/lib/py4j-0.10.4.jar /home/ldl/sparkdemo/lib/pyrolite-4.13.jar /home/ldl/sparkdemo/lib/RoaringBitmap-0.5.11.jar /home/ldl/sparkdemo/lib/slf4j-api-1.7.16.jar /home/ldl/sparkdemo/lib/slf4j-log4j12-1.7.16.jar /home/ldl/sparkdemo/lib/snappy-java-1.1.2.6.jar /home/ldl/sparkdemo/lib/stax-api-1.0-2.jar /home/ldl/sparkdemo/lib/stream-2.7.0.jar /home/ldl/sparkdemo/lib/univocity-parsers-2.2.1.jar /home/ldl/sparkdemo/lib/unused-1.0.0.jar /home/ldl/sparkdemo/lib/validation-api-1.1.0.Final.jar /home/ldl/sparkdemo/lib/xbean-asm5-shaded-4.4.jar /home/ldl/sparkdemo/lib/xercesImpl-2.9.1.jar /home/ldl/sparkdemo/lib/xml-apis-1.3.04.jar /home/ldl/sparkdemo/lib/xmlenc-0.52.jar /home/ldl/sparkdemo/lib/xz-1.0.jar /home/ldl/sparkdemo/lib/zookeeper-3.4.6.jar
  收起阅读 »

社区日报 第44期 (2017-09-11)

1.使用elk来对spring Boot 程序可视化

http://t.cn/RpM5eM4

2.很多不做java的同学都不太了解es和logstash的自动垃圾回收,这里介绍一下java的gc体系

http://t.cn/RpMf7Ve

3.用Elasticsearch处理非范式数据。

http://t.cn/RpMpNC5
编辑:cyberdak

归档:https://elasticsearch.cn/article/267

订阅:https://tinyletter.com/elastic-daily
 
继续阅读 »
1.使用elk来对spring Boot 程序可视化

http://t.cn/RpM5eM4

2.很多不做java的同学都不太了解es和logstash的自动垃圾回收,这里介绍一下java的gc体系

http://t.cn/RpMf7Ve

3.用Elasticsearch处理非范式数据。

http://t.cn/RpMpNC5
编辑:cyberdak

归档:https://elasticsearch.cn/article/267

订阅:https://tinyletter.com/elastic-daily
  收起阅读 »

社区日报 第43期 (2017-09-10)

1.用Elasticsearch处理实体间的关联关系。
http://t.cn/Rpt082p
2.ELK配合Auditbeat模块跟踪监控Linux系统。
http://t.cn/Rpt0nBT
3.使用Wireshark,Elasticsearch和Kibana分析网络数据包。
http://t.cn/RptTsyy

编辑:至尊宝
归档:https://elasticsearch.cn/article/266
订阅:https://tinyletter.com/elastic-daily
继续阅读 »
1.用Elasticsearch处理实体间的关联关系。
http://t.cn/Rpt082p
2.ELK配合Auditbeat模块跟踪监控Linux系统。
http://t.cn/Rpt0nBT
3.使用Wireshark,Elasticsearch和Kibana分析网络数据包。
http://t.cn/RptTsyy

编辑:至尊宝
归档:https://elasticsearch.cn/article/266
订阅:https://tinyletter.com/elastic-daily 收起阅读 »

社区日报 第42期 (2017-09-09)

1.正确使用bool语法,你做到了吗:
http://t.cn/RpGg46z 

2.有时候scripts并不是最佳选择:
http://t.cn/RpGg9kw 

3.手把手教你用docker部署es:
http://t.cn/RpGeJ5U 

编辑:bsll
归档:https://elasticsearch.cn/article/264
订阅:https://tinyletter.com/elastic-daily
 
继续阅读 »
1.正确使用bool语法,你做到了吗:
http://t.cn/RpGg46z 

2.有时候scripts并不是最佳选择:
http://t.cn/RpGg9kw 

3.手把手教你用docker部署es:
http://t.cn/RpGeJ5U 

编辑:bsll
归档:https://elasticsearch.cn/article/264
订阅:https://tinyletter.com/elastic-daily
  收起阅读 »

Filebeat 配置「SSL」證書加密 出現錯誤 ... ERR Failed to publish events

各位朋友大家好:
 
進行配置「Filebeat」證書加密 出現錯誤如下,有哪位朋友遇過此問題可以幫幫忙!
 
ERR Failed to publish events caused by: read tcp 192.168.1.57:56182->192.168.1.249:5043: wsarecv: An existing connection was forcibly closed by the remote host.
 
使用過「telnet IP Port」測試「ELK」服務器,確認通訊協議 OK !
 
Filebeat 配置如檔如下:  (Windows 環境)
filebeat.prospectors:
- input_type: log #輸入 type「log」
paths:
- D:\Wireshark_Log\* #指定推送日誌「Log」文件

output.logstash:
hosts: ["192.168.1.249:5043"] #指定接收Logstash
tls:
certificate_authorities:
- C:\filebeat-5.5.0-windows-x86_64\ssl\logstash\192.168.1.249.crt
ssl.certificate:
- C:\filebeat-5.5.0-windows-x86_64\ssl\filebeat\192.168.1.57.crt
ssl.certificate:
- C:\filebeat-5.5.0-windows-x86_64\ssl\filebeat\192.168.1.57.key

以下是「FileBeat」錯誤 日誌
2017-09-08T14:14:57+08:00 ERR Failed to publish events caused by: read tcp 192.168.1.57:56202->192.168.1.249:5043: wsarecv: An existing connection was forcibly closed by the remote host.
2017-09-08T14:14:57+08:00 INFO Error publishing events (retrying): read tcp 192.168.1.57:56202->192.168.1.249:5043: wsarecv: An existing connection was forcibly closed by the remote host.
2017-09-08T14:15:19+08:00 INFO Non-zero metrics in the last 30s: filebeat.harvester.closed=1 filebeat.harvester.open_files=-1 filebeat.harvester.running=-1 libbeat.logstash.call_count.PublishEvents=1 libbeat.logstash.publish.read_errors=1 libbeat.logstash.publish.write_bytes=323 libbeat.logstash.published_but_not_acked_events=5
2017-09-08T14:15:49+08:00 INFO No non-zero metrics in the last 30s

 2017.09.08 感謝 medcl 兄弟幫忙,再次修改如下:
filebeat.prospectors:
- input_type: log #輸入 type「log」
paths:
- D:\Wireshark_Log\* #指定推送日誌「Log」文件

output.logstash:
hosts: ["192.168.1.249:5043"] #指定接收Logstash
ssl: # <=== 新版本貌似要改成「SSL」
certificate_authorities:
- C:\filebeat-5.5.0-windows-x86_64\ssl\logstash\192.168.1.249.crt
ssl.certificate:
- C:\filebeat-5.5.0-windows-x86_64\ssl\filebeat\192.168.1.57.crt
ssl.key: # <=== 修正為「ssl.key」
- C:\filebeat-5.5.0-windows-x86_64\ssl\filebeat\192.168.1.57.key

以下是「FileBeat」錯誤 日誌
2017-09-08T15:40:23+08:00 INFO Non-zero metrics in the last 30s: filebeat.harvester.open_files=1 filebeat.harvester.running=1 filebeat.harvester.started=1 libbeat.logstash.publish.read_bytes=5120 libbeat.logstash.publish.write_bytes=660 libbeat.publisher.published_events=20
2017-09-08T15:40:29+08:00 ERR Connecting error publishing events (retrying): x509: certificate is valid for 192.168.1.57, not 192.168.1.249
2017-09-08T15:40:53+08:00 INFO Non-zero metrics in the last 30s: libbeat.logstash.publish.read_bytes=1024 libbeat.logstash.publish.write_bytes=132
2017-09-08T15:41:01+08:00 ERR Connecting error publishing events (retrying): x509: certificate is valid for 192.168.1.57, not 192.168.1.249
2017-09-08T15:41:23+08:00 INFO Non-zero metrics in the last 30s: libbeat.logstash.publish.read_bytes=1024 libbeat.logstash.publish.write_bytes=132
2017-09-08T15:41:53+08:00 INFO No non-zero metrics in the last 30s
意思是說 
证书对 192.168.1.57 有效,而不是192.168.1.249 。 這裡有些不明白...
继续阅读 »
各位朋友大家好:
 
進行配置「Filebeat」證書加密 出現錯誤如下,有哪位朋友遇過此問題可以幫幫忙!
 
ERR Failed to publish events caused by: read tcp 192.168.1.57:56182->192.168.1.249:5043: wsarecv: An existing connection was forcibly closed by the remote host.
 
使用過「telnet IP Port」測試「ELK」服務器,確認通訊協議 OK !
 
Filebeat 配置如檔如下:  (Windows 環境)
filebeat.prospectors:
- input_type: log #輸入 type「log」
paths:
- D:\Wireshark_Log\* #指定推送日誌「Log」文件

output.logstash:
hosts: ["192.168.1.249:5043"] #指定接收Logstash
tls:
certificate_authorities:
- C:\filebeat-5.5.0-windows-x86_64\ssl\logstash\192.168.1.249.crt
ssl.certificate:
- C:\filebeat-5.5.0-windows-x86_64\ssl\filebeat\192.168.1.57.crt
ssl.certificate:
- C:\filebeat-5.5.0-windows-x86_64\ssl\filebeat\192.168.1.57.key

以下是「FileBeat」錯誤 日誌
2017-09-08T14:14:57+08:00 ERR Failed to publish events caused by: read tcp 192.168.1.57:56202->192.168.1.249:5043: wsarecv: An existing connection was forcibly closed by the remote host.
2017-09-08T14:14:57+08:00 INFO Error publishing events (retrying): read tcp 192.168.1.57:56202->192.168.1.249:5043: wsarecv: An existing connection was forcibly closed by the remote host.
2017-09-08T14:15:19+08:00 INFO Non-zero metrics in the last 30s: filebeat.harvester.closed=1 filebeat.harvester.open_files=-1 filebeat.harvester.running=-1 libbeat.logstash.call_count.PublishEvents=1 libbeat.logstash.publish.read_errors=1 libbeat.logstash.publish.write_bytes=323 libbeat.logstash.published_but_not_acked_events=5
2017-09-08T14:15:49+08:00 INFO No non-zero metrics in the last 30s

 2017.09.08 感謝 medcl 兄弟幫忙,再次修改如下:
filebeat.prospectors:
- input_type: log #輸入 type「log」
paths:
- D:\Wireshark_Log\* #指定推送日誌「Log」文件

output.logstash:
hosts: ["192.168.1.249:5043"] #指定接收Logstash
ssl: # <=== 新版本貌似要改成「SSL」
certificate_authorities:
- C:\filebeat-5.5.0-windows-x86_64\ssl\logstash\192.168.1.249.crt
ssl.certificate:
- C:\filebeat-5.5.0-windows-x86_64\ssl\filebeat\192.168.1.57.crt
ssl.key: # <=== 修正為「ssl.key」
- C:\filebeat-5.5.0-windows-x86_64\ssl\filebeat\192.168.1.57.key

以下是「FileBeat」錯誤 日誌
2017-09-08T15:40:23+08:00 INFO Non-zero metrics in the last 30s: filebeat.harvester.open_files=1 filebeat.harvester.running=1 filebeat.harvester.started=1 libbeat.logstash.publish.read_bytes=5120 libbeat.logstash.publish.write_bytes=660 libbeat.publisher.published_events=20
2017-09-08T15:40:29+08:00 ERR Connecting error publishing events (retrying): x509: certificate is valid for 192.168.1.57, not 192.168.1.249
2017-09-08T15:40:53+08:00 INFO Non-zero metrics in the last 30s: libbeat.logstash.publish.read_bytes=1024 libbeat.logstash.publish.write_bytes=132
2017-09-08T15:41:01+08:00 ERR Connecting error publishing events (retrying): x509: certificate is valid for 192.168.1.57, not 192.168.1.249
2017-09-08T15:41:23+08:00 INFO Non-zero metrics in the last 30s: libbeat.logstash.publish.read_bytes=1024 libbeat.logstash.publish.write_bytes=132
2017-09-08T15:41:53+08:00 INFO No non-zero metrics in the last 30s
意思是說 
证书对 192.168.1.57 有效,而不是192.168.1.249 。 這裡有些不明白... 收起阅读 »

线下活动又来啦,长沙,武汉,广州,深圳的同学快来报名啊 ?

Elastic 线下活动又双叒叕来啦,????,这次的活动日程是:

 
在这些城市的同学快快来报名,可以报名分享,也可以报名参会,更欢迎一起组织✌️。
 
干货交流,免费参加,不收费(一直都是)!

已有主题(欢迎报名分享):
  • Elastic - Medcl - Elastic Stack 6.0 新功能介绍
  • 基于爬虫和 Elasticsearch 快速构建站内搜索引擎
  • 芒果 TV - 刘波涛 - 芒果日志之旅
  • 尚德机构 - 白凡 - 高吞吐状况下斗鱼搜索引擎优化之路
  • 腾讯 - 姜国强- 基于 ES 的时序数据库服务
  • 中信银行信用卡中心 - 陈刚 - ES的容器化之路
  • 中投证券 - 尉晋洪 - ELK 在证券行业业务监控中的应用
  • 网易 - ELK 在藏宝阁中的应用
  • 网易 - 网易 ELK 系统综述
  • 数说故事 - 吴文杰 - ElasticSearch with OLAP in Datastory
  • 酷狗 - 钟旺 - 基于ES的音乐搜索引擎
  • Vivo - 杨振涛 - Elasticsearch在vivo互联网的最佳实践

 
报名分享与场地赞助请加微信:medcl123

上半年往期回顾~


对了,报名链接:http://elasticsearch.mikecrm.com/O6o0yq3  
名额有限哦!
继续阅读 »
Elastic 线下活动又双叒叕来啦,????,这次的活动日程是:

 
在这些城市的同学快快来报名,可以报名分享,也可以报名参会,更欢迎一起组织✌️。
 
干货交流,免费参加,不收费(一直都是)!

已有主题(欢迎报名分享):
  • Elastic - Medcl - Elastic Stack 6.0 新功能介绍
  • 基于爬虫和 Elasticsearch 快速构建站内搜索引擎
  • 芒果 TV - 刘波涛 - 芒果日志之旅
  • 尚德机构 - 白凡 - 高吞吐状况下斗鱼搜索引擎优化之路
  • 腾讯 - 姜国强- 基于 ES 的时序数据库服务
  • 中信银行信用卡中心 - 陈刚 - ES的容器化之路
  • 中投证券 - 尉晋洪 - ELK 在证券行业业务监控中的应用
  • 网易 - ELK 在藏宝阁中的应用
  • 网易 - 网易 ELK 系统综述
  • 数说故事 - 吴文杰 - ElasticSearch with OLAP in Datastory
  • 酷狗 - 钟旺 - 基于ES的音乐搜索引擎
  • Vivo - 杨振涛 - Elasticsearch在vivo互联网的最佳实践

 
报名分享与场地赞助请加微信:medcl123

上半年往期回顾~


对了,报名链接:http://elasticsearch.mikecrm.com/O6o0yq3  
名额有限哦! 收起阅读 »

社区日报 第41期 (2017-09-08)

1.Elasticsearch 最常见的Top10面试题及答案。
http://t.cn/Rp7Z2Zg 

2.5分钟,让你的Elasitcsearch更安全!
http://t.cn/R9SQJlK 
3. Elasticsearch复杂数据存储方式实战。
http://t.cn/RphsfBJ 


编辑:laoyang360
归档:https://www.elasticsearch.cn/article/260 

订阅:https://tinyletter.com/elastic-daily 
 
继续阅读 »
1.Elasticsearch 最常见的Top10面试题及答案。
http://t.cn/Rp7Z2Zg 

2.5分钟,让你的Elasitcsearch更安全!
http://t.cn/R9SQJlK 
3. Elasticsearch复杂数据存储方式实战。
http://t.cn/RphsfBJ 


编辑:laoyang360
归档:https://www.elasticsearch.cn/article/260 

订阅:https://tinyletter.com/elastic-daily 
  收起阅读 »

社区日报 第40期 (2017-09-07)

1.elasticsearch的heap区设置详解: http://t.cn/RpPxEEO
2.一条命令构建全栈elastic stack: http://t.cn/RpPxDo4
3.怎么监控elasticsearch的性能: http://t.cn/RpPJh7u

编辑:金桥
归档:https://elasticsearch.cn/article/259
订阅:https://tinyletter.com/elastic-daily
继续阅读 »
1.elasticsearch的heap区设置详解: http://t.cn/RpPxEEO
2.一条命令构建全栈elastic stack: http://t.cn/RpPxDo4
3.怎么监控elasticsearch的性能: http://t.cn/RpPJh7u

编辑:金桥
归档:https://elasticsearch.cn/article/259
订阅:https://tinyletter.com/elastic-daily 收起阅读 »

【360 - 北京】ES研发工程师

工作职责: 1、ElasticSearch集群的配置管理和优化; 2、ElasticSearch相关开发。技能及资质要求: 1、本科及以上学历,熟悉Java,Linux; 2、熟悉ElasticSearch,有相关使用或开发经验; 3、熟悉搜索技术、nosql或hadoop等优先。有意向者,请将简历发送至:noh1122@163.com,QQ: 2472659680
继续阅读 »
工作职责: 1、ElasticSearch集群的配置管理和优化; 2、ElasticSearch相关开发。技能及资质要求: 1、本科及以上学历,熟悉Java,Linux; 2、熟悉ElasticSearch,有相关使用或开发经验; 3、熟悉搜索技术、nosql或hadoop等优先。有意向者,请将简历发送至:noh1122@163.com,QQ: 2472659680 收起阅读 »

社区日报 第39期 (2017-09-06)

1.如何基于 Java 和 Elasticsearch 打造一个搜索框架
http://t.cn/RNDj4cz 

2.有赞搜索引擎实践
工程篇  http://t.cn/RNDHO1W 
算法篇  http://t.cn/RqjlyIR 

3. 民生银行基于 ELK 的选型以及基本应用
http://t.cn/RND80vn 
 
编辑:江水
归档:https://elasticsearch.cn/article/257 
订阅:https://tinyletter.com/elastic-daily
继续阅读 »
1.如何基于 Java 和 Elasticsearch 打造一个搜索框架
http://t.cn/RNDj4cz 

2.有赞搜索引擎实践
工程篇  http://t.cn/RNDHO1W 
算法篇  http://t.cn/RqjlyIR 

3. 民生银行基于 ELK 的选型以及基本应用
http://t.cn/RND80vn 
 
编辑:江水
归档:https://elasticsearch.cn/article/257 
订阅:https://tinyletter.com/elastic-daily 收起阅读 »

社区日报 第38期 (2017-09-05)

1.系统扩容导致的的ES故障和解决方法,你是否用的上?
http://t.cn/RC7iMym 
2.聊聊ES5.0以上版本的磁盘优化策略和建议。
http://t.cn/RN0g6bh 
3.如何定义ES集群的生命周期,可以参考一下Ebay给出的方案!
http://t.cn/RXlwIqQ 

编辑:叮咚光军
归档:https://elasticsearch.cn/article/256
订阅:https://tinyletter.com/elastic-daily 
 
继续阅读 »
1.系统扩容导致的的ES故障和解决方法,你是否用的上?
http://t.cn/RC7iMym 
2.聊聊ES5.0以上版本的磁盘优化策略和建议。
http://t.cn/RN0g6bh 
3.如何定义ES集群的生命周期,可以参考一下Ebay给出的方案!
http://t.cn/RXlwIqQ 

编辑:叮咚光军
归档:https://elasticsearch.cn/article/256
订阅:https://tinyletter.com/elastic-daily 
  收起阅读 »

社区日报 第37期 (2017-09-04)

1.使用sql来搜索Elasticsearch数据:

http://t.cn/RNWqqUx

2.使用Elasticsearch做全站搜索?快来学习 targetprocess 对全站搜索的优化(自备梯子):

http://t.cn/RNW5sWH

3. 使用Elasticsearch做数据分析?那么你一定要来看看kibi,一个更加智能的数据平台(自备梯子):

http://t.cn/RNWVW6h
编辑:cyberdak

归档:https://elasticsearch.cn/article/255

订阅:https://tinyletter.com/elastic-daily
 
继续阅读 »
1.使用sql来搜索Elasticsearch数据:

http://t.cn/RNWqqUx

2.使用Elasticsearch做全站搜索?快来学习 targetprocess 对全站搜索的优化(自备梯子):

http://t.cn/RNW5sWH

3. 使用Elasticsearch做数据分析?那么你一定要来看看kibi,一个更加智能的数据平台(自备梯子):

http://t.cn/RNWVW6h
编辑:cyberdak

归档:https://elasticsearch.cn/article/255

订阅:https://tinyletter.com/elastic-daily
  收起阅读 »

社区日报 第36期 (2017-09-03)

1.Snaptrip使用Elasticsearch改善客户体验:
http://t.cn/RNKFisR
2. 手把手教你在Windows上安装Elasticsearch v5.5.0:
http://t.cn/RNKFCpg
3.基于Mesos的当当作业云Elastic-Job-Cloud:
http://t.cn/RfCBHZt

编辑:至尊宝
归档:https://elasticsearch.cn/article/254
订阅:https://tinyletter.com/elastic-daily
继续阅读 »
1.Snaptrip使用Elasticsearch改善客户体验:
http://t.cn/RNKFisR
2. 手把手教你在Windows上安装Elasticsearch v5.5.0:
http://t.cn/RNKFCpg
3.基于Mesos的当当作业云Elastic-Job-Cloud:
http://t.cn/RfCBHZt

编辑:至尊宝
归档:https://elasticsearch.cn/article/254
订阅:https://tinyletter.com/elastic-daily 收起阅读 »

社区日报 第35期 (2017-09-02)

1. es的各种关系索引、查询你清楚吗?
http://t.cn/RNXpc2Y
2. 手把手教你用Lassie从互联网搜集收据,并导入到es
http://t.cn/RNXWeGk
3. 如何利用painless更好的完成你的个性化需求,这篇文章带你入门。
http://t.cn/RNXjWeG
 
编辑:bsll
归档:https://elasticsearch.cn/article/253
订阅:https://tinyletter.com/elastic-daily
 
继续阅读 »
1. es的各种关系索引、查询你清楚吗?
http://t.cn/RNXpc2Y
2. 手把手教你用Lassie从互联网搜集收据,并导入到es
http://t.cn/RNXWeGk
3. 如何利用painless更好的完成你的个性化需求,这篇文章带你入门。
http://t.cn/RNXjWeG
 
编辑:bsll
归档:https://elasticsearch.cn/article/253
订阅:https://tinyletter.com/elastic-daily
  收起阅读 »

es 由于gc 引起的节点脱落

最近发现es节点在gc时,引起节点脱落,将配置改成如下后,过了一阵又发生节点脱落的情况。discovery.zen.fd.ping_timeout: 60s
discovery.zen.fd.ping_interval: 10s
discovery.zen.fd.ping_retries: 10
 
这个配置值如何设置?另是否还有其他的解决方案
 
继续阅读 »
最近发现es节点在gc时,引起节点脱落,将配置改成如下后,过了一阵又发生节点脱落的情况。discovery.zen.fd.ping_timeout: 60s
discovery.zen.fd.ping_interval: 10s
discovery.zen.fd.ping_retries: 10
 
这个配置值如何设置?另是否还有其他的解决方案
  收起阅读 »