Java SparkContext错误:java.lang.NoSuchMethodError:io.netty.buffer.PooledByteBufAllocator

时间:2018-04-27 05:15:19

标签: java maven apache-spark

这是我第一次涉足Spark的java。使用Spark 1.X(已尝试1.5.0)或2.X(已尝试2.2.0),java 1.8scala 2.10时,会发生以下错误:

JavaSparkContext sc = new JavaSparkContext(sparkConf);

Exception in thread "main" java.lang.NoSuchMethodError: 
io.netty.buffer.PooledByteBufAllocator.<init>(ZIIIIIII)V
    at org.apache.spark.network.util.NettyUtils.createPooledByteBufAllocator(NettyUtils.java:120)
    at org.apache.spark.network.client.TransportClientFactory.<init>(TransportClientFactory.java:107)
    at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99)
    at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:70)
    at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:450)
    at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:56)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:246)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:432)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
    at KMeansMP.main(KMeansMP.java:38)

我原以为它是库不匹配但无法隔离确切的不兼容性。以下是pom.xml

的相关部分
<properties>
    <spark.version>2.2.0</spark.version>
</properties>

..

<dependencies>
    <dependency>
        <groupId>org.apache.giraph</groupId>
        <artifactId>giraph-core</artifactId>
        <version>1.1.0-hadoop2</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>2.7.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>${spark.version}</version>
        <scope>compile</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_2.10</artifactId>
        <version>${spark.version}</version>
        <scope>compile</scope>
    </dependency>
</dependencies>

鼓励那些有提示的java名士兵投入。

1 个答案:

答案 0 :(得分:2)

spark-coregiraph-core都依赖netty-all。您需要将其从giraph-core中排除。

<dependencies>
    <dependency>
        <groupId>org.apache.giraph</groupId>
        <artifactId>giraph-core</artifactId>
        <version>1.1.0-hadoop2</version>
        <exclusions>
            <exclusion>
                <groupId>io.netty</groupId>
                <artifactId>netty-all</artifactId>
            </exclusion>
        </exclusions>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>2.7.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>${spark.version}</version>
        <scope>compile</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_2.10</artifactId>
        <version>${spark.version}</version>
        <scope>compile</scope>
    </dependency>
</dependencies>