hadoop-core的maven artifactId hadoop 2.2.0

时间:2014-03-12 23:55:53

标签: maven hadoop ant hadoop2

我正在将我的应用程序从hadoop 1.0.3迁移到hadoop 2.2.0,maven build将hadoop-core标记为依赖。因为hadoop 2.2.0不存在hadoop-core。我尝试用hadoop-client和hadoop-common替换它,但我仍然得到ant.filter的这个错误。任何人都可以建议使用哪种工件?

previous config :
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-core</artifactId>
    <version>1.0.3</version>
</dependency>

New Config:

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-client</artifactId>
    <version>2.2.0</version>
</dependency>

错误:

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project event: Compilation failure: Compilation failure:

[ERROR] /opt/teamcity/buildAgent/work/c670ebea1992ec2f/event/src/main/java/com/intel/event/EventContext.java:[27,36] package org.apache.tools.ant.filters does not exist

[ERROR] /opt/teamcity/buildAgent/work/c670ebea1992ec2f/event/src/main/java/com/intel/event/EventContext.java:[27,36] package org.apache.tools.ant.filters does not exist

[ERROR] /opt/teamcity/buildAgent/work/c670ebea1992ec2f/event/src/main/java/com/intel/event/EventContext.java:[180,59] cannot find symbol

[ERROR] symbol: class StringInputStream

[ERROR] location: class com.intel.event.EventContext

3 个答案:

答案 0 :(得分:5)

我们主要依赖hdfs api来实现我们的应用程序。当我们迁移到hadoop 2.X时,我们惊讶地发现依赖关系的变化。我们开始一次添加一个依赖项。今天我们依赖于以下核心库。

hadoop-annotations-2.2.0
hadoop-auth-2.2.0
hadoop-common-2.2.0
hadoop-hdfs-2.2.0
hadoop-mapreduce-client-core-2.2.0

除此之外,我们也依赖于测试库。根据您的需要,您可能希望将hadoop-hdfs和hadoop-mapreduce-client与hadoop-common一起包含在依赖项中。

答案 1 :(得分:0)

尝试使用这些工件,在我的示例项目wordcount

上单词
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>2.2.0</version>
</dependency>

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-core</artifactId>
    <version>1.2.1</version>
</dependency>

答案 2 :(得分:0)

可以从this链接获取Maven依赖项。 至于hadoop-core依赖,hadoop-core是hadoop 1.X的名称,只是将版本重命名为2.X不会有帮助。同样在hadoop 2.X项目中使用hadoop 1.X依赖项会产生类似

的错误
  

引起:org.apache.hadoop.ipc.RemoteException:服务器IPC版本9无法与客户端版本4通信

因此建议不要使用它。我一直在我的hadoop中使用以下依赖项

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-client</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-hdfs</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-core</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-jobclient</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-common</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>

你可以试试这些。