我是Hadoop的新手。昨天我遵循了这本书,并使用JUnit作为气象数据的单元测试。但是有一些问题。
这是我的pom文件:
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.9.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.9.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.mrunit</groupId>
<artifactId>mrunit</artifactId>
<version>1.1.0</version>
<classifier>hadoop2</classifier>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-minicluster</artifactId>
<version>2.9.0</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>
这是问题:
java.lang.IncompatibleClassChangeError:找到的类 org.apache.hadoop.mapreduce.TaskInputOutputContext,但接口为 预期的
期待并感谢您的答复!
答案 0 :(得分:0)
尝试添加其他jar版本。
我将export default class Row extends Component {
constructor(props) {
super(props);
// icons to listview (arrow icon)
this.icons = {
up: require("../../assets/ic_play_circle_down.png"),
down: require("../../assets/ic_play_circle_up.png")
};
this.state = {
title: props.title,
// getting screen size (width and height)
let: ({ width, height } = Dimensions.get("window")),
// collapsed check the listview expanded or not if it is true: list view not expanded
collapsed: true,
isApproved: false,
comment: "dummy comment"
};
}
acceptApproval = () => {
this.setState({ isApproved: true });
console.log("Approved TICK " + this.state.isApproved);
this.verifyApprovals();
};
regectApproval = () => {
this.setState({ isApproved: false });
console.log("Approved" + this.state.isApproved);
this.verifyApprovals();
};
更改为mrunit-1.1.0.jar
老罐子:
mrunit-0.9.0-incubating-hadoop2.jar
新JAR:-
public interface Mapper<K1, V1, K2, V2> extends JobConfigurable, Closeable { }
我只是将jar从hadoop1更改为hadoop2,然后所有我的单元测试用例都成功运行了。 https://issues.apache.org/jira/browse/MRUNIT-156
public class Mapper<KEYIN, VALUEIN, KEYOUT, VALUEOUT> {}
答案 1 :(得分:0)
我刚刚发现,在Hadoop 2.x中,没有Hadoop核心,因此正确的pom是:
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.9.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.9.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.9.0</version>
</dependency>
<dependency>
<groupId>org.apache.mrunit</groupId>
<artifactId>mrunit</artifactId>
<version>1.1.0</version>
<classifier>hadoop2</classifier>
<scope>test</scope>
</dependency>