我的mac上的Hadoop 2.7.3安装在:
/usr/local/Cellar/hadoop/2.7.3
我使用java编写了一个演示来从HDFS读取文件:
import java.io.*;
import java.net.URI;
import java.net.URISyntaxException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FSDataOutputStream;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;
public class HDFSTest{
public static void main(String[] args) throws IOException, URISyntaxException{
String file= "hdfs://localhost:9000/hw1/customer.tbl";
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(URI.create(file), conf);
Path path = new Path(file);
FSDataInputStream in_stream = fs.open(path);
BufferedReader in = new BufferedReader(new
InputStreamReader(in_stream));
String s;
while ((s=in.readLine())!=null) {
System.out.println(s);
}
in.close();
fs.close();
}
}
当我编译java文件时,如图所示出错:
hero:Documents yaopan$ javac HDFSTest.java
HDFSTest.java:8: error: package org.apache.hadoop.conf does not exist
import org.apache.hadoop.conf.Configuration;
^
HDFSTest.java:10: error: package org.apache.hadoop.fs does not exist
import org.apache.hadoop.fs.FSDataInputStream;
^
HDFSTest.java:12: error: package org.apache.hadoop.fs does not exist
import org.apache.hadoop.fs.FSDataOutputStream;
^
HDFSTest.java:14: error: package org.apache.hadoop.fs does not exist
import org.apache.hadoop.fs.FileSystem;
我知道原因是找不到hadoop罐子,怎么配置? ^
答案 0 :(得分:1)
在您的安装下找到名为“{ themes: [] }
”的jar文件(即hadoop-common-2.7.3.jar
)并将其设置在类路径中,或者直接在命令行中与/usr/local/Cellar/hadoop/2.7.3
一起提供。
javac
(用适当的路径替换javac -cp "/PATH/hadoop-common-2.7.3.jar" HDFSTest.java
)
答案 1 :(得分:0)
只需将hadoop jar添加到classpath:
我在/usr/local/Cellar/hbase/1.2.2
上使用自制软件安装hbase,
将/usr/local/Cellar/hbase/1.2.2/libexec/lib
下的所有jar添加到classpath:
1.edit .bash_profile
sudo vim ~/.bash_profile
2.add classpath
#set hbase lib path
export CLASSPATH=$CLASSPATH://usr/local/Cellar/hbase/1.2.2/libexec/lib/*
保存并退出
WQ