我是PIG的初学者。
我在WIKI之后编写了一个程序,将文件中的单词转换为大写。
- cat UPPER.java
package com.bigdata.myUdf;
import java.io.IOException;
import org.apache.pig.EvalFunc;
import org.apache.pig.data.Tuple;
import org.apache.pig.impl.util.WrappedIOException;
public class UPPER extends EvalFunc<String> {
public String exec(Tuple input) throws IOException {
if (input == null || input.size() == 0)
return null;
try{
String str = (String)input.get(0);
return str.toUpperCase();
}catch(Exception e){
throw WrappedIOException.wrap("Caught exception processing input row ", e);
}
}
}
- cat /home/hduser/lab/mydata/myscript.pig
REGISTER /home/hduser/software/myUdfs/UPPER.jar
std_det = LOAD '/pigdata/udf1.txt' USING PigStorage(',') as (name:chararray);
B = FOREACH std_det GENERATE com.bigdata.myUdf.UPPER(name);
dump B;
但是当我运行它时,我得到错误。
java -cp com.bigdata.myUdf.UPPER.jar org.apache.pig.Main -x local /home/hduser/lab/mydata/myscript.pig
ERROR
Error: Could not find or load main class org.apache.pig.Main
cat .bashrc
export PIG_INSTALL=/home/hduser/software/pig
export PATH="${PATH}:${PIG_INSTALL}/bin"
export PIG_CLASSPATH=$HADOOP_CONF_DIR:${PIG_INSTALL}:.
export CLASSPATH=.:${PIG_CLASSPATH}
猪脚本位于: /home/hduser/lab/mydata/myscript.pig
JAR文件位于: /home/hduser/software/myUdfs/UPPER.jar
请帮助我理解我做错了什么。提前致谢。 遵照Shivashakti的指示。该程序已运行,但它没有提供任何输出。
pig -x local myScript.pig
15/01/05 04:47:57 INFO pig.ExecTypeProvider: Trying ExecType : LOCAL
15/01/05 04:47:57 INFO pig.ExecTypeProvider: Picked LOCAL as the ExecType
2015-01-05 04:47:57,920 [main] INFO org.apache.pig.Main - Apache Pig version 0.14.0 (r1640057) compiled Nov 16 2014, 18:02:05
2015-01-05 04:47:57,921 [main] INFO org.apache.pig.Main - Logging error messages to: /home/hduser/lab/piglog/pig_1420462077918.log
2015-01-05 04:47:57,959 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - user.name is deprecated. Instead, use mapreduce.job.user.name
2015-01-05 04:47:58,314 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2015-01-05 04:47:58,315 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
2015-01-05 04:47:58,318 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to hadoop file system at: file:///
2015-01-05 04:47:58,463 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2015-01-05 04:47:59,070 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2015-01-05 04:47:59,227 [main] INFO org.apache.pig.Main - Pig script completed in 2 seconds and 505 milliseconds (2505 ms)
答案 0 :(得分:2)
您可以按照以下步骤操作吗?
1.从以下链接下载3个jar文件(pig-0.11.1.jar,hadoop-common-0.21.0.jar和piggybank.jar)
http://www.java2s.com/Code/Jar/p/Downloadpig0111jar.htm
http://www.java2s.com/Code/Jar/h/Downloadhadoopcommon0210jar.htm
http://www.java2s.com/Code/Jar/p/Downloadpiggybankjar.htm
<强> 2。将以上所有3个jar文件设置为类路径
export CLASSPATH=/tmp/pig-0.11.1.jar:/tmp/hadoop-common-0.21.0.jar:/tmp/piggybank.jar
第3。从当前目录创建目录名“com / bigdata / myUdf /”
>>mkdir -p com/bigdata/myUdf/
<强> 4。编译UPPER.java文件,确保JAVA_HOME设置正确,并且所有上述三个jar文件都包含在类路径中,否则编译问题将会出现
>>javac UPPER.java
<强> 5。将已编译的UPPER.class文件移动到“com / bigdata / myUdf /”文件夹
>>mv UPPER.class com/bigdata/myUdf/
<强> 6。创建一个jar文件名UPPER.jar
>>jar -cvf UPPER.jar com/
<强> 7。现在将UPPER.jar包含到您的猪脚本中并运行以下命令
>>pig -x local myscript.pig
运行上述命令后,您将获得实际输出。
示例强>
的输入强>
hello
world
<强> myscript.pig 强>
REGISTER UPPER.jar;
std_det = LOAD 'input' USING PigStorage(',') as (name:chararray);
B = FOREACH std_det GENERATE com.bigdata.myUdf.UPPER(name);
dump B;
<强>输出:强>
(HELLO)
(WORLD)
示例命令:
$ ls
UPPER.java input myscript.pig
$ mkdir -p com/bigdata/myUdf/
$ javac UPPER.java
$ mv UPPER.class com/bigdata/myUdf/
$ jar -cvf UPPER.jar com/
$ pig -x local myscript.pig
答案 1 :(得分:0)
该错误表明apache jar不在类路径中。 -cp com.bigdata.myUdf.UPPER.jar
不包括必需的罐子。它只包括&#39; UPPER.jar&#39;。您可以查看如何在类路径here
P.S。我认为你应该从命令行使用pig命令而不是按照你的方式执行它。但是我自己还没有使用它,所以它只是一个猜测。