我已经配置了gridgain-hadoop-os-6.6.2.zip,并按照docs/hadoop_readme.pdf
中提到的步骤进行了操作。使用bin / ggstart.sh命令启动gridgain,现在我正在使用hadoop-2.2.0在gridgain中运行一个简单的wordcount代码。使用命令
hadoop jar $HADOOP_HOME/share/hadoop/mapreduce/*-mapreduce-examples-*.jar wordcount /input /output
我尝试过的步骤:
第1步: 在usr / local文件夹中提取hadoop-2.2.0和gridgain-hadoop-os-6.6.2.zip文件,并将gridgain文件夹的名称更改为" gridgain"。
第2步: 使用JAVA_HOME设置导出GRIDGAIN_HOME = / usr / local / gridgain ..的路径和hadoop-2.2.0的路径
# Set Hadoop-related environment variables
export HADOOP_PREFIX=/usr/local/hadoop-2.2.0
export HADOOP_HOME=/usr/local/hadoop-2.2.0
export HADOOP_MAPRED_HOME=/usr/local/hadoop-2.2.0
export HADOOP_COMMON_HOME=/usr/local/hadoop-2.2.0
export HADOOP_HDFS_HOME=/usr/local/hadoop-2.2.0
export YARN_HOME=/usr/local/hadoop-2.2.0
export HADOOP_CONF_DIR=/usr/local/hadoop-2.2.0/etc/hadoop
export GRIDGAIN_HADOOP_CLASSPATH='/usr/local/hadoop-2.2.0/lib/*:/usr/local/hadoop-2.2.0/lib/*:/usr/local/hadoop-2.2.0/lib/*'
第3步:
现在我将命令作为bin/setup-hadoop.sh
运行...对每个提示回答Y.
第4步:
使用命令
启动了gridgain仓/ ggstart.sh
第5步:
现在我使用以下方法创建了dir和上传文件:
hadoop fs -mkdir /input
hadoop fs -copyFromLocal $HADOOP_HOME/README.txt /input/WORD_COUNT_ME.
txt
第6步:
运行此命令会给我错误:
hadoop jar $HADOOP_HOME/share/hadoop/mapreduce/*-mapreduce-examples-*.
jar wordcount /input /output
获取以下错误:
15/02/22 12:49:13 INFO Configuration.deprecation: mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir
15/02/22 12:49:13 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_091ebfbd-2993-475f-a506-28280dbbf891_0002
15/02/22 12:49:13 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/hadoop-yarn/staging/hduser/.staging/job_091ebfbd-2993-475f-a506-28280dbbf891_0002
java.lang.NullPointerException
at org.gridgain.client.hadoop.GridHadoopClientProtocol.processStatus(GridHadoopClientProtocol.java:329)
at org.gridgain.client.hadoop.GridHadoopClientProtocol.submitJob(GridHadoopClientProtocol.java:115)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:430)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
at org.apache.hadoop.examples.WordCount.main(WordCount.java:84)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144)
at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:74)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
和gridgain控制台错误:
sLdrId=a0b8610bb41-091ebfbd-2993-475f-a506-28280dbbf891, userVer=0, loc=true, sampleClsName=java.lang.String, pendingUndeploy=false, undeployed=false, usage=0]], taskClsName=o.g.g.kernal.processors.hadoop.proto.GridHadoopProtocolSubmitJobTask, sesId=e129610bb41-091ebfbd-2993-475f-a506-28280dbbf891, startTime=1424589553332, endTime=9223372036854775807, taskNodeId=091ebfbd-2993-475f-a506-28280dbbf891, clsLdr=sun.misc.Launcher$AppClassLoader@1bdcbb2, closed=false, cpSpi=null, failSpi=null, loadSpi=null, usage=1, fullSup=false, subjId=091ebfbd-2993-475f-a506-28280dbbf891], jobId=f129610bb41-091ebfbd-2993-475f-a506-28280dbbf891]]
java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/JobContext
at java.lang.Class.getDeclaredConstructors0(Native Method)
at java.lang.Class.privateGetDeclaredConstructors(Class.java:2585)
at java.lang.Class.getConstructor0(Class.java:2885)
at java.lang.Class.getConstructor(Class.java:1723)
at org.gridgain.grid.hadoop.GridHadoopDefaultJobInfo.createJob(GridHadoopDefaultJobInfo.java:107)
at org.gridgain.grid.kernal.processors.hadoop.jobtracker.GridHadoopJobTracker.job(GridHadoopJobTracker.java:959)
at org.gridgain.grid.kernal.processors.hadoop.jobtracker.GridHadoopJobTracker.submit(GridHadoopJobTracker.java:222)
at org.gridgain.grid.kernal.processors.hadoop.GridHadoopProcessor.submit(GridHadoopProcessor.java:188)
at org.gridgain.grid.kernal.processors.hadoop.GridHadoopImpl.submit(GridHadoopImpl.java:73)
at org.gridgain.grid.kernal.processors.hadoop.proto.GridHadoopProtocolSubmitJobTask.run(GridHadoopProtocolSubmitJobTask.java:54)
at org.gridgain.grid.kernal.processors.hadoop.proto.GridHadoopProtocolSubmitJobTask.run(GridHadoopProtocolSubmitJobTask.java:37)
at org.gridgain.grid.kernal.processors.hadoop.proto.GridHadoopProtocolTaskAdapter$Job.execute(GridHadoopProtocolTaskAdapter.java:95)
at org.gridgain.grid.kernal.processors.job.GridJobWorker$2.call(GridJobWorker.java:484)
at org.gridgain.grid.util.GridUtils.wrapThreadLoader(GridUtils.java:6136)
at org.gridgain.grid.kernal.processors.job.GridJobWorker.execute0(GridJobWorker.java:478)
at org.gridgain.grid.kernal.processors.job.GridJobWorker.body(GridJobWorker.java:429)
at org.gridgain.grid.util.worker.GridWorker.run(GridWorker.java:151)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: Failed to load class: org.apache.hadoop.mapreduce.JobContext
at org.gridgain.grid.kernal.processors.hadoop.GridHadoopClassLoader.loadClass(GridHadoopClassLoader.java:125)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 20 more
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.JobContext
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at org.gridgain.grid.kernal.processors.hadoop.GridHadoopClassLoader.loadClassExplicitly(GridHadoopClassLoader.java:196)
at org.gridgain.grid.kernal.processors.hadoop.GridHadoopClassLoader.loadClass(GridHadoopClassLoader.java:106)
... 21 more
^[[B
在这里帮助....
在此编辑:
raj@ubuntu:~$ hadoop classpath
/usr/local/hadoop-2.2.0/etc/hadoop:/usr/local/hadoop-2.2.0/share/hadoop/common/lib/*:/usr/local/hadoop-2.2.0/share/hadoop/common/*:/usr/local/hadoop-2.2.0/share/hadoop/hdfs:/usr/local/hadoop-2.2.0/share/hadoop/hdfs/lib/*:/usr/local/hadoop-2.2.0/share/hadoop/hdfs/*:/usr/local/hadoop-2.2.0/share/hadoop/yarn/lib/*:/usr/local/hadoop-2.2.0/share/hadoop/yarn/*:/usr/local/hadoop-2.2.0/share/hadoop/mapreduce/lib/*:/usr/local/hadoop-2.2.0/share/hadoop/mapreduce/*:/usr/local/hadoop-2.2.0/contrib/capacity-scheduler/*.jar
raj@ubuntu:~$ jps
3529 GridCommandLineStartup
3646 Jps
raj@ubuntu:~$ echo $GRIDGAIN_HOME
/usr/local/gridgain
raj@ubuntu:~$ echo $HADOOP_HOME
/usr/local/hadoop-2.2.0
raj@ubuntu:~$ hadoop version
Hadoop 2.2.0
Subversion https://svn.apache.org/repos/asf/hadoop/common -r 1529768
Compiled by hortonmu on 2013-10-07T06:28Z
Compiled with protoc 2.5.0
From source with checksum 79e53ce7994d1628b240f09af91e1af4
This command was run using /usr/local/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar
raj@ubuntu:~$ cd /usr/local/hadoop-2.2.0/share/hadoop/mapreduce
raj@ubuntu:/usr/local/hadoop-2.2.0/share/hadoop/mapreduce$ ls
hadoop-mapreduce-client-app-2.2.0.jar hadoop-mapreduce-client-hs-2.2.0.jar hadoop-mapreduce-client-jobclient-2.2.0-tests.jar lib
hadoop-mapreduce-client-common-2.2.0.jar hadoop-mapreduce-client-hs-plugins-2.2.0.jar hadoop-mapreduce-client-shuffle-2.2.0.jar lib-examples
hadoop-mapreduce-client-core-2.2.0.jar hadoop-mapreduce-client-jobclient-2.2.0.jar hadoop-mapreduce-examples-2.2.0.jar sources
raj@ubuntu:/usr/local/hadoop-2.2.0/share/hadoop/mapreduce$
答案 0 :(得分:2)
我确切地配置了你提到的版本(gridgain-hadoop-os-6.6.2.zip + hadoop-2.2.0) - " wordcount"样本工作正常。
[问题后的UPD&#39作者日志分析:]
Raju,感谢详细的日志。 问题的原因是错误地设置env变量
export HADOOP_MAPRED_HOME=${HADOOP_HOME}
export HADOOP_COMMON_HOME=${HADOOP_HOME}
export HADOOP_HDFS_HOME=${HADOOP_HOME}
您明确将所有这些变量设置为$ {HADOOP_HOME}值,错误是什么。这会导致GG组成不正确的hadoop类路径,如下面的GG Node日志所示:
+++ HADOOP_PREFIX=/usr/local/hadoop-2.2.0
+++ [[ -z /usr/local/hadoop-2.2.0 ]]
+++ '[' -z /usr/local/hadoop-2.2.0 ']'
+++ HADOOP_COMMON_HOME=/usr/local/hadoop-2.2.0
+++ HADOOP_HDFS_HOME=/usr/local/hadoop-2.2.0
+++ HADOOP_MAPRED_HOME=/usr/local/hadoop-2.2.0
+++ GRIDGAIN_HADOOP_CLASSPATH='/usr/local/hadoop-2.2.0/lib/*:/usr/local/hadoop-2.2.0/lib/*:/usr/local/hadoop-2.2.0/lib/*'
因此,要解决此问题,请不要设置不必要的env变量。 JAVA_HOME和HADOOP_HOME足够了,不需要任何其他东西。
答案 1 :(得分:0)
问题不在于设置其他与hadoop相关的环境变量。这足以设定。
JAVA_HOME , HADOOP_HOME and GRIDGAIN_HOME