运行Hive mapreduce程序

时间:2014-01-12 18:21:19

标签: java apache hadoop

我正在尝试运行HIPI map reduce示例(Downloader)。我已将hipi jar添加到构建路径中,但在执行时遇到错误。

我的命令看起来像,

hadoop jar Downloader.jar Downloader  ./hipi/hipi.txt ./hipi/output.hib 1

我的输入文件hipi.txt包含三个网址

错误日志:

> Output HIB: ./hipi/ 14/01/12 02:39:08 WARN mapred.JobClient: Use
> GenericOptionsParser for parsing the arguments. Applications should
> implement Tool for the same. Found host successfully: 0 Tried to get 1
> nodes, got 1 14/01/12 02:39:09 INFO input.FileInputFormat: Total input
> paths to process : 1 First n-1 nodes responsible for 3 images Last
> node responsible for 3 images 14/01/12 02:39:10 INFO mapred.JobClient:
> Running job: job_201401050058_0010 14/01/12 02:39:12 INFO
> mapred.JobClient:  map 0% reduce 0% 14/01/12 02:40:10 INFO
> mapred.JobClient: Task Id : attempt_201401050058_0010_m_000000_0,
> Status : FAILED Error: java.lang.ClassNotFoundException:
> hipi.imagebundle.HipiImageBundle  at
> java.net.URLClassLoader$1.run(URLClassLoader.java:202)    at
> java.security.AccessController.doPrivileged(Native Method)    at
> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)     at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)     at
> java.lang.ClassLoader.loadClass(ClassLoader.java:247)     at
> Downloader$DownloaderMapper.map(Downloader.java:61)   at
> Downloader$DownloaderMapper.map(Downloader.java:1)    at
> org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:140)   at
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)   at
> org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)    at
> org.apache.hadoop.mapred.Child$4.run(Child.java:268)  at
> java.security.AccessController.doPrivileged(Native Method)    at
> javax.security.auth.Subject.doAs(Subject.java:396)    at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>   at org.apache.hadoop attempt_201401050058_0010_m_000000_0: Temp path:
> ./hipi/0.hib.tmp 14/01/12 02:40:18 INFO mapred.JobClient: Task Id :
> attempt_201401050058_0010_m_000000_1, Status : FAILED Error:
> java.lang.ClassNotFoundException: hipi.imagebundle.HipiImageBundle    at
> java.net.URLClassLoader$1.run(URLClassLoader.java:202)    at
> java.security.AccessController.doPrivileged(Native Method)    at
> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)     at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)     at
> java.lang.ClassLoader.loadClass(ClassLoader.java:247)     at
> Downloader$DownloaderMapper.map(Downloader.java:61)   at
> Downloader$DownloaderMapper.map(Downloader.java:1)    at
> org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:140)   at
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)   at
> org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)    at
> org.apache.hadoop.mapred.Child$4.run(Child.java:268)  at
> java.security.AccessController.doPrivileged(Native Method)    at
> javax.security.auth.Subject.doAs(Subject.java:396)    at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>   at org.apache.hadoop attempt_201401050058_0010_m_000000_1: Temp path:
> ./hipi/0.hib.tmp 14/01/12 02:40:27 INFO mapred.JobClient: Task Id :
> attempt_201401050058_0010_m_000000_2, Status : FAILED Error:
> java.lang.ClassNotFoundException: hipi.imagebundle.HipiImageBundle    at
> java.net.URLClassLoader$1.run(URLClassLoader.java:202)    at
> java.security.AccessController.doPrivileged(Native Method)    at
> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)     at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)     at
> java.lang.ClassLoader.loadClass(ClassLoader.java:247)     at
> Downloader$DownloaderMapper.map(Downloader.java:61)   at
> Downloader$DownloaderMapper.map(Downloader.java:1)    at
> org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:140)   at
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)   at
> org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)    at
> org.apache.hadoop.mapred.Child$4.run(Child.java:268)  at
> java.security.AccessController.doPrivileged(Native Method)    at
> javax.security.auth.Subject.doAs(Subject.java:396)    at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>   at org.apache.hadoop attempt_201401050058_0010_m_000000_2: Temp path:
> ./hipi/0.hib.tmp 14/01/12 02:40:44 INFO mapred.JobClient: Job
> complete: job_201401050058_0010 14/01/12 02:40:44 INFO
> mapred.JobClient: Counters: 7 14/01/12 02:40:44 INFO mapred.JobClient:
> Job Counters  14/01/12 02:40:44 INFO mapred.JobClient:     Failed map
> tasks=1 14/01/12 02:40:44 INFO mapred.JobClient:     Launched map
> tasks=4 14/01/12 02:40:44 INFO mapred.JobClient:     Data-local map
> tasks=4 14/01/12 02:40:44 INFO mapred.JobClient:     Total time spent
> by all maps in occupied slots (ms)=61598 14/01/12 02:40:44 INFO
> mapred.JobClient:     Total time spent by all reduces in occupied
> slots (ms)=0 14/01/12 02:40:44 INFO mapred.JobClient:     Total time
> spent by all maps waiting after reserving slots (ms)=0 14/01/12
> 02:40:44 INFO mapred.JobClient:     Total time spent by all reduces
> waiting after reserving slots (ms)=0

1 个答案:

答案 0 :(得分:1)

如果您使用HIPI网站上提供的命令会更好。点击HERE访问该网站。这是有用的命令:

  

./runDownloader.sh /hdfs/path/to/list.txt /hdfs/path/to/output.hib 100

根据您获得的Hadoop版本,用于创建jar文件的HIPI包附带的build.xml文件中的路径名将不正确。例如,我已经下载了hadoop-1.2.1,但是HIPI库使用了更旧的版本。有了这个,HIPI团队提供了这段代码:

<project basedir="." default="all">

  <target name="setup">
  <property name="hadoop.home" value="/hadoop/hadoop-0.20.1" />
  <property name="hadoop.version" value="0.20.1" />
  <property name="hadoop.classpath" value="${hadoop.home}/hadoop-${hadoop.version}-core.jar" />
  <property name="metadata.jar" value="3rdparty/metadata-extractor-2.3.1.jar" />
  </target>
  ...

对我来说,hadoop.classpath不正确。我不得不改为

<target name="setup">
    <property name="hadoop.home" value="/your/path/hadoop-1.2.1" />
    <property name="hadoop.version" value="1.2.1" />
    <property name="hadoop.classpath" value="${hadoop.home}/hadoop-core-${hadoop.version}.jar" />
    <property name="metadata.jar" value="3rdparty/metadata-extractor-2.3.1.jar" />
  </target>

我只需要移动“核心”关键字。

在此之后,你应该能够:

  

./runDownloader.sh /hdfs/path/to/list.txt /hdfs/path/to/output.hib 100

假设你知道你的hadoop文件系统的路径,这应该成功编译你的build.xml文件并运行程序。