在hadoop

时间:2015-05-18 08:48:09

标签: hadoop nutch

我在hadoop2.5.2上运行nutch2.3,在gora 0.6上运行hbase 0.98.12,在执行nutch生成过程时,hadoop抛出了一个eofexception。任何建议都是受欢迎的。

  

2015-05-18 15:22:06,578 INFO [main] mapreduce.Job   (Job.java:monitorAndPrintJob(1362)) - 地图100%减少0%2015-05-18   15:22:13,697 INFO [main] mapreduce.Job   (Job.java:monitorAndPrintJob(1362)) - 地图100%减少50%2015-05-18   15:22:14,720 INFO [main] mapreduce.Job   (Job.java:printTaskEvents(1441)) - 任务ID:   attempt_1431932258783_0006_r_000001_0,状态:FAILED错误:   java.io.EOFException at   org.apache.avro.io.BinaryDecoder.ensureBounds(BinaryDecoder.java:473)     在org.apache.avro.io.BinaryDecoder.readInt(BinaryDecoder.java:128)     at org.apache.avro.io.BinaryDecoder.readIndex(BinaryDecoder.java:423)     在   org.apache.avro.io.ResolvingDecoder.doAction(ResolvingDecoder.java:229)     在org.apache.avro.io.parsing.Parser.advance(Parser.java:88)at   org.apache.avro.io.ResolvingDecoder.readIndex(ResolvingDecoder.java:206)     在   org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:152)     在   org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:177)     在   org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:148)     在   org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:139)     在   org.apache.hadoop.io.serializer.avro.AvroSerialization $ AvroDeserializer.deserialize(AvroSerialization.java:127)     在   org.apache.hadoop.mapreduce.task.ReduceContextImpl.nextKeyValue(ReduceContextImpl.java:146)     在   org.apache.hadoop.mapreduce.task.ReduceContextImpl.nextKey(ReduceContextImpl.java:121)     在   org.apache.hadoop.mapreduce.lib.reduce.WrappedReducer $ Context.nextKey(WrappedReducer.java:302)     在org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:170)at at   org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:627)     在org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389)at   org.apache.hadoop.mapred.YarnChild $ 2.run(YarnChild.java:168)at at   java.security.AccessController.doPrivileged(Native Method)at   javax.security.auth.Subject.doAs(Subject.java:415)at   org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)     在org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)

     

2015-05-18 15:22:21,901 INFO [main] mapreduce.Job   (Job.java:printTaskEvents(1441)) - 任务ID:   attempt_1431932258783_0006_r_000001_1,状态:失败错误:   java.io.EOFException at   org.apache.avro.io.BinaryDecoder.ensureBounds(BinaryDecoder.java:473)     在org.apache.avro.io.BinaryDecoder.readInt(BinaryDecoder.java:128)     at org.apache.avro.io.BinaryDecoder.readIndex(BinaryDecoder.java:423)     在   org.apache.avro.io.ResolvingDecoder.doAction(ResolvingDecoder.java:229)     在org.apache.avro.io.parsing.Parser.advance(Parser.java:88)at   org.apache.avro.io.ResolvingDecoder.readIndex(ResolvingDecoder.java:206)     在   org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:152)     在   org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:177)     在   org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:148)     在   org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:139)     在   org.apache.hadoop.io.serializer.avro.AvroSerialization $ AvroDeserializer.deserialize(AvroSerialization.java:127)     在   org.apache.hadoop.mapreduce.task.ReduceContextImpl.nextKeyValue(ReduceContextImpl.java:146)     在   org.apache.hadoop.mapreduce.task.ReduceContextImpl.nextKey(ReduceContextImpl.java:121)     在   org.apache.hadoop.mapreduce.lib.reduce.WrappedReducer $ Context.nextKey(WrappedReducer.java:302)     在org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:170)at at   org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:627)     在org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389)at   org.apache.hadoop.mapred.YarnChild $ 2.run(YarnChild.java:168)at at   java.security.AccessController.doPrivileged(Native Method)at   javax.security.auth.Subject.doAs(Subject.java:415)at   org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)     在org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)

     

2015-05-18 15:22:28,986 INFO [main] mapreduce.Job   (Job.java:printTaskEvents(1441)) - 任务ID:   attempt_1431932258783_0006_r_000001_2,状态:FAILED错误:   java.io.EOFException at   org.apache.avro.io.BinaryDecoder.ensureBounds(BinaryDecoder.java:473)     在org.apache.avro.io.BinaryDecoder.readInt(BinaryDecoder.java:128)     at org.apache.avro.io.BinaryDecoder.readIndex(BinaryDecoder.java:423)     在   org.apache.avro.io.ResolvingDecoder.doAction(ResolvingDecoder.java:229)     在org.apache.avro.io.parsing.Parser.advance(Parser.java:88)at   org.apache.avro.io.ResolvingDecoder.readIndex(ResolvingDecoder.java:206)     在   org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:152)     在   org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:177)     在   org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:148)     在   org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:139)     在   org.apache.hadoop.io.serializer.avro.AvroSerialization $ AvroDeserializer.deserialize(AvroSerialization.java:127)     在   org.apache.hadoop.mapreduce.task.ReduceContextImpl.nextKeyValue(ReduceContextImpl.java:146)     在   org.apache.hadoop.mapreduce.task.ReduceContextImpl.nextKey(ReduceContextImpl.java:121)     在   org.apache.hadoop.mapreduce.lib.reduce.WrappedReducer $ Context.nextKey(WrappedReducer.java:302)     在org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:170)at at   org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:627)     在org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389)at   org.apache.hadoop.mapred.YarnChild $ 2.run(YarnChild.java:168)at at   java.security.AccessController.doPrivileged(Native Method)at   javax.security.auth.Subject.doAs(Subject.java:415)at   org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)     在org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)

     

2015-05-18 15:22:37,078 INFO [main] mapreduce.Job   (Job.java:monitorAndPrintJob(1362)) - 地图100%减少100%2015-05-18   15:22:37,109 INFO [main] mapreduce.Job   (Job.java:monitorAndPrintJob(1375)) - 工作职位_1431932258783_0006   因状态失败而导致失败,原因是:任务失败   task_1431932258783_0006_r_000001作业失败,因为任务失败。   failedMaps:0 failedReduces:1

     

2015-05-18 15:22:37,256 INFO [main] mapreduce.Job   (Job.java:monitorAndPrintJob(1380)) - 计数器:50文件系统   计数器FILE:读取的字节数= 22 FILE:字节数   write = 232081 FILE:读取操作数= 0 FILE:Number of   大读操作= 0 FILE:写操作次数= 0 HDFS:   读取的字节数= 612 HDFS:写入的字节数= 0 HDFS:   读操作次数= 1 HDFS:大读操作次数= 0         HDFS:写操作次数= 0作业计数器减少失败   tasks = 4启动map tasks = 1启动reduce tasks = 5 Rack-local   map tasks = 1占用插槽中所有映射所花费的总时间   (ms)= 10399所有花费的总时间减少了占用的时隙   (ms)= 23225所有地图任务花费的总时间(毫秒)= 10399总时间   所有减少任务花费的时间(毫秒)= 23225所用的总核心秒数   所有map任务= 10399所有减少的总vcore-seconds   tasks = 23225所有map任务占用的总兆字节数= 10648576         所有reduce任务占用的总兆字节数= 23782400     Map-Reduce Framework Map输入记录= 1 Map输出记录= 1         映射输出字节= 32映射输出物化字节= 62输入拆分   bytes = 612组合输入记录= 0组合输出记录= 0         减少输入组= 0减少随机字节数= 14减少输入   records = 0减少输出记录= 0 Spilled Records = 1 Shuffled   Maps = 1 Failed Shuffles = 0合并Map输出= 1 GC时间已过   (ms)= 175花费的CPU时间(ms)= 6860物理内存(字节)   snapshot = 628305920虚拟内存(字节)snapshot = 3198902272         总提交堆使用量(字节)= 481820672随机错误         BAD_ID = 0 CONNECTION = 0 IO_ERROR = 0 WRONG_LENGTH = 0 WRONG_MAP = 0         WRONG_REDUCE = 0文件输入格式计数器字节读取= 0文件   输出格式计数器字节写入= 0 2015-05-18 15:22:37,266   错误[main] crawl.GeneratorJob(GeneratorJob.java:run(310)) -   GeneratorJob:java.lang.RuntimeException:job failed:   name = [t2] generate:1431933684-12185,jobid = job_1431932258783_0006 at   org.apache.nutch.util.NutchJob.waitForCompletion(NutchJob.java:54)at   org.apache.nutch.crawl.GeneratorJob.run(GeneratorJob.java:213)at at   org.apache.nutch.crawl.GeneratorJob.generate(GeneratorJob.java:241)     在org.apache.nutch.crawl.GeneratorJob.run(GeneratorJob.java:308)at   org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)at   org.apache.nutch.crawl.GeneratorJob.main(GeneratorJob.java:316)at   sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at   sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)     在   sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     在java.lang.reflect.Method.invoke(Method.java:606)at   org.apache.hadoop.util.RunJar.main(RunJar.java:212)

     

运行错误:/usr/pro/nutch2.3/deploy/bin/nutch generate -D   mapred.reduce.tasks = 2 -D mapred.child.java.opts = -Xmx1000m -D   mapred.reduce.tasks.speculative.execution = false -D   mapred.map.tasks.speculative.execution = false -D   mapred.compress.map.output = true -topN 50000 -noNorm -noFilter -adddays   0 -crawlId t2 -batchId 1431933684-12185

2 个答案:

答案 0 :(得分:4)

我在同一个配置中遇到了完全相同的问题。我的问题通过添加

解决了
<property>
  <name>io.serializations</name>
  <value>org.apache.hadoop.io.serializer.WritableSerialization</value>
  <description>A list of serialization classes that can be used for
  obtaining serializers and deserializers.</description>
</property>

到nutch-site.xml。感谢http://quabr.com/26180364/cant-run-nutch2-on-hadoop2-nutch-2-x-hadoop-2-4-0-hbase-0-94-18-gora-0-5

答案 1 :(得分:0)

按照流程可能会解决您的问题!!

编辑ivy.xml - 注意非常非常重要的一步

<dependency org=”org.apache.gora” name=”gora-hbase” rev=”0.6.1′′ conf=”*->default” />

<dependency org=”org.apache.solr” name=”solr-solrj” rev=”4.1.0′′ conf=”*->default” />

添加此行

<dependency org=”org.apache.hbase” name=”hbase-common” rev=”0.98.8-hadoop2′′
conf=”*->default” />

转到Stack / apache-nutch-2.3.1 / conf 编辑gora.properties

gora.datastore.default=org.apache.gora.hbase.store.HBaseStore

编辑hbase.xml

<configuration>
<property>
<name>hbase.cluster.distributed</name>
<value>true</value>
</property>
<property>
<name>hbase.rootdir</name>
<value>hdfs://localhost:9000/hbase</value>
</property>
<!–Here you have to set the path where you want HBase to store its built in zookeeper files.–>
<property>
<name>hbase.zookeeper.property.dataDir</name>
<value>hdfs://localhost:9000/zookeeper</value>
</property>
<property>
<name>hbase.zookeeper.property.clientPort</name>
<value>2181</value>
</property>
</configuration>

编辑nutch-site.xml

<configuration>
<property>
<name>http.agent.name</name>
<value>NutchSpider</value>
</property><property>
<name>storage.data.store.class</name>
<value>org.apache.gora.hbase.store.HBaseStore</value>
<description>Default class for storing data</description>
</property>
<property>
<name>plugin.includes</name>
<value>protocol-http|urlfilter-regex|parse-(html|tika)|index-(basic|anchor)|indexer-solr|scoring-op
ic|urlnormalizer-(pass|regex|basic)</value>
</property>
</configuration>

清理Nutch的Build 蚂蚁干净 建立荷兰 蚂蚁运行时