现在我正在使用带有Hadoop MapReduce框架的Terrier v3.5平台进行索引编制过程,我对它们之间的连接存在问题,所以我需要你的帮助来解决我如何配置纠正Terley的问题Hadoop的。这就是我得到的东西,
hduser@master-HP:/home/hpmaster/Bureau/workspacefe/mossiba/bin$ ./trec_terrier.sh -i -H
Setting TERRIER_HOME to /home/hpmaster/Bureau/workspacefe/mossiba
Setting JAVA_HOME to /usr
INFO - Term-partitioned Mode, 26 reducers creating one inverted index.
INFO - Copying terrier share/ directory (/home/hpmaster/Bureau/workspacefe/mossiba /share) to shared storage area (hdfs://master:54310/tmp/1235281319-terrier.share)
INFO - Copying classpath to job
org.terrier.utility.io.WrappedIOException: Cannot HadoopUtility.makeTerrierJob
org.terrier.utility.io.WrappedIOException: Cannot HadoopUtility.makeTerrierJob
at org.terrier.utility.io.HadoopUtility.makeTerrierJob(HadoopUtility.java:176)
at org.terrier.utility.io.HadoopPlugin$JobFactory.makeTerrierJob(HadoopPlugin.java:122)
at org.terrier.utility.io.HadoopPlugin$DirectJobFactory.newJob(HadoopPlugin.java:137)
at org.terrier.applications.HadoopIndexing.main(HadoopIndexing.java:160)
at org.terrier.applications.TrecTerrier.run(TrecTerrier.java:371)
at org.terrier.applications.TrecTerrier.applyOptions(TrecTerrier.java:564)
at org.terrier.applications.TrecTerrier.main(TrecTerrier.java:235)
Caused by: java.io.FileNotFoundException: File file:/usr/lib/tools.jar does not exist.
at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:372)
at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:251)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:192)
at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1231)
at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1207)
at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1179)
at org.terrier.utility.io.HadoopUtility.saveClassPathToJob(HadoopUtility.java:241)
at org.terrier.utility.io.HadoopUtility.makeTerrierJob(HadoopUtility.java:174)
... 6 more