我试图让Solr 4.7.2与HDFS协同工作(使用Hadoop 2.4.1)。
以下是solrconfig.xml
:
<?xml version='1.0' encoding='UTF-8' ?>
<config>
<luceneMatchVersion>LUCENE_47</luceneMatchVersion>
<lockType>hdfs</lockType>
<unlockOnStartup>true</unlockOnStartup>
<writeLockTimeout>20000</writeLockTimeout>
<commitLockTimeout>10000</commitLockTimeout>
<directoryFactory name="DirectoryFactory" class="solr.HdfsDirectoryFactory">
<str name="solr.hdfs.home">hdfs://localhost:54310/solr</str>
<bool name="solr.hdfs.blockcache.enabled">true</bool>
<int name="solr.hdfs.blockcache.slab.count">1</int>
<bool name="solr.hdfs.blockcache.direct.memory.allocation">true</bool>
<int name="solr.hdfs.blockcache.blocksperbank">16384</int>
<bool name="solr.hdfs.blockcache.read.enabled">true</bool>
<bool name="solr.hdfs.blockcache.write.enabled">true</bool>
<bool name="solr.hdfs.nrtcachingdirectory.enable">true</bool>
<int name="solr.hdfs.nrtcachingdirectory.maxmergesizemb">16</int>
<int name="solr.hdfs.nrtcachingdirectory.maxcachedmb">192</int>
</directoryFactory>
<requestHandler name='standard' class=
'solr.StandardRequestHandler' default='true' />
<requestHandler name='/update' class=
'solr.UpdateRequestHandler' />
<requestHandler name='/admin/' class=
'org.apache.solr.handler.admin.AdminHandlers' />
<admin>
<defaultQuery>*:*</defaultQuery>
</admin>
</config>
启动Solr服务器后,它会抛出异常:
hdp1: org.apache.solr.common.SolrException:org.apache.solr.common.SolrException: Error opening new searcher
我尝试调试并在jetty日志中发现以下错误:
Caused by: org.apache.lucene.store.LockObtainFailedException: Lock obtain timed out:
NativeFSLock@hdfs:/localhost:54310/solr/hdp1/data/index/HdfsDirectory@1d5821fa lockFactory=NativeFSLockFactory@hdfs:/localhost:54310/solr/hdp1/data/index-write.lock: java.io.FileNotFoundException: hdfs:/localhost:54310/solr/hdp1/data/index/HdfsDirectory@1d5821fa lockFactory=NativeFSLockFactory@hdfs:/localhost:54310/solr/hdp1/data/index-write.lock (No such file or directory)
它能够在HDFS上创建目录,但在创建/访问时却以某种方式失败 索引。
答案 0 :(得分:0)
您需要设置-Dsolr.lock.type = hdfs。