嵌入弹性搜索导致声纳启动失败''连接被拒绝:/ 0:0:0:0:0:0:0:0:1:9001

时间:2019-03-27 06:56:28

标签: elasticsearch sonarqube

Sonarqube由sonarqube(操作系统帐户)启动,但是由于嵌入es启动失败而导致声纳启动失败。我将声纳设置为官方推荐并停止了防火墙,但是声纳无法启动,我尝试使用tcpdump捕获es连接,但什么也没发现。我将日志级别设置为TRACE,但是可以看到es'connection拒绝'出现问题

环境: centos6.8,sonarqube7.6

操作系统设置:

/etc/sysctl.conf

vm.max_map_count = 262144

/etc/security/limits.conf

sonarqube  hard nofile 65536
sonarqube  soft nofile 65536
sonarqube  hard nproc 2048
sonarqube  soft nproc 2048

声纳设置:

sonar.jdbc.username=root
sonar.jdbc.password=123456
sonar.jdbc.url=jdbc:mysql://10.12.34.22:3306/sonar?useUnicode=true&characterEncoding=utf8&rewriteBatchedStatements=true&useConfigs=maxPerformance&useSSL=false
sonar.web.javaOpts=-Xmx2048m -Xms2048m -XX:+HeapDumpOnOutOfMemoryError
sonar.search.javaOpts=-Xms4096m -Xmx4096m -XX:+HeapDumpOnOutOfMemoryError -Djava.net.preferIPv4Stack=true
sonar.search.javaAdditionalOpts=-Djava.net.preferIPv4Stack=true -Dbootstrap.system_call_filter=false -Dbootstrap.memory_lock=false
sonar.log.level.app=TRACE
sonar.log.level.web=TRACE
sonar.log.level.ce=TRACE
sonar.log.level.es=TRACE

sonar.log

2019.03.26 17:32:17 WARN  app[][o.s.a.p.AbstractProcessMonitor] Process exited with exit value [es]: 143

es.log

2019.03.26 16:22:23 TRACE es[][o.e.n.Node] observer: sampled state rejected by predicate (cluster uuid: _na_
version: 0
state uuid: mcJmnjowSf-vFWNWtfjVHQ
from_diff: false
meta data version: 0
blocks: 
   _global_:
      1,state not recovered / initialized, blocks READ,WRITE,METADATA_READ,METADATA_WRITE      2,no master, blocks WRITE,METADATA_WRITE
nodes: 
   {sonarqube}{QNyovcldTryY5TaZOf9U0g}{bOIXQl_KQumKoaXIOxvFMg}{127.0.0.1}{127.0.0.1:9001}{rack_id=sonarqube}, local
routing_table (version 0):
routing_nodes:
-----node_id[QNyovcldTryY5TaZOf9U0g][V]
---- unassigned
). adding listener to ClusterService
2019.03.26 16:22:23 DEBUG es[][o.e.c.s.ClusterService] processing [initial_join]: took [4ms] no change in cluster_state
2019.03.26 16:22:23 TRACE es[][o.e.n.Node] observer: postAdded - predicate rejected state (cluster uuid: _na_
version: 0
state uuid: mcJmnjowSf-vFWNWtfjVHQ
from_diff: false
meta data version: 0
blocks: 
   _global_:
      1,state not recovered / initialized, blocks READ,WRITE,METADATA_READ,METADATA_WRITE      2,no master, blocks WRITE,METADATA_WRITE
nodes: 
   {sonarqube}{QNyovcldTryY5TaZOf9U0g}{bOIXQl_KQumKoaXIOxvFMg}{127.0.0.1}{127.0.0.1:9001}{rack_id=sonarqube}, local
routing_table (version 0):
routing_nodes:
-----node_id[QNyovcldTryY5TaZOf9U0g][V]
---- unassigned
)
2019.03.26 16:22:23 TRACE es[][o.e.d.z.UnicastZenPing] resolved host [127.0.0.1] to [127.0.0.1:9001]
2019.03.26 16:22:23 TRACE es[][o.e.d.z.UnicastZenPing] resolved host [[::1]] to [[::1]:9001]
2019.03.26 16:22:23 TRACE es[][o.e.d.z.UnicastZenPing] [1] sending to {sonarqube}{QNyovcldTryY5TaZOf9U0g}{bOIXQl_KQumKoaXIOxvFMg}{127.0.0.1}{127.0.0.1:9001}{rack_id=sonarqube}
2019.03.26 16:22:23 TRACE es[][o.e.d.z.UnicastZenPing] [1] opening connection to [{#zen_unicast_[::1]_0#}{lfCkZQIPT2yqfF8LAU3TIg}{0:0:0:0:0:0:0:1}{[::1]:9001}]
2019.03.26 16:22:23 TRACE es[][o.e.t.T.tracer] [1][internal:discovery/zen/unicast] sent to [{sonarqube}{QNyovcldTryY5TaZOf9U0g}{bOIXQl_KQumKoaXIOxvFMg}{127.0.0.1}{127.0.0.1:9001}{rack_id=sonarqube}] (timeout: [3.7s])
2019.03.26 16:22:23 TRACE es[][o.e.t.T.tracer] [1][internal:discovery/zen/unicast] received request
2019.03.26 16:22:23 TRACE es[][o.e.t.TaskManager] register 1 [direct] [internal:discovery/zen/unicast] []
2019.03.26 16:22:23 TRACE es[][o.e.t.TaskManager] unregister task for id: 1
2019.03.26 16:22:23 TRACE es[][o.e.t.T.tracer] [1][internal:discovery/zen/unicast] sent response
2019.03.26 16:22:23 TRACE es[][o.e.t.T.tracer] [1][internal:discovery/zen/unicast] received response from [{sonarqube}{QNyovcldTryY5TaZOf9U0g}{bOIXQl_KQumKoaXIOxvFMg}{127.0.0.1}{127.0.0.1:9001}{rack_id=sonarqube}]
2019.03.26 16:22:23 TRACE es[][o.e.d.z.UnicastZenPing] [1] received response from {sonarqube}{QNyovcldTryY5TaZOf9U0g}{bOIXQl_KQumKoaXIOxvFMg}{127.0.0.1}{127.0.0.1:9001}{rack_id=sonarqube}: [ping_response{node [{sonarqube}{QNyovcldTryY5TaZOf9U0g}{bOIXQl_KQumKoaXIOxvFMg}{127.0.0.1}{127.0.0.1:9001}{rack_id=sonarqube}], id[1], master [null],cluster_state_version [-1], cluster_name[sonarqube]}, ping_response{node [{sonarqube}{QNyovcldTryY5TaZOf9U0g}{bOIXQl_KQumKoaXIOxvFMg}{127.0.0.1}{127.0.0.1:9001}{rack_id=sonarqube}], id[2], master [null],cluster_state_version [-1], cluster_name[sonarqube]}]
2019.03.26 16:22:23 TRACE es[][o.e.t.n.ESLoggingHandler] [id: 0x6a453264] REGISTERED
2019.03.26 16:22:23 TRACE es[][o.e.t.n.ESLoggingHandler] [id: 0x6a453264] CONNECT: /0:0:0:0:0:0:0:1:9001
2019.03.26 16:22:23 TRACE es[][o.e.t.n.ESLoggingHandler] [id: 0x6a453264] CLOSE
2019.03.26 16:22:23 TRACE es[][o.e.t.n.ESLoggingHandler] [id: 0x6a453264] UNREGISTERED
2019.03.26 16:22:23 TRACE es[][o.e.d.z.UnicastZenPing] [1] failed to ping {#zen_unicast_[::1]_0#}{lfCkZQIPT2yqfF8LAU3TIg}{0:0:0:0:0:0:0:1}{[::1]:9001}
org.elasticsearch.transport.ConnectTransportException: [][[::1]:9001] connect_timeout[3s]
    at org.elasticsearch.transport.netty4.Netty4Transport.connectToChannels(Netty4Transport.java:362) ~[?:?]
    at org.elasticsearch.transport.TcpTransport.openConnection(TcpTransport.java:570) ~[elasticsearch-5.6.3.jar:5.6.3]
    at org.elasticsearch.transport.TcpTransport.openConnection(TcpTransport.java:117) ~[elasticsearch-5.6.3.jar:5.6.3]
    at org.elasticsearch.transport.TransportService.openConnection(TransportService.java:351) ~[elasticsearch-5.6.3.jar:5.6.3]
    at org.elasticsearch.discovery.zen.UnicastZenPing$PingingRound.getOrConnect(UnicastZenPing.java:398) ~[elasticsearch-5.6.3.jar:5.6.3]
    at org.elasticsearch.discovery.zen.UnicastZenPing$3.doRun(UnicastZenPing.java:507) [elasticsearch-5.6.3.jar:5.6.3]
    at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:638) [elasticsearch-5.6.3.jar:5.6.3]
    at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-5.6.3.jar:5.6.3]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_201]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_201]
    at java.lang.Thread.run(Thread.java:748) [?:1.8.0_201]
Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: /0:0:0:0:0:0:0:1:9001
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?]
    at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) ~[?:?]
    at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:352) ~[?:?]
    at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:340) ~[?:?]
    at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:632) ~[?:?]
    at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:544) ~[?:?]
    at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:498) ~[?:?]
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458) ~[?:?]
    at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) ~[?:?]
    ... 1 more
Caused by: java.net.ConnectException: Connection refused
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?]
    at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) ~[?:?]
    at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:352) ~[?:?]
    at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:340) ~[?:?]
    at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:632) ~[?:?]
    at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:544) ~[?:?]
    at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:498) ~[?:?]
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458) ~[?:?]
    at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) ~[?:?]
    ... 1 more
2019.03.26 16:22:24 TRACE es[][o.e.d.z.UnicastZenPing] [1] sending to {sonarqube}{QNyovcldTryY5TaZOf9U0g}{bOIXQl_KQumKoaXIOxvFMg}{127.0.0.1}{127.0.0.1:9001}{rack_id=sonarqube}
2019.03.26 16:22:24 TRACE es[][o.e.t.T.tracer] [2][internal:discovery/zen/unicast] sent to

0 个答案:

没有答案