我们正在从SonarQube 5.6升级到6.7.1。我已经为6.7.1创建了一个新文件夹,可以使用和卸载SonarQube。我们添加了市场上的新插件并更新了6.7.1的配置文件。使用内部数据库,我们很好。但是,只要我们将其切换为使用SQL Server,服务就会失败。
以下是sonar.properties文件的更改:
sonar.jdbc.url = jdbc:sqlserver:// [SQL Server FQDN]; databaseName = sonar; integratedSecurity = false sonar.jdbc.username = sonar_user sonar.jdbc.password = [口令]
这是sonar.log中的TRACE级别输出:
--> Wrapper Started as Service
Launching a JVM...
Wrapper (Version 3.2.3) http://wrapper.tanukisoftware.org
Copyright 1999-2006 Tanuki Software, Inc. All Rights Reserved.
2018.02.15 09:38:52 INFO app[][o.s.a.AppFileSystem] Cleaning or creating temp directory C:\sonarqube-6.7.1\temp
2018.02.15 09:38:52 TRACE app[][o.s.a.NodeLifecycle] tryToMoveTo from INIT to STARTING => true
2018.02.15 09:38:52 TRACE app[][o.s.a.p.Lifecycle] tryToMoveTo es from INIT to STARTING => true
2018.02.15 09:38:52 INFO app[][o.s.a.es.EsSettings] Elasticsearch listening on /127.0.0.1:9001
2018.02.15 09:38:52 INFO app[][o.s.a.p.ProcessLauncherImpl] Launch process[[key='es', ipcIndex=1, logFilenamePrefix=es]] from [C:\sonarqube-6.7.1\elasticsearch]: C:\Program Files\Java\jre1.8.0_91\bin\java -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -XX:+AlwaysPreTouch -server -Xss1m -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djna.nosys=true -Djdk.io.permissionsUseCanonicalPath=true -Dio.netty.noUnsafe=true -Dio.netty.noKeySetOptimization=true -Dio.netty.recycler.maxCapacityPerThread=0 -Dlog4j.shutdownHookEnabled=false -Dlog4j2.disable.jmx=true -Dlog4j.skipJansi=true -Xms512m -Xmx512m -XX:+HeapDumpOnOutOfMemoryError -Delasticsearch -Des.path.home=C:\sonarqube-6.7.1\elasticsearch -cp lib/* org.elasticsearch.bootstrap.Elasticsearch -Epath.conf=C:\sonarqube-6.7.1\temp\conf\es
2018.02.15 09:38:52 TRACE app[][o.s.a.p.Lifecycle] tryToMoveTo es from STARTING to STARTED => true
2018.02.15 09:38:52 INFO app[][o.s.a.SchedulerImpl] Waiting for Elasticsearch to be up and running
2018.02.15 09:38:53 TRACE app[][o.e.p.PluginsService] plugin loaded from classpath [- Plugin information:
Name: org.elasticsearch.transport.Netty4Plugin
Description: classpath plugin
Version: NA
Native Controller: false
* Classname: org.elasticsearch.transport.Netty4Plugin]
2018.02.15 09:38:53 INFO app[][o.e.p.PluginsService] no modules loaded
2018.02.15 09:38:53 INFO app[][o.e.p.PluginsService] loaded plugin [org.elasticsearch.transport.Netty4Plugin]
2018.02.15 09:38:53 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [force_merge], size [1], queue size [unbounded]
2018.02.15 09:38:53 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [fetch_shard_started], core [1], max [2], keep alive [5m]
2018.02.15 09:38:53 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [listener], size [1], queue size [unbounded]
2018.02.15 09:38:53 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [index], size [1], queue size [200]
2018.02.15 09:38:53 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [refresh], core [1], max [1], keep alive [5m]
2018.02.15 09:38:53 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [generic], core [4], max [128], keep alive [30s]
2018.02.15 09:38:53 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [warmer], core [1], max [1], keep alive [5m]
2018.02.15 09:38:53 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [search], size [2], queue size [1k]
2018.02.15 09:38:53 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [flush], core [1], max [1], keep alive [5m]
2018.02.15 09:38:53 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [fetch_shard_store], core [1], max [2], keep alive [5m]
2018.02.15 09:38:53 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [management], core [1], max [5], keep alive [5m]
2018.02.15 09:38:53 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [get], size [1], queue size [1k]
2018.02.15 09:38:53 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [bulk], size [1], queue size [200]
2018.02.15 09:38:53 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [snapshot], core [1], max [1], keep alive [5m]
2018.02.15 09:38:53 DEBUG app[][o.e.c.n.IfConfig] configuration:
lo
Software Loopback Interface 1
inet 127.0.0.1 netmask:255.0.0.0 broadcast:127.255.255.255 scope:host
inet6 ::1 prefixlen:128 scope:host
UP MULTICAST LOOPBACK mtu:-1 index:1
net0
WAN Miniport (L2TP)
MULTICAST mtu:-1 index:2
net1
WAN Miniport (SSTP)
MULTICAST mtu:-1 index:3
net2
WAN Miniport (IKEv2)
MULTICAST mtu:-1 index:4
net3
WAN Miniport (PPTP)
MULTICAST mtu:-1 index:5
ppp0
WAN Miniport (PPPOE)
MULTICAST mtu:-1 index:6
eth0
WAN Miniport (IP)
MULTICAST mtu:-1 index:7
eth1
WAN Miniport (IPv6)
MULTICAST mtu:-1 index:8
eth2
WAN Miniport (Network Monitor)
MULTICAST mtu:-1 index:9
eth3
Microsoft Kernel Debug Network Adapter
MULTICAST mtu:-1 index:10
ppp1
RAS Async Adapter
MULTICAST mtu:-1 index:11
eth4
Microsoft Hyper-V Network Adapter
inet 10.50.8.41 netmask:255.255.255.0 broadcast:10.50.8.255 scope:site
inet6 fe80::9407:eb78:32b6:4195 prefixlen:64 scope:link
hardware 00:15:5D:02:44:6E
UP MULTICAST mtu:1500 index:12
net4
Microsoft ISATAP Adapter
inet6 fe80::5efe:a32:829 prefixlen:128 scope:link
hardware 00:00:00:00:00:00:00:E0
POINTOPOINT mtu:1280 index:13
eth5
WAN Miniport (IP)-WFP Native MAC Layer LightWeight Filter-0000
MULTICAST mtu:-1 index:14
eth6
WAN Miniport (IP)-QoS Packet Scheduler-0000
MULTICAST mtu:-1 index:15
eth7
WAN Miniport (IPv6)-WFP Native MAC Layer LightWeight Filter-0000
MULTICAST mtu:-1 index:16
eth8
WAN Miniport (IPv6)-QoS Packet Scheduler-0000
MULTICAST mtu:-1 index:17
eth9
WAN Miniport (Network Monitor)-WFP Native MAC Layer LightWeight Filter-0000
MULTICAST mtu:-1 index:18
eth10
WAN Miniport (Network Monitor)-QoS Packet Scheduler-0000
MULTICAST mtu:-1 index:19
eth11
Microsoft Hyper-V Network Adapter-WFP Native MAC Layer LightWeight Filter-0000
MULTICAST mtu:-1 index:20
eth12
Microsoft Hyper-V Network Adapter-QoS Packet Scheduler-0000
MULTICAST mtu:-1 index:21
eth13
Microsoft Hyper-V Network Adapter-WFP 802.3 MAC Layer LightWeight Filter-0000
MULTICAST mtu:-1 index:22
2018.02.15 09:38:55 TRACE app[][o.e.i.b.HierarchyCircuitBreakerService] parent circuit breaker with settings [parent,type=PARENT,limit=22708224/21.6mb,overhead=1.0]
2018.02.15 09:38:55 TRACE app[][o.e.i.b.request] creating ChildCircuitBreaker with settings [request,type=MEMORY,limit=19464192/18.5mb,overhead=1.0]
2018.02.15 09:38:55 TRACE app[][o.e.i.b.fielddata] creating ChildCircuitBreaker with settings [fielddata,type=MEMORY,limit=19464192/18.5mb,overhead=1.03]
2018.02.15 09:38:55 TRACE app[][o.e.i.b.in_flight_requests] creating ChildCircuitBreaker with settings [in_flight_requests,type=MEMORY,limit=32440320/30.9mb,overhead=1.0]
2018.02.15 09:38:56 DEBUG app[][o.e.c.i.i.Stopwatch] Module execution: 66ms
2018.02.15 09:38:56 DEBUG app[][o.e.c.i.i.Stopwatch] TypeListeners creation: 8ms
2018.02.15 09:38:56 DEBUG app[][o.e.c.i.i.Stopwatch] Scopes creation: 4ms
2018.02.15 09:38:56 DEBUG app[][o.e.c.i.i.Stopwatch] Converters creation: 0ms
2018.02.15 09:38:56 DEBUG app[][o.e.c.i.i.Stopwatch] Binding creation: 4ms
2018.02.15 09:38:56 DEBUG app[][o.e.c.i.i.Stopwatch] Private environment creation: 0ms
2018.02.15 09:38:56 DEBUG app[][o.e.c.i.i.Stopwatch] Injector construction: 0ms
2018.02.15 09:38:56 DEBUG app[][o.e.c.i.i.Stopwatch] Binding initialization: 3ms
2018.02.15 09:38:56 DEBUG app[][o.e.c.i.i.Stopwatch] Binding indexing: 0ms
2018.02.15 09:38:56 DEBUG app[][o.e.c.i.i.Stopwatch] Collecting injection requests: 0ms
2018.02.15 09:38:56 DEBUG app[][o.e.c.i.i.Stopwatch] Binding validation: 0ms
2018.02.15 09:38:56 DEBUG app[][o.e.c.i.i.Stopwatch] Static validation: 0ms
2018.02.15 09:38:56 DEBUG app[][o.e.c.i.i.Stopwatch] Instance member validation: 1ms
2018.02.15 09:38:56 DEBUG app[][o.e.c.i.i.Stopwatch] Provider verification: 0ms
2018.02.15 09:38:56 DEBUG app[][o.e.c.i.i.Stopwatch] Static member injection: 0ms
2018.02.15 09:38:56 DEBUG app[][o.e.c.i.i.Stopwatch] Instance injection: 0ms
2018.02.15 09:38:56 DEBUG app[][o.e.c.i.i.Stopwatch] Preloading singletons: 1ms
2018.02.15 09:38:56 DEBUG app[][o.e.c.t.TransportClientNodesService] node_sampler_interval[5s]
2018.02.15 09:38:56 DEBUG app[][i.n.c.MultithreadEventLoopGroup] -Dio.netty.eventLoopThreads: 2
2018.02.15 09:38:56 DEBUG app[][i.n.u.i.PlatformDependent] Platform: Windows
2018.02.15 09:38:56 DEBUG app[][i.n.u.i.PlatformDependent0] -Dio.netty.noUnsafe: false
2018.02.15 09:38:56 DEBUG app[][i.n.u.i.PlatformDependent0] Java version: 8
2018.02.15 09:38:56 DEBUG app[][i.n.u.i.PlatformDependent0] sun.misc.Unsafe.theUnsafe: available
2018.02.15 09:38:56 DEBUG app[][i.n.u.i.PlatformDependent0] sun.misc.Unsafe.copyMemory: available
2018.02.15 09:38:56 DEBUG app[][i.n.u.i.PlatformDependent0] java.nio.Buffer.address: available
2018.02.15 09:38:56 DEBUG app[][i.n.u.i.PlatformDependent0] direct buffer constructor: available
2018.02.15 09:38:56 DEBUG app[][i.n.u.i.PlatformDependent0] java.nio.Bits.unaligned: available, true
2018.02.15 09:38:56 DEBUG app[][i.n.u.i.PlatformDependent0] jdk.internal.misc.Unsafe.allocateUninitializedArray(int): unavailable prior to Java9
2018.02.15 09:38:56 DEBUG app[][i.n.u.i.PlatformDependent0] java.nio.DirectByteBuffer.<init>(long, int): available
2018.02.15 09:38:56 DEBUG app[][i.n.u.i.PlatformDependent] sun.misc.Unsafe: available
2018.02.15 09:38:56 DEBUG app[][i.n.u.i.PlatformDependent] -Dio.netty.tmpdir: C:\Windows\system32\config\systemprofile\AppData\Local\Temp (java.io.tmpdir)
2018.02.15 09:38:56 DEBUG app[][i.n.u.i.PlatformDependent] -Dio.netty.bitMode: 64 (sun.arch.data.model)
2018.02.15 09:38:56 DEBUG app[][i.n.u.i.PlatformDependent] -Dio.netty.noPreferDirect: false
2018.02.15 09:38:56 DEBUG app[][i.n.u.i.PlatformDependent] -Dio.netty.maxDirectMemory: 32440320 bytes
2018.02.15 09:38:56 DEBUG app[][i.n.u.i.PlatformDependent] -Dio.netty.uninitializedArrayAllocationThreshold: -1
2018.02.15 09:38:56 DEBUG app[][i.n.u.i.CleanerJava6] java.nio.ByteBuffer.cleaner(): available
2018.02.15 09:38:56 DEBUG app[][i.n.c.n.NioEventLoop] -Dio.netty.noKeySetOptimization: false
2018.02.15 09:38:56 DEBUG app[][i.n.c.n.NioEventLoop] -Dio.netty.selectorAutoRebuildThreshold: 512
2018.02.15 09:38:56 DEBUG app[][i.n.u.i.PlatformDependent] org.jctools-core.MpscChunkedArrayQueue: available
2018.02.15 09:38:56 TRACE app[][i.n.c.n.NioEventLoop] instrumented a special java.util.Set into: sun.nio.ch.WindowsSelectorImpl@69555c83
2018.02.15 09:38:56 TRACE app[][i.n.c.n.NioEventLoop] instrumented a special java.util.Set into: sun.nio.ch.WindowsSelectorImpl@6cba52b5
2018.02.15 09:38:56 DEBUG app[][o.e.c.t.TransportClientNodesService] adding address [{#transport#-1}{eLdEZAQ9SOatzX-UBU1KBg}{127.0.0.1}{127.0.0.1:9001}]
2018.02.15 09:38:56 DEBUG app[][i.n.c.DefaultChannelId] -Dio.netty.processId: 804 (auto-detected)
2018.02.15 09:38:56 DEBUG app[][i.netty.util.NetUtil] -Djava.net.preferIPv4Stack: false
2018.02.15 09:38:56 DEBUG app[][i.netty.util.NetUtil] -Djava.net.preferIPv6Addresses: false
2018.02.15 09:38:56 DEBUG app[][i.netty.util.NetUtil] Loopback interface: lo (Software Loopback Interface 1, 127.0.0.1)
2018.02.15 09:38:56 DEBUG app[][i.netty.util.NetUtil] Failed to get SOMAXCONN from sysctl and file \proc\sys\net\core\somaxconn. Default: 200
2018.02.15 09:38:56 DEBUG app[][i.n.c.DefaultChannelId] -Dio.netty.machineId: 00:15:5d:ff:fe:02:44:6e (auto-detected)
2018.02.15 09:38:56 DEBUG app[][i.n.u.ResourceLeakDetector] -Dio.netty.leakDetection.level: simple
2018.02.15 09:38:56 DEBUG app[][i.n.u.ResourceLeakDetector] -Dio.netty.leakDetection.maxRecords: 4
2018.02.15 09:38:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.numHeapArenas: 0
2018.02.15 09:38:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.numDirectArenas: 0
2018.02.15 09:38:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.pageSize: 8192
2018.02.15 09:38:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.maxOrder: 11
2018.02.15 09:38:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.chunkSize: 16777216
2018.02.15 09:38:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.tinyCacheSize: 512
2018.02.15 09:38:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.smallCacheSize: 256
2018.02.15 09:38:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.normalCacheSize: 64
2018.02.15 09:38:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.maxCachedBufferCapacity: 32768
2018.02.15 09:38:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.cacheTrimInterval: 8192
2018.02.15 09:38:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.useCacheForAllThreads: true
2018.02.15 09:38:56 DEBUG app[][i.n.b.ByteBufUtil] -Dio.netty.allocator.type: pooled
2018.02.15 09:38:56 DEBUG app[][i.n.b.ByteBufUtil] -Dio.netty.threadLocalDirectBufferSize: 65536
2018.02.15 09:38:56 DEBUG app[][i.n.b.ByteBufUtil] -Dio.netty.maxThreadLocalCharBufferSize: 16384
2018.02.15 09:38:57 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0xd5120a31] REGISTERED
2018.02.15 09:38:57 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0xd5120a31] CONNECT: /127.0.0.1:9001
2018.02.15 09:38:58 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0xd5120a31] CLOSE
2018.02.15 09:38:58 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0xd5120a31] UNREGISTERED
2018.02.15 09:39:00 DEBUG app[][o.e.c.t.TransportClientNodesService] failed to connect to node [{#transport#-1}{eLdEZAQ9SOatzX-UBU1KBg}{127.0.0.1}{127.0.0.1:9001}], ignoring...
org.elasticsearch.transport.ConnectTransportException: [][127.0.0.1:9001] connect_timeout[30s]
at org.elasticsearch.transport.netty4.Netty4Transport.connectToChannels(Netty4Transport.java:362)
at org.elasticsearch.transport.TcpTransport.openConnection(TcpTransport.java:570)
at org.elasticsearch.transport.TcpTransport.openConnection(TcpTransport.java:117)
at org.elasticsearch.transport.TransportService.openConnection(TransportService.java:351)
at org.elasticsearch.client.transport.TransportClientNodesService$SimpleNodeSampler.doSample(TransportClientNodesService.java:407)
at org.elasticsearch.client.transport.TransportClientNodesService$NodeSampler.sample(TransportClientNodesService.java:357)
at org.elasticsearch.client.transport.TransportClientNodesService.addTransportAddresses(TransportClientNodesService.java:198)
at org.elasticsearch.client.transport.TransportClient.addTransportAddress(TransportClient.java:319)
at org.sonar.application.process.EsProcessMonitor.addHostToClient(EsProcessMonitor.java:186)
at org.sonar.application.process.EsProcessMonitor.buildTransportClient(EsProcessMonitor.java:177)
at org.sonar.application.process.EsProcessMonitor.getTransportClient(EsProcessMonitor.java:160)
at org.sonar.application.process.EsProcessMonitor.checkStatus(EsProcessMonitor.java:134)
at org.sonar.application.process.EsProcessMonitor.checkOperational(EsProcessMonitor.java:93)
at org.sonar.application.process.EsProcessMonitor.isOperational(EsProcessMonitor.java:78)
at org.sonar.application.process.SQProcess.refreshState(SQProcess.java:162)
at org.sonar.application.process.SQProcess$EventWatcher.run(SQProcess.java:221)
Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: no further information: /127.0.0.1:9001
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:352)
at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:340)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:632)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:579)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:496)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at java.lang.Thread.run(Unknown Source)
Caused by: java.net.ConnectException: Connection refused: no further information
... 10 common frames omitted
2018.02.15 09:39:00 DEBUG app[][o.s.a.p.EsProcessMonitor] Connected to Elasticsearch node: [127.0.0.1:9001]
2018.02.15 09:39:01 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0x81cb7c2f] REGISTERED
2018.02.15 09:39:01 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0x81cb7c2f] CONNECT: /127.0.0.1:9001
2018.02.15 09:39:02 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0x81cb7c2f] CLOSE
2018.02.15 09:39:02 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0x81cb7c2f] UNREGISTERED
2018.02.15 09:39:02 DEBUG app[][o.e.c.t.TransportClientNodesService] failed to connect to node [{#transport#-1}{eLdEZAQ9SOatzX-UBU1KBg}{127.0.0.1}{127.0.0.1:9001}], ignoring...
org.elasticsearch.transport.ConnectTransportException: [][127.0.0.1:9001] connect_timeout[30s]
at org.elasticsearch.transport.netty4.Netty4Transport.connectToChannels(Netty4Transport.java:362)
at org.elasticsearch.transport.TcpTransport.openConnection(TcpTransport.java:570)
at org.elasticsearch.transport.TcpTransport.openConnection(TcpTransport.java:117)
at org.elasticsearch.transport.TransportService.openConnection(TransportService.java:351)
at org.elasticsearch.client.transport.TransportClientNodesService$SimpleNodeSampler.doSample(TransportClientNodesService.java:407)
at org.elasticsearch.client.transport.TransportClientNodesService$NodeSampler.sample(TransportClientNodesService.java:357)
at org.elasticsearch.client.transport.TransportClientNodesService$ScheduledNodeSampler.run(TransportClientNodesService.java:390)
at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:569)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: no further information: /127.0.0.1:9001
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:352)
at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:340)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:632)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:579)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:496)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
... 1 common frames omitted
Caused by: java.net.ConnectException: Connection refused: no further information
... 10 common frames omitted
2018.02.15 09:39:07 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0xe776a2fd] REGISTERED
2018.02.15 09:39:07 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0xe776a2fd] CONNECT: /127.0.0.1:9001
2018.02.15 09:39:07 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0xe776a2fd, L:/127.0.0.1:51152 - R:/127.0.0.1:9001] ACTIVE
2018.02.15 09:39:07 TRACE app[][o.e.i.b.request] [request] Adjusted breaker by [16440] bytes, now [16440]
2018.02.15 09:39:07 DEBUG app[][i.n.b.AbstractByteBuf] -Dio.netty.buffer.bytebuf.checkAccessible: true
2018.02.15 09:39:07 DEBUG app[][i.n.u.ResourceLeakDetectorFactory] Loaded default ResourceLeakDetector: io.netty.util.ResourceLeakDetector@2662f46f
2018.02.15 09:39:07 DEBUG app[][i.n.util.Recycler] -Dio.netty.recycler.maxCapacityPerThread: 32768
2018.02.15 09:39:07 DEBUG app[][i.n.util.Recycler] -Dio.netty.recycler.maxSharedCapacityFactor: 2
2018.02.15 09:39:07 DEBUG app[][i.n.util.Recycler] -Dio.netty.recycler.linkCapacity: 16
2018.02.15 09:39:07 DEBUG app[][i.n.util.Recycler] -Dio.netty.recycler.ratio: 8
2018.02.15 09:39:07 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0xe776a2fd, L:/127.0.0.1:51152 - R:/127.0.0.1:9001] [length: 39, request id: 1, type: request, version: 5.0.0, action: internal:tcp/handshake] WRITE: 45B
2018.02.15 09:39:07 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0xe776a2fd, L:/127.0.0.1:51152 - R:/127.0.0.1:9001] FLUSH
2018.02.15 09:39:07 TRACE app[][o.e.i.b.request] [request] Adjusted breaker by [-16440] bytes, now [0]
2018.02.15 09:39:07 TRACE app[][o.e.t.T.tracer] [1][internal:tcp/handshake] sent to [{#transport#-1}{eLdEZAQ9SOatzX-UBU1KBg}{127.0.0.1}{127.0.0.1:9001}] (timeout: [null])
2018.02.15 09:39:07 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0xe776a2fd, L:/127.0.0.1:51152 - R:/127.0.0.1:9001] [length: 19, request id: 1, type: response, version: 5.0.0] READ: 25B
2018.02.15 09:39:07 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0xe776a2fd, L:/127.0.0.1:51152 - R:/127.0.0.1:9001] READ COMPLETE
2018.02.15 09:39:07 TRACE app[][o.e.i.b.request] [request] Adjusted breaker by [16440] bytes, now [16440]
2018.02.15 09:39:07 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0xe776a2fd, L:/127.0.0.1:51152 - R:/127.0.0.1:9001] [length: 47, request id: 2, type: request, version: 5.6.3, action: cluster:monitor/nodes/liveness] WRITE: 53B
2018.02.15 09:39:07 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0xe776a2fd, L:/127.0.0.1:51152 - R:/127.0.0.1:9001] FLUSH
2018.02.15 09:39:07 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0xe776a2fd, L:/127.0.0.1:51152 - R:/127.0.0.1:9001] [length: 150, request id: 2, type: response, version: 5.6.3] READ: 156B
2018.02.15 09:39:07 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0xe776a2fd, L:/127.0.0.1:51152 - R:/127.0.0.1:9001] READ COMPLETE
2018.02.15 09:39:07 TRACE app[][o.e.i.b.request] [request] Adjusted breaker by [-16440] bytes, now [0]
... [snipped due to length for post] ...
2018.02.15 09:39:20 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0x5f03831f, L:/127.0.0.1:51162 ! R:/127.0.0.1:9001] UNREGISTERED
<-- Wrapper Stopped
非常感谢您提供的任何帮助!
谢谢!