我正在使用DAS 3.0.1设置wso2 API manager 1.10.x,以便使用mysql发布API统计信息。我的API管理器系统与单独的VM上的网关工作节点集群。我按照以下文档通过UI启用API管理器分析。 http://mail.wso2.org/mailarchive/dev/2016-March/060905.html 我还按照此文档手动为网关工作节点启用了分析。 http://blog.rukspot.com/2016/05/configure-wso2-apim-analytics-using-xml.html 设置完成后,我重新启动所有服务器,一切都很好。 但是当我从网关工作者日志发出API的请求时,我看不到它向DAS接收器发布任何统计信息。 DAS汇总表中也没有数据。 要使API管理器网关工作节点向DAS发布统计信息,我需要做什么?我在配置中遗漏了什么吗?
我确实在DAS中看到以下异常(我认为这与网关工作节点不发布统计信息有关。)
[2017-05-31 17:02:46,660] INFO {org.wso2.carbon.event.processor.manager.core.internal.CarbonEventManagementService} - Starting polling event receivers
Exception in thread "dag-scheduler-event-loop" java.lang.NoClassDefFoundError: org/xerial/snappy/SnappyInputStream
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:66)
at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:60)
at org.apache.spark.broadcast.TorrentBroadcast.org$apache$spark$broadcast$TorrentBroadcast$$setConf(TorrentBroadcast.scala:73)
at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:80)
at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62)
at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1292)
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitMissingTasks(DAGScheduler.scala:874)
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:815)
at org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:799)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1429)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1421)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
Caused by: java.lang.ClassNotFoundException: org.xerial.snappy.SnappyInputStream cannot be found by spark-core_2.10_1.4.2.wso2v1
at org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(BundleLoader.java:501)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:421)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:412)
at org.eclipse.osgi.internal.baseadaptor.DefaultClassLoader.loadClass(DefaultClassLoader.java:107)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
配置(api-manager.xml):
<APIUsageTracking>
<Enabled>true</Enabled>
<DASServerURL>{tcp://10.14.3.93:7614}</DASServerURL>
<DASRestApiURL>10.14.3.93:9446</DASRestApiURL>
<SkipEventReceiverConnection>false</SkipEventReceiverConnection>
<PublisherClass>org.wso2.carbon.apimgt.usage.publisher.APIMgtUsageDataBridgeDataPublisher</PublisherClass>
<PublishResponseMessageSize>false</PublishResponseMessageSize>
</APIUsageTracking>
答案 0 :(得分:0)
似乎插件目录中的org.xerial.snappy.snappy-java_1.1.1.7.jar存在OSGI头问题。 请从maven repository下载jar文件并将其复制到{DAS_HOME} / repository / components / lib目录并重新启动服务器。