带有API Manager 1.9.0的WSO2 DAS 3.0.0无法运行

时间:2015-12-08 07:54:41

标签: oracle wso2 wso2-am wso2-das

我正在尝试使用DAS 3.0.0替换BAM,将WSO2 API Manager 1.9.0 / 1.9.1替换为Oracle for WSO2AM_STATS_DB。

我正在关注http://blog.rukspot.com/2015/09/publishing-apim-runtime-statistics-to.html

我可以在数据资源管理器表ORG_WSO2_APIMGT_STATISTICS_REQUEST和ORG_WSO2_APIMGT_STATISTICS_RESPONSE中看到DAS碳排障控制台中的数据。

但数据不存储在Oracle中。因此,我无法在AM的发布者中看到统计数据。它一直在说"数据发布已启用。生成一些流量以查看统计信息。"

我在日志中遇到以下错误:

[2015-12-08 13:00:00,022]  INFO {org.wso2.carbon.analytics.spark.core.AnalyticsT
ask} -  Executing the schedule task for: APIM_STAT_script for tenant id: -1234
[2015-12-08 13:00:00,037]  INFO {org.wso2.carbon.analytics.spark.core.AnalyticsT
ask} -  Executing the schedule task for: Throttle_script for tenant id: -1234
Exception in thread "dag-scheduler-event-loop" java.lang.NoClassDefFoundError: o
rg/xerial/snappy/SnappyInputStream
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:274)
        at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.sc
ala:66)
        at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.sc
ala:60)
        at org.apache.spark.broadcast.TorrentBroadcast.org$apache$spark$broadcas
t$TorrentBroadcast$$setConf(TorrentBroadcast.scala:73)
        at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.s
cala:80)
        at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(Torre
ntBroadcastFactory.scala:34)
        at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastMan
ager.scala:62)
        at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1291)
        at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DA
GScheduler$$submitMissingTasks(DAGScheduler.scala:874)
        at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DA
GScheduler$$submitStage(DAGScheduler.scala:815)
        at org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGSchedul
er.scala:799)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAG
Scheduler.scala:1426)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAG
Scheduler.scala:1418)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
Caused by: java.lang.ClassNotFoundException: org.xerial.snappy.SnappyInputStream
 cannot be found by spark-core_2.10_1.4.1.wso2v1
        at org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(Bundl
eLoader.java:501)
        at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.
java:421)
        at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.
java:412)
        at org.eclipse.osgi.internal.baseadaptor.DefaultClassLoader.loadClass(De
faultClassLoader.java:107)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        ... 15 more

我错过了什么吗? 有人可以帮我解决这个问题吗?

提前致谢。

1 个答案:

答案 0 :(得分:4)

将所有库(jar)移动到项目&#39; s / WEB-INF / lib中。现在/ WEB-INF / lib下的所有库/ jar都将归入classpath。

使用snappy-java jar文件,它可以正常工作。