未指定选项'schema'使用DAS 3.1.0设置wso2 AM 1.10.x

时间:2017-05-26 19:36:58

标签: wso2 wso2-am wso2-das

我正在尝试使用DAS 3.1.0设置wso2 API Manager 1.10.0。 DAS将使用MySQL 5.7.18。我从DAS包运行mysql5.7.sql以在MySQL中创建数据库模式。我还下载了MySQL-connector-java-5.1.35-bin.jar并将其复制到repository \ components \ lib目录中。

我在API管理器中启用了Configure Analytics,并成功保存了配置。我可以看到API管理器可以毫无问题地与DAS通信。

但是在DAS的碳日志中,我看到了这样的例外:

TID: [-1234] [] [2017-05-26 15:30:00,368] ERROR {org.wso2.carbon.analytics.spark.core.AnalyticsTask} -  Error while executing the scheduled task for the script: APIM_STAT_SCRIPT {org.wso2.carbon.analytics.spark.core.AnalyticsTask}
org.wso2.carbon.analytics.spark.core.exception.AnalyticsExecutionException: Exception in executing query create temporary table APIRequestSummaryData using CarbonJDBC options (dataSource "WSO2AM_STATS_DB", tableName "API_REQUEST_SUMMARY")
    at org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.executeQueryLocal(SparkAnalyticsExecutor.java:764)
    at org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.executeQuery(SparkAnalyticsExecutor.java:721)
    at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService.executeQuery(CarbonAnalyticsProcessorService.java:201)
    at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService.executeScript(CarbonAnalyticsProcessorService.java:151)
    at org.wso2.carbon.analytics.spark.core.AnalyticsTask.execute(AnalyticsTask.java:60)
    at org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute(TaskQuartzJobAdapter.java:67)
    at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Option 'schema' not specified
    at scala.sys.package$.error(package.scala:27)
    at org.apache.spark.sql.jdbc.carbon.AnalyticsJDBCRelationProvider$$anonfun$3.apply(JDBCRelation.scala:113)
    at org.apache.spark.sql.jdbc.carbon.AnalyticsJDBCRelationProvider$$anonfun$3.apply(JDBCRelation.scala:113)
    at scala.collection.MapLike$class.getOrElse(MapLike.scala:128)
    at org.apache.spark.sql.execution.datasources.CaseInsensitiveMap.getOrElse(ddl.scala:150)
    at org.apache.spark.sql.jdbc.carbon.AnalyticsJDBCRelationProvider.createRelation(JDBCRelation.scala:113)
    at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158)
    at org.apache.spark.sql.execution.datasources.CreateTempTableUsing.run(ddl.scala:92)
    at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:58)
    at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:56)
    at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:70)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
    at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:55)
    at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:55)
    at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:145)
    at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:130)
    at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:52)
    at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:817)
    at org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.executeQueryLocal(SparkAnalyticsExecutor.java:760)

如何解决? 感谢。

2 个答案:

答案 0 :(得分:1)

API Manager 1.10和DAS 3.1.0彼此不兼容。除非您自定义数据库和CApp,否则它不会起作用。

您可以将API Manager 2.1与DAS 3.1.0或API Manager 1.10与DAS 3.0.x一起使用

答案 1 :(得分:0)

事实证明,我需要将正确的架构声明脚本从/ dbscript / stat / sql文件夹导入我在此处设置的DAS数据库。