在WSO2 AM Analytics中的Analytics(分析)临时表中获取一个空字符串作为版本

时间:2018-08-07 09:25:16

标签: analytics wso2-am

在WSO2 AM Analytics 2.1.0中执行“ APIM_LATENCY_BREAKDOWN_STATS”脚本时,出现以下错误。

ERROR {org.apache.spark.executor.Executor} -  Exception in task 0.0 in stage 178869.0 (TID 161807) {org.apache.spark.executor.Executor} java.sql.BatchUpdateException: ORA-01400: cannot insert NULL into ("WSO2_STATDB"."API_EXE_TME_DAY_SUMMARY"."VERSION")
    at oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedStatement.java:11190)
    at oracle.jdbc.driver.OracleStatementWrapper.executeBatch(OracleStatementWrapper.java:244)
    at org.apache.spark.sql.jdbc.carbon.package$CarbonJDBCWrite$.savePartition(carbon.scala:149)
    at org.apache.spark.sql.jdbc.carbon.package$CarbonJDBCWrite$$anonfun$saveTable$1.apply(carbon.scala:72)
    at org.apache.spark.sql.jdbc.carbon.package$CarbonJDBCWrite$$anonfun$saveTable$1.apply(carbon.scala:71)
    at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920)
    at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920)
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    at org.apache.spark.scheduler.Task.run(Task.scala:89)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)

在分析“ API_EXECUTION_TME_DAY_SUMMARY_FINAL”临时表中的数据时,我注意到有些记录的版本为空字符串(版本不为null)。

我不知道如何将版本更新为空。当我在未指定API版本的情况下调用API(设置默认版本)时,上表中的默认版本已正确更新。

您知道版本如何变空或如何再次产生此错误吗?

0 个答案:

没有答案