我正在使用Starschema的JDBC驱动程序将Pentaho连接到BigQuery。我能够成功地将BigQuery中的数据提取到Pentaho中。但是我无法将Pentaho的数据写入BigQuery。将行插入到BigQuery时抛出异常,似乎可能不支持该操作。我该如何解决这个问题?
错误讯息:
2017/10/30 14:27:43 - Table output 2.0 - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : Because of an error, this step can't continue:
2017/10/30 14:27:43 - Table output 2.0 - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : org.pentaho.di.core.exception.KettleException:
2017/10/30 14:27:43 - Table output 2.0 - Error inserting row into table [TableID] with values: [A], [I], [G], [1], [2016-02-18], [11], [2016-02-18-12.00.00.123456], [GG], [CB], [132], [null], [null], [null]
2017/10/30 14:27:43 - Table output 2.0 -
2017/10/30 14:27:43 - Table output 2.0 - Error inserting/updating row
2017/10/30 14:27:43 - Table output 2.0 - executeUpdate()
2017/10/30 14:27:43 - Table output 2.0 -
2017/10/30 14:27:43 - Table output 2.0 -
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:385)
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.processRow(TableOutput.java:125)
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2017/10/30 14:27:43 - Table output 2.0 - at java.lang.Thread.run(Unknown Source)
2017/10/30 14:27:43 - Table output 2.0 - Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
2017/10/30 14:27:43 - Table output 2.0 - Error inserting/updating row
2017/10/30 14:27:43 - Table output 2.0 - executeUpdate()
2017/10/30 14:27:43 - Table output 2.0 -
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.core.database.Database.insertRow(Database.java:1321)
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:262)
2017/10/30 14:27:43 - Table output 2.0 - ... 3 more
2017/10/30 14:27:43 - Table output 2.0 - Caused by: net.starschema.clouddb.jdbc.BQSQLFeatureNotSupportedException: executeUpdate()
2017/10/30 14:27:43 - Table output 2.0 - at net.starschema.clouddb.jdbc.BQPreparedStatement.executeUpdate(BQPreparedStatement.java:317)
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.core.database.Database.insertRow(Database.java:1288)
2017/10/30 14:27:43 - Table output 2.0 - ... 4 more
2017/10/30 14:27:43 - BigQuery_rwa-tooling - Statement canceled!
2017/10/30 14:27:43 - Simple Read Write from csv to txt - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : Something went wrong while trying to stop the transformation: org.pentaho.di.core.exception.KettleDatabaseException:
2017/10/30 14:27:43 - Simple Read Write from csv to txt - Error cancelling statement
2017/10/30 14:27:43 - Simple Read Write from csv to txt - cancel()
2017/10/30 14:27:43 - Simple Read Write from csv to txt - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : org.pentaho.di.core.exception.KettleDatabaseException:
2017/10/30 14:27:43 - Simple Read Write from csv to txt - Error cancelling statement
2017/10/30 14:27:43 - Simple Read Write from csv to txt - cancel()
2017/10/30 14:27:43 - Simple Read Write from csv to txt -
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.core.database.Database.cancelStatement(Database.java:750)
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.core.database.Database.cancelQuery(Database.java:732)
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.trans.steps.tableinput.TableInput.stopRunning(TableInput.java:299)
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.trans.Trans.stopAll(Trans.java:1889)
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.trans.step.BaseStep.stopAll(BaseStep.java:2915)
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.trans.steps.tableoutput.TableOutput.processRow(TableOutput.java:139)
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at java.lang.Thread.run(Unknown Source)
2017/10/30 14:27:43 - Simple Read Write from csv to txt - Caused by: net.starschema.clouddb.jdbc.BQSQLFeatureNotSupportedException: cancel()
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at net.starschema.clouddb.jdbc.BQStatementRoot.cancel(BQStatementRoot.java:113)
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.core.database.Database.cancelStatement(Database.java:744)
2017/10/30 14:27:43 - Simple Read Write from csv to txt - ... 7 more
2017/10/30 14:27:43 - Table output 2.0 - Signaling 'output done' to 0 output rowsets.
2017/10/30 14:27:43 - BigQuery_prID - No commit possible on database connection [BigQuery_prID]
答案 0 :(得分:1)
看起来您可能正在尝试通过旧版SQL执行此操作,该SQL不支持DML语句(INSERT / UPDATE / DELETE)。
标准SQL确实支持DML,但这些主要是为了支持批量表操作而不是面向行的插入;建议不要使用单独的DML INSERT来提取数据。有关详细信息,请参阅DML reference documentation上的配额。
最好通过加载作业使用BigQuery流式传输或批量提取进行摄取,但由于这些机制不在查询语言之内,因此您可能需要超越使用JDBC驱动程序。