[亚马逊](500310)无效操作:断言

时间:2017-12-27 12:10:45

标签: amazon-s3 pyspark apache-spark-sql amazon-redshift

我正在使用spark-redshift并使用pyspark查询红移数据进行处理。

如果我使用workbench等在redshift上运行,查询工作正常。但是spark-redshift会将数据卸载到s3然后检索它,当我运行它时会抛出以下错误。

py4j.protocol.Py4JJavaError: An error occurred while calling o124.save.
: java.sql.SQLException: [Amazon](500310) Invalid operation: Assert
Details: 
 -----------------------------------------------
  error:  Assert
  code:      1000
  context:   !AmLeaderProcess - 
  query:     583860
  location:  scheduler.cpp:642
  process:   padbmaster [pid=31521]
  -----------------------------------------------;
    at com.amazon.redshift.client.messages.inbound.ErrorResponse.toErrorException(ErrorResponse.java:1830)
    at com.amazon.redshift.client.PGMessagingContext.handleErrorResponse(PGMessagingContext.java:822)
    at com.amazon.redshift.client.PGMessagingContext.handleMessage(PGMessagingContext.java:647)
    at com.amazon.jdbc.communications.InboundMessagesPipeline.getNextMessageOfClass(InboundMessagesPipeline.java:312)
    at com.amazon.redshift.client.PGMessagingContext.doMoveToNextClass(PGMessagingContext.java:1080)
    at com.amazon.redshift.client.PGMessagingContext.getErrorResponse(PGMessagingContext.java:1048)
    at com.amazon.redshift.client.PGClient.handleErrorsScenario2ForPrepareExecution(PGClient.java:2524)
    at com.amazon.redshift.client.PGClient.handleErrorsPrepareExecute(PGClient.java:2465)
    at com.amazon.redshift.client.PGClient.executePreparedStatement(PGClient.java:1420)
    at com.amazon.redshift.dataengine.PGQueryExecutor.executePreparedStatement(PGQueryExecutor.java:370)
    at com.amazon.redshift.dataengine.PGQueryExecutor.execute(PGQueryExecutor.java:245)
    at com.amazon.jdbc.common.SPreparedStatement.executeWithParams(Unknown Source)
    at com.amazon.jdbc.common.SPreparedStatement.execute(Unknown Source)
    at com.databricks.spark.redshift.JDBCWrapper$$anonfun$executeInterruptibly$1.apply(RedshiftJDBCWrapper.scala:108)
    at com.databricks.spark.redshift.JDBCWrapper$$anonfun$executeInterruptibly$1.apply(RedshiftJDBCWrapper.scala:108)
    at com.databricks.spark.redshift.JDBCWrapper$$anonfun$2.apply(RedshiftJDBCWrapper.scala:126)
    at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
    at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
Caused by: com.amazon.support.exceptions.ErrorException: [Amazon](500310) Invalid operation: Assert

生成的查询:

UNLOAD ('SELECT “x”,”y" FROM (select x,y from table_name where 
((load_date=20171226 and hour>=16) or (load_date between 20171227 and 
20171226) or (load_date=20171227 and hour<=16))) ') TO ‘s3:s3path' WITH 
CREDENTIALS ‘aws_access_key_id=xxx;aws_secret_access_key=yyy' ESCAPE 
MANIFEST

这里有什么问题,如何解决这个问题。

1 个答案:

答案 0 :(得分:0)

当解释数据类型出错时,通常会发生断言错误,例如union查询的2个部分,其中一部分中的列N是varchar,而另一部分中的同一列是整数或空。也许你的断言错误发生在来自不同节点的数据上(就像在union查询中一样)。尝试为每个列添加显式数据格式,例如x::integer