我有一个比varchar(max)数据类型大的列,据我了解,该列是AWS Glue使用的最大数据类型,并且在尝试加载我的列时出现错误“字符串长度超过DDL长度”表,因为它。我不打算截断该专栏,因为它并不那么重要,也无法弄清楚如何在Glue中做到这一点。我知道,如果我在EC2实例中使用psql连接到数据库,并且可以真正以这种方式成功加载表,则可以在复制命令中使用TRUNCATECOLUMNS作为标记。但是,老板坚持要求我使用Glue来完成这项工作,因此我正在寻找一种使用Glue脚本截断列的方法。我浏览了很多文档,但找不到相似的东西。谢谢。
以下是一些可能会遇到此问题并需要完整参考的其他人的有效代码。请注意,varchar(65535)
是一列在Redshift中可以容纳的最大字符数:
val truncColUdf = udf((str: String) => if (str.length > 29999) str.substring(0, 29999) else str)
val datasource30 = glueContext.getCatalogSource(database = "database", tableName = "entry", redshiftTmpDir = "", transformationContext = "datasource30").getDynamicFrame()
val revDF30 = datasource30.toDF()
.withColumn("message", truncColUdf(col("message")))
val truncDynamicFrame30 = DynamicFrame(revDF30, glueContext)
val applymapping30 = truncDynamicFrame30.applyMapping(mappings = Seq(("id", "bigint", "id", "bigint"), ("message", "string", "message", "varchar(65535)"), ("state", "string", "state", "varchar(256)"), ("created_at", "timestamp", "created_at", "timestamp"), ("depth", "int", "depth", "int")), caseSensitive = false, transformationContext = "applymapping30")
val resolvechoice30 = applymapping30.resolveChoice(choiceOption = Some(ChoiceOption("make_cols")), transformationContext = "resolvechoice30")
val dropnullfields30 = resolvechoice30.dropNulls(transformationContext = "dropnullfields30")
val datasink30 = glueContext.getJDBCSink(catalogConnection = "databaseConnection", options = JsonOptions("""{"dbtable": "entry", "database": "database"}"""), redshiftTmpDir = args("TempDir"), transformationContext = "datasink30").writeDynamicFrame(dropnullfields30)
以下是正在读取的示例行:
01,"<p>Here is the message where the quotations are in case of commas within the message, like so.</p>",active,2017-08-27 23:38:40,1
答案 0 :(得分:0)
将DynamicFrame转换为spark的DataFrame,然后使用用户定义的函数截断列值(Scala):
import com.amazonaws.services.glue.DynamicFrame
import org.apache.spark.sql.functions._
val truncColUdf = udf((str: String) => if (str.length > 20) str.substring(0, 20) else str)
val truncDataFrame = dynamicFrame.toDF()
.select("text_long")
.withColumn("text_short", truncColUdf(col("text_long")))
.withColumn("text_short_length", length(col("text_short")))
truncDataFrame.show(5, false)
val truncDynamicFrame = DynamicFrame(truncDataFrame, glueContext)
...
//write to sink
输出:
+-----------------------+--------------------+-----------------+
|text_long |text_short |text_short_length|
+-----------------------+--------------------+-----------------+
|I'd rather not answer |I'd rather not answe|20 |
|Agree |Agree |5 |
|Custom Answer Favorable|Custom Answer Favora|20 |
|Agree |Agree |5 |
|Sometimes |Sometimes |9 |
+-----------------------+--------------------+-----------------+
答案 1 :(得分:0)
您可以在 DynamicFrameWriter 的“extracopyoptions”参数中传递“TRUNCATECOLUMNS”:https://aws.amazon.com/premiumsupport/knowledge-center/sql-commands-redshift-glue-job/