选择列时出现奇怪的Spark SQL行为

时间:2019-04-04 07:37:11

标签: apache-spark apache-spark-sql

我有一个要从中选择列的数据框。奇怪的是,我正在追随异常。

org.apache.spark.sql.AnalysisException: cannot resolve '`products.n_52`' given input columns: [products.objectives, products.q_4, n_8, n_52, q_52, npv_4, n_4, b_1, products.qpv_8, qpv_13, q_4, b_26, products.npv_1, products.qpv_52, products.q_26, products.q_52, npv_52, n_13, products.n_4, products.qpv_1, products.b_8, npv_8, qpv_4, b_13, b_4, qpv_8, q_26, b_8, products.qpv_26, n_26, products.qpv_13, qpv_52, npv_13, products.b_1, products.b_13, products.n_13, q_13, products.b_4, n_1, q_8, products.q_8, qpv_26, products.npv_52, products.b_52, products.npv_13, products.npv_26, products.n_1, products.npv_4, qpv_1, npv_1, products.npv_8, products.qpv_4, custid, products.n_52, products.n_26, products.q_1, b_52, products.product, products.n_8, npv_26, q_1, products.q_13, products.b_26];;

Spark给我错误,给定列表中没有列products._n_52,但我可以在给定列表中看到该列。

有人遇到过类似的问题吗?

如果有什么帮助,我可以在下面的UDF中传递此列

df = df.withColumn("arr", F.explode(F.col("products.n_52"))

0 个答案:

没有答案