创建DataFrame.withcolumn()
时,Spark开发团队忘记检查列名是否已被使用。
一开始:
val res = sqlContext.sql("select * from tag.tablename where dt>20150501 limit 1").withColumnRenamed("tablename","tablename")
res.columns
所示:
res6: Array[String] = Array(user_id, service_type_id, tablename, dt)
然后
val res1 = res.withColumn("tablename",res("tablename")+1)
res1.columns
所示:
res7: Array[String] = Array(user_id, service_type_id, tablename, dt, tablename)
顺便说一句,res1.show
有效。
BUG从这里开始:
res1.select("tablename")
org.apache.spark.sql.AnalysisException: Ambiguous references to tablename: (tablename#48,List()),(tablename#53,List());