spark-sql读取配置单元表列数错误,

时间:2018-10-25 07:28:44

标签: hive apache-spark-sql hive-metastore

当我使用spark-sql命令行读取配置单元表来计算报告时,出现此错误:

spark-sql无法正确读取配置单元表的数量。

看下面的日志,重点显示表"org_code"中没有列cms_organization_info,而cms_organization_info表中只有2列:[cms_organization_info.channel_id]

但是当我执行sql:desc cms_organization_info时,我发现表具有列"org_code".

非常感谢,如果您能帮助我解决这个问题!

================日志错误================

24-10-2018 23:35:14 CST h2m_real_time_overdueNotPayPerDay INFO - Time taken: 0.139 seconds
**

> 24-10-2018 23:35:18 CST h2m_real_time_overdueNotPayPerDay INFO - Error
> in query: cannot resolve '`org_code`' given input columns:
> [cms_organization_info.channel_id]; line 56 pos 16;

**
24-10-2018 23:35:18 CST h2m_real_time_overdueNotPayPerDay INFO - 'InsertIntoTable 'UnresolvedRelation `report`.`overdueNotPayPerDay`, false, false
24-10-2018 23:35:18 CST h2m_real_time_ov

================================

0 个答案:

没有答案