如何在pyspark中读取sql文件?

时间:2019-07-03 13:08:21

标签: sql apache-spark pyspark pyspark-sql spark-hive

我一直在尝试运行此代码,希望它从sql文件创建一个表,该文件包含表架构和使用pyspark的值。 似乎无法理解该错误。 请帮助我。

-------------------- SQL文件------------------------ < / p>

<Editor Text="Enter text here" AutoSize="TextChanges" Keyboard="Chat" />

-

-转储表#CREATE TABLE IF NOT EXISTS `user_details` ( `user_id` int(11) NOT NULL AUTO_INCREMENT, `username` varchar(255) DEFAULT NULL, `first_name` varchar(50) DEFAULT NULL, `last_name` varchar(50) DEFAULT NULL, `gender` varchar(10) DEFAULT NULL, `password` varchar(50) DEFAULT NULL, `status` tinyint(10) DEFAULT NULL, PRIMARY KEY (`user_id`) ) ENGINE=MyISAM DEFAULT CHARSET=latin1 AUTO_INCREMENT=10001 ; 的数据

user_details

预先感谢

INSERT INTO `user_details` (`user_id`, `username`, `first_name`, `last_name`, `gender`, `password`, `status`) VALUES
(1, 'rogers63', 'david', 'john', 'Fe ;male', 'e6a33eee180b07e563d74fee8c2c66b8', 1),
(2, 'mike28', 'rogers', 'paul', 'Male', '2e7dc6b8a1598f4f75c3eaa47958ee2f', 1),
(3, 'rivera92', 'david', 'john', 'Male', '1c3a8e03f448d211904161a6f5849b68', 1),
(4, 'ross95', 'maria', 'sanders', 'Male', '62f0a68a4179c5cdd997189760cbcf18', 1),
(5, 'paul85', 'morris', 'miller', 'Female', '61bd060b07bddfecccea56a82b850ecf', 1),
(6, 'smith34', 'daniel', 'michael', 'Female', '7055b3d9f5cb2829c26cd7e0e601cde5', 1),
(7, 'james84', 'sanders', 'paul', 'Female', 'b7f72d6eb92b45458020748c8d1a3573', 1),
(8, 'daniel53', 'mark', 'mike', 'Male', '299cbf7171ad1b2967408ed200b4e26c', 1),
(9, 'brooks80', 'morgan', 'maria', 'Female', 'aa736a35dc15934d67c0a999dccff8f6', 1),
(10, 'morgan65', 'paul', 'miller', 'Female', '#a28dca31f5aa5792e1cefd1dfd098569', 1);

这就是我一直在尝试的事情

但我遇到了错误

from pyspark import SparkConf, SparkContext
from pyspark.sql import SQLContext

conf = SparkConf().setAppName("asdasd").set("spark.driver.memory", "1g")
sc = SparkContext(conf = conf)

sqlContext = SQLContext(sc) 

with open("/home/data/Downloads/Sample-SQL-File-10-Rows.sql") as fr:
    query = fr.read()
results = sqlContext.sql(query)

0 个答案:

没有答案