fetchSize错误:
TypeError:jdbc()得到了意外的关键字参数'fetchSize'
我尝试以
阅读mydf = spark.read.jdbc(url, table, numPartitions=20, column=partitionColumn, lowerBound=0, upperBound=1000, fetchSize = 10000, properties=properties)
答案 0 :(得分:0)
我不确定性能,但是请尝试以下代码。这不会引发错误
df = (spark.read.format("jdbc").option("url", url)
.option("dbtable", "mytable")
.option("user", user) .option("password", password)
.option("numPartitions", "100").option("fetchsize","10000")
.option("partitionColumn", "id")
.option("lowerBound", "1").option("upperBound","1000000").load())