我正在尝试导入Spark SQL。我无法导入。我不确定自己在犯什么错误。我只是一个初学者。
package MySource
import java.sql.{DriverManager, ResultSet}
import org.apache.spark.sql.SparkSession
import java.util.Properties
object MyCalc {
def main(args: Array[String]): Unit = {
println("This is my first Spark")
//val sqlContext = new org.apache.spark.sql.SQLContext(sc)
val spark = SparkSession
.builder()
.appName("SparkSQL")
//.master("YARN")
.master("local[*]")
//.enableHiveSupport()
//.config("spark.sql.warehouse.dir","file:///c:/temp")
.getOrCreate()
import spark.sqlContext.implicits._
}
}
错误:(3,8)对象SparkSession不是包org.apache.spark.sql的成员
导入org.apache.spark.sql.SparkSession
错误:(15、17)未找到:值SparkSession
val spark = SparkSession