我正在尝试将CSV加载为自定义对象类型的DataFrame:
theUl
我试过了:
case class Geom(attributes: Map[String, Any])
但它抛出了这个例外:
import session.implicits._
implicit val mapEncoder = org.apache.spark.sql.Encoders.kryo[Map[String, Any]]
implicit val geomEncoder = org.apache.spark.sql.Encoders.product[Geom]
val sparkSQLGeometryRDD = session.read
.option("delimiter", "\t")
.option("inferSchema", "true")
.option("header", "true")
.csv("src\\main\\resources\\TexasPostCodes.txt")
//.as[MyObjEncoded]//(encoder)
.persist()
val columns = sparkSQLGeometryRDD.schema.fieldNames
//sparkSQLGeometryRDD.show()
val mappedDF = sparkSQLGeometryRDD
.map(x => x.getValuesMap[Any](columns.toList))
.map(x => Geom(x))
.show
有人可以帮我找一下,我的代码出了什么问题?
在从方法移出案例类和编码器之后,它工作正常。
Exception in thread "main" java.lang.ClassNotFoundException: scala.Any
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)