我正在Spark中使用Jena。在群集上进行部署时,我遇到一个奇怪的问题(在本地Dev上不会发生,因为我不需要构建uber jar)
在群集上部署时,出现以下异常:
Caused by: org.apache.jena.shared.NoReaderForLangException: Reader not found: JSON-LD
at org.apache.jena.rdf.model.impl.RDFReaderFImpl.getReader(RDFReaderFImpl.java:61)
at org.apache.jena.rdf.model.impl.ModelCom.read(ModelCom.java:305)
1-我想知道,一般来讲,org.apache.jena.shared.NoReaderForLangException: Reader not found: JSON-LD
之类的错误可能来自何处?同样,此代码在本地模式下也可以完美地工作。
2-我的Build.sbt组装策略:
lazy val entellectextractorsmappers = project
.settings(
commonSettings,
mainClass in assembly := Some("entellect.extractors.mappers.NormalizedDataMapper"),
assemblyMergeStrategy in assembly := {
case "application.conf" => MergeStrategy.concat
case "reference.conf" => MergeStrategy.concat
case PathList("META-INF", "services", "org.apache.jena.system.JenaSubsystemLifecycle") => MergeStrategy.concat
case PathList("META-INF", "services", "org.apache.spark.sql.sources.DataSourceRegister") => MergeStrategy.concat
case PathList("META-INF", xs @ _*) => MergeStrategy.discard
case x => MergeStrategy.first},
dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-core" % "2.9.5",
dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.9.5",
dependencyOverrides += "com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.9.5",
dependencyOverrides += "org.apache.jena" % "apache-jena" % "3.8.0",
libraryDependencies ++= Seq(
"org.apache.jena" % "apache-jena" % "3.8.0",
"edu.isi" % "karma-offline" % "0.0.1-SNAPSHOT",
"org.apache.spark" % "spark-core_2.11" % "2.3.1" % "provided",
"org.apache.spark" % "spark-sql_2.11" % "2.3.1" % "provided",
"org.apache.spark" %% "spark-sql-kafka-0-10" % "2.3.1"
//"com.datastax.cassandra" % "cassandra-driver-core" % "3.5.1"
))
.dependsOn(entellectextractorscommon)
如果有人关于使用Jena和Spark的提示真棒,那么其他任何提示可能导致找不到Reader的提示:JSON-LD,如何时发生,或者从库的角度来看意味着什么。这样,我可以追溯到包装中导致这种情况发生的原因。