Spark 2.3中带有spark.read.json的许多警告;子类导致没有可能的候选人

时间:2018-11-12 14:59:03

标签: json apache-spark

我知道我们可以在将日志设置为error的情况下启动spark-shell,但是这些警告是否有解释?前几个警告是可以的,因为相同的插件警告的路径遵循指向/ usr / local / bin / spark的符号链接。

scala> val people = spark.read.json("/path/to/people.json").show
18/11/12 09:41:33 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/local/bin/spark/jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/local/share/spark-2.3.0-bin-hadoop2.7/jars/datanucleus-core-3.2.10.jar."
18/11/12 09:41:33 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/local/bin/spark/jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/local/share/spark-2.3.0-bin-hadoop2.7/jars/datanucleus-api-jdo-3.2.6.jar."
Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
18/11/12 09:41:37 WARN Query: Query for candidates of org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics and subclasses resulted in no possible candidates
Cannot add `SDS`.`SD_ID` as referenced FK column for `TBLS`
org.datanucleus.exceptions.NucleusException: Cannot add `SDS`.`SD_ID` as referenced FK column for `TBLS`
    at org.datanucleus.store.rdbms.key.ForeignKey.setColumn(ForeignKey.java:232)
    at org.datanucleus.store.rdbms.key.ForeignKey.addColumn(ForeignKey.java:207)
    at org.datanucleus.store.rdbms.table.TableImpl.getExistingForeignKeys(TableImpl.java:1057)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
18/11/12 09:41:40 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException

0 个答案:

没有答案