线程“主”中的异常java.lang.ClassNotFoundException:oracle.jdbc.driver.OracleDriver

时间:2019-05-03 19:42:04

标签: oracle apache-spark connection

我正在尝试通过Intellij上的spark SBT项目从Oracle 11g XE中读取一个表,它在下面给我错误是日志

19/05/04 00:54:02 INFO SharedState: Warehouse path is 'file:/E:/Workspace/dataframe-examples/spark-warehouse/'.
Exception in thread "main" java.lang.ClassNotFoundException: oracle.jdbc.driver.OracleDriver
    at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at org.apache.spark.sql.execution.datasources.jdbc.DriverRegistry$.register(DriverRegistry.scala:38)
    at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$6.apply(JDBCOptions.scala:78)
    at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$6.apply(JDBCOptions.scala:78)
    at scala.Option.foreach(Option.scala:257)
    at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:78)
    at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:34)
    at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:32)
    at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:330)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:152)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125)
    at dsm.ex.Ex16_ReadFromOracleDb$.main(Ex16_ReadFromOracleDb.scala:16)
    at dsm.ex.Ex16_ReadFromOracleDb.main(Ex16_ReadFromOracleDb.scala)
19/05/04 00:54:04 INFO SparkContext: Invoking stop() from shutdown hook
19/05/04 00:54:04 INFO SparkUI: Stopped Spark web UI at http://192.168.159.1:4040
19/05/04 00:54:04 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/05/04 00:54:04 INFO MemoryStore: MemoryStore cleared
19/05/04 00:54:04 INFO BlockManager: BlockManager stopped
19/05/04 00:54:04 INFO BlockManagerMaster: BlockManagerMaster stopped
19/05/04 00:54:04 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/05/04 00:54:04 INFO SparkContext: Successfully stopped SparkContext
19/05/04 00:54:04 INFO ShutdownHookManager: Shutdown hook called
19/05/04 00:54:04 INFO ShutdownHookManager: Deleting directory C:\Users\Rupesh\AppData\Local\Temp\spark-350e148e-ec94-494e-9b3c-72608fee54b4

0 个答案:

没有答案