使用Spark / Scala连接oracle数据库获取错误

时间:2017-09-08 14:54:44

标签: oracle scala apache-spark dataframe toad

我是Sacala的新手,我有一个文本文件,其中我试图读取并加载到数据帧之后我试图加载到数据库中,同时加载到数据库I' m当我在Toad中使用的相同凭证成功连接时,获得下面给出的错误。任何帮助将不胜感激

的text.txt

TYPE,CODE,SQ_CODE,RE_TYPE,VERY_ID,IN_DATE,DATE
"F","000544","2017002","OP","95032015062763298","20150610","20150529"
"F","000544","2017002","LD","95032015062763261","20150611","20150519"
"F","000544","2017002","AK","95037854336743246","20150611","20150429"   

val sparkSession = SparkSession.builder().master("local").appName("IT_DATA").getOrCreate()

Driver=oracle.jdbc.driver.OracleDriver
Url=jdbc:oracle:thin:@xxx.com:1521/DATA00.WORLD



username=xxx
password=xxx

val dbProp = new java.util.Properties
    dbProp.setProperty("driver", Driver)
    dbProp.setProperty("user", username)
    dbProp.setProperty("password", password)

          //Create dataframe boject
            val df = sparkSession.read
            .format("com.databricks.spark.csv")
            .option("header", "true")
            .option("inferSchema", "true")
            .option("location", "/xx/xx/xx/xx/test.csv") 
            .option("delimiter", ",")
            .option("dateFormat", "yyyyMMdd")
            .load().cache()

            df.write.mode("append").jdbc(Url, TableTemp, dbProp)

df.show

+-----+-------+---------+---------+-------------------+---------+-------------+
| TYPE|   CODE|  SQ_CODE| RE_TYPE |            VERY_ID|  IN_DATE|      DATE   |
+-----+-------+---------+---------+-------------------+---------+-------------+
|   F | 000544|  2017002|      OP |  95032015062763298| 20150610|   2015-05-29|
|   F | 000544|  2017002|      LD |  95032015062763261| 20150611|   2015-05-19|
|   F | 000544|  2017002|      AK |  95037854336743246| 20150611|   2015-04-29|
+-----+-------+---------+--+------+-------------------+---------+-------------+         

错误

java.sql.SQLException: ORA-01017: invalid username/password; logon denied

0 个答案:

没有答案