将pyspark连接到Oracle SQL时出错?

时间:2017-08-30 11:43:35

标签: oracle pyspark oracle-sqldeveloper pyspark-sql

我尝试在pyspark和oracle sql之间建立连接,所以我可以用这种方式加载表格; 我使用以下代码:

from pyspark import SparkConf, SparkContext
from pyspark.sql import SQLContext, Row
import os

spark_config = SparkConf().setMaster("local").setAppName("Project_SQL")
sc = SparkContext(conf = spark_config)
sqlctx = SQLContext(sc)

os.environ['SPARK_CLASSPATH'] = "C:\Program Files (x86)\Oracle\SQL Developer 4.0.1\jdbc\lib.jdbc6.jar"


df = sqlctx.read.format("jdbc").options(url="jdbc:oracle:thin:@<>:<>:<>"
                                   , driver = "oracle.jdbc.driver.OracleDriver"
                                   , dbtable = "account"
                                   , user="...."
                                   , password="...").load()

但我收到以下错误。

An error occurred while calling o29.load.:
java.sql.SQLRecoverableException: IO Error: The Network Adapter could not establish the connection

有人可以帮我解决这个问题吗?你认为这是因为防火墙吗?

0 个答案:

没有答案