无法分配请求的地址:服务“ sparkDriver”失败

时间:2019-11-02 10:15:07

标签: postgresql scala apache-spark

我有一个简单的项目,将postgres数据库连接到spark。我的项目如下:

object Connector extends App {


runJdbcDatasetExample()

private def runJdbcDatasetExample(): Unit = {

val spark = SparkSession
  .builder()
  .appName("Spark SQL basic project")
  .config("spark.master", "local")
  .getOrCreate()

val jdbcDF = spark.read
  .format("jdbc")
  .option("url", "jdbc:mysql://localhost:5432")
  .option("dbtable", "schema.tablename")
  .option("user", "postgres")
  .option("password", "root")
  .load()

 }

}

我有一个在端口5432上运行的postgres数据库。但是,当我运行项目时,我得到了这个信息:

Exception in thread "main" java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address

我不知道该如何解决。感谢帮助!

0 个答案:

没有答案