Livy REST Spark java.io.FileNotFoundException:

时间:2019-06-02 11:48:51

标签: apache-spark hadoop pyspark livy

我是BigData的新用户,我曾尝试用apache Livy调用spark作业。 与提交命令行工作正常。天生的我有例外

  • 命令行:

      

    curl -X POST --data'{“ file”:“ /user/romain/spark-examples.jar”,“ className”:“ org.apache.spark.examples.SparkPi”}'-H'内容-类型:application / json'http://localhost:8998/batches

  • 生活日志:

2019-06-01 00:43:19,160 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Exception in thread "main" java.io.FileNotFoundException: File hdfs://localhost:9000/home/spark-2.4.3-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.4.3.jar does not exist. at org.apache.hadoop.hdfs.DistributedFileSystem.listStatusInternal(DistributedFileSystem.java:795) .......org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

2 个答案:

答案 0 :(得分:0)

与@ cricket_007注释有关,由执行hadoop fs -copyFromLocal命令行解决

答案 1 :(得分:0)

如果它在本地计算机中,请尝试使用'file':'local:<path/to/file>