在CRD定义中找不到Spark应用程序的文件

时间:2019-01-14 16:36:59

标签: scala apache-spark kubernetes

我在Kubernetes中使用Spark运算符。我需要一个文件来执行应用程序。但是,当我在自定义对象定义的“文件”部分的jar文件附近定义它时,出现错误,这意味着找不到该文件:

Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException: 0
    at example.SparkExample$.main(SparkExample.scala:37)
    at example.SparkExample.main(SparkExample.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:846)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:194)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:921)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:932)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

此异常指向我读取类参数的SparkExample.scala:37:

  val dataFromFile = readFile(spark.sparkContext, args(0))

自定义对象定义或参数设置有什么问题?

自定义对象定义如下:

apiVersion: sparkoperator.k8s.io/v1alpha1
kind: SparkApplication
metadata:
  name: spark-example
  namespace: default
spec:
  type: Scala
  image: gcr.io/ynli-k8s/spark:v2.4.0-SNAPSHOT
  mainClass: example.SparkExample
  mainApplicationFile: http://ip:8089/spark_k8s_airflow.jar
  mode: cluster
  deps:
    files:
      - http://ip:8089/jar_test_data.txt
  driver:
    coreLimit: 1000m
    cores: 0.1
    labels:
      version: 2.4.0
    memory: 1024m
    serviceAccount: default
  executor:
    cores: 1
    instances: 1
    labels:
      version: 2.4.0
    memory: 1024m
  imagePullPolicy: Always

1 个答案:

答案 0 :(得分:0)

答案就在这个issued中 我刚刚在应用程序中添加了参数,并更改了将文件加载到Spark应用程序的方式