我尝试了以下这些方法:
>>./spark-shell –-jars /home/my_path/my_jar.jar
在外壳中,我尝试导入软件包:
scala> import com.vertica.spark._
<console>:23: error: object vertica is not a member of package com
import com.vertica.spark._
它不起作用,我也尝试从罐子的路径中删除斜杠(/)
>>./spark-shell –-jars home/my_path/my_jar.jar
还是一样。.虽然有警告
20/04/21 22:34:40 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://ubuntu:4040
Spark context available as 'sc' (master = local[*], app id = local-1587488711233).
Spark session available as 'spark'.
Welcome to
但是另一方面,如果我进入外壳并尝试使用相同的jar路径添加require,那么它将成功导入:
scala> :require /home/my_path/my_jar.jar
Added '/home/my_path/my_jar.jar' to classpath.
scala> import com.vertica.spark._
import com.vertica.spark._
添加带有火花壳本身的罐子时我会丢失什么?
答案 0 :(得分:0)
此问题可能是由于hadoop的本机问题,请尝试在源bashrc下面进行尝试,
export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH
和
export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native:$LD_LIBRARY_PATH