线程“主”中的SSH异常java.lang.NoClassDefFoundError:org / apache / hadoop / fs / FSDataInputStream

时间:2019-10-10 15:02:15

标签: java maven apache-spark hadoop

我是服务器的新手。在我的电脑上,使用Apache Spark没问题。通常,我使用IntelliJ来运行代码。

我尝试在外部服务器ssh中运行项目,但出现错误:

  

线程“主”中的异常java.lang.NoClassDefFoundError:   org / apache / hadoop / fs / FSDataInputStream位于   org.apache.spark.sql.SparkSession $ .org $ apache $ spark $ sql $ SparkSession $$ assertOnDriver(SparkSession.scala:1086)     在   org.apache.spark.sql.SparkSession $ Builder.getOrCreate(SparkSession.scala:902)     在com.p53.main(p53.java:42)处   sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)位于   sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)     在   sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     在java.lang.reflect.Method.invoke(Method.java:498)在   com.intellij.rt.execution.application.AppMainV2.main(AppMainV2.java:131)   造成原因:java.lang.ClassNotFoundException:   org.apache.hadoop.fs.FSDataInputStream位于   java.net.URLClassLoader.findClass(URLClassLoader.java:381)在   java.lang.ClassLoader.loadClass(ClassLoader.java:424)在   sun.misc.Launcher $ AppClassLoader.loadClass(Launcher.java:331)在   java.lang.ClassLoader.loadClass(ClassLoader.java:357)...还有8个

当我在终端(/ usr / local / spark / bin / spark-shell)中运行时,Spark运行良好。

我的pom依赖项是:

dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.4.3</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>2.4.3</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-mllib_2.11</artifactId>
            <version>2.4.3</version>
            <scope>runtime</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-hdfs</artifactId>
            <version>3.2.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>3.2.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>3.2.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-catalyst_2.11</artifactId>
            <version>2.4.3</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-mllib_2.11</artifactId>
            <version>2.4.3</version>
        </dependency>
    </dependencies>

pom插件:

plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.5.1</version>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                </configuration>
            </plugin>
            <plugin>
                <artifactId>maven-jar-plugin</artifactId>
                <version>3.0.2</version>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                    <archive>
                        <manifest>
                            <mainClass>Main</mainClass>
                        </manifest>
                    </archive>
                </configuration>
            </plugin>
        </plugins>

我知道我做错了什么或缺少了什么,但我只是不知道问题出在哪里。

1 个答案:

答案 0 :(得分:0)

您需要设置SPARK_DIST_CLASSPATH。

TelephonyManager tm = (TelephonyManager)getSystemService(Context.TELEPHONY_SERVICE);
String networkOperator = tm.getNetworkOperatorName();
if("Android".equals(networkOperator)) {
    // Emulator
}
else {
    // Device
}