嗨,我正在写一个火花流项目。并且它在我的本地环境中成功运行。但是,当我将其提交到纱线簇时,它总是会抛出此异常。
Caused by: java.io.IOException: No FileSystem for scheme: viewfs
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2795)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2809)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:98)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2852)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2834)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:186)
at org.apache.spark.deploy.yarn.Client$$anonfun$7.apply(Client.scala:123)
at org.apache.spark.deploy.yarn.Client$$anonfun$7.apply(Client.scala:123)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.deploy.yarn.Client.<init>(Client.scala:123)
at org.apache.spark.deploy.yarn.Client.<init>(Client.scala:69)
at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1226)
at org.apache.spark.deploy.yarn.Client.main(Client.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
... 7 more
我已经提到了与此问题相关的其他线程,但是它们都不能解决我的问题。 spark版本为2.1.1
,项目用java
编写。我尝试了一些方法来解决此问题。
SparkConf conf = new SparkConf().setAppName("test");
JavaStreamingContext jsc = new JavaStreamingContext(conf, Durations.seconds(5));
Configuration hadoopConfig = jsc.sparkContext().hadoopConfiguration();
hadoopConfig.set("fs.hdfs.impl", org.apache.hadoop.hdfs.DistributedFileSystem.class.getName());
hadoopConfig.set("fs.file.impl", org.apache.hadoop.fs.LocalFileSystem.class.getName());
pom.xml软件包设置
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
<configuration>
<shadedArtifactAttached>false</shadedArtifactAttached>
<outputFile>
${project.build.directory}/${project.artifactId}-${project.version}-shaded.jar
</outputFile>
<artifactSet>
<includes>
<include>*:*</include>
</includes>
<excludes>
<exclude>commons-cli:*</exclude>
<exclude>commons-logging:*</exclude>
<exclude>com.fasterxml.jackson.core:*</exclude>
<exclude>log4j:log4j</exclude>
<exclude>org.apache.commons:commons-lang3</exclude>
<exclude>org.slf4j:*</exclude>
</excludes>
</artifactSet>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ApacheLicenseResourceTransformer"/>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ApacheNoticeResourceTransformer">
<addHeader>false</addHeader>
</transformer>
<transformer
implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>reference.conf</resource>
</transformer>
</transformers>
</configuration>
</plugin>