我有一个Spark-scala程序,当我打开它时,会显示一条消息:
E325: ATTENTION
Found a swap file by the name ".AppFilms.scala.swp"
owned by: root dated: Fri Dec 2 09:21:07 2016
file name: ~root/projectFilms/src/main/scala/AppFilms.scala
modified: YES
user name: root host name: sandbox.hortonworks.com
process ID: 20488
While opening file "AppFilms.scala"
dated: Wed Dec 7 12:52:47 2016
NEWER than swap file!
(1) Another program may be editing the same file. If this is the case,
be careful not to end up with two different instances of the same
file when making changes. Quit, or continue with caution.
(2) An edit session for this file crashed.
If this is the case, use ":recover" or "vim -r AppFilms.scala"
to recover the changes (see ":help recovery").
If you did this already, delete the swap file ".AppFilms.scala.swp"
to avoid this message.
"AppFilms.scala" 79L, 2250C
Press ENTER or type command to continue
我理解分配内存的平均变化,我不放弃它。 但是,当我用sbt运行我的完整程序时,它会出错并成功:
Not enough arguments received.
16/12/07 12:54:06 ERROR ContextCleaner: Error in cleaning thread
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:135)
at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:175)
我推出了spark-shell,我尝试通过命令行运行它,它非常好,我可以显示结果。 我使用spark1.4.1,scala 2.11和sbt 0.13 拜托,你能给我答案吗?