我有一个名为SGA.jar的Scala jar文件。内部有一个名为org / SGA / MainTest的类,该类使用基础的SGA.jar逻辑执行一些图形操作,如下所示:
package org.SGA
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
import org.apache.spark.graphx._
import org.apache.spark.rdd.RDD
import java.io._
import scala.util._
object MainTest {
def initialize() : Unit = {
println("Initializing")
}
def perform(collection : Iterable[String]) : Unit = {
val conf = new SparkConf().setAppName("maintest")
val sparkContext = new SparkContext(conf)
sparkContext.setLogLevel("ERROR")
val edges = sparkContext.parallelize(collection.toList).map(_.split(" ")).map { edgeCoordinates => new Edge(edgeCoordinates(0).toLong, edgeCoordinates(1).toLong, edgeCoordinates(2).toDouble) }
println("Creating graph")
val graph : Graph[Any, Double] = Graph.fromEdges(edges, 0)
println("Graph created")
// ...
}
}
SGA.jar嵌入到scalaWrapper.jar中,这是围绕scala SGA.jar和必要数据集的Java包装器。它的文件夹结构如下所示:
scalaWrapper.jar
| META-INF
| | MANIFEST.MF
| scalawrapper
| | datasets
| | | data1.txt
| | jars
| | | SGA.jar
| | FileParser.java
| | FileParser.class
| | WrapperClass.java
| | WrapperClass.class
| .classpath
| .project
FileParser类基本上将文本文件中的可用数据转换为可用的结构,在此不再赘述。主类是WrapperClass,但是:
package scalawrapper;
import scala.collection.*;
import scala.collection.Iterable;
import java.util.List;
import org.SGA.*;
public class WrapperClass {
public static void main(String[] args) {
FileParser fileparser = new FileParser();
String filepath = "/scalawrapper/datasets/data1.txt";
MainTest.initialize();
List<String> list = fileparser.Parse(filepath);
Iterable<String> scalaIterable = JavaConversions.collectionAsScalaIterable(list);
MainTest.perform(scalaIterable);
}
}
SGA.jar是通过SBT构建的,而Java jar是从Eclipse开发和导出的。在本地运行时(在这种情况下,SparkConf已附加了.setMaster(“ local [*]”)。set(“ spark.executor.memory”,“ 7g”)以便于本地执行),没有问题,并且代码表现正常。
当scalaWrapper.jar预期在EMR群集上运行时,会出现问题。集群被定义为1个主节点+ 4个工作节点,另外还有一个spark应用步骤:
Main class : None
Arguments : spark-submit --deploy-mode cluster --class scalawrapper.WrapperClass --executor-memory 17g --executor-cores 16 --driver-memory 17g s3://scalaWrapperCluster/scalaWrapper.jar
执行失败,显示为:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/mnt1/yarn/usercache/hadoop/filecache/10/__spark_libs__1619195545177535823.zip/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
19/04/22 16:56:43 INFO SignalUtils: Registered signal handler for TERM
19/04/22 16:56:43 INFO SignalUtils: Registered signal handler for HUP
19/04/22 16:56:43 INFO SignalUtils: Registered signal handler for INT
19/04/22 16:56:43 INFO SecurityManager: Changing view acls to: yarn,hadoop
19/04/22 16:56:43 INFO SecurityManager: Changing modify acls to: yarn,hadoop
19/04/22 16:56:43 INFO SecurityManager: Changing view acls groups to:
19/04/22 16:56:43 INFO SecurityManager: Changing modify acls groups to:
19/04/22 16:56:43 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, hadoop); groups with view permissions: Set(); users with modify permissions: Set(yarn, hadoop); groups with modify permissions: Set()
19/04/22 16:56:44 INFO ApplicationMaster: Preparing Local resources
19/04/22 16:56:44 INFO ApplicationMaster: ApplicationAttemptId: appattempt_1555952041027_0001_000001
19/04/22 16:56:44 INFO ApplicationMaster: Starting the user application in a separate Thread
19/04/22 16:56:44 INFO ApplicationMaster: Waiting for spark context initialization...
19/04/22 16:56:44 ERROR ApplicationMaster: User class threw exception: java.lang.NoClassDefFoundError: org/SGA/MainTest
java.lang.NoClassDefFoundError: org/SGA/MainTest
at scalawrapper.WrapperClass.main(WrapperClass.java:20)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:678)
Caused by: java.lang.ClassNotFoundException: org.SGA.MainTest
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 6 more
请注意WrapperClass.java:20对应于MainTest.initialize();。
这种例外似乎很流行,因为我尝试了很多尝试来解决(example),但没有一个解决了我的问题。我尝试在scalaWrapper.jar文件中包括用于构建SGA.jar的scala库,该方法消除了静态字段,在项目定义中搜索错误,但是没有运气。
答案 0 :(得分:0)
我通过将SGA.jar分别上传到S3并将其作为--jars参数添加到spark-submit中来解决了该问题。
spark-submit --deploy-mode cluster --jars s3://scalaWrapperCluster/SGA.jar --class scalawrapper.WrapperClass --executor-memory 17g --executor-cores 16 --driver-memory 17g s3://scalaWrapperCluster/scalaWrapper.jar
请注意,scalaWrapper.jar中的原始功能(包括已经内置的SGA.jar)没有更改。单独上传的SGA.jar是正在执行的。