spark-submit无法在我的jar文件中使用依赖项

时间:2019-07-19 01:38:54

标签: maven apache-spark spark-submit

我制作了一个jar文件,其中包含org.apache.spark.sql.kafka10。 (使用spark-sql-kafka-0-10_2.11:2.4.3)。
但是,当我使用class ActivitHistoryCardWidget extends StatefulWidget { DActivitHistory dActivitHistory; bool showText1; bool showText2; bool showText3; ActivitHistoryCardWidget({this.dActivitHistory, this.showText1, this.showText2, this.showText3}); @override _ActivitHistoryCardWidget createState() { return new _ActivitHistoryCardWidget(dActivitHistory: dActivitHistory); } } class _ActivitHistoryCardWidget extends State<ActivitHistoryCardWidget> { DActivitHistory dActivitHistory; bool showText1; bool showText2; bool showText3; _ActivitHistoryCardWidget({this.dActivitHistory, this.showText1, this.showText2, this.showText3}); @override Widget build(BuildContext context) { return Column( children: <Widget>[ Container( child: Row( mainAxisAlignment: MainAxisAlignment.start, crossAxisAlignment: CrossAxisAlignment.start, children: <Widget>[ Container( width: 100, alignment: Alignment.topCenter, child: Text(new DateFormat("d").format(DateTime.parse(dActivitHistory.dActivitDate)), textAlign: TextAlign.center, style: TextStyle( fontSize: 50, ) ), ), Column( mainAxisAlignment: MainAxisAlignment.start, crossAxisAlignment: CrossAxisAlignment.start, children: <Widget>[ SizedBox(height: 10,), Text(new DateFormat("EEEE").format(DateTime.parse(dActivitHistory.dActivitDate)), style: TextStyle( fontSize: 14 ), ), SizedBox(height: 5,), Text(new DateFormat("MMMM y").format(DateTime.parse(dActivitHistory.dActivitDate)), style: TextStyle( fontSize: 16, fontWeight: FontWeight.w600 ), ), //Point Widget widget.showText1 ? Container() : this.PointHistory(context), //Activit Widget widget.showText2 ? Container() : this.ActivitHistory(context), //Give Widget widget.showText3 ? Container() : this.GiveHistory(context), SizedBox(height: 10,), ], ) ], ), ), Divider(height: 2, color: Pigment.fromString(UIData.primaryColor),), ], ); } Widget PointHistory(context) { return Column( children: <Widget>[ SizedBox(height: 10,), Row( children: <Widget>[ SizedBox(child: Image.asset("assets/images/Activit/star.png"),height: 20, width: 20,), SizedBox(width: 10,), Text(getNumberFormat(dActivitHistory.counter.toString()), style: TextStyle( fontSize: 16, color: Pigment.fromString(UIData.primaryColor) ), ), ], ), ], ); } Widget ActivitHistory(context) { return Column( children: <Widget>[ SizedBox(height: 10,), Row( children: <Widget>[ SizedBox(child: Image.asset("assets/images/home/my-Activit.png"),height: 20, width: 20,), SizedBox(width: 10,), Text(getNumberFormat(dActivitHistory.dActivitTotal.toString()), style: TextStyle( fontSize: 16, color: Pigment.fromString(UIData.primaryColor) ), ), ], ), ], ); } Widget GiveHistory(context) { return Column( children: <Widget>[ SizedBox(height: 10,), Row( children: <Widget>[ SizedBox(child: Image.asset("assets/images/home/more-Give.png"),height: 20, width: 20,), SizedBox(width: 10,), Text('Di Salurkan Sejumlah Rp '+getNumberFormat(dActivitHistory.GiveTotal.toString()+' ke Lembaga '+getFoundationName(dActivitHistory.foundationDonateId.toString()) ), style: TextStyle( fontSize: 16, color: Pigment.fromString(UIData.primaryColor) ), ), ], ), ], ); } getFoundationName(String str) { String returnName = ''; switch (str) { case '1': returnName = 'A'; break; case '2': returnName = 'B'; break; case '3': returnName = 'C'; break; case '4': returnName = 'D'; break; case '5': returnName = 'E'; break; } return returnName; } getNumberFormat(String str) { final f = new NumberFormat("#.###"); return str.replaceAll(f.symbols.GROUP_SEP, ''); } } 执行时,会出现如下错误。

./bin/spark-submit --class MYCLASS --master local[*] MYJAR.jar

这是我的pom.xml

Exception in thread "main" org.apache.spark.sql.AnalysisException: Failed to find data source: kafka. Please deploy the application as per the deployment section of "Structured Streaming + Kafka Integration Guide".;

我知道我可以通过
很好地执行 <?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>*</groupId> <artifactId>*</artifactId> <version>1.0-SNAPSHOT</version> <name>mt</name> <!-- FIXME change it to the project's website --> <url>http://www.example.com</url> <properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> <maven.compiler.source>1.8</maven.compiler.source> <maven.compiler.target>1.8</maven.compiler.target> </properties> <dependencies> <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> <version>4.11</version> <scope>test</scope> </dependency> <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-lang3 --> <dependency> <groupId>org.apache.commons</groupId> <artifactId>commons-lang3</artifactId> <version>3.9</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming-kafka-0-8_2.11</artifactId> <version>2.4.3</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-catalyst_2.11</artifactId> <version>2.4.3</version> </dependency> <!-- https://mvnrepository.com/artifact/org.apache.logging.log4j/log4j-core --> <dependency> <groupId>org.apache.logging.log4j</groupId> <artifactId>log4j-core</artifactId> <version>2.12.0</version> </dependency> <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql --> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql_2.11</artifactId> <version>2.4.3</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql-kafka-0-10_2.11</artifactId> <version>2.4.3</version> </dependency> <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core --> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.11</artifactId> <version>2.4.3</version> </dependency> </dependencies> <build> <pluginManagement><!-- lock down plugins versions to avoid using Maven defaults (may be moved to parent pom) --> <plugins> <plugin> <artifactId>maven-assembly-plugin</artifactId> <configuration> <archive> <manifest> <mainClass>*</mainClass> </manifest> </archive> <descriptorRefs> <descriptorRef>jar-with-dependencies</descriptorRef> </descriptorRefs> </configuration> <executions> <execution> <id>make-assembly</id> <!-- this is used for inheritance merges --> <phase>package</phase> <!-- bind to the packaging phase --> <goals> <goal>single</goal> </goals> </execution> </executions> </plugin> <plugin> <artifactId>maven-shade-plugin</artifactId> <version>3.2.1</version> <executions> <execution> <phase>package</phase> <goals> <goal>shade</goal> </goals> <configuration> <transformers> <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer"> <mainClass>*</mainClass> </transformer> </transformers> </configuration> </execution> </executions> </plugin> <!-- clean lifecycle, see https://maven.apache.org/ref/current/maven-core/lifecycles.html#clean_Lifecycle --> <plugin> <artifactId>maven-clean-plugin</artifactId> <version>3.1.0</version> </plugin> <!-- default lifecycle, jar packaging: see https://maven.apache.org/ref/current/maven-core/default-bindings.html#Plugin_bindings_for_jar_packaging --> <plugin> <artifactId>maven-resources-plugin</artifactId> <version>3.0.2</version> </plugin> <plugin> <artifactId>maven-compiler-plugin</artifactId> <version>3.8.0</version> <configuration> <source>1.8</source> <target>1.8</target> <encoding>UTF-8</encoding> </configuration> </plugin> <plugin> <artifactId>maven-surefire-plugin</artifactId> <version>2.22.1</version> </plugin> <plugin> <artifactId>maven-jar-plugin</artifactId> <version>3.0.2</version> <configuration> <archive> <manifest> <addClasspath>true</addClasspath> <classpathPrefix>lib/</classpathPrefix> <mainClass>*</mainClass> </manifest> </archive> </configuration> </plugin> <plugin> <artifactId>maven-install-plugin</artifactId> <version>2.5.2</version> </plugin> <plugin> <artifactId>maven-deploy-plugin</artifactId> <version>2.8.2</version> </plugin> <!-- site lifecycle, see https://maven.apache.org/ref/current/maven-core/lifecycles.html#site_Lifecycle --> <plugin> <artifactId>maven-site-plugin</artifactId> <version>3.7.1</version> </plugin> <plugin> <artifactId>maven-project-info-reports-plugin</artifactId> <version>3.0.0</version> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-dependency-plugin</artifactId> <executions> <execution> <id>copy-dependencies</id> <phase>package</phase> <goals> <goal>copy-dependencies</goal> </goals> <configuration> <outputDirectory>lib/</outputDirectory> <overWriteReleases>false</overWriteReleases> <overWriteSnapshots>false</overWriteSnapshots> <overWriteIfNewer>true</overWriteIfNewer> </configuration> </execution> </executions> </plugin> </plugins> </pluginManagement> </build> </project> 选项,但我不想使用此选项。

当我使用--packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.4.3时,它将使jar文件具有许多依赖性(包括spark-sql-kafka-0-10_2.11:2.4.3)。
但是我不知道为什么使用此jar文件进行spark提交会导致错误。

1 个答案:

答案 0 :(得分:0)

您只需下载并将此'spark-sql-kafka-0-10_2.11'jar文件放在SPARK_HOME / jars文件夹中 spark-sql-kafka-0-10_2.11。 您可以从以下链接下载此jar: https://repo1.maven.org/maven2/org/apache/spark/spark-sql-kafka-0-10_2.11/2.4.3/spark-sql-kafka-0-10_2.11-2.4.3.jar