在azure数据工厂中执行scala jar文件

时间:2018-05-28 13:05:50

标签: scala azure apache-spark azure-data-factory

以下是我要执行的代码:

  

SimpleApp.scala

void

我使用intelliJ IDEA编译它(使用此链接:https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-create-standalone-application)。

现在我正在尝试在Azure Data Factory上执行它。我创造了这份工作:

package test

import java.sql.DriverManager
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf


object SimpleApp {
  def main(args: Array[String]) {
    val conf = new SparkConf().setAppName("bouh").setMaster("yarn")
    val sc = new SparkContext(conf)


    val jdbcHostname = "servername.database.windows.net" 
    val jdbcPort = 1433
    val jdbcDatabase ="database"
    val jdbc_url = s"jdbc:sqlserver://${jdbcHostname}:${jdbcPort};database=${jdbcDatabase};encrypt=true;trustServerCertificate=false;hostNameInCertificate=*.database.windows.net;loginTimeout=60;"

    val jdbcUsername = "user"
    val jdbcPassword = "password"

    val connection = DriverManager.getConnection(jdbc_url, jdbcUsername, jdbcPassword)
    val statement = connection.createStatement

    val rdd = sc.textFile("wasbs://dev@hdinsight.blob.core.windows.net/folder/*.txt")

    rdd.collect().map(
      (Id: String) => {
        statement.execute(s"EXEC delete_item_by_id @Id = '${Id}'")
      }
    )
  }
}

但执行失败并出现错误:

{
    "name": "pipeline1",
    "properties": {
        "activities": [
            {
                "name": "Spark1",
                "type": "HDInsightSpark",
                "policy": {
                    "timeout": "7.00:00:00",
                    "retry": 0,
                    "retryIntervalInSeconds": 30,
                    "secureOutput": false
                },
                "typeProperties": {
                    "rootPath": "dev/apps/spikes",
                    "entryFilePath": "test.jar",
                    "className": "SimpleApp",
                    "sparkJobLinkedService": {
                        "referenceName": "linkedServiceStorageBlobHDI",
                        "type": "LinkedServiceReference"
                    }
                },
                "linkedServiceName": {
                    "referenceName": "linkedServiceHDI",
                    "type": "LinkedServiceReference"
                }
            }
        ]
    }
}

我知道找不到课程,但我该如何解决这个问题呢?它来自我的scala脚本,还是天蓝色的工作?

编辑:如果我打开test.jar,我有足够的文件/文件夹。我在/ test文件夹中找到了SimpleApp.class(test是我的包的名称)。我尝试过ADF 18/05/28 12:52:53 ERROR ApplicationMaster: Uncaught exception: java.lang.ClassNotFoundException: SimpleApp at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.spark.deploy.yarn.ApplicationMaster.startUserApplication(ApplicationMaster.scala:621) at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:379) at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:245) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:749) at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:71) at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:70) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1865) at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:70) at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:747) at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala) 18/05/28 12:52:53 INFO ApplicationMaster: Final app status: FAILED, exitCode: 10, (reason: Uncaught exception: java.lang.ClassNotFoundException: SimpleApp) 18/05/28 12:52:53 INFO ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag message: Uncaught exception: java.lang.ClassNotFoundException: SimpleApp) 18/05/28 12:52:53 INFO ApplicationMaster: Deleting staging directory adl://home/user/livy/.sparkStaging/application_1527060048715_0507 18/05/28 12:52:53 INFO ShutdownHookManager: Shutdown hook called ,但仍然出现同样的错误"className": "test.SimpleApp"

1 个答案:

答案 0 :(得分:0)

您可以尝试打开jar并查看SimpleApp类的路径