Azure数据块:通过API将Maven库安装到群集会导致错误(库解析失败。原因:java.lang.RuntimeException)

时间:2019-11-26 09:03:22

标签: azure databricks azure-databricks

我正在尝试通过python的API将一些Maven库安装到现有的Azure数据砖的集群/新创建的集群中。

集群详细信息:

  • Python 3
  • 5.5 LTS(包括Apache Spark 2.4.3,Scala 2.11)
  • 节点类型:Standard_D3_v2
spark_submit_packages = "org.apache.spark:spark-streaming-kafka-0-8-assembly_2.11:2.4.3," \
                        "com.databricks:spark-redshift_2.11:3.0.0-preview1," \
                        "org.postgresql:postgresql:9.3-1103-jdbc3," \
                        "com.amazonaws:aws-java-sdk:1.11.98," \
                        "com.amazonaws:aws-java-sdk-core:1.11.98," \
                        "com.amazonaws:aws-java-sdk-sns:1.11.98," \
                        "org.apache.hadoop:hadoop-aws:2.7.3," \
                        "com.amazonaws:aws-java-sdk-s3:1.11.98," \
                        "com.databricks:spark-avro_2.11:4.0.0," \
                        "com.microsoft.azure:azure-data-lake-store-sdk:2.0.11," \
                        "org.apache.hadoop:hadoop-azure-datalake:3.0.0-alpha2," \
                        "com.microsoft.azure:azure-storage:3.1.0," \
                        "org.apache.hadoop:hadoop-azure:2.7.2"

    install_lib_url = "https://westus.azuredatabricks.net/api/2.0/libraries/install"
    packages = spark_submit_packages.split(",")
    maven_packages = []
    for pack in packages:
        maven_packages.append({"maven": {"coordinates": pack}})

    headers = {"Authorization": "Bearer {}".format(TOKEN)}
    headers['Content-type'] = 'application/json'

    data = {
        "cluster_id": cluster_id,
        "libraries": maven_packages
    }

    res = requests.post(install_lib_url, headers=headers, data=json.dumps(data))
    _response = res.json()
    print(json.dumps(_response))

响应为预期的空json。
但是有时此api调用会在用户界面中导致以下错误,并且库安装失败,

Library resolution failed. Cause: java.lang.RuntimeException: commons-httpclient:commons-httpclient download failed.
    at com.databricks.libraries.server.MavenInstaller.$anonfun$resolveDependencyPaths$5(MavenLibraryResolver.scala:253)
    at scala.collection.MapLike.getOrElse(MapLike.scala:131)
    at scala.collection.MapLike.getOrElse$(MapLike.scala:129)
    at scala.collection.AbstractMap.getOrElse(Map.scala:63)
    at com.databricks.libraries.server.MavenInstaller.$anonfun$resolveDependencyPaths$4(MavenLibraryResolver.scala:253)
    at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
    at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:75)
    at scala.collection.TraversableLike.map(TraversableLike.scala:238)
    at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
    at scala.collection.AbstractTraversable.map(Traversable.scala:108)
    at com.databricks.libraries.server.MavenInstaller.resolveDependencyPaths(MavenLibraryResolver.scala:249)
    at com.databricks.libraries.server.MavenInstaller.doDownloadMavenPackages(MavenLibraryResolver.scala:455)
    at com.databricks.libraries.server.MavenInstaller.$anonfun$downloadMavenPackages$2(MavenLibraryResolver.scala:381)
    at com.databricks.backend.common.util.FileUtils$.withTemporaryDirectory(FileUtils.scala:431)
    at com.databricks.libraries.server.MavenInstaller.$anonfun$downloadMavenPackages$1(MavenLibraryResolver.scala:380)
    at com.databricks.logging.UsageLogging.$anonfun$recordOperation$4(UsageLogging.scala:417)
    at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:239)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
    at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:234)
    at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:231)
    at com.databricks.libraries.server.MavenInstaller.withAttributionContext(MavenLibraryResolver.scala:57)
    at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:276)
    at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:269)
    at com.databricks.libraries.server.MavenInstaller.withAttributionTags(MavenLibraryResolver.scala:57)
    at com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:398)
    at com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:337)
    at com.databricks.libraries.server.MavenInstaller.recordOperation(MavenLibraryResolver.scala:57)
    at com.databricks.libraries.server.MavenInstaller.downloadMavenPackages(MavenLibraryResolver.scala:379)
    at com.databricks.libraries.server.MavenInstaller.downloadMavenPackagesWithRetry(MavenLibraryResolver.scala:137)
    at com.databricks.libraries.server.MavenInstaller.resolveMavenPackages(MavenLibraryResolver.scala:113)
    at com.databricks.libraries.server.MavenLibraryResolver.resolve(MavenLibraryResolver.scala:44)
    at com.databricks.libraries.server.ManagedLibraryManager$GenericManagedLibraryResolver.resolve(ManagedLibraryManager.scala:263)
    at com.databricks.libraries.server.ManagedLibraryManagerImpl.$anonfun$resolvePrimitives$1(ManagedLibraryManagerImpl.scala:193)
    at com.databricks.libraries.server.ManagedLibraryManagerImpl.$anonfun$resolvePrimitives$1$adapted(ManagedLibraryManagerImpl.scala:188)
    at scala.collection.Iterator.foreach(Iterator.scala:941)
    at scala.collection.Iterator.foreach$(Iterator.scala:941)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
    at scala.collection.IterableLike.foreach(IterableLike.scala:74)
    at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
    at com.databricks.libraries.server.ManagedLibraryManagerImpl.resolvePrimitives(ManagedLibraryManagerImpl.scala:188)
    at com.databricks.libraries.server.ManagedLibraryManagerImpl$ClusterStatus.installLibs(ManagedLibraryManagerImpl.scala:772)
    at com.databricks.libraries.server.ManagedLibraryManagerImpl$InstallLibTask$1.run(ManagedLibraryManagerImpl.scala:473)
    at com.databricks.threading.NamedExecutor$$anon$1.$anonfun$run$1(NamedExecutor.scala:317)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:239)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
    at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:234)
    at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:231)
    at com.databricks.threading.NamedExecutor.withAttributionContext(NamedExecutor.scala:256)
    at com.databricks.threading.NamedExecutor$$anon$1.run(NamedExecutor.scala:317)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)

是由于在单个API中安装了多个Maven库吗? (但是我们需要给API列表:|)

Cluster_libraries_screenshot

编辑:重新启动群集时也会发生此问题。假设我已经手动将大约10个Maven库安装到集群中。所有安装均成功。但是当我重新启动集群时,即使这些成功的安装也失败了。

1 个答案:

答案 0 :(得分:0)

收到Azure支持团队的以下回复:

  

似乎特定的Maven有问题   jar(org.apache.hadoop:hadoop-azure-datalake:3.0.0-alpha2)

     

解决方法:
   1.从maven存储库下载jar。
   2.将其上传到dbfs。
   3.使用dbfs中的jar创建库。