连接到Blob存储“在配置中找不到用于它们的凭据”

时间:2018-09-13 17:03:49

标签: azure databricks

我正在使用由Spark Cluster支持的Databricks笔记本。尝试连接到Azure Blob存储时遇到问题。我使用了link,并尝试了直接访问Azure Blob存储-设置帐户访问密钥这一节。我在这里没有错误:

spark.conf.set(
  "fs.azure.account.key.<your-storage-account-name>.blob.core.windows.net",
  "<your-storage-account-access-key>")

但是当我尝试在目录上执行“ ls”时会收到错误消息:

dbutils.fs.ls("wasbs://<your-container-name>@<your-storage-account-name>.blob.core.windows.net/<your-directory-name>")

shaded.databricks.org.apache.hadoop.fs.azure.AzureException: shaded.databricks.org.apache.hadoop.fs.azure.AzureException: Unable to access container <container name> in account <storage account name>core.windows.net using anonymous credentials, and no credentials found for them in the configuration.

如果有更好的方法,请也提供建议。谢谢

1 个答案:

答案 0 :(得分:1)

1) You need to pass the **storage account name** and **key** while setting up the configuration . You can find this from azure portal.




spark.conf.set(
 "fs.azure.account.key.<your-storage-account-name>.blob.core.windows.net",
 "<your-storage-account-access-key>")



2) Also while doing the ls you need to add
**Container name** and **directory name.**



dbutils.fs.ls("wasbs://<your-container-name>@<your-storage-account-name>.blob.core.windows.net/<your-directory-name>")



Hope this will resolve your issue!