我正在尝试使用azure sdk for python在ADF中创建数据集,不幸的是,我遇到了此错误消息。我不确定下面的代码有什么问题。
dsOut_name = 'POC_DatasetName'
ds_ls ="AzureBlobStorage"
output_blobpath = '/tempdir'
df_name = 'pipeline1'
dsOut_azure_blob = AzureBlobDataset(linked_service_name=ds_ls, folder_path=output_blobpath)
dsOut = adf_client.datasets.create_or_update(rg_name, df_name, dsOut_name, dsOut_azure_blob)
print_item(dsOut)
Error Message: SerializationError: Unable to build a model: Unable to deserialize to object: type, AttributeError: 'str' object has no attribute 'get', DeserializationError: Unable to deserialize to object: type, AttributeError: 'str' object has no attribute 'get'
请帮助
答案 0 :(得分:1)
我可以重现您的问题,此行ds_ls ="AzureBlobStorage"
是错误的,应该为ds_ls = LinkedServiceReference(reference_name=ls_name)
。
您可以参考我完整的工作样本。
确保您的服务主体在数据工厂的Owner
中具有RBAC角色(例如Contributor
,Access control (IAM)
),并且您已经完成所有的Prerequisites。 / p>
我的软件包版本:
azure-mgmt-datafactory 0.6.0
azure-mgmt-resource 3.1.0
azure-common 1.1.23
代码:
from azure.common.credentials import ServicePrincipalCredentials
from azure.mgmt.resource import ResourceManagementClient
from azure.mgmt.datafactory import DataFactoryManagementClient
from azure.mgmt.datafactory.models import *
subscription_id = '<subscription-id>'
ls_name = 'storageLinkedService'
rg_name = '<group-name>'
df_name = '<datafactory-name>'
credentials = ServicePrincipalCredentials(client_id='<client id of the service principal>',
secret='<secret of the service principal>', tenant='<tenant-id>')
resource_client = ResourceManagementClient(credentials, subscription_id)
adf_client = DataFactoryManagementClient(credentials, subscription_id)
storage_string = SecureString('DefaultEndpointsProtocol=https;AccountName=<storage account name>;AccountKey=<storage account key>')
ls_azure_storage = AzureStorageLinkedService(connection_string=storage_string)
ls = adf_client.linked_services.create_or_update(rg_name, df_name, ls_name, ls_azure_storage)
ds_ls = LinkedServiceReference(reference_name=ls_name)
# Create an Azure blob dataset (output)
dsOut_name = 'ds_out'
output_blobpath = '<container name>/<folder name>'
dsOut_azure_blob = AzureBlobDataset(linked_service_name=ds_ls, folder_path=output_blobpath)
dsOut = adf_client.datasets.create_or_update(rg_name, df_name, dsOut_name, dsOut_azure_blob)
print(dsOut)