Python Simple-Salesforce更改“ concurrencyMode”

时间:2019-09-16 16:22:36

标签: python salesforce simple-salesforce

我正在利用Python的simple-salesforce软件包执行批量上传。我看到一些不一致的响应错误,我相信可以通过将'concurrencyMode'更改为'Serial'来解决

我在文档中没有看到该选项。有谁知道是否可以更新源代码以将该参数添加到请求中?我尝试更新api.py和bulk.py中的标头,但是没有运气。

谢谢

1 个答案:

答案 0 :(得分:0)

https://<salesforce_instance>/services/async/<api_version>/job批量方法通过发布到 def _create_job(self, operation, object_name, external_id_field=None): """ Create a bulk job Arguments: * operation -- Bulk operation to be performed by job * object_name -- SF object * external_id_field -- unique identifier field for upsert operations """ payload = { 'operation': operation, 'object': object_name, 'contentType': 'JSON' } 使用Salesforce Bulk API 1.0。在bulk.py中,作业的创建方式如下:

<jobInfo
   xmlns="http://www.force.com/2009/06/asyncapi/dataload">
 <operation>...</operation>
 <object>...</object>
 <contentType>JSON</contentType>
</jobInfo>

这将生成以下XML有效负载:

concurrencyMode

要显式请求串行作业,您需要向请求中添加jobInfo元素。 <jobInfo xmlns="http://www.force.com/2009/06/asyncapi/dataload"> <operation>...</operation> <object>...</object> <concurrencyMode>Serial</concurrencyMode> <contentType>JSON</contentType> </jobInfo> 片段应为

_create_job

更改 def _create_job(self, operation, object_name, external_id_field=None): """ Create a serial bulk job Arguments: * operation -- Bulk operation to be performed by job * object_name -- SF object * external_id_field -- unique identifier field for upsert operations """ payload = { 'operation': operation, 'object': object_name, 'concurrencyMode': 'Serial', 'contentType': 'JSON' } 以具有以下额外元素:

import pandas as pd
import io
import requests
url="https://download.bls.gov/pub/time.series/la/la.data.64.County"
s=requests.get(url).content
c=pd.read_csv(io.StringIO(s.decode('utf-8')))