尝试在Azure表存储中插入多个实体时,我遇到了以下错误:
com.microsoft.azure.storage.table.TableServiceException: Bad Request
at com.microsoft.azure.storage.table.TableBatchOperation$1.postProcessResponse(TableBatchOperation.java:525)
at com.microsoft.azure.storage.table.TableBatchOperation$1.postProcessResponse(TableBatchOperation.java:433)
at com.microsoft.azure.storage.core.ExecutionEngine.executeWithRetry(ExecutionEngine.java:146)
以下是批量插入的Java代码:
public BatchInsertResponse batchInsert(BatchInsertRequest request){
BatchInsertResponse response = new BatchInsertResponse();
String erpName = request.getErpName();
HashMap<String,List<TableEntity>> tableNameToEntityMap = request.getTableNameToEntityMap();
HashMap<String,List<TableEntity>> errorMap = new HashMap<String,List<TableEntity>>();
HashMap<String,List<TableEntity>> successMap = new HashMap<String,List<TableEntity>>();;
CloudTable cloudTable=null;
for (Map.Entry<String, List<TableEntity>> entry : tableNameToEntityMap.entrySet()){
try {
cloudTable = azureStorage.getTable(entry.getKey());
} catch (Exception e) {
e.printStackTrace();
}
// Define a batch operation.
TableBatchOperation batchOperation = new TableBatchOperation();
List<TableEntity> value = entry.getValue();
for (int i = 0; i < value.size(); i++) {
TableEntity entity = value.get(i) ;
batchOperation.insertOrReplace(entity);
if (i!=0 && i % batchSize == 0) {
try {
cloudTable.execute(batchOperation);
batchOperation.clear();
} catch (Exception e) {
e.printStackTrace();
}
}
}
try {
cloudTable.execute(batchOperation);
} catch (Exception e) {
e.printStackTrace();
}
}
}
如果我将batchSize值指定为10,则上面的代码工作正常但是如果我将指定为1000或100,则会抛出错误的请求错误。
请帮我解决此错误。我使用的是Spring启动和Azure存储Java SDK版本4.3.0。
答案 0 :(得分:2)
正如Aravind所提到的,400错误通常意味着你的数据有问题。从algebraic data types链接,如果不满足以下一个或多个条件,实体批处理事务将失败:
this
。请根据这四条规则检查您的实体,并确保您没有违反其中一条规则。