mapreduce失败并显示消息“对API调用datastore_v3.Put()的请求太大了。”

时间:2014-04-12 07:50:25

标签: google-app-engine mapreduce

我正在运行一个超过5000万用户记录的mapreduce工作。

对于每个用户,我读了两个其他数据存储区实体,然后将每个玩家的统计信息流式传输到bigquery。

我的第一次干运行(禁止流式传输到bigquery)失败了,发生了以下堆栈跟踪。

/ _啊/管道/ handleTask com.google.appengine.tools.cloudstorage.NonRetriableException:com.google.apphosting.api.ApiProxy $ RequestTooLargeException:API调用datastore_v3.Put()的请求太大。     在com.google.appengine.tools.cloudstorage.RetryHelper.doRetry(RetryHelper.java:121)     在com.google.appengine.tools.cloudstorage.RetryHelper.runWithRetries(RetryHelper.java:166)     在com.google.appengine.tools.cloudstorage.RetryHelper.runWithRetries(RetryHelper.java:157)     在com.google.appengine.tools.pipeline.impl.backend.AppEngineBackEnd.tryFiveTimes(AppEngineBackEnd.java:196)     在com.google.appengine.tools.pipeline.impl.backend.AppEngineBackEnd.saveWithJobStateCheck(AppEngineBackEnd.java:236)

我搜索了这个错误,我发现的唯一一件事就是Mapper太大而无法序列化,但我们的Mapper根本没有数据。

/**  
 *  Adds stats for a player via streaming api.
 */
public class PlayerStatsMapper extends Mapper<Entity, Void, Void> {

private static Logger log = Logger.getLogger(PlayerStatsMapper.class.getName());

private static final long serialVersionUID = 1L;

private String dataset;
private String table;

private transient GbqUtils gbq;

public PlayerStatsMapper(String dataset, String table) {
    gbq = Davinci.getComponent(GbqUtils.class);

    this.dataset = dataset;
    this.table = table;
}

private void readObject(java.io.ObjectInputStream in) throws IOException, ClassNotFoundException {
    in.defaultReadObject();
    log.info("IOC reinitating due to deserialization.");
    gbq = Davinci.getComponent(GbqUtils.class);
}

@Override
public void beginShard() {
}

@Override
public void endShard() {
}

@Override
public void map(Entity value) {
    if (!value.getKind().equals("User")) {
        log.severe("Expected a User but got a " + value.getKind());
        return;
    }

    User user = new User(1, value);

    List<Map<String, Object>> rows = new LinkedList<Map<String, Object>>();
    List<PlayerStats> playerStats = readPlayerStats(user.getUserId());
    addRankings(user.getUserId(), playerStats);

    for (PlayerStats ps : playerStats) {
        rows.add(ps.asMap());
    }

//      if (rows.size() > 0)
//          gbq.insert(dataset, table, rows);
}


    .... private methods only

}

使用此代码

启动maprecuce作业
    MapReduceSettings settings = new MapReduceSettings().setWorkerQueueName("mrworker");
    settings.setBucketName(gae.getAppName() + "-playerstats");

    // @formatter:off  <I, K, V, O, R>
    MapReduceSpecification<Entity, Void, Void, Void, Void> spec = 
            MapReduceSpecification.of("Enque player stats", 
                    new DatastoreInput("User", shardCountMappers),
                    new PlayerStatsMapper(dataset, "playerstats"),          
                    Marshallers.getVoidMarshaller(),
                    Marshallers.getVoidMarshaller(), 
                    NoReducer.<Void, Void, Void> create(), 
                    NoOutput.<Void, Void> create(1));
    // @formatter:on
    String jobId = MapReduceJob.start(spec, settings);

1 个答案:

答案 0 :(得分:0)

我通过支持appengine-mapreduce-0.2.jar解决了这个问题,这是我们之前使用过的。上面使用的是appengine-mapreduce-0.5.jar,它实际上并不适用于我们。

当支持到0.2时,控制台_ah / pipiline / list也会再次开始工作!

其他人遇到与0.5相似的问题?