Pig如何实例化UDF对象

时间:2016-05-04 13:06:53

标签: java hadoop apache-pig pig-udf

有人可以告诉我Pig如何实例化UDF对象?我用Pig构建了一个管道来处理一些数据。我在多节点Hadoop集群中部署了管道我希望保存管道中每个步骤之后生成的所有中间结果。所以我在Java中编写了一个UDF,它将在初始化时打开HTTP连接并在exec中传输数据。此外,我将关闭对象finalize中的连接。

我的脚本可以简化如下:

REGISTER MyPackage.jar;
DEFINE InterStore test.InterStore('localhost', '58888');
DEFINE Clean      test.Clean();

raw = LOAD 'mydata';
cleaned = FILTER (FOREACH raw GENERATE FLATTEN(Clean(*))) BY NOT ($0 MATCHES '');
cleaned = FOREACH cleaned GENERATE FLATTEN(InterStore(*));
named = FOREACH cleaned GENERATE $1 AS LocationID, $2 AS AccessCount;
named = FOREACH named GENERATE FLATTEN(InterStore(*)) AS (LocationID, AccessCount);
grp = GROUP named BY LocationID;
grp = FOREACH grp GENERATE FLATTEN(InterStore(*)) AS (group, named:{(LocationID, AccessCount)});
sum = FOREACH grp GENERATE group AS LocationID, SUM(named.AccessCount) AS TotalAccesses;
sum = FOREACH sum GENERATE FLATTEN(InterStore(*)) AS (LocationID, TotalAccesses);
ordered = ORDER sum BY TotalAccesses DESC;
STORE ordered INTO 'result';

InterStore的代码可以简化如下:

class InterStore extends EvalFunc<Tuple>{
  HttpURLConnection con;  //Avoid redundant connection establishment in exec
  public InterStore(String ip, String port) throws IOException
  {
    URL url = new URL("http://" + ip + ':' + port);
    con = (HttpURLConnection)url.openConnection();
    con.setRequestMethod("PUT");
    con.setDoOutput(true);
    con.setDoInput(true);
  }
  public Tuple exec(Tuple input) throws IOException
  {
    con.getOutputStream().write((input.toDelimitedString(",")+'\n').getBytes());
    return input;
  }
  @Override
  protected void finalize() throws Throwable
  {
    con.getOutputStream().close();
    int respcode = con.getResponseCode();
    BufferedReader in = new BufferedReader(new InputStreamReader(con.getInputStream()));
    System.out.printf("Resp Code:%d, %s\n", respcode, in.readLine());
    in.close();
  }
}

但是,我发现HTTP连接无法像在本地模式下那样成功传输数据。如何处理?

1 个答案:

答案 0 :(得分:0)

是否有服务在'localhost','58888'上收听?

请注意,每个执行节点的本地主机都不同,您可能希望这样做:

%default LHOST `localhost` 

并将此变量用作参数

DEFINE InterStore test.InterStore('$LHOST', '58888');

一般情况下,我会在UDF中进行一些打印输出并仔细检查传递给它的参数,并测试连接(比如ping并检查端口是否可以从hadoop节点访问)