处理语句时出错:失败:执行错误,从org.apache.hadoop.hive.ql.exec.mr.MapRedTask返回代码2(状态= 08S01,代码= 2)

时间:2019-02-15 07:13:03

标签: hive

我已经成功执行了一个配置工作业,但是自从上一天以来,它在映射器作业完成后给我一个错误,以下是日志和查询:

INSERT INTO TABLE zong_dwh.TEMP_P_UFDR_imp6 
SELECT
  from_unixtime(begin_time+5*3600,'yyyy-MM-dd') AS Date1,
  from_unixtime(begin_time+5*3600,'HH') AS Hour1, 
  MSISDN AS MSISDN,
  A.prot_type AS Protocol, 
  B.protocol as Application,
  host AS Domain,
  D.browser_name AS browser_type, 
  cast (null as varchar(10)) as media_format, 
  C.ter_type_name_en as device_category, 
  C.ter_brand_name as device_brand, 
  rat as session_technology, 
  case 
    when rat=1 then Concat(mcc,mnc,lac,ci) 
    when rat=2  then Concat(mcc,mnc,lac,sac) 
    when rat=6 then concat(mcc,mnc,eci) 
  end AS Actual_Site_ID,
  sum(coalesce(L4_DW_THROUGHPUT,0)+coalesce(L4_UL_THROUGHPUT,0)) as total_data_volume,
  sum(coalesce(TCP_UL_RETRANS_WITHPL,0)/coalesce(TCP_DW_RETRANS_WITHPL,1)) AS retrans_rate, 
  sum(coalesce(DATATRANS_UL_DURATION,0) + coalesce(DATATRANS_DW_DURATION,0)) as duration, 
  count(sessionkey) as usage_quantity,
  round(sum(L4_DW_THROUGHPUT)/1024/1024,4)/sum(end_time*1000+end_time_msel-begin_time*1000-begin_time_msel) AS downlink_throughput,
  round(sum(L4_UL_THROUGHPUT)/1024/1024,4)/sum(end_time*1000+end_time_msel-begin_time*1000-begin_time_msel) as uplink_throughput 
from 
  ps.detail_ufdr_http_browsing_17923 A 
  INNER JOIN ps.dim_protocol B ON   B.protocol_id=A.prot_type 
  INNER JOIN ps.dim_terminal C on substr(A.imei,1,8)=C.tac 
  inner join ps.dim_browser_type D on A.browser_type=D.browser_type_id  
Group by
  from_unixtime(begin_time+5*3600,'yyyy-MM-dd'),
  from_unixtime(begin_time+5*3600,'HH'),MSISDN,
  prot_type,
  B.protocol,
  host,
  D.browser_name,
  cast (null as varchar(10)),
  C.ter_type_name_en,
  C.ter_brand_name,
  rat,
  case 
    when rat=1 then Concat(mcc,mnc,lac,ci)  
    when rat=2  then Concat(mcc,mnc,lac,sac) 
    when rat=6 then concat(mcc,mnc,eci)  
  end;

日志:

  

错误:java.lang.RuntimeException:   org.apache.hadoop.hive.ql.metadata.HiveException:Hive运行时错误   处理行时(标签= 0)   {“ key”:{“ _ col0”:“ 2019-02-11”,“ _ col1”:“ 05”,“ _ col2”:“ 3002346407”,“ _ col3”:146,“ _ col4”:“”,“ _ col5” :null,“ _ col6”:null,“ _ col7”:“ 35538​​908”,“ _ col8”:6,“ _ col9”:“”,“ _ col10”:“”,“ _ col11”:“”,“ _ col12”:“ 0ED1102 “},”值“:{” _ col0“:75013,” _ col1“:4.0,” _ col2“:2253648000,” _ col3“:5,” _ col4“:0}}}   在   org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:256)   在   org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:444)   在org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)处   org.apache.hadoop.mapred.YarnChild $ 2.run(YarnChild.java:182)在   java.security.AccessController.doPrivileged(本机方法),位于   javax.security.auth.Subject.doAs(Subject.java:422)在   org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1769)   在org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:176)导致   创建人:org.apache.hadoop.hive.ql.metadata.HiveException:Hive运行时   处理行时出错(标签= 0)   {“ key”:{“ _ col0”:“ 2019-02-11”,“ _ col1”:“ 05”,“ _ col2”:“ 3002346407”,“ _ col3”:146,“ _ col4”:“”,“ _ col5” :null,“ _ col6”:null,“ _ col7”:“ 35538​​908”,“ _ col8”:6,“ _ col9”:“”,“ _ col10”:“”,“ _ col11”:“”,“ _ col12”:“ 0ED1102 “},”值“:{” _ col0“:75013,” _ col1“:4.0,” _ col2“:2253648000,” _ col3“:5,” _ col4“:0}}}   在   org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:244)   ... 7更多原因:   org.apache.hadoop.hive.ql.metadata.HiveException:无法执行   方法public org.apache.hadoop.io.Text   org.apache.hadoop.hive.ql.udf.UDFConv.evaluate(org.apache.hadoop.io.Text,org.apache.hadoop.io.IntWritable,org.apache.hadoop.io.IntWritable)   在类的对象org.apache.hadoop.hive.ql.udf.UDFConv@2e2f720上   带有参数的org.apache.hadoop.hive.ql.udf.UDFConv   {:org.apache.hadoop.io.Text,16:org.apache.hadoop.io.IntWritable,   10:org.apache.hadoop.io.IntWritable},大小为3   org.apache.hadoop.hive.ql.exec.FunctionRegistry.invoke(FunctionRegistry.java:1034)   在   org.apache.hadoop.hive.ql.udf.generic.GenericUDFBridge.evaluate(GenericUDFBridge.java:182)   在   org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator._evaluate(ExprNodeGenericFuncEvaluator.java:193)   在   org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:77)   在   org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:65)   在   org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:104)   在org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:838)   在   org.apache.hadoop.hive.ql.exec.GroupByOperator.forward(GroupByOperator.java:1019)   在   org.apache.hadoop.hive.ql.exec.GroupByOperator.processAggr(GroupByOperator.java:821)   在   org.apache.hadoop.hive.ql.exec.GroupByOperator.processKey(GroupByOperator.java:695)   在   org.apache.hadoop.hive.ql.exec.GroupByOperator.process(GroupByOperator.java:761)   在   org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:235)   ... 7更多原因:java.lang.reflect.InvocationTargetException在   sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)位于   sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)   在   sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)   在java.lang.reflect.Method.invoke(Method.java:498)在   org.apache.hadoop.hive.ql.exec.FunctionRegistry.invoke(FunctionRegistry.java:1010)   ... 18更多原因:java.lang.ArrayIndexOutOfBoundsException:0 at   org.apache.hadoop.hive.ql.udf.UDFConv.evaluate(UDFConv.java:160)...   另外23个

0 个答案:

没有答案