我有一個要求,可以根據某些業務邏輯將幾個日期格式轉換爲布爾條件。Hive:python UDF在關閉運算符時給出了「Hive運行時錯誤」
但是我從Hive調用python腳本時失敗了。下面是我寫的日期格式轉換爲樣品1列中的腳本:
import sys
def getYearMonthFromStringDate(dt):
year=0
month=0
try:
ss=dt.split('-')
year=ss[0]
month=ss[1]
except ValueError:
print "Error parsing date string %s" %dt
return int(year)*100+int(month)
for line in sys.stdin:
tempArr=line.split('\t')
accountgl0s=tempArr[0]
agl0 = getYearMonthFromStringDate(accountgl0s)
output_list = [accountgl0s, ag10]
print '\t'.join(output_list)
我使用下面的命令添加在分佈式緩存中的文件:
add file /folder/date.py
現在,我打電話這條巨蟒使用變換如下我蜂巢表的山坳accountgl0s
功能:
Input column accountgl0s = '2016-10-01'
select transform(accountgl0) using 'python date.py' as (accountgl0s,agl0) from sample;
我的期望輸出應該是2016-10-01 201610
。但我得到下面的錯誤:
Error: java.lang.RuntimeException: Hive Runtime Error while closing operators
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:217)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: [Error 20003]: An error occurred when trying to close the Operator running your custom script.
at org.apache.hadoop.hive.ql.exec.ScriptOperator.close(ScriptOperator.java:557)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:610)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:610)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:610)
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:199)
... 8 more
FAILED: Execution Error, return code 20003 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask. An error occurred when trying to close the Operator running your custom script.
MapReduce Jobs Launched:
Stage-Stage-1: Map: 1 HDFS Read: 0 HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec
上述這些錯誤消息,有像開始工作= job_1480642810315_083110,跟蹤URL = HTTP輸出中的URL:// hadoop-test15.int.xxxxxx.com:8088/proxy/application_148012122315_081130/,請訪問此網址,您可以找到有關python錯誤消息的信息。 –
你爲什麼要編寫一個自定義的python腳本來分割一個字符串? – gobrewers14