使用:spark 1.5.2,配置單元1.2 我有鑲木地板格式的外部配置單元表。我創建了一個.py腳本,用於從my_table中選擇一個數據框,執行一些轉換,然後嘗試寫回原始表。無法覆蓋pyspark中的鑲木地板配置單元表
我嘗試以下方法:
df.write.insertInto('table_name', overwrite='true')
。
這引發以下錯誤:
pyspark.sql.utils.AnalysisException: Cannot insert overwrite into table that is also being read from.
df.write.mode('overwrite').parquet('my_path')
df.write.parquet('my_path', mode='overwrite')
df.write.save('my_path', format='parquet', mode = 'overwrite')
這些似乎都引發此錯誤:
ERROR Client fs/client/fileclient/cc/client.cc:1802 Thread: 620 Open failed for file /my_path/part-r-00084-9, LookupFid error No such file or directory(2) 2016-04-26 16:47:17,0942 ERROR JniCommon fs/client/fileclient/cc/jni_MapRClient.cc:2488 Thread: 620 getBlockInfo failed, Could not open file /my_path/part-r-00084-9 16/04/26 16:47:17 WARN DAGScheduler: Creating new stage failed due to exception - job: 16
**請注意,如果文件格式爲orc,但上述方法1正常工作,但會爲拼花地板引發該錯誤。
任何建議將不勝感激!