你可以使用withColumn來創建一個列你想要加總的值,然後聚合。例如:
from pyspark.sql import functions as F, types as T
schema = T.StructType([
T.StructField('key', T.IntegerType(), True),
T.StructField('col1', T.StringType(), True),
T.StructField('col2', T.StringType(), True)
])
data = [
(1, 'ABC', 'DEF'),
(1, 'DEF', 'XYZ'),
(1, 'DEF', 'GHI')
]
rdd = sc.parallelize(data)
df = sqlContext.createDataFrame(rdd, schema)
result = df.withColumn('value', F.when((df.col1 == 'ABC') | (df.col2 == 'XYZ'), 1).otherwise(0)) \
.groupBy('key') \
.agg(F.sum('value').alias('sum'))
result.show(100, False)
打印出這樣的結果:
+---+---+
|key|sum|
+---+---+
|1 |2 |
+---+---+
非常感謝! withColumn幫助了我,並且能夠現在執行總和.. :) – Renu