成員這是非常sad.My火花版本是2.1.1,斯卡拉版本是2.11值reduceByKey不是org.apache.spark.rdd.RDD
import org.apache.spark.SparkContext._
import com.mufu.wcsa.component.dimension.{DimensionKey, KeyTrait}
import com.mufu.wcsa.log.LogRecord
import org.apache.spark.rdd.RDD
object PV {
//
def stat[C <: LogRecord,K <:DimensionKey](statTrait: KeyTrait[C ,K],logRecords: RDD[C]): RDD[(K,Int)] = {
val t = logRecords.map(record =>(statTrait.getKey(record),1)).reduceByKey((x,y) => x + y)
我得到這個錯誤
at 1502387780429
[ERROR] /Users/lemanli/work/project/newcma/wcsa/wcsa_my/wcsavistor/src/main/scala/com/mufu/wcsa/component/stat/PV.scala:25: error: value reduceByKey is not a member of org.apache.spark.rdd.RDD[(K, Int)]
[ERROR] val t = logRecords.map(record =>(statTrait.getKey(record),1)).reduceByKey((x,y) => x + y)
限定有特點
trait KeyTrait[C <: LogRecord,K <: DimensionKey]{
def getKey(c:C):K
}
它被編譯,謝謝。
def stat[C <: LogRecord,K <:DimensionKey : ClassTag : Ordering](statTrait: KeyTrait[C ,K],logRecords: RDD[C]): RDD[(K,Int)] = {
val t = logRecords.map(record =>(statTrait.getKey(record),1)).reduceByKey((x,y) => x + y)
重點需要重寫訂購[T]。
object ClientStat extends KeyTrait[DetailLogRecord, ClientStat] {
implicit val c
lientStatSorting = new Ordering[ClientStat] {
override def compare(x: ClientStat, y: ClientStat): Int = x.key.compare(y.key)
}
def getKey(detailLogRecord: DetailLogRecord): ClientStat = new ClientStat(detailLogRecord)
}