2017-01-16 71 views
0

我最近開始使用火花。目前我正在測試具有不同頂點和邊緣類型的二分圖。火花圖形x多種邊緣類型

從我在graphx中做的研究得到不同的邊和一些有屬性的我需要子類的邊。

下面是代碼片段:

scala> trait VertexProperty 
defined trait VertexProperty 

scala> case class paperProperty(val paperid: Long, val papername: String, val doi: String, val keywords: String) extends VertexProperty 
defined class paperProperty 

scala> case class authorProperty(val authorid: Long, val authorname: String) extends VertexProperty 
defined class authorProperty 

scala> val docsVertces: RDD[(VertexId, VertexProperty)] = docs.rdd.map(x => (x(0).asInstanceOf[VertexId],paperProperty(x(0).asInstanceOf[VertexId],x(1).asInstanceOf[String],x(2).asInstanceOf[String],x(3).asInstanceOf[String]))) 
docsVertces: org.apache.spark.rdd.RDD[(org.apache.spark.graphx.VertexId, VertexProperty)] = MapPartitionsRDD[23] at map at <console>:47 

scala> val authorVertces: RDD[(VertexId, VertexProperty)] = authors.rdd.map(x => (x(0).asInstanceOf[VertexId],authorProperty(x(0).asInstanceOf[Long],x(1).asInstanceOf[String]))) 
authorVertces: org.apache.spark.rdd.RDD[(org.apache.spark.graphx.VertexId, VertexProperty)] = MapPartitionsRDD[24] at map at <console>:41 

scala> val vertices = VertexRDD(docsVertces ++ authorVertces) 
vertices: org.apache.spark.graphx.VertexRDD[VertexProperty] = VertexRDDImpl[28] at RDD at VertexRDD.scala:57 

scala> 

但是我與邊緣失敗。

scala> class EdgeProperty() 
defined class EdgeProperty 

scala> case class authorEdgeProperty(val doccount: Long) extends EdgeProperty() 
defined class authorEdgeProperty 

scala> case class citeEdgeProperty() extends EdgeProperty() 
defined class citeEdgeProperty 

scala> // edge using subclass will not work we need to have one consistent superclass 

scala> val docauthoredges = docauthor.map(x => Edge(x(0).asInstanceOf[VertexId],x(1).asInstanceOf[VertexId],  authorEdgeProperty(x(1).asInstanceOf[Long]))) 
docauthoredges: org.apache.spark.sql.Dataset[org.apache.spark.graphx.Edge[authorEdgeProperty]] = [srcId: bigint, dstId: bigint ... 1 more field] 

scala> val docciteedges = doccites.map(x => Edge(x(0).asInstanceOf[VertexId],x(1).asInstanceOf[VertexId], citeEdgeProperty())) 
docciteedges: org.apache.spark.sql.Dataset[org.apache.spark.graphx.Edge[citeEdgeProperty]] = [srcId: bigint, dstId: bigint ... 1 more field] 

scala> docauthoredges.unionAll(docciteedges) 
<console>:52: error: type mismatch; 
found : org.apache.spark.sql.Dataset[org.apache.spark.graphx.Edge[citeEdgeProperty]] 
required: org.apache.spark.sql.Dataset[org.apache.spark.graphx.Edge[authorEdgeProperty]] 
     docauthoredges.unionAll(docciteedges) 
          ^

scala> 

我試圖邊緣投射到他們的父和我recieving以下消息:

scala> val docauthoredges = docauthor.map(x => Edge(x(0).asInstanceOf[VertexId],x(1).asInstanceOf[VertexId],   authorEdgeProperty(x(1).asInstanceOf[Long]).asInstanceOf[EdgeProperty])) 
java.lang.UnsupportedOperationException: No Encoder found for EdgeProperty 
- field (class: "EdgeProperty", name: "attr") 
- root class: "org.apache.spark.graphx.Edge" 
    at org.apache.spark.sql.catalyst.ScalaReflection$.org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor(ScalaReflection.scala:598) 
... 

任何幫助,將不勝感激

回答

0

你的問題是有點徒勞,因爲GraphX沒有按」 t支持Datasets並且邊和頂點都應該作爲RDDs傳遞,但是爲了論證:

  • 您會得到第一個異常,因爲Spark中的分佈式數據結構是不變的。不要使用asInstanceOf。只需要註明類型註釋。
  • 由於Datasets進一步受到Encoders的使用限制,您會得到第二個例外。 Dataset中的所有對象都必須使用相同的Encoder,在這種情況下,只能使用二進制編碼器,而不能爲用戶定義的類隱式訪問。

有了這兩塊結合:

import org.apache.spark.sql.{Dataset, Encoders} 

sealed trait EdgeProperty 

case class AuthorEdgeProperty(val doccount: Long) extends EdgeProperty 
case class CiteEdgeProperty() extends EdgeProperty 

val docauthoredges: Dataset[EdgeProperty] = spark.range(10) 
    .map(AuthorEdgeProperty(_): EdgeProperty)(Encoders.kryo[EdgeProperty]) 

val docciteedges: Dataset[EdgeProperty] = spark.range(5) 
    .map(_ => CiteEdgeProperty(): EdgeProperty)(Encoders.kryo[EdgeProperty]) 

val edges: Dataset[EdgeProperty] = docauthoredges.union(docciteedges) 

轉換爲RDD,使其在GraphX可用:

edges.rdd