spark-betweenness
spark-betweenness copied to clipboard
How to print sortedVBC?
Hi, I am trying to print the sortedVBC. I do not get errors by building the .jar file, but anything is shown when I run it. Please, this is my source code to do so:
val graph = Graph(vertices, edges, defaultVertex)
val k = 3
val kBCGraph = KBetweenness.run(graph, k)
val verticesBetweenness = kBCGraph.vertices.collect()
val sortedVBC = verticesBetweenness.sortWith((x,y) => x._1 < y._1)
println(s"${sortedVBC(0)._1} should equal (1L)")
Best and thanks in advance
Hi, Can you please mention what isn't working correctly here? Do you get the VBC values wrong? / sorted wrong? Are you getting anything at all? Is the resulting graph empty? Maybe the original graph your are supplying is empty? NOTE - your are sorting by node_ids not VBC values (graphX graph object is (id,value) so you need to use *._2 and not *._1). Please attach the graph or a sample of the graph you are using to test whether this works correctly. You can also try and run some of the tests here to get a hang of how to use the library on some example data
Hi, Now I could add your library to my project without errors, but I do not get any result when I run it on my graph. Please, here I attach a sample of it and my source code:
val sc = new SparkContext(new SparkConf().setAppName("Spark Count"))
val vertexArray = Array( (1L, ("Alice", 28)), (2L, ("Bob", 27)), (3L, ("Charlie", 65)), (4L, ("David", 42)), (5L, ("Ed", 55)), (6L, ("Fran", 50)) ) val edgeArray = Array( Edge(2L, 1L, 7), Edge(2L, 4L, 2), Edge(3L, 2L, 4), Edge(3L, 6L, 3), Edge(4L, 1L, 1), Edge(5L, 2L, 2), Edge(5L, 3L, 8), Edge(5L, 6L, 3) )
val vertexRDD: RDD[(Long, (String, Int))] = sc.parallelize(vertexArray) val edgeRDD: RDD[Edge[Int]] = sc.parallelize(edgeArray)
val graph: Graph[(String, Int), Int] = Graph(vertexRDD, edgeRDD) val kBCGraph = KBetweenness.run(graph, 2)
val verticesBetweenness = kBCGraph.vertices.collect()
val sortedVBC = verticesBetweenness.sortWith((x,y) => x._1 < y._1)
println(s"$sortedVBC(0)._1")
Did you manage to get it to work? Maybe not the recommended scala/spark version as on readme?