predictionio-template-java-ecom-recommender
predictionio-template-java-ecom-recommender copied to clipboard
Update Algorithm.java
Solved bug "empty collection".
Thanks @felipexw. Can you point this are the Apache version of PIO here: https://github.com/apache/incubator-predictionio Create the PR against the develop
branch.
Future releases will come through Apache. A little more explanation of error conditions solved would be great.
On querying for recommendation for a user (u6), I got the following error as a response:
Query:
{ "userEntityId":
"u6", "number": 4 }
Stack Trace:
java.lang.RuntimeException: empty collection
at org.example.recommendation.Algorithm.getRecentProductFeatures(Algorithm.java:256)
at org.example.recommendation.Algorithm.predict(Algorithm.java:186)
at org.example.recommendation.Algorithm.predict(Algorithm.java:35)
at org.apache.predictionio.controller.PAlgorithm.predictBase(PAlgorithm.scala:76)
at org.apache.predictionio.workflow.ServerActor$$anonfun$24$$anonfun$25.apply(CreateServer.scala:490)
at org.apache.predictionio.workflow.ServerActor$$anonfun$24$$anonfun$25.apply(CreateServer.scala:489)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.immutable.List.map(List.scala:285)
at org.apache.predictionio.workflow.ServerActor$$anonfun$24.apply(CreateServer.scala:489)
at org.apache.predictionio.workflow.ServerActor$$anonfun$24.apply(CreateServer.scala:467)
at spray.routing.ApplyConverterInstances$$anon$22$$anonfun$apply$1.apply(ApplyConverterInstances.scala:25)
at spray.routing.ApplyConverterInstances$$anon$22$$anonfun$apply$1.apply(ApplyConverterInstances.scala:24)
at spray.routing.ConjunctionMagnet$$anon$1$$anon$2$$anonfun$happly$1$$anonfun$apply$1.apply(Directive.scala:38)
at spray.routing.ConjunctionMagnet$$anon$1$$anon$2$$anonfun$happly$1$$anonfun$apply$1.apply(Directive.scala:37)
at spray.routing.directives.BasicDirectives$$anon$1.happly(BasicDirectives.scala:26)
at spray.routing.ConjunctionMagnet$$anon$1$$anon$2$$anonfun$happly$1.apply(Directive.scala:37)
at spray.routing.ConjunctionMagnet$$anon$1$$anon$2$$anonfun$happly$1.apply(Directive.scala:36)
at spray.routing.directives.BasicDirectives$$anon$2.happly(BasicDirectives.scala:79)
at spray.routing.Directive$$anon$7$$anonfun$happly$4.apply(Directive.scala:86)
at spray.routing.Directive$$anon$7$$anonfun$happly$4.apply(Directive.scala:86)
at spray.routing.directives.BasicDirectives$$anon$3$$anonfun$happly$1.apply(BasicDirectives.scala:92)
at spray.routing.directives.BasicDirectives$$anon$3$$anonfun$happly$1.apply(BasicDirectives.scala:92)
at spray.routing.directives.ExecutionDirectives$$anonfun$detach$1$$anonfun$apply$7$$anonfun$apply$1.apply$mcV$sp(ExecutionDirectives.scala:89)
at spray.routing.directives.ExecutionDirectives$$anonfun$detach$1$$anonfun$apply$7$$anonfun$apply$1.apply(ExecutionDirectives.scala:89)
at spray.routing.directives.ExecutionDirectives$$anonfun$detach$1$$anonfun$apply$7$$anonfun$apply$1.apply(ExecutionDirectives.scala:89)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.UnsupportedOperationException: empty collection
at org.apache.spark.rdd.RDD$$anonfun$first$1.apply(RDD.scala:1370)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
at org.apache.spark.rdd.RDD.first(RDD.scala:1367)
at org.apache.spark.api.java.JavaPairRDD.first(JavaPairRDD.scala:221)
at org.example.recommendation.Algorithm.getRecentProductFeatures(Algorithm.java:234)
... 34 more
@felipexw Is this issue addressed in this PR?