nghiadinhhieu

Results 8 comments of nghiadinhhieu

Thanks for replying, I run Kafka Example with localmode. When i debug it encounters errror at function "public static StreamingContext GetOrCreate(string checkpointPath, Func creatingFunc)" ==> Error at code line: return...

I am using the lastest Mobius version in https://github.com/Microsoft/Mobius. All Unit Test is Passed. But in runtime i encounter some errors in streaming context. I try do not manually create...

This is my command line: sparkclr-submit.cmd --master local[*] --conf spark.local.dir=c:\temp --exe SparkClrKafka.exe C:\Mobius-master\examples\Streaming\Kafka\bin\Debug And here is my Spark's JVM logs: [spark.txt](https://github.com/Microsoft/Mobius/files/623669/spark.txt)

Yes, I've already set these parameters, here is my code: var sparkContext = new SparkContext(new SparkConf().SetAppName("SparkCLRKafka Example")); const string topicName = "test"; var topicList = new List {topicName}; var kafkaParams...

Yes, in Spark log I can not see any errors, but when i run debug projects I've got 2 errors at 2 code lines: 1. SparkCLRIpcProxy.JvmBridge.CallConstructor("org.apache.spark.streaming.api.java.JavaStreamingContext", new object[] { checkpointPath...

Here is my console log at local mode: ... at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToM essageDecoder.java:103) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessage Decoder.java:244) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308)...

I tested my hadoop configs work well. I think my issues relating to the calling JVM StreamingContext functions like this: "SparkCLRIpcProxy.JvmBridge.CallConstructor("org.apache.spark.streaming.api.java.JavaStreamingContext" and the function: SparkCLRIpcProxy.JvmBridge.CallConstructor("org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper", new object[] { }); I...

Yeah, I upgraded version 2 release, but it's still issue above. Thanks