Mobius icon indicating copy to clipboard operation
Mobius copied to clipboard

Streaming Error

Open nghiadinhhieu opened this issue 8 years ago • 18 comments

Hello All, I run samples Pi, Wordcount is done correctly. But when i run Kafka Example, it show errors:

[ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - JVM method execution failed: Constructor failed for class org.apache.spar k.SparkContext when called with 1 parameters ([Index=1, Type=JvmObjectReference, Value=3], ) [2016-11-29 15:23:53,370] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - org.apache.spark.SparkException: Could not parse Master URL: '' at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createT askScheduler(SparkContext.scala:2735) at org.apache.spark.SparkContext.(SparkContext.scala:522) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct

orAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC onstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.spark.api.csharp.CSharpBackendHandler.handleMethodCall(CSh arpBackendHandler.scala:167) at org.apache.spark.api.csharp.CSharpBackendHandler.handleBackendRequest (CSharpBackendHandler.scala:103) at org.apache.spark.api.csharp.CSharpBackendHandler.channelRead0(CSharpB ackendHandler.scala:30) at org.apache.spark.api.csharp.CSharpBackendHandler.channelRead0(CSharpB ackendHandler.scala:27) at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChanne lInboundHandler.java:105) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToM essageDecoder.java:103) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessage Decoder.java:244) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChanne lPipeline.java:846) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(Abstra ctNioByteChannel.java:131) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav a:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve ntLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja va:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread EventExecutor.java:111) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorato r.run(DefaultThreadFactory.java:137) at java.lang.Thread.run(Thread.java:745) [2016-11-29 15:23:53,370] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - JVM method execution failed: Constructor failed for class org.apache.spar k.SparkContext when called with 1 parameters ([Index=1, Type=JvmObjectReference, Value=3], ) [2016-11-29 15:23:53,370] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] -



at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStat ic, Object classNameOrJvmObjectReference, String methodName, Object[] parameters ) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Interop\Ipc\JvmBridg e.cs:line 90



Unhandled Exception: System.Exception: JVM method execution failed: Constructor failed for class org.apache.spark.SparkContext when called with 1 parameters ([I ndex=1, Type=JvmObjectReference, Value=3], ) at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStat ic, Object classNameOrJvmObjectReference, String methodName, Object[] parameters ) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Interop\Ipc\JvmBridg e.cs:line 135 at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallConstructor(String classN ame, Object[] parameters) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSh arp\Interop\Ipc\JvmBridge.cs:line 46 at Microsoft.Spark.CSharp.Proxy.Ipc.SparkCLRIpcProxy.CreateSparkContext(ISpar kConfProxy conf) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Proxy \Ipc\SparkCLRIpcProxy.cs:line 64 at Microsoft.Spark.CSharp.Core.SparkContext..ctor(String master, String appNa me, String sparkHome, SparkConf conf) in C:\Mobius-master\csharp\Adapter\Microso ft.Spark.CSharp\Core\SparkContext.cs:line 144 at Microsoft.Spark.CSharp.Core.SparkContext..ctor(String master, String appNa me) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Core\SparkContext. cs:line 112 at Microsoft.Spark.CSharp.Examples.SparkClrKafkaExample.Main(String[] args) i n C:\Mobius-master\examples\Streaming\Kafka\Program.cs:line 23 [2016-11-29 15:23:53,493] [3] [DEBUG] [Microsoft.Spark.CSharp.Interop.Ipc.WeakOb jectManagerImpl] - check begin : weakReferences.Count = 3, checkCount: 10 [2016-11-29 15:23:53,493] [3] [DEBUG] [Microsoft.Spark.CSharp.Interop.Ipc.WeakOb jectManagerImpl] - check end : released 0 garbage, remain 3 alive, used 0 ms : r elease garbage used 0 ms, store alive used 0 ms Exception caught: An existing connection was forcibly closed by the remote host java.io.IOException: An existing connection was forcibly closed by the remote ho st[CSharpRunner.main] closing CSharpBackend

    at sun.nio.ch.SocketDispatcher.read0(Native Method)
    at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:43)
    at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)

Requesting to close all call back sockets. at sun.nio.ch.IOUtil.read(IOUtil .java:192)

    at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)

[CSharpRunner.main] Return CSharpBackend code -532462766 at io.netty.buff er.UnpooledUnsafeDirectByteBuf.setBytes(UnpooledUnsafeDirectByteBuf.java:447)

    at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:881)
    at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketCha

nnel.java:242) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(Abstra ctNioByteChannel.java:119) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav a:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve ntLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja va:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread EventExecutor.java:111) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorato r.run(DefaultThreadFactory.java:137) at java.lang.Thread.run(Thread.java:745)

Please help me.

nghiadinhhieu avatar Nov 29 '16 08:11 nghiadinhhieu

The error message is Could not parse Master URL:. Are you passing a valid value for master config setting? Can you share that value?

skaarthik avatar Nov 29 '16 19:11 skaarthik

Thanks for replying, I run Kafka Example with localmode. When i debug it encounters errror at function "public static StreamingContext GetOrCreate(string checkpointPath, Func<StreamingContext> creatingFunc)" ==> Error at code line: return new StreamingContext(SparkCLREnvironment.SparkCLRProxy.CreateStreamingContext(checkpointPath));

Here is the Call Stack for debug:

Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(bool isStatic, object classNameOrJvmObjectReference, string methodName, object[] parameters) Line 135 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallConstructor(string className, object[] parameters) Line 46 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Proxy.Ipc.StreamingContextIpcProxy.StreamingContextIpcProxy(string checkpointPath) Line 64 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Proxy.Ipc.SparkCLRIpcProxy.CreateStreamingContext(string checkpointPath) Line 91 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Streaming.StreamingContext.GetOrCreate(string checkpointPath, System.Func<Microsoft.Spark.CSharp.Streaming.StreamingContext> creatingFunc) Line 84 C#

Exception: {"JVM method execution failed: Constructor failed for class org.apache.spark.streaming.api.java.JavaStreamingContext when called with 1 parameters ([Index=1, Type=String, Value=hdfs://192.168.10.32:9000/checkpoint], )"}

The checkpointPath, i've already created in HDFS: const string checkpointPath = "hdfs://192.168.10.32:9000/checkpoint"

Here is my app.config:

<appender name="ConsoleAppender" type="log4net.Appender.ConsoleAppender">
  <layout type="log4net.Layout.PatternLayout">
    <conversionPattern value="[%date] [%thread] [%-5level] [%logger] - %message%newline" />
  </layout>
</appender>
<!--** Uncomment the following setting to run Spark driver executable in **debug** mode ** -->
<!--** Setting the port number is optional and needed only to override the default debug port number (5567) -->
<!--** In debug mode, the driver is not launched by CSharpRunner but launched from VS or command prompt not configured for SparkCLR ** -->
<!--** CSharpBackend should be launched in debug mode as well and the port number from that should be used below ** -->
<!--** Command to launch CSharpBackend in debug mode is "sparkclr-submit.cmd debug <port number - optional>" ** -->
<!--** If port number is not specified default debug port number will be used **-->
<!--********************************************************************************************************-->


  <add key="CSharpBackendPortNumber" value="5567"/>


<!--********************************************************************************************************-->
<!--** Uncomment the following setting to override the location of CSharpWorker.exe to use  ** -->
<!--** when running Spark in **local** or ** YARN ** modes ** -->
<!--** If this setting is not used, CSharpWorker.exe will be used from default location - location of driver exe ** -->
<!--********************************************************************************************************-->


  <add key="CSharpWorkerPath" value="C:\Mobius-master\examples\Streaming\Kafka\bin\Debug\CSharpWorker.exe"/>


<!-- *** Settings for Mobius in Linux *** -->

<!--********************************************************************************************************-->
<!--** Uncomment the following setting to use Mobius in Linux - ** CentOS, Fedora or OS X or similiar distros **  ** -->
<!--** This setting uses the application layout settings recommended at http://www.mono-project.com/docs/getting-started/application-deployment/#layout-recommendation ** -->
<!--** Make sure CSharpWorker.sh.exe is available in the same location as your Mobius driver application ** -->
<!--** For more instructions refer to https://github.com/Microsoft/Mobius/blob/master/notes/linux-instructions.md#instructions-1 **-->
<!--********************************************************************************************************-->

<!-- for Spark in ** local ** mode -->
<!--
  <add key="CSharpWorkerPath" value="/path/to/mobius/driver/application/CSharpWorker.sh.exe"/>
-->

<!-- for Spark in ** YARN ** mode -->
<!--
  <add key="CSharpWorkerPath" value="CSharpWorker.sh.exe"/>
-->

I also run this example with Standalone Cluster mode " sparkclr-submit.cmd --master spark://192.168.10.24:7077 --conf spark.local.dir=c:\temp --exe SparkClrKafka.exe C:\Mobius-master\examples\Streaming\Kafka\bin\Debug"

, and it show errors:

[ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - JVM method execution failed: Constructor failed for class org.apache.spar k.streaming.api.java.JavaStreamingContext when called with 1 parameters ([Index= 1, Type=String, Value=hdfs://192.168.10.32:9000/checkpoint], ) [2016-11-30 10:13:08,284] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - java.util.NoSuchElementException: None.get at scala.None$.get(Option.scala:313) at scala.None$.get(Option.scala:311) at org.apache.spark.streaming.StreamingContext.(StreamingContext.s cala:108) at org.apache.spark.streaming.api.java.JavaStreamingContext.(JavaS treamingContext.scala:146) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct

orAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC onstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.spark.api.csharp.CSharpBackendHandler.handleMethodCall(CSh arpBackendHandler.scala:167) at org.apache.spark.api.csharp.CSharpBackendHandler.handleBackendRequest (CSharpBackendHandler.scala:103) at org.apache.spark.api.csharp.CSharpBackendHandler.channelRead0(CSharpB ackendHandler.scala:30) at org.apache.spark.api.csharp.CSharpBackendHandler.channelRead0(CSharpB ackendHandler.scala:27) at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChanne lInboundHandler.java:105) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToM essageDecoder.java:103) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessage Decoder.java:244) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChanne lPipeline.java:846) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(Abstra ctNioByteChannel.java:131) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav a:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve ntLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja va:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread EventExecutor.java:111) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorato r.run(DefaultThreadFactory.java:137) at java.lang.Thread.run(Thread.java:745)

[2016-11-30 10:13:08,284] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - JVM method execution failed: Constructor failed for class org.apache.spar k.streaming.api.java.JavaStreamingContext when called with 1 parameters ([Index= 1, Type=String, Value=hdfs://192.168.10.32:9000/checkpoint], ) [2016-11-30 10:13:08,284] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] -



at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStat ic, Object classNameOrJvmObjectReference, String methodName, Object[] parameters ) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Interop\Ipc\JvmBridg e.cs:line 90



Unhandled Exception: System.Exception: JVM method execution failed: Constructor failed for class org.apache.spark.streaming.api.java.JavaStreamingContext when c alled with 1 parameters ([Index=1, Type=String, Value=hdfs://192.168.10.32:9000/ checkpoint], ) at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStat ic, Object classNameOrJvmObjectReference, String methodName, Object[] parameters ) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Interop\Ipc\JvmBridg e.cs:line 135 at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallConstructor(String classN ame, Object[] parameters) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSh arp\Interop\Ipc\JvmBridge.cs:line 46 at Microsoft.Spark.CSharp.Proxy.Ipc.StreamingContextIpcProxy..ctor(String che ckpointPath) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Proxy\Ipc \StreamingContextIpcProxy.cs:line 64 at Microsoft.Spark.CSharp.Proxy.Ipc.SparkCLRIpcProxy.CreateStreamingContext(S tring checkpointPath) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp
Proxy\Ipc\SparkCLRIpcProxy.cs:line 91 at Microsoft.Spark.CSharp.Streaming.StreamingContext.GetOrCreate(String check pointPath, Func`1 creatingFunc) in C:\Mobius-master\csharp\Adapter\Microsoft.Spa rk.CSharp\Streaming\StreamingContext.cs:line 84 at Microsoft.Spark.CSharp.Examples.SparkClrKafkaExample.Main(String[] args) i n C:\Mobius-master\examples\Streaming\Kafka\Program.cs:line 39 [2016-11-30 10:13:12,468] [3] [DEBUG] [Microsoft.Spark.CSharp.Interop.Ipc.WeakOb jectManagerImpl] - check begin : weakReferences.Count = 24, checkCount: 10 [2016-11-30 10:13:12,468] [3] [DEBUG] [Microsoft.Spark.CSharp.Interop.Ipc.WeakOb jectManagerImpl] - Stop releasing as exceeded allowed checkCount: 10 [2016-11-30 10:13:12,468] [3] [DEBUG] [Microsoft.Spark.CSharp.Interop.Ipc.WeakOb jectManagerImpl] - check end : released 0 garbage, remain 24 alive, used 0 ms : release garbage used 0 ms, store alive used 0 ms Exception caught: An existing connection was forcibly closed by the remote host java.io.IOException: An existing connection was forcibly closed by the remote ho st at sun.nio.ch.SocketDispatcher.read0(Native Method) [CSharpRunner.main] closing CSharpBackend at sun.nio.ch.SocketDispatcher.r ead(SocketDispatcher.java:43)

    at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
    at sun.nio.ch.IOUtil.read(IOUtil.java:192)
    at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)

Requesting to close all call back sockets. at io.netty.buffer.UnpooledUnsaf eDirectByteBuf.setBytes(UnpooledUnsafeDirectByteBuf.java:447)

[CSharpRunner.main] Return CSharpBackend code -532462766 at io.netty.buff er.AbstractByteBuf.writeBytes(AbstractByteBuf.java:881)

    at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketCha

nnel.java:242) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(Abstra ctNioByteChannel.java:119) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav a:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve ntLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja va:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread EventExecutor.java:111) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorato r.run(DefaultThreadFactory.java:137) at java.lang.Thread.run(Thread.java:745)

nghiadinhhieu avatar Nov 30 '16 02:11 nghiadinhhieu

Which version of Mobius are you using? Do not manually create checkpoint directory. If the directory exists, Spark will try to load checkpoint from that directory. I guess that is failing for you.

If not creating checkpoint directory does not help, avoid using checkpoints until you figure out the root cause.

skaarthik avatar Nov 30 '16 06:11 skaarthik

I am using the lastest Mobius version in https://github.com/Microsoft/Mobius. All Unit Test is Passed. But in runtime i encounter some errors in streaming context.

I try do not manually create checkpoint directory and I've got a new error: {"JVM method execution failed: Constructor failed for class org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper when called with no parameters"}

Here is the Call Stack:

Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(bool isStatic, object classNameOrJvmObjectReference, string methodName, object[] parameters) Line 135 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallConstructor(string className, object[] parameters) Line 46 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Proxy.Ipc.StreamingContextIpcProxy.DirectKafkaStream(System.Collections.Generic.List topics, System.Collections.Generic.Dictionary<string, string> kafkaParams, System.Collections.Generic.Dictionary<string, long> fromOffsets) Line 226 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Streaming.KafkaUtils.CreateDirectStream(Microsoft.Spark.CSharp.Streaming.StreamingContext ssc, System.Collections.Generic.List topics, System.Collections.Generic.Dictionary<string, string> kafkaParams, System.Collections.Generic.Dictionary<string, long> fromOffsets) Line 95 C# SparkClrKafka.exe!Microsoft.Spark.CSharp.Examples.SparkClrKafkaExample.Main.AnonymousMethod__0() Line 45 C# Microsoft.Spark.CSharp.Adapter.dll!Microsoft.Spark.CSharp.Streaming.StreamingContext.GetOrCreate(string checkpointPath, System.Func<Microsoft.Spark.CSharp.Streaming.StreamingContext> creatingFunc) Line 79 C# SparkClrKafka.exe!Microsoft.Spark.CSharp.Examples.SparkClrKafkaExample.Main(string[] args) Line 39 C#

nghiadinhhieu avatar Nov 30 '16 08:11 nghiadinhhieu

Can you share Spark's JVM logs? There is a failure in JVM that triggers Mobius failure. You should be able to do Web search and find info on how to set log4j settings for Spark.

skaarthik avatar Nov 30 '16 20:11 skaarthik

This is my command line: sparkclr-submit.cmd --master local[*] --conf spark.local.dir=c:\temp --exe SparkClrKafka.exe C:\Mobius-master\examples\Streaming\Kafka\bin\Debug And here is my Spark's JVM logs:

spark.txt

nghiadinhhieu avatar Dec 01 '16 04:12 nghiadinhhieu

Can you confirm if you updated code to set topic name and broker list? https://github.com/Microsoft/Mobius/blob/master/examples/Streaming/Kafka/Program.cs#L24 https://github.com/Microsoft/Mobius/blob/master/examples/Streaming/Kafka/Program.cs#L28

skaarthik avatar Dec 01 '16 05:12 skaarthik

Yes, I've already set these parameters, here is my code:

var sparkContext = new SparkContext(new SparkConf().SetAppName("SparkCLRKafka Example")); const string topicName = "test"; var topicList = new List {topicName}; var kafkaParams = new Dictionary<string, string> //refer to http://kafka.apache.org/documentation.html#configuration { {"metadata.broker.list", "192.168.10.135:9092"}, {"auto.offset.reset", "smallest"} };

nghiadinhhieu avatar Dec 01 '16 07:12 nghiadinhhieu

If you have set valid Kafka parameters, I cannot think of a reason for the failure. There are no exceptions in Spark logs you shared. So it is not clear what is happening. Setting the log level to DEBUG and see if you find anything useful.

skaarthik avatar Dec 02 '16 05:12 skaarthik

Yes, in Spark log I can not see any errors, but when i run debug projects I've got 2 errors at 2 code lines:

  1. SparkCLRIpcProxy.JvmBridge.CallConstructor("org.apache.spark.streaming.api.java.JavaStreamingContext", new object[] { checkpointPath }); ==> even I let spark creates checkpointPath, it will encounters errors when it exists. Exception is: {"JVM method execution failed: Constructor failed for class org.apache.spark.streaming.api.java.JavaStreamingContext when called with 1 parameters ([Index=1, Type=String, Value=hdfs://192.168.10.32:9000/checkpoint], )"}

  2. JvmObjectReference jhelper = SparkCLRIpcProxy.JvmBridge.CallConstructor("org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper", new object[] { }); Exception is: {"JVM method execution failed: Constructor failed for class org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper when called with no parameters"}

I really don't know why, maybe some of configs from you could figure out the problems.

I found spark logs DEBUG:

[CSharpBackend] DEBUG io.netty.util.internal.PlatformDependent - Javassist: unavailable [CSharpBackend] DEBUG io.netty.util.internal.PlatformDependent - You don't have Javassist in your class path or you don't have enough permission to load dynamically generated classes. Please check the configuration for better performance.

Did it can cause errors?

Thanks a lot.

nghiadinhhieu avatar Dec 02 '16 07:12 nghiadinhhieu

Since you are able to run Pi or WordCount, I doubt that the failure has anything to do with CSharpBackend or the DEBUG messages copied above.

To keep the investigation simple, try to run your driver program using non-debug Mobius setup & Spark local mode (instead of using a remote Spark cluster) and share the entire output you see in the console.

skaarthik avatar Dec 03 '16 01:12 skaarthik

Here is my console log at local mode:

... at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToM essageDecoder.java:103) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessage Decoder.java:244) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChanne lPipeline.java:846) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(Abstra ctNioByteChannel.java:131) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav a:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve ntLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja va:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread EventExecutor.java:111) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorato r.run(DefaultThreadFactory.java:137) at java.lang.Thread.run(Thread.java:745) Caused by: java.util.NoSuchElementException: None.get at scala.None$.get(Option.scala:313) at scala.None$.get(Option.scala:311) at org.apache.spark.streaming.StreamingContext.(StreamingContext.s cala:108) at org.apache.spark.streaming.api.java.JavaStreamingContext.(JavaS treamingContext.scala:146) ... 26 more () methods: public void org.apache.spark.streaming.api.java.JavaStreamingContext.start() public void org.apache.spark.streaming.api.java.JavaStreamingContext.stop() public void org.apache.spark.streaming.api.java.JavaStreamingContext.stop(boolea n) public void org.apache.spark.streaming.api.java.JavaStreamingContext.stop(boolea n,boolean) public org.apache.spark.streaming.StreamingContextState org.apache.spark.streami ng.api.java.JavaStreamingContext.getState() public void org.apache.spark.streaming.api.java.JavaStreamingContext.close() public org.apache.spark.streaming.api.java.JavaPairDStream org.apache.spark.stre aming.api.java.JavaStreamingContext.union(org.apache.spark.streaming.api.java.Ja vaPairDStream,java.util.List) public org.apache.spark.streaming.api.java.JavaDStream org.apache.spark.streamin g.api.java.JavaStreamingContext.union(org.apache.spark.streaming.api.java.JavaDS tream,java.util.List) public org.apache.spark.api.java.JavaSparkContext org.apache.spark.streaming.api .java.JavaStreamingContext.sc() public org.apache.spark.streaming.api.java.JavaDStream org.apache.spark.streamin g.api.java.JavaStreamingContext.transform(java.util.List,org.apache.spark.api.ja va.function.Function2) public org.apache.spark.streaming.StreamingContext org.apache.spark.streaming.ap i.java.JavaStreamingContext.ssc() public void org.apache.spark.streaming.api.java.JavaStreamingContext.awaitTermin ation() public void org.apache.spark.streaming.api.java.JavaStreamingContext.awaitTermin ation(long) public void org.apache.spark.streaming.api.java.JavaStreamingContext.checkpoint( java.lang.String) public static java.lang.String[] org.apache.spark.streaming.api.java.JavaStreami ngContext.jarOfClass(java.lang.Class) public static org.apache.spark.streaming.api.java.JavaStreamingContext org.apach e.spark.streaming.api.java.JavaStreamingContext.getOrCreate(java.lang.String,org .apache.hadoop.conf.Configuration,org.apache.spark.streaming.api.java.JavaStream ingContextFactory) public static org.apache.spark.streaming.api.java.JavaStreamingContext org.apach e.spark.streaming.api.java.JavaStreamingContext.getOrCreate(java.lang.String,org .apache.spark.streaming.api.java.JavaStreamingContextFactory) public static org.apache.spark.streaming.api.java.JavaStreamingContext org.apach e.spark.streaming.api.java.JavaStreamingContext.getOrCreate(java.lang.String,org .apache.hadoop.conf.Configuration,org.apache.spark.streaming.api.java.JavaStream ingContextFactory,boolean) public static org.apache.spark.streaming.api.java.JavaStreamingContext org.apach e.spark.streaming.api.java.JavaStreamingContext.getOrCreate(java.lang.String,org .apache.spark.api.java.function.Function0) public static org.apache.spark.streaming.api.java.JavaStreamingContext org.apach e.spark.streaming.api.java.JavaStreamingContext.getOrCreate(java.lang.String,org .apache.spark.api.java.function.Function0,org.apache.hadoop.conf.Configuration,b oolean) public static org.apache.spark.streaming.api.java.JavaStreamingContext org.apach e.spark.streaming.api.java.JavaStreamingContext.getOrCreate(java.lang.String,org .apache.spark.api.java.function.Function0,org.apache.hadoop.conf.Configuration) public org.apache.spark.api.java.JavaSparkContext org.apache.spark.streaming.api .java.JavaStreamingContext.sparkContext() public org.apache.spark.streaming.api.java.JavaReceiverInputDStream org.apache.s park.streaming.api.java.JavaStreamingContext.socketStream(java.lang.String,int,o rg.apache.spark.api.java.function.Function,org.apache.spark.storage.StorageLevel ) public org.apache.spark.streaming.api.java.JavaDStream org.apache.spark.streamin g.api.java.JavaStreamingContext.binaryRecordsStream(java.lang.String,int) public void org.apache.spark.streaming.api.java.JavaStreamingContext.addStreamin gListener(org.apache.spark.streaming.scheduler.StreamingListener) public boolean org.apache.spark.streaming.api.java.JavaStreamingContext.awaitTer minationOrTimeout(long) public org.apache.spark.streaming.api.java.JavaReceiverInputDStream org.apache.s park.streaming.api.java.JavaStreamingContext.socketTextStream(java.lang.String,i nt) public org.apache.spark.streaming.api.java.JavaReceiverInputDStream org.apache.s park.streaming.api.java.JavaStreamingContext.socketTextStream(java.lang.String,i nt,org.apache.spark.storage.StorageLevel) public org.apache.spark.streaming.api.java.JavaDStream org.apache.spark.streamin g.api.java.JavaStreamingContext.textFileStream(java.lang.String) public org.apache.spark.streaming.api.java.JavaReceiverInputDStream org.apache.s park.streaming.api.java.JavaStreamingContext.rawSocketStream(java.lang.String,in t,org.apache.spark.storage.StorageLevel) public org.apache.spark.streaming.api.java.JavaReceiverInputDStream org.apache.s park.streaming.api.java.JavaStreamingContext.rawSocketStream(java.lang.String,in t) public org.apache.spark.streaming.api.java.JavaPairInputDStream org.apache.spark .streaming.api.java.JavaStreamingContext.fileStream(java.lang.String,java.lang.C lass,java.lang.Class,java.lang.Class,org.apache.spark.api.java.function.Function ,boolean,org.apache.hadoop.conf.Configuration) public org.apache.spark.streaming.api.java.JavaPairInputDStream org.apache.spark .streaming.api.java.JavaStreamingContext.fileStream(java.lang.String,java.lang.C lass,java.lang.Class,java.lang.Class,org.apache.spark.api.java.function.Function ,boolean) public org.apache.spark.streaming.api.java.JavaPairInputDStream org.apache.spark .streaming.api.java.JavaStreamingContext.fileStream(java.lang.String,java.lang.C lass,java.lang.Class,java.lang.Class) public org.apache.spark.streaming.api.java.JavaReceiverInputDStream org.apache.s park.streaming.api.java.JavaStreamingContext.actorStream(akka.actor.Props,java.l ang.String) public org.apache.spark.streaming.api.java.JavaReceiverInputDStream org.apache.s park.streaming.api.java.JavaStreamingContext.actorStream(akka.actor.Props,java.l ang.String,org.apache.spark.storage.StorageLevel) public org.apache.spark.streaming.api.java.JavaReceiverInputDStream org.apache.s park.streaming.api.java.JavaStreamingContext.actorStream(akka.actor.Props,java.l ang.String,org.apache.spark.storage.StorageLevel,akka.actor.SupervisorStrategy) public org.apache.spark.streaming.api.java.JavaInputDStream org.apache.spark.str eaming.api.java.JavaStreamingContext.queueStream(java.util.Queue,boolean,org.apa che.spark.api.java.JavaRDD) public org.apache.spark.streaming.api.java.JavaInputDStream org.apache.spark.str eaming.api.java.JavaStreamingContext.queueStream(java.util.Queue,boolean) public org.apache.spark.streaming.api.java.JavaDStream org.apache.spark.streamin g.api.java.JavaStreamingContext.queueStream(java.util.Queue) public org.apache.spark.streaming.api.java.JavaReceiverInputDStream org.apache.s park.streaming.api.java.JavaStreamingContext.receiverStream(org.apache.spark.str eaming.receiver.Receiver) public org.apache.spark.streaming.api.java.JavaPairDStream org.apache.spark.stre aming.api.java.JavaStreamingContext.transformToPair(java.util.List,org.apache.sp ark.api.java.function.Function2) public void org.apache.spark.streaming.api.java.JavaStreamingContext.remember(or g.apache.spark.streaming.Duration) public final void java.lang.Object.wait(long,int) throws java.lang.InterruptedEx ception public final native void java.lang.Object.wait(long) throws java.lang.Interrupte dException public final void java.lang.Object.wait() throws java.lang.InterruptedException public boolean java.lang.Object.equals(java.lang.Object) public java.lang.String java.lang.Object.toString() public native int java.lang.Object.hashCode() public final native java.lang.Class java.lang.Object.getClass() public final native void java.lang.Object.notify() public final native void java.lang.Object.notifyAll() args: argType: java.lang.String, argValue: hdfs://192.168.10.32:9000/checkpoint [2016-12-03 12:13:24,936] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - JVM method execution failed: Constructor failed for class org.apache.spar k.streaming.api.java.JavaStreamingContext when called with 1 parameters ([Index= 1, Type=String, Value=hdfs://192.168.10.32:9000/checkpoint], ) [2016-12-03 12:13:24,936] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - java.util.NoSuchElementException: None.get at scala.None$.get(Option.scala:313) at scala.None$.get(Option.scala:311) at org.apache.spark.streaming.StreamingContext.(StreamingContext.s cala:108) at org.apache.spark.streaming.api.java.JavaStreamingContext.(JavaS treamingContext.scala:146) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct

orAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC onstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.spark.api.csharp.CSharpBackendHandler.handleMethodCall(CSh arpBackendHandler.scala:167) at org.apache.spark.api.csharp.CSharpBackendHandler.handleBackendRequest (CSharpBackendHandler.scala:103) at org.apache.spark.api.csharp.CSharpBackendHandler.channelRead0(CSharpB ackendHandler.scala:30) at org.apache.spark.api.csharp.CSharpBackendHandler.channelRead0(CSharpB ackendHandler.scala:27) at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChanne lInboundHandler.java:105) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToM essageDecoder.java:103) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessage Decoder.java:244) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(Abst ractChannelHandlerContext.java:308) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(Abstra ctChannelHandlerContext.java:294) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChanne lPipeline.java:846) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(Abstra ctNioByteChannel.java:131) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav a:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve ntLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja va:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread EventExecutor.java:111) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorato r.run(DefaultThreadFactory.java:137) at java.lang.Thread.run(Thread.java:745)

[2016-12-03 12:13:24,936] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] - JVM method execution failed: Constructor failed for class org.apache.spar k.streaming.api.java.JavaStreamingContext when called with 1 parameters ([Index= 1, Type=String, Value=hdfs://192.168.10.32:9000/checkpoint], ) [2016-12-03 12:13:24,951] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBri dge] -



at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStat ic, Object classNameOrJvmObjectReference, String methodName, Object[] parameters ) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Interop\Ipc\JvmBridg e.cs:line 93



Unhandled Exception: System.Exception: JVM method execution failed: Constructor failed for class org.apache.spark.streaming.api.java.JavaStreamingContext when c alled with 1 parameters ([Index=1, Type=String, Value=hdfs://192.168.10.32:9000/ checkpoint], ) at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStat ic, Object classNameOrJvmObjectReference, String methodName, Object[] parameters ) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Interop\Ipc\JvmBridg e.cs:line 135 at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallConstructor(String classN ame, Object[] parameters) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSh arp\Interop\Ipc\JvmBridge.cs:line 46 at Microsoft.Spark.CSharp.Proxy.Ipc.StreamingContextIpcProxy..ctor(String che ckpointPath) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp\Proxy\Ipc \StreamingContextIpcProxy.cs:line 64 at Microsoft.Spark.CSharp.Proxy.Ipc.SparkCLRIpcProxy.CreateStreamingContext(S tring checkpointPath) in C:\Mobius-master\csharp\Adapter\Microsoft.Spark.CSharp
Proxy\Ipc\SparkCLRIpcProxy.cs:line 91 at Microsoft.Spark.CSharp.Streaming.StreamingContext.GetOrCreate(String check pointPath, Func`1 creatingFunc) in C:\Mobius-master\csharp\Adapter\Microsoft.Spa rk.CSharp\Streaming\StreamingContext.cs:line 84 at Microsoft.Spark.CSharp.Examples.SparkClrKafkaExample.Main(String[] args) i n C:\Mobius-master\examples\Streaming\Kafka\Program.cs:line 39 .Exception caught: An existing connection was forcibly closed by the remote host

java.io.IOException: An existing connection was forcibly closed by the remote ho st at sun.nio.ch.SocketDispatcher.read0(Native Method) at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:43) [CSharpRunner.main] closing CSharpBackend at sun.nio.ch.IOUtil.readIntoNat iveBuffer(IOUtil.java:223)

    at sun.nio.ch.IOUtil.read(IOUtil.java:192)
    at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
    at io.netty.buffer.UnpooledUnsafeDirectByteBuf.setBytes(UnpooledUnsafeDi

rectByteBuf.java:447)Requesting to close all call back sockets.

    at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:881)

[CSharpRunner.main] Return CSharpBackend code -532462766 at io.netty.chan nel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:242)

    at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(Abstra

ctNioByteChannel.java:119) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav a:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve ntLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja va:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread EventExecutor.java:111) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorato r.run(DefaultThreadFactory.java:137) at java.lang.Thread.run(Thread.java:745)

nghiadinhhieu avatar Dec 03 '16 05:12 nghiadinhhieu

Looks like something went wrong when reading checkpoint path. Have you ruled out any issue in HADOOP_HOME or winutils.exe by using HDFS paths in word count or [word count streaming](https://github.com/Microsoft/Mobius/blob/master/notes/running-mobius-app.md#hdfswordcount-example-streaming examples)?

skaarthik avatar Dec 03 '16 05:12 skaarthik

I tested my hadoop configs work well. I think my issues relating to the calling JVM StreamingContext functions like this: "SparkCLRIpcProxy.JvmBridge.CallConstructor("org.apache.spark.streaming.api.java.JavaStreamingContext" and the function: SparkCLRIpcProxy.JvmBridge.CallConstructor("org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper", new object[] { }); I can not figure out the problems here, maybe in my environments setup. Thanks for helping me.

nghiadinhhieu avatar Dec 06 '16 09:12 nghiadinhhieu

Unfortunately I am not able to reproduce this error. I will try out a few things during the holidays to see if I can get the same error for further investigation.

skaarthik avatar Dec 10 '16 00:12 skaarthik

@nghiadinhhieu - are you still having this issue?

skaarthik avatar Jan 28 '17 09:01 skaarthik

Yeah, I upgraded version 2 release, but it's still issue above. Thanks

nghiadinhhieu avatar Feb 08 '17 02:02 nghiadinhhieu

Hi @nghiadinhhieu have you had any luck in solving this issue?

Mrsevic avatar Mar 08 '17 09:03 Mrsevic