magellan icon indicating copy to clipboard operation
magellan copied to clipboard

python API

Open Liorba opened this issue 8 years ago • 5 comments

Hey, after seeing the europe spark summit lecture on magellan, you said python API will be available.

is there still a plan to add it? Thanks,

Liorba avatar Dec 24 '15 12:12 Liorba

Hi @Liorba , I am in the process of cutting a 1.0.4 with Spark 1.5 The Python API will be available once that happens. Due to a class loading issue before Spark 1.5, the Python API was not functional unless you patched Spark... but with 1.0.4 that should be fixed. In fact, the Python API issue is what is keeping up the 1.0.4 release as some implementation changes in Spark 1.5 as far as Python goes has introduced an issue in Magellan's Python code which I m currently debugging. I'll let you know as soon as 1.0.4 is cut (which should be any day now)

halfabrane avatar Dec 24 '15 15:12 halfabrane

Thanks @halfabrane for your quick response, really looking forward to try the new version

Liorba avatar Dec 28 '15 09:12 Liorba

Hello,

Any update on when this will be ready?

Thank you

robbyki avatar Jan 14 '16 15:01 robbyki

Hi @halfabrane, great news. I'm looking forward to trying this out as well!

GISDev01 avatar Jan 17 '16 22:01 GISDev01

Hi,

While importing the pyspark package in my server. Its killing the SparkContext. Error is given below. Please let me know how to resolve the given below issue

16/03/17 02:28:47 ERROR SparkContext: Error initializing SparkContext. org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master. at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:124) at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:64) at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144) at org.apache.spark.SparkContext.(SparkContext.scala:530) at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:59) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:422) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381) at py4j.Gateway.invoke(Gateway.java:214) at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79) at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68) at py4j.GatewayConnection.run(GatewayConnection.java:209) at java.lang.Thread.run(Thread.java:745)

thenakulchawla avatar Mar 17 '16 07:03 thenakulchawla