flyte icon indicating copy to clipboard operation
flyte copied to clipboard

[Docs] `FlyteScopedUserException 'JavaPackage' object is not callable` if spark plugin is not enabled

Open jasonlai1218 opened this issue 1 year ago • 3 comments

Description

When executing Spark Task, flyte binary does not enable spark plugin, it will happen...

FlyteScopedUserException: 'JavaPackage' object is not callable
 
TypeError: Encountered error while executing workflow 'wf':
  Error encountered while executing 'wf':
  'JavaPackage' object is not callable

But it is difficult to know this from the error message. I think it can be recorded in troubleshooting.

Are you sure this issue hasn't been raised already?

  • [X] Yes

Have you read the Code of Conduct?

  • [X] Yes

jasonlai1218 avatar Feb 03 '24 11:02 jasonlai1218

🤖

Hey there, @jasonlai1218! 👋 I'm Dosu, a bot here to lend a hand with bugs, answer your questions, and guide you on becoming a contributor while we wait for a human maintainer to jump in. Just so you know, I'm here to make things a bit smoother for you. I see you've run into an issue with the Spark Task and the FlyteScopedUserException. I'm diving into it now and will get back to you with a more detailed answer shortly!

dosubot[bot] avatar Feb 03 '24 11:02 dosubot[bot]

#self-assign

MortalHappiness avatar Feb 03 '24 11:02 MortalHappiness

🤖

Thank you for bringing this to our attention. The error message you're seeing, 'JavaPackage' object is not callable, is indeed not very informative about the underlying issue, which is that the Spark plugin is not enabled.

As you've suggested, it would be beneficial to have this information included in the troubleshooting documentation. This would help users understand the need to enable the Spark plugin when executing Spark tasks.

To enable the Spark plugin, you need to register the plugin with Flyte's plugin machinery. This is done in the init() function. Here is the relevant code snippet:

func init() {
	if err := sparkOp.AddToScheme(scheme.Scheme); err != nil {
		panic(err)
	}

	pluginmachinery.PluginRegistry().RegisterK8sPlugin(
		k8s.PluginEntry{
			ID:                  sparkTaskType,
			RegisteredTaskTypes: []pluginsCore.TaskType{sparkTaskType},
			ResourceToWatch:     &sparkOp.SparkApplication{},
			Plugin:              sparkResourceHandler{},
			IsDefault:           false,
		})
}

In this code, sparkOp.AddToScheme(scheme.Scheme) is adding the Spark operator to the scheme. If there is an error, it will panic and stop the execution.

Then, pluginmachinery.PluginRegistry().RegisterK8sPlugin() is called to register the Spark plugin. The k8s.PluginEntry struct is filled with the necessary information:

  • ID is set to sparkTaskType, which is the identifier for the Spark task type.
  • RegisteredTaskTypes is set to an array containing sparkTaskType, meaning this plugin can handle tasks of type sparkTaskType.
  • ResourceToWatch is set to &sparkOp.SparkApplication{}, meaning this plugin will watch for changes in SparkApplication resources.
  • Plugin is set to sparkResourceHandler{}, which is the handler for Spark resources.
  • IsDefault is set to false, meaning this plugin is not a default plugin.

You can find this code in the flyteplugins/go/tasks/plugins/k8s/spark/spark.go file.

I hope this helps clarify the issue. If you have any further questions, please don't hesitate to ask.

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

dosubot[bot] avatar Feb 03 '24 11:02 dosubot[bot]