seatunnel
seatunnel copied to clipboard
running with json configuration file not work
Search before asking
- [X] I had searched in the issues and found no similar issues.
What happened
running sample configuration file with error
SeaTunnel Version
2.1.3
SeaTunnel Config
{
"env" : {
"spark.app.name" : "SeaTunnel",
"spark.executor.instances" : 2,
"spark.executor.cores" : 1,
"spark.executor.memory" : "1g"
},
"source" : [
{
"result_table_name" : "my_dataset",
"plugin_name" : "Fake"
}
],
"transform" : [],
"sink" : [
{
"plugin_name" : "Console"
}
]
}
Running Command
./bin/start-seatunnel-spark.sh --master local -c ~/Downloads/test.conf --deploy-mode client
Error Exception
og4j:WARN No appenders could be found for logger (org.apache.seatunnel.core.base.config.ConfigBuilder).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" org.apache.seatunnel.shade.com.typesafe.config.ConfigException$WrongType: /Users/chenhu/Downloads/test.conf: 9: source has type list of LIST rather than list of OBJECT
at org.apache.seatunnel.shade.com.typesafe.config.impl.SimpleConfig.getHomogeneousWrappedList(SimpleConfig.java:452)
at org.apache.seatunnel.shade.com.typesafe.config.impl.SimpleConfig.getObjectList(SimpleConfig.java:460)
at org.apache.seatunnel.shade.com.typesafe.config.impl.SimpleConfig.getConfigList(SimpleConfig.java:465)
at org.apache.seatunnel.core.spark.SparkStarter.lambda$getPluginIdentifiers$9(SparkStarter.java:306)
at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:267)
at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
at org.apache.seatunnel.core.spark.SparkStarter.getPluginIdentifiers(SparkStarter.java:312)
at org.apache.seatunnel.core.spark.SparkStarter.getConnectorJarDependencies(SparkStarter.java:221)
at org.apache.seatunnel.core.spark.SparkStarter.buildCommands(SparkStarter.java:155)
at org.apache.seatunnel.core.spark.SparkStarter.main(SparkStarter.java:109)
Flink or Spark Version
spark3.3 on hadoop3
Java or Scala Version
java8
Screenshots
No response
Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
Code of Conduct
- [X] I agree to follow this project's Code of Conduct
The current plugin configuration has some custom handling, which may be related to the issue;
The format might need to be like this
{
"env": {
"spark.app.name": "SeaTunnel",
"spark.executor.instances": 2,
"spark.executor.cores": 1,
"spark.executor.memory": "1g"
},
"source": [{
"Fake": {
"result_table_name": "my_dataset",
"plugin_name": "Fake"
}
}],
"transform": [],
"sink": [{
"Console": {}
}]
}
The format might need to be like this
{ "env": { "spark.app.name": "SeaTunnel", "spark.executor.instances": 2, "spark.executor.cores": 1, "spark.executor.memory": "1g" }, "source": [{ "Fake": { "result_table_name": "my_dataset", "plugin_name": "Fake" } }], "transform": [], "sink": [{ "Console": {} }] }
Not working, the same error
AFAIK, the follow config can works:
{
"env" : {
"spark.app.name" : "SeaTunnel",
"spark.executor.instances" : 2,
"spark.executor.cores" : 1,
"spark.executor.memory" : "1g"
},
"source" : {
"fake": {
"result_table_name": "my_dataset"
}
},
"transform" : {},
"sink" : {
"console": {}
}
}
The error only occurs when there are multiple plugins in one section.
@chenhu
{
"env" : {
"spark.app.name" : "SeaTunnel",
"spark.executor.instances" : 2,
"spark.executor.cores" : 1,
"spark.executor.memory" : "1g"
},
"source" : [
{
"result_table_name" : "my_dataset",
"plugin_name" : "Fake"
}
],
"transform" : [],
"sink" : [
{
"plugin_name" : "Console"
}
]
}
plz change file name to test.json
and retry again.
This issue has been automatically marked as stale because it has not had recent activity for 30 days. It will be closed in next 7 days if no further activity occurs.
This issue has been closed because it has not received response for too long time. You could reopen it if you encountered similar problems in the future.