sparkmagic
sparkmagic copied to clipboard
[BUG]%%pretty with chinese character error
Describe the bug when use %%pretty function, if there is chinese character in the table, the table cannot show properly and return this error.
An internal error was encountered. Please file an issue at https://github.com/jupyter-incubator/sparkmagic Error: Expected DF rows to be uniform width (581)
To Reproduce %%pretty df.show()
Versions:
- SparkMagic
- Livy (if you know it)
- Spark
I experience the same issue.
I experience the same issue too, when i want to display chinese character,it will return error: Expected DF rows to be uniform width (804)
thanks to any help,it really confuses me TAT
This issue should be fixed by this PR, which I just released as part of the 0.20.4 release.
I'm marking this as resolved for now, but please let me know if this is not the case after you upgrade.
This issue should be fixed by this PR, which I just released as part of the 0.20.4 release.
I'm marking this as resolved for now, but please let me know if this is not the case after you upgrade.
thanks for your reply! but when i upgrade to 0.20.4 release,i will still meet the same probelm: Expected DF rows to be uniform width (11) but found | a| 你好| (9)
i found it is correct in livyserver,but returned error in notebook display
maybe you can use this dataframe to reappear the problem: ` %%pretty
df = spark.createDataFrame([("a","你好"),("b","你好")],("key","value"))
df.show(5) `
@devstein
and it is my versions spark 2.4.5 sparkmagic 0.20.4 i dont know livy version
Thanks for the code snippet @baixinzxl. I will investigate once I have bandwidth in the coming weeks. Contributions are welcome if you want to dive into the code!
thanks, take your time~ and i guess the problem may relate with chinese encoding differences from livy to jupyter ,hope it helps
sorry for disturbing but i wonder if there is any findings about the problem? looking forward to hearing from you @devstein ~ thank you ~
Hey @baixinzxl I haven't forgotten about this. I've been stretched for time and have tried to tackle this twice without success. The relevant code is in this file if you want to take a stab at it!