godot
godot copied to clipboard
GDScript: Reintroduce binary tokenization on export
This adds back a function available in 3.x: exporting the GDScript files in a binary form by converting the tokens recognized by the tokenizer into a data format.
It is enabled by default on export but can be manually disabled. The format helps with loading times since, the tokens are easily reconstructed, and with hiding the source code, since recovering it would require a specialized tool. Code comments are not stored in this format.
The --test command can also include a --use-binary-tokens flag which will run the GDScript tests with the binary format instead of the regular source code by converting them in-memory before the test runs.
Besides the regular option to export GDScript as binary tokens, this also includes a compression option on top of it. The binary format needs to encode some information which generally makes it bigger than the source text. This option reduces that difference by using Zstandard compression on the buffer.
- Production edit: See https://github.com/godotengine/godot-proposals/issues/8605.
- Production edit: removed line breaks
But what happened with intermediate representation proposal, is there is any work done there? Why bring back this useless tokenization if there is zero protection to the source code because gdsdecomp can recover everything in a second?
Besides the regular option to export GDScript as binary tokens, this also includes a compression option on top of it. The binary format needs to encode some information which generally makes it bigger than the source text. This option reduces that difference by using Zstandard compression on the buffer.
Would be great to see exported projects size comparison
To be clear, this is unrelated to the IR proposal. The proposal itself will take some time to solidify because it needs approval from the maintainers and not everyone think it's feasible.
This PR is a palliative to avoid having plain source code export while we wait for that or figure out something else, because a lot of people has been complaining about this issue.
This PR is a palliative to avoid having plain source code export while we wait for that or figure out something else, because a lot of people has been complaining about this issue.
But returning tokenization does not solve this problem because scripts are still easily opened with gdsdecomp tool.
But returning tokenization does not solve this problem because scripts are still easily opened with gdsdecomp tool.
That won't change in any real way with any solution, any compiled/parsed/processed format will be vulnerable to that, it's just a matter of degrees, the biggest improvement will be this over plain source to the vast majority of "attacks"
any compiled/parsed/processed format
You can't run decompiled code because it's gibberish but in the case of GDScript tokenization you absolutely can revert it back to it's original form and run it without any issues, this is why this tokenization is useless. I had high hopes for intermediate representation format, but after almost two months it turned out that no work had even begun
I'd suggest reading up on this and getting a realistic perspective on this, if you genuinely believe that decompilation isn't a realistic way of reverse engineering code then you are not going to have realistic expectations of the outcomes of these things
but after almost two months it turned out that no work had even begun
Please be realistic about the time it takes to undertake such a major project, and the explanations for the expectations on it above by vnen... Just because it hasn't happened yet doesn't mean it won't, especially with just a few months
If you'd rather wait longer with no changes and no improvements for performance then that's your opinion, but I don't think it's reasonable to wait for a feature that doesn't clash with this just to get a bit better obfuscation (not that obfuscation is the main goal here anyway)
(Also if you're going to block me you will not react to my comments please, that's just petty, I don't want to have to block you back...)
Reducing loading times and applying zstd compression is good reason to reintroduce binary tokenization on its own merits.
I don't know if I'll have much success reviewing the code.
@MichaelWengren you can decompile and modify anything, be it .net or anything else. Eventually instead of an IR, what will most likely happen is GDScript files optionally compiled to binary (C) per platform. This will make it pretty impossible to see or modify the code already.
Updated to set the compressed mode as the default.
Some accidental reverts, see above
I thought git push --force-with-lease would prevent this. I'll have to be more careful from now on.
Thanks!
Just wanted to say I really appreciate this commit. I don't see obscuring the code as a security feature, but making the files smaller and faster is great. Most importantly, having comments stripped is very much a value add for a lot of developers. It's easy to leak things accidentally there since many devs used to compiled languages would assume comments are not user-visible.
about source safety (reverse-engineering protection, obfuscation, etc) there's any thread to follow?
@azur-wolf See:
- godotengine/godot-proposals#4220
There are also proposals that can make decompilation more difficult, although they are not directly intended for this:
- godotengine/godot-proposals#8605
- godotengine/godot-proposals#6031 (AOT option)
- godotengine/godot-proposals#3069