Prometheus icon indicating copy to clipboard operation
Prometheus copied to clipboard

[BUG] File Size (Parsing Errors)

Open ImInTheICU opened this issue 2 years ago • 10 comments

Describe the bug When uploading a 2mb file I got stuck on the Parsing Screen.

Expected behavior To obfuscate the file.

To Reproduce Upload a file 2mb in size.

Screenshots image

Additional context Nope, just wish it worked with medium sized files.

ImInTheICU avatar Aug 17 '22 23:08 ImInTheICU

(Update: Still on the parsing screen its now been 24 minutes)

ImInTheICU avatar Aug 17 '22 23:08 ImInTheICU

This is a know issue, and I am working on fixing this. My parser (actually mostly the tokenizer) is really slow.

levno-710 avatar Aug 18 '22 07:08 levno-710

This is a know issue, and I am working on fixing this. My parser (actually mostly the tokenizer) is really slow.

This is also a problem in other obfuscators such as ByteLuaObfuscator another really good one, it can only handle 500KB before erroring because of out of memory.

ImInTheICU avatar Aug 18 '22 07:08 ImInTheICU

This is a know issue, and I am working on fixing this. My parser (actually mostly the tokenizer) is really slow.

Also do you know when you will push a fix, were still waiting for the random strings mentioned in another ticket? Im not sure what you would call this.... a ticket a problem a issue ? ? Its not really a issue it was more of a suggestion so i will call it a ticket (Sorry if that come out rude, i dident mean it like that.)

ImInTheICU avatar Aug 18 '22 07:08 ImInTheICU

I am currently not abled to work on Prometheus a lot, because I am in an Internship where I have to work full time, but I try to adress these issues as fast as possible

levno-710 avatar Aug 18 '22 08:08 levno-710

I am currently not abled to work on Prometheus a lot, because I am in an Internship where I have to work full time, but I try to adress these issues as fast as possible

That sucks,

ImInTheICU avatar Aug 18 '22 08:08 ImInTheICU

I am currently not abled to work on Prometheus a lot, because I am in an Internship where I have to work full time, but I try to adress these issues as fast as possible

Also do you think theres a way around this 500kb limit??

ImInTheICU avatar Aug 18 '22 08:08 ImInTheICU

The only way that I think would be possible, would be to rewrite the parser and tokenizer

levno-710 avatar Aug 18 '22 09:08 levno-710

Or to split the entire file into multiple chunks that require each other

levno-710 avatar Aug 18 '22 09:08 levno-710

Or to split the entire file into multiple chunks that require each other

That could work but its kinda cheap, rewriting the tokenizer and parser is needed, thanks!

ImInTheICU avatar Aug 18 '22 09:08 ImInTheICU

Closing this issue, due to #83 being the same

levno-710 avatar Sep 06 '22 07:09 levno-710