compare-tokenizers icon indicating copy to clipboard operation
compare-tokenizers copied to clipboard

feat: update all tokenizers to latest versions

Open niieani opened this issue 1 year ago • 1 comments

I've done some heavy optimization of gpt-tokenizer, and was curious about the speed of the latest version. I've upgraded all the tokenizers and devDependencies, and the result in the README.md.

I've also created some of my own benchmarks to confirm the results - it's now the fastest, and most memory efficient tokenizer, and it's pure JavaScript. Happy to say that it's even faster than WASM or node bindings to Rust. 😄

fastest lowest-footprint

niieani avatar Sep 23 '24 06:09 niieani