tiktoken icon indicating copy to clipboard operation
tiktoken copied to clipboard

NPM package

Open atgctg opened this issue 2 years ago • 7 comments
trafficstars

Would it be possible to add a wasm target and make tiktoken available for Node.js projects?

I'm currently relying on gpt-3-encoder but would prefer to use tiktoken for performance reasons.

atgctg avatar Jan 22 '23 09:01 atgctg

Thanks for the suggestion! I'm not currently planning on implementing this, but it is likely that at some point we will.

If other people encountering this also have this feature request, please thumbs up the original post.

Note that the third party gpt-3-encoder library will not work exactly right for any of the Codex or GPT-3.5 series models, including code-cushman-001, code-davinci-002, text-davinci-002, text-davinci-003, etc. It will also not work at all for e.g. the recent embeddings models, like text-embedding-ada-002

hauntsaninja avatar Feb 01 '23 20:02 hauntsaninja

Just published a new version of tiktoken that includes a mapping from model to tokeniser. Anything not using r50k is liable to be incorrect (sometimes subtly, sometimes majorly, sometimes majorly but you won't notice) with the third party gpt-3-encoder library: https://github.com/openai/tiktoken/blob/main/tiktoken/model.py#L7

hauntsaninja avatar Feb 03 '23 21:02 hauntsaninja

Hello, I've been working on JS bindings for tiktoken, found here: https://github.com/dqbd/tiktoken. Core methods are implemented, some methods are missing for now.

You can install it as such:

npm install tiktoken

If it is desired, I could try to wrangle the changes and create an upstream PR.

Note: Couldn't secure the tiktoken NPM package name, as it is currently owned by @gmpetrov

EDIT: Pure JS port is also available as

npm install js-tiktoken

dqbd avatar Feb 19 '23 14:02 dqbd

Hey, I also made a simple alternative for tiktoken on Node.js! A lot of features are missing too, but I plan to make a 1:1 library with the python version.

Link: https://github.com/ceifa/tiktoken-node

Because it relies on node addon instead of webassembly, it is 5-6x faster than @dqbd approach.

ceifa avatar Mar 16 '23 23:03 ceifa

If you just want to get tokens and USD consumed by messages,you can try it : )

npm install gpt-tokens
import { GPTTokens } from 'gpt-tokens'

const gptTokens = new GPTTokens({
    model   : 'gpt-3.5-turbo',
    messages: [
        { 'role': 'system', 'content': 'You are a helpful assistant' },
        { 'role': 'user', 'content': '' },
    ],
})

// 18
console.log('Tokens: ', gptTokens.usedTokens)
// 0.000036
console.log('USD: ', gptTokens.usedUSD)

Cainier avatar Apr 17 '23 10:04 Cainier

Could someone help me understand the pros and cons of using Xenova/text-embedding-ada-002 with Transformers.js vs one of the other project listed above?

  • https://huggingface.co/Xenova/text-embedding-ada-002

metaskills avatar Sep 16 '23 12:09 metaskills

there is another one https://github.com/niieani/gpt-tokenizer that seems like a full Node port, without a wrapper around Python tiktoker.

seyfer avatar Sep 25 '23 15:09 seyfer