feat(ui): add Token Estimator link to footer
Summary
This PR adds a "Token Estimator" link to the footer of the Gitingest web frontend. The link points to https://gitingest.com/tokencount, allowing users to easily access a tool for estimating token counts in pasted text.
Details
-
UI Change:
- Added a new link labeled Token Estimator (with a counter icon) to the left column of the footer, alongside the Extension and Python package links.
- The link opens in a new tab and is styled consistently with the other resource links.
- No changes were made to the CLI or Python package.
-
Why:
- Addresses and closes #318 by making token estimation more visible and accessible to users.
-
How to test:
- Start the server (
cd src && uvicorn server.main:app --reload). - Open the app in your browser.
- Scroll to the footer and verify the "Token Estimator" link appears and opens https://gitingest.com/tokencount in a new tab.
- Start the server (
Screenshot
Closes #318
@HmbleCreator Thanks, But this is just the link, it seems to me that https://github.com/cyclotruc/gitingest/issues/318 would require to actually implement the backend route for this new function, do you think you can work on this?
@HmbleCreator Thanks, But this is just the link, it seems to me that #318 would require to actually implement the backend route for this new function, do you think you can work on this?
Thanks for pointing that out! Yes, I’d be happy to implement the backend route for token estimation as part of #318. I can add an endpoint (e.g. /api/estimate-tokens) that accepts raw text, estimates the token count using tiktoken or a similar utility, and returns the result. Let me know if you have a preferred response schema or want it namespaced differently; I'd be happy to get started.
@HmbleCreator Thanks, But this is just the link, it seems to me that #318 would require to actually implement the backend route for this new function, do you think you can work on this?
Thanks for pointing that out! Yes, I’d be happy to implement the backend route for token estimation as part of #318. I can add an endpoint (e.g. /api/estimate-tokens) that accepts raw text, estimates the token count using tiktoken or a similar utility, and returns the result. Let me know if you have a preferred response schema or want it namespaced differently; I'd be happy to get started.
Are we only gonna have support for OpenAI models or other models also, like open source models available via HF?
The token counting functionality uses two efficient tokenizer libraries:
tiktokenizerfor OpenAI models - provides fast and accurate token counting for GPT modelsautotiktokenizerfor non-OpenAI models - enables efficient token counting across a wide range of open-source models
This dual approach ensures we can provide fast and accurate token counting regardless of the model being used, while keeping the implementation lightweight and efficient.
I'll test it out in few hours.
I see that both logic and routing are defined and implemented within the entrypoint of the server, maybe move this to the "routers/" folder.
Hi! I wanted to ask—does the UI theme/design you use for Gitingest have a name or is it based on a particular design system? I liked the style; it’s clean, modern, and playful! Additionally, I wanted to express my gratitude for your guidance and patience over the past two to three days. As a newbie, it was a great experience learning about FastAPI, backend/frontend separation, API best practices, and collaborative open-source workflows. I learned a great deal and truly appreciate the opportunity to contribute! Thanks again!
Hi! I wanted to ask—does the UI theme/design you use for Gitingest have a name or is it based on a particular design system?
Yes, neobrutalism, you can find it at neobrutalism.com.
I wanted to express my gratitude for your guidance and patience over the past two to three days. As a newbie, it was a great experience learning about FastAPI, backend/frontend separation, API best practices, and collaborative open-source workflows. I learned a great deal and truly appreciate the opportunity to contribute! Thanks again!
You're welcome :)
hey, is there any issue with the merging, any issue with code?
@HmbleCreator i'm sorry, we're late on this one.
We currently have some urgent topic and we don't know if we still want to add a dedicated tiktoken feature to the frontend app.
I'll put this in draft for now.
This pull request has merge conflicts that must be resolved before it can be merged.
This pull request has resolved merge conflicts and is ready for review.
This pull request has merge conflicts that must be resolved before it can be merged.
Hi there! We haven’t seen activity on this pull request for 45 days, so I’m marking it as stale. If you’d like to keep it open, please leave a comment within 10 days. Thanks!
This pull request has resolved merge conflicts and is ready for review.
Hi there! We haven’t seen activity on this pull request for 45 days, so I’m marking it as stale. If you’d like to keep it open, please leave a comment within 10 days. Thanks!
Hi there! We haven’t heard anything for 10 days, so I’m closing this pull request. Feel free to reopen if you’d like to continue working on it. Thanks!