llama_index icon indicating copy to clipboard operation
llama_index copied to clipboard

Doc : PromptHelper - need more info on parameters

Open ezawadzki opened this issue 2 years ago • 1 comments

It would be great to have more information about parameters of PromptHelper :

image

  • Definition of parameters, the impact, for what they are used to
  • Precise the unity (nb of chars ? etc...)
  • Min / Max ?

Enventually, we can provide external link documentation from LLM Theory.

ezawadzki avatar Mar 24 '23 11:03 ezawadzki

Agreed, we should definitely make the documentation clearer. Will take a todo here.

Disiok avatar Mar 25 '23 01:03 Disiok

Hello, have you figured out the purpose of these parameters? I want to control the number of tokens used by the query each time. Which parameters are mainly used? @ezawadzki

mingxin-yang avatar Mar 30 '23 03:03 mingxin-yang

@mingxin-yang I guess number of chars but not sure...

[Edit @mingxin-yang ] : token = chunk of text Source

ezawadzki avatar Mar 31 '23 09:03 ezawadzki

The documentation has been updated with more info on token parameters. All parameters related to chunk sizes are measured in tokens. Furthermore, the prompt helper isn't really user facing anymore. Instead, you can use the service context directly.

Check the docs out! https://gpt-index.readthedocs.io/en/latest/core_modules/supporting_modules/service_context.html#configuring-the-service-context

logan-markewich avatar Jul 21 '23 22:07 logan-markewich