rank_llm
rank_llm copied to clipboard
P4-Better estimate the max word length for passage truncating in prompts
For example:
maybe in a oneline function that could be used both for LRL and rankGPT prompts?
something like: num_words = (context size - num output tokens(current window size) )* 0.75 max_word_length = num_words / (current window size or rank_end - rank_start)