llmx icon indicating copy to clipboard operation
llmx copied to clipboard

Support for palm api usage stats

Open victordibia opened this issue 2 years ago • 0 comments

What

In many cases it is useful to keep track of the token usage for each query and associated costs. Different model provider apis handle this differently (some provide this information as part of a generate query response) and the task is to return accurate usage states in a unified format for all model providers

Work Items

  • [x] OpenAI/AzureOpenAI
  • [x] HuggingFace
  • [ ] PALM
    • [ ] Extend response from palm api to include usage information.
    • [ ] Standardize on usage format for all apis

Palm api response provides the following fields.

"metadata": {
    "tokenMetadata": {
      "input_token_count": {
        "total_tokens": integer,
        "total_billable_characters": integer
      },
      "output_token_count": {
        "total_tokens": integer,
        "total_billable_characters": integer
      }
    }
  }

victordibia avatar Sep 14 '23 16:09 victordibia