llm icon indicating copy to clipboard operation
llm copied to clipboard

Add `--verbose` or similar to return equivalent of log response

Open tonydewan opened this issue 5 months ago • 0 comments

I'd like to be able to pass an option to the chat command to make it return the full logged (json) object, or at least something structured that includes conversation_id and other details. E.g

llm -m 4 "Hello, how are you" --no-stream --verbose
> {
    "id": "",
    "model": "gpt-4",
    "prompt": "Hello, how are you",
    "system": null,
    "prompt_json": {
      "messages": [
        {
          "role": "user",
          "content": "Hello, how are you"
        }
      ]
    },
    "options_json": {},
    "response": "I'm an AI, so I don't have feelings, but I'm here and ready to help you. How can I assist you today?",
    "response_json": {
      "id": "",
      "choices": [
        {
          "finish_reason": "stop",
          "index": 0,
          "message": {
            "content": "I'm an AI, so I don't have feelings, but I'm here and ready to help you. How can I assist you today?",
            "role": "assistant"
          }
        }
      ],
      "created": 000000000,
      "model": "gpt-4-0613",
      "object": "chat.completion",
      "usage": {
        "completion_tokens": 29,
        "prompt_tokens": 12,
        "total_tokens": 41
      }
    },
    "conversation_id": "000000000",
    "duration_ms": 2188,
    "datetime_utc": "2024-01-31T21:48:45.963664",
    "conversation_name": "Hello, how are you",
    "conversation_model": "gpt-4"
 }

I'm running multiple concurrent requests and would like to the option to continue a given conversation by passing the conversation_id. To do that now, I need to get N logs (e.g, llm logs list --json -n 10) and determine which matches based on the response. This works okay, but is error prone and slow.

tonydewan avatar Jan 31 '24 21:01 tonydewan