NeMo-Guardrails icon indicating copy to clipboard operation
NeMo-Guardrails copied to clipboard

[BUG] LLMGenerationActions.generate_value() fails if the value is not a number

Open uvnikgupta opened this issue 1 year ago • 4 comments

The return statement of generate_value() is return literal_eval(value) raises an exception if the value is other than a number. Modifying it to return value resolves the issue. Is this an expected behavior? If yes, can someone please tell how to extract string values from the user utterances.

I am using the following colang flow:

# Extract the FILE NAME from the user's request. If the FILE NAME is not specified just say "unknown" without giving any explanation
$file_name = ...

and my prompt content is :

content: |-
      """
      {{ general_instructions }}
      """

      # This is how a conversation between a user and the bot can go:
      {{ sample_conversation }}

      # This is how the bot thinks:
      {{ examples }}

      # This is the current conversation between the user and the bot:
      {{ sample_conversation | first_turns(2) }}
      {{ history | colang }}
      # {{ instructions }}
      ${{ var_name }} =

uvnikgupta avatar Jun 07 '24 01:06 uvnikgupta

@uvnikgupta: what LLM are you using? can you try to also instruct the LLM to wrap the value in double quotes?

drazvan avatar Jun 07 '24 06:06 drazvan

@uvnikgupta: what LLM are you using? can you try to also instruct the LLM to wrap the value in double quotes?

I am using llama3 and mixtral. Can try to "request" the LLM to add the double quotes but I am concerned that it may not always work. I feel the better resolution would be to remove the literal_eval. Is there any particular reason to keep it that way?

uvnikgupta avatar Jun 07 '24 11:06 uvnikgupta

Asking the LLM to # Always enclose your response within "" works for llama3 and GPT 3.5 but is a hit or miss for mixtral

uvnikgupta avatar Jun 07 '24 12:06 uvnikgupta

The reason for keeping it is because like this the LLM can produce any primitive Python expression. It's useful to be able to allow the LLM to produce a list of strings for example. Or even a tuple. But it might make sense to enable a "fallback to string" if the literal_eval fails.

drazvan avatar Jun 07 '24 12:06 drazvan