langchain icon indicating copy to clipboard operation
langchain copied to clipboard

Validation error for PromptTemplate __root__ Invalid format specifier (type=value_error)

Open lucinex opened this issue 2 years ago • 4 comments

Getting this error when ever there is some combination of { [ ' in the string_text while building prompts, Is there a work around to this??

ValidationError: 1 validation error for PromptTemplate root Invalid format specifier (type=value_error)

Error in get_answer coroutine: Traceback (most recent call last): File "/app/src/chatbot/query_gpt.py", line 272, in context_calling chat_prompt_with_context = self.build_chat_prompt(queries, context_flag=True) File "/app/src/chatbot/query_gpt.py", line 250, in build_chat_prompt assistant_history_prompt = AIMessagePromptTemplate.from_template( File "/usr/local/lib/python3.8/site-packages/langchain/prompts/chat.py", line 67, in from_template prompt = PromptTemplate.from_template(template) File "/usr/local/lib/python3.8/site-packages/langchain/prompts/prompt.py", line 130, in from_template return cls(input_variables=list(sorted(input_variables)), template=template) File "pydantic/main.py", line 341, in pydantic.main.BaseModel.init pydantic.error_wrappers.ValidationError: 1 validation error for PromptTemplate root Invalid format specifier (type=value_error)

lucinex avatar Apr 12 '23 12:04 lucinex

Hi @lucinex , did you find a resolution to the above error, I am having the same problem. Thank you!

srishino1016 avatar May 05 '23 09:05 srishino1016

@lucinex @srishino1016 can you post the text of the prompt template you are using? Maybe it will give more insight into why.

I faced the same issue but it was because I was using .from_template() method but my prompt template had no placeholders.

paul-tharun avatar Jun 15 '23 09:06 paul-tharun

Hi, I'm having the same issue.

ValidationError: 1 validation error for PromptTemplate
__root__
  Invalid format specifier ' string  // text that is being replaced.
        'updateText': string  // text that is replacing the existing text.
' for object of type 'str' (type=value_error)

It is raised on this part of the code:

messages=[
        HumanMessagePromptTemplate.from_template(exampleprompt)
        ],

And this is the content of exampleprompt:

You will generate a new text based on the user's input and the given existing text;                                
however, you should keep the original format and structure as possible, updating only                              
the needed part of existing text and keeping the rest of the text.

    
The output should be a markdown code snippet formatted in the following schema, including the leading and trailing 
"```json" and "```":

```json
{
        'existingText': string  // text that is being replaced.
        'updateText': string  // text that is replacing the existing text.
}

                                                 
Example

User Request: Cambia el depósito a dos meses de renta
Existing Text: El Arrendatario entrega al Arrendador a la firma del presente Contrato, la cantidad de $15,000.00 
M.N. (Quince mil pesos 00/100 M.N.)
Output: El Arrendatario entrega al Arrendador a la firma del presente Contrato, la cantidad de $30,000.00 M.N. 
(Treinta mil pesos 00/100 M.N.)

User Request: Cambia la cantidad mensual de la renta a $80,000 pesos
Existing Text: TERCERA. Renta. El Arrendatario ... judicial previa. 
Output: 

@paul-tharun , you mention that your issue was due to using .from_template(), which is the same case as mine. How did you solve this?

fernando-m1 avatar Jun 15 '23 17:06 fernando-m1

@fernando-m1 Hey, the issue is because you are trying use from_template without any place holder.

The text being used is just a message and not a template.

This is a template Tell me a joke about {topic}

This is only a message The me a joke about cats

The underlying issue is because, the from_template method expects a placeholder in the text and tries to parse the json in the text.

To fix this you can create the HumanMessage object directly with

from langchain.schema import HumanMessage
messages=[HumanMessage(content=exampleprompt)]

paul-tharun avatar Jun 16 '23 06:06 paul-tharun

Faced this issue earlier, then solved it by replacing { with [ . Minor workaround but the last solution seems the most useful one. Closing this issue here. Sorry was out of touch with the online community.

lucinex avatar Jun 18 '23 04:06 lucinex

To give an example of Lucinex's fix:

Prompt that causes error

    Example JSON object:

    [
    {
    "type": "some type", ...

Prompt that works

...and still outputs valid JSON:

    Example JSON object:

    [
    [ // note
    "type": "some type", ...

I image there's a cleaner way to do this but this worked for the moment.

jelling avatar Sep 06 '23 14:09 jelling

Also had this issue with (maybe there is something out of the box that does this?)

import json

from langchain.prompts import ChatPromptTemplate

def convert_openai_to_chat_template(input_data):
    chat = [(item["role"], item["content"]) for item in input_data]
    chat_template = ChatPromptTemplate.from_messages(chat)
    return chat_template.format_messages()

if __name__ == "__main__":
    cat = "dog"
    dict_tst = {"some": [cat]}
    dict_str = json.dumps(dict_tst)
    chat = [{"role": "user", "content": "cat" + dict_str}]
    convert_openai_to_chat_template(chat)
*** pydantic.error_wrappers.ValidationError: 1 validation error for PromptTemplate
__root__
  Invalid format specifier (type=value_error)

@paul-tharun 's suggestion works.

from langchain.schema import AIMessage, HumanMessage, SystemMessage

messages = [
    SystemMessage(content="dd"), 
    HumanMessage(content="dd"), 
    AIMessage(content="dd"),  
    HumanMessage(content="dd")
]

May suggest some formatting option that does not require placeholders. New to the codebase though and having trouble navigating the docs

big-c-note avatar Sep 14 '23 14:09 big-c-note