guidance icon indicating copy to clipboard operation
guidance copied to clipboard

add .save() and .load() functions to Program class to (de)serialize partially executed programs

Open Sam1320 opened this issue 2 years ago • 3 comments

Is your feature request related to a problem? Please describe.

For non-executed programs storing and loading from text files works fine. But for partially executed programs with custom variables an ugly workaround was necessary. Otherwise I would get the error

Error in program:  Can't set a property of a non-existing variable: conversation[-1].user_text

when trying to continue the chat after loading the text file and converting it to a program.

Describe the solution you'd like

Ideally we can just program.save(filename) and program.load(filename) and it just works without worrying about storing and loading the program.variables() separately.

Describe alternatives you've considered

I was trying to do this:

import guidance
guidance.llm = guidance.llms.OpenAI('gpt-3.5-turbo')
guidance.llm.cache.clear()

prompt = guidance(
'''{{#system~}}
You are a helpful assistant
{{~/system}}
{{~#geneach 'conversation' stop=False~}}
{{#user~}}
{{set 'this.user_text' (await 'user_text')}}
{{~/user}}
{{#assistant~}}
{{gen 'this.ai_text' temperature=0 max_tokens=300}}
{{~/assistant}}
{{~/geneach}}''', stream=False, silent=True)

prompt = prompt(user_text ='Hello there :)')

with open('prompt.txt', 'w') as f:
    f.write(str(prompt))

with open('prompt.txt', 'r') as f:
    prompt = f.read()

prompt = guidance(prompt)

prompt = prompt(user_text ='I want to travel to the moon')

But got the error mentioned above.

My solution was:


def save_prompt(prompt, filename):
    variables = prompt.variables()
    del variables['llm']
    to_store = {'text': str(prompt), 'variables': variables}
    with open(filename, 'w') as f:
        json.dump(to_store, f)
    
def load_prompt(filename):
    with open(filename, 'r') as f:
        loaded = json.load(f)
    prompt = guidance(loaded['text'], **loaded['variables'])
    return prompt

Note that the del variables['llm'] is necessary because the llm object is not serializable.

Sam1320 avatar May 19 '23 19:05 Sam1320

Thanks for noting this. I agree we should have direct support for serializing not just the template but the variable state as well. I can think of two approaches here:

  1. Create a new format like a JSON format to store both the program string and the variables.
  2. Extend the template for a little bit so that we can include the variables into the template. Then we just have one string format type.

I lean towards number 2 because we can already use the set command like {{set 'var_name' literal_value}}. But I think the grammar might need a bit of enhancing to ensure complex literal values work (they might I just have not tried).

Anyone have more thoughts?

slundberg avatar May 25 '23 19:05 slundberg

I'm evaluating Guidance at the moment for a project, and persistence of prompt-with-state is something that would be very helpful to me. My preference here is option 1 for the following reasons:

  • It would make it easier to 'hack' the persisted state in code to, for example, filter out certain data as part of a data loading pipeline in something like Azure Synapse Pipelines.
  • It would open the door to querying the state using a standard JSON toolchain. This would be useful for automating safety evaluations as you productionize a prompt.
  • It might reduce the attack surface for injection attacks since there wouldn't be yet another escaping mechanism.

I'm not entirely sure how typical my requirements are though! (And I could always implement my own persistence if the chosen approach is option 2).

jgeldart avatar May 27 '23 00:05 jgeldart

I think having a single string representation is simpler and more elegant, in fact I intuitively thought option 2 was already implemented. Also introducing a JSON format might open the door to gradual increase in bloatedness and unnecessary complexity over time. On the contrary expressing everything in the template forces a simple feature design.

Sam1320 avatar May 27 '23 07:05 Sam1320

This has not had much activity in months, and I am a bit stuck on this. Any word on approaches here? I am going to have to figure out a hack soon.

nhorton avatar Aug 09 '23 02:08 nhorton

@Sam1320

Your solution works.

Please note when calling the prompt again and passing the input, you will need to also add the argument await_missing=True

That made it work for me.

syncware-ai avatar Aug 22 '23 16:08 syncware-ai

Great to know it helped. Its surprising that after all these months they have not merged a solution for this feature.

Sam1320 avatar Aug 22 '23 18:08 Sam1320

Hey, sorry it took us so long (please see this announcement). The new version works a little differently, we have immutable objects, and it's easy to take partial completions and do whatever you want with them.

marcotcr avatar Nov 14 '23 21:11 marcotcr