guidance icon indicating copy to clipboard operation
guidance copied to clipboard

Unable to import module when running from a read-only filesystem

Open pthimon opened this issue 2 years ago • 5 comments

When importing guidance from a read-only filesystem, there are a couple of issues:

  • Opening the log.txt file (in https://github.com/microsoft/guidance/blob/main/guidance/_utils.py#L10) will always raise an exception as it can't be opened for writing. Can this be switched to use the standard python logging module instead?

  • Setting caching=False still tries to create the cache directory. Can be worked around by setting os.environ['XDG_CACHE_HOME'] = '/tmp' (or somewhere that is definitely writeable) before importing, but isn't ideal.

System info:

  • OS: AWS Lambda Python 3.9
  • Guidance Version: 0.0.47

pthimon avatar May 19 '23 15:05 pthimon

hey @pthimon interesting that you are running it in aws lambda. How do you manage the stateless computation? I am having trouble storing, loading and continuing a partially executed program.

Sam1320 avatar May 19 '23 16:05 Sam1320

Hi @Sam1320 I'm just storing the partial template string, and then reloading it when the next request comes in.

The simplest example would be:

initial_template = '''{{~#geneach 'conversation' stop=False~}}
{{#user~}}
{{await 'user_message'}}
{{~/user}}

{{#assistant~}}
{{gen 'assistant_message' temperature=0.8 max_tokens=500}}
{{~/assistant}}
{{~/geneach}}'''

template = load_template(user_id)  # from a db
if not template:
  template = initial_template
chat = guidance(template)
output_template = chat(user_message=msg) # msg is from request
save_template(str(output_template), user_id)  # saves to db, can save output_template.variables() if you need them

Obviously the template would be more complicated in a real example!

pthimon avatar May 19 '23 17:05 pthimon

Hey @pthimon thanks for the reply!. How do I save the variables() ? I am doing the following:

import guidance
guidance.llm = guidance.llms.OpenAI('gpt-3.5-turbo')
guidance.llm.cache.clear()

prompt = guidance(
'''{{#system~}}
You are a helpful assistant
{{~/system}}
{{~#geneach 'conversation' stop=False~}}
{{#user~}}
{{set 'this.user_text' (await 'user_text')}}
{{~/user}}
{{#assistant~}}
{{gen 'this.ai_text' temperature=0 max_tokens=300}}
{{~/assistant}}
{{~/geneach}}''', stream=False, silent=True)

prompt = prompt(user_text ='Hello there :)')

with open('prompt.txt', 'w') as f:
    f.write(str(prompt))

with open('prompt.txt', 'r') as f:
    prompt = f.read()

prompt = guidance(prompt)

prompt = prompt(user_text ='I want to travel to the moon')

But get the error:

AssertionError: Can't set a property of a non-existing variable: conversation[-1].user_text

Error in program:  Can't set a property of a non-existing variable: conversation[-1].user_text

Sam1320 avatar May 19 '23 18:05 Sam1320

Removing the this. prefix from 'this.user_text' and 'this.ai_text' makes it run. That means you don't get the conversation stored in the conversation array, but for my use case I just parsed what I needed from the generated template.

pthimon avatar May 19 '23 18:05 pthimon

Experiencing this same issue — would love the option to either disable log.txt, or choose my own home for it!

j6k4m8 avatar Jun 05 '23 23:06 j6k4m8

Looks like the log issue is fixed https://github.com/microsoft/guidance/pull/106 and the caching has a workaround, so marking as resolved.

pthimon avatar Jul 17 '23 10:07 pthimon