guidance
guidance copied to clipboard
Unable to import module when running from a read-only filesystem
When importing guidance from a read-only filesystem, there are a couple of issues:
-
Opening the log.txt file (in https://github.com/microsoft/guidance/blob/main/guidance/_utils.py#L10) will always raise an exception as it can't be opened for writing. Can this be switched to use the standard python
loggingmodule instead? -
Setting
caching=Falsestill tries to create the cache directory. Can be worked around by settingos.environ['XDG_CACHE_HOME'] = '/tmp'(or somewhere that is definitely writeable) before importing, but isn't ideal.
System info:
- OS: AWS Lambda Python 3.9
- Guidance Version: 0.0.47
hey @pthimon interesting that you are running it in aws lambda. How do you manage the stateless computation? I am having trouble storing, loading and continuing a partially executed program.
Hi @Sam1320 I'm just storing the partial template string, and then reloading it when the next request comes in.
The simplest example would be:
initial_template = '''{{~#geneach 'conversation' stop=False~}}
{{#user~}}
{{await 'user_message'}}
{{~/user}}
{{#assistant~}}
{{gen 'assistant_message' temperature=0.8 max_tokens=500}}
{{~/assistant}}
{{~/geneach}}'''
template = load_template(user_id) # from a db
if not template:
template = initial_template
chat = guidance(template)
output_template = chat(user_message=msg) # msg is from request
save_template(str(output_template), user_id) # saves to db, can save output_template.variables() if you need them
Obviously the template would be more complicated in a real example!
Hey @pthimon thanks for the reply!.
How do I save the variables() ?
I am doing the following:
import guidance
guidance.llm = guidance.llms.OpenAI('gpt-3.5-turbo')
guidance.llm.cache.clear()
prompt = guidance(
'''{{#system~}}
You are a helpful assistant
{{~/system}}
{{~#geneach 'conversation' stop=False~}}
{{#user~}}
{{set 'this.user_text' (await 'user_text')}}
{{~/user}}
{{#assistant~}}
{{gen 'this.ai_text' temperature=0 max_tokens=300}}
{{~/assistant}}
{{~/geneach}}''', stream=False, silent=True)
prompt = prompt(user_text ='Hello there :)')
with open('prompt.txt', 'w') as f:
f.write(str(prompt))
with open('prompt.txt', 'r') as f:
prompt = f.read()
prompt = guidance(prompt)
prompt = prompt(user_text ='I want to travel to the moon')
But get the error:
AssertionError: Can't set a property of a non-existing variable: conversation[-1].user_text
Error in program: Can't set a property of a non-existing variable: conversation[-1].user_text
Removing the this. prefix from 'this.user_text' and 'this.ai_text' makes it run. That means you don't get the conversation stored in the conversation array, but for my use case I just parsed what I needed from the generated template.
Experiencing this same issue — would love the option to either disable log.txt, or choose my own home for it!
Looks like the log issue is fixed https://github.com/microsoft/guidance/pull/106 and the caching has a workaround, so marking as resolved.