harvey1992

Results 9 comments of harvey1992

@dosu you didn't actually use the custom prompt in the test generation.

@jjmachan The default instructions for prompting test generation make the LLM generate questions that would be written on an exam. I wanted to modify the instructions so they would output...

could you show me a simple example?

@jjmachan I am doing something similar to this from the documentation ```py # generator with openai models generator_llm = ChatOpenAI(model="gpt-3.5-turbo-16k") critic_llm = ChatOpenAI(model="gpt-4") embeddings = OpenAIEmbeddings() generator = TestsetGenerator.from_langchain( generator_llm,...

@jjmachan i modified the open-source code and it made it work for my needs. thanks

@fschuh I updated the evolution file and generator file to allow custom prompts to be passed via arguments.

@CosaroLisa I don't want to post a ton of code here, but in the evolution file, i created a setter to set the prompt for the given evolution type and...

@dosu is there a way a customize the prompt using TestsetGenerator.with_openai() and generator.generate_with_langchain_docs()?