Dhar Rawal

Results 36 comments of Dhar Rawal

"I'm assuming MyModule.load_compiled_model() translates to just .load()" - correct Does this help? ![image](https://github.com/stanfordnlp/dspy/assets/114010652/ea6693c2-4d4e-4ecf-826e-8ac8f540b1b1)

@arnavsinghvi11 - I think I have a clue! Is teacher.predictors() aggregating across all candidate programs? If I generate13 candidate programs from the the 1st iteration and each program has 3...

Looking at the code... Below is the comment for named_sub_modules() where it picks up the number of predictors: ``` def named_sub_modules(self, type_=None, skip_compiled=False) -> Generator[tuple[str, "BaseModule"], None, None]: """Find all...

Setting skip_compiled=True works a bit too well :) It gets rid of ALL the predictors. But at least we are on the right track. Notice how it found 5 candidate...

@arnavsinghvi11 - I think I found the issue and the solution. There is no need to mess with skip_compiled A compiled program has candidate programs, so to use it as...

You can now write code like this: ``` for i in range(3): if i == 0: teacher_module, _ = MyModule.load_compiled_model() optimizer = BootstrapFewShotWithRandomSearch( metric=my_metric, teacher_settings=dict({"lm": teacher_lm}), ) student_module = MyModule().activate_assertions()...

Hey guys, please review PR https://github.com/stanfordnlp/dspy/pull/772 that tackles this issue with a much better redesign. Almost done...

@arnavsinghvi11 - I see the problem now. Its probably same time to fix as creating a program (my program is too complex). **Revised thinking:** 1. Make reduce_fn, size, and deterministic...