arnavsinghvi11

Results 157 comments of arnavsinghvi11

[HFModel](https://github.com/stanfordnlp/dspy/blob/b2816c4a35e3144a06752423de1afb8e68e1005f/dsp/modules/hf.py) is the alternate to TGI for hosting local models but not recommended as it does not have the extensive coverage/support that TGI offers (especially for quantization).

Hi @bhargavanubavam , DSPy can indeed support answer retrieval, but I'm curious if you only wanted to pass the PDF in as input. Currently, it might be easier to load...

@JamesHWade just following up on this PR. feel free to close this and open the new one without conflicts! Thanks

Hi @smcdowellfactor , could you try setting `dspy.settings(log_openai_usage=False` manually and testing if this resolves it? You can also comment you the logging configuration in [gpt3.py](https://github.com/stanfordnlp/dspy/blob/01455bd20f5770819aff9edc0b43915203f71cbd/dsp/modules/gpt3.py#L5) for your use case.

Hi @poppingtonic , this error happens because your dev_set is not populated correctly with `dspy.Example` objects. Try following this [documentation](https://dspy-docs.vercel.app/docs/building-blocks/data). Instead of creating a dspy.Predicion object, you'd want to create...

> dsp.settings.trace Yeah I believe we should set it to [] going forward since there are [checks in assertion.py](https://github.com/stanfordnlp/dspy/issues/432). It seems that the optimizations logic also set [trace = []...

@Demontego VertexAI would be great. Feel free to open a PR for it! You'll find this documentation useful for [integrating an LM within DSPy](https://dspy-docs.vercel.app/docs/deep-dive/language_model_clients/custom-lm-client) or following the format of existing...

> Hey both, would it be an idea to name this googlevertexai.py? And the LM `GoogleVertexAI`? Inspired by the azurecognitivesearch.py file, but I also see we have hf_client / hf_server....

tagging @CShorten here. I think `self.co` is supposed to be `self.google` right? :) That might fix up the error since I don't believe there's a generation yet.