Lance Martin

Results 87 comments of Lance Martin

Yes, but for that case you could likely just prompt the LLM directly. Unless you still want all the configurability of the tool (for report structure, etc)? We could add...

> Support Mcp to add local knowledge /others mcp service as part of search looks good to enhance Im create a seperate issue for this. This would be a great...

> > > Support Mcp to add local knowledge /others mcp service as part of search looks good to enhance > > > > > > Im create a separate...

> > > Support Mcp to add local knowledge /others mcp service as part of search looks good to enhance > > > > > > Im create a separate...

> I've added **EXA** as a search API, and **arXiv** and **PubMed** as separate tools. It's super easy to integrate. Yes, do you mind creating a PR? These are nice...

Thanks for Exa PR! Had minor comments. Let's also add arXiv and PubMed. Please include them in README.

Hi! I have a simpler version below that uses Ollama. The main differences are that 1) it skips the planning phase and 2) it doesn't perform section writing in parallel....

I added `init_chat_model` API for provider, so you can select Ollama. Keep in mind that you will want a model that can perform tool-calling (for structured output). https://python.langchain.com/api_reference/langchain/chat_models/langchain.chat_models.base.init_chat_model.html#init-chat-model

Yes, will do now after resolving merge conflicts.