langchaingo
langchaingo copied to clipboard
draft: ollama: add Ollama context option
Summary
Adds support for Ollama's context feature to enable short conversational memory between requests. This addresses GitHub discussion #973.
Changes
- Add WithContext([]int) option function to Ollama client
- Add comprehensive example demonstrating usage patterns
- Add test coverage for the new functionality
- Add documentation explaining context usage with Ollama
Usage
// Use context from previous response for follow-up requests
llm, err := ollama.New(
ollama.WithModel("llama2"),
ollama.WithContext(contextFromPreviousResponse),
)
Notes
- Context enables the model to maintain state between requests
- Particularly useful for follow-up questions that reference previous context
- Context is model-specific and should only be reused with the same model
- Provides foundation for future enhancements like context extraction from responses
Test Plan
- [x] Unit tests for WithContext() option
- [x] Example code demonstrating usage patterns
- [x] Documentation coverage
- [ ] Integration tests with running Ollama server (requires manual testing)
Addresses discussion #973