langchaingo icon indicating copy to clipboard operation
langchaingo copied to clipboard

[Bug Report] ErnieLLM doesn't support multiple messages and stale example

Open SpikeWong opened this issue 1 year ago • 0 comments

// GenerateContent implements the Model interface.
func (o *LLM) GenerateContent(ctx context.Context, messages []llms.MessageContent, options ...llms.CallOption) (*llms.ContentResponse, error) { //nolint: lll, cyclop, whitespace

	if o.CallbacksHandler != nil {
		o.CallbacksHandler.HandleLLMGenerateContentStart(ctx, messages)
	}

	opts := &llms.CallOptions{}
	for _, opt := range options {
		opt(opts)
	}

	// Assume we get a single text message
	msg0 := messages[0]
	part := msg0.Parts[0]

In the implementation of erniellm, it only picks the first message. But the example of erniellm input more than 1 message, so of course only the first message get processed.

content := []llms.MessageContent{
		llms.TextParts(schema.ChatMessageTypeSystem, "You are a company branding design wizard."),
		llms.TextParts(schema.ChatMessageTypeHuman, "What would be a good company name a company that makes colorful socks?"),
	}
	completion, err := llm.GenerateContent(ctx, content, llms.WithStreamingFunc(func(ctx context.Context, chunk []byte) error {
		fmt.Print(string(chunk))
		return nil
	}))

Also in the latest code version, llms.TextParts hasn't work.

I also would like to complain about the frequent break changes.

SpikeWong avatar Feb 02 '24 05:02 SpikeWong