Christopher Speller

Results 38 issues of Christopher Speller

It would be nice if the user would get some kind of message saying the AI could be better if they added the GitHub plugin or connected their account. Particularly...

enhancement
help wanted

Token counting is a bit hard to get right. The current implmentation for OpenAI is an approximation, see: https://github.com/mattermost/mattermost-plugin-ai/blob/master/server/ai/openai/openai.go#L296. Others have worse. Implement better counting (or figure out another solution)...

enhancement
help wanted

How can we allow users to reuse prompts? Provide a prompt library for them to access?

enhancement
question

Currently the LLM does not have access to file contents even though the MM server adds the extracted content. This is not straightforward as you need to balance what the...

enhancement

The current content extraction in the MM server for PDFs doesn't work very well which creates some issues when the LLM tries to understand the files. Is there an alternative...

enhancement
help wanted

Not sure exactly what is required here. Do we just need to modify our prompts asking it to answer in a specific language? Or maybe we need to localize the...

enhancement
help wanted

Currently the audio summarization functionality will fail if the meeting is too long. (over 25MB) The first step to fixing this is being able to split up longer recordings and...

enhancement
help wanted

## Description The MM Webapp does not export AdvancedCreateComment directly. Instead it sets a bunch of incorrect parameters and only allows changes to `placeholder` and `onSubmit`. See: https://github.com/mattermost/mattermost/blob/master/webapp/channels/src/plugins/exported_create_post.tsx#L15 This caused...

Adding LLM capabilities to https://github.com/mattermost/mattermost-plugin-playbooks/ seems like a natural fit. What can we do to enhance the playbooks experience with LLMs? A good start might be to automatically fill in...

Add a feature to the postbox where after the user enters a post they can ask which channel the LLM thinks the post should go in. We can supply the...