llama.cpp icon indicating copy to clipboard operation
llama.cpp copied to clipboard

Feature Request: MiniCPM 2.6 model support?

Open ttamoud opened this issue 1 year ago • 2 comments

Prerequisites

  • [X] I am running the latest code. Mention the version if possible as well.
  • [X] I carefully followed the README.md.
  • [X] I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • [X] I reviewed the Discussions, and have a new and useful enhancement to share.

Feature Description

I'd like to begin by expressing my sincere gratitude for your outstanding contributions. Your efforts have been instrumental in supporting and advancing the open-source community.

It would be fantastic to have support for 8 billion parameters vision models that can truly rival the performance of leading proprietary models.

Motivation

SOTA OSS VLM with only 8b params, a piece of art, rivals top models.

QVl0iPtT5aUhlvViyEpgs

Possible Implementation

No response

ttamoud avatar Aug 10 '24 20:08 ttamoud

https://ollama.com/xuxx/minicpm2.6

There is an installation package for Ollama on Windows, and the MiniCPM 2.6 model is available.

xuhongming251 avatar Aug 11 '24 13:08 xuhongming251

https://github.com/ggerganov/llama.cpp/pull/8967

rick-github avatar Aug 11 '24 16:08 rick-github

This issue was closed because it has been inactive for 14 days since being marked as stale.

github-actions[bot] avatar Sep 26 '24 01:09 github-actions[bot]