llama.cpp icon indicating copy to clipboard operation
llama.cpp copied to clipboard

Feature Request: Support for Deepseek Janus-Pro-7B & Janus-1.3B

Open YorkieDev opened this issue 11 months ago • 2 comments

Prerequisites

  • [x] I am running the latest code. Mention the version if possible as well.
  • [x] I carefully followed the README.md.
  • [x] I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • [x] I reviewed the Discussions, and have a new and useful enhancement to share.

Feature Description

DeepSeek recently released Janus-Pro-7B and Janus-1.3B, both multimodal models currently supported in Transformers.

Resources: Janus GitHub

Motivation

Adding them to llama.cpp would enable efficient local inference, expanding support for state-of-the-art multimodal AI. Would love to see this integrated—appreciate all the great work!

Possible Implementation

No response

YorkieDev avatar Jan 29 '25 14:01 YorkieDev

Yes, also Janus Pro 1B would be a nice model to run in browser using webassembly also.

bil-ash avatar Feb 02 '25 14:02 bil-ash

+1

Forevery1 avatar Feb 14 '25 19:02 Forevery1

same here, this one is a must.... yet not available for open webui and macs

Akossimon avatar Mar 07 '25 15:03 Akossimon

This issue was closed because it has been inactive for 14 days since being marked as stale.

github-actions[bot] avatar Apr 22 '25 01:04 github-actions[bot]