d2 icon indicating copy to clipboard operation
d2 copied to clipboard

Investigate integrating a local language model

Open fwcd opened this issue 1 year ago • 1 comments

Highly quantized language models that can run locally are getting more and more popular with even Chrome shipping a Gemini Nano model in their latest canary builds. Models like Phi-3-mini already achieve impressive performance for being comparatively small and support cross-platform inference using a Rust library named candle.

It would be cool if we could bundle such a model with D2, e.g. as a command and/or as a Conversator.

fwcd avatar Jul 07 '24 12:07 fwcd

llama.cpp and llama.swift would be worth investigating, even though the latter might be primarily targeting Apple platforms.

fwcd avatar Jul 07 '24 12:07 fwcd