Matt Hamann

Results 43 issues of Matt Hamann

I recently upgraded to the latest version of pinojs, which pulls in this library. In the previous version of pino, I could do something like this: ``` log.info('Uhoh, an error...

:robot: I have created a release *beep* *boop* --- ## [0.0.19](https://github.com/llama-farm/llamafarm/compare/v0.0.18...v0.0.19) (2025-12-01) ### Features * **designer:** better ux for day 2 users ([#509](https://github.com/llama-farm/llamafarm/issues/509)) ([7cbbafe](https://github.com/llama-farm/llamafarm/commit/7cbbafea1a0d6ef80c9e79bad318184f49ea169b)) ### Bug Fixes * **api:** Support...

autorelease: pending
Review effort 1/5

If a previous version of LlamaFarm is present on the machine in the default data dir location (i.e. `.llamafarm`), the app doesn't seem to flow through the process whereby the...

component::installer

LF Version: 0.0.18 When attempting to start LlamaFarm, the error dialog that appears on start-up when something goes wrong ends up directly behind the splash screen. Most users won't know...

component::designer

This is more of a note for docs, but the `build-essential` apt dependency needs to be installed on Ubuntu in order for the universal runtime to start properly. It should...

documentation
component::runtime

### **PR Type** Enhancement ___ ### **Description** - Add multimodal message support for images, videos, and audio files - Implement base64 encoding for media files in CLI and API -...

Review effort 3/5

We need to add first-class support for image input and output within LlamaFarm pipelines and the assistant runtime. This will allow users to build and run workflows that accept images...

When support for Docker is unavailable and the server url is a local machine address, check for docker support and show a friendly and helpful error message if it's unavailable....

component::cli

In order to prevent scary warnings when running the binaries downloaded from GitHub, it's important that we properly sign them using a Microsoft code signing certificate.

component::cli

Currently, certain LlamaFarm CLI commands (e.g., lf projects run, lf projects chat, lf pipelines execute) assume that the necessary background services (e.g., local API server, vector DB, model runtime) are...