Results 14 comments of Yoni

Turning off "Avoid crossing perimeters" works for me. On: ![image](https://github.com/prusa3d/PrusaSlicer/assets/22361729/ba14ca0b-0d28-4365-818b-3c71405be37f) Off: ![image](https://github.com/prusa3d/PrusaSlicer/assets/22361729/b4ad378a-bea5-42e4-9fb2-20bc5b971471) but what's weird is in vase mode, it still generates seams: ![image](https://github.com/prusa3d/PrusaSlicer/assets/22361729/2d9d0b65-1ec0-47fb-8f2f-49e3a04e877e) but then it's gone when using...

This is so random, I tried with 0 retraction, and the seams are gone. 0.5mm retraction: ![image](https://github.com/prusa3d/PrusaSlicer/assets/22361729/09739e38-c65d-4a3e-80c9-47321dbecb03) 0mm retraction: ![image](https://github.com/prusa3d/PrusaSlicer/assets/22361729/34544b99-1ebe-416b-9e54-e06363db9fa3)

Did you also turn off avoid crossing parameter? If I turn it on, even with 0 retraction the weird "seams" still there. But you're probably right, this thing is inherent....

Please also add that function calling is also depends on AI platform. Even if the model can suggest function calling, if the AI platform doesn't support it then it won't...

Please correct me if I'm wrong. I think the documentation is misleading or typo, idk. From what I see, the error is what it says it is: It's not implemented....

If you want to experiment with it, I've replaced the implementation in [my fork](https://github.com/yonitjio/LocalAI/commit/93545fa863d0ed46b3ccb0c37697c65cebd1b64b#diff-8a8d2f30427af18ec5f0755016c713867761f757794c5859792fb1868abe7bd4). I replaced the whole `llama.cpp` parts in `grpc-server.cpp` with more recent server example from [llama.cpp](https://github.com/ggerganov/llama.cpp) along...

For now, the solution is to use another backend. I just checked, still no code for embedding in grpc-server.cpp.

I don't know if this helps since I don't use Extended OpenAI Conversation. In my case, this occurs if the array items type is `object`. I "fixed" it with using...

This should be merged with issue https://github.com/mudler/LocalAI/issues/1617

In my case, this is due the mistral tokenizer fell back to fast tokenizer which made the `sp_model` missing, installing `sentencepiece` solved it for me. But then I get error...