mediapipe
mediapipe copied to clipboard
Getting Buffer Validation newBufferWith must not exceed 256 MB error when I am trying to load model of 2.5 gb size
I have downloaded Google iOS sample of MediaPipe and tried to load my model which is 2.5 GB in size.
private var inference: LlmInference! = {
// slm1.bin is my model
let path = Bundle.main.path(forResource: "slm1", ofType: "bin")!
let llmOptions = LlmInference.Options(modelPath: path)
return LlmInference(options: llmOptions)
}()
The project is built successfully but on App launch, I am getting following error:
-[MTLDebugDevice newBufferWithBytes:length:options:]:670: failed assertion `Buffer Validation
newBufferWith*:length 0x1f400000 must not exceed 256 MB.
I came to know that a single MTLBuffer is limited to a maximum length of 256MB. If we need a total allocation of more than 256 MB, we can allocate multiple buffers and split data among them, but I don't know how I will be able to do in case of this SDK.
Hi @Pratik-7i,
This is already on our roadmap, and while we have not yet tested it with iOS interface and other large models, we are actively working to ensure compatibility with them. This functionality will be available soon. Regarding the scenario you are currently using, please allow us some time to determine if we can assist with it.
Thank you!!
Hi @priankakariatyml,
Do you know way to allocating multiple buffers and distributing data among them in the SDK? Any recommendations would be greatly valued.
Thank you!!
Thanks @kuaashish Actually we need to integrate large model in our live project, can we know when this functionality might be available?
Hi @Pratik-7i,
We are unable to provide an exact date at this time. However, rest assured that it will be available soon. We will keep you informed of any updates through the same thread.
Thank you!!
Thanks @kuaashish We will be waiting for an update
Any update on this?
Encountered the same error on iOS sample with the model larger than 2.5G
Any update??? It has been passed almost 3 months...