does this example works for live camera?
What we need to change to make this example work for live camera?
Hi,
We have a live camera example here: https://github.com/Unity-Technologies/sentis-samples/blob/main/DepthEstimationSample/Assets/Scripts/InferenceWebcam.cs
Hi,
We have a live camera example here: https://github.com/Unity-Technologies/sentis-samples/blob/main/DepthEstimationSample/Assets/Scripts/InferenceWebcam.cs
Sorry i meant this current barracuda repo do you have example for Live Object recognition? or is it not possible?
Latest Barracuda version is called Sentis . You can try this one: https://huggingface.co/unity/sentis-yolotinyv7
Thanks, the above url you pasted : https://huggingface.co/unity/sentis-yolotinyv7 is for classifying the static or the image from the pre-recorded video. I tried replacing the video.texture with webcam texture for live classifying . I have already dragged the RunYolo script in the main camera. Though i get a blank white screen , i want to know is it the right way? Any example available for the live /real time classification with the bounding box?