[Feature Request] Add camera sharing support
it's a feature request, so for example, I can take pics from my VM via my iDevice's built in cameras, I don't think it might be possible but I don't expect you to actually add the camera feature support for VM, but if it is possible then that's good. but if not, then instead of adding camera support, add mic support, I am not sure if it's already added to UTM, but if it isn't, then add mic support for the user to have access using sound recorder in VM properly, again, I don't really expect you to make all the working features, and also I have way more feature requests to make UTM a better app but I don't really want to spam all the features because if I do I might ruin the issues section.
essentially take the device's webcam then funnel the signal into a virtual usb video device for your guest OS? this may be slow considering the speed of the virtual processor and no hardware accelerated video
Will virtual USB video devices be added in UTM v2.0? I think this function will be very cool. What parts of the code need to be modified? I want to have a look and try to modify the code.
I'm going to boldly venture that this would likely necessitate a fully open-source implementation of something like Flexihub/USB Network Gate (USB over IP).
Thank you soo much already !
Yes please, it will be amazing !
I don't know if this is useful, but with some options to ffmpeg, I was able to send my camera feed over IP to my VM (Linux has an option to import a stream as a proper camera). The main issues I had was that --- 1. the camera is always on; 2. there's a little delay (~2 seconds). I don't know whether these limitations are acceptable to any person though.
Also, instead of creating a virtual USB device, why not modify gstreamer directly? Is there a reason it can't be added there?
Actually, from searching online, gstreamer already has support, though you might need gst-plugins-bad.
EDIT: Oh, now I know why we need a USB solution: SPICE doesn't support camera input, even though gstreamer does.
To virtualize USB devices, you may have to use IOUSBHost.framework. The problem is, that this requires com.apple.usb.hostcontrollerinterface, which must be requested. This is even if you don't plan on distributing it.
I expanded my Python script into a full-fledged Go application: https://github.com/DUOLabs333/av-forward. You run the server on the MacOS host, and run the client on the Linux guest (same application, just compile it for Linux and MacOS separately). On the host, you need ffmpeg, and on the guest, you need v4l2loopback and ffmpeg (both should be packaged by most distributions). It works over IP on the local network.
It's still very beta, and it's my first non-trivial Go application, so any bug reports will be appreciated. Two known issues: there is a slight (~0.1s) delay in the camera, and there is a noticeable lack of quality. Both I am working on.
I found that the CPU usage increases dramatically when streaming, probably because ffmpeg can't use the GPU. So, the latency issues will be partially based on improving GPU support (likely from transitioning to gfxstream).
I'm planning to start making a USB driver --- this will allow for greater compatibility (instead of having to make a virtual camera for each platform, each platform will just need to support USB cameras, which is almost always true).
However, to make the USB device without having to deal with DriverKit, I'm using IOUSBHostControllerInterface. According to here, this will not work on version <12. I'm on 11 and I get the "Unable to connect to the kernel". Can someone check the code I wrote to see if you get any error message? driver.txt
To run:
- Rename to .m
- Run
gcc driver.m -lobjc -framework IOUSBHost -framework Foundation && ./a.out
I expanded my Python script into a full-fledged Go application: https://github.com/DUOLabs333/av-forward. You run the server on the MacOS host, and run the client on the Linux guest (same application, just compile it for Linux and MacOS separately). On the host, you need ffmpeg, and on the guest, you need v4l2loopback and ffmpeg (both should be packaged by most distributions). It works over IP on the local network.
It's still very beta, and it's my first non-trivial Go application, so any bug reports will be appreciated. Two known issues: there is a slight (~0.1s) delay in the camera, and there is a noticeable lack of quality. Both I am working on.
Hey, I was trying to get camera work on my UTM VM and found this. I am having MacOS VM on my M1 Mac. Will this work for it and if it will can you please list down the steps which should be followed for UTM?
No, it only works on Linux guests currently. I'm planning on rewriting it to use a USBHost driver on Mac, so it should be compatible with all OSes that support USB cameras. However, I need to finish up implementing Vulkan.
If anyone is on version > 11, you can test the permission check, to see if it's only possible >11, or if I have to set up a provision profile.
I've hit a blocker on the Vulkan implementation, which gives me time to actually work on the driver.
It seems that making a USB driver requires a provisioning profile, which can only be gotten through XCode. However, I do not have enough space to download the 11GB tool, let alone install it. I'll probably have to buy a flash drive in order to get this working.
I got Xcode installed and set up, and I can build the app with a provisioning profile. However, now I get User doesn't have permission to launch the app (managed networks), and the com.apple.developer.usb.host-controller-interface entitlement is not found in the provisioning profile. Maybe someone more familiar with this could jump in?
EDIT: It seems that the free profile doesn't allows you to use com.apple.developer.usb.host-controller-interface. If so, then I can't do this at all.
+1
@DUOLabs333 I want to do the same thing as you to make a virtual USB device. For the entitlment issue, you can disable SIP to get bypassed. Would like to know if you get any process on the virtual USB device creation. Are there any doc can be refre to ? Thanks!
I already have SIP disabled. I never did figure out how to make virtual devices without Apple's permission.
🤷🏻♂️🤦🏻♂️
why?
I'm not sure if anyone still cares, but I found out that you can create a virtual camera for macOS (and Windows) with https://github.com/webcamoid/akvirtualcamera --- my tool (which I rewrote --- again. I went back to the streaming approach since it seems like creating a virtual USB device is virtually (heh) impossible on macOS) could use that to support macOS. I don't use non-Linux VMs, so I can't test it --- however, if someone is willing to open a PR, I'll merge it in.
@DUOLabs333, thanks!
This issue was about host macOS camera sharing to virtual macOS, as I remember. Anyway,
however, if someone is willing to open a PR
That would be awesome.
it seems like creating a virtual USB device is virtually (heh) impossible on macOS)
How do you think software like VirtualHere is doing this as they seem to work as a macOS Client ?
To virtualize USB devices, you may have to use IOUSBHost.framework. The problem is, that this requires com.apple.usb.hostcontrollerinterface, which must be requested. This is even if you don't plan on distributing it.
I'm guessing they can do this because they were able to win a (painfully long) battle against Apple and able to get the entitlement for their app?
@DUOLabs333, This is a very interesting problem that I'd like to tackle, if you have any (more) insights to share it would be highly appreciated!
How do you think software like VirtualHere is doing this as they seem to work as a macOS Client ?
Given that there is almost no documentation for this, it is either through trial and error; reading through what little source code is available online; or working directly with Apple. My bet is on 1.
@DUOLabs333, This is a very interesting problem that I'd like to tackle, if you have any (more) insights to share it would be highly appreciated!
The best I can give you is the code from my initial attempts at piecing together the documentation to build the driver. It is nowhere near done, but at least it can act as a starting point.