ffmpeg ("lenscorrection" or "fisheye") with docker-installed Motioneye?
Hello all
Question:
How do I process an RTSP camera stream with some ffmpeg directives and then use motioneye to display it please?
Background:
On either of two linux machines and in either case using dockge and a docker compose yaml, motioneye is running perfectly.
Out of the nine cameras I am displaying, one of them is an EZViz doorbell camera with a fisheye lens. I have been able to convert the fisheye image to a Cartesian one using ffplay and its ffmpeg strings like this:
~$: ffplay -i rtsp://user:password@ip_addr:554/Streaming/Channels/101 -vf lenscorrection=k1=-0.450:k2=0.1
(The results of my experments are shown here ).
I hope to be able to display the corrected image in motioneye but I don't know how to approach the problem.
1 - can it be done?
2 - would it be something which gets handled within the docker compose file (I think so) or somehow within motioneye (I don't think so)?
3 - can you point me towards a tutorial please or otherwise illustrate how I might achieve my aims? I am not particularly knowledgeable although I am a long time linux user. (By way of example, I can sudo my around a OS, or I can follow instructions to edit scripts and things and I can git clone and locally compile, but originating my own ten line bash script or C program would be entirely beyond me!).
Thanks!
EB
I wonder if there are also other (better?) ways to do such a thing, but based on my understanding of the options, you could run ffmpeg sort of as a service to provide a transformed video stream which to configure in MotionEye. I mean something like described here: https://ffmpeg-api.com/learn/ffmpeg/recipe/live-streaming#ffmpeg-for-live-streaming
How you set up that ffmpeg process to run is up to you I guess. If you are already anyway using Docker for ME, I guess it would be reasonable to set up Docker compose to run both. I'm not that versed with Docker to be able to suggest a config though, but I wouldn't expect it to be too difficult. To get it running steadily, I guess you'd eventually want to wrap that ffmpeg process in some sort of script that restarts the stream transformation automatically if it gets killed for whatever reason.
@zagrim it hadn't occurred to me to approach it as a problem where I could pre-process and then feed ME ...a nice idea. Thanks!
I'm a Docker beginner bu your suggestion sounds pretty feasible. I'll start experimenting (at least with Docker, when I cock it all up, I can just start again) and I'll also look at the URL you showed me.
Did you ever find a solution to your problem? I have a similar challenge, but I am running the motioneye addon for home assistant on a sbc, so I don't know if I can even do what you were looking into.
Sorry for the late reply: no, I ended up doing everything in Frigate in such a way as to not generate much of a burden on the host hardware, and hence I have stopped using MotionEye now.