moonraker
moonraker copied to clipboard
Unified camera support
Problem
I'm following for some time the klipper ecosystem and see a few missing features:
- ability to have timelapse
- ability to have a picture of a print after the print completes
- a way to support RTSP camera (pretty common for Xiaomi camera using unofficial software)
- a way to have a consistent configuration of cameras across all frontends
- make Moonraker in some cases to produce a video stream/snapshot to use in Mainsail/Fluidd
- provide out of box experience that if camera is configured in Moonraker it just works, regardless what type of camera it is: it can be USB connected V4L, or RTSP, or a simple IP camera that outputs h264 stream
- Moonraker runs whatever is needed to provide output, which can be:
live555
,mjpgstreamer
, or anything else that can transform output into desired form
Today:
- each frontend defines its own configuration of cameras
- the support of different camera differs between, and you have to configure it many times
Design
Camera configuration
[camera default]
type = mjpg / v4l / rtsp
snapshot_url = # used by mjpg
stream_url = # used by mjpg or rtsp
device = # used by v4l
fps = # can be provided in each case to transform the stream (?)
resolution = # can be provided in each case to transform the stream (?)
flip_horizontal = false | true
flip_vertical = false | true
proxy = false | true # indicates if the camera is accessed directly or via moonraker => a default is that Moonraker tries to expose direct access
[camera my_mjpg]
type = mjpg
snapshot_url = http://my-webcam/snapshot
# stream_url:
# set if available, then the adaptive `mjpg-streamer` is used
# that requests snapshots fps-times per second
stream_url = http://my-webcam/stream
fps = 30
[camera my_v4l]
type = v4l
device = /dev/video0
resolution = 1920x1080
[camera my_rtsp]
type = rtsp
stream_url = rtsp://my-camera:12315/stream.ch
Camera API
The frontend can request /server/camera/list
that returns
all available cameras with different access patterns as possible,
so frontend can pick the most efficient one:
- if
h264
is available we should prefer to use mkv/mp4/ts container as this is least amount of data to transfer and can perform stream copy - if
mjpg
is available we provide a stream of jpg frames - we try our best to transform the stream into desired output
- the added benefit is that
octoprint_compact
can also benefit from the camera configuration of which a Cura can presentmjpg
stream
{
"default": {
"type": "mjpg/v4l/rtsp",
"stream_video_url": "a url to a video output if available",
"stream_mjpg_url": "a url to a mjpg output (is always available but comes with performance/transfer penalty)",
"snapshot_url": "a url to a single frame capture", # this needs to be always available
"flip_horizontal": "false|true", # a flag only provided if transformation needs to be performed
"flip_vertical": "false|true", # a flag only provided if transformation needs to be performed
}
}
Snapshot Design
An intent of this feature is to attach to [history]
and provide a post print capture of a print that can be accessed via [history]
later:
[history_snapshot]
camera = my_v4l
snapshot_delay = 5 # capture 5s after the print finishes
Timelapse Design
An intent of this feature is to create a timelapse and attach it to [history]
. I guess quite similar to https://github.com/FrYakaTKoP/moonraker/blob/dev-timelapse/docs/configuration.md:
[timelapse]
camera: my_v4l
enabled: True
autorender: True
constant_rate_factor: 23
output_framerate: 30
output_path: ~/timelapse/
frame_path: /tmp/timelapse/
time_format_code: %Y%m%d_%H%M
pixelformat: yuv420p
extraoutputparams:
Impact on History API
Each of the mentioned features extends History API with these informations. The only required change on frontend is to consume a camera
key and show any of the requested data. Moonraker will happily serve that in a form where either <img>
(for snapshot) or <video>
(for timelapse) can be used.
We don't truly need any other configuration for frontend.
{
"jobs": [
{
"end_time": 1623432959.359852,
"filament_used": 0,
"filename": "wait_print.gcode",
"metadata": {
"size": 450,
"modified": 1623173866.6310892,
"slicer": "Unknown",
"gcode_start_byte": 8,
"gcode_end_byte": 450
},
"camera": {
"thumbnail_url": "/server/thumbnails/...jpg",
"snapshot_url": "/server/history_snapshot/my-print-132153242.jpg",
"timelapse_url": "/server/history_snapshot/my-print-321321312.mp4"
},
"print_duration": 0,
"status": "completed",
"start_time": 1623432911.2004614,
"total_duration": 48.258318522945046,
"job_id": "000068",
"exists": true
},
]
}
This would be a fantastic addition.. 👍🏼
I've created telegram bot for this) i'm using opencv for capturing frames from cams ( mjpeg\h264 streams supported). i could try to move my codebase into moonraker as another plugin if someone is interested in it)
Is there any progress in this direction? Is it possible to connect cameras via RTSP? And how are the cameras connected at the moment and which cameras are supported?
@Suhanovd since moonraker does not handle any camera related stuff (beside the timelapse component which is not yet in the master branch). It highly depends on the Os you installed moonraker on. As example MainsailOs uses mjpg_streamer which is the same as octopi uses, compatibility should be the same.
RTSP is a whole other story. Since Webbrowsers have no native support for rtsp which would need a special plugin embedded in the frontend (flash, activex, ....) or transcode the rtsp stream to something the brower accepts without any trickery. Problem with later is that it will introduce delay (big cpu overhead also). If your ipcam supports streaming to a html5
Btw. i would like to see something like a unified camera support controlled by moonraker, but afaik no body is working on it and i have other priorities.