Self-hosted call.element support (like it's done for jitsi)
Your use case
What would you like to do?
I would like to configure my own element-call support on the server side, so that all of the clients would pick it up instead of call.element.io
Why would you like to do it?
To facilitate self-hosted calls.
How would you like to achieve it?
Ideally, that element call service would be populated through the server; but .well-known approach, like it's done for Jitsi, works too.
Have you considered any alternatives?
Yes. The alternative - it's to rebuild and repackage and publish all of the Element clients. Doable, but feels a bit unreasonable.
Additional context
https://github.com/element-hq/element-call/issues/2228 https://github.com/element-hq/element-desktop/issues/1566
Bumping this up...
I scraped at the documentation to get element call server going. I have tested this to call between ElementX android <> ElementX android; ElementX android <> Element Web (nightly).
Here are my instructions. If you find something needs to change do point that out.
Proxy setup
DNS names
I am assuming that the domain name is mydomain.com. Make sure the following dns names have public ip/FQDN:
- call.mydomain.com
- sfu.mydomain.com
- sfu-jwt.mydomain.com
Nginx proxy config
The following assumes that the elementcall docker container is running at 192.168.1.2. The following config forwards the call to element call/livekit running via docker.
server {
server_name call.mydomain.com;
ssl_certificate /etc/letsencrypt/live/mydomain.com/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/mydomain.com/privkey.pem;
ssl_dhparam /etc/ssl/dhparam.pem;
listen 443 ssl http2;
add_header Strict-Transport-Security "max-age=31536000; includeSubdomains";
ssl_protocols TLSv1.2;
ssl_ciphers "EECDH+AESGCM:EDH+AESGCM:AES256+EECDH:AES256+EDH:!aNULL:!eNULL:!EXPORT:!DES:!MD5:!PSK:!RC4";
ssl_prefer_server_ciphers on;
proxy_buffer_size 128k;
proxy_buffers 4 256k;
proxy_busy_buffers_size 256k;
location / {
proxy_pass http://192.168.1.2:8093;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-Server $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
server {
server_name sfu.mydomain.com;
ssl_certificate /etc/letsencrypt/live/mydomain.com/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/mydomain.com/privkey.pem;
ssl_dhparam /etc/ssl/dhparam.pem;
listen 443 ssl http2;
add_header Strict-Transport-Security "max-age=31536000; includeSubdomains";
ssl_protocols TLSv1.2;
ssl_ciphers "EECDH+AESGCM:EDH+AESGCM:AES256+EECDH:AES256+EDH:!aNULL:!eNULL:!EXPORT:!DES:!MD5:!PSK:!RC4";
ssl_prefer_server_ciphers on;
proxy_buffer_size 128k;
proxy_buffers 4 256k;
proxy_busy_buffers_size 256k;
gzip_min_length 1k;
gzip_buffers 4 16k;
gzip_comp_level 2;
gzip_types text/plain application/javascript application/x-javascript text/css application/xml text/javascript application/x-httpd-php image/jpeg #image/gif image/png application/wasm;
gzip_vary off;
gzip_disable "MSIE [1-6]\.";
error_page 405 =200 $uri;
default_type application/wasm;
location / {
proxy_pass http://192.168.1.2:7880;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection $connection_upgrade;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-Server $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
server {
server_name sfu-jwt.mydomain.com;
ssl_certificate /etc/letsencrypt/live/mydomain.com/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/mydomain.com/privkey.pem;
ssl_dhparam /etc/ssl/dhparam.pem;
listen 443 ssl http2;
add_header Strict-Transport-Security "max-age=31536000; includeSubdomains";
ssl_protocols TLSv1.2;
ssl_ciphers "EECDH+AESGCM:EDH+AESGCM:AES256+EECDH:AES256+EDH:!aNULL:!eNULL:!EXPORT:!DES:!MD5:!PSK:!RC4";
ssl_prefer_server_ciphers on;
proxy_buffer_size 128k;
proxy_buffers 4 256k;
proxy_busy_buffers_size 256k;
location / {
proxy_pass http://192.168.1.2:8881;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-Server $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
Docker setup
Element call
Use the following docker compose:
$ cat docker-compose.yaml
networks:
lkbackend:
services:
element-call:
image: ghcr.io/element-hq/element-call:latest
container_name: element-call
hostname: element-call
ports:
- 8093:8080
volumes:
- /home/ubuntu/dockerdata/volumes/elementcall/config.json:/app/config.json
restart: unless-stopped
networks:
- lkbackend
jwt-service:
image: ghcr.io/element-hq/lk-jwt-service:latest-ci
container_name: lk-jwt-service
hostname: lk-jwt-service
ports:
- 8881:8080
environment:
- LIVEKIT_SECRET=somestrongstring
- LIVEKIT_URL=wss://sfu.mydomain.com:443
- LIVEKIT_KEY=devkey
deploy:
restart_policy:
condition: on-failure
networks:
- lkbackend
livekit:
image: livekit/livekit-server:latest
command: --dev --config /etc/livekit.yaml
restart: unless-stopped
volumes:
- /home/ubuntu/dockerdata/volumes/elementcall/backend/livekit.yaml:/etc/livekit.yaml
network_mode: "host"
redis:
image: redis:6-alpine
command: redis-server /etc/redis.conf
ports:
- 6379:6379
volumes:
- /home/ubuntu/dockerdata/volumes/elementcall/backend/redis.conf:/etc/redis.conf
networks:
- lkbackend
Here are the supporting files: config.json
cat /home/ubuntu/dockerdata/volumes/elementcall/config.json
{
"default_server_config": {
"m.homeserver": {
"base_url": "https://matrix.mydomain.com",
"server_name": "mydomain.com"
}
},
"livekit": {
"livekit_service_url": "https://sfu-jwt.mydomain.com"
}
}
livekit.yaml
$ cat /home/ubuntu/dockerdata/volumes/elementcall/backend/livekit.yaml
port: 7880
bind_addresses:
- "0.0.0.0"
rtc:
tcp_port: 7881
port_range_start: 50100
port_range_end: 50200
use_external_ip: false
turn:
enabled: false
domain: localhost
cert_file: ""
key_file: ""
tls_port: 5349
udp_port: 443
external_tls: true
keys:
devkey: "somestrongstring"
logging:
redis.conf
$ cat /home/ubuntu/dockerdata/volumes/elementcall/backend/redis.conf
bind 0.0.0.0
protected-mode yes
port 6379
timeout 0
tcp-keepalive 300
Synapse change
set the following in homeserver.yaml
listeners:
- port: 8008
tls: false
type: http
x_forwarded: true
resources:
- names: [client, federation,openid]
compress: false
#https://github.com/element-hq/element-call/issues/2005
serve_server_wellknown: true
Firewall/Router port forwards
For Livekit
From https://docs.livekit.io/home/self-hosting/ports-firewall/ Forward UDP ports 50100:50200 to docker instance 192.168.1.2 Forward TCP port 7881 to docker instance 192.168.1.2
For Element Call
Forward TCP port 443 to Nginx server
@rajil , thanks. My question was not about setting up Element call - it's done, no problems - a bit different from your approach, but I believe, it's a matter of taste.
My question is about making Mobile & Desktop apps to use self-hosted version, because for now they use call.element.io without any possibility to change that in the settings. I'm sorry if my original post was misleading.
@alexander-potemkin The nightly version of ElementX Android already allows changing from call.element.io. And so does the nightly version of Element Desktop.
@rajil , my hopes were for Element "Classic"... Anyway, thanks for letting know! If that's not too much to ask, could you please, let me know how it's handled? Does it follow .well-known or DNS directives? Or it relies on end user to apply the changes manually?
Element Classic is no longer being developed AFAIK. I had to apply the change manually in the EX client settings (not sure if there is a .well-known for it).
Thank you. My question remains then:
- it's unpractical to expect staff will change settings on they own
- Element X is not there yet in terms of functionality
It seems that .well-known support is on the way
It seems that .well-known support is on the way
As it's a commit into element-call, it's the only service that is affected; if they push it to JS SDK - it might affect Desktop clients, but not mobiles. Unless I'm missing something.
Mobile is also supported.
I scraped at the documentation to get element call server going. I have tested this to call between ElementX android <> ElementX android; ElementX android <> Element Web (nightly).
Here are my instructions. If you find something needs to change do point that out.
I simply want to send you a big THANK YOU! Due to the current lack of docs I'd never be able to set up EC today without your post. Awesome starting point!
Adding some comments after I did the described setup:
If the Element-Call-VM is behind a firewall like pfSense and is not directly exposed to the internet, you need to change the line in livekit.yaml to:
use_external_ip: true
With this setting, livekit will detect the external IP address via STUN. Having this set to "false" won't work in this scenario.
Furthermore, some .well-known's must be changed / added:
.well-known/matrix/client must be extended with this:
"org.matrix.msc4143.rtc_foci": [
{
"type": "livekit",
"livekit_service_url": "https://sfu-jwt.mydomain.com"
}
]
.well-known/element/element.json must be introduced, so Element-X gets the changed server:
{"call":{"widget_url":"https://call.mydomain.com"}}
The Element-Web config.json at the server which serves Element-Web (Usually the same machine which runs Synapse) must be extended by these:
"features": {
"feature_video_rooms": true,
"feature_new_room_decoration_ui": true,
"feature_group_calls": true,
"feature_element_call_video_rooms": true,
[...]
},
"element_call": {
"url": "https://call.mydomain.com",
"participant_limit": 8,
"brand": "Element Call",
"use_exclusively": true
},
Element-Desktop will, for now, not get the information from the .well-known/element/element.json. I hope they will implement this in the future. To make Element-Desktop work with Element-Call by default, you need to copy the config.json which you use for your Element-Web into each Desktop installation. The locations are described here).
I found that Element-Call as for now is using VP8 as codec. This really kills battery at Apple devices (iPhone/iPad) which are getting very hot during calls. VP8 is not a hardware-supported codec here, so all must be done in software.
I've filed an issue here, but meanwhile you can force livekit to use H.264 which solves all these problems:
In /opt/element-call/volumes/elementcall/backend/livekit.yaml add this section:
room:
enabled_codecs:
- mime: video/h264
- mime: audio/opus
This will force livekit to use H.264 and all is back to normal in the Apple world, iPhones not getting hot anymore, battery use for 15 minutes video call on an iPhone 14 is down to 3% now.