notedeck icon indicating copy to clipboard operation
notedeck copied to clipboard

Embed video playback

Open alltheseas opened this issue 1 year ago • 4 comments

user story

As a Damus media enjoyer, I would like to be able to play videos in Damus, so that I do not have to click on a link and view the video in the browser.

acceptance criteria

  1. video links area embedded in note (e.g. video.mp4 etc)
  2. user has option to disable embed auto loading (see low data mode damus-io/notedeck#195 )

alltheseas avatar Apr 08 '24 05:04 alltheseas

Popular customer request

alltheseas avatar Dec 02 '24 20:12 alltheseas

Here’s a **concise, high-performance, self-contained** plan that covers **HLS (.m3u8) livestreams** and **MP4** across **Android, Linux, macOS, Windows** from a Rust app.

---

# Architecture (Rust core + thin platform adapters)

* **Rust core** defines a small `PlayerAdapter` trait and owns business logic, analytics, retry, etc.
* **Android (must):** **ExoPlayer / AndroidX Media3** behind a tiny **Kotlin + JNI** shim.

  * HW decode via **MediaCodec**, first-class HLS/LL-HLS, captions, ABR.
  * Ship as Gradle deps inside your AAB/APK (no user installs).
* **Desktop (Linux/macOS/Windows):** **GStreamer (Rust bindings)** *or* **libmpv**.

  * Both support MP4 + HLS (TS/fMP4), HW decode, subtitles.
  * Bundle the runtime/libs with your app to avoid system deps.

```rust
pub trait PlayerAdapter {
    fn init(&mut self, surface: NativeSurfaceHandle);
    fn set_source(&mut self, url_or_path: &str, is_live: bool);
    fn play(&mut self);
    fn pause(&mut self);
    fn seek_ms(&mut self, pos: i64);
    fn set_volume(&mut self, vol: f32);
    fn release(&mut self);
}

Performance notes

  • Hardware decoding paths

    • Android: MediaCodec (ExoPlayer)
    • macOS: VideoToolbox (via GStreamer or mpv)
    • Windows: D3D11VA/DXVA2 (via GStreamer or mpv)
    • Linux: VAAPI/NVDEC (via GStreamer or mpv)
  • Low-latency HLS (optional): If your stream publishes #EXT-X-PART + blocking reload, set a small live offset:

    • Android/ExoPlayer: MediaItem.LiveConfiguration(targetOffsetMs≈1000–2500)
    • GStreamer: tune hlsdemux buffering and playlist reload; keep segments 2–6s, parts 200–500 ms at the origin/CDN.

Packaging (self-contained, no user installs)

  • Android: include Media3/ExoPlayer in Gradle; done.
  • Linux: ship GStreamer/libmpv .so + needed plugins in AppImage/Flatpak.
  • macOS: place .dylib/frameworks inside your .app bundle; codesign.
  • Windows: include .dll in your installer/MSIX; add to app-local PATH.
  • Include only needed codecs/plugins to keep size lean (H.264/AAC, MP4, HLS demux).

Rendering

  • Android: pass a Surface/SurfaceView/TextureView from your Rust UI (e.g., winit + ndk-glue) to the Kotlin adapter.

  • Desktop:

    • GStreamer: glimagesink to your window or appsink → upload to wgpu texture.
    • libmpv: use opengl-cb to draw into your GL/Vulkan surface.

Error handling & telemetry

  • Unify player events (state, errors, buffering %, dropped frames, bitrate) behind your trait.
  • Android: expose ExoPlayer analytics; Desktop: map GStreamer/mpv messages to the same schema.

What you get

  • Best-in-class Android performance (ExoPlayer/MediaCodec).
  • Portable desktop playback with HW acceleration (GStreamer or mpv).
  • Single Rust API your app calls everywhere.
  • Zero extra installs for users—you bundle all media bits.
  • Supports both HLS (.m3u8) livestreams and MP4 VOD out of the box.

If you want, I can drop in a tiny repo layout (Cargo + Gradle + CMake for JNI) showing the trait, the Kotlin wrapper, and a GStreamer adapter skeleton.

alltheseas avatar Oct 16 '25 17:10 alltheseas

Below is a minimal, self-contained repo scaffold for a Rust app that plays HLS (.m3u8) and MP4 across Android, Linux, macOS, Windows. It uses a Rust core with a PlayerAdapter trait, plus:

Android: Kotlin + ExoPlayer (Media3) via a tiny JNI bridge to a Rust cdylib

Desktop: GStreamer (Rust bindings)

0) Repo layout
rust-video-cross/
├─ Cargo.toml                         # Workspace
├─ README.md
├─ player-core/                       # Rust core (trait & shared types)
│  ├─ Cargo.toml
│  └─ src/lib.rs
├─ adapter-desktop-gst/               # Desktop adapter (GStreamer)
│  ├─ Cargo.toml
│  └─ src/main.rs                     # demo binary for desktop
├─ adapter-android/                   # Android native glue (Rust + JNI)
│  ├─ Cargo.toml
│  ├─ src/lib.rs                      # cdylib exposing JNI fns
│  └─ build.rs (optional)
└─ android/                           # Android app (Gradle)
   ├─ app/
   │  ├─ build.gradle
   │  ├─ src/main/AndroidManifest.xml
   │  ├─ src/main/java/com/example/rustplayer/
   │  │  ├─ RustPlayer.kt             # ExoPlayer wrapper
   │  │  └─ NativeBridge.kt           # JNI calls into Rust
   │  └─ src/main/cpp/CMakeLists.txt  # optional (not needed if Rust-only)
   ├─ build.gradle
   └─ settings.gradle
1) Workspace Cargo.toml
[workspace]
members = [
  "player-core",
  "adapter-desktop-gst",
  "adapter-android",
]
resolver = "2"
2) player-core (Rust core trait)

player-core/Cargo.toml

[package]
name = "player-core"
version = "0.1.0"
edition = "2021"


[lib]
name = "player_core"
path = "src/lib.rs"

player-core/src/lib.rs

#[derive(Debug, Clone, Copy)]
pub struct NativeSurfaceHandle(pub usize); // platform-specific handle


pub trait PlayerAdapter {
    fn init(&mut self, surface: Option<NativeSurfaceHandle>);
    fn set_source(&mut self, url_or_path: &str, is_live: bool);
    fn play(&mut self);
    fn pause(&mut self);
    fn seek_ms(&mut self, pos: i64);
    fn set_volume(&mut self, vol: f32);
    fn release(&mut self);
}


// Common helpers/types can live here as well (errors, events, etc.)
3) Desktop adapter (GStreamer)

adapter-desktop-gst/Cargo.toml

[package]
name = "adapter-desktop-gst"
version = "0.1.0"
edition = "2021"


[dependencies]
anyhow = "1"
glib = "0.20"
gstreamer = "0.23"
gstreamer-player = "0.23"
player-core = { path = "../player-core" }

adapter-desktop-gst/src/main.rs (demo player)

use anyhow::Context;
use gst::prelude::*;
use player_core::{NativeSurfaceHandle, PlayerAdapter};


struct GstPlayer { player: gst_player::Player }


impl GstPlayer {
    fn new() -> anyhow::Result<Self> {
        gst::init()?;
        Ok(Self { player: gst_player::Player::new(None, None) })
    }
}


impl PlayerAdapter for GstPlayer {
    fn init(&mut self, _surface: Option<NativeSurfaceHandle>) {}
    fn set_source(&mut self, url_or_path: &str, _is_live: bool) {
        self.player.set_uri(Some(url_or_path));
        self.player.prepare();
    }
    fn play(&mut self) { self.player.play(); }
    fn pause(&mut self) { self.player.pause(); }
    fn seek_ms(&mut self, pos: i64) {
        let _ = self.player.seek(gst::ClockTime::from_mseconds(pos as u64));
    }
    fn set_volume(&mut self, vol: f32) { self.player.set_volume(vol as f64); }
    fn release(&mut self) {}
}


fn main() -> anyhow::Result<()> {
    let uri = std::env::args().nth(1).context("usage: cargo run -- <file-or-url>")?;
    let mut p = GstPlayer::new()?;
    p.init(None);
    p.set_source(&uri, uri.ends_with(".m3u8"));
    p.play();


    let main = glib::MainLoop::new(None, false);
    ctrlc::set_handler({ let m = main.clone(); move || m.quit() }).ok();
    main.run();
    Ok(())
}

Desktop deps: install GStreamer runtime on dev boxes (e.g., Linux: gstreamer1.0-plugins-base|good|bad + gstreamer1.0-libav). For distribution, bundle only needed plugins/dylibs with your app.

4) Android adapter (Rust + JNI + Kotlin ExoPlayer)
4.1 Rust cdylib exposing JNI

adapter-android/Cargo.toml

[package]
name = "adapter-android"
version = "0.1.0"
edition = "2021"


[lib]
crate-type = ["cdylib"]


[dependencies]
jni = "0.21"
player-core = { path = "../player-core" }

adapter-android/src/lib.rs

use jni::objects::{JClass, JObject, JString};
use jni::sys::{jlong, jboolean};
use jni::JNIEnv;
use player_core::NativeSurfaceHandle;


#[no_mangle]
pub extern "system" fn Java_com_example_rustplayer_NativeBridge_init(
    env: JNIEnv,
    _cls: JClass,
    surface: JObject,
) {
    // Convert Java Surface -> native handle if you need to track it in Rust.
    // Many apps keep playback fully in Kotlin (ExoPlayer) and use Rust for control/state only.
    let _handle = if surface.is_null() { None } else { Some(NativeSurfaceHandle(0)) };
}


#[no_mangle]
pub extern "system" fn Java_com_example_rustplayer_NativeBridge_setSource(
    env: JNIEnv,
    _cls: JClass,
    url: JString,
    is_live: jboolean,
) {
    let _url: String = env.get_string(&url).unwrap().into();
    let _is_live = is_live != 0;
    // In the simplest design, forward to Kotlin via method calls or keep logic in Kotlin.
}


#[no_mangle]
pub extern "system" fn Java_com_example_rustplayer_NativeBridge_release(
    _env: JNIEnv,
    _cls: JClass,
) {
}

Note: This shows the JNI surface. A common pattern is to put all playback on the Kotlin side (ExoPlayer) and use Rust to store app state, analytics, and high-level commands (optionally calling back Kotlin via JNI if needed). If you prefer Rust-driven control, you can retain a global handle to a Kotlin RustPlayer instance via global refs.

4.2 Android Gradle project

android/settings.gradle

pluginManagement {
  repositories { gradlePluginPortal(); google(); mavenCentral() }
}
dependencyResolutionManagement {
  repositoriesMode.set(RepositoriesMode.FAIL_ON_PROJECT_REPOS)
  repositories { google(); mavenCentral() }
}
rootProject.name = "RustVideoCross"
include(":app")

android/build.gradle

buildscript {
  dependencies { classpath "com.android.tools.build:gradle:8.5.0" }
}

android/app/build.gradle

plugins {
  id "com.android.application"
  kotlin("android") version "1.9.24"
}


android {
  namespace "com.example.rustplayer"
  compileSdk 34
  defaultConfig {
    applicationId "com.example.rustplayer"
    minSdk 24
    targetSdk 34
    versionCode 1
    versionName "0.1"
    ndk { abiFilters += listOf("arm64-v8a", "armeabi-v7a") }
  }
  buildTypes { release { isMinifyEnabled = false } }
}


dependencies {
  val media3 = "1.4.1"
  implementation("androidx.media3:media3-exoplayer:$media3")
  implementation("androidx.media3:media3-exoplayer-hls:$media3")
  implementation("androidx.media3:media3-ui:$media3")
  implementation("org.jetbrains.kotlinx:kotlinx-coroutines-android:1.8.1")
}

android/app/src/main/AndroidManifest.xml

<manifest xmlns:android="http://schemas.android.com/apk/res/android">
  <application android:label="RustVideoCross" android:allowBackup="true">
    <activity android:name=".MainActivity" android:exported="true">
      <intent-filter>
        <action android:name="android.intent.action.MAIN" />
        <category android:name="android.intent.category.LAUNCHER" />
      </intent-filter>
    </activity>
  </application>
  <uses-permission android:name="android.permission.INTERNET"/>
</manifest>

android/app/src/main/java/com/example/rustplayer/NativeBridge.kt

package com.example.rustplayer


object NativeBridge {
  init { System.loadLibrary("adapter_android") } // libadapter_android.so
  @JvmStatic external fun init(surface: android.view.Surface?)
  @JvmStatic external fun setSource(url: String, isLive: Boolean)
  @JvmStatic external fun release()
}

android/app/src/main/java/com/example/rustplayer/RustPlayer.kt

package com.example.rustplayer


import android.content.Context
import android.view.Surface
import androidx.media3.common.MediaItem
import androidx.media3.exoplayer.ExoPlayer


class RustPlayer(private val context: Context) {
  private var player: ExoPlayer? = null


  fun init(surface: Surface?) {
    player = ExoPlayer.Builder(context).build().apply {
      setVideoSurface(surface)
    }
  }


  fun setSource(url: String, isLive: Boolean) {
    val item = MediaItem.Builder().setUri(url).build()
    player?.setMediaItem(item)
    player?.prepare(); player?.playWhenReady = true
  }


  fun play() { player?.play() }
  fun pause() { player?.pause() }
  fun seekMs(ms: Long) { player?.seekTo(ms) }
  fun setVolume(v: Float) { player?.volume = v }
  fun release() { player?.release(); player = null }
}

MainActivity (not shown) should create a SurfaceView/PlayerView, call RustPlayer.init(surface), then setSource(url, isLive).

5) Build & Run
Desktop (Linux/macOS/Windows)
# Install GStreamer runtime on dev machine (varies by OS)
cargo run -p adapter-desktop-gst -- "https://example.com/stream.m3u8"
Android
# Build Rust cdylib for Android (arm64):
cargo install cargo-ndk
cargo ndk -t arm64-v8a -o android/app/src/main/jniLibs build -p adapter-android --release


# Then build Android app:
cd android && ./gradlew :app:installDebug
6) Packaging notes (self-contained)

Android: the ExoPlayer artifacts are in your APK; Rust .so is inside jniLibs/.

Desktop: bundle only the GStreamer libs/plugins you actually need (H.264/AAC, MP4, HLS) into your installer/AppImage/.app.

Keep ABR/LL-HLS knobs on the origin/CDN (segment 2–6s; parts 200–500ms) and set live-offset in the platform adapter.

This scaffold gives you:

A single Rust trait API you can call everywhere

Best-in-class Android perf via ExoPlayer/MediaCodec

Portable desktop playback via GStreamer

No extra installs for users (you bundle the bits)

Want me to add a minimal MainActivity (with a PlayerView) and a sample LL-HLS URL toggle?

alltheseas avatar Oct 16 '25 17:10 alltheseas

mp4 video playback


GStreamer vs libmpv

  - Licensing & distro fit
      - GStreamer is LGPL (core/base/plugins good for commercial
        redistribution).
      - libmpv is GPLv2+; shipping it means the whole binary must comply, which
        may be a blocker unless the app stays GPL-compatible.
  - Integration model
      - GStreamer lets you wire pipelines (e.g., souphttpsrc → qtdemux → …) and
        hook appsink for raw RGBA upload into wgpu—maximum control, but more
        plumbing.
      - libmpv exposes a higher-level player; you talk to it via the render API,
        and it pushes frames to an OpenGL/Vulkan/WGPU backend with minimal code,
        but you live with mpv’s playback model.
  - Dependency footprint
      - GStreamer means bundling a plugin set tailored per platform; size is
        larger, but you can cherry-pick codecs and HTTP modules.
      - libmpv pulls in FFmpeg + mpv libs; overall bundle is typically smaller
        and easier to ship, but codec set is “all or nothing”.
  - Feature surface & extensibility
      - GStreamer excels when you need custom filters, analytics taps, subtitle
        muxing, alternate audio tracks, or future non-MP4 formats—pipelines
        are composable.
      - libmpv already implements a deep playback feature set (quality
        selection, AB-loop, stats, scripts) but is harder to bend beyond what
        mpv exposes.
  - Hardware acceleration & platform maturity
      - Both use platform decoders (VAAPI/V4L2, NVDEC, VideoToolbox, D3D11VA).
        GStreamer’s backends are mature and configurable per element; you choose
        fallbacks.
      - libmpv piggybacks on FFmpeg’s hwaccel and mpv heuristics—less knobs, but
        often “just works” assuming GPU drivers cooperate.
  - Maintenance overhead
      - GStreamer requires more in-house knowledge (caps negotiation, pipeline
        state machine, plugin updates). Strong community, but documentation
        is broad.
  - UI/UX hooks
      - With GStreamer you own the UI: overlays, custom controls, telemetry can
        be integrated directly in Rust/egui.
      - libmpv handles timing and sync internally; you overlay controls on top,
        but fine-grained UI states (buffering %, stream stats) require querying/
        mpv properties.

• Performance-wise they’re very close—the bottleneck is whether you hit hardware decode and keep frame copies
  off the CPU. The deciding factors look like this:

  - Decode path: Both wrap the same platform codecs (VideoToolbox, NVDEC/D3D11VA/VAAPI). If you enable those,
    decode throughput is essentially identical.
  - Frame delivery into egui/wgpu: This is where the difference shows up.
      - With GStreamer, the simplest Rust integration pulls RGBA frames through appsink, so you end up copying
        into CPU memory and uploading to a texture each frame—fine for 720p, but 1080p60+ starts to hurt.
        You can avoid that by wiring zero-copy (dmabuf/VAAPI surfaces on Linux, CVPixelBuffer on macOS, DX11
        textures on Windows), but that’s extra plumbing.
      - libmpv exposes its render API and happily drives GL/Vulkan/ANGLE surfaces itself; you just share a
        texture handle. That keeps frame copies off the CPU out of the box, so peak throughput tends to be
        better without extra engineering.
  - Startup/buffering: Both do HTTP range and buffering well. GStreamer gives more knobs if you need to
    tune souphttpsrc for exotic hosters; libmpv uses ffmpeg/mpv heuristics that are already optimized for
    progressive MP4.
  - Telemetry/controls: GStreamer lets you tap pipeline stats anywhere (useful for QoS metrics). libmpv gives
    you mpv’s stats via properties; less granular but good enough for diagnosing drops.

  If you’re willing to invest in zero-copy surfaces with GStreamer, the two are effectively tied. Without that
  work, libmpv’s render API usually wins on raw playback smoothness because it avoids the extra CPU upload.

alltheseas avatar Oct 17 '25 05:10 alltheseas